Teresa Scassa - Blog

Displaying items by tag: bc pipa

The British Columbia Court of Appeal has ruled that the BC Privacy Commissioner’s enforcement order against Clearview AI is both reasonable and enforceable. Clearview AI is a US-based company that scrapes photographs from the internet, including from social media websites, to build a massive facial recognition database which it offers as a service to law enforcement (very broadly defined). At the time complaints were first lodged with Canadian privacy commissioners, the database held over 3 billion images. Today the number is estimated at around 70 billion.

The order against the company followed a joint investigation report (from the federal Privacy Commissioner and the Commissioners of British Columbia, Alberta and Quebec). The laws of BC, Alberta, and Canada all contain exceptions to the requirements of knowledge and consent for the collection, use and disclosure of personal information where that information is “publicly available”. Clearview AI sought to rely on that exception, arguing that it needed no consent to collect and use personal information such as photographs that were available on the internet.

The term “publicly available” is defined in narrow terms in the regulations, and the BC Court of Appeal found that the Commissioner’s interpretation of this exception to exclude information posted on social media sites was reasonable. In another judicial review application that challenged a similar order against Clearview AI from the Alberta Privacy Commissioner, the Alberta Court of King’s Bench also found the interpretation to be reasonable. However, that court struck down part of the exception in the regulations, finding that it breached Clearview AI’s right to freedom of expression under the Canadian Charter of Rights and Freedoms. Charter arguments were not raised before the BC courts, and so the reasonable interpretation of the BC regulation stands in BC. (You can find my discussion of the Alberta court decision and its implications here).

The Court also found reasonable the BC Commissioner’s ruling that the scraping of photographs from the internet to create a massive facial recognition database was not a purpose that “a reasonable person would consider appropriate in the circumstances.” This baseline privacy norm is shared by the laws of Canada, Alberta and BC. The result of the BC Court of Appeal decision is therefore a clear win for the BC Privacy Commissioner – and frankly, for BC residents. Although the window of time is still open for Clearview AI to seek leave to appeal to the Supreme Court of Canada, without a constitutional angle to this case it is hard to see why the Supreme Court would consider it necessary to review the BC Court of Appeal’s ruling on this interpretation of BC law.

What is perhaps most interesting about this decision is the strong signal it sends about privacy in a digital age. Clearview had argued (as it did in Alberta) that the province’s laws do not apply to its activities. The Court of Appeal disagreed, noting that the test for a “real and substantial connection” to the jurisdiction is necessarily contextual. It framed that context as “the internet as it exists today.” (at para 51) Writing for the unanimous court, Justice Iyer noted that “Clearview’s success as a business depends on its ability to acquire facial data on a global scale to build the databank on which its search engine runs” (at para 52). She observed that the scale of the company’s activities and its inability to exclude BC from its data scraping “supports a conclusion that BC’s relationship to Clearview is substantial, not incidental” (at para 52). She also noted that BC’s private sector data protection law is quasi-constitutional in nature, making transnational enforcement in a global digital age important. She rejected Clearview AI’s argument that just because PIPA is important within BC, its reach should note extend beyond the province’s borders, stating that: “PIPA is simply one of many legislative and common law mechanisms through which the protection of personal privacy is achieved. The importance of the public interest in protecting that fundamental right is highly relevant in the sufficient connection analysis.” (at para 54)

Clearview AI’s business model and the scale of its activities were clearly relevant to the conclusion on jurisdiction. Justice Iyer stated that:

[T]his case is not about the ‘incidental touching’ of a person’s publicly available data. It is about a systematic acquisition of facial data regardless of jurisdiction that enables an enterprise to commercially exploit that information by disclosing it to law enforcement and other entities who are interested in connecting with an individual. (at para 61)

In these circumstances, the Court concluded that BC’s Personal Information Protection Act applies, giving the Commissioner jurisdiction.

These findings on jurisdiction clearly reinforce both the importance of privacy protection and the significant impact of contemporary technology on privacy. Other statements in the decision also highlight this reality. In comments that are relevant to the anticipated reform (in the way that the arrival of the Easter Bunny is anticipated – with childlike faith that becomes cynical over the years) of Canada’s Personal Information Protection and Electronic Documents Act (PIPEDA)), Justice Iyer reminds us of the Federal Court of Appeal’s admonition that PIPEDA (and its substantially similar counterparts) “does not aim to balance competing rights, it balances a need [of organizations to use personal data] with a right” (at para 82). The BC Court of Appeal decision joins the growing list of decisions in Canada that highlight the importance of privacy rights – particularly in the face of invasive transnational technologies and business models.

 

Published in Privacy

The Commission d’accès à l’information du Québec (CAI) has released a decision regarding a pilot project to use facial recognition technology (FRT) in Métro stores in Quebec. When this is paired with a 2023 investigation report of the BC Privacy Commissioner regarding the use of FRT in Canadian Tire Stores in that province, there seems to be an emerging consensus around how privacy law will apply to the use of FRT in the retail sector in Canada.

Métro had planned to establish a biometric database to enable the use of FRT at certain of its stores operating under the Métro, Jean Coutu and Super C brands, on a pilot basis. The objective of the system was to reduce shoplifting and fraud. The system would function in conjunction with video surveillance cameras installed at the entrances and exits to the stores. The reference database would consist of images of individuals over the age of majority who had been linked to security incidents involving fraud or shoplifting. Images of all shoppers entering the stores would be captured on the video surveillance cameras and then converted to biometric face prints for matching with the face prints in the reference database.

The CAI initiated an investigation after receiving notice from Métro of the creation of the biometric database. The company agreed to put its launch of the project on hold pending the results of the investigation.

The Quebec case involved the application of Quebec’s the Act respecting the protection of personal information in the private sector (PPIPS) as well as its Act to establish a legal framework for information technology (LFIT) The LFIT requires an organization that is planning to create a database of “biometric characteristics and measurements” to disclose this fact to the CAI no later than 60 days before it is to be used. The CAI can impose requirements and can also order the use suspended or the database destroyed if it is not in compliance with any such orders or if it “otherwise constitutes an invasion of privacy” (LFIT art. 45).

Métro argued that the LFIT required individual consent only for the use of a biometric database to ‘confirm or verify’ the identity of an individual (LFIT s. 44). It maintained that its proposed use was different – the goal was not to confirm or verify the identities of shoppers; rather, it was to identify ‘high risk’ shoppers based on matches with the reference database. The CAI rejected this approach, noting the sensitivity of biometric data. Given the quasi-constitutional status of Canadian data protection laws, the CAI found that a ‘large and liberal’ approach to interpretation of the law was required. The CAI found that Métro was conflating the separate concepts of “verification” and “confirmation” of identity. In this case, the biometric faceprints in the probe images would be used to search for a match in the “persons of interest” database. Even if the goal of the generation of the probe images was not to determine the precise identity of all customers – or to add those face prints to the database – the underlying goal was to verify one attribute of the identity of shoppers – i.e., whether there was a match with the persons of interest database. This brought the system within the scope of the LTIF. The additional information in the persons of interest database, which could include the police report number, a description of the past incident, and related personal information would facilitate the further identification of any matches.

Métro also argued that the validation or confirmation of identity did not happen in one single process and that therefore s. 44 of the LTIF was not engaged. The CAI dismissed what it described as the compartmentalisation of the process. Instead, the law required a consideration of the combined effect of all the steps in the operation of the system.

The company also argued that they had obtained the consent required under art 12 of the PPIPS. It maintained that the video cameras captured shoppers’ images with their consent, as there was notice of use of the cameras and the shoppers continued into the stores. It argued that the purposes for which it used the biometric data were consistent with the purposes for which the security cameras were installed, making it a permissible secondary use under s. 12(1) of PPIPS. The CAI rejected this argument noting that it was not a question of a single collection and a related secondary use. Rather, the generation of biometric faceprints from images captured on video is an independent collection personal of data. That collection must comply with data protection requirements and cannot be treated a secondary use of already collected data.

The system proposed by Métro would be used on any person entering the designated stores, and as such it was an entry requirement. Individuals would have no ability to opt out and still shop, and there were no alternatives to participation in the FRT scheme. Not only is consent not possible for the general population entering the stores, those whose images become part of the persons of interest database would also have no choice in the matter.

Métro argued that its obligation to protect its employees and the public outweighed the privacy interests of its customers. The CAI rejected this argument, noting that this was not the test set out in the LTIF, which asked instead whether the database of biometric characteristics “otherwise constitutes an invasion of privacy” (art 45). The CAI was of the view that to create a database of biometric characteristics and to match these characteristics against face prints generated from data captured from the public without their consent in circumstances where the law required consent amounted to a significant infringement of privacy rights. The Commission emphasized again the highly sensitive character of the personal data and issued an order prohibiting the implementation of the proposed system.

The December 2023 BC investigation report was based on that province’s Personal Information Protection Act. It followed a commissioner-initiated investigation into the use by several Canadian Tire Stores in BC of FRT systems integrated with video surveillance cameras. Like the Métro pilot, biometric face prints were generated from the surveillance footage and matched against a persons-of-interest database. The stated goals of the systems were similar as well – to reduce shoplifting and enhance the security of the stores. As was the case in Quebec, the BC Commissioner found that the generation of biometric face prints was a new collection of personal information that required express consent. The Commissioner had found that the stores had not provided adequate notice of collection, making the issue of consent moot. However, he went on to find that even if there had been proper notice, express consent had not been obtained, and consent could not be implied in the circumstances. The collection of biometric faceprint data of everyone entering the stores in question was not for a purpose that a reasonable person would consider appropriate, given the acute sensitivity of the data collected and the risks to the individual that might flow from its misuse, inaccuracy, or from data breaches. Interestingly, in BC, the four stores under investigation removed their FRT systems soon after receiving the notice of investigation. During the investigation, the Commissioner found little evidence to support the need for the systems, with store personnel admitting that the systems added little to their normal security functions. He chastised the retailers for failing both to conduct privacy impact assessments prior to adoption and to put in place measures to evaluate the effectiveness and performance of the systems.

An important difference between the two cases relates to the ability of the CAI to be proactive. In Quebec, the LTIF requires notice to be provided to the Commissioner of the creation of a biometric database in advance of its implementation. This enabled it to rule on the appropriateness of the system before privacy was adversely impacted on a significant scale. By contrast, the systems in BC were in operation for three years before sufficient awareness surfaced to prompt an investigation. Now that powerful biometric technologies are widely available for retail and other uses, governments should be thinking seriously about reforming private sector privacy laws to provide for advance notice requirements – at the very least, for biometric systems.

Following both the Quebec and the BC cases, it is difficult to see how broad-based FRT systems integrated with store security cameras could be deployed in a manner consistent with data protection laws – at least under current shopping business models. This suggests that such uses may be emerging as a de facto no-go zone in Canada. Retailers may argue that this reflects a problem with the law, to the extent that it interferes with their business security needs. Yet if privacy is to mean anything, there must be reasonable limits on the collection of personal data – particularly sensitive data. Just because something can be done, does not mean it should be. Given the rapid advance of technology, we should be carefully attuned to this. Being FRT face-printed each time one goes to the grocery store for a carton of milk may simply be an unacceptably disproportionate response to an admittedly real problem. It is a use of technology that places burdens and risks on ordinary individuals who have not earned suspicion, and who may have few other choices for accessing basic necessities.

 

Published in Privacy

A battle over the protection of personal information in the hands of federal political parties (FPPs) has been ongoing now for several years in British Columbia. The BC Supreme Court has just released a decision which marks a significant defeat for the FPPs in their quest to ensure that only minimal privacy obligations apply to their growing collection, use and disclosure of personal information. Although the outcome only green-lights the investigation by BC’s Office of the Information and Privacy Commissioner into the Liberal, New Democrat and Conservative parties’ compliance with the province’s Personal Information Protection Act (PIPA), it is still an important victory for the complainants. The decision affirms the constitutional applicability of PIPA to the FPPs. The tone of the decision also sends a message. Its opens with: “The ability of an individual to control their personal information is intimately connected to their individual autonomy, dignity and privacy.” Justice Weatherill confirms that “These fundamental values lie at the heart of democracy” (at para 1).

The dispute originated with complaints brought in 2019 by three BC residents (the complainants) who sought access under PIPA to their personal information in the hands of each of the three main FPPs in their BC ridings. They wanted to know what information had been collected about them, how it was being used, and to whom it was being disclosed. This access right is guaranteed under PIPA. By contrast no federal law – whether relating to privacy or to elections – provides an equivalent right with respect to political parties. The Canada Elections Act (CEA) was amended in 2018 to include a very limited obligation for FPPs to have privacy policies approved by the Chief Electoral Officer (CEO), published, and kept up to date. These provisions did not include access rights, oversight, or a complaints mechanism. When the responses of the FPPs to the complainants’ PIPA requests proved inadequate, the complainants filed complaints with the OIPC, which initiated an investigation.

Disappointingly, the FPPs resisted this investigation from the outset. They challenged the constitutional basis for the investigation, arguing that the BC law could not apply to FPPs. This issue was referred to an outside adjudicator, who heard arguments and rendered a decision in March 2022. He found that the term “organization” in PIPA included FPPs that collected information about BC residents and that PIPA’s application to the FPPs was constitutional. In April 2022, the FPPs individually filed applications for judicial review of this decision. The adjudicator ruled that he would pause his investigation until the constitutional issues were resolved.

In June of 2023, while the judicial review proceedings were ongoing, the government tabled amendments to the CEA in Bill C-47. These amendments (now passed) permit FPPs to “collect, use, disclose, retain and dispose of personal information in accordance with the party’s privacy policy” (s. 385.1). Section 385.2(3) states: “The purpose of this section is to provide for a national, uniform, exclusive and complete regime applicable to registered parties and eligible parties respecting their collection, use, disclosure, retention and disposal of personal information”. The amendments were no doubt intended to reinforce the constitutional arguments being made in the BC litigation.

In his discussion of these rather cynical amendments, Justice Weatherill quoted extensively from statements of the Chief Electoral Officer of Canada before the Senate Standing Committee on Legal and Constitutional Affairs in which he discussed the limitations of the privacy provisions in the CEA, including the lack of substantive rights and the limited oversight/enforcement. The CEO is quoted as stating “Not a satisfactory regime, if I’m being perfectly honest” (at para 51).

Support for extension of privacy obligations to political parties has been gaining momentum, particularly considering increasingly data-driven strategies, the use of profiling and targeting by political parties, concerns over the security of such detailed information and general frustration over politicians being able to set their own rules for conduct that would be considered unacceptable by any other actors in the public and private sectors. Perhaps sensing this growing frustration, the federal government introduced Bill C-65 in March of 2024. Among other things, this bill would provide some enforcement powers to the CEO with respect to the privacy obligations in the CEA. Justice Weatherill declined to consider this Bill in his decision, noting that it might never become law and was thus irrelevant to the proceedings.

Justice Weatherill ruled that BC’s PIPA applies to organizations, and that FPPS active in the province fall within the definition of “organization”. The FPPs argued that PIPA should be found inoperative to the extent that it is incompatible with federal law under the constitutional doctrine of paramountcy. They maintained that the CEA addressed the privacy obligations of political parties and that the provincial legislation interfered with that regime. Justice Weatherill disagreed, citing the principle of cooperative federalism. Under this approach, the doctrine of paramountcy receives a narrow interpretation, and where possible “harmonious interpretations of federal and provincial legislation should be favoured over interpretations that result in incompatibility” (at para 121). He found that while PIPA set a higher standard for privacy protection, the two laws were not incompatible. PIPA did not require FPPs to do something that was prohibited under the federal law – all it did was provide additional obligations and oversight. There was no operational conflict between the laws – FPPs could comply with both. Further, there was nothing in PIPA that prevented the FPPs from collecting, using or disclosing personal information for political purposes. It simply provided additional protections.

Justice Weatherill also declined to find that the application of PIPA to FPPs frustrated a federal purpose. He found that there was no evidence to support the argument that Parliament intended “to establish a regime in respect of the collection and use of personal information by FPPs” (at para 146). He also found that the evidence did not show that it was a clear purpose of the CEA privacy provisions “to enhance, protect and foster the FPPs’ effective participation in the electoral process”. He found that the purpose of these provisions was simply to ensure that the parties had privacy policies in place. Nothing in PIPA frustrated that purpose; rather, Justice Weatherill found that even if there was a valid federal purpose with respect to the privacy policies, “PIPA is in complete alignment with that purpose” (at para 158).

Justice Weatherill also rejected arguments that the doctrine of interjurisdictional immunity meant that the federal government’s legislative authority over federal elections could not be allowed to be impaired by BC’s PIPA. According to this argument the Chief Electoral Officer was to have the final say over the handling of personal information by FPPs. The FPPs argued that elections could be disrupted by malefactors who might use access requests under PIPA in a way that could lead to “tying up resources that would otherwise be focused on the campaign and subverting the federal election process” (at para 176). Further, if other provincial privacy laws were extended to FPPs, it might mean that FPPs would have to deal with multiple privacy commissioners, bogging them down even further. Justice Weatherill rejected these arguments, stating:

Requiring FPPs to disclose to British Columbia citizens, on request, the personal information they have about the citizen, together with information as to how it has been used and to whom it has been disclosed has no impact on the core federal elections power. It does not “significantly trammel” the ability of Canadian citizens to seek by lawful means to influence fellow electors, as was found to have been the case in McKay. It does not destroy the right of British Columbians to engage in federal election activity. At most, it may have a minimal impact on the administration of FPPs. This impact is not enough to trigger interjurisdictional immunity. All legislation carries with it some burden of compliance. The petitioners have not shown that this burden is so onerous as to impair them from engaging with voters. (at para 182).

Ultimately, Justice Weatherill ruled that there was no constitutional barrier to the application of PIPA. The result is that the matter goes back to the OIPC for investigation and determination on the merits. It has been a long, drawn out and expensive process so far, but at least this decision is an unequivocal affirmation of the application of basic privacy principles (at least in BC) to the personal information handling practices of FPPs. It is time for Canada’s political parties to accept obligations similar to those imposed on private sector organizations. If they want to collect, use and disclose data in increasingly complex data-driven voter profiling and targeting activities they need to stop resisting the commensurate obligations to treat that information with care and to be accountable for their practices.

Published in Privacy

Canadian Trademark Law

Published in 2015 by Lexis Nexis

Canadian Trademark Law 2d Edition

Buy on LexisNexis

Electronic Commerce and Internet Law in Canada, 2nd Edition

Published in 2012 by CCH Canadian Ltd.

Electronic Commerce and Internet Law in Canada

Buy on CCH Canadian

Intellectual Property for the 21st Century

Intellectual Property Law for the 21st Century:

Interdisciplinary Approaches

Purchase from Irwin Law