The Clearview AI saga has a new Canadian instalment. In December 2024, the British Columbia Supreme Court rendered a decision on Clearview AI’s application for judicial review of an order issued by the BC Privacy Commissioner. This post explores that decision and some of its implications. The first part sets the context, the next discusses the judicial review decision, and part three looks at the ramifications for Canadian privacy law of the larger (and ongoing) legal battle.
Context
Late in 2021, the Privacy Commissioners of BC, Alberta, Quebec and Canada issued a joint report on their investigation into Clearview AI (My post on this order is here). Clearview AI, a US-based company, had created a massive facial recognition (FRT) database from images scraped from the internet that it marketed to law enforcement agencies around the world. The investigation was launched after a story broke in the New York Times about Clearview’s activities. Although Canadian police services initially denied using Clearview AI, the RCMP later admitted that it had purchased two licences. Other Canadian police services made use of promotional free accounts.
The joint investigation found that Clearview AI had breached the private sector data protection laws of the four investigating jurisdictions by collecting and using sensitive personal information without consent and by doing so for purposes that a reasonable person would not consider appropriate in the circumstances. The practices also violated Quebec’s Act to establish a legal framework for information technology. Clearview AI disagreed with these conclusions. It indicated that it would temporarily cease its operations in Canada but maintained that it was entitled to scrape content from the public web. After failing to respond to the recommendations in the joint report, the Commissioners of Quebec, BC and Alberta issued orders against the company. These orders required Clearview AI to cease offering its services in their jurisdictions, to make best efforts to stop collecting the personal information of those within their respective provincial boundaries, and to delete personal information in its databases that had been improperly collected from those within their boundaries. No order issued from the federal Commissioner, who does not have order making powers under the Personal Information Protection and Electronic Documents Act (PIPEDA). He could have applied to the Federal Court for an order but chose not to do so (more on that in Part 3 of this post).
Clearview AI declined to comply with the provincial orders, other than to note that it had already temporarily ceased operations in Canada. It then applied for judicial review of the orders in each of the three provinces.
To date, only the challenge to the BC Order has been heard and decided. In the BC application, Clearview argued that the Commissioner’s decision was unreasonable. Specifically, it argued that BC’s Personal Information Protection Act (PIPA) did not apply to Clearview AI, that the information it scraped was exempt from consent requirements because it was “publicly available information”, and that the Commissioner’s interpretation of purposes that a reasonable person would consider appropriate in the circumstances was unreasonable and failed to consider Charter values. In his December 2024 decision, Justice Shergill of the BC Supreme Court disagreed, upholding the Commissioner’s order.
The BC Supreme Court Decision on Judicial Review
Justice Shergill confirmed that BC’s PIPA applies to Clearview AI’s activities, notwithstanding the fact that Clearview AI is a US-based company. He noted that applying the ‘real and substantial connection’ test – which considers the nature and extent of connections between a party’s activities and the jurisdiction in which proceedings are initiated – leads to that conclusion. There was evidence that Clearview AI’s database had been marketed to and used by police services in BC, as well as by the RCMP which polices many parts of the province. Further, Justice Shergill noted that Clearview’s data scraping practices were carried out worldwide and captured data about BC individuals including, in all likelihood, data from websites hosted in BC. Interestingly, he also found that Clearview’s scraping of images from social media sites such as Facebook, YouTube and Instagram also created sufficient connection, as these sites “undoubtedly have hundreds of thousands if not millions of users in British Columbia” (at para 91). In reaching his conclusion, Justice Shergill emphasized “the important role that privacy plays in the preservation of our societal values, the ‘quasi-constitutional’ status afforded to privacy legislation, and the increasing significance of privacy laws as technology advances” (at para 95). He also found that there was nothing unfair about applying BC’s PIPA to Clearview AI, as the company “chose to enter British Columbia and market its product to local law enforcement agencies. It also chooses to scrape data from the Internet which involves personal information of people in British Columbia” (at para 107).
Sections 12(1)(e), 15(1)(e) and 18(1)(e) of PIPA provide exceptions to the requirement of knowledge and consent for the collection, use and disclosure of personal information where “the personal information is available to the public” as set out in regulations. The PIPA Regulations include “printed or electronic publications, including a magazine, book, or newspaper in printed or electronic form.” Similar exceptions are found in the federal PIPEDA and in Alberta’s Personal Information Protection Act. Clearview AI had argued that public internet websites, including social media sites, fell within the category of electronic publications and their scraping was thus exempt from consent requirements. The commissioners disagreed, and Clearview AI challenged this interpretation as unreasonable.
Justice Shergill found that the Commissioners’ conclusion that social media websites fell outside the exception for publicly available information was reasonable. The BC Commissioner was entitled to read the list in the PIPA Regulations as a “narrow set of sources” (at para 160). Justice Shergill reviewed the reasoning in the joint report for why social media sites should be treated differently from other types of publications mentioned in the exception. These include the fact that social media sites are dynamic and not static and that individuals exercise a different level of control over their personal information on social media platforms than on news or other such sites. Although the legislation may require a balancing of privacy rights with private sector interests, Justice Shergill found that it was reasonable for the Commissioner to conclude that privacy rights should be given precedence over commercial interests in the overall context of the legislation. Referencing the Supreme Court of Canada’s decision in Lavigne, Justice Shergill noted that “it is the protection of individual privacy that supports the quasi-constitutional status of privacy legislation, not the right of the organization to collect and use personal information” (at para 174). An individual’s ability to control what happens to their personal information is fundamental to the autonomy and dignity protected by privacy rights and “it is thus reasonable to conclude that any exception to these important rights should be interpreted narrowly” (at para 175).
Clearview AI argued that posting photos to social media sites reflected an individual’s autonomous choice to surrender the information to the public domain. Justice Shergill preferred the Commissioner’s interpretation, which considered the sensitivity of the biometric information, and the impact its collection and use could have on individuals. He referenced the Supreme Court of Canada’s decision in R. v. Bykovets (my post on this case is here), which emphasized that “individuals ‘may choose to divulge certain information for a limited purpose, or to a limited class of persons, and nonetheless retain a reasonable expectation of privacy” (at para 162, citing para 46 of Bykovets).
Clearview AI also argued that the Commissioner was unreasonable in not taking into account Charter values in his interpretation of PIPA. In particular, the company was of the view that the freedom of expression, which guarantees the right both to communicate and to receive information, extended to the ability to access and use publicly available information without restriction. Although Justice Shergill found that the Commissioner could have been more direct in his consideration of Charter values, his decision was still not unreasonable on this point. The Commissioner did not engage with the Charter values issues at length because he did not consider the law to be ambiguous – Charter values-based interpretation comes into play in helping to resolve ambiguities in the law. As Justice Shergill noted, “It is difficult to understand how Clearview’s s. 2(b) Charter rights are infringed through an interpretation of ‘publicly available’ which excludes it from collecting personal information from social media websites without consent” (at para 197).
Like its counterpart legislation in Alberta and at the federal level, BC’s PIPA contains a section that articulates the overarching principle, that any collection, use or disclosure of personal information must be for purposes that a reasonable person would consider appropriate in the circumstances. This means, among other things, that even if the exception to consent had applied in this case, the collection and use of the scraped personal information would still have had to have been for a reasonable purpose.
The Commissioners had found that overall, Clearview’s scraping of vast quantities of sensitive personal information from the internet to build a massive facial recognition database was not one that a reasonable person would find appropriate in the circumstances. Clearview AI preferred to characterize its purpose as providing a service to the benefit of law enforcement and national security. In their joint report, the Commissioners had rejected this characterization noting that it did not justify the massive, widespread scraping of personal information by a private sector company. Further, the Commissioners had noted that such an activity could have negative consequences for individuals, including cybersecurity risks and risks that errors could lead to reputational harm. They also observed that the activity contributed to “broad-based harm inflicted on all members of society, who find themselves under continual mass surveillance by Clearview based on its indiscriminate scraping and processing of their facial images” (at para 253). Justice Shergill found that the record supported these conclusions, and that the Commissioners’ interpretation of reasonable purposes was reasonable.
Clearview AI also argued that the Commissioner’s Order was “unnecessary, unenforceable or overbroad”, and should thus be quashed (at para 258). Justice Shergill accepted the Commissioner’s argument that the order was necessary because Clearview had only temporarily suspended its services in Canada, leaving open the possibility that it would offer its services to Canadian law enforcement agencies in the future. He also accepted the Commissioner’s argument that compliance with the order was possible, noting that Clearview had accepted certain steps for ceasing collection and removing images in its settlement of an Illinois class action lawsuit. The order required the company to use “best efforts”, in an implicit acknowledgement that a perfect solution was likely impossible. Clearview argued that a “best efforts” standard was too vague to be enforceable; Justice Shergill disagreed, noting that courts often used “best efforts language”. Further, and quite interestingly, Justice Shergill noted that “if it is indeed impossible for Clearview to sufficiently identify personal information sourced from people in British Columbia, then this is a situation of Clearview’s own making” (at para 279). He noted that “[i]t is not an answer for Clearview to say that because the data was indiscriminately collected, any order requiring it to cease collecting data of persons present in a particular jurisdiction is unenforceable” (at para 279).
Implications
This is a significant decision as it upholds interpretations of important provisions of BC PIPA. These provisions are similar to ones in Alberta’s PIPA and in the federal PIPEDA. However, it is far from the end of the Clearview AI saga, and there is much to continue to watch.
In the first place, the BC Supreme Court decision is already under appeal to the BC Court of Appeal. If the Court of Appeal upholds this decision, it will be a major victory for the BC Commissioner. Yet, either way, there is likely to be a further application for leave to appeal to the Supreme Court of Canada. It may be years before the issue is finally resolved. In this time, data protection laws in BC, Alberta and at the federal level might well be reformed. It will therefore also be important to examine any new bills to see whether the provisions at issue in this case are addressed in any way or left as is.
In the meantime, Clearview AI has also filed for judicial review of the orders of the Quebec and Alberta commissioners, and these applications are moving forward. All three orders (BC, Alberta and Quebec) are based on the same joint findings. A decision by either or both the Quebec or Alberta superior courts that the orders are unreasonable could strike a significant blow for the united front that Canada’s commissioners are increasingly showing on privacy issues that affect all Canadians. There is therefore a great deal riding on the outcomes of these applications. In any event, regardless of the outcomes, expect applications for leave to appeal to the Supreme Court of Canada. Leave to appeal is less likely to be granted if all three provincial courts of appeal take a similar approach to the issues. It is at this point impossible to predict how this litigation will play out.
It is notable that the Privacy Commissioner of Canada, who has no order making powers under PIPEDA but who can apply to Federal Court for an order, declined to do so. Under PIPEDA, such an application requires a hearing de novo by the Federal Court – this means that unlike the judicial review proceedings in the other provinces, the Federal Court need not show any deference to the federal Commissioner’s findings. Instead, the Court would proceed to a determination of the issues after hearing and considering the parties’ evidence and argument. One might wonder whether the rather bruising decision of the Federal Court in Privacy Commissioner v. Facebook (which was subsequently overturned by the Federal Court of Appeal) might have influenced the Commissioner to not roll the dice to seek an order with so much at stake. That a hearing de novo before the Federal Court could upset the apple cart of the Commissioners’ attempts to co-ordinate efforts, reduce duplication and harmonize interpretation, is sobering. Yet, it also means that if this litigation saga ends with the conclusion that the orders are reasonable and enforceable, BC, Alberta and Quebec residents will have received results in the form of orders requiring Clearview to delete images and to geo-fence any future collection of images to protect those within those provinces (which will still need to be made enforceable in the US) – while Canadians elsewhere in the country will not. Canadians will need long promised but as yet undelivered reform of PIPEDA to address the ability of the federal Commissioner to issue orders – ones that will be subject to judicial review with appropriate deference, rather than second guessed by the Personal Information and Data Protection Tribunal proposed in Bill C-27.
Concluding thoughts
Despite rulings from privacy and data protection commissioners around the world that Clearview AI is in breach of their respective laws, and notwithstanding two class action lawsuits in the US under the Illinois Biometric Information Privacy Act, the company has continued to grow its massive FRT database. At the time of the Canadian investigation, the database was said to hold 3 billion images. Current reports place this number at over 50 billion. Considering the resistance of the company to compliance with Canadian law, this raises the question of what it will take to motivate compliance by resistant organizations. As the proposed amendments to Canada’s federal private sector privacy laws wither on the vine after neglect and mismanagement in their journey through Parliament, this becomes a pressing and important question.