Teresa Scassa - Blog

Thursday, 04 February 2021 10:06

How Might Bill C-11 Affect the Outcome of a Clearview AI-type Complaint?

Written by  Teresa Scassa
Rate this item
(12 votes)

 

A joint ruling from the federal Privacy Commissioner and his provincial counterparts in Quebec, B.C., and Alberta has found that U.S.-based company Clearview AI breached Canadian data protection laws when it scraped photographs from social media websites to create the database it used to support its facial recognition technology. According to the report, the database contained the biometric data of “a vast number of individuals in Canada, including children.” Investigations of complaints under public sector data protection laws about police use of Clearview AI’s services are still ongoing.

The Commissioners’ findings are unequivocal. The information collected by Clearview AI is sensitive biometric data. Express consent was required for its collection and use, and Clearview AI did not obtain consent. The company’s argument that consent was not required because the information was publicly available was firmly rejected. The Commissioners described Clearview AI’s actions as constituting “the mass identification and surveillance of individuals by a private entity in the course of commercial activity.” (at para 72) In defending itself, Clearview AI put forward arguments that were clearly at odds with Canadian law. They also resisted the jurisdiction of the Canadian Commissioners, notwithstanding the fact that they collected the personal data of Canadians and offered their commercial services to Canadian law enforcement agencies. Clearview AI did not accept the Commissioners’ findings, and “has not committed to following” the recommendations.

At the time of this report, Bill C-11, a bill to reform Canada’s current data protection law, is before Parliament. The goal of this post is to consider what difference Bill C-11 might make to the outcome of complaints like this one should it be passed into law. I consider both the substantive provisions of the bill and its new enforcement regime.

Consent

Like the current Personal Information Protection and Electronic Documents Act (PIPEDA), consent is a core requirement of Bill C-11. To collect, use or disclose personal information, an organization must either obtain valid consent, or its activities must fall into one of the exceptions to consent. In the Clearview AI case, there was no consent, and the disputed PIPEDA exception to the consent requirement was the one for ‘publicly available personal information’. While this exception seems broad on its face, to qualify, the information must fall within the parameters set out in the Regulations Specifying Publicly Available Personal Information. These regulations focus on certain categories of publicly available information – such as registry information (land titles, for example), court registries and decisions, published telephone directory information, and public business information listings. In most cases, the regulations provide that the use of the information must also relate directly to the purposes for which it was made public. The regulations also contain an exception for “personal information that appears in a publication, including a magazine, book or newspaper, in printed or electronic form, that is available to the public, where the individual has provided the information.” The interpretation of this provision was central to Clearview AI’s defense of its practices. It argued that social media postings were “personal information that appears in a publication.” The Commissioners adopted a narrow interpretation consistent with this being an exception in quasi-constitutional legislation. They distinguished between the types of publications mentioned in the exception and uncurated, dynamic social-media sites. The Commissioners noted that unlike newspapers or magazines, individuals retain a degree of control over the content of their social media sites. They also observed that to find that all information on the internet falls within the publicly available information exception “would create an extremely broad exemption that undermines the control users may otherwise maintain over their information at the source.” (at para 65) Finally, the Commissioners observed that the exception applied to information provided by the data subject, but that photographs were scraped by Clearview AI regardless of whether they were posted by the data subject or by someone else.

Would the result be any different under Bill C-11? In section 51, Bill C-11 replicates the “publicly available information exception” for collection, use or disclosure of personal information. Like PIPEDA, it also leaves the definition of this term to regulations. However, Canadians should be aware that there has been considerable pressure to expand the regulations so that personal information shared on social media sites is exempted from the consent requirement. For example, in past hearings into PIPEDA reform, the House of Commons ETHI Committee at one point appeared swayed by industry arguments that PIPEDA should be amended to include websites and social media within this exception. Bill C-11 does not resolve this issue; but if passed, it might well be on the table in the drafting of regulations. If nothing else, the Clearview AI case provides a stark illustration of just how important this issue is to the privacy of Canadians.

However, data scrapers may be able to look elsewhere in Bill C-11 for an exception to consent. Bill C-11 contains new exceptions to consent for “business operations” which I have criticized here. One of these exceptions would almost certainly be relied upon by a company in Clearview AI’s position if the bill were passed. The exceptions allow for the collection and use of personal information without an individual’s knowledge or consent if, among other things, it is for “an activity in the course of which obtaining the individual’s consent would be impracticable because the organization does not have a direct relationship with the individual.” (18(2)(e)). A company that scrapes data from social media sites to create a facial recognition database would find it impracticable to get consent because it has no direct relationship with any of the affected individuals. The exception seems to fit.

That said, s. 18(1) does set some general guardrails. The one that seems relevant in this case is that the exceptions to consent are only available where “a reasonable person would expect such a collection or use for that activity”. Hopefully, collection of images from social media websites to fuel facial recognition technology would not be something that a reasonable person would expect; certainly, the Commissioners would not find it to be so. In addition, section 12 of Bill C-11 requires that information be collected or used “only for purposes that a reasonable person would consider appropriate in the circumstances” (a requirement carried over from PIPEDA, s. 5(3)). In their findings, the Commissioners ruled that the collection and use of images by Clearview AI was for a purpose that a reasonable person would find inappropriate. The same conclusion could be reached under Bill C-11.

There is reason to be cautiously optimistic, then, that Bill C-11 would lead to the same result on a similar set of facts: the conclusion that the wholesale scraping of personal data from social media sites to build a facial recognition database without consent is not permitted. However, the scope of the exception in s. 18(2)(e) is still a matter of concern. The more exceptions that an organization pushing the boundaries feels it can wriggle into, the more likely it will be to engage in a privacy-compromising activities. In addition, there may be a range of different uses for scraped data and “what a reasonable person would expect” is a rather squishy buffer between privacy and wholesale data exploitation.

Enforcement

Bill C-11 is meant to substantially increase enforcement options when it comes to privacy. Strong enforcement is particularly important in cases where organizations are not interested in accepting the guidance of regulators. This is certainly the case with Clearview AI, which expressly rejected the Commissioners’ findings. Would Bill C-11 strengthen the regulator’s hand?

The Report of Findings in this case reflects the growing trend of having the federal and provincial commissioners that oversee private sector data protection laws jointly investigate complaints involving issues that affect individuals across Canada. This cooperation is important as it ensures consistent interpretation of what is meant to be substantially similar legislation across jurisdictions. Nothing in Bill C-11 would prevent the federal Commissioner from continuing to engage in this cross-jurisdictional collaboration – in fact, subsection 116(2) expressly encourages it.

Some will point to the Commissioner’s new order-making powers as another way to strengthen his enforcement hand. The Commissioner can now direct an organization to take measures to comply with the legislation or to cease activities that are in contravention of the legislation (s. 92(2)). This is a good thing. However, these orders are subject to appeal to the new Personal Information Protection and Data Tribunal (the Tribunal). By contrast, orders of the Commissioners of BC and Alberta are final, subject only to judicial review.

In addition, it is not just the orders of the Commissioner that are appealable under C-11, but also his findings. This raises questions about how the new structure under Bill C-11 might affect cooperative inquiries like the one in this case. Conclusions shared with other Commissioners can be appealed by respondents to the Tribunal, which owes no deference to the Commissioner on questions of law. As I and others have already noted, the composition of the Tribunal is somewhat concerning; Bill C-11 would require only a minimum of one member of the tribunal to have expertise in privacy law. While it is true that proceedings before the Federal Court were de novo, and thus the Commissioner was afforded no formal deference in that context either, access to Federal Court was more limited than the wide-open appeals route to the Tribunal. The Bill C-11 structure really seems to shift the authority to interpret and apply the law away from the Commissioner and to the mysterious and not necessarily expert Tribunal.

Bill C-11 also has a much-touted new power to issue substantial fines for breach of the legislation. Interestingly, however, this does not seem to be the kind of case in which a fine would be available. Fines, provided for under s. 93(1) of Bill C-11 are available only with respect to the breach of certain obligations under the statute (these are listed in s. 93(1)). Playing fast and loose with the requirement to obtain consent is not one of them. This is interesting, given the supposedly central place consent plays within the Bill. Further thought might need to be given to the list of ‘fine-able contraventions’.

Overall, then, although C-11 could lead to a very similar result on similar facts, the path to that result may be less certain. It is also not clear that there is anything in the enforcement provisions of the legislation that will add heft to the Commissioner’s findings. In practical terms, the decisions that matter will be those of the Tribunal, and it remains to be seen how well this Tribunal will serve Canadians.

Login to post comments

Canadian Trademark Law

Published in 2015 by Lexis Nexis

Canadian Trademark Law 2d Edition

Buy on LexisNexis

Electronic Commerce and Internet Law in Canada, 2nd Edition

Published in 2012 by CCH Canadian Ltd.

Electronic Commerce and Internet Law in Canada

Buy on CCH Canadian

Intellectual Property for the 21st Century

Intellectual Property Law for the 21st Century:

Interdisciplinary Approaches

Purchase from Irwin Law