Tags
access to information
AI
AIDA
AI governance
AI regulation
Ambush Marketing
artificial intelligence
big data
bill c11
Bill c27
copyright
data governance
data protection
data scraping
data strategy
freedom of expression
Geospatial
geospatial data
intellectual property
Internet
internet law
IP
open courts
open data
open government
personal information
pipeda
Privacy
trademarks
transparency
|
Displaying items by tag: pipa
Tuesday, 27 May 2025 05:18
New Clearview AI Decision Has Implications for OpenAI InvestigationThe Alberta Court of Queen’s Bench has issued a decision in Clearview AI’s application for judicial of an Order made by the province’s privacy commissioner. The Commissioner had ordered Clearview AI to take certain steps following a finding that the company had breached Alberta’s Personal Information Protection Act (PIPA) when it scraped billions of images – including those of Albertans – from the internet to create a massive facial recognition database marketed to police services around the world. The court’s decision is a partial victory for the commissioner. It is interesting and important for several reasons – including for its relevance to generative AI systems and the ongoing joint privacy investigation into OpenAI. These issues are outlined below. Brief Background Clearview AI became notorious in 2020 following a New York Times article which broke the story on the company’s activities. Data protection commissioners in Europe and elsewhere launched investigations, which overwhelmingly concluded that the company violated applicable data protection laws. In Canada, the federal privacy commissioner joined forces with the Quebec, Alberta and British Columbia (BC) commissioners, each of which have private sector jurisdiction. Their joint investigation report concluded that their respective laws applied to Clearview AI’s activities as there was a real and substantial connection to their jurisdictions. They found that Clearview collected, used and disclosed personal information without consent, and that no exceptions to consent applied. The key exception advanced by Clearview AI was the exception for “publicly available information”. The Commissioners found that the scope of this exception, which was similarly worded in the federal, Alberta and BC laws, required a narrow interpretation and that the definition in the regulations enacted under each of these laws did not include information published on the internet. The commissioners also found that, contrary to shared legislative requirements, the collection and use of the personal information by Clearview AI was not for a purpose that a reasonable person would consider appropriate in the circumstances. The report of findings made a number of recommendations that Clearview ultimately did not accept. The Quebec, BC and Alberta commissioners all have order making powers (which the federal commissioner does not). Each of these commissioners ordered Clearview to correct its practices, and Clearview sought judicial review of each of these orders. The decision of the BC Supreme Court (which upheld the Commissioner’s order) is discussed in an earlier post. The decision from Quebec has yet to be issued. In Alberta, Clearview AI challenged the commissioner’s jurisdiction on the basis that Alberta’s PIPA did not apply to its activities. It also argued that that the Commissioner’s interpretation of “publicly available information” was unreasonable. In the alternative, Clearview AI argued that ‘publicly available information’, as interpreted by the Commissioner, was an unconstitutional violation of its freedom of expression. It also contested the Commissioner’s finding that Clearview did not have a reasonable purpose for collecting, using and disclosing the personal information. The Jurisdictional Question Courts have established that Canadian data protection laws will apply where there is a real and substantial connection to the relevant jurisdiction. Clearview AI argued that it was a US-based company that scraped most of its data from social media websites mainly hosted outside of Canada, and that therefore its activities took place outside of Canada and its provinces. Yet, as Justice Feasby noted, “[s]trict adherence to the traditional territorial conception of jurisdiction would make protecting privacy interests impossible when information may be located everywhere and nowhere at once” (at para 50). He noted that there was no evidence regarding the actual location of the servers of social media platforms, and that Clearview AI’s scraping activities went beyond social media platforms. Justice Feasby ruled that he was entitled to infer from available evidence that images of Albertans were collected from servers located in Canada and in Alberta. He observed that in any event, Clearview marketed its services to police in Alberta, and its voluntary decision to cease offering those services did not alter the fact that it had been doing business in Alberta and could do so again. Further, the information at issue in the order was personal information of Albertans. All of this gave rise to a real and substantial connection with Alberta. Publicly Available Information The federal Personal Information Protection and Electronic Documents Act (PIPEDA) contains an exception to the consent requirement for “publicly available information”. The meaning of this term is defined in the Regulations Specifying Publicly Available Information. The relevant category is found in s. 1(e) which specifies “personal information that appears in a publication, including a magazine, book or newspaper, in printed or electronic form, that is available to the public, where the individual has provided the information.” Alberta’s PIPA contains a similar exception (as does BC’s law), although the wording is slightly different. Section 7(e) of the Alberta regulations creates an exception to consent where: (e) the personal information is contained in a publication, including, but not limited to, a magazine, book or newspaper, whether in printed or electronic form, but only if (i) the publication is available to the public, and (ii) it is reasonable to assume that the individual that the information is about provided that information; [My emphasis]
In their joint report of findings, the Commissioners found that their respective “publicly available information” exceptions did not include social media platforms. Clearview AI made much of the wording of Alberta’s exception, arguing that even if it could be said that the PIPEDA language excluded social media platforms, the use of the words “including but not limited to” in the Alberta regulation made it clear that the list was not closed, nor was it limited to the types of publications referenced. In interpreting the exceptions for publicly available information, the Commissioners emphasized the quasi-constitutional nature of privacy legislation. They found that the privacy rights should receive a broad and expansive interpretation and the exceptions to those rights should be interpreted narrowly. The commissioners also found significant differences between social media platforms and the more conventional types of publications referenced in their respective regulations, making it inappropriate to broaden the exception. Justice Feasby, applying reasonableness as the appropriate standard of review, found that the Alberta Commissioner’s interpretation of the exception was reasonable. Freedom of Expression Had the court’s decision ended there, the outcome would have been much the same as the result in the BC Supreme Court. However, in this case, Clearview AI also challenged the constitutionality of the regulations. It sought a declaration that if the exception were interpreted as limited to books, magazines and comparable publications, then this violated its freedom of expression under s. 2(b) of the Canadian Charter of Rights and Freedoms. Clearview AI argued that its commercial purposes of scraping the internet to provide information services to its clients was expressive and was therefore protected speech. Justice Feasby noted that Clearview’s collection of internet-based information was bot-driven and not engaged in by humans. Nevertheless, he found that “scraping the internet with a bot to gather images and information may be protected by s. 2(b) when it is part of a process that leads to the conveyance of meaning” (at para 104). Interestingly, Justice Feasby noted that since Clearview no longer offered its services in Canada, any expressive activities took place outside of Canada, and thus were arguably not protected by the Charter. However, he acknowledged that the services had at one point been offered in Canada and could be again. He observed that “until Clearview removes itself permanently from Alberta, I must find that its expression in Alberta is restricted by PIPA and the PIPA Regulation” (at para 106). Having found a prima facie breach of s. 2(b), Justice Feasby considered whether this was a reasonable limit demonstrably justified in a free and democratic society, under s. 1 of the Charter. The Commissioner argued that the expression at issue in this case was commercial in nature and thus of lesser value. Justice Feasby was not persuaded by category-based assumptions of value; rather, he preferred an approach in which the regulation of commercial expression is consistent with and proportionate to its character. Justice Feasby found that the Commissioner’s reasonable interpretation of the exception in s. 7 of the regulations meant that it would exclude social media platforms or “other kinds of internet websites where images and personal information may be found” (at para 118). He noted that this is a source-based exception – in other words that some publicly available information may be used without knowledge or consent, but not other similar information. The exclusion depends on the source and not the purpose of use for the personal information. Justice Feasby expressed concern that the same exception that would exclude the scraping of images from the internet for the creation of a facial recognition database would also apply to the activities of search engines widely used by individuals to gain access to information on the internet. He thus found that the publicly available information exception was overbroad, stating: “Without a reasonable exception to the consent requirement for personal information made publicly available on the internet without use of privacy settings, internet search service providers are subject to a mandatory consent requirement when they collect, use and disclose such personal information by indexing and delivering search results” (at para 138). He stated: “I take judicial notice of the fact that search engines like Google are an important (and perhaps the most important) way individuals access information on the internet” (at para 144). Justice Feasby also noted that while it was important to give individuals some level of control over their personal information, “it must also be recognized that some individuals make conscious choices to make their images and information discoverable by search engines and that they have the tools in the form of privacy settings to prevent the collection, use, and disclosure of their personal information” (at para 143). His constitutional remedy – to strike the words “including, but not limited to magazines, books, and newspapers” from the regulation was designed to allow “the word ‘publication’ to take its ordinary meaning which I characterize as ‘something that has been intentionally made public’” (at para 149). The Belt and Suspenders Approach Although excising part of the publicly available information definition seems like a major victory for Clearview AI, in practical terms it is not. This is because of what the court refers to as the law’s “belt and suspenders approach”. This metaphor suggests that there are two routes to keep up privacy’s pants – and loosening the belt does not remove the suspenders. In this case, the suspenders are located in the clause found in PIPA, as well as in its federal and BC counterparts, that limits the collection, use and disclosure of personal information to only that which “a reasonable person would consider appropriate in the circumstances”. The court ruled that the Commissioner’s conclusion that the scraping of personal information was not for purposes that a reasonable person would consider appropriate in the circumstances was reasonable and should not be overturned. This approach, set out in the joint report of findings, emphasized that the company’s mass data scraping involved over 3 billion images of individuals, including children. It was used to create biometric face prints that would remain in Clearview’s databases even if the source images were removed from the internet, and it was carried out for commercial purposes. The commissioners also found that the purposes were not related to the reasons why individuals might have shared their photographs online, could be used to the detriment of those individuals, and created the potential for a risk of significant harm. Continuing with his analogy to search engines, Justice Feasby noted that Clearview AI’s use of publicly available images was very different from the use of the same images by search engines. The different purposes are essential to the reasonableness determination. Justice Feasby states: “The “purposes that are reasonable” analysis is individualized such that a finding that Clearview’s use of personal information is not for reasonable purposes does not apply to other organizations and does not threaten the operations of the internet” (at para 159). He noted that the commercial dimensions of the use are not determinative of reasonableness. However, he observed that “where images and information are posted to social media for the purpose of sharing with family and friends (or prospective friends), the commercialization of such images and information by another party may be a relevant consideration in determining whether the use is reasonable” (at para 160). The result is that Clearview AI’s scraping of images from the public internet violates Alberta’s PIPA. The court further ruled that the Commissioner’s order was clear and specific, and capable of being implemented. Justice Feasby required Clearview AI to report within 50 days on its good faith progress in taking steps to cease the collection, use and disclosure of images and biometric data collected from individuals in Alberta, and to delete images and biometric data in its database that are from individuals in Alberta. Harmonized Approaches to Data Protection Law in Canada This decision highlights some of the challenges to the growing collaboration and cooperation of privacy commissioners in Canada when it comes to interpreting key terms and concepts in substantially similar legislation. Increasingly, the commissioners engage in joint investigations where complaints involve organizations operating in multiple jurisdictions in Canada. While this occurs primarily in the private sector context, it is not exclusively the case, as a recent joint investigation between the BC and Ontario commissioners into a health data breach demonstrates. Joint investigations conserve regulator resources and save private sector organizations from having to respond to multiple similar and concurrent investigations. In addition, joint investigations can lead to harmonized approaches and interpretations of shared concepts in similar legislation. This is a good thing for creating certainty and consistency for those who do business across Canadian jurisdictions. However, harmonized approaches are vulnerable to multiple judicial review applications, as was the case following the Clearview AI investigation. Although the BC Supreme Court found that the BC Commissioner’s order was reasonable, what the Alberta King’s Bench decision demonstrates is that a common front can be fractured. Justice Feasby found that a slight difference in wording between Alberta’s regulations and those in BC and at the federal level was sufficient to justify finding the scope of Alberta’s publicly available information exception to be unconstitutional. Harmonized approaches may also be vulnerable to unilateral legislative change. In this respect, it is worth noting that an Alberta report on the impending reform of PIPA recommends “that the Government take all necessary steps, including through proposing amendments to the Personal Information Protection Act, to improve alignment of all provincial privacy legislation, including in the private, public and health sectors” (at p. 13). The Elephant in the Room: Generative AI and Data Protection Law in Canada In his reasons, Justice Feasby made Google’s search functions a running comparison for Clearview AI’s data scraping practices. Perhaps a better example would have been the data scraping that takes place in order to train generative AI models. However, the court may have avoided that example because there is an ongoing investigation by the Alberta, Quebec, BC and federal commissioners into OpenAI’s practices. The findings in that investigation are overdue – perhaps the delay has, at least in part, been caused by anticipation of what might happen with the Alberta Clearview AI judicial review. The Alberta decision will likely present a conundrum for the commissioners. Reading between the lines of Justice Feasby’s decision, it is entirely possible that he would find that the scraping of the public internet to gather training data for generative AI systems would both fall within the exception for publicly available information and be for a purpose that a reasonable person would consider appropriate in the circumstances. Generative AI tools are now widely used – more widely even than search engines since these tools are now also embedded in search engines themselves. To find that the collection and use of personal information that may be indiscriminately found on the internet cannot be used in this way because consent is required is fundamentally impractical. In the EU, the legitimate interest exception in the GDPR provides latitude for use in this way without consent, and recent guidance from the European Data Protection Supervisor suggestions that legitimate interests combined, where appropriate with Data Protection Impact Assessments may address key data protection issues. In this sense, the approach taken by Justice Feasby seems to carve a path for data protection in a GenAI era in Canada by allowing data scraping of publicly available sources on the Internet in principle, subject to the limit that any such collection or any ensuing use or disclosure of the personal information must be for purposes that a reasonable person would consider appropriate in the circumstances. However, this is not a perfect solution. In the first place, unlike the EU approach, which ensures that other privacy protective measures (such as privacy impact assessments) govern this kind of mass collection, Canadian law remains outdated and inadequate. Further, the publicly available information exceptions – including Alberta’s even after its constitutional nip and tuck – also emphasize that, to use the language of Alberta’s PIPA, it must be “reasonable to assume that the individual that the information is about provided the information”. In fact, there will be many circumstances in which individuals have not provided the information posted online about them. This is the case with photos from parties, family events and other social interactions. Further, social media – and the internet as a whole – is full of non-consensual images, gossip, anecdotes and accusations. The solution crafted by the Alberta Court of King’s Bench is therefore only a partial solution. A legitimate interest exception would likely serve much better in these circumstances, particularly if it is combined with broader governance obligations to ensure that privacy is adequately considered and assessed. Of course, before this happens, the federal government’s privacy reform measures in Bill C-27 must be resuscitated in some form or another.
Published in
Privacy
Monday, 17 March 2025 06:16
Clearview AI and Compliance with Canadian Privacy LawsThe Clearview AI saga has a new Canadian instalment. In December 2024, the British Columbia Supreme Court rendered a decision on Clearview AI’s application for judicial review of an order issued by the BC Privacy Commissioner. This post explores that decision and some of its implications. The first part sets the context, the next discusses the judicial review decision, and part three looks at the ramifications for Canadian privacy law of the larger (and ongoing) legal battle. Context Late in 2021, the Privacy Commissioners of BC, Alberta, Quebec and Canada issued a joint report on their investigation into Clearview AI (My post on this order is here). Clearview AI, a US-based company, had created a massive facial recognition (FRT) database from images scraped from the internet that it marketed to law enforcement agencies around the world. The investigation was launched after a story broke in the New York Times about Clearview’s activities. Although Canadian police services initially denied using Clearview AI, the RCMP later admitted that it had purchased two licences. Other Canadian police services made use of promotional free accounts. The joint investigation found that Clearview AI had breached the private sector data protection laws of the four investigating jurisdictions by collecting and using sensitive personal information without consent and by doing so for purposes that a reasonable person would not consider appropriate in the circumstances. The practices also violated Quebec’s Act to establish a legal framework for information technology. Clearview AI disagreed with these conclusions. It indicated that it would temporarily cease its operations in Canada but maintained that it was entitled to scrape content from the public web. After failing to respond to the recommendations in the joint report, the Commissioners of Quebec, BC and Alberta issued orders against the company. These orders required Clearview AI to cease offering its services in their jurisdictions, to make best efforts to stop collecting the personal information of those within their respective provincial boundaries, and to delete personal information in its databases that had been improperly collected from those within their boundaries. No order issued from the federal Commissioner, who does not have order making powers under the Personal Information Protection and Electronic Documents Act (PIPEDA). He could have applied to the Federal Court for an order but chose not to do so (more on that in Part 3 of this post). Clearview AI declined to comply with the provincial orders, other than to note that it had already temporarily ceased operations in Canada. It then applied for judicial review of the orders in each of the three provinces. To date, only the challenge to the BC Order has been heard and decided. In the BC application, Clearview argued that the Commissioner’s decision was unreasonable. Specifically, it argued that BC’s Personal Information Protection Act (PIPA) did not apply to Clearview AI, that the information it scraped was exempt from consent requirements because it was “publicly available information”, and that the Commissioner’s interpretation of purposes that a reasonable person would consider appropriate in the circumstances was unreasonable and failed to consider Charter values. In his December 2024 decision, Justice Shergill of the BC Supreme Court disagreed, upholding the Commissioner’s order. The BC Supreme Court Decision on Judicial Review Justice Shergill confirmed that BC’s PIPA applies to Clearview AI’s activities, notwithstanding the fact that Clearview AI is a US-based company. He noted that applying the ‘real and substantial connection’ test – which considers the nature and extent of connections between a party’s activities and the jurisdiction in which proceedings are initiated – leads to that conclusion. There was evidence that Clearview AI’s database had been marketed to and used by police services in BC, as well as by the RCMP which polices many parts of the province. Further, Justice Shergill noted that Clearview’s data scraping practices were carried out worldwide and captured data about BC individuals including, in all likelihood, data from websites hosted in BC. Interestingly, he also found that Clearview’s scraping of images from social media sites such as Facebook, YouTube and Instagram also created sufficient connection, as these sites “undoubtedly have hundreds of thousands if not millions of users in British Columbia” (at para 91). In reaching his conclusion, Justice Shergill emphasized “the important role that privacy plays in the preservation of our societal values, the ‘quasi-constitutional’ status afforded to privacy legislation, and the increasing significance of privacy laws as technology advances” (at para 95). He also found that there was nothing unfair about applying BC’s PIPA to Clearview AI, as the company “chose to enter British Columbia and market its product to local law enforcement agencies. It also chooses to scrape data from the Internet which involves personal information of people in British Columbia” (at para 107). Sections 12(1)(e), 15(1)(e) and 18(1)(e) of PIPA provide exceptions to the requirement of knowledge and consent for the collection, use and disclosure of personal information where “the personal information is available to the public” as set out in regulations. The PIPA Regulations include “printed or electronic publications, including a magazine, book, or newspaper in printed or electronic form.” Similar exceptions are found in the federal PIPEDA and in Alberta’s Personal Information Protection Act. Clearview AI had argued that public internet websites, including social media sites, fell within the category of electronic publications and their scraping was thus exempt from consent requirements. The commissioners disagreed, and Clearview AI challenged this interpretation as unreasonable. Justice Shergill found that the Commissioners’ conclusion that social media websites fell outside the exception for publicly available information was reasonable. The BC Commissioner was entitled to read the list in the PIPA Regulations as a “narrow set of sources” (at para 160). Justice Shergill reviewed the reasoning in the joint report for why social media sites should be treated differently from other types of publications mentioned in the exception. These include the fact that social media sites are dynamic and not static and that individuals exercise a different level of control over their personal information on social media platforms than on news or other such sites. Although the legislation may require a balancing of privacy rights with private sector interests, Justice Shergill found that it was reasonable for the Commissioner to conclude that privacy rights should be given precedence over commercial interests in the overall context of the legislation. Referencing the Supreme Court of Canada’s decision in Lavigne, Justice Shergill noted that “it is the protection of individual privacy that supports the quasi-constitutional status of privacy legislation, not the right of the organization to collect and use personal information” (at para 174). An individual’s ability to control what happens to their personal information is fundamental to the autonomy and dignity protected by privacy rights and “it is thus reasonable to conclude that any exception to these important rights should be interpreted narrowly” (at para 175). Clearview AI argued that posting photos to social media sites reflected an individual’s autonomous choice to surrender the information to the public domain. Justice Shergill preferred the Commissioner’s interpretation, which considered the sensitivity of the biometric information, and the impact its collection and use could have on individuals. He referenced the Supreme Court of Canada’s decision in R. v. Bykovets (my post on this case is here), which emphasized that “individuals ‘may choose to divulge certain information for a limited purpose, or to a limited class of persons, and nonetheless retain a reasonable expectation of privacy” (at para 162, citing para 46 of Bykovets). Clearview AI also argued that the Commissioner was unreasonable in not taking into account Charter values in his interpretation of PIPA. In particular, the company was of the view that the freedom of expression, which guarantees the right both to communicate and to receive information, extended to the ability to access and use publicly available information without restriction. Although Justice Shergill found that the Commissioner could have been more direct in his consideration of Charter values, his decision was still not unreasonable on this point. The Commissioner did not engage with the Charter values issues at length because he did not consider the law to be ambiguous – Charter values-based interpretation comes into play in helping to resolve ambiguities in the law. As Justice Shergill noted, “It is difficult to understand how Clearview’s s. 2(b) Charter rights are infringed through an interpretation of ‘publicly available’ which excludes it from collecting personal information from social media websites without consent” (at para 197). Like its counterpart legislation in Alberta and at the federal level, BC’s PIPA contains a section that articulates the overarching principle, that any collection, use or disclosure of personal information must be for purposes that a reasonable person would consider appropriate in the circumstances. This means, among other things, that even if the exception to consent had applied in this case, the collection and use of the scraped personal information would still have had to have been for a reasonable purpose. The Commissioners had found that overall, Clearview’s scraping of vast quantities of sensitive personal information from the internet to build a massive facial recognition database was not one that a reasonable person would find appropriate in the circumstances. Clearview AI preferred to characterize its purpose as providing a service to the benefit of law enforcement and national security. In their joint report, the Commissioners had rejected this characterization noting that it did not justify the massive, widespread scraping of personal information by a private sector company. Further, the Commissioners had noted that such an activity could have negative consequences for individuals, including cybersecurity risks and risks that errors could lead to reputational harm. They also observed that the activity contributed to “broad-based harm inflicted on all members of society, who find themselves under continual mass surveillance by Clearview based on its indiscriminate scraping and processing of their facial images” (at para 253). Justice Shergill found that the record supported these conclusions, and that the Commissioners’ interpretation of reasonable purposes was reasonable. Clearview AI also argued that the Commissioner’s Order was “unnecessary, unenforceable or overbroad”, and should thus be quashed (at para 258). Justice Shergill accepted the Commissioner’s argument that the order was necessary because Clearview had only temporarily suspended its services in Canada, leaving open the possibility that it would offer its services to Canadian law enforcement agencies in the future. He also accepted the Commissioner’s argument that compliance with the order was possible, noting that Clearview had accepted certain steps for ceasing collection and removing images in its settlement of an Illinois class action lawsuit. The order required the company to use “best efforts”, in an implicit acknowledgement that a perfect solution was likely impossible. Clearview argued that a “best efforts” standard was too vague to be enforceable; Justice Shergill disagreed, noting that courts often used “best efforts language”. Further, and quite interestingly, Justice Shergill noted that “if it is indeed impossible for Clearview to sufficiently identify personal information sourced from people in British Columbia, then this is a situation of Clearview’s own making” (at para 279). He noted that “[i]t is not an answer for Clearview to say that because the data was indiscriminately collected, any order requiring it to cease collecting data of persons present in a particular jurisdiction is unenforceable” (at para 279). Implications This is a significant decision as it upholds interpretations of important provisions of BC PIPA. These provisions are similar to ones in Alberta’s PIPA and in the federal PIPEDA. However, it is far from the end of the Clearview AI saga, and there is much to continue to watch. In the first place, the BC Supreme Court decision is already under appeal to the BC Court of Appeal. If the Court of Appeal upholds this decision, it will be a major victory for the BC Commissioner. Yet, either way, there is likely to be a further application for leave to appeal to the Supreme Court of Canada. It may be years before the issue is finally resolved. In this time, data protection laws in BC, Alberta and at the federal level might well be reformed. It will therefore also be important to examine any new bills to see whether the provisions at issue in this case are addressed in any way or left as is. In the meantime, Clearview AI has also filed for judicial review of the orders of the Quebec and Alberta commissioners, and these applications are moving forward. All three orders (BC, Alberta and Quebec) are based on the same joint findings. A decision by either or both the Quebec or Alberta superior courts that the orders are unreasonable could strike a significant blow for the united front that Canada’s commissioners are increasingly showing on privacy issues that affect all Canadians. There is therefore a great deal riding on the outcomes of these applications. In any event, regardless of the outcomes, expect applications for leave to appeal to the Supreme Court of Canada. Leave to appeal is less likely to be granted if all three provincial courts of appeal take a similar approach to the issues. It is at this point impossible to predict how this litigation will play out. It is notable that the Privacy Commissioner of Canada, who has no order making powers under PIPEDA but who can apply to Federal Court for an order, declined to do so. Under PIPEDA, such an application requires a hearing de novo by the Federal Court – this means that unlike the judicial review proceedings in the other provinces, the Federal Court need not show any deference to the federal Commissioner’s findings. Instead, the Court would proceed to a determination of the issues after hearing and considering the parties’ evidence and argument. One might wonder whether the rather bruising decision of the Federal Court in Privacy Commissioner v. Facebook (which was subsequently overturned by the Federal Court of Appeal) might have influenced the Commissioner to not roll the dice to seek an order with so much at stake. That a hearing de novo before the Federal Court could upset the apple cart of the Commissioners’ attempts to co-ordinate efforts, reduce duplication and harmonize interpretation, is sobering. Yet, it also means that if this litigation saga ends with the conclusion that the orders are reasonable and enforceable, BC, Alberta and Quebec residents will have received results in the form of orders requiring Clearview to delete images and to geo-fence any future collection of images to protect those within those provinces (which will still need to be made enforceable in the US) – while Canadians elsewhere in the country will not. Canadians will need long promised but as yet undelivered reform of PIPEDA to address the ability of the federal Commissioner to issue orders – ones that will be subject to judicial review with appropriate deference, rather than second guessed by the Personal Information and Data Protection Tribunal proposed in Bill C-27.
Concluding thoughts Despite rulings from privacy and data protection commissioners around the world that Clearview AI is in breach of their respective laws, and notwithstanding two class action lawsuits in the US under the Illinois Biometric Information Privacy Act, the company has continued to grow its massive FRT database. At the time of the Canadian investigation, the database was said to hold 3 billion images. Current reports place this number at over 50 billion. Considering the resistance of the company to compliance with Canadian law, this raises the question of what it will take to motivate compliance by resistant organizations. As the proposed amendments to Canada’s federal private sector privacy laws wither on the vine after neglect and mismanagement in their journey through Parliament, this becomes a pressing and important question.
Published in
Privacy
|
Electronic Commerce and Internet Law in Canada, 2nd EditionPublished in 2012 by CCH Canadian Ltd. Intellectual Property for the 21st CenturyIntellectual Property Law for the 21st Century: Interdisciplinary Approaches |