Tags
access to information
AI
AIDA
AI governance
AI regulation
Ambush Marketing
artificial intelligence
big data
bill c11
Bill c27
copyright
data governance
data protection
data scraping
freedom of expression
Geospatial
geospatial data
intellectual property
Internet
internet law
IP
open courts
open data
open government
personal information
pipeda
Privacy
smart cities
trademarks
transparency
|
Displaying items by tag: right to be forgotten
Tuesday, 02 September 2025 06:48
Right to Be Forgotten Findings Raise Issues About Privacy Commissioner's Powers and Canadian Privacy Law Reform
Canada’s Privacy Commissioner has released a set of findings that recognize a right to be forgotten (RTBF) under the Personal Information Protection and Electronic Documents Act (PIPEDA). The complainant’s long legal journey began in 2017 when they complained that a search of their name in Google’s search engine returned news articles from many years earlier regarding an arrest and criminal charges relating to having sexual activity without disclosing their status as being HIV positive. Although these reports were accurate at the time they were published, the charges were stayed shortly afterwards, because the complainant posed no danger to public health. Charging guidelines for the offence in question indicated that no charges should be laid where there is no realistic possibility that HIV could be transmitted. The search results contain none of that information. Instead, they publicly disclose the HIV status of the complainant, and they create the impression that their conduct was criminal in nature. As a result of the linking of their name to these search results, the complainant experienced – and continues to experience – negative consequences including social stigma, loss of career opportunities and even physical violence. Google’s initial response to the complaint was to challenge the jurisdiction of the Privacy Commissioner to investigate the matter under PIPEDA, arguing that PIPEDA did not apply to its search engine functions. The Commissioner referred this issue to the Federal Court, which found that PIPEDA applied. That decision was (unsuccessfully) appealed by Google to the Federal Court of Appeal. When the matter was not appealed further to the Supreme Court of Canada, the Commissioner began his investigation which resulted in the current findings. Google has indicated that it will not comply with the Commissioner’s recommendation to delist the articles so that they do not appear in a search using the complainant’s name. This means that it is likely that an application will be made to Federal Court for a binding order. The matter is therefore not yet resolved. This post considers three issues. The first relates to the nature and scope of the RTBF in PIPEDA, as found by the Commissioner. The second relates to the Commissioner’s woeful lack of authority when it comes to the enforcement of PIPEDA. Law reform is needed to address this, yet Bill C-27, which would have given greater enforcement powers to the Commissioner, died on the order paper. The government’s intentions with respect to future reform and its timing remain unclear. The third point also addresses PIPEDA reform. I consider the somewhat fragile footing for the Commissioner’s version of the RTBF given how Bill C-27 had proposed to rework PIPEDA’s normative core. The Right to be Forgotten (RTBF) and PIPEDA In his findings, the Commissioner grounds the RTBF in an interpretation of s. 5(3) of PIPEDA: 5(3) An organization may collect, use or disclose personal information only for purposes that a reasonable person would consider are appropriate in the circumstances. This is a core normative provision in PIPEDA. For example, although organizations may collect personal information with the consent of the individual, they cannot do so if the collection is for purposes that a reasonable person would not consider appropriate in the circumstances. This provision (or at least one very similar to it in Alberta’s Personal Information Protection Act), was recently found to place important limits on the scraping of photographs from the public internet by Clearview AI to create a massive facial recognition (FRT) database. Essentially, even though the court found that photographs posted on the internet were publicly available and could be collected and used without consent, they could not be collected and used to create a FRT database because this was not a purpose a reasonable person would consider appropriate in the circumstances. The RTBF would function much in the same way when it comes to the operations of platform search engines. Those search engines – such as Google’s – collect, use and disclose information found on the public internet when they return search results to users in response to queries. When searches involve individuals, search results may direct users to personal information about that individual. That is acceptable – as long as the information is being collected, used and disclosed for purposes a reasonable person would consider appropriate in the circumstances. In the case of the RTBF, according to the Commissioner, the threshold will be crossed when the privacy harms caused by the disclosure of the personal information in the search results outweigh the public interest in having that information shared through the search function. In order to make that calculation, the Commissioner articulates a set of criteria that can be applied on a case-by-case basis. The criteria include: a. Whether the individual is a public figure (e.g. a public office holder, a politician, a prominent business person, etc.); b. Whether the information relates to an individual’s working or professional life as opposed to their private life; c. Whether the information relates to an adult as opposed to a minor; d. Whether the information relates to a criminal charge that has resulted in a conviction or where the charges were stayed due to delays in the criminal proceedings; e. Whether the information is accurate and up to date; f. Whether the ability to link the information with the individual is relevant and necessary to the public consideration of a matter under current controversy or debate; g. The length of time that has elapsed since the publication of the information and the request for de-listing. (at para 109) In this case, the facts were quite compelling, and the Commissioner had no difficulty finding that the information at issue caused great harm to the complainant while providing no real public benefit. This led to the de-listing recommendation – which would mean that a search for the complainant’s name would no longer turn up the harmful and misleading articles – although the content itself would remain on the web and could be arrived at using other search criteria. The Privacy Commissioner’s ‘Powers’ Unlike his counterparts in other jurisdictions, including the UK, EU member countries, and Quebec, Canada’s Privacy Commissioner lacks suitable enforcement powers. PIPEDA was Canada’s first federal data protection law, and it was designed to gently nudge organizations into compliance. It has been effective up to a point. Many organizations do their best to comply proactively, and the vast majority of complaints are resolved prior to investigation. Those that result in a finding of a breach of PIPEDA contain recommendations to bring the organization into compliance, and in many cases, organizations voluntarily comply with the recommendations. The legislation works – up to a point. The problem is that the data economy has dramatically evolved since PIPEDA’s enactment. There is a great deal of money to be made from business models that extract large volumes of data that are then monetized in ways that are beyond the comprehension of individuals who have little choice but to consent to obscure practices laid out in complex privacy policies in order to receive services. Where complaint investigations result in recommendations that run up against these extractive business models, the response is increasingly to disregard the recommendations. Although there is still the option for a complainant or the Commissioner to apply to Federal Court for an order, the statutory process set out in PIPEDA requires the Federal Court to hold a hearing de novo. In other words, notwithstanding the outcome of the investigation, the court hears both sides and draws its own conclusions. The Commissioner, despite his expertise, is owed no deference. In the proposed Consumer Protection Privacy Act (CPPA) that was part of the now defunct Bill C-27, the Commissioner was poised to receive some important new powers, including order-making powers and the ability to recommend the imposition of steep administrative monetary penalties. Admittedly, these new powers came with some clunky constraints that would have put the Commissioner on training wheels in the privacy peloton of his international counterparts. Still, it was a big step beyond the current process of having to ask the Federal Court to redo his work and reach its own conclusions. Bill C-27, however, died on the order paper with the last federal election. The current government is likely in the process of pep-talking itself into reintroducing a PIPEDA reform bill, but as yet there is no clear timeline for action. Until a new bill is passed, the Commissioner is going to have to make do with his current woefully inadequate enforcement tools. The Dangers of PIPEDA Reform Assuming a PIPEDA reform bill will contain enforcement powers better adapted to a data-driven economy, one might be forgiven for thinking that PIPEDA reform will support the nascent RTBF in Canada (assuming that the Federal Court agrees with the Commissioner’s approach). The problem is, however, there could be some uncomfortable surprises in PIPEDA reform. Indeed, this RTBF case offers a good illustration of how tinkering with PIPEDA may unsettle current interpretations of the law – and might do so at the expense of privacy rights. As noted above, the Commissioner grounded the RTBF on the strong and simple principle at the core of PIPEDA and expressed in s. 5(3), which I repeat here for convenience: 5(3) An organization may collect, use or disclose personal information only for purposes that a reasonable person would consider are appropriate in the circumstances. The Federal Court of Appeal has told us that this is a normative standard – in other words, the fact that millions of otherwise reasonable people may have consented to certain terms of service does not on its own make those terms something that a reasonable person would consider appropriate in the circumstances. The terms might be unduly exploitative but leave individuals with little or no choice. The reasonableness inquiry sets a standard for the level of privacy protection an individual should be entitled to in a given set of circumstances. Notably, Bill C-27 sought to disrupt the simplicity of s. 5(3), replacing it with the following: 12 (1) An organization may collect, use or disclose personal information only in a manner and for purposes that a reasonable person would consider appropriate in the circumstances, whether or not consent is required under this Act.
(2) The following factors must be taken into account in determining whether the manner and purposes referred to in subsection (1) are appropriate: (a) the sensitivity of the personal information; (b) whether the purposes represent legitimate business needs of the organization; (c) the effectiveness of the collection, use or disclosure in meeting the organization’s legitimate business needs; (d) whether there are less intrusive means of achieving those purposes at a comparable cost and with comparable benefits; and (e) whether the individual’s loss of privacy is proportionate to the benefits in light of the measures, technical or otherwise, implemented by the organization to mitigate the impacts of the loss of privacy on the individual. Although s. 12(1) is not so different from s. 5(3), the government saw fit to add a set of criteria in s. 12(2) that would shape any analysis in a way that leans the decision-maker towards accommodating the business needs of the organization over the privacy rights of the individual. Paragraph 12(2)(b) and (c) explicitly require the decision-maker to think about the legitimate business needs of the organization and the effectiveness of the particular collection, use or disclosure in meeting those needs. In an RTBF case, this might mean thinking about how indexing the web and returning search results meets the legitimate business needs of a search engine company and does so effectively. It then asks whether there are “less intrusive means of achieving those purposes at a comparable cost and with comparable benefits”. This too focuses on the organization. Not only is this criterion heavily weighted in favour of business in terms of its substance – less intrusive means should be of comparable cost – the issues it raises are ones about which an individual challenging the practice would have great difficulty producing evidence. While the Commissioner has greater resources, these are still limited. The fifth criterion returns us to the issue of privacy, but it asks whether “the individual’s loss of privacy is proportionate to the benefits [to the organization] in light of the measures, technical or otherwise, implemented by the organization to mitigate the impacts of the loss of privacy on the individual”. The criteria in s. 12(2) fall over themselves to nudge a decision-maker towards finding privacy-invasive practices to be “for purposes that a reasonable person would consider appropriate in the circumstances” – not because a reasonable person would find them appropriate in light of the human right to privacy, but because an organization has a commercial need for the data and has fiddled about a bit to attempt to mitigate the worst of the impacts. Privacy essentially becomes what the business model will allow – the reasonable person is now an accountant. It is also worth noting that by the time a reform bill is reintroduced (and if we dare to imagine it – actually passed), the Federal Court may have weighed in on the RTBF under PIPEDA, putting us another step closer to clarifying whether there is a RTBF in Canada’s private sector privacy law. Assuming that the Federal Court largely agrees with the Commissioner and his approach, if something like s. 12 of the CPPA becomes part of a new law, the criteria developed by the Commissioner for the reasonableness assessment in RTBF cases will be supplanted by the rather ugly list in s. 12(2). Not only will this cast doubt on the continuing existence of a RTBF, it may likely doom one. And this is not the only established interpretation/approach that will be unsettled by such a change. The Commissioner’s findings in the RTBF investigation demonstrate the flexibility and simplicity of s. 5(3). When a PIPEDA reform bill returns to Parliament, let us hope that the s. 12(2) is no longer part of it.
Published in
Privacy
Friday, 09 July 2021 14:00
A First Step Along the Path to a Right to Be Forgotten in Canada?
The Federal Court has issued its decision in a reference case brought by the Privacy Commissioner of Canada regarding the interpretation of his jurisdiction under the Personal Information Protection and Electronic Documents Act (PIPEDA). The reference relates to a complaint against Google about its search engine, and implicating the so-called ‘right to be forgotten’. Essentially, the complainant in that case seeks an order requiring Google to de-index certain web pages that show up in searches for his name and that contain outdated and inaccurate sensitive information. Google’s response to the complaint was to challenge the jurisdiction of the Commissioner to investigate. It argued that its search engine functions were not a ‘commercial activity’ within the meaning of PIPEDA and that PIPEDA therefore did not apply. It also argued that its search engine was a journalistic or literary function which is excluded from the application of PIPEDA under s. 4(2)(c). The Canadian Broadcasting Corporation (CBC) and the Samuelson-Glushko Canadian Internet Policy and Public Interest Clinic (CIPPIC) both intervened. Associate Chief Justice Gagné ruled that the Commissioner has jurisdiction to deal with the complaint. In this sense, this ruling simply enables the Commissioner to continue with his investigation of the complaint and to issue his Report of Findings – something that could no doubt generate fresh fodder for the courts, since a finding that Google should de-index certain search results would raise interesting freedom of expression issues. Justice Gagné’s decision, however, focuses on whether the Commissioner has jurisdiction to proceed. Her ruling addresses 1) the commercial character of Google’s search engine activity; 2) whether Google’s activities are journalistic in nature; and 3) the relevance of the quasi-constitutional status of PIPEDA. I will consider each of these in turn. 1) The Commercial Character of Google’s Search Engine Largely for division of powers reasons, PIPEDA applies only to the collection, use or disclosure of personal information in the course of “commercial activity”. Thus, if an organization can demonstrate that it was not engaged in commercial activity, they can escape the application of the law. Justice Gagné found that Google collected, used and disclosed information in offering its search engine functions. The issue, therefore, was whether it engaged in these practices “in the course of commercial activity”. Justice Gagné noted that Google is one of the most profitable companies in existence, and that most of its profits came from advertising revenues. Although Google receives revenues when a user clicks on an ad that appears in search results, Google argued that not all search results generate ads – this depends on whether other companies have paid to have the particular search terms trigger their ads. In the case of a search for an ordinary user’s name, it is highly unlikely that the search will trigger ads in the results. However, Justice Gagné noted that advertisers can also target ads to individual users of Google’s search engine based on data that Google has collected about that individual from their online activities. According to Justice Gagné, “even if Google provides free services to the content providers and the user of the search engine, it has a flagrant commercial interest in connecting these two players.” (at para 57) She found that search engine users trade their personal data in exchange for the search results that are displayed when they conduct a search. Their data is, in turn, used in Google’s profit-generating activities. She refused to ‘dissect’ Google’s activities into those that are free to users and those that are commercial, stating that the “activities are intertwined, they depend on one another, and they are all necessary components of that business model.” (at para 59) She also noted that “unless it is forced to do so, Google has no commercial interest in de-indexing or de-listing information from its search engine.” (at para 59) 2) Is Google’s Search Engine Function Journalistic in Nature PIPEDA does not apply to activities that are exclusively for journalistic purposes. This is no doubt to ensure that PIPEDA does not unduly interfere with the freedom of the press. Google argued that its search engine allowed users to find relevant information, and that in providing these services it was engaged in journalistic purposes. Justice Gagné observed that depending upon the person, a search by name can reveal a broad range of information from multiple and diverse sources. In this way, Google facilitates access to information, but, in her view, it does not perform a journalistic function. She noted: “Google has no control over the content of search results, the search results themselves express no opinion, and Google does not create the content of the search results.” (at para 82) She adopted the test set out in an earlier decision in A.T. v. Globe24hr.com, whereby an activity qualifies as journalism if “its purpose is to (1) inform the community on issues the community values, (2) it involves an element of original production, and (3) it involves a ‘self-conscious discipline calculated to provide an accurate and fair description of facts, opinion and debate at play within a situation.” (at para 83) Applying the test to Google’s activities, she noted that Google did more than just inform a community about matters of interest, and that it did not create or produce content. She observed as well that “there is no effort on the part of Google to determine the fairness or the accuracy of the search results.” (at para 85). She concluded that the search engine functions were not journalistic activity – or that if they were they were not exclusively so. As a result, the journalistic purposes did not exempt Google from the application of PIPEDA.
3) The Relevance of the Quasi-Constitutional Status of PIPEDA The Supreme Court of Canada has ruled that both public and private sector data protection laws in Canada have quasi-constitutional status. What this means in practical terms is less clear. Certainly it means that they are recognized as laws that protect rights and/or values that are of fundamental importance to a society. For example, in Lavigne, the Supreme Court of Canada stated that the federal Privacy Act served as “a reminder of the extent to which the protection of privacy is necessary to the preservation of a free and democratic society” (at para 25). In United Food and Commercial Workers, the Supreme Court of Canada found that Alberta’s private sector data protection law also had quasi-constitutional status and stated: “The ability of individuals to control their personal information is intimately connected to their individual autonomy, dignity and privacy. These are fundamental values that lie at the heart of a democracy.” (at para 19) What this means in practical terms is increasingly important as questions are raised about the approach to take to private sector data protection laws in their upcoming reforms. For example, the Privacy Commissioner of Canada has criticized Bill C-11 (a bill to reform PIPEDA) for not adopting a human rights-based approach to privacy – one that is explicitly grounded in human rights values. By contrast, Ontario in its White Paper proposing a possible private sector data protection law for Ontario, indicates that it will adopt a human rights-based approach. One issue at the federal level might be the extent to which the quasi-constitutional nature of a federal data protection law does the work of a human rights-based approach when it comes to shaping interpretation of the statute. The decision in this reference case suggests that the answer is ‘no’. In fact, the Attorney-General of Canada specifically intervened on this point, argue that “[t]he quasi-constitutional nature of PIPEDA does not transform or alter the proper approach to statutory interpretation”. (at para 30). Justice Gagné agreed. The proper approach is set out in this quote from Driedger in Lavigne (at para 25): “the words of an Act are to be read in their entire context and in their grammatical and ordinary sense harmoniously with the scheme of the Act, the object of the Act, and the intention of Parliament.” In this case, the relevant words of the Act – “commercial activity” and “journalistic purposes” were interpreted by the Court in accordance with ordinary interpretive principles. I do not suggest that these interpretations are wrong or problematic. I do find it interesting, though, that this decision makes it clear that an implicit human rights-based approach is far inferior to making such an approach explicit through actual wording in the legislation. This is a point that may be relevant as we move forward with the PIPEDA reform process. Next Steps Google may, of course, appeal this decision to the Federal Court of Appeal. If it does not, the next step will be for the Commissioner to investigate the complaint and to issue its Report of Findings. The Commissioner has no order-making powers under PIPEDA. If an order is required to compel Google to de-index any sites, this will proceed via a hearing de novo in Federal Court. We are still, therefore, a long way from a right to be forgotten in Canada.
Published in
Privacy
Wednesday, 17 April 2019 09:06
Right to Be Forgotten Reference to Federal Court Attracts Media Concern
A recent decision on a motion before the Federal Court marks the progress of the Privacy Commissioner’s reference case on whether the Personal Information Protection and Electronic Documents Act (PIPEDA) includes a right to be forgotten. In an earlier report following the OPC’s consultation on digital reputation, the Privacy Commissioner had indicated that he was of the view that PIPEDA, in its unamended form, provided for a right to be forgotten that could be exercised against search engines. The reference, launched on October 10, 2018, is linked to a complaint filed with the Office of the Privacy Commissioner (OPC) by an individual against Google. The Complainant is concerned that Google searches of his name produce links to news articles that he alleges “are outdated and inaccurate and disclose sensitive information such as his sexual orientation and a serious medical condition” (at para 6). The complainant’s view is that by providing prominent links to these articles, Google is breaching the PIPEDA. He is seeking to have these results de-indexed. This means that they would no longer appear in Google search results. De-indexing does not involve the removal of content from the source websites. Basically, the articles would still be out there, but they would not appear in Google search results. Unless similar orders were made against other search engines such as Bing, they content would be findable using those engines. The Commissioner has referred two questions to the Federal Court. First, he seeks to know whether Google’s search engine activities constitute the “commercial activity” necessary to bring these activities within the scope of PIPEDA, which applies to the collection, use or disclosure of personal information in the course of commercial activity. The second question is whether Google’s search engine activities, even if commercial, fall within the exception to PIPEDA’s application where personal information is collected, used or disclosed “for journalistic, artistic or literary purposes and for no other purpose” (s. 4(2)(c)). Google and the Attorney General of Canada were given notice of the reference and are entitled to become parties to the reference. Google has challenged the scope of the reference. It seeks to add the question of whether, if PIPEDA does apply to the search engine’s activities, and if there is a deindexing order, such an order would violate s. 2(b) of the Canadian Charter of Rights and Freedoms. This motion to expand the scope of the reference had not yet been heard. The CBC, along with a coalition of other Canadian media organizations brought motions seeking to be added as parties to the original reference. Their concern is that the Commissioner’s interpretation of the scope of PIPEDA as including a right to be forgotten is a violation of the freedom of expression guaranteed by s. 2(b) of the Charter. Their argument is based on the principle that the right of expression includes the right to receive information, and that measures taken to limit access to information in the news media thus breach the Charter. By bringing their motion, the media outlets sought to be added as parties, with the right to introduce evidence and make argument before the Court. The motion was heard by Prothonotary Tabib, who rendered her decision on March 1. She began by noting that since the motion was being heard prior to any decision on Google’s motion to expand the scope of proceedings, party status would be considered only with respect to the original reference questions. She was critical of the motion on the basis that it proceeded “from the fundamental assumption that the Court’s determination of the jurisdictional questions in a way that confers jurisdiction on the OPC to investigate the underlying complaint will inevitably result in deindexing lawful news media content from Internet search results” (at para 17). She noted that in fact the reference questions were directed towards the issue of whether the Commissioner had jurisdiction in the matter. If the outcome of the reference was a finding that there was jurisdiction, the Commissioner would still have to investigate, would have to find the complaint well-founded, and would have to determine whether de-indexing was an appropriate remedy. The Commissioner can only make non-binding orders, so no Charter rights would be violated unless the matter proceeded to a recommendation to de-index with which Google voluntarily complied. If Google refused to comply the complainant or the Commissioner could bring the matter to Federal Court seeking a binding order, but the Court would hold a hearing de novo and might reach different conclusions. Basically, the prothonotary was of the view that the matter was a long way from breaching anyone’s Charter rights. She noted that “The media parties’ reliance on assumptions as to the ultimate result to form the cornerstone of their argument conflates all subsequent steps and determinations into the preliminary issue” (at para 18). Prothonotary Tabib considered Rule 104(1)(b) of the Federal Courts Rules, which empowers the Court to order a person to be joined as a party. She focused on the issue of whether the presence of the media parties was necessary “for a full and effectual determination” of all of the issues in the reference. The media companies argued that their presence was necessary since the results of the reference would be binding on them. Prothonotary Tabib noted:
The media parties’ arguments thus essentially rest on the underlying assumption that what is truly at issue in this reference is the constitutionality of the Privacy Commissioner "“intended”" institution of a deindexing process in respect of lawful news content from Internet search results. However, as determined above, that is not what is truly at issue in this reference. What is at issue here is only whether Google is subject to or exempt from the application of Part 1 of PIPEDA in respect of how it collects, uses or discloses personal information in the operation of its search engine service when it presents search results in response to an individual’s name. (at para 36)
She observed that the only direct effect of the outcome of the reference would be the Commissioner’s decision to proceed with the investigation of the complaint against Google. She also noted that any freedom of expression impact that might ultimately flow from this matter would be shared by all internet content providers, as well as all those who used Google’s search engines. If the Charter interests of the media entitled them to be parties, then there was virtually no limit to who could be a party – which would be an absurd and unmanageable result. In her view it would be more appropriate for the media companies to seek intervenor status. However, she found that their motion did not address the issues they would need to establish for intervenor status. In brief, they failed to show how their contributions to the argument would be distinct from what Google would provide as party to the reference case. The motions were dismissed, with leave provided for the companies to reapply for leave to intervene once Google’s motion to vary the scope of the reference is decided.
Published in
Privacy
Wednesday, 31 January 2018 14:59
OPC Report on Online Reputation Misses the Mark on the Application of PIPEDA to Search Engines
The Office of the Privacy Commissioner of Canada has released its Draft Position on Online Reputation. It’s an important issue and one that is of great concern to many Canadians. In the Report, the OPC makes recommendations for legislative change and proposes other measures (education, for example) to better protect online reputation. However, the report has also generated considerable controversy for the position it has taken on how the Personal Information Protection and Electronic Documents Act currently applies in this context. In this post I will focus on the Commissioner’s expressed view that PIPEDA applies to search engine activities in a way that would allow Canadians to request the de-indexing of personal information from search engines, with the potential to complain to the Commissioner if these demands are not met. PIPEDA applies to the collection, use and disclosure of personal information in the course of commercial activity. The Commissioner reasons, in this report, that search engines are engaged in commercial activity, even if search functions are free to consumers. An example is the placement of ads in search results. According to the Commissioner, because search engines can provide search results that contain (or lead to) personal information, these search engines are collecting, using and disclosing personal information in the course of commercial activity. With all due respect, this view seems inconsistent with current case law. In 2010, the Federal Court in State Farm Mutual Automobile Insurance Co. v. Canada (Privacy Commissioner) ruled that an insurance company that collected personal information on behalf of an individual it was representing in a law suit was not collecting that information in the course of commercial activity. This was notwithstanding the fact that the insurance company was a commercial business. The Court was of the view that, at essence, the information was being collected on behalf of a private person (the defendant) so that he could defend a legal action (a private and non-commercial matter to which PIPEDA did not apply). Quite tellingly, at para 106, the court stated: “if the primary activity or conduct at hand, in this case the collection of evidence on a plaintiff by an individual defendant in order to mount a defence to a civil tort action, is not a commercial activity contemplated by PIPEDA, then that activity or conduct remains exempt from PIPEDA even if third parties are retained by an individual to carry out that activity or conduct on his or her behalf.” The same reasoning applies to search engines. Yes, Google makes a lot of money, some of which comes from its search engine functions. However, the search engines are there for anyone to use, and the relevant activities, for the purposes of the application of PIPEDA, are those of the users. If a private individual carries out a Google search for his or her own purposes, that activity does not amount to the collection of personal information in the course of commercial activity. If a company does so for its commercial purposes, then that company – and not Google – will have to answer under PIPEDA for the collection, use or disclosure of that personal information. The view that Google is on the hook for all searches is not tenable. It is also problematic for the reasons set out by my colleague Michael Geist in his recent post. I also note with some concern the way in which the “journalistic purposes” exception is treated in the Commissioner’s report. This exception is one of several designed to balance privacy with freedom of expression interests. In this context, the argument is that a search engine facilitates access to information, and is a tool used by anyone carrying out online research. This is true, and for the reasons set out above, PIPEDA does not apply unless that research is carried out in the course of commercial activities to which the statute would apply. Nevertheless, in discussing the exception, the Commissioner states: Some have argued that search engines are nevertheless exempt from PIPEDA because they serve a journalistic or literary function. However, search engines do not distinguish between journalistic/literary material. They return content in search results regardless of whether it is journalistic or literary in nature. We are therefore not convinced that search engines are acting for “journalistic” or “literary” purposes, or at least not exclusively for such purposes as required by paragraph 4(2)(c). What troubles me here is the statement that “search engines do not distinguish between journalistic and literary material”. They don’t need to. The nature of what is sought is not the issue. The issue is the purpose. If an individual uses Google in the course of non-commercial activity, PIPEDA does not apply. If a journalist uses Google for journalistic purposes, PIPEDA does not apply. The nature of the content that is searched is immaterial. The quote goes on to talk about whether search engines act for journalistic or literary purposes – that too is not the point. Search engines are tools. They are used by actors. It is the purposes of those actors that are material, and it is to those actors that PIPEDA will apply – if they are collecting, using or disclosing personal information in the course of commercial activity. The Report is open for comment until April 19, 2018.
Published in
Privacy
|
Electronic Commerce and Internet Law in Canada, 2nd EditionPublished in 2012 by CCH Canadian Ltd.
Intellectual Property for the 21st CenturyIntellectual Property Law for the 21st Century: Interdisciplinary Approaches
|