Tags
access to information
AI
AI governance
AI regulation
Ambush Marketing
artificial intelligence
big data
bill c11
Bill c27
copyright
data governance
data protection
data strategy
Electronic Commerce
freedom of expression
Geospatial
geospatial data
intellectual property
Internet
internet law
IP
open courts
open data
open government
personal information
pipeda
Privacy
smart cities
trademarks
transparency
|
Displaying items by tag: data protection
Monday, 23 November 2020 13:42
With a New Federal Bill Before Parliament, Is There Still a Case for Ontario to Enact its Own Private Sector Data Protection Law?
The federal government’s new Bill C-11 to reform its antiquated private sector data protection law has landed on Parliament’s Order Paper at an interesting moment for Ontario. Earlier this year, Canada’s largest province launched a consultation on whether it should enact its own private sector data protection law that would apply instead of the federal law to intraprovincial activities. The federal Personal Information Protection and Electronic Documents Act was enacted in 2000, a time when electronic commerce was on the rise, public trust was weak, and transborder flows of data were of growing economic importance. Canada faced an adequacy assessment under the European Union’s Data Protection Directive, in order to keep data flowing to Canada from the EU. At the time, only Quebec had its own private sector data protection law. Because a federal law in this area was on a somewhat shaky constitutional footing, PIPEDA’s compromise was that it would apply nationally to private sector data collection, use or disclosure in the course of commercial activity, unless a province had enacted “substantially similar” legislation. In such a case, the provincial statute would apply within the province, although not to federally-regulated industries or where data flowed across provincial or national borders. British Columbia and Alberta enacted their own statutes in 2004. Along with Quebec’s law, these were declared substantially similar to PIPEDA. The result is a somewhat complicated private sector data protection framework made workable by co-operation between federal and provincial privacy commissioners. Those provinces without their own private sector laws have seemed content with PIPEDA – and with allowing Ottawa picking up the tab for its oversight and enforcement. Twenty years after PIPEDA’s enactment, data thirsty technologies such as artificial intelligence are on the ascendance, public trust has been undermined by rampant overcollection, breaches and scandals, and transborder data flows are ubiquitous. The EU’s 2018 General Data Protection Regulation (GDPR) has set a new and higher standard for data protection and Canada must act to satisfy a new adequacy assessment. Bill C-11 is the federal response. There are provisions in Bill C-11 that tackle the challenges posed by the contemporary data environment. For example, organizations will have to provide upfront a “general account” of their use of automated decision systems that “make predictions, recommendations or decisions about individuals that could have significant impacts on them” (s. 62(1)(c)). The right of access to one’s personal information will include a right to an explanation of any prediction, recommendation or decision made using an automated decision system (s. 63(3)). There are also new exceptions to consent requirements for businesses that seek to use their existing stores of personal information for new internal purposes. C-11 will facilitate some sharing of de-identified data for “socially beneficial purposes”. These are among the Bill’s innovations. There are, however, things that the Bill does not do. Absent from Bill C-11 is anything specifically addressing the privacy of children or youth. In fact, the Bill reworks the meaning of “valid consent”, such that it is no longer assessed in terms of the ability of those targeted for the product or service to understand the consequences of their consent. This undermines privacy, particularly for youth. Ontario could set its own course in this area. More importantly, perhaps, there are some things that a federal law simply cannot do. It cannot tread on provincial jurisdiction, which leaves important data protection gaps. These include employee privacy in provincially regulated sectors, the non-commercial activities of provincial organizations, and provincial political parties. The federal government clearly has no stomach for including federal political parties under the CPPA. Yet the province could act – as BC has done – to impose data protection rules on provincial parties. There is also the potential to build more consistent norms, as well as some interoperability where necessary, across the provincial public, health and private sectors under a single regulator. The federal bill may also not be best suited to meet the spectrum of needs of Ontario’s provincially regulated private sector. Many of the bill’s reforms target the data practices of large corporations, including those that operate transnationally. The enhanced penalties and enforcement mechanisms in Bill C-11 are much needed, but are oriented towards penalizing bad actors whose large-scale data abuses cause significant harm. Make no mistake – we need C-11 to firmly regulate the major data players. And, while a provincial data protection law must also have teeth, it would be easier to scale such a law to the broad diversity of small and medium-sized enterprises in the Ontario market. This is not just in terms of penalties but also in terms of the compliance burden. Ontario’s Information and Privacy Commissioner could play an important role here as a conduit for information and education and as a point of contact for guidance. Further, as the failed Sidewalk Toronto project demonstrated, the province is ripe with opportunities for public-private technology partnerships. Having a single regulator and an interoperable set of public and private sector data protection laws could offer real advantages in simplifying compliance and making the environment more attractive to innovators, while at the same time providing clear norms and a single point of contact for affected individuals. In theory as well, the provincial government would be able to move quickly if need be to update or amend the law. The wait for PIPEDA reform has been excruciating. It it is not over yet, either. Bill C-11 may not be passed before we have to go to the polls again. That said, timely updating has not been a hallmark of either BC or Alberta’s regimes. Drawbacks of a new Ontario private sector data protection law would include further multiplication of the number of data protection laws in Canada, and the regulatory complexity this can create. A separate provincial law will also mean that Ontario will assume the costs of administering a private sector data protection regime. This entails the further risk that budget measures could be used by future governments to undermine data protection in Ontario. Still, the same risks – combined with considerably less control – exist with federal regulation. There remains a strong and interesting case for Ontario to move forward with its own legislation.
Published in
Privacy
Wednesday, 18 November 2020 11:29
It’s not you, it’s me? Why does the federal government have a hard time committing to the human right to privacy?
It’s been a busy privacy week in Canada. On November 16, 2020 Canada’s Department of Justice released its discussion paper as part of a public consultation on reform of the Privacy Act. On November 17, the Minister of Industry released the long-awaited bill to reform Canada’s private sector data protection legislation. I will be writing about both developments over the next while. But in this initial post, I would like to focus on one overarching and obvious omission in both the Bill and the discussion paper: the failure to address privacy as a human right. Privacy is a human right. It is declared as such in international instruments to which Canada is a signatory, such as the Universal Declaration of Human Rights and the International Convention on Civil and Political Rights. Data protection is only one aspect of the human right to privacy, but it is an increasingly important one. The modernized Convention 108 (Convention 108+), a data protection originating with the Council of Europe but open to any country, puts human rights front and centre. Europe’s General Data Protection Regulation also directly acknowledges the human right to privacy, and links privacy to other human rights. Canada’s Privacy Commissioner has called for Parliament to adopt a human rights-based approach to data protection, both in the public and private sectors. In spite of all this, the discussion paper on reform of the Privacy Act is notably silent with respect to the human right to privacy. In fact, it reads a bit like the script for a relationship in which one party dances around commitment, but just can’t get out the words “I love you”. (Or, in this case “Privacy is a human right”). The title of the document is a masterpiece of emotional distancing. It begins with the words: “Respect, Accountability, Adaptability”. Ouch. The “Respect” is the first of three pillars for reform of the Act, and represents “Respect for individuals based on well established rights and obligations for the protection of personal information that are fit for the digital age.” Let’s measure that against the purpose statement from Convention 108+: “The purpose of this Convention is to protect every individual, whatever his or her nationality or residence, with regard to the processing of their personal data, thereby contributing to respect for his or her human rights and fundamental freedoms, and in particular the right to privacy.” Or, from article 1 of the GDPR: “This Regulation protects fundamental rights and freedoms of natural persons and in particular their right to the protection of personal data.” The difference is both substantial and significant. The discussion paper almost blurts it out… but again stops short in its opening paragraph, which refers to the Privacy Act as “Canada’s quasi-constitutional legal framework for the collection, use, disclosure, retention and protection of personal information held by federal public bodies.” This is the romantic equivalent of “I really, really, like spending time with you at various events, outings and even contexts of a more private nature.” The PIPEDA reform bill which dropped in our laps on November 17 does mention the “right to privacy”, but the reference is in the barest terms. Note that Convention 108+ and the GDPR identify the human right to privacy as being intimately linked to other human rights and freedoms (which it is). Section 5 of the Bill C-11 (the Consumer Privacy Protection Act) talks about the need to establish “rules to govern the protection of personal information in a manner that recognizes the right to privacy of individuals with respect to their personal information and the need of organizations to collect, use or disclose personal information for purposes that a reasonable person would consider appropriate in the circumstances.” It is pretty much what was already in PIPEDA, and it falls far short of the statements quoted from Convention 108+ and the GDPR. In the PIPEDA context, the argument has been that “human rights” are not within exclusive federal jurisdiction, so talking about human rights in PIPEDA just makes the issue of its constitutionality more fraught. Whether this argument holds water or not (it doesn’t), the same excuse does not exist for the federal Privacy Act. The Cambridge Analytica scandal (in which personal data was used to subvert democracy), concerns over uses of data that will perpetuate discrimination and oppression, and complex concerns over how data is collected and used in contexts such as smart cities all demonstrate that data protection is more than just about a person’s right to a narrow view of privacy. Privacy is a human right that is closely linked to the enjoyment of other human rights and freedoms. Recognizing privacy as a human right does not mean that data protection will not not require some balancing. However, it does mean that in a data driven economy and society we keep fundamental human values strongly in focus. We’re not going to get data protection right if we cannot admit these connections and clearly state that data protection is about the protection of fundamental human rights and freedoms. There. Is that so hard?
Published in
Privacy
Tuesday, 06 October 2020 13:56
BC Court of Appeal is Open to Broadening Privacy Recourse
The BC Court of Appeal has handed down a decision that shakes up certain assumptions about recourse for privacy-related harms in that province – and perhaps in other provinces as well. The decision relates to a class action lawsuit filed after a data breach. The defendant had stored an unencrypted copy of a database containing customer personal information on its website. The personal information included: “names, addresses, email addresses, telephone numbers, dates of birth, social insurance numbers, occupations, and, in the case of credit card applicants, their mothers' birth names.” (at para 4) This information was accessed by hackers. By the time of this decision, some of the information had been used in phishing scams but the full extent of its use is still unknown. As is typical in privacy class action lawsuits, the plaintiffs sought certification on multiple grounds. These included: “breach of contract, negligence, breach of privacy, intrusion upon seclusion, beach of confidence, unjust enrichment and waiver of tort.” (at para 6) The motions judge certified only claims in contract, negligence, and the federal common law of privacy. The defendants appealed, arguing that the remaining grounds were not viable and that the action should not have been certified. They also argued that a class action lawsuit was not the preferable procedure for the resolution of the common issues. While the plaintiffs cross-appealed the dismissal of the claim for breach of confidence, they did not appeal the decision that there was no recourse for breach of privacy or the tort of intrusion upon seclusion under BC law. This post focuses what I consider to be the three most interesting issues in the case. These are: whether there is recourse for data breaches other than via data protection legislation; whether the tort of breach of privacy exists in B.C.; and whether there is a federal common law of privacy. 1. Is PIPEDA a complete code The defendants argued that the class action lawsuit was not the preferred procedure because the federal Personal Information Protection and Electronic Documents Act (PIPEDA) constituted a “complete code in respect of the collection, retention, and disclosure of personal information by federally-regulated businesses, and that no action, apart from the application to the Federal Court contemplated by the Act can be brought in respect of a data breach.” (at para 18) Justice Groberman, writing for the unanimous Court, noted that while it was possible for a statute to constitute a complete code intended to fully regulate a particular domain, it is not inevitable. He observed that the Ontario Court of Appeal decision in Hopkins v. Kay had earlier determined that Ontario’s Personal Health Information Protection Act (PHIPA) did not constitute a complete code when it came to regulating personal health information, allowing a lawsuit to proceed against a hospital for a data breach. In Hopkins, the Ontario Court of Appeal noted that PHIPA was primarily oriented towards addressing systemic issues in the handling of personal health information, rather than dealing with individual disputes. Although there was a complaints mechanism in the statute, the Commissioner had the discretion to decline to investigate a complaint if a more appropriate procedure were available. Justice Groberman noted that PIPEDA contained a similar provision in s. 12. He observed that “[t]his language, far from suggesting that the PIPEDA is a complete code, acknowledges that other remedies continue to be available, and gives the Commissioner the discretion to abstain from conducting an investigation where an adequate alternative remedy is available to the complainant.” (at para 28) In his view, PIPEDA is similarly oriented towards addressing systemic problems and preventing future breaches, and that “[w]hile there is a mechanism to resolve individual complaints, it is an adjunct to the legislative scheme, not its focus.” (at para 29) He also found it significant that PIPEDA addressed private rather than public sector data protection. He stated: “[w]ithin a private law scheme, it seems to me that we should exercise even greater caution before concluding that a statute is intended to abolish existing private law rights.” (at para 30) He concluded that nothing in PIPEDA precluded other forms of recourse for privacy harms. 2. Do common law privacy torts exist in BC? In 2012 the Ontario Court of Appeal recognized the privacy tort of intrusion upon seclusion in Jones v. Tsige. However, since British Columbia has a statutory privacy tort in its Privacy Act, the motions judge (like other BC judges before him) concluded that the statutory tort displaced any possible common law tort in BC. Justice Groberman was clearly disappointed that the plaintiffs had chosen not to appeal this conclusion. He stated: “In my view, the time may well have come for this Court to revisit its jurisprudence on the tort of breach of privacy.” (at para 55) He proceeded to review the case law usually cited as supporting the view that there is no common law tort of breach of privacy in BC. He distinguished the 2003 decision in Hung v. Gardiner on the basis that in that case the judge at first instance had simply stated that he was not convinced by the authorities provided that such a tort existed in BC. On appeal, the BCCA agreed with the judge’s conclusion on an issue of absolute privilege, and found it unnecessary to consider any of the other grounds of appeal. The BCCA decision in Mohl v. University of British Columbia is more difficult to distinguish because in that case the BCCA stated “[t]here is no common-law claim for breach of privacy. The claim must rest on the provisions of the [Privacy] Act.” (Mohl at para 13) Nevertheless, Justice Groberman indicated that while this statement was broad, “it is not entirely clear that it was intended to be a bold statement of general principle as opposed to a conclusion with respect to the specific circumstances of Mr. Mohl's case. In any event, the observation was not critical to this Court's reasoning.” (at para 62) Justice Groberman concluded that “The thread of cases in this Court that hold that there is no tort of breach of privacy, in short, is a very thin one.” (at para 64) He also noted that the privacy context had considerably changed, particularly with the Ontario Court of Appeal’s decision in Jones v. Tsige. He stated: It may be that in a bygone era, a legal claim to privacy could be seen as an unnecessary concession to those who were reclusive or overly sensitive to publicity, though I doubt that that was ever an accurate reflection of reality. Today, personal data has assumed a critical role in people's lives, and a failure to recognize at least some limited tort of breach of privacy may be seen by some to be anachronistic. (at para 66) He indicated that the Court of Appeal might be inclined to reconsider the issue were it to be raised before them, although he could not do so in this case since the plaintiffs had not appealed the judge’s ruling on this point. 3. There is no federal common law of privacy However keen Justice Groberman might have been to hear arguments on the common law tort of privacy, he overturned the certification of the privacy claims as they related to the federal common law of privacy. He characterized this approach as ‘creative’, but inappropriate. He noted that while common law principles might evolve in areas of federal law (e.g. maritime law), in cases where there was shared jurisdiction such as in privacy law, there was no separate body of federal common law distinct from provincial common law. He stated “there is only a single common law, and it applies within both federal and provincial spheres.” (at para 76) More specifically, he stated: Where an area of law could be regulated by either level of government, it is not sensible to describe the situation in which neither has enacted legislation as being a situation of either "federal" or "provincial" common law. It is simply a situation of the "common law" applying. The plaintiffs cannot choose whether to bring their claims under "federal" or "provincial" common law as if these were two different regimes. (at para 86) Because the claim advanced by the plaintiff had nothing to do with any specific area of federal jurisdiction, Justice Groberman rejected the idea that a cause of action arose under “federal” common law. Overall, this decision is an interesting one. Clearly the Court of Appeal is sending strong signals that it is time to rethink recourse for breach of privacy in the province. It may now be that there is both a statutory and a common law action for breach of privacy. If this is so, it will be interesting to see what scope is given to the newly recognized common law tort. “Complete code” arguments have arisen in other lawsuits relating to breach of privacy; the BCCA’s response in this case adds to a growing body of jurisprudence that rejects the idea that data protection laws provide the only legal recourse for the mishandling of personal data. Finally, a number of class action lawsuits have asserted the “federal common law of privacy”, even though it has been entirely unclear what this is. The BCCA suggests that it is a fabrication and that no such distinct area of common law exists.
Published in
Privacy
Monday, 17 August 2020 08:54
Data Protection Laws and Political Parties: No Half Measures
This is a copy of my submission in response to the Elections Canada consultation on Political Communications in Federal Elections. The consultation closes on August 21, 2020. Note that this submission has endnotes which are at the end of the document. Where possible these include hyperlinks to the cited sources. 16 August 2020 I appreciate the invitation to respond to Election Canada’s consultation on the overall regulatory regime that governs political communications in federal elections. I hold the Canada Research Chair in Information Law and Policy at the University of Ottawa, where I am also a law professor. I provide the following comments in my capacity as an individual. The consultation raises issues of great importance to Canadians. My comments will focus on Discussion Paper 3: The Protection of Electors’ Personal Information in the Federal Electoral Context.[1] Concerns over how political parties handle personal information have increased steadily over the years. Not surprisingly, this coincides with the rise of big data analytics and artificial intelligence (AI) and the capacity of these technologies to use personal data in new ways including profiling and manipulating. Discussion Paper 3 hones in on the Cambridge Analytica scandal[2] and its implications for the misuse of personal data for voter manipulation. This egregious case illustrates why, in a big data environment, we need to seriously address how voter personal data is collected, used and disclosed.[3] The potential misuse of data for voter manipulation is an expanding threat.[4] Yet this kind of high-profile voter manipulation scandal is not the only concern that Canadians have with how their personal information is handled by political parties. Additional concerns include lax security;[5] unwanted communications;[6] targeting based on religion, ethnicity or other sensitive grounds;[7] data sharing;[8] lack of transparency,[9] and voter profiling.[10] In addition, there is a troubling lack of transparency, oversight and accountability.[11] All of these are important issues, and they must be addressed through a comprehensive data protection regime.[12] Public concern and frustration with the state of data protection for Canadians when it comes to political parties has been mounting. There have been reports and studies,[13] op-eds and editorials,[14] privacy commissioner complaints,[15] a competition bureau complaint,[16] and even legal action.[17] There is a growing gulf between what Canadians expect when it comes to the treatment of their personal data and the obligations of political parties. Canadians now have two decades of experience with the Personal Information Protection and Electronic Documents Act (PIPEDA)[18] which governs the collection, use, and disclosure of personal data in the private sector. Yet PIPEDA does not apply to political parties, and there is a very wide gap between PIPEDA’s data protection norms and the few rules that apply to federal political parties. There is also considerable unevenness in the regulatory context for use of personal data by political parties across the country. For example, B.C.’s Personal Information Protection Act (PIPA)[19] already applies to B.C. political parties, and while there have been some problems with compliance,[20] the democratic process has not been thwarted. A recent interpretation of PIPA by the B.C. Privacy Commissioner also places federal riding offices located in B.C. under its jurisdiction.[21] This means that there are now different levels of data protection for Canadians with respect to their dealings with federal parties depending upon the province in which they live and whether, if they live in B.C., they are interacting with their riding office or with the national party itself.. Further, if Quebec’s Bill 64 is enacted, it would largely extend the province’s private sector data protection law to political parties. Ontario, which has just launched a consultation on a new private sector data protection law for that province is considering extending it to political parties.[22] Internationally, The EU’s General Data Protection Regulation (GDPR)[23] applies to political parties, with some specially tailored exceptions. Frankly put, it is becoming impossible to credibly justify the lack of robust data protection available to Canadians when it comes to how their personal data is handled by political parties. Lax data protection is neither the rule in Canada, nor the norm internationally. There are points at which Discussion Paper 3 is overly defensive about the need for political parties to collect, use and disclose personal information about voters in the course of their legitimate activities. This need is not contested. But for too long it has gone virtually unrestrained and unsupervised. To be clear, data protection is not data prohibition. Data protection laws explicitly acknowledge the need of organizations to collect, use and disclose personal information.[24] Such laws set the rules to ensure that organizations collect, use, and disclose personal data in a manner consistent with the privacy rights of individuals. In addition, they protect against broader societal harms that may flow from unrestrained uses of personal data, including, in the political context, the manipulation of voters and subversion of democracy. 1. Information provided to parties by Elections Canada Discussion Paper 3 sets out the current rules that protect electors’ personal information. For the most part, they are found in the Canada Elections Act (CEA).[25] In some instances, these rules provide less protection than comparable provincial election laws. For example, security measures, including the use of fictitious information in lists of electors to track unauthorized uses are in place in some jurisdictions, but not at the federal level. Discussion Paper 3 notes that while such measures are not part of the CEA, best practices are provided for in Elections Canada guidelines.[26] These guidelines are not mandatory and are insufficient to protect electors’ information from deliberate or unintentional misuse. The CEA also contains new provisions requiring political parties to adopt privacy polices and to publish these online. While such privacy policies offer some improved degree of transparency, they do not provide for adequate enforcement or accountability. Further, they do not meet the threshold, in terms of prescribed protections, of the fair information principles that form the backbone of most data protection laws including PIPEDA. There are some matters that should be addressed by specific provisions in the CEA. These relate to information that is shared by the CEA with political parties such as the list of electors. The CEA should maintain accountability for this information by imposing security obligations on parties or candidates who receive the list of electors. It would be appropriate in those circumstances to have specific data breach notification requirements relating to the list of electors contained in the CEA. However, with respect to the wealth of other information that political parties collect or use, they should have to comply with PIPEDA and be accountable under PIPEDA for data breaches. 2. Fair Information Principles Approach Discussion Paper 3 takes the position that fair information principles should be applied to political parties, and frames its questions in terms of how this should be accomplished. There are two main options. One is to craft a set of rules specifically for political parties which might be incorporated into the CEA, with oversight by either the Privacy Commissioner and/or the Chief Electoral Officer. Another is to make political parties subject to PIPEDA, and to add to that law any carefully tailored exceptions necessary in the political context. The latter approach is better for the following reasons: · The data protection landscape in Canada is already fragmented, with separate laws for federal and provincial public sectors; separate laws for the private sector, including PIPEDA and provincial equivalents in B.C., Alberta and Quebec; and separate laws for personal health information. There is a benefit to simplicity and coherence. PIPEDA can be adapted to the political context. There are many obligations which can and should be the same whether for private sector organizations or political parties. If particular exceptions tailored to the political context are required, these can be added. · Political parties in BC (including federal riding associations) are already subject to data protection laws. Quebec, in Bill 64, proposes to make political parties subject to their private sector data protection law. The same approach should be followed federally. · It is expected that PIPEDA will be amended in the relatively short term to bring it into line with the contemporary big data context. Creating separate norms in the CEA for political parties risks establishing two distinct privacy schemes which may not keep up with one another as the data context continues to evolve. It is much simpler to maintain one set of norms than to have two sets of privacy norms that are initially similar but that diverge over time.
3. Fair Information Principles: Specific Provisions Discussion Paper 3 considers certain of the Fair Information Principles and how they apply to political parties. This discussion seems to assume in places that the solution will be to introduce new provisions in the CEA, rather than applying PIPEDA to political parties, subject to certain exceptions. For example, the first question under Accountability asks “Besides publishing their privacy policies, what other requirements could parties be subject to in order to make them accountable for how they collect, use and disclose personal information?”[27] As noted above, my view is that political parties should be subject to PIPEDA. The “other requirements” needed are those found in PIPEDA. There is no need to reinvent the wheel for political parties. On the issue of data breaches, I note with concern that Discussion Paper 3 takes an overly cautious approach. For example, it states, presumably referring to PIPEDA, that “There are also penalties for organizations that knowingly fail to report a breach, which could be ruinous for a smaller party.”[28] In the first place, these penalties are for knowingly failing to report a breach, not for experiencing a breach. A party that experiences a data breach that creates a real risk of serious harm to an individual (the reporting threshold) and does not report it, should not complain of the fines that are imposed for this failure. Secondly, the amounts set out in the legislation are maximum fines and courts have discretion in imposing them. In any event, a class action law suit following a data breach is much more likely to be the ruination of a smaller party; liability for such a data breach could be mitigated by being able to demonstrate not only that the party complied with data protection norms but that it also responded promptly and appropriately when the breach took place. In my view, the data breach notification requirements can and should be applied to political parties. Discussion Paper 3 also floats the idea of a voluntary code of practice as an alternative to parties being subject to data protection laws. It states: “A voluntary code may be more palatable to political parties than legislated change, while at the same time moving towards increasing electors’ privacy”.[29] It is fair to say that ‘soft’ guidance with no enforcement is always more palatable to those to whom it would apply than real obligations. However, we are long past the time for a gentle transition to a more data protective approach. Political parties have embraced big data and data analytics and now collect, use, and disclose unprecedented amounts of personal information. They need to be subject to the same data protection laws as other actors in this environment. While those laws may need a few carefully tailored exceptions to protect the political process, on the whole, they can and should apply. It would be wasteful, confusing, and unsatisfactory to create a parallel regime for data protection and political parties in Canada. Given their embrace of the big data environment and their expanding use of personal data, these parties should be held to appropriate and meaningful data protection norms, with oversight by the Privacy Commissioner of Canada. Federal political parties should be subject to PIPEDA with some carefully tailored exceptions. [1] Elections Canada, Discussion Paper 3: The Protection of Electors’ Personal Information in the Federal Electoral Context, May 2020, online: https://www.elections.ca/content.aspx?section=res&dir=cons/dis/compol/dis3&document=index&lang=e. [2] See, e.g.: Office of the Privacy Commissioner of Canada, PIPEDA Report of Findings #2019-004: Joint investigation of AggregateIQ Data Services Ltd. by the Privacy Commissioner of Canada and the Information and Privacy Commissioner for British Columbia, November 26 2019, online: https://www.priv.gc.ca/en/opc-actions-and-decisions/investigations/investigations-into-businesses/2019/pipeda-2019-004/. [3] Cherise Seucharan and Melanie Green, “A B.C. scandal has pulled back the curtain on how your online information is being used”, November 29, 2019, online: https://www.thestar.com/vancouver/2019/11/29/heres-how-companies-and-political-parties-are-getting-their-hands-on-your-data.html. [4] Brian Beamish, 2018 Annual Report: Privacy and Accountability for a Digital Ontario, Office of the Information and Privacy Commissioner of Ontario, June 27, 2019, at p. 30, online: https://www.ipc.on.ca/wp-content/uploads/2019/06/ar-2018-e.pdf. Office of the Information and Privacy Commissioner of British Columbia, “Investigation Report P19-01: Full Disclosure: Political parties, campaign data, and voter consent”, February 6, 2019, online: https://www.oipc.bc.ca/investigation-reports/2278. [5] Joan Bryden, “Elections Canada chief warns political parties are vulnerable to cyberattacks”, 4 February 2019, Global News, online: https://globalnews.ca/news/4925322/canada-political-parties-cyberattack-threat/; Office of the Information and Privacy Commissioner of British Columbia, “Investigation Report P19-01: Full Disclosure: Political parties, campaign data, and voter consent”, February 6, 2019, at 6 (noting the number of complaints received relating to lax security practices), and pp. 27-31 (outlining security issues), online: https://www.oipc.bc.ca/investigation-reports/2278. [6] Office of the Information and Privacy Commissioner of British Columbia, “Investigation Report P19-01: Full Disclosure: Political parties, campaign data, and voter consent”, February 6, 2019, at 22, online: https://www.oipc.bc.ca/investigation-reports/2278. Note that the complaint that led to the ruling that that province’s Personal Information Protection Act applied to federal riding associations in B.C. was based on an unconsented to use of personal data. See: OIPC BC, Courtenay-Alberni Riding Association of The New Democratic Party of Canada, Order No. P19-02, 28 August 2019, online: https://www.oipc.bc.ca/orders/2331. [7] See, e.g.: Michael Geist, “Why Political Parties + Mass Data Collection + Religious Targeting + No Privacy Laws = Trouble”, October 11, 2019, online: http://www.michaelgeist.ca/2019/10/why-political-parties-mass-data-collection-religious-targeting-no-privacy-laws-trouble/; Sara Bannerman, Julia Kalinina, and Nicole Goodman, “ Political Parties’ Voter Profiling Is a Threat to Democracy”, The Conversation, 27 January 2020, online: https://thetyee.ca/Analysis/2020/01/27/Political-Parties-Profiling-Democracy/. [8] See: Office of the Information and Privacy Commissioner of British Columbia, “Investigation Report P19-01: Full Disclosure: Political parties, campaign data, and voter consent”, February 6, 2019, at 25, online: https://www.oipc.bc.ca/investigation-reports/2278. [9] Colin Bennett, “They’re spying on you: how party databases put your privacy at risk”, iPolitics, September 1, 2015, online: https://ipolitics.ca/2015/09/01/theyre-spying-on-you-how-party-databases-put-your-privacy-at-risk/ [10] Colin J. Bennett, “Canadian political parties are gathering more and more data on voters all the time. It’s time we regulated what data they glean, and what they can do with it”, Policy Options, 1 February 2013, online: https://policyoptions.irpp.org/magazines/aboriginality/bennett/. [11] See, e.g.: Yvonne Colbert, “What's in your file? Federal political parties don't have to tell you”, CBC, 30 July 2019, online: https://www.cbc.ca/news/canada/nova-scotia/privacy-federal-political-parties-transparency-1.5226118; Katharine Starr, “Privacy at risk from Canadian political parties, says U.K. watchdog”, CBC, 10 November 2018, online: https://www.cbc.ca/news/politics/uk-information-commissioner-canadian-parties-data-privacy-1.4898867. [12] Federal, Provincial and Territorial Privacy Commissioners of Canada support meaningful privacy obligations for political parties. See: Securing Trust and Privacy in Canada’s Electoral Process: Resolution of the Federal, Provincial and Territorial Information and Privacy Commissioners, Regina, Saskatchewan, September 11-13, 2018, online: https://www.priv.gc.ca/en/about-the-opc/what-we-do/provincial-and-territorial-collaboration/joint-resolutions-with-provinces-and-territories/res_180913/. [13] See, e.g.: Colin J. Bennett and Robyn M. Bayley, “Canadian Federal Political Parties and Personal Privacy Protection: A Comparative Analysis”, March 2012, online: https://www.priv.gc.ca/en/opc-actions-and-decisions/research/explore-privacy-research/2012/pp_201203/; Colin Bennett, “Data Driven Elections and Political Parties in Canada: Privacy Implications, Privacy Policies and Privacy Obligations”, (April 12, 2018). Canadian Journal of Law and Information Technology, Available at SSRN: https://ssrn.com/abstract=3146964; Colin J. Bennett, “Privacy, Elections and Political Parties: Emerging Issues For Data Protection Authorities”, 2016, online: https://www.colinbennett.ca/wp-content/uploads/2016/03/Privacy-Elections-Political-Parties-Bennett.pdf; House of Commons, Standing Committee on Access to Information, Privacy and Ethics, Democracy Under Threat: Risks and Solutions in the Era of Disinformation and Data Monopoly (December 2018), online: <https://www.ourcommons.ca/Content/Committee/421/ETHI/Reports/RP10242267/ethirp17/ethirp17-e.pdf>, archived: https://perma.cc/RV8T-ZLWW. [14] See, e.g.: Samantha Bradshaw, “Data-protection laws must be extended to political parties”, Globe and Mail, 22 March 2018, online: https://www.theglobeandmail.com/opinion/article-data-protection-laws-must-be-extended-to-political-parties/; Michael Morden, “Politicians say they care about privacy. So why can political parties ignore privacy law?”, Globe and Mail, 29 May 2019, online: https://www.theglobeandmail.com/opinion/article-politicians-say-they-care-about-privacy-so-why-can-political-parties/; Colin Bennett, “Politicians must defend Canadians’ online privacy from Big Tech – and from politicians themselves”, Globe and Mail, 26 December 2019, online: https://www.theglobeandmail.com/opinion/article-politicians-must-defend-canadians-online-privacy-from-big-tech-and/; Sabrina Wilkinson, “Voter Privacy: What Canada can learn from abroad”, OpenCanada.org, 4 October 2019, online: https://www.opencanada.org/features/voter-privacy-what-canada-can-learn-abroad/ Fraser Duncan, “Political Parties and Voter Data: A Disquieting Gap in Canadian Privacy Legislation”, Saskatchewan Law Review, June 21 2019, online: https://sasklawreview.ca/comment/political-parties-and-voter-data-a-disquieting-gap-in-canadian-privacy-legislation.php; Colin Bennett, “They’re spying on you: how party databases put your privacy at risk”, iPolitics, September 1, 2015, online: https://ipolitics.ca/2015/09/01/theyre-spying-on-you-how-party-databases-put-your-privacy-at-risk/. [15] See: Office of the Information and Privacy Commissioner of British Columbia, “Investigation Report P19-01: Full Disclosure: Political parties, campaign data, and voter consent”, February 6, 2019, at 25, online: https://www.oipc.bc.ca/investigation-reports/2278; OIPC BC, Courtenay-Alberni Riding Association of The New Democratic Party of Canada, Order No. P19-02, 28 August 2019, online: https://www.oipc.bc.ca/orders/2331. [16] See: Rachel Aiello, “Major political parties under competition probe over harvesting of Canadians' personal info”, CTV News 15 January 2020, online: https://www.ctvnews.ca/politics/major-political-parties-under-competition-probe-over-harvesting-of-canadians-personal-info-1.4768501. [17] Rachel Gilmore, “Privacy group going to court over alleged improper use of voters list by Liberals, Tories and NDP”, CTV News, 10 August 2020, online: https://www.ctvnews.ca/politics/privacy-group-going-to-court-over-alleged-improper-use-of-voters-list-by-liberals-tories-and-ndp-1.5058556. [18] SC 2000, c 5, http://canlii.ca/t/541b8. [19] SBC 2003, c 63, http://canlii.ca/t/52pq9. [20] Investigation Report P19-01: Full Disclosure: Political parties, campaign data, and voter consent”, February 6, 2019, at 22, online: https://www.oipc.bc.ca/investigation-reports/2278. [21] OIPC BC, Courtenay-Alberni Riding Association of The New Democratic Party of Canada, Order No. P19-02, 28 August 2019, online: https://www.oipc.bc.ca/orders/2331. [22] Ministry of Government and Community Services, “Ontario Private Sector Privacy Reform: Improving private sector privacy for Ontarians in a digital age”, 13 August 2020, online: https://www.ontariocanada.com/registry/showAttachment.do?postingId=33967&attachmentId=45105. [23] L119, 4 May 2016, p. 1–88; online: https://gdpr-info.eu/. [24] See, e.g., PIPEDA, s. 3. [25] SC 2000, c 9, http://canlii.ca/t/53mhm. [26] Elections Canada, Guidelines for the Use of the List of Electors, https://www.elections.ca/content.aspx?section=pol&document=index&dir=ann/loe_2019&lang=e. [27] Elections Canada, Discussion Paper 3: The Protection of Electors’ Personal Information in the Federal Electoral Context, May 2020, at 11, online: https://www.elections.ca/content.aspx?section=res&dir=cons/dis/compol/dis3&document=index&lang=e. [28] Ibid at 16. [29] Ibid at 17.
Published in
Privacy
Thursday, 13 August 2020 16:02
Ontario launches consultation on a new private sector data protection law
The Ontario Government has just launched a public consultation and discussion paper to solicit input on a new private sector data protection law for Ontario. Currently, the collection, use and disclosure of personal information in Ontario is governed by the Personal Information Protection and Electronic Documents Act (PIPEDA). This is a federal statute overseen by the Privacy Commissioner of Canada. PIPEDA allows individual provinces to pass their own private sector data protection laws so long as they are ‘substantially similar’. To date, Quebec, B.C. and Alberta are the only provinces to have done so. Critics of this move by Ontario might say that there is no need to add the cost of overseeing a private sector data protection law to the provincial budget when the federal government currently bears this burden. Some businesses might also balk at having to adapt to a new data protection regime. While many of the rules might not be significantly different from those in PIPEDA, there are costs involved simply in reviewing and assessing compliance with any new law. Another argument against a new provincial law might relate to the confusion and uncertainty that could be created around the application of the law, since it would likely only apply to businesses engaged in intra-provincial commercial activities and not to inter-provincial or international activities, which would remain subject to PIPEDA. Although these challenges have been successfully managed in B.C., Alberta and Quebec, there is some merit in having a single, overarching law for the whole of the private sector in Canada. Nevertheless, there are many reasons to enthusiastically embrace this development in Ontario. First, constitutional issues limit the scope of application of PIPEDA to organizations engaged in the collection, use or disclosure of personal information in the course of commercial activity. This means that those provinces that rely solely on PIPEDA for data protection regulation have important gaps in coverage. PIPEDA does not apply to employees in provincially regulated sectors; non-commercial activities of non-profits and charities are not covered, nor are provincial (or federal, for that matter) political parties. The issue of data protection and political parties has received considerable attention lately. B.C.’s private sector data protection law applies to political parties in B.C., and this has recently been interpreted to include federal riding associations situated in B.C. Bill 64, a bill to amend data protection laws in Quebec, would also extend the application of that province’s private sector data protection law to provincial political parties. If Ontario enacts its own private sector data protection law, it can (and should) extend it to political parties, non-commercial actors or activities, and provide better protection for employee personal data. These are all good things. A new provincial law will also be designed for a digital and data economy. A major problem with PIPEDA is that it has fallen sadly out of date and is not well adapted to the big data and AI environment. For a province like Ontario that is keen to build public trust in order to develop its information economy, this is a problem. Canadians are increasingly concerned about the protection of their personal data. The COVID-19 crisis appears to have derailed (once again) the introduction of a bill to amend PIPEDA and it is not clear when such a bill will be introduced. Taking action at the provincial level means no longer being entirely at the mercy of the federal agenda. There is something to be said as well for a law, and a governance body (in this case, it would be the Office of the Ontario Information and Privacy Commissioner) that is attuned to the particular provincial context while at the same time able to cooperate with the federal Commissioner. This has been the pattern in the other provinces that have their own statutes. In Alberta and B.C. in particular, there has been close collaboration and co-operation between federal and provincial commissioners, including joint investigations into some complaints that challenge the boundaries of application of federal and provincial laws. In addition, Commissioners across the country have increasingly issued joint statements on privacy issues of national importance, including recently in relation to COVID-19 and contact-tracing apps. National co-operation combined with provincial specificity in data protection could offer important opportunities for Ontario. In light of this, this consultation process opens an exciting new phase for data protection in Ontario. The task will not simply to be to replicate the terms of PIPEDA or even the laws of Alberta and B.C. (all of which can nonetheless provide useful guidance). None of these laws is particularly calibrated to the big data environment (B.C.’s law is currently under review), and there will be policy choices to be made around many of the issues that have emerged in the EU’s General Data Protection Regulation. This consultation is an opportunity to weigh in on crucially important data protection issues for a contemporary digital society, and a made-in-Ontario statute.
Published in
Privacy
Friday, 24 April 2020 06:56
Facebook Judicial Review Application Raises Fairness Issues
On April 15, 2020 Facebook filed an application for judicial review of the Privacy Commissioner’s “decisions to investigate and continue investigating” Facebook, and seeking to quash the Report of Findings issued on April 25, 2019. This joint investigation involving the BC and federal privacy commissioners was carried out in the wake of the Cambridge Analytica scandal. The Report of Findings found that Facebook had breached a number of its obligations under the federal Personal Information Protection and Electronic Documents Act (PIPEDA) and B.C.’s Personal Information Protection Act (PIPA). [As I explain here, it is not possible to violate both statutes on the same set of facts, so it is no surprise that nothing further has happened under PIPA]. The Report of Findings set out a series of recommendations. It also contained a section on Facebook’s response to the recommendations in which the commissioners chastise Facebook. The Report led to some strongly worded criticism of Facebook by the federal Privacy Commissioner. On February 6, 2020, the Commissioner referred the matter to Federal Court for a hearing de novo under PIPEDA. The application for judicial review is surprising. Under the Federal Courts Act, a party has thirty days from the date of a decision affecting it to seek judicial review. For Facebook, that limitation ran out a long time ago. Further, section 18.1 of the Federal Courts Act provides for judicial review of decision, but a Report of Findings is not a decision. The Commissioner does not have the power to make binding orders. Only the Federal Court can do that, after a hearing de novo. The decisions challenged in the application for judicial review are therefore the “decisions to investigate and to continue investigating” Facebook. In its application for judicial review Facebook argues that the complainants lacked standing because they did not allege that they were users of Facebook or that their personal information had been impacted by Cambridge Analytica’s activities. Instead, they raised general concerns about Facebook’s practices leading to the Cambridge Analytica scandal. This raises the issue of whether a complaint under PIPEDA must be filed by someone directly affected by a company’s practice. The statute is not clear. Section 11(1) of PIPEDA merely states: “An individual may file with the Commissioner a written complaint against an organization for contravening a provision of Division 1 or 1.1 or for not following a recommendation set out in Schedule 1.” Facebook’s argument is that a specific affected complainant is required even though Facebook’s general practices might have left Canadian users vulnerable. This is linked to a further argument by Facebook that the investigation lacked a Canadian nexus since there was no evidence that any data about Canadians was obtained or used by Cambridge Analytica. Another argument raised by Facebook is that that the investigation amounted to a “broad audit of Facebook’s personal information management practices, not an investigation into a particular PIPEDA contravention” as required by Paragraph 11(1) of PIPEDA. Facebook argues that the separate audit power under PIPEDA has built-in limitations, and that the investigation power is much more extensive. They argue, essentially, that the investigation was an audit without the limits. Facebook also argues that the report of findings was issued outside of the one-year time limit set in s. 13(1) of PIPEDA. In fact, it was released after thirteen rather than twelve months. Finally, Facebook argues that the investigation carried out by the Commissioner lacked procedural fairness and independence. The allegations are that the sweeping scope of the complaint made against Facebook was not disclosed until shortly before the report was released and that as a result Facebook had been unaware of the case it had to meet. It also alleges a lack of impartiality and independence on the part of the Office of the Privacy Commissioner in the investigation. No further details are provided. The lack of timeliness of this application may well doom it. Section 18.1 of the Federal Courts Act sets the thirty-day time limit from the date when the party receives notice of the decision it seeks to challenge; the decision in this case is the decision to initiate the investigation, which would have been communicated to Facebook almost two years ago. Although judges have discretion to extend the limitation period, and although Facebook argues it did not receive adequate communication regarding the scope of the investigation, even then their application comes almost a year after the release of the Report of Findings. Perhaps more significantly, it comes two and a half months after the Commissioner filed his application for a hearing de novo before the Federal Court. The judicial review application seems to be a bit of a long shot. Long shot though it may be, it may be intended as a shot across the bows of both the Office of the Privacy Commissioner and the federal government. PIPEDA is due for reform in the near future. Better powers of enforcement for PIPEDA have been on the government’s agenda; better enforcement is a pillar of the Digital Charter. The Commissioner and others have raised enforcement as one of the major weaknesses of the current law. In fact, the lack of response by Facebook to the recommendations of the Commissioner following the Report of Findings was raised by the Commissioner as evidence of the need for stronger enforcement powers. One of the sought-after changes is the power for the Commissioner to be able to issue binding orders. This application for judicial review, regardless of its success, puts on the record concerns about procedural fairness that will need to be front of mind in any reforms that increase the powers of the Commissioner. As pointed out by former Commissioner Jennifer Stoddart in a short article many years ago, PIPEDA creates an ombuds model in which the Commissioner plays a variety of roles, including promoting and encouraging compliance with the legislation, mediating and attempting early resolution of disputes and investigating and reporting on complaints. Perhaps so as to give a degree of separation between these roles and any binding order of compliance, it is left to the Federal Court to issue orders after a de novo hearing. Regardless of its merits, the Facebook application for judicial review raises important procedural fairness issues even within this soft-compliance model, particularly since the Commissioner took Facebook so publicly to task for not complying with its non-binding recommendations. If PIPEDA were to be amended to include order-making powers, then attention to procedural fairness issues will be even more crucial. Order-making powers might require clearer rules around procedures as well as potentially greater separation of functions within the OPC, or possibly the creation of a separate adjudication body (e.g. a privacy tribunal).
Published in
Privacy
Monday, 30 March 2020 07:20
Interesting amendments to Ontario's health data and public sector privacy laws buried in omnibus bill
Given that we are in the middle of a pandemic, it is easy to miss the amendments to Ontario’s Personal Health Information Protection Act (PHIPA) and the Freedom of Information and Protection of Privacy Act (FIPPA) that were part of the omnibus Economic and Fiscal Update Act, 2020 (Bill 188) which whipped through the legislature and received Royal Assent on March 25, 2020. There is much that is interesting in these amendments. The government is clearly on a mission to adapt PHIPA to the digital age, and many of the new provisions are designed to do just that. For example, although many health information custodians already do this as a best practice, a new provision in the law (not yet in force) will require health information custodians that use digital means to manage health information to maintain an electronic audit log. Such a log must detail the identity of anyone who deals with the information, as well as the date and time of any access or handling of the personal information. The Commissioner may request a custodian to provide him with the log for audit or review. Clearly this is a measure designed to improve accountability for the handling of digital health information and to discourage snooping (which is also further discouraged by an increase in the possible fine for snooping found later in the bill). The amendments will also create new obligations for “consumer electronic service providers”. These companies offer services to individuals to help manage their personal health information. The substance of the obligations remains to be further fleshed out in regulations; the obligations will not take effect until the regulations are in place. The Commissioner will have a new power to order that a health information custodian or class of custodians cease providing personal health information to a consumer electronic service provider. Presumably this will occur in cases where there are concerns about the privacy practices of the provider. Interestingly, at a time when there is much clamor for the federal Privacy Commissioner to have new enforcement powers to better protect personal information, the PHIPA amendments give the provincial Commissioner the power to levy administrative penalties against “any person” who, in the opinion of the Commissioner, has contravened the Act or its regulations. The administrative penalties are meant either to serve as ‘encouragement’ to comply with the Act, or as a means of “preventing a person from deriving, directly or indirectly, any economic benefit as a result of contravention” of PHIPA. The amount of the penalty should reflect these purposes and must be in accordance with regulations. The amendments also set a two-year limitation period from the date of the most recent contravention for the imposition of administrative penalties. In order to avoid the appearance of a conflict of interest, administrative penalties are paid to the Minister of Finance of the province. These provisions await the enactment of regulations before taking effect. The deidentification of personal information is a strategy relied upon to carry out research without adversely impacting privacy, but the power of data analytics today raises serious concerns about reidentification risk. It is worth noting that the definition of “de-identify” in PHIPA will be amended, pending the enactment of regulations to that can require the removal of any information “in accordance with such requirements as may be prescribed.” The requirements for deidentification will thus made more adaptable to changes in technology. The above discussion reflects some of the PHIPA amendments; readers should be aware that there are others, and these can be found in Bill 188. Some take effect immediately; others await the enactment of regulations. I turn now to the amendments to FIPPA, which is Ontario’s public sector data protection law. To understand these amendments, it is necessary to know that the last set of FIPPA amendments (also pushed through in an omnibus bill) created and empowered “inter-ministerial data integration units”. This was done to facilitate inter-department data sharing with a view to enabling a greater sharing of personal information across the government (as opposed to the more siloed practices of the past). The idea was to allow the government to derive more insights from its data by enabling horizontal sharing, while still protecting privacy. These new amendments add to the mix the “extra-ministerial data integration unit”, which is defined in the law as “a person or entity, or an administrative division of a person or entity, that is designated as an extra-ministerial data integration unit in the regulations”. The amendments also give to these extra-ministerial data integration units many of the same powers to collect and use data as are available to inter-ministerial data integration units. Notably, however, an extra-ministerial data integration unit, according to its definition, need not be a public-sector body. It could be a person, a non-profit, or even a private sector organization. It must be designated in the regulations, but it is important to note the potential scope. These legislative changes appear to pave the way for new models of data governance in smart city and other contexts. The Institute for Clinical Evaluative Sciences (ICES) is an Ontario-based independent non-profit organization that has operated as a kind of data trust for health information in Ontario. It is a “prescribed entity” under s. 45 of PHIPA which has allowed it to collect “personal health information for the purpose of analysis or compiling statistical information with respect to the management of, evaluation or monitoring of, the allocation of resources to or planning for all or part of the health system, including the delivery of services.” It is a trusted institution which has been limited in its ability to expand its data analytics to integrate other relevant data by public sector data protection laws. In many ways, these amendments to FIPPA are aimed at better enabling ICES to expand its functions, and it is anticipated that ICES will be designated in the regulations. However, the amendments are cast broadly enough that there is room to designate other entities, enabling the sharing of municipal and provincial data with newly designated entities for the purposes set out in FIPPA, which include: “(a) the management or allocation of resources; (b) the planning for the delivery of programs and services provided or funded by the Government of Ontario, including services provided or funded in whole or in part or directly or indirectly; and (c) the evaluation of those programs and services.” The scope for new models of governance for public sector data is thus expanded. Both sets of amendments – to FIPPA and to PHIPA – are therefore interesting and significant. The are also buried in an omnibus bill. Last year, the Ontario government launched a Data Strategy Consultation that I have criticized elsewhere for being both rushed and short on detail. The Task Force was meant to report by the end of 2019; not surprisingly, given the unrealistic timelines, they have not yet reported. It is not even clear that a report is still contemplated. While it is true that technology is evolving rapidly and that there is an urgent need to develop a data strategy, the continued lack of transparency and the failure to communicate clearly about steps already underway is profoundly disappointing. One of the pillars of the data strategy was meant to be privacy and trust. Yet we have already seen two rounds of amendments to the province’s privacy laws pushed through in omnibus bills with little or no explanation. Many of these changes would be difficult for the lay person to understand or contextualize without assistance; some are frankly almost impenetrable. Ontario may have a data strategy. It might even be a good one. However, it seems to be one that can only be discovered or understood by searching for clues in omnibus bills. I realize that we are currently in a period of crisis and resources may be needed elsewhere at the moment, but this obscurity predates the pandemic. Transparent communication is a cornerstone of trust. It would be good to have a bit more of it.
Published in
Privacy
Tuesday, 24 March 2020 09:17
Private sector data, privacy and the pandemic
The COVID-19 pandemic has sparked considerable debate and discussion about the role of data in managing the crisis. Much of the discussion has centred around personal data, and in these discussions the balance between privacy rights and the broader public interest is often a focus of debate. Invoking the general ratcheting up of surveillance after 9-11, privacy advocates warn of the potential for privacy invasive emergency measures to further undermine individual privacy even after the crisis is over. This post will focus on the potential for government use of data in the hands of private sector companies. There are already numerous examples of where this has taken place or where it is proposed. The nature and intensity of the privacy issues raised by these uses depends very much on context. For the purposes of this discussion, I have identified three categories of proposed uses of private sector data by the public sector. (Note: My colleague Michael Geist has also written about 3 categories of data – his are slightly different). The first category involves the use of private sector data to mine it for knowledge or insights. For example, researchers and public health agencies have already experimented with using social media data to detect the presence or spread of disease. Some of this research is carried out on publicly accessible social media data and the identity of specific individuals is not necessary to the research, although geolocation generally is. Many private sector companies sit on a wealth of data that reveals the location and movements of individuals, and this could provide a rich source of data when combined with public health data. Although much could be done with aggregate and deidentified data in this context, privacy is still an issue. One concern is the potential for re-identification. Yet the full nature and scope of concerns could be highly case-specific and would depend upon what data is used, in what form, and with what other data it is combined. Government might, or might not be, the lead actor when it comes to the use of private sector data in this way. Private sector companies could produce analytics based on their own stores of data. They might do so for a variety of reasons, including experimentation with analytics or AI, a desire to contribute to solutions, or to provide analytics services to public and private sector actors. There is also the potential for public-private collaborations around data. Private sector companies acting on their own would most likely publish only aggregate or deidentified data, possibly in the form of visualizations. If the published information is not personal information, this type of dissemination is possible, although these companies would need to be attentive to reidentification risks. In cases where personal data is shared with the public sector, there might be other legal options. The Personal Information Protection and Electronic Documents Act (PIPEDA) contains a research exception that allows organizations to disclose information without consent “for statistical, or scholarly study or research, purposes that cannot be achieved without disclosing the information, [and] it is impracticable to obtain consent”. Such disclosure under s. 7(3)(f) requires that the organization inform the Commissioner in advance of any such disclosure, presumably to allow the Commissioner to weigh in on the legitimacy of what is proposed. The passage of a specific law, most likely on an emergency basis, could also enable disclosure of personal information without consent. Such an option would be most likely to be pursued where the government seeks to compel private sector companies to disclose information to them. Ideally, any such law would set clear parameters on the use and disposal of such data, and could put strict time limits on data sharing to coincide with the state of emergency. A specific law could also provide for oversight and accountability. The second category is where information is sought by governments in order to specifically identify and track individuals in order to enable authorities to take certain actions with respect to those individuals. An example is where cell phone location data of individuals who have been diagnosed with the disease is sought by government officials so that they can retrospectively track their movements to identify where infected persons have been and with whom they have had contact (contact-tracing).This might be done in order to inform the public of places and times where infected persons have been (without revealing the identity of the infected person) or it might be done to send messages directly to people who were in the vicinity of the infected person to notify them of their own possible infection. In such cases, authorities access and make use of the data of the infected person as well as the data of persons in proximity to them. Such data could also be used to track movements of infected persons in order to see if they are complying with quarantine requirements. For example, public authorities could combine data from border crossings post-spring break with cell phone data to see if those individuals are complying with directives to self-quarantine for 14 days. The use of private sector data in this way could be problematic under existing Canadian privacy law. Telcos are subject to PIPEDA, which does not contain an exception to the requirement for consent that would be an easy fit in these circumstances. However, PIPEDA does permit disclosure without consent where it is ‘required by law’. A special law, specific to the crisis, could be enacted to facilitate this sort of data sharing. Any such law should also contain its own checks and balances to ensure that data collection and use is appropriate and proportional. Israel provides an example of a country that enacted regulations to allow the use of cell phone data to track individuals diagnosed with COVID-19. A podcast on this issue by Michael Geist featuring an interview with Israeli law professor Michael Birnhack exposes some of the challenges with this sort of measure. In a decision issued shortly after the recording of the podcast, the Israeli Supreme Court ruled that the regulations failed to meet the appropriate balance between privacy and the demands of the public health crisis. The case makes it clear that it is necessary to find an appropriate balance between what is needed to address a crisis and what best ensures respect for privacy and civil liberties. It is not an all or nothing proposition – privacy or public health. It is a question of balance, transparency, accountability and proportionality. It is interesting to note that in this context, at least one country has asked individuals to voluntarily share their location and contact information. Singapore has developed an app called TraceTogether that uses Bluetooth signals to identify the phones of other app users that are within two metres of each user. The design of the app includes privacy protective measures. Sharing personal data with appropriate consent is easily permitted under public and private sector laws so long as appropriate safeguards are in place. A third category of use of personal information involves the public sharing of information about the movements of individuals known to be infected with the virus. Ostensibly this is in order to give people information they may need to protect themselves from unwanted exposure. South Korea offers an example of such measures – it has provided highly detailed information about the location and movements of infected persons; the detail provide could lead to identification. Given the fact in Canada at least, testing has been limited due to insufficient resources, a decision to release detailed information about those who test positive could serve to stigmatize those persons while giving others a false sense of security. Some have raised concerns that such measures would also discourage individuals from coming forward to be tested or to seek treatment out of concerns over stigmatization. In Canada, the disclosure of specific personal health information of individuals – or information that could lead to their identification – is an extreme measure that breaches basic personal health information protection requirements. It is hard to see on what basis the public release of this type of information could be at all proportionate. A common theme in all of the debates and discussions around data and privacy in the current context is that exceptional circumstances call for exceptional measures. The COVID-19 pandemic has spurred national and regional governments to declare states of emergency. These governments have imposed a broad range of limitations on citizen activities in a bid to stop the spread of the virus. The crisis is real, the costs to human life, health and to the economy are potentially devastating. Sadly, it is also the case that while many do their best to comply with restrictions, others flaunt them to greater or lesser extents, undermining the safety of everyone. In this context, it is not surprising that more drastic, less voluntary measures are contemplated, and that some of these will have important implications for privacy and civil liberties. Privacy and civil liberties, however, are crucially important values and should not be casual victims of pandemic panic. A careful balancing of interests can be reflected not just in the measures involving the collection and use of data, but also in issues of oversight, transparency, accountability, and, perhaps most importantly, in limits on the duration of collection and use.
Published in
Privacy
Monday, 02 March 2020 08:41
Privacy investigations into Clearview AI in Canada: Old laws and new tech
Clearview AI and its controversial facial recognition technology have been making headlines for weeks now. In Canada, the company is under joint investigation by federal and provincial privacy commissioners. The RCMP is being investigated by the federal Privacy Commissioner after having admitted to using Clearview AI. The Ontario privacy commissioner has expressed serious concerns about reports of Ontario police services adopting the technology. In the meantime, the company is dealing with a reported data breach in which hackers accessed its entire client list. Clearview AI offers facial recognition technology to ‘law enforcement agencies.’ The term is not defined on their site, and at least one newspaper report suggests that it is defined broadly, with private security (for example university campus police) able to obtain access. Clearview AI scrapes images from publicly accessible websites across the internet and compiles them in a massive database. When a client provides them with an image of a person, they use facial recognition algorithms to match the individual in the image with images in its database. Images in the database are linked to their sources which contain other identifying information (for example, they might link to a Facebook profile page). The use of the service is touted as speeding up all manner of investigations by facilitating the identification of either perpetrators or victims of crimes. This post addresses a number of different issues raised by the Clearview AI controversy, framed around the two different sets of privacy investigations. The post concludes with additional comments about transparency and accountability. 1. Clearview AI & PIPEDA Canada’s Personal Information Protection and Electronic Documents Act (PIPEDA) applies to the collection, use and disclosure of personal information by private sector organizations engaged in commercial activities. Although Clearview AI is a U.S. company, PIPEDA will still apply if there is a sufficient nexus to Canada. In this case, the service clearly captures data about Canadians, and the facial recognition services are marketed to Canadian law enforcement agencies. This should be enough of a connection. The federal Privacy Commissioner is joined in his investigation by the Commissioners of Quebec, B.C. and Alberta. Each of these provinces has its own private sector data protection laws that apply to organizations that collect, use and disclose personal information within the borders of their respective province. The joint investigation signals the positive level of collaboration and co-operation that exists between privacy commissioners in Canada. However, as I explain in an earlier post, the relevant laws are structured so that only one statute applies to a particular set of facts. This joint investigation may raise important jurisdictional questions similar to those raised in the Facebook/Cambridge Analytica joint investigation and that were not satisfactorily resolved in that case. It is a minor issue, but nonetheless one that is relevant and interesting from a privacy governance perspective. The federal Commissioner’s investigation will focus on whether Clearview AI complied with PIPEDA when it collected, used and disclosed the personal information which populates its massive database. Clearview AI’s position on the legality of its actions is clearly based on U.S. law. It states on its website that: “Clearview searches the open web. Clearview does not and cannot search any private or protected info, including in your private social media accounts.” In the U.S., there is much less in the way of privacy protection for information in ‘public’ space. In Canada however, the law is different. Although there is an exception in PIPEDA (and in comparable provincial private sector laws) to the requirement of consent for the collection, use or disclosure of “publicly available information”, this exception is cast in narrow terms. It is certainly not broad enough to encompass information shared by individuals through social media. Interestingly, in hearings into PIPEDA reform, the House of Commons ETHI Committee at one point seemed swayed by industry arguments that PIPEDA should be amended to include websites and social media within the exception for “publicly available personal information”. In an earlier post, I argued that this was a dangerous direction in which to head, and the Clearview AI controversy seems to confirm this. Sharing photographs online for the purposes of social interaction should not be taken as consent to use those images in commercial facial recognition technologies. What is more, the law should not be amended to deem it to be so. To the extent, then, that the database contains personal information of Canadians that was collected without their knowledge or consent, the conclusion will likely be that there has been a breach of PIPEDA. The further use and disclosure of personal information without consent will also amount to a breach. An appropriate remedy would include ordering Clearview AI to remove all personal information of Canadians that was collected without consent from its database. Unfortunately, the federal Commissioner does not have order-making powers. If the investigation finds a breach of PIPEDA, it will still be necessary to go to Federal Court to ask that court to hold its own hearing, reach its own conclusions, and make an order. This is what is currently taking place in relation the Facebook/Cambridge Analytica investigation, and it makes somewhat of a mockery of our privacy laws. Stronger enforcement powers are on the agenda for legislative reform of PIPEDA, and it is to be hoped that something will be done about this before too long.
2. The Privacy Act investigation The federal Privacy Commissioner has also launched an investigation into the RCMP’s now admitted use of Clearview AI technology. The results of this investigation should be interesting. The federal Privacy Act was drafted for an era in which government institution generally collected the information they needed and used from individuals. Governments, in providing all manner of services, would compile significant amounts of data, and public sector privacy laws set the rules for governance of this data. These laws were not written for our emerging context in which government institutions increasingly rely on data analytics and data-fueled AI services provided by the private sector. In the Clearview AI situation, it is not the RCMP that has collected a massive database of images for facial recognition. Nor has the RCMP contracted with a private sector company to build this service for it. Instead, it is using Clearview AI’s services to make presumably ad hoc inquiries, seeking identity information in specific instances. It is not clear whether or how the federal Privacy Act will apply in this context. If the focus is on the RCMP’s ‘collection’ and ‘use’ of personal information, it is arguable that this is confined to the details of each separate query, and not to the use of facial recognition on a large scale. The Privacy Act might simply not be up to addressing how government institutions should interact with these data-fuelled private sector services. The Privacy Act is, in fact, out of date and clearly acknowledged to be so. The Department of Justice has been working on reforms and has attempted some initial consultation. But the Privacy Act has not received the same level of public and media attention as has PIPEDA. And while we might see reform of PIPEDA in the not too distant future, reform of the Privacy Act may not make it onto the legislative agenda of a minority government. If this is the case, it will leave us with another big governance gap for the digital age. If the Privacy Act is not to be reformed any time soon, it will be very interesting to see what the Privacy Commissioner’s investigation reveals. The interpretation of section 6(2) of the Privacy Act could be of particular importance. It provides that: “A government institution shall take all reasonable steps to ensure that personal information that is used for an administrative purpose by the institution is as accurate, up-to-date and complete as possible.” In 2018 the Supreme Court of Canada issued a rather interesting decision in Ewert v. Canada, which I wrote about here. The case involved a Métis man’s challenge to the use of actuarial risk-assessment tests by Correctional Services Canada to make decisions related to his incarceration. He argued that the tests were “developed and tested on predominantly non-Indigenous populations and that there was no research confirming that they were valid when applied to Indigenous persons.” (at para 12). The Corrections and Conditional Release Act contained language very similar to s. 6(2) of the Privacy Act. The Supreme Court of Canada ruled that this language placed an onus on the CSC to ensure that all of the data it relied upon in its decision-making about inmates met that standard – including the data generated from the use of the assessment tools. This ruling may have very interesting implications not just for the investigation into the RCMP’s use of Clearview’s technology, but also for public sector use of private sector data-fueled analytics and AI where those tools are based upon personal data. The issue is whether, in this case, the RCMP is responsible for ensuring the accuracy and reliability of the data generated by a private sector AI system on which they rely. One final note on the use of Clearview AI’s services by the RCMP – and by other police services in Canada. A look at Clearview AI’s website reveals its own defensiveness about its technologies, which it describes as helping “to identify child molesters, murderers, suspected terrorists, and other dangerous people quickly, accurately, and reliably to keep our families and communities safe.” Police service representatives have also responded defensively to media inquiries, and their admissions of use come with very few details. If nothing else, this situation highlights the crucial importance of transparency, oversight and accountability in relation to these technologies that have privacy and human rights implications. Transparency can help to identify and examine concerns, and to ensure that the technologies are accurate, reliable and free from bias. Policies need to be put in place to reflect clear decisions about what crimes or circumstances justify the use of these technologies (and which ones do not). Policies should specify who is authorized to make the decision to use this technology and according to what criteria. There should be record-keeping and an audit trail. Keep in mind that technologies of this kind, if unsupervised, can be used to identify, stalk or harass strangers. It is not hard to imagine someone use this technology to identify a person seen with an ex-spouse, or even to identify an attractive woman seen at a bar. They can also be used to identify peaceful protestors. The potential for misuse is enormous. Transparency, oversight and accountability are essential if these technologies are to be used responsibly. The sheepish and vague admissions of use of Clearview AI technology by Canadian police services is a stark reminder that there is much governance work to be done around such technologies in Canada even beyond privacy law issues.
Published in
Privacy
Tuesday, 11 February 2020 10:13
Liberals and Conservatives seem to agree on privacy reform but with very different approaches
A recent story in iPolitics states that both the Liberals and the Conservatives support strengthening data protection laws in Canada, although it also suggests they may differ as to the best way to do so. The Liberals have been talking about strengthening Canada’s data protection laws – both the Privacy Act (public sector) and the Personal Information Protection and Electronic Documents Act (PIPEDA) (private sector) since well before the last election, although their emphasis has been on PIPEDA. The mandate letters of both the Ministers of Justice and Industry contained directions to reform privacy laws. As I discuss in a recent post, these mandate letters speak of greater powers for the Privacy Commissioner, as well as some form of “appropriate compensation” for data breaches. There are also hints at a GDPR-style right of erasure, a right to withdraw consent to processing of data, and rights of data portability. With Canada facing a new adequacy assessment under the EU’s General Data Protection Regulation (GDPR) it is perhaps not surprising to see this inclusion of more EU-style rights. Weirdly, though, the mandate letters of the Minister of Industry and the Minister of Heritage also contain direction to create the new role of “Data Commissioner” to serve an as-yet unclear mandate. The concept of a Data Commissioner comes almost entirely out of the blue. It seems to be first raised before the ETHI Committee on February 7, 2019 by Dr. Jeffrey Roy of Dalhousie University. He referenced in support of this idea a new Data Commissioner role being created in Australia as well as the existence of a UK Chief Data Officer. How it got from an ETHI Committee transcript to a mandate letter is still a mystery. If this, in a nutshell, is the Liberal’s plan, it contains both the good, the worrisome, and the bizarre. Strengthening PIPEDA – both in terms of actual rights and enforcement of those rights is a good thing, although the emphasis in the mandate letters seems very oriented towards platforms and other issues that have been in the popular press. This is somewhat worrisome. What is required is a considered and substantive overhaul of the law, not a few colourful and strategically-placed band-aids. There is no question that the role of the federal Privacy Commissioner is front and centre in this round of reform. There have been widespread calls to increase his authority to permit him to issue fines and to make binding orders. These measures might help address the fundamental weakness of Canada’s private sector data protection laws, but they will require some careful thinking about the drafting of the legislation to ensure that some of the important advisory and dispute resolution roles of the Commissioner’s office are not compromised. And, as we learned with reform of the Access to Information Act, there are order-making powers and then there are order-making powers. It will not be a solution to graft onto the legislation cautious and complicated order-making powers that increase bureaucracy without advancing data protection. The bizarre comes in the form of the references to a new Data Commissioner. At a time when we clearly have not yet managed to properly empower the Privacy Commissioner, it is disturbing that we might be considering creating a new bureaucracy with apparently overlapping jurisdiction. The mandate letters suggest that the so-called data commissioner would oversee (among other things?) data and platform companies, and would have some sort of data protection role in this regard. His or her role might therefore overlap with both those of the Privacy Commissioner and the Competition Bureau. It is worth noting that the Competition Bureau has already dipped its toe into the waters of data use and abuse. The case for a new bureaucracy is not evident. The Conservatives seem to be opposed to the creation of the new Data Commissioner, which is a good thing. However, Michelle Rempel Garner was reported by iPolitics as rejecting “setting up pedantic, out of date, ineffectual and bloated government regulatory bodies to enforce data privacy.” It is not clear whether this is simply a rejection of the new Data Commissioner’s office, or also a condemnation of the current regulatory approach to data protection (think baby and bath water). Instead, the Conservatives seem to be proposing creating a new data ownership right for Canadians, placing the economic value of Canadians’ data in their hands. This is a bad idea for many reasons. In the first place, creating a market model for personal data will do little to protect Canadians. Instead, it will create a context in which there truly is no privacy because the commercial exchange of one’s data for products and services will include a transfer of any data rights. It will also accentuate existing gaps between the wealthy and those less well off. The rich can choose to pay extra for privacy; others will have no choice but to sell their data. Further, the EU, which has seriously studied data ownership rights (and not just for individuals) has walked away from them each time. Data ownership rights are just too complicated. There are too many different interests in data to assign ownership to just one party. If a company uses a proprietary algorithm to profile your preferences for films or books, is this your data which you own, or theirs because they have created it? What is much more important is the recognition of different interests in data and the strengthening, through law, of the interests of individuals. This is what the GDPR has done. Rights of data portability and erasure, the right to withdraw consent to processing, and many other rights within the GDPR give individuals much stronger interests in their data, along with enforcement tools to protect those interests. Those strengthened interests are now supporting new business models that place consumers at the centre of data decision-making. Open banking (or consumer-directed banking), currently being studied by the Department of Finance in Canada, is an example of this, but there are others as well. The fix, in the end, is relatively simple. PIPEDA needs to be amended to both strengthen and expand the existing interests of individuals in their personal data. It also needs to be amended to provide for appropriate enforcement, compensation, and fines. Without accountability, the rights will be effectively meaningless. It also needs to happen sooner rather than later.
(With thanks to my RA Émilie-Anne Fleury who was able to find the reference to the Data Commissioner in the ETHI Committee transcripts)
Published in
Privacy
|
Electronic Commerce and Internet Law in Canada, 2nd EditionPublished in 2012 by CCH Canadian Ltd. Intellectual Property for the 21st CenturyIntellectual Property Law for the 21st Century: Interdisciplinary Approaches |