Teresa Scassa - Blog

Bill S-4, the Digital Privacy Act has received royal assent and is now law. This bill amends Canada’s Personal Information Protection and Electronic Documents Act (PIPEDA). PIPEDA, Canada’s private sector data protection statute has been badly in need of updating for some time now. Although it only came into being in 2001, the technologies impacting personal information and the growing private sector thirst for such data have changed dramatically, rapidly outstripping the effectiveness of the legislation. There have been many calls for the reform of PIPEDA (perhaps most notably from successive Privacy Commissioners). The Digital Privacy Act addresses a handful of issues – some quite important, but leaves much more to be done. In this post I consider three of the changes: new data sharing powers for private sector organizations, data breach notification requirements, and a new definition of consent.

At least one of the amendments is considered a step backwards by privacy advocates. A new s. 7(3)(d.1) allows private sector organizations to share personal information between themselves without the knowledge or consent of the individuals to whom the information pertains for the purposes of investigating breaches of “agreements” or laws. Originally seen as a measure that would make it easier for organizations such as banks to investigate complex fraud schemes that might involve a fraudster dealing with multiple organizations, the growing awareness of the vulnerability of individuals to snooping and information sharing of all kinds, has made this provision the target of significant criticism by privacy advocates. Keep in mind that an “agreement” can be a user agreement with an ISP, the terms of use of a web site or other online service, or any other contract between an individual and an organization. The provision means that any company that suspects that one of the terms of an agreement to which it is party has been breached can ask other companies to share information – without the knowledge or consent of the individual or without a court order – in order to investigate this potential breach. There is a profound lack of transparency and accountability in the data sharing enabled by this provision. True, such sharing is not mandatory – an organization can refuse to share the information requested under this provision. This amendment places an onus on individuals to pressure organizations to give them clearer and more robust assurances regarding whether and how their personal information will be shared.

The amendments will also add to PIPEDA data breach notification requirements. This is a change long sought by privacy advocates. Essentially, the law will require an organization that has experienced a data security breach to report the breach to the Privacy Commissioner “if it is reasonable in the circumstances to believe that the breach creates a real risk of significant harm to an individual.” (s. 10.1) Affected individuals must also be notified in the same circumstances. “Significant harm” is defined in the legislation as including “bodily harm, humiliation, damage to reputation or relationships, loss of employment, business or professional opportunities, financial loss, identity theft, negative effects on the credit record and damage to or loss of property.” A determination of whether there is a “real risk” of these types of harms can be determined by considering two factors spelled out in the legislation: the sensitivity of the information at issue, and the likelihood that it is being misused or may be misused in the future. Any other “prescribed factor” must also be taken into account, leaving room to include other considerations in the regulations that will be required to implement these provisions. The real impact of these data breach notification provisions will largely turn on how “real risk” and “significant harm” are interpreted and applied. It is important to note as well that these provisions are the one part of the new law that is not yet in force. The data breach notification provisions are peppered throughout with references to “prescribed” information or requirements. This means that to come into effect, regulations are required. It is not clear what the timeline is for any such regulations. Those who have been holding their breath waiting for data breach notification requirements may just have to give in and inhale now in order to avoid asphyxiation.

One amendment that I find particularly interesting is a brand new definition of consent. PIPEDA is a consent-based data protection regime. That is, it is premised on the idea that individuals make free and informed choices about who gets to use their personal information and for what purposes. Consent is, of course, becoming somewhat of a joke. There are too many privacy policies, they are too long and too convoluted for people either to have the time to read them all or be capable of understanding them. It doesn’t help that they are often framed in very open-ended terms which do not give a clear indication of how personal information will be used by the organization seeking consent. In this context, the new definition is particularly intriguing. Section 6.1 of the statute now reads:

6.1 For the purposes of clause 4.3 of Schedule 1, the consent of an individual is only valid if it is reasonable to expect that an individual to whom the organization’s activities are directed would understand the nature, purpose and consequences of the collection, use or disclosure of the personal information to which they are consenting.

This is a rather astonishing threshold for consent – and one that is very consumer-friendly. It requires that the individual understand “the nature, purpose and consequences” of the use of their personal information to which they consent. In our networked, conglomerated and big-data dominated economy, I am not sure how anyone can fully understand the consequences of the collection, use or disclosure of much of their personal information. Given a fulsome interpretation this provision could prove a powerful tool for protecting consumer privacy. Organizations should take note. At the very least it places a much greater onus on them to formulate clear, accessible and precise privacy policies.

Published in Privacy

The Privacy Commissioner of Canada has issued his findings in relation to the investigation of multiple complaints by Canadians regarding the collection, use and disclosure of their personal information by a company based in Romania. The company, Globe24h operates a website which it describes as a “global database of public records”. This global database contains a substantial number of decisions from Canadian courts and administrative tribunals. Some of this content was acquired by scraping court or tribunal websites, or websites such as CanLII. (I wrote about this situation earlier here.)

The problem, from a privacy point of view is that many court and tribunal decisions contain a great deal of personal information. For example, a decision from a divorce case might provide considerable detail about personal assets. Immigration or refugee determination hearings similarly might reveal sensitive personal information. As Commissioner Therrien noted in his findings, the “highly detailed, highly sensitive personal information” found in the decisions that were the focus of the complaints in this case “could have negative reputation impacts (including financial information, health information, and information about children)” (at para 27). Globe24h offers a fee-based service for removal of personal information. A number of the complainants in this case had paid up to 200 euros to have their information removed from decisions in the database.

The Romanian company responded to the investigation by arguing that the Office of the Privacy Commissioner of Canada had no jurisdiction over its activities; and that if it did, Canada’s Personal Information Protection and Electronic Documents Act did not apply because it was engaged in journalistic activities. Alternatively, they argued that they were making use of publicly available information, for which consent is not required under PIPEDA. In this admittedly long blog post, I look at a number of different issues raised in the Commissioner’s findings. You can jump ahead if you like to: Open courts principle and privacy; Extended territorial jurisdiction; Journalism exception; Publicly available personal information; or Crown copyright – the unspoken issue.

 

Open courts principle and privacy

The open courts principle – which provides transparency for the justice system in Canada – dictates that decision-makers provide reasons for their decisions and that these decisions be publicly accessible. In the old days, decisions were published in law reports or made available for consultation at court offices. Either way, anyone interested in a particular case had to make some effort to track it down. Decisions were indexed according to subject matter, but were not easily searchable by individual names. The capacity to make court decisions publicly available on the Internet has dramatically increased the ability of the public to access court decisions (and, given the high cost of legal services and the growing number of self-represented litigants, it is not a moment too soon). However, public availability of court decisions on the Internet can raise significant privacy issues for individuals involved in litigation. There is a big difference between accepting that a court decision in one’s case will be published in the interests of transparency and having one’s personal information sucked up and spit out by search engines as part of search results unrelated to the administration of justice.

The main response to this problem to date (from the Canadian Judicial Council’s 2005 Model Policy for Access to Court Records) has been for courts to require the use of technological measures on court websites (and on websites such as CanLII) to prevent search engines from indexing the full text of court decisions. This means that those searching online using a particular individual’s name would not find personal details from court proceedings caught up in the search results. However, these licence terms are only imposed on entities such as CanLII. The general copyright licences on court websites place no such restrictions on the reproduction and use of court decisions. Of course, placing restrictions on the searchability/usability of published decisions can also be a barrier to their innovative reuse. A better approach – or at least a complementary one – might be to be more restrained in the sharing of personal information in published decisions. This latter approach is one recommended by the Office of the Privacy Commissioner of Canada for administrative tribunals. It is unevenly adopted by courts and tribunals in Canada.

While the open courts principle and how Canadian courts and tribunals implement it are relevant to the problem in this case, the Commissioner’s decision does not address these issues. The complaints focussed on the activities of the Romanian company and not on how courts and tribunals manage personal information. Nevertheless, this issue is, to a large extent, at the heart of the problem in this case.

Extended territorial jurisdiction

Under basic international law principles, countries cannot apply their laws outside of their own borders. So how could Canadian law apply to a Romanian company’s activities? The answer lies in what some co-authors of mine and I call extended territorial jurisdiction. This arises where activities outside a country’s borders are nonetheless closely connected to that country. After receiving over 20 complaints from Canadians regarding the hosting of their personal information on the Globe24h website, the Privacy Commissioner chose to apply Canada’s PIPEDA to the Romanian company. He did so on the basis that the company was collecting, using and disclosing personal information in the course of commercial activities (key triggers for PIPEDA’s application) and that its activities had a “real and substantial connection” to Canada. This connection was found in the fact that the company chose to include Canadian court and tribunal decisions in its database; that it sourced this material from websites located in Canada; that it accepted requests from Canadians to remove their personal information from its databases; and that it charged Canadians a fee to perform this service. While the company would be subject to Romanian data protection law in general, the Commissioner did not see this as an impediment to applying Canadian law in the specific circumstances. He noted that “It is commonplace in today’s global environment that organizations with an online presence may be subject to data protection laws in multiple jurisdictions depending on the nature of their activities.” (at para 100)

This approach is consistent with that taken by the Office of the Privacy Commissioner of Canada since the Federal Court handed down its decision on this point in Lawson v. Accusearch Inc. Of course, taking jurisdiction over a party in another country and being able to enforce outcomes in accordance with Canadian law are separate matters. In any event, the Privacy Commissioner is relatively toothless even within Canada; in the case of offshore companies any positive results depend largely upon a respondent’s willingness to cooperate with investigations and to change their practices with some gentle nudging. In this case, there seems to be a change of practice on the part of Globe24h, although the extent and durability of this change remain to be seen.

Journalism exception

I have previously written about the rather broad and open-ended exception to the application of PIPEDA to the collection, use or disclosure of personal information for “journalistic purposes”. Journalism is capable of a fairly broad interpretation, and in an era of disintermediated information and commentary, a broad approach to this exception is warranted. This may be even more so the case given the Supreme Court of Canada’s recent admonition that privacy laws must be balanced with the freedom of expression. However, an overly broad approach could exclude large swaths of activity from the scope of PIPEDA.

In this case, Globe24h argued that by providing a database of legal information it was entitled to benefit from the journalistic purposes exception. The Commissioner adopted a definition of “journalism” put forward by the Canadian Association of Journalism (CAJ). According to this definition journalism is an activity that has as its goal the communication of information, in a format that has “an element of original production” and that “provides clear evidence of a self-conscious discipline calculated to provide an accurate and fair description of facts, opinion and debate at play within a situation.” (at para 52). The definition is interesting, but it may be under inclusive when it comes to balancing freedom of expression and privacy. This remains an open question. Using this definition, the Commissioner found that the database of public records compiled by Globe24h was not journalism. In particular, he was of the view that the purpose of the database was to generate revenue from different means, including charging individuals who wish to have their personal information removed. He also found that the database did not embody the “original production” required in the CAJ’s definition, and concluded that “Globe24h is republishing information already available online through Canadian court and tribunal websites in a manner that enables the information to be located by search engines, which would not otherwise be possible, so as to profit from individuals’ desire to have this practice stop.” (at para 66).

While there may be an argument that this website does not serve journalistic purposes, the analysis here relies heavily upon the Commissioner’s conclusion that the site’s primary motivation is to derive revenue from individuals who are concerned about their privacy. It is not clear whether, without that element, he would have found that the journalism exception applied. The importance of this poorly worded exception – and the potential of narrow interpretations to conflict with the freedom of expression – leaves one wishing for clearer guidance.

Publicly available personal information

Globe24h also argued that it made use of publicly available personal information. PIPEDA expressly permits the collection, use and disclosure of such information without consent so long as it is used for the purposes for which it was collected and made publicly available. According to the Commissioner, the purpose for which the court decisions were made publicly available was “to promote transparency in the judicial system” (at para 93). He also went on to state that “the purpose for publishing court findings online does not include the association of such findings with individuals’ names in online search results.” (at para 92). The point here, I think, is that the search engine indexing shifts uses of this information away from transparency and towards data mining or snooping; the latter are not consistent with the purposes for which the information was made publicly available.

However, it should be noted that in this case, the assessment of purpose drifts into how the information might be accessed or manipulated by third parties –not by the respondent. This is rather tricky territory. It is a kind of secondary liability in the data protection context: court decisions are made publicly available to anyone around the world; the respondent creates a database that aggregates court decisions from multiple jurisdictions and makes them available. In doing so it enhances the searchability of the decisions by freeing them from technological restrictions. Has it done anything to take it outside the exception? Is the possibility that this new searchability might lead to improper uses of the information by others enough to find that the use does not fall within the exception? My point here is that the problem of excessive personal information in published court decisions seems to be pushed onto those who publish this information (and who thus facilitate the open courts principle), rather than resting with the courts who perhaps should be more careful in deciding what personal information is required to serve the open courts principle and what information is not.

Crown copyright – the unspoken issue

In Canada, court and tribunal decisions are covered by Crown copyright. This lies behind the courts’ ability to dictate licence terms to those who publish these decisions. Recent amendments to the Copyright Act also make it an infringement to circumvent technological protection measures on copyright protected works. Had the Romanian website been publishing court decisions in contravention of the user licence provided by court websites or circumventing court-mandated technological protection measures that blocked the indexing of the court decisions by search engines, then the courts themselves might have sought takedown of these materials or insisted upon compliance with their licence terms. These terms, however, do not appear in the licence for federal court decisions, for decisions of Ontario superior courts, or for decisions of the BC Supreme Court – and this is just a sample. Whether courts should use copyright restrictions to protect privacy values is an interesting question, particularly in an era of increasingly open government. Whether it is realistic or feasible to do so is another good question – if it is not then the privacy issues must be addressed at source. In any event, it may be time for the CJC to revisit its digitally archaic 2005 policy.

The individuals affected by Globe24h turned to the Privacy Commissioner for help when they experienced privacy invasions as a result of the company’s activities. They found a sympathetic ear, and the Commissioner may have achieved some results for them. One can ask, though, where the courts and tribunals have been in all of this. As noted earlier, they should take the lead in addressing privacy issues in their decisions. In addition, while Crown copyright may be an anachronism with the potential to limit free speech, as long as the government clings to it in the face of calls for reform it might consider using it on occasion in circumstances such as these, where inadequate measures designed to protect privacy have failed Canadians and something more is required.

 

Published in Privacy

Canada’s Information Commissioner has tabled a report in Parliament that has deeply troubling implications.

The scandal-in-the-making is a product of three pretty well-known characteristics of the current government – first, they have been utterly committed to dismantling and destroying every trace of the Long-Gun Registry established under the former Liberal government; second, their commitment to transparency and accountability is situational at best; and third, they have a tendency to bury important and sometimes controversial amendments in omnibus budget implementation bills.

Here’s what happened: The Conservative government was determined to do away with the long gun registry. It introduced a bill on October 25, 2011 which was eventually passed into law as the Ending the Long-Gun Registry Act (ELGRA) . This statute came into effect on April 5, 2012. However, no doubt anticipating the demise of the registry, an unnamed individual filed an access to information request on March 27, 2012. This applicant sought an electronic copy of all records in the Canadian Fire Arms Registry relating to firearms that were neither prohibited nor restricted. These were the specific records slated to be destroyed under s. 29 of the ELGRA.

Shortly after the coming into force of the ELGRA, the Information Commissioner wrote to the Minister of Public Safety and Emergency Preparedness to remind him that records relating to the Long-gun Registry that were the subject of requests under the Access to Information Act that were filed before the coming into effect of the ELGRA would have to be retained until the access requests had been dealt with (including any court proceedings flowing from these requests). The Minister responded, giving the Commissioner assurances that the RCMP would “abide by the right of access.”

The applicant eventually received a response to his request for records, but he was not satisfied with the response. He was of the opinion that the information provided was incomplete and was also concerned that the RCMP had gone ahead and destroyed responsive records. The Information Commissioner investigated and agreed that the response was incomplete. She also concluded that responsive records had been destroyed by the RCMP, notwithstanding the fact that they knew that the records were subject to a right of access. The destruction by government entities of records subject to a right of access is an offence under 67.1 of the Access to Information Act.

On March 26, 2015, the Information Commissioner informed the Attorney General of Canada, the Hon. Peter MacKay, of the possible commission of this offence. She also notified the Minister of Public Safety that in her view the complaint was well-founded. She recommended that any responsive records still in the possession of the RCMP be provided to the applicant. The Minister responded, indicating that he had no intention of following this recommendation.

Up to this point, the situation reveals a government committed to destroying all traces of the long-gun registry, and, as a result, unwilling to respond to an access request that would provide an applicant with data from the registry prior to its destruction. The Prime Minister’s response as reported by the CBC was: “[T]o be perfectly clear, the government is clarifying the information act to make sure it is in full conformity with Parliament's already expressed wishes on the long-gun registry that the RCMP has executed as they were required to do according to the law.”

It is clear that the access request slipped through the cracks between the introduction of the bill in October 2011 and its passage into law. It is also clear that granting access to the records would go against the intent expressed in the legislation to destroy the registry. The merits or demerits of the long-gun registry have already been the subject of much heated debate, but the battle over its continued existence is at an end. What is troubling is that the “loophole” existed, that a perfectly legitimate access to information request was filed, that the Minister of Public Safety committed to preserve records until outstanding access requests had been dealt with, and that the information was nonetheless destroyed.

What the government should have done was to address the access issue in the ELGRA in the first place. The wisdom of backdating the law to suspend access to information requests retroactively to the date the Bill was introduced in Parliament could have been debated as part of the legislation to put an end to the long-gun registry. Having omitted to do this, what the government has done instead is add to its budget implementation bill (Bill C-59) a series of provisions that retroactively remove the right of access to the long-gun registry data. The right of access is terminated on the date the long-gun Bill was introduced into Parliament (October 25, 2011). It effectively also removes any obligation to retain records, and makes their destruction legitimate. It also removes any liability of the Crown or its agents or employees with respect to the destruction of records.

It is true that these provisions will “fix” the oversight in the original long-gun Bill. However, as the Information Commissioner points out, they also retroactively absolve the RCMP of having destroyed records when it was clearly illegal to do so, and when the Minister of Public Safety had committed to the preservation of the records pending the resolution of outstanding access requests. The actions appear to have been illegal under the law as it stood at the time. Any pot smoker with a conviction for possession will tell you that it doesn’t matter what you think the law SHOULD be; what matters is what the law actually says when you carry out the transgressive act. Unless, of course, you have a legislative time machine that you can use to change the law at the time of your transgression. The Conservative government has such a legislative time machine. It is yet another one of those distasteful omnibus bills that offer a convenient sidestep to democratic debate and accountability.

This, ultimately, is the real problem and central matter for concern. The long-gun registry is – well – long gone. There was indeed a legislative loophole that created a problematic gap for a government that had committed to the total destruction of the registry records. But the ability to use omnibus bills to rewrite history and to absolve conduct that was both illegal and contrary to government assurances is ugly. And, as the Information Commissioner suggests, it is perhaps also a very dangerous precedent.

Published in Privacy

The Ontario Court of Appeal recently allowed a proposed class action proceeding for breach of privacy. This on its own is not unusual – such proceedings are increasingly common in Canada. (See earlier post on this subject here). What is particularly interesting about this decision is that the Court of Appeal ruled that Ontario’s Personal Health Information Protection Act (PHIPA) did not pose a barrier to tort proceedings. It had been argued that the provincial legislation created a “complete code” for dealing with breaches of personal information protection in the health care context in Ontario, and that tort law recourse was therefore not possible. This is an important decision for health care consumers, as class action litigation is emerging as an important means of redress and accountability for failures to adequately protect personal information. The decision should also send a wakeup call to hospitals and other health information custodians in the province.

In Hopkins v. Kay the representative plaintiff alleged that her medical records – along with those of 280 other patients at the Peterborough Regional Health Centre – had been improperly accessed by a hospital employee. The legal claim was based on the tort of intrusion upon seclusion, and the key issue was whether such recourse was precluded by the existence of PHIPA.

Writing for the unanimous Court, Justice Sharpe framed his analysis around two issues: first, whether there was a legislative intent to create a complete code when PHIPA was enacted; and second, whether the case law supported a conclusion that in the circumstances the jurisdiction of the Superior Court to consider a tort claim was ousted.

The relevance of the “complete code” issue is that if the legislature intended to create a complete code to deal with personal health information protection, then, by implication, it intended to preclude any separate tort recourse. In considering whether the intent was to create a complete code, Justice Sharpe drew on three criteria articulated by the Nova Scotia Court of Appeal in Pleau v. Canada: 1) is the dispute resolution process established by the legislation consistent with exclusive jurisdiction?; 2) what is the essential character of the dispute, and is it regulated by the legislation such that the intervention of the court would be inconsistent with the scheme; and 3) is the scheme capable of affording “effective redress”.

Justice Sharpe noted that PHIPA laid out an elaborate scheme governing the protection of personal health information. However, although he found that the statute “does contain a very exhaustive set of rules and standards for custodians of personal health information, details regarding the procedure or mechanism for the resolution of disputes are sparse.” (at para 37) He observed that oral hearings were not at all typical – in most cases, complaints were dealt with through written submissions. Further, apart from the right to make representations, there were no procedural guarantees in the statute. Justice Sharpe observed that the statute also allowed the Commissioner to refuse to consider a complaint where there was another more appropriate recourse. He found that this suggested that PHIPA was not meant to be an exclusive and comprehensive code.

The Court also found it significant that under PHIPA an award of damages could not be made by the Commissioner, and could only be made by way of a separate proceeding brought in the Supreme Court. Justice Sharpe found that this suggested that the Commission was not meant “to play a comprehensive or expansive role in dealing with individual complaints.” (at para 44) He concluded that “PHIPA provides an informal and highly discretionary review process that is not tailored to deal with individual claims, and it expressly contemplates the possibility of other proceedings.” (at para 45)

The second factor in the analysis required the court to consider the essential character of the claim, in order to determine whether a decision to assume jurisdiction would be consistent with the legislative scheme. The appellants argued that the claim was for nothing more than a breach of the PHIPA obligations, and that allowing the claim in tort to proceed would allow PHIPA to be circumvented. Justice Sharpe disagreed, noting that much more was required to make out the tort claim than to establish a breach of obligations under PHIPA. For example, the tort required a demonstration of intentional or reckless conduct, carried out without lawful justification, and in circumstances that a reasonable person would regard as highly offensive. On the whole, Justice Sharpe found that allowing the tort action to proceed in court would not undermine the scheme created under PHIPA.

The third consideration was whether the statute provided effective redress. The Court found that PHIPA gave the Commissioner a great deal of discretion when it came to the resolution of complaints, including the authority to decide not to proceed with a complaint. He also found that the complaints investigation process in PHIPA was generally meant to address systemic issues rather than to provide an effective recourse for individuals harmed by improper care of their personal information. Justice Sharpe noted that “Even if the Commissioner investigates a complaint, his primary objective in achieving an appropriate resolution will not be to provide an individual remedy to the complainant, but rather to address systemic issues.” (at para 59) Because of the broad discretion given to the Commissioner, any complainant whose complaint was not investigated would face “an expensive and uphill fight” to seek judicial review of the decision not to proceed. Justice Sharpe therefore concluded that the legislature had not intended to create a comprehensive code to deal with the consequences of misuse of personal health information.

The second issue considered by the Court was whether case law prevented the pursuit of the tort claim. Other courts had found that there was no right of action at common law where a statute provided a comprehensive scheme for redress. The leading case in this area is Seneca College v. Bhadauria, in which the Supreme Court of Canada ruled that the Ontario Human Rights Code precluded a separate common law tort of discrimination.

Justice Sharpe distinguished the Human Rights Code from PHIPA. He noted that the recourse under the Human Rights Code provided for awards of damages, whereas the Commissioner under PHIPA had no authority to award damages. Further, under PHIPA the Commissioner had a great deal of discretion to decide to proceed or not with a complaint, and chose to exercise that discretion so as to focus on systemic issues. By contrast, the Human Rights Code created a mechanism which focussed on the hearing of individual complaints. The two statutes were thus quite different. Justice Sharpe also distinguished two other cases involving labour relations legislation in which the courts refused to consider disputes that in their view should properly have been dealt with through arbitration or grievance mechanisms. Justice Sharpe noted that such proceedings provided an “accessible mechanism for comprehensive and efficient dispute resolution, and consequently form an important cornerstone of labour relations.” (at para 69) This was in contrast to PHIPA, where the Commissioner had given clear priority to the resolution of complaints raising systemic issues.

The Court concluded that there was nothing in PHIPA to support the view that the legislature intended to create an exhaustive code providing recourse for failures in the protection of personal health information. He found that permitting individuals to pursue claims at common law would not undermine PHIPA. He also found that the PHIPA scheme was such that in some cases individuals would not have effective redress under that statute. In the result, Ontarians now have the option of bringing tort claims for the mishandling of their personal health information. The case will also be of interest in other jurisdictions with personal information protection legislation.

 

 

Published in Privacy

A news story from January 2015 puts squarely into focus some of the challenges of privacy and open government.

The story centred on the Canadian legal information website CanLII, although the privacy issues it raises relate more directly to how government institutions protect personal information when seeking to comply with open courts and open government principles.

CanLII, a non-profit corporation that is managed by the Federation of Law Societies of Canada, is a tremendously important information resource in Canada. Since its inception, it has become instrumental in ensuring that Canadians have free online access to primary Canadian legal materials. It follows in the tradition of other Legal Information Institutes in the United States, Australia and Britain/Ireland. CanLII includes all Canadian and provincial statutes and regulations, case law from all federal and provincial courts, and case law from a growing number of administrative tribunals. Prior to CanLII’s appearance on the scene, these materials were found either on the shelves of law libraries, or were accessible through commercial databases that charged fees for access. In essence, they were not easily accessible to Canadians without significant effort or cost. In a legal system in which “ignorance of the law is no excuse”, and in which an ever-growing number of Canadians have no choice but to represent themselves in legal proceedings, this kind of public access seems essential. CanLII’s efforts to liberate these legal materials make an interesting story with plenty of open government lessons. (I have written about the evolution of CanLII here,).

The news story that broke in January related to a Romanian website that had scraped the content from CanLII and reposted it to another website hosted in Romania. In doing so, it allowed for the circumvention of technological measures put in place by CanLII that prevented Google from indexing terms found in court and tribunal decisions. These measures were put in place by CanLII largely to protect the privacy of individuals whose names and personal information may feature in court decisions. By contrast, the Romanian materials are fully searchable.

This situation raises several interesting issues of privacy and open government. At first glance, it may look like a failure of CanLII’s efforts to put into place effective technological measures to protect individual privacy. (CanLII has reportedly upgraded its technological protections, although the cases initially scraped from the site remain out of its control). But CanLII is really only the second line of defence. The first line of defence, is, of course, the courts and tribunals themselves that provide case law to CanLII as well as increasingly through their own websites.

The problem of “public personal information” is a thorny one, and it arises in this context as well as in many others. Public personal information is information that is legally public (government registry information, for example, or information in court decisions). While this information has long been public in nature, its widespread, immediate and limitless distribution was never contemplated in the pre-internet age in which decisions to make it public were made. Thus, there are important privacy issues surrounding how and under what conditions such information is made public, as well as how the public interest in openness should be balanced against individual rights to privacy in an internet and big data age.

In Canada, the open courts principle means that the proceedings of courts are open to public scrutiny – it’s a fundamental principle that justice must not only be done, it must be seen to be done. This means not only that, barring exceptional circumstances, court and tribunal hearings are public, as are the decisions reached in those cases. In fact, not only does this serve transparency and accountability values, the publication of court and tribunal decisions allows lawyers and members of the public to consult these decisions to better understand the law, and to learn how courts and tribunals interpret and apply legislation. In exceptional circumstances, courts may issue publication bans in relation to certain court hearings; courts may also order certain personal information (including, in some cases, names of individuals) redacted from court decisions. For example, in decisions involving young offenders, only initials are used. The names of victims of sexual assaults may also be redacted.

In the pre-internet dark ages, the redaction of names and other personal information from court decisions was less significant because these decisions did not circulate widely. Few members of the public, for exmpale, were curious enough to go down to a law library to trawl through case reporters in the hope of spotting the name of someone they knew. Internet access and online publication of decisions changes things significantly. Fully searchable databases of court and administrative tribunals can leave individuals substantially exposed with respect to a very broad range of personal information. Decisions in divorce cases may include a detailed account of assets and liabilities, and may also recount grim details of personal conduct. Decisions of workers’ compensation tribunals may contain significant amounts of personal health information; the same can be said of human rights tribunals, pension and disability tribunals, and so on. In many civil cases where plaintiffs allege damages for anxiety, stress, or depression caused by the harm they suffered, courts may engage in a detailed discussion of the evidence presented. In personal injury law suits, there may be considerable discussion of personal health information. This is just a sample of some of the personal information that may be found in court decisions. In digital form, this information is available to nosy neighbors, malefactors, and data miners alike.

Courts and tribunals publish their decisions in conformity with the open courts principle. Online publication, however, raises significant privacy concerns that must be balanced against the open courts principle. The Canadian Judicial Council has considered this issue, and has issued guidelines for courts as to how to prepare decisions for online publication. The Office of the Privacy Commissioner of Canada has also weighed in on the issue with respect to the practices of federal administrative tribunals. The problem is, of course, that these guidelines are not mandatory, and, as Christopher Berzins has noted, there no consistent approach across the broad range of courts and tribunals in Canada. Further, in some cases, there may be genuine debate about whether certain details are required in order to meet the open courts principle. For example, if we are to understand why a certain award of damages is made in a particular case, we need to understand the nature of the evidence presented, and how the judge assessed that evidence.

So much for the first line of defence. Ideally, courts and tribunals, prior to making decisions available for online publication, should address privacy issues. Many do, some do not. Not all do so to the same extent or in the same way. In some cases, the open courts principle will outweigh privacy considerations – although whether technical or other solutions should be instituted is an excellent question. The fact remains that much personal information ends up being published online through important resources such as CanLII. CanLII itself has introduced a second line of defence – technological measures to ensure that the personal information is not accessible through search engines. What the story about the Romanian website has taught us is that this line of defence is entirely porous. It has also taught us that as more and more public personal information is made available in formats that allow for easy dissemination, greater attention needs to be paid – by courts and by governments at all levels – to the challenges of public personal information.

Published in Privacy

Last week I wrote about a very early ‘finding’ under Canada’s Personal Information Protection and Electronic Documents Act which raises some issues about how the law might apply in the rapidly developing big data environment. This week I look at a more recent ‘finding’ – this time 5 years old – that should raise red flags regarding the extent to which Canada’s laws will protect individual privacy in the big data age.

In 2009, the Assistant Privacy Commissioner Elizabeth Denham (who is now the B.C. Privacy Commissioner) issued her findings as a result of an investigation into a complaint by the Canadian Internet Policy and Public Interest Clinic into the practices of a Canadian direct marketing company. The company combined information from different sources to create profiles of individuals linked to their home addresses. Customized mailing lists based on these profiles were then sold to clients looking for individuals falling within particular demographics for their products or services.

Consumer profiling is a big part of big data analytics, and today consumer profiles will draw upon vast stores of personal information collected from a broad range of online and offline sources. The data sources at issue in this case were much simpler, but the lessons that can be learned remain important.

The respondent organization used aggregate geodemographic data, which it obtained from Statistics Canada, and which was sorted according to census dissemination areas. This data was not specific to particular identifiable individuals – the aggregated data was not meant to reveal personal information, but it did give a sense of, for example, distribution of income by geographic area (in this case, by postal code). The company then took name and address information from telephone directories so as to match the demographic data with the name and location information derived from the directories. Based on the geo-demographic data, assumptions were made about income, marital status, likely home-ownership, and so on. The company also added its own assumptions about religion, ethnicity and gender based upon the telephone directory information – essentially drawing inferences based upon the subscribers’ names. These assumptions were made according to ‘proprietary models’. Other proprietary models were used to infer whether the individuals lived in single or multi-family dwellings. The result was a set of profiles of named individuals with inferences drawn about their income, ethnicity and gender. CIPPIC’s complaint was that the respondent company was collecting, using and disclosing the personal information of Canadians without their consent.

The findings of the Assistant Privacy Commissioner (APC) are troubling for a number of reasons. She began by characterizing the telephone directory information as “publicly available personal information”. Under PIPEDA, information that falls into this category, as defined by the regulations, can be collected, used and disclosed without consent, so long as the collection, use and disclosure are for the purposes for which it was made public. Telephone directories fall within the Regulations Specifying Publicly Available Information. However, the respondent organization did more than simply resell directory information.

Personal information is defined in PIPEDA as “information about an identifiable individual”. The APC characterized the aggregate geodemographic data as information about certain neighborhoods, and not information about identifiable individuals. She stated that “the fact that a person lives in a neighborhood with certain characteristics” was not personal information about that individual.

The final piece of information associated with the individuals in this case was the set of assumptions about, among other things, religion, ethnicity and gender. The APC characterized these as “assumptions”, rather than personal information – after all, the assumptions might not be correct.

Because the respondent’s clients provided the company with the demographic characteristics of the group it sought to reach, and because the respondent company merely furnished names and addresses in response to these requests, the APC concluded that the only personal information that was collected, used or disclosed was publicly available personal information for which consent was not required. (And, in case you are wondering, allowing people to contact individuals was one of the purposes for which telephone directory information is published – so the “use” by companies of sending out marketing information fell within the scope of the exception).

And thus, by considering each of the pieces of information used in the profile separately, the respondent’s creation of consumer profiles from diffuse information sources fell right through the cracks in Canada’s data protection legislation. This does not bode well for consumer privacy in an age of big data analytics.

The most troubling part of the approach taken by the APC is that which dismisses “assumptions” made about individuals as being merely assumptions and not personal information. Consumer profiling is about attributing characteristics to individuals based on an analysis of their personal information from a variety of sources. It is also about acting on those assumptions once the profile is created. The assumptions may be wrong, the data may be flawed, but the consumer will nonetheless have to bear the effects of that profile. These effects may be as minor as being sent advertising that may or may not match their activities or interests; but they could be as significant as decisions made about entitlements to certain products or services, about what price they should be offered for products or services, or about their desirability as a customer, tenant or employee. If the assumptions are not “actual” personal information, they certainly have the same effect, and should be treated as personal information. Indeed, the law accepts that personal information in the hands of an organization may be incorrect (hence the right to correct personal information), and it accepts that opinions about an individual constitute their personal information, even though the opinions may be unfair.

The treatment of the aggregate geodemographic information is also problematic. On its own, it is safe to say that aggregate geodemographic information is information about neighborhoods and not about individuals. But when someone looks up the names and addresses of the individuals living in an area and matches that information to the average age, income and other data associated with their postal codes, then they have converted that information into personal information. As with the ethnicity and gender assumptions, the age, income, and other assumptions may be close or they may be way off base. Either way, they become part of a profile of an individual that will be used to make decisions about that person. Leslie O’Keefe may not be Irish, he may not be a woman, and he may not make $100,000 a year – but if he is profiled in this way for marketing or other purposes, it is not clear why he should have no recourse under data protection laws.

Of course, the challenged faced by the APC in this case was how to manage the ‘balance’ set out in s. 3 of PIPEDA between the privacy interests of individuals and the commercial need to collect, use and disclose personal information. In this case, to find that consent – that cornerstone of data protection laws – was required for the use and disclosure of manufactured personal information, would be to hamstring an industry built on the sale of manufactured personal information. As the use – and the sophistication – of big data and big data analytics advances, organizations will continue to insist that they cannot function or compete without the use of massive stores of personal information. If this case is any indication, decision makers will be asked to continue to blur and shrink the edges of key concepts in the legislation, such as “consent” and “personal information”.

The PIPEDA complaint in this case dealt with relatively unsophisticated data used for relatively mundane purposes, and its importance may be too easily overlooked as a result. But how we define personal information and how we interpret data protection legislation will have enormous importance as to role of big data analytics in our lives continues to grow. Both this decision and the one discussed last week offer some insights into how Canada’s data protection laws might be interpreted or applied – and they raise red flags about the extent to which these laws are adequately suited to protecting privacy in the big data era.

Published in Privacy

A long past and largely forgotten ‘finding* from the Office of the Privacy Commissioner of Canada offers important insights into the challenges that big data and big data analytics will pose for the protection of Canadians’ privacy and consumer rights.

13 years ago, former Privacy Commissioner George Radwanski issued his findings on a complaint that had been brought against a bank. The complainant had alleged that the bank had wrongfully denied her access to her personal information. The requirement to provide access is found in the Personal Information Protection and Electronic Documents Act (PIPEDA). The right of access also comes with a right to demand the correction of any errors in the personal information in the hands of the organization. This right is fundamentally important, not just to privacy. Without access to the personal information being used to inform decision-making, consumers have very little recourse of any kind against adverse or flawed decision-making.

The complainant in this case had applied for and been issued a credit card by the bank. What she sought was access to the credit score that had been used to determine her entitlement to the card. The bank had relied upon two credit scores in reaching its decision. The first was the type produced by a credit reporting agency – in this case, Equifax. The second was an internal score generated by the bank using its own data and algorithm. The bank was prepared to release the former to the complainant, but refused to give her access to the latter. The essence of the complaint, therefore, was whether the bank had breached its obligations under PIPEDA to give her access to the personal information it held about her.

The Privacy Commissioner’s views on the interpretation and application of the statute in this case are worth revisiting 13 years later as big data analytics now fuel so much decision-making regarding consumers and their entitlement to or eligibility for a broad range of products and services. Credit reporting agencies are heavily regulated to ensure that decisions about credit-worthiness are made fairly and equitably, and to ensure that individuals have clear rights to access and to correct information in their files. For example, credit reporting legislation may limit the types of information and the data sources that may be used by credit reporting agencies in arriving at their credit scores. But big data analytics are now increasingly relied upon by all manner of organizations that are not regulated in the same way as credit-reporting agencies. These analytics are used to make decisions of similar importance to consumers – including decisions about credit-worthiness. There are few limits on the data that is used to fuel these analytics, nor is there much transparency in the process.

In this case, the bank justified its refusal to disclose its internal credit score on two main grounds. First, it argued that this information was not “personal information” within the meaning of PIPEDA because it was ‘created’ internally and not collected from the consumer or any other sources. The bank argued that this meant that it did not have to provide access, and that in any event, the right of access was linked to the right to request correction. The nature of the information – which was generated based upon a proprietary algorithm – was such that was not “facts” that could be open to correction.

The argument that generated information is not personal information is a dangerous one, as it could lead to a total failure of accountability under data protection laws. The Commissioner rejected this argument. In his view, it did not matter whether the information was generated or collected; nor did it matter whether it was subject to correction or not. The information was personal information because it related to the individual. He noted that “opinions” about an individual were still considered to be personal information, even though they are not subject to correction. This view of ‘opinions’ is consistent with subsequent findings and decisions under PIPEDA and comparable Canadian data protection laws. Thus, in the view of the Commissioner, the bank’s internally generated credit score was the complainant’s personal information and was subject to PIPEDA.

The bank’s second argument was more successful, and is problematic for consumers. The bank argued that releasing the credit score to the complainant would reveal confidential commercial information. Under s. 9(3)(b) of PIPEDA, an organization is not required to release personal information in such circumstances. The bank was not arguing so much that the complainant’s score itself was confidential commercial information; rather, what was confidential were the algorithms used to arrive at the score. The bank argued that these algorithms could be reverse-engineered from a relatively small sample of credit scores. Thus, a finding that such credit scores must be released to individuals would leave the bank open to the hypothetical situation where a rival might organize or pay 20 or so individuals to seek access to their internally generated credit scores in the hands of the bank, and that set of scores could then be used to arrive at the confidential algorithms. The Commissioner referred this issue to an expert on algorithms and concluded that “although an exact determination of a credit-scoring model was difficult and highly unlikely, access to customized credit scores would definitely make it easier to approximate a bank’s model.”

The Commissioner noted that under s. 9(3)(b) there has to be some level of certainty that the disclosure of personal information will reveal confidential commercial information before disclosure can be refused. In this case, the Commissioner indicated that he had “some difficulty believing that either competitors or rings of algorithmically expert fraud artists would go to the lengths involved.” He went on to say that “[t]he spectre of the banks falling under systematic assault from teams of loan-hungry mathematicians is simply not one I find particularly persuasive.” Notwithstanding this, he ruled in favour of the bank. He noted that other banks shared the same view as the respondent bank, and that competition in the banking industry was high. Since he had found it was technically possible to reverse-engineer the algorithm, he was of the view that he had to find that the release of the credit score would reveal confidential commercial information. He was satisfied with the evidence the bank supplied to demonstrate how closely guarded the credit-scoring algorithm was. He noted that in the UK and Australia, relatively new guidelines required organizations to provide only general information regarding why credit was denied.

The lack of transparency of algorithms used in the big data environment becomes increasingly problematic the more such algorithms are used. Big data analytics can be used to determine credit-worthiness – and such these determinations are made not just by banks but by all manner of companies that extend consumer credit through loans, don’t-pay-for-a-year deals, purchase-by-installment, store credit cards, and so on. They can also be used to determine who is entitled to special offers or promotions, for price discrimination (where some customers are offered better prices for the same products or services), and in a wide range of other contexts. Analytics may also be used by prospective employers, landlords or others whose decisions may have important impacts on people’s lives. Without algorithmic transparency, it might be impossible to know whether the assumptions, weightings or scoring factors are biased, influenced by sexism or racism (or other discriminatory considerations), or simply flawed.

There may be some comfort to be had that in this case the Commissioner was allowed to have access to the scoring model used. He stated that he found it innocuous – although it is not clear what kind of scrutiny he gave it. After all, his mandate extended only to decisions relating to the management of personal information, and did not extend to issues of discrimination. It is also worth noting that the Commissioner seems to suggest that each case must be decided on its own facts, and that what the complainant stood to gain and the respondent stood to lose were relevant considerations. In this case, the complainant had not been denied credit, so in the Commissioner’s view there was little benefit to her in the release of the information to be weighed against the potential harm to the bank. Nevertheless, the decision raises a red flag around transparency in the big data context.

In the next week or so I will be posting a ‘Back to the Future II’ account of another, not quite so old, PIPEDA finding that is also significant in the big data era. Disturbingly, this decision eats away at Commissioner Radwanski’s conclusion on the issue of “personal information” as it relates to generated or inferred information about individuals. Stay tuned!



* Because the Privacy Commissioner of Canada has no order-making powers, he can only issue “findings” in response to complaints filed with the office. The ‘findings’ are essentially opinions as to how the act applies in the circumstances of the complaint. If the complaint is considered well-founded, the Commissioner can also make recommendations as to how the organization should correct these practices. For binding orders or compensation the complainant must first go through the complaints process and then take the matter to the Federal Court. Few complainants do so. Thus, while findings are non-binding and set no precedent, they do provide some insight into how the Commissioner would interpret and apply the legislation.

 

Published in Privacy

In the fall of 2014, Quebec’s Commission d’accès à l’information, which is responsible for overseeing Quebec’s private sector data protection legislation, ruled that the province’s law applied to Rogers Communications Inc., a federally regulated company. The company had been the subject of a complaint that it had violated Quebec’s data protection law when it required new cellular phone subscribers to provide two pieces of identification, and then recorded the identification numbers on the furnished ID documents. Administrative Judge Lina Desbiens ruled that the complaint was well-founded. In her view, while it was legitimate to ask to see identification for the purposes of establishing the identity of the client, it was not necessary to record the identification numbers. Further, she found that the document ID numbers were not necessary for the purposes of carrying out a credit check – other information would suffice for this purpose.

The issue of the application of the Quebec data protection statute is the more interesting part of this decision. Because Rogers is part of the federally-regulated telecommunications industry, the federal Personal Information Protection and Electronic Documents Act (PIPEDA) applies to its activities. Certainly there have been plenty of cases in which PIPEDA has been applied to Rogers or to its sister telecommunications companies.[1] From Rogers’ point of view, if the federal Act applied, then the provincial statute did not. Judge Desbiens disagreed. She noted that

s. 81 of the Act Respecting the Protection of Personal Information in the Private Sector gave the Commission jurisdiction over “any matter relating to the protection of personal information as well as into the practices of a person who carries on an enterprise and who collects, holds, uses or communicates such information to third persons.” She read this to mean that the Commission’s jurisdiction extended to the collection, use or disclosure of personal information by any business operating in Quebec. Since Rogers operated its business in Quebec, it was thus subject to the provincial law. Although the federal law might also apply to Rogers, Judge Desbiens found that it would only apply to the exclusion of the provincial law where the application of the provincial law would affect, in some significant way, the exercise of federal jurisdiction. In this case, she observed, Rogers was a telecommunications company, but the decision as to what pieces of identification it could require from new customers and what information it could record was not something that would affect in any way federal jurisdiction over telecommunications.

Judge Desbiens cited in support of her position several other decisions of the Commission d’accès à l’information in which the Quebec legislation was applied to companies in federally regulated industries. Notably, however, the facts addressed in these decisions predated the coming into effect of PIPEDA. Judge Desbiens also cited the more recent case of Nadler c. Rogers Communications Inc.. This case involved a civil suit for breach of privacy, and while the court considers the Quebec private sector data protection statute in its reasons, no argument appears to have been made regarding jurisdictional issues.

Judge Desbiens’ ultimate conclusion was that it was possible for a company to comply with both federal and provincial statutes by satisfying the stricter of the two sets of norms. In any event, she expressed the view her decision on the merits and the position of the federal Privacy Commissioner on similar issues did not diverge.[2]

The decision that both federal and provincial data protection statutes apply to federally regulated companies doing business in Quebec seems problematic. On the one hand, federally regulated companies are frequently subject to provincial laws in some of their day-to-day business activities. This is why, for example, some banking products or services are not available in all provinces. Arguably, therefore, it should not matter that a federally-regulated company be required to comply with provincial data protection norms. However, the situations are not equivalent. In the case of personal information, the federal government has provided a national scheme that specifically applies to federally regulated businesses. While Judge Desbiens is most likely correct that there would be little difference in the outcome of this case under PIPEDA, it should not necessarily be assumed that this would be the so on a different set of facts. And, while it is true that the data protection decision in this case does not interfere with federal jurisdiction over telecommunications, it does seem clearly to trench upon federal jurisdiction over data protection in the federally regulated private sector.

 



[1] For just a few examples, see: Kollar v. Rogers Communications Inc., 2011 FC 452, http://www.canlii.org/en/ca/fct/doc/2011/2011fc452/2011fc452.pdf; Buschau v. Rogers Communications Inc., 2011 FC 911, http://www.canlii.org/en/ca/fct/doc/2011/2011fc911/2011fc911.pdf; Johnson v. Bell Canada, [2009] 3 FCR 67, 2008 FC 1086; Henry v. Bell Mobility, 2014 FC 555.

[2] The Commission cited several documents published on the website of the Office of the Privacy Commissioner of Canada. These include: Collection of Drivers’ Licence Numbers Under Private Sector Privacy Legislation, https://www.priv.gc.ca/information/pub/guide_edl_e.asp; Best Practices for the Use of Social Insurance Numbers in the Private Sector, https://www.priv.gc.ca/resource/fs-fi/02_05_d_21_e.asp; and Photo Identification Guidance, https://www.priv.gc.ca/resource/fs-fi/02_05_d_34_tips_e.asp.

Published in Privacy

Class action law suits for breach of privacy are becoming increasingly common in Canada. For example, the B.C. Supreme Court, the Ontario Superior Court, and Newfoundland and Labrador Supreme Court have all recently certified class action law suits in relation to alleged privacy breaches.

The use of the class action law suit can be a useful solution to some of the problems that plague the victims of privacy breaches. These difficulties include:

1) The lack of any other meaningful and effective recourse for a large scale privacy breach. Complaints regarding a large-scale privacy breach by a private sector corporation can be made to the Privacy Commissioner of Canada under the Personal Information Protection and Electronic Documents Act (PIPEDA) (or to his provincial counterparts in B.C., Quebec or Alberta, depending upon the nature of the corporation and its activities). However, the federal privacy commissioner can only investigate and issue a report with non-binding recommendations. He has no order-making powers. Further, there is no power to award damages. An individual who feels they have been harmed by a privacy breach must, after receiving the Commissioner’s report, make an application to Federal Court for compensation. Damage awards in Federal Court under PIPEDA have been very low, ranging from about $0 to $5000 (with a couple of outlier exceptions). This amount of damages will not likely compensate for the time and effort required to bring the legal action, let alone the harm from the privacy breach. Perhaps more importantly, a few thousand dollars may not be a significant deterrent for companies whose practices have led to the privacy breach. The Privacy Commissioner’s Office has called for reform of PIPEDA to include order making powers, and to give the Commissioner the authority to impose significant fines on companies whose conduct leads to significant privacy harms. Yet legislative reform in this area does not seem to be on the current government’s agenda.

2) The problem of establishing damages in privacy cases. It can be very difficult to establish damages in cases where privacy rights have been breached. For example, although a company’s data breach might affect tens or even hundreds of thousands of individuals, it may be very difficult for any of those individuals to show that the data breach has caused them any actual harm. Even if one or more of these individuals suffers identity theft, it may be impossible to link this back to that particular data breach. While all of the affected individuals may suffer some level of anxiety over the security of their personal information, it is hard to put a dollar value on this kind of anxiety – and courts have tended to take a rather conservative view in evaluating such harm. It simply might not be worth it for any individual to bring legal action in such circumstances – even if they were to succeed, their damages would likely not even come close to making the litigation worth their while.

3) The inaccessibility of justice on an individual scale. Frankly, the majority of Canadians are not in a financial position to take anyone to court for breach of privacy. (Those in province of Quebec might be slightly better off in this regard, as privacy rights are much clearer and better established in private law in that province than they are elsewhere in Canada). It should be noted that those few individuals who have sought damages in Federal Court for PIPEDA breaches have been self-represented – legal representation would simply be too costly given the stakes. A suit for the tort of invasion of privacy or for breach of a statutory privacy tort would be considerably more complex than an application for damages under PIPEDA. Damage awards in privacy cases are so low that litigation is not a realistic solution for most.

In this context it is not surprising that the class action law suit for breach of privacy is catching on in Canada. Such law suits allow large numbers of affected individuals to seek collective recourse. As mentioned earlier, the British Columbia Supreme Court recently certified a class action law suit against Facebook for breach of privacy rights protected under British Columbia’s Privacy Act. The claim in Douez v. Facebook, Inc. related to Facebook’s Sponsored Stories “product”. Advertisers who paid to make use of this product could use the names and likenesses of Facebook users in “sponsored stories” about their products or services. These “sponsored stories” would then be sent to the contacts of the person featured in the story. The court found that between September 9, 2012 and March 10, 2013, 1.8 million B.C. residents were featured in Sponsored Stories. The plaintiffs argued that this practice violated their privacy. Although the issues have not yet been litigated on their merits, the certification of the class action law suit allows the privacy claims to proceed on behalf of the significant number of affected individuals.

In Evans v. Bank of Nova Scotia, Justice Smith of the Ontario Superior Court of Justice certified a class action law suit against the Bank of Nova Scotia. In that case, an employee of the bank had, over almost a five year period, accessed highly confidential personal banking information of 643 customers. In June of 2012, the Bank notified these customers that there may have been unauthorized access to their banking information; 138 of these individuals later informed the bank that they were victims of identity theft or fraud. The bank employee subsequently admitted that he had channelled the banking information through his girlfriend to individuals who sought to use the information for illegal purposes. The lawsuit claims damages for invasion of privacy and negligence, among other things, and argues that the bank should be held vicariously liable for the actions of its employee.

Most recently, in Hynes v. Western Regional Integrated Health Authority, the Newfoundland and Labrador Supreme Court certified a class action law suit against the Health Authority after it was discovered that an employee had improperly accessed 1,043 medical records without authorization. The information accessed included name and address information, as well as information about diagnostic and medical procedures at the hospital. This case is an example of where it may be difficult to assess or quantify the harm suffered by the particular individuals as a result of the breach, as it is not known how the information may have been used. The plaintiffs argued that both the statutory privacy tort in Newfoundland and the common law tort of intrusion upon seclusion were applicable, and that the Health Authority should be held vicariously liable for the acts of its employee. The also argued that the Health Authority had been negligent in its care of their personal information. The court found that the arguments raised met the necessary threshold at the class action certification stage – the merits remain to be determined once the case ultimately proceeds to trial.

What these three cases demonstrate is that class action law suits may give individuals a useful recourse in cases where data breaches have exposed their personal information and perhaps left them vulnerable to identify theft or other privacy harms. Such law suits may also act as a real incentive for companies to take privacy protection seriously. The cost of defending a class action law suit, combined with the possibility of a very substantial damages award (or settlement), and the potential reputational harm from high profile litigation, all provide financial incentives to properly safeguard personal information.

This may be welcome news for those who are concerned about what seems to be a proliferation of data breaches. It should not, however, let the federal government off the hook in terms of strengthening Canada’s private sector data protection legislation and giving the Privacy Commissioner more effective tools to act in the public interest to protect privacy by ensuring compliance with the legislation.

 

Published in Privacy

Just over a year ago, in Information and Privacy Commissioner of Alberta v. United Food and Commercial Workers, Local 401 the Supreme Court of Canada struck down Alberta’s Personal Information Protection Act (PIPA) on the basis that it violated the freedom of expression guaranteed by s. 2(b) of the Canadian Charter of Rights and Freedom. The case arose after a union was found to have violated PIPA by collecting and using video and photo images of people crossing its picket lines in the course of a labour dispute without the consent of those individuals. The union was ultimately successful in its arguments that the limitations on the collection, use and disclosure of personal information without consent contained in PIPA violated their freedom of expression. (You can read more about this decision in my early blog post here).

As a remedy, the Supreme Court of Canada struck down the entire statute, but put in place a suspension of invalidity for a period of year. This amount of time was considered reasonable for the Alberta legislature to amend the legislation to bring it into conformity with the Charter. The year passed without legislative action, and at the last minute the government scrambled to obtain an extension. The Court granted a six month extension on October 30, 2014.

The Alberta government has now introduced a bill to amend PIPA to bring it into conformity with the Charter. Bill 3 is framed in fairly narrow terms. In essence, it creates a new exception to the general rule that there can be no collection, use or disclosure of personal information without consent. This exception is specifically for trade unions. The collection, use or disclosure without consent is permissible if it is “for the purpose of informing or persuading the public about a matter of significant public interest or importance relating to a labour relations dispute involving the trade union” (proposed new sections 14.1, 17.1, and 20.1). The information collected, used or disclosed must be “reasonably necessary” for that purpose, and, in the circumstances, it must be reasonable to collect, use or disclose that information without consent.

The new provisions attempt to strike a balance between the right to privacy and the freedom of expression of trade unions. While it will now be permissible to collect, use or disclose personal information without consent in the context of a labour dispute, there is no blank cheque. Rather than exempt trade unions from the application of PIPA altogether, the new provisions set out the circumstances in which unions may act, and these actions will be under the supervision of the Office of the Information and Privacy Commissioner (OIPC). A person whose information is collected, used or disclosed without their consent by a union may still complain to the OIPC; the OIPC will get to determine if the union’s purpose was to inform or persuade the public “about a matter of significant public interest or importance relating to a labour relations dispute involving the trade union” This wording is interesting – actions by a trade union taken in support of another trade union may not qualify, nor may actions carried out by a trade union to protest a government’s policies. Further, an adjudicator might decide that the information was collected, used or disclosed in relation to a matter that was not of significant public interest or importance. Whether this exception strikes the right balance is an open question which may arise in the course of some future dispute.

The issue of the balance between the freedom of expression and privacy is an extremely interesting one, and it arises in other contexts under private sector data protection legislation. These competing rights are purportedly balanced, for example, by provisions that exempt journalistic, artistic and literary endeavors from the application of the statute in certain circumstances. However, as the United Food case demonstrates, these exceptions do not necessarily capture all of the actors who may have information of public interest that they wish to communicate. A few years ago I wrote an article about the “journalistic purposes” exception that is found in Alberta’s PIPA, as well as in B.C.’s Personal Information Protection Act and the federal Personal Information Protection and Electronic Documents Act. I argue that this exception may not strike the right balance between the right to journalistic freedom of expression and privacy. In the first place, it is not clear who is meant to be entitled to the exception (what are journalistic, artistic or literary purposes, and who gets to assert them?) Secondly, the exceptions are structured so that once it is decided that the acts in question fall within the exception, there can be no oversight to determine whether the manner in which the personal information was collected, used or disclosed went beyond what was reasonable for the legitimate information of the public.

Although the United Food saga may be approaching its close, the issues around the balance between freedom of expression and privacy are far from being resolved. Expect to see these issues surfacing in cases arising under private sector data protection legislation (as was the case with United Food) as well as in other privacy contexts as well.

Note: I recently posted about a privacy law suit that raised freedom of expression issues. It can be found here.

 

Published in Privacy
<< Start < Prev 1 2 3 4 5 6 7 8 9 Next > End >>
Page 3 of 9

Canadian Trademark Law

Published in 2015 by Lexis Nexis

Canadian Trademark Law 2d Edition

Buy on LexisNexis

Electronic Commerce and Internet Law in Canada, 2nd Edition

Published in 2012 by CCH Canadian Ltd.

Electronic Commerce and Internet Law in Canada

Buy on CCH Canadian

Intellectual Property for the 21st Century

Intellectual Property Law for the 21st Century:

Interdisciplinary Approaches

Purchase from Irwin Law