Teresa Scassa - Blog

Displaying items by tag: data protection

Note: The following are my speaking notes for my appearance on February 23, 2026 before the House of Commons Standing Committee on Access to Information, Privacy and Ethics (ETHI). ETHI is currently engaged in a review of PIPEDA. My colleague Dr. Florian Martin-Bariteau also appeared before the same committee. His remarks are found here.

Thank you for the invitation to meet with you today and to contribute to your study of the Personal Information Protection and Electronic Documents Act. I am a professor at the University of Ottawa, Faculty of Law, where I hold the Canada Research Chair in Information Law. I am appearing in my personal capacity.

We are facing a crisis of legitimacy when it comes to personal data protection in Canada. Every day there are new stories about data hacks and breaches, and about the surreptitious collection of personal information by devices in our homes and on our persons that are linked to the Internet of Things. There are stories about how big data profiling impacts the ability of individuals to get health insurance, obtain credit or find employment. There are also concerns about the extent to which state authorities access our personal information in the hands of private sector companies. PIPEDA, as it currently stands, is inadequate to meet these challenges

My comments are organized around the theme of transparency. Transparency is fundamentally important to data protection and it has always played an important role under PIPEDA. At a fundamental level, transparency means openness and accessibility. In the data protection context it means requiring organizations to be transparent about the collection, use and disclosure of personal information; and it means the Commissioner must be transparent about his oversight functions under the Act. I will also argue that it means that state actors (including law enforcement and national security organizations) must be more transparent about their access to and use of the vast stores of personal information in the hands of private sector organizations.

Under PIPEDA, transparency is at the heart of the consent-based data protection scheme. Transparency is central to the requirement for companies to make their privacy policies available to consumers, and to obtain consumer consent to collection, use or disclosure of personal information. Yet this type of transparency has come under significant pressure and has been substantially undermined by technological change on the one hand, and by piecemeal legislative amendment on the other.

The volume of information that is collected through our digital, mobile and online interactions is enormous, and its actual and potential uses are limitless. The Internet of Things means that more and more, the devices we have on our person and in our homes are collecting and transmitting information. They may even do so without our awareness, and often on a continuous basis. The result is that there are fewer clear and well-defined points or moments at which data collection takes place, making it difficult to say that notice has been provided and consent obtained in any meaningful way. In addition, the number of daily interactions and activities that involve data collection have multiplied beyond the point at which we are capable of reading and assessing each individual privacy policy. And, even if we did have the time, privacy policies are often so long, complex, and vague that reading them does not provide much of an idea of what is being collected and shared, with or by whom, or for what purposes.

In this context consent has become a joke, although unfortunately the joke is largely on the consumer. The only parties capable of saying that our current consent-based model still works are those that benefit from consumer resignation in the face of this ubiquitous data harvesting.

The Privacy Commissioner’s recent consultation process on consent identifies a number of possible strategies to address the failure of the current system. There is no quick or easy fix – no slight changing of wording that will address the problems around consent. This means that on the one hand, there need to be major changes in how organizations achieve meaningful transparency about their data collection, use and disclosure practices. There must also be a new approach to compliance that gives considerably more oversight and enforcement powers to the Commissioner. The two changes are inextricably linked. The broader public protection mandate of the Commissioner requires that he have necessary powers to take action in the public interest. The technological context in which we now find ourselves is so profoundly different from what it was when this legislation was enacted in 2001 that to talk of only minor adjustments to the legislation ignores the transformative impacts of big data and the Internet of Things.

A major reworking of PIPEDA may in any event be well be overdue, and it might have important benefits that go beyond addressing the problems with consent. I note that if one was asked to draft a statute as a performance art piece that evokes the problems with incomprehensible, convoluted and contorted privacy policies and their effective lack of transparency, then PIPEDA would be that statute. As unpopular as it might seem to suggest that it is time to redraft the legislation so that it no longer reads like the worst of all privacy policies, this is one thing that the committee should consider.

I make this recommendation in a context in which all those who collect, use or disclose personal information in the course of commercial activity – including a vast number of small businesses with limited access to experienced legal counsel – are expected to comply with the statute. In addition, the public ideally should have a fighting chance of reading this statute and understanding what it means in terms of the protection of their personal information and their rights of recourse. As it is currently drafted PIPEDA is a convoluted mishmash in which the normative principles are not found in the law itself, but are rather tacked on in a Schedule. To make matters worse, the meaning of some of the words in the Schedule, as well as the principles contained therein are modified by the statute so that it is not possible to fully understand rules and exceptions without engaging in a complex connect-the-dots exercise. After a series of piecemeal amendments, PIPEDA now consists in large part of a growing list of exceptions to the rules around collection, use or disclosure without consent. While the OPC has worked hard to make the legal principles in PIPEDA accessible to businesses and to individuals, the law itself is not accessible In a recent case involving an unrepresented applicant, Justice Roy of the Federal Court expressed the opinion that for a party to “misunderstand the scope of the Act is hardly surprising.”

I have already mentioned the piecemeal amendments to PIPEDA over the years as well as concerns over transparency. In this respect it is important to note that the statute has been amended so as to increase the number of exceptions to the consent that would otherwise be required for the collection, use or disclosure of personal information. For example, paragraphs 7(3)(d.1) and (d.2) were added in 2015, and permit organizations to share personal information between themselves for the purposes of investigating breaches of an agreement or actual or anticipated contraventions of the laws of Canada or a province, or to detect or supress fraud. These are important objectives, but I note that no transparency requirements were created in relation to these rather significant powers to share personal information without knowledge or consent. In particular, there is no requirement to notify the Commissioner of such sharing. The scope of these exceptions creates a significant transparency gap that undermines personal information protection. This should be fixed.

PIPEDA also contains exceptions that allow organizations to share personal information with government actors for law enforcement or national security purposes without notice or consent of the individual. These exceptions also lack transparency safeguards. Given the huge volume of highly detailed personal information, including location information that is now collected by private sector organizations, the lack of mandatory transparency requirements is a glaring privacy problem. The Department of Industry, Science and Economic Development has created a set of voluntary transparency guidelines for organizations that choose to disclose the number of requests they receive and how they deal with them. It is time for there to be mandatory transparency obligations around such disclosures, whether it be public reporting or reporting to the Commissioner, or a combination of both. It should also be by both public and private sector actors.

Another major change that is needed to enable PIPEDA to meet the contemporary data protection challenges relates to the powers of the Commissioner. When PIPEDA was enacted in 2001 it represented a fundamental change in how companies were to go about collecting, using and disclosing personal information. This major change was made with great delicacy; PIPEDA reflected an ombuds model which allowed for a light touch with an emphasis on facilitating and cajoling compliance rather than imposing and enforcing it. Sixteen years later and with exabytes of personal data under the proverbial bridge, it is past time for the Commissioner to be given a new set of tools in order to ensure an adequate level of protection for personal information in Canada.

First, the Commissioner should have the authority to impose fines on organizations in circumstances where there has been substantial or systemic non-compliance with privacy obligations. Properly calibrated, such fines can have an important deterrent effect, which is currently absent in PIPEDA. They also represent transparent moments of accountability that are important in maintaining public confidence in the data protection regime.

The toolbox should also include the power for the Commissioner to issue binding orders. I am sure that you are well aware that the Commissioners in Quebec, Alberta and British Columbia already have such powers. As it stands, the only route under PIPEDA to a binding order runs through the Federal Court, and then only after a complaint has passed through the Commissioner’s internal process. This is an overly long and complex route to an enforceable order, and it requires an investment of time and resources that places an unfair burden on individuals.

I note as well that PIPEDA currently does not provide any guidance as to damage awards. The Federal Court has been extremely conservative in damage awards for breaches of PIPEDA, and the amounts awarded are unlikely to have any deterrent effect other than to deter individuals who struggle to defend their personal privacy. Some attention should be paid to establishing parameters for non-pecuniary damages under PIPEDA. At the very least, these will assist unrepresented litigants in understanding the limits of any recourse available to them.

Thank you for your attention, and I welcome any questions.

Published in Privacy

The Federal Court of Canada has ordered a Romanian company and its sole proprietor to cease publishing online any Canadian court or tribunal decisions containing personal information. It has also awarded damages against the company’s owner. The decision flows from an application made pursuant to s. 14 of the Personal Information Protection and Electronic Documents Act (PIPEDA). The applicant had complained to the Privacy Commissioner of Canada regarding the activities of the defendant and his website Globe24h.com. The Commissioner ruled the complaint well-founded (my comment on this finding is here). However, since the Commissioner has no power to make binding orders or to award damages, the applicant pursued the matter in court. (Note that the lack of order-making powers is considered by many to be a weakness of PIPEDA, and the Commissioner has suggested to Parliament that it might be time for greater enforcement powers.)

Globe24h.com is a Romania-based website operated by the respondent Radulescu. The site re-publishes public documents from a number of jurisdictions, including Canada. The Canadian content is scraped from CanLII and from court and tribunal websites. This scraping is contrary to the terms of use of those sites. The Canadian court websites and CanLII also prevent the indexing of their websites by search engines; this means that a search for an individual by name will not turn up court or tribunal decisions in which that individual is named. This practice is meant to balance the privacy of individuals with the public interest in having broad access to court and tribunal decisions. Such decisions may contain considerable amounts of personal information as they may relate to any kind of legal dispute including family law matters, employment-related disputes, discrimination complaints, immigration proceedings, bankruptcy cases, challenges to decisions on pensions or employment insurance, criminal matters, disputes between neighbors, and so on. In contrast, the Globe24h.com website is indexed by search engines; as a result, the balance attempted to be struck by courts and tribunals in Canada is substantially undermined.

The applicant in this case was one of many individuals who had complained to the Office of the Privacy Commissioner (OPC) after finding that a web search for their names returned results containing personal information from court decisions. The applicant, like many others, had sought to have his personal information removed from the Globe24h website. However, the “free removal” option offered by the site could take half a year or more to process. The alternative was to pay to have the content removed. Those who had opted to pay for removal found that they might have to pay again and again if the same information was reproduced in more than one document or in multiple versions of the decision hosted on the Globe24h web site.

The first issue considered by the Federal Court was whether PIPEDA could apply extraterritorially to Globe24h.com. In general, a country’s laws are not meant to apply outside its boundaries. Although the Federal Court referred to the issue as one of extraterritorial application of laws, it is more akin to what my co-authors and I have called extended territoriality. In other words, PIPEDA will apply to activities carried out in Canada and with impacts in Canada – even though the actors may be located outside of Canada. The internet makes such situations much more common. In this case, Radulescu engaged in scraping data from websites based in Canada; the information he collected included personal information of Canadians. He then, through his company, charged individuals fees to have their personal information removed from his website. The Court found that in these circumstances, PIPEDA would apply.

It was clear that the respondent had collected, used and disclosed the personal information of the applicant without his consent. Although Radulescu did not appear before the Federal Court, he had interacted with the OPC during the course of the investigation of the complaint against Globe24h. In that context, he had argued that he was entitled to benefit from the exception in PIPEDA which permitted the collection, use and disclosure of personal information without consent where it is for journalistic purposes. There is little case law that addresses head-on the scope of the “journalistic purposes” exception under PIPEDA. Justice Mosely found that the criteria proposed by the Canadian Association of Journalists, and supported by the OPC, provide a “reasonable framework” to define journalistic purposes:

 

. . . only where its purpose is to (1) inform the community on issues the community values, (2) it involves an element of original production, and (3) it involves a “self-conscious discipline calculated to provide an accurate and fair description of facts, opinion and debate at play within a situation.” (at para 68)

Justice Mosley found that “journalistic purposes” required something more than making court decisions available for free over the internet without any value-added content. He also noted that the statutory exception applies only where the collection, use or disclosure of personal information is for journalistic purposes and for no other purpose. Here, he found that the respondent had other purposes – namely to profit from charging people to remove their personal information from the website.

The respondent had also argued that he was entitled to benefit from the exception to the consent requirement because the information he collected, used and disclosed was ‘publicly available’. This exception is contained in PIPEDA and in regulations pertaining to publicly available information. While court and tribunal decisions fall within the definition of publicly available information, the exception to the consent requirement is only available where the collection, use or disclosure of the information relates “directly to the purpose for which the information appears in the record or document.” (Regs, s. 1(d)). In this case, Justice Mosley found that the respondent’s purpose did not relate directly to the reasons why the personal information was included in the decisions. Specifically, the inclusion of personal information in court decisions is to further the goals of the open courts principle, whereas, in the words of Justice Mosley, the respondent’s purpose “serves to undermine the administration of justice by potentially causing harm to participants in the justice system.” (at para 78)

PIPEDA contains a requirement that limits data collection, use or disclosure by an organization to only where it is “for purposes that a reasonable person would consider are appropriate in the circumstances.” (s. 5(3)). Justice Mosely noted that the Canadian Judicial Council’s policies on the online publication of court decisions strongly discourages the indexing of such decisions by search engines in order to strike a balance between open courts and privacy. This led Justice Mosely to conclude that the respondent did not have a bona fide business interest in making court decisions available in a way that permitted their indexing by search engines. Therefore the collection, use and disclosure of this information was not for purposes that a reasonable person would consider to be appropriate.

Having found that the respondent had breached PIPEDA, Justice Mosley next considered the issue of remedies. The situation was complicated in this case by the fact that the respondent is based in Romania. This raised issues of whether the court should make orders that would have an impact in Romania, as well as the issue of enforceability. The applicant was also pursuing separate remedies in Romania, and Justice Mosley noted that a court order from Canada might assist in these objectives. The OPC argued that it would be appropriate for the Court to make an order with a broader impact than just the applicant’s particular circumstances. The number of other complaints received by both CanLII and the OPC about personal information contained in decisions hosted on the Romanian site were indicative of a systemic issue. Justice Mosley was also influenced by the OPC’s argument that a broad order could be used by the applicant and by others to persuade search engines to de-index the pages of the respondent’s websites. Accepting that PIPEDA enabled him to address systemic and not just individual problems, Justice Mosely issued a declaration that the respondent had violated PIPEDA, and ordered that he remove all Canadian court and tribunal decisions that contain personal information. He also ordered that the respondent take steps to ensure that these decisions are removed from search engine caches. The respondent was also ordered to refrain from any further copying or publishing of Canadian court or tribunal decisions containing personal information in a manner that would violate PIPEDA.

The applicant also sought damages for breach of PIPEDA. Damages awards have been a weak spot under PIPEDA. The Federal Court has been extremely conservative in awarding damages; this tendency has not been helped by the fact that the overwhelming majority of applications have been brought by self-represented litigants. In this case, Justice Mosley accepted that the breach was egregious, and noted the practice of the respondent to profit from exploiting the personal information of Canadians. He also noted that the level of disclosure of personal information was extensive because of the bulk downloading and publishing of court decisions. Finally, he noted that the respondent “has also acted in bad faith in failing to take responsibility and rectify the problem” (at para 103). In the circumstances, one might have expected an order of damages far in excess of the modest $5000 ultimately ordered by Justice Mosely. This amount seems disproportionate to the nature of the breach, as well as to the impact it had on the applicant and the extensive steps he has had to take to try to address the problem. Even though recovering any amount from the respondent might be no more than a pipe dream in the circumstances, the amount set in this case would seem to lack any deterrent effect and is hardly proportionate to the nature of the breach.

Overall, this decision is an important one. It confirms the application of PIPEDA to the collection, use or disclosure of personal information of Canadians that is linked to Canada, even where the respondent is located in another country. It also provides clarification of the exceptions to consent for journalistic purposes and for publicly available information. In this regard, the court’s careful reading of these exceptions prevents them from being used as a broad licence to exploit personal information. The court’s reasoning with respect to its declaration and its order is also useful, particularly as it applies to the sanctioning of offshore activities. The only weakness is in the award of damages; this is a recurring issue with PIPEDA and one that may take legislative intervention to address.

Published in Privacy

Yesterday I appeared before the House of Commons’ Standing Committee on Access to Information, Privacy and Ethics, along with Professor David Lyon of Queen’s University and Professor Lisa Austin of the University of Toronto. The Committee is considering long overdue reform of the Privacy Act, and we had been invited to speak on this topic.

All three of us urged the Committee to take into account the very different technological environment in which we now find ourselves. Professor Lyon cogently addressed the changes brought about by the big data context. Although the Privacy Act as it currently stands largely address the collection, use and disclosure of personal information for “administrative purposes” all three of us expressed concerns over the access to and use by government of information in the hands of the private sector, and the use of information in big data analytics. Professor Austin in particular emphasized the need to address not just the need for accuracy in the data collected by government but also the need to assess “algorithmic accuracy” – the quality/appropriateness of algorithms used to analyse large stores of data and to draw conclusions or predictions from this data. She also made a clear case for bringing Charter considerations into the Privacy Act – in other words, for recognizing that in some circumstances information collection, disclosure or sharing that appears to be authorized by the Privacy Act might nevertheless violate the Canadian Charter of Rights and Freedoms. There was also considerable discussion of information-sharing practices both within government and between our government and other foreign or domestic governments.

The Committee seemed very interested and engaged with the issues, which is a good sign. Reform of the Privacy Act will be a challenging task. The statute as a public sector data protection statute is sorely out of date. However, it is also out of context – in other words, it was drafted to address an information context that is radically different from that in which we find ourselves today. Many of the issues that were raised before the Committee yesterday go well beyond the original boundaries of the Privacy Act, and the addition of a few provisions or a few tweaks here and there will not come close to solving some of these privacy issues – many of which overlap with issues of private sector data protection, criminal law and procedure, and national security.

The notes related to my own remarks to the Committee are available below.

Written Notes for Comments by Professor Teresa Scassa to the House of Commons’ Standing Committee on Access to Information, Privacy and Ethics, June 14, 2016

Thank you for the opportunity to address this Committee on the issue of reform of the Privacy Act.

I have reviewed the Commissioner’s recommendations on Privacy Act reform and I am generally supportive of these proposals. I will focus my remarks today on a few specific issues that are united by the theme of transparency. Greater transparency with respect to how personal information is collected, used and disclosed by government enhances privacy by exposing practices to comment and review and by enabling appropriate oversight and accountability. At the same time, transparency is essential to maintaining public confidence in how government handles personal information.

The call for transparency must be situated within our rapidly changing information environment. Not only does technology now enable an unprecedented level of data collection and storage, enhanced analytic capacity has significantly altered the value of information in both public and private sectors. This increased value provides temptations to over-collect personal information, to share it, mine it or compile it across departments and sectors for analysis, and to retain it beyond the period required for the original purposes of its collection.

In this regard, I would emphasize the importance of the recommendation of the Commissioner to amend the Privacy Act to make explicit a “necessity” requirement for the collection of personal information, along with a clear definition of what ‘necessary’ means. (Currently, s. 4(1) of the Privacy Act requires only that personal information “relate[] directly to an operating program or activity of the institution”.) The goal of this recommendation is to curtail the practice of over-collection of personal information. Over-collection runs counter to the expectations of the public who provide information to government for specific and limited purposes. It also exposes Canadians to enhanced risks where negligence, misconduct or cyberattack result in data breaches. Data minimization is an important principle that is supported by data protection authorities around the world and that is reflected in privacy legislation. The principle should be explicit and up front in a reformed Privacy Act. Data minimization also has a role to play in enhancing transparency: not only do clear limits on the collection of personal information serve transparency goals; over-collection encourages the re-purposing of information, improper use and over-sharing.

The requirement to limit collection of information to specific and necessary purposes is tied to the further requirement on government to collect personal information directly from the individual “where possible” (s. 5(1)). This obviously increases transparency as it makes individuals directly aware of the collection. However, this requirement relates to information collected for an “administrative purpose”. There may be many other purposes for which government collections information, and these fall outside the privacy protective provisions of the Privacy Act. This would include circumstances that is disclosed to a government investigative body at its request in relation to an investigation or the enforcement of any law, or that is disclosed to government actors under court orders or subpoenas. Although such information gathering activities may broadly be necessary, they need to be considered in the evolving data context in which we find ourselves, and privacy laws must adapt to address them.

Private sector companies now collect vast stores of personal information, and this information often includes very detailed, core-biographical information. It should be a matter of great concern, therefore, that the permissive exceptions in both PIPEDA and the Criminal Code enable the flow of massive amounts of personal information from the private sector to government without the knowledge or consent of the individual. Such requests/orders are often (although not always) made in the course of criminal or national security investigations. The collection is not transparent to the individuals affected, and the practices as a whole are largely non-transparent to the broader public and to the Office of the Privacy Commissioner (OPC).

We have heard the most about this issue in relation to telecommunications companies, which are regularly asked or ordered to provide detailed information to police and other government agents. It should be noted, however, that many other companies collect personal information about individuals that is highly revelatory about their activities and choices. It is important not to dismiss this issue as less significant because of the potentially anti-social behaviour of the targeted individuals. Court orders and requests for information can and do encompass the personal information of large numbers of Canadians who are not suspected of anything. The problem of tower dump warrants, for example, was recently highlighted in a recent case before the Ontario Supreme Court (R. v. Rogers Communication (2016 ONSC 70))(my earlier post on this decision can be found here). The original warrant in that case sought highly detailed personal information of around 43,000 individuals, the vast majority of whom had done nothing other than use their cell phones in a certain area at a particular time. Keep in mind that the capacity to run sophisticated analytics will increase the attractiveness of obtaining large volumes of data from the private sector in order to search for an individual linked to a particular pattern of activity.

Without adequate transparency regarding the collection of personal information from the private sector, there is no way for the public to be satisfied that such powers are not abused. Recent efforts to improve transparency (for example, the Department of Innovation, Science and Economic Development’s voluntary transparency reporting guidelines) have focused on private sector transparency. In other words, there has been an attempt to provide a framework for the voluntary reporting by companies of the number of requests they receive from government authorities, the number they comply with, and so on. But these guidelines are entirely voluntary, and they also only address transparency reporting by the companies themselves. There are no legislated obligations on government actors to report in a meaningful way – whether publicly or to the OPC – on their harvesting of personal information from private sector companies. I note that the recent attempt by the OPC to audit the RCMP’s use of warrantless requests for subscriber data came to an end when it became clear that the RCMP did not keep specific records of these practices.

In my view, a modernization of the Privacy Act should directly address this enhanced capacity of government institutions to access the vast stores of personal information in the hands of the private sector. The same legislation that permits the collection of personal information from private sector companies should include transparency reporting requirements where such collection takes places. In addition, legislative guidance should be provided on how government actors who obtain personal information from the private sector either by request or under court order should deal with this information. Specifically, limits on the use and retention of this data should be imposed.

It is true that both the Criminal Code and PIPEDA enable police forces and investigative bodies under both federal and provincial jurisdiction to obtain personal information from the private sector under the same terms and conditions, and that reform of the Privacy Act in this respect will not address transparency and accountability of provincial actors. This suggests that issues of transparency and accountability of this kind might also fruitfully be addressed in the Criminal Code and in PIPEDA, but this is no reason not to also address it in the Privacy Act. To the extent that government institutions are engaged in the indirect collection of personal information, the Privacy Act should provide for transparency and accountability with respect to such activities.

Another transparency issue raised by the Commissioner relates to information-sharing within government. Technological changes have made it easier for government agencies and departments to share personal information – and they do so on what the Commissioner describes as a “massive” scale. The Privacy Act enables personal information sharing within and between governments, domestically and internationally, in specific circumstances – for investigations and law enforcement, for example, or for purposes consistent with those for which it was collected. (Section 8(2)(a) allows for sharing “for the purpose for which the information was obtained or compiled by the institution or for a use consistent with that purpose”). Commissioner Therrien seeks amendments that would require information-sharing within and between governments to take place according to written agreements in a prescribed form. Not only would this ensure that information sharing is compliant with the legislation, it would offer a measure of transparency to a public that has a right to know whether and in what circumstances information they provide to one agency or department will be shared with another – or whether and under what conditions their personal information may be shared with provincial or foreign governments.

Another important transparency issue is mandatory data breach reporting. Treasury Board Secretariat currently requires that departments inform the OPC of data security breaches; yet the Commissioner has noted that not all comply. As a result, he is asking that the legislation be amended to include a mandatory breach notification requirement. Parliament has recently amended PIPEDA to include such a requirement. Once these provisions take effect, the private sector will be held to a higher standard than the public sector unless the Privacy Act is also amended. Any amendments to the federal Privacy Act to address data security breach reporting would have to take into account the need for both the Commissioner and for affected individuals to be notified where there has been a breach that meets a certain threshold for potential harm, as will be the case under PIPEDA. The PIPEDA amendments will also require organizations to keep records of all breaches of security safeguards regardless of whether they meet the harm threshold that triggers a formal reporting requirement. Parliament should impose a requirement on those bodies governed by the Privacy Act to both keep and to submit records of this kind to the OPC. Such records would be helpful in identifying patterns or trends either within a single department or institution or across departments or institutions. The ability to identify issues proactively and to address them either where they arise or across the federal government can only enhance data security – something which is becoming even more urgent in a time of increased cybersecurity threats.

 

Published in Privacy

 

The Federal Court has released a decision in a case that raises important issues about transparency and accountability under Canada’s private sector privacy legislation.

The Personal Information Protection and Electronic Documents Act (PIPEDA) governs privacy with respect to the collection, use and disclosure of personal information by private sector organizations. Under PIPEDA, individuals have the right to access their personal information in the hands of private sector organizations. The right of access allows individuals to see what information organizations have collected about them. It is accompanied by a right to have incorrect information rectified. In our datified society, organizations make more and more decisions about individuals based upon often complex profiles built with personal information from a broad range of sources. The right of access allows individuals to see whether organizations have exceeded the limits of the law in collecting and retaining personal information; it also allows them the opportunity to correct errors that might adversely impact decision-making about them. Unfortunately, our datified society also makes organizations much more likely to insist that the data and algorithms used to make decisions or generate profiles, along with the profiles themselves, are all confidential business information and thus exempt from the right of access. This is precisely what is at issue in Bertucci v. Royal Bank of Canada.

The dispute in this case arose after the Bertuccis – a father and son who had banked with RBC for 35 and 20 years respectively, and who also held business accounts with the bank – were told by RBC that the bank would be closing their accounts. The reason given for the account closure was that the bank was no longer comfortable doing business with them. Shortly after this, the Bertuccis made a request, consistent with their right of access under PIPEDA, to be provided with all of their personal information in the hands of RBC, including information as to why their bank accounts were closed. RBC promptly denied the request, stating that it had already provided its reason for closing the accounts and asserting that it had a right under its customer contracts to unilaterally close accounts without notice. It also indicated that it had received no personal information from third parties about the Bertuccis and that all of the information that they sought was confidential commercial information.

RBC relied upon paragraph 9(3)(b) of PIPEDA, which essentially allows an organization to refuse to provide access to personal information where “to do so would reveal confidential commercial information”. On receiving RBC’s refusal to provide access, the Bertuccis complained to the Office of the Privacy Commissioner. The OPC investigated the complaint and ultimately sided with RBC, finding that it was justified in withholding the information. In reaching this conclusion, the OPCC relied in part on an earlier Finding of the Privacy Commissioner which I have previously critiqued, precisely because of its potential implications for transparency and accountability in the evolving big data context.

In reaching it conclusion on the application of paragraph 9(3)(b) of PIPEDA, the OPC apparently accepted that the information at issue was confidential business information, noting that it was “treated as confidential by RBC, including information about the bank’s internal methods for assessing business-related risks.” (At para 10)

After having their complaint declared unfounded by the OPC, the applicants took the issue to the Federal Court. Justice Martineau framed the key question before the court in these terms: “Can RBC refuse to provide access to undisclosed personal information it has collected about the applicants on the grounds that its disclosure in this case would reveal confidential commercial information” (at para 16)

RBC’s position was that it was not required to justify why it might close an account. It argued that if it is forced to disclose personal information about a decision to close an account, then it is effectively stripped of its prerogative to not provide reasons. It also argued that any information that it relied upon in its risk assessment process would constitute confidential business information. This would be so even if the information were publicly available (as in the case of a newspaper article about the account holder). The fact that the newspaper article was relied upon in decision-making would be what constituted confidential information – providing access to that article would de facto disclose that information.

The argument put forward by RBC is similar to the one accepted by the OPC in its earlier (2002) decision which was relied upon by the bank and which I have previously criticized here. It is an argument that, if accepted, would bode very ill for the right of access to personal information in our big data environment. Information may be compiled from all manner of sources and used to create profiles that are relied upon in decision-making. To simply accept that information used in this way is confidential business information because it might reveal how the company reaches decisions slams shut the door on the right of access and renders corporate decision-making about individuals, based upon the vast stores of collected personal information, essentially non-transparent.

The Bertuccis argued that PIPEDA – which the courts have previously found to have a quasi-constitutional status in protecting individual privacy – makes the right of access to one’s personal information the rule. An exception to this rule would have to be construed narrowly. The applicants wanted to know what information led to the closure of their accounts and sought as well to exercise their right to have this information corrected if it was inaccurate. They were concerned that the maintenance on file of inaccurate information by RBC might continue to haunt them in the future. They also argued that RBC’s approach created a two-tiered system for access to personal information. Information that could be accessed by customers whose accounts were not terminated would suddenly become confidential information once those accounts were closed, simply because it was used in making that decision. They argued that the bank should not be allowed to use exceptions to the access requirement to shelter itself from embarrassment at having been found to have relied upon faulty or inadequate information.

Given how readily the OPC – the guardian of Canadians’ personal information in the hands of private sector organizations – accepted RBC’s characterization of this information as confidential, Justice Martineau’s decision is encouraging. He largely agreed with the position of the applicants, finding that the exceptions to the right to access to one’s personal information must be construed narrowly. Significantly, Justice Martineau found that courts cannot simply defer to a bank’s assertion that certain information is confidential commercial information. He placed an onus on RBC to justify why each withheld document was considered confidential. He noted that in some circumstances it will be possible to redact portions of reports, documents or data that are confidential while still providing access to the remainder of the information. In this case, Justice Martineau was not satisfied that the withheld information met the standard for confidential commercial information, nor was he convinced that some of it could not have been provided in redacted form.

Reviewing the documents at issue, Justice Martineau began by finding that a list of the documents relied upon by the bank in reaching its decision was not confidential information, subject to certain redactions. He noted as well that much of what was being withheld by the bank was “raw data”. He distinguished the raw data from the credit scoring model that was found to be confidential information in the 2002 OPC Finding mentioned above. He noted as well that the raw data was not confidential information and had not, when it was created, been treated as confidential information by the bank. He also noted that the standard for withholding information on an access request was very high.

Justice Martineau gave RBC 45 days to provide the applicants with all but a few of the documents which the court agreed could be withheld as confidential commercial information. Although the applicants had sought compensatory and punitive damages, he found that it was not an appropriate case in which to award damages.

Given the importance of this decision in the much broader big data and business information context, RBC is likely to appeal it to the Federal Court of Appeal. If so, it will certainly be an important case to watch. The issues it raises are crucial to the future of transparency and accountability of corporations with respect to their use of personal information. In light of the unwillingness of the OPC to stand up to the bank both in this case and in earlier cases regarding assertions of confidential commercial information, Justice Martineau’s approach is encouraging. There is a great deal at stake here, and this case will be well worth watching if it is appealed.

 

 

 

 

Published in Privacy

The department formerly known as Industry Canada (now Innovation, Science and Economic Development or ISED) has just released a discussion paper that seeks public input on the regulations that will accompany the new data breach notification requirements in the Personal Information Protection and Electronic Documents Act (PIPEDA).

The need to require private sector organizations in Canada to report data breaches was first formally identified in the initial review of PIPEDA carried out in 2007. The amendments to the statute were finally passed into law in June of 2015, but they will not take effect until regulations are enacted that provide additional structure to the notification requirements. The discussion paper seeks public input prior to drafting and publishing regulations for comment and feedback, so please stop holding your breath. It will still take a while before mandatory data breach notification requirements are in place in Canada.

The new amendments to the legislation make it mandatory for organizations to report data breaches to the Privacy Commissioner if those breaches pose “a real risk of significant harm to an individual”. (s. 10.1) An organization must also notify any individuals for whom the breach poses “a real risk of significant harm (s. 10.1(3). The form and contents of these notifications remain to be established by the regulations. A new s. 10.2 of PIPEDA will also require an organization that has suffered a reportable breach to notify any other organization or government institution of the breach if doing so may reduce the risk of harm. For example, such notifications might include ones to credit reporting agencies or law enforcement officials. The circumstances which trigger this secondary notification obligation remain to be fleshed out in the regulations. Finally, a new s. 10.3 of PIPEDA will require organizations to keep records of all data breaches not just those that reach the threshold for reporting to the Privacy Commissioner. In theory these records might enable organizations to detect flaws in their security practices. They may also be requested by the Commissioner, providing potential for oversight of data security at organizations. The content of these records remains to be determined by the new regulations.

From the above, it is clear that the regulations that will support these statutory data breach reporting requirements are fundamentally important in setting its parameters. The ISED discussion paper articulates a series of questions relating to the content of the regulations on which it seeks public input. The questions relate to how to determine when there is a “real risk of significant harm to an individual”; the form and content of the notification that is provided to the Commissioner by an organization that has experienced a breach; the form, manner and content of notification provided to individuals; the circumstances in which an organization that has experienced a breach must notify other organizations; and the form and content or records kept by organizations, as well as the period of time that these records must be retained.

There is certain that ISED will receive many submissions from organizations that are understandably concerned about the impact that these regulations may have on their operations and legal obligations. Consumer and public interest advocacy groups will undoubtedly make submissions from a consumer perspective. Individuals are also welcome contribute to the discussion. Some questions are particularly relevant to how individuals will experience data breach notification. For example, if an organization experiences a breach that affects your personal information and that poses a real risk of harm, how would you like to receive your notification? By telephone? By mail? By email? And what information would you like to receive in the notification? What level of detail about the breach would you like to have? Do you want to be notified of measures you can take to protect yourself? Do you want to know what steps the organization has taken and will take to protect you?

Anyone with an interest in this issue, whether personally or on behalf of a group or an organization has until May 31, 2016 to provide written submission to This e-mail address is being protected from spambots. You need JavaScript enabled to view it . The discussion paper and questions can be found here.

Published in Privacy

Technology has enabled the collection and sharing of personal information on a massive scale, and governments have been almost as quick as the private sector to hoover up as much of it as they can. They have also been as fallible as the private sector – Canada’s federal government, for example, has a substantial number of data breaches in the last few years.

What has not kept pace with technology has been the legislation in place to protect privacy. Canada’s federal Privacy Act, arguably a ground-breaking piece of legislation when it was first enacted in 1983, has remained relatively untouched throughout decades of dramatic technological change. Despite repeated calls for its reform, the federal government has been largely unwilling to update this statute that places limits on its collection, use and disclosure of personal information. This may be changing with the new government’s apparent openness to tackling the reform of both this statute and the equally antiquated Access to Information Act. This is good news for Canadians, as each of these statutes has an important role to play in holding a transparent government accountable for its activities.

On March 10, 2016 Federal Privacy Commissioner Daniel Therrien appeared before the Standing Committee on Access to Information, Privacy and Ethics, which is considering Privacy Act reform. The Commissioner’s statement identified some key gaps in the statute and set out his wish list of reforms.

As the Commissioner pointed out, technological changes have made it easier for government agencies and departments to share personal information – and they do so on what he describes as a “massive” scale. The Privacy Act currently has little to offer to address these practices. Commissioner Therrien is seeking amendments that would require information sharing within the government to take place according to written agreements in a prescribed form. Not only would this ensure that information sharing is compliant with legal obligations to protect privacy, it would offer a measure of transparency to a public that has a right to know whether and in what circumstances information they provide to one agency or department will be shared with another.

The Commissioner is also recommending that government institutions be explicitly required under the law to safeguard the personal information in their custody, and to report data breaches to the Office of the Privacy Commissioner. It may come as a surprise to many Canadians that such a requirement is not already in the statute – its absence is a marker of how outdated the law has become. Since 2014, the Treasury Board of Canada, in its Directive on Privacy Practices has imposed mandatory breach reporting for all federal government institutions, but this is not a legislated requirement, nor is there recourse to the courts for non-compliance.

The Commissioner is also seeking more tools in his enforcement toolbox. Under the Privacy Act as it currently stands, the Commissioner may make recommendations to government institutions regarding their handling of personal information. These recommendations may then be ignored. While he notes that “in the vast majority of cases, government departments do eventually agree to implement our recommendations”, it is clear that this can be a long, drawn out process with mixed results. Currently, the only matters that can be taken to court for enforcement are denials by institutions to provide individuals with access to their personal information. The Commissioner is not seeking the power to directly compel institutions to comply with its recommendations; rather, he recommends that an institution that receives recommendations from the Office of the Privacy Commissioner have two choices. They may implement the recommendations or they may go to court for a declaration that they do not need to comply. On this model, relatively prompt compliance would presumably become the default.

The Commissioner is also seeking an amendment that would require government institutions to conduct privacy impact assessments before the launch of a new program or where existing programs are substantially modified. Again, you would think this would be standard practice by now. It does happen, but the Commissioner diplomatically describes current PIAs as being “sometimes uneven” in both their quality and timeliness. The Commissioner would also like to see a legislated requirement that government bills that will have an impact on privacy be sent to the OPC for review before being tabled in Parliament.

The Commissioner seeks additional amendments to improve transparency in relation to the government’s handling of personal information. Currently, the Commissioner files an annual report to Parliament. He may also issue special reports. The Commissioner recommends that he be empowered under the legislation “to report proactively on the practices of government”. He also recommends extending the Privacy Act to all government institutions. Some are currently excluded, including the Prime Minister’s Office and the offices of Ministers. He also recommends allowing all individuals whose personal information is in the hands of a federal government institution to have a right of access to that information (subject, of course, to the usual exceptions). Currently on Canadian citizens and those present in Canada have access rights.

This suite of recommendations is so reasonable that most Canadians would be forgiven for assuming these measures were already in place. Given the new government’s pre- and post-election commitments to greater transparency and accountability, there may be reason to hope we will finally see the long-overdue reform of the Privacy Act.

 

Published in Privacy

Bill S-4, the Digital Privacy Act has received royal assent and is now law. This bill amends Canada’s Personal Information Protection and Electronic Documents Act (PIPEDA). PIPEDA, Canada’s private sector data protection statute has been badly in need of updating for some time now. Although it only came into being in 2001, the technologies impacting personal information and the growing private sector thirst for such data have changed dramatically, rapidly outstripping the effectiveness of the legislation. There have been many calls for the reform of PIPEDA (perhaps most notably from successive Privacy Commissioners). The Digital Privacy Act addresses a handful of issues – some quite important, but leaves much more to be done. In this post I consider three of the changes: new data sharing powers for private sector organizations, data breach notification requirements, and a new definition of consent.

At least one of the amendments is considered a step backwards by privacy advocates. A new s. 7(3)(d.1) allows private sector organizations to share personal information between themselves without the knowledge or consent of the individuals to whom the information pertains for the purposes of investigating breaches of “agreements” or laws. Originally seen as a measure that would make it easier for organizations such as banks to investigate complex fraud schemes that might involve a fraudster dealing with multiple organizations, the growing awareness of the vulnerability of individuals to snooping and information sharing of all kinds, has made this provision the target of significant criticism by privacy advocates. Keep in mind that an “agreement” can be a user agreement with an ISP, the terms of use of a web site or other online service, or any other contract between an individual and an organization. The provision means that any company that suspects that one of the terms of an agreement to which it is party has been breached can ask other companies to share information – without the knowledge or consent of the individual or without a court order – in order to investigate this potential breach. There is a profound lack of transparency and accountability in the data sharing enabled by this provision. True, such sharing is not mandatory – an organization can refuse to share the information requested under this provision. This amendment places an onus on individuals to pressure organizations to give them clearer and more robust assurances regarding whether and how their personal information will be shared.

The amendments will also add to PIPEDA data breach notification requirements. This is a change long sought by privacy advocates. Essentially, the law will require an organization that has experienced a data security breach to report the breach to the Privacy Commissioner “if it is reasonable in the circumstances to believe that the breach creates a real risk of significant harm to an individual.” (s. 10.1) Affected individuals must also be notified in the same circumstances. “Significant harm” is defined in the legislation as including “bodily harm, humiliation, damage to reputation or relationships, loss of employment, business or professional opportunities, financial loss, identity theft, negative effects on the credit record and damage to or loss of property.” A determination of whether there is a “real risk” of these types of harms can be determined by considering two factors spelled out in the legislation: the sensitivity of the information at issue, and the likelihood that it is being misused or may be misused in the future. Any other “prescribed factor” must also be taken into account, leaving room to include other considerations in the regulations that will be required to implement these provisions. The real impact of these data breach notification provisions will largely turn on how “real risk” and “significant harm” are interpreted and applied. It is important to note as well that these provisions are the one part of the new law that is not yet in force. The data breach notification provisions are peppered throughout with references to “prescribed” information or requirements. This means that to come into effect, regulations are required. It is not clear what the timeline is for any such regulations. Those who have been holding their breath waiting for data breach notification requirements may just have to give in and inhale now in order to avoid asphyxiation.

One amendment that I find particularly interesting is a brand new definition of consent. PIPEDA is a consent-based data protection regime. That is, it is premised on the idea that individuals make free and informed choices about who gets to use their personal information and for what purposes. Consent is, of course, becoming somewhat of a joke. There are too many privacy policies, they are too long and too convoluted for people either to have the time to read them all or be capable of understanding them. It doesn’t help that they are often framed in very open-ended terms which do not give a clear indication of how personal information will be used by the organization seeking consent. In this context, the new definition is particularly intriguing. Section 6.1 of the statute now reads:

6.1 For the purposes of clause 4.3 of Schedule 1, the consent of an individual is only valid if it is reasonable to expect that an individual to whom the organization’s activities are directed would understand the nature, purpose and consequences of the collection, use or disclosure of the personal information to which they are consenting.

This is a rather astonishing threshold for consent – and one that is very consumer-friendly. It requires that the individual understand “the nature, purpose and consequences” of the use of their personal information to which they consent. In our networked, conglomerated and big-data dominated economy, I am not sure how anyone can fully understand the consequences of the collection, use or disclosure of much of their personal information. Given a fulsome interpretation this provision could prove a powerful tool for protecting consumer privacy. Organizations should take note. At the very least it places a much greater onus on them to formulate clear, accessible and precise privacy policies.

Published in Privacy

Last week I wrote about a very early ‘finding’ under Canada’s Personal Information Protection and Electronic Documents Act which raises some issues about how the law might apply in the rapidly developing big data environment. This week I look at a more recent ‘finding’ – this time 5 years old – that should raise red flags regarding the extent to which Canada’s laws will protect individual privacy in the big data age.

In 2009, the Assistant Privacy Commissioner Elizabeth Denham (who is now the B.C. Privacy Commissioner) issued her findings as a result of an investigation into a complaint by the Canadian Internet Policy and Public Interest Clinic into the practices of a Canadian direct marketing company. The company combined information from different sources to create profiles of individuals linked to their home addresses. Customized mailing lists based on these profiles were then sold to clients looking for individuals falling within particular demographics for their products or services.

Consumer profiling is a big part of big data analytics, and today consumer profiles will draw upon vast stores of personal information collected from a broad range of online and offline sources. The data sources at issue in this case were much simpler, but the lessons that can be learned remain important.

The respondent organization used aggregate geodemographic data, which it obtained from Statistics Canada, and which was sorted according to census dissemination areas. This data was not specific to particular identifiable individuals – the aggregated data was not meant to reveal personal information, but it did give a sense of, for example, distribution of income by geographic area (in this case, by postal code). The company then took name and address information from telephone directories so as to match the demographic data with the name and location information derived from the directories. Based on the geo-demographic data, assumptions were made about income, marital status, likely home-ownership, and so on. The company also added its own assumptions about religion, ethnicity and gender based upon the telephone directory information – essentially drawing inferences based upon the subscribers’ names. These assumptions were made according to ‘proprietary models’. Other proprietary models were used to infer whether the individuals lived in single or multi-family dwellings. The result was a set of profiles of named individuals with inferences drawn about their income, ethnicity and gender. CIPPIC’s complaint was that the respondent company was collecting, using and disclosing the personal information of Canadians without their consent.

The findings of the Assistant Privacy Commissioner (APC) are troubling for a number of reasons. She began by characterizing the telephone directory information as “publicly available personal information”. Under PIPEDA, information that falls into this category, as defined by the regulations, can be collected, used and disclosed without consent, so long as the collection, use and disclosure are for the purposes for which it was made public. Telephone directories fall within the Regulations Specifying Publicly Available Information. However, the respondent organization did more than simply resell directory information.

Personal information is defined in PIPEDA as “information about an identifiable individual”. The APC characterized the aggregate geodemographic data as information about certain neighborhoods, and not information about identifiable individuals. She stated that “the fact that a person lives in a neighborhood with certain characteristics” was not personal information about that individual.

The final piece of information associated with the individuals in this case was the set of assumptions about, among other things, religion, ethnicity and gender. The APC characterized these as “assumptions”, rather than personal information – after all, the assumptions might not be correct.

Because the respondent’s clients provided the company with the demographic characteristics of the group it sought to reach, and because the respondent company merely furnished names and addresses in response to these requests, the APC concluded that the only personal information that was collected, used or disclosed was publicly available personal information for which consent was not required. (And, in case you are wondering, allowing people to contact individuals was one of the purposes for which telephone directory information is published – so the “use” by companies of sending out marketing information fell within the scope of the exception).

And thus, by considering each of the pieces of information used in the profile separately, the respondent’s creation of consumer profiles from diffuse information sources fell right through the cracks in Canada’s data protection legislation. This does not bode well for consumer privacy in an age of big data analytics.

The most troubling part of the approach taken by the APC is that which dismisses “assumptions” made about individuals as being merely assumptions and not personal information. Consumer profiling is about attributing characteristics to individuals based on an analysis of their personal information from a variety of sources. It is also about acting on those assumptions once the profile is created. The assumptions may be wrong, the data may be flawed, but the consumer will nonetheless have to bear the effects of that profile. These effects may be as minor as being sent advertising that may or may not match their activities or interests; but they could be as significant as decisions made about entitlements to certain products or services, about what price they should be offered for products or services, or about their desirability as a customer, tenant or employee. If the assumptions are not “actual” personal information, they certainly have the same effect, and should be treated as personal information. Indeed, the law accepts that personal information in the hands of an organization may be incorrect (hence the right to correct personal information), and it accepts that opinions about an individual constitute their personal information, even though the opinions may be unfair.

The treatment of the aggregate geodemographic information is also problematic. On its own, it is safe to say that aggregate geodemographic information is information about neighborhoods and not about individuals. But when someone looks up the names and addresses of the individuals living in an area and matches that information to the average age, income and other data associated with their postal codes, then they have converted that information into personal information. As with the ethnicity and gender assumptions, the age, income, and other assumptions may be close or they may be way off base. Either way, they become part of a profile of an individual that will be used to make decisions about that person. Leslie O’Keefe may not be Irish, he may not be a woman, and he may not make $100,000 a year – but if he is profiled in this way for marketing or other purposes, it is not clear why he should have no recourse under data protection laws.

Of course, the challenged faced by the APC in this case was how to manage the ‘balance’ set out in s. 3 of PIPEDA between the privacy interests of individuals and the commercial need to collect, use and disclose personal information. In this case, to find that consent – that cornerstone of data protection laws – was required for the use and disclosure of manufactured personal information, would be to hamstring an industry built on the sale of manufactured personal information. As the use – and the sophistication – of big data and big data analytics advances, organizations will continue to insist that they cannot function or compete without the use of massive stores of personal information. If this case is any indication, decision makers will be asked to continue to blur and shrink the edges of key concepts in the legislation, such as “consent” and “personal information”.

The PIPEDA complaint in this case dealt with relatively unsophisticated data used for relatively mundane purposes, and its importance may be too easily overlooked as a result. But how we define personal information and how we interpret data protection legislation will have enormous importance as to role of big data analytics in our lives continues to grow. Both this decision and the one discussed last week offer some insights into how Canada’s data protection laws might be interpreted or applied – and they raise red flags about the extent to which these laws are adequately suited to protecting privacy in the big data era.

Published in Privacy

Class action law suits for breach of privacy are becoming increasingly common in Canada. For example, the B.C. Supreme Court, the Ontario Superior Court, and Newfoundland and Labrador Supreme Court have all recently certified class action law suits in relation to alleged privacy breaches.

The use of the class action law suit can be a useful solution to some of the problems that plague the victims of privacy breaches. These difficulties include:

1) The lack of any other meaningful and effective recourse for a large scale privacy breach. Complaints regarding a large-scale privacy breach by a private sector corporation can be made to the Privacy Commissioner of Canada under the Personal Information Protection and Electronic Documents Act (PIPEDA) (or to his provincial counterparts in B.C., Quebec or Alberta, depending upon the nature of the corporation and its activities). However, the federal privacy commissioner can only investigate and issue a report with non-binding recommendations. He has no order-making powers. Further, there is no power to award damages. An individual who feels they have been harmed by a privacy breach must, after receiving the Commissioner’s report, make an application to Federal Court for compensation. Damage awards in Federal Court under PIPEDA have been very low, ranging from about $0 to $5000 (with a couple of outlier exceptions). This amount of damages will not likely compensate for the time and effort required to bring the legal action, let alone the harm from the privacy breach. Perhaps more importantly, a few thousand dollars may not be a significant deterrent for companies whose practices have led to the privacy breach. The Privacy Commissioner’s Office has called for reform of PIPEDA to include order making powers, and to give the Commissioner the authority to impose significant fines on companies whose conduct leads to significant privacy harms. Yet legislative reform in this area does not seem to be on the current government’s agenda.

2) The problem of establishing damages in privacy cases. It can be very difficult to establish damages in cases where privacy rights have been breached. For example, although a company’s data breach might affect tens or even hundreds of thousands of individuals, it may be very difficult for any of those individuals to show that the data breach has caused them any actual harm. Even if one or more of these individuals suffers identity theft, it may be impossible to link this back to that particular data breach. While all of the affected individuals may suffer some level of anxiety over the security of their personal information, it is hard to put a dollar value on this kind of anxiety – and courts have tended to take a rather conservative view in evaluating such harm. It simply might not be worth it for any individual to bring legal action in such circumstances – even if they were to succeed, their damages would likely not even come close to making the litigation worth their while.

3) The inaccessibility of justice on an individual scale. Frankly, the majority of Canadians are not in a financial position to take anyone to court for breach of privacy. (Those in province of Quebec might be slightly better off in this regard, as privacy rights are much clearer and better established in private law in that province than they are elsewhere in Canada). It should be noted that those few individuals who have sought damages in Federal Court for PIPEDA breaches have been self-represented – legal representation would simply be too costly given the stakes. A suit for the tort of invasion of privacy or for breach of a statutory privacy tort would be considerably more complex than an application for damages under PIPEDA. Damage awards in privacy cases are so low that litigation is not a realistic solution for most.

In this context it is not surprising that the class action law suit for breach of privacy is catching on in Canada. Such law suits allow large numbers of affected individuals to seek collective recourse. As mentioned earlier, the British Columbia Supreme Court recently certified a class action law suit against Facebook for breach of privacy rights protected under British Columbia’s Privacy Act. The claim in Douez v. Facebook, Inc. related to Facebook’s Sponsored Stories “product”. Advertisers who paid to make use of this product could use the names and likenesses of Facebook users in “sponsored stories” about their products or services. These “sponsored stories” would then be sent to the contacts of the person featured in the story. The court found that between September 9, 2012 and March 10, 2013, 1.8 million B.C. residents were featured in Sponsored Stories. The plaintiffs argued that this practice violated their privacy. Although the issues have not yet been litigated on their merits, the certification of the class action law suit allows the privacy claims to proceed on behalf of the significant number of affected individuals.

In Evans v. Bank of Nova Scotia, Justice Smith of the Ontario Superior Court of Justice certified a class action law suit against the Bank of Nova Scotia. In that case, an employee of the bank had, over almost a five year period, accessed highly confidential personal banking information of 643 customers. In June of 2012, the Bank notified these customers that there may have been unauthorized access to their banking information; 138 of these individuals later informed the bank that they were victims of identity theft or fraud. The bank employee subsequently admitted that he had channelled the banking information through his girlfriend to individuals who sought to use the information for illegal purposes. The lawsuit claims damages for invasion of privacy and negligence, among other things, and argues that the bank should be held vicariously liable for the actions of its employee.

Most recently, in Hynes v. Western Regional Integrated Health Authority, the Newfoundland and Labrador Supreme Court certified a class action law suit against the Health Authority after it was discovered that an employee had improperly accessed 1,043 medical records without authorization. The information accessed included name and address information, as well as information about diagnostic and medical procedures at the hospital. This case is an example of where it may be difficult to assess or quantify the harm suffered by the particular individuals as a result of the breach, as it is not known how the information may have been used. The plaintiffs argued that both the statutory privacy tort in Newfoundland and the common law tort of intrusion upon seclusion were applicable, and that the Health Authority should be held vicariously liable for the acts of its employee. The also argued that the Health Authority had been negligent in its care of their personal information. The court found that the arguments raised met the necessary threshold at the class action certification stage – the merits remain to be determined once the case ultimately proceeds to trial.

What these three cases demonstrate is that class action law suits may give individuals a useful recourse in cases where data breaches have exposed their personal information and perhaps left them vulnerable to identify theft or other privacy harms. Such law suits may also act as a real incentive for companies to take privacy protection seriously. The cost of defending a class action law suit, combined with the possibility of a very substantial damages award (or settlement), and the potential reputational harm from high profile litigation, all provide financial incentives to properly safeguard personal information.

This may be welcome news for those who are concerned about what seems to be a proliferation of data breaches. It should not, however, let the federal government off the hook in terms of strengthening Canada’s private sector data protection legislation and giving the Privacy Commissioner more effective tools to act in the public interest to protect privacy by ensuring compliance with the legislation.

 

Published in Privacy
Wednesday, 02 July 2014 07:07

Privacy and Open Government

The public-oriented goals of the open government movement promise increased transparency and accountability of governments, enhanced citizen engagement and participation, improved service delivery, economic development and the stimulation of innovation. In part, these goals are to be achieved by making more and more government information public in reusable formats and under open licences. The Canadian federal government has committed to open government, and is currently seeking input on its implementation plan. The Ontario government is also in the process of developing an open government plan, and other provinces are at different stages of development of open government. Progress is also occurring at the municipal level across Canada, with notable open data and/or open government initiatives in Vancouver, Toronto, and Ottawa (to give a few examples).


Yet open government brings with it some privacy challenges that are not explicitly dealt with in existing laws for the protection of privacy. While there is some experience with these challenges in the access to information context (where privacy interests are routinely balanced against the goals of transparency and accountability (and see my posting on a recent Supreme Court of Canada decision on this issue), this experience may not be well adapted to developments such as open data and proactive disclosure, nor may it be entirely suited to the dramatic technological changes that have affected our information environment. In a recent open-access article, I identify three broad privacy challenges raised by open government. The first is how to balance privacy with transparency and accountability in the context of “public” personal information (for example, registry information that may now be put online and broadly shared). The second challenge flows from the disruption of traditional approaches to privacy based on a collapse of the distinctions between public and private sector actors. The third challenge is that of the potential for open government data—even if anonymized—to contribute to the big data environment in which citizens and their activities are increasingly monitored and profiled.

I invite you to have a look at this article, which is published in (2014) 6 Future Internet 397-413.

Published in Privacy

Canadian Trademark Law

Published in 2015 by Lexis Nexis

Canadian Trademark Law 2d Edition

Buy on LexisNexis

Electronic Commerce and Internet Law in Canada, 2nd Edition

Published in 2012 by CCH Canadian Ltd.

Electronic Commerce and Internet Law in Canada

Buy on CCH Canadian

Intellectual Property for the 21st Century

Intellectual Property Law for the 21st Century:

Interdisciplinary Approaches

Purchase from Irwin Law