Teresa Scassa - Blog

The Ontario Supreme Court of Justice has just approved the settlement of a class action law suit against Home Depot over a data privacy breach that took place in 2014. Both the settlement agreement and the decision by Justice Perell offer some interesting insights into privacy class actions in Canada.

Between April 11, 2014 and September 13, 2014 Home Depot’s payment system was hacked by criminals who used malware to skim data from credit card purchases at self-serve stations. When Home Depot discovered the breach it notified potentially affected customers through the French and English press in Canada. It also sent out over half a million emails to potentially affected customers in Canada. The emails apologized for the breach, and confirmed that the malware had been eradicated. Customers were assured that they would not be held responsible for fraudulent charges to their credit card accounts and they were offered free credit monitoring and identity theft insurance.

Although the breach led to complaints against Home Depot being filed with the privacy commissioners of Alberta, Quebec, B.C. and Canada, the commissioners all concluded that Home Depot had not breached their respective private sector data protection statutes. The fact that Home Depot had acted quickly and decisively to notify customers and to offer them protection also clearly influenced Justice Perell in his decision on the settlement agreement. He noted that Home Depot “apparently did nothing wrong”, and that it “responded in a responsible, prompt, generous and exemplary fashion to the criminal acts perpetrated on it by the computer hackers.” (at para 74.)

After the breach, which affected customers in the U.S. and Canada, a number of class action lawsuits were filed in both countries. The U.S.-based suits were consolidated into a single action which led to a settlement. The U.S. agreement was used as a template for the Canadian settlement. Under the terms of the settlement agreement put before Justice Perell, Home Depot admitted no wrongdoing. In exchange for releasing their claims against Home Depot, class members would be entitled to access a settlement fund of $250,000 available to compensate them for any actual expenses incurred as a result of the data breach up to a maximum of $5000 per claimant. The agreement also provides for class members to access free credit monitoring to a cap of $250,000. Justice Perell noted that given the cost of bulk purchases of credit card monitoring, this amount would allow for between 2,500 and 5,000 of the class members to access credit monitoring. In order to be entitled to any funds or credit monitoring, class members would have to file a claim form by October 29, 2016. Under the terms of the agreement, Home Depot would assume the costs of notifying class members and of administering the funds. Any money not distributed from the funds at the end of the claims period could be used to offset these costs. Justice Perell approved these terms of the settlement agreement.

The agreement also provided for a sum of $360,000 plus HST to be paid to the class action lawyers for legal fees, costs and disbursements. Small sums were also provided for in the agreement as honoraria for the representative plaintiffs in the class, although Justice Perell declined to approve these amounts, noting that honoraria were not appropriate in this case. He noted that “Compensation for a representative plaintiff may only be awarded if he or she has made an exceptional contribution that has resulted in success for the class.” (at para 80)

In assessing the settlement agreement, Justice Perell made it clear that the value of the settlement for class members was at most $400,000. He noted that in terms of compensation very little might actually be paid out. No class members would have had to cover the cost of fraudulent credit card charges and, in the time since the breach, there were no documented cases of identity theft related to this breach. He noted that the only information obtained through the hack was credit card information; other identity details used in identity theft such as driver’s licence data or social insurance numbers, were never stolen. He thus found it “highly unlikely” that the $250,000 fund would be used for damage awards. He also expressed doubt whether, given the short deadline in the agreement, the $250,000 fund for identity theft insurance would be used up.

Given the modest value of the settlement agreement, Justice Perell would not approve the $360,000 bill for legal fees and disbursements. Instead, he set the amount at $120,000. He noted that to do otherwise would pay class counsel more than would be received by the class members. He noted as well that in his view the case against Home Depot was very weak: the data breach was the result of a criminal hack; the privacy commissioners had found no wrongdoing on the part of Home Depot; and Home Depot had not attempted to cover it up and instead had acted promptly to notify customers and to help them mitigate any possible harm. Further, he noted that “by the time the actions against Home Depot came to be settled, there were no demonstrated or demonstrable losses by the Class Members” (at para 101). Justice Perell observed that while class counsel may have incurred higher fees than what were being awarded, there is a degree of risk with any class proceeding. He noted that “class counsel should not anticipate that every reasonably commenced class action will be remunerative and a profitable endeavor.” (at para 103)

The result is interesting on a number of fronts. Clearly Home Depot found it less costly to settle than to proceed with the litigation even though Justice Perell seems to be of the view that they would have won their case. The case illustrates just how costly data breaches can be, even for companies that have done nothing wrong and are themselves victims of criminal activities. In terms of the class action law suit, as with many data breaches, proof of actual harm to the class members was difficult to come by, making losses quite speculative. Further, as litigation of this kind tends to proceed slowly, the lack of harm to class members becomes increasingly apparent in cases where there is no evidence that the illegal obtained data has been used by the malefactors. The result in this case suggests that in class action law suits related to privacy breaches, class members who do not suffer actual pecuniary loss should not expect significant payouts; and companies who are not at fault in the breach and who act promptly to assist affected customers may substantially reduce (or eliminate) their liability. These factors may affect decisions by class counsel to launch class action lawsuits where the link between the breach and actual harm is weak, or where defendants are not obviously at fault.

 

 

Published in Privacy

Yesterday I appeared before the House of Commons’ Standing Committee on Access to Information, Privacy and Ethics, along with Professor David Lyon of Queen’s University and Professor Lisa Austin of the University of Toronto. The Committee is considering long overdue reform of the Privacy Act, and we had been invited to speak on this topic.

All three of us urged the Committee to take into account the very different technological environment in which we now find ourselves. Professor Lyon cogently addressed the changes brought about by the big data context. Although the Privacy Act as it currently stands largely address the collection, use and disclosure of personal information for “administrative purposes” all three of us expressed concerns over the access to and use by government of information in the hands of the private sector, and the use of information in big data analytics. Professor Austin in particular emphasized the need to address not just the need for accuracy in the data collected by government but also the need to assess “algorithmic accuracy” – the quality/appropriateness of algorithms used to analyse large stores of data and to draw conclusions or predictions from this data. She also made a clear case for bringing Charter considerations into the Privacy Act – in other words, for recognizing that in some circumstances information collection, disclosure or sharing that appears to be authorized by the Privacy Act might nevertheless violate the Canadian Charter of Rights and Freedoms. There was also considerable discussion of information-sharing practices both within government and between our government and other foreign or domestic governments.

The Committee seemed very interested and engaged with the issues, which is a good sign. Reform of the Privacy Act will be a challenging task. The statute as a public sector data protection statute is sorely out of date. However, it is also out of context – in other words, it was drafted to address an information context that is radically different from that in which we find ourselves today. Many of the issues that were raised before the Committee yesterday go well beyond the original boundaries of the Privacy Act, and the addition of a few provisions or a few tweaks here and there will not come close to solving some of these privacy issues – many of which overlap with issues of private sector data protection, criminal law and procedure, and national security.

The notes related to my own remarks to the Committee are available below.

Written Notes for Comments by Professor Teresa Scassa to the House of Commons’ Standing Committee on Access to Information, Privacy and Ethics, June 14, 2016

Thank you for the opportunity to address this Committee on the issue of reform of the Privacy Act.

I have reviewed the Commissioner’s recommendations on Privacy Act reform and I am generally supportive of these proposals. I will focus my remarks today on a few specific issues that are united by the theme of transparency. Greater transparency with respect to how personal information is collected, used and disclosed by government enhances privacy by exposing practices to comment and review and by enabling appropriate oversight and accountability. At the same time, transparency is essential to maintaining public confidence in how government handles personal information.

The call for transparency must be situated within our rapidly changing information environment. Not only does technology now enable an unprecedented level of data collection and storage, enhanced analytic capacity has significantly altered the value of information in both public and private sectors. This increased value provides temptations to over-collect personal information, to share it, mine it or compile it across departments and sectors for analysis, and to retain it beyond the period required for the original purposes of its collection.

In this regard, I would emphasize the importance of the recommendation of the Commissioner to amend the Privacy Act to make explicit a “necessity” requirement for the collection of personal information, along with a clear definition of what ‘necessary’ means. (Currently, s. 4(1) of the Privacy Act requires only that personal information “relate[] directly to an operating program or activity of the institution”.) The goal of this recommendation is to curtail the practice of over-collection of personal information. Over-collection runs counter to the expectations of the public who provide information to government for specific and limited purposes. It also exposes Canadians to enhanced risks where negligence, misconduct or cyberattack result in data breaches. Data minimization is an important principle that is supported by data protection authorities around the world and that is reflected in privacy legislation. The principle should be explicit and up front in a reformed Privacy Act. Data minimization also has a role to play in enhancing transparency: not only do clear limits on the collection of personal information serve transparency goals; over-collection encourages the re-purposing of information, improper use and over-sharing.

The requirement to limit collection of information to specific and necessary purposes is tied to the further requirement on government to collect personal information directly from the individual “where possible” (s. 5(1)). This obviously increases transparency as it makes individuals directly aware of the collection. However, this requirement relates to information collected for an “administrative purpose”. There may be many other purposes for which government collections information, and these fall outside the privacy protective provisions of the Privacy Act. This would include circumstances that is disclosed to a government investigative body at its request in relation to an investigation or the enforcement of any law, or that is disclosed to government actors under court orders or subpoenas. Although such information gathering activities may broadly be necessary, they need to be considered in the evolving data context in which we find ourselves, and privacy laws must adapt to address them.

Private sector companies now collect vast stores of personal information, and this information often includes very detailed, core-biographical information. It should be a matter of great concern, therefore, that the permissive exceptions in both PIPEDA and the Criminal Code enable the flow of massive amounts of personal information from the private sector to government without the knowledge or consent of the individual. Such requests/orders are often (although not always) made in the course of criminal or national security investigations. The collection is not transparent to the individuals affected, and the practices as a whole are largely non-transparent to the broader public and to the Office of the Privacy Commissioner (OPC).

We have heard the most about this issue in relation to telecommunications companies, which are regularly asked or ordered to provide detailed information to police and other government agents. It should be noted, however, that many other companies collect personal information about individuals that is highly revelatory about their activities and choices. It is important not to dismiss this issue as less significant because of the potentially anti-social behaviour of the targeted individuals. Court orders and requests for information can and do encompass the personal information of large numbers of Canadians who are not suspected of anything. The problem of tower dump warrants, for example, was recently highlighted in a recent case before the Ontario Supreme Court (R. v. Rogers Communication (2016 ONSC 70))(my earlier post on this decision can be found here). The original warrant in that case sought highly detailed personal information of around 43,000 individuals, the vast majority of whom had done nothing other than use their cell phones in a certain area at a particular time. Keep in mind that the capacity to run sophisticated analytics will increase the attractiveness of obtaining large volumes of data from the private sector in order to search for an individual linked to a particular pattern of activity.

Without adequate transparency regarding the collection of personal information from the private sector, there is no way for the public to be satisfied that such powers are not abused. Recent efforts to improve transparency (for example, the Department of Innovation, Science and Economic Development’s voluntary transparency reporting guidelines) have focused on private sector transparency. In other words, there has been an attempt to provide a framework for the voluntary reporting by companies of the number of requests they receive from government authorities, the number they comply with, and so on. But these guidelines are entirely voluntary, and they also only address transparency reporting by the companies themselves. There are no legislated obligations on government actors to report in a meaningful way – whether publicly or to the OPC – on their harvesting of personal information from private sector companies. I note that the recent attempt by the OPC to audit the RCMP’s use of warrantless requests for subscriber data came to an end when it became clear that the RCMP did not keep specific records of these practices.

In my view, a modernization of the Privacy Act should directly address this enhanced capacity of government institutions to access the vast stores of personal information in the hands of the private sector. The same legislation that permits the collection of personal information from private sector companies should include transparency reporting requirements where such collection takes places. In addition, legislative guidance should be provided on how government actors who obtain personal information from the private sector either by request or under court order should deal with this information. Specifically, limits on the use and retention of this data should be imposed.

It is true that both the Criminal Code and PIPEDA enable police forces and investigative bodies under both federal and provincial jurisdiction to obtain personal information from the private sector under the same terms and conditions, and that reform of the Privacy Act in this respect will not address transparency and accountability of provincial actors. This suggests that issues of transparency and accountability of this kind might also fruitfully be addressed in the Criminal Code and in PIPEDA, but this is no reason not to also address it in the Privacy Act. To the extent that government institutions are engaged in the indirect collection of personal information, the Privacy Act should provide for transparency and accountability with respect to such activities.

Another transparency issue raised by the Commissioner relates to information-sharing within government. Technological changes have made it easier for government agencies and departments to share personal information – and they do so on what the Commissioner describes as a “massive” scale. The Privacy Act enables personal information sharing within and between governments, domestically and internationally, in specific circumstances – for investigations and law enforcement, for example, or for purposes consistent with those for which it was collected. (Section 8(2)(a) allows for sharing “for the purpose for which the information was obtained or compiled by the institution or for a use consistent with that purpose”). Commissioner Therrien seeks amendments that would require information-sharing within and between governments to take place according to written agreements in a prescribed form. Not only would this ensure that information sharing is compliant with the legislation, it would offer a measure of transparency to a public that has a right to know whether and in what circumstances information they provide to one agency or department will be shared with another – or whether and under what conditions their personal information may be shared with provincial or foreign governments.

Another important transparency issue is mandatory data breach reporting. Treasury Board Secretariat currently requires that departments inform the OPC of data security breaches; yet the Commissioner has noted that not all comply. As a result, he is asking that the legislation be amended to include a mandatory breach notification requirement. Parliament has recently amended PIPEDA to include such a requirement. Once these provisions take effect, the private sector will be held to a higher standard than the public sector unless the Privacy Act is also amended. Any amendments to the federal Privacy Act to address data security breach reporting would have to take into account the need for both the Commissioner and for affected individuals to be notified where there has been a breach that meets a certain threshold for potential harm, as will be the case under PIPEDA. The PIPEDA amendments will also require organizations to keep records of all breaches of security safeguards regardless of whether they meet the harm threshold that triggers a formal reporting requirement. Parliament should impose a requirement on those bodies governed by the Privacy Act to both keep and to submit records of this kind to the OPC. Such records would be helpful in identifying patterns or trends either within a single department or institution or across departments or institutions. The ability to identify issues proactively and to address them either where they arise or across the federal government can only enhance data security – something which is becoming even more urgent in a time of increased cybersecurity threats.

 

Published in Privacy

A recent news story from the Ottawa area raises interesting questions about big data, smart cities, and citizen engagement. The CBC reported that Ottawa and Gatineau have contracted with Strava, a private sector company to purchase data on cycling activity in their municipal boundaries. Strava makes a fitness app that can be downloaded for free onto a smart phone or other GPS-enabled device. The app uses the device’s GPS capabilities to gather data about the users’ routes travelled. Users then upload their data to Strava to view the data about their activities. Interested municipalities can contract with Strava Metro for aggregate de-identified data regarding users’ cycling patterns over a period of time (Ottawa and Gatineau have apparently contracted for 2 years’ worth of data). According to the news story, their goal is to use this data in planning for more bike-friendly cities.

On the face of it, this sounds like an interesting idea with a good objective in mind. And arguably, while the cities might create their own cycling apps to gather similar data, it might be cheaper in the end for them to contract for the Strava data rather than to design and then promote the use of theirs own apps. But before cities jump on board with such projects, there are a number of issues that need to be taken into account.

One of the most important issues, of course, is the quality of the data that will be provided to the city, and its suitability for planning purposes. The data sold to the city will only be gathered from those cyclists who carry GPS-enabled devices, and who use the Strava app. This raises the question of whether some cyclists – those, for example, who use bikes to get around to work, school or to run errands and who aren’t interested in fitness apps – will not be included in planning exercises aimed at determining where to add bike paths or bike lanes. Is the data most likely to come from spandex-wearing, affluent, hard core recreational cyclists than from other members of the cycling community? The cycling advocacy group Citizens for Safe Cycling in Ottawa is encouraging the public to use the app to help the data-gathering exercise. Interestingly, this group acknowledges that the typical Strava user is not necessarily representative of the average Ottawa cyclist. This is in part why they are encouraging a broader public use of the app. They express the view that some data is better than no data. Nevertheless, it is fair to ask whether this is an appropriate data set to use in urban planning. What other data will be needed to correct for its incompleteness, and are there plans in place to gather this data? What will the city really know about who is using the app and who is not? The purchased data will be deidentified and aggregated. Will the city have any idea of the demographic it represents? Still on the issue of data quality, it should be noted that some Strava users make use of the apps’ features to ride routes that create amusing map pictures (just Google “strava funny routes” to see some examples). How much of the city’s data will reflect this playful spirit rather than actual data about real riding routes is a question also worth asking.

Some ethical issues arise when planning data is gathered in this way. Obviously, the more people in Ottawa and Gatineau who use this app, the more data there will be. Does this mean that the cities have implicitly endorsed the use of one fitness app over another? Users of these apps necessarily enable tracking of their daily activities – should the city be encouraging this? While it is true that smart phones and apps of all variety are already harvesting tracking data for all sorts of known and unknown purposes, there may still be privacy implications for the user. Strava seems to have given good consideration to user privacy in its privacy policy, which is encouraging. Further, the only data sold to customers by Strava is deidentified and aggregated – this protects the privacy of app users in relation to Strava’s clients. Nevertheless, it would be interesting to know if the degree of user privacy protection provided was a factor for either city in choosing to use Strava’s services.

Another important issue – and this is a big one in the emerging smart cities context – relates to data ownership. Because the data is collected by Strava and then sold to the cities for use in their planning activities, it is not the cities’ own data. The CBC report makes it clear that the contract between Strava and its urban clients leaves ownership of the data in Strava’s hands. As a result, this data on cycling patterns in Ottawa cannot be made available as open data, nor can it be otherwise published or shared. It will also not be possible to obtain the data through an access to information request. This will surely reduce the transparency of planning decisions made in relation to cycling.

Smart cities and big data analytics are very hot right now, and we can expect to see all manner of public-private collaborations in the gathering and analysis of data about urban life. Much of this data may come from citizen-sensors as is the case with the Strava data. As citizens opt or are co-opted into providing the data that fuels analytics, there are many important legal, ethical and public policy questions which need to be asked.

 

The Federal Court has released a decision in a case that raises important issues about transparency and accountability under Canada’s private sector privacy legislation.

The Personal Information Protection and Electronic Documents Act (PIPEDA) governs privacy with respect to the collection, use and disclosure of personal information by private sector organizations. Under PIPEDA, individuals have the right to access their personal information in the hands of private sector organizations. The right of access allows individuals to see what information organizations have collected about them. It is accompanied by a right to have incorrect information rectified. In our datified society, organizations make more and more decisions about individuals based upon often complex profiles built with personal information from a broad range of sources. The right of access allows individuals to see whether organizations have exceeded the limits of the law in collecting and retaining personal information; it also allows them the opportunity to correct errors that might adversely impact decision-making about them. Unfortunately, our datified society also makes organizations much more likely to insist that the data and algorithms used to make decisions or generate profiles, along with the profiles themselves, are all confidential business information and thus exempt from the right of access. This is precisely what is at issue in Bertucci v. Royal Bank of Canada.

The dispute in this case arose after the Bertuccis – a father and son who had banked with RBC for 35 and 20 years respectively, and who also held business accounts with the bank – were told by RBC that the bank would be closing their accounts. The reason given for the account closure was that the bank was no longer comfortable doing business with them. Shortly after this, the Bertuccis made a request, consistent with their right of access under PIPEDA, to be provided with all of their personal information in the hands of RBC, including information as to why their bank accounts were closed. RBC promptly denied the request, stating that it had already provided its reason for closing the accounts and asserting that it had a right under its customer contracts to unilaterally close accounts without notice. It also indicated that it had received no personal information from third parties about the Bertuccis and that all of the information that they sought was confidential commercial information.

RBC relied upon paragraph 9(3)(b) of PIPEDA, which essentially allows an organization to refuse to provide access to personal information where “to do so would reveal confidential commercial information”. On receiving RBC’s refusal to provide access, the Bertuccis complained to the Office of the Privacy Commissioner. The OPC investigated the complaint and ultimately sided with RBC, finding that it was justified in withholding the information. In reaching this conclusion, the OPCC relied in part on an earlier Finding of the Privacy Commissioner which I have previously critiqued, precisely because of its potential implications for transparency and accountability in the evolving big data context.

In reaching it conclusion on the application of paragraph 9(3)(b) of PIPEDA, the OPC apparently accepted that the information at issue was confidential business information, noting that it was “treated as confidential by RBC, including information about the bank’s internal methods for assessing business-related risks.” (At para 10)

After having their complaint declared unfounded by the OPC, the applicants took the issue to the Federal Court. Justice Martineau framed the key question before the court in these terms: “Can RBC refuse to provide access to undisclosed personal information it has collected about the applicants on the grounds that its disclosure in this case would reveal confidential commercial information” (at para 16)

RBC’s position was that it was not required to justify why it might close an account. It argued that if it is forced to disclose personal information about a decision to close an account, then it is effectively stripped of its prerogative to not provide reasons. It also argued that any information that it relied upon in its risk assessment process would constitute confidential business information. This would be so even if the information were publicly available (as in the case of a newspaper article about the account holder). The fact that the newspaper article was relied upon in decision-making would be what constituted confidential information – providing access to that article would de facto disclose that information.

The argument put forward by RBC is similar to the one accepted by the OPC in its earlier (2002) decision which was relied upon by the bank and which I have previously criticized here. It is an argument that, if accepted, would bode very ill for the right of access to personal information in our big data environment. Information may be compiled from all manner of sources and used to create profiles that are relied upon in decision-making. To simply accept that information used in this way is confidential business information because it might reveal how the company reaches decisions slams shut the door on the right of access and renders corporate decision-making about individuals, based upon the vast stores of collected personal information, essentially non-transparent.

The Bertuccis argued that PIPEDA – which the courts have previously found to have a quasi-constitutional status in protecting individual privacy – makes the right of access to one’s personal information the rule. An exception to this rule would have to be construed narrowly. The applicants wanted to know what information led to the closure of their accounts and sought as well to exercise their right to have this information corrected if it was inaccurate. They were concerned that the maintenance on file of inaccurate information by RBC might continue to haunt them in the future. They also argued that RBC’s approach created a two-tiered system for access to personal information. Information that could be accessed by customers whose accounts were not terminated would suddenly become confidential information once those accounts were closed, simply because it was used in making that decision. They argued that the bank should not be allowed to use exceptions to the access requirement to shelter itself from embarrassment at having been found to have relied upon faulty or inadequate information.

Given how readily the OPC – the guardian of Canadians’ personal information in the hands of private sector organizations – accepted RBC’s characterization of this information as confidential, Justice Martineau’s decision is encouraging. He largely agreed with the position of the applicants, finding that the exceptions to the right to access to one’s personal information must be construed narrowly. Significantly, Justice Martineau found that courts cannot simply defer to a bank’s assertion that certain information is confidential commercial information. He placed an onus on RBC to justify why each withheld document was considered confidential. He noted that in some circumstances it will be possible to redact portions of reports, documents or data that are confidential while still providing access to the remainder of the information. In this case, Justice Martineau was not satisfied that the withheld information met the standard for confidential commercial information, nor was he convinced that some of it could not have been provided in redacted form.

Reviewing the documents at issue, Justice Martineau began by finding that a list of the documents relied upon by the bank in reaching its decision was not confidential information, subject to certain redactions. He noted as well that much of what was being withheld by the bank was “raw data”. He distinguished the raw data from the credit scoring model that was found to be confidential information in the 2002 OPC Finding mentioned above. He noted as well that the raw data was not confidential information and had not, when it was created, been treated as confidential information by the bank. He also noted that the standard for withholding information on an access request was very high.

Justice Martineau gave RBC 45 days to provide the applicants with all but a few of the documents which the court agreed could be withheld as confidential commercial information. Although the applicants had sought compensatory and punitive damages, he found that it was not an appropriate case in which to award damages.

Given the importance of this decision in the much broader big data and business information context, RBC is likely to appeal it to the Federal Court of Appeal. If so, it will certainly be an important case to watch. The issues it raises are crucial to the future of transparency and accountability of corporations with respect to their use of personal information. In light of the unwillingness of the OPC to stand up to the bank both in this case and in earlier cases regarding assertions of confidential commercial information, Justice Martineau’s approach is encouraging. There is a great deal at stake here, and this case will be well worth watching if it is appealed.

 

 

 

 

Published in Privacy

The department formerly known as Industry Canada (now Innovation, Science and Economic Development or ISED) has just released a discussion paper that seeks public input on the regulations that will accompany the new data breach notification requirements in the Personal Information Protection and Electronic Documents Act (PIPEDA).

The need to require private sector organizations in Canada to report data breaches was first formally identified in the initial review of PIPEDA carried out in 2007. The amendments to the statute were finally passed into law in June of 2015, but they will not take effect until regulations are enacted that provide additional structure to the notification requirements. The discussion paper seeks public input prior to drafting and publishing regulations for comment and feedback, so please stop holding your breath. It will still take a while before mandatory data breach notification requirements are in place in Canada.

The new amendments to the legislation make it mandatory for organizations to report data breaches to the Privacy Commissioner if those breaches pose “a real risk of significant harm to an individual”. (s. 10.1) An organization must also notify any individuals for whom the breach poses “a real risk of significant harm (s. 10.1(3). The form and contents of these notifications remain to be established by the regulations. A new s. 10.2 of PIPEDA will also require an organization that has suffered a reportable breach to notify any other organization or government institution of the breach if doing so may reduce the risk of harm. For example, such notifications might include ones to credit reporting agencies or law enforcement officials. The circumstances which trigger this secondary notification obligation remain to be fleshed out in the regulations. Finally, a new s. 10.3 of PIPEDA will require organizations to keep records of all data breaches not just those that reach the threshold for reporting to the Privacy Commissioner. In theory these records might enable organizations to detect flaws in their security practices. They may also be requested by the Commissioner, providing potential for oversight of data security at organizations. The content of these records remains to be determined by the new regulations.

From the above, it is clear that the regulations that will support these statutory data breach reporting requirements are fundamentally important in setting its parameters. The ISED discussion paper articulates a series of questions relating to the content of the regulations on which it seeks public input. The questions relate to how to determine when there is a “real risk of significant harm to an individual”; the form and content of the notification that is provided to the Commissioner by an organization that has experienced a breach; the form, manner and content of notification provided to individuals; the circumstances in which an organization that has experienced a breach must notify other organizations; and the form and content or records kept by organizations, as well as the period of time that these records must be retained.

There is certain that ISED will receive many submissions from organizations that are understandably concerned about the impact that these regulations may have on their operations and legal obligations. Consumer and public interest advocacy groups will undoubtedly make submissions from a consumer perspective. Individuals are also welcome contribute to the discussion. Some questions are particularly relevant to how individuals will experience data breach notification. For example, if an organization experiences a breach that affects your personal information and that poses a real risk of harm, how would you like to receive your notification? By telephone? By mail? By email? And what information would you like to receive in the notification? What level of detail about the breach would you like to have? Do you want to be notified of measures you can take to protect yourself? Do you want to know what steps the organization has taken and will take to protect you?

Anyone with an interest in this issue, whether personally or on behalf of a group or an organization has until May 31, 2016 to provide written submission to This e-mail address is being protected from spambots. You need JavaScript enabled to view it . The discussion paper and questions can be found here.

Published in Privacy

Technology has enabled the collection and sharing of personal information on a massive scale, and governments have been almost as quick as the private sector to hoover up as much of it as they can. They have also been as fallible as the private sector – Canada’s federal government, for example, has a substantial number of data breaches in the last few years.

What has not kept pace with technology has been the legislation in place to protect privacy. Canada’s federal Privacy Act, arguably a ground-breaking piece of legislation when it was first enacted in 1983, has remained relatively untouched throughout decades of dramatic technological change. Despite repeated calls for its reform, the federal government has been largely unwilling to update this statute that places limits on its collection, use and disclosure of personal information. This may be changing with the new government’s apparent openness to tackling the reform of both this statute and the equally antiquated Access to Information Act. This is good news for Canadians, as each of these statutes has an important role to play in holding a transparent government accountable for its activities.

On March 10, 2016 Federal Privacy Commissioner Daniel Therrien appeared before the Standing Committee on Access to Information, Privacy and Ethics, which is considering Privacy Act reform. The Commissioner’s statement identified some key gaps in the statute and set out his wish list of reforms.

As the Commissioner pointed out, technological changes have made it easier for government agencies and departments to share personal information – and they do so on what he describes as a “massive” scale. The Privacy Act currently has little to offer to address these practices. Commissioner Therrien is seeking amendments that would require information sharing within the government to take place according to written agreements in a prescribed form. Not only would this ensure that information sharing is compliant with legal obligations to protect privacy, it would offer a measure of transparency to a public that has a right to know whether and in what circumstances information they provide to one agency or department will be shared with another.

The Commissioner is also recommending that government institutions be explicitly required under the law to safeguard the personal information in their custody, and to report data breaches to the Office of the Privacy Commissioner. It may come as a surprise to many Canadians that such a requirement is not already in the statute – its absence is a marker of how outdated the law has become. Since 2014, the Treasury Board of Canada, in its Directive on Privacy Practices has imposed mandatory breach reporting for all federal government institutions, but this is not a legislated requirement, nor is there recourse to the courts for non-compliance.

The Commissioner is also seeking more tools in his enforcement toolbox. Under the Privacy Act as it currently stands, the Commissioner may make recommendations to government institutions regarding their handling of personal information. These recommendations may then be ignored. While he notes that “in the vast majority of cases, government departments do eventually agree to implement our recommendations”, it is clear that this can be a long, drawn out process with mixed results. Currently, the only matters that can be taken to court for enforcement are denials by institutions to provide individuals with access to their personal information. The Commissioner is not seeking the power to directly compel institutions to comply with its recommendations; rather, he recommends that an institution that receives recommendations from the Office of the Privacy Commissioner have two choices. They may implement the recommendations or they may go to court for a declaration that they do not need to comply. On this model, relatively prompt compliance would presumably become the default.

The Commissioner is also seeking an amendment that would require government institutions to conduct privacy impact assessments before the launch of a new program or where existing programs are substantially modified. Again, you would think this would be standard practice by now. It does happen, but the Commissioner diplomatically describes current PIAs as being “sometimes uneven” in both their quality and timeliness. The Commissioner would also like to see a legislated requirement that government bills that will have an impact on privacy be sent to the OPC for review before being tabled in Parliament.

The Commissioner seeks additional amendments to improve transparency in relation to the government’s handling of personal information. Currently, the Commissioner files an annual report to Parliament. He may also issue special reports. The Commissioner recommends that he be empowered under the legislation “to report proactively on the practices of government”. He also recommends extending the Privacy Act to all government institutions. Some are currently excluded, including the Prime Minister’s Office and the offices of Ministers. He also recommends allowing all individuals whose personal information is in the hands of a federal government institution to have a right of access to that information (subject, of course, to the usual exceptions). Currently on Canadian citizens and those present in Canada have access rights.

This suite of recommendations is so reasonable that most Canadians would be forgiven for assuming these measures were already in place. Given the new government’s pre- and post-election commitments to greater transparency and accountability, there may be reason to hope we will finally see the long-overdue reform of the Privacy Act.

 

Published in Privacy

I was at the United Nations last week for an Expert Group Meeting on Moving from commitments to results in building effective, accountable and inclusive institutions at all levels. On February 18, 2016, I gave a presentation on balancing privacy with transparency in open government. This is a challenging issue, and one that is made even more so by digitization, information communication technologies and the big data environment.

Openness access to government information and data serve the goals of greater transparency and greater public trust in government. They are essential in fighting corruption, but they are also important in holding governments to account for their decision-making and for their spending of public funds. However, transparency must also be balanced against other considerations, including privacy. Privacy is a human right, and it protects the dignity, autonomy and integrity of individuals. Beyond this, however, the protection of privacy of personal information in the hands of governments also enhances public trust in governments and can contribute to citizen engagement.

How, then, does one balance privacy with transparency when it comes to information in the hands of government? There are no easy answers. My slides from my presentation can be found here, and these slides contain some links to some other publicly available work on this topic.

Published in Privacy

A recent decision of the Ontario Superior Court of Justice has expanded the scope of the tort of invasion of privacy in Ontario. This is an important development, given that the tort was only recognized for the first time by the Ontario Court of Appeal in 2012. The rapid expansion of private recourses for invasion of privacy is not surprising. Technology has amplified privacy risks, and highly publicized incidents of data breaches, snooping, shaming, and identity theft have dramatically increased public awareness of the risks and harms of privacy invasive activity.

Doe 464533 v. D. involved a defendant who posted an intimate video of the plaintiff on a pornography website without her knowledge or consent. The two had been in a relationship which began when they were in high school and ended shortly afterwards. The plaintiff moved away to attend university and remained in regular contact with the defendant. He began pressuring her to send him an intimate video of herself. She refused to do so for a time, but eventually gave in to repeated requests. The defendant had assured her that no one else would see the video. As it turns out, he posted the video to a porn site on the same day he received it. He also showed it to other young men from the high school he had attended with the plaintiff.

The posting of the video and its aftermath were devastating to the plaintiff who suffered from depression and anguish. Justice Stinson observed that at the time of the hearing, 4 years after the incident, she was still “emotionally fragile and worried that the video may someday resurface and have an adverse impact on her employment, her career or her future relationships.” (at para 14)

There are two significant aspects to the court’s decision in this case. The first is that it expands the privacy tort recognized by the Ontario Court of Appeal in Jones v. Tsige. In that case, a bank employee had improperly accessed customer information for her own purposes. The Court of Appeal was prepared to recognize at least one aspect of the broad tort of invasion of privacy – that of “intrusion upon seclusion”. In other words, one who snoops or hacks their way into the personal information of another can be held liable for this invasion. The facts of Doe 464533 did not fit within the boundaries of ‘intrusion upon seclusion’. The defendant did not improperly access the plaintiff’s personal information. She sent it to him directly. However, she did so on the understanding that the material would remain strictly private. In breach of this understanding, the defendant posted the information online and shared it with common acquaintances. Justice Stinson characterized this as another branch of the broad tort of invasion of privacy – the “public disclosure of embarrassing private facts about the plaintiff”. Justice Stinson observed that “[i]n the electronic and Internet age in which we all now function, private information, private facts and private activities may be more and more rare, but they are no less worthy of protection.” (at para 44) He adopted a slightly modified version of the American Restatement (second) of Torts’ formulation of this branch of the tort:

One who gives publicity to a matter concerning the private life of another is subject to liability to the other for invasion of the other’s privacy, if the matter publicized or the act of the publication (a) would be highly offensive to a reasonable person, and (b) is not of legitimate concern to the public. (at para 46)

The recognition of this branch of the tort is an important development given that it now clearly provides recourse for those who are harmed by the publication of private facts about themselves. There are limits – the tort will only be available where the material published “would be highly offensive to a reasonable person”. Further, if the facts are ones that there is a public interest in knowing (for example, the publication of information about a person’s involvement in corrupt or illegal activity), there will be no liability. However in an era in which “revenge porn” is a known phenomenon, the tort may provide a deterrent effect in some instances, and a basis for recourse in others.

The other interesting aspect of this decision is the damage award. The plaintiff had decided to commence her action under the Court’s Simplified Procedure. This meant that the maximum she could ask for in damages was $100,000. Justice Stinson ordered the maximum amount with little hesitation – which suggests that he might have awarded even more extensive damages had there been no cap. This is surely interesting, as damage awards for breach of privacy (either the tort or recourses under private sector data protection laws in Canada) have been generally quite small. In Jones v. Tsige, the Court had awarded only $10,000 in damages and had indicated that the normal range for such damages would be up to a maximum of $20,000 where no direct financial losses could be shown. In Doe 464533, Justice Stinson found the harm suffered by the plaintiff by the publication of the video to be analogous to the harm suffered in cases of sexual assault and battery. He fixed an amount of $50,000 in general damages for the past and ongoing effects of the defendant’s actions. He also awarded $25,000 in aggravated damages relating to the particularly offensive behavior of the defendant. According to Justice Stinson, the defendant’s breach of trust was “an affront to their relationship that made the impact of his actions even more hurtful and painful for the plaintiff.”(at para 59). He also awarded $25,000 in punitive damages for the defendant’s reckless disregard for the plaintiff. He noted that the defendant had not apologized, nor had he shown any remorse. He noted as well the highly blameworthy nature of the defendant’s conduct, the vulnerability of the plaintiff, and the significant harm the plaintiff had suffered. Justice Stinson also expressed the view that the punitive damage award was meant to have a deterrent effect. He stated: “it should serve as a precedent to dissuade others from engaging in similar harmful conduct.” (at para 62) In addition to the total award of $100,000 in damages, the judge ordered a further $5,500 in prejudgment interest and $36,208.73 in legal costs.

The recognition of the new tort, combined with the court’s approach to quantifying the harm suffered from this form of privacy invasive activity, should sound a warning to those who seek to use the internet as a means to expose or humiliate others.

Published in Privacy

Recent debates about enhanced police and national security surveillance powers in Canada have drawn attention to the vulnerability of Canadians’ privacy rights in the absence of proper safeguards and oversight. This problem is particularly acute in our big data economy, where participation in the economy – simply by being consumers of products and services – leaves a detailed trail of data in the hands of private sector actors. The Criminal Code provides for extensive access by police to personal information in the hands of third parties through its warrant system. Laws such as the Personal Information Protection and Electronic Documents Act (PIPEDA) also allow private sector companies to provide law enforcement and other government entities with personal information, without the knowledge and consent of the individual. This is often done in response to a court order or search warrant; however, PIPEDA also permits voluntary sharing even without a warrant in some circumstances.

The courts have had to play an important role in placing limits on the extent of access by state authorities to Canadians’ personal information. Just this week, in another significant decision, Justice Sproat of the Ontario Superior Court, issued a long-awaited decision in R. v. Rogers Communication (2016 ONSC 70) on the constitutional limitations on “tower dump” warrants.

The original tower dump warrants in this case were issued to police who were investigating a jewellery store robbery in Toronto. The police believed that the unidentified suspects had used cell phones during or just after the robbery. They asked the court for an order requiring the relevant cell phone service providers (in this case Rogers and Telus) to provide a dump of all of the data from cell phone towers that might have picked up and transmitted these calls within a window of time surrounding the robbery. On Telus’ estimate, compliance with the original order would have required it to provide data relating to at least 9,000 customers. Rogers estimated that it would need to provide the records of 34,000 subscribers. In addition to the data regarding all of the customers who had placed calls through those towers, the police also sought their name and address information, the names and contact information of all of the individuals who these people called, and credit card and bank information on file for the callers. The police subsequently revised their request, seeking a much more limited amount of data. However, Rogers and Telus pursued their Charter case, arguing that a court ruling on the constitutional legitimacy of this type of data request was necessary to protect not just their own interests but those of their customers.

The Court agreed that the customers of telecommunications companies had a reasonable expectation of privacy in their cell phone data and that if Rogers and Telus could not proceed with the Charter claims, it would be difficult for these issues to be effectively litigated. It agreed to hear and rule on the Charter arguments notwithstanding that the police had withdrawn their initial request for the data and notwithstanding the fact that the Charter rights in question belonged to thousands of private citizens and not to the Telcos directly.

Justice Sproat did not hesitate in ruling that the original production orders sought in this case were overly broad and that they infringed the Charter rights of the individuals whose data would have been captured by them. He found that the orders “went far beyond what was reasonably necessary to gather evidence concerning the commission of the crimes under investigation” (at para 42). He then went on to formulate a set of guidelines for police seeking tower dump warrants. He premised his guidelines on the “fundamental principles of incrementalism and minimal intrusion” (at para 63). He emphasized as well the requirement for police who seek such a warrant to explain “clearly in the information to obtain how requested data relates or does not relate to the investigation.” (at para 64)

The guidelines and their more detailed articulation can be found at paragraph 63 of the decision. In summary though, they are that the police must provide:

1. A statement or explanation that demonstrates that the officer seeking the production order is aware of the principles of incrementalism and minimal intrusion and has tailored the requested order with that in mind;

2. An explanation as to why all of the named locations or cell towers, and all of the requested dates and time parameters are relevant to the investigation;

3. An explanation as to why all of the types of records sought are relevant;

4. Any other details or parameters which might permit the target of the production order to conduct a narrower search and produce fewer records;

5. A request for a report based on specified data instead of a request for the underlying data itself;

6. If there is a request for the underlying data there should be a justification for that request

7. Confirmation that the types and amounts of data that are requested can be meaningfully reviewed

These are important guidelines that seek to limit the reach of state authorities into the private lives of Canadians to only that information which is genuinely necessary to investigate criminal activity.

It is worth noting that Justice Sproat declined to consider post-seizure safeguards in relation to tower dump data. Where a production order legitimately allows police to seek tower dump data, nothing in the Criminal Code provides any guidance as to what safeguards should govern the security and retention of this data. These are important issues – we are all painfully aware of the rising number of public and private sector data security breaches, and of cases of excessive retention and careless destruction of no-longer useful personal information. According to Justice Sproat, issues regarding retention of this data are best left to the legislator. Given the vast amount of personal information now capable of collection from the private sector through the host of different production orders available under the Criminal Code, Parliament should be strongly encouraged to address this issue. In the meantime, it would be good to see police forces develop policies regarding the retention and destruction of personal information obtained under warrants that is no longer necessary for its original purpose.

 

 

 

Published in Privacy

The rise of big data analytics, combined with a movement at all levels of government in Canada towards open data and the proactive disclosure of government information have created a context in which privacy interests are increasingly likely to conflict with the goals of transparency and accountability. In some cases these conflicts may be small and easily reconciled, but in other cases they may be more substantial. In addition, some means of reconciling the conflict must be found; where privacy and transparency conflict, for example, which value should prevail and under what conditions?

Conflicts between transparency and privacy have been seen recently in, for example, concerns expressed over the amount of personal information that might be found in court and tribunal decisions that are published online. Sunshine lists – lists of salaries of public employees that are over a certain amount – also raise issues. Provinces that publish such lists have tended to do so using file formats that do not lend themselves to easy digital manipulation. But of course these modest technological barriers are routinely overcome, and individual name and salary information is absorbed into the big data universe for purposes quite distinct from meeting a government’s transparency objectives. Open municipal data files may include information about specific individuals: for example, a database of all home renovation permit applications would have privacy implications for those individuals who applied for such permits. Even with names were redacted, it is easy enough to identify the owners of any homes for which renovation permits were obtained. In some cases, the level of connection may be less direct. For example, a public restaurant inspection record that cited kitchen staff at a small local restaurant for failure to wash their hands on a specific inspection date might indirectly reveal the identity of the persons who did not wash their hands, particularly if the staff of the restaurant is quite small. And, of course, in the big data context, even anonymized data, or data that is not personal information on its face, can be matched with other available data to identify specific individuals.

The point is not that the disclosure of such information must be avoided at all costs – rather, the issue is how to determine where to draw the line between privacy and transparency, and what steps might be taken to protect privacy while still ensuring transparency. No new legislative framework has been created to specifically guide the move towards open government in Canada, notwithstanding the fact that government data is fuel for the engines of big data.

In a paper that has just been published by the Alberta Law Review, my co-author Amy Conroy and I explore these issues, using a recent Supreme Court of Canada decision as a departure point for our analysis. Although the Court’s decision in Ministry of Community Safety and Correctional Services v Information and Privacy Commissioner (Ontario) (Ministry of Community Safety) does not specifically address either open data or proactive disclosure, the case nevertheless offers important insights into the gaps in both legislation and case law in this area.

In our paper we consider the challenges inherent in the release of government data and information either through pro-active disclosure or as open data. A key factor in striking the balance between transparency and privacy is the definition of personal information – information that is not personal information has no privacy implications. Another factor is, of course, the meaning given to the concept of transparency. Our paper considers how courts and adjudicators understand transparency in the face of competing claims to privacy. We challenge the simple equation of the release of information with transparency and argue that the coincidence of open government with big data requires new approaches that are informed by the developing relationship between privacy and transparency.

“Promoting Transparency While Protecting Privacy in Open Government in Canada” by Amy Conroy and Teresa Scassa is published in (2015) 53:1 Alberta Law Review 175-206. A pre-print version is available here.

Published in Privacy
<< Start < Prev 1 2 3 4 5 6 7 8 9 10 Next > End >>
Page 3 of 10

Canadian Trademark Law

Published in 2015 by Lexis Nexis

Canadian Trademark Law 2d Edition

Buy on LexisNexis

Electronic Commerce and Internet Law in Canada, 2nd Edition

Published in 2012 by CCH Canadian Ltd.

Electronic Commerce and Internet Law in Canada

Buy on CCH Canadian

Intellectual Property for the 21st Century

Intellectual Property Law for the 21st Century:

Interdisciplinary Approaches

Purchase from Irwin Law