Tags
access to information
AI
AIDA
AI governance
AI regulation
Ambush Marketing
artificial intelligence
big data
bill c11
Bill c27
copyright
data governance
data protection
data strategy
freedom of expression
Geospatial
geospatial data
intellectual property
Internet
internet law
IP
open courts
open data
open government
personal information
pipeda
Privacy
smart cities
trademarks
transparency
|
Teresa Scassa
Thursday, 22 November 2018 06:51
Bill C-86 proposes positive changes for Official Marks regime: Comments to the Standing Senate Committee on Banking, Trade and Commerce,
The following is a copy of remarks I made in an appearance before the Standing Senate Committee on Banking, Trade and Commerce, on November 21, 2018. The remarks are about proposed amendments to the Trade-marks Act found in the omnibus Bill C-86. I realize that not everyone is as interested in official marks as I am. To get a sense of what the fuss is all about, you can have a look at my posts about the overprotection of Olympic Marks, the dispute over the Spirit Bear mark, the struggle to register a trademark in the face of an wrongly granted and since abandoned official mark, Canada Post’s official marks for POSTAL CODE and CODE POSTAL and a previous private member’s bill to reform official marks. Canada’s official marks regime has long been criticized by lawyers, academics and the Federal Court. In fact, it is the Federal Court that has, over the years, created some much needed boundaries for these “super marks”. The problems with official marks are well known, but they have largely been ignored by Parliament. It is therefore refreshing to see the proposed changes in ss. 215 and 216 of Bill C-86. Sections 215 and 216 address only one of the problems with the official marks regime. Although it is an important one, it is worth noting that there is more that could be done. The goal of my remarks will be to identify what I see as two shortfalls of ss 215 and 216. Official marks are a subcategory of prohibited marks, which may not be adopted, used or registered unless consent is provided. They are available to “public authorities”. A public authority need only ask the Registrar of Trademarks to give public notice of its adoption and use of an official mark for that mark to be protected. There are no limits to what can be adopted. There are no registration formalities, no examination or opposition proceedings. Until the very recent decision of the Federal Court in Quality Program Services Inc. v. Canada, it seemed nothing prevented a public authority from obtaining an official mark that was identical to or confusing with an already registered trademark. While Quality Program Services at least provides consequences for adopting a confusing official mark, it is currently under appeal and it is not certain that the decision will be upheld. This is another instance of the Federal Court trying to set boundaries for official marks that simply have not been set in the legislation. Official marks are theoretically perpetual in duration. They remain on the register until they are either voluntarily withdrawn by the ‘owner’ (and owners rarely think to do this), or until a successful (and costly) action for judicial review results in one being struck from the Register. Until the Ontario Association of Architects decision in 2002 tightened up the meaning of ‘public authority’, official marks were handed out like Halloween candy, and many entities that were not ‘public authorities’ were able to obtain official marks. Many of these erroneously-issued official marks continue to exist today; in fact, the Register of Trademarks has become cluttered with official marks that are either invalid or no longer in use. Sections 215 and 216 address at least part of this last problem. They provide an administrative process through which either the Registrar or any person prepared to pay the prescribed fee can have an official mark invalidated if the entity that obtained the mark “is not a public authority or no longer exists.” This is a good thing. I would, however, suggest one amendment to the proposed new s. 9(4). Where it is the case (as per the new s. 9(3)) that the entity that obtained the official mark was not a public authority or has ceased to exist, s. 9(4) allows the Registrar to give public notice that subparagraph (1)(n)(iii) “does not apply with respect to the badge, crest, emblem or mark”. As it is currently worded, this is permissive– the Registrar “may” give public notice of non-application. In my view, it should read: (4) In the circumstances set out in subsection (3), the Registrar may, on his or her own initiative or shall, at the request of a person who pays a prescribed fee, give public notice that subparagraph (1)(n)(iii) does not apply with respect to the badge, crest, emblem or mark. There is no reason why a person who has paid a fee to have established the invalidity of an official mark should not have the Registrar give public notice of this. I would also suggest that the process for invalidating official marks should extend to those that have not been used within the preceding three years – in other words, something parallel to s. 45 of the Trade-marks Act which provides an administrative procedure to remove unused registered trademarks from the Register. There are hundreds of ‘public authorities’ at federal and provincial levels across Canada, and they adopt official marks for all sorts of programs and initiatives, many of which are relatively transient. There should be a means by which official marks can simply be cleared from the Register when they are no longer used. Thus, I would recommend adding new subsections 9(5) and (6) to the effect that: (5) The Registrar may at any time – and, at the written request of a person who pays a prescribed fee, made after three years from the date that public notice was given of an official mark, shall, unless the Registrar sees good reason to the contrary – give notice to the public authority requiring it to furnish, within three months, an affidavit or a statutory declaration showing that the official mark was in use at any time during the three year period immediately preceding the date of the notice and, if not, the date when it was last so in use and the reason for the absence of such use since that date. (6) Where, by reason of the evidence furnished to the Registrar or the failure to furnish any evidence, it appears to the Registrar that an official mark was not used at any time during the three year period immediately preceding the date of the notice and that the absence of use has not been due to special circumstances that excuse it, the Registrar shall give public notice that subparagraph (1)(n)(iii) does not apply with respect to the badge, crest, emblem or mark. These are my comments on changes to the official marks regime that most closely relate to the amendments in Bill C-86. The regime has other deficiencies which I would be happy to discuss.
Published in
Trademarks
Tagged under
Wednesday, 31 October 2018 07:50
Statistics Canada faces backlash over collection of personal financial information (or: Teaching an old law new tricks)
A Global News story about Statistics Canada’s collection of detailed financial data of a half million Canadians has understandably raised concerns about privacy and data security. It also raises interesting questions about how governments can or should meet their obligations to produce quality national statistics in an age of big data. According to Andrew Russell’s follow-up story, Stats Canada plans to collect detailed customer information from Canada’s nine largest banks. The information sought includes financial information including account balances, transaction data, credit card and bill payments. It is unclear whether the collection has started. As a national statistical agency, Statistics Canada is charged with the task of collecting and producing data that “ensures Canadians have the key information on Canada's economy, society and environment that they require to function effectively as citizens and decision makers.” Canadians are perhaps most familiar with providing census data to Statistics Canada, including more detailed data through the long form census. However, the agency’s data collection is not limited to the census. Statistics Canada’s role is important, and the agency has considerable expertise in carrying out its mission and in protecting privacy in the data it collects. This is not to say, however, that Statistics Canada never makes mistakes and never experiences privacy breaches. One of the concerns, therefore, with this large-scale collection of frankly sensitive data is the increased risk of privacy breaches. The controversial collection of detailed financial data finds its legislative basis in this provision of the Statistics Act: 13 A person having the custody or charge of any documents or records that are maintained in any department or in any municipal office, corporation, business or organization, from which information sought in respect of the objects of this Act can be obtained or that would aid in the completion or correction of that information, shall grant access thereto for those purposes to a person authorized by the Chief Statistician to obtain that information or aid in the completion or correction of that information. [My emphasis] Essentially, it conveys enormous power on Stats Canada to request “documents or records” from third parties. Non-compliance with a request is an offence under s. 32 of the Act, which carries a penalty on conviction of a fine of up to $1000. A 2017 amendment to the legislation removed the possibility of imprisonment for this offence. In case you were wondering whether Canada’s private sector data protection legislation offers any protection when it comes to companies sharing customer data with Statistics Canada, rest assured that it does not. Paragraph 7(3)(c.1) of the Personal Information Protection and Electronic Documents Act provides that an organization may disclose personal information without the knowledge or consent of an individual where the disclosure is: (c.1) made to a government institution or part of a government institution that has made a request for the information, identified its lawful authority to obtain the information and indicated that [. . .] (iii) the disclosure is requested for the purpose of administering any law of Canada or a province According to the Global News story, Statistics Canada notified the Office of the Privacy Commissioner about its data collection plan and obtained the Commissioner’s advice. In his recent Annual Report to Parliament the Commissioner reported on Statistic’s Canada’s growing practice of seeking private sector data: We have consulted with Statistics Canada (StatCan) on a number of occasions over the past several years to discuss the privacy implications of its collection of administrative data – such as individuals’ mobile phone records, credit bureau reports, electricity bills, and so on. We spoke with the agency about this again in the past year, after a number of companies contacted us with concerns about StatCan requests for customer data. The Commissioner suggested that Stats Canada might consider the collection of only data that has been deidentified at source rather than detailed personal information. He also recommended an ongoing assessment of the necessity and effectiveness of such programs. The Commissioner also indicated that one of the problems with the controversial data collection by Statistics Canada is its lack of openness. He stated: “many Canadians might be surprised to learn the government is collecting their information in this way and for this purpose.” While part of this lack of transparency lies in the decision not to be more upfront about the data collection, part of it lies in the fact that the legislation itself – while capable of being read to permit this type of collection – clearly does not expressly contemplate it. Section 13 was drafted in a pre-digital, pre-big data era. It speaks of “documents or records”, and not “data”. While it is possible to interpret it so as to include massive quantities of data, the original drafters no doubt contemplated a collection activity on a much more modest scale. If Section 13 really does include the power to ask any organization to share its data with Stats Canada, then it has become potentially limitless in scope. At the time it was drafted, the limits were inherent in the analogue environment. There was only so much paper Stats Canada could ask for, and only so much paper it had the staff to process. In addition, there was only so much data that entities and organizations collected because they experienced the same limitations. The digital era means not only that there is a vast and increasing amount of detailed data collected by private sector organizations, but that this data can be transferred in large volumes with relative ease, and can be processed and analyzed with equal facility. Statistics Canada is not the only national statistics organization to be using big data to supplement and enhance its data collection and generation. In some countries where statistical agencies struggle with a lack of human resources and funding, big data from the private sector offer opportunities to meet the data needs of their governments and economies. Statistical agencies everywhere recognize the potential of big data to produce more detailed, fine-grained and reliable data about many aspects of the economy. For example, the United Nations maintains a big data project inventory that catalogues experiments by national statistical agencies around the world with big data analytics. Remember the cancellation of the long form census by the Harper government? This was not a measure to protect Canadians’ privacy by collecting less information; it was motivated by a belief that better and more detailed data could be sought using other means – including reliance on private sector data. It may well be that Statistics Canada needs the power to collect digital data to assist in data collection programs that serve national interests. However, the legislation that authorizes such collection must be up-to-date with our digital realities. Transparency requires an amendment to the legislation that would specifically enable the collection and use of digital and big data from the private sector tor statistical purposes. Debate over the scope and wording of such a provision would give both the public and the potential third party data sources an opportunity to identify their concerns. It would also permit the shaping of limits and conditions that are specific to the nature and risks of this form of data collection.
Published in
Privacy
Tagged under
Sunday, 21 October 2018 11:37
Digital goverance and Sidewalk Toronto: Some thoughts on the latest proposal
Late in the afternoon of Monday, October 15, 2018, Sidewalk Labs released a densely-packed slide-deck which outlined its new and emerging data governance plan for the Sidewalk Toronto smart city development. The plan was discussed by Waterfront Toronto’s Digital Strategy Advisory Panel at their meeting on Thursday, October 18. I am a member of that panel, and this post elaborates upon the comments I made at that meeting. Sidewalk Labs’ new data governance proposal builds upon the Responsible Data Use Policy Framework (RDUPF) document which had been released by Sidewalk Labs in May 2018. It is, however, far more than an evolution of that document – it is a different approach reflecting a different smart city concept. It is so different that Ann Cavoukian, advisor to Sidewalk Labs on privacy issues, resigned on October 19. The RDUPF had made privacy by design its core focus and promised the anonymization of all sensor data. Cavoukian cited the fact that the new data governance framework contemplated that not all personal information would be deidentified as a reason for her resignation. Neither privacy by design nor data anonymization are privacy panaceas, and the RDUPF document had a number of flaws. One of them was that by championing deidentification of personal information as the key to responsible data use, it very clearly only addressed privacy concerns relating to a subset of the data that would inevitably be collected in the proposed smart city. In addition, by focusing on privacy by design, it did little to address the many other data governance issues the project faced. The new proposal embraces a broader concept of data governance. It is cognizant of privacy issues but also considers issues of data control, access, reuse, and localization. In approaching data governance, Sidewalk is also proposing using a ‘civic data trust’ as a governance model. Sidewalk has made it clear that this is a work in progress and that it is open to feedback and comment. It received some at the DSAP meeting on Thursday, and more is sure to come. My comments at the DSAP focused on two broad issues. The first was data and the second was governance. I prefaced my discussion of these by warning that in my view it is a mistake to talk about data governance using either of the Sidewalk Labs documents as a departure point. This is because these documents embed assumptions that need to be examined rather than simply accepted. They propose a different starting point for the data governance conversation than I think is appropriate, and as a result they unduly shape and frame that discussion. Data Both the RDUPF and the current data governance proposal discuss how the data collected by the Sidewalk Toronto development will be governed. However, neither document actually presents a clear picture of what those data are. Instead, both documents discuss a subset of data. The RDUPF discussed only depersonalized data collected by sensors. The second discussed only what it defines as “urban data”: Urban Data is data collected in a physical space in the city, which includes: ● Public spaces, such as streets, squares, plazas, parks, and open spaces ● Private spaces accessible to the public, such as building lobbies, courtyards, ground-floor markets, and retail stores ● Private spaces not controlled by those who occupy them (e.g. apartment tenants) This is very clearly only a subset of smart cities data. (It is also a subset that raises a host of questions – but those will have to wait for another blog post.) In my view, any discussion of data governance in the Sidewalk Toronto development should start with a mapping out of the different types of data that will be collected, by whom, for what purposes, and in what form. It is understood that this data landscape may change over time, but at least a mapping exercise may reveal the different categories of data, the issues they raise, and the different governance mechanisms that may be appropriate depending on the category. By focusing on deidentified sensor data, for example, the RDUPF did not address personal information collected in relation to the consumption of many services that will require identification – e.g., for billing or metering purposes. In the proposed development, what types of services will require individuals to identify themselves? Who will control such data? How will it be secured? What will policies be with respect to disclosure to law enforcement without a warrant? What transparency measures will be in place? Will service consumption data also be deidentified and made available for research? In what circumstances? I offer this as an example of a different category of data that still requires governance, and that still needs to be discussed in the context of a smart cities development. This type of data would also fall outside the category of “urban data” in the second governance plan, making that plan only a piece of the overall data governance required, as there are many other categories of data that are not captured by “urban data”. The first step in a data governance must be for all involved to understand what data is being collected, how, why, and by whom. The importance of this is also made evident by the fact that between the RDUPF and the new governance plan, the very concept of the Sidewalk Toronto smart city seems to have changed. The RDUPF envisioned a city in which sensors were installed by Sidewalk and Sidewalk was committing to the anonymization of any collected personal information. In the new version, the model seems to be of the smart city as a technology platform on which any number of developers will be invited to build. As a result, the data governance model proposes an oversight body to provide approval for new data collection in public spaces, and to play some role in the sharing of the collected data if appropriate. This is partly behind the resignation of Ann Cavoukian. She objected to the fact that this model accepts that some new applications might require the collection of personal information and so deidentification could not be an upfront promise for all data collected. The technology-platform model seems responsive to concerns that the smart city would effectively be subsumed by a single corporation. It allows other developers to build on the platform – and by extension to collect and process data. Yet from a governance perspective this is much messier. A single corporation can make bold commitments with respect to its own practices; it may be difficult or inappropriate to impose these on others. It also makes it much more difficult to predict what data will be collected and for what purposes. This does not mean that the data mapping exercise is not worthwhile – many kinds and categories of data are already foreseeable and mapping data can help to understand different governance needs. In fact, it is likely that a project this complex will require multiple data governance models. Governance The second point I tried to make in my 5 minutes at the Thursday meeting was about data governance. The new data governance plan raises more questions than it answers. One glaring issue seems to be the place for our already existing data governance frameworks. These include municipal and provincial Freedom of Information and Protection of Privacy Acts and PIPEDA. They may also include the City of Toronto’s open data policies and platforms. There are very real questions to be answered about which smart city data will be private sector data and which will be considered to be under the custody or control of a provincial or municipal government. Government has existing legal obligations about the management of data that are under its custody or control, and these obligations include the protection of privacy as well as transparency. A government that decides to implement a new data collection program (traffic cameras, GPS trackers on municipal vehicles, etc.) would be the custodian of this data, and it would be subject to relevant provincial laws. The role of Sidewalk Labs in this development challenges, at a very fundamental level, the understanding of who is ultimately responsible for the collection and governance of data about cities, their services and infrastructure. Open government data programs invite the private sector to innovate using public data. But what is being envisaged in this proposal seems to be a privatization of the collection of urban data – with some sort of ‘trust’ model put in place to soften the reality of that privatization. The ‘civic data trust’ proposed by Sidewalk Labs is meant to be an innovation in data governance, and I am certainly not opposed to the development of innovative data governance solutions. However, the use of the word “trust” in this context feels wrong, since the model proposed is not a data trust in any real sense of the word. This view seems to be shared by civic data trust advocate Sean MacDonald in an article written in response to the proposal. It is also made clear in this post by the Open Data Institute which attempts to define the concept of a civic data trust. In fact, it is hard to imagine such an entity being created and structured without significant government involvement. This perhaps is at the core of the problem with the proposal – and at the root of some of the pushback the Sidewalk Toronto project has been experiencing. Sidewalk Labs is a corporation – an American one at that – and it is trying to develop a framework to govern vast amounts of data collected about every aspect of city life in a proposed development. But smart cities are still cities, and cities are public institutions created and structured by provincial legislation and with democratically elected councils. If data is to be collected about the city and its residents, it is important to ask why government is not, in fact, much more deeply implicated in any development of both the framework for deciding who gets to use city infrastructure and spaces for data collection, and what data governance model is appropriate for smart cities data.
Published in
Privacy
Tagged under
Thursday, 04 October 2018 05:43
Artist sued in Canada for copyright infringement for AI-related art project
A law suit filed in Montreal this summer raises novel copyright arguments regarding AI-generated works. The plaintiffs are artist Amel Chamandy and Galerie NuEdge Fine Arts (which sells and exhibits her art). They are suing artist Adam Basanta for copyright and trademark infringement. (The trademark infringement arguments are not discussed in this post). Mr Basanta is a world renowned new media artist who experiments with AI in his work. (See the Globe and Mail story by Chris Hannay on this law suit here). According to a letter dated July 4, filed with the court, Mr. Basanta’s current project is “to explore connections between mass technologies, using those technologies themselves.” He explains his process in a video which can be found here. Essentially, he has created what he describes as an “art-factory” that randomly generates images without human input. The images created are then “analyzed by a series of deep-learning algorithms trained on a database of contemporary artworks in economic and institutional circulation” (see artist’s website). The images used in the database of artworks are found online. Where the analysis finds a match of more than 83% between one of the randomly generated images and an image in the database, the randomly generated image is presented online with the percentage match, the title of the painting it matches, and the artist’s name. This information is also tweeted out. The image of the painting that matches the AI image is not reproduced or displayed on the website or on Twitter. One of Mr Basanta’s images was an 85.81% match with a painting by Ms Chamandy titled “Your World Without Paper”. This information was reported on Mr Basanta’s website and Twitter accounts along with the machine-generated image which resulted in the match. The copyright infringement allegation is essentially that “the process used by the Defendant to compare his computer generated images to Amel Chamandy’s work necessarily required an unauthorized copy of such a work to be made.” (Statement of Claim, para 30). Ms Chamandy claims statutory damages of up to $20,000 for the commercial use of her work. Mr Basanta, for his part, argues that there is no display of Ms Chamandy’s work, and therefore no infringement. AI has been generating much attention in the copyright world. AI algorithms need to be ‘trained’ and this training requires that they be fed a constant supply of text, data or images, depending upon the algorithm. Rights holders argue that the use of their works in this way without consent is infringement. The argument is that the process requires unauthorized copies to be fed into the system for algorithmic analysis. Debates have raged in the EU over a text-and-data mining exception to copyright infringement which would make this type of use of copyright protected works acceptable so long as it is for research purposes. Other uses would require clearance for a fee. There has already been considerable debate in Europe over whether research is a broad enough basis for the exception and what activities it would include. If a similar exception is to be adopted in Canada in the next round of copyright reform, we will face similar challenges in defining its boundaries. Of course, the Chamandy case is not the conventional text and data mining situation. The copied image is not used to train algorithms. Rather, it is used in an analysis to assess similarities with another image. But such uses are not unknown in the AI world. Facial recognition technologies match live captured images with stored face prints. In this case, the third party artwork images are like the stored face prints. It is AI, just not the usual text and data mining paradigm. This should also raise questions about how to draft exceptions or to interpret existing exceptions to address AI-related creativity and innovation. In the US, some argue that the ‘fair use’ exception to infringement is broad enough to support text and data mining uses of copyright protected works since the resulting AI output is transformative. Canada’s fair dealing provisions are less generous than U.S. fair use, but it is still possible to argue that text and data mining uses might be ‘fair’. Canadian law recognizes fair dealing for the purposes of research or private study, so if an activity qualifies as ‘research’ it might be fair dealing. The fairness of any dealing requires a contextual analysis. In this case the dealing might be considered fair since the end result only reports on similarities but does not reproduce any of the protected images for public view. The problem, of course, with fair dealing defences is that each case turns on its own facts. The fact-dependent inquiry necessary for a fair dealing defense could be a major brake on innovation and creativity – either by dissuading uses out of fear of costly infringement claims or by driving up the cost of innovation by requiring rights clearance in order to avoid being sued. The claim of statutory damages here is also interesting. Statutory damages were introduced in s. 38.1 of the Copyright Act to give plaintiffs an alternative to proving actual damage. For commercial infringements, statutory damages can range from $500 to $20,000 per work infringed; for non-commercial infringement the range is $100 to $5,000 for all infringements and all works involved. A judge’s actual award of damages within these ranges is guided by factors that include the need for deterrence, and the conduct of the parties. Ms Chamandy asserts that Mr Basanda’s infringement is commercial, even though the commercial dimension is difficult to see. It would be interesting to consider whether the enhancement of his reputation or profile as an artist or any increase in his ability to obtain grants would be considered “commercial”. Beyond the challenge of identifying what is commercial activity in this context, it opens a window into the potential impact of statutory damages in text and data mining activities. If such activities are considered to infringe copyright and are not clearly within an exception, then in Canada, a commercial text and data miner who consumes – say 500,000 different images to train an algorithm – might find themselves, even on the low end of the spectrum, liable for $250 million dollars in statutory damages. Admittedly, the Act contains a clause that gives a judge the discretion to reduce an award of statutory damages if it is “grossly out of proportion to the infringement”. However, not knowing what a court might do or by how much the damages might be reduced creates uncertainty that can place a chill on innovation. Although in this case, there may well be a good fair dealing defence, the realities of AI would seem to require either a clear set of exceptions to clarify infringement issues, or some other scheme to compensate creators which expressly excludes resort to statutory damages. The vast number of works that might be consumed to train an algorithm for commercial purposes makes statutory damages, even at the low end of the scale, potentially devastating and creates a chill.
Published in
Copyright Law
Tagged under
Wednesday, 12 September 2018 13:44
Smart cities data - governance challenges
This post gives a brief overview of a talk I am giving September 12, 2018, on a panel hosted by the Centre for Law Technology and Society at uOttawa. The panel title is ‘Smart and the City’
This post (and my presentation) explores the concept of the ‘smart’ city and lays the groundwork for a discussion of governance by exploring the different types of data collected in so-called smart cities. Although the term ‘smart city’ is often bandied about, there is no common understanding of what it means. Anthony Townsend has defined smart cities as “places where information technology is combined with infrastructure, architecture, everyday objects, and even our bodies to address social, economic, and environmental problems.” (A. Townsend, Smart Cities: Big Data, Civic Hackers, and the Quest for a New Utopia. (New York: W.W. Norton & Co., 2013), at p. 15). This definition emphasizes the embedding of information technologies within cities with the goal of solving a broad range of urban problems. Still, there is uncertainty as to which cities are ‘smart’ or at what point a city passes the invisible ‘smart’ threshold. Embedded technologies are multiple and ever-evolving, and many are already in place in the cities in which we live. Technologies that have become relatively commonplace include smart transit cards, GPS systems on public vehicles (e.g.: buses, snowplows, emergency vehicles, etc.), smart metering for utilities, and surveillance and traffic cameras. Many of the technologies just identified collect data; smart technologies also process data using complex algorithms to generate analytics that can be used in problem identification and problem solving. Predictive policing is an example of a technology that generates information based on input data and complex algorithms. While it is possible for a smart city to be built from the ground up, this is not the most common type of smart city. Instead, most cities become ‘smarter’ by increments, as governments adopt one technology after another to address particular needs and issues. While both from-the-ground-up and incremental smart cities raise important governance issues, it is the from-the-ground-up projects (such as Sidewalk Toronto) that get the most public attention. With incremental smart cities, the piecemeal adoption of technologies often occurs quietly, without notice, and thus potentially without proper attention being paid to important overarching governance issues such as data ownership and control, privacy, transparency, and security. Canada has seen two major smart cities initiatives launched in the last year. These are the federal government’s Smart Cities Challenge – a contest between municipalities to fund the development of smart cities projects – and the Sidewalk Toronto initiative to create a from-the-ground-up smart development in Toronto’s Quayside area. Although Canadian cities have been becoming ‘smart’ by increments for some time now, these two high-profile initiatives have sparked discussion of the public policy issues, bringing important governance issues to the forefront. These initiatives, like many others, have largely been conceived of and presented to the public as technology, infrastructure, and economic development projects. Rather than acknowledging up-front the need for governance innovation to accompany the emerging technologies, governance tends to get lost in the hype. Yet it is crucial. Smart cities feed off data, and residents are primary sources. Much of the data collected in smart cities is personal information, raising obvious privacy issues. Issues of ownership and control over smart cities data (whether personal or non-personal) are also important. They are relevant to who gets to access and use the data, for what purposes, and for whose profit. The public outcry over the Sidewalk Toronto project (examples here, here and here) clearly demonstrates that cities are not just tech laboratories; they are the places where we try to live decent and meaningful lives. The governance issues facing so-called smart cities are complex. They may be difficult to disentangle from the prevailing ‘innovate or perish’ discourse. They are also rooted in technologies that are rapidly evolving. Existing laws and legal and policy frameworks may not be fully adequate to address smart cities challenges. This means that the governance issues raised by smart cities may require a rethinking of the existing law and policy infrastructure almost at pace with the emerging and evolving technologies. The complexity of the governance challenges may be better understood when one considers the kind of data collected in smart cities. The narrower the categories of data, the more manageable data governance in the smart city will seem. However, the nature of information technologies, including the types and locations of sensors, and the fact that many smart cities are built incrementally, require a broad view of the types of data at play in smart cities. Here are some kinds of data collected and used in smart cities:
· traditional municipal government data (e.g. data about registrants or applicants for public housing or permits; data about water consumption, infrastructure, waste disposal, etc.) · data collected by public authorities on behalf of governments (eg: electrical consumption data; transit data, etc.) · sensor data (e.g.: data from embedded sensors such as traffic cameras, GPS devices, environmental sensors, smart meters) · data sourced from private sector companies (e.g.: data about routes driven or cycled from companies such as Waze or Strava; social media data, etc.) · data from individuals as sensors (e.g. data collected about the movements of individuals based on signals from their cell phones; data collected by citizen scientists; crowd-sourced data, etc.) · data that is the product of analytics (e.g. predictive data, profiles, etc.)
Public sector access to information and protection of privacy legislation provides some sort of framework for transparency and privacy when it comes to public sector data, but clearly such legislation is not well adapted to the diversity of smart cities data. While some data will be clearly owned and controlled by the municipality, other data will not be. Further the increasingly complex relationship between public and private sectors around input data and data analytics means that there will be a growing number of conflicts between rights of access and transparency on the one hand, and the protection of confidential commercial information on the other. Given that few ‘smart’ cities will be built from the ground up (with the potential for integrated data governance mechanisms), the complexity and diversity of smart cities data and technologies creates a stark challenge for developing appropriate data governance.
(Sorry to leave a cliff hanger – I have some forthcoming work on smart cities data governance which I hope will be published by the end of this year. Stay tuned!)
Published in
Privacy
Tagged under
Monday, 27 August 2018 06:54
Judge rebuffs tax authority's "fishing expedition" in utility company's databases
A recent Federal Court decision highlights the risks to privacy that could flow from unrestrained access by government to data in the hands of private sector companies. It also demonstrates the importance of judicial oversight in ensuring transparency and the protection of privacy. The Income Tax Act (ITA) gives the Minister of National Revenue (MNR) the power to seek information held by third parties where it is relevant to the administration of the income tax regime. However, where the information sought is about unnamed persons, the law requires judicial oversight. A judge of the Federal Court must review and approve the information “requirement”. Just such a matter arose in Canada (Minister of National Revenue) v. Hydro-Québec. The MNR sought information from Hydro-Québec, the province’s electrical utility, about a large number of its business customers. Only a few classes of customers, such as heavy industries that consumed very large amounts of electricity were excluded. Hydro itself did not object to the request and was prepared to fulfil it if ordered to do so by the Federal Court. The request was considered by Justice Roy who noted that because the information was about unnamed and therefore unrepresented persons, it was “up to the Court to consider their interests.” (at para 5) Under s. 231.2(3) of the ITA, before ordering the disclosure of information about unnamed persons, a must be satisfied that: (a) the person or group is ascertainable; and (b) the requirement is made to verify compliance by the person or persons with any duty or obligation under this Act. The information sought from Hydro in digital format included customer names, business numbers, full billing addresses, addresses of each place where electricity is consumed, telephone numbers associated with the account, billing start dates, and, if applicable, end dates, and any late payment notices sent to the customer. Justice Roy noted that no information had been provided to the court to indicate whether the MNR had any suspicions about the tax compliance of business customers of Hydro-Quebec. Nor was there much detail about what the MNR planned to do with the information. The documents provided by the MNR, as summarized by the Court, stated that the MNR was “looking to identify those who seem to be carrying on a business but failed to file all the required income tax returns.” (at para 14) However, Justice Roy noted that there were clearly also plans to share the information with other groups at the Canada Revenue Agency (CRA). These groups would use the information to determine “whether the individuals and companies complied with their obligations under the ITA and the ETA”. (at para 14) Justice Roy was sympathetic to the need of government to have powerful means of enforcing tax laws that depend upon self-reporting of income. However, he found that what the MNR was attempting to do under s. 231.2 went too far. He ruled that the words used in that provision had to be interpreted in light of “the right of everyone to be left alone by the state”. (at para 28) He observed that it is clear from the wording of the Act that “Parliament wanted to limit the scope of the Minister’s powers, extensive as they are.” (at para 68) Justice Roy carefully reviewed past jurisprudence interpreting s. 231.2(3). He noted that the section has always received a strict interpretation by judges. In past cases where orders had been issued, the groups of unnamed persons about whom information was sought were clearly ascertainable, and the information sought was ‘directly related to these taxpayers’ tax status because it is financial in nature.” (at para 63) In the present case, he found that the group was not ascertainable, and the information sought “has nothing to do with tax-status.” (at para 63) In his view, the aim of the request was to determine the identity of business customers of Hydro-Québec. The information was not sought in relation to a good faith audit, and with a proper factual basis. Because it was a fishing expedition meant to determine who might suitably be audited, the group of individuals identified by Hydro-Québec could not be considered “ascertainable”, as was required by the law. Justice Roy noted that no information was provided to demonstrate what “business customer” meant. He observed that “the Minister would render the concept of “ascertainable group” meaningless if, in the context of the ITA, she may claim that any group is an ascertainable group.” (at para 78) He opined that giving such broad meaning to “ascertainable” could be an abuse that would lead to violations of privacy by the state. Justice Roy also found that the second condition of s. 231.2(3) was not met. Section 231.2(3)(b) required that the information be sought in order “to verify compliance by the person or persons in the group with any duty or obligation under this Act.” He observed that the MNR was seeking an interpretation of this provision that would amount to: “Any information the Minister may consider directly or indirectly useful”. (at para 80) Justice Roy favoured a much more restrictive interpretation, limiting it to information that could “shed light on compliance with the Act.” (at para 80) He found that “the knowledge of who has a business account with Hydro-Québec does not meet the requirement of a more direct connection between the information and documents and compliance with the Act.” (at para 80) The MNR had argued that if the two conditions of s. 231.2(3) were met, then a judge was required to issue the authorization. Because Justice Roy found the two conditions were not met, the argument was moot. Nevertheless, he noted that even if he had found the conditions to be met, he would still have had the discretion to deny the authorization if to grant it would harm the public interest. In this case, there would be a considerable invasion of privacy “given the number of people indiscriminately included in the requirement for which authorization of the Court is being sought. (at para 88) He also found that the fact that digital data was sought increased the general risk of harm. He observed that “the applicant chose not to restrict the use she could make of the large quantity of information she received” (at para 91) and that it was clearly planned that the information would be shared within the CRA. Justice Roy concluded that even if he erred in his interpretation of the criteria in s. 231.2(3), and these criteria had to be given a broad meaning, he would still not have granted the authorization on the basis that “judicial intervention is required to prevent such an invasion of the privacy of many people in Quebec.” (at para 96) Such intervention would particularly be required where “the fishing expedition is of unprecedented magnitude and the information being sought is far from serving to verify compliance with the Act.” (at para 96) This is a strong decision which clearly protects the public interest. It serves to highlight the privacy risks in an era where both private and public sectors amass vast quantities of personal information in digital form. Although the ITA provides a framework to ensure judicial oversight in order to limit potential abuses, there are still far too many other contexts where information flows freely and where there may be insufficient oversight, transparency or accountability.
Published in
Privacy
Tagged under
Monday, 20 August 2018 09:40
Personal information in tribunal decisions: still seeking a balance with the open courts principle
The report of an investigator for Ontario’s Office of the Information and Privacy Commissioner (OIPC) into personal information contained within a published tribunal decision adds to the debate around how to balance individual privacy with the open courts principle. In this case (Privacy Complaint No. PC17-9), the respondent is the Ontario Human Rights Tribunal (OHRT), established under the Ontario Human Rights Code. The OHRT often hears matters involving highly sensitive personal information. Where an adjudicator considers it relevant to their decision, they may include this information in their written reasons. Although a party may request that the decision be anonymized to protect their personal information, OHRT adjudicators have been sparing in granting requests for anonymization, citing the importance of the open courts principle. The OIPC investigated after receiving a complaint about the reporting of sensitive personal information in an OHRT decision. The interesting twist in this case was that the personal information at issue was not that of the person who had complained to the OHRT (the ‘OHRT complainant’), and whose complaint had led to the tribunal hearing. Rather, it was the personal information of the OHRT complainant’s sister and mother. The complaint to the OIPC was made by the sister (the ‘OIPC complainant’) on behalf of herself and her mother. Although the sister’s and mother’s names were not used in the OHRT decision, they argued that they were easily identifiable since they lived in a small town and shared a distinctive surname with the OHRA complainant. The OIPC investigator agreed. She noted that the information at the heart of the complaint consisted of “the applicant’s name, the applicant’s mother’s age, the mother’s primary language, the number of medications the applicant’s mother was taking, the reason for the medication, the state of the mother’s memory and the city the complainant resides in.” (at para 19). The investigator found that although the names of the OIPC complainant and her mother were not mentioned, their relationship to the OHRT complainant was. She observed: “Given that the applicant’s name is available, the uniqueness of the names and the size of the community, it is reasonable to assume that someone reading the decision would be able to identify her mother and sister and connect the information in the decision to them.” (at para 26) Since the OHRT is a public body, and the information at issue was personal information, the OIPC complainant argued that the OHRT had breached the province’s Freedom of Information and Protection of Privacy Act (FIPPA) by publishing this information in its decision. For its part, the OHRT argued that the information was exempted from the application of FIPPA under s. 37 of that Act because it was “personal information that is maintained for the purpose of creating a record that is available to the general public”. It argued that it has an adjudicative mandate under the Human Rights Code and that the Statutory Powers Procedures Act (SPPA) permits it to determine its own practices and procedures. Although neither the OHRC nor the SPPA address the publication of decisions, the OHRT had decided that as a matter of practice, its decisions would be published, including on the public legal information website CanLII. The OHRT also argued that its proceedings were subject to the open courts principle. This argument was supported by the recent Ontario Superior Court decision (discussed here) which confirmed that the open courts principle applied to the decisions of statutory tribunals. The investigator agreed with the OHRT. She observed that “[o]penness at tribunals tends to improve the quality of testimony and for that reason is conducive to the pursuit of truth in adjudicative proceedings.” (at para 56). She noted as well that the other elements of the open courts principle, including “oversight of decision-makers, the integrity of the administration of justice, and the educational and democracy-enhancing features of open courts” (at para 57) were all linked to the Charter value of freedom of expression. She accepted that the publication of reasons for decision was part of the openness principle, and concluded that: “The publication of decisions is an aspect of the Tribunal’s control over its own process and the information that is included in the Tribunal’s decisions is within the adjudicator’s discretion in providing reasons for those decisions.” (at para 65) She noted that many public values were served by the publication of the Tribunal’s decisions: “The publication of its decisions supports public confidence in the justice system, serves an educational purpose, promotes accountability by the Tribunal for its decision-making, and ensures that the public has the information necessary to exercise the Charter right to freedom of expression.” (at para 66) As a result, she concluded that s. 37 of FIPPA excluded the published decisions from the application of the privacy provisions of the Act. This seems like an appropriate conclusion given the legislative framework. However, it does raise two general points of importance with respect to how the OHRT deals with personal information in its decisions. First, human rights legislation exists in an attempt to provide recourse and redress for those who experience discrimination in contexts which closely affect their lives, such as employment, accommodation, and the receipt of services. The prohibited grounds of discrimination are ones which touch on highly personal and intimate aspects of peoples’ lives, relating to sexual identity, national origin, religion, and mental or physical disability, to provide but a few examples. Personal information of this kind is generally considered highly sensitive. The spectre that it will be published – online – alongside an individual’s name, might be daunting enough to prevent some from seeking redress under the legislation at all. For example, fear that the online publication of one’s mental health information might make it difficult to find future employment could prevent a person from filing of a complaint of discrimination. This would seem to subvert the purpose of human rights legislation. And yet, human rights tribunals have been reticent in granting requests for anonymization, citing the open courts principle. Secondly, this case raises the further issue of how the sensitive personal information of third parties – who were neither witness before the tribunal or complainants to the OHRC – ended up in a decision published online, and for which the Tribunal had refused an anonymization request. The OIPC investigator concluded her report by recommending that the OHRT “continue to apply data minimization principles in the drafting of its decisions and include only personal information necessary to achieve the purpose of those decisions.” (at para 72) In the absence of clear directives for dealing with the online publication of personal information in court or tribunal decisions, and appropriate training for adjudicators, this gentle reminder seems to be the best that complainants can hope for. It is not good enough. One need only recall the complaints to the Office of the Privacy Commissioner of Canada about the offshore website that had scraped decisions from CanLII and court websites in order to make them available in fully indexable form over the internet, to realize that we have important unresolved issues about how personal information is published and disseminated in court and tribunal decisions in Canada.
Published in
Privacy
Tagged under
Tuesday, 07 August 2018 09:19
Ontario Court of Appeal Confirms Order to Disclose Physician Billing Information
Some years ago, a reporter from the Toronto Star filed an access to information request to obtain the names of the top 100 physician billers to Ontario’s Health Insurance Program (OHIP). She also sought the amounts billed, and the physicians’ fields of specialization. The information was in the hands of the Ministry of Health and Long-Term Care, and the request was made under Ontario’s Freedom of Information and Protection of Privacy Act (FIPPA). The Ministry refused to disclose the records on the basis that they constituted the physicians’ personal information. An adjudicator with the Ontario Information and Privacy Commissioner’s Office disagreed, and ordered disclosure. An appeal by the Ontario Medical Association (OMA) to the Ontario Divisional Court was unsuccessful (discussed here). On August 3, 2018, the Ontario Court of Appeal dismissed the OMA’s further appeal of that decision. The relatively brief and unanimous Court of Appeal decision made short work of the OMA’s arguments. The Court found that the adjudicator’s determination that the information was not personal information was reasonable. FIPPA specifically excludes from the definition of personal information “the name, title, contact information or designation of an individual that identifies the individual in a business, professional or official capacity”. The OMA had argued that the disclosure of the names in conjunction with the billing information meant that the disclosure would include personal information that “describes an individual’s finances, income, assets, liabilities…”. FIPPA provides in s. 21(3) that the disclosure of personal information is presumptively an invasion of privacy when it falls within this category. However, the Court found that the billing information constituted “the affected physicians’ gross revenue before allowable business expenses such as office, personnel, lab equipment, facility and hospital expenses.” (at para 25) The Court agreed with the adjudicator that the gross billing information did not reveal the actual income of the physicians. It stated: “where, as here, an individual’s gross professional or business income is not a reliable indicator of the individual’s actual personal finances or income, it is reasonable to conclude not only that the billing information is not personal information as per s. 2(1), but also that it does not describe “an individual’s finances [or] income”, for the purpose of s. 21(3)(f).” (at para 26) The OMA had resisted disclosure because the billing information might give the public, who might not understand the costs associated with running a medical practice, a distorted idea of the physicians’ personal finances. Ironically, the Court found that the differences between billing information and actual income were so different that it did not amount to personal information. The OMA had objected to what it considered to be the OIPC’s changed position on the nature of this type of information; in the past, the OIPC had accepted that this information was personal information and had not ordered disclosure. The Ontario Court of Appeal observed that the adjudicator was not bound to follow precedent; it also observed that there were differences of opinion in past OIPC decisions on this issue, and no clear precedent existed in any event. The decision is an important one for access to information. A publicly funded health care system consumes substantial resources, and there is a public interest in understanding, analyzing, critiquing and discussing how those resources are spent. The OMA was understandably concerned that public discussions not centre on particular individuals. However, governments have been moving towards greater transparency when it comes to monies paid to specific individuals and businesses, whether they are contractors or even public servants. As the Court of Appeal noted, FIPPA balances access to information with the protection of personal privacy. The public interest clearly prevailed in this instance.
Published in
Privacy
Tagged under
Wednesday, 25 July 2018 12:29
Social media profiles and PIPEDA's "Publicly Available Information" Exception to Consent
A recent Finding from the Office of the Privacy Commissioner of Canada contains a consideration of the meaning of “publicly available information”, particularly as it relates to social media profiles. This issue is particularly significant given a recent recommendation by the ETHI committee in its Report on PIPEDA reform. PIPEDA currently contains a very narrowly framed exception to the requirement of consent for “publicly available information”. ETHI had recommended amending the definition to make it “technologically neutral”. As I argued here, such a change would make it open-season for the collection, use and disclosure of social media profiles of Canadians. The Finding, issued on June 12, 2018, came after multiple complaints were filed by Canadians about the practices of a New Zealand-based social media company, Profile Technology Ltd (PTL). The company had obtained Facebook user profile data from 2007 and 2008 under an agreement with Facebook. While their plan might have originally been to create a powerful search engine for Facebook, in 2011 they launched their own social media platform. They used the Facebook data to populate their platform with profiles. Individuals whose profiles were created on the site had the option of ‘claiming’ them. PTL also provided two avenues for individuals who wished to delete the profiles. If an email address had been part of the original data obtained from Facebook and was associated with the PTL profile, a user could log in using that email address and delete the account. If no email address was associated with the profile, the company required individuals to set up a helpdesk ticket and to provide copies of official photo identification. A number of the complainants to the OPC indicated that they were unwilling to share their photo IDs with a company that had already collected, used and disclosed their personal information without their consent. The complainants’ concerns were not simply that their personal information had been taken and used to populate a new social media platform without their consent. They also felt harmed by the fact that the data used by PTL was from 2007-2008, and did not reflect any changes or choices they had since made. One complaint received by the OPC related to the fact that PTL had reproduced a group that had been created on Facebook, but that since had been deleted from Facebook. Within this group, allegations had been made about the complainant that he/she considered defamatory and bullying. The complainant objected to the fact that the group persisted on PTL and that the PTL platform did not permit changes to public groups and the behest of single individuals on the basis that they treated the group description “as part of the profile of every person who has joined that group, therefore modifying the group would be like modifying all of those people’s profiles and we cannot modify their profiles without their consent.” (at para 55) It should be noted that although the data was initially obtained by PTL from Facebook under licence from Facebook, Facebook’s position was that PTL had used the data in violation of the licence terms. Facebook had commenced proceedings against PTL in 2013 which resulted in a settlement agreement. There was some back and forth over whether the terms of the agreement had been met, but no information was available regarding the ultimate resolution. The Finding addresses a number of interesting issues. These include the jurisdiction of the OPC to consider this complaint about a New Zealand based company, the sufficiency of consent, and data retention limits. This post focuses only on the issue of whether social media profiles are “publicly available information” within the meaning of PIPEDA. PTL argued that it was entitled to benefit from the “publicly available information” exception to the requirement for consent for collection and use of personal information because the Facebook profiles of the complainants were “publicly available information”. The OPC disagreed. It noted that the exception for “publicly available information”, found in ss. 7(1)(d) and 7(2)(c.1) of PIPEDA, is defined by regulation. The applicable provision is s. 1(e) of the Regulations Specifying Publicly Available Information, which requires that “the personal information must appear in a publication, the publication must be available to the public, and the personal information has to have been provided by the individual.”(at para 87) The OPC rejected PTL’s argument that “publication” included public Facebook profiles. In its view, the interpretation of “publicly available information” must be “in light of the scheme of the Act, its objects, and the intention of the legislature.” (at para 89) It opined that neither a Facebook profile nor a ‘group’ was a publication. It noted that the regulation makes it clear that “publicly available information” must receive a restrictive interpretation, and reflects “a recognition that information that may be in the public domain is still worthy of privacy protection.” (at para 90) The narrow interpretation of this exception to consent is consistent with the fact that PIPEDA has been found to be quasi-constitutional legislation. In finding that the Facebook profile information was not publicly available information, the OPC considered that the profiles at issue “were created at a time when Facebook was relatively new and its policies were in flux.” (at para 92) Thus it would be difficult to determine that the intention of the individuals who created profiles at that time was to share them broadly and publicly. Further, at the time the profiles were created, they were indexable by search engines by default. In an earlier Finding, the OPC had determined that this default setting “would not have been consistent with users’ reasonable expectations and was not fully explained to users” (at para 92). In addition, the OPC noted that Facebook profiles were dynamic, and that their ‘owners’ could update or change them at will. In such circumstances, “treating a Facebook profile as a publication would be counter to the intention of the Act, undermining the control users otherwise maintain over their information at the source.” (at para 93) This is an interesting point, as it suggests that the dynamic nature of a person’s online profile prevents it from being considered a publication – it is more like an extension of a user’s personality or self-expression. The OPC also noted that even though the profile information was public, to qualify for the exception it had to be contributed by the individual. This is not always the case with profile information – in some cases, for example, profiles will include photographs that contain the personal information of third parties. This Finding, which is not a decision, and not binding on anyone, shows how the OPC interprets the “publicly available information” exception in its home statute. A few things are interesting to note: · The OPC finds that social media profiles (in this case from Facebook) are different from “publications” in the sense that they are dynamic and reflect an individual’s changing self-expression · Allowing the capture and re-use, without consent, of self-expression from a particular point in time, robs the individual not only of control of their personal information by of control over how they present themselves to the public. This too makes profile data different from other forms of “publicly accessible information” such as telephone or business directory information, or information published in newspapers or magazines. · The OPC’s discussion of Facebook’s problematic privacy practices at the time the profiles were created muddies the discussion of “publicly available information”. A finding that Facebook had appropriate rules of consent should not change the fact that social media profiles should not be considered “publicly available information” for the purposes of the exception.
It is also worth noting that a complaint against PTL to the New Zealand Office of the Privacy Commissioner proceeded on the assumption that PTL did not require consent because the information was publicly available. In fact, the New Zealand Commissioner ruled that no breach had taken place. Given the ETHI Report’s recommendation, it is important to keep in mind that the definition of “publicly accessible information” could be modified (although the government’s response to the ETHI report indicates some reservations about the recommendation to change the definition of publicly available information). Because the definition is found in a regulation, a modification would not require legislative amendment. As is clear from the ETHI report, there are a number of industries and organizations that would love to be able to harvest and use social media platform personal information without need to obtain consent. Vigilance is required to ensure that these regulations are not altered in a way that dramatically undermines privacy protection.
Published in
Privacy
Tagged under
Friday, 13 July 2018 13:30
Supreme Court's decision on (non) disclosure of aggregate health data in big tobacco litigation has few takeaways for privacy, big data
The Supreme Court of Canada has issued its unanimous decision in The Queen v. Philip Morris International Inc. This appeal arose out of an ongoing lawsuit brought by the province of British Columbia against tobacco companies to recover the health care costs associated with tobacco-related illnesses in the province. Similar suits brought by other provincial governments are at different stages across the country. In most cases, the litigation is brought under provincial legislation passed specifically to enable and to structure this recourse. The central issue in this case concerned the degree of access to be provided to Philip Morris International (PMI)to the databases relied upon by the province to calculate tobacco-related health care costs. PMI wanted access to the databases in order to develop its own experts’ opinions on the nature and extent of these costs, and to challenge the opinions to be provided by provincial experts who would have full access to the databases. Although the databases contained aggregate, de-identified data, the government refused access, citing the privacy interests of British Columbians in their health care data. As a compromise, they offered limited and supervised access to the databases at Statistics Canada Data Centre. Although the other tobacco company defendants accepted this compromise, PMI did not, and sought a court order granting it full access. The court at first instance and later the Court of Appeal for British Columbia sided with PMI and ordered that access be provided. The SCC overturned this order. This case had been watched with interest by many because of the broader issues onto which it might have shed some light. On one view, the case raised issues about how to achieve fairness in litigation where one party relies on its own vast stores of data – which might include confidential commercial data – and the other party seeks to test the validity or appropriateness of analytics based on this data. What level of access, if any, should be granted, and under what conditions? Another issue of broader interest was, where potentially re-identifiable personal information is sought, what measures are appropriate to protect privacy, including the deemed undertaking rule. Others were interested in knowing what parameters the court might set for assessing the re-identification risk where anonymized data are disclosed. Those who hoped for broader take-aways for big data, data analytics and privacy, are bound to be disappointed in the decision. In deciding in favour of the BC government, the Supreme Court largely confined its decision to an interpretation of the specific language of the Tobacco Damages and Health Care Costs Recovery Act. The statute offered the government two ways to proceed against tobacco companies – it could seek damages related to the healthcare costs of specific individuals, in which case the health records of those individuals would be subject to discovery, or it could proceed in a manner that considered only aggregate health care data. The BC government chose the latter route. Section 2(5) set out the rules regarding discovery in an aggregate action. The focus of the Supreme Court’s interpretation was s. 2(5)(b) of the Act which reads:
2(5)(b) the health care records and documents of particular individual insured persons or the documents relating to the provision of health care benefits for particular individual insured persons are not compellable except as provided under a rule of law, practice or procedure that requires the production of documents relied on by an expert witness [My emphasis]
While it was generally accepted that this meant that the tobacco companies could not have access to individual health care records, PMI argued that the aggregate data was not a document “relating to the provision of health care benefits for particular individual insured persons”, and therefore its production could be compelled. The Supreme Court disagreed. Writing for the unanimous court, Justice Brown defined both “records” and “documents” as “means of storing information” (at para 22). He therefore found that the relevant databases “are both “records” and “documents” within the meaning of the Act.” (at para 22) He stated:
Each database is a collection of health care information derived from original records or documents which relate to particular individual insured persons. That information is stored in the databases by being sorted into rows (each of which pertains to a particular individual) and columns (each of which contains information about the field or characteristic that is being recorded, such as the type of medical service provided). (at para 22)
He also observed that many of the fields in the database were filled with data from individual patient records, making the databases “at least in part, collections of health care information taken from individuals’ clinical records and stored in an aggregate form alongside the same information drawn from the records of others.” (at para 23) As a result, the majority found that the databases qualified under the legislation as “documents relating to the provision of health care benefits for particular individual insured persons”, whether or not those individuals were identified within the database.
Perhaps the most interesting passage in the Court’s decision is the following: The mere alteration of the method by which that health care information is stored — that is, by compiling it from individual clinical records into aggregate databases — does not change the nature of the information itself. Even in an aggregate form, the databases, to the extent that they contain information drawn from individuals’ clinical records, remain “health care records and documents of particular individual insured persons”. (at para 24)
A reader eager to draw lessons for use in other contexts might be see the Court to be saying that aggregate data derived from personal data are still personal data. This would certainly be important in the context of current debates about whether the deidentification of personal information removes it from the scope of private sector data protection laws such as the Personal Information Protection and Electronic Documents Act. But it would be a mistake to read that much into this decision. The latter part of the quoted passage grounds the Court’s conclusion on this point firmly in the language of the BC tobacco legislation. Later the Court specifically rejects the idea that a “particular” individual under the BC statute is the same as an “identifiable individual”. Because the case is decided on the basis of the interpretation of s. 2(5)(b), the Court neatly avoids a discussion of what degree of reidentification risk would turn aggregate or anonymized data into information about identifiable individuals. This topic is also of great interest in the big data context, particularly in relation to data protection law. And, although it might have been interesting to know whether any degree of reidentification risk could be sufficiently mitigated by the deemed undertaking rule so as to permit discovery remains unexplored territory, those looking for a discussion of the relationship between re-identification risk and the deemed undertaking rule will also have to wait for a different case.
Published in
Privacy
Tagged under
|
Electronic Commerce and Internet Law in Canada, 2nd EditionPublished in 2012 by CCH Canadian Ltd. Intellectual Property for the 21st CenturyIntellectual Property Law for the 21st Century: Interdisciplinary Approaches |