Teresa Scassa - Blog

Teresa Scassa

Teresa Scassa

Thursday, 04 April 2019 12:54

Open Banking & Data Ownership

On April 4, 2019 I appeared before the Senate Standing Committee on Banking, Trade and Commerce (BANC) which has been holding hearings on Open Banking, following the launch of a public consultation on Open Banking by the federal government. Open banking is an interesting digital innovation initiative with both potential and risks. I wrote earlier about open banking and some of the privacy issues it raises here. I was invited by the BANC Committee to discuss ‘data ownership’ in relation to open banking. The text of my open remarks to the committee is below. My longer paper on Data Ownership is here.

_______________

Thank you for this invitation and opportunity to meet with you on the very interesting subject of Open Banking, and in particular on data ownership questions in relation to open banking.

I think it is important to think about open banking as the tip of a data iceberg. In other words, if Canada moves forward with open banking, this will become a test case for rendering standardized data portable in the hands of consumers with the goal of providing them with more opportunities and choices while at the same time stimulating innovation.

The question of data ownership is an interesting one, and it is one that has become of growing importance in an economy that is increasingly dependent upon vast quantities of data. However, the legal concept of ‘ownership’ is not a good fit with data. There is no data ownership right per se in Canadian law (or in law elsewhere in comparable jurisdictions, although in the EU the idea has recently been mooted). Instead, we have a patchwork of laws that protect certain interests in data. I will give you a very brief overview before circling back to data portability and open banking.

The law of confidential information exists to protect interests in information/data that is kept confidential. Individuals or corporations are often said to ‘own’ confidential information. But the value of this information lies in its confidentiality, and this is what the law protects. Once confidentiality is lost, so is exclusivity – the information is in the public domain.

The Supreme Court of Canada in 1988 also weighed in on the issue of data ownership – albeit in the criminal law context. They ruled in R. v. Stewart that information could not be stolen for the purposes of the crime of theft, largely because of its intangible nature. Someone could memorize a confidential list of names without removing the list from the possession of its ‘owner’. The owner would be deprived of nothing but the confidentiality of and control over the information.

It is a basic principle of copyright law that facts are in the public domain. There is good reason for this. Facts are seen as the building blocks of expression, and no one should have a monopoly over them. Copyright protects only the original expression of facts. Under copyright law, it is possible to have protection for a compilation of facts – the original expression will lie in the way in which the facts are selected or arranged. It is only that selection or arrangement that is protected – not the underlying facts. This means that those who create compilations of fact may face some uncertainty as to their existence and scope of any copyright. The Federal Court of Appeal, for example, recently ruled that there was no copyright in the Ontario Real Estate Board’s real estate listing data.

Of course, the growing value of data is driving some interesting arguments – and decisions – in copyright law. A recent Canadian case raises the possibility that facts are not the same as data under copyright law. This issue has also arisen in the US. Some data are arguably ‘authored’, in the sense that they would not exist without efforts to create them. Predictive data generated by algorithms are an example, or data that require skill, judgment and interpretation to generate. Not that many years ago, Canada Post advanced the argument that they had copyright in a postal code. In the US, a handful of cases have recognized certain data as being ‘authored’, but even in those cases, copyright protection has been denied on other grounds. According ownership rights over data – and copyright law provides a very extended period of protection – would create significant issues for expression, creation and innovation.

The other context in which the concept of data ownership arises is in relation to personal information. Increasingly we hear broad statements about how individuals ‘own’ their personal information. These are not statements grounded in law. There is no legal basis for individuals to be owners of their personal information. Individuals do have interests in their personal information. These interests are defined and protected by privacy and data protection laws (as well as by other laws relating to confidentiality, fiduciary duties, and so on). The GDPR in Europe was a significant expansion/enhancement of these interests, and reform of PIPEDA in Canada – if it ever happens – could similarly enhance the interests that individuals have in their personal data.

Before I speak more directly of these interests – and in particular of data portability – I want to just mention why it is that it is difficult to conceive of interests in personal data in terms of ownership.

What personal data could you be said to own, and what would it mean? Some personal data is observable in public contexts. Do you own your name and address? Can you prevent someone from observing you at work every day and deciding you are regularly late and have no dress sense? Is that conclusion your personal information or their opinion? Or both? If your parents’ DNA might reveal your own susceptibility to particular diseases, is their DNA your personal information? If an online bookstore profiles you as someone who likes to read Young Adult Literature – particularly vampire themed – is that your personal information or is it the bookstore’s? Or is it both? Data is complex and there may be multiple interests implicated in the creation, retention and use of various types of data – whether it is personal or otherwise. Ownership – a right to exclusive possession – is a poor fit in this context. And the determination of ownership on the basis of the ‘personal’ nature of the data will overlook the fact that there may be multiple interests entangled in any single datum.

What data protection laws do is define the nature and scope of a person’s interest in their personal information in particular contexts. In Canada, we have data protection laws that apply with respect to the public sector, the private sector, and the health sector. In all cases, individuals have an interest in their personal information which is accompanied by a number of rights. One of these is consent – individuals generally have a right to consent to the collection, use or disclosure of their personal information. But consent for collection is not required in the public sector context. And PIPEDA has an ever-growing list of exceptions to the requirements for consent to collection, use or disclosure. This shows how the interest is a qualified one. Fair information principles reflected in our data protection laws place a limit on the retention of personal information – when an organization that has collected personal information that is now no longer required for the purpose for which it is collected, their obligation is to securely dispose of it – not to return it to the individual. The individual has an interest in their personal information, but they do not own it. And, as data protection laws make clear, the organizations that collect, use and disclose personal information also have an interest in it – and they may also assert some form of ownership rights over their stores of personal information.

As I mentioned earlier, the GDPR has raised the bar for data protection world-wide. One of the features of the GDPR is that it greatly enhances the nature and quality of the data subject’s interest in their personal information. The right to erasure, for example, limited though it might be, gives individuals control over personal information that they may have, at one time, shared publicly. The right of data portability – a right that is reflected to some degree in the concept of open banking – is another enhancement of the control exercised by individuals over their personal information.

What portability means in the open banking context is that individuals will have the right to provide access to their personal financial data to a third party of their choice (presumably from an approved list). While technically they can do that now, it is complicated and not without risk. In open banking, the standard data formats will make portability simple, and will enhance the ability to bring the data together for analysis and to provide new tools and services. Although individuals will still not own their data, they will have a further degree of control over it. Thus, open banking will enhance the interest that individuals have in their personal financial information. This is not to say that it is not without risks or challenges.

 

Ongoing litigation in Canada over the recovery by provincial governments of health care costs related to tobacco use continues to raise interesting issues about the intersection of privacy, civil procedure, and big data analytics. A March 7 2019 decision by the New Brunswick Court of Queen’s Bench (Her Majesty the Queen v. Rothmans Inc.) picks up the threads left hanging by the rather muted decision of the Supreme Court of Canada in The Queen v. Philip Morris International Inc.

The litigation before the Supreme Court of Canada arose from the BC government’s attempt to recover tobacco-related health care costs in that province. The central issue concerned the degree of access to be provided to one of the big tobacco defendants, Philip Morris International (PMI), to the databases relied upon by the province to calculate tobacco-related health care costs. PMI wanted access to the databases in order to develop its own experts’ opinions on the nature and extent of these costs, and to challenge the opinions to be provided by provincial experts who would have full access to the databases. Although the databases contained aggregate, de-identified data, the government denied access, citing the privacy interests of British Columbians in their health care data. As a compromise, they offered limited and supervised access to the databases at Statistics Canada Research Data Centre. While the other tobacco company defendants accepted this compromise, PMI did not, and sought a court order granting it full access.

The Supreme Court of Canada’s decision was a narrow one. It interpreted the applicable legislation as making health care records and documents of individuals non-compellable in litigation for recovery of costs based on aggregate health care data. The Court considered the health databases to be “records” and “documents” and therefore not compellable. However, their decision touched only on the issue of whether PMI was entitled to access the databases to allow its own experts to prepare opinions. The Court did not address whether a defendant would be entitled to access the databases in order to challenge the plaintiff’s expert’s report that was created using the database information. Justice Brown, who wrote for the unanimous Court stated: “To be clear, the databases will be compellable once "relied on by an expert witness": s. 2(5)(b). A "statistically meaningful sample" of the databases, once anonymized, may also be compelled on a successful application under ss. 2(5)(d) and 2(5) (e).” (at para 36) In response to concerns about trial fairness, Justice Brown noted the early stage of the litigation, and stated that: “Within the Act, the Legislature has provided a number of mechanisms through which trial fairness may be preserved. Specifically, s. 2(5)(b) itself requires that any document relied upon by an expert witness be produced.” (at para 34) He also observed that:

 

[Section] 2(5)(d) permits a court, on application, to order discovery of a "statistically meaningful sample" of any of the records and documents that are otherwise protected by s. 2(5)(b). No defendant has yet made such an application and thus no court has yet had reason to consider what would constitute a "statistically meaningful sample" of the protected documents. (at para 35)

The Supreme Court of Canada therefore essentially laid the groundwork for the motions brought to the New Brunswick Court of Queen’s Bench under essentially similar legislation. Section 2 of New Brunswick’s Tobacco Damages and Health Care Costs Recovery Act is more or less identical to the provisions considered by the Supreme Court of Canada. Sections 2(5)(d) and (e) of the Act provide:

2(5). . .

(b) the health care records and documents of particular individual insured persons or the documents relating to the provision of health care benefits for particular individual insured persons are not compellable except as provided under a rule of law, practice or procedure that requires the production of documents relied on by an expert witness,

. . .

(d) notwithstanding paragraphs (b) and (c), on application by a defendant, the court may order discovery of a statistically meaningful sample of the documents referred to in paragraph (b) and the order shall include directions concerning the nature, level of detail and type of information to be disclosed, and

(e) if an order is made under paragraph (d), the identity of particular individual insured persons shall not be disclosed and all identifiers that disclose or may be used to trace the names or identities of any particular individual insured persons shall be deleted from any documents before the documents are disclosed.

Thus, the provisions allow for discovery of documents relied upon by the government, subject to an obligation to deidentify them.

An expert witness for the Province of New Brunswick had produced several reports relying on provincial health care data. The province maintained that for privacy reasons the defendant should not have direct access to the data, even though it was deidentified in the database. It offered instead to provide recourse through a Statistics Canada Research Data Centre. The defendant sought “a "statistically meaningful sample" of clinical health care records concerning 1,273 individual insured persons in New Brunswick, under the authority of subsections 2(5)(d) and (e) of the Act.” (at para 2) It also sought a production order for “all Provincial administrative databases and national survey data” that was relied upon by the Province’s expert witness in preparing his reports. In addition, they sought access to data from other provincial health databases that were not relied upon by the expert in his report – the defendant was interested in assessing the approaches he chose not to pursue in addition to those he actually pursued. The province argued that it had provided sufficient access to relevant data through the Statistics Canada RDC, which implemented appropriate safeguards to protect privacy.

Justice Petrie first considered whether the access via Statistics Canada was adequate and he concluded that it was not. He noted that one of the other defendants in the litigation had filed an access to information request with Statistics Canada and had thereby learned of some of the work carried out by the province’s expert witness, including some “calculations and analysis” that he had chosen not to rely upon in his work. While the defendants were not prejudiced by this disclosure, they used it as an example of a flaw in the system administered by Stats Canada since its obligations under the Access to Information Act had led to the disclosure of confidential and privileged information. They argued that they could be prejudiced in their own work through Stats Canada by access to information requests from any number of entities with interests adverse to theirs, including other provincial governments. Justice Petrie sided with the defendants. He found that: “the Province's production of the data and materials relied upon by Dr. Harrison only within the confines and authority of a third party to this litigation, StatsCan/RDC poses a real risk to the confidentiality and privilege that must be accorded to the defendants and their experts.” (at para 66) He also stated:

 

The risk of potential premature or inadvertent disclosure, as determined by StatsCan, presents an unfair obstacle to the defendants' experts if required to undertake their analysis only within StatsCan/RDC. In short, the StatsCan Agreement terms and conditions are overly restrictive and likely pose a serious risk to trial fairness. I am of the view that less restrictive options are available to the Court and ones that more fairly balance trial fairness with the risks to any privacy breach for individual New Brunswickers. (at para 65)

These less restrictive options stem from the Courts own power to “provide for directions on production and to protect the personal and sensitive information of individuals.” (at para 68) Justice Petrie found that “there are no applicable restrictions under privacy legislation to prohibit the Court from ordering document production outside of the StatsCan/RDC in the circumstances.” (at para 72) He rejected arguments that the Statistics Act prevented such disclosures, ruling that custody and control over the health data remained shared between the province and Stats Canada, and that the court could order the province to disclose it. Further, it found:

 

Where, as here, the Province has served the defendants with five expert reports of Dr. Harrison and indicated their intention to call him as a witness at trial, I find that subsection 2(5)(b) of the Act expressly requires production of the materials "relied upon" by the expert in the ordinary course. I am confident that the Court is capable of fashioning an order which would adequately address any privacy or reidentification concerns while, at the same time, imposing more balanced measures on the defendants and/or their experts. (at para 82)

These measures could include a direction by the court that no party attempt to identify specific individuals from the deidentified data.

On the issue of the disclosure of a statistically significant sample of health records, the defendant sought a sample from over 1200 New Brunswick patients. The legislation specifically provides in s. 2(5)(d) that a court may order discovery of a statistically meaningful sample of the documents”, so long as they are deidentified. Justice Petrie found that there was a statutory basis for making this order, so long as privacy could be preserved. He rejected the province’s argument that the only way to do this was through the Stats Canada RDC. Instead, he relied upon the court’s own powers to tailor orders to the circumstances. He stated: “I am of the view that there is a satisfactory alternative to the StatsCan/RDC Agreement on terms that can allow for any re-identification risks to be properly addressed by way of a consent order preferably, and if not, by way of further submissions and ruling of this Court.” (at para 131)

On the issue of privacy and the deidentified records in the statistically significant sample, Justice Petrie stated:

 

Even if individuals might be able to be re-identified, which I am not convinced, it is not clear why the defendants would ever do so. [. . .] With respect to this request for an individual's personal health records, the Province has suggested no other alternative to such a sample, nor any alternative to the suggested approach on "anonymization" of the information. (at para 141)

He granted the orders requested by the defendants and required the parties to come to terms on a consent order to protect privacy in a manner consistent with his reasons.

This decision raises issues that are more interesting than those that were before the Supreme Court of Canada, mainly because the court is required in this case to specifically address the balance between privacy and fairness in litigation. The relevant legislation clearly does not require defendants to accept the plaintiff’s analyses of health data at face value; they are entitled to conduct their own analyses to test the plaintiff’s evidence, and they are permitted to do so using the data directly and not through some intermediary. While this means that sensitive health data, although anonymized, will be in the hands of the defendant tobacco companies, the court is confident that the rules of the litigation process, including the implied undertaking rule and the power of the court to set limits on parties’ conduct will be sufficient to protect privacy. Although this court seems to believe that reidentification is not likely to be possible (a view that is certainly open to challenge), even if it were possible, direction from the court that no analyses designed to permit identification will take place, is considered sufficient.

(This post is admittedly on the long side - if you have read the case and all you want are my thoughts on the difference between majority and minority opinions, feel free to skip to "Concluding thoughts" at the end.)

On February 14, 2019 the Supreme Court of Canada released its long-awaited decision in R. v. Jarvis, a case in which a high school teacher was prosecuted for voyeurism after he used a pen camera to make multiple recordings of female students’ cleavage while he talked to them in hallways or labs at school. Jarvis was acquitted at trial on the basis that the judge was not persuaded beyond a reasonable doubt that the recordings were for a sexual purpose, which was an element of the crime. The Ontario Court of Appeal found that the recordings were for a sexual purpose, but they upheld the acquittal on the basis that the students had no reasonable expectation of privacy at school. (My post on the ONCA decision is here).

The only issue before the Supreme Court of Canada (SCC) was “whether the Court of Appeal erred in finding that the students recorded by Mr. Jarvis were not in circumstances that give rise to a reasonable expectation of privacy for the purposes of s. 162(1) of the Criminal Code.” (at para 4). The SCC ruled unanimously that the students had a reasonable expectation of privacy and that a conviction should be entered in the case. However, the Court split on how they reached that conclusion. Six judges opted for a contextual approach to the reasonable expectation of privacy that set out a non-exhaustive list of nine considerations to take into account in determining whether a person has been observed or recorded in circumstances giving rise to an expectation of privacy. In reaching this interpretation, these judges relied in part on ‘reasonable expectation of privacy’ jurisprudence developed by the Court under s. 8 of the Charter. The three minority judges rejected the use of privacy jurisprudence developed in the criminal context, where the interests of the state are pitted against those of the individual. They also disagreed with the majority’s list of factors to consider in assessing a reasonable expectation of privacy. The minority would have kept only those four of the nine factors that could be linked to elements of the offence in s. 162(1).

The importance of this decision lies in the contextual approach taken by the majority to the reasonable expectation of privacy. This approach moves us away from the troubling dichotomy between public and private space which seems to inform the decision of the majority of the Court of Appeal. While the location of the person who is being subject to observation or recording is one of the factors to take into account, it is only one of them. Similarly, awareness of or consent to potential observation or recording is only a consideration and is not on its own determinative. The contextual approach also permits consideration of the relationship between the parties.

In this case, Jarvis had been charged with the crime of voyeurism under s. 162(1) of the Criminal Code. It is useful to reproduce the relevant parts of this provision:

162 (1) Every one commits an offence who, surreptitiously, observes — including by mechanical or electronic means — or makes a visual recording of a person who is in circumstances that give rise to a reasonable expectation of privacy, if

[. . . ]

(c) the observation or recording is done for a sexual purpose.

For there to be a conviction, Jarvis’ recordings would have to have been of students “in circumstances that give rise to a reasonable expectation of privacy.” The recordings were made when Jarvis engaged individual students or small groups of students in conversation in the school’s hallways or common areas.

The Majority’s approach to Interpretation

The majority’s interpretation of the phrase “circumstances that giver rise to a reasonable expectation of privacy” is important, particularly since the majority of the ONCA had focused predominantly on location in determining whether a reasonable expectation of privacy arose on the facts. The majority of the SCC had some important things to say on the issue of privacy in public space. While acknowledging that expectations of privacy “will generally be at their highest when a person is in a traditionally ‘private’ place from which she has chosen to exclude all others”, (at para 37), Chief Justice Wagner nonetheless affirmed that a person does not lose all expectation of privacy because she is in public. He stated: “a person may be in circumstances where she can expect to be the subject of certain types of observation or recording but not to be the subject of other types.” (at para 38) He continued: “being in a public or semi-public space does not automatically negate all expectations of privacy with respect to observations or recording”. (at para 41)

The Chief Justice noted that the wording of s. 162(1) also supported the view that a reasonable expectation of privacy was not tied to location. In the first place, that provision speaks of “circumstances” giving rise to a reasonable expectation of privacy. It identifies three possible situations, the first of which is tied to location (where a person is “in a place in which a person can reasonably be expected to be nude. . . or to be engaged in explicit sexual activity: s. 162(1)(a)). (at para 44) But paragraph 162(1)(c) merely refers to situations where “the observation or recording is done for sexual purposes. This latter provision contains no element of location.

The majority ruled that the jurisprudence developed under s. 8 of the Charter, which provides a right to be free from unreasonable search or seizure, could be used in interpreting the concept of “reasonable expectation of privacy”. This is a point on which the minority justices differed sharply. Section 8 of the Charter essentially provides an accused with what amounts to privacy protection from state intrusion. The concept of a “reasonable expectation of privacy” is a key element of a s. 8 analysis. However, as the majority notes, it is also a term used in other contexts – both civil and criminal. Interestingly, those civil contexts in which the phrase is used in Canadian legislation are predominantly found in relatively new statutes that provide tort recourse for the non-consensual distribution of intimate images. The phrase appears in legislation of this kind in Nova Scotia, Newfoundland, Alberta, Saskatchewan and Manitoba.

The majority noted that the Court’s s. 8 jurisprudence requires a contextual analysis of the reasonable expectation of privacy. Further, the case law teaches us that ‘privacy is not an ‘all-or-nothing’ concept and that “simply because a person is in circumstances where she does not expect complete privacy does not mean that she waives all reasonable expectations of privacy.” (at para 61) Privacy is differently affected by recordings than by passing observations. Further, the impact of new and emerging technologies needs to be carefully considered. It is possible that “technology may allow a person to see or hear more acutely, thereby transforming what is “reasonably expected and intended to be a private setting” into a setting that is not.” (at para 63). The majority also noted that “’reasonable expectation of privacy’ is a normative rather than a descriptive standard.” (at para 68). This means that a person’s expectation of privacy should not be determined simply on the basis of whether there is a risk that they might be observed or recorded. If this were the case, advances in technology would shrink reasonable expectations of privacy to nothingness. As a result, the majority framed the core question as “whether that person was in circumstances in which she would reasonably have expected not to be the subject of the observation or recording at issue.” (at para 70)

Applying the contextual approach

For the majority, the determination of whether a person was in “circumstances that give rise to a reasonable expectation of privacy” should be guided by a non-exhaustive list of contextual considerations. These considerations should include:

1. The location the person was in when she was observed or recorded

2. The nature of the impugned conduct, that is whether it consisted of observation or recording

3. Awareness of or consent to potential observation or recording.

4. The manner in which the observation or recording was done

5. The subject matter or content of the observation or recording

6. Any rules, regulations or policies that governed the observation or recording in question.

7. The relationship between the person who was observed or recorded and the person who did the observing or recording.

8. The purpose for which the observation or recording was done

9. The personal attributes of the person who was observed or recorded.

Applying these factors to the case before them, the majority noted that the videos were taken at school. The majority of the Court of Appeal had considered schools to be public places. However, the majority of the SCC found that schools are not entirely ‘public’ in nature. Access is restricted, and schools are “subject to formal rules and informal norms of behaviour, including with respect to visual recording, that may not exist in other quasi-public locations”. (at para 73). They noted that the young women were not merely observed, they were recorded – and they were unaware that recording was taking place. Although the ONCA had taken into account the fact that students were aware of continuous recording by security cameras in schools, the majority of the SCC ruled that “not all forms of recording are equally intrusive” and “there are profound differences between the effect on privacy resulting from the school’s security cameras and that resulting from Mr. Jarvis’ recordings” (at para 75). The majority found Jarvis’s recordings were “far more intrusive than casual observation, security camera surveillance or other types of observation or recording that would reasonably be expected by people in most public places, and in particular, by students in a school environment.”(at para 76).

In considering the content of the recordings, the majority noted that while the recordings were of students engaging in normal school activities, they focused close-up on their faces and breasts. The videos targeted specific students rather than capturing general scenes of school activity. The majority stated: “the videos do not show students merging into the “situational landscape”; rather, they single out these students, make them personally identifiable, and allow them to be subjected to intensive scrutiny.” (at para 80).

On the issue of rules and policies, the majority noted that there was a formal school board policy that prohibited the making of recordings of this kind. While the existence of such rules or policies is not determinative, and their weight might vary depending on the circumstances, in this case, the policy gave clear support to a finding of a reasonable expectation of privacy on the part of the students. Jarvis’ behavior was outside of the clearly established norms for teachers at school.

The seventh factor is important in this case. It relates to the relationship between the perpetrator and the person being observed or recorded. The majority found that a relationship of trust existed between teachers and students. The Chief Justice wrote: “It is inherent in this relationship that students can reasonably expect teachers not to abuse their position of authority over them, and the access they have to them, by making recordings of them for personal, unauthorized purposes” (at para 84). Of all of the factors in the majority’s list, this is the one that makes it most clear that a reasonable expectation of privacy does not rely simply on factors related to location, awareness, or the logistics of the observation or recording. Perhaps because of this, it is one of the factors the minority justices rejected.

The majority also considered the purpose of the recording. Since conviction for voyeurism under s. 162(1)(c) requires that the observation or recording be for sexual purposes, this seems a bit redundant. However, the consideration is part of an framework for determining a reasonable expectation of privacy more generally – and presumably in contexts other than just s. 162(1) of the Criminal Code. Thus, for example, the fact that the school had video cameras in public spaces did not infringe on the students’ reasonable expectations of privacy, but Jarvis’ recordings did – a key reason (though not the only one) for this was linked to the purpose of the recordings. The majority of the Court of Appeal, by contrast, had fixed on location as crucial to the reasonable expectation of privacy; citing the public nature of schools and the already existing surveillance cameras, they found the students had no reasonable expectation of privacy

The final factor considered by the majority was the “personal attributes” of the affected persons. In this case, it meant taking into account that the people recorded were high school students. Justice Wagner noted that there is evidence of a “societal consensus” that children have “greater privacy rights than similarly situated adults.” (at para 86).

After applying these criteria to the facts, the majority easily concluded that the young women recorded by Jarvis had a reasonable expectation of privacy. Justice Wagner wrote: “A student attending class, walking down a school hallway or speaking to her teacher certainly expects that she will not be singled out by the teacher and made the subject of a secretive, minutes-long recording or series of recordings focusing on her body.” (at para 90). Interestingly, he also indicated that he might have ruled the same way if the recordings had been made by a stranger on a public street.

The minority opinion

Justice Rowe wrote for the three judges in the minority. Although they too found that a conviction should be entered in this case, they had two main points of disagreement with the majority justices. The first was that, in their view, s. 8 case law should not be used in interpreting what a “reasonable expectation of privacy” is for the purposes of a criminal offence. They noted that s. 8 case law evolved to address the reasonable expectations of privacy that individuals have vis à vis the state. Section 162(1) involved the Crown having to prove that one individual encroached on the reasonable expectation of privacy of another; according to Justice Rowe, this was something very different from redressing “[t]he power imbalance of the police as agents of the state vis-à-vis a citizen that is at the heart of the preoccupations under s. 8 of the Charter”. (at para 102)

Justice Rowe also considered that s. 8 had been interpreted to protect personal, territorial and information privacy. By contrast, in his view, s. 162(1) of the Criminal Code “can relate only to the protection of one’s physical image, a subcategory of personal privacy, itself a subcategory of that which is protected under s. 8”. (at para 102).

The minority justices also take issue with the majority’s list of contextual factors. Instead, they find that only four of the nine factors are actually required by the wording of s. 162(1) taken as a whole. These are: location, the subject matter of the observation or recording; the purpose for which it was made; and the complainant’s awareness of the observation or recording. For the minority justices, the five other factors identified by the majority are relevant only to sentencing. Thus, for the minority, the existence of a relationship of trust is not a factor in assessing whether a person is guilty of voyeurism.

Justice Rowe notes that the voyeurism offences in the Criminal Code were the first “to include a complainant’s reasonable expectation of privacy as an element of the offence.” (at para 118) Since voyeurism is a sexual offence, he argued that the concept of a reasonable expectation of privacy had to be interpreted with regard to “personal autonomy and sexual integrity”. In his view, the privacy interest in s. 162(1):

is meant to protect a privacy interest in one’s image against observations or recordings that are, first, surreptitious and, second, objectively sexual in content or purpose. This privacy interest itself, where it is substantially and not trivially engaged (e.g. by merely uncouth or ill-mannered behavior), is founded on the twin interests of the protection of sexual integrity and the autonomy to control one’s personal visual information. (at para 128)

In the context of the voyeurism offence, the minority justices were of the view that “Infringing a person’s reasonable expectation of privacy in the context of the voyeurism offence can be conceptualized as crossing a threshold where the law prioritizes the observed person’s interest in protecting their autonomy and sexual integrity over the accused’s liberty of action.” (at para 132)

Such an approach to privacy does not depend solely on location. While location is relevant, it is not determinative. For the minority justices, a privacy infringement occurs “when that which is unknown/unobserved becomes known/observed without the person having put this information forward.” (at para 136) Although a person may be undressed in some public places such as a change room, they might reasonably expect to be observed, yet they would “maintain an essential privacy interest that can be infringed by surreptitious observation or recording, with or without the use of technology, which allows more invasive access to the subject’s image than would otherwise be possible.” (at para 137)

Ultimately, the minority justices found that the students had a reasonable expectation “regarding how their bodies would be observed in the classrooms and hallways of their school” (at para 146). They found that Jarvis’ recordings “went beyond the access that the students allowed in this setting, thus infringing their autonomy”. They were also of a sexual nature, leading to the conclusion that the students’ sexual integrity was infringed.

Concluding Thoughts

The majority’s decision will likely be welcomed by many in the privacy community who had become concerned by the fact that many lower courts, in different contexts, had suggested that there can be no reasonable expectation of privacy in public space. In a society in which public space is increasingly penetrated by technology that permits surveillance and recording (the majority, for example, mentioned drones, but Jarvis’ pen camera is also an example), a contextual approach to privacy is far more useful than any distinction based on concepts of private and public space. The majority also includes the concept of relationships of trust or authority in its analysis. In Jarvis, it is hard to ignore the fact that the teacher was in a position of both trust and authority over the students. Youths should be able to trust that the adults who have authority over them will not surreptitiously record images of them for sexual purposes regardless of where they are located. The relationship is surely a factor in the reasonableness of any expectation of privacy. The majority’s contextual approach feels right in these circumstances.

At the same time, the minority is correct in noting that s. 8 jurisprudence has evolved to answer the question of whether and when individuals have a reasonable expectation of privacy vis à vis the state. As Justice Rowe observes in Jarvis, s. 162(1) is an offence that defines the circumstances in which a person’s liberty to act crosses the line and becomes criminal. His approach, which links the expectation of privacy to considerations present in the wording of the offence (including location, purpose of recording, the subject matter of the observation or recording, and the complainant’s awareness of the filing), is meant to keep the offence more narrowly focused to preserve the balance between one person’s liberty and the other person’s autonomy and sexual integrity. As noted earlier, the language “reasonable expectation of privacy” also appears in the laws of those provinces that have made it a tort to disseminate intimate images without consent. For the minority justices, the issue is whether the offender has made public something that the victim had not wished to have public – something that undermines her autonomy and sexual integrity.

The problem with the minority approach, however, may lie in what made this case – which must have seemed like a no-brainer to so many – have to go all the way to the Supreme Court of Canada for a conviction to be entered. The trial judge in this case obviously struggled with his own perceptions that the young women in question were ‘putting it out there’. He wrote: “[i]t may be that a female student’s mode of attire may attract a debate about appropriate reactions of those who observe such a person leading up to whether there is unwarranted and disrespectful ogling” (Trial decision, at para 46). Perhaps the Court of Appeal’s focus on the public nature of the school and its hallways is also influenced that this idea that women’s bodies in public spaces are there for consumption. Without the majority’s contextual approach – one that directs us to consider a range of factors including the youth of victims and relationships of trust – the decisions from the courts below are perhaps proof enough that a more pared-down focus on “autonomy and sexual integrity” may just not cut it.

Thursday, 07 February 2019 08:09

Ontario Launches Data Strategy Consultation

On February 5, 2019 the Ontario Government launched a Data Strategy Consultation. This comes after a year of public debate and discussion about data governance issues raised by the proposed Quayside smart cities development in Toronto. It also comes at a time when the data-thirsty artificial intelligence industry in Canada is booming – and hoping very much to be able to continue to compete at the international level. Add to the mix the view that greater data sharing between government departments and agencies could make government ‘smarter’, more efficient, and more user-friendly. The context might be summed up in these terms: the public is increasingly concerned about the massive and widespread collection of data by governments and the private sector; at the same time, both governments and the private sector want easier access to more and better data.

Consultation is a good thing – particularly with as much at stake as there is here. This consultation began with a press release that links to a short text about the data strategy, and then a link to a survey which allows the public to provide feedback in the form of answers to specific questions. The survey is open until March 7, 2019. It seems that the government will then create a “Minister’s Task Force on Data” and that this body will be charged with developing a draft data strategy that will be opened for further consultation. The overall timeline seems remarkably short, with the process targeted to wrap up by Fall 2019.

The press release telegraphs the government’s views on what the outcome of this process must address. It notes that 55% of Canada’s Big data vendors are located in Ontario, and that government plans “to make life easier for Ontarians by delivering simpler, faster and better digital services.” The goal is clearly to develop a data strategy that harnesses the power of data for use in both the private and public sectors.

If the Quayside project has taught anyone anything, it is that people do care about their data in the hands of both public and private sector actors. The press release acknowledges this by referencing the need for “ensuring that data privacy and protection is paramount, and that data will be kept safe and secure.” Yet perhaps the Ontario government has not been listening to all of the discussions around Quayside. While the press release and the introduction to the survey talk about privacy and security, neither document addresses the broader concerns that have been raised in the context of Quayside, nor those that are raised in relation to artificial intelligence more generally. There are concerns about bias and discrimination, transparency in algorithmic decision-making, profiling, targeting, and behavioural modification. Seamless sharing of data within government also raises concerns about mass surveillance. There is also a need to consider innovative solutions to data governance and the role the government might play in fostering or supporting these.

There is no doubt that the issues underlying this consultation are important ones. It is clear that the government intends to take steps to facilitate intra-governmental sharing of data as well as greater sharing of data between government and the private sector. It is also clear that much of that data will ultimately be about Ontarians. How this will happen, and what rights and values must be protected, are fundamental questions.

As is the case at the provincial and federal level across the country, the laws which govern data in Ontario were written for a different era. Not only are access to information and protection of privacy laws out of date, data-driven practices increasingly impact areas such as consumer protection, competition, credit reporting, and human rights. An effective data strategy might need to reach out across these different areas of law and policy.

Privacy and security – the issues singled out in the government’s documents – are important, but privacy must mean more than the narrow view of protecting identifiable individuals from identity theft. We need robust safeguards against undue surveillance, assurances that our data will not be used to profile or target us or our communities in ways that create or reinforce exclusion or disadvantage; we need to know how privacy and autonomy will be weighed in the balance against the stimulation of the economy and the encouragement of innovation. We also need to consider whether there are uses to which our data should simply not be put. Should some data be required to be stored in Canada, and if so in what circumstances? These and a host of other questions need to be part of the data strategy consultation. Perhaps a broader question might be why we are talking only about a data strategy and not a digital strategy. The approach of the government seems to focus on the narrow question of data as both an input and output – but not on the host of other questions around the digital technologies fueled by data. Such questions might include how governments should go about procuring digital technologies, the place of open source in government, the role and implication of technology standards – to name just a few.

With all of these important issues at stake, it is hard not to be disappointed by the form and substance of at least this initial phase of the government's consultation. It is difficult to say what value will be derived from the survey which is the vehicle for initial input. Some of the questions are frankly vapid. Consider question 2:

2. I’m interested in exploring the role of data in:

creating economic benefits

increasing public trust and confidence

better, smarter government

other

There is no box in which to write in what the “other” might be. And questions 9 to 11 provide sterling examples of leading questions:

9. Currently, the provincial government is unable to share information among ministries requiring individuals and businesses to submit the same information each time they interact with different parts of government. Do you agree that the government should be able to securely share data among ministries?

Yes

No

I’m not sure

10. Do you believe that allowing government to securely share data among ministries will streamline and improve interactions between citizens and government?

Yes

No

I’m not sure

11. If government made more of its own data available to businesses, this data could help those firms launch new services, products, and jobs for the people of Ontario. For example, government transport data could be used by startups and larger companies to help people find quicker routes home from work. Would you be in favour of the government responsibly sharing more of its own data with businesses, to help them create new jobs, products and services for Ontarians?

Yes

No

I’m not sure

In fairness, there are a few places in the survey where respondents can enter their own answers, including questions about what issues should be put to the task force and what skills and experience members should have. Those interested in data strategy should be sure to provide their input – both now and in the later phases to come.

Tuesday, 22 January 2019 16:56

Canada's Shifting Privacy Landscape

Note: This article was originally published by The Lawyer’s Daily (www.thelawyersdaily.ca), part of LexisNexis Canada Inc.

In early January 2019, Bell Canada caught the media spotlight over its “tailored marketing program”. The program will collect massive amounts of personal information, including “Internet browsing, streaming, TV viewing, location information, wireless and household calling patterns, app usage and the account information”. Bell’s background materials explain that “advertising is a reality” and that customers who opt into the program will see ads that are more relevant to their needs or interests. Bell promises that the information will not be shared with third party advertisers; instead it will enable Bell to offer those advertisers the ability to target ads to finely tuned categories of consumers. Once consumers opt in, their consent is presumed for any new services that they add to their account.

This is not the first time Bell has sought to collect vast amounts of data for targeted advertising purposes. In 2015, it terminated its short-lived and controversial “Relevant Ads” program after an investigation initiated by the Privacy Commissioner of Canada found that the “opt out” consent model chosen by Bell was inappropriate given the nature, volume and sensitivity of the information collected. Nevertheless, the Commissioner’s findings acknowledged that “Bell’s objective of maximizing advertising revenue while improving the online experience of customers was a legitimate business objective.”

Bell’s new tailored marketing program is based on “opt in” consent, meaning that consumers must choose to participate and are not automatically enrolled. This change and the OPC’s apparent acceptance of the legitimacy of targeted advertising programs in 2015 suggest that Bell may have brought its scheme within the parameters of PIPEDA. Yet media coverage of the new tailored ads program generated public pushback, suggesting that the privacy ground has shifted since 2015.

The rise of big data analytics and the stunning recent growth of artificial intelligence have sharply changed the commercial value of data, its potential uses, and the risks it may pose to individuals and communities. After the Cambridge Analytica scandal, there is also much greater awareness of the harms that can flow from consumer profiling and targeting. While conventional privacy risks of massive personal data collection remain (including the risk of data breaches, and enhanced surveillance), there are new risks that impact not just privacy but consumer choice, autonomy, and equality. Data misuse may also have broader impacts than just on individuals; such impacts may include group-based discrimination, and the kind of societal manipulation and disruption evidenced by the Cambridge Analytica scandal. It is not surprising, then, that both the goals and potential harms of targeted advertising may need rethinking; along with the nature and scope of data on which they rely.

The growth of digital and online services has also led to individuals effectively losing control over their personal information. There are too many privacy policies, they are too long and often obscure, products and services are needed on the fly and with little time to reflect, and most policies are ‘take-it-or-leave-it”. A growing number of voices are suggesting that consumers should have more control over their personal information, including the ability to benefit from its growing commercial value. They argue that companies that offer paid services (such as Bell) should offer rebates in exchange for the collection or use of personal data that goes beyond what is needed for basic service provision. No doubt, such advocates would be dismayed by Bell’s quid pro quo for its collection of massive amounts of detailed and often sensitive personal information: “more relevant ads”. Yet money-for-data schemes raise troubling issues, including the possibility that they could make privacy something that only the well-heeled can afford.

Another approach has been to call for reform of the sadly outdated Personal Information Protection and Electronic Documents Act. Proposals include giving the Privacy Commissioner enhanced enforcement powers, and creating ‘no go zones’ for certain types of information collection or uses. There is also interest in creating new rights such as the right to erasure, data portability, and rights to explanations of automated processing. PIPEDA reform, however, remains a mirage shimmering on the legislative horizon.

Meanwhile, the Privacy Commissioner has been working hard to squeeze the most out of PIPEDA. Among other measures, he has released new Guidelines for Obtaining Meaningful Consent, which took effect on January 1, 2019. These guidelines include a list of “must dos” and “should dos” to guide companies in obtaining adequate consent

While Bell checks off many of the ‘must do’ boxes with its new program, the Guidelines indicate that “risks of harm and other consequences” of data collection must be made clear to consumers. These risks – which are not detailed in the FAQs related to the program – obviously include the risk of data breach. The collected data may also be of interest to law enforcement, and presumably it would be handed over to police with a warrant. A more complex risk relates to the fact that internet, phone and viewing services are often shared within a household (families or roommates) and targeted ads based on viewing/surfing/location could result in the disclosure of sensitive personal information to other members of the household.

Massive data collection, profiling and targeting clearly raise issues that go well beyond simple debates over opt-in or opt-out consent. The privacy landscape is changing – both in terms of risks and responses. Those engaged in data collection would be well advised to be attentive to these changes.

In Netlink Computer Inc. (Re), the British Columbia Supreme Court dismissed an application for leave to sue a trustee in bankruptcy for the an alleged improper disposal of assets of a bankrupt company that contained the personal information of the company’s customers.

The issues at the heart of the application first reached public attention in September 2018 when a security expert described in a blog post how he noticed that servers from the defunct company were listed for sale on Craigslist. Posing as an interested buyer, he examined the computers and found that their unwiped hard drives contained what he reported as significant amounts of sensitive customer data, including credit card information and photographs of customer identification documents. Following the blog post, the RCMP and the BC Privacy Commissioner both launched investigations. Kipling Warner, who had been a customer of the defunct company Netlink, filed law suits against Netlink, the trustee in bankruptcy which had disposed of Netlink’s assets, the auction company Able Solutions, which and sold the assets, and Netlink’s landlord. All of the law suits include claims of breach statutory obligations under the Personal Information Protection and Electronic Documents Act, breach of B.C.’s Privacy Act, and breach of B.C.’s Personal Information Protection Act. The plan was to have the law suits certified as class action proceedings. The action against Netlink was stayed due to the bankruptcy. The B.C. Supreme Court decision deals only with the action against the trustee, as leave of the court must be obtained in order to sue a trustee in bankruptcy.

As Master Harper explained in his reasons for decision, the threshold for granting leave to sue a trustee in bankruptcy is not high. The evidence presented in the claim must advance a prima facie case. Leave to proceed will be denied if the proposed action is considered frivolous or vexations, since such a lawsuit would “interfere with the due administration of the bankrupt’s estate by the trustee” (at para 9). Essentially the court must balance the competing interests of the party suing the trustee and the interest in the efficient and timely wrapping up of the bankrupt’s estate.

The decision to dismiss the application in this case was based on a number of factors. Master Harper was not impressed by the fact that the multiple law suits brought against different actors all alleged the same grounds. He described this as a “scattergun approach” that suggested a weak evidentiary foundation. The application was supported by two affidavits, one from Mr. Warner, which he described as being based on inadmissible ‘double hearsay’ and one from the blogger, Mr. Doering. While Master Harper found that the Doering affidavit contained first hand evidence from Doering’s investigation into the servers sold on Craigslist, he noted that Doering himself had not been convinced by the seller’s statements about how he came to be in possession of the servers. The Master noted that this did not provide a basis for finding that it was the trustee in bankruptcy who was responsible. The Master also noted that although an RCMP investigation had been launched at the time of the blog post, it had since concluded with no charges being laid. The Master’s conclusion was that there was no evidence to support a finding that any possible privacy breach “took place under the Trustee’s ‘supervision and control’.” (at para 58)

Although the application was dismissed, the case does highlight some important concerns about the handling of personal information in bankruptcy proceedings. Not only can customer databases be sold as assets in bankruptcy proceedings, Mr Doering’s blog post raised the spectre of computer servers and computer hard drives being disposed of without properly being wiped of the personal data that they contain. Although he dismissed the application to file suit against the Trustee, Master Harper did express some concern about the Trustee’s lack of engagement with some of the issues raised by Mr. Warner. He noted that no evidence was provided by the Trustee “as to how, or if, the Trustee seeks to protect the privacy of customers when a bankrupt’s assets (including customer information) are sold in the bankruptcy process.” (at para 44) This is an important issue, but it is one on which there is relatively little information or discussion. A 2009 blog post from Quebec flags some of the concerns raised about privacy in bankruptcy proceedings; a more recent post suggests that while larger firms are more sophisticated in how they deal with personal information assets, the data in the hands of small and medium sized firms that experience bankruptcy may be more vulnerable.

Digital and data governance is challenging at the best of times. It has been particularly challenging in the context of Sidewalk Labs’ proposed Quayside development for a number of reasons. One of these is (at least from my point of view) an ongoing lack of clarity about who will ‘own’ or have custody or control over all of the data collected in the so-called smart city. The answer to this question is a fundamentally important piece of the data governance puzzle.

In Canada, personal data protection is a bit of a legislative patchwork. In Ontario, the collection, use or disclosure of personal information by the private sector, and in the course of commercial activity, is governed by the federal Personal Information Protection and Electronic Documents Act (PIPEDA). However, the collection, use and disclosure of personal data by municipalities and their agencies is governed by the Municipal Freedom of Information and Protection of Privacy Act (MFIPPA), while the collection, use and disclosure of personal data by the province is subject to the Freedom of Information and Protection of Privacy Act (FIPPA). The latter two statutes – MFIPPA and FIPPA – contain other data governance requirements for public sector data. These relate to transparency, and include rules around access to information. The City of Toronto also has information management policies and protocols, including its Open Data Policy.

The documentation prepared for the December 13, 2018 Digital Strategy Advisory Panel (DSAP) meeting includes a slide that sets out implementation requirements for the Quayside development plan in relation to data and digital governance. A key requirement is: “Compliance with or exceedance of all applicable laws, regulations, policy documents and contractual obligations” (page 95). This is fine in principle, but it is not enough on its own to say that the Quayside project must “comply with all applicable laws”. At some point, it is necessary to identify what those applicable laws are. This has yet to be done. And the answer to the question of which laws apply in the context of privacy, transparency and data governance, depends upon who ultimately is considered to ‘own’ or have ‘custody or control’ of the data.

So – whose data is it? It is troubling that this remains unclear even at this stage in the discussions. The fact that Sidewalk Labs has been asked to propose a data governance scheme suggests that Sidewalk and Waterfront may be operating under the assumption that the data collected in the smart city development will be private sector data. There are indications buried in presentations and documentation that also suggest that Sidewalk Labs considers that it will ‘own’ the data. There is a great deal of talk in meetings and in documents about PIPEDA, which also indicates that there is an assumption between the parties that the data is private sector data. But what is the basis for this assumption? Governments can contract with a private sector company for data collection, data processing or data stewardship – but the private sector company can still be considered to act as an agent of the government, with the data being legally under the custody or control of the government and subject to public sector privacy and freedom of information laws. The presence of a private sector actor does not necessarily make the data private sector data.

If the data is private sector data, then PIPEDA will apply, and there will be no applicable access to information regime. PIPEDA also has different rules regarding consent to collection than are found in MFIPPA. If the data is considered ultimately to be municipal data, then it will be subject to MFIPPA’s rules regarding access and privacy, and it will be governed by the City of Toronto’s information management policies. These are very different regimes, and so the question of which one applies is quite fundamental. It is time for there to be a clear and forthright answer to this question.

Law and the “Sharing Economy”: Regulating Online Market Platforms is a new, peer-reviewed collection of papers co-edited by Derek McKee, Finn Makela and myself. The book is the product of a workshop held in January 2017. It is published in late November 2018 by the University of Ottawa press in both print and open access PDF formats.

The title of the book uses scare quotes around ‘Sharing Economy’ because of the deep ambivalence felt about the term amongst contributors to the volume, and the inability to find a suitable alternative. The term ‘sharing economy’ is used by some to suggest an alternative to the market; others have used it to describe activities taking place over large, commercial platforms. And, while some of the platforms use the rhetoric of helping ordinary individuals make ends meet by providing them with the ability to commercialize (‘share’) underutilized resources, the reality is that many of the large platform companies have resulted in other resources finding their way into the ‘sharing economy’. These resources may include, for example, living spaces once rented out on a long term basis that now turn greater profits as short term accommodation. Platform companies have had broad disruptive impacts. Our authors consider their impacts on licensing regimes, alternative dispute resolution, legal normativity, local governance, specific industries, and labour rights. The also consider platform companies’ digital data, their relationship to international trade agreements, and the competition law and policy issues they raise.

The collection of papers in this book offer “a set of diverse lenses through which we can examine both the sharing economy and its broader social impacts, and from which certain key themes emerge” (introduction, p. 5). The book is organized into 5 broad themes: Technologies of Regulation; Regulating Technology; The Space of Regulation – Local to Global; Regulating Markets; and Regulating Labour. The papers reflect a diversity of perspectives. Some explore issues in the context of specific platforms such as Airbnb or Uber; others consider the issues raised by the ‘sharing economy’ more broadly. A Table of Contents for the book is found below.

 

Law and the “Sharing Economy”: Regulating Online Market Platforms

Derek McKee, Finn Makela, Teresa Scassa, eds.

 

Table of contents

Introduction

Derek McKee, Finn Makela and Teresa Scassa

Technologies of regulation

Peer Platform Markets and Licensing Regimes

Derek McKee

The False Promise of the Sharing Economy

Harry Arthurs

The Fast to the Furious

Nofar Sheffi

Regulating technology

The Normative Ecology of Disruptive Technology

Vincent Gautrais

Information Law in the Platform Economy: Ownership, Control and Reuse of Platform Data

Teresa Scassa

The space of regulation: local to global

Urban Cowboy E-Capitalism Meets Dysfunctional Municipal Policy-Making: What the Uber Story Tells Us About Canadian Local Governance

Mariana Valverde

The Sharing Economy and Trade Agreements: The Challenge to Domestic Regulation

Michael Geist

Regulating markets

Should Licence Plate Owners be Compensated when Uber Comes to Town?

Eran Kaplinsky

Competition Law and Policy Issues in the Sharing Economy

Francesco Ducci

Regulating labour

The Legal Framework for Digital Platform Work: The French Experience

Marie-Cécile Escande-Varniol

Uber and the Unmaking and Remaking of Taxi Capitalisms: Technology, Law and Resistance in Historical Perspective

Eric Tucker

Making Sense of the Public Discourse on Airbnb and Labour: What About Labour Rights?

Sabrina Tremblay-Huet

 

 

 

On November 23, 2018, Waterfront Toronto hosted a Civic Labs workshop in Toronto. The theme of the workshop was Smart City Data Governance. I was asked to give a 10 minute presentation on the topic. What follows is a transcript of my remarks.

Smart city governance relates to how smart cities govern themselves and their processes; how they engage citizens and how they are transparent and accountable to them. Too often the term “smart city” is reduced to an emphasis on technology and on technological solutionism – in other words “smart cities” are presented as a way in which to use technology to solve urban problems. In its report on Open Smart Cities, Open North observes that “even when driven in Canada by good intentions and best practices in terms of digital strategies, . . . [the smart city] remains a form of innovation and efficient driven technological solutionism that is not necessarily integrated with urban plans, with little or no public engagement and little to no relation to contemporary open data, open source, open science or open government practices”.

Smart cities governance puts the emphasis on the “city” rather than the “smart” component, focusing attention on how decisions are made and how the public is engaged. Open North’s definition of the Open Smart City is in fact a normative statement about digital urban governance:

An Open Smart City is where residents, civil society, academics, and the private sector collaborate with public officials to mobilize data and technologies when warranted in an ethical, accountable and transparent way to govern the city as a fair, viable and liveable commons and balance economic development, social progress and environmental responsibility.

This definition identifies the city government as playing a central role, with engagement from a range of different actors, and with particular economic, social and environmental goals in mind. This definition of a smart city involves governance in a very basic and central way – stakeholders are broadly defined and they are engaged not just in setting limits on smart cities technology, but in deciding what technologies to adopt and deploy and for what purposes.

There are abundant interesting international models of smart city governance – many of them arise in the context of specific projects often of a relatively modest scale. Many involve attempts to find ways to include city residents in both identifying and solving problems, and the use of technology is relevant both to this engagement and to finding solutions.

The Sidewalk Toronto project is somewhat different since this is not a City of Toronto smart city initiative. Rather, it is the tri-governmental entity Waterfront Toronto that has been given the lead governance role. This has proved challenging since while Waterfront Toronto has a public-oriented mandate, it is not a democratically elected body, and its core mission is to oversee the transformation of specific brownfield lands into viable communities. This is important to keep in mind in thinking about governance issues. Waterfront Toronto has had to build public engagement into its governance framework in ways that are different from a municipal government. The participation of federal and provincial privacy commissioners, and representatives from federal and provincial governments feed into governance as does the DSAP and there has been public outreach. There will also be review of and consultation of the Master Innovation Development Plan (MIDP) once it is publicly released. But it is a different model from city government and this may set it apart in important ways from other smart cities initiatives in Canada and around the world.

Setting aside for a moment the smart cities governance issue, let’s discuss data governance. The two are related – especially with respect to the issue of what data is collected in the smart city and for what purposes.

Broadly speaking, data governance goes to the question of how data will be stewarded (and by whom) and for what purposes. Data governance is about managing data. As such, it is not a new concept. Data governance is a practice that is current in both private and public sector contexts. Most commonly it takes place within a single organization which develops practices and protocols to manage its existing and future data. Governance issues include considering who is responsible for the data, who is entitled to set the rules for access to and reuse of it, how those rules will be set, and who will profit/benefit from the data and on what terms. It also includes addressing issues such as data security, standards, interoperability, and localization. Where the data include personal information, compliance with privacy laws is an aspect of data governance. But governance is not limited to compliance – for example, an organization may adopt higher standards than those required by privacy law, or may develop novel approaches to managing and protecting personal information.

There are many different data governance models. Some (particularly in the public sector) are shaped by legislation, regulations and government policies. Others may be structured by internal policies, standards, industry practice, and private law instruments such as contracts or trusts. As the term is commonly used, data governance does not necessarily implicate citizen involvement or participation in the same way as “smart city governance” does – it is the “city” part of “smart city governance” that brings in to focus democratic principles of transparency, accountability, engagement and so on. However, where there is a public sector dimension to the collection or control of data, then public sector laws, including those relating to transparency and accountability, may apply.

With the rise of the data economy, data sharing is becoming an important activity for both public and private sector actors. As a result, new models of data governance are needed to facilitate data sharing. There are many different benefits that flow from data sharing. It may be carried out for financial gain, or it may be done to foster innovation, enable new insights, stimulate the economy, increase transparency, solve thorny problems, and so on. There are also different possible beneficiaries. Data may be shared amongst a group of entities each of which will find advantages in the mutual pooling of their data resources. Or it may be shared broadly in the hope of generating new data-based solutions to existing problems. In some cases, data sharing has a profit motive. The diversity of actors, beneficiaries, and motivations, makes it necessary to find multiple, diverse and flexible frameworks and principles to guide data sharing arrangements.

Open government data regimes are an important example of a data governance model for data sharing. Many governments have decided that opening government data is a significant public policy goal, and have done tremendous amount of work to create the infrastructure not just for sharing data, but for doing it in a useful, accessible and appropriate manner. This means the development of standards for data and metadata, and the development of portals and search functions. It has meant paying attention to issues of interoperability. It has also required governments to consider how best to protect privacy and confidential information, or information that might impact on security issues. Once open, the sharing frameworks are relatively straightforward -- open data portals typically offer data to anyone, with no registration requirement, under a simple open licence.

Governments are not the only ones developing open data portals – research institutions are increasingly searching for ways in which to publicly share research outputs including publications and data. Some research data infrastructures support sharing, but not necessarily on fully open terms – this requires another level of consideration as to the policy reasons for limiting access, how to limit access effectively, and how to set and ensure respect for appropriate limits on reuse.

The concept of a data trust has also received considerable attention as a means of data sharing. The term data trust is now so widely and freely used that it does not have a precise meaning. In its publication “What is a Data Trust”, the ODI identifies at least 5 different concepts of a data trust, and they provide examples of each:

· A data trust as a repeatable framework of terms and mechanisms.

· A data trust as a mutual organisation.

· A data trust as a legal structure.

· A data trust as a store of data.

· A data trust as public oversight of data access.

The diversity of “data trusts” means that there are a growing number of models to study and consider. However, it also makes it a little dangerous to talk about “data trust” as if it has a precise meaning. With data trusts, the devil is very much in the details. If Sidewalk Labs is to propose a ‘data trust’ for the management of data gathered in the Sidewalk Toronto development, then it will be important to probe into exactly what the term means in this context.

What Sidewalk Labs is proposing is a particular vision of a data trust as a data governance model for data sharing in a smart cities development. It is admittedly a work in progress. It has some fairly particular characteristics. For example, not only is it a framework to set the parameters for sharing the subset “urban data” (defined by Sidewalk Labs) collected through the project, it also contemplates providing governance for any proposals by third parties who might want to engage in the collection of new kinds, categories or volumes of data.

In thinking about the proposed ‘trust’, some questions I would suggest considering are:

1) What is the relationship between the proposed trust and the vision for smart city governance? In other words, to what extent is the public and/or are public sector decision-makers engaged in determining what data will be governed by the trust, on what terms, for whose benefit, and on what terms will sharing take place?

2) A data governance model does not make up for a robust smart city governance up front (in identifying the problems to be solved, the data to be collected to solve them, etc.). If this piece is missing, then discussion of the trust may involve discussing the governance of data where there is no group consensus or input as to its collection. How should this be done (if at all)?

3) A data governance model can be created for the data of a single entity (e.g. an open government portal, or a data governance framework for a corporation); but it can also be developed to facilitate data sharing between entities, or even between a group of entities and a broader public. So an important question in the ST context is what model is this? Is this Sidewalk Labs data that is being shared? Or is it Waterfront’s? Or the City’s? Who has custody/control or ownership of the data that will be governed by the ‘trust’?

4) Data governance is crucial with respect to all data held by an entity. Not all data collected through the Sidewalk Toronto project will fall within Sidewalk’s definition of “urban data” (for which the ‘trust’ is proposed). If the data governance model under consideration only deals with a subset of data, then there must be some form of data governance for the larger set. What is it? And who determines its parameters?

The following is a copy of remarks I made in an appearance before the Standing Senate Committee on Banking, Trade and Commerce, on November 21, 2018. The remarks are about proposed amendments to the Trade-marks Act found in the omnibus Bill C-86. I realize that not everyone is as interested in official marks as I am. To get a sense of what the fuss is all about, you can have a look at my posts about the overprotection of Olympic Marks, the dispute over the Spirit Bear mark, the struggle to register a trademark in the face of an wrongly granted and since abandoned official mark, Canada Post’s official marks for POSTAL CODE and CODE POSTAL and a previous private member’s bill to reform official marks.

Canada’s official marks regime has long been criticized by lawyers, academics and the Federal Court. In fact, it is the Federal Court that has, over the years, created some much needed boundaries for these “super marks”. The problems with official marks are well known, but they have largely been ignored by Parliament. It is therefore refreshing to see the proposed changes in ss. 215 and 216 of Bill C-86.

Sections 215 and 216 address only one of the problems with the official marks regime. Although it is an important one, it is worth noting that there is more that could be done. The goal of my remarks will be to identify what I see as two shortfalls of ss 215 and 216.

Official marks are a subcategory of prohibited marks, which may not be adopted, used or registered unless consent is provided. They are available to “public authorities”. A public authority need only ask the Registrar of Trademarks to give public notice of its adoption and use of an official mark for that mark to be protected. There are no limits to what can be adopted. There are no registration formalities, no examination or opposition proceedings. Until the very recent decision of the Federal Court in Quality Program Services Inc. v. Canada, it seemed nothing prevented a public authority from obtaining an official mark that was identical to or confusing with an already registered trademark. While Quality Program Services at least provides consequences for adopting a confusing official mark, it is currently under appeal and it is not certain that the decision will be upheld. This is another instance of the Federal Court trying to set boundaries for official marks that simply have not been set in the legislation.

Official marks are theoretically perpetual in duration. They remain on the register until they are either voluntarily withdrawn by the ‘owner’ (and owners rarely think to do this), or until a successful (and costly) action for judicial review results in one being struck from the Register. Until the Ontario Association of Architects decision in 2002 tightened up the meaning of ‘public authority’, official marks were handed out like Halloween candy, and many entities that were not ‘public authorities’ were able to obtain official marks. Many of these erroneously-issued official marks continue to exist today; in fact, the Register of Trademarks has become cluttered with official marks that are either invalid or no longer in use.

Sections 215 and 216 address at least part of this last problem. They provide an administrative process through which either the Registrar or any person prepared to pay the prescribed fee can have an official mark invalidated if the entity that obtained the mark “is not a public authority or no longer exists.” This is a good thing. I would, however, suggest one amendment to the proposed new s. 9(4). Where it is the case (as per the new s. 9(3)) that the entity that obtained the official mark was not a public authority or has ceased to exist, s. 9(4) allows the Registrar to give public notice that subparagraph (1)(n)(iii) “does not apply with respect to the badge, crest, emblem or mark”. As it is currently worded, this is permissive– the Registrar “may” give public notice of non-application. In my view, it should read:

(4) In the circumstances set out in subsection (3), the Registrar may, on his or her own initiative or shall, at the request of a person who pays a prescribed fee, give public notice that subparagraph (1)‍(n)‍(iii) does not apply with respect to the badge, crest, emblem or mark.

There is no reason why a person who has paid a fee to have established the invalidity of an official mark should not have the Registrar give public notice of this.

I would also suggest that the process for invalidating official marks should extend to those that have not been used within the preceding three years – in other words, something parallel to s. 45 of the Trade-marks Act which provides an administrative procedure to remove unused registered trademarks from the Register. There are hundreds of ‘public authorities’ at federal and provincial levels across Canada, and they adopt official marks for all sorts of programs and initiatives, many of which are relatively transient. There should be a means by which official marks can simply be cleared from the Register when they are no longer used. Thus, I would recommend adding new subsections 9(5) and (6) to the effect that:

(5) The Registrar may at any time – and, at the written request of a person who pays a prescribed fee, made after three years from the date that public notice was given of an official mark, shall, unless the Registrar sees good reason to the contrary – give notice to the public authority requiring it to furnish, within three months, an affidavit or a statutory declaration showing that the official mark was in use at any time during the three year period immediately preceding the date of the notice and, if not, the date when it was last so in use and the reason for the absence of such use since that date.

(6) Where, by reason of the evidence furnished to the Registrar or the failure to furnish any evidence, it appears to the Registrar that an official mark was not used at any time during the three year period immediately preceding the date of the notice and that the absence of use has not been due to special circumstances that excuse it, the Registrar shall give public notice that subparagraph (1)‍(n)‍(iii) does not apply with respect to the badge, crest, emblem or mark.

These are my comments on changes to the official marks regime that most closely relate to the amendments in Bill C-86. The regime has other deficiencies which I would be happy to discuss.

<< Start < Prev 1 2 3 4 5 6 7 8 9 10 Next > End >>
Page 9 of 37

Canadian Trademark Law

Published in 2015 by Lexis Nexis

Canadian Trademark Law 2d Edition

Buy on LexisNexis

Electronic Commerce and Internet Law in Canada, 2nd Edition

Published in 2012 by CCH Canadian Ltd.

Electronic Commerce and Internet Law in Canada

Buy on CCH Canadian

Intellectual Property for the 21st Century

Intellectual Property Law for the 21st Century:

Interdisciplinary Approaches

Purchase from Irwin Law