Teresa Scassa - Blog

Teresa Scassa

Teresa Scassa

Monday, 25 July 2022 06:34

Bill C-27 and Children’s Privacy

Note: This is the fifth in a series of posts on Canada's Bill C-27 which, among other things, will reform Canada's private sector data protection law.

Bill C-27, the bill to amend Canada’s existing private sector data protection law, gives particular attention to the privacy rights of minors in a few instances. This is different from the current law, and it is a change since the previous (failed) reform bill, Bill C-11. The additions to Bill C-27 respond to concerns raised by privacy advocates and scholars regarding Bill C-11’s silence on children’s privacy.

Directly addressing children’s privacy has been a bit of a struggle for this government, which seems particularly sensitive to federal-provincial division of powers issues. After all, it is the provinces that get to determine the age of majority. A private sector data protection law that defined a child in terms of a particular age range for the purposes of consent, for example, might raise constitutional hackles. Further, many of the privacy issues that concern parents the most are ones that fall at least to some extent within provincial jurisdiction. Consider the issues around children’s privacy and educational technologies used in schools. While many of those technologies are sourced from the private sector, the schools themselves are subject to provincial public sector data protection laws, and so, the schools’ adoption and use of these technologies is governed by provincial legislation. That said, children still spend a great deal of time online; their toys are increasingly connected to the Internet of Things; their devices and accompanying apps capture and transmit all manner of data; and they, their parents and friends post innumerable pictures, videos and anecdotes about them online. Children have a clear interest in private sector data protection.

The government’s modest response to concerns about children’s privacy in Bill C-27 no doubt reflects this constitutional anxiety. The most significant provision is found in s. 2(2), which states that “For the purposes of this Act, the personal information of minors is considered to be sensitive information.” Note that the reference is to ‘minors’ and not ‘children’, and no attempt is made to define the age of majority.

If you search Bill C-27 for further references to minors, you will find few. Two important ones are found in s. 55, which deals with the right of erasure. This right, which allows an individual to request the deletion of their data, has number of significant exceptions to it. However, two of these exceptions do not apply in the case of the data of minors (see my post on the right of erasure). The first of these allows an organization to deny a request for erasure if “the disposal of the information would have an undue adverse impact on the accuracy or integrity of information that is necessary to the ongoing provision of a product or service to the individual in question”. The second allows an organization to deny a request for deletion if the data is subject to a data retention policy. Neither exception to the right of erasure applies in the case of the data of minors. This is important as it will allow minors (or those acting on their behalf) to obtain deletion of data – even outside the organization’s regular disposal schedule.

The Personal Information Protection and Electronic Documents Act currently links valid consent to a person’s capacity to understand “the nature, purpose and consequences of the collection, use or disclosure of the personal information to which they are consenting” (s. 6.1). Bill C-11 would have eliminated this requirement for valid consent. Responding to criticisms, the government in Bill C-27, has added a requirement that consent must be sought “in plain language that an individual to whom the organization’s activities are directed would reasonably be expected to understand.” (s. 15(4)) It is good to see this element returned to the reform bill, even if it is a little half-hearted compared to PIPEDA’s s. 6.1. In this regard, Bill C-27 is an improvement over C-11. (See my post on consent in Bill C-27).

Although no other provisions are specifically drafted for minors, per se, declaring that the personal information of minors is considered ‘sensitive’ is significant in a Bill that requires organizations to give particular attention to the sensitivity of personal data in a range of circumstances. For example, an organization’s overall privacy management program must take into account both the volume and sensitivity of the information that the organization collects (s. 9(2)). The core normative principle in the legislation, which limits the collection, use and disclosure of personal information to that which a reasonable person would consider appropriate in the circumstances also requires a consideration of the sensitivity of personal data (s. 12(2)(a)). In determining whether an organization can rely upon implied consent, the sensitivity of the information is a relevant factor (s. 15(5)). Organizations, in setting data retention limits, must take into account, among other things, the sensitivity of personal data (s. 53(2)), and they must provide transparency with respect to those retention periods (s. 62(2)(e)). The security safeguards developed for personal data must take into account its sensitivity (s. 57(1)). When there is a data breach, the obligation to report the breach to the Commissioner depends upon a real risk of significant harm – one of the factors in assessing such a risk is the sensitivity of the personal data (s. 58(8)). When data are de-identified, the measures used for de-identification must take into account the sensitivity of the data, and the Commissioner, in exercising his powers, duties or functions must also consider the sensitivity of the personal data dealt with by an organization (s. 109).

The characterization of the data of minors as ‘sensitive’ means that the personal data of children – no matter what it is – will be treated as sensitive data in the interpretation and application of the law. In practical terms, this is not new. The Office of the Privacy Commissioner has consistently treated the personal data of children as sensitive. However, it does not hurt to make this approach explicit in the law. In addition, the right of erasure for minors is an improvement over both PIPEDA and Bill C-11. Overall, then, Bill C-27 offers some enhancement to the data protection rights of minors.

As part of my series on Bill C-27, I will be writing about both the proposed amendments to Canada’s private sector data protection law and the part of the Bill that will create a new Artificial Intelligence and Data Act (AIDA). So far, I have been writing about privacy, and my posts on consent, de-identification, data-for-good, and the right of erasure are already available. Posts on AIDA, will follow, although I still have a bit more territory on privacy to cover first. However, in the meantime, as a teaser, perhaps you might be interested in playing a bit of statutory MadLibs…...

Have you ever played MadLibs? It’s a paper-and-pencil game where someone asks the people in the room to supply a verb, noun, adverb, adjective, or body part, and the provided words are used to fill in the blanks in a story. The results are often absurd and sometimes hilarious.

The federal government’s proposal in Bill C-27 for an Artificial Intelligence and Data Act, really lends itself to a game of statutory MadLibs. This is because some of the most important parts of the bill are effectively left blank – either the Minister or the Governor-in-Council is tasked in the Bill with filling out the details in regulations. Do you want to play? Grab a pencil, and here goes:

Company X is developing an AI system that will (insert definition of ‘high impact system). It knows that this system is high impact because (insert how a company should assess impact). Company X has established measures to mitigate potential harms by (insert measures the company took to comply with the regulations) and has also recorded (insert records it kept), and published (insert information to be published).

Company X also had its system audited by an auditor who is (insert qualifications). Company X is being careful, because if it doesn’t comply with (insert a section of the Act for which non-compliance will count as a violation), it could be found to have committed a (insert degree of severity) violation. This could lead to (insert type of proceeding).

Company X, though, will be able to rely on (insert possible defence). However, if (insert possible defence) is unsuccessful, Company X may be liable to pay an Administrative Monetary Penalty if they are a (insert category of ‘person’) and if they have (insert factors to take into account). Ultimately, if they are unhappy with the outcome, they can launch a (insert a type of appeal proceeding).

Because of this regulatory scheme, Canadians can feel (insert emotion) at how their rights and interests are protected.

Bill C-27, which will amend Canada’s private sector data protection law, contains a right of erasure. In its basic form, this right allows individuals to ask an organization to dispose of the personal information it holds about them. It is sometimes referred to as the right to be forgotten, although the right to be forgotten has different dimensions that are not addressed in Bill C-27. Bill C-27’s predecessor, Bill C-11, had proposed a right of erasure in fairly guarded terms: individuals would be able to request the disposal only of information that the organization had obtained from the individual. This right would not have extended to information the organization had collected through other means – by acquiring that information from other organizations, scraping it from the internet, or even creating it through profiling algorithms. Section 55 of Bill C-27 (“disposal at individual’s request”) brings some interesting changes to this limitation. Significantly, it extends the right of erasure to the individual’s personal information that “is under the organization’s control”. Nevertheless, in doing so, it also adds some notable restrictions.

First, Bill C-27’s right of erasure will only apply in three circumstances. The first, set out in s. 55(1)(a), is where the information was collected, used or disclosed in contravention of the Act. Basically, if an organization had no right to have or use the personal data in the first place, it must dispose of the information at the request of the individual.

The second situation, set out in s. 55(1)(b), is where an individual has withdrawn their consent to the collection, use or disclosure of the information held by the organization. Perhaps a person agreed to allow an organization to collect certain data in addition to the data considered necessary to providing a particular product or service. If that person decides they no longer want the organization to collect this additional data, not only can they withdraw consent to its continued collection, they can exercise their right to erasure and have the already-collected data deleted.

Finally, s. 55(1)(c) allows an individual to request deletion of personal data where the information is no longer necessary for the continued provision of a product or service requested by the individual. If an individual ceases to do business with an organization, for example, and does not wish the organization to retain their personal information, they can request its deletion. Here, the expansion of the right to include all personal information under the organization’s control can be important. For example, if you terminate your contract with a streaming service, you could request deletion not just of the customer data you provided to them, and your viewing history, but also the organization’s inexplicable profile of you as someone who loves zombie movies.

Where an organization has acceded to a request for disposal of personal data, it is also obliged, under s. 55(4), to inform “any service provider” to which it has transferred the data to dispose of them. The organization is responsible for ensuring this takes place. Note, however, that the obligation is only to inform any service provider, defined in the bill as an entity that “provides services for or on behalf” of the organization to assist it in fulfilling its purposes. The obligation to notify does not extend to those to whom the data may have been sold.

There are, however, important exceptions to this expanded right of erasure. Subsection 55(2) would allow an organization to refuse to dispose of data under s. 55(1)(b) or (c) in circumstances where it is inseparable from the personal data of another person (for example, that embarrassing photo of you partying with others that someone else posted online); other legal requirements require the organization to retain the information; or the organization requires the data for a legal defence or legal remedy.

A few other exceptions are potentially more problematic. Paragraph 55(2)(d) creates an exception to the right of erasure where:

(d) the information is not in relation to a minor and the disposal of the information would have an undue adverse impact on the accuracy or integrity of information that is necessary to the ongoing provision of a product or service to the individual in question;

For example, this might apply in the case where an individual remains in a commercial relationship with an organization, but has withdrawn consent to a particular use or disclosure of their data and has requested its deletion. If the organization believes that deleting the information would adversely affect the integrity of the product or service they continue to provide to the individual, they can refuse deletion. It will be interesting to see how this plays out. There may be a matter of opinion about the impacts on the integrity of the product or service being supplied. If an individual finds an organization’s recommendation service based on past purchases or views to be largely useless, seeking deletion of data about their viewing history will not impact the integrity of the service from the individual’s point of view – but the organization might have a different opinion.

In Bill C-27, the government responded to criticisms that its predecessor, Bill C-11, did nothing to specifically deal with children’s privacy. Bill C-27 addresses the privacy of minors in specific instances, and the right of erasure is one of them. Interestingly, the right of erasure prevails under s. 55(2)(d) for minors, presumably even when the erasure would have an “undue adverse impact on the accuracy or integrity of information that is necessary to the ongoing provision of a product or service”. It seems that minors will get to choose between deletion and adverse impacts, while those over the age of majority will have to put up with retention and uses of their personal data to which they object.

Another exception to the right also applies only to those past the age of majority. Paragraph 55(2)(f) provides that an organization may refuse a request for disposal of personal information if:

(f) the information is not in relation to a minor and it is scheduled to be disposed of in accordance with the organization’s information retention policy, and the organization informs the individual of the remaining period of time for which the information will be retained.

What this means is that if an organization has a retention policy that conforms to s. 53 of Bill C-27 (one that provides for the destruction of personal information once it is no longer necessary for the purposes for which it was collected, used or disclosed), then it can refuse a request for erasure – unless, of course, it is a minor who requests erasure. In that case, they must act in advance of the normal disposal schedule. This provision was no doubt added to save organizations from the burden of having to constantly respond to requests for erasure of personal data. For large swathes of personal data, for example, they can prepare a standard response that informs a requestor of their retention policy and provides the timetable on which the data will be deleted once it is no longer necessary to fulfill the purposes for which it was collected. If this provision can also be relied upon when an individual ceases to do business with an organization and requests the deletion of their information, then the right of erasure in Bill C-27 will become effectively useless in the case of any company with a data retention policy. Except, of course, for minors.

Finally, organizations will be given the right to refuse to consider requests for deletion that are “vexatious or made in bad faith”. Let’s hit pause here. This exception is to protect commercial entities against data subjects. I understand that organizations do not want to be subject to mass campaigns for data deletion– or serial requests by individuals – that overwhelm them. That might happen. However, the standard form email that will be part of the ‘regular deletion schedule’ exception discussed above will largely suffice to address this problem. Organizations now have enormous abilities to collect massive amounts of personal data and to use these data for a wide variety of purposes. Many do this responsibly, but there are endless examples of overcollection, over-retention, excessive sharing, poor security, and outright abuses of personal data. The right of erasure is a new right for individuals to help them exercise greater control over their personal data in a context in which such data are often flagrantly misused. To limit this right based on what an organization considers vexatious is a demonstration of how the balance in Bill C-27 leans towards the free flow and use of personal data rather than the protection of privacy.

It is important to note that there is yet another limit on the right of erasure, which is found in Bill C-27’s definition of ‘dispose’. According to this definition, dispose means “to permanently and irreversibly delete personal information or to anonymize it. Thus, an organization can choose to anonymize personal data, and once it has done so, the right of erasure is not available. (See my post on anonymized and deidentified data for what ‘anonymized’ means). Section 2(3) of Bill C-27 also removes the right of erasure where information is merely de-identified (pseudonymized). This seems like an internal contradiction in the legislation. Disposal means deletion or rigorous anonymization – but, under s. 2(3), a company can just pseudonymize to avoid a request for disposal. The difference seems to be that pseudonymized data may still eventually need to be disposed of under data retention limits, whereas anonymized data can be kept forever.

All told, as a right that is meant to give more control to individuals, the right of erasure in Bill C-27 is a bit of a bust. Although it allows an individual to ask an organization to delete data (and not just data that the individual provided), the right is countered by a great many bases on which the organization can avoid it. It’s a bit of a ‘Canadian compromise’ (one of the ones in which Canadians get compromised): individuals get a new right; organizations get to side-step it.

 

 

[Note: This is my third in a series of posts on the new Bill C-27 which will reform private sector data protection law in Canada and which will add a new Artificial Intelligence and Data Act. The previous two posts addressed consent and de-identification/anonymization.]

In 2018 a furore erupted over media reports that Statistics Canada (StatCan) sought to collect the financial data of a half a million Canadians from Canadian banks to generate statistical data. Reports also revealed that it had already collected a substantial volume of personal financial data from credit agencies. The revelations led to complaints to the Privacy Commissioner, who carried out an investigation and issued an interim and a final report. One outcome was that StatCan worked with the Office of the Privacy Commissioner of Canada to develop a new approach to the collection of such data. Much more recently, there were expressions of public outrage when media reported that the Public Health Agency of Canada (PHAC) had acquired de-identified mobility data about Canadians from Telus in order to inform their response to the COVID-19 pandemic. This led to hearings before the ETHI Standing Committee of the House of Commons, and resulted in a report with a series of recommendations.

Both of these instances involved attempts by government institutions or agencies to make use of existing private sector data to enhance their analyses or decision-making. Good policy is built on good data; we should support and encourage the responsible use of data by government in its decision-making. At the same time, however, there is clearly a deep vein of public distrust in government – particularly when it comes to personal data – that cannot be ignored. Addressing this distrust requires both transparency and strong protection for privacy.

Bill C-27, introduced in Parliament in June 2022, proposes a new Consumer Privacy Protection Act to replace the aging Personal Information Protection and Electronic Documents Act (PIPEDA). As part of the reform, this private sector data protection bill contains provisions that are tailored to address the need of government – as well as the commercial data industry – to access personal data in the hands of the private sector.

Two provisions in C-27 are particularly relevant here: sections 35 and 39. Section 35 deals specifically with the sharing of private sector data for the purposes of statistics and research. Section 7(3)(f) of PIPEDA contains an exception that is similar to s. 35. Section 39 is entirely new. Section 39 deals with the use of data for “socially beneficial purposes”. Both s. 35 and s. 39 were in the predecessor to C-27, Bill C-11. Only section 35 has been changed since C-11 – a small change significantly broadens its scope.

Section 35 of Bill C-27 provides:

35 An organization may disclose an individual’s personal information without their knowledge or consent if

(a) the disclosure is made for statistical purposes or for study or research purposes and those purposes cannot be achieved without disclosing the information;

(b) it is impracticable to obtain consent; and

(c) the organization informs the Commissioner of the disclosure before the information is disclosed.

This provision would enable the kind of data sharing by the private sector that was involved in the StatCan example mentioned above, and that was previously enabled by s. 7(3)(f) of PIPEDA. As currently the case under PIPEDA, s. 35 would allow for the sharing of personal information without an individual’s knowledge or consent. It is important to note that there is no requirement that the personal information be de-identified or anonymized in any way (see my earlier post on de-identification and anonymization here). The remainder of s. 35 imposes the only limitations on such sharing. One of these relates to purpose. The sharing must be for “statistical purposes” (but note that StatCan is not the only organization that engages in statistical activities, and such sharing is not limited to StatCan). It can also be for “study or research purposes”. Bill C-11, like PIPEDA, had referred to “scholarly study or research purposes”. The removal of ‘scholarly’ substantially enlarges the scope of this provision (for example, market research and voter profile research would no doubt count). There is a further qualifier – the statistical, study, or research purposes have to be ones that “cannot be achieved without disclosing the information”. However, they do not have to be ‘socially beneficial’ (although there is an overarching provision in s. 5 that requires that the purposes for collecting, using or disclosing personal information be ones that a ‘reasonable person would consider appropriate in the circumstances’). Section 35(b) (as is the case under PIPEDA’s s. 7(3)(f)) also requires that it be impracticable to obtain consent. This is not really much of a barrier. If you want to use the data of a half a million individuals, for example, it is really not practical to seek their consent. Finally, the organization must inform the Commissioner of the disclosure prior to it taking place. This provides a thin film of transparency. Another nod and a wink to transparency is found in s. 62(2)(b), which requires organizations to provide a ‘general account’ of how they apply “the exceptions to the requirement to obtain an individual’s consent under this Act”.

Quebec’s Loi 25 also addresses the use of personal information in the hands of the private sector for statistical and research purposes without individual consent. Unlike Bill C-27, it contains more substantive guardrails:

21. A person carrying on an enterprise may communicate personal information without the consent of the persons concerned to a person or body wishing to use the information for study or research purposes or for the production of statistics.

The information may be communicated if a privacy impact assessment concludes that

(1) the objective of the study or research or of the production of statistics can be achieved only if the information is communicated in a form allowing the persons concerned to be identified;

(2) it is unreasonable to require the person or body to obtain the consent of the persons concerned;

(3) the objective of the study or research or of the production of statistics outweighs, with regard to the public interest, the impact of communicating and using the information on the privacy of the persons concerned;

(4) the personal information is used in such a manner as to ensure confidentiality; and

(5) only the necessary information is communicated.

The requirement of a privacy impact assessment (PIA) in Loi 25 is important, as is the condition that this assessment consider the goals of the research or statistical activity in relation to the public interest and to the impact on individuals. Loi 25 also contains important limitations on how much information is shared. Bill C-27 addresses none of these issues. At the very least, as is the case under Quebec law, there should be a requirement to conduct a PIA with similar considerations – and to share it with the Privacy Commissioner. Since this is data sharing without knowledge or consent, there could even be a requirement that the PIAs be made publicly available, with appropriate redactions if necessary.

Some might object that there is no need to incorporate these safeguards in the new private sector data protection law since those entities (such as StatCan) who receive the data have their own secure policies and practices in place to protect data. However, under s. 35 there is no restriction on who may receive data for statistical, study or research purposes, and no reason to assume that they have appropriate safeguards in place. If they do, then the PIA can reflect this.

Section 39 addresses the sharing of de-identified personal information for socially beneficial purposes. Presumably, this would be the provision under which, in the future, mobility data might be shared with an agency such as PHAC. Under s. 39:

39 (1) An organization may disclose an individual’s personal information without their knowledge or consent if

(a) the personal information is de-identified before the disclosure is made;

(b) the disclosure is made to

(i) a government institution or part of a government institution in Canada,

(ii) a health care institution, post-secondary educational institution or public library in Canada,

(iii) any organization that is mandated, under a federal or provincial law or by contract with a government institution or part of a government institution in Canada, to carry out a socially beneficial purpose, or

(iv) any other prescribed entity; and

(c) the disclosure is made for a socially beneficial purpose.

(2) For the purpose of this section, socially beneficial purpose means a purpose related to health, the provision or improvement of public amenities or infrastructure, the protection of the environment or any other prescribed purpose.

This provision requires that shared information must be de-identified, although as noted in my earlier post, de-identification in Bill C-27 no longer means what it did in C-11. The data shared may have only direct identifiers removed leaving individuals easily identifiable. The disclosure must be for socially beneficial purposes, and it must be to a specified or prescribed entity. I commented on the identical provision in C-11 here, so I will not repeat in detail those earlier concerns from that post. They remain unaddressed in Bill C-27. The most significant gap is the lack of a requirement for a data governance agreement to be in place between the parties based upon the kinds of considerations that would be relevant in a privacy impact assessment.

Where the sharing is to be with a federal government institution, the Privacy Act should provide additional protection. However, the Privacy Act is itself an antediluvian statute that has long been in need of reform. It is worth noting that while the doors to data sharing are opened in Bill C-27, many of the necessary safeguards – at least where government is concerned – are left for another statute in the hands of another department, and that lies who-knows-where in the government’s legislative agenda (although rumours are that we might see a Bill this fall [Warning: holding your breath could be harmful to your health.]). In its report on the sharing of mobility data with PHAC, ETHI calls for much greater transparency about data use on the part of the Government of Canada, and also calls for enhanced consultation with the Privacy Commissioner prior to engaging in this form of data collection. Apart from the fact that these pieces will not be in place – if at all – until the Privacy Act is reformed, the exceptions in sections 35 and 39 of C-27 apply to organizations and institutions outside the federal government, and thus, can involve institutions and entities not subject to the Privacy Act. Guardrails should be included in C-27 (as they are, for example, in Loi 25); yet, they are absent.

As noted earlier, there are sound reasons to facilitate the use of personal data to aid in data-driven decision-making that serves the public interest. However, any such use must protect individual privacy. Beyond this, there is also a collective privacy dimension to the sharing of even anonymized human-derived data. This should also not be ignored. It requires greater transparency and public engagement, along with appropriate oversight by the Privacy Commissioner. Bill C-27 facilitates use without adequately protecting privacy – collective or individual. Given the already evident lack of trust in government, this seems either tone-deaf or deeply cynical.

 

 

 

 

 

 

 

This is the second post in a series on Bill C-27, a bill introduced in Parliament in June 2022 to reform Canada's private sector data protection law. The first post, on consent provisions, is found here.

In a data-driven economy, data protection laws are essential to protect privacy. In Canada, the proposed Consumer Privacy Protection Act in Bill C-27 will, if passed, replace the aging Personal Information Protection and Electronic Documents Act (PIPEDA) to govern the collection, use and disclosure of personal information by private sector organizations. Personal information is defined in Bill C-27 (as it was in PIPEDA) as “information about an identifiable individual”. The concept of identifiability of individuals from information has always been an important threshold issue for the application of the law. According to established case law, if an individual can be identified directly or indirectly from data, alone or in combination with other available data, then those data are personal information. Direct identification comes from the presence of unique identifiers that point to specific individuals (for example, a name or a social insurance number). Indirect identifiers are data that, if combined with other available data, can lead to the identification of individuals. To give a simple example, a postal code on its own is not a direct identifier of any particular individual, but in a data set with other data elements such as age and gender, a postal code can lead to the identification of a specific individual. In the context of that larger data set, the postal code can constitute personal information.

As the desire to access and use more data has grown in the private (and public) sector, the concepts of de-identification and anonymization have become increasingly important in dealing with personal data that have already been collected by organizations. The removal of both direct and indirect identifiers from personal data can protect privacy in significant ways. PIPEDA did not define ‘de-identify’, nor did it create particular rules around the use or disclosure of de-identified information. Bill C-11, the predecessor to C-27, addressed de-identified personal information, and contained the following definition:

de-identify means to modify personal information — or create information from personal information — by using technical processes to ensure that the information does not identify an individual or could not be used in reasonably foreseeable circumstances, alone or in combination with other information, to identify an individual

This definition was quite inclusive (information created from personal information, for example, would include synthetic data). Bill C-11 set a relative standard for de-identification – in other words, it accepted that de-identification was sufficient if the information could not be used to identify individuals “in reasonably foreseeable circumstances”. This was reinforced by s. 74 which required organizations that de-identified personal information to use measures that were proportionate to the sensitivity of the information and the way in which the information was to be used. De-identification did not have to be perfect – but it had to be sufficient for the context.

Bill C-11’s definition of de-identification was criticized by private sector organizations that wanted de-identified data to fall outside the scope of the Act. In other words, they sought either an exemption from the application of the law for de-identified personal information, or a separate category of “anonymized” data that would be exempt from the law. According to this view, if data cannot be linked to an identifiable individual, then they are not personal data and should not be subject to data protection law. For their part, privacy advocates were concerned about the very real re-identification risks, particularly in a context in which there is a near endless supply of data and vast computing power through which re-identification can take place. These concerns are supported by research (see also here and here). The former federal Privacy Commissioner recommended that it be made explicit that the legislation would apply to de-identified data.

The changes in Bill C-27 reflect the power of the industry lobby on this issue. Bill C-27 creates separate definitions for anonymized and de-identified data. These are:

anonymize means to irreversibly and permanently modify personal information, in accordance with generally accepted best practices, to ensure that no individual can be identified from the information, whether directly or indirectly, by any means.

[. . .]

de-identify means to modify personal information so that an individual cannot be directly identified from it, though a risk of the individual being identified remains. [my emphasis]

Organizations will therefore be pleased that there is now a separate category of “anonymized” data, although such data must be irreversibly and permanently modified to ensure that individuals are not identifiable. This is harder than it sounds; there is, even with synthetic data, for example, still some minimal risk of reidentification. An important concern, therefore, is whether the government is actually serious about this absolute standard, whether it will water it down by amendment before the bill is enacted, or whether it will let interpretation and argument around ‘generally accepted best practices’ soften it up. To ensure the integrity of this provision, the law should enable the Privacy Commissioner to play a clear role in determining what counts as anonymization.

Significantly, under Bill C-27, information that is ‘anonymized’ would be out of scope of the statute. This is made clear in a new s. 6(5) which provides that “this Act does not apply in respect of personal information that has been anonymized”. The argument to support this is that placing data that are truly anonymized out of scope of the legislation creates an incentive for industry to anonymize data, and anonymization (if irreversible and permanent) is highly privacy protective. Of course, similar incentives can be present if more tailored exceptions are created for anonymized data without it falling ‘out of scope’ of the law.

Emerging and evolving concepts of collective privacy take the view that there should be appropriate governance of the use of human-derived data, even if it has been anonymized. Another argument for keeping anonymized data in scope relates to the importance of oversight, given re-identification risks. Placing anonymized data outside the scope of data protection law is contrary to the recent recommendations of the ETHI Standing Committee of the House of Commons following its hearings into the use of de-identified private sector mobility data by the Public Health Agency of Canada. ETHI recommended that the federal laws be amended “to render these laws applicable to the collection, use, and disclosure of de-identified and aggregated data”. Aggregated data is generally considered to be data that has been anonymized. The trust issues referenced by ETHI when it comes to the use of de-identified data reinforce the growing importance of notions of collective privacy. It might therefore make sense to keep anonymized data within scope of the legislation (with appropriate exceptions to maintain incentives for anonymization) leaving room for governance of anonymization.

Bill C-27 also introduces a new definition of “de-identify”, which refers to modifying data so that individuals cannot be directly identified. Direct identification has come to mean identification through specific identifiers such as names, or assigned numbers. The new definition of ‘de-identify’ in C-27 suggests that simply removing direct identifiers will suffice to de-identify personal data (a form of what, in the GDPR, is referred to as pseudonymization). Thus, according to this definition, as long as direct identifiers are removed from a data set, an organization can use data without knowledge or consent in certain circumstances, even though specific individuals might still be identifiable from those data. While it will be argued that these circumstances are limited, the exception for sharing for ‘socially beneficial purposes’ is disturbingly broad given this weak definition (more to come on this in a future blog post). In addition, the government can add new exceptions to the list by regulation.

The reference in the definition of ‘de-identify’ only to direct identification is meant to be read alongside s. 74 of Bill C-27, which provides:

74 An organization that de-identifies personal information must ensure that any technical and administrative measures applied to the information are proportionate to the purpose for which the information is de-identified and the sensitivity of the personal information.

Section 74 remains unchanged from Bill C-11, where it made more sense, since it defined de-identification in terms of direct or indirect identifiers using a relative standard. In the context of the new definition of ‘de-identify’, it is jarring, since de-identification according to the new definition requires only the removal of direct identifiers. What this, perhaps, means is that although the definition of de-identify only requires removal of direct identifiers, actual de-identification might mean something else. This is not how definitions are supposed to work.

In adopting these new definitions, the federal government sought to align its terminology with that used in Quebec’s Loi 25 that reformed its public and private sector data protection laws. The Quebec law provides, in a new s. 23, that:

[. . .]

For the purposes of this Act, information concerning a natural person is anonymized if it is, at all times, reasonably foreseeable in the circumstances that it irreversibly no longer allows the person to be identified directly or indirectly.

Information anonymized under this Act must be anonymized according to generally accepted best practices and according to the criteria and terms determined by regulation.

Loi 25 also provides that data is de-identified (as opposed to anonymized) “if it no longer allows the person concerned to be directly identified”. At first glance, it seems that Bill C-27 has adopted similar definitions – but there are differences. First, the definition of anonymization in Loi 25 uses a relative standard (not an absolute one as in C-27). It also makes specific reference not just to generally accepted best practices, but to criteria and terms to be set out in regulation, whereas in setting standards for anonymization, C-27 refers only to “generally accepted best practices”. [Note that in its recommendations following its hearings into the use of de-identified private sector mobility data by the Public Health Agency of Canada, the ETHI Committee of Parliament recommended that federal data protection laws should include “a standard for de-identification of data or the ability for the Privacy Commissioner to certify a code of practice in this regard.”]

Second, and most importantly, in the Quebec law, anonymized data does not fall outside the scope of the legislation –instead, a relative standard is used to provide some flexibility while still protecting privacy. Anonymized data are still subject to governance under the law, even though the scope of that governance is limited. Further, under the Quebec law, recognizing that the definition of de-identification is closer to pseudonymization, the uses of de-identified data are more restricted than they are in Bill C-27.

Further, in an eye-glazing bit of drafting, s. 2(3) of Bill C-27 provides:

2(3) For the purposes of this Act, other than sections 20 and 21, subsections 22(1) and 39(1), sections 55 and 56, subsection 63(1) and sections 71, 72, 74, 75 and 116, personal information that has been de-identified is considered to be personal information.

This is a way of saying that de-identified personal information remains within the scope of the Act except where it does not. Yet, data that has only direct identifiers stripped from it should always be considered personal information, since the reidentification risk, as noted above, could be very high. What s. 2(3) does is allow de-identified data to be treated as anonymized (out of scope) in some circumstances. For example, s. 21 allows organizations to use ‘de-identified’ personal information for internal research purposes without knowledge or consent. The reference in s. 2(3) amplifies this by providing that such information is not considered personal information. As a result, presumably, other provisions in Bill C-27 would not apply. This might include data breach notification requirements – yet if information is only pseudonymized and there is a breach, it is not clear why such provisions should not apply. Pseudonymization might provide some protection to those affected by a breach, although it is also possible that the key was part of the breach, or that individuals remain re-identifiable in the data. The regulator should have jurisdiction. Subsection 22(1) allows for the use and even the disclosure of de-identified personal information between parties to a prospective business transaction. In this context, the de-identified information is not considered personal information (according to s. 2(3)) and so the only safeguards are those set out in s. 22(1) itself. Bizarrely, s. 22(1) makes reference to the sensitivity of the information – requiring safeguards appropriate to its sensitivity, even though it is apparently not considered personal information. De-identified (not anonymized) personal information can also be shared without knowledge or consent for socially beneficial purposes under s. 39(1). (I have a blog post coming on this provision, so I will say no more about it here, other than to note that given the definition of ‘de-identify’, such sharing seems rash and the safeguards provided are inadequate). Section 55 provides for a right of erasure of personal information; since information stripped of direct identifiers is not personal information for the purposes of section 55 (according to s. 2(3)), this constitutes an important limitation on the right of erasure. If data are only pseudonymized, and if the organization retains the key, then why is there no right of erasure? Section 56 addresses the accuracy of personal information. Personal information de-identified according to the definition in C-27 would also be exempted from this requirement.

In adopting the definitions of ‘anonymize’ and ‘de-identify’, the federal government meets a number of public policy objectives. It enhances the ability of organizations to make use of data. It also better aligns the federal law with Quebec’s law (at least at the definitional level). The definitions may also create scope for other privacy protective technologies such as pseudonymization (which is what the definition of de-identify in C-27 probably really refers to) or different types of encryption. But the approach it has adopted creates the potential for confusion, for risks to privacy, and for swathes of human-derived data to fall ‘outside the scope’ of data protection law. The government view may be that, once you stir all of Bill C-27’s provisions into the pot, and add a healthy dose of “trust us”, the definition of “de-identify” and its exceptions are not as problematic as they are at first glance. Yet, this seems like a peculiar way to draft legislation. The definition should say what it is supposed to say, rather than have its defects mitigated by a smattering of other provisions in the law and faith in the goodness of others and the exceptions still lean towards facilitating data use rather than protecting privacy.

In a nutshell, C-27 has downgraded the definition of de-identification from C-11. It has completely excluded from the scope of the Act anonymized data, but has provided little or no guidance beyond “generally accepted best practices” to address anonymization. If an organization claims that their data are anonymized and therefore outside of the scope of the legislation, it will be an uphill battle to get past the threshold issue of anonymization in order to have a complaint considered under what would be the new law. The organization can simply dig in and challenge the jurisdiction of the Commissioner to investigate the complaint.

All personal data, whether anonymized or ‘de-identified’ should remain within the scope of the legislation. Specific exceptions can be provided where necessary. Exceptions in the legislation for the uses of de-identified information without knowledge or consent must be carefully constrained and reinforced with safeguards. Further, the regulator should play a role in establishing standards for anonymization and de-identification. This may involve consultation and collaboration with standards-setting bodies, but references in the legislation must be to more than just “generally accepted best practices”.

Note: this is the first in a series of blog posts on Bill C-27, also known as An Act to enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act and the Artificial Intelligence and Data Act.

Bill C-27 is a revised version of the former Bill C-11 which died on the order paper just prior to the last federal election in 2021. The former Privacy Commissioner called Bill C-11 ‘a step backwards’ for privacy, and issued a series of recommendations for its reform. At the same time, industry was also critical of the Bill, arguing that it risked making the use of data for innovation too burdensome.

Bill C-27 takes steps to address the concerns of both privacy advocates and those from industry with a series of revisions, although there is much that is not changed from Bill C-11. Further, it adds an entirely new statute – the Artificial Intelligence and Data Act (AIDA) – meant to govern some forms of artificial intelligence. This series of posts will assess a number of the changes found in Bill C-27. It will also consider the AIDA.

_________________________________

The federal government has made it clear that it considers consent to be a cornerstone of Canadian data protection law. They have done so in the Digital Charter, in Bill C-11 (the one about privacy), and in the recent reincarnation of data protection reform legislation in Bill C-27. On the one hand, consent is an important means by which individuals can exercise control over their personal information; on the other hand, it is widely recognized that the consent burden has become far too high for individuals who are confronted with long, complex and often impenetrable privacy policies at every turn. At the same time, organizations that see new and emerging uses for already-collected data seek to be relieved of the burden of obtaining fresh consents. The challenge in privacy law reform has therefore been to make consent meaningful, while at the same time reducing the consent burden and enabling greater use of data by private and public sector entities. Bill C-11 received considerable criticism for how it dealt with consent (see, for example, my post here, and the former Privacy Commissioner’s recommendations to improve consent in C-11 here). Consent is back, front and centre in Bill C-27, although with some important changes.

Section 15 of Bill C-27 reaffirms that consent is the default rule for collection, use or disclosure of personal information, although the statute creates a long list of exceptions to this general rule. One criticism of Bill C-11 was that it removed the definition of consent in s. 6.1 of PIPEDA, which provided that consent “is only valid if it is reasonable to expect that an individual to whom the organization’s activities are directed would understand the nature, purpose and consequences of the collection, use or disclosure of the personal information to which they are consenting.” Instead, Bill C-11 simply relied upon a list of information that must be provided to individuals prior to consent. Bill C-27’s compromise is found in the addition of a new s. 15(4) which requires that the information provided to individuals to obtain their consent must be “in plain language that an individual to whom the organization’s activities are directed would reasonably be expected to understand.” This has the added virtue of ensuring, for example, that privacy policies for products or services directed at youth or children must take into account the sophistication of their audience. The added language is not as exigent as s. 6.1 (for example, s. 6.1 requires an understanding of the nature, purpose and consequences of the collection, use and disclosure, while s. 15(4) requires only an understanding of the language used), so it is still a downgrading of consent from the existing law. It is, nevertheless, an improvement over Bill C-11.

A modified s. 15(5) and a new s. 15(6) also muddy the consent waters. Subsection 15(5) provides that consent must be express unless it is appropriate to imply consent. The exception to this general rule is the new subsection 15(6) which provides:

(6) It is not appropriate to rely on an individual’s implied consent if their personal information is collected or used for an activity described in subsection 18(2) or (3).

Subsections 18(2) and (3) list business activities for which personal data may be collected or used without an individual’s knowledge or consent. At first glance, it is unclear why it is necessary to provide that implied consent is inappropriate in such circumstances, since no consent is needed at all. However, because s. 18(1) sets out certain conditions criteria for collection without knowledge or consent, it is likely that the goal of s. 15(6) is to ensure that no organization circumvents the limited guardrails in s. 18(1) by relying instead on implied consent. The potential breadth of s. 18(3) (discussed below), combined with s. 2(3) makes it difficult to distinguish between the two, in which case, the cautious organization will comply with s. 18(3) rather than rely on implied consent in any event.

The list of business activities for which no knowledge or consent is required for the collection or use of personal information is pared down from that in Bill C-11. The list in C-11 was controversial, as it included some activities which were so broadly stated that they would have created gaping holes in any consent requirement (see my blog post on consent in C-11 here). The worst of these have been removed. This is a positive development, although the provision creates a backdoor through which other exceptions can be added by regulation. Further, Bill C-27 has added language to s. 12(1) to clarify that the requirement that the collection, use or disclosure of personal information must be “only in a manner and for purposes that a reasonable person would consider appropriate in the circumstances” applies “whether or not consent is required under this Act.”

[Note that although the exceptions in s. 18 are to knowledge as well as consent, s. 62(2)(b) of Bill C-27 will require that an organization provide plain language information about how it makes use of personal information, and how it relies upon exceptions to consent “including a description of any activities referred to in subsection 18(3) in which it has a legitimate interest”.]

Bill C-27 does, however, contain an entirely new exception to the collection or use of personal data with knowledge or consent. This is found in s. 18(3):

18 (3) An organization may collect or use an individual’s personal information without their knowledge or consent if the collection or use is made for the purpose of an activity in which the organization has a legitimate interest that outweighs any potential adverse effect on the individual resulting from that collection or use and

(a) a reasonable person would expect the collection or use for such an activity; and

(b) the personal information is not collected or used for the purpose of influencing the individual’s behaviour or decisions.

So as not to leave this as open-ended as it seems at first glance, a new s. 18(4) sets conditions precedent for the collection or use of personal information for ‘legitimate purposes’:

(4) Prior to collecting or using personal information under subsection (3), the organization must

(a) identify any potential adverse effect on the individual that is likely to result from the collection or use;

(b) identify and take reasonable measures to reduce the likelihood that the effects will occur or to mitigate or eliminate them; and

(c) comply with any prescribed requirements.

Finally, a new s. 18(5) requires the organization to keep a record of its assessment under s. 18(4) and it must be prepared to provide a copy of this assessment to the Commissioner at the Commissioner’s request.

It is clear that industry had the ear of the Minister when it comes to the addition of ss. 18(3). A ‘legitimate interest’ exception was sought in order to enable the use of personal data without consent in a broader range of circumstances. Such an exception is found in the EU’s General Data Protection Regulation (GDPR). Here is how it is worded in the GDPR:

6(1) Processing shall be lawful only if and to the extent that at least one of the following applies:

[. . . ]

(f) processing is necessary for the purposes of the legitimate interests pursued by the controller or by a third party, except where such interests are overridden by the interests or fundamental rights and freedoms of the data subject which require protection of personal data, in particular where the data subject is a child.

Under the GDPR, an organization that relies upon legitimate interests instead of consent, must take into account, among other things:

6(4) [. . . ]

(a) any link between the purposes for which the personal data have been collected and the purposes of the intended further processing;

(b) the context in which the personal data have been collected, in particular regarding the relationship between data subjects and the controller;

(c) the nature of the personal data, in particular whether special categories of personal data are processed, pursuant to Article 9, or whether personal data related to criminal convictions and offences are processed, pursuant to Article 10;

(d) the possible consequences of the intended further processing for data subjects;

(e) the existence of appropriate safeguards, which may include encryption or pseudonymisation.

Bill C-27’s ‘legitimate interests’ exception is different in important respects from that in the GDPR. Although Bill C-27 gives a nod to the importance of privacy as a human right in a new preamble, the human rights dimensions of privacy are not particularly evident in the body of the Bill. The ‘legitimate interests’ exception is available unless there is an “adverse effect on the individual” that is not outweighed by the organization’s legitimate interest (as opposed to the ‘interests or fundamental freedoms of the individual’ under the GDPR). Presumably it will be the organization that does this initial calculation. One of the problems in data protection law has been quantifying adverse effects on individuals. Data breaches, for example, are shocking and distressing to those impacted, but it is often difficult to show actual damages flowing from the breach, and moral damages have been considerably restricted by courts in many cases. Some courts have even found that ordinary stress and inconvenience of a data breach is not compensable harm since it has become such a routine part of life. If ‘adverse effects’ on individuals are reduced to quantifiable effects, the ‘legitimate interests’ exception will be far too broad.

This is not to say that the ‘legitimate interests’ provision in Bill C-27 is incapable of facilitating data use while at the same time protecting individuals. There is clearly an attempt to incorporate some checks and balances, such as reasonable expectations and a requirement to identify and mitigate any adverse effects. But what C-27 does is take something that, in the GDPR, was meant to be quite exceptional to consent and make it potentially a more mainstream basis for the use of personal data without knowledge or consent. It is able to do this because rather than reinforce the centrality and importance of privacy rights, it places privacy on an uneasy par with commercial interests in using personal data. The focus on ‘adverse effects’ runs the risk of equating privacy harm with quantifiable harm, thus trivializing the human and social value of privacy.

 

 

 

Note: The following is my response to the call for submissions on the recommendations following the third review of Canada’s Directive on Automated Decision-Making. Comments are due by June 30, 2022. If you are interested in commenting, please consult the Review Report and the Summary of Key Issues and Proposed Amendments. Comments can be sent to This e-mail address is being protected from spambots. You need JavaScript enabled to view it .

 

The federal Directive on Automated Decision-Making (DADM) and its accompanying Algorithmic Impact Assessment tool (AIA) are designed to provide governance for the adoption and deployment of automated decision systems (ADS) by Canada’s federal government. Governments are increasingly looking to ADS in order to speed up routine decision-making processes and to achieve greater consistency in decision-making. At the same time, there are reasons to be cautious. Automated decision systems carry risks of incorporating and replicating discriminatory bias. They may also lack the transparency required of government decision-making, particularly where important rights or interests are at stake. The DADM, which has been in effect since April 2019 (with compliance mandatory no later than April 2020), sets out a series of obligations related to the design and deployment of automated decision-making systems. The extent of the obligations depends upon a risk assessment, and the AIA is the tool by which the level of risk of the system is assessed.

Given that this is a rapidly evolving area, the DADM provides that it will be reviewed every six months. It is now in its third review. The first two reviews led to the clarification of certain obligations in the DADM and to the development of guidelines to aid in its interpretation. This third review proposes a number of more substantive changes. This note comments on some of these changes and proposes an issue for future consideration.

Clarify and Broaden the Scope

A key recommendation in this third round of review relates to the scope of the DADM. Currently, the DADM applies only to ‘external’ services of government – in other words services offered to individuals or organizations by government. It does not apply internally. This is a significant gap when one considers the expanding use of ADS in the employment context. AI-enabled decision systems have been used in hiring processes, and they can be used to conduct performance reviews, and to make or assist in decision-making about promotions and internal workforce mobility. The use of AI tools in the employment context can have significant impacts on the lives and careers of employees. It seems a glaring oversight to not include such systems in the governance regime for ADM. The review team has recommended expanding the scope of the DADM to include internal as well as external services. They note that this move would also extend the DADM to any ADS used for “grants and contributions, awards and recognition, and security screening” (Report at 11). This is an important recommendation and one which should be implemented.

The review team also recommends a clarification of the language regarding the application of the DADM. Currently it puts within its scope “any system, tool, or statistical models used to recommend or make an administrative decision about a client”. Noting that “recommend” could be construed as including only those systems that recommend a specific outcome, as opposed to systems that process information on behalf of a decision-maker, the team proposes replacing “recommend” with “support”. This too is an important recommendation which should be implemented.

Periodic Reviews

Currently the DADM provides for its review every six months. This was always an ambitious review schedule. No doubt it was motivated by the fact that the DADM was a novel tool designed to address a rapidly emerging and evolving technology with potentially significant implications. The idea was to ensure that it was working properly and to promptly address any issues or problems. In this third review, however, the team recommends changing the review period from six months to two years. The rationale is that the six-month timetable makes it challenging for the team overseeing the DADM (which is constantly in a review cycle), and makes it difficult to properly engage stakeholders. They also cite the need for the DADM to “display a degree of stability and reliability, enabling federal institutions and the clients they serve to plan and act with a reasonable degree of confidence.” (Report at 12).

This too is a reasonable recommendation. While more frequent reviews were important in the early days of the DADM and the AIA, reviews every six months seem unduly burdensome once initial hiccups are resolved. A six-month review cycle engages the team responsible for the DADM in a constant cycle of review, which may not be the best use of resources. The proposed two-year review cycle would allow for a more experience to be garnered with the DADM and AIA, enabling a more substantive assessment of issues arising. Further, a two-year window is much more realistic if stakeholders are to be engaged in a meaningful way. Being asked to comment on reports and proposed changes every six months seems burdensome for anyone – including an already stretched civil society sector. The review document suggests that Canada’s Chief Information Officer could request completion of an off-cycle review if the need arose, leaving room for the possibility that a more urgent issue could be addressed outside of the two-year review cycle.

Data Model and Governance

The third review also proposes amendments to provide for what it describes as a more ‘holistic’ approach to data governance. Currently, the DADM focuses on data inputs – in other words on assessing the quality, relevance and timeliness of the data used in the model. The review report recommends the addition of an obligation to establish “measures to ensure that data used and generated by the Automated Decision System are traceable, protected, and appropriately retained and disposed of in accordance with the Directive on Service and Digital, Directive on Privacy Practices, and Directive on Security Management”. It will also recommend amendments to extend testing and assessment beyond data to underlying models, in order to assess both data and algorithms for bias or other problems. These are positive amendments which should be implemented.

Explanation

The review report notes that while the DADM requires “meaningful explanations” of how automated decisions were reached, and while guidelines provide some detail as to what is meant by explainability, there is still uncertainty about what explainability entails. The Report recommends adding language in Appendix C, in relation to impact assessment, that will set out the information necessary for ‘explainability’. This includes:

  • The role of the system in the decision-making process;
  • The training and client data, their source and method of collection, if applicable;
  • The criteria used to evaluate client data and the operations applied to process it; and
  • The output produced by the system and any relevant information needed to interpret it in the context of the administrative decision.

Again, this recommendation should be implemented.

Reasons for Automation

The review would also require those developing ADM systems for government to specifically identify why it was considered necessary or appropriate to automate the existing decision-making process. The Report refers to a “clear and demonstrable need”. This is an important additional criterion as it requires transparency as to the reasons for automation – and that these reasons go beyond the fact that vendor-demonstrated technologies look really cool. As the authors of the review note, requiring justification also helps to assess the parameters of the system adopted – particularly if the necessity and proportionality approach favoured by the Office of the Privacy Commissioner of Canada is adopted.

Transparency

The report addresses several issues that are relevant to the transparency dimensions of the DADM and the accompanying AIA. Transparency is an important element of the DADM, and it is key both to the legitimacy of the adoption of ADS by government, but also to its ongoing use. Without transparency in government decision-making that impacts individuals, organizations and communities, there can be no legitimacy. There are a number of transparency elements that are built into the DADM. For example, there are requirements to provide notice of automated decision systems, a right to an explanation of decisions that is tailored to the impact of the decision, and a requirement not just to conduct an AIA, but to publish the results. The review report includes a number of recommendations to improve transparency. These include a recommendation to clarify when an AIA must be completed and released, greater transparency around peer review results, more explicit criteria for explainability, and adding additional questions to the AIA. These are all welcome recommendations.

At least one of these recommendations may go some way to allaying my concerns with the system as it currently stands. The documents accompanying the report (slide 3 of summary document) indicate that there are over 300 AI projects across 80% of federal institutions. However, at the time of writing, only four AIAs were published on the open government portal. There is clearly a substantial lag between development of these systems and release of the AIAs. The recommendation that an AIA be not just completed but also released prior to the production of the system is therefore of great importance to ensuring transparency.

It may be that some of the discrepancy in the numbers is attributable to the fact that the DADM came into effect in 2020, and it was not grandfathered in for projects already underway. For transparency’s sake, I would also recommend that a public register of ADS be created that contains basic information about all government ADS. This could include their existence and function, as well as some transparency regarding explainability, the reasons for adoption, and measures taken to review, assess and ensure the reliability of these systems. Although it is too late, in the case of these systems, to perform a proactive AIA, there should be some form of reporting tool that can be used to provide important information, for transparency purposes, to the public.

Consideration for the Future

The next review of the DADM and the AIA should also involve a qualitative assessment of the AIAs that have been published to date. If the AIA is to be a primary tool not just for assessing ADS but for providing transparency about them, then they need to be good. Currently there is a requirement to conduct an AIA for a system within the scope of the DADM – but there is no explicit requirement for it to be of a certain quality. A quick review of the four AIAs currently available online shows some discrepancy between them in terms of the quality of the assessment. For example, the project description for one such system is an unhelpful 9-word sentence that does not make clear how AI is actually part of the project. This is in contrast to another that describes the project in a 14-line paragraph. These are clearly highly divergent in terms of the level of clarity and detail provided.

The first of these two AIAs also seems to contain contradictory answers to the AIA questionnaire. For example, the answer to the question “Will the system only be used to assist a decision-maker” is ‘yes’. Yet the answer to the question “Will the system be replacing a decision that would otherwise be made by a human” is also ‘yes’. Either one of these answers is incorrect, or the answers do not capture how the respondent interpreted these questions. These are just a few examples. It is easy to see how use of the AIA tool can range from engaged to pro forma.

The obligations imposed on departments with respect to ADS vary depending upon the risk assessment score. This score is evaluated through the questionnaire, and one of the questions asks “Are clients in this line of business particularly vulnerable?” In the AIA for an access to information (ATIP) tool, the answer given to this question is “no”. Of course, the description of the tool is so brief that it is hard to get a sense of how it functions. However, I would think that the clientele for an ATIP portal would be quite diverse. Some users will be relatively sophisticated (e.g., journalists or corporate users). Others will be inexperienced. For some of these, information sought may be highly important to them as they may be seeking access to government information to right a perceived wrong, to find out more about a situation that adversely impacts them, and so on. In my view, this assessment of the vulnerability of the clients is not necessarily accurate. Yet the answer provided contributes to a lower overall score and thus a lower level of accountability. My recommendation for the next round of reviews is to assess the overall effectiveness of the AIA tool in terms of the information and answers provided and in terms of their overall accuracy.

I note that the review report recommends adding questions to the AIA in order to improve the tool. Quite a number of these are free text answers, which require responses to be drafted by the party completing the AIA. Proposed questions include ones relating to the user needs to be addressed, how the system will meet those needs, and the effectiveness of the system in meeting those needs, along with reasons for this assessment. Proposed questions will also ask whether non-AI-enabled solutions were also considered, and if so, why AI was chosen as the preferred method. A further question asks what the consequences would be of not deploying the system. This additional information is important both to assessing the tool and to providing transparency. However, as noted above, the answers will need to be clear and sufficiently detailed in order to be of any use.

The AIA is crucial to assessing the level of obligation and to ensuring transparency. If AIAs are pro forma or excessively laconic, then the DADM can be as finely tuned as can be, but it will still not achieve desired results. The review committee’s recommendation that plain language summaries of peer review assessments also be published will provide a means of assessing the quality of the AIAs, and thus it is an important recommendation to strengthen both transparency and compliance.

A final issue that I would like to address is that, to achieve transparency, people will need to be able to easily find and access the information about the systems. Currently, AIAs are published on the Open Government website. There, they are listed alphabetically by title. This is not a huge problem right now, since there are only four of them. As more are published, it would be helpful to have a means of organizing them by department or agency, or by other criteria (including risk/impact score) to improve their findability and usability. Further, it will be important that any peer review summaries are linked to the appropriate AIAs. In addition to publication on the open government portal, links to these documents should be made available from department, agency or program websites. It would also be important to have an index or registry of AI in the federal sector – including not just those projects covered by the DADM, but also those in production prior to the DADM’s coming into force.

[Note: I have written about the DADM and the AIA from an administrative law perspective. My paper, which looks at the extent to which the DADM addresses administrative law concerns regarding procedural fairness, can be found here.]

 

On March 30, 2022 Alberta introduced Bill 13, the Financial Innovation Act. The Bill aims to create a regulatory sandbox for innovators in the growing financial technology (fintech) sector. This is a sector in which there is already considerable innovation and development – with more to come as Canada moves towards open banking. (Canada just appointed a new open banking lead on March 22, 2022). In addition to open banking, we are seeing a proliferation of cryptocurrencies, growing interest in central bank digital currencies, and platform-based digital currencies.

The concept of a regulatory sandbox is gaining traction in different sectors. Some forms of innovation in the new digital and data-driven economy run up against regulatory frameworks designed for more conventional forms of technological development. The existing regulatory system becomes a barrier to innovation – not because the innovation is necessarily harmful or undesirable, but simply because it does not fit easily within the conventional framework. A regulatory sandbox is meant to give innovators some regulatory flexibility to develop their products or services, while at the same time allowing regulators to experiment with tailoring regulation to the emerging technological environment.

Some examples of regulatory sandboxes in Canada include one developed by the Canadian Securities Administration largely for the emerging fintech sector (the CSA Regulatory Sandbox), a Health Canada regulatory sandbox for advanced therapeutic products, and the Law Society of Ontario’s legal tech regulatory sandbox. These are sandboxes developed by regulatory bodies which provide flexibility within their existing regulatory frameworks. What is different about Alberta’s Bill 13 is that it legislates a broader regulatory sandbox. The Bill provides for qualified participants to receive exemptions from rules within multiple existing regulatory frameworks, including rules under the Loan & Trust Corporation Act and the Credit Union Act (among others – see s. 8 of the Bill)– as well as provincial privacy legislation.

Access to and use of personal data will be necessary for fintech apps, and existing privacy legislation can create challenges in this context. Certainly, for open banking to work in Canada, the federal government’s Personal Information Protection and Electronic Documents Act will need to be amended. Bill C-11, which died on the order paper in late 2021 contained an amendment that would have allowed for the creation of sector-specific data mobility frameworks via regulation. An amendment of this kind, for example, would have facilitated open banking. With such an approach, privacy protection is not abandoned; rather, it is customized.

Alberta’s Bill 13 appears to be designed to provide some form of customization in order to protect privacy while facilitating innovation. Section 5 of the Bill provides that when a company seeks an exemption from provisions of the Personal Information Protection Act (PIPA), this application for exemption must be reviewed by Alberta’s Information and Privacy Commissioner. The Commissioner is empowered to require the company to provide it with all necessary information to assess the request. The Commissioner may then approve or deny the exemption outright, or approve it subject to terms and conditions. The Commissioner may also withdraw any previously granted approval. The role of the IPC is thus firmly embedded in the legislation. Section 8, which empowers the Minister to grant a certificate of acceptance to a sandbox participant, provides that the Minister may grant an exemption to any provision of PIPA only with the prior written approval of the Commissioner and only on terms and conditions jointly agreed to by the Minister and the Commissioner. Similarly, the Minister’s power to add, amend or revoke an exemption to PIPA in s. 10(4) of the Act can only be exercised in conjunction with the Information and Privacy Commissioner. The Commissioner retains the power to withdraw a written approval (s. 10(5)) and doing so will require the Minister to promptly revoke the exemption.

Bill 13 also provides for transparency with respect to regulatory sandbox exemptions via requirements to publish information about sandbox participants, exemptions, terms and conditions imposed on them, expiry dates, and any amendments, revocations or cancellations of certificates of acceptance.

Given the federal-provincial division of powers, the scope of Bill 13 is somewhat limited, as it cannot provide exemptions to federal regulatory requirements. While Credit Unions are under provincial jurisdiction, banks are federally regulated, and the federal private sector data protection law – PIPEDA – also applies to interprovincial flows of data. Nevertheless, s. 19 of the Bill provides for reciprocal agreements between Alberta and “other governments that have a regulatory sandbox framework, or agencies of those other governments”. There is room here for collaboration and co-operation.

Bill 13 is clearly designed to attract fintech startups to Alberta by providing a more supple regulatory environment in which to operate. This is an interesting bill, and one to watch as it moves through the legislature in Alberta. Not only is it a model for a legislated regulatory sandbox its approach to addressing privacy issues is worth some examination.

On March 29, I appeared before Ontario's Standing Committee on Social Policy on the topic of the government's proposed Bill 88. My statement, which builds on an earlier post about this same bill is below. Note that the Bill has since received Royal Assent. No definition (as proposed below) of electronic monitoring was added to the bill by amendment. None of the amendments proposed by the Ontario Information and Privacy Commissioner were added.

Remarks by Teresa Scassa to the Standing Committee on Social Policy of the Ontario Legislature – Hearing on Bill 88, An Act to enact the Digital Platform Workers' Rights Act, 2022 and to amend various Acts

March 29, 2022

Thank you for this invitation to appear before the Standing Committee on Social Policy. My name is Teresa Scassa and I hold the Canada Research Chair in Information Law and Policy at the University of Ottawa.

The portion of Bill 88 that I wish to address in my remarks is that dealing with electronic monitoring of employees in Schedule 2. This part of the Bill would amend the Employment Standards Act to require employers with 25 or more employees to put in place a written policy on electronic monitoring and to provide employees with a copy. This is an improvement over having no requirements at all regarding employee monitoring. However, it is only a small step forward, and I will address my remarks to why it is important to do more, and where that might start.

Depending on the definition of electronic monitoring that is adopted (and I note that the bill does not contain a definition), electronic monitoring can include such diverse practices as GPS tracking of drivers and vehicles; cellphone tracking; and video camera surveillance. It can also include tracking or monitoring of internet usage, email monitoring, and the recording of phone conversations for quality control. Screen-time and key-stroke monitoring are also possible, as is tracking to measure the speed of task performance. Increasingly, monitoring tools are paired with AI-enabled analytics. Some electronic monitoring is for workplace safety and security purposes; other monitoring protects against unauthorized internet usage. Monitoring is now also used to generate employee metrics for performance evaluation, with the potential for significant impacts on employment, retention and advancement. Although monitoring was carried out prior to the pandemic, pandemic conditions and remote work have spurred the adoption of new forms of electronic monitoring. And, while electronic monitoring used to be much easier to detect (for example, surveillance cameras mounted in public view were obvious), much of it is now woven into the fabric of the workplace or embedded on workplace devices and employees may be unaware of the ways in which they are monitored and the uses to which their data will be put. The use of remote and AI-enabled monitoring services may also see employee data leaving the country, and may expose it to secondary uses (for example, in training the monitoring company’s AI algorithms).

An amendment that requires employers to establish a policy that gives employees notice of any electronic monitoring will at least address the issue of awareness of such monitoring, but it does very little for employee privacy. This is particularly disappointing since there had been some hope that a new Ontario private sector data protection law would have included protections for employee privacy. Privacy protection in the workplace is typically adapted to that context – it does not generally require employee consent for employment-related data collection. However, it does set reasonable limits on the data that is collected and on the purposes to which it is put. It also provides for oversight by a regulator like the Ontario Information and Privacy Commissioner (OIPC), and provides workers with a means of filing complaints in cases where they feel their rights have been infringed. Privacy laws also provide additional protections that are increasingly important in an era of cyber-insecurity, as they can address issues such as the proper storage and deletion of data, and data breach notification. In Canada, private sector employees have this form of privacy protection in Quebec, BC and Alberta, as well as in the federally-regulated private sector. Ontarians should have it too.

Obviously, Bill 88 will not be the place for this type of privacy protection. My focus here is on changes that could be made to Bill 88 that could enhance the small first step it takes on this important issue.

First, I would encourage this committee to recommend the addition of a definition of ‘electronic monitoring’. The broad range of technologies and applications that could constitute electronic monitoring and the lack of specificity in the Bill could lead to underinclusive policies from employers who struggle to understand the scope of the requirement. For example, do keypad entry systems constitute electronic monitoring? Are vehicular GPS systems fleet management devices or electronic employee monitoring or both? I propose the following definition:

electronic monitoring” is the collection and/or use of information about an employee by an employer, or by a third party for the benefit of an employer, by means of electronic equipment, computer programs, or electronic networks. Without limiting the generality of the forgoing, this includes collection and use of information gathered by employer-controlled electronic equipment, vehicles or premises, video cameras, electronic key cards and key pads, mobile devices, or software installed on computing devices or mobile devices.

Ontario’s Privacy Commissioner has made recommendations to improve the employee monitoring provisions of Bill 88. She has proposed that it be amended to require a digital copy of all electronic-monitoring policies drafted to comply with this Bill to be submitted to her office. This would be a small additional obligation that would not expose employers to complaints or liability. It would allow the OIPC to gather important data on the nature and extent of electronic workplace monitoring in Ontario. It would also give the OIPC insight into current general practices and emerging best practices. It could be used to understand gaps and shortcomings. Data gathered in this way could help inform future law and policy-making in this area. For example, I note that the Lieutenant Governor in Council will have the power under Bill 88 to make regulations setting out additional requirements for electronic monitoring policies, terms or conditions of employment related to electronic monitoring, and prohibitions related to electronic monitoring. The Commissioner’s recommendation would enhance both transparency and data gathering when it comes to workplace surveillance.

 

Note: My paper The Surveillant University: Remote Proctoring, AI and Human Rights is forthcoming in the Canadian Journal of Comparative and Contemporary Law. It explores a necessity and proportionality approach to the adoption by universities of remote proctoring solutions. Although the case discussed in the post below addresses a different set of issues, it does reflect some of the backlash and resistance to remote proctoring.

In 2020, the remote AI-enabled exam proctoring company Proctorio filed suit for copyright infringement and breach of confidence lawsuit against Ian Linkletter, a BC-based educational technologist. It also obtained an interim injunction prohibiting Linkletter from downloading or sharing information about Proctorio’s services from its Help Center or online Academy. Linkletter had posted links on Twitter to certain ‘unlisted’ videos on the company’s YouTube channel. His tweets were highly critical of the company and its AI-enabled exam surveillance software. He responded to the suit and the interim injunction with an application to have the underlying action thrown out under BC’s Protection of Public Participation Act (PPPA). This anti-SLAPP (strategic litigation against public participation) statute allows a court to dismiss proceedings that arise from an expression on a matter of public interest made by the applicant. On March 11, 2022, Justice Milman of the BC Supreme Court handed down his decision rejecting the PPPA application.

Linkletter first became concerned with Proctorio (a service to which the University of British Columbia subscribed at the time) after a University of British Columbia (UBC) student had her chat logs with Proctorio published online by the company after she complained about the service she received during an exam. In order to learn more about Proctorio, Linkletter developed a ‘sandbox’ course for which he was the instructor. This enabled him to access Proctorio’s online Help Center and its ‘Academy’ via UBC. These sites provide information and training to instructors. The Help Center had a number of videos available through YouTube. The URLs for these videos were unlisted, which meant that they were not searchable through YouTube’s site, although anyone with the link could access the video. Mr. Linkletter posted some of these links to Twitter, expressing his concerns with the contents of the videos. The company disabled the links, and created new ones. Linkletter also posted a screenshot of the Academy website with a message indicating that the original links were not available.

Justice Milman did not hesitate to find that the applicant had expressed himself on a matter of public interest. He noted that the software adopted by UBC “has generated controversy, there and elsewhere, due to concerns about its perceived invasiveness and what is thought by some to be its disparate and discriminatory impacts on some students.” (at para 3). The onus shifted to the respondent Proctorio to demonstrate the substantial merit of its proceedings, the lack of a valid defence by the applicant, and the seriousness of the harm it would suffer relative to the public interest in the expression. The threshold to be met by Proctorio was to demonstrate “that there are grounds to believe that its underlying claim is legally tenable and supported by evidence that is reasonably capable of belief such that the claim can be said to have a real prospect of success” (at para 56).

Proctorio’s lawsuit is essentially based on three intellectual property claims. The first of these was a breach of confidence claim relating to the unlisted YouTube video links. To succeed with this claim, the information at issue must be confidential; the circumstances under which it was communicated must give rise to an obligation of confidence; and the defendant must have made unauthorized use of the information to the detriment of the party communicating it. Justice Milman found that the respondent met the threshold of ‘substantial merit’ on this cause of action.

What Linkletter posted publicly on Twitter were links to videos. Proctorio claimed that it was these videos (along with a screen shot of a message on its Academy website) that were the confidential information it sought to protect. Although there are a number of factors that a court will take into account in assessing the confidentiality of information, the information must have a confidential nature and the party seeking to protect it must have taken appropriate steps to protect its confidentiality.

Unlisted YouTube video links are not publicly searchable, yet anyone with the link can access the content – and YouTube’s terms of service permit the sharing of unlisted links. However, Justice Milman found that Linkletter accessed Proctorio’s videos (and their links) via Proctorio’s website, which had its own terms of service to which Linkletter had clicked to agree. Those terms prohibit the copying or duplication of the materials found in their Help Centre – although they do not identify any of the content as confidential information. Canadian courts have found users of websites to be bound by terms of service regardless of whether they have read them; it is not a stretch to find that Linkletter had a contractual obligation not to share the contents. However, when it comes to taking the steps necessary to protect the confidentiality of information, one can question whether terms of service buried in links on a website – and that do not specifically identify the material as confidential – constitute a confidentiality or non-disclosure agreement. There was evidence that much of the material could be found elsewhere on the internet. It was also available to tens of thousands of instructors who were given access to the site at the discretion of university clients, not Proctorio. Justice Milman noted that “none of the videos stated on their face that they were commercially sensitive or should be kept from public view” (at para 64). He also found that “the choice to make them available on a public platform like YouTube when more secure options could have been used, dilutes the strength of Proctorio’s case” (at 64). In these circumstances, the court’s ruling that the confidential information claim had sufficient merit seems generous. In order to make out a claim of breach of confidence, it is also necessary for the plaintiff to show that the defendant made use of the information to the company’s detriment. Although the information was used to criticize the company, it is hard to see how Proctorio suffered any real damage particular to this breach of confidence. Much of the content was available through other sources, and the court described the company’s assertions that the videos could permit students to game their algorithms or could reveal their algorithmic secrets to competitors as ‘speculative’. Nonetheless, Justice Milman found enough here to satisfy the Proctorio’s onus to repel the PPPA application.

The copyright infringement argument depended upon a finding that the sharing of a hyperlink amounted to the sharing of the content that could be accessed by following the link. In spite of the fact that there is Canadian case law that suggests that sharing hyperlinks is not copyright infringement, Justice Milman was prepared to distinguish these cases. He found it significant that the materials were not publicly available except to those who had access to the links; sharing the links amounted to more than just pointing people to information otherwise available on the internet. Having found likely infringement, Justice Milman next considered available defences. He found that Linkletter did not meet the test for fair dealing as set out by the Supreme Court of Canada in CCH Canadian. It was conceded by Proctorio that Linkletter passed the first part of the fair dealing test – that the dealing was for a purpose listed in ss. 29, 29.1 or 29.2 of the Copyright Act. Presumably it was for the purposes of criticism or comment, although this is not made explicit in the decision. In assessing the fair dealing criteria, however, Justice Milman found that Linkletter’s circulation of the links on social media mitigated against fair dealing, as did the fact that anyone who followed the link had access to the full work.

On ‘alternatives to the dealing’, Justice Milman noted that rather than share the videos publicly, Linkletter could have reported on what he saw in the videos (although he earlier had found the videos (or the links to the videos – it is not entirely clear) to be confidential information). He could also have referred to other publicly available sources on the contents of the videos to make his point. On the issue of the nature of the work, Justice Milman found that the works were confidential (thus working against a finding of fair dealing) “even if most of the information in the videos was already available elsewhere on the internet”. Oddly, then, the fair dealing analysis not only underscores the fact that the material was largely publicly available, it suggests that an alternative to providing links to the videos was to discuss their contents freely. This suggests that the issue was not really the confidentiality of the content, but the fact that Linkletter had breached contractual terms of service in order to provide access to it.

On the final fair dealing criterion, the effect of the dealing on the work, Justice Milman found that by making the videos available through their links, “Mr. Linkletter created a risk that Proctorio’s product would be rendered less effective for its intended purposes (because students could more easily anticipate how instructors can configure the settings) and its proprietary information more readily available to competitors.” (at para 112). He conceded that this risk was ‘speculative’ given the amount of information about Proctorio’s services already in the public domain. Justice Milman found that, on balance, the fair dealing defence was not available to Linkletter. He also found that the defence of ‘user-generated content’ was not applicable.

Justice Milman declined to find that there had been circumvention of technical protection measures by Linkletter. He found that Linkletter had gained access to the materials by legitimate means. His subsequent copyright infringing acts were carried out without avoiding, bypassing, removing, deactivating or impairing any effective technology, device or component as required by s. 41.1 of the Copyright Act.

The final element of the test under the PPPA is that the interest of the plaintiff in carrying on with the action must outweigh its deleterious effects on expression and public participation. Justice Milman found that this test was met, notwithstanding the fact that he also found that the “corresponding harm that Proctorio has been able to demonstrate is limited” (at para 124). He found that the risks identified by Proctorio of students circumventing its technology or competitors learning how its software worked were “unlikely to materialize”. Nonetheless, he found that Linkletter’s actions “compromised the integrity of its Help Center and Academy screens, which were put in place in order to segregate the information made available to instructors and administrators from that intended for students and members of the public” (at para 126). He credited the interim injunction for limiting the adverse impacts in this regard. However, he was critical of the broad scope of that injunction and narrowed it to ensure that Linkletter was not enjoined from sharing or linking to content available from public sources. Justice Milman also noted that Linkletter remained free to express his views, as have been others who have also criticized Proctorio online.

The breach of copyright and breach of confidence claims in this case are weak, although their consideration is admittedly superficial given that this is not a decision on the merits. The court found just enough in the copyright and breach of confidence claims to keep them on the right side of the PPPA. Clearly Proctorio objects to the provision of direct public access to its instructional videos beyond the tens of thousands of instructors who have access to them each year – and who are apparently otherwise free to discuss their content in public fora. In this case, Proctorio quickly mitigated any harm by changing the links in question. It could also deny Linkletter access to its services on the basis that he breached the terms of use, and can better protect its content by no longer providing it on as unlisted content on YouTube. The narrowed injunction leaves Linkletter free to criticize Proctorio and to link to other publicly available information on the internet. In the circumstances, even if the underlying lawsuit is not a SLAPP suit, as Justice Milman concludes, it is hard to fathom why it should continue to consume scare judicial resources.

<< Start < Prev 1 2 3 4 5 6 7 8 9 10 Next > End >>
Page 3 of 37

Canadian Trademark Law

Published in 2015 by Lexis Nexis

Canadian Trademark Law 2d Edition

Buy on LexisNexis

Electronic Commerce and Internet Law in Canada, 2nd Edition

Published in 2012 by CCH Canadian Ltd.

Electronic Commerce and Internet Law in Canada

Buy on CCH Canadian

Intellectual Property for the 21st Century

Intellectual Property Law for the 21st Century:

Interdisciplinary Approaches

Purchase from Irwin Law