Tags
access to information
AI
AIDA
AI governance
AI regulation
Ambush Marketing
artificial intelligence
big data
bill c11
Bill c27
copyright
data governance
data protection
data strategy
freedom of expression
Geospatial
geospatial data
intellectual property
Internet
internet law
IP
open courts
open data
open government
personal information
pipeda
Privacy
smart cities
trademarks
transparency
|
Monday, 11 March 2024 15:45
Investigation of AI-Enabled Remote Proctoring Software under Public Sector Privacy Law Leads to AI RecommendationsOntario’s Information and Privacy Commissioner has released a report on an investigation into the use by McMaster University of artificial intelligence (AI)-enabled remote proctoring software. In it, Commissioner Kosseim makes findings and recommendations under the province’s Freedom of Information and Protection of Privacy Act (FIPPA) which applies to Ontario universities. Interestingly, noting the absence of provincial legislation or guidance regarding the use of AI, the Commissioner provides additional recommendations on the adoption of AI technologies by public sector bodies. AI-enabled remote proctoring software saw a dramatic uptake in use during the pandemic as university classes migrated online. It was also widely used by professional societies and accreditation bodies. Such software monitors those writing online exams in real-time, recording both audio and video, and using AI to detect anomalies that may indicate that cheating is taking place. Certain noises or movements generate ‘flags’ that lead to further analysis by AI and ultimately by the instructor. If the flags are not resolved, academic integrity proceedings may ensue. Although many universities, including the respondent McMaster, have since returned to in-person exam proctoring, AI-enabled remote exam surveillance remains an option where in-person invigilation is not possible. This can include in courses delivered online to students in diverse and remote locations. The Commissioner’s investigation related to the use by McMaster University of two services offered by the US-based company Respondus: Respondus Lockdown Browser and Respondus Monitor. Lockdown Browser consists of software downloaded by students onto their computers that blocks access to the internet and to other files on the computer during an exam. Respondus Monitor is the AI-enabled remote proctoring application. This post focuses on Respondus Monitor. AI-enabled remote proctoring systems have raised concerns about both privacy and broader human rights issues. These include the intrusiveness of the constant audio and video monitoring, the capturing of data from private spaces, uncertainty over the treatment of personal data collected by such systems, adverse impacts on already marginalised students, and the enhanced stress and anxiety that comes from both constant surveillance and easily triggered flags. The broader human rights issues, however, are an uncomfortable fit with public sector data protection law. Commissioner Kosseim begins with the privacy issues, finding that Respondus Monitor collects personal information that includes students’ names and course information, images of photo identification documents, and sensitive biometric data in audio and video recordings. Because the McMaster University Act empowers the university to conduct examinations and appoint examiners, the Commissioner found that the collection was carried out as part of a lawfully authorized activity. Although exam proctoring had chiefly been conducted in-person prior to the pandemic, she found that there was no “principle of statute or common law that would confine the method by which the proctoring of examinations may be conducted by McMaster to an in-person setting” (at para 48). Further, she noted that even post-pandemic, there might still be reasons to continue to use remote proctoring in some circumstances. She found that the university had a legitimate interest in attempting to curb cheating, noting that evidence suggested an upward trend in academic integrity cases, and a particular spike during the pandemic. She observed that “by incorporating online proctoring into its evaluation methods, McMaster was also attempting to address other new challenges that arise in an increasingly digital and remote learning context” (at para 50). The collection of personal information must be necessary to a lawful authorized activity carried out by a public body. Commissioner Kosseim found that the information captured by Respondus Monitor – including the audio and video recordings – was “technically necessary for the purpose of conducting and proctoring the exams” (at para 60). Nevertheless, she expressed concerns over the increased privacy risks that accompany this continual surveillance of examinees. She was also troubled by McMaster’s assertion that it “retains complete autonomy, authority, and discretion to employ proctored online exams, prioritizing administrative efficiency and commercial viability, irrespective of necessity” (at para 63). She found that the necessity requirement in s. 38(2) of FIPPA applied, and that efficiency or commercial advantage could not displace it. She noted that the kind of personal information collected by Respondus Monitor was particularly sensitive, creating “risks of unfair allegations or decisions being made about [students] based on inaccurate information” (at para 66). In her view, “[t]hese risks must be appropriately mitigated by effective guardrails that the university should have in place to govern its adoption and use of such technologies” (at para 66). FIPPA obliges public bodies to provide adequate notice of the collection of personal information. Commissioner Kosseim reviewed the information made available to students by McMaster University. Although she found overall that it provided students with useful information, students had to locate different pieces of information on different university websites. The need to check multiple sites to get a clear picture of the operation of Respondus Monitor did not satisfy the notice requirement, and the Commissioner recommended that the university prepare a “clear and comprehensive statement either in a single source document, or with clear cross-references to other related documents” (at para 70). Section 41(1) of FIPPA limits the use of personal information collected by a public body to the purpose for which it was obtained or compiled, or for a consistent purpose. Although the Commissioner found that the analysis of the audio and video recordings to generate flags was consistent with the collection of that information, the use by Respondus of samples of the recordings to improve its own systems – or to allow third party research – was not. On this point, there was an important difference in interpretation. Respondus appeared to define personal information as personal identifiers such as names and ID numbers; it treated audio and video clips that lacked such identifiers as “anonymized”. However, under FIPPA audio and video recordings of individuals are personal information. No provision was made for students either to consent to or opt out of this secondary use of their personal information. Commissioner Kosseim noted that Respondus had made public statements that when operating in some jurisdictions (including California and EU members states) it did not use audio or video recordings for research or to improve its products or services. She recommended that McMaster obtain a similar undertaking from Respondus to not use its students’ information for these purposes. The Commissioner also noted that Respondus’ treating the audio and video recordings as anonymized data meant that it did not have adequate safeguards in place for this personal information. Respondus’ Terms of Service provide that the company reserved the right to disclose personal information for law enforcement purposes. Commissioner Kosseim found that McMaster should require, in its contact with Respondus, that Respondus notify it promptly of any compelled disclosure of its students’ personal information to law enforcement or to government, and to limit any such disclosure to the specific information it is legally required to disclose. She also set a retention limit for the audio and video recordings at one year, with confirmation to be provided by Respondus of deletions after the end of this period. One of the most interesting aspects of this report is the section titled “Other Recommendations” in which the Commissioner addresses the adoption of an AI-enabled technology by a public institution in a context in which “there is no current law or binding policy specifically governing the use of artificial intelligence in Ontario’s public sector.” (at para 134). The development and adoption of these technologies is outpacing the evolution of law and policy, leaving important governance gaps. In May 2023, the Commissioner Kosseim and Commissioner DeGuire of the Ontario Human Rights Commission issued a joint statement urging the Ontario government to take action to put in place an accountability framework for public sector AI. Even as governments acknowledge that these technologies create risks of discriminatory bias and other potential harms, there remains little to govern AI systems outside the piecemeal coverage offered by existing laws such as, in this case, FIPPA. Although the Commissioner’s interpretation and application of FIPPA addressed issues relating to the collection, use and disclosure of personal information, there remain important issues that cannot be addressed through privacy legislation. Commissioner Kosseim acknowledged that McMaster University had “already carried out a level of due diligence prior to adopting Respondus Monitor” (at para 138). Nevertheless, given the risks and potential harms of AI-enabled technologies, she made a number of further recommendations. The first was to conduct an Algorithmic Impact Assessment (AIA) in addition to a Privacy Impact Assessment. She suggested that the federal government’s AIA tool could be a useful guide while waiting for one to be developed for Ontario. An AIA could allow the adopter of an AI system to have better insight into the data used to train the algorithms, and could assess impacts on students going beyond privacy (which might include discrimination, increased stress, and harms from false positive flags). She also called for meaningful consultation and engagement with those affected by the adoption of the technology taking place both before the adoption of the system and on an ongoing basis thereafter. Although the university may have had to react very quickly given that the first COVID shutdown occurred shortly before an exam period, an iterative engagement process even now would be useful “for understanding the full scope of potential issue that may arise, and how these may impact, be perceived, and be experienced by others” (at para 142). She noted that this type of engagement would allow adopters to be alert and responsive to problems both prior to adoption and as they arise during deployment. She also recommended that the consultations include experts in both privacy and human rights, as well as those with technological expertise. Commissioner Kosseim also recommended that the university consider providing students with ways to opt out of the use of these technologies other than through requesting accommodations related to disabilities. She noted “AI-powered technologies may potentially trigger other protected grounds under human rights that require similar accommodations, such as color, race or ethnic origin” (at para 147). On this point, it is worth noting that the use of remote proctoring software creates a context in which some students may need to be accommodated for disabilities or other circumstances that have nothing to do with their ability to write their exam, but rather that impact the way in which the proctoring systems read their faces, interpret their movements, or process the sounds in their homes. Commissioner Kosseim encouraged McMaster University “to make special arrangements not only for students requesting formal accommodation under a protected ground in human rights legislation, but also for any other students having serious apprehensions about the AI-enabled software and the significant impacts it can have on them and their personal information” (at para 148). Commissioner Kosseim also recommended that there be an appropriate level of human oversight to address the flagging of incidents during proctoring. Although flags were to be reviewed by instructors before deciding whether to proceed to an academic integrity investigation, the Commissioner found it unclear whether there was a mechanism for students to challenge or explain flags prior to escalation to the investigation stage. She recommended that there be such a procedure, and, if there already was one, that it be explained clearly to students. She further recommended that a public institution’s inquiry into the suitability for adoption of an AI-enabled technology should take into account more than just privacy considerations. For example, the public body’s inquiries should consider the nature and quality of training data. Further, the public body should remain accountable for its use of AI technologies “throughout their lifecycle and across the variety of circumstances in which they are used” (at para 165). Not only should the public body monitor the performance of the tool and alert the supplier of any issues, the supplier should be under a contractual obligation to inform the public body of any issues that arise with the system. The outcome of this investigation offers important lessons and guidance for universities – and for other public bodies – regarding the adoption of third-party AI-enabled services. For the many Ontario universities that adopted remote proctoring during the pandemic, there are recommendations that should push those still using these technologies to revisit their contracts with vendors – and to consider putting in place processes to measure and assess the impact of these technologies. Although some of these recommendations fall outside the scope of FIPPA, the advice is still sage and likely anticipates what one can only hope is imminent guidance for Ontario’s public sector.
Published in
Privacy
|
Electronic Commerce and Internet Law in Canada, 2nd EditionPublished in 2012 by CCH Canadian Ltd. Intellectual Property for the 21st CenturyIntellectual Property Law for the 21st Century: Interdisciplinary Approaches |