Privacy Reports
Decision Information
The Office of the Information and Privacy Commissioner of Ontario (IPC) received complaints from students at the University of Waterloo (the university) regarding “smart” snack vending machines installed on campus by a third-party service provider. The complaints alleged that the machines appeared to use facial recognition technology that was collecting facial images without consent or proper notice. In this report, I find that the machines used cameras and face detection technology to capture identifiable facial images amounting to a collection of personal information within the meaning of section 38(1) of the Freedom of Information and Protection of Privacy Act (FIPPA). Further, I find that this collection did not comply with section 38(2) of FIPPA and, therefore, was a privacy breach. I also find that affected individuals were not given notice of the collection, as required under section 39(2) of FIPPA. Although the university had reasonable contractual safeguards in place with the third-party service provider, it was unaware that personal information was being collected through the machines’ face detection technology. This oversight was due to shortcomings in the university’s procurement process for the vending machines which failed to apply the necessary level of due diligence by conducting a privacy impact assessment, or requiring prospective service providers to do so, in order to identify and assess the privacy implications of the technology. In this report, I recommend that the university take adequate steps in the procurement process to ensure it evaluates third-party service providers and any technology to be used, and fulfills its obligations to protect personal information under its control in accordance with section 4(1) of Regulation 460 under FIPPA.
In this report, I find that the machines used cameras and face detection technology to capture identifiable facial images amounting to a collection of personal information within the meaning of section 38(1) of the Freedom of Information and Protection of Privacy Act (FIPPA). Further, I find that this collection did not comply with section 38(2) of FIPPA and, therefore, was a privacy breach. I also find that affected individuals were not given notice of the collection, as required under section 39(2) of FIPPA.
Although the university had reasonable contractual safeguards in place with the third-party service provider, it was unaware that personal information was being collected through the machines’ face detection technology. This oversight was due to shortcomings in the university’s procurement process for the vending machines which failed to apply the necessary level of due diligence by conducting a privacy impact assessment, or requiring prospective service providers to do so, in order to identify and assess the privacy implications of the technology.
In this report, I recommend that the university take adequate steps in the procurement process to ensure it evaluates third-party service providers and any technology to be used, and fulfills its obligations to protect personal information under its control in accordance with section 4(1) of Regulation 460 under FIPPA.
Decision Content
REVISED PRIVACY COMPLAINT REPORT
PRIVACY COMPLAINT
University of Waterloo
August 27, 2025
Summary: The Office of the Information and Privacy Commissioner of Ontario (IPC) received complaints from students at the University of Waterloo (the university) regarding “smart” snack vending machines installed on campus by a third-party service provider. The complaints alleged that the machines appeared to use facial recognition technology that was collecting facial images without consent or proper notice.
In this report, I find that the machines used cameras and face detection technology to capture identifiable facial images amounting to a collection of personal information within the meaning of section 38(1) of the Freedom of Information and Protection of Privacy Act (FIPPA). Further, I find that this collection did not comply with section 38(2) of FIPPA and, therefore, was a privacy breach. I also find that affected individuals were not given notice of the collection, as required under section 39(2) of FIPPA.
Although the university had reasonable contractual safeguards in place with the third-party service provider, it was unaware that personal information was being collected through the machines’ face detection technology. This oversight was due to shortcomings in the university’s procurement process for the vending machines which failed to apply the necessary level of due diligence by conducting a privacy impact assessment, or requiring prospective service providers to do so, in order to identify and assess the privacy implications of the technology.
In this report, I recommend that the university take adequate steps in the procurement process to ensure it evaluates third-party service providers and any technology to be used, and fulfills its obligations to protect personal information under its control in accordance with section 4(1) of Regulation 460 under FIPPA.
Statutes Considered: The Freedom of Information and Protection of Privacy Act, R.S.O. 1990, c. F. 31, (FIPPA) sections 2(1), 38(1) and (2), and 39(2); and Regulation 460 under FIPPA, section 4(1); The University of Waterloo Act, 1972, SO 1972, c 200, ss. 2, 4, 14.1(c), (f) Legislation Act, 2006, S.O. 2006, c. 21, Sched. F, s. 92(1).
Orders and Investigation Reports Considered: Order PO-1880, Order 11, Privacy Complaint Reports PI21-00001, PR16-40, MC10-2, MC07-68; Privacy Investigation Report PC12-39; Joint Investigation of the Cadillac Fairview Corporation Limited by the Privacy Commissioner of Canada, the Information and Privacy Commissioner of Alberta, and the Information and Privacy Commissioner for British Columbia.
Cases Considered: Cash Converters Canada Inc. v Oshawa (City), 2007 ONCA 502, 86 OR (3d) 401; Reference re Assisted Human Reproduction Act, 2010 SCC 61; Reference re Genetic Non-Discrimination Act, 2020 SCC 17; Reference re Securities Act, 2011 SCC 66, [2011] 3 SCR 837; Ontario Criminal Code Review Board v. Hale, 47 O.R. (3d) 201 (C.A.).
Secondary Sources: Mahmud Jamal, “Is PIPEDA Constitutional”, 43 Can. Bus. L.J. 434 (2006), at pp. 442, 448; Michel Bastarache, “The Constitutionality of PIPEDA: A Reconsideration in the Wake of the Supreme Court of Canada’s Reference re Securities Act”, June 2012, at pp. 4-6, 11-12; Josh Nisker, “PIPEDA: A Constitutional Analysis, Canadian Bar Review, Vol. 85, p. 317 (2006), at pp. 326-329.
PREAMBLE:
The original Privacy Complaint Report referred to a software company identified by the University of Waterloo (the university) as being the subcontractor that provided the face detection software used in its intelligent vending machines (IVMs). After the report was publicly issued, the software company in question informed the IPC that its face detection software was not the one used in the university’s IVMs. This office confirmed the claim with the university, who then provided the correct information about a different software provider. Accordingly, the report was revised and a Revised Privacy Complaint Report issued to reflect this correction. Although the technical facts differ somewhat from the original report, the findings remain the same.
BACKGROUND:
[1] In February 2024, news media reported that students at the university had raised privacy concerns about smart snack vending machines installed on their main campus, after one of the machines malfunctioned and displayed the following software error message:
Invenda.Vending.FacialRecognition.App.exe – Application Error[1]
[2] Additionally, it was reported that the machines were supplied by Adaria Vending Services Limited (Adaria), owned by MARS Wrigley (MARS) and manufactured by Invenda Group (Invenda).[2]
[3] The Office of the Information and Privacy Commissioner of Ontario (IPC) received complaints from the university’s students about the machines. The students believed that the machines had captured images of their faces and complained that they were not informed of the apparent use of facial analytics technology. They also complained that they had not consented to the collection of their facial images, which they consider to be personal information.
[4] As such, the students believed that the university had breached their privacy under the Freedom of Information and Protection of Privacy Act (FIPPA). As a remedial measure, they requested that the machines be removed from campus.
[5] An investigation was commenced into this matter, and written representations and relevant materials were requested and received from the university.
The University, Adaria, MARS, Invenda and the Vending Machines
[6] The machines at issue are intelligent vending machines (IVMs) manufactured by Invenda, a software company that makes automated retail devices smart and connects them to a single digital platform.[3]
[7] In response to this office’s request for information about the matter, the university advised that, in October 2023, it entered into an agreement with Adaria to provide 29 unattended snack vending services on its main campus (the Agreement). Under this agreement, Adaria was responsible for maintaining, monitoring and stocking the IVMs, which it installed on the university’s campus in December 2023.
[8] According to the university, Adaria either purchased or leased the machines from MARS and MARS contracted with Invenda to manufacture and supply the IVMs that were installed on the campus. The university also advised that it did not contract with MARS or Invenda and was informed there was no contract between Adaria and Invenda concerning the IVMs.
[9] Regarding the technology used in the IVMs, the university explained that the machines were equipped with a computer running a customized installation of the Windows operating system that Invenda calls “Invenda OS”, which collected data from the IVMs and transmitted it over the Internet to a cloud service operated by Invenda, known as the Invenda Cloud.
[10] The university further advised that, without its knowledge, the IVMs used face detection technology that collected demographic data.
The University’s Jurisdictional Challenges
[11] Throughout this investigation, the university raised several grounds challenging the IPC’s jurisdiction to investigate the potential collection of personal information through the operation of the IVMs on its campus, as well as the scope of any such investigation. While the university ultimately responded to our requests for information, its cooperation with the investigation came with the following caveat:
The University continues to question the jurisdiction of the Commissioner to carry out an investigation into an alleged breach of personal information through vending machines on its campus but intends to cooperate with the
Commissioner so that this matter can be resolved.
[12] First, the university took the position that Personal Information Protection and Electronic Documents Act (PIPEDA), and not FIPPA, applied to the operation of its commercial activities based on the federal power of Parliament over “the regulation of trade and commerce.” In support of this position, the university cited a publication by the Office of the Privacy Commissioner of Canada (OPC) concerning the application of the PIPEDA to Municipalities, Universities, Schools, and Hospitals.[4] In the university’s view, the sale of food items through the vending machines constituted a commercial activity unrelated to its core purpose of providing post-secondary education, with the result that PIPEDA would govern any collection of personal information by the vending machines to the exclusion of FIPPA.
[13] Second, the university cited Invenda's assertion that the IVMs did not, in fact, collect any personal information. It argued that, if no personal information has been collected, then the IPC cannot have jurisdiction to conduct an investigation under FIPPA.
[14] Third, while the university accepted that it is accountable for the collection, use or disclosure of personal information by third party service providers, and is responsible for their management, it maintained that it did not contract with Adaria to collect personal information using face detection technology. Because any collection of personal information by the IVMs fell outside the scope of the contracted services, the university argued that Adaria was not its service provider in this context. On this basis, the university submitted that any investigation into the matter by the IPC should be limited to their vendor due diligence and management practices.
[15] Finally, the university maintained there was no need for the IPC to investigate the matter because the face detection technology in the IVMs was disabled in February 2024, the machines were removed from campus in March 2024, with no plans to reinstall them, and all data that was collected has since been permanently deleted.
[16] Turning to the constitutional argument, the university appears to take the position that Parliament’s power over federal trade and commerce under section 91(2) of the Constitution Act, 1867 means that PIPEDA’s regulation of personal information in the context of the university’s commercial operates to the exclusion of FIPPA.
[17] The university’s position is not supported by the provisions of FIPPA establishing its scope and application. Nor is this position supported by any judicial authority.
[18] The views expressed in the OPC’s publication with respect to the “core” activities of provincial and municipal institutions are not reflected in the provisions of FIPPA. Notwithstanding the potential application of PIPEDA to the commercial activities of third-party vendors, the university retains control over personal information of its students collected by commercial vendors operating on its behalf and under its authority. This gives rise to commensurate responsibilities to ensure by contractual or other means that vendors’ activities in the handling of such personal information comply with the provisions of FIPPA.[5] Whether or not the collection of personal information has a commercial purpose or component has no bearing on the university’s obligations to protect the personal information under its control in accordance with FIPPA.[6]
[19] There is a broad consensus among the judiciary and legal commentators that the relationship between organizations and individuals in the handling of their personal information falls within the “property and civil rights” head of power at section 92(13) of the Constitution Act, 1867 or “generally all matters of a merely local or private nature in the province”
at section 92(16).[7] Nonetheless, the constitutional division of powers jurisprudence recognizes that a particular subject matter can have both federal and provincial aspects, in which case the “double aspect doctrine” allows for the concurrent application of both federal and provincial legislation.[8]
[20] In a 2006 article supporting the constitutional validity of PIPEDA under the federal trade and commerce power, Mahmud Jamal (now Justice of the Supreme Court), stated that “the protection of personal information arguably has a double aspect”
and, as such, “both levels of government may legislate in one jurisdictional field for two different purposes.”
He further opined that “the protection of personal information is arguably not a single matter”
but “cuts across both federal and provincial fields and is not assigned exclusively to either level of authority”
.[9]
[21] To the extent that minimum national privacy standards are seen as necessary to promote Canadian economic interests, legal scholars have concluded that PIPEDA may be viewed as constitutionally valid under the trade and commerce power at section 91(2) pursuant to the “double aspect doctrine”.[10] That is the conclusion reached in a 2022 legal opinion commissioned by the OPC examining the constitutional validity of proposed successor legislation to PIPEDA, entitled “The Constitutional Validity of Bill C-11, the Digital Charter Implementation Act” (“OPC Opinion”),[11] where the authors state:
… [T]he regulation of privacy can exist under both federal and provincial jurisdiction. Parliament is entitled to protect privacy in so far as it impacts national economic interests — jurisdictional silos are not required. Parliament’s jurisdiction to promote the economy through the protection of privacy does not strip provincial ability to further protect privacy (emphasis added). [12]
[22] The OPC opinion underscores the conclusion that PIPEDA cannot override the privacy provisions of FIPPA or oust the oversight and reporting functions of the Commissioner under FIPPA in relation to the activities of the University at issue here, including its intra-provincial commercial activities.
[23] Accordingly, and notwithstanding the potential application of PIPEDA, it is clear that FIPPA also applies to any personal information collected by or on behalf of the university, including in the course of its commercial activities.
[24] The remainder of the university’s so-called jurisdictional challenges actually comprise the substantive issues to be examined in this investigation report. Despite Invenda’s assertion that no personal information was collected by the IVMs, that issue remains to be determined and will be examined below. The university’s position that it did not contract with Adaria to collect personal information using face detection technology, and in fact was unaware of the use of this technology, raises questions about whether adequate privacy protections were, or should have been, in place with respect to the university’s procurement and tendering processes.
[25] While it is acknowledged that the IVM’s have been removed from campus grounds, the circumstances of this investigation require assurance that any personal information collected by the IVMs along with any derivative information has in fact been deleted or destroyed and will not be collected again in this way in the future.
[26] All of this is in aid of the IPC fulfilling its mandate under section 58 of FIPPA to report annually to the Legislative Assembly on the effectiveness of FIPPA in protecting personal privacy, including an assessment of the extent to which institutions are complying with FIPPA and the IPC’s recommendations with respect to the practices of particular institutions.[13]
ISSUES:
- Did the university’s use of face detection technology result in a collection of personal information on the university’s behalf under FIPPA?
- Did the collection of personal information comply with sections 38 and 39 of FIPPA?
- Did the university have reasonable measures in place to protect personal information in accordance with section 4(1) of Regulation 460 under FIPPA?
DISCUSSION:
A. Did the university’s use of face detection technology result in a collection of personal information on the university’s behalf under FIPPA?
[27] Section 2(1) of FIPPA defines “personal information” as “recorded information about an identifiable individual.”
Recorded information is information recorded in any format, including by electronic means or otherwise.[14]
[28] Information is “about” an individual when it refers to them in their personal capacity, which means that it reveals something of a personal nature about them. Information is about an “identifiable individual” if it is reasonable to expect that the individual can be identified from the information, either by itself or if combined with other information.[15]
[29] Section 2(1) lists examples of personal information. The list is not exhaustive, meaning that other kinds of information may also be personal information.[16]
[30] In this matter, images of individuals’ faces were captured by the IVMs and data relating to these images, including estimates of their gender and age,[17] was generated. Accordingly, I must determine whether these images and any resulting data qualify as “personal information” as defined in section 2(1).
The Face Detection Technology
[31] The university advised that the IVMs were equipped with an optical sensor that continuously captured visual input. According to the university, the sensor’s resolution was too low to be considered a camera or create an identifiable image. However, as I discuss in the analysis section of this report, below, based on additional information provided by the university about the resolution of the images captured by the optical sensor, it is more accurately characterized as the equivalent of a camera.
[32] The university advised that the face detection software in the IVMs used the sensor to count people who passed in front of, looked at, or interacted with, the IVMs, as well as to detect faces by analyzing the image of the object (i.e., a person’s face) that was close to the IVM. Using a third-party software library, the software would process detected faces to estimate the gender (male or female) and age (together, the demographic data) of consumers. The university explained that the software assumed these objects were people walking by the IVM and that these events were recorded and uploaded to the Invenda Cloud.
[33] When a sale occurred, the university advised that the IVMs would record a timestamp, the item purchased and the demographic data. The university also advised that the IVMs collected heatmap data to track where users touched the interactive display. It would then upload all this data to the Invenda Cloud.
[34] According to the university, the technology could only estimate the demographic data at most 80% of the time, with 20% or more of the transactions recorded as “unknown” for this data, and that the accuracy confidence level was 60-80%.
[35] The university submitted that the face detection software in the IVMs processed facial images in a way that respected individual privacy. More specifically, the university asserted that the software:
- performed face detection, not face recognition and, therefore, did not identify or track individuals;
- processed captured facial images in real time and, therefore, at no point was this data permanently recorded or transmitted elsewhere; and
- produced only anonymized data that could not be linked to any specific individual.
[36] The face detection software at issue was a third-party audience measurement tool[18] that detected people and faces.
[37] The university stated that the face detection software was installed by Invenda, and that the “FacialRecognition.App.exe” application referenced on the malfunctioning IVM’s display screen is an Invenda component, responsible for receiving data from third-party audience measurement services to create metadata stored for statistical purposes.
[38] The university provided this office with manuals and relevant documents related to the optical sensor and face detection software, as well as the demographic detection and people counting features in the IVMs (the IVM Technology Materials).
[39] Based on the IVM Technology Materials, the third-party provider of the software claimed that the system “do[es] not involve the collection of any sensitive personal or privacy data. No photos or videos are recorded, stored, or transmitted over the Internet.”
The provider further explained that the software “provides real-time people counting and face analysis to generate…fully anonymized metadata”
and that “no specific personal characteristics are stored locally nor sent to a remote location at any moment…”
.
[40] The IVM Technology Materials also confirmed that the optical sensor delivered a continuous video stream that was analyzed by the face detection software, which used an object detector trained to identify faces and people within a general population in each video frame.
[41] According to these materials, when a person entered the field of view of an IVM’s optical sensor, the demographic detection and people counting processes simultaneously took these steps:
1. For demographic detection, once the software detected a face, it captured the image, extracted the individual’s facial features and processed them in real time without storing them locally or remotely in the Invenda Cloud.
The facial features were then compared to hundreds of thousands of faces in a third-party software library to estimate the individual’s gender and age.
This process was repeated continuously for each video frame.
2. For people counting, once the software detected a person, it compared the number of people detected in the current video frame to those in the previous frame.
The software increased the people count by the number of persons in the current frame who did not match any person detected in the preceding frame. However, the characteristics used to make this match were not saved to disk.
This process was repeated continuously for each video frame.
[42] Along with these explanations, the IVM Technology Materials also contained the following input images as examples:
Demographic Detection Input Facial Image:
People Counting Input Image:
[43] Moreover, according to the IVM Technology Materials, when a sale occurred, the IVMs would record and upload only the following data to the Invenda Cloud:
- Timestamp: DateTime
- Age: Integer (number)
- Gender: String (“Female” or “Male”)
- Number of people: Integer (number)
[44] The university also advised that a payment record was created for each sale.
[45] Based on the foregoing information about the face detection technology, and, in particular, the input facial image from which consumers’ facial features were extracted, I find in the analysis section below that the use of the IVMs on campus resulted in the collection of personally identifiable images.
Sales and Payment information
[46] To provide the university with sales data and inventory tracking information, the university advised that Adaria installed payment collection software (i.e. smartVend) in the IVMs which accepted payment by credit or debit card, mobile wallets and campus cards.
[47] When a person purchased a snack, the university advised that the card number was collected, and that the following transaction data was recorded:
- a timestamp of the payment device confirmation;
- the success or failure of the payment process; and
- the amount of the transaction.
[48] According to the university, this data was stored in the IVMs, sent to the Invenda Cloud, and was only used for statistical analysis and reconciling the number of products dispensed with those purchased.
[49] The university advised that Adaria did not store any payment data to reduce security risks related to cardholder information. The university also explained that Adaria’s software, which collected payment data, was separate from the face detection software, and that payment data was processed directly between the payment device and the payment service selected by Invenda's customers. Consequently, the university advised that the payment data cannot be matched to the other records uploaded to the Invenda Cloud.
Analysis:
Collection of Personal Information
[50] There was no dispute that the IVMs captured video images of individuals’ faces on the university ’s campus. However, the university argued that the resolution of the optical sensor in the IVMs was too low for the device to be considered a camera or create identifiable images of individuals.
[51] Respectfully, I disagree with the university’s position. At the outset, I note that the university did not provide any evidence to support its position.
[52] The IVM Technology Materials stated that the optical sensors in the IVMs provided a stream of input images, which were then processed by the software to detect facial images, capture them and extract individual “face features.” In my view, this sequence involving continuous facial image input, detection and capture prior to the feature extraction process is functionally equivalent to the operation of a camera system. Accordingly, in my view, the optical sensors that capture facial images for processing by the software constituted a camera, both in purpose and effect.
[53] The university advised that the optical sensors (hereinafter, the cameras) in the IVMs had resolution of either 1280x720 or 640x480. My review of relevant sources has found that a camera with a 640x480 resolution can capture identifiable images provided the subject is within sufficient proximity.[19] Consequently, a 1280x720 resolution would offer even greater clarity and detail.
[54] In my view, conditions for the cameras to capture identifiable images were favourable because:
- the IVMs were installed indoors, in specific rooms and common areas (such as lobbies and a lounge) on the university’s campus[20], which likely provided sufficient natural and/or artificial lighting, a factor that typically enhances image clarity;
- the cameras were fixed in a static position reducing the chances of motion blur and related image distortion; and
- the cameras were likely positioned close to individuals, particularly, in the case of users actively engaging with the machines increasing the likelihood of capturing identifiable images.
[55] For these reasons, I find that the resolution of the cameras installed in the IVMs was sufficient to create an identifiable image of an individual.
[56] The university also submitted that no personal information was involved as the images captured by the face detection technology were stored only temporarily for a few milliseconds before being permanently deleted. Therefore, in the university’s view, the images did not qualify as recorded information about an identifiable individual.
[57] Moreover, the university maintained that none of the output records were about identifiable individuals and, therefore, were not personal information under FIPPA.
[58] In support of this position, the university submitted an Officer Certificate from Adaria in which a director certified that Adaria “does not have in its possession or control, biometric personal information (which includes images or videos of faces or people, retinal scans, facial patterns, or facial recognition data) obtained in connection with the Agreement or in connection with any biometric sensors installed in any vending machines serviced by the Company in connection with the Agreement.”
[59] The university also provided a copy of a letter it received from MARS and a letter that Adaria received from Invenda, both regarding the IVMs.
[60] MARS’ letter stated that the “the Invenda technology used in [the IVMs did] not collect, manage, retain, or process any personally identifiable information”
and clarified that the IVMs “[did] not have the capability to capture, retain or transmit imagery.”
[61] Invenda’s letter, titled “Statement Regarding Demographic Trend Detection Technology,” stated:
As the producer of the Invenda loT solution, the Invenda smart vending machine, and its associated software, we formally warrant that the demographic detection software integrated into the smart vending machine operates entirely locally. It does not engage in storage, communication, or transmission of any imagery or personally identifiable information. The software conducts local processing of digital image maps derived from the USB optical sensor in real-time, without storing such data on permanent memory mediums or transmitting it over the Internet to the Cloud.
The output generated from processing the video stream manifests as a simple file in JSON format…[21]
It is imperative to note that the Invenda Software does not possess the capability to recognize any individual's identity or any other form of personal information.
Further, Invenda's smart vending machines and its software are compliant with General Data Protection Regulation and equivalent data privacy policies.
[62] Despite the assertions in these letters and the university’s submissions, it is clear that the use of the face detection technology in the IVMs resulted in a collection of personal information under FIPPA, however transient. As noted above, I was provided with a sample “demographic detection input facial image” captured by the cameras that was of photographic quality, clearly showing an individual’s identifiable facial features. These facial images were temporarily stored in memory, even if only for milliseconds, before being processed by the face detection software.
[63] This office has previously held that information collected by video cameras in the nature of personally identifiable images qualifies as personal information under section 2(1) of FIPPA.[22]
[64] Pursuant to section 38(1) of FIPPA, such information does not need to be permanently recorded to qualify as “personal information.” Section 38(1) expands the definition of “personal information” for the purposes of sections 38 and 39 of FIPPA, as follows:
“personal information” includes information that is not recorded and that is otherwise defined as “personal information” under this Act.
[65] Moreover, I do not accept the university’s claim that the face detection software operated entirely in real-time, as the face detection technology in the IVMs temporarily stored images of individuals’ faces in memory to generate the demographic data. This conclusion is reinforced by the IVM Technology Materials which indicate that a “motion-based method” is employed for people counting which requires multiple frames to be considered concurrently. This means that some number of image frames for people counting purposes would need to be kept in memory during processing in order to compare one frame to the next. Where the same frame detected the image of a face for demographic purposes, it would likewise be retained temporarily in memory for a brief period of time.
[66] I note that the OPC, the Office of the Information and Privacy Commissioner of Alberta, and the Office of the Information and Privacy Commissioner of British Colombia (the “Commissioners”) reached a similar conclusion in a report of their joint investigation of the Cadillac Fairview Corporation Limited (CFCL) (the Commissioners’ Report)[23].
[67] In that report, the Commissioners assessed whether CFCL was collecting and using personal information without consent through facial detection technology (i.e. Anonymous Video Analytics (AVA)) in mall directories. The Commissioners did not accept CFCL’s assertion that the technology operated solely in real time, noting that facial images were briefly stored during processing[24] and the resulting data that was generated was used afterwards. Despite the short retention period, the Commissioners found this still amounted to a collection of personal information.[25]
[68] I note that an application by plaintiffs to certify a class action against CFCL for the tort of intrusion on seclusion was recently dismissed by the British Columbia Supreme Court in Cleaver v. Cadillac Fairview Corp. Ltd.[26] The plaintiff’s’ application was based, in part, on the Commissioners’ Report, which the Court stated could only be admitted for context “but not for the truth of its contents.” Consequently, the contents of the Commissioners’ Report could not be relied on by the plaintiffs or their witnesses. Further, the Court observed that the focus of a certification application “is not on the merits or the weight of the evidence but rather on the appropriate form of the action.”
[69] In considering the criteria for class certification, the Court found no factual basis to demonstrate that the plaintiffs could self-identify as “having viewed a wayfinding directory at one or more of the shopping malls during the relevant periods”
and “no rational relationship” between that proposed class definition and the fundamental common issues raised by the plaintiffs.
[70] In case the Court was wrong on that point, it went on to consider whether there was some basis in fact for those common issues, namely, “that a facial image of an individual was recorded and used to create biometric and personal information about that individual”
.
[71] The Court could not consider the plaintiffs’ expert testimony as it relied exclusively on the Commissioners’ report that was deemed inadmissible for the truth of its content. Therefore, the Court accepted the only evidence before it which was the evidence of an expert retained by CFCL. CFCL’s expert attested that the technology in question “takes images ... [of] a face” and that the “captured” facial images were then converted to anonymized “embedding numbers” which could not be used to identify an individual or “to recover the original facial image.”
[27] While the Court accepted this evidence, it ultimately observed there was “no basis in fact that any facial images were recorded by the cameras located at the Directories.”
The Court’s finding in this respect can only be reconciled with the expert’s evidence if it used the word “recorded” to mean ”permanently recorded.”
[72] In any event, for the purposes of the present investigation, section 38(1) of FIPPA does not require personal information to be “recorded” in order to be “collected.” Therefore, the issue of whether or not the facial images were “recorded” as a component of the common issues examined by the Court in the class action lawsuit against CFCL, has no bearing on whether personal information in the form of facial images was “collected” within the meaning of FIPPA.
[73] Further, where the Court stated “there is no basis in fact for the allegation that the Data contains personal information within the meaning of the relevant statutes,”
it was not referring to the facial images that were “captured,” but rather to the numerical “Data” which the Court defined in its reasons as the “data generated by the Software.”
[74] Finally, an application to certify a proceeding as a class action does not involve a trial on the merits, but rather a determination based on admissible evidence that certain certification criteria have been met. The Commissioners’ Report was not admissible evidence that could be weighed or tested against other evidence, and, in any event, it was not part of the Court’s role to weigh evidence in the certification application. Given the limited use of the Commissioners’ Report for context only, the Court’s reasons at no point addressed or called into question the Commissioners’ finding that “the captured images ... held in memory for a very short period ... represent[ed] a collection of personal information”
within the meaning of their respective statutes.
[75] For essentially the same reasons as those in the Commissioners’ Report, I find that the capture of images by the face detection technology in the IVMs qualifies as a collection of personal information as defined at section 38(1) of FIPPA. Regardless of whether the original image used to create the demographic data was permanently retained in storage, the image was “taken” or “collected” by the equipment’s camera and stored for a sufficient period of time to be converted to another format. From any practical perspective, including the perspective of the individual whose facial image was captured, this is a collection of personal information that must find its authority in the provisions of FIPPA to be lawful.
[76] As noted above, the university claimed that the IVM software converted recorded facial images into an anonymous format that could not be linked to any specific individual. I can confirm that our investigation into this matter has found no evidence to suggest that personal information, beyond the initial temporary capture of facial images, was retained and further used by these vendors.
B. Did the collection of personal information comply with sections 38 and 39 of FIPPA?
Authorization to collect personal information
[77] Section 38(2) of FIPPA limits the circumstances in which personal information may be collected. This section states:
No person shall collect personal information on behalf of an institution unless the collection is expressly authorized by statute, used for the purposes of law enforcement or necessary to the proper administration of a lawfully authorized activity.
[78] In order for an institution to collect personal information in compliance with the requirements of FIPPA, one or more of the three criteria specified in section 38(2) must be satisfied. I have not been referred to any statute that could be said to expressly authorize the collection of personal information in this case, nor is there any basis to find that personal information collected by the vending machines was used or intended to be used for the purposes of law enforcement. It therefore falls to be determined whether the collection of the personal information at issue on behalf of the university is necessary for the proper administration of a lawfully authorized activity.
[79] The university is continued as a corporation under section 2 of The University of Waterloo Act, 1972,[28] pursuant to which it has “all the property, rights, powers and privileges which it now has, holds, possesses or enjoys”
and which, pursuant to section 4, includes “all powers necessary and incidental to the satisfaction and furtherance of its objects as a University.”
More specifically, under section 14.1(c) and (f), the Board of Governors of the university has all powers necessary or convenient to perform its duties, which include the power to plan and implement the physical and operational development of the university and to establish and collect fees and charges for academic tuition and for services of any kind which may be offered by the University. Further, pursuant to section 92(1) of the Legislation Act, [29] the university has the power as a corporation to contract by its corporate name and to acquire, hold and dispose of personal property.
[80] Given the breadth of the university’s powers, there can be no doubt that the installation and operation of vending machines on the university’s premises as a component of its larger food services operations, and entering into a contract for that purpose, is a lawfully authorized activity. The question remains whether the collection of personal information is necessary for the proper administration of the activity in question, namely the sale of beverages and snacks to members of the public.
[81] The test for determining whether this requirement is satisfied was articulated by the Court of Appeal in Cash Converters Canada Inc. v. Oshawa (City) ("Cash Converters") as follows:
... [T]he institution must show that each item or class of personal information that is to be collected is necessary to properly administer the lawfully authorized activity. Consequently, where the personal information would merely be helpful to the activity, it is not "necessary" within the meaning of the Act. Similarly, where the purpose can be accomplished another way, the institution is obliged to choose the other route.[30]
[82] As described at paragraph 51 to 54 above, sales from the vending machines involved payment by credit or debit card, mobile wallets and campus cards. The associated payment data was not stored by Adaria but was processed directly between the payment device and the selected payment service in order to reduce security risks related to cardholder information. Further this information was kept separate from the face detection software and the sales and inventory tracking data. It therefore appears that the collection of personal cardholder information in connection with the payment process was no more than necessary for the purposes of processing each transaction and, in my view, satisfied the test articulated in Cash Converters.
[83] In contrast, I have been provided with no evidence or argument by the university, nor am I aware of any facts or circumstances, by which the collection of a facial images by the IVMs for conversion to demographic data is necessary for operation of the machines or the completion on individual transactions. While the collection and conversion of facial images may be helpful to enable vendors to better understand the demographics of the population they serve and to track sales and adjust inventories, any benefit from collecting that information is at best merely helpful, and not necessary to administer a lawfully authorized activity.
[84] My findings in this regard stand in contrast to the IPC’s 2024 McMaster University Privacy Complaint Report[31] which examined the use of online proctoring software by McMaster to capture audio and video recordings of students who were taking their exams remotely during the Covid pandemic. In the absence of direct human oversight, the recording function of this online proctoring service gave instructors the ability to review first-hand any exam sessions in which the technology flagged suspicious behaviour consistent with cheating. In this circumstance, the individual items and classes of personal information collected by the software on behalf of the university were necessary for the purpose of conducting and proctoring the exams.
[85] As a result, the Commissioner was satisfied in that case that the collection of personal information by the proctoring software was “necessary” within the meaning of section 38(2) for the proper administration of the university’s lawfully authorized activity of conducting and proctoring online exams during the pandemic, and in other circumstances that could be justified post-pandemic.[32]
[86] The collection of facial images through the university’s use of the IVMs’ facial detection technology falls far short of the necessity test articulated in Cash Converters and found to be satisfied in the McMaster Report.
[87] Accordingly, I find that the collection of facial images by the IVMs was not expressly authorized by statute, used for the purposes of law enforcement or necessary to the proper administration of a lawfully authorized activity, and thus was not in compliance with section 38(2) and amounted to a privacy breach.
Notice of collection
[88] Subject to exceptions[33], none of which apply in this case, section 39(2) of FIPPA requires that individuals be informed when their personal information is being collected. This section states:
Where personal information is collected on behalf of an institution, the head shall, unless notice is waived by the responsible minister, inform the individual to whom the information relates of,
(a) the legal authority for the collection;
(b) the principal purpose or purposes for which the personal information is intended to be used; and
(c) the title, business address and business telephone number of a public official who can answer the individual’s questions about the collection.
[89] The notice required by section 39(2) may be provided orally, in writing or in any manner that meaningfully informs the individual.[34]
[90] In this matter, the university did not inform individuals that the IVMs were using face detection technology that captured their images. As a result, the affected individuals were not given the opportunity to avoid the collection of their personal information either by choosing not to use the machines or by avoiding coming within their camera range. They were also not given any of the required information under section 39(2).
[91] In light of my finding above that the collection of personal information in the form of identifiable facial images was not in accordance with section 38(2), the university could not have provided notice of the legal authority for the collection in compliance with section 39(2).
[92] Even if the university could be said to have legal authority to collect facial images of users of the IVMs (which I find it did not), the university did not provide any meaningful notice of such collection in accordance with the requirements of section 39(2). In fact, it provided no such notice at all.
[93] In conclusion, I find that the university did not, and could not, provide notice of the collection of personal information in accordance with section 39(2) of FIPPA.
C. Did the university have reasonable measures in place to protect personal information in accordance with section 4(1) of Regulation 460 under FIPPA?
[94] FIPPA requires that institutions protect personal information that they hold or that is collected and held on their behalf.[35]
[95] This is especially important when institutions work with third parties who may collect or access personal information.[36] A third-party can be any outside person, business, or organization that provides a service to or works for an institution.[37]
[96] Accordingly, when institutions contract with third parties, they must still follow FIPPA and, therefore, they need contractual and oversight measures in place to ensure compliance with this legislation.[38]
[97] Regulation 460, made under FIPPA, establishes rules for the security of records that an institution has. Specifically, section 4(1) of Regulation 460 states:
Every head shall ensure that reasonable measures to prevent unauthorized access to the records in his or her institution are defined, documented and put in place, taking into account the nature of the records to be protected.
[98] These requirements “appl[y] throughout the life-cycle of a given record, from the point at which it is collected or otherwise obtained, through all of its uses, and up to and including its eventual disposal.”[39] Also, “reasonable measures” do not need to cover every possible scenario to prevent authorized access; rather, they need to be “fair and suitable under the circumstances”.[40]
[99] In Privacy Complaint Report PR16-40, the investigator noted that section 4(1) of Regulation 460 does not prescribe a “one-size-fits-all” approach to security:
It does not set out a list of measures that every institution must put in place regardless of circumstance. Instead, it requires institutions to have “reasonable” measures and ties those measures to the “nature” of the records to be protected. It follows that the same security measures may not be required of all institutions. Depending on the nature of the records to be protected, including their sensitivity, level of risk and the types of threats posed to them, the required measures may differ among institutions.
[100] In this matter, the third-party service provider was Adaria with whom the university had contracted to provide snack vending services.
[101] Accordingly, I must determine whether the university had reasonable measures in place to protect any personal information that may have been involved in the provision of these services.
[102] As part of my investigation, I reviewed copies of the university’s snack vending services Request For Proposal (the RFP), Adaria’s proposal in response to the RFP (Adaria’s Proposal) and the Agreement.
[103] In assessing these documents, I considered the IPC’s “Privacy and Access in Public Sector Contracting with Third Party Service Providers” guidance document which sets out best practices for ensuring privacy when preparing for and entering into agreements with service providers. [41] I also considered the IPC’s “Planning for Success: Privacy Impact Assessment Guide” which assists institutions in conducting a privacy impact assessment (PIA) to assess compliance with FIPPA.
[104] Below, I find that the university may have had reasonable contractual measures in place in so far as it required Adaria, through what appears to be standard-type provisions, to protect any personal information involved in the delivery of the snack vending machines. These contractual provisions were included despite the university’s assertion that it was not aware of any personal information being collected as part of the face detection technology embedded in the IVMs.
[105] However, beyond the written agreement, it was the university’s overall procurement process for the RFP that failed to identify and assess the privacy implications of this technology offered by its third-party service provider and their suppliers. Specifically, the university’s procurement process did not engage the level of due diligence necessary to uncover the fact that the technology being offered by Adaria involved the collection of visual images of IVM users, contrary to section 38(2) of FIPPA. The fact that the university originally identified the wrong software supplier as part of this investigation is further evidence of its lack of knowledge and understanding of what exactly it was procuring and who the various subcontractors were.
The University’s Vendor Management Program
[106] The university advised that it had a vendor management program in place to safeguard the personal information of individuals on campus prior to the installation of the IVMs.
[107] The university also advised that when a department seeks to implement a new business process or redesign an existing process or service involving a substantial change to the processing of personal information, it is required to complete an Information Risk Assessment (IRA) based on IPC guidance. [42]
[108] These requirements involve completing an IRA intake questionnaire for review by the university’s Privacy Officer and Information Security Officer, who provide recommendations on privacy protection and data security.
[109] The university’s IRA program aims to:
- Identify and understand privacy and security risks associated with new or redesigned university processes or services that use sensitive information, including personal information.
- Prevent or reduce privacy and security issues.
- Enhance privacy and security protection.
- Help Information Stewards decide whether to move forward with the initiative.[43]
[110] Where the university decides to complete an IRA as part of its procurement process, the questionnaire includes the following two questions:[44]
- Will it involve use of new technology or one known to impact privacy that could raise significant privacy risks (e.g., biometrics, smart cards, drug testing, or technology with surveillance capabilities)?
- Will it involve creation or modification of databases that will contain Restricted or Highly Restricted Information, including Personal Information? In particular, where the data is sensitive or relates to a significant number of people, or that will link separate databases or create files that index or point to Personal Information on such databases?
[111] The university advised that typically, their Procurement Department, in consultation with the department that is to work with the vendor, prepares the request for proposal for vendors handling personal information. The university also advised that recommendations from this IRA process, when undertaken, are included in the proposal.
[112] In this case, the university advised that the Procurement and Food Services departments created the RFP and that it contained all the standard personal information and data security requirements found in the university’s requests for proposals. However, the university advised that they did not complete an IRA as part of the procurement process for the IVMs because they believed that only payment card and related financial information would be collected. As such, the department did not anticipate any significant change in how personal information was being processed.
The RFP, Adaria’s Proposal and the Agreement
[113] The RFP was posted in June 2023, and the successful respondent was responsible for providing and maintaining vending services on the university’s campus.
[114] The RFP informed respondents that the university must handle protected information in accordance with FIPPA. It also stated that the successful respondent, upon entering into an agreement and handling protected information, would be required to provide details on their data protection practices. The contract would only be awarded if the university was satisfied with these practices.
[115] Further, the RFP required that respondents provide security documents, such as information security policy, data privacy policy, incident response plan, diagrams showing the flow of personal or sensitive information, audit reports and security certifications. It also required that they provide a “[d]escription of the roles and responsibilities of any additional agents and sub-contractors who will be involved in providing the Snack Vending Services”
.
[116] With respect to data, the RFP noted that it was important for the “Food Services management team to have just-in-time information on the performance of [their] Snack Vending business with accurate accountability to sales and volumetric reporting”
and required that bidders provide “reporting on volume and sales per machine location.”
[117] Regarding the IVMs, the RFP specifically asked respondents to “include a photo or catalogue image of the type of Snack Vending machines to be provided to the [university], including the SKU number of the equipment and its specifications”
and “[o]utline the technologies to be proposed including technology used to manage inventory levels”
. Further, for each SKU model, respondents had to provide information about its data or network requirements.
[118] Adaria’s Proposal informed the university that it had “partnered with Mars, Frito-Lay to test product innovations at vending machines”
and that MARS Intelligent Vending Machines would be installed. The proposal stated that MARS was “implementing their new Intelligent Mars Machines and we are working with Mars on this deployment in Canada.”
It also included a brochure on these machines indicating that they required “High Traffic/Dwell Time.”
[119] Regarding the technology in the machines, Adaria’s Proposal stated:
...we have implemented a proprietary remote monitoring system, utilized on all vending equipment we deploy. This is a miniature device that is fitted into vending machines and transmits sales data to our backend servers, which processes the information into online reports that we use to replenish our machines more efficiently.
[120] The proposal explained that this remote monitoring system (i.e. smartVend) “is an in-house developed telemetry system that provides accurate sales data and item-level inventory tracking of all deployed vending machines.”
[121] Adaria’s Proposal also stated that they “will not use any additional agents or subcontractors to perform the requirements of [the] RFP”
.
[122] Under the Agreement, the university owned all personal information “submitted to or created by” Adaria, who agreed to handle it in compliance with all applicable privacy laws. The Agreement specifically required the university, Adaria and Adaria’s agents and subcontractors to handle and protect personal information in compliance with FIPPA. It also defined “personal information/data” to include the definition of “personal information” under FIPPA.”
[123] Before disclosing any confidential information due to legal requirements or government requests, the Agreement required Adaria to notify the university, giving it the chance to prevent or limit the disclosure The Agreement also required that Adaria, along with its agents and subcontractors, continue protecting confidential information even after the Agreement ends.
[124] The Agreement included comprehensive security provisions requiring Adaria to implement appropriate safeguards and security controls to protect personal information from unauthorized or malicious use, as well as maintain a formal security and breach incident response program.
[125] When the Agreement ends, Adaria and its agents and subcontractors would be required to return all data (including confidential information) to the university. They were also required to permanently destroy all digital copies of the data in a way that made recovery impossible and provide written confirmation that this was done.
[126] Finally, under the Agreement, the university had the discretion to hire a qualified third party to conduct information security audits.
Analysis
[127] In Privacy Investigation Report PC12-39 and Privacy Complaint Report PR16-40, this office considered relevant contractual provisions for determining whether an institution met its obligations to ensure that all reasonable steps were taken to protect the privacy and security of personal information under its control in accordance with section 4(1) of Regulation 460. These provisions include:
- Ownership of data
- Collection, Use, and Disclosure
- Confidential Information
- Notice of Compelled Disclosure
- Subcontracting
- Security
- Retention and Destruction
- Audits
[128] In this matter, the Agreement appears to have contained all the appropriate standard clauses necessary to protect personal information involved in the delivery of snack vending services by Adaria.
[129] Despite the apparent adequacy of these contractual measures to protect any personal information collected, the fact remains that the collection of personal information through the IVMs’ face detection system was not authorized.
[130] As mentioned above, the university was not aware that the IVMs contained face detection technology that collected personal information. Based on the Officer Certificate from Adaria, it appears that they too were unaware of this collection. It was only after an IVM malfunctioned on campus that the university became aware that data was being collected and accessed by Invenda.
[131] The RFP did not request vending machines that use face detection technology to collect personal information or provide demographic data analytics. Further Adaria’s Proposal did not explicitly refer to the face detection technology in the IVMs or indicate that additional agents or subcontractors would be used to fulfill the RFP’s requirements.
[132] However, Adaria’s Proposal did inform the university of Adaria’s collaboration with MARS to test product innovations and that new MARS Intelligent Vending Machines would be installed that required high traffic and/or dwell time. (i.e. a lot of people passing by and/or lingering around the machines).
[133] The university advised that an IRA was not completed because it did not anticipate a significant change in the processing of personal information based on the belief that only financial information would be collected by the IVMs. However, given that Adaria’s Proposal informed the university that it would be working with MARS to implement and deploy new intelligent MARS machines in Canada, this should have caught the attention of the university and alerted it to the possible privacy implications necessitating further scrutiny.
[134] Accordingly, if the university had conducted an IRA during its procurement and tendering processes, it likely would have identified key red flags, specifically, that the machines used new innovative technology with surveillance capabilities involving a significant number of people.
[135] In my view, doing so would have highlighted and prevented, or mitigated, the risk of the unauthorized collection of personally identifiable images that occurred on the university’s campus.
[136] This omission highlights the importance of having not only good contractual templates in place, but also sound and robust information management practices throughout the entire procurement process, from planning, tendering, vendor selection, contracting, agreement management, right up to and including termination.
[137] Regarding procurement planning, the IPC’s Contracting with Third Party Service Providers Guide advises that institutions should, among other things, consider:
- Defining the records to develop a clear understanding whether personal information will be involved in the procurement project or initiative; and
- identifying and mitigating privacy and security risks (using a privacy impact assessment) before starting the procurement process or signing an agreement and creating measures to mitigate them, as well as consider whether the service provider should also conduct a security assessment of its services, processes or technology before entering into the agreement.
[138] The IPC’s Contracting with Third Party Service Providers Guide also recommends that institutions define the types of records and personal information the service provider will have access to and be responsible for processing, building upon this step taken during procurement planning.
[139] Further, while the IPC’s PIA Assessment Guide notes that a PIA is not required where an institution determines that a project will not involve personal information, it also recommends that this conclusion and the reasoning behind it be documented.[45] In this matter I was not provided with any information regarding the university’s conclusions or rationale for not conducting a PIA in relation to the RFP.
[140] Finally, although the RFP required bidders to describe the technologies proposed for use in the vending machines and provide details about each machine’s data or network requirements, it did not require bidders to conduct a security assessment of the technology to be implemented in the machines.
[141] In my view, had Adaria conducted a PIA during the procurement process for the RFP, it would have evaluated the technology used in the new IVMs to be deployed by MARS in Canada, likely identified that this technology involved Invenda and a third-party provider of face detection software, and the associated risk of unauthorized collection of personally identifiable images.
[142] Moreover, regarding the tendering process, the IPC’s Contracting with Third Party Service Providers Guide recommends that institutions “[u]ndertake any activities required to collect sufficient information to evaluate service providers.”
Since the university did not conduct an IRA or PIA, or request that Adaria do so, the university failed to gather enough information to assess whether Adaria’s service delivery would comply with FIPPA.
[143] Based on the above, I find that although the university had reasonable contractual measures in place with Adaria to ensure the privacy and security of personal information under its control, it failed to carry out the necessary due diligence to uncover the fact that the delivery of IVM services it was contracting for would involve the collection of personal information contrary to FIPPA.
[144] Accordingly, I find that the university was deficient in its procurement process by failing to conduct an IRA or PIA, or requiring that Adaria do so, considering the information provided in the Adaria Proposal.
[145] Because of this deficiency, the university missed key red flags relating to the technology used in the IVMs indicating that there was a potential privacy risk. It also failed to gather sufficient information to properly evaluate the third-party service providers and their suppliers. These deficiencies resulted in the university being unaware of the face detection technology in the IVMs that collected personally identifiable images without authorization.
[146] Shortly after the software error appeared on the IVM, and immediately upon becoming aware of the face detection technology in the IVMs, the university disabled the face detection software and removed all IVMs from its campus. The university also received assurances from Invenda that all the collected data, including data generated in processing the images, has been permanently deleted.
[147] Although the risks associated with the continuing use of the IVMs have now been averted, there are several recommendations I am making to the university to avoid such a situation from recurring in the future.
[148] Specifically, when planning to enter into an agreement with a third party vendor in the future, that may involve the deployment of new “smart” technologies, I recommend that the university conduct a PIA (or IRA in this case) and also require that prospective third-party service providers do the same as part of their procurement process. While this stands as a recommendation today, the requirement to conduct a PIA in similar circumstances will become both explicit and mandatory under recent changes to FIPPA that took effect as of July 1, 2025.
[149] I also recommend that the university not only include appropriate privacy and security-related clauses in its agreements with third party vendors, but that it follow the IPC Guidance on Contracting with Third Party Service Providers to ensure that it identifies, assesses and mitigates any potential risks to personal information throughout the entire procurement process, including during the planning, tendering, vendor selection, agreement management and termination phases.
CONCLUSIONS:
Based on the results of the investigation, I have reached the following conclusions:
- The university’s use of face detection technology resulted in a collection of “personal information” within the meaning of section 2(1) and section 38(1) of FIPPA.
- The collection of personal information was contrary to section 38(2) of FIPPA.
- The university did not, and could not, provide notice of the collection of personal information as required by section 39(2) of FIPPA.
- The university had reasonable contractual measures in place with its third-party service provider to ensure the privacy and security of personal information under its control.
- However, the university was deficient in its overall procurement process by failing to identify that personal information would be collected as part of the IVM services it was contracting for contrary to section 38(2) of FIPPA. This failure resulted in an inability to properly evaluate the third-party service providers and their suppliers and make an appropriate selection. As such, the university fell short of its obligation to protect the personal information under its control in accordance with section 4(1) of Regulation 460.
RECOMMENDATIONS:
Given the conclusions reached in this investigation, I make the following recommendations, which the university has agreed to implement:
- The university should review its privacy policies to ensure that any future collection of personal information complies with sections 38(2) and 39(2) of FIPPA.
- When planning to enter into agreements with third-party service providers, particularly those involving the use of new or “smart” technologies, the university should ensure that it carries out the necessary due diligence to identify, assess and mitigate any potential risks to personal information throughout the entire procurement process, including during the planning, tendering, vendor selection, agreement management and termination phases.
This includes conducting a PIA and/or requiring the third-party service provider do so as part of its procurement process where appropriate, following the IPC Guidance on Contracting with Third Party Service Providers, as well as including appropriate standard privacy and security clauses in all contracts with third-party vendors.
|
Original Signed by: |
|
August 27, 2025 |
|
John Gayle |
|
|
|
Investigator |
|
|
[1] https://www.theguardian.com/world/2024/feb/23/vending-machine-facial-recognition-canada-univeristy-waterloo, https://mobilesyrup.com/2024/02/27/university-of-waterloo-vending-machine-facial-recognition/ and 'Facial recognition' error message on vending machine sparks concern at University of Waterloo | CTV News.
[2] See footnote 1.
[4] The Application of PIPEDA to Municipalities, Universities, Schools, and Hospitals - Office of the Privacy Commissioner of Canada.
[5] I note that the agreement the university entered into with Adaria provided that the university owned all personal information “submitted to or created by” Adaria, who agreed to handle it in compliance with all applicable privacy laws.
[6] An example of an IPC Privacy Investigation Report relating to an institution’s commercial activities is found in Privacy Complaint PR16-40; Ontario Lottery and Gaming Corp. (Re), [2019] O.I.P.C. No. 11 (QL).
[7] Reference re Assisted Human Reproduction Act, 2010 SCC 61; Reference re Genetic Non-Discrimination Act, 2020 SCC 17; Mahmud Jamal, “Is PIPEDA Constitutional”, 43 Can. Bus. L.J. 434 (2006), at pp. 442, 448; Michel Bastarache, “The Constitutionality of PIPEDA: A Reconsideration in the Wake of the Supreme Court of Canada’s Reference re Securities Act”, June 2012, at pp. 4-6, 11-12; Josh Nisker, “PIPEDA: A Constitutional Analysis, Canadian Bar Review, Vol. 85, p. 317 (2006), at pp. 326-329.
[8] Reference re Securities Act, 2011 SCC 66, [2011] 3 SCR 837, para. 66.
[9] Mahmud Jamal, “Is PIPEDA Constitutional”, 43 Can. Bus. L.J. 434 (2006) at pp. 442, 448.
[10] The constitutional validity of PIPEDA has not been tested in the courts. Jamal, at pp. 448-449; Nisker, at pp. 326, 329, 343.
[11] At that time Bill C-27 had received second reading.
[12] Addario Law Group, LLP, The Constitutional Validity of Bill C-11, the Digital Charter Implementation Act, March 32, 2022, PDF, p. 11.
[13] Section 58 of FIPPA.
[14] See the definition of “record” under section 2(1) of FIPPA.
[15] Order PO-1880, upheld on judicial review in Ontario (Attorney General) v. Pascoe, [2002] O.J. No. 4300 (C.A.); 2001 CanLII 26053 (ON IPC).
[16] Order 11, [1988] O.I.P.C. No. 11.
[17] See Invenda’s brochure regarding IVM at https://a.storyblok.com/f/184550/x/e94dbdaff3/product-catalogue-ovm-ivm_2024.pdf?cv=1714124864949.
[18] The university advised that Invenda did not customize the software for use within the IVMs.
[19] For example, in the technical paper “Forensic Facial Comparison: Current Status, Limitations, and Future Directions”, the authors found that “a minimum horizontal pixel count of 10-16 per face for a known face and 20 pixels for an unknown face is considered the bare minimum for successful identification in frontal view.”
See https://www.mdpi.com/2079-7737/10/12/1269. Further, in the “Perfect Pixel Count – Meeting your Operational Requirements” report, the authors mapped the horizontal pixels of a face needed to make facial identifications and found that only 20 horizonal pixels were needed. See https://www.axis.com/files/feature_articles/ar_perfect_pixel_count_55971_en_1402_lo.pdf. In the case at hand, the IVMs used a minimum horizontal pixel count of 640.
[20] The locations of the IVMs were set out in a Schedule attached to the Agreement.
[21] JavaScript Object Notation (JSON) is a standard text-based format for representing structured data based on JavaScript object syntax. It is commonly used for transmitting data in web applications.
[22] Privacy Complaint Reports MC10-2 and MC07-68, among others, where this information qualified as “personal information” under paragraph (a) under the definition of “personal information” in section 2(1) of FIPPA.
[23] Joint Investigation of the Cadillac Fairview Corporation Limited by the Privacy Commissioner of Canada, the Information and Privacy Commissioner of Alberta, and the Information and Privacy Commissioner for British Columbia (“CFCL Report “)
[24] CFCL Report, at para. 48, confirming “that the entirety of the age and gender estimation process summarized above occurs in milliseconds and that each image of a face is only stored in computer memory for the duration of that process.”
[25] CFCL Report, at paras. 60-63.
[26] 2025 BCSC 910.
[27] Ibid., paras. 42, 43 and 49.
[28] The University of Waterloo Act, 1972, SO 1972, c 200, ss. 2, 4, 14.1(c), (f).
[30] 2007 ONCA 502, at para. 40.
[31] Privacy Complaint Report PI21-00001
[32] Ibid, paras. 58-61. It is noteworthy that the Commissioner expressed ongoing reservations about the use of online proctoring technology in conjunction with artificial intelligence technologies and the need to develop privacy protective guardrails in this connection.
[33] Under section 39(2), the responsible minister (the Chairman of the Management Board of Cabinet) may waive the notice requirement and section 39(3) of FIPPA sets out exceptions.
[34] IPC Practices No. 8 Providing Notice of Collection and page 12 of the IPC’s “Guidelines for the Use of Video Surveillance, October 2015”, both available on the IPC’s website.
[35] See Part III Protection of Individual Privacy of FIPPA.
[36] IPC Practice No. 18 “How to Protect Personal Information in the Custody of a Third Party”.
[37] See footnote 47.
[38] Ontario Criminal Code Review Board v. Hale, 47 O.R. (3d) 201 (C.A.) and Privacy Complaint Report PR16-40.
[39] Privacy Complaint Report MI10-5.
[40] Investigation Report I93-044M.
[41] https://www.ipc.on.ca/en/resources/privacy-and-access-public-sector-contracting-third-party-service-providers
[42] Information Risk Assessment | Information and Privacy | University of Waterloo: https://uwaterloo.ca/privacy/information-risk-assessment.
[43] See footnote 42.
[44] See footnote 42.
[45] See page 5 in the IPC’s PIA Assessment Guide.