In its ongoing quest to update Canadian federal private sector privacy law, the federal government recently introduced the Digital Charter Implementation Act, 20221 (Bill C-27) in the House of Commons, revisiting many aspects of its predecessor, Bill C-11, a bill that was introduced in 2020 but died on the order paper upon the dissolution of Parliament in 2021.
If enacted, Bill C-27 would:
- Create the Consumer Privacy Protection Act (CPPA), replacing Part 1 of the Personal Information Protection and Electronic Documents Act (PIPEDA),2 Canada’s current private sector privacy law.
- Create the Personal Information and Data Protection Tribunal Act (PIDPTA), which would establish a tribunal that would hear the Office of the Privacy Commissioner of Canada’s (OPC’s) recommendations on administrative monetary penalties and appeals from certain inquiry findings and compliance orders of the OPC.
- Enact the Artificial Intelligence and Data Act (AIDA) to regulate “international and interprovincial trade and commerce in AI systems” and prohibit certain conduct that could result in serious harm to individuals and their interests.3
Despite the many criticisms of the previous bill, much of the framework of Bill C-27 draws on language from Bill C-11 and it is likely that it will face some of the same criticisms. Our prior commentary on Bill C-11 can be found here and our webinar on Bill C-11 is available here. So, what is new and noteworthy about Bill C-27?
The Consumer Privacy Protection Act (CPPA) – The Highlights
Shift in Focus – Although it’s a small and subtle change, the “Summary” at the beginning of Bill C-27 no longer refers to “recognizing the needs of organizations to collect, use or disclose personal information in the course of commercial activities” – it now simply refers to “taking into account” those needs. Arguably that change in terminology represents a diminution of organizations’ rights when dealing with personal information.
Anonymized vs. De-identified Information – Bill C-27 draws a distinction between anonymizing and de-identifying personal information. To “anonymize” will mean to “irreversibly and permanently modify personal information, in accordance with generally accepted best practices, to ensure that no individual can be identified from the information, whether directly or indirectly, by any means,”4 whereas to “de-identify” personal information will mean that the information is modified so that “an individual cannot be directly identified from it, though a risk of the individual being identified remains.”5 The CPPA will not apply to “anonymized” information but it will apply to de-identified personal information (with certain exceptions). Given that evolving technologies make re-identification of personal information in datasets a moving target, how will the law treat those organizations that are handling information that was once anonymized but later is re-engineered to uncover personal information? How will clients draw the line between information that is truly anonymous versus information that has merely been de-identified? These will be the difficult questions in applying Bill C-27 going forward.
Legitimate Interests – PIPEDA is based on the principle that express or implied consent is required to collect, use, or disclose personal information, unless an exception applies. In an attempt to build flexibility into Canadian privacy legislation, Bill C-11 had previously introduced the concept of organizations collecting or using personal information without consent for various “business activities,” subject to certain qualifications. This aspect of the prior bill was heavily criticized, particularly the business activity that permitted collection or use without consent where the individual’s consent was “impracticable” because the organization did not have a direct relationship with the individual. Parliament has obviously heard the criticisms because that particular form of “business activity” has been eliminated, along with another that referred to due diligence to reduce commercial risk.
However, Bill C-27 has relied upon the EU’s GDPR concept of the “legitimate interest” of organizations that permits collection and use of personal information without consent where that legitimate interest outweighs any potential adverse effects on the individual, subject to certain qualifications and conditions precedent. The conditions precedent are noteworthy because they will require organizations, prior to any collection or use of personal information based on their “legitimate interest,” to analyze and document how the organization will meet the conditions precedent. This analysis must take into account potential adverse effects on the individual and identify and take reasonable measures to reduce the likelihood that the effect will occur or to mitigate or eliminate them. It will be mandatory to produce this analysis on request by the OPC. Clients will need to be mindful that claiming “legitimate interests” to collect or use personal information without consent cannot be claimed capriciously. Some form of serious privacy impact assessment must be undertaken and documented in order to rely on this basis for collecting or using personal information. Note also that “legitimate interest” only applies to collection and use; it does not apply to disclosure.
Consent – While Bill C-11 introduced the elements of valid consent, Bill C-27 now goes further and requires that those elements must be described in “plain language” that the individual “would reasonably be expected to understand”. While the intent is to avoid obscure and impenetrable “legalese” in privacy policies, this plain language may be a difficult standard to implement in practice, given varying comprehension levels among members of the public and the challenge in describing complex data uses in simple terms. Although implied consent is permitted depending on the reasonable expectations of the individual and the sensitivity of the information in issue, Bill C-27 also contains a statement that an organization cannot rely on implied consent where it is relying upon the “business activities” or “legitimate interests” bases for collecting or using personal information. This highlights the importance of documenting the organization’s rationale for relying upon business activities or legitimate interests – in other words, a vague notion that consent is implied in those types of specific circumstances will not suffice.
Collection, Retention, and Disposal of Personal Information – The new CPPA reiterates PIPEDA’s overarching principle6 that organizations can collect, use, or disclose personal information for purposes that a reasonable person would consider appropriate in the circumstances. However, the CPPA now goes further to state that it is no longer just the purposes that the reasonable person must consider as appropriate – it is also the manner of collection. Additionally, the CPPA makes explicit that this principle applies regardless of whether or not consent is required under the statute.7 With respect to retention periods for personal information, the CPPA proposes that an organization must take into account the sensitivity of the information when determining retention periods8 and privacy policies must make explicit references to the retention periods applicable to sensitive personal information.9
As in the case of Bill C-11, Bill C-27 makes explicit that individuals have a right to have their information disposed of upon request. However, in addition to expanding the list of situations in which organizations can refuse this request, Bill C-27 now augments the scope of the information for which the request can be made. It is no longer simply information collected by the organization; Bill C-27 now refers to personal information under the organization’s control, which could include information received from third parties and information in the hands of service providers. Organizations will be required to ensure that service providers dispose of the information, which will make for challenging contract negotiations with those service providers.
Minors – PIPEDA is silent on the treatment of personal information of children. However, the OPC has made youth privacy a priority for a number of years. If enacted, Bill C-27 would provide some clarity on how personal information of children should be treated. The bill makes explicit that personal information of minors is sensitive personal information. The upshot for clients who handle children’s personal information is that they will have to re-evaluate the totality of their privacy practices to ensure that this information is appropriately handled in light of its status as “sensitive” personal information.
Cross-border Transfers – Organizations operating in a global context often ask whether personal information can be exported outside of Canada. Bill C-11 did not address this issue except to require disclosure of international/interprovincial transfers in privacy policies. Bill C-27 takes the same approach. This may be seen as a benefit for organizations, but it may argue against Canada maintaining its adequacy ruling with the EU, where cross-border transfers of personal data are subject to stringent controls.
Charities and Political Parties – Bill C-27 reverts from the broad Bill C-11 definition of “commercial activity” to PIPEDA’s definition, which includes selling, bartering, or leasing of donor, membership, or other fundraising lists. For charities, this will mean that they will continue to be subject to federal privacy law when conducting those activities. Like PIPEDA and Bill C-11, Bill C-27 is silent on the application of the CPPA to political parties.
Substantial Penalties – Bill C-27 maintains the structure of the regulatory regime established in Bill C-11 through the creation of the Personal Information and Data Protection Tribunal (Tribunal), discussed below, with power to impose penalties of up to the greater of C$10 million and 3% of the organization’s gross global revenue. It also repeats the criminal sanctions in Bill C-11 which can range in severity up to the greater of C$25 million and 5% of the organization’s annual gross global revenue.
Bill C-27 would extend the list found in Bill C-11 of circumstances in which the OPC may recommend the imposition of administrative monetary penalties to the Tribunal to include:
|Newly added sections in Bill C-27 where administrative monetary penalties may be imposed10|
|Subsection 9(1)||Implementation and maintenance of a privacy management program that includes policies, practices, and procedures.|
|Subsection 11(1)||Requirement to guarantee that service providers give a level of protection which is equivalent to the level of protection that the transferring organization is required to provide.|
|Subsection 12(3)||Requirement to determine the reason for which the personal information is being collected, used, or disclosed and to record those purposes before using or disclosing that information.|
|Subsection 12(4)||For any new purpose, the organization must record that new purposes before using or disclosing information for that new purpose.|
|Subsections 15(1) and (7)||Requirement that an organization obtains a person’s valid consent for the collection, use or disclosure of the person’s personal information, unless it falls within an exception.|
|Subsection 17(2)||Requirement that when receiving a request to withdraw consent, the organization must inform the person of the consequences associated with the withdrawal of their consent, and cease the collection, use or disclosure of the person’s personal information.|
|Subsections 55(1) and (4)||Requirement to dispose of the information upon request by individual and a need to inform service providers to whom the organization has transferred the information.|
|Section 61||Requirement that if service providers face data breaches, they must notify the organization that controls the personal information.|
|Subsection 62(1)||Requirement that the information explaining the organization’s policies and practices be made readily available, and in plain language.|
It should not be forgotten that, like Bill C-11, Bill C-27 provides for a private right of action for those individuals who have suffered loss or injury as a result of a contravention of the CPPA by an organization if the OPC has made a finding of contravention that has not been appealed and the time for appeal has expired, the Tribunal has dismissed an appeal of a finding, the Tribunal has substituted its own finding for a contravention of the CPPA or the organization has committed an offence under the CPPA.11
Expansion of the OPC’s Powers – The CPPA would expand the powers of the OPC to permit the OPC to conduct inquiries into complaints and recommend administrative monetary penalties to the Tribunal.12 Importantly, in Bill C-27’s iteration of the CPPA, the OPC would be granted the power to authorize an organization’s use of de-identified personal information to identify an individual, if the OPC believes it would be in the best interest of the individual to do so.13 In exercising its powers, the OPC must consider the purpose of the CPPA, the size and revenue of organizations, the volume and sensitivity of personal information under the organization’s control, and matters of public interest.14
The Personal Information and Data Protection Tribunal Act (PIDPTA)
Like its predecessor Bill C-11, Bill C-27 would create the Personal Information and Data Protection Tribunal. Our previous commentary on PIDPTA can be found here. The only revisions to Bill C-27 as compared to Bill C-11 lie in the makeup of the Tribunal and the Tribunal’s powers. Under Bill C-27, the Tribunal must contain at least three members with experience in the field of information and privacy law, an increase from the one-member minimum under Bill C-11. Whereas Bill C-11 set out that the Tribunal and its members would have all the powers of a commissioner under Part I of the Inquiries Act and the power to make interim decisions, Bill C-27 sets out that the Tribunal has all the powers, rights, and privileges that are vested in a superior court of record.15 Further, under Bill C-27’s proposal, any Tribunal decisions may be made an order of the Federal Court or any superior court and be enforceable as such.16 To make a Tribunal decision an order of a court, the Tribunal may either follow that court’s practices and procedures or file a certified copy of the Tribunal’s decision with the registrar of the court.17
The Artificial Intelligence and Data Act (AIDA)
Probably the most significant difference between Bill C-11 and Bill C-27 is the introduction of a new statute, the Artificial Intelligence and Data Act (AIDA), which is intended to regulate international and interprovincial trade and commerce in artificial intelligence systems and would prohibit certain conduct in relation to artificial intelligence systems that may result in serious harm to individuals or that may harm their interests. The AIDA defines an “artificial intelligence system” as a “technological system that, autonomously or partly autonomously, processes data related to human activities through the use of a genetic algorithm, a neural network, machine learning or another technique in order to generate content or make decisions, recommendations or predictions.”18 Although Bill C-11 contained the concept of “automated decision systems” requiring organizations to explain how those systems have been used to make decisions about individuals, and those requirements have largely been repeated in Bill C-27, the AIDA goes further by referring autonomous or semi-autonomous processing of data. The focus of the AIDA is to impose certain obligations on persons who are carrying out “regulated activities” in the course of international or interprovincial trade and commerce and on persons who are responsible for artificial intelligence systems, particularly “high-impact systems,” the criteria of which are to be set by regulation.
Requirements for Persons Responsible for AI Systems
- Anonymized Data: persons who carry out regulated activities and make anonymized data available for use in AI systems must establish measures with respect to the manner in which those data are anonymized and the use or management of anonymized data.19
- Assessments of High Impact Systems: persons responsible for AI systems must assess whether they are high-impact systems.20
- Risk Mitigation: persons responsible for high impact systems must establish measures to identify, assess, and mitigate risks of harm or biased output that could result from use of the system.21 These risk mitigation measures and their effectiveness must also be monitored.22
- Keeping General Records: persons who carry out any regulated activities must keep records for measures related to anonymized data and risk mitigation, and reasons that support their assessment of the AI system for which they are responsible as a high-impact system.23
- Plain-Language Public Disclosure: persons who make available or manage the operation of a high-impact AI system must publish, on a publicly available website, plain-language descriptions that include descriptions of (a) how the system is intended to be or is used, (b) the types of content that the system is intended to generate or does generate and the decisions, recommendations, or predictions that it is intended to or does make, (c) mitigation measures taken to reduce the risk of harm or biased output, as required by section 8 of the AIDA, and (d) other information prescribed by regulation.24 Additionally, persons that are responsible for high-impact systems must notify the Minister if use of the system results in or is likely to result in material harm, which is characterized as physical or psychological harm to an individual, damage to an individual’s property, or economic loss to an individual.25
Powers of the Minister – The AIDA also lays out the powers of the Minister, who may be any member of the Queen’s Privy Council designated by the Governor in Council. The Minister has fairly broad powers to request records from any person who carries out regulated activities or if they have reasonable grounds to believe that the use of a high-impact system could result in harm or biased output.26
Furthermore, for any of the requirements for persons responsible for AI systems, the Minister may also order an audit or engage an independent auditor, following which the Minister may order the person to implement any measure to address items referred to in the audit report.27 The Minister may also require cessation of the use or making available of a high-impact AI system if the Minister has reasonable grounds to believe there is a serious risk of imminent harm.28
In the event of contravention of any section of the Act, or in the event of an AI system posing a serious risk of imminent harm, the Minister may publish, on a publicly available website, information about the AI system to encourage compliance or to prevent the harm from occurring. The Minister cannot publish confidential business information or personal information.29
Administrative Monetary Penalties – Separate and apart from the penalties that were listed in the CPPA, the AIDA will have its own administrative monetary penalty scheme for violations and contravention of the statute, to be set out in regulations. As of the writing of this Cassels Comment, no guidance has yet been issued on what the regulations might say in that respect, but the AIDA makes it clear that any administrative monetary penalties are meant to promote compliance, and not to punish.30
Anyone who contravenes sections 6–12 of the AIDA is guilty of an offence. The fines are as set out below:31
- On conviction on indictment:
- Fine of not more than the greater of C$10 million and 3% of the person’s gross global revenues in its financial year before the one in which the person is sentenced, in the case of a person who is not an individual, OR a fine at the discretion of the court, in the case of an individual
- On conviction on summary conviction:
- Fine of not more than the greater of C$5 million and 2% of the person’s gross global revenues in its financial year before the one in which the person is sentenced, in the case of a person who is not an individual, OR a fine of not more than $50,000, in the case of an individual
However, persons can be found not guilty of these offences if they are able to establish that they exercised due diligence to prevent the commission of the offence.32 Furthermore, it is sufficient for an offence if an act was committed by an employee, agent or mandatary, whether or not they have been identified and prosecuted, unless the accused establishes that it happened without its knowledge.33
Criminal Offences – The AIDA also would make certain prohibited activities criminal offences. This includes the following:
- Possession or use of personal information: It is an offence if a person, if for the purpose of designing, developing, using, or making available for use an AI system, the person possesses or uses personal information, knowing or believing that the information is obtained or derived, directly or indirectly, as a result of (a) an offence under an Act of Parliament or provincial legislature, or (b) an act or omission anywhere that, if it had occurred in Canada, would have constituted such an offence.34
- Making an AI system available for use: It is an offence if the person knows or is reckless as to whether use of an AI system is likely to cause serious physical or psychological harm to an individual or substantial damage to an individual’s property, makes the AI system available for use and the system causes such harm, OR with intent to defraud public and cause substantial economic loss to an individual, makes an AI system available and causes that loss.35
Anyone who commits one of the offences above can face the following penalties:36
- On conviction on indictment:
- Fine of not more than the greater of C$25 million and 5% of the person’s gross global revenues in its financial year before the one in which the person is sentenced, in the case of a person who is not an individual, OR a fine at the discretion of the court or term of imprisonment of up to 5 years less a day or both, in the case of an individual.
- On conviction on summary conviction:
- Fine of not more than the greater of C$20 million and 4% of the person’s gross global revenues in its financial year before the one in which the person is sentenced, in the case of a person who is not an individual, OR a fine of not more than $100,000 or a term of imprisonment of up to two years less a day, or both, in the case of an individual.
Bill C-27 has just started its journey through Parliament. When Parliament resumes in September, we will continue to track its progress and keep our readers advised of significant developments.
1 Bill C-27, An Act to enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act and the Artificial Intelligence and Data Act and to make consequential and related amendments to other Acts, 1st Sess, 44th Parl, 2022 (first reading 16 June 2022) [“Bill C-27”].
2 Personal Information Protection and Electronics Documents Act, SC 2000, c 5 [“PIPEDA”].
3 Bill C-27, supra note 1 at Part 3, cl 4.
4 Ibid at Part 1, cl 2.
6 PIPEDA, supra note 2 at Part 1, cl 5(3).
7 Bill C-27, supra note 1 at Part 1, cl 12(1).
8 Ibid at Part 1, cl 53(2).
9 Ibid at Part 1, cl 62(2)(e).
10 Ibid at Part 1, cl 94(1).
11 Ibid at Part 1, cl 107.
12 Ibid at Part 1, cl 89, 90 and 94.
13 Ibid at Part 1, cl 116.
14 Ibid at Part 1, cl 109.
15 Ibid at Part 2, cl 16(1).
16 Ibid at Part 2, cl 16(2).
17 Ibid at Part 2, cl 16(3).
18 Ibid at Part 3, cl 2.
19 Ibid at Part 3, cl 6.
20 Ibid at Part 3, cl 7.
21 Ibid at Part 3, cl 8.
22 Ibid at Part 3, cl 9.
23 Ibid at Part 3, cl 10.
24 Ibid at Part 3, cl 11(1) and (2).
25 Ibid at Part 3, cl 12.
26 Ibid at Part 3, cl 13 & 14.
27 Ibid at Part 3, cl 15 & 16.
28 Ibid at Part 3, cl 17.
29 Ibid at Part 3, cl 28.
30 Ibid at Part 3, cl 29(2).
31 Ibid at Part 3, cl 30.
32 Ibid at Part 3, cl 30(4).
33 Ibid at Part 3, cl 30(5).
34 Ibid at Part 3, cl 38.
35 Ibid at Part 3, cl 39.
36 Ibid at Part 3, cl 40.