With bills on privacy, artificial intelligence, and cyber security moving through the federal legislature, and with the second phase of Quebec’s privacy reform coming into effect, 2023 is set to be a pivotal year for privacy and data protection in Canada.
In June 2022, the Canadian government introduced Bill C-27, An Act to enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act and the Artificial Intelligence and Data Act and to make consequential and related amendments to other Acts. Through the Bill’s Consumer Privacy Protection Act (CPPA), Canada seeks to replace its current federal private-sector privacy law—the Personal Information Protection and Electronic Documents Act—with a modernized and strengthened privacy and data protection legal framework. The envisioned regime includes reinforced accountability rules and consent requirements, new enforcement tools and powers, and new individual rights. The Bill’s second reading was finalized on 24 April 2023, and will now be studied in commission.
In terms of strengthening accountability, the CPPA codifies past decisions and guidance from the regulator by introducing a definition of “Control” which clarifies that information is deemed to be under the control of “the organization that decides to collect it and that determines the purposes for its collection, use or disclosure” (s7(2)). It also introduces an obligation for organizations to implement and maintain a “privacy management program” which must be developed in consideration of the volume and sensitivity of the personal information under its control (s9(2)) and also requires that organizations identify and record each of the purposes for which they collect, use, or disclose any personal information, and that they do so at or before the time of collection (s12(3)).
The CPPA also includes greater clarity concerning the notion of transparency and consent. In particular, under s15(3), the CPPA will require that organizations disclose, at or before the time consent is sought, in “plain language”:
The language through which this information is presented must also be sufficiently adapted to the target audience such that it would be reasonable to expect that audience to “understand” the content of the notice. The CPPA also codifies rules around implied consent, which organizations can rely on if doing so is “appropriate in the circumstances” in light of the reasonable expectations of the individual and the sensitivity of the information.
While the CPPA provides clarity on valid consent to make it more meaningful, it also introduces several consent exceptions. The “specified business activity” exception, for example, exempts organizations from obtaining an individual’s consent if the processing is made for the purpose of a specified business activity other than influencing behaviour or decisions, and falls within an individual’s reasonable expectations (s18). Another interesting consent exception may be relied upon by organizations which collect or use (not disclose) personal information for a purpose in which they have a legitimate interest that outweighs any potential adverse effect on individuals, which does not involve influencing their behaviour or decisions and falls within the individual’s reasonable expectations. The CPPA also offers consent exceptions regarding the processing of de-identified information for socially beneficial purposes or internal R&D and analytics.
The CPPA includes a variety of new individual rights, including, for instance, a right to disposal. Under this new right, individuals would be able to request that an organization permanently and irreversibly deletes their personal information under the organization’s control under certain conditions (s55). The CPPA also introduces a right to mobility, which essentially functions as a limited right to data portability and, finally, a right to be informed of automated decision-making, meaning that organizations will be obliged to make readily available, in plain language, a general account of the organization’s use of automated decision systems to make predictions, recommendations or decisions about individuals that could have a significant impact on an individual (ss62 & 72).
The CPPA would also substantially increase the power of Canada’s Privacy Commissioner, granting it new powers to conduct inquiries, issue binding compliance orders, and recommend penalties to the Data Protection Tribunal created by Bill C-27. The maximum penalty for all the contraventions in a given recommendation taken together is the higher of C$10,000,000 and 3% of the organization’s gross global revenue of its previous financial year (s95). Certain more egregious conduct could constitute an offence leading to a fine of a maximum of the higher of C$25,000,000 and 5% of the organization’s gross global revenue in its previous financial year (s128). Finally, the CPPA will introduce a private right of action for individuals affected by a contravention of the Act (s107).
Bill C-27 also introduces the Artificial Intelligence and Data Act (AIDA), which is the first Canadian attempt at regulating private sector development and use of AI. AIDA adopts a principles-based approach to regulation, and focuses on preventing various forms of harm to individuals and their property as well as biased output.
In terms of scope, AIDA would regulate “artificial intelligence systems,” which are defined as “technological system[s] that, autonomously or partly autonomously, [process] data related to human activities through the use of a genetic algorithm, a neural network, machine learning or another technique in order to generate content or make decisions, recommendations or predictions” (s2). AIDA would also broadly apply to “regulated activities” carried out in the course of international or interprovincial trade and commerce (s5(1)), seemingly intending to capture most if not all aspects of AI development and use.
AIDA would require that a person responsible for an AI system assesses whether that system qualifies as a “high-impact system” (ss 7 & 10(1)). If so, the person responsible must establish measures to identify, assess and mitigate the risks of harm or biased output that could result from the use of such system, and establish measures to monitor related compliance (ss 8-9).
AIDA would also create a nuanced transparency regime for high-impact systems that encompasses both intended and actual use. Persons making AI systems available for use will be required to publish a plain-language explanation of the intended use of the AI system, and the decisions, recommendations or predictions that it is intended to make (s11(1)). Similarly, persons managing the operations of an AI system (e.g. organizations putting it to use) will be required to publish a plain-language explanation of the actual use of the AI system, and the decisions, recommendations or predictions that it makes (s11(2)). There are also reporting obligations linked to high-impact systems, wherein a person responsible for a high-impact system must report whether uses of the system result or are likely to result in material forms of harm outlined in AIDA (s12).
In terms of enforcement, AIDA will introduce an administrative monetary penalty scheme by regulation. Penal fines under AIDA would be comparable to those introduced under the CPPA, with the maximum penalty for certain offences being C$25,000,000, or, if greater, 5% of the organization’s global gross revenues in its previous financial year (s40).
Finally, Canada has also introduced a new cyber security bill that will impose obligations on organizations acting in industries of national importance (critical infrastructures). The Bill, Bill C-26: An Act respecting cyber security, amending the Telecommunications Act and making consequential amendments to other Acts” which would create the new “Critical Cyber Systems Protection Act (CCSPA), includes new mandatory cyber security programs and cyber incident reporting, and will be backed by administrative monetary penalties for non-compliance.
The CCSPA will impose duties on “designated operators”, i.e. operators designated by the government that “operate a work or carry on an undertaking or business [that is within the legislative authority of (the federal) Parliament] in respect of a vital service or vital system”. The duties envisioned include the obligation for a designated operator to establish a cyber security program in respect of the critical cyber security systems managed (s9). The CCSPA will also create a reporting obligation for designated operators in respect of cyber security incidents (s17), and an obligation to keep records related to such incidents and other documents, including mitigation steps and implementation measures taken, among others (s30).
The CCSPA also provides for the issuing of “Cyber Security Directions” (CSD) which are orders requiring operators to comply with cyber security protection measures for critical cyber systems (ss20-21). These directions may require designated operators to take specific actions in response to emerging cyber threats and other developments. The CCSPA will also allow regulators to issue administrative monetary penalties, with maximum penalties of up to $15,000,000 (s91).
In 2021, Quebec’s government adopted Bill 64, An Act to modernize legislative provisions as regards the protection of personal information, amending the province’s Act respecting the protection of personal information in the private sector (ARPPIPS). While certain Bill 64 requirements came into force in 2022, the majority will come into effect on September 22, 2023. The amended ARPPIPS will offer a modernized privacy regime inspired by the GDPR.
In the realm of accountability and governance, organizations will be required to establish and implement policies and practices relating to the protection of personal information (3.2). Detailed information about these policies and practices must be available on the organization’s website in clear and simple terms. Additionally, organizations will be required to conduct a privacy impact assessment (PIA) prior to the acquisition, development or redesign of an information system or electronic service delivery project involving any processing of personal information (s3.3). Organizations will also be obliged, as of September, to ensure that the privacy settings of technological products and services offered to the public provide the highest level of confidentiality by default, without any intervention by the individual, though browser cookies are expressly exempt (s9.1).
With regard to reinforced transparency obligations, notice obligations coming into force in September will require that organizations collecting personal information publish pre-emptive or concurrent notices regarding collection. The content of the notice must outline the purposes and means of collection, the individual’s rights to access and rectification, and right to withdraw their consent, and as applicable, the name or the categories of third parties to whom personal information might be disclosed as well as the possibility that their personal information may be communicated outside of Quebec (s8). Organizations will also be required to inform individuals of any collection of personal information using a technology that includes functions allowing the individual to be identified, located or profiled (which would include the analysis of a person’s work performance, economic situation, health, personal preferences, interests or behaviour) and the means available to activate such functions, which will have to be de-activated by default (s8.1).
New limitations created by the ARPPIPS include the requirement to conduct a PIA prior to communicating personal information outside of Quebec (s17), and the requirement to conclude a written contract, that includes specific undertakings, with service providers before transferring any personal information (s18.3).
Individuals will also be able to avail themselves of new rights. Specifically, a new right to de-indexation is created where the dissemination of an individual’s personal information contravenes the law or a court order or otherwise causes serious injury to their reputation and privacy (s28.1). When an organization makes a decision exclusively through automated tools processing personal information, individuals will also have a right to be informed thereof, to request additional information, and the right to submit observations to a designated person within the organization (s12.1).
Finally, the enforcement arsenal of ARPPIPS is substantially reinforced, with the creation of administrative monetary penalties of a maximum of $10,000,000 or, if greater, 2% of worldwide turnover of the preceding fiscal year, to be imposed directly by the Quebec regulator. In addition, courts will have the power to issue penal fines of up to $25,000,000 or, if greater, 4% of worldwide turnover of the preceding fiscal year. A new private right of action is also created for individuals who suffer injury caused by an unlawful contravention of the ARPPIPS where the infringement is intentional or results from a gross fault (ss90/1-93.1).
2023 is a pivotal year for data protection and privacy in Canada. Organizations doing business in Canada, will need to ensure that their privacy governance program and processing practices comply with the new provincial and federal rules. BLG’s Privacy and Data Protection team is available to help your organization navigate these new requirements.
Debbie Heywood looks at the latest proposals for changing UK data privacy law following the publication of a second Data Protection and Digital Information Bill.
1 of 6 Insights
Michael Tan, Julian Sun, Paul Voigt and Wiebke Reuter look at what China's new SCCs mean for businesses looking to export personal data from China to the EU.
2 of 6 Insights
Liisa Thomas of Sheppard Mullin Richter & Hampton LLP summarises the complexities of the USA's patchwork approach to privacy regulation.
3 of 6 Insights
Trilegal's Nikhil Narendran and Karishma Sundara look at the changes ahead for India's data and technology regulatory framework.
4 of 6 Insights
MinterEllison's Sonja Read, Susan Kantor, Christina Graves, Helen Lauder and Paul Kallenbach look at the proposed reforms to Australia's Privacy Act 1988.
6 of 6 Insights