Unmasking Deception: Canada Weighs in on the Dark Side of Deepfakes

  • April 30, 2024
  • Jennifer R. Davidson, partner, Technology, Privacy and Cybersecurity Law, and Victoria Di Felice, articling student, of Deeth Williams Wall LLP

In today’s digital age, seeing is no longer believing. Deepfake technologies are shaking up the legal landscape as Canada grapples with how to catch up to these AI imposters. Deepfakes are hyper-realistic media manipulations created using artificial intelligence (AI), where images, audio, or videos, are either digitally altered or fully generated by AI to convincingly replace one person’s likeness with another.[i] These sophisticated synthetic multimedia manipulations are blurring the lines between reality and fiction.

DEEPFAKES IN THE WILD

Not all use of deepfake technology is nefarious or deceptive. Deepfake technology can, and is, being used for entertainment purposes to augment video game characters or develop satirical content.[ii] However, the use of deepfakes spark major concerns when the images, voices and even movements of real people are manipulated to create content that makes it look like the people portrayed are saying or doing things they’ve never done.

A 2023 report analyzing the state of deepfakes found that non-consensual altered pornographic clips constitute 98% of all deepfake videos found online.[iii] In late January 2024, sexually explicit deepfakes of singer Taylor Swift went viral on X (formerly known as Twitter). To prevent further distribution, X made Swift’s name unsearchable for 48 hours. Despite this effort, the images were still viewed millions of times before being taken down, with one of the images viewed more than 45 million times.[iv]

Unfortunately, Swift is just one of many women targeted in these pornographic deepfake videos, a crime for which 99% of the individuals targeted are women.[v] Most do not have the resources of Taylor Swift to enforce the removal of these images from the internet. The creation and dissemination of non-consensual deepfake pornography not only infringes on one’s right to control their own image and identity but can cause irreversible harm to an individual’s reputation and mental health.[vi]

The use of deepfake videos to engage in fraudulent activity is hitting new peaks. In February of this year, an employee of a multinational finance company was tricked into handing over 25 million dollars to criminals who utilized deepfake technology to invite the victim into a video conference call filled with the employee’s colleagues, including the chief financial officer. In the meeting, the employee was instructed to make the transaction. In dutifully following his superiors, he inadvertently handed over $25 million to the criminals.[vii]

It’s not just corporations being targeted. Last year, a family living in Newfoundland and Labrador received a call from a voice they recognized as their son, claiming to be in trouble and needing their urgent help. The couple handed over nearly $10,000 in cash to bail him out of trouble. Meanwhile, his son was at his home and knew nothing of the so-called emergency.[viii] These incidents demonstrate the growing sophistication of fraudulent scams via deepfakes, which can result in significant financial consequences to their victims.

In the last two years, deepfakes of politicians have become increasingly common. In March of this year, a deepfake advertisement depicting Justin Trudeau recommending a cryptocurrency exchange was posted on YouTube.[ix] In February of this year, days before Slovakia’s parliamentary elections, a fake audio recording of one of the candidates boasting about how he rigged the election surfaced online.[x]

In the era where videos go viral globally in mere seconds through platforms like Instagram, TikTok, and X, to audiences who assume that the image they see is a true artifact, deepfakes can easily lead to the spread of misinformation and shape public opinion.[xi] The potential use of deepfakes in upcoming elections creates serious concerns that require swift government action.

THE CANADIAN GOVERNMENT’S RESPONSE  

The Canadian Government has taken steps to regulate the nefarious uses of deepfake technologies. The most recent legislative developments include:  

Bill C-63: The Online Harms Act

On February 26 of this year, the Government of Canada published Bill C-63 which introduces the Online Harms Act (the Bill).[xii] This is the first piece of federal legislation to explicitly address deepfakes.  The Bill aims to hold social media services accountable for “harmful content” hosted on their platforms and create stronger online protections for everyone in Canada. Among other things, “harmful content” includes “intimate content communicated without consent” and explicitly includes content of this nature created by deepfakes.[xiii] Recognizing the need to regulate the distribution of non-consensual intimate images is long overdue. While several provinces including Alberta, Nova Scotia, Manitoba, British Columbia, Saskatchewan, New Brunswick, Newfoundland and Labrador and Prince Edward Island have passed laws that deal with the distribution of non-consensual intimate images, many of them do not include any specific recourse for deepfakes.[xiv]

Although a step in the right direction, the Bill does not address other forms of harmful content that may be generated by deepfakes aside from sexual content. As the Bill moves through the legislative process it is subject to change, which hopefully will include expanding the scope of harmful content to capture other forms of deepfakes such as those depicting the likeness of individuals expressing opinions that could damage their reputation or credibility.

Bill C-65: Amendments to the Canada Elections Act

On March 20, 2024, the Canadian government tabled Bill C-65, which seeks to amend the Canada Elections Act ( the Elections Act), providing a comprehensive set of measures meant to safeguard election integrity and enhance trust in Canada’s electoral process.[xv] Although not yet explicitly set out in the first version of the Bill, there have been reports which state that Section 480.1 of the Elections Act directed at the impersonation of certain people involved in the election process will be expanded to cover deepfakes.[xvi] In the meantime, Elections Canada has launched an online tool called “Electofacts” to be used by Canadians to verify whether information they have come across about the federal electoral process is accurate.[xvii]

While Bill C-65 and the Electofacts tool marks a significant step forward in diminishing the spread of misinformation, Canada trails behind the United States in addressing this issue. Since January of last year, 41 states have introduced legislation to regulate election-related deepfakes, with eleven states already enacting such laws.[xviii] This underscores a pressing need for Canada to expedite its legislative response to this issue, ideally, before the call for the next federal election.

CONCLUSION

As AI advancements make deepfakes more convincing, our ability to distinguish fact from fiction drastically diminishes.[xix] It’s time to accelerate our legal response, to ensure a rapid evolution of recourse to contend with the swiftly evolving deepfakes reshaping our digital reality.

 

[ii] Ibid; WIPO, “Artificial intelligence: deepfakes in the entertainment industry” (June 2022), online: WIPO <https://www.wipo.int/wipo_magazine/en/2022/02/article_0003.html>.

[iii] Home Security Heroes, “2023 State of Deepfakes” (2023), online: Home Security Heroes <https://www.homesecurityheroes.com/state-of-deepfakes/ - key-findings>.

[iv] Eva Zhu, “Will the Taylor Swift AI deepfakes finally make governments take action?” (31 Jan 2024), online: CBC <https://www.cbc.ca/arts/commotion/will-the-taylor-swift-ai-deepfakes-finally-make-governments-take-action-1.7100874>.

[v] Supra note 3.

[vi] Halle Nelson, “Taylor Swift and the Dangers of Deepfake Pornography” (8 Feb 2024), online: National Sexual Violence Resource Center <https://www.nsvrc.org/blogs/feminism/taylor-swift-and-dangers-deepfake-pornography#:~:text=According%20to%20an%20AI%2Ddeveloped,altered%20images%20of%20underage%20girls>.

[vii] Heather Chen & Kathleen Magramo, “Finance worker pays out $25 million after video call with deepfake ‘chief financial officer’” (4 Feb 2024), online: CNN <https://www.cnn.com/2024/02/04/asia/deepfake-cfo-scam-hong-kong-intl-hnk/index.html>.

[viii] Mark Quinn, “N.L. family warns others not to fall victim to the same deepfake phone scam that costs them $10K” ( 29 Mar 2023), online: CBC <https://www.cbc.ca/news/canada/newfoundland-labrador/deepfake-phone-scame-1.6793296>.

[x] Curt Devine, Donie O’Sullivan & Sean Lyngaas, “A fake recording of a candidate saying he’d rigged the election went viral. Experts say it’s only the beginning” (1 Feb 2024), online: CNN <https://www.cnn.com/2024/02/01/politics/election-deepfake-threats-invs/index.html>.

[xi] Supra note 1; Government of Canada, “Disinformation, Deepfakes, and the Human Response” (16 Nov 2023), online: Canada <https://www.canada.ca/en/security-intelligence-service/corporate/publications/the-evolution-of-disinformation-a-deepfake-future/disinformation-deepfakes-and-the-human-response.html>.

[xii]Bill C-63, An Act to enact the Online Harms Act, to amend the Criminal Code, the Canadian Human Rights Act and An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service and to make consequential and related amendments to other Acts, 1st session, 44th Parliament, 2024 (first reading 26 February 2024).

[xiii] Ibid, Online Harms Act, s 2(1): See definition of “intimate content communicated without consent”, subsection (b).

[xiv]  Intimate Images Protection Act, SBC 2003, c 11 (BC) ; Protecting Victims of Non-consensual Distribution of Intimate Images Act, RSA 2017, c P-26.9 (AB); Intimate Images and Cyber-protection Act, SNS 2017, c 7(NS); The Intimate Image Protection Act, CCSM c I87 (MB); The Privacy (Intimate Images – Additional Remedies) Amendment Act, 2022, SS 2022, c 29 (SS); Intimate Images Unlawful Distribution Act, SNB 2022, c1 (NB);  Intimate Images Protection Act, RSPEI 1988, c I-9.1 (PEI); Intimate Images Protection Act, RSNL 2018, c I-22, (NL).

[xv]  Bill C-65, An Act to amend the Canada Elections Act, 1st session, 44th Parliament, 2024 (first reading 20 March 2024), “Summary” preamble.

[xvi] Darren Major, “Liberals introduce legislation amending Elections Act as part of agreement with NDP” (20 Mar 2024), online: CBC https://www.cbc.ca/news/politics/elections-act-update-legislation-1.7149657; Rachel Aiello, “Liberals table elections law reforms aimed at making it easier to vote, harder to meddle” (20 Mar 2024), online: CTV News <https://www.ctvnews.ca/politics/liberals-table-electoral-reform-legislation-that-could-change-the-way-voters-cast-their-ballots-1.6814769>.

[xvii] Elections Canada, “new resource to counter misinformation and distribution about the electoral process” (9 Jan 2024), online: Elections < https://www.elections.ca/content.aspx?section=med&dir=pre&document=jan0924&lang=e>.

[xviii] Public Citizen, “Tracker: State Legislation on Deepfakes in Elections” (last updated 5 April 2024), online: Citizen < https://www.citizen.org/article/tracker-legislation-on-deepfakes-in-elections/>.

[xix] Patrick Tucker, “Deepfakes Are Getting Better, Easier to Make, and Cheaper” (6 August 2020), online:  Defense One <https://www.defenseone.com/technology/2020/08/deepfakes-are-getting-better-easier-make-and-cheaper/167536/>.

Any article or other information or content expressed or made available in this Section is that of the respective author(s) and not of the OBA.