What Should You Do If Someone Creates a Deepfake of You?

Column by Ronaldo Lemos published in Folha de São Paulo.

published in

6 de February de 2024

categories

{{ its_tabs[single_menu_active] }}

theme

In less than five minutes, fake pornographic content can be produced and cause lasting trauma.

Last week, the world was shocked by the spread of fake pornographic images of singer Taylor Swift. These deepfakes went viral on Twitter (X), quickly amassing millions of views. Although platforms acted swiftly to remove most images and temporarily made the singer’s name unsearchable, the damage had already been done.

If this can happen to Swift, what about the rest of us? She’s currently the most prominent celebrity, earning $2 billion last year. She has a team of lawyers and image managers at her disposal, yet she couldn’t stop the viral spread of deepfakes.

So, what should you do if this happens to you in Brazil? Unfortunately, creating pornographic deepfakes has become disturbingly simple. Several apps can generate nude images from a single photo, while others allow users to superimpose a victim’s face onto a pornographic video.

If this happens to you, taking immediate steps to minimize the damage is crucial. The first step is to document the crime. Gather evidence of the images, the platforms they appeared on, and the accounts or profiles responsible for sharing them.

Next, notify the platforms as soon as possible. Thanks to Brazil’s Civil Rights Framework for the Internet, there is a legal provision that requires platforms to remove images and videos featuring nudity or sexual acts posted without consent. Most platforms have a dedicated reporting channel for this type of content. Victims can use this channel to request the immediate removal of the deepfake. The relevant links can be found on Facebook’s Help Center and Google’s “Remove Information” page.

Another crucial step is to file a police report. If a cybercrime unit is available in your state, report the crime there (such units exist in most states). If not, file a report at a regular police station or online to protect your rights.

You can also pursue legal action against the individuals who distributed the images and against the platforms if they fail to remove the content promptly after notification. Under the Civil Rights Framework, it’s possible to obtain connection records to help identify those behind the crime, even if the account is anonymous or fake.

If the victim is an adult, the penalty for distributing non-consensual pornographic images or videos is 1 to 5 years in prison. If the victim is a minor, under the Child and Adolescent Statute (ECA), those responsible face a prison sentence of 3 to 6 years. If the deepfake also constitutes cyberbullying, additional penalties of 2 to 4 years in prison may apply.

It’s important to note that the law does not distinguish whether the scene is real or fake. Many legal experts, including this columnist, argue that the current law already covers deepfakes.

Unfortunately, most pornographic deepfakes today originate in schools and involve minors. Often, these are classmates trying to humiliate someone – usually girls. This makes it essential for parents and guardians to educate their children about the serious consequences of creating and sharing deepfakes. While it may take less than five minutes to make a pornographic deepfake, the repercussions can be devastating and long-lasting.

What’s out: Generation 9-9-6 in China (those working from 9 am to 9 pm 6 days a week).

What’s in: Generation 躺平 (“tang ping”, translated as “to lie down”) in China, who refuse to work.

What’s next: Generation 佛系 (fo xi) in China, who aspire to a simple life like Buddha’s.