technology

Battling ‘Deepfakes': What to Know About NY Bill to Outlaw Altered Pornographic Images

NBC Universal, Inc.

After a Long Island man was sentenced for taking photos from social media accounts of 14 women when they were in high and middle school, altering them to make them sexually explicit and then posting them on a porn website, there is a greater discussion about the surge in the images — how to go after people who could do similar things.

Patrick Carey, who was posting the fake images up to within hours of his 2021 arrest, also shared the women's personal identifying information, including full names, phone numbers and addresses -- and encouraged other users on the porn site to harass and threaten them with violence, according to court documents.

The images had been altered in what is otherwise known as a "deepfake," a technology whereby Carey convincingly superimposed the victims' faces on other, separate images of women engaging in sexual conduct.

Carey was sentenced to six months in jail, a term that parents of the victims considered to be "light." But Nassau County District Attorney Anne Donnelly said there was only one reason is serving any time whatsoever: Investigators found a single image of an underage girl he used in one of the explicit deepfakes.

That means that the other deepfake content he created, all of which was clearly explicit and outrageous, didn't technically break any laws in New York.

Now, the district attorney is looking to ban the images through legislation. But before explaining the law to ban the images, it's important to know what deepfakes are.

What Are Deepfakes

A “deepfake” is a video, audio or image that uses machine learning to create a convincing imitation of a real person’s likeness or voice. While some deepfakes can be innocuous, they have also been used for revenge porn, fake news and fraud. 

In another 2021 case, a Pennsylvania mom was accused of creating deepfake videos and photos of underage girls on her daughter’s cheer squad in a prolonged effort to harass them and get them kicked off the team.

Tom Cruise became a target on TikTok as well around the same time.

Developers have raced to better understand and track deepfakes to avoid spreading misinformation. In June, AI researchers at Facebook and Michigan State University say they have developed software that can reveal where deepfakes have come from.

Are Deepfakes Illegal?

No, but some could be under a new bill.

Donnelly proposed legislation to establish a number of felony and misdemeanor crimes related to deepfakes to deter such behavior, making it a felony charge and punishable by up to seven years in prison if convicted.

"New York State currently lacks the adequate criminal statutes to protect victims of ‘deepfake’ pornography, both adults and children," Donnelly said. "That is why I am proposing the legislature take up the ‘Digital Manipulation Protection Act,’ that would close the loopholes in the law that allow sexual predators and child pornographers to create sexually explicit digitally manipulated images and evade prosecution."

If passed, the "Digital Manipulation Protection Act" would create five new crimes that would apply to those who create sexually explicit deepfakes without permission of those who are pictured. Those crimes, which include specific protections for minors, are:

  • Unlawful Publication of a Sexually Explicit Depiction of an Individual (a proposed class A misdemeanor)
  • Unlawful Dissemination of a Sexually Explicit Depiction of an Individual (a proposed class B misdemeanor)
  • Unlawful Distribution of a Sexually Explicit Depiction of a Child in the First Degree (a proposed class D felony)
  • Unlawful Distribution of a Sexually Explicit Depiction of a Child in the Second Degree (a proposed class E felony)
  • Unlawful Access of a Sexually Explicit Depiction of a Child (a proposed class A misdemeanor)

It's not immediately clear how much support the bill has in the New York State Legislature, but Donnelly said that the Carey case proves that an update to current laws is necessary in order to keep up new technologies — and in order to avoid similar things from happening to other victims.

"We cannot protect New Yorkers without making these changes," she said.

Contact Us