Share on Facebook
Share on X
Share on LinkedIn

By: Michael J. McConnell, New York Personal Injury Lawyer

The Take It Down Act was signed into federal law this week after receiving widespread bipartisan support. In essence, the law seeks to protect victims of revenge porn – including “deepfake” images created by artificial intelligence – through tougher penalties and requirements for sites to take down such content within 48 hours after being notified by a victim.

In this article, I will briefly discuss the revenge porn problem in the age of artificial intelligence, the key details of the Take It Down Act, and how it will help victims moving forward.

The Revenge Porn Problem

Revenge porn has plagued society for quite some time. The classic and perhaps most common example is when a former intimate partner shares nude photos or explicit videos online of their former partner without their consent after a breakup or divorce. This is typically done as a way of embarrassing or “getting back at” someone.

The problem has also existed for quite some time among high school and college students. Whether it is a classmate obtaining and sharing such material through social media, Snapchat, text message, or any other platform, the consequences can range from emotional distress all the way to suicide. Indeed, there have been countless tragic stories of teens committing suicide after someone shared or threatened to share an intimate image of them without their consent.

More recently, the problem has expanded to AI-generated “deepfake” photos that appear to be real-life intimate images. In some cases, the publisher of the image knows the victim personally such as a classmate or former partner. In others, the publisher is completely a stranger to the victim, which is typical in instances where a deepfake image has been shared online of a celebrity.

In any event, the public awareness from some of these cases appears to have been a major motivating factor in Congress deciding to move forward with the Take It Down Act. Legislators from both sides of the aisle highlighted these stories in support of passing this legislation. Senator Ted Cruz, who cosponsored the bill with Senator Klobuchar, stated that the story of Elliston Berry in particular motivated this legislation.

Elliston Berry was just a fourteen-year-old girl when a classmate used an artificial intelligence program to create a fake but realistic looking nude photograph of her as well as several other female students. The photo then spread like wildfire through the school by way of Snapchat. Berry’s mother relentlessly tried to get Snapchat to take the image down, but it took over nine months for that to happen.

The trauma that victims suffer – particularly children – from revenge porn including deepfakes is significant and well documented. It can lead to depression, anxiety, and countless other psychological and mental health issues. As mentioned above, it has led to suicide on far too many occasions as well. Obviously the longer something stays up online, the more people will see it and thus the greater trauma the victim will suffer.

The issue in many of the revenge porn situations prior to the passage of the Take It Down Act was that there was a lack of federal regulation on what the duties of online platforms were regarding taking this content down. Additionally, proponents of the bill believed that, for the most part, the criminal penalties that some states enacted for publishing non-consensual intimate images including AI-generated ones were insufficient to truly tackle the problem and deter future conduct.

What Exactly Does the Take It Down Act Do?

The Take It Down Act makes it a federal crime to publish or threaten to publish non-consensual intimate images (NCCIs) and AI-created deepfakes. Additionally, online platforms are now required by law to take such content down within 48 hours of a victim’s request as well as make reasonable efforts to remove duplicates and reposts.

It is important for victims of revenge porn to understand that the Take It Down Act makes clear that consent to create an image doesn’t mean someone has consent to share it. In other words, even if a victim shares an intimate photograph or consents to an AI-created one, that does not give anyone the right to share it with anyone else. If they share such an image without a victim’s consent, they can be criminally punished under this law.

Individuals who are convicted under the Take It Down Act can be sent to prison for up to two years if the victim is an adult or up to three years if the victim is a minor. Platforms who fail to remove the content can be fined by the Federal Trade Commission (FTC).

Why the Take It Down Act Matters

The intent of the law is clearly to prevent the widespread problem of revenge porn. It makes clear to potential perpetrators that they can and will be held responsible for publishing or threatening to publish intimate content without someone’s consent.

This is particularly important for survivors of domestic abuse, sexual abuse, and sex trafficking. It is well documented that abusers tend to use these intimate images as leverage against their victims.

In fact, just last week in the criminal trial against P. Diddy, we heard testimony from the key witness Cassie Ventura that Diddy had threatened to release videos of Cassie performing sexual acts in the so-called “freak offs” as a means of intimidation to assert control over her.

Such a tactic is a common one in abusive relationships, but the good news is that the Take It Down Act sends a clear message to abusers that even just the mere act of threatening to publish intimate content of someone without their consent is enough to send them to prison for several years.

Additionally, in the context of high school students and minors in general, the ability to get platforms to quickly remove these images might help mitigate the harm when another student inevitably does share NCCIs online.

Finally, many school systems and universities are beginning to do a better job educating students about the harm caused by such conduct. To the extent they can prevent the problem and make clear to students that such conduct will not be tolerated, the less and less of these types of tragic cases we might see.

Final Thoughts

The Take It Down Act is an important step forward in the protecting potential victims of online revenge porn, including AI-generated content. Publishing or threatening to publish such content can result in significant criminal punishment, including prison time. Platforms can also be fined for failing to remove the content within 48 hours of a verified request. All in all, the law is a win for potential victims of revenge porn – including survivors of domestic abuse, sexual abuse, and human trafficking as these vulnerable groups are commonly targets of revenge porn.