Brian Long, CEO of Adaptive Security, on sextortion: “It’s against the law and it’s really, really harmful and dangerous to people”
AI tools have made it easier for bad actors to produce fake images to use in sextortion, but lawmakers and those affected are fighting back.

The world is waking up to one of the malicious ends to which some people are employing artificial intelligence tools, creating deepfake nudes. In 2023, Security Hero reported that there had been a 550% increase in the total number of deepfake videos online over 2019 figures.
At the time, one in every three deepfake tools allowed users to create deepfake pornography, stated the company whose mission is to protect people from identity theft. The Security Hero analysis found that women are far and away the primary target of deepfake pornography, making up 99% of the material generated.
Bad actors are producing and sharing altered images of not only famous people but also people that they’ve randomly met, seen pictures of online and even classmates. Research by Thorn last year found that nearly a third of individuals between the ages of 13 and 20 had heard of deepfake nudes and that one in eight knew someone who had been a victim of it.
In some cases, it’s the perpetrator thinking that it is humorous, or they create the images for their own pleasure, or use the images to bully the victim. These images have also been used in sextortion, whereby a victim is cajoled into sending them money or more explicit images of themselves with the threat of making the forgery public.
Victims fight back as AI-driven sexual exploitation grows
As the world wakes up to this unsettling trend, lawmakers, individuals and victims are standing up and taking action. Last year, Congress passed the Take It Down Act nearly unanimously and President Donald Trump signed it into law. Instead of taking months to have nonconsensual intimate images and deepfakes removed from online platforms, any such images must be taken down within 48 hours after being notified of them.
First lady Melania Trump says the “Take It Down Act," which criminalizes nonconsensual sharing of sexually explicit content and AI-generated deepfakes, is a "national victory that will help parents and families protect children from online exploitation." https://t.co/1N2V1FHBtf pic.twitter.com/de3t9ec5yS
— ABC News (@ABC) May 19, 2025
And lawmakers haven’t stopped there. The Senate recently passed the DEFIANCE Act, which allows victims to sue individuals who produce and distribute nonconsensual deepfake pornography, which is now pending a vote in the House. This came after Elon Musk’s AI chatbot Grok caused a furor when reports surfaced that it was being used to generate sexualized images and non-consensual deepfake nudes, including of minors.
“It’s against the law and it’s really, really harmful and dangerous to people”
One of the companies on the frontline trying to stop the spread of non-consensual deepfakes is Adaptive Security, which uses a proactive approach to analyze behaviors and events to protect against and adapt to threats before they happen.
The company, along with Pathos Consulting Group, is working with Elliston Berry, whose classmate shared a deepfake nude image of her when she was 14 years old, to help educate students, parents and school staff about non-consensual deepfakes. The now 16-year-old, who joined Melania Trump at last year’s State of the Union, developed a 17-mimute training course that Adaptive Security is providing free to schools and parents.
Elliston Berry, a teenage victim of explicit AI deepfakes, helped develop a free program to teach parents and schools how to protect their kids from AI-driven abuse.https://t.co/oVue0G5QLz
— Daily Citizen (@DailyCitizen_) January 16, 2026
Speaking to CNN about the Take It Down Act, CEO Brian Long said: “It’s not just for the potential victims, but it’s also for the potential perpetuators of these types of crimes. They need to understand that this isn’t a prank, right? … It’s against the law and it’s really, really harmful and dangerous to people.”
He has also spoken up about the next set of legislation before the House. “The DEFIANCE Act is designed to close that gap by making consequences more predictable for bad actors and giving victims a clearer route to relief,” Long told Cybernews. “In this category of abuse, speed matters as much as the final outcome.”
Related stories
Get your game on! Whether you’re into NFL touchdowns, NBA buzzer-beaters, world-class soccer goals, or MLB home runs, our app has it all.
Dive into live coverage, expert insights, breaking news, exclusive videos, and more – plus, stay updated on the latest in current affairs and entertainment. Download now for all-access coverage, right at your fingertips – anytime, anywhere.

Complete your personal details to comment