A new law in Florida has gone into effect, establishing felony criminal penalties for the creation, possession, or distribution of nonconsensual sexually explicit images generated by artificial intelligence. The legislation, which became active on October 1, provides one of the most stringent state-level responses to the increasing problem of AI-driven online harassment.
The law, known as House Bill 757, allows victims to pursue civil damages and mandates that online platforms develop systems for removing the harmful content. This move positions Florida at the forefront of states taking decisive action against the misuse of deepfake technology.
Key Takeaways
- Florida's new law (HB 757) makes creating or sharing AI-generated explicit images without consent a felony.
- Penalties range up to 15 years in prison, depending on the intent to distribute the material.
- Victims are now able to sue for a minimum of $10,000 in damages for each violation.
- The law requires websites and social media apps to implement a 48-hour takedown process for such content by the end of 2025.
- This state law adds criminal charges, going a step beyond existing federal regulations.
Details of Florida's New Legislation
Effective October 1, House Bill 757 addresses what are commonly known as "deepfakes." These are digital files—including images, videos, or audio—that have been manipulated with artificial intelligence to falsely depict a person saying or doing something they never did.
The law specifically targets sexually explicit deepfakes, where an individual's face is placed onto pornographic material or an ordinary photo is altered to create nudity without their permission. This form of digital abuse has seen a significant increase with the public availability of AI tools.
Criminal Penalties and Legal Framework
The legislation created a new section in state law, Section 800.045, which outlines the criminal offenses. Under this new section, it is a third-degree felony to create, solicit, or knowingly possess an AI-altered sexual image of someone without their consent. A conviction for a third-degree felony in Florida can result in up to five years in prison.
The penalties are more severe for those who distribute the material. Possession with the intent to share or distribute these images is classified as a second-degree felony, which carries a potential prison sentence of up to 15 years. This tiered approach aims to punish distributors more harshly than those who only possess the content.
Building on Federal Law
Florida's law complements the federal Take It Down Act, which was signed in May. The national act requires websites to remove nonconsensual intimate images, whether real or AI-generated, within 48 hours of a victim's request. However, the federal law does not include criminal penalties. Florida's HB 757 adds the critical element of prosecution and potential prison time for offenders.
New Protections and Recourse for Victims
Beyond criminal charges, HB 757 provides victims with powerful new tools to seek justice and regain control over their digital identity. A key component of the law is the right for victims to file civil lawsuits against the creators and distributors of deepfake content.
The law sets statutory damages starting at $10,000 per violation, plus attorney’s fees. Importantly, each individual image is considered a separate offense. This means that if multiple fake images are created or shared, the potential damages can accumulate quickly, creating a significant financial deterrent.
Platform Accountability and Content Removal
The law also places new responsibilities on technology companies. By December 31, 2025, websites and applications must establish and maintain a clear reporting system for victims to request the removal of nonconsensual AI-generated images.
Once a platform receives a valid report, it has 48 hours to remove the flagged content and make reasonable efforts to delete any copies. Companies that comply with these takedown requests in good faith are given liability protection, shielding them from lawsuits even if a removal is later found to be mistaken.
The Human Impact of Digital Abuse
The need for such strong legislation is illustrated by the experiences of victims like Elliston Berry, a teenager from Texas. At 14, she discovered that a classmate had taken a photo from her social media, used an AI application to digitally remove her clothing, and circulated the fake nude image at her school.
"This was not a great way to wake up, and I was truly disgusted," Berry stated. "My brain went to so many different areas. I had so many questions but I was just speechless."
Berry, now 16, described the experience as feeling like her innocence was erased "pixel by pixel." She faced the difficult task of explaining the situation to her mother and trying to convince her peers that the images were not real.
A Family's Fight for Change
Her mother, Anna McAdams, found that existing systems were not equipped to handle this new form of abuse. "The school wouldn’t really help us, and then our local law enforcement didn’t know what to do with it," McAdams said. "It was kind of like we were just running in circles."
Frustrated by the lack of options, McAdams and her daughter decided to share their story publicly. Their advocacy work included meetings with lawmakers in Washington, D.C., and helped inform the creation of the national Take It Down Act. Their story highlights the emotional and psychological toll that deepfake abuse can have on individuals and their families.
A Rapidly Growing Problem
According to monitoring groups, reports of nonconsensual deepfake cases increased by nearly 400% in 2024. The vast majority of victims are women and minors, demonstrating a clear and disturbing trend in how this technology is being weaponized.
Steps for Victims of Deepfake Abuse
Legal experts and law enforcement officials advise victims to take specific, immediate actions to protect themselves and build a case against the perpetrators. The combination of new federal and state laws provides a clear path forward.
If you are a victim of this type of abuse, consider the following steps:
- Preserve All Evidence: Take screenshots of the images and the conversations where they were shared. Copy any relevant links and record usernames, dates, and times. This documentation is crucial for any legal action.
- Report to the Platform: Use the website or app's reporting tools immediately. Under the federal Take It Down Act, platforms are required to act on valid removal requests for nonconsensual intimate images within 48 hours.
- Contact Law Enforcement: In Florida, it is important to file a police report. When doing so, reference Section 800.045 of the Florida Statutes, which explicitly defines the creation and distribution of deepfake pornography as a felony.
- Seek Legal Counsel: In addition to criminal charges, victims can pursue a civil lawsuit to seek financial damages and obtain a court order for the permanent removal of the images.
As Elliston Berry noted, the experience can have lasting effects on a person's mental health and create fear about future opportunities. Proactive conversations between parents and children about online safety are becoming increasingly important in the age of artificial intelligence.





