AI tools are making it easier for students to produce realistic, sexually explicit deepfakes of classmates, fueling a new wave of cyberbullying that can cause long-term trauma. A high-profile Louisiana middle school case led to criminal charges and highlighted gaps in school responses. Experts urge updated policies, clearer reporting paths and open conversations between parents, students and educators.
Deepfake Cyberbullying Surges in Schools as AI Makes Explicit Images Easy to Create

Schools across the United States are confronting a rapid rise in student-created deepfakes: realistic, AI-generated images and videos that depict classmates in sexually explicit scenarios. The spread of these manipulated materials can result in prolonged harassment, public humiliation and severe emotional trauma for victims.
State Laws and Legal Action
The problem gained national attention this fall when AI-generated nude images circulated at a Louisiana middle school. Two boys were ultimately charged in that case — believed to be the first prosecutions under Louisiana’s recently enacted statute — after one targeted student was expelled following a fight related to the images.
Legislatures are responding: by 2025 at least half of U.S. states had passed laws addressing generative AI deepfakes, including measures that specifically target simulated child sexual abuse material, according to the National Conference of State Legislatures. Students elsewhere have faced charges or expulsions in states including Florida, Pennsylvania and California; a Texas elementary school teacher was also charged for using AI to create illicit images of students.
How Deepfakes Have Evolved
Deepfakes began as niche projects and political smear tools, but the technology has become widely accessible. "You can do it on an app, you can download it on social media, and you don’t have to have any technical expertise whatsoever," said Sergio Alexander, a research associate at Texas Christian University. This ease of use has driven a dramatic spike in reports: the National Center for Missing and Exploited Children said reports of AI-generated child sexual abuse material rose from 4,700 in 2023 to 440,000 in the first six months of 2025.
Why Schools May Be Unprepared
Experts warn that many schools have not updated policies or training to address AI-manipulated images. Sameer Hinduja, co-director of the Cyberbullying Research Center, urges schools to explicitly update rules and educate staff and students so young people do not assume adults are unaware or powerless.
“This incident highlights a serious concern that all parents should address with their children,” Lafourche Parish Sheriff Craig Webre said in a news release.
Psychological Impact
Unlike a rumor or a nasty text, a deepfake is a visual artifact that can spread quickly and reappear over time, compounding victims' distress. Many young people develop anxiety, depression and social withdrawal after being targeted. "They literally shut down because it makes it feel like there’s no way they can prove this is not real — because it does look 100% real," Alexander said.
How Parents and Schools Should Respond
Parents and educators can take practical steps: open conversations, clear reporting pathways, tightened policies and partnerships with law enforcement and platform providers. Experts recommend normalizing talk about manipulated content and creating a nonpunitive environment so students feel safe reporting incidents.
Laura Tierney, founder and CEO of The Social Institute, offers the SHIELD framework as a response roadmap:
- S — Stop: Do not forward or share the image.
- H — Huddle: Seek a trusted adult immediately.
- I — Inform: Report to the social platform where the content is posted.
- E — Evidence: Note who shared the content and when, but avoid downloading the file.
- L — Limit: Restrict social media access while the situation is addressed.
- D — Direct: Connect the victim to counseling, school staff, or law enforcement as needed.
Takeaways
Deepfake cyberbullying is a growing, cross-cutting problem that requires updated school policies, informed parents, better education for students, and coordinated responses with platforms and law enforcement to protect victims and deter abuse.
The Associated Press provided reporting for this story. AP retains editorial responsibility for the content.


































