Attention
X

You are now leaving AMAZE.org.
Content beyond this site might not be
appropriate for young adolescents.

Continue to external site
attentionAttention
X

The following video was not produced by AMAZE.

Play Video
We'd like one thing before you download!
X

Please give us your email address before you download. Feel free to subscribe to our Newsletter while you’re here!


Download
DeepFakes Aren't Fake. Here's Why.
DeepFakes Aren't Fake. Here's Why.
Add video to playlist Create Playlist

DeepFakes Aren't Fake. Here's Why.

More and more AI-generated nudes, or “deepfakes,” are appearing in schools. That’s because young people are regularly being served ads about apps that “undress” their classmates. While some may call these images “fake,” the harm they cause to a person’s privacy, dignity, and bodily autonomy is very real. In this video, a student generates an AI nude of his classmate, and talks to his friend about the consequences. He learns the truth about “deepfakes” and how these AI tools are being used to create harmful, non-consensual content. Whether it’s a “joke” or something more serious, AI-generated imagery can have real-world consequences for mental health and reputations. [AMZ-187]

Featured Resources:

This video is the third of five in a series on “Combatting the Manosphere.” The videos and corresponding collateral material are produced by AMAZE in collaboration with the LinkUp Lab, a project of Equimundo and Futures Without Violence.

LinkUp Lab

Youth

Digital boundaries are just as important as physical ones. Using someone’s image without their consent—especially to create a “nudified” version—is a violation of their body and a form of sexual violence. The reason “deepfakes” aren’t really fake is that they cause long-lasting emotional harm, including feelings of shame, fear, and a loss of safety for the person depicted. It can also follow them around for the rest of their life, just like a nasty rumor about someone that won’t go away no matter how many times they explain it’s untrue.

There are real long-lasting consequences for the person creating the AI nude too. Not only do schools suspend or even expel students who create them, but laws make it illegal to generate a naked picture without someone’s consent too. Many times, the police get involved. And if the person depicted in the image is under 18, then the crime is often treated like child pornography, which comes with even harsher punishment.

So if a friend talks about or shows you an AI nude, it’s important you tell them it’s not okay with you. Just like in the video, it’s often helpful to ask someone to put themselves in that person’s position: “How would you feel if someone shared a naked picture of you behind your back?” Pushing back might feel scary, but it’s a form of kindness that can prevent your friend from causing serious harm (and getting in trouble themselves too).

It’s also important you not share these images. That causes further harm and can have serious legal consequences. If you see this content on a social platform, report it immediately. If you or someone you know is affected, tell a trusted adult so you don’t have to handle it alone. If a friend is targeted, remember that they are the victim of a crime. Offer to help them use reporting tools rather than asking how it happened.

FAQs

Can someone make a realistic looking nude image of me even if I don’t post nudes?

Yes. The technology only needs a few regular photos or a short video of your face to create a realistic “nudified” version or video. This is why it’s important to keep your social media profiles private and only accept follow requests from people you know in real life.

Is there any way to get an image deleted once it's out there?

Yes, but it is very difficult because online images spread quickly. The best steps are to report the content on the platform and talk to a trusted adult who can help you handle the situation. There are even more resources at https://takeitdown.ncmec.org/

What should I do if I find a nude image of myself or someone I know online?

First, do not delete the evidence—take screenshots or screen recordings. Then, report the content directly on the platform, and use a tool like TakeItDown.ncmec.org (if you are under 18) or StopNCII.org (if you are over 18). These tools create a “digital fingerprint” of the image on your device so platforms can find and block it without you ever having to send the actual photo to anyone.

Parents

As AI technology becomes more accessible, many young people encounter “undressing apps” without even seeking them out. It is essential to help youth recognize that digital harm is serious. Just like in the video, it’s often helpful to ask your child to put themselves in a victim’s position: “How would you feel if someone shared a naked picture of you behind your back?”

If your child is struggling to understand how something that is “fake” can be such a big deal, one comparison that might be helpful is a rumor. Rumors are almost always untrue, but once they start and get repeated over and over, it’s very hard to debunk them. Even when people know a rumor about someone isn’t true, they use it to create jokes and tease them anyway. So just like an AI nude, the effect is real even if the content is not. Plus, once online, it becomes very difficult to control what happens to that image. It could follow someone around into college or prevent them from getting a job. Their grandparents could stumble upon it today, and their kids could find it in the future.

In your conversations, ensure young people understand that creating or sharing AI nudes is against the law. If it involves minors, it is considered child sexual abuse material. Schools often expel students for this, and criminal charges can easily be filed.

Lead any conversation on this topic with a focus on how AI-generated nudes impact the victim. While it’s important your child realizes there are legal consequences for themselves as well, their key takeaway shouldn’t just be “that’s illegal and I’ll get in trouble,” but rather, “that’s harmful, nonconsensual, and wrong to do to someone.”

As with many topics AMAZE covers, you don’t need “one big talk.” Bring it up while driving, doing dishes, or during a commercial break. You can also start by listening more than you talk; asking open-ended questions to learn how they see these technologies being used in their world. If they admit to seeing these images, stay non-judgemental. Focus on the harm the technology causes rather than reacting with immediate discipline. This keeps the door open for them to come to you if they are ever targeted.

Conversation Starters

Model consent: Ask your child for permission before you take or post a photo of them.

This teaches them that they have agency over their own image and should respect the same in others. Then you can build empathy by asking how they would feel if someone altered an image of them or someone they loved and shared it around. Remind them that images online spread rapidly and are nearly impossible to fully remove.

Use current events to start an open-ended conversation.

Try something like, “I saw a video today about ‘undressing apps’ that use AI to create fake images. Have you or your friends seen ads for those on social media?” Or, “There’s a new law called the Take It Down Act that helps people remove fake images of themselves from the internet. Do you think kids at school know that creating those images is actually a crime?”

Educators

Educators play a vital role in creating a school culture that values bodily autonomy (in-person and online) and bystander intervention. As AI technology becomes more accessible, many students encounter “undressing apps” through social media advertisements without ever seeking them out. For school staff, it is essential to help youth recognize that digital harm is a serious violation of a peer’s dignity. In a classroom or advisory setting, it can be effective to ask students to consider the perspective of the person targeted: “How would you feel if a private, violating image of you was circulated throughout the school without your knowledge?”

If students struggle to understand how a “fake” image can cause harm, the comparison to a rumor is a powerful teaching tool. Like a rumor, an AI nude is fabricated, yet once it is shared, the social consequences are immediate and difficult to reverse. Even when others know an image is generated by AI, they may still use it to tease or harass the victim. Educators can help students see that the emotional impact—shame, fear, and loss of safety—is real regardless of how the content was created. Students should also understand that once online, it becomes very difficult to control what happens to that image. Their grandparents could stumble upon it today, and their kids could find it in the future.

It is vital to ensure students understand the severe consequences surrounding this technology. It is illegal to create or share AI nudes, and it is a violation of school conduct codes that often results in suspension or expulsion. Furthermore, when these images involve minors, they are legally classified as child sexual abuse material. School staff should be clear that these actions can lead to permanent criminal records under laws like the Take It Down Act.

When addressing these incidents, the focus should remain on the impact on the victim and the importance of bodily autonomy. The goal is for students to move beyond a fear of “getting in trouble” and toward a fundamental understanding that creating such content is a form of sexual violence. Educators should also address the gender dynamics often present in these situations, as the technology is frequently used to target girls and women, reinforcing harmful power imbalances.
Rather than relying on a single assembly, look for “teachable moments” in digital citizenship lessons or media literacy discussions. By maintaining a non-judgmental and approachable tone, staff can encourage students to report these images rather than passing them on. When students feel they can talk to a trusted teacher without immediate, reflexive punishment, they are more likely to speak up as bystanders, which is the most effective way to stop the spread of harmful AI-generated content within a school community.

National Sex Education Standards

CHR.2.CC.2 - Bodily Autonomy and Personal Boundaries

Define bodily autonomy and personal boundaries

View all CHR.2.CC.2 Videos

CHR.2.CC.3 - Consent
CHR.5.CC.2 - The Relationship between Consent, Personal Boundaries, and Bodily Autonomy

Explain the relationship between consent, personal boundaries, and bodily autonomy

View all CHR.5.CC.2 Videos

CHR.8.INF.2 - Impact of Technology and Social Media on Relationships

Evaluate the impact of technology (e.g., use of smart phones, GPS tracking) and social media on relationships (e.g., consent, communication)

View all CHR.8.INF.2 Videos

CHR.10.INF.2 - Potentially Positive and Negative Roles of Technology and Social Media

Analyze the potentially positive and negative roles of technology and social media on one’s sense of self and within relationships

View all CHR.10.INF.2 Videos

IV.2.CC.1 - Child Sexual Abuse

Define child sexual abuse and identify behaviors that would be considered child sexual abuse

View all IV.2.CC.1 Videos

IV.5.CC.1 - Child Sexual Abuse, Sexual Harassment, and Domestic Violence

Define child sexual abuse, sexual harassment, and domestic violence and explain why they are harmful and their potential impacts

View all IV.5.CC.1 Videos

IV.10.CC.1 - State and Federal Laws Related to Intimate Partner and Sexual Violence

Identify the state and federal laws related to intimate partner and sexual violence (e.g., sexual harassment, sexual abuse, sexual assault, domestic violence)

View all IV.10.CC.1 Videos

IV.10.CC.2 - Types of Abuse

Describe the types of abuse (e.g., physical, emotions, psychological, financial, and sexual) and the cycle of violence as it relates to sexual abuse, domestic violence, dating violence, and gender-based violence

View all IV.10.CC.2 Videos

IV.12.INF.1 - Attitudes and Beliefs about Interpersonal and Sexual Violence

Analyze how peers, family, media, society, culture, and a person’s intersecting identities can influence attitudes and beliefs about interpersonal and sexual violence

View all IV.12.INF.1 Videos