Your Child’s Photo Is All They Need: What every Parent Needs To Know About Deepfakes
I have been working in the online safety space for over twelve years. I have sat in auditoriums full of parents who thought they were doing everything right. They were monitoring their kids' devices. They had the conversation. They taught their children not to share personal information online. They were trying. We are all trying our best.
And still, something slipped through.
Here is what I want to help you understand today. The rules have changed. Dramatically. And most parents have absolutely no idea. Please know I am not saying that to shame anyone. I am saying it because tech is moving faster than any of us expected. Faster than the laws(which are largely non existent and weak), faster than the schools, faster than most parents can keep up with. And the only way through it is to understand it. So let's talk about it honestly….the way I would if we were sitting across from each other at your kitchen table. We all know how I love a good kitchen table conversation. And because I have been doing work around Deepfakes for three years and it is time. We must have the hard conversation.
Twelve years ago, the biggest concern I had around social media and inappropriate images was that somebody had to take the picture. And then somebody had to send it. Which meant you largely had to either convince someone to take that picture, steal it without their knowledge, or have someone willingly hand it over. It was a fairly involved process. And when that image was shared we knew what it was. It was pornography. It was identifiable. It was wrong. And there were clear laws around it.
That world is gone.
Today, anyone …and I mean anyone… with a phone or a device can grab an image from anywhere. Facebook. Instagram. TikTok. Snapchat. A school website. A church directory. Your family Christmas card that you posted publicly. It does not matter where it came from. They can take that image and mold it, manipulate it, and transform it into anything they want. Something that looks like that person. Sounds like that person. Acts like that person. Something that is completely fabricated and completely convincing.
This is called a deepfake. And it is not something coming down the road. It is not a problem for future parents to worry about. It is happening right now. Every single day. In schools like your child's school. In towns like yours. To girls who look exactly like your daughter.
So What Exactly Is a Deepfake?
This term gets thrown around and I want you to truly understand what we are talking about. And oddly this is one time we are not going to have to be “tech experts” to understand it.
A deepfake is AI-generated media …a video, an image, or audio …in which a real person's face or voice is placed onto content they never created and never consented to. In the context of women and girls, this most commonly means taking a real photograph and using artificial intelligence to fabricate explicit, pornographic content using that person's face and body.
The result looks real. It is designed to look real. And to the untrained eye …I mean honestly let’s be real….even to the trained eye… it is often indistinguishable from an actual photograph or video. One of the most frightening things is that the technology to do this is not locked away in some sophisticated lab. It is not expensive. It is not complicated. You do not need to be a technology expert. You do not need special equipment. It is on the internet. Much of it is free. Some of it costs a few dollars. And it requires nothing more than a single clear photograph and a few clicks. Any 13 year old can easily navigate the app. And they are.
They are using your daughter's homecoming picture. Her school photo. A screenshot from a video she posted last week. A photo from your family vacation that you shared publicly. That is all it takes. One photo. A few minutes. And someone can create something that will follow your child for the rest of her life.
So let’s talk numbers. Information in this space is concerning. I use and share my research because I want you to understand the full scope of what we are dealing with . Because when we understand the scope, we take it seriously. 96% of all deepfake videos online are non-consensual pornography. Overwhelmingly targeting women and girls. Deepfake pornography websites receive over 134 million views per year. The number of deepfake videos online doubled in a single year between 2022 and 2023. And it has continued to accelerate since then. In the United States, documented cases exist in middle schools and high schools across the country… in cities and in small towns, in wealthy districts and underfunded ones. This is not a big city problem. This is not someone else's problem. The average age of victims in school-based cases is twelve to fourteen years old.
One app specifically designed for this purpose processed over 680,000 images of real women and girls before it was shut down. In under a year. 680,000 real women and girls. Real people with real families. Real daughters. Real students. Real lives.
Deepfakes are not only being used to humiliate. They are being used as weapons. And the weapon has a name: sextortion. Here is how it works.
A predator … sometimes a complete stranger online, sometimes a peer, sometimes someone your child considered a friend …obtains a photograph of your daughter. They use AI to create a fabricated explicit image. And then they contact her with a threat.
That can sound like “Send me money. Send me real explicit photos or videos. Or I will send this image to everyone you know. Your friends. Your teachers. Your parents.” Your child receives the message. She is terrified. She is humiliated. She is convinced that somehow this is her fault and that she should not have posted that photo, that she should have been more careful, that she did something to cause this. And so she says nothing. She handles it alone. She may comply with the demands because she sees no other way out. Shame wins.
FBI reports of sextortion targeting minors have increased by over 300% in recent years. Three hundred percent. And the vast majority of victims do not tell a single adult. Not a parent. Not a teacher. Not a counselor. Because they are scared. And because no one told them this could happen. And because no one told them what to do if it did.
That is why I know it is so important that we have this conversation with other parents and with our children.
Research also shows that victims of non-consensual deepfake imagery experience symptoms identical to survivors of sexual assault. Post-traumatic stress disorder. Severe anxiety. Depression. Suicidal ideation. The violation of having your body fabricated and distributed without your consent triggers the same neurological trauma response as physical assault. Because it is a form of sexual violation. The method is digital. The harm is devastatingly real. Victims describe feeling like they will never be safe again. Because the image exists. It can be reshared at any time. It does not disappear. They have no control over it. And that loss of control…that permanent vulnerability…is psychologically devastating. In documented school-based cases, victims have experienced complete school avoidance, academic collapse, total social withdrawal, and self-harm. Several teenage girls in the United States have died by suicide following the creation and distribution of deepfake images by their peers.
All of this begs the question….what is the legal recourse if this is to happen to my daughter? Due to lack of consistency and supportive laws in this space, it is one of the reasons that cases go unreported. Young girls are afraid that they do not have support and that nothing will happen to the predator. Not to mention the exhaustive time spent fighting the case and the financial drain on resources.
As of 2026, over 30 states have laws specifically addressing non-consensual deepfake pornography. But states do define these laws differently. That means in many states, a person who creates and distributes a fabricated explicit image of a real woman or girl could face little to no legal consequence. Federal legislation has been introduced. It has not yet passed comprehensively. Schools are largely unprepared. Many lack clear policies. When incidents occur, victims are often re-traumatized by investigations that treat them as part of the problem rather than as the victim of a crime. Platforms are inconsistent. Content gets reported. Sometimes it is removed quickly. Often it spreads faster than anyone can stop it.
The legal landscape is improving. Advocates are pushing hard. But right now, in many places, the law has not caught up to the technology. And that means the first line of defense is you. Your awareness. Your conversations. Your home.
What can we do as parents? Starting today. This is the part that matters most to me. Because I never want to leave a parent feeling helpless. Informed and empowered is the goal. So here is what you can actually do.
Have the conversation before something happens. I know it is uncomfortable. I know it feels like you are introducing your child to something dark before they need to know about it. But I promise you they need to know about it. Use the word deepfake. Explain what it is in age-appropriate terms. Make sure your child understands that their photos can be used without their knowledge or consent and that if it ever happens, it is not their fault and they will not be in trouble for coming to you.
Audit your family's public digital footprint together. Sit down with your child and look at what is publicly visible on their profiles. This is not about punishment or shame. It is about awareness and making thoughtful choices together. High-resolution, full-face images carry the most risk. Private accounts are not foolproof, but they significantly reduce exposure.
Teach your child what to do if it happens. Document everything before reporting. Screenshots, URLs, usernames. Report to the platform immediately using their non-consensual intimate image reporting tools. Report to law enforcement even if you are unsure about local laws, create a record. Contact StopNCII.org …a completely free tool that creates a digital fingerprint of an image to help prevent it from being re-uploaded across participating platforms.
Know the signs that something is wrong. Sudden withdrawal from social media. Reluctance to go to school. Unexplained anxiety or depression. Secretive behavior around devices. Receiving unexpected gifts or money. Any of these can be signs that something has happened that your child has not told you about yet.
And say these words out loud to your child — more than once:
If anyone ever does something like this to you …if anyone ever uses your image in a way that makes you feel scared or ashamed …I want you to come to me immediately. I will believe you. I will not be angry with you. And we will handle it together.
Have the hard conversation now before anything happens. Because when something happens, a child does not always have the presence of mind to remember what you told them six months ago. But they remember how you made them feel when you said it.
Yes the technology is advancing faster than the laws. Faster than school policies. Faster than most of us can keep up with. And that is genuinely frightening. But here is what I know after twelve years of doing this work.
Parents who are informed are parents who can protect their kids. Communities that talk about hard things openly are communities where kids feel safe enough to ask for help. And advocates …regular people who contact their legislators, who share information with other parents are the reason laws get written and platforms get pressured.
You do not have to be a technology expert to protect your child. You just have to show up. Stay informed. Keep the conversation open. Have the hard conversation. And make sure the people in your child's life …their teachers, their coaches, their counselors …understand what is happening too.
Our kids are worth the uncomfortable conversation. Every single time.