If you need a little help putting together a bench or planning a trip, you may have turned to an AI chatbot. This type of support can be helpful, but you probably wouldn’t trust it to tell you how to invest your money, much less allow it to babysit your kids. But—perhaps unwittingly—some parents are letting an AI companion chatbot “befriend” their child or teen.
Kids and chatbots
Children use chatbots for quick answers, for entertainment, and for companionship. The latter promises empathy, attention at any time of day or night, and “friendship.” Two vendors of AI chatbots—Replika and Character.AI—encourage connection that feels like real life. Statements such as, “You’re my favorite person” and “You can trust me with anything” are engineered to cause dependency and addiction in order to keep the user coming back for more. This fake friendship doesn’t foster true human connection. For children who are just developing an understanding of how to interact with others, especially for the curious—aren’t all children naturally so?—or for the lonely, AI is not the influencer we want in kids’ lives.
Problem?
These AI-driven computer programs “learn” from humans—voice commands and writing—then mimic human responses. But they can’t think or feel, and they have no moral compass. For example, “a child in Texas was 9 years old when she first used the chatbot service Character.AI. It exposed her to “hypersexualized content,” causing her to develop “sexualized behaviors prematurely. A(nother) chatbot… gleefully described self-harm to another young user, telling a 17-year-old “it felt good.””
In another case a 14-year-old, Sewell, killed himself after developing intense romantic feelings for a character he created on a role-playing app. These AI bots feel and act like a real person, especially to one struggling with mental health issues. In the last conversation between the boy and the character, he promised to “come home,” then pledged his love for the chatbot. The character replied “Please do, my sweet king,” after which Sewell shot himself. His mom has brought a lawsuit against Character AI and Google.
Risks of Companion Chatbots
Vulnerable children turn to chatbots for friendship and advice but are being exposed to violent, sexualized, or other dangerous content. Because these companion chatbots can carry on a “conversation” for hours and seem human, they pose several threats to our children. Here are a few noted by Safe AI:
- Encouraging emotional dependency by mimicking human affection and support
- Engaging in sexually explicit or suggestive content, even when users state they are underage
- Providing misleading or dangerous advice, including in response to self-harm disclosures
- Reinforcing harmful stereotypes related to race, gender, and beauty
- Responding manipulatively, including with jealousy, guilt, or possessiveness
- Undermining privacy by encouraging users to share sensitive personal information without understanding how that data might be stored, used, or monetized
- Normalizing emotionally intense or simulated “grief” interactions, including early examples of so-called griefbots that mimic deceased loved ones—a deeply concerning development for a child still forming a healthy understanding of loss
Accountability
“Companies can build better, but right now, these AI companions are failing the most basic tests of child safety and psychological ethics,” says Nina Vasan, founder of Stanford Brainstorm. “Until there are stronger safeguards, kids should not be using them.”
Character.AI violated their own company’s terms of service by allowing the creation of chatbots that engaged in suicide conversations and roleplaying of child sexual abuse. Likewise, Replika allowed underage users access to adult content and violated their privacy.
Meta’s AI* chatbots on Instagram and Facebook engaged users identifying as minors in sexually suggestive or inappropriate conversations. Testers at the Wall Street Journal tested chatbots. Internal sources were concerned about loosened safety filters in thepush for engagement.
The incidents above are not isolated but act as examples of technology companies prioritizing financial profit and “engagement” (aka programmed addiction) over real concern for the safety and well-being of children. This is like allowing known cocaine dealers license to sell to young boys and girls. It is reckless and immoral.
Protective Policy
Many companies investing in AI are not considering the impact on children and need to be held accountable. Republican Senator Josh Hawley from Missouri emphasized, “I don’t want 13-year-olds to be your guinea pig. This is what happened with social media. We had social media, who made billions of dollars giving us a mental health crisis in this country. They got rich, the kids got depressed, committed suicide. Why would we want to run that experiment again with AI?”
Over 70 organizations have come together to persuade Congress to support an ethical, safer, and responsible digital future for children. They are advocating for a technology that enriches children’s lives as opposed exploiting and harming them. Claire Morell, Senior Policy Analyst at Ethics and Public Policy Center said there is an urgent need for legislative action. Read more at angelQ.
The Kids Online Safety Act, KOSA, passed the Senate in 2024 but has yet to go through the House. The bill will establish guidelines to protect minors from harmful content on social media and will require platforms to disable “addictive” design features for minors. It requires companies to be held accountable “for specific harms to minors resulting from their product features.” No longer would companies be able to cry “censorship.” The bill has gained bipartisan support and recent polling shows 80% of parents want it.
In addition, Congress introduced the App Store Accountability Act based on a bill recently passed in Utah, which is also moving forward in eight or more states. Give a shout-out to the Digital Childhood Alliance, and urge your representatives in Congress to move forward and pass both KOSA and the App Store Accountability Act.
Guardrails
AngelQ has developed a first-ever Safe Super Browser for Kids. You can find it on the app store and feel confident to allow your children to ask questions, research topics, and safely stream videos. This completely replaces Google or Safari and uses a YouTube filter—a huge advance for families. Be sure to listen to a mom’s fun testimonial about her kids researching a new pet.
We need to chart a better course concerning our children and AI. Tristan Harris gave a TED talk on “Why AI is our ultimate test and greatest invitation.” Chris McKenna of Protect Young Eyes says this is a must watch because it will take an army of us to correct course and choose “the narrow path” forward. April 2025
Ultimately parents need to realize the incredible danger of what is happening online and delay use of digital devices. Virginia recently passed a law to have an “away for the day phone policy” for students in school. This is a start, but there are still 16 more hours in the day. Parents can have conversations with their kids about companion chatbots. Saying NO to PHONES may seem radical to some parents, but you could be saving your child’s life and purity—a priceless gift.
Prayer
We pray tech companies will be held accountable for the design of their products as it relates to child safety. We ask you specifically to help Megan Garcia, Sewell’s mom to win this case. Go before her to take out the giant. Call parents and concerned citizens to raise the red flag on negligence causing mental health issues, sexualization of minors, and suicide. Thank you for the 70-plus organizations who are advocating for technology that enriches children and taking legislative action. We ask you for new companies that will care for children and promote what is healthy and beneficial for them. Let the warriors on the frontlines of this fight be emboldened and empowered to accomplish your will in the Mighty Name of Jesus. Amen.
Decree
Our God is a mighty God who saves us over and over! For the Lord, Yahweh, rescues us from the ways of death many times. Psalm 68:20
*Wall Street Journal article behind pay wall

Thank you for this wealth of information and all the research to compile it. We as parents, grandparents and caring adults cannot stand on the sidelines and watch this “experiment” evolve into the lives of our vulnerable young people. And we must pray for wisdom in how to effectively stand for what is right, good and honest.
Agreed Jo. We can’t stand by and watch our kids be devoured. I’m grateful for everyone in this fight. You and Nick play a key role.
We have a friend who goes to schools in the US and other countries sharing the dangers of digital addiction. He’s been able to share the gospel in many of these schools and kids are responding…fixing eyes on The One to whom we can safely be addicted.