1.15.26

“Children at Risk of Forming Romantic Bonds with AI Chatbots”

By Amanda Macias – Fox News


Editor’s note: This story discusses sensitive issues involving children, mental health, sexually explicit content and suicide that may be disturbing to some readers. If you or someone you know is having thoughts of suicide, please contact the Suicide & Crisis Lifeline at 988 or 1-800-273-TALK (8255).


[
COMMENTS FROM DONNA GARNER: Recently a family member showed me how Chat GPT works. 


I was amazed at the quickness that my questions to Chat GPT were answered. 


I was also amazed by the depth of factual knowledge (with links to sources) that it produced in minutes. 


During the Biden years, I was personally banned for life from Facebook for having published excerpts by a credible author on the subject of the permanent dangers from “gender-affirming” surgeries.


To check Chat GPT’s “prejudice” factor, I entered my own name into the Chat GPT system.


In a matter of minutes, it produced a large amount of detailed information about me. 


I read through all of it and found only two mistakes.  


Chat GPT had pulled up some stray photo of another person, and it pulled up my wrong date of birth. 


Outside of those two flaws, all of the information was accurate – not always “kind” but accurate. 


Then my family member showed me the DANGERS of Chat GPT for children. 


He asked it to answer my questions using the voice and terminology that a 12-year-old boy would use.


Immediately, this male, chatty voice started up a “conversation.” 


I could ask it anything, and the “chattiness” continued with all sorts of information. 


I felt as if I really had a “friend” to whom I could ask and say anything. 


What if I were a 12-year-old boy who felt as if he had no friends which often happens to youngsters at that age. 


I could go in my bedroom, shut the door, and talk 24/7 to a “friend” who “understood me.”  


What if I asked the “friend” how I could kill myself without leaving a “mess” for my parents to clean up?  


I am not criticizing only ChatGPT which actually can be a very helpful source of fact-based information. 


I am sure all the other “AI chat bots” do the same thing. 


Parents, this is a wake-up call for you who must step in to protect your children.  


You must not allow your children to spend unsupervised time with their techie devices in their bedrooms all by themselves. 


Parents, as hard as it may be, you must make sure that your children do their homework in a visible, “unprivate” spot in the house such as the kitchen table – a location where all family members are unexpectedly accessing at will.    


You must take the time to make sure your children are actually completing and learning their assignments without becoming dependent on Chat GPT for answers, essays, assignments, tests, projects, etc.  


Another concern for all of us is what if we become dependent as a society on AI for all of our knowledge?  


One takedown of our nation’s power grid, and our techie devices would be useless. 


Question: If we were stranded on a desert island, would we have enough knowledge to survive?
]

=================

 

Excerpts from this article:

https://www.foxnews.com/politics/experts-warn-lawmakers-establish-guardrails-ai-chatbots-form-romantic-bonds-children


PARENT WARNING 


What began as a congressional
 hearing Thursday on excessive screen time among children and young adults took a darker turn, as experts warned lawmakers that AI bots can harm children’s mental health and foster unhealthy, or even sexually explicit, emotional relationships.


Dr. Jenny Radesky, an associate professor of pediatrics at the University of Michigan Medical School, said children often turn to
 
AI chatbots during vulnerable moments, raising concerns about emotional dependency and safety.


"Kids are going to AI when they're lonely, when they don't know who to talk to and when they're worried about being judged,"
explained Radesky during testimony before the Senate committee on Commerce, Science and Transportation.


She said some social media companies have built AI chatbots directly into their user interfaces, which has become one of the primary ways
 children discover and begin engaging with the technology.


"We need to make sure that families can also opt out of things like an algorithmic feed
or having the presence of AI chatbots in products that kids are using," Radesky said, calling for laws that would hold companies accountable for adverse events and enforce strict safety benchmarks.

 

NOT JUST EMOTIONAL ATTACHMENT BUT BAD ADVICE


She said the risks extend well beyond emotional attachment. 


"The concerns are not just the relational concerns,"
Radesky said. 


"There’s also safety — giving bad advice, giving unsafe advice, the sycophancy where you’re leading kids down a rabbit hole of different beliefs and sexual interactions. So we need many guardrails."


The warnings from experts prompted calls for stronger federal oversight. 

 

AI BOYFRIENDS AND GIRLFRIENDS


…Dr. Jean Twenge, a professor of psychology at San Diego State University, echoed those concerns,
warning about the rise of so-called "AI boyfriends and girlfriends" and sexually explicit chat apps


She urged lawmakers
to establish a minimum age of 16 for social media and either 16 or 18 for AI companion apps.


"We don’t want 12-year-olds having their first romantic relationship with a chatbot,"
Twenge said, also calling for strict guardrails on tools like ChatGPT and other research-focused AI to prevent harmful conversations, some of which, she noted, have already been linked to tragic suicides.


Experts warned that without swift action,
children could continue to encounter AI systems that shape their emotions, relationships, and beliefs with little oversight or protection

 

CHILDREN NEED TO EXPERIENCE REAL WORLD – NOT GET LOST IN VIRTUAL WORLD


Senate Commerce Committee Chair 
Ted Cruz, R-Texas, said the challenge for parents is growing as children spend more of their day online.


Cruz said 
parents are already struggling to keep children safe in an increasingly digital world, citing research showing children ages 8 to 12 spend an average of 5.5 hours a day on screens, while teens average more than eight hoursmore than half their waking day.


"Kids need time to be kids — to experience the real world, not get lost in a virtual one,"
he said.