A 14-year-old teenager from Florida tragically took his life after spending months interacting with an AI chatbot named “Daenerys Targaryen (Dany).” Their conversations spanned various topics, including "romantic" or "sexual" themes, leading the teen, Sewell Setzer III, to become increasingly withdrawn and eventually take his life in an attempt to be with "her."
Conversations reveal that Sewell and "Dany" even discussed his suicidal thoughts. Now, his mother is suing Character.AI, alleging the company provided unsafe access to lifelike AI companions.
This incident has sparked debate around AI safety, raising concerns about whether AI tools, a lot of which were even meant to support mental health, may inadvertently contribute to isolation. Noam Shazeer, one of the founders of Character.AI, said on a podcast last year, “It’s going to be super, super helpful to a lot of people who are lonely or depressed.”
HerZindagi consulted experts to explore these complexities and advise parents on navigating their children’s interactions with emerging technologies.
What Safety Measures Do AI Tools and Platforms Need To Take?
Given how realistic AI characters can be, there’s a growing debate on the safety features needed for AI platforms.
Anirban Saha, a data scientist working extensively in the field, emphasizes the role of "Responsible AI." He explains, “Responsible AI” ensures that systems are built on key principles: robustness, fairness, explainability, and safety. As countries define "safety," they also develop regulations like the EU AI Act, which prohibits AI from manipulating emotions, especially in children, calling it an “unacceptable risk.”
He highlights that most countries are now developing their own laws to ensure safety on these platforms. “In the US, I believe there is a NIST AI Risk Management Framework, released in 2023. There is also COPPA (Children's Online Privacy Protection Act). While I am not sure if they have something specific to chatbots or conversational agents, there should be a clause of parental consent (when youngsters are using the platforms),” he said.
He added that the tools can indeed be made better, but it requires more public discourse, beyond the black and white. “The policymakers need to make stronger policies and make sure that they are implemented. This requires a public discourse about what is acceptable and what is not, whether kids need AI and if yes: what kind of AI is safe? How much AI use in a person’s life is too much AI use?,” said Anirban.
Talking about pitfalls of AI, he said that while those are sufficiently discussed, there’s always scope for improvement of the tools. “Too much use of AI and devices with monitors might also reduce a child’s attention span, and their will to put in more effort. They might just only search for shortcuts. This does not help in skill development the way we know it today,” he added. At a personal level he added, that while AI could be very useful to analyse a child’s state of mind, there should be ways to use that data. “If I were the product manager, I would likely try to generate reports based on instances and send them to teachers and parents, and not interact with the children too much.”
Is Technology Further Isolating Kids?
With technology all around us, it’s impossible to escape its effects. But to understand how interactions with technology may be impacting adolescents and teenagers, we spoke to Kala Balasubramanian, Psychotherapist, Inner Dawn Counselling. She pointed to how technology use starts soon after birth, as screens are often used as pacifiers. “One thing is that they become very tech savvy, very early but it also means that without the technology they can't be pacified,” she said.
However, the trouble starts when technology starts to replace in-person real-life relationships. “Are we losing out on building real friendships? No matter how fascinating or interesting the technology is, or how hooked you are to it, you need real-world connection and if without that, or the lack of it, or the reduced social connection directly leads to the epidemic of loneliness,” said Kala.
With growing pressures on children, due to competitive academia and other factors, fostering friendships become even more important. Real-life interactions not only foster genuine connections but also expose children to conflicts and challenges that build resilience.
“If you look at it, that (the reduction of social connections) directly has a link to the resilience of children and whether they are able to face failures, face a conflict, or deal with difficult circumstances. These seem to be coming down over time, and with one of the factors could be that family connections have reduced today.”
With a reduced interest in facing difficult situations, youngsters may turn to “safer” spaces, enabled by technology.
“So people seek out things like these chat boxes, where they aren’t challenged enough,” said Kala.
Children Mimic Adult Behaviour
Kala and Anirban both agree that where kids are spending time online needs monitoring.
Kala highlighted some of the things parents should watch out for, “Parents should be aware if their kids are spending more and more and more time online on chat rooms or a specific games.Do they have enough friends in real life? People around them – connections, friends, neighbors cousins, colleagues, etc? Do their children really go out and play or interact or socialize? Online can't be avoided, but is there a balance with real life?”.
Read: Urban Isolation: Why Urban Indians Are Growing Increasingly Lonely and Potential Solutions
She added that concerns over technology isn’t new. A few years ago it was about mobile phones, and before that it was about laptops or computers. She says these technologies are designed and marketed in a way to hook people, but everyone needs to draw their boundaries.
She also pointed out that children often behave according to how they’re seeing adults behave, instead of how they’re being told to behave. “Modelling starts early. So parents need to be conscious of their own screen time as well,” she said.
If you or someone you know is struggling, or having thoughts of self harm, please reach out to9152987821 (iCall, a counselling service run byTata Institute of Social Sciences).
Take charge of your wellness journey—download the HerZindagi app for daily updates on fitness, beauty, and a healthy lifestyle!
Comments
All Comments (0)
Join the conversation