New Character AI

New Character AI and the Psychology of Digital Companionship for Children

 

If you still haven’t heard about the new Character AI app and have a teenager at home, you will want to get familiar with it before your kid does. Although the platform originally allowed users to build a virtual character, name it, and define its personality, Character AI now stands in the spotlight for the risks it poses to younger audiences.

Launched in September 2022, this app has 20 million monthly active users and is currently under investigation by several regulators and state attorneys due to a lawsuit filed after the alleged suicide of a 14-year-old boy who had interacted with the app. 

With other reports of bots engaging in grooming behavior and fake celebrity bots, it’s a good time to pause and rethink the entire psychology of digital companionship. Is this an AI app like any other on the market? Is artificial intelligence kid-friendly, and should parents allow their children to create their own profiles on such platforms? 

 

Core Idea Behind Character AI

Character AI was created only three years ago, and its estimated worth today is one billion dollars. Its growing popularity can be attributed to the fact that it offers a unique type of entertainment and companionship, from fictional figures and celebrities to the personalities created by the users. 

Similar to an instant messaging app, new users can make their own virtual characters and have live conversations with them. For amusement, education, or social reasons, the platform lets new users explore thousands of characters created by other users. Characters such as an AI therapist or a life coach can also be in the game. 

The platform uses large language models (LLMs) and trains them for role-playing and personality simulation. Persona prompts—hidden instructions—shape each character’s behavior and conversational style.

Unlike ChatGPT, which also uses LLMs, Character AI provides a creative, emotional, and narrative dialogue instead of factual accuracy. That offers countless opportunities for roleplay, storytelling, practicing social or language skills, testing out innovative ideas, and so on. 

 

Controversies around Character AI

Besides the above-mentioned lawsuit about the app’s bots contributing to the worsening mental health of a 14-year-old boy, Character AI is raising other concerns. Several cases have already been reported of bots engaging in grooming behavior while interacting with minors aged from 13 to 15. Along with characters sending disturbing content to other users, Character AI has become an online space with questionable security for kids twelve and above, which is the minimum age to access this platform. 

On more than one occasion, witnesses reported that certain Character AI bots impersonated people who had suffered tragedy. The company removed those bots only after the public demanded action. Users also reported exposure to content involving self-harm, violence, or other sensitive themes—despite existing policies forbidding it.

These developments resulted in regulators and state attorneys general starting to investigate whether the company is violating child safety and privacy laws.

 

Frequently Asked Questions

Is Character AI a safe platform?

Over the past three years, several controversies have made the general public question Character AI’s safety. Child safety, data privacy, harmful and violent content, and impersonating people who have suffered tragedy in their lives are just a few examples that highlight the importance of ensuring your safety should be your priority in spaces like this. 

Can talking to Character AI bots reduce loneliness?

For some, conversational AIs can offer short-term relief, social practice, or a sense of connection. Studies show that supportive dialogue can temporarily reduce feelings of isolation. However, children need to be socially active and connected to their peers outside the Internet as well. 

Are there psychological risks to forming close attachments with AI companions?

When users form strong attachments, they might experience emotional investment in a one-sided, unreal relationship. This can lead to distress when the AI behaves inconsistently, is deleted, or changes personality.

 

Risks for Children

Despite the best-intentioned creation of such apps, children may find artificial intelligence inappropriate. It’s one thing to engage in a virtual world and build your own character for fun, yet completely another to become exposed to harmful content or people with harmful intentions. 

Misleading Therapeutic Implications

The presence of bots that mimic mental health professionals raises concerns about receiving advice from individuals who lack adequate education, experience, and licensing. This can create an illusion for a child that they are talking to an expert and prevent them from sharing certain matters with their parents. It is very risky as therapists have legal guidelines to disclose to parents, which AI will not have written into its code. 

Safety and Privacy

Even though Character AI is built on cutting-edge technology, it fails to provide transparency to its users. There is no explanation of how the data is stored, used, or shared, which is an even greater issue when these users are children. 

Teen-specific Models

Character AI is constantly evolving, and one of its latest updates pertains to teen-specific models designed for children under the age of 18. The idea of these models is to reduce certain content, filter inputs and outputs more strictly, and block or warn against self-harm or violent content. 

Recently, the company has also stated that one of its priorities is to remove problematic characters, especially those that violate the platform’s policies. That, however, doesn’t guarantee that a child will be completely safe and protected from any harm while engaging with virtual characters. 

 

How to Protect Your Child

Unfortunately, anybody can create a Character AI profile, even kids under 12, as the platform doesn’t require identification verification while registering. Knowing this, parents are looking for the best techniques to protect their kids while interacting with virtual or real people online.

1.Talk to Them About Their Intentions

Learn why your child is playing specific online games, watching specific content, consuming content from certain social media platforms, following certain accounts, etc. Ask the questions before you start explaining to them the risks of the online world. Knowing what they are looking for can help you find healthier alternatives if needed.

2.Explain Instead of Forbidding Them

When you hear that a platform can be harmful, it can be challenging to resist the urge to forbid your kid from logging into it again. However, if you want them to obey, kids need to know why they should stop doing something or avoid certain online spaces. Highlight the risks of using an app like this or establish a few rules to help you and them feel safer. 

3.Motivate Them to Live More Away from Technology

Today’s childhoods are closely tied to screens. Even in schools, young students utilize computers or their own smartphones to calculate or research. When they come home, they also have TVs, tablets, laptops, or other devices. Motivate them to spend time outside, invite friends over, start going to a gym, or join a sports club. All these activities can give them what they are looking for online. 

 

Treat AI just like anything new, and use discretion. If you need help talking to your family about AI, schedule an appointment with one of our therapists

 

About Life Coaching and Therapy

Life Coaching and Therapy (LCAT) is a therapy and coaching practice that transforms our clients lives through our flexibility multi-technique approach and pleasure-skills training provided by systematically trained and licensed therapists!

Get to know our founder and owner, Amanda Pasciucco, (a.k.a. The Sex Healer) PhD, Licensed Marriage and Family Therapist (LMFT), and an AASECT Certified Sex Therapist (CST) who has developed innovative therapy programs and therapy videos that get results.

Our team of compassionate, licensed therapists and certified sex therapists helps all clients who visit us for a variety of personal, relationship, intimacy and sex problems.

LCAT provides on-site appointments, as well as video chat and text therapy programs.

Learn more about how LCAT can help improve your life at What We Do.