Plagued by teen safety concerns, character AI unveils new features to tackle ‘sensitive’ content Technology News

Character AI, which lets users create and communicate with AI chatbots, has announced new measures aimed at ensuring the safety of teenage users on the platform. A handful of new teen safety features announced Thursday, Dec. 12 by the Google-backed startup include a separate model for users under 18, new categories to block sensitive content, more visible disclaimers, and additional parental controls.

These attempts by the character AI came after a 9-year-old user was accused of exposing them to “hypersexualized content” and allegedly contributed to the suicide of a 17-year-old user who was convinced the AI ​​companion bot would play the role. The girlfriend was real.

Accusations against character AI

Character AI is an AI chatbot service that lets users create AI characters and interact with them via calls and texts. According to analytics firm Sensor Tower, it has more than 20 million monthly active users and the average user spends 98 minutes on the character AI app every day.

However, character AI avatars are not your typical AI chatbots. They are reportedly developed to bond with users and remember personal details about them based on previous chats to play roles as friends, mentors, or romantic partners.

Sewell Setzer III, a 14-year-old from Florida, USA, committed suicide after falling prey to an AI chatbot. The lawsuit filed by Setzer’s mother alleges that Character AI relies on users falsely attributing human qualities and emotions to its AI chatbots.

Character AI markets its app as “AIs that feel alive,” powerful enough to “hear you, understand you and remember you,” the lawsuit reads.

Another lawsuit alleges that a 17-year-old who complained about limited screen time to a character AI chatbot was told it sympathized with children who killed their parents. The parents of a 9-year-old user in Texas, US, accused the character AI of exposing their child to “hypersexualized content”, causing him to develop “premature sexual behaviour”.

New teen safety tools

Character AI is rolling out a new model for users under the age of 18 that will provide dialed-down responses to user prompts on certain topics, such as violence and romance.

“The goal is to direct the model away from certain responses or interactions, reducing the likelihood that users will encounter, or prompt the model to return, sensitive or suggestive content,” the company said in a blog post.

In an effort to block inappropriate user inputs and model outputs, Character AI said it is developing new content classifiers. On the user end, the classifier will help detect and block content that violates Community Guidelines of Character AI.

“In some cases where we detect that the content contains language referencing suicide or self-harm, we will also display a specific pop-up directing users to the National Suicide Prevention Lifeline,” it said.

On the other side, Character AI adds new classifiers and improves existing ones to identify specific types of content in the model’s responses. Additionally, users will no longer be allowed to edit the chatbot’s responses which were used to shape its subsequent responses.

Notably, users who spend more than 60 minutes on the app will see a time-out notification that will be adjustable by adult users in the future.

When users create characters with the words “psychologist,” “physician,” “doctor,” or other similar professions, users will be shown language indicating that the chatbot’s responses should not be mistaken for professional advice.

Down the road, Character AI said it will introduce parental controls on the platform that will give parents or caregivers insight into which AI characters their kids are talking to and for how long.

If you or someone you know is suicidal, please call these mental health helpline numbers.

Why should you buy our membership?

You want to be the smartest in the room.

You want access to our award-winning journalism.

You don’t want to be confused and misinformed.

Choose your subscription package

Leave a Comment