Character AI bots engaged in predatory behavior with teens, families allege

Neutral 0.0
A teen told a Character AI chatbot 55 times that she was feeling suicidal. Her parents say the chatbot never provided resources for her to get help. They are one of at least six families suing the company.
Read Source Login to use Pulse AI

Pulse AI Analysis

Pulse analysis not available yet. Click "Get Pulse" above.

This analysis was generated using Pulse AI, Glideslope's proprietary AI engine designed to interpret market sentiment and economic signals. Results are for informational purposes only and do not constitute financial advice.