suicide

Lawsuit blames Character.AI in death of 14-year-old boy

Character.AI is being targeted in a lawsuit after the suicide of a 14-year-old Florida boy whose mother says he became obsessed with a chatbot on the platform. According to The New York Times, Sewell Setzer III, a ninth grader from Orlando, spent months talking to chatbots on Character.AI’s AI role-playing app. Setzer developed an emotional […]

Lawsuit blames Character.AI in death of 14-year-old boy Read More »

Meta, TikTok and Snap pledge to participate in program to combat suicide and self-harm content

In an attempt to prevent suicide and self-harm content from spreading online, the nonprofit Mental Health Coalition (MHC) today announced a new program, Thrive, aimed at encouraging online platforms to share “signals” of potentially harmful material. Thrive, which counts Meta, Snap and TikTok as founding members, will provide ways for platforms to share hashes —

Meta, TikTok and Snap pledge to participate in program to combat suicide and self-harm content Read More »