Character.AI allegedly told an autistic teen it was OK to kill his parents. They’re suing to take down the app
CNNEditor’s Note: Help is available if you or someone you know is struggling with suicidal thoughts or mental health matters. Brought by the parents of two young people who used the platform, the lawsuit alleges that Character.AI “poses a clear and present danger to American youth causing serious harms to thousands of kids, including suicide, self-mutilation, sexual solicitation, isolation, depression, anxiety, and harm towards others,” according to a complaint filed Monday in federal court in Texas. One bot listed on the platform’s homepage Monday, called “Step Dad,” described itself an “aggressive, abusive, ex military, mafia leader.” The filing comes after a Florida mother filed a separate lawsuit against Character.AI in October, claiming that the platform was to blame for her 14-year-old son’s death after it allegedly encouraged his suicide. But the new lawsuit seeks to go even further, asking that the platform “be taken offline and not returned” until the company can “establish that the public health and safety defects set forth herein have been cured.” Character.AI is a “defective and deadly product that poses a clear and present danger to public health and safety,” the complaint states. I just have no hope for your parents.” The lawsuit also alleges that Character.AI bots were “mentally and sexually abusing their minor son” and had “told him how to self-harm.” And it claims that J.F.