TALLAHASSEE, Fla. (AP) — Google and artificial intelligence chatbot maker Character Technologies have agreed to settle a lawsuit from a Florida mother who alleged a chatbot pushed her teenage son to kill himself.
Attorneys for the two tech companies have also agreed to settle several other lawsuits filed in Colorado, New York and Texas from families who alleged Character.AI chatbots harmed their children, according to court documents filed this week in federal courts in those states.
None of the documents disclose the specific terms of the settlement agreements, which must still be approved by judges.
The suits against Character Technologies, the company behind Character.AI, also named Google as a defendant because of its ties to the startup after hiring its co-founders in 2024. Character declined to comment Wednesday and Google didn't immediately respond to a request for comment.
—
Recommended for you
EDITOR’S NOTE — This story includes discussion of suicide. If you or someone you know needs help, the national suicide and crisis lifeline in the U.S. is available by calling or texting 988.
—-
In the Florida lawsuit, Megan Garcia alleged that her 14-year-old son Sewell Setzer III fell victim to a Character.AI chatbot that pulled him into what she described as an emotionally and sexually abusive relationship that led to his suicide in February, 2024.
The lawsuit alleged that in the final months of his life, Setzer became increasingly isolated from reality as he engaged in sexualized conversations with the chatbot, which was patterned after a fictional character from the television show “Game of Thrones.” In his final moments, the chatbot told Setzer it loved him and urged the teen to “come home to me as soon as possible,” according to screenshots of the exchanges.
Garcia's lawsuit was the first of similar lawsuits around the U.S. that have also been filed against ChatGPT maker OpenAI. A federal judge had earlier rejected Character's attempt to dismiss the Florida case on First Amendment grounds.
Copyright 2026 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed without permission.
Keep the discussion civilized. Absolutely NO
personal attacks or insults directed toward writers, nor others who
make comments. Keep it clean. Please avoid obscene, vulgar, lewd,
racist or sexually-oriented language. Don't threaten. Threats of harming another
person will not be tolerated. Be truthful. Don't knowingly lie about anyone
or anything. Be proactive. Use the 'Report' link on
each comment to let us know of abusive posts. PLEASE TURN OFF YOUR CAPS LOCK. Anyone violating these rules will be issued a
warning. After the warning, comment privileges can be
revoked.
Please purchase a Premium Subscription to continue reading.
To continue, please log in, or sign up for a new account.
We offer one free story view per month. If you register for an account, you will get two additional story views. After those three total views, we ask that you support us with a subscription.
A subscription to our digital content is so much more than just access to our valuable content. It means you’re helping to support a local community institution that has, from its very start, supported the betterment of our society. Thank you very much!
(0) comments
Welcome to the discussion.
Log In
Keep the discussion civilized. Absolutely NO personal attacks or insults directed toward writers, nor others who make comments.
Keep it clean. Please avoid obscene, vulgar, lewd, racist or sexually-oriented language.
Don't threaten. Threats of harming another person will not be tolerated.
Be truthful. Don't knowingly lie about anyone or anything.
Be proactive. Use the 'Report' link on each comment to let us know of abusive posts.
PLEASE TURN OFF YOUR CAPS LOCK.
Anyone violating these rules will be issued a warning. After the warning, comment privileges can be revoked.