From the maybe-we-should-have-done-that-to-start dept:

The chatbot company Character.AI will ban users 18 and under from conversing with its virtual companions beginning in late November after months of legal scrutiny.

The announced change comes after the company, which enables its users to create characters with which they can have open-ended conversations, faced tough questions over how these AI companions can affect teen and general mental health, including a lawsuit over a child’s suicide and a proposed bill that would ban minors from conversing with AI companions.

“We’re making these changes to our under-18 platform in light of the evolving landscape around AI and teens,” the company wrote in its announcement. “We have seen recent news reports raising questions, and have received questions from regulators, about the content teens may encounter when chatting with AI and about how open-ended AI chat in general might affect teens, even when content controls work perfectly.”

  • DoGeeseSeeGod@lemmy.blahaj.zone
    link
    fedilink
    arrow-up
    6
    ·
    24 hours ago

    Its probably the legal battle is most of it. But I gotta wonder how much of decision was based that recent headline of Grok AI asking a 12 year old to send nudes. Accidentally creating something that sometimes attempts to make CASM is not a good look and the lawsuits if a kid actually did it.