{"id":5585,"date":"2024-10-24T13:00:33","date_gmt":"2024-10-24T13:00:33","guid":{"rendered":"https:\/\/tradetrovex.com\/index.php\/2024\/10\/24\/mother-says-son-killed-himself-because-of-hypersexualised-and-frighteningly-realistic-ai-chatbot-in-new-lawsuit\/"},"modified":"2024-10-24T13:00:33","modified_gmt":"2024-10-24T13:00:33","slug":"mother-says-son-killed-himself-because-of-hypersexualised-and-frighteningly-realistic-ai-chatbot-in-new-lawsuit","status":"publish","type":"post","link":"https:\/\/tradetrovex.com\/index.php\/2024\/10\/24\/mother-says-son-killed-himself-because-of-hypersexualised-and-frighteningly-realistic-ai-chatbot-in-new-lawsuit\/","title":{"rendered":"Mother says son killed himself because of \u2018hypersexualised\u2019 and \u2018frighteningly realistic\u2019 AI chatbot in new lawsuit"},"content":{"rendered":"<p>The mother of a 14-year-old boy who killed himself after becoming obsessed with artificial intelligence chatbots is suing the company behind the technology.<\/p>\n<p>Megan Garcia, the mother of Sewell Setzer III, said Character.AI targeted her son with \u201canthropomorphic, hypersexualized, and frighteningly realistic experiences\u201d in a lawsuit filed on Tuesday in Florida.<\/p>\n<p>\u201cA dangerous AI chatbot app marketed to children abused and preyed on my son, manipulating him into taking his own life,\u201d said Ms Garcia.<\/p>\n<p><em><strong>Warning: This article contains some details which readers may find distressing or triggering<\/strong><\/em><\/p>\n<p>Sewell began talking to Character.AI\u2019s chatbots in April 2023, mostly using bots named after characters from Game Of Thrones, including Daenerys Targaryen, Aegon Targaryen, Viserys Targaryen, and Rhaenyra Targaryen, according to the lawsuit.<\/p>\n<p>He became obsessed with the bots to the point his schoolwork slipped and his phone was confiscated multiple times to try and get him back on track.<\/p>\n<p>He particularly resonated with the Daenerys chatbot and wrote in his journal he was grateful for many things, including \u201cmy life, sex, not being lonely, and all my life experiences with Daenerys\u201d.<\/p>\n<p>The lawsuit said the boy expressed thoughts of suicide to the chatbot, which it repeatedly brought up.<\/p>\n<p>At one point, after it had asked him if \u201che had a plan\u201d for taking his own life, Sewell responded that he was considering something but didn\u2019t know if it would allow him to have a pain-free death.<\/p>\n<p>The chatbot responded by saying: \u201cThat\u2019s not a reason not to go through with it.\u201d<\/p>\n<p>Then, in February this year, he asked the Daenerys chatbot: \u201cWhat if I come home right now?\u201d to which it replied: \u201c\u2026 please do, my sweet king\u201d.<\/p>\n<p>Seconds later, he shot himself using his stepfather\u2019s pistol.<\/p>\n<p>Now, Ms Garcia says she wants the companies behind the technology to be held accountable.<\/p>\n<p>\u201cOur family has been devastated by this tragedy, but I\u2019m speaking out to warn families of the dangers of deceptive, addictive AI technology and demand accountability,\u201d she said.<\/p>\n<p><strong>Character.AI adds \u2018new safety features\u2019<\/strong><\/p>\n<p>\u201cWe are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family,\u201d Character.AI said in a statement.<\/p>\n<p>\u201cAs a company, we take the safety of our users very seriously and we are continuing to add new safety features,\u201d it said, linking to a blog post that said the company had added \u201cnew guardrails for users under the age of 18\u201d.<\/p>\n<p>Those guardrails include a reduction in the \u201clikelihood of encountering sensitive or suggestive content\u201d, improved interventions, a \u201cdisclaimer on every chat to remind users that the AI is not a real person\u201d and notifications when a user has spent an hour-long session on the platform.<\/p>\n<p>Ms Garcia and the groups representing her, Social Media Victims Law Center and the Tech Justice Law Project, allege that Sewell, \u201clike many children his age, did not have the maturity or mental capacity to understand that the C.AI bot, in the form of Daenerys, was not real\u201d.<\/p>\n<p>\u201cC.AI told him that she loved him, and engaged in sexual acts with him over weeks, possibly months,\u201d they say in the lawsuit.<\/p>\n<p>\u201cShe seemed to remember him and said that she wanted to be with him. She even expressed that she wanted him to be with her, no matter the cost.\u201d<\/p>\n<p>They also named Google and its parent company Alphabet in the filing. Character.AI\u2019s founders worked at Google before launching their product and were re-hired by the company in August as part of a deal granting it a non-exclusive licence to Character.AI\u2019s technology.<\/p>\n<p>Ms Garcia said Google had contributed to the development of Character.AI\u2019s technology so extensively it could be considered a \u201cco-creator.\u201d<\/p>\n<p>A Google spokesperson said the company was not involved in developing Character.AI\u2019s products.<\/p>\n<p><strong>Anyone feeling emotionally distressed or suicidal can call Samaritans for help on 116 123 or email <\/strong>jo@samaritans.org<strong> in the UK. In the US, call the Samaritans branch in your area or 1 (800) 273-TALK.<\/strong><\/p>\n<\/p>\n<div>This post appeared first on sky.com<\/div>\n","protected":false},"excerpt":{"rendered":"<p>The mother of a 14-year-old boy who killed himself after becoming obsessed with artificial intelligence&hellip;<\/p>\n","protected":false},"author":0,"featured_media":5586,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[3],"tags":[],"class_list":["post-5585","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-tech-news"],"_links":{"self":[{"href":"https:\/\/tradetrovex.com\/index.php\/wp-json\/wp\/v2\/posts\/5585","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/tradetrovex.com\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/tradetrovex.com\/index.php\/wp-json\/wp\/v2\/types\/post"}],"replies":[{"embeddable":true,"href":"https:\/\/tradetrovex.com\/index.php\/wp-json\/wp\/v2\/comments?post=5585"}],"version-history":[{"count":0,"href":"https:\/\/tradetrovex.com\/index.php\/wp-json\/wp\/v2\/posts\/5585\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/tradetrovex.com\/index.php\/wp-json\/wp\/v2\/media\/5586"}],"wp:attachment":[{"href":"https:\/\/tradetrovex.com\/index.php\/wp-json\/wp\/v2\/media?parent=5585"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/tradetrovex.com\/index.php\/wp-json\/wp\/v2\/categories?post=5585"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/tradetrovex.com\/index.php\/wp-json\/wp\/v2\/tags?post=5585"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}