{"id":5677,"date":"2024-10-30T16:00:41","date_gmt":"2024-10-30T16:00:41","guid":{"rendered":"https:\/\/tradetrovex.com\/index.php\/2024\/10\/30\/a-gut-punch-character-ai-criticised-over-horrific-brianna-ghey-and-molly-russell-chatbots\/"},"modified":"2024-10-30T16:00:41","modified_gmt":"2024-10-30T16:00:41","slug":"a-gut-punch-character-ai-criticised-over-horrific-brianna-ghey-and-molly-russell-chatbots","status":"publish","type":"post","link":"https:\/\/tradetrovex.com\/index.php\/2024\/10\/30\/a-gut-punch-character-ai-criticised-over-horrific-brianna-ghey-and-molly-russell-chatbots\/","title":{"rendered":"\u2018A gut punch\u2019: Character.AI criticised over \u2018horrific\u2019 Brianna Ghey and Molly Russell chatbots"},"content":{"rendered":"<p>The NSPCC is warning an AI company that allowed users to create chatbots imitating murdered teenager Brianna Ghey and her mother pursued \u201cgrowth and profit at the expense of safety and decency\u201d.<\/p>\n<p>Character.AI, which last week was accused of \u201cmanipulating\u201d a teenage boy into taking his own life, also allowed users to create chatbots imitating teenager <strong>Molly Russell. <\/strong><\/p>\n<div class=\"sdc-site-outbrain sdc-site-outbrain--AR_6\">    <\/div>\n<p>Molly took her own life aged 14 in November 2017 after viewing posts related to suicide, depression and anxiety online.<\/p>\n<div class=\"ad ad--teads\">        <\/div>\n<p>The chatbots were discovered during <strong>an investigation by The Telegraph newspaper<\/strong>.<\/p>\n<p>\u201cThis is yet another example of how manipulative and dangerous the online world can be for young people,\u201d said Esther Ghey, the mother of <strong>Brianna Ghey<\/strong>, and called on those in power to \u201cprotect children\u201d from \u201csuch a rapidly changing digital world\u201d.<\/p>\n<p>According to the report, a Character.AI bot with a slight misspelling of Molly\u2019s name and using her photo, told users it was an \u201cexpert on the final years of Molly\u2019s life\u201d.<\/p>\n<p>\u201cIt\u2019s a gut punch to see Character.AI show a total lack of responsibility and it vividly underscores why stronger regulation of both AI and user generated platforms cannot come soon enough,\u201d said Andy Burrows, who runs the Molly Rose Foundation, a charity set up by the teenager\u2019s family and friends in the wake of her death.<\/p>\n<p>The NSPCC has now called on the government to implement its \u201cpromised AI safety regulation\u201d and ensure the \u201cprinciples of safety by design and child protection are at its heart\u201d.<\/p>\n<p>\u201cIt is appalling that these horrific chatbots were able to be created and shows a clear failure by Character.AI to have basic moderation in place on its service,\u201d said Richard Collard, associate head of child safety online policy at the charity.<\/p>\n<p>Character.AI told Sky News the characters were user-created and removed as soon as the company was notified.<\/p>\n<p>\u201cCharacter.AI takes safety on our platform seriously and moderates Characters both proactively and in response to user reports,\u201d said a company spokesperson.<\/p>\n<p>\u201cWe have a dedicated Trust &amp; Safety team that reviews reports and takes action in accordance with our policies.<\/p>\n<p>\u201cWe also do proactive detection and moderation in a number of ways, including by using industry-standard blocklists and custom blocklists that we regularly expand. We are constantly evolving and refining our safety practices to help prioritise our community\u2019s safety.\u201d<\/p>\n<p><strong>Anyone feeling emotionally distressed or suicidal can call Samaritans for help on 116 123 or email jo@samaritans.org in the UK. In the US, call the Samaritans branch in your area or 1 (800) 273-TALK<\/strong><\/p>\n<\/p>\n<div>This post appeared first on sky.com<\/div>\n","protected":false},"excerpt":{"rendered":"<p>The NSPCC is warning an AI company that allowed users to create chatbots imitating murdered&hellip;<\/p>\n","protected":false},"author":0,"featured_media":5678,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[3],"tags":[],"class_list":["post-5677","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-tech-news"],"_links":{"self":[{"href":"https:\/\/tradetrovex.com\/index.php\/wp-json\/wp\/v2\/posts\/5677","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/tradetrovex.com\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/tradetrovex.com\/index.php\/wp-json\/wp\/v2\/types\/post"}],"replies":[{"embeddable":true,"href":"https:\/\/tradetrovex.com\/index.php\/wp-json\/wp\/v2\/comments?post=5677"}],"version-history":[{"count":0,"href":"https:\/\/tradetrovex.com\/index.php\/wp-json\/wp\/v2\/posts\/5677\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/tradetrovex.com\/index.php\/wp-json\/wp\/v2\/media\/5678"}],"wp:attachment":[{"href":"https:\/\/tradetrovex.com\/index.php\/wp-json\/wp\/v2\/media?parent=5677"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/tradetrovex.com\/index.php\/wp-json\/wp\/v2\/categories?post=5677"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/tradetrovex.com\/index.php\/wp-json\/wp\/v2\/tags?post=5677"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}