From chatbots dishing out illegal advice to dodgy AI-generated search results, take a look back over the year’s biggest AI ...
Even using random capitalization in a prompt can cause an AI chatbot to break its guardrails and answer any question you ask ...
IN an unsettling lawsuit filed in the Eastern District of Texas, two families have taken on Character Technologies Inc., the ...
Catch up quick: Chatbot companions — also called AI girlfriends or boyfriends, personalized AI, social bots, or virtual friends — have been heralded as a cure for loneliness. But critics say ...
Two families are suing AI chatbot company Character.AI for allegedly encouraging harm after the kids became emotionally attached to the bots. One chatbot allegedly exposed a child to sexualized ...
This addictive element can exacerbate feelings such as loneliness, especially in children, causing some users to depend on the chatbots. “The real danger lies in some people who treat AI as a ...
This story discusses suicide. If you or someone you know is having thoughts of suicide, please contact the Suicide & Crisis Lifeline at 988 or 1-800-273-TALK (8255). Two Texas parents filed a ...
An alarming incident involving an AI chatbot has escalated into a legal battle, highlighting the potential dangers of ...
With increasing AI capabilities, many fear that the technology could somehow take over and deceive humans. But in reality, ...
The lawsuit claims that C.AI knowingly has put young teens using the app in danger through predatory bot learning practices.