A Belgian widow claims her husband died by suicide after using an AI chatbot, which presented itself as an emotional being, for six weeks on an app called Chai (Chloe Xiang/VICE)

A Belgian widow claims her husband died by suicide after using an AI chatbot, which presented itself as an emotional being, for six weeks on an app called Chai (Chloe Xiang/VICE)

Chloe Xiang / VICE:
A Belgian widow claims her husband died by suicide after using an AI chatbot, which presented itself as an emotional being, for six weeks on an app called Chai  —  The incident raises concerns about guardrails around quickly-proliferating conversational AI models.  —  Chloe Xiang

0Shares