
I’m a big fan of AI chatbots like Google Gemini and ChatGPT. Unfortunately, they don’t always get things right. An Austrian advocacy group, Noyb, has filed a complaint against OpenAI for wrongly stating that a Norwegian man was a murderer.
When a Norwegian man asked ChatGPT what it knew about him, it reportedly stated that he had been sentenced to 21 years in prison for killing two of his children and the attempted murder of his third. ChatGPT also included accurate information in these details, such as the number of children he had, their genders and the name of his hometown.
Noyb says this has put OpenAI in violation of the European Union’s data privacy and security law, the General Data Protection Regulation (GDPR). The advocacy group’s data protection lawyer, Joakim Söderberg, stated:
“The GDPR is clear. Personal data has to be accurate. And if it’s not, users have the right to have it changed to reflect the truth. Showing ChatGPT users a tiny disclaimer that the chatbot can make mistakes clearly isn’t enough. You can’t just spread false information and in the end add a small disclaimer saying that everything you said may just not be true.”
This isn’t the first time Noyb has complained about OpenAI spreading misinformation. The advocacy group complained last April about the AI incorrectly obtaining information about a public figure.
Sometimes, AI can get information incorrect, but to claim someone is a murderer is pretty egregious.
These types of erroneous statements from AI are what the industry calls a hallucination. Unlike hallucinations experienced by people, AI hallucinations refer to erroneously constructed responses, which stem from how generative AI systems like large language models (LLMs) work.
Researchers say LLMs hallucinate up to 27 per cent of the time and 46 per cent of generated texts include factual errors.
Image Credit: Shutterstock
Source: Engadget
MobileSyrup may earn a commission from purchases made via our links, which helps fund the journalism we provide free on our website. These links do not influence our editorial content. Support us here.
Discover more from
Subscribe to get the latest posts sent to your email.
Be the first to comment