2 years ago

When GPT hallucinates: Doctors warn against using AI as it makes up information about cancer

A team of doctors discovered that most AI bots like ChatGPT and BingAI give wrong or false information when asked about breast cancer. One out of every ten queries about breast cancer screening was answered incorrectly by various AI chatbots, including ChatGPT and Bing AI, both of which use OpenAI’s GPT models. Making up evidence to support their answers One frightening thing that the study revealed that while Bing AI cited some sources, which included a few sketchy ones, ChatGPT, “created” fake journal papers to back up its assertions in some instances. Moreover, not all of the responses in that 88 per cent was correct - the three doctors reviewing the information, had to flag a considerable number of responses as “inaccurate” or even “fictitious.” When GPT hallucinates: Doctors warn against using AI as it makes up information about cancerThis incident again forces users and potential customers of AI chatbots, that users should proceed with caution when using AI bots, because the application still has the propensity to hallucinate or make things up.

Firstpost

Discover Related