Don’t believe everything you read: Google AI overviews offer “very dangerous advice” on food and health
As artificial intelligence becomes increasingly common in everyday life, some of the flaws of the automated systems are becoming clear.


Users looking for health and nutrition advice online may be receiving dangerous advice from Google’s artificial intelligence systems, a new report has found.
The Guardian studied the answers given by Google’s in-house AI system, which offers readers brief ‘AI Overviews’ to provide answers to search queries. Positioned at the top of the page of search results, users are increasingly taking information from the AI Overview alone.
In theory, the AI results should be taken from respectable sources across the web. However experts cited numerous examples where Google had presented incorrect, or even dangerous, advice to users looking for health information.
Anna Jewell, the director of support, research and influencing at Pancreatic Cancer UK, explained: “The Google AI response suggests that people with pancreatic cancer avoid high-fat foods and provides a list of examples. However, if someone followed what the search result told them then they might not take in enough calories, struggle to put on weight, and be unable to tolerate either chemotherapy or potentially life-saving surgery.”

This is just one example of misinformation found on the Google AI Overviews. The investigation found that queries regarding ‘normal range for liver blood tests’ returned a series of figures without proper context, and no consideration for sex, age or ethnicity of patients.
Athena Lamnisos, the chief executive of the Eve Appeal cancer charity, found that Google AI Overview incorrectly listed a pap test as a test for vaginal cancer, potentially giving patients a false sense of security. The nature of the information was often changing, suggesting that the algorithm is not capable of returning clear, evidenced responses.
Lamnisos explained: “We were also worried by the fact that the AI summary changed when we did the exact same search, coming up with a different response each time that pulled from different sources. That means that people are getting a different answer depending on when they search, and that’s not good enough.”
Similarly Stephen Buckley, the head of information at mental health charity Mind, warned that summaries for serious conditions and disorders gave “very dangerous advice” that were “incorrect, harmful or could lead people to avoid seeking help.”
Related stories
Get your game on! Whether you’re into NFL touchdowns, NBA buzzer-beaters, world-class soccer goals, or MLB home runs, our app has it all.
Dive into live coverage, expert insights, breaking news, exclusive videos, and more – plus, stay updated on the latest in current affairs and entertainment. Download now for all-access coverage, right at your fingertips – anytime, anywhere.


Complete your personal details to comment