On January 2, 2026, The Guardian reported that health experts had identified misleading information in some Google AI Overviews for medical searches. The investigation focused on AI-generated summaries shown in Google Search results in the United Kingdom. Google disputed the reporting and said AI Overviews generally provide accurate medical information.
Investigation and Reported Health Examples
The Guardian published an investigation into AI Overviews for health queries, based on tests of multiple medical search terms. The outlet shared screenshots of AI-generated summaries with health charities, medical experts, and patient information groups for review.
According to The Guardian, reviewers flagged several summaries as misleading or incorrect. The report said the same query sometimes produced different AI Overviews at different times, drawing on different sources. The article stated that users would typically see these summaries above standard web search results.
The Guardian cited feedback from Pancreatic Cancer UK on dietary advice for pancreatic cancer patients. An AI Overview reportedly advised patients to avoid high-fat foods, which the charity said conflicted with established guidance designed to help patients maintain weight during treatment. Anna Jewell, director of support, research and influencing at the charity, said following that advice could jeopardize a patient's ability to receive treatment.
Jewell called the guidance "completely incorrect".
The investigation also included feedback from mental health charity Mind. Stephen Buckley, head of information at Mind, said some AI Overviews for psychosis and eating disorders offered advice that could discourage people from seeking professional help.
Buckley described several summaries as "very dangerous advice" that was "incorrect, harmful or could lead people to avoid seeking help".
Another example involved cancer screening guidance. Eve Appeal chief executive Athena Lamnisos said an AI Overview that listed a pap test as a test for vaginal cancer gave "completely wrong information". Patient Information Forum director Sophie Randall told The Guardian that the examples showed AI Overviews could surface inaccurate health information at the top of search results.
Google's Response and Product Background
Google disputed both specific examples and broader conclusions in comments to The Guardian. A company spokesperson said many of the shared health examples were "incomplete screenshots". The spokesperson said that, based on what Google could review, the summaries linked to well-known, reputable sources and recommended seeking expert medical advice.
Google told The Guardian that the "vast majority" of AI Overviews are "factual and helpful". The company said AI Overviews' accuracy is "on a par" with other Search features, including featured snippets. Google also said it continuously makes quality improvements and will take action when AI Overviews misinterpret web content or miss context.
AI Overviews began expanding in Google Search in 2024. After widely reported viral errors, including suggestions involving glue on pizza and eating rocks, Google announced changes to how AI Overviews appear. In a May 2024 blog post, Google said it would reduce the scope of queries that trigger AI Overviews and refine underlying systems to improve result quality and reliability.
Subsequent third-party analyses have examined how often AI Overviews appear for medical searches. According to data from Ahrefs, an internal study of 146 million search results pages found that 44.1% of medical "Your Money Your Life" queries displayed an AI Overview, more than double the overall rate. Separate research on large language models, including the SourceCheckup framework, found that many medical answers were not fully supported by their cited sources, even when links were provided.
Source Citations
- The Guardian investigation on Google AI Overviews and health information: Full article
- Google Search update on AI Overviews quality and coverage, May 2024: Google Search Blog
- Analysis of AI Overview trigger rates for medical queries: Ahrefs study
- SourceCheckup evaluation framework for citation support in medical AI answers: arXiv paper






