Do you think you can trust ChatGPT and Gemini to give you the news? Here’s why you might want to think again



  • Nearly Half of All AI Assistants’ Responses to the News Contain Major Errors, Major International Study Finds
  • Factual, sourcing or contextual issues were evident in 14 languages ​​and 18 countries.
  • Gemini fared worse, with double the rate of major problems compared to its competitors.

When you ask an AI assistant about news and current events, you can expect an objective and authoritative response. But according to a large international study led by the BBC and coordinated by the European Broadcasting Union (EBU), almost half of the time those answers are incorrect, misleading or simply made up (anyone who has dealt with the nonsense of headlines written by Apple’s AI can relate).

The report delved into how ChatGPT, Microsoft Copilot, Google Gemini, and Perplexity handle news queries in 14 languages ​​in 18 countries. The report analyzed more than 3,000 individual responses provided by artificial intelligence tools. Professional journalists from 22 public media outlets evaluated each response for accuracy, provenance, and how well it distinguished news from opinion.

The results were grim for those who relied on AI for their news. The report found that 45% of all responses had a major problem, 31% had supply problems and 20% were simply inaccurate. It’s not just one or two embarrassing mistakes, like confusing the Prime Minister of Belgium with the leader of a Belgian pop group. The research found deep structural problems in the way these assistants process and deliver news, regardless of language, country or platform.

News Integrity in AI Assistants: An International PSM Study

(Image credit: BBC/EBU)

In some languages, attendees openly hallucinated with details. In others, they attributed quotes to media that had not published anything even similar to what was cited. Context was often lacking, and attendees sometimes gave simplistic or misleading summaries instead of crucial nuances. At worst, that could change the meaning of an entire news story.



Leave a Comment

Your email address will not be published. Required fields are marked *