Google's AI Overviews feature was intended to give tidy summations of search results, researched and written in mere seconds by generative AI. The problem is, it got stuff wrong . How often? It's hard to say, although examples piled up quickly this month not long after Google started rolling out AI Overviews on a wide-scale basis.
Consider these well-publicized flubs: When asked how to keep cheese on pizza, it suggested adding an eighth of a cup of nontoxic glue . That's a tip that originated from an 11-year-old comment on Reddit. And in response to a query about daily rock intake for a person, it recommended we eat "at least one small rock per day.
" That advice hailed from a 2021 story in The Onion . On Thursday evening, Google said it is now scaling back the service on health-related queries, as well as when it deems users are making nonsensical or satirical searches. You also shouldn't see AI Overviews results "for hard news topics, where freshness and factuality are important.
" Read more: Glue in Pizza? Eat Rocks? Google's AI Search Is Mocked for Bizarre Answers In a blog post , Liz Reid, vice president and head of Google Search, acknowledged that "some odd, inaccurate or unhelpful AI Overviews certainly did show up" and said that Google has "made more than a dozen technical improvements to our systems" in the last week and will "keep improving." While we're still learning about what's next for AI Overviews and for generative AI in search more broadly, we do know more abo.