Google has quietly discontinued its “What People Suggest” feature — an AI-powered tool that displayed crowdsourced health tips from everyday internet users directly inside Search results. The move raises important questions about AI reliability, medical accuracy, and the growing tension between innovation and user safety.
What Was the “What People Suggest” Feature?
How It Worked
Google launched “What People Suggest” in early 2025 on US mobile devices. The feature used artificial intelligence to scan online communities — including Reddit, Twitter/X, and Quora — and organize user-generated health perspectives into digestible summaries. These summaries appeared at the top of search results for health-related queries, such as “Why does my leg hurt?” or “How do I manage arthritis pain?”
Google’s Original Vision
At a New York event in March 2025, Google introduced the feature as a major step forward. Karen DeSalvo, then Google’s chief health officer, stated that people not only seek expert medical advice but also value hearing from others who share similar experiences. The company described it as a way to “organize different perspectives from online discussions into easy-to-understand themes.” Google even claimed the feature showed “the potential of AI to transform health outcomes across the globe.”
That vision, however, did not last long.
Why Did Google Remove It?
Google’s Official Explanation
A Google spokesperson confirmed the removal to The Guardian, calling it part of a “broader simplification” of the search results page. The company maintained that the decision had nothing to do with the feature’s quality or safety. According to Google, the tool simply wasn’t being used frequently enough to justify its place in Search.
The Suspicious Timing
Despite Google’s explanation, the timing is difficult to overlook. The removal comes just months after the company faced intense scrutiny over its AI Overviews, which were found to deliver incorrect or potentially harmful health advice. Notably, Google restricted AI-generated responses for certain sensitive medical queries — such as liver test results — after investigations revealed they were sometimes inaccurate. Furthermore, a November 2025 blog post by a Google search advocate had hinted at phasing out lesser-used features, though it never specifically named “What People Suggest.”
The Bigger Problem: AI and Medical Misinformation
Why Crowdsourced Health Advice Is Risky
The core issue with “What People Suggest” was its source material. Health advice from non-medical professionals carries inherent risks. Age, medical history, underlying conditions, and individual symptoms all affect how appropriate a given recommendation might be. What works for one person could be harmful — or even dangerous — for another.
Additionally, AI models are known to misinterpret context. A sarcastic comment on a forum, for instance, could be mistaken for a genuine health tip. Critics pointed out that placing such summaries prominently at the top of search results gave unverified advice an air of authority it did not deserve.
A Pattern of Retreating From AI Health Features
This is not an isolated incident. Google has stepped back from multiple AI health features in quick succession. The restriction of AI Overviews for medical queries earlier in 2026 was also a direct response to public concern. Together, these moves suggest a broader reckoning with the limits of deploying AI in healthcare contexts at scale.
What Health Experts Said
Criticism From the Medical Community
Health professionals were largely uncomfortable with the feature from the start. They argued that summarizing forum discussions and presenting them alongside authoritative medical results blurred a critical line. Moreover, AI systems often strip away important nuance. A tip that helped one person manage mild arthritis pain might not apply — and could actively mislead — someone with a more complex condition.
Critics also highlighted the concept of “anecdotal bias.” Online health communities tend to attract users with extreme experiences — either very positive or very negative outcomes. Consequently, AI summaries drawn from these discussions may not reflect average or medically sound outcomes.
What Happens Now for Users
Forum Content Still Exists — Just Not Summarized
It is important to note that the removal of “What People Suggest” does not eliminate community health discussions from Search entirely. Reddit, Quora, and similar platforms still rank prominently in Google Search results. Users can still access this content — they simply need to click through to the source themselves, rather than receiving an AI-generated summary upfront.
This distinction matters. Google is no longer “vouching” for these summaries. Therefore, users who find medical tips in forums must treat them as personal anecdotes — not clinical evidence.
Tips for Safer Health Searches
- Always cross-reference health information with verified medical sources.
- Consult a licensed healthcare provider before acting on online advice.
- Use platforms like Mayo Clinic, WebMD, or government health portals for evidence-based guidance.
- Treat forum posts as personal stories, not prescriptions.
The Future of AI in Health Search
Google’s Next Move
Interestingly, Google’s “What People Suggest” removal coincides with its annual “The Check Up” healthcare event — a conference where the company showcases new AI health research and partnerships. This signals that Google has not abandoned health AI altogether. Rather, it appears to be recalibrating its approach: moving away from unverified, crowdsourced summaries and toward more structured, expert-backed health information tools.
A Broader Industry Lesson
The rise and fall of “What People Suggest” highlights a tension that runs through all consumer-facing AI: the gap between what is technically possible and what is responsible to deploy at scale. Regulators in the UK, EU, and beyond are watching these voluntary retreats closely as they develop AI healthcare frameworks. For now, the message from Google is clear — speed and convenience cannot come at the cost of patient safety.
