Microsoft CEO Satya Nadella and Google Principal Engineer Jaana Dogan have recently addressed growing criticism of artificial intelligence tools, including concerns over "AI slop," changing workplace expectations, and the impact of AI features on web traffic and content publishers.
Satya Nadella's comments on "AI slop" and product design
Nadella published a blog post titled Looking Ahead to 2026 on his personal site. In the post, he argued that the industry needs to move past debates about low-quality "slop" versus more advanced output.
The industry should "get beyond the arguments of slop vs sophistication," he wrote, describing AI systems as "cognitive amplifier tools" that must "prove [their] value in the real world" in 2026.
Nadella framed AI integration into products as "the product design question we need to debate and answer," and said the goal is to reach "a new equilibrium" that reflects how people actually work with these tools. His comments came as AI output quality and reliability remain central issues for businesses deciding how far to integrate generative systems into workflows.
His use of the term "slop" aligns with broader cultural conversation around AI quality. Merriam-Webster recently selected "slop" as its Word of the Year, based on lookup data and editorial judgment that reflect increased attention to low-effort or low-quality content online.
Jaana Dogan's posts on new technology burnout and Claude Code
Jaana Dogan, a Principal Engineer working on Google's Gemini API, has used X to comment on attitudes toward new technology and to describe her own experience with AI coding tools.
In January 2026, she posted on X about why some people say they oppose new technologies.
"People are only anti new tech when they are burned out from trying new tech," she wrote.
In a separate update, Dogan posted about using Claude Code for a distributed systems prototype. She said the tool produced an orchestrator design in about an hour based on her problem description and then compared the result with work her team had been doing since the previous year.
Dogan added that in 2023 she believed such capabilities were still "five years away," suggesting her expectations for AI-assisted development have shifted as tools mature. She also noted that her comments were personal observations and did not represent official Google policy or product roadmaps.
Reported data on AI search features and publisher traffic
Recent data has intensified debate over how AI search features affect traffic to publishers and other content creators.
Pew Research Center examined how AI summaries appear in Google Search, using browsing data from 68,879 real searches. In its May 23, 2025 analysis, 8% of users clicked a link when AI Overviews appeared in results. When no AI summary appeared, 15% of users clicked a link.
Separately, Columbia Journalism Review reported on the impact of AI Overviews testing on news-related searches, using data from Similarweb. According to that analysis, Similarweb data indicates that the share of news-related Google searches with no click to news sites rose from 56% to 69%. The figures were based on aggregated Google Search traffic data supplied by Similarweb.
On the infrastructure side, Cloudflare has estimated crawl-to-referral ratios for several search and AI companies. According to its analysis, Google Search sent about one visit to publisher sites for every fourteen crawler requests. The same post estimated ratios of around 1,700:1 for OpenAI and 73,000:1 for Anthropic, highlighting a much higher gap between content crawling and measurable referral traffic from some AI-focused services.
At the same time, Google's Search documentation continues to reference Experience, Expertise, Authoritativeness, and Trustworthiness (E-E-A-T) as criteria for ranking high-quality content. That documentation is presented as a long-running expectation for publishers to provide clear sourcing, accuracy, and first-hand experience, especially in areas like health, finance, and legal information, even as AI-generated summaries play a larger role in how information is surfaced.






