Google Search Advocate John Mueller has publicly criticized proposals to serve Markdown-only versions of web pages to AI crawlers. In comments on Reddit and Bluesky, he questioned the technical value of the approach for large language model bots.
Key details of Mueller's Markdown comments
Mueller's recent remarks were prompted by a Reddit discussion where a developer described a system for serving Markdown files instead of standard HTML to AI bots. He later referred to the same idea in public posts on Bluesky.
- The Reddit developer said the approach targeted AI crawlers including GPTBot and ClaudeBot.
- They reported early tests showing about a 95% reduction in tokens used per page for those bots.
- Mueller replied in the thread, asking whether such bots can meaningfully recognize Markdown on websites beyond treating it as plain text.
- He questioned how Markdown delivery would handle internal links, navigation, headings, and other site elements normally exposed in HTML.
- On Bluesky, Mueller described the concept of converting pages to Markdown for bots as "a stupid idea," referencing the same proposal.
- His Bluesky post also noted that large language models can read images, using that point to mock aggressive content transformations for AI crawlers.
"Converting pages to markdown is such a stupid idea."
Background on the Markdown-for-bots proposal
Developers experimenting with bot-specific Markdown delivery aim to cut the amount of data processed by AI crawlers. The Reddit poster argued that Markdown pages help language models ingest more content per request because they include less markup and code than full HTML documents.
In the same discussion cycle, technical SEO consultant Jono Alderson raised concerns about flattened Markdown copies of pages. Alderson said such copies remove meaningful structure and context compared with standard HTML, including elements that help both search engines and users understand page layout and importance.
Other Reddit participants also expressed skepticism about rewriting content just for AI crawlers. One commenter noted that there is no public evidence language models reward cheaper-to-parse formats in training, ranking, or citations.
Separately, SE Ranking reported an analysis of 300,000 domains studying the llms.txt convention for LLM crawlers. According to coverage by Search Engine Journal, that study found no clear link between llms.txt usage and LLM citations.
In earlier public comments reported by Search Engine Journal, Mueller similarly questioned creating LLM-only Markdown or JSON versions of pages. He instead recommended maintaining clean HTML with supported structured data, rather than separate machine-only files for AI systems.
Taken together, the Reddit thread, Bluesky posts, and earlier remarks highlight an ongoing debate about LLM-focused web formats. Search professionals continue to discuss whether separate machine-facing versions of pages are needed alongside standard HTML.
Source citations
The following primary sources contain Mueller's comments and the original discussion about Markdown delivery for AI crawlers.






