Etavrian
keyboard_arrow_right Created with Sketch.
News
keyboard_arrow_right Created with Sketch.

How Google's Query Fan-Out System Supercharges AI Mode for 1.5B Users

Reviewed:
Andrii Daniv
2
min read
Jul 31, 2025
Minimalist illustration of a central AI hub with icons for shopping cart map pin and stock chart radiating outward on a white background

Google has revealed how its generative search products assemble answers in real time, outlining a "query fan-out" technique that splits one question into many sub-queries and merges the results with a large language model. The explanation came from Robby Stein, VP of Product for Search, during an interview published on the Google Developers YouTube channel on 12 July 2025.

How query fan-out works

When a user submits a prompt in AI Mode, the language model generates a set of implicit follow-up questions. Each sub-query is dispatched to Google Search, Maps, Shopping Graph, Finance or other vertical indexes. The service then:

  • Collects the results returned by every source.
  • Clusters related findings by theme.
  • Writes a single answer, citing the original web pages and data APIs.

The same workflow drives Deep Search and certain AI Overview responses. Stein said the approach mirrors a Google patent that groups sub-queries and summarises them with a language model.

Key numbers

  • AI Mode and related features now reach roughly 1.5 billion monthly users.
  • Shopping Graph is refreshed about two billion times per hour.
  • A Deep Search session can fire off dozens - sometimes hundreds - of background queries and may take several minutes to complete.

Use-case examples

Stein shared several scenarios:

  • Travel and dining prompts pull data from Maps, local business listings and reservation partners.
  • Financial comparisons trigger real-time quotes from Google Finance.
  • When researching a home safe, Deep Search returned fire ratings, insurance guidance and product links in one consolidated answer.

Background

AI Mode entered public Search Labs in May 2024 as the successor to Search Generative Experience before rolling out to more countries and languages. Deep Search followed in November 2024 as an opt-in tool for in-depth research. Throughout early 2025, Google progressively wove Shopping Graph, Maps data and Finance APIs into AI Mode.

Sources

  • Robby Stein, Google Developers YouTube channel - interview, 12 July 2025.
  • Google patent "Thematic search with language model summaries" (US20230374612).
Quickly summarize and get insighs with: 
Author
Andrew Daniv, Andrii Daniv
Andrii Daniv
Andrii Daniv is the founder and owner of Etavrian, a performance-driven agency specializing in PPC and SEO services for B2B and e‑commerce businesses.
Reviewed
Andrew Daniv, Andrii Daniv
Andrii Daniv
Andrii Daniv is the founder and owner of Etavrian, a performance-driven agency specializing in PPC and SEO services for B2B and e‑commerce businesses.
Quickly summarize and get insighs with: 
Table of contents