Bloomberg reports that Apple has selected Google’s Gemini to power a major Siri overhaul, with a target release in spring 2026. Apple’s own publications detail a privacy-first AI architecture that keeps personal context on device and runs heavier inference on Private Cloud Compute. This brief separates reported facts from interpretation and flags open questions.
Executive snapshot
- Apple chose Google to build a custom Gemini model for Siri, targeting spring 2026 (iOS 26.4) for release, according to Bloomberg’s Mark Gurman [S1][S3].
- Bloomberg says Anthropic’s Claude was evaluated but would have cost Apple more than $1.5 billion per year, tipping the decision toward Google [S2].
- Reported design: Google’s models handle query planning and summarization on Apple’s Private Cloud Compute, while Apple’s models keep on-device personal data processing local [S1][S2][S5].
- Apple’s active installed base exceeds 2.2 billion devices, indicating large potential reach if the overhaul ships on schedule [S6].
- Branding: Bloomberg reports Apple does not plan to publicly acknowledge Google’s role; marketing would remain Apple-only [S1].
- Implication for marketers: If shipped, more queries may resolve inside Siri and Spotlight on iPhone, likely reducing early-stage web discovery from iOS [S1][S3].
Findings: Google Gemini for Siri overhaul
Bloomberg, citing people familiar with the matter, reports Apple has selected Google to create a custom Gemini-based model that will power a major Siri upgrade, with deployment targeted for iOS 26.4 in spring 2026 reports [S1][S3]. The work reportedly runs under the internal codename “Glenwood” [S1].
According to Bloomberg, Apple evaluated Anthropic’s Claude for the same role, but Claude’s expected annual cost (more than $1.5 billion) made Google’s proposal more attractive [S2]. The reported architecture splits responsibilities: Google’s models would handle query planning and summarization, while Apple’s own foundation models would process sensitive, on-device context. Cloud inference would run on Apple’s Private Cloud Compute infrastructure rather than Google-operated data centers [S1][S2][S5].
Bloomberg further reports Apple plans to market the upgraded Siri as Apple technology running on Apple servers, without acknowledging Google’s involvement [S1]. Bloomberg has also referenced a separate Apple smart home display device on a similar timeline that could showcase the assistant’s expanded capabilities [S3]. None of these partnership details have been announced by Apple or Google; financial terms beyond “Apple paying Google” remain undisclosed [S1]. These are forward-looking reports and could change.
Method and source notes
What was measured or reported
- Partnership and architecture details for Siri’s next generation; target release timing; relative vendor costs (Google vs. Anthropic). Source type: investigative reporting based on anonymous sources (Bloomberg/Mark Gurman) [S1][S2][S3].
- Apple’s official AI architecture for cloud inference (Private Cloud Compute) and current third-party model integrations (ChatGPT in Siri/Writing Tools) [S4][S5].
- Installed base scale for potential impact context (Apple earnings call) [S6].
Caveats
- Bloomberg’s claims are unconfirmed by Apple or Google and rely on unnamed sources; terms and timing could shift [S1][S2][S3].
- No public documentation yet on data handling specifics for a Gemini integration beyond Apple’s general Private Cloud Compute model [S5].
- The reported cost comparison with Anthropic is an estimate from Bloomberg’s sources, not a disclosed contract figure [S2].
Source IDs
- [S1] Bloomberg (Mark Gurman), Power On newsletter, Nov 2, 2025 - Apple to use custom Gemini for Siri; Apple to avoid acknowledging Google; PCC deployment; internal codename.
- [S2] Bloomberg, Sep 3, 2025 - Apple evaluated Google and Anthropic; Claude projected to cost Apple >$1.5B annually; model role as query planner/summarizer.
- [S3] Bloomberg, Jun 12, 2025 - Targeting spring 2026 for Siri AI overhaul (iOS 26.4) and related hardware timing.
- [S4] Apple Newsroom, “Apple introduces Apple Intelligence,” Jun 10, 2024 - Siri integration with ChatGPT (user-permissioned) and high-level architecture.
- [S5] Apple Security/ML Research, Private Cloud Compute technical overview, Jun 2024 - Design for server-side inference with hardware attestation and privacy protections.
- [S6] Apple Q1 FY2024 earnings call transcript, Feb 2024 - Active installed base exceeds 2.2 billion devices.
Private Cloud Compute and Apple Intelligence details
Apple’s 2024 disclosures outline a two-tier AI system: on-device models for personal context and a company-run cloud for heavier inference called Private Cloud Compute (PCC) [S4][S5]. PCC is described as Apple-controlled hardware with secure enclaves, software images that can be independently inspected, and cryptographic attestation to verify the server environment before any data is processed. Apple states that requests are ephemeral and not stored by default [S5].
Apple’s WWDC 2024 materials also show Apple brokering access to third-party models: Siri and Writing Tools can invoke OpenAI’s ChatGPT when the user consents, with clear handoffs and permissions [S4]. Bloomberg’s reporting is directionally consistent with this architecture: Google’s models would handle cloud-side planning and summarization while Apple keeps personal data processing on device and runs cloud inference on PCC rather than sending requests to a Google-operated endpoint [S1][S2][S5]. None of Apple’s public documents, however, mention Google in connection with Siri, and the specific division of responsibilities, routing rules, and any source attribution or link-out behavior for answers remain undisclosed [S1][S5].
Interpretation and implications for marketers
- Likely: More iOS queries will resolve inside Siri and Spotlight as planning and summarization improve, compressing top-of-funnel organic traffic for informational intents from iPhone users once iOS 26.x reaches scale [S1][S3]. Expect fewer early-page visits from head terms that can be answered concisely.
- Likely: Apple surfaces matter more. Maintain accurate data where Apple already controls the experience: Apple Business Connect for Maps/Siri cards, plus structured entity data that feeds local and knowledge-style responses. For app publishers, implement App Intents and Core Spotlight so Siri/Spotlight can deep-link into in-app answers when appropriate (Apple Developer) [S4].
- Tentative: Source attribution and click-through from Siri answers are uncertain. If Apple minimizes third-party branding in responses, publishers could see lower visibility unless answers require a tap-through [S1].
- Tentative: Measurement gaps may widen. Siri/Spotlight referrals are already harder to segment; without explicit attribution mechanics, analytics teams may see more “direct” or untagged traffic from iOS unless Apple exposes identifiers or parameters [S1].
- Speculative: If Apple later introduces new paid placements or promotions within Siri/Spotlight to offset reduced web discovery, paid budgets could shift toward Apple inventory. There is no current evidence of such placements tied to this overhaul [S1][S4].
Interpretation notes: Items labeled Likely are grounded in Apple’s published architecture and Bloomberg’s repeated reporting cadence; Tentative reflects directional but unconfirmed behavior; Speculative flags potential outcomes without direct evidence.
Contradictions and gaps
- Confirmation: Neither Apple nor Google has announced a Gemini-Siri deal; legal, technical, and financial terms are unverified [S1][S2][S3].
- Data handling specifics: Apple’s PCC model is documented, but how third-party model prompts and outputs are logged, cached, or trained on within a Gemini integration is not public [S5].
- Attribution and links: It is unknown whether Siri’s upgraded answers will cite sources or pass referral data that marketers can track [S1].
- Vendor scope: Bloomberg reports Google’s role as query planner and summarizer running on Apple servers, but the exact boundaries between Apple and Google models are not documented [S1][S2].
- Timing risk: Bloomberg indicates a spring 2026 target; major Siri rewrites have slipped before, and hardware tie-ins could alter the schedule [S3].
Data appendix (select)
- Target release window: Spring 2026, iOS 26.4 [S3].
- Vendor cost estimate: Anthropic Claude projected at >$1.5B per year to Apple if selected [S2].
- Reported model roles: Google Gemini for query planning and summarization; Apple foundation models for on-device context [S1][S2].
- Deployment environment: Apple Private Cloud Compute for cloud inference [S1][S5].
- Installed base scale: >2.2 billion active Apple devices [S6].






