Ai Engineering 3 min read

Native iOS 27 Workloads Can Now Route to Claude and Gemini

Apple's Extensions framework for iOS 27 allows developers to integrate third-party AI models directly into native Siri and Writing Tools workflows.

On May 5, 2026, details emerged regarding Apple’s plan to overhaul its AI architecture in iOS 27, iPadOS 27, and macOS 27. The company is shifting away from OpenAI’s exclusive role in Apple Intelligence to support system-level routing to multiple third-party AI models. For developers building consumer AI applications, this opens a direct path to integrate external models into native Apple interfaces.

The Extensions Framework

The update centers on a new system internally referred to as Extensions. This framework allows the operating system to route generative AI tasks from native features directly to a user-selected third-party model. Users will manage these default selections via a dedicated menu in the Settings app.

Apple is internally testing integrations with Google’s Gemini and Anthropic’s Claude. Reports indicate other providers, including xAI’s Grok, could be supported if they opt in. To participate, third-party AI companies must build Extensions support into their existing App Store applications. Apple plans to launch a specific App Store section highlighting these compatible applications, enforcing a requirement that providers maintain native iOS binaries rather than relying purely on web interfaces.

To distinguish between native system execution and external AI routing, Apple will allow users to assign distinct voices to third-party models. A response generated by Claude or Gemini will sound different from a native Siri response, providing a persistent auditory cue for the active model.

Impact on Native Features

The Extensions routing layer replaces the default model execution pipeline across several core operating system components.

FeatureiOS 27 Routing Capability
SiriRoutes world knowledge and complex reasoning to external providers.
Writing ToolsDelegates text generation, summarization, and editing.
Image PlaygroundLeverages external image-generation models for on-device tools.
Visual IntelligenceMoves to a dedicated Camera app toggle for external vision models.

Infrastructure and Google Partnership

The Extensions architecture coincides with a reported multi-year cloud deal between Apple and Google. Apple will utilize Gemini-based models and Google cloud technology to power the next generation of Apple Foundation Models. Despite this deep infrastructure integration, the Extensions feature maintains a neutral platform layer. Users can explicitly bypass Google’s default system role in favor of other alternative foundation models.

Apple has scheduled its Worldwide Developers Conference for June 8 to 12, 2026, where the framework will be officially unveiled. The public release of the operating systems and their AI model-picker features is slated for Fall 2026. The shift in AI strategy comes precisely as Apple reached a $250 million legal settlement over claims it had previously misled investors regarding the timeline and capabilities of Siri’s legacy machine learning features.

If you build iOS applications that rely on custom AI logic or complex function calling, you must prepare your codebase for the Extensions framework APIs expected at WWDC. Native system routing means users will soon expect to invoke your app’s specific models directly from the iOS text selection menu or Siri prompt, bypassing your proprietary user interface entirely.

Get Insanely Good at AI

Get Insanely Good at AI

The book for developers who want to understand how AI actually works. LLMs, prompt engineering, RAG, AI agents, and production systems.

Keep Reading