Apple's New Siri Will Be Powered By Google Gemini
The smarter, more capable version of Siri that Apple is developing will be powered by Google Gemini, reports Bloomberg. Apple will pay Google approximately $1 billion per year for a 1.2 trillion parameter artificial intelligence model that was developed by Google.
For context, parameters are a measure of how a model understands and responds to queries. More parameters generally means more capable, though training and architecture are also factors. Bloomberg says that Google's model "dwarfs" the parameter level of Apple's current models.
The current cloud-based version of Apple Intelligence uses 150 billion parameters, but there are no specific metrics detailing how the other models Apple is developing measure up.
Apple will use Gemini for functions related to summarizing and multi-step task planning and execution, but Apple models will also be used for some Siri features. The AI model that Google is developing for Apple will run on Apple's Private Cloud Compute servers, so Google will not have access to Apple data.
Gemini uses a Mixture-of-Experts architecture, so while it has over a trillion total parameters, only a fraction of them are activated for each query. The architecture allows for a large total compute capacity without racking up significant processing costs.
Apple weighed using its own AI models for the LLM version of Siri, and also tested options from OpenAI and Anthropic, but it decided to go with Gemini after deciding Anthropic's fees were too high. Apple already has a partnership with Google for search results, with Google paying Apple around $20 billion per year to be the default search engine option on Apple devices.
Though Apple is planning to rely on Google AI for now, it plans to continue working on its own models and will transition to an in-house solution when its LLMs are capable enough. Apple is already working on a 1 trillion parameter cloud-based model that could be ready as soon as 2026. Apple is unlikely to publicize its arrangement with Google while it develops in-house models.
Apple was meant to debut an updated version of Siri in iOS 18, but deficiencies required the company to overhaul the underlying Siri architecture and significantly delay the rollout. The smarter Apple Intelligence Siri is expected to be introduced in an iOS 26.4 update that's coming in spring 2026.
Siri will be able to answer more complex queries and complete more complicated tasks in and between apps. It will be closer in function to Claude and ChatGPT, though Apple is not planning a dedicated chatbot app.
This article, "Apple's New Siri Will Be Powered By Google Gemini" first appeared on MacRumors.com
Discuss this article in our forums
For context, parameters are a measure of how a model understands and responds to queries. More parameters generally means more capable, though training and architecture are also factors. Bloomberg says that Google's model "dwarfs" the parameter level of Apple's current models.
The current cloud-based version of Apple Intelligence uses 150 billion parameters, but there are no specific metrics detailing how the other models Apple is developing measure up.
Apple will use Gemini for functions related to summarizing and multi-step task planning and execution, but Apple models will also be used for some Siri features. The AI model that Google is developing for Apple will run on Apple's Private Cloud Compute servers, so Google will not have access to Apple data.
Gemini uses a Mixture-of-Experts architecture, so while it has over a trillion total parameters, only a fraction of them are activated for each query. The architecture allows for a large total compute capacity without racking up significant processing costs.
Apple weighed using its own AI models for the LLM version of Siri, and also tested options from OpenAI and Anthropic, but it decided to go with Gemini after deciding Anthropic's fees were too high. Apple already has a partnership with Google for search results, with Google paying Apple around $20 billion per year to be the default search engine option on Apple devices.
Though Apple is planning to rely on Google AI for now, it plans to continue working on its own models and will transition to an in-house solution when its LLMs are capable enough. Apple is already working on a 1 trillion parameter cloud-based model that could be ready as soon as 2026. Apple is unlikely to publicize its arrangement with Google while it develops in-house models.
Apple was meant to debut an updated version of Siri in iOS 18, but deficiencies required the company to overhaul the underlying Siri architecture and significantly delay the rollout. The smarter Apple Intelligence Siri is expected to be introduced in an iOS 26.4 update that's coming in spring 2026.
Siri will be able to answer more complex queries and complete more complicated tasks in and between apps. It will be closer in function to Claude and ChatGPT, though Apple is not planning a dedicated chatbot app.
This article, "Apple's New Siri Will Be Powered By Google Gemini" first appeared on MacRumors.com
Discuss this article in our forums
Note: MacRumors is an affiliate partner with Expercom. When you click a link and make a purchase, we may receive a small payment, which helps us keep the site running.
Note: MacRumors is an affiliate partner with Amazon. When you click a link and make a purchase, we may receive a small payment, which helps us keep the site running.
Note: MacRumors is an affiliate partner with some of these vendors. When you click a link and make a purchase, we may receive a small payment, which helps us keep the site running.