Apple is planning to upgrade
Siri twice in the coming year, adding personalization features in iOS 26.4 before turning the personal assistant into a
full chatbot in iOS 27.
As long as timelines don't change, we'll see the βSiriβ chatbot as soon as June 2026. Here's everything we know so far.
SiriBot
With iOS 27, Apple will change the way that βSiriβ works. Right now, βSiriβ can answer basic questions and complete simple tasks, but you can't engage it in a back and forth conversation, get help with multi-step tasks, or ask complicated questions.
Based on the current βSiriβ chatbot rumors, βSiriβ will be able to do all of that and more with the upcoming upgrade, and it will work like competing chatbots.
Apple
wasn't initially planning to introduce a full chatbot that users can interact with similarly to Claude or ChatGPT, but chatbots have become too popular for Apple to ignore. Simply adding AI capabilities to apps and features isn't enough for Apple to stay competitive with the way people have embraced chatbots for everything from web searches to coding help.
Google has already integrated Gemini into a range of Android devices, and chatbots like ChatGPT have hundreds of millions of weekly active users.
Siri Capabilities
According to
Bloomberg's
Mark Gurman, βSiriβ's chatbot capabilities will be "embedded deeply" into Apple's products at the system level. βSiriβ won't be an app, but will instead be integrated into iOS, iPadOS, and macOS like βSiriβ is now.
Siri Activation and Interface
Users will activate βSiriβ in the same way they do today, speaking the βSiriβ wake word or pressing on the side button of a Siri-enabled device. βSiriβ will be able to respond to both voice and text-based requests.
We don't yet know what the new βSiriβ interface will look like. Apple will need to make big changes to the way that βSiriβ looks and feels if it wants to match functionality offered by companies like OpenAI, Anthropic, and Google.
People are used to opening up an app and having a full text interface that includes conversation history, and it's not clear how Apple will provide that if there's no dedicated βSiriβ chatbot app. People will want to be able to access their past conversations and have tools for uploading files and images.
It's possible activating βSiriβ could lead to an app-like interface that takes over the
iPhone,
iPad, or Mac's display, but that will be a departure from βSiriβ's current minimalistic design. Apple could alternatively log conversations in a place like the Notes app, or in the clipboard on the Mac.
Gurman says that βSiriβ won't be an app, but that might mean that it won't
only be an app. There could be some kind of dedicated chatbot app that people can use, with βSiriβ also able to be activated and used on a system level and in and across apps.
What Siri Chatbot Can Do
It sounds like the βSiriβ chatbot will be able to do everything that current chatbots can do, and more.
- Search the web for information
- Generate images
- Generate content
- Summarize information
- Analyze uploaded files
- Use personal data to complete tasks
- Ingest information from emails, messages, files and more
- Analyze open windows and on-screen content to take action
- Control device features and settings
- Search for on-device content, replacing Spotlight
βSiriβ will also be integrated into Apple's core apps, including Mail, Messages,
Apple TV, Xcode, and
Photos. βSiriβ will be able to search for specific images, edit photos, help with coding, make suggestions for TV shows and movies, and send emails.
iOS 26.4 "LLM Siri" vs. Chatbot Siri
In iOS 26.4, Apple plans to introduce a new, updated version of βSiriβ that relies on large language models, or LLMs. Apple has been working on this version of βSiriβ since
Apple Intelligence features were added to iOS 18, but it was delayed because βSiriβ's underlying architecture needed an overhaul to run LLMs.
Starting in iOS 26.4, βSiriβ will be able to hold continuous conversations and provide human-like responses to questions, plus βSiriβ will have new personalization features that will let it do more than before. What βSiriβ won't have, though, is full chatbot capabilities. Here's what we're expecting:
Personal Context
With personal context, βSiriβ will be able to keep track of emails, messages, files, photos, and more, learning more about you to help you complete tasks and keep track of what you've been sent.
- Show me the files Eric sent me last week.
- Find the email where Eric mentioned ice skating.
- Find the books that Eric recommended to me.
- Where's the recipe that Eric sent me?
- What's my passport number?
Onscreen Awareness
Onscreen awareness will let βSiriβ see what's on your screen and complete actions involving whatever you're looking at. If someone texts you an address, for example, you can tell βSiriβ to add it to their contact card. Or if you're looking at a photo and want to send it to someone, you can ask βSiriβ to do it for you.
Deeper App Integration
Deeper app integration means that βSiriβ will be able to do more in and across apps, performing actions and completing tasks that are just not possible with the personal assistant right now. We don't have a full picture of what βSiriβ will be capable of, but Apple has provided a few examples of what to expect.
- Moving files from one app to another.
- Editing a photo and then sending it to someone.
- Get directions home and share the ETA with Eric.
- Send the email I drafted to Eric.
You're not going to have a chat-like interface for back-and-forth conversations with βSiriβ when iOS 26.4 launches, but the personal assistant should be very different than it is now. Apple software engineering chief Craig Federighi told employees last summer that the βSiriβ revamp was successful. "This has put us in a position to not just deliver what we announced, but to deliver a much bigger upgrade than that we envisioned," he said.
Siri Redesign
With all of the new functionality coming to βSiriβ, Apple is planning to make visual design changes. It's not quite clear what that will entail, but for the upcoming table-top robot that's in the works, Apple has tested an animated version of βSiriβ that looks similar to the Mac's Finder logo.
Apple could start rolling out that new, more personalized design when βSiriβ gets the major iOS 27 revamp.
Memory
Claude, ChatGPT, and Gemini can remember past conversations and interactions, retaining a memory of the user. Apple is said to be discussing how much the βSiriβ chatbot will be able to remember.
Apple may limit conversational memory to protect user privacy.
Naming
βSiriβ is getting a major overhaul, but Apple will probably continue to refer to it as βSiriβ. It'll just be a much smarter version of βSiriβ.
Underlying Architecture and Servers
Apple has
inked a deal with Google that will see Gemini powering upcoming versions of βSiriβ. Apple plans to use Gemini for the iOS 26.4 updates that it is introducing, and Google's technology will also power the βSiriβ chatbot.
"Apple and Google have entered into a multi-year collaboration under which the next generation of Apple Foundation Models will be based on Google's Gemini models and cloud technology," the two companies said in a statement in January.
The βSiriβ chatbot specifically will rely on a custom AI model developed by the Google Gemini team. Gurman claims that the custom model is comparable to Gemini 3, and that it will be much more powerful than the model behind Apple's upcoming iOS 26.4 features.
Apple and Google are also discussing running the βSiriβ chatbot on Google's servers powered by Tensor Processing Units, probably because Apple doesn't yet have the infrastructure to handle chatbot queries from billions of active devices per day.
In the future, Apple will be able to transition βSiriβ to a different underlying model, so when the company does have in-house LLMs powerful enough to compete with ChatGPT or Gemini, it can move away from Google. Apple will also potentially be able to offer chatbot capabilities in China by partnering with a Chinese AI company.
China restricts foreign companies from offering AI features in the country.
Platforms
βSiriβ's chatbot functionality will be the key new feature in iOS 27, iPadOS 27, and macOS 27, and βSiriβ's capabilities will be integrated into the βiPhoneβ, βiPadβ, and Mac. βSiriβ chatbot features could also come to other platforms like visionOS and tvOS.
Cost
There is no word yet on whether
there will be some kind of fee associated with the βSiriβ chatbot. The βSiriβ chatbot won't be able to run entirely on device, and Apple is going to need major cloud processing power. Without taking into account any development or hosting costs, Apple is paying Google approximately $1 billion per year for access to Google's models.
Companies like Google and OpenAI spend billions on infrastructure and compute costs each year, and no AI service is entirely free. Apple will likely need to charge something, but it could do what Google has done with Gemini.
Google offers a free version of Gemini on Pixel smartphones and other Android devices that have integrated AI. The basic version of Gemini is able to answer questions, summarize text, write emails, and control apps and smartphone features.
Android users can pay $20 per month for Gemini Advanced to get access to the more advanced version of Gemini that offers better reasoning, longer context for analyzing bigger documents, and improved coding.
Launch Date
Apple is planning to introduce βSiriβ's chatbot capabilities when it announces iOS 27, iPadOS 27, and macOS 27 at the June Worldwide Developers Conference. If the chatbot features aren't ready to go, Apple will likely hold off on showing off the new functionality because of the major mistake it made with iOS 18 and βApple Intelligenceβ.
The βSiriβ chatbot is expected to be introduced in the new updates in September after several months of beta testing.
This article, "
Apple's Siri Chatbot in iOS 27: Everything We Know" first appeared on
MacRumors.comDiscuss this article in our forums