¿How will Siri improve with iOS 27? The arrival of iOS 27 is set to be one of the biggest turning points in the history of Apple's assistant. Siri will evolve from a simple voice command system into a much more conversational artificial intelligence chatbotWith advanced features inherited from generative models like Gemini and backed by Apple Intelligence, this isn't just a minor visual tweak; it's a profound shift in how we interact with the iPhone, iPad, and Mac.
For years we've had the feeling that Siri had stagnated. While ChatGPT, Gemini, and even other assistants were advancing at full speed, Apple's assistant remained limited, with poor responses and little context memory.With iOS 26.4 and, especially, iOS 27, Apple aims to turn this situation around: a new chat-like interface, improved natural language understanding, full system integration, and an AI engine powered by Google technology. Now, Siri truly aspires to compete in the same league as the leading chatbots on the market.
From old assistant to AI chatbot: Siri's big turn
The transformation that Apple is preparing for Siri with iOS 27 goes far beyond a simple aesthetic changeThe classic circular animation or the color wave at the bottom of the screen will give way to an experience more like a chat app, where the conversation is maintained, context matters, and responses flow naturally.
According to leaks from Bloomberg and other specialized media outlets, Apple will abandon the traditional floating card interface in favor of a chatbot model similar to what we already find in services like ChatGPT, GeminiGrok or Perplexity. The company is aware that Siri's current presentation conveys the feeling of an outdated assistant, unable to compete with new conversational interfaces.
Apple had spent years fine-tuning small details of Siri - some voice adjustments, specific functions, a bit more integration with apps - but without daring to touch the core of the product. The rise of generative artificial intelligence has forced a rethink of everything.And iOS 27 will be the version in which Apple acknowledges that the classic assistant model has become definitively obsolete.

Apple Intelligence, Gemini, and Project "Campos": what's behind the new Siri
Siri's recent history cannot be understood without Apple Intelligence on your iPhone. Apple announced its own AI platform with great fanfare, but in practice it has encountered technical limitations and delays.Meanwhile, competitors like OpenAI and Google have refined very mature generative models, which has pushed Apple to seek external alliances.
This is where the multi-year agreement with Google comes into play. The Cupertino company will use the Gemini model family and the Google cloud as the basis for the new Siri chatbot.adapting them to its ecosystem and privacy requirements. Instead of reinventing the wheel from scratch, Apple will leverage the power and scalability of Gemini to lay the foundation for the next generation of its assistant.
Internally, this new Siri is being developed under the codename "Campos". It won't be a separate app or an isolated experiment: it will completely replace the current version of Siri. and will be deeply integrated into iOS 27, iPadOS 27, and macOS 27. The technological foundation will combine Apple's own models with Google's infrastructure and foundational models.
This does not mean that Apple is giving up its identity. The company's obsession with privacy will continue to shape how these models are trained and run.combining local processing on the latest M and A series chips with controlled cloud tasks. Much of the engineering work is focused on adapting Gemini to the Apple ecosystem without compromising its security and data protection principles.
The move also has a clear strategic interpretation. Apple becomes the first major company to integrate Gemini technology into a mass-market consumer productAnd it does so just as the race for conversational AI has become the great battleground of the technology industry.
Key dates: From iOS 18 to iOS 27, the long road to the new Siri
To understand how we got to the Siri chatbot in iOS 27, it's worth reviewing the recent roadmap. Apple has been rolling out Apple Intelligence features and Siri improvements in several phases, with changes spread across different versions of iOS:
- iOS 18.1 (October 2024): A revamped Siri interface debuted, along with the ability to remember context in a conversation, though only in US English.
- iOS 18.2 (December 2024): That experience expanded to more English-speaking countries and initial integration with ChatGPT was added.
- iOS 18.4 (April 2025): Apple Intelligence arrived in Spanish and other languages, but without the deep version of the new Siri.
- iOS 26.4 (March 2026, according to leaks): Bridge update with "Deep Siri", much more integrated into the system and able to understand what is happening on screen and in apps.
- iOS 27 (late 2026): Launch of the complete chatbot starring Siri, built on Gemini and with a conversational interface.
WWDC 2026 will be the chosen stage to unveil all the details of iOS 27 and the new assistant. Apple will use the developers conference to explain its AI architecture, privacy limits, and tools for third-party apps.The public rollout is planned for September, accompanying the new generation of iPhones and aligned with iPadOS 27 and macOS 27.
In parallel, Apple will continue to maintain and refine the current version of Siri on devices that do not support Apple Intelligence. This will create an interesting coexistence: some users will enjoy the advanced chatbot, while others will continue with a more limited, classic Siri.simply because their hardware does not meet the minimum requirements.

Compatible devices: Which iPhones, iPads, and Macs will have the new Siri?
The new Siri won't be for everyone. Apple has set fairly strict hardware requirements, focusing on devices with chips powerful enough to run AI models locally. and connect to the cloud when needed.
In the case of the iPhone, Only the latest models with iOS 26 or higher will be able to access the full experience of the new assistantThese include the Pro ranges and the new generations:
- iPhone 15 Pro
- iPhone 15 Pro Max
- iPhone 16 and 16 Plus
- iPhone 16 Pro and 16 Pro Max
- iPhone 16e
- iPhone 17
- iPhone Air
- iPhone 17 Pro
- iPhone 17 Pro Max
In the iPad ecosystem, Compatibility requires having iPadOS 26 or a later version and a relatively modern chip. able to move Apple Intelligence with ease:
- iPad mini (7th generation, 2024)
- iPad Air with M1, M2, or M3 chip
- iPad Pro with M1, M2, M4 or M5 chip
On Macs the line is even clearer. Models with Intel processors are excluded from Siri's new features.while devices with Apple Silicon and macOS 26 or higher are included:
- iMac with M1, M3, or M4
- MacBook Air with M1, M2, M3 or M4
- MacBook Pro with M1, M1 Pro, M1 Max, M2, M2 Pro, M2 Max, M3, M3 Pro, M3 Max, M4, M4 Pro, M4 Max, or M5
- Mac mini with M1, M2, M2 Pro, M4 or M4 Pro
- Mac Studio with M1 Max, M1 Ultra, M2 Max, M2 Ultra, M4 Max, or M3 Ultra
- Mac Pro with M2 Ultra
Apple also plans to bring this experience to the Apple Watch and future devices. One of the most striking projects is a screenless wearable, with a format similar to that of a AI-powered AirTag but with built-in microphones, focused on voice interaction with the new Siri and planned - according to leaks - for 2027.
What changes with iOS 26.4 and what will change with iOS 27
Siri's improvements are coming in two big waves. The first is linked to iOS 26.4, which introduces a much "smarter" version of the traditional assistantThe first, with iOS 27, makes the complete leap to the chatbot model.
With iOS 26.4, the old Siri receives a deep update on three fronts: context memory, natural language understanding, and device content awarenessThe assistant will stop behaving as if each request were independent and will begin to chain related commands together.
Classic example: you ask them to turn off the living room light and then you say "and the kitchen light too". Until now, Siri would get lost with that second command, but the new version will understand the relationship between the two and act accordingly.The same will happen with longer conversations, where you can refer to "what you said before" without having to repeat everything.
Furthermore, Siri will begin to pay attention—privately—to what's happening on your deviceIt will read the screen's contents, taking into account received and sent messages, emails, notifications, and other elements, without you having to provide all the clues each time. This will allow it to answer queries like "What time is my friend arriving for lunch on Saturday?" based on an email or message they sent you that you've forgotten.
This same logic will be applied to the Photos app, document search, and other areas. You can ask him things like "show me the photos where I'm wearing a red jacket in New York" or that finds specific files in the app, including specific files and emails with attachments, without having to remember exact titles.
With iOS 27, this set of capabilities is wrapped in a completely new layer: Siri will become a conversational chatbot with chat history, a mixed voice and text interface, and an AI engine powered by Gemini.You will be able to generate extensive summaries, write emails and documents, help with code, propose creative ideas or analyze complex files, in addition to continuing to handle typical reminders, alarms and home automation.

Chat-like interface, screen awareness, and system-wide usage
One of the most striking aspects of the new phase will be the interface. The idea is for Siri to stop appearing as an ephemeral layer and instead become a stable space for conversation.Much like a chat thread, you can scroll up to review previous messages, retrieve old replies, or continue previous conversations right where you left off.
This new chat view will coexist with the classic activation. You will still be able to call Siri with the wake word or by pressing and holding the side button on your iPhone.But once the interaction begins, the flow will develop as a conversation in which voice and text blend naturally.
Apple's biggest differentiating factor will be the integration with the device. The new Siri will have what the company calls “screen awareness”It will know what apps you have open, what document you are reading, what photo you are editing, or what message you are viewing, and will be able to offer help based on that immediate context.
That will translate into tasks like asking him to Summarize a document you have in front of you, generating an appropriate response to an email you are reading., that modifies text on the screen or guides you step by step in a professional app like Xcode, where it could act as a programming assistant.
In addition, Apple plans to open some of these capabilities to third parties through an API. Developers will be able to integrate Siri's chatbot into their applicationsso that the assistant understands specific actions within each app and can perform complex tasks that go far beyond "open X application" or "send a message".
ChatGPT, Gemini and the role of external models
The relationship of the new Siri with other AI models will be somewhat hybrid. By default, Apple Intelligence will use its own models combined with Gemini technology. to answer most queries. However, Apple will maintain a direct integration with ChatGPT for certain requests.
When the system detects that Siri is unable to solve something using its own models, It will offer the option of using ChatGPT, always informing the user that they are being transferred to an external service. under OpenAI's terms. In that case, the GPT-4o model will be used, with the same usage limitations as on the web or in the official app.
Once the number of applications allowed to GPT-4o is exceeded, The system will temporarily switch to GPT-3.5 and will restore access to the advanced model after 24 hours.It is, basically, a user-friendly integration, but not an "unlimited" version or different from the standard OpenAI service.
This integration with ChatGPT already began in iOS 18.2, initially only in English. Spanish and other language support has been rolled out alongside the rest of Apple Intelligence's features., in a phased manner, to adjust both the user experience and privacy guarantees.
On the other hand, the alliance with Google for the use of Gemini has become a key pillar. Apple has had to resort to this collaboration after encountering performance and scalability problems in its own generative assistant prototypes., and thus seeks to ensure that the new Siri is up to par from day one.
Privacy, reliability, and open doubts about Siri's memory
A more conversational assistant, with more context and greater analytical capacity, also raises delicate questions. Apple has been selling privacy as one of its main differentiators for years.And now it has to prove that it can maintain that commitment while adopting an advanced chatbot model.
Much of the processing will be done locally, taking advantage of the latest M and A chips, but Certain complex tasks will continue to be delegated to the cloud, either on Apple's own infrastructure or on Google's servers for Gemini.The company must clearly explain what data leaves the device, how it is anonymized, and for how long it is stored.
Another important internal debate centers on conversational memory. While competitors like ChatGPT use history to personalize responsesApple is considering limiting this memory to reduce the risk of abuse or leaks of sensitive information.
We will likely see configuration options that allow decide to what extent Siri can remember past interactionsWhether we want it to forget certain conversations or whether we prefer it not to keep any long-term history. That will be one of the red lines that will differentiate Apple from other players in the sector.
Finally, there remains the question of reliability. If the new Siri continues to fail at basic tasks or offers inconsistent answersThe new visual package will be of little use. Apple is aware that user perception is shaped by daily use, by whether the assistant is accurate when asked for something simple, and whether it minimizes the "hallucinations" typical of some generative models.
How it will be used, how much it will cost, and what we can expect from the future
As for its use, Apple doesn't want to complicate anyone's life. Activating Siri will continue to work as before: with the voice command "Siri" or by pressing and holding the side button. on the iPhone (or the equivalent on iPad and Mac). What will change is what appears on the screen and how the conversation continues afterward.
At first, You won't need to activate anything special to use the new version of the assistant.Beyond ensuring you have Apple Intelligence and Siri enabled in Settings > Apple Intelligence and Siri, some features will remain in beta and may require joining a waitlist, as has already happened with other Apple AI tools.
Regarding the price, the first phase of the new Siri will be a service included in the system. There will be no need to pay a specific subscription to enjoy the revamped assistant.just as the use of Siri has been free until now within iOS, iPadOS and macOS.
What will happen with the more advanced chatbot in the medium term is another matter. Some leaks suggest that Apple is not ruling out offering certain premium features via subscription.Especially if they involve intensive use of cloud resources or external models like Gemini or ChatGPT. For now, these are just conjectures, and we'll have to wait for official announcements to know for sure.
Looking a little further ahead, Apple's strategy seems clear: Siri will cease to be a simple "voice button" and will become the core of the AI experience across all of the brand's devices.from the iPhone to potential new wearables focused on spoken interaction.
With all this in mind, iOS 27 is shaping up to be the moment when Siri finally makes the leap that has been demanded for years. The combination of a chatbot interface, deep system integration, Gemini support, and Apple Intelligence enhancements can make it a truly useful tool in everyday life.Provided the company fine-tunes privacy, stability, and the quality of responses, Siri could finally shed its outdated assistant label and become the Apple Siri it is today, living up to the expectations of the conversational AI era.