Real differences between traditional Siri and Apple's new Siri AI

  • The new Siri AI relies on Apple Intelligence and models like Gemini to provide context, memory, and natural language far superior to the traditional Siri.
  • Apple combines local processing on the device with its own auditable servers, prioritizing privacy and allowing the use of external models such as ChatGPT.
  • Advanced Siri and Apple Intelligence features are arriving in phases between iOS 18 and iOS 27 and only on recent devices with Apple Silicon chips.
  • The first version of the new Siri will be free, although Apple could reserve advanced chatbot options or extra features for future subscriptions.

Differences between traditional Siri and Siri AI

Many users with recent iPhones are wondering Why, despite having Apple Intelligence enabled and being on the latest betas, does Siri still respond with simple Google links and very basic answers? If you just got an iPhone 16 Pro Max, have iOS 18.x or later, and were expecting some kind of "integrated ChatGPT" on your phone, it's normal to feel somewhat disappointed and not be sure what's actually changed.

The reality is that today two major phases of Siri coexist.The two are: the "classic" Siri, limited and focused on simple tasks, and the new Siri based on Apple Intelligence and advanced models like Google's Gemini, which is still being rolled out in phases. Understanding the differences between the two, where the rollout is at, and what you can expect in each version of iOS, iPadOS, and macOS, and how Take advantage of Siri on your iPad This is key to avoiding frustration and knowing when you'll truly notice the leap in quality. Let's go all the way. Differences between traditional Siri and Siri AI.

Fundamental differences: Traditional Siri vs. Siri AI

The traditional Siri was born as a very limited voice assistantSet alarms, send a quick message, make a call, open an app, or launch an internet search. Its language comprehension was very literal, with little contextual memory and no real ability to reason about what you were doing on the device. As soon as you deviated from the script, it returned Google results and little else.

The new Siri AI is part of Apple IntelligenceApple's bet on an artificial intelligence more like a personal co-pilot than a simple voice search engine. In this new stage, Siri relies on advanced language models—including Google's Gemini and, in specific cases, OpenAI's ChatGPT—to better understand what you say, hold contextual conversations, work within your apps, and generate content (texts, images, notes, summaries, etc.).

While traditional Siri almost always functioned as an “intermediary” Unlike the simple commands and searches that Siri used to issue, the AI ​​is designed to interpret what's happening on the screen, cross-reference it with your personal data (emails, photos, messages, etc.), and provide a pre-prepared response. It will not only locate information but also summarize it, recall it later, or use it for other tasks.

Another big difference is where your data is processedWith classic Siri, the AI ​​model was basic, and many functions passed through Apple servers, but without a deep focus on personal context. With Apple Intelligence, most processing is done directly on the device, reserving the cloud for complex tasks and using Apple-designed servers with its own chips, with external privacy audits.

That said, even though for the end user everything is called “Siri”Internally, Apple is updating it in two major waves: first, a much smarter, more contextual Siri capable of moving throughout the system (arriving with iOS 26.4), and later, a true conversational chatbot in the style of ChatGPT, which will debut with iOS 27.

Apple Intelligence: the engine that separates before and after

Apple Intelligence and Siri AI

Apple Intelligence is the umbrella under which the new Siri lives.It's not just a "smart mode" for the assistant, but a collection of AI models integrated throughout the operating system: from the keyboard to the Notes app, including Mail, Photos, and notifications, and practical automations on iPhoneSiri is the public face, but the intelligence is spread throughout.

Apple Intelligence's philosophy departs from traditional cloud-based AI.Where you send your data to a remote server (as is the case with many chatbots) and trust that the company will use it responsibly. Instead, Apple is betting on “Personal Intelligence”: the model analyzes what happens on your device and has access to the content of your apps, but most of the processing is done locally, without that data ever leaving your iPhone, iPad, or Mac.

In cases where the cloud is needed —for example, to handle highly complex requests or to integrate external models like ChatGPT— Apple uses its own servers with chips designed by the company and subject to independent audits. The promise is that even in the cloud, data processing will be done with a level of privacy exceeding the industry standard.

This approach has a direct consequence on the experienceApple Intelligence can "understand" your digital life. If you ask Siri when your mother's flight arrives, the AI ​​will search your emails for the flight number, consult the information online, and respond with the exact time without you having to open apps and copy data from one place to another.

Furthermore, Apple Intelligence is not limited to a single application Unlike many AI systems that reside on a website or in a specific app, Siri is integrated into multiple apps and functions: text writing, photo management, priority notification management, emoji creation (Genmoji), call transcription, and much more. Siri simply leverages all these capabilities when you ask it for something.

Apple knows that a model running on the device itself will be less powerful. than the gigantic cloud models of other companies, so it has opted for a hybrid solution: when Siri AI can't handle something on its own, it will suggest using ChatGPT or other third-party models, always warning that this data will leave Apple's private environment.

Integration with Google Gemini and OpenAI ChatGPT

The new Siri will be based on Gemini's AI models

One of the major new features of this new phase is the integration with GeminiGoogle's family of artificial intelligence models. Following the agreement between the two companies, Apple will use these models to enhance the generative and reasoning capabilities of the new Siri, especially in tasks requiring deep understanding and the generation of text or creative content.

With Gemini as the driving force behind the scenesSiri is no longer limited to simply providing superficial results and links. It can now answer factual questions in natural language, offer more comprehensive explanations, and, importantly, cite sources. This represents a huge leap forward compared to its previous behavior, where it merely offered a list of suggested websites.

Another capability that comes with this integration is the generation of personalized stories.You can ask Siri to create themed stories, include specific characters (your child, your partner, a friend), or adapt the story to a particular age. It's a feature clearly inspired by generative AI, designed for creative home entertainment.

Siri AI will also incorporate emotional support toolsWhen users express loneliness, frustration, or discouragement, the assistant can respond with a more empathetic and less robotic tone, engaging in conversations that feel more human. It's not about replacing a professional, but rather about offering basic companionship and thoughtful responses when the user is having a bad day.

Alongside Gemini, Apple is keeping the door open for ChatGPTWhen Siri can't answer a question, or when you explicitly request it, you can send the request to the OpenAI model. This access uses the most advanced model available to the general user, currently GPT-4, with the same usage limitations as if you were using it on the web or in the official app.

This means that, once a certain number of GPT-4o requests have been exceeded,Siri will switch to using GPT-3.5 for all other queries, restoring advanced access after 24 hours, just like in the original service. The idea is to provide convenient access to ChatGPT directly from Siri, without reinventing the service.

As Google's models continue to improve And as Apple refines Apple Intelligence, the need to use ChatGPT will likely decrease. In the future, ChatGPT will be reserved for very specific cases or for users who consciously prefer that particular model.

What can the new Siri do today (and what will it do in future versions)

To avoid confusion, it's best to separate the two major "waves" of new products. of Siri AI that we know from leaks and announcements: the deep Siri that integrates with the whole system, and the conversational chatbot that will arrive later.

The first major update comes with iOS 26.4 (and its equivalents on iPadOS and macOS). This version brings a Siri capable of remembering the context within a conversation: if you say "turn off the living room light" and then add "and the kitchen light," it will no longer freeze up, unsure of what you mean. The flow of the conversation is no longer a problem.

Furthermore, this revamped Siri understands natural language much better.You won't have to speak like a robot or search for the exact phrase that "activates" the command. You can correct yourself mid-sentence, rephrase what you want, or give chained instructions, and the assistant will be able to keep up with you much more flexibly than the classic Siri.

Another huge change is that Siri will start paying attention to everything you do. On your device (privately): what apps you use, what messages you receive and send, what emails you get, what photos you take… It's not something you see on the screen, but it does mean the assistant can answer questions based on that context. For example, remembering the time a friend told you via email they'd arrive for lunch, even if you never added it to your calendar.

This same approach is applied to searches within the systemThanks to the combination of Siri and the redesigned Photos app, you can ask for very specific things, such as "show me the photos where I'm wearing the red jacket in New York," or locate specific documents in Files, or messages and email attachments without having to remember which app each thing is in.

The second major phase will arrive with iOS 27where Siri will become a full-fledged chatbot in the style of ChatGPT. Here, we're no longer just talking about understanding the immediate context, but about maintaining a conversation history—not an infinite record, but enough memory to remember your preferences and relevant details from previous interactions.

At that stage, Siri will be able to have a longer and more coherent conversationRemembering, for example, that you prefer to fly with certain airlines or at certain times, or that you like a specific type of food, and using that information for future recommendations and planning.

This Siri chatbot will also be powered by Google models.Sharing many capabilities with Gemini, Apple reserves the right to add its own exclusive features on top of that foundation. For the user, the experience will be quite similar to using an advanced chatbot, but seamlessly integrated into the Apple ecosystem.

Practical Apple Intelligence features that change how you use Siri

Beyond the voice assistant, Apple Intelligence introduces a host of features These are features that are noticeable in everyday life and that rely on Siri when needed. These are some of the most important features that have already been announced or are being rolled out.

  • Siri is “reborn” from scratchThe assistant has been redesigned to understand natural language, accept on-the-fly corrections, and follow the conversation without getting lost at the slightest lapse. No more having to repeat the entire sentence every time you change your mind.
  • Siri understands what's on the screenIf you're in Messages and someone sends you an address, you can say "save this address" and they'll know which one you mean. Or you can ask them to send "the photos of my mom's house from Saturday" to that person and they'll find and share the correct pictures.
  • Siri typing modeIn addition to speaking to it, you can also write to it directly, which is very useful when you don't want to use your voice or you're in a noisy environment. This writing mode will be available at all times.
  • Writing and proofreadingAI can generate emails, documents, and notes from scratch, or review those you've already written, suggest changes in tone, grammatical corrections, restructure sentences, and help you condense long texts into clear summaries.
  • Creating images with Image PlaygroundAn integrated tool that lets you create images with the help of AI, without needing complicated prompts. You select themes, styles, locations, accessories… and the system generates the image locally on your device.
  • Genmoji: AI-generated emojisIf the emoji catalog isn't enough for you, you can create custom Genmoji by describing what you want. Apple Intelligence will then shape them to your liking.
  • Smart Editing in PhotosThe Photos app gains the ability to remove people or objects that are in the way of an image, in addition to improving the Memories feature with more coherent narratives based on what's really happening in your photos.
  • Call Transcription and SummaryIn iOS 18 and later you can record calls (notifying the other person), automatically transcribe them and generate summaries that can even be directly transferred to Notes.
  • Notifications and priority messagesAI will summarize the content of long emails and notifications, and detect messages that seem urgent to bring them to the forefront, preventing important information from getting lost in the noise.
  • Use of third-party modelsWhen Siri or Apple Intelligence can't help you with something, you can use external models like ChatGPT from within the system itself, even taking advantage of your paid account if you have one.

Compatibility: Which devices will have Siri AI and Apple Intelligence

The bad news for many users is that the new Siri won't be available on all devices.Apple has drawn a clear line: sufficient power is needed to run the models on the device itself, and that severely limits the list.

On iPhone, you need one of these models (with iOS 26 or higher) to access the new Siri with Apple Intelligence:

  • iPhone 15 Pro
  • iPhone 15 Pro Max
  • iPhone 16
  • 16 iPhone Plus
  • iPhone 16 Pro
  • iPhone 16 Pro Max
  • iPhone 16e
  • iPhone 17
  • iPhone Air
  • iPhone 17 Pro
  • iPhone 17 Pro Max

In the iPad world, the requirement is to have iPadOS 26 or later. on one of these compatible models:

  • iPad mini (7th generation, 2024)
  • iPad Air with M1, M2, or M3 chip
  • iPad Pro with M1, M2, M4 or M5

On Mac, computers with Intel processors are left out of the party.Only Macs with Apple Silicon (M1 and later) and macOS 26 or later will be able to enjoy the new Siri and the full features of Apple Intelligence.

  • iMac with M1, M3, or M4
  • MacBook Air with M1, M2, M3 or M4
  • MacBook Pro with M1, M1 Pro, M1 Max, M2, M2 Pro, M2 Max, M3, M3 Pro, M3 Max, M4, M4 Pro, M4 Max, or M5
  • Mac mini with M1, M2, M2 Pro, M4 or M4 Pro
  • Mac Studio with M1 Max, M1 Ultra, M2 Max, M2 Ultra, M4 Max, or M3 Ultra
  • Mac Pro with M2 Ultra

In all cases, Apple Intelligence and the new Siri are still in beta. in many countries and languages. Initially launched only in English and in the United States, they have gradually expanded to other markets, including Spanish.

Release schedule: why your Siri still seems “dumb”

Part of the current confusion stems from the phased rollout of Apple Intelligence and across different iOS versions. Many users have received some Siri improvements, but not the full experience promised at WWDC, and that's why they feel like "nothing's happening" when they ask complex questions.

This is the approximate arrival timeline for the new Siri and from Apple Intelligence based on what has already been released and what Apple has confirmed or hinted at:

  • iOS 18.1 (October 2024): arrival of the new Siri interface and the first conversational context recall system, initially only in US English.
  • iOS 18.2 (December 2024): expansion of these functions to other English-speaking countries and first integration with ChatGPT.
  • iOS 18.4 (April 2025)Apple Intelligence is starting to become available in Spanish and other languages, although still without the new deep Siri integrated throughout the system.
  • iOS 26.4 (March 2026, according to leaks): Launch of Siri capable of "investigating" the entire system, analyzing emails, messages, photos and more to respond like a true personal assistant.
  • iOS 27 (late 2026): arrival of the Siri chatbot, with conversation history and a fully conversational experience.

If you have a compatible iPhone today and are in an intermediate betaIt's quite likely you're only getting a partial view of Siri's new brain. That's why it feels like it's running at half speed: it's not that Apple has lied, it's just that its roadmap is spread over several years and the big leap in 2026 is still to come.

How to activate the new Siri and how much it costs

How to provide information to Siri on your iPhone

Activating the new Siri shouldn't require any complicated steps.In principle, when updating to compatible versions of iOS, iPadOS, or macOS, Apple Intelligence features will be enabled according to the rollout phase and the country you are in.

Even so, it is highly recommended to check the settings section. on your device. The general path is to go to Settings > Apple Intelligence & Siriwhere you can see what features are available, what external models can be integrated (such as ChatGPT) and whether you are in the beta programs or waiting lists that Apple has opened to test some capabilities early.

As for the price, the first version of the new Siri is free.You won't have to pay an additional subscription to use Apple Intelligence or the revamped assistant: they are considered part of the operating system itself, just as classic Siri has been part of iOS for years.

For the advanced chatbot that will arrive with iOS 27Apple hasn't ruled out the possibility of subscriptions. For now, it's all speculation, but it's possible that certain "premium" features or expanded capabilities will come at a cost, especially if they involve more cloud computing or integration with paid third-party models.

Meanwhile, accessing ChatGPT from Siri uses standard conditions From OpenAI: you can use GPT-4 up to a daily limit, and once you exceed that limit, you'll be relegated to GPT-3.5 until the counter resets. If Apple later allows linking ChatGPT Plus accounts or other plans, these limits may be increased for those who already pay OpenAI directly.

Given this whole situation, many users feel that "Siri is still the same" There's a good explanation for this: the rollout is happening in waves, the hardware requirements are demanding, and not all features arrive at the same time or in the same country. As iOS 26.4 and iOS 27 roll out to compatible devices, the difference between traditional Siri and Siri AI will be so significant that it will feel like you've switched assistants without leaving the same button.

Apple Intelligence
Related article:
Apple advances in the development of a Siri chatbot at the level of ChatGPT

Hey siri
You might be interested in:
Over 100 fun questions to ask Siri
Follow us on Google News