After years of complaints about misunderstood requests and unhelpful responses, it appears that Apple is about to launch Siri's biggest update in a long timeIt wouldn't be a simple facelift, but a profound change in the technology behind the assistant, bringing it closer to the artificial intelligence chatbots that have become popular in recent years.
New Siri: A next-generation AI-powered assistant
Reports from specialized media outlets such as Bloomberg, ZDNET, and 9to5Mac agree that Apple is rebuilding Siri on an architecture of large language models (LLM)similar to those used by ChatGPT or Google Gemini. This change in foundation would allow for more natural conversations, with less of a feeling of talking to a rigid machine.
According to these sources, The new Siri has been undergoing internal testing for months. on iPhone, iPad and Mac, initially as a standalone application and later integrated into the trial versions of the systemThe goal is to gradually replace the old infrastructure with this new AI engine, capable of better understanding natural language and handling more complex requests.
The idea is for the assistant to leave behind such common situations as misinterpreted voice commands or absurd responseswhich have even been the subject of jokes in television series. Apple wants Siri to more closely resemble the behavior of modern chatbots: continuous dialogues, the ability to remember recent context, and fewer comprehension errors.
In this new scheme, Siri would not only answer questions, but also he would act as a true orchestrator of actions within the system and applications, something that until now has been very limited or has worked inconsistently.

Possible release date: all eyes on March
The big question for many users is when they'll be able to try this new Siri. Everything points to it. The update will arrive with iOS 26.4 and its equivalents on iPadOS and macOS, in a window that fits with Apple's trajectory in previous years.
In past cycles, the .4 versions of the system They have tended to be released in March, especially towards the end of the month. The recent release history itself clearly illustrates this pattern:
- iOS 18.4It was launched on Monday, March 31st.
- iOS 17.4It arrived on Tuesday, March 5, slightly ahead of the usual range.
- iOS 16.4It was published on Monday, March 27.
In this context, internal leaks indicate that iOS 26.4 would again be released in the late March window.iOS 26.3 is already in beta testing, with a planned release at the end of January, which would leave room for the first beta of iOS 26.4 to appear at the end of January or during February.
If that beta were released soon, it wouldn't be unreasonable to... deployment somewhat earlier in MarchHowever, reports emphasize that the real-world performance of the new Siri in testing will be key: if accuracy, stability, or app compatibility issues are detected, Apple could take a few extra weeks to refine the experience before rolling it out to the general public.
In Europe there is an added factor: compliance with digital and data protection regulationsIn the past, the company has already modified the release schedule for certain versions, such as iOS 17.4, to comply with deadlines set by the Digital Markets Act. Therefore, it is possible that the EU launch window will be subject to specific regulatory requirements.
What practical changes will users notice?
Beyond the date, what is relevant for Spanish and European users is What will the new Siri be able to do that it can't do now?This is where several features that Apple originally presented as part of its AI strategy, and which ended up being delayed, come into play.
The first big news is a much deeper integration with applications through the calls App IntentsThis technology will allow the assistant to perform very specific actions within apps, without the user having to navigate through menus or touch the screen.
A significant improvement is also expected in understanding the personal contextSiri would have more structured access to data such as emails, messages, calendar, and notes—always under Apple's privacy policies—to better understand requests that have previously been confusing, such as searching for a specific message or locating important information within the device's data. Part of this improvement would involve better integration of systems such as Apple Intelligence with the assistant.
Another key pillar will be the screen awarenessThe assistant will be able to take into account what is visible at that moment to act directly on that content: add an address that appears in an email, save a phone number, summarize an article that is being read in the browser, or perform actions related to an app that the user has open.
Finally, Siri's ability to answer general knowledge questions directlyIt acts more as a conversational search engine than a mere intermediary that displays links. The idea is that it can provide clear and concise explanations on current events, history, sports, or culture, without requiring users to open a webpage for each search.
An LLM architecture to reduce errors and misunderstandings
Much of the historical frustration with Siri has to do with recurring misunderstandings, repeated orders that fail, and absurd responsesApple intends to tackle this problem at its root by using large language models, the same family of technologies behind today's leading chatbots.
These models are trained on large volumes of data to learn to interpret human language with more nuancesIt will understand references, subtle ironies, and linked requests. Instead of relying on rigid commands, the assistant will be able to handle instructions phrased more naturally, similar to how we speak in everyday conversation.
Leaks indicate that the new Siri will be able to to better maintain the thread of what has been said beforeThis should reduce the typical "I didn't understand you" when the person refers to something mentioned in a previous sentence or to an element that is on screen.
Furthermore, it is expected that many situations in which the assistant easily gives up or offers an incorrect answer without acknowledging their own limitations will disappear. The goal is a Siri less likely to confuse names, places, or navigation instructionsand with a greater ability to indicate when the request needs to be reformulated.
However, observers point out that Apple has made similar promises in the past, for example with its Apple Intelligence initiative, which It didn't quite live up to expectations This relaunch of the assistant puts a significant portion of the company's AI credibility at stake.
Where the information will be processed: device, cloud, and privacy
Another key aspect of this update has to do with where the calculations that bring the new Siri to life are performedThe company would opt for a hybrid model in which some requests are resolved on the device itself and others are routed to cloud servers.
Local processing—directly on the iPhone, iPad, or Mac—has the advantage of be faster and more discreetFor relatively simple tasks or those that only require information already on the device, Siri could respond without sending personal data outside, relying on models optimized to run on Apple chips.
For more complex requests, such as extensive summaries, advanced analyses, or general knowledge searchesThe system would rely on cloud computing. This is where speculation arises about the potential use of third-party technologies, with Google and OpenAI as leading candidates to provide part of the high-level language engine.
According to these reports, Apple would retain control over the final execution of the actions and access to the user datawhich would rely on its own models and systems, even if it uses an external LLM to better interpret natural language.
This approach is particularly sensitive in Europe, where the General Data Protection Regulation and new regulations on digital services are in place. They impose strong requirements in transparency, data minimization, and user control.The company will have to clearly explain what is processed on the device, what travels to the cloud, how long it is stored, and with what guarantees.

Integration with applications and new ways of using
One of the most visible transformations in daily life will come with advanced Siri integration into applicationsUntil now, the assistant has had a very limited role in many apps, often forcing users to resort to touch and menus for tasks that, in theory, could be solved with voice.
With the new version, developers will be able to use App Intents to define very specific actions within its apps that Siri will be able to run directly. This could include anything from editing a specific photo in the Photos app to checking a flight status in an airline app or repeating an order in an online store.
For all of that to work, app developers will need to adapt your applications to the new assistant APIsIt is expected that major services—messaging, shopping, banking, mobility, or travel—will move quickly to take advantage of these capabilities, especially in markets like Europe, where the iPhone user base is large.
The combination of App Intents with the screen awareness It also promises more natural uses. For example, if a user is viewing an address in an email, they could ask Siri to save it to their contacts; if they have an order information page open, they could request tracking or a return without manually navigating through the app.
In parallel, the improved context will enable the assistant manage queries that until now were unreliable, like "search for the message where they sent me the account number” or “tell me what time tomorrow’s flight leaves from the confirmation email”, always respecting the user’s privacy settings and permissions.
Impact on other devices in the Apple ecosystem
The new Siri won't be limited to the iPhone. The company's intention is that Your entire ecosystem will benefit from the leap in intelligenceespecially those products where voice is the primary form of interaction.
Among the devices that could most notice the change are smart speakers and the possible home appliances with integrated screenA product combining elements of HomePod and iPad has long been rumored, designed to act as a home control center, a screen for video calls, and a quick access point to information via voice commands.
That type of product would have been put on hold in part because Siri wasn't prepared to offer a truly up-to-date experience.With a more powerful assistant, capable of better understanding the family context, user profiles, and what is displayed on the screen, the idea of such a device for the living room or kitchen is gaining traction.
The Apple Watch is another clear candidate to benefit from the change. Its form factor makes it ideal for brief and frequent interactions with the assistantChecking messages, managing reminders, logging workouts, or launching complex automations without touching the iPhone could become much smoother actions if Siri responds better.
AirPods and other headphones from the brand could also benefit from a most reliable assistant for voice requestsFrom playback controls to navigation instructions, all without needing to take your phone out of your pocket. In all these cases, success will depend on the new architecture delivering on its promises in terms of accuracy and speed.
With everything that has been revealed, the update that Apple is preparing for March with iOS 26.4 It marks a turning point for the future of Siri.If the combination of advanced language models, greater integration with apps, and better context management works as expected, the assistant could leave behind much of its accumulated bad reputation and become a truly useful tool for millions of users, both in Spain and Europe and in the rest of the world.