In recent months, a wave of information has emerged about Apple's plans for its future devices, especially AirPods and Apple Watch. What at first might seem like a simple evolution of these wearables actually points to a deep transformation in the way we interact with everyday technology. Based on leaks and statements from analysts such as Mark Gurman and Ming-Chi Kuo, it appears that the The arrival of integrated cameras in these products will be much more than adding a traditional function photography or video calling. And we could see all of this throughout the year 2027.
Apple is betting on cameras with its own chips: "Nevis" and "Glennie"
The technological leap based on the artificial intelligence and contextual recognition, where the cameras of the AirPods and the Apple Watch will play an essential role, is closer than ever. These new features, which could see the light in 2027 If the development of the new chips progresses as expected, they will focus on processing environmental data and improving the everyday user experience, investing in innovations that go beyond what we usually associate with these small devices.
Apple is finalizing, under internal names "Nevis" Y "Glennie", new chips designed specifically to power cameras both the Apple Watch and AirPods. According to various sources and specialized media, these components will allow both devices to not only capture images, but process and analyze visual information in real time. The goal isn't to allow you to take traditional photos or make video calls, but rather to have the camera act as an advanced sensor that allows the device to better interpret and understand the user's surroundings.
The "Nevis" chip would be destined for the Apple Watch, while the "Glennie" chip would be exclusive to AirPods. Everything indicates that the development of these chips is progressing according to schedule, with a possible official presentation as early as 2027. The idea of including cameras in these wearables is not limited to Apple alone, although the Cupertino firm seems to want to go a step further by equipping its products with intelligent contextual detectors that feed the company's artificial intelligence.
Why a camera in AirPods and Apple Watch?
The inclusion of cameras in these devices does not respond to the typical function of capturing images, As is the case with mobile phones and tablets. In the case of AirPods, the infrared cameras are expected to serve, among other things, to improve spatial audio, especially in combination with other products such as the Apple Vision Pro. Additionally, the possibility of using these sensors to collect useful information for health functions, such as measuring body temperature, or for gesture control by interpreting hand movements is being considered.
On the Apple Watch, The camera would be located near the screen or the digital crown, especially in models like the "Ultra." Its main function would be to activate visual recognition systems, such as Apple's Visual Intelligence technology, thanks to which the user can obtain relevant information about objects, places, or even items of clothing simply by pointing the device at them. This feature has already been seen in the iPhone 15 Pro and the upcoming iPhone 16 line, and its arrival on the watch would mark a decisive step in the integration of AI into wearables.
A connected future: from environmental recognition to personalized assistance
The cameras of these devices are conceived as "eyes" that collect valuable information to feed Apple's artificial intelligence. The intention is clear: that both AirPods and the Apple Watch will be able to scan the environment to offer the user personalized responses and services based on their surroundings, whether through object recognition, location contextualization, or even anticipating health needs and advice.
All the leaks consulted insist that the ability to take pictures or make traditional video calls will not be available, Thus ruling out FaceTime or other uses more typical of a smartphone. What's interesting is the utilitarian approach: Apple wants to enhance gesture controls, health-related functions, and the ability for Apple Intelligence's artificial intelligence to process the data collected by these cameras to offer a more intuitive and precise experience.
A long-term strategy to transform the entire Apple ecosystem
The development of these products does not occur in a vacuum: Apple aims for the "Nevis" and "Glennie" chips to be the cornerstone of a smarter, more cohesive ecosystem, where every device—whether iPhone, Mac, AirPods, or Apple Watch—actively participates in the collection and analysis of contextual and environmental data. The ultimate goal is for artificial intelligence stops being something abstract and become a concrete aid in the user's daily life.
Current forecasts place the Possible launch of these devices in 2027, although the pace of chip development will be decisive. Apple is reportedly trying to speed up the deadlines, but there are still doubts about whether they will achieve technical perfection in time.
What seemed like a simple rumor about cameras in small accessories becomes the prelude to a revolution in the way we interact with our environment, opening the door to a generation of much smarter devices and connected to the daily lives of users.