Summary created by Smart Answers AI
In summary:
- Macworld reports that Apple’s camera-equipped AirPods are in late-stage development, featuring longer stems with low-resolution cameras and LED indicators for cloud uploads.
- The cameras enable hands-free Visual Intelligence through Siri, allowing users to ask questions about objects they’re looking at without using their iPhone.
- These AI-focused AirPods could launch by year-end as part of Apple’s OS 27 updates, representing four years of development work.
The latest report from Bloomberg’s Mark Gurman describes that the oft-rumored “AirPods with cameras” are nearing completion. He says the product has reached a new milestone, and is now in the late stages of development, where the prototypes have the “near-final design and capabilities.” This is the point where they make a few final tweaks and adjustments and finalize the software before entering production. The next stage is Product Validation Testing, where a limited production run is made and used for internal testing before being announced and sold to customers.
Of course, the product hasn’t yet been announced and could still be cancelled or delayed. However, given the stage of development they’re at, they could be on shelves by the end of the year—assuming the new Siri launch goes according to plan. Apple has to be happy with the quality of the new Visual Intelligence features before bringing these AirPods to market.
The new Siri, now expected to be part of the OS 27 updates, is reportedly a central component of these AirPods, because that’s what the cameras are for. According to Gurman’s report, the cameras are located on each earbud stem, which will be slightly longer than the current AirPods Pro, and provide low-res information for Siri, rather than being designed to take photos or videos for users.
They would allow you to simply look at things and ask Siri about them, from looking at a bunch of ingredients and asking what meal you could make (a scenario AI companies seem obsessed with for some reason), using landmarks when giving directions, or adding information from a poster to your calendar. Essentially, it’s the same experience you would get invoking Visual Intelligence on your iPhone, without having to pull out your phone, hold it up to something, and take a picture.
These new AirPods will appear similar to AirPods Pro 3, but with longer stems to accommodate the cameras and a visible LED light that will illuminate when visual data is being uploaded to the cloud. We don’t know what these AirPods will cost or Apple will call them, but might we suggest AirPods Ultra? Gurman says they have been in development for around four years and are part of a wave of AI-centered products, which include a pin/pendant and smart glasses as well. The new AirPods are further along in development than those.



