Apple’s smart glasses coming in 2026, aiming to take on Meta’s wearables: Report
Despite the hardware advances, Apple is reportedly grappling with internal concerns around its AI capabilities.
Apple is intensifying its push into wearable augmented intelligence, fast-tracking development on its own AI-powered smart glasses to take on Meta’s popular Ray-Ban collaboration. According to a new report from Bloomberg, Apple now aims to debut its smart glasses by the end of 2026, with mass prototyping expected to begin later this year.
Smart Glasses: A New Frontier for Siri and On-Device AI
Internally developed under the supervision of the Vision Products Group—the same team behind the Apple Vision Pro, the upcoming smart glasses are said to integrate cameras, microphones, and speakers directly into the frame. These components will reportedly enable real-time interaction with the user’s environment through Siri, Apple’s digital assistant.
Core features expected at launch include phone call support, music playback, turn-by-turn navigation, and live language translation, mirroring much of the functionality seen in Meta’s Ray-Ban smart glasses. One person familiar with the product development described Apple’s version as “similar to the Meta product but better made,” hinting at a more premium hardware design.
While Apple has not officially confirmed the product, the report signals a shift in how seriously the company is treating the competitive wearables space, particularly as Meta and Google gain ground with their own AI-powered accessories.
AI Lag May Complicate Apple’s Glasses Ambitions
Despite the hardware advances, Apple is reportedly grappling with internal concerns around its AI capabilities. Unlike Meta and Google, which are powering their smart glasses with robust in-house models like Llama and Gemini, Apple still relies on external partners such as Google and OpenAI for visual intelligence on iPhones.
Currently, Apple’s Visual Look Up feature is underpinned by Google Lens and OpenAI’s models for tasks like object recognition and scene analysis. While functional, the arrangement is seen as a temporary solution. Apple is widely expected to introduce its own foundation model at WWDC 2025 as part of a broader “Apple Intelligence” initiative.
The question remains whether that rollout will be mature enough to underpin real-time, on-device analysis for smart glasses by the time they hit the market. That may prove to be the key differentiator, or bottleneck, for Apple's upcoming wearable.