Apple’s Next Frontier: Smart Glasses Are Coming
AI Wearables, Vision, and the Future of Apple’s Ambient Computing
For years, the tech world has buzzed about Apple’s ambitions in augmented reality, culminating in the launch of the Vision Pro. But Apple’s next major wearable may not be a headset—instead, it could be something subtler and more seamlessly integrated into daily life. New reports reveal that Apple is actively developing a new generation of AI-powered wearables, designed to serve as your iPhone’s eyes and ears, with smart glasses leading the charge.
This isn’t just about another gadget. It’s a strategic step to claim leadership in the emerging personal AI hardware market. While Meta has made early advances with its Ray-Ban glasses, Apple’s multi-product approach—including smart glasses, a discreet AI pendant, and camera-equipped AirPods—signals a future where ambient computing and artificial intelligence are woven seamlessly into everyday experiences. Here’s an insider look at how Apple aims to redefine wearable tech.
The Main Event: Apple’s Smart Glasses
At the heart of Apple’s new wearable strategy is a pair of smart glasses—currently the highest priority in this category. Rather than a scaled-down Vision Pro, these glasses focus more on the concept behind Meta’s AI wearables, but with Apple’s signature sophistication in both function and form. The aim is to deliver a device that excels as both a camera and an AI assistant, rather than as a full augmented reality display.
These glasses won’t be standalone. Like the Apple Watch, they’ll require an iPhone connection, offloading processing power to the device you already carry. Expected hardware includes:
A high-resolution camera for capturing photos and videos, drawing on Apple’s reputation for top-tier imaging.
Speakers and microphones for calls, media playback, and Siri interactions.
A dedicated environmental camera that provides contextual awareness for the glasses’ AI features.
Apple is investing heavily in AI integration. The company is developing advanced features that go beyond simple photography. For instance, the glasses could recognize and digitize real-world text or create context-aware reminders—imagine your glasses prompting you to buy milk when you look at the carton, or giving navigational directions based on recognizable landmarks rather than just street names.
Design is another area where Apple expects to shine. Unlike Meta’s partnership with Ray-Ban, Apple is handling development in-house, using high-end materials—including acrylic elements—to give the frames a premium feel. Multiple styles could emerge as the product matures. Reports indicate that development is moving quickly, with a public release potentially targeted for early 2027.
The AI Pendant: A Discreet Alternative
Apple knows that not everyone will want to wear smart glasses. That’s why it’s also developing an alternative: an AI pendant. Referred to internally as the iPhone’s “eyes and ears,” this device is positioned as a comparable, yet different, approach to delivering wearable intelligence.
Conceptually, it’s similar to the Humane AI Pin, but with an important distinction—it’s designed as an iPhone accessory rather than a standalone device. About the size of an AirTag, the pendant would include an always-on camera and microphone and could be worn clipped to clothing or as a necklace.
The goal for the pendant is to provide a continuous stream of environmental data to the iPhone, powering hands-free AI interactions without needing to take your phone out. However, this project is still in early development and might not make it to market if it doesn’t meet Apple’s standards.
AirPods with a Vision
The third piece in Apple’s wearable strategy involves enhancing AirPods with built-in cameras. While rumors about camera-equipped AirPods have circulated for some time, new reports suggest this vision is moving closer to reality.
Apple’s plan is to integrate low-power infrared (IR) cameras into the earbuds—not for photography, but for basic environmental imaging. This innovation would enable new interaction methods, such as controlling audio or Siri with subtle air gestures. By tracking hand movements relative to the user’s head, AirPods could interpret commands without requiring a touch or voice, adding a new layer of seamless, gesture-based control to the Apple ecosystem.
A Calculated Strike in the AI Hardware War
Together, these three product lines represent a unified, multi-pronged strategy meant to challenge competitors like Meta and establish Apple as the clear leader in mainstream AI hardware. By tying these devices closely to the iPhone, Apple capitalizes on its immense, loyal customer base.
This approach delivers several clear advantages:
Lower Cost: Offloading processing to the iPhone keeps the wearables simpler and more affordable than standalone devices like the Vision Pro or Humane AI Pin.
Longer Battery Life: With less on-device processing, battery demands are reduced, supporting all-day use.
Seamless Integration: These wearables are designed to work effortlessly with the current Apple ecosystem, offering immediate value for iPhone owners.
While Meta’s Ray-Ban collaboration gave it a head start, Apple is playing the long game. By emphasizing high build quality, leading-edge camera technology, and deep AI integration, Apple aims to create indispensable products. The smart glasses, AI pendant, and camera-equipped AirPods won’t exist in isolation—they’ll act as gateways to a future in which Apple’s ecosystem moves beyond your pocket and onto your body, learning and assisting in the background as you go about your day.
The race to define the next dominant computing platform is well underway. Apple has made it clear: it intends to win—not just with a single flagship device, but with an entire family of intelligent wearables, fundamentally transforming how we interact with technology.
If you enjoy Apple Secrets, please consider becoming a paid subscriber to support our work.



I’d love to see some Apple smart glasses sooner rather than later.