Meet people where they already are, with a little more intelligence woven in.
Somewhere in Apple's hardware labs, engineers are trying on glasses — four different versions of them, in rectangular and oval frames, in various sizes and colors — and asking a question the company has been circling for years: what does an Apple wearable actually look like when it doesn't try to do everything at once?
According to reporting by Bloomberg's Mark Gurman, Apple is actively testing four distinct smart glasses prototypes with a target launch window of 2027. The designs span two basic frame shapes — rectangular and oval — and come in multiple sizes and colorways, suggesting the company is still in the phase of narrowing down what the product will feel like to wear before it commits to what it will do.
What it will do, at least in its current conception, is notably restrained. These glasses are not being built around a display. There is no augmented reality overlay, no floating interface hovering at the edge of your vision. Instead, the focus is on AI-driven utility: capturing photos and video, handling phone calls, playing music, and serving as a hands-free interface for an upgraded version of Siri. The pitch, in other words, is closer to a smarter pair of earbuds with a camera than to anything resembling a headset.
That positioning is deliberate, and it carries the shadow of recent history. Apple's Vision Pro — the $3,499 mixed reality headset that launched in early 2024 — was an extraordinary piece of engineering that struggled to find a mass audience. It was heavy, expensive, and asked users to rearrange their relationship with computing in ways most people weren't ready for. The smart glasses project represents a different instinct: meet people where they already are, doing things they already do, just with a little more intelligence woven in.
The competitive context is hard to ignore. Meta's Ray-Ban smart glasses, built in partnership with EssilorLuxottica, have sold in meaningful numbers and demonstrated that consumers will wear AI-enabled eyewear if it looks like regular eyewear and doesn't demand too much of them. Google, which pioneered and then retreated from the category with Google Glass more than a decade ago, is reportedly circling back. The market Apple is entering in 2027 will be more crowded than the one it might have entered in 2023.
Apple's advantage, as ever, is integration. A pair of glasses that talks fluently to your iPhone, your AirPods, your Apple Watch, and a Siri that has been substantially rebuilt around large language model capabilities is a different proposition than a standalone device. The question is whether Apple can make the hardware feel inevitable — the way the original AirPods did — or whether it will land as a capable product in search of a compelling reason to exist.
The 2027 timeline gives the company roughly a year and a half of additional development runway from where it stands now. That's enough time to settle on a frame design, refine the AI features, and figure out the pricing strategy that will determine whether this becomes a mainstream accessory or a niche one. For now, the glasses are still just prototypes on someone's face in a lab — but the fact that Apple is this far along, with this much design variation still in play, suggests the project is real and the commitment is serious.
The smarter bet, watching this unfold, is less on whether Apple ships the glasses and more on whether the AI inside them is good enough to make people reach for them instead of their phones.
Notable Quotes
The glasses are expected to prioritize AI-driven functionality over display-based augmented reality, positioning them as lightweight everyday assistants rather than mixed reality headsets.— Mark Gurman, Bloomberg (paraphrased)
The Hearth Conversation Another angle on the story
Why does it matter that Apple is avoiding a display in these glasses?
Because every smart glasses product that led with a display has struggled. The display adds weight, cost, and complexity — and it asks the wearer to trust that the overlay will actually be useful. Skipping it is a way of saying: let's solve the easier problem first.
Is this basically Apple's answer to the Meta Ray-Bans?
In form, yes. In ambition, probably more. Apple will lean hard on Siri and on how the glasses talk to the rest of your devices. Meta's advantage is the Ray-Ban brand. Apple's is the ecosystem.
Four design variants feels like a lot. What does that tell us about where they are in development?
It tells us they haven't committed yet. When a company is testing rectangular versus oval frames in multiple sizes, they're still in the phase of asking what this thing should feel like to own — not just what it should do.
What happened with Vision Pro that Apple is trying to avoid repeating?
Vision Pro asked too much. It was expensive, isolating to wear, and required users to reimagine how they interact with computing. Most people weren't ready. These glasses are designed to slot into existing habits rather than replace them.
Does a 2027 launch give Apple enough time to get this right?
Probably. The underlying AI capabilities are maturing fast, and Apple has been rebuilding Siri from the ground up. The hardware problem — making something light enough and stylish enough that people actually wear it — is the harder constraint, and they're clearly working on it.
What's the thing to watch between now and launch?
Whether the Siri integration is genuinely useful or just a feature list. The glasses live or die on whether the AI earns its place in your day.