In September, Meta pulled the veil back on three new models of smart glasses. One is an update of the company’s hit Ray-Bans. Another applies the same basic tech to a sportier Oakley frame. But the third got all the press—because it has a display.
Meta has been all-in on virtual and augmented reality for years. It acquired Oculus in 2014 for $2 billion and doubled down in 2021 by rebranding itself Meta (after the metaverse). The company has plowed $100 billion into the space over the last decade in a bet that it can build the next great computer interface.
Meta’s Quest VR headsets have sold decently over the years. But as Apple can attest, even with the best available technology, VR and mixed reality devices are still bulky face computers many people resist. Some, including Apple, believe no device will go fully mainstream until it’s been slimmed down to the size of a pair of glasses.
Enter Meta’s Ray-Bans.
The company’s first smart glasses relied on the chunky frames and brand recognition of Ray-Ban’s Wayfarer model. The glasses came stock with dual cameras, microphones, and bone conduction headphones. Crucially, Meta axed a functional display, the most anticipated but challenging feature to engineer, but added AI (what else?). The glasses were an unexpected hit, and the tech giant sold around 2 million.
On the back of that success, Meta is now expanding the product line. The smart glasses form factor, the company believes, is perfect for AI because it gives a chatbot your point-of-view—everything you see and hear. You can, in theory, ask AI to give you ideas for dinner from the food you’re looking at, translate a sign in a language you don’t know, or identify a painting or plant and give you some information on it.
But it isn’t only AI Meta’s after.
Smart glasses should have a visual component too, and for the first time, the $799 Meta Ray-Ban Display glasses include a small rectangular display. The device’s transition lenses darken in full daylight, so the display is crisp and legible. It also comes with a “neural wristband” that senses signals in your wrist muscles. With the wristband, you can control the display. A pinch, for example, opens or closes it.
“It’s the only device where you can basically let an AI see what you see, hear what you hear, talk to you throughout the day, and then once you get the display, it can just generate a UI in the display for you,” Meta CEO Mark Zuckerberg told The Verge.
The device’s live demo was glitchy, but after tech reporters tried it out for themselves, even some skeptics said the glasses were the most advanced yet. Apart from chatting with AI, other possible applications include hands-free cooking, directions, texting, translation, or even captioning a conversation with someone in a loud room.
The glasses are compelling, but Zuckerberg thinks they could evolve into our primary computing devices. If the tech keeps up the pace, he may be right. But there’s reason to be skeptical. Since face computers took their first steps with Oculus and Google Glass, we’ve learned things get complicated the more up-close-and-personal a device becomes. The stuff we wear, especially on our faces, is a highly individual choice.
Even the small increase in bulk between the standard Ray-Ban Meta glasses to those with a display could turn a significant fraction of the public off to them. And that’s for a device that only mimics the display on your phone, as opposed to the dream of a seamless digital reality laid across your field of view.
Getting that kind of functionality to fit into an ordinary pair of glasses—not to mention in a variety of styles—is out of reach. Moreover, cameras and microphones help make the tech compelling, but they also represent a major downside. Privacy is an issue for any Meta product.
“The problem isn’t just the privacy of people sharing their data, it’s the privacy of all the people around them that might not even know they’re being filmed,” tech analyst Carolina Milanesi told The New York Times. “And people don’t have trust in Meta.”
Still, maybe we can dial back expectations a notch. New devices don’t have to be iPhone killers to be successful. Smartwatches are popular and make sense for small interactions, like reading texts and recording health data.
Beyond music and calls, wireless earbuds can be used to chat with AI or, as Apple’s new Airpods show, handle real-time translation. And smartphones, still a key orchestrator of all this, are hard to beat (even for AI). New offerings needn’t replace some or all of these devices to be valuable. Even smartphones haven’t replaced laptops, tablets, or desktop computers.
Computer interfaces are an expanding constellation of devices with tradeoffs. While smart glasses may be a new star in that constellation—indeed, Apple has reportedly stopped work on an updated version of the Vision Pro to zero in on smart glasses—there’s room for plenty of other options.
Some may love having an always accessible screen hovering in front of their eyes; others will find it distracting. The perfect device tends to be the one that fits a person’s preferences, needs, and circumstances.