This is Optimizer, a weekly newsletter sent every Friday from Verge senior reviewer Victoria Song that dissects and discusses the latest phones, smartwatches, apps, and other gizmos that swear they’re going to change your life. Optimizer arrives in our subscribers’ inboxes at 10AM ET. Opt in for Optimizer here.
About a month ago, I was walking in Williamsburg and a stylish Brooklyn hipster stopped me to ask about the Oakley Meta HSTN smart glasses I was testing. A few weeks later, I went to dinner with a friend that I hadn’t seen in years. It wasn’t until we were walking to the train that I noticed they were wearing a pair of Ray-Ban Meta glasses. I went to a concert, turned my head, and saw someone in my section with the Ray-Bans’ recording light on. The next day, a quick search told me footage had been uploaded to TikTok.
It hit me. Regular people aren’t just curious about smart glasses. They’re actually buying and using them. I’m starting to spot them in the wild. And when Meta takes the stage next week at its annual Connect event, normal people are going to be wondering: what comes next?
No one expected much when the company launched its second-generation Ray-Bans nearly two years ago. (I mean, its first-gen glasses were a bonafide flop, with 90 percent of users abandoning the device.) Now, it’s 2025, and the company has sold 2 million pairs. Its partner company EssilorLuxottica plans to ramp up production to as many as 10 million by the end of 2026. This success is why smart glasses were all over the CES show floor last January. Until now, most people have associated smart glasses with the abject failure that was the original Google Glass Explorer Edition.
You could peg some of Meta’s success to timing. There’s a confluence of things happening right now: the AI craze, the search for AI hardware, and the TikTok era has normalized filming in public, to name a few. But what Meta genuinely innovated here was nailing the execution on a very old idea.
Where Google Glass stood out like a sore thumb — it had extreme Vegeta from Dragon Ball Z vibes — Meta teamed up with Ray-Ban to create stylish yet discreet glasses that don’t look like tech. It lowered the cost of entry from $1,500 to a much more affordable $300. The sound quality is good. The camera quality is shockingly good enough to post on Instagram, making this a useful tool for content creators. Adding AI seemed like a dubious move, but I’ve since heard from several low-vision and blind people who say the AI-infused glasses have helped them live more independent lives.
It’s why my inbox is flooded with pitches for Meta knockoffs. Hell, it’s probably why Google has suddenly thrown its hat back into the smart glasses ring after years of pretending the whole glasshole conversation never happened.

But it’s not enough to keep Meta’s momentum going.
Meta might be the best-known smart glasses maker right now, but its rivals will soon chip away at its lead. Looking at the current competition — Google’s Android XR prototypes, XREAL One Pro, Rokid Glasses, and Even Realities G1, to name a few — all of them have some sort of display. And while I get the desire for fewer screens, adding one to glasses allows for infinitely more possibilities.
I have no sense of direction. Many of my walks involve me anxiously looking down at my phone to see if I’m going the right way. It’d be much nicer if I could say “hey, give me directions to the library” into my glasses and have the route pop up in my line of sight. I’d have killed for a heads-up display while giving presentations in high school instead of trying to decipher my chicken scratch on index cards. I’d love it if, when I go on a family vacation to Italy this fall, I could look at a menu and just see the translation pop up without needing to snap a photo, talk to an AI, or open a phone app to view the translation there. Heck, maybe having recipe notes in my line of sight while cooking would mean fewer burnt dinners.
Those are simple suggestions, but they address a ton of the issues I’ve had with audio-only glasses. Part of the appeal is hands-free, phone-free living. Right now, you can either be hands-free or phone-free, but rarely both simultaneously. Given that, I’m betting we’ll see a version of Meta glasses with a display unveiled next week. My colleague Alex Heath reported extensively on the Hypernova prototype, and we’ve also seen corroborating reports from CNBC.
Even so, this smart glasses resurgence isn’t guaranteed. While it’s gaining steam, it’s in a fragile state. We’re more comfortable with people filming everywhere, but that doesn’t mean everyone is on board with the added privacy risks.
While writing this, I kept thinking about my colleague Liz Lopatto’s recent column about how anonymity is dead and we’re all fodder for the content mill because everyone is filming everywhere, all the time. This only becomes more true when you throw smart glasses into the mix. That’s not even touching on the dystopian element, like when a Border Patrol agent sports the Ray-Bans during an immigration raid, an esthetician weirds out a client by wearing the glasses during a Brazilian waxing appointment, and college kids turn them into a real-time doxxing tool.
This is not a problem that Meta will solve next week at Connect, and that’s mainly because it can’t. It’s a societal conversation that we’re all going to have to have with each other as these devices become more popular. Meta can provide guidelines on how to use them responsibly (and it does), but we are the ones who’ll write the cultural rules in real time. I hope it’s less violent than the last time this tech entered the mainstream conscience.
The only case Meta (and every other company jumping into this space) can make is proving that these devices will make life so much better that it outweighs the negatives. It’s a tall order. A display isn’t going to help them win the argument overnight. But it will tilt the odds ever more in their favor — and that’s momentum it can’t afford to lose.
0 Comments