Meta Ray-Bans, 6 Months Later: The Best AI Companion Happens to Be Camera Glasses
They're wonderful vacation companions and memory makers. The AI part is just a bonus that's along for the ride.
Every once in a while, you'll see me talking to my glasses. I'll stare at something briefly and mutter to myself, or I'll tap the sides of my frames and then start chatting with them. These are actions of a future world. For me, it's just another day with Meta's second-gen Ray-Ban glasses, which I've had on my face off and on for the last five months or so.
Everyday AR glasses aren't here yet, and VR headsets generally stay at home. Somewhere in between and elsewhere are smart glasses, most of which don't have displays in them. Meta's Ray-Bans, like a wave of wearables suddenly arriving in 2024, do have AI onboard -- and they can use it to see what you're seeing, in a sense, using cameras. Both Meta's glasses and Humane's AI Pin can take snapshots of the world and then analyze those images with generative AI, answering back in audio.
Meta's newest Ray-Bans, which start at $299, arrived back in October with just camera and audio features. The generative AI functions, accessing the onboard camera, are newer. What I love about Meta's smart glasses is that AI is a whimsical bonus. These are glasses first and cameras/headphones second -- and pretty good ones at that.
That's what kept Meta's glasses in my life. They're practical and, oddly, transformative. I forget I have them on, and then, suddenly, I realize I've gotten used to them when I start talking to myself and tapping my glasses to snap photos on my normal glasses, and suddenly miss the extra features like a phantom limb.
Between small wearable AI devices that are suddenly sprouting everywhere and large advanced mixed reality VR headsets, advanced smart glasses now seem like a happy middle. Something new, casual and often more useful than a smartwatch. All it took was getting fitted for a prescription and wearing these glasses around all the time, and I found out exactly how immersive they can be. That's until the battery runs out.
Meta Ray-Bans (Gen 2)
Like
- Normal-looking design
- Great microphones
- Great wearable action camera for vacations
Don't like
- Volume gets washed out in loud environments
- Battery life won't last a full day
- AI features are still limited
As glasses
Meta's second-gen Ray-Bans really, truly look like normal glasses. They're thicker plastic on the arms than my chunky designer everyday glasses, but nobody knows they're tech when I wear these black Wayfarer glasses around. It takes a few minutes for even seasoned tech vets to notice them.
Luckily, they also match my style. Meta made more frame designs, including a rounder-framed Headliner and a cat-eyed Skyler. There are different colors and transparencies, and different tinted lens options. The coolest part is that they can outfitted with prescription lenses, too. I got my nearsighted and progressive reading lens needs met, making the smart glasses into a truly working pair of everyday glasses for me. They're comfy, too.
I have to remember I'm wearing them. While Meta's glasses are splash-level water resistant, they're not meant to get wet. I don't tend to swim with my glasses on anyway, but I'd also want to avoid wearing these during a serious downpour.
As headphones
These glasses double as Bluetooth headphones, pairing with iOS or Android and syncing with a dedicated Meta View app that also syncs photos (more on that below). The speakers, located right under the arms near my ears, deliver surprisingly solid sound without much audio bleed. My family sitting next to me can't hear much of the music I'm listening to, or the phone calls I'm on. I look like I'm talking to myself.
The audio isn't as good as good earbuds, especially when it comes to bass. Also, while the speakers hit a respectable volume level, in noisy public areas they become hard to hear. There's no noise canceling, which I've gotten spoiled with on earbuds like the AirPods Pro 2.
They're really good for speech, in particular podcasts and phone calls. The glasses have a microphone array that works really well for conversations. I haven't had any complaints about my voice quality while using them.
The multiple microphones also create some spatial audio magic when recording videos with the glasses; playing back video clips later makes environmental audio seem eerily like it's happening all around me.
The right arm of the Ray-Bans has a touchpad with a volume slider and a tap-to-play function. Tapping with two fingers answers phone calls, and tapping and holding can start a personal playlist in either Spotify or Apple Music.
As a camera and memory maker
These glasses can record photos and one-minute video clips, too, and this is where I've really come to love these glasses as a vacation companion. On two different trips to Walt Disney World and Universal Studios, they've been able to snap photos and capture clips throughout my day -- and while on rides -- in far more spontaneous ways than my phone. I've worn them while sledding down hills. I've worn them while cooking and unboxing gadgets. They're memory makers, and as I wrote in an earlier story, they feel like what advanced mixed reality headsets like the Vision Pro someday aspire to…as far as being an instant everyday companion.
The glasses only shoot in vertical portrait mode, which is great for Instagram or TikTok but not for your home TV. The video quality certainly isn't as good as my iPhone 15 Pro, or many recent phones. But the videos look a lot better than I expected. Good enough to share, even pretty OK in dim light and wonderful at capturing a quick moment. There's some stabilization while walking, too.
Photos are snapped by either saying, "Hey Meta, take a photo/video," or by pressing or pressing and holding a little shutter button on the right arm. The camera voice commands work even when disconnected from a phone.
I find myself taking little memory shots all the time, capturing a quick thing I see. They feel like memories, even more than taking out my phone for a photo or video. The limited one-minute video record time is a bit of a letdown, but it's good enough to capture most quick events on the fly. It's likely done to help conserve storage space and battery life on the glasses, but I'd love a way to override it at times. The glasses can also live stream to Instagram and be used during WhatsApp and Messenger calls. They don't work for FaceTime yet, but maybe that'll be next.
Photos and videos sync to Meta's phone app by joining a local Wi-Fi connection with the glasses; I try to pull photos and videos off once a day, but I haven't run out of storage room. They're automatically added to my photo library after enabling that in settings, but the photos and videos can also be managed in Meta's app.
What about privacy? The camera on these glasses hides in one temple. A white LED light comes on when recording, and a shutter snap sounds. Both are pretty subtle, though, and I've recorded in lots of places where other people nearby didn't seem to notice. Or maybe they noticed and didn't care. After all, everyone's always recording everything on their phones most of the time. Maybe we're just in an always-recording world. Be careful wearing these in private places, or bathrooms. You can turn off the glasses completely via a small switch on the inside of the glasses, but no one else will know the glasses are off: there's no physical camera cover.
As a wearable AI companion
Now, let's get to the weird stuff: these glasses can be AI companions, much like Humane's AI Pin and the Rabbit R1. All of these attempts at being AI companions feel early and semi-broken right now, but Meta's implementation feels the most acceptable for everyday wear. A beta mode in the US and Canada adds generative AI responses to whatever you might ask the glasses. "Hey Meta, what is existentialism?" "Hey Meta, what joke can you tell me about the end of the universe?" "Hey Meta, can you make up the plot for a one-act play about the transformation of Scott Stein into an interstellar warrior?" You can go ahead and ask whatever absurd things you might ask AI in any other app in your life and get results that will either be amazing, terrible, bizarre or a mix of all three.
Meta's AI features can be practical, like the smart speaker in your home or Siri or Google Assistant. Generative AI provides deeper, more tangly possibilities. On long walks in my neighborhood, I've had ongoing conversations with my glasses. You have to say "Hey Meta" at first to trigger the glasses (or tap the glasses), but a new update also allows the glasses to automatically listen after a response it gives you, which I've found makes quick replies or follow-up questions feel a lot more natural. It also means the glasses are listening more.
The weirder stuff comes with "multi-modal" AI, which taps into what the glasses see to generate information, translate what text you see or identify objects. This is where the glasses start to truly feel like some future wearable AI extension. It's a companion that can talk to me and see what I see.
If this all sounds like the 2013 movie "Her," you're not wrong. I've had a lot of moments that have made me feel like I have an AI companion that can be my personal, everyday guide. In practice, that companionship has limits and awkwardness.
Waking the camera mode for AI, for example, requires saying, "Hey Meta, look and..." to preface whatever you're asking, which feels like a mouthful. There's also a few-second pause, where you wait for the response while the photo is snapped and the AI processes your request. While this happens, I can hear a little musical riff, a sort of "please wait" tone I've internalized.
Apparently, during these moments, I get what I've called "smart glass dead eye," where I kind of stare into space, frozen, waiting for the response to happen. It's seriously offputting with friends as if I've suddenly entered zombie mode. Maybe this is a preview of what we'll all be someday, much like the early days of Bluetooth earpiece-wearing people who looked like they were talking to themselves 20 years ago.
Meta's AI voice options sound sort of like people, and sort of not. The interactions feel wooden, and I'm often hit with dead ends where the AI suddenly just doesn't want to help anymore, like a more advanced version of what happens on every other smart speaker and assistant. The times have changed, and they also haven't.
The AI's usefulness is also limited by its lack of app hook-ins. I can make calls, send messages and connect to Apple Music. I can ask for general information and I can connect to some Facebook functions. I can't hook into my iPhone's features like Siri can, and I can't call an Uber, check on an Amazon delivery or summarize my email. Not yet. Doing this, like with all emerging AI services and gadgets, requires authorizing access… which is a leap of faith anyway, privacy-wise. Both Apple and Google haven't made moves yet on working deeper AI access into iOS and Android for accessories like Meta's glasses. A new era needs to come, similar to what smartwatches had to navigate to become phone extensions. Meta's glasses still sit on the outside a bit, although they're far more phone-connected than the Humane AI Pin, which doesn't connect at all.
There are awkward moments. During a Passover seder, my glasses started talking to me when a message popped up on my phone. I should have turned the glasses off, but it's a reminder that much like smartwatches (and phones) we have to learn to manage distractions on a personal device that's always on. It's why I still take these glasses off and go back to my own non-smart pair a lot of the time. Also, that's because of battery life limits.
Battery life: the other weak link
Meta includes a nice-looking case that doubles as a charger. The glasses snap in, and little metal contact pins on the bridge charge the glasses' onboard battery. The case can charge the glasses eight times, and in everyday use, they hold a ton of juice, similar to how Apple's AirPods charge case feels over a week of use. The case charges back up via USB-C, but the glasses need the case to charge.
That's a problem during a given day because the battery runs down quickly. I find the battery running out by around noon when I've started the day with them around 7 a.m. The glasses recharge relatively fast -- less than an hour -- but they need to be in their case. That means I need a spare pair of glasses, or I need to spend the rest of my day with the battery dead. You can still see through them but none of the smart features will work.
If you're buying Meta's Ray-Bans as non-prescription sunglasses, you could take them off for a bit to charge. For me, that's not an easy option. It keeps me from loving these as a truly all-day device, but they're the closest I've experienced.
Part of a new future
Meta's glasses have shown me something somewhat astonishing at times: by being a companion to my everyday glasses, they achieve something -- even if limited -- that the most advanced VR mixed reality headsets can't. I take Meta's glasses with me wherever I go. The smartest thing Meta did with these glasses is make the AI features just a bonus instead of the star attraction. These are camera and audio glasses first, and they're great at that. Those practical functions are the foothold, the reason for me even wearing them in the first place.
As companies like Meta (and Qualcomm, and possibly Google and Apple, too) strive to make magic AR glasses in the future, AI-enabled smart glasses like these are the necessary first steps. Someday these glasses will have displays, and wristband inputs or hand tracking. For now, they're fun enough that my friends and my family all take notice and agree. You don't need glasses like these, obviously, and they're still not perfect. At the price Meta's charging for them, I'd take a pair of Meta Ray-Bans to play with AI over anything Humane, Rabbit or others are making.
Keep in mind: wearing them makes you part of Meta's ongoing tech experiment, as is the case with all emerging connected tech. But if you see me mumbling to myself as I walk around, you know the reason. Please be kind. You may be like this soon, too.