STORIES

Behind the glasses: 3 design leaders share their vision for wearables at Meta

OP-ED
CONTENT DESIGN
PRODUCT DESIGN
By Joshua To, Michelle P., Tizzy A.
8 min read
April 29, 2025
A collage shows Facebook's refreshed brand identity, with the updated Facebook logo; a portrait overlaid with text reading "Open your world;" in-app product images, Facebook's blue color palette; an example of the Facebook Sans typeface; and examples of Facebook Reactions.

SUMMARY

The view appears equally bold and bright for Ray-Ban Meta and Orion.

At Meta, we’re building the future of human connection and the technology that makes it possible. To carry out this mission, designers are moving beyond 2D screens into experiences that incorporate AI (artificial intelligence) and AR (augmented reality). Glasses like Ray-Ban Meta and Orion are two examples of the evolution to help people connect more meaningfully with each other and the world around them.


Weighing only a few more grams than Ray-Ban’s iconic Wayfarers, Ray-Ban Meta glasses enable wearers to use their voice to get fast answers with Meta AI, capture photos, make calls, send messages, listen to music and livestream.


After a decade of research, design and development — the Orion product prototype showed how both AI and AR can seamlessly enhance everyday life. From pointing out which chord to play next while learning the piano to offering recipe ideas with step-by-step visual instructions based on what’s in the refrigerator, the AR glasses’ wide display takes in what objects are in front of a wearer and overlays guidelines in the physical world to help them accomplish tasks — without the friction of holding a phone or looking at a computer.


To get a glimpse behind the glasses and tap into the minds of the people leading the design of Ray-Ban Meta and Orion, we asked three designers to respond to the question: “What excites you the most about the intersection of wearables design, AI and AR?” Their personal responses reveal the immense creative and connective potential emerging from this intersection — taking cues from architecture and urban planning to making leaps in accessibility.

“We’re lighting the way to a level of interaction far beyond the smartphone.” — Joshua To

As the VP of product design for AR and wearables at Meta, Joshua oversees the design strategy and execution of cutting-edge products that enhance human connection and interaction.

A woman wearing a blue jacket and the Orion AR glasses stands in front of a holographic screen, smiling, talking to and looking towards the woman appearing on the screen.

“Orion brings the value of computing into the here and now, so you can be yourself and interact naturally without having to learn complicated new interfaces.” — Joshua To

Computers have changed rapidly in the last 50 years, but people, not so much. Apart from fashion and technology, it’d be hard to tell the difference between people of today and people from hundreds of years ago. When we merge computing with things that people wear — like watches or glasses — we have to pay close attention to people’s needs and the full range of people’s shapes and sizes. Innovation in wearables comes when we can find ways for computers to understand and support these needs and variances, rather than giving people more technology to have to figure out and keep up with. At Meta, this is our north star when designing wearable devices.


The smartphone is arguably the most successful computing tool people have made to date. It’s used daily by billions and combines incredible computing capabilities with an innately human-centered interface. Like tools we’ve designed and used for millions of years, the smartphone fits easily in the hand and responds rapidly to touch. It’s well-crafted for transforming intent into action. The small, portable form factor means it goes where we go and is available when and where needed.


It’s a mistake to think the smartphone is the pinnacle of computer interfaces. Without effort, it can’t see what you see and hear what you hear. It can understand your voice and respond to touch, but it’s unresponsive to basic, everyday signals like pointing, gazing or nodding. Smartphones often hide in pockets or bags instead of helping you stay present and connected.


Orion brings the value of computing into the here and now, so you can be yourself and interact naturally without having to learn complicated new interfaces. Orion’s ability to understand intuitive hand gestures like tapping, swiping and pointing through a fusion of computer vision (CV) and electromyography (EMG) and see what you see lights the way to a level of interaction that matches how people actually navigate their world.


The leaps Orion demonstrates in terms of innate natural interaction alone are enough to be transformative. But the marriage of Orion’s novel input and output with our foundational work in AI points toward an extraordinary future. We believe the rapidly accelerating capabilities of our Llama models combined with capabilities established by Orion will assist people well beyond the smartphone.

  • How might we enhance people’s memories?

  • How might we help people develop healthier habits and support their goals?

  • How can we evolve and improve the notification experience to help people stay in-the-know while also staying present?

“It’s time to break free from backlit rectangles.” — Michelle P.

Michelle is the design director of the device design team at Meta, which ensures that the end-to-end experiences across all our wearable devices are of the highest quality from concept to marketing and launch.

The Orion AR glasses take in a table displaying bananas, pineapple, chia seeds, matcha, oats, and cacao, then generate recipe recommendations for the wearer to follow.
Something Went Wrong
We're having trouble playing this video.

“Wearables like Ray-Ban Meta and Orion offer an opportunity to redesign everyday experiences in a way that’s more useful, usable and beautiful.” — Michelle P.

As we push the boundaries of what’s possible with AI and AR wearables, teams at Meta are thinking differently about how we design. We’re most familiar with designing 2D spaces with 2D design tools, and that’s still critical work. But it’s time to extend beyond backlit rectangles, app grids and complex menus that don’t translate to experiences that will live on top of a physical backdrop, especially as people are moving through the world.


For ages, people have honed the skills of creating spaces and moving through them, from the grandeur of the Taj Mahal to a well-designed smartphone application. Just as city planners use rivers and public buildings to guide movement and create vibrancy and identity, we can employ similar techniques in AR to enhance the experience. Imagine user interfaces (UI) that live in harmony with the world, sampling colors, light and textures that play with our physical realities — and being intentional about where we choose to break that harmony for attention, all while avoiding overwhelming or disorienting people.


At Meta, our goal is to design devices that deliver value across multiple sensory dimensions. We want to provide functional value by making tasks easier and more efficient, but we also want to create emotional value by crafting experiences that inspire, delight and bring people closer. By considering the biological and sensory aspects of human interaction, we can design interfaces that are not only seen and heard but felt in an entirely different way than what we experience from screens as we typically know them.


Wearables like Ray-Ban Meta and Orion offer an opportunity to redesign everyday experiences in a way that’s more useful, usable and beautiful. By integrating principles from other disciplines responsible for building physical spaces like industrial design, architecture and urban planning, we can create environments that are both functional and harmonious with the spaces they inhabit.

“This work underscores that accessible design is good design.” — Tizzy A.

Tizzy is a content design director and accessibility lead who ensures that wearables are clear, understandable and usable by everyone who wants to try them.

A woman walks viewers through how she uses her Ray-Ban Meta glasses to help her accomplish tasks independently and seamlessly as a blind person, including calling a sighted volunteer to help her turn on her new record player, read an invitation and get ready for a party with the Be My Eyes application.
Something Went Wrong
We're having trouble playing this video.

“Through a partnership with the Be My Eyes application, a blind or low-vision person wearing Ray-Ban Meta glasses can initiate a video call with a sighted volunteer with a simple voice command.” — Tizzy A.

Curb cuts — lowered sections of sidewalks at intersections — were initially designed so that people using wheelchairs could move on and off sidewalks more easily. But curb cuts are actually beneficial for everyone: caregivers with strollers, toddlers on bicycles and people using shopping carts or wearing high heels. We apply this same mindset to our product strategy on the wearables team at Meta. We make design decisions and build features that are more accessible to and benefit everyone, including those with disabilities.


Accessible design is good design, and that design is also beautiful.


Part of what makes Ray-Ban Meta innovative is the assistive technology embedded in the lightweight and fashionable glasses form factor. People who are blind or have low vision organically adopted the product because it allows them to see in a way that they couldn’t before. They can put on the glasses — which look just like regular glasses — and use Meta AI capabilities to say, “Hey, Meta, what do the instructions on this prescription say?” or “Hey, Meta, which airport gate am I at?” Through our partnership with the Be My Eyes application, people can initiate a video call with a sighted volunteer on their glasses with a simple voice command. The volunteer can “see” through the camera of the glasses and provide audible feedback through the glasses’ open-ear speakers.


When AR is added to the equation, we open even more possibilities for accessibility. Imagine what an individual who can’t lift their arms could accomplish while wearing Orion, which uses the person’s eye gaze to act as the pointer and simple gestures like pinching to select an item through the EMG wristband. Getting work done or attending a virtual doctor appointment would be much more difficult — perhaps impossible — using just a smartphone or controller.


Features like captioning and translation through Ray-Ban Meta and Orion can benefit someone who is deaf, hard of hearing or who struggles with a learning disability. And like curb cuts, these features have a halo effect for everyone: second language speakers, international travelers and even emerging readers. When we design AR and AI wearables with accessibility in mind, it elevates the experience for everyone.

We’re looking forward to seeing what the next year holds at the intersection of wearables design, AI and AR at Meta, and for the evolution of Ray-Ban Meta and Orion. We want to hear from you: As a designer, what excites you most about wearable devices that incorporate AI and/or AR? Share your thoughts with us on Facebook, Instagram or Threads.

Design at Meta is for everyone who touches user experience and design.

Whether you’re a product designer, writer, creative strategist, researcher, project manager, team leader or all-around systems-thinker, there’s something here for you.


Design at Meta gives you a look into the expertise and perspectives of the multidisciplinary teams who are building the future of human connection and the technology that makes it possible.

FacebookInstagramThreadsDribbbleMedium