Aura—Ambient AI assistant for everyday rituals
A glanceable conversational layer that lives across your devices. Designed the gesture vocabulary, latency-aware motion system, and the quiet feedback loops that make AI feel calm — not chatty.
I'm Vanessa Core — a product designer at Meta shaping AI experiences and the next generation of wearables. I work at the seam where ambient intelligence meets the human body.
I'm a product designer at Meta working across AI and the next generation of wearables. My practice sits between the cinematic and the intimate — building interfaces that feel inevitable, like they were always there, waiting for you.
I started in industrial design, moved through software, and now spend most of my time prototyping the seam between hardware, gesture, voice and intelligence. I believe the best technology is the kind you forget you're using.
Each project is a study in restraint — what to add, what to leave out, what to let the system feel for itself.
A glanceable conversational layer that lives across your devices. Designed the gesture vocabulary, latency-aware motion system, and the quiet feedback loops that make AI feel calm — not chatty.
Designed the on-glass UI grammar for a translucent companion: peripheral typography, off-axis interactions, and a layered information model that respects the world behind the lens.
Explored micro-gesture detection translated into a soft, tactile interaction language. Defined the latency budget, training rituals, and feedback choreography for a wearable that learns you.
A privacy-first companion that captures the small things — voice scribbles, photos, fleeting thoughts — and quietly weaves them into useful context, on-device. No cloud, no anxiety.
From industrial design studios to Meta's Reality Labs, my work has tracked a single thread: how technology leaves the screen and joins the body.
Leading the design of an ambient, multimodal assistant across smartglasses, mobile, and home. Defining the gesture vocabulary, latency-aware motion, and the quiet feedback loops for AI on the body.
Designed on-glass UI grammar, off-axis interaction patterns, and the industrial language for a research wearable program. Shipped a neural input prototype with the brain–computer team.
Worked on the soft edges of health — moments where a watch becomes a companion, not a clinician. Led design for two unreleased experiments in passive sensing.
Cut my teeth on consumer hardware, hospitality robotics, and a generation of speculative wearables that never made the lab door — but taught me everything.
Designing assistants that earn attention through restraint — anticipating less, observing more, intervening only when it matters.
Buttons, gestures, glances. A wearable is a body language. I design the rhythm before the resolution.
On-device by default, legibly. Trust isn't a setting — it's the shape of the product.
A product is what you keep removing until it works.
Four tenets I return to. Less a method than a posture — the way I hold a problem before I touch it.
The best products feel discovered, not designed. I work toward the version of a feature that, once seen, seems like the only one that could have existed.
Cleverness is a tax on attention. I design for the quiet competence of a tool that doesn't ask for applause.
Every transition is an explanation. I treat easing curves and timing as part of the product copy.
When the screen disappears, design becomes posture, gesture, and breath. I study the wearer as carefully as the wearable.
I take on a small number of collaborations each year — usually 0→1 product work at the intersection of AI, hardware, and the body. If that sounds like you, write me a paragraph.