Aria Gen 2 glasses take research with Project Aria to the next level.

Learn more in our latest blog post. 

ICCV 2025

Time: Monday Oct 20th, 8am - 12pm

Room: 328

Project Aria is the world’s most advanced device for egocentric research with applications in augmented reality, contextual AI, and robotics. Aria is worn like a regular pair of glasses and is packed with sensors that capture egocentric video, audio, inertial data, and more.
Project Aria can produce detailed information about the wearer’s gaze, their precise position and trajectory, their hand pose, and information about the surrounding environment.

In 2022, we hosted our first Project Aria tutorial at CVPR, where we invited attendees to participate in the Project Aria academic program by applying for the Aria Research Kit, which includes Aria glasses and tools for developing open-source datasets and real-time prototypes.

Since the tutorial debut, it has nearly doubled in attendance year-over-year, now with over 250 research labs using over 1,000 Aria devices throughout the world, having produced highly detailed egocentric datasets and developed new methods for topics such as a robotics imitation learning.

This year, we will continue sharing insights and resources with the research community with the fourth Hands-on Egocentric Research Tutorial with Project Aria. We will share updates on our forthcoming release of Aria Gen 2 glasses, including a live demo.We will feature research that Meta has recently completed with Aria and that some of our partners have completed with Aria.

Looking forward to another outstanding tutorial with the ICCV community.

Agenda

08:00

Keynote

=

Richard Newcombe - Meta Reality Labs


08:30

Project Aria Overview and Updates

=

James Fort - Meta Reality Labs


09:00

Aria Gen 2 Live Demo

=

Aria Kang - Meta Reality Labs

=

Cheng Peng - Meta Reality Labs


09:20

Robust Conditional Shape Generation from Casual Captures

=

Yawar Siddiqui - Meta Reality Labs


09:40

Nymeria++

=

Lingni Ma - Meta Reality Labs



Coffee Break (10:00 - 10: 20)


10:20

Thinking Inside the Scene: Toward Common Sense in Robotic Perception

=

Zuria Bauer - ETH Zurich


10:40

Seeing Inside: Estimating Heart Rate, Emotion, and Personality on Egocentric Vision Systems

=

Björn Braun - ETH Zurich


11:00

Depth from Aria Gen 2 with NVIDIA FoundationStereo

=

Bowen Wen - NVIDIA

=

Michael Goesele - Meta Reality Labs


11:20

Egocentric Video Understanding in Operating Rooms - Challenges & Opportunities

=

Felix Tristram - TU Munich


11:40

Robot Learning from Egocentric Human Data with Project Aria

=

Danfei Xu - Georgia Tech


Organizers

Richard Newcombe

Reality Labs Research, Meta

Jitendra Malik

Reality Labs Research, Meta

Mingfei Yan

Reality Labs Research, Meta

Jakob Julian Engel

Reality Labs Research, Meta

James Fort

Reality Labs Research, Meta