Project Aria accelerates AR and AI research

An image showing three researchers working with Project Aria glasses

Project Aria helps researchers explore the future of AR and AI, before devices are mature

Project Aria is a research program from Meta Reality Labs Research that helps researchers understand the technical and non-technical challenges for building AR and AI devices.

In time, AR and AI technologies will empower us to achieve our goals via natural interfaces that understad our physical and digital worlds. By partnering with research institutions around the world, publishing research, and making resources openly available, we believe we can make this future a reality, responsibly.

Learn more about Project Aria Glasses

Innovating through open collaboration

An image showing the output of SceneScript on an Egocentric RGB image

Scene understanding


SceneScript is a method for representing and inferring scene geometry using an auto-regressive structured language model, and end-to-end learning.

Learn more about SceneScript
An image showing the output of EgoBlur on an egocentric image

Enabling responsible innovation


EgoBlur is a new AI model from Meta to preserve privacy by detecting and blurring PII from images.

The model detects and blurs faces and license plates in images and is intended to be used within both research and consumer applications.

Learn more about Egoblur

Participate in Open Research

Work with Project Aria public datasets to accelerate the state of machine perception and AI

Learn more about Datasets
Learn more about Challenges
An image showing two egocentric sequences from the Aria Everyday Activities dataset, with trajectory and point cloud visualization

Featured dataset

Aria Everyday Activities

Aria Everyday Activities is a multi-purpose egocentric dataset created using Project Aria.

The dataset provides computer vision researchers access to anonymized Aria sequences, captured in a variety of scenarios, such as cooking, playing games, or exercising.

Learn more about Aria Everyday Activities
An image showing a selection of simulated indoor scenes from the Aria Synthetic Environments dataset

Featured dataset

Aria Synthetic Environments

Aria Synthetic Environments is a large-scale, fully simulated dataset of procedurally-generated interior scenes.

The dataset sets a new precedent for the scale of indoor environment datasets, surfacing exciting new research opportunities for tasks related to 3D scene reconstruction, and object detection and tracking.

Learn more about Aria Synthetic Environments
An image showing various annotations from the Aria Digital Twin dataset

Featured dataset

Aria Digital Twin

Aria Digital Twin is an real-world egocentric dataset captured using Aria glasses, with extensive simulated ground truth for devices, objects and environment.

TThe dataset sets a new standard for ‘digital twins’, accelerating research into challenges such as 3D object detection, scene reconstruction, and sim-to-real learning.

Learn more about Aria Digital Twin
A screenshot from the Project Aria research paper.

Read the Project Aria Research Paper

For more information about the Project Aria glasses, read our paper here.

Illustration of Project Aria Glasses, Tools, and Services offered within the Aria Research Kit.

A comprehensive toolkit for partners to harness Project Aria

For the broader research community Meta offers a kit that includes Project Aria glasses, tools, and services needed to conduct independent studies to help shape the future of AR.


Subscribe to Project Aria Updates

Stay in the loop with the latest news from Project Aria.

By providing your email, you agree to receive marketing related electronic communications from Meta, including news, events, updates, and promotional emails related to Project Aria. You may withdraw your consent and unsubscribe from these at any time, for example, by clicking the unsubscribe link included on our emails. For more information about how Meta handles your data please read our Data Policy.