INTRODUCING

Aria Everyday Activities

A re-release of Aria’s first Pilot Dataset, updated with new tooling and location data, to accelerate the state of machine perception and AI.

WHAT IS IT?

An updated egocentric dataset created using Project Aria

Aria’s original Pilot Dataset provided computer vision researchers access to anonymized Aria sequences, captured in a variety of scenarios, such as cooking, playing games, or exercising.

In ‘Aria Everyday Activities’, we have updated the original dataset to make it easier for researchers to get up to speed, using Aria’s simplified tools.

In addition, we also provide new location data, including semidense pointclouds and feature observations, generated by Project Aria’s Machine Perception Services, helping to unlock new areas of research.

LEARN MORE ABOUT PROJECT ARIA

Sensor Data

  • 1 x 110 degree FOV Rolling Shutter RGB camera
  • 2 x 150 degree FOV Global Shutter mono cameras for SLAM and hand tracking
  • 2 x 80 degree FOV Global Shutter mono cameras for eye-tracking with IR illumination
  • 1 kHz + 800 Hz IMUs + barometer & magnetometer environmental sensors
  • 7 x 48 KHz spatial microphones

Annotation Data (*New to AEA)

  • Per-frame eye tracking
  • Improved 3D trajectories, aligned between multiple users*
  • Semi-dense point cloud*
  • Semi-dense point observations*
  • Online calibration of Project Aria’s sensors*
A visualization of the sensor data from Project Aria glasses, contained within the Project Aria Pilot Dataset.
HOW IS IT ANNOTATED?

New and improved automatic and manual annotations

Aria Everyday Activities harnesses Machine Perception Services to provide additional context to the spatial-temporal reference frames.

Multiple point clouds being aligned to the same frame of reference.

Semi-dense point annotation

Since the first release of the Pilot Dataset, we have released semi-dense points for each sequence, generated by Project Aria’s Machine Perception Services.

Multiple point clouds being aligned to the same frame of reference.

Multi-user poses in shared reference frame

As per the original Pilot Dataset, in addition to providing per-frame trajectory for every recording, sequences captured within the same environment are aligned to the same reference-frame, unlocking new research opportunities for collaborative scene understanding.

A visualization of the sensor calibration from Project Aria Glasses.

Online camera calibration

For a high-quality egocentric dataset, it is essential to understand how cameras perceive the world. Aria Everyday Activities provides full factory and per-frame camera calibration parameters for every sensor.

Multiple camera trajectories, being visualized within the same frame of reference.

Multi-device time sync

Sequences captured within the same environment at the same time are precisely time-aligned with sub-millisecond accuracy.

An image from Project Aria glasses, showing a woman in a house. The woman's face has been blurred to preserve privacy.

Speech-to-text

For sequences where actors speak, we provide speech-to-text annotation with new tools to query words or sentences at any query timestamp. This supports egocentric communications research, such as predicting turn-taking in conversations and multi-speaker transcription.

A visualization of the eye gaze annotation, contained within the Project Aria Pilot Dataset.

Wearer eye-gaze

Using data from Project Aria’s eye-tracking cameras, the Aria Everyday Activities dataset includes an uncalibrated estimate of the wearer’s eye-gaze. This can be used to accelerate research into user-object interactions.

PROJECT ARIA TOOLS

New tools to conduct research faster

Specialized tooling allows researchers to more easily load, query, and manipulate AEA data, integrating with projectaria_tools, an open repository for working with Aria data.

New visualization tools allow researchers to easily view two time-aligned sequences at once.

Researchers can get up-to-speed within minutes with a new a Jupyter notebook, guiding dataset users how to access the sensor data, visualize annotations, and synchronize timestamps using Timecode.

VIEW AEA DOCUMENTATION
A screenshot of the visualization tools, provided with the Project Aria Pilot Dataset.

Enabling innovation, responsibly

All sequences within the Aria Everyday Activities dataset have been captured using fully consented actors in controlled environments.

Additionally, face and license plates have been blurred using human annotation prior to public release.

RESPONSIBLE INNOVATION PRINCIPLES
Sensor data from the Project Aria Pilot Dataset. Faces have been blurred to preserve privacy.

Read the accompanying AEA Research Paper

For more information about the Aria Everyday Activities Dataset, read our paper here.

ARIA EVERYDAY ACTIVITIES RESEARCH PAPER
A screenshot from the Aria Everyday Activities research paper.

Access Aria Everyday Activities and accompanying tools

If you are a researcher in AI or ML research, access the Aria Everyday Activities dataset and accompanying tools from Meta Reality Labs research

By submitting your email and accessing the Aria Everyday Activities dataset, you agree to abide by the dataset license agreement and to receive emails in relation to the dataset.

Subscribe to Project Aria Updates

Stay in the loop with the latest news from Project Aria.

By providing your email, you agree to receive marketing related electronic communications from Meta, including news, events, updates, and promotional emails related to Project Aria. You may withdraw your consent and unsubscribe from these at any time, for example, by clicking the unsubscribe link included on our emails. For more information about how Meta handles your data please read our Data Policy.