PROJECT ARIA RESEARCH KIT

Collaboratively supporting the acceleration of AI and ML technology

READ DOCUMENTATION
A researcher from CMU puts on a pair of Project Aria glasses.

Aria Research Kit

For approved research partners, Meta offers a kit that includes Project Aria glasses and SDK, so that researchers can conduct independent studies and help shape the future of AR.

WHAT’S INCLUDED IN ARIA RESEARCH KIT?

A powerful suite of tools, services and hardware

The Project Aria program is not just about glasses.

READ DOCUMENTATION
An illustration of Project Aria glasses.

Glasses

Partners are provided with Aria glasses to collect data required for their research.

An illustration of tools.

Client SDK

A powerful tool which provides the means to interface directly with Project Aria glasses.

An illustration of a cloud.

Cloud Services

Machine perception services provide researchers additional annotations and insights based on collected data.

ARIA RESEARCH KIT CLOUD SERVICES API

Enhanced insights with Machine Perception Services

Approved research partners have access to a variety of cloud-based services provided by Meta, such as 6DoF trajectory and 3D eye-gaze estimation.

These Machine Perception Services enable researchers to harness algorithms and pipelines used internally by Reality Labs Research, allowing partners to focus their energy on what matters most for their research.

READ MACHINE PERCEPTION SERVICES DOCUMENTATION
A screenshot from the visualization tools available to Project Aria partners.
ARIA RESEARCH KIT CLIENT SDK

A versatile toolkit for interfacing with Aria glasses

In addition to the open-source Project Aria Tools which enable researchers to work with Aria data, approved Aria research partners have access to a client SDK to enable real-time interaction between Aria and a secondary device such as a PC or phone.

This SDK exposes device functionality, such as sensor streaming, sequence management, and capture configuration, allowing Aria glasses to be tailored to the needs of any given research project.

Aria glasses on a desk. The glasses are plugged in via cable.
READ OUT OUR RECENT COLLABORATIONS

Driving innovation through partnerships

AR glasses are intended for all-day wear and a major use case in our daily lives is driving or riding in a moving vehicle. To help solve this, we have partnered with BMW to explore how this technology could integrate into tomorrow’s vehicles to provide a unique and valuable experience for consumers.

WATCH THE CASE STUDY ON META

To explore accessibility in AR technology and better understand how it can benefit people with varying physical abilities in the future, we started a pilot program in 2020 with Carnegie Mellon University’s NavCog project to build 3D maps of museums and airports. Developed by CMU, NavCog is an audio wayfinding app designed to help people with visual impairments better navigate their surroundings indoors, where GPS signals often don’t reach. CMU has been working on the NavCog project since 2014.

The open source project has many collaborators around the world. Prior to partnering with Meta, CMU relied on bluetooth beacons placed around an indoor space to accurately determine the location of aNavCog user within that space. Using the Project Aria device, CMU researchers built a 3D map of the Pittsburgh International Airport and other locations. They could then use that map to train AI localization models running on a mobile phone. This could reduce NavCog’s dependency on the external bluetooth beacons, inching us closer toward the not-so-distant future where NavCog can be deployed at scale.

In 2022, FAIR brought together an international consortium of 15 universities to collect and release the world’s largest public dataset of first-person or “egocentric” video of daily-life activities.
Teaching AI to perceive the world as people do requires more than video data.

Select Ego4D consortium universities are expanding their work on egocentric video understanding to include data captured using Project Aria’s sensors, which include stereo cameras, dual inertial measurement units, spatialized microphones, eye tracking cameras, and more. With this effort, we’re excited to bring Project Aria to more researchers around the world, helping the Ego4D project unlock deeper insights into the human experience.

LEARN MORE ABOUT EGO4D

Apply for the Aria Research Kit

If you are a researcher exploring machine perception technologies or their applications, apply for the Aria Research Kit here.