A wearable computer in a glasses form-factor
Project Aria glasses utilize groundbreaking technology to help researchers gather
information from the user’s perspective, contributing to the advancement of egocentric research in machine perception and augmented reality.
Project Aria is used by Meta and research partners to explore a wide variety of use cases and proofs of concept for research across AR and AI
We design our products with a privacy-first approach.
For Project Aria, Meta is committed to providing people visibility into what information is being captured, clearly signaling when capture is occurring and creating clear processes to protect the information that is recorded. Participants will only record in either Meta offices, wearers’ private homes (with consent from all members of the household), or public spaces, and won’t record in private venues without written consent from such places. When the device is collecting data, it will display a white light to let people know it’s recording. Before any information gathered in a public place is made available to our researchers, it will be automatically scrubbed to blur faces and vehicle license plates.
AR devices and experiences will eventually enable deep connections between people and the things that matter most to them, providing more utility and information while decreasing the time spent looking down at various devices. Our approach to building an AR ecosystem will always put people before opportunity. See our responsible innovation principles for more information about how we’re building products with a privacy-first approach.
For the broader research community Meta offers a kit that includes Project Aria glasses, tools, and services needed to conduct independent studies to help shape the future of AR.
Frequently Asked Questions
We want individuals who encounter a Project Aria participant in public to understand that the participant is collecting data. When the device is collecting data, it’ll display a white light to let people know it’s recording.
Data collectors have training to respect the privacy norms and conventions of the country they are recording in.
Communication about where and when data collections occur must comply with Project Aria Research Community Guidelines.
We’ll continue to evaluate our policies and the indicators we provide to help make sure we’re respecting people’s privacy and being transparent about what Project Aria is and what the research device does.
Safeguards in the Research Community Guidelines include things like, making sure people get appropriate notice that their data may be captured, providing contact details to people who want more information, only recording in a private venue or building if official permission has been granted, not recording sensitive activities or sensitive places.
In the UK and European region, research participants will wear a “recording in progress” vest, have a badge with lanyard, and a flier with QR codes and URLs directing anyone who is interested to a website for more information. In all countries, research participants will have a digital badge in the Aria mobile app that includes QR codes and URLS to direct bystanders to more information.
Since the initial launch of Project Aria in September 2020, we have gradually expanded in-public data collection to locations beyond the U.S.
As we announced in September 2021, some Meta employees and contractors have started in-public data capture in a handful of public landmarks in Singapore.
Additionally, in 2021, some employees and contractors based in the UK, EU, Switzerland, Singapore, Canada and Israel captured data in their own homes with agreement from all members of the household.
Meta employees based in England and Ireland began to capture data in public places in those countries, including in Meta offices, where approved.
Last updated: April 2023
Data gathered for Meta in public places will be automatically scrubbed to blur faces and vehicle license plates. In the UK and European region, participants recording in public must wear highly visible and clear communication on vests, lanyards, and carry flyers noting that recording is in progress. In all regions, participants will have a digital badge they can share with bystanders to learn more about the research project and privacy policies.
All research participants receive training before they are allowed to collect data. The training includes:
Prior to uploading captured sequences to Meta services for face and license plate blurring, recorded data is temporarily stored on the Project Aria glasses. The data on the glasses is encrypted, cannot be accessed by anyone, and complies with our Project Aria Research Community Standards.
Research participants upload the data to Meta’s secure storage where it is kept in quarantine for 72 hours before being transferred to separate back-end storage. The only exceptions to the quarantine period are when data is captured within fully consented environments with only Meta employees and contractors present who have consented to such capture. This quarantine provides extra time where participants can delete the data before any researcher can use it.
Meta uses a secure ingestion system to upload the data from the Project Aria device to separate, designated back-end storage systems.
Last Updated: November 2023
The device does not use facial recognition identification technology, and we do not connect information about bystanders captured using the research device’s sensors to any social media accounts. The glasses do not display any information on the inside of the lens, and research participants cannot access the raw data captured by the device.
No. Project Aria is a research project intended to help us understand what hardware and software are needed to build AR glasses.
We are hoping to collect sufficient and high-enough quality egocentric data to help us understand the hardware and software needed to build real, working AR glasses.
In this research phase, all of the data we collect is stored on separate back-end servers. We have controls in place to help ensure that data is only accessed by authorized researchers who require access to fulfill necessary responsibilities. We’re committed to keeping this information secure.
The Project Aria glasses are not a consumer product nor are they a prototype, and they will not be for sale. They won’t display any information on the inside of the lens, and research participants cannot view or listen to the raw data captured by the device. As a research device, the research glasses are meant to help us understand the hardware and software needed to build AR Glasses.
There are 4000 Project Aria devices in the world that are used by Meta as well as external research partners.
Meta research participants collect data in public in the USA, UK, Singapore and Ireland. Meta research participants record in their private homes with agreement from all members of the household in the rest of the EU, Switzerland, Canada and Israel.
We collaborate with external research partners in several countries worldwide including the USA, UK, Switzerland, India, Canada, Singapore, Colombia and Japan.
Last updated: September 2023
Meta employees can choose to participate in the program if they are interested. As always, non-participants have the right to ask that recording stop if they are not comfortable and have the ability to ask that the relevant data is deleted. All research participants will be provided with training so they are mindful of recording restrictions.
Industry and academic partners alike are required to abide by our Project Aria Research Community Guidelines. These guidelines are a set of requirements and best practices that mirror Meta's own Project Aria privacy requirements (e.g. ensuring it is clear to bystanders that recording is taking place and blurring personally identifiable information such as faces and license plates). Adherence to these guidelines is necessary, as they ensure that any research done with Project Aria meets its privacy and safety requirements.
To support research partners blurring data, we’ve made EgoBlur open source. EgoBlur is an AI privacy model designed to detect and blur faces and license plate information from color and grayscale images.
In addition to abiding by our Community Guidelines, each university partner will be responsible for complying with standards from institutional research ethics committees or review boards.
Our intention is for partners to store and manage this data themselves. However, partners will have the option to use Meta’s Machine Perception Services for generating derived data. Meta won’t use this data unless the partner agrees to let Meta use it. Go to MPS Data Processing in our documentation wiki for more details about how partner data is processed.