Aria Gen 2 Applications Are Now Open
Since its launch in 2020, Project Aria has enabled researchers across the world to investigate how to bring the human perspective to AI. It has allowed for the development of breakthrough egocentric datasets for large language model and vision language model benchmarking and new paradigms for training robot policies from human data. Nearly 300 academic laboratories in 27 countries have used over 1,000 Aria Gen 1 devices for egocentric research across the fields of contextual AI, human-computer interaction, and robotics. And still more have used the open-source egocentric datasets and models that Meta has developed using Project Aria. In February 2025, we announced Aria Gen 2, bringing a step change in sensing capability, wearability, interactivity, and on-device computing. Today, we’re excited to announce that applications are now open for researchers to receive Aria Gen 2, targeting a broad rollout of Aria Gen 2 devices to qualified applicants in Q2 2026. We provide separate application forms for academic researchers and corporate researchers. Aria Gen 1 applications will close today and we will process the backlog of remaining applications already submitted.
Aria Gen 2 Device Whitepaper and Pilot Dataset
We’re also providing two key resources for the community to more deeply understand Aria Gen 2: the Aria Gen 2 device whitepaper and the Aria Gen 2 Pilot Dataset. The device whitepaper provides a technical deep dive into the capabilities of the device, including its ergonomic hardware design for all-day wearability, details of each of the many diverse sensors on the device, specifics of Meta’s custom co-processor for ultra-low-power on-device machine perception computing, and the functionality of the accompanying SDK. The paper thoroughly compares Aria Gen 1 with Aria Gen 2 for a holistic view of how this new-generation device will transform what the research community is capable of producing.

The Aria Gen 2 Pilot Dataset is a canonical dataset for the Aria Gen 2 device that aims to depict how it can be used in various research applications. It provides a tangible example of the type and quality of data that the Aria Gen 2 device is able to produce that researchers can load, visualize, and analyze. The dataset features a subject wearing Aria Gen 2 glasses, recording her daily activities throughout a staged weekend in controlled environments with other consenting participants also wearing Aria Gen 2 glasses. Machine perception signals are computed both on the Aria Gen 2 device to show the quality of hand tracking, eye tracking, and VIO signals that the device is capable of producing. Additional machine perception data such as semi-dense point cloud maps of the environment are also computed on Machine Perception Services and provided with the dataset. The dataset illustrates how Aria Gen 2 can be extended into various application areas by providing the output of example algorithms run on the Aria Gen 2 data, including some of Meta’s own research algorithms for hand-object interaction detection, speaker diarization, and 3D bounding box detection as well as NVIDIA’s FoundationStereo for depth estimation. Specialized tools and notebooks are provided alongside the pilot dataset to enable researchers to easily explore what the dataset has to offer.

See More with the Aria Team at the ICCV Conference
If you’re at the International Conference on Computer Vision starting October 19 in Honolulu, you’ll have many opportunities to meet with the Aria team and learn more at workshops, tutorials, and invited talks. We have live demos of Aria Gen 2 at the Meta booth on the main floor where you can see the device streaming with machine perception in real time. We’re excited to partner with NVIDIA in several ways to show the power of Aria Gen 2 and FoundationStereo together as an egocentric wearable depth sensing system. As mentioned, FoundationStereo has been run on all recordings in the Aria Gen 2 Pilot Dataset, which you can explore today. We’ll have a talk together with NVIDIA at The Fourth Hands-on Egocentric Research Tutorial during the morning of Monday, October 20, describing the combined value of these technologies. Finally, we’ll have a stunning dedicated live demo of Aria Gen 2 streaming to FoundationStereo at the Meta booth. Please visit to learn more and explore the ways that you can utilize Aria Gen 2 with FoundationStereo for depth in your research.

What to Do Next
We encourage interested researchers to:
- Read the Aria Gen 2 device whitepaper
- Download the Aria Gen 2 Pilot Dataset
- Formulate a clear and detailed vision of the kind of research that you intend to do with Aria Gen 2
- Fill out a high-quality Academic or Corporate application for consideration to become an Aria Gen 2 partner — please do this even if you previously filled out an Interest Form for Aria Gen 2
We look forward to seeing how the community pushes the frontier of human-centered AI with the next generation of Project Aria!


