‘Alo Moves XR’ Brings Pilates, Yoga, and Meditation to Life on Meta Quest
Whether you want to engage your core with yoga, calm yourself through meditation, or strengthen your mind-body connection with Pilates, Alo Moves is ready to welcome you into your wellness era. The movement and mindfulness brand has taken the home fitness world by storm with its award-winning on-demand classes, and now it’s kicking things up a notch with the launch of Alo Moves XR on the Meta Quest Platform.
Developed by Magnopus, Alo Moves XR uses cutting-edge volumetric capture to bring your 3D instructor to life in mixed reality, while room mapping and object detection let you practice safely as you get the studio experience from home. Want to perfect your form? Walk up to the life-sized 3D instructor for a closer look from any angle or lean on two “mini-instructors” that you can reposition anywhere in your space.

Take in the sights of destinations including Spain, Norway, and Thailand, or immerse yourself in one of three serene environments for an unparalleled escape.
Alo Moves XR launches with 32 immersive classes led by top instructors like Ashley Galvin, Annie Landa (aka Annie Moves), Bianca Wise, Kirat Randhawa, and Susy Markoe Schieffelin. Members can also tap into new mixed reality yoga and Pilates classes each month, plus weekly meditation and sound bath sessions. Later this year, Alo Moves XR will add new instructors and expand its offerings, including 20-minute yoga sessions, quick toning and sculpting classes, Briohny Smyth’s and Josh Kramer’s yoga fundamentals, evening reset stretching, breathwork, and more.
We sat down with Magnopus Director of Virtual Production AJ Sciutto to talk shop and celebrate today’s launch.
AJ Sciutto: I am LA-born and raised and have lived here my whole life. I have deep connections to the film industry in LA—it’s part of my DNA. My uncle and my great-uncle all worked in film, nominated for multiple Oscars for sound mixing. I went to sets with my uncle, stood in the corner as a little kid, and watched Arnold Schwarzenegger blow some shit up on Terminator.

AS: I got a chance to see behind the curtain a little bit, which got me interested in technology because there’s always a technological aspect in the film world. And so during college, I always had my own camera in my hands and was super interested in how the film world blended with technology. Once I graduated, I was lucky enough to get a job at Sony Pictures Imageworks for about eight years doing visual effects. It was a nice blend of technology and filmmaking.
My time there gave me a grounding in artistic generation, how to manage people, how art comes to fruition, and what the creative process is—how technology helps support the creative vision. And after eight years, I was looking for a new opportunity to blend those two worlds on a more technology-focused side. I found Magnopus, and they were very much a startup using creativity and tech to do innovative work in the VR world. I jumped over here about eight and a half years ago, and I’ve been here ever since. I helped the company grow from about 25 people when I started to over 250 now.
AS: Virtual Production is this umbrella term for how real-time game engine technology functions within a filmmaking pipeline. That could be a bunch of different things, and we’ve been involved in almost all of them, of course, across the last seven years or so.
When I joined Magnopus as a Producer, my first project was actually a project with Meta and NASA. It was Mission: ISS. That was great for me because it unlocked the engineering side of my brain. I was only ever dealing with artists beforehand, and with Mission: ISS, I got to produce engineers and designers and artists at the same time. And that really resonated with me—how all those things intersect. Shortly thereafter, we got the opportunity to work on The Lion King. We built the virtual production system for The Lion King and shot every single shot of that movie with Jon Favreau, Caleb Deschanel, and Rob Legato on a stage that we built and designed.
That opened up my eyes to the possibilities of how to drive technology forward in an effort to satisfy the needs of the creatives. Most of the time, that’s filmmakers, but nowadays, it’s anyone from company executives to fitness brands to whoever.
At this point, I’ve transitioned from primarily producing to focusing more on biz dev and executive leadership, but I’m still very much ingrained in every project. My role is now mostly to support the creative production and engineering teams actually doing all the hard work of building each of our experiences. The teams we have running development are fantastic, running everything on the project from engineering, design, volumetric capture, integration, QA, overall creative vision, and production. The less I have to do, the more proud I am of the team running point.

AS: We were approached through our connections over at Meta for a meet and greet with the Alo team. They were impressed with some of our recent projects, like our real-time environment build-out for Westworld Season 4. We quickly got into conversations figuring out how a development schema would work for this project. I wanted to make sure that they felt like we were not just a vendor doing the work for them—but that we were a partner working together, that we were going down this road together, were invested in this creatively, and wanted to make the best product with them.
And so very early on we suggested doing what I called the “spike test.” This involved trying three different methodologies for capturing the instructor and figuring out together which one we liked best. We were transparent about the process and put all the pros and cons on the table with them to decide which method worked for the Alo Moves brand and Meta. What pushed the technology forward, but what also felt very approachable and tangible? I think it was that kind of trust building through collaboration that is what got us started on the journey of working together.
AS: It was actually easier than most would think because at our core, Magnopus is an applied research and development company that’s focused on bridging the gap between physical and digital. There are a lot of physical experiences that are phenomenal, and there are a lot of digital experiences that are phenomenal. But there’s not a ton that marry those two worlds into something that is really joyful.
or 11 years now, we’ve been focused on bridging that gap. The team we put together to develop this experience has a higher percentage of athletes working on this project than our normal technology projects do, and it’s that familiarity with fitness that made them so personally invested in the project. In the digital fitness experiences we tested, there was always something lacking. There’s always been a lack of connection between myself and the instructor. There’s always been a lack of connection in how I interact with the world around me. A lot of that fed into the design methodology we employed and how we approached this app—how would an athlete approach this? What feels good?
All the lessons we learned from working with mixed reality and virtual reality—such as interaction mechanics, user interfaces, and movement mechanics—laid the foundation for our approach. This groundwork gave us a clear mindset on how to build the experience around what was first and foremost an instructor-led class. Once we recognized that the key was connecting with the digital representation of a performer, actor, or instructor, everything started to fall into place and made a lot more sense. Of course, there was still a lot of work to do, and our amazing development team spent day-in and day-out obsessing over every interaction, every piece of design, and the color path of every pixel.

AS: We worked with volumetric capture, which is a technology we’ve been using on this platform, for a while. Metastage is the company we’ve been using to do vol-cap, and we fully believe they are the best in the biz. We’ve had a long relationship with them, using them for earlier Oculus-based experiences like Blade Runner 2049: Memory Lab. So we have knowledge and background in working with that technology and a VolCap asset, specifically on how it integrates with virtual environments, how to light it, how color management needs to work inside of the world, and how to interact with it.
Are you interacting directly with the vol-cap? Are you interacting through a menu system? What does that menu system look like? How does that interaction feel? How do I use my hands? Magnopus made the demo experience for Meta’s hand tracking technology, which is called Elixir, and that gave us a ton of insight and practice in building joyful, intuitive, and engaging products that users just get.
And so using hand interactions and figuring out what feels good with those mechanics, all that institutional knowledge about how to develop those worlds, fed into the production of the Alo experience. Our creative director Jesse worked closely with our engineers Tahnee, Dylan, and Lorena to build an experience that felt comfortable and intuitive, but also aligned with Meta’s design language for hand tracking. I think first and foremost when doing fitness, especially instructor-led fitness, there’s a lot of give and take in copying what that instructor is doing from a movement perspective. You’re always looking at the instructor because you’re basing your movement on them.
But in a headset-based experience or a non-IRL classroom experience, where you don’t have others around you—you lose some of that.
And so to circumvent that, to navigate that, we added mini-instructors around the user. While the big instructor is up in front of the class or in front of your field of view doing the instruction, you have these mini-instructors that really help with making the experience more comfortable, so that when you’re in a position, you don’t have to strain your neck in an awkward way to see the big instructor. You can reposition the smaller mini-instructors where your head is looking, so you can still base your movement on what they’re doing.
And the great thing about volumetric capture is that it’s a full three-dimensional form of the body. It’s not the perspective of the camera—it’s the full perspective, so you can rotate them around and see the angle of the ankle in line with the knee that’s on the back supporting leg, as opposed to the way the ankle or the wrist or the shoulder is positioned on the front arm. And it’s all there for you to study and increase the effectiveness of your practice. So those were some of the things that we really leaned on to build the experience.
On the departure side of what we’ve done previously, the film world is mostly about take-based capture. Instead, it’s more similar to the live-to-tape broadcast world where you’re capturing the performance once and that’s the performance you’re going to use. And so making sure that performance is rock solid from when the capture date was, making sure that everyone’s happy with what’s being said and what’s being instructed—that’s where our partners Metastage worked closely with Jesse Warfield, our creative director, and Chuy Brambilla, our producer, to ensure each capture was technically sound, while the amazing team from Alo made sure it was creatively sound.

AS: That was a no-brainer for us when it came to figuring out which tech to use, mostly because it centers around the safety of doing motion-based fitness instruction. And we wanted to make sure that when you’re in the headset experience, when you’re looking down at your own body, you can see exactly what your body is doing, not just imagining it from a behind-the-screen kind of perspective.
We had to use the mixed reality features of the Quest 3 headset to be able to allow the user to see their own hands and body as they mimic the instructor in the virtual world. But also to ensure that if the user’s dog or child came through the living room, or if the coffee table got too close, the user could clearly see those things to continue with a safe practice. And so that’s why the decision was made to go mixed reality.
AS: Oh, it’s definitely the mini-instructors. The mini-instructors are great. The joy and delight you get when you can pinch and pick up this little photorealistic human and then rotate them around to see the perspective that you need to study to do your movement—you’ve never had that kind of superpower before in IRL, and so to be able to do that, really, every time you see somebody do that in the headset, you see a big smile on their face. It’s one of those a-ha moments that justifies and gives value to doing fitness-based experiences in VR. There’s also the comfort and the privacy aspect of doing it in your own home. Then you add the superpowers of having these mini-instructors and these additional tools you could never have in real life, and that I think sways the needle over to why this is going to be so cool and so impactful.
AJ: We started our first meetings towards the end of last summer, and we started development in earnest in the fall of last year. So it’s been almost a full year in development, which is arguably somewhat fast for developing a full-blown fitness application with user profiles, recommendation engines, class tracking, and all that good stuff.
AJ: I think one of my favorite memories was presenting the spike test to Alo and Meta. It took about three or four weeks to do the spike test. We used three different capture methodologies: traditional 2D video capture, 3D video capture, and then volumetric capture. We put them all into a very simple headset experience. We were all familiar with the 2D video capture, that was the true-to-life form, that was pixel-accurate to the performance. But it was one perspective, and so it didn’t really give it the tech-forward nature that we wanted.
The 3D video added depth, movement, and stronger interpersonal dynamics between the user and the instructor. However, it still lacked the ability to offer more than users could typically experience on platforms like YouTube
With the vol-cap, when we put Tash Trindall, the Alo Moves General Manager, in the headset and let her walk around the volumetric human for the first time, it was one of those a-ha moments—there was no going back. Once we saw this, it was clear that none of the other options would provide us a unique advantage in making an experience effective and fun. And I think that sentiment was echoed both at Alo and at Meta. Everyone was very excited to see vol-cap used for this unique use case.
AS: We had a lot of success in our most recent virtual production project, which was the TV show Fallout on Prime Video. We’re now actively in development on Season 2. Working with that team again is very exciting for us.
We’re also cautiously exploring the world of AI and figuring out how AI can have an ethical place in the content creation pipeline. We see AI as a tool that could assist in getting to our creative end goal. The narrative realm is shared across both linear and nonlinear, and AI is going to be a tool that really helps us bridge that gap in making hyper-localized immersive experiences. Instead of making one experience for five million people, we’re exploring how to make five million individual experiences for five million different audience members that are localized and customized to that user’s personal interests. That’s what we’re excited about working on next.
AJ: You know, I hear people ask, “Why would I do fitness in a VR app?” And my response is that it’s a lot more fun and a lot more comfortable than it may seem from the outside. It’s really personable, and it’s connected, and it’s beautiful, and it’s private. If you haven’t tried it, give it a shot.
But mostly, I want to give credit where credit is due. Our development team spent the last year making a product that we couldn’t be more proud of. Countless internal meetings reviewing scope, UI/UX flow, design language, interactions, color science, streaming technologies, QA, backend development—all while making sure the fantastic team at Alo Moves were aligned with what we were doing. They deserve all the credit and recognition for making what we feel is a really great experience and that we hope all of you enjoy as well.
Creative Director: Jesse Warfield
Producer: Chuy Brambila
Lead Engineer: Tahnee Smith
Engineer: Dylan Markowitz
Engineer: Lorena Rother
Engineer: Garrett Hickey
Engineer: David Mazzocco
Engineer: Rittikorn Tangtrongchit
Engineer: Noah Pinales
Lead UI Designer: Soojin Jun
UI Designer: Teresa Fitzgerald
UI Designer: Jiyoung Han
Technical Art Director: David Dalzell
Art Director: Emerick Tackett
FX Artist: Justin Dykhouse
FX Artist: Daniel Naulin
3D Generalist: Zi Meyer
Experience Designer: Raegan Brown
QA Analyst: Matthew Alcala
QA Analyst: Katie Lafaw
QA Analyst: Yardan Cohen
Release Manager: Peter Geiss
Executive Producer: AJ Sciutto
Executive Producer: Daisy Leak
Creative Executive: Alex Henning
Step inside a new world of wellness with Alo Moves XR today.


