A Look at Our Surface EMG Research Focused on Equity and Accessibility
- Wristband technology that uses muscle signals as a form of input can facilitate more inclusive human-computer interactions (HCI) for people with a wide range of neuromotor abilities.
- We continue to invest in external research and partnerships that focus on prioritizing equity and accessibility when designing neuromotor systems at the wrist for better HCI.
- By considering everyone, we can help build a truly inclusive new computing platform that puts people—and their diverse lived experiences—at the center.
Today at Connect, we unveiled Orion, which we believe is the most advanced pair of AR glasses ever made. Orion’s input and interaction system builds on our research into surface electromyography (EMG) and includes a wristband that lets people swipe, click, and scroll through content on their glasses—all without the need for controllers.

We’ve continued that research into surface EMG to facilitate inclusive human-computer interactions (HCI) for people with a wide range of neuromotor abilities. Surface EMG uses external sensors around the wrist to detect electrical muscle signals that control the wrist and hand—and that technology opens up an easy-to-use, convenient, and rich new form of HCI. The surface EMG wristband used in Orion is our latest iteration of this technology. Our goal is to eventually bring this technology to consumers in a wristband form factor for effortless HCI on the go.
We’re designing this technology with inclusivity in mind, developing wearable devices and input control algorithms that are performant across behavioral, physiological, and motor abilities. Regardless of the size and shape of your hand or how you move it, HCI with an EMG wristband should simply work—and it may even be able to adapt itself to your unique movements over time.
Surface EMG input is inherently inclusive compared to traditional physical controllers, enabling functional control for people with atypical anatomy as well as those with limited range of motion. Unlike camera-based systems that detect physical hand movements from the hands and fingers—or handheld joystick-based controllers that need to be pressed and pushed—muscle signals at the wrist can provide control signals even if you can’t produce large movements or if you have fewer than five fingers on your hand.
We’ve partnered with leading external experts to accelerate research that explores how to design this technology so that it works for a diverse group of people.
Supporting External Research Focused on Equity and Accessibility
As part of our work to develop EMG wristbands at scale for consumers, we support external research labs to expand the accessibility potential of neuromotor interfaces. Today, we’re sharing the latest updates from some of the teams we’ve helped fund with academic gifts. These teams have all used their own in-house or commercial surface EMG electrodes and software to increase knowledge on how to design a more inclusive EMG system with funding provided by our 2021 Request for Proposals (RFP).
At the University of Utah, Dr. Jacob George’s team develops virtual interactions for everyone, even those with hand paralysis. Their research demonstrated how surface EMG signals at the wrist remain viable for control, even in cases where the signal-to-noise ratio is reduced, for example following a stroke. Notably, research participants who were unable to extend their physical fingers were able to move all of the fingers of a virtual hand avatar using EMG. This research shows the expressive potential of surface EMG in virtual environments beyond the physical capabilities of an individual.
At the University of Washington, Drs. Jennifer Mankoff and Momona Yamagami (now at Rice University) designed algorithms using surface EMG and movement sensors that allow people with different neuromotor capabilities to choose how they perform a gesture in order to interact with a computer. Research participants included people with limited motion due to muscular dystrophy, spinal cord injury, and other conditions. Through this project, the participants had full freedom to design their own individualized gestures for typical computer actions, such as “pan,” “rotate,” and “zoom.”
As a 2023 follow-on to the RFP, we funded Drs. Lee Miller and Jonathon Schofield at the University of California at Davis, who are developing surface EMG algorithms for people with different ages and skin characteristics. Since surface EMG sensors are noninvasive and record muscle activity through the skin’s surface, it’s important to consider how characteristics like skin elasticity, hydration, body-mass index, and even the amount of hair on the arm impacts muscle signals. The team published initial approaches to develop algorithms that are robust across individuals with different wrist characteristics and continues to pursue this work.
New Accessibility Collaborations with Meta’s Research Prototypes
In 2023, we supported newly launched external projects using our own hardware prototypes and algorithms. Now that our surface EMG research has matured—even informing product prototypes like Orion—we can share our internally designed systems with select external partners who conduct projects under approvals from their local ethics committees to tackle equity- and accessibility-related use cases. This is an exciting evolution of the accessibility projects listed above.
Our wrist-based surface EMG input hardware prototypes are sensitive enough to detect small neuromotor signals, called motor unit action potentials, that are often still present even after the most serious spinal cord injuries. These signals are too small to generate overt movements, but developing systems that use them for HCI can be truly game-changing for individuals with paralysis—imagine being able to control computers effectively with your muscle signals, even while not being able to physically move your fingers. As part of our efforts to evaluate this technology’s potential to serve a wide range of people, we’re collaborating with Dr. Doug Weber’s team at Carnegie Mellon University to demonstrate how these controls could work even for people with complete hand paralysis.
Our university partners have shown that our wristbands, algorithms, and user experiences can guide people with spinal cord injury to produce small muscle signals and use them to control computer-based activities (including gaming and screen navigation)—even on day one of training. These exciting outcomes show the equitable potential of the tech both for typical consumers and for those with paralysis who are currently limited in how they interact with technology and the world around them. The team continues to expand this research program by recruiting new participants and evaluating how control can improve with additional practice.
Tactile controllers—like a keyboard or touchscreen—can be difficult for people with tremor to control. EMG wristbands can overcome the challenges of holding a controller or maintaining consistent contact with a touchscreen by letting people use their hands in free space to swipe, tap, or pinch as best they can, with the system able to still identify their goal. To do this, our internal surface EMG input research focuses on using machine learning models to translate movements—swipes, taps, and even handwriting—into computer interactions like navigation, selection, and text entry. The algorithms are based on a person’s neuromotor signals, rather than on their physical ability to move. This approach could unlock the potential to make computing more broadly accessible, including for those with hand tremors.
In collaboration with an external research organization, we’re demonstrating how our input control algorithms can work for those who struggle to use handheld devices due to tremor. Through this research program, we’re developing and validating algorithms that translate these gestures into computer commands even when the hand is shaking—and learning how to make the technology even more robust for different levels of tremor severity.
Across all of these studies, we’re incredibly excited to report promising early outcomes, and we look forward to learning more as we continue to focus on developing a new technology that considers everyone. Stay tuned for future updates as we continue this work.


