Engineers at Northwestern University have unveiled MobilePoser, a groundbreaking system that enables real-time, full-body motion capture using mobile devices such as smartphones, smartwatches, and earbuds. Unlike traditional systems requiring specialized equipment and rooms, MobilePoser leverages the sensors already built into consumer devices. By combining sensor data with advanced AI and physics-based optimization, the system achieves precise pose tracking and 3D human translation with an error margin of just 8–10 centimeters.
This innovation not only reduces the cost and accessibility barriers of motion capture but also opens doors for applications in gaming, fitness, and healthcare. For example, the app could help users improve exercise form, assist physicians in mobility analysis, and even enhance indoor navigation. To foster further development, the researchers have released their code and pre-trained models as open-source software, with plans for compatibility with iPhone and Apple devices.
FAQ:
What is MobilePoser and how does it differ from traditional motion capture systems?
MobilePoser is a system/app that can perform real-time, full-body motion capture using everyday devices such as smartphones, smartwatches, and wireless earbuds. Unlike traditional motion capture solutions, it does not require expensive camera systems, special suits, or dedicated studios, but instead relies on the built-in sensors of consumer devices.
How accurate is MobilePoser, and how does it achieve this level of precision?
MobilePoser tracks full-body posture and 3D movement with an error margin of 8–10 centimeters. It achieves this by combining data from the IMU sensors (acceleration, orientation) of the phone, watch, and earbuds with advanced artificial intelligence models and physics-based optimization, filtering out noise and producing physically realistic, continuous motion.
What are the practical use cases of MobilePoser?
MobilePoser can be used in many areas: in gaming, it can enable immersive, full-body control; in fitness, it can help check exercise form and improve posture; in healthcare, it can support doctors and therapists in analyzing mobility, gait, and movement patterns, and it can also serve as a basis for indoor navigation solutions. The researchers have released the code and pre-trained models as open source and plan to support iPhone, AirPods, and Apple Watch as well.
