We introduce a novel motion capture system that reconstructs full-body 3D motion using only sparse pairwise distance (PWD) measurements from body-mounted(UWB) sensors. Using time-of-flight ranging between wireless nodes, our method eliminates the need for external cameras, enabling robust operation in uncontrolled and outdoor environments. Unlike traditional optical or inertial systems, our approach is shape-invariant and resilient to environmental constraints such as lighting and magnetic interference.
At the core of our system is Wild-Poser (WiP for short), a compact, real-time Transformer-based architecture that directly predicts 3D joint positions from noisy or corrupted PWD measurements, which can later be used for joint rotation reconstruction via learned methods. WiP generalizes across subjects of varying morphologies, including non-human species, without requiring individual body measurements or shape fitting. Operating in real time, WiP achieves low joint position error and demonstrates accurate 3D motion reconstruction for both human and animal subjects in-the-wild. Our empirical analysis highlights its potential for scalable, low-cost, and general purpose motion capture in real-world settings.
We evaluate our method on both synthetic and real-world datasets, demonstrating its effectiveness in reconstructing accurate 3D poses from sparse pairwise distance measurements. Our results show that WiP outperforms existing methods in terms of joint position accuracy and robustness to noise and occlusions.
Fig. 8: Our method (blue) produces poses where the end-effectors (darker shades) align more closely with the original sensor target positions (green spheres), compared to UIP (magenta) and PIP (yellow).
Unlike inertial-based methods, our approach is robust to cumulative error, both in time (left) and distance (right). By utilizing distance measurements and incorporating additional reference sensors as anchors, we can estimate the global position of the subject at every timestamp with minimal reliance on previous predictions.
If you find our work useful in your research, please consider citing:
@article{abramovich2026mocap,
title={Mocap Anywhere: Towards Pairwise-Distance based Motion Capture in the Wild (for the Wild)},
author={Abramovich, Ofir and Shamir, Ariel and Aristidou, Andreas},
journal={arXiv preprint arXiv:2601.19519},
year={2026}
}