NVIDIA has released a new robotics research model called SONIC, short for Supersizing Motion Tracking for Natural Humanoid Whole-Body Control. The system is designed to improve how humanoid robots learn and execute complex physical movements by using large-scale motion tracking as the foundation for control.
SONIC is positioned as a general-purpose controller that enables a single model to handle a wide range of full-body behaviors. Instead of developing separate control systems for walking, balancing, or crouching, the model is trained to reproduce natural human motion patterns captured through motion-capture recordings. According to NVIDIA, the training dataset consists of more than 100 million frames of human movement, representing hundreds of hours of recorded activity.
From Task-Specific Control to Generalist Movement
Conventional humanoid robot control systems are typically built around narrowly defined tasks and rely on manually designed reward functions. These approaches often require extensive tuning and retraining when new behaviors are introduced. SONIC replaces this workflow by treating control as a motion-tracking problem, where the robot continuously learns to follow full-body reference motions.
By using dense supervision from motion-capture data, the model learns coordinated movement across the legs, torso, arms, and head. As a result, a single SONIC policy can generate walking, running, crouching, crawling, balancing, and smooth transitions between these behaviors without separate training processes for each action.
High-Level Input and Interaction
SONIC is designed to accept different forms of high-level input. These include VR-based teleoperation using head and hand tracking, motion extracted from monocular video, and commands generated by language-based or multimodal AI systems. All inputs are translated into a shared internal representation that the humanoid robot can execute.
This design allows operators and higher-level AI models to guide physical movement without directly programming joint-level trajectories. In demonstrations, the system has been used to translate abstract movement instructions into coordinated whole-body motion.
Real-Time Control in Simulation
The system includes a kinematic planning layer that enables real-time motion generation and adjustment. In simulation environments, SONIC-controlled humanoid robots are able to respond to user input and environmental changes while maintaining balance and stability.
NVIDIA has primarily showcased the system within its robotics simulation and development platforms. These environments are used to train, test, and validate humanoid behavior before potential deployment on physical hardware, reducing the risks and costs associated with early-stage experimentation.
NVIDIA’s Robotics Strategy
SONIC forms part of NVIDIA’s broader effort to develop foundation models for robotics and so-called physical AI. The company presents the model as a low-level, reactive control layer that can be combined with higher-level systems responsible for perception, reasoning, and task planning.
In this architecture, SONIC focuses on executing movement, while other AI models determine objectives and interpret sensory input. Demonstrations have shown the system working alongside vision-language-action models that generate movement commands from visual scenes or natural language instructions.
The SONIC project has been released as a research initiative, with technical documentation, trained models, and evaluation results made available to the robotics community. Current performance results are mainly based on simulation experiments, and NVIDIA has not yet announced large-scale commercial deployment on physical humanoid robots.
SONIC is now open-source!
Generalist whole-body teleoperation for EVERYONE!
Our team has long been building comprehensive pipelines for whole-body control, kinematic planner, and teleoperation, and they will all be shared.
This will be a continuous update; inference code +… pic.twitter.com/wn1PYxIIpO
— Zhengyi “Zen” Luo (@zhengyiluo) February 20, 2026
