UBTECH has completed what it describes as the world’s first collaborative practical training program for humanoid robots at ZEEKR’s 5G Intelligent Factory. The initiative involved multiple humanoid robots working together across various tasks and industrial settings, marking an advancement in Swarm Intelligence technology. This development is intended to transition humanoid robotics from single-agent autonomy to networked collaboration.
At the center of this advancement is UBTECH’s Walker S1 humanoid robot, which has been deployed as a coordinated cluster to explore Swarm Intelligence behaviors. Supporting this effort, UBTECH has introduced BrainNet, a software framework for humanoid robot collaboration, alongside the Internet of Humanoids (IoH), a control hub designed to manage the system’s software and hardware integration.
The BrainNet framework functions by linking cloud-device collaborative inference nodes and skill nodes, forming a “super brain” and an “intelligent sub-brain” within the Swarm Intelligence system. The super brain, powered by a large reasoning multimodal model, is designed to manage complex industrial tasks through hybrid decision-making. The intelligent sub-brain integrates cross-field perception and multi-robot control, supporting distributed learning to accelerate skill acquisition and transfer among humanoid robots.
To facilitate autonomous multi-robot collaboration, UBTECH has developed a large reasoning multimodal model for humanoid robots. This AI system, currently under development using DeepSeek-R1 deep reasoning technology, is designed to enable robots to process data at scale and apply human-like common-sense reasoning to industrial tasks. The model allows robots to break down, schedule, and coordinate workflows, optimizing multi-robot collaboration in industrial environments.
The AI model is trained on industrial datasets collected from practical training with Walker S series robots across multiple automotive factories. By integrating multimodal features and Retrieval-Augmented Generation (RAG) technology, the model aims to enhance decision-making accuracy, generalization across different workstations, and scalability for industrial deployment.
UBTECH’s Practical Training 2.0 program, implemented at the ZEEKR factory, has focused on multi-robot, multi-scenario, and multi-task collaboration. In this environment, dozens of Walker S1 robots have been deployed across production areas, including final assembly, quality inspection, and vehicle assembly stations. The robots have demonstrated capabilities in collaborative sorting, handling, and precision assembly.
For sorting tasks, Walker S1 robots employ vision-based perception and hybrid decision-making systems to track dynamic targets, map environments, and allocate tasks dynamically. In collaborative handling, the robots use a joint planning and control system to manage load distribution, trajectory planning, and adaptive posture control, improving efficiency in transporting heavy workpieces. In precision assembly, the robots apply high-precision sensing and adaptive control for manipulating deformable objects, ensuring accurate alignment and handling of materials such as films.
UBTECH is working with several industrial partners, including Dongfeng Liuzhou Motor, Geely Auto, FAW-Volkswagen Qingdao, Audi FAW, BYD, BAIC New Energy, Foxconn, and SF Express. Walker S series robots are currently deployed in more automotive factories than any other humanoid robot, according to UBTECH. With the introduction of Practical Training 2.0, the company aims to accelerate the expansion of multi-robot collaboration and AI model development to support broader industrial adoption.