Realbotix staged a public demonstration at the Consumer Electronics Show (CES) 2026 in Las Vegas in which two humanoid robots engaged in a fully autonomous, unscripted conversation using embedded artificial intelligence running on-device. The company said the interaction relied on proprietary models and software operating without cloud connectivity.
According to the company, the demonstration involved two humanoid robots, Aria and David, conversing directly with one another in real time for more than two hours. The exchange took place across multiple languages, including English, Spanish, French, and German, and was designed to illustrate autonomous language generation and response without prewritten scripts or human intervention.
Andrew Kiguel, chief executive officer of Realbotix, said the demonstration was intended to show the robots’ capacity for autonomous interaction beyond human-directed use cases. “Realbotix has specialized in robots for human interaction. In this case, we demonstrated that our robots can interact with each other,” Kiguel said. He added that the interaction was unscripted and multilingual, and that a separate demonstration focused on vision-based interaction with attendees.
In addition to the robot-to-robot conversation, Realbotix presented a separate system involving a third humanoid robot equipped with an integrated vision platform. The company said the vision system, positioned within the robot’s eyes, enabled the robot to recognize individuals and respond verbally while tracking faces and interpreting visual and vocal cues during interactions with visitors.
