During the Humanoids Summit in London, Hugging Face and Pollen Robotics publicly discussed the rationale and implications of Hugging Face’s recent acquisition of the French robotics startup. The fireside chat, presented on stage with representatives from both companies, offered insights into how open-source AI and embodied robotics are coming together to build a more accessible and collaborative future for humanoid development.
The discussion featured Remi Cadene, Research Scientist at Hugging Face, and Santiago Pavon, Growth Lead at Pollen Robotics, moderated by Brian Heater of the Association for Advancing Automation. It opened with the origin story behind Hugging Face’s robotics push, which began in May 2024 with the launch of its Robots initiative. The project aimed to reduce friction between AI and physical systems by developing Python-based middleware, open datasets, and tooling that would enable anyone with AI skills to start building for robots.
From Software to Hardware
Cadene explained that as the project matured, it became clear that real-world deployment would require a hardware platform — not only to demonstrate the technology, but to allow communities to experiment and build upon it. This need brought them to Pollen Robotics, creators of Reachy, a modular and 3D-printable humanoid robot designed to be open-source and easy to work with. Founded in 2016, Pollen introduced the first Reachy in 2020, and launched Reachy 2 in late 2024.
Pollen’s approach to development has always been iterative and community-driven. New components were added over time — an arm, a camera, a head — based on what users needed most. According to Pavon, the robot’s form evolved organically through feedback from researchers, educators, and DIY developers, with the aim of lowering barriers rather than pushing a fully formed commercial product.
Creativity Over Commercial Pressure
This philosophy closely aligned with Hugging Face’s open-source values. Cadene emphasized how naturally the collaboration evolved, combining Hugging Face’s AI and software expertise with Pollen’s hardware knowledge and design. With Pollen based in Bordeaux and Hugging Face in Paris, close physical proximity helped accelerate in-person iteration and prototyping.
For Pollen, the acquisition brought stability and freedom to focus on research and community-building, rather than short-term monetization. While hardware sales continue to provide sustainability, the primary goal remains enabling experimentation and extending the capabilities of the broader robotics community.
Emotion, Empathy, and Transparency
Pavon also shared the thinking behind Reachy’s distinctive aesthetic. In contrast to the often imposing or dystopian appearance of many industrial humanoids, Reachy was intentionally designed to be transparent and approachable. Its expressive eyes, visible mechanisms, and audible motors help users understand what the robot is doing — and even feel sympathy when it struggles. “We want people to root for the robot, not fear it,” he said.
Cadene noted that this is particularly important as non-technical users increasingly interact with AI-driven machines. Where virtual agents like ChatGPT deliver seamless language capabilities, physical robots still face technical limitations. Managing expectations is part of the mission — but so is showing what’s possible when hardware and AI come together.
Language Models in the Loop
The teams have already begun deploying large language models in Reachy’s real-world interactions. A demo at the Summit showcased how an LLM could guide emotional expression during a conversation, selecting appropriate gestures or tones based on the content. In these cases, the LLM acts as a translator between user intent and robot behavior — bringing warmth and relatability to human-robot communication.
The joint team now includes around 45 core developers, backed by a growing open-source community. Hugging Face’s low robots framework provides the software infrastructure, while Pollen continues to lead hardware development and design. Together, they are building models for multimodal perception, reinforcement learning, and embodied interaction — including compact datasets and new APIs designed specifically for robot control.
Scaling Through Global Collaboration
To accelerate adoption and experimentation, Hugging Face and Pollen are organizing a global hackathon in July. The event will take place in cities across five continents, bringing together students, startups, and researchers to build new use cases for robots like Reachy. The goal is not just to generate new data, but to grow the community, diversify contributors, and enable more hands-on learning with physical AI systems.
Looking ahead, Cadene described a roadmap that includes scaling dataset collection, training more compact and efficient multimodal models, and enhancing interaction through sensory feedback and simulation. Pavon added that future versions of Reachy will likely incorporate more onboard compute power, enabling real-time AI inference directly on the robot.
Building a Shared Future for Embodied AI
Both speakers emphasized that the acquisition is less about traditional product-market fit and more about laying the foundations for a flexible, open, and scalable robotics ecosystem. Instead of targeting narrow industrial use cases, Hugging Face and Pollen are creating tools and platforms that invite collaboration and exploration — whether for research, education, or future commercial applications.
As Cadene concluded: “We’re only two and a half years past the launch of ChatGPT. The pace of progress is staggering. Our job now is to build the tools and the community that can help steer embodied AI in a responsible, open, and global direction.”
