Researchers from the Istituto Italiano di Tecnologia (IIT) in Genoa and Brown University in Providence have found that humans can perceive a humanoid robot’s hand as part of their own body schema when performing tasks together. The study, published in iScience and funded by the European Research Council, examined how collaborative activity influences the way people integrate robots into their perceptual and cognitive systems.
The research team, led by Alessandra Sciutti at IIT in collaboration with Brown University professor Joo-Hyun Song, investigated whether unconscious mechanisms that shape interactions between humans also apply to human-robot interactions. Their work focused on the “near-hand effect,” a phenomenon in which the presence of a hand near an object alters a person’s visual attention because the brain prepares to use the object.
Experiments involved 30 participants performing a joint task with iCub, IIT’s child-sized humanoid robot. The volunteers and the robot used a steel wire to slice a bar of soap together. Following the activity, the researchers used the Posner cueing task—a test in which participants identify the position of images on a screen while influenced by nearby objects—to assess whether the robotic hand had been incorporated into participants’ body schema. Reaction times showed that participants responded more quickly to images near the robot’s hand, indicating that their brains treated it as if it were their own. The effect occurred only among those who had completed the collaborative task with iCub.
The strength of this integration depended on the robot’s behavior and proximity. Broad, fluid, and synchronized gestures reinforced the effect, as did physical closeness between the robot’s hand and the participant’s. Questionnaires revealed that participants who perceived the robot as competent and likable, or who attributed human-like qualities to it, also displayed stronger cognitive integration.
According to the researchers, these findings provide insight into how psychological and perceptual mechanisms influence human-robot interaction. Such knowledge may inform the design of robots intended for rehabilitation, assistive technologies, and virtual reality, where intuitive collaboration and responsiveness to human cues are critical.
Photo: Researchers from the Istituto Italiano di Tecnologia (IIT) in Genoa (Italy) and Brown University in Providence (USA) have discovered that people sense the hand of a humanoid robot as part of their body schema, particularly when it comes to carrying out a task together, like slicing a bar of soap. Credit IIT-Istituto Italiano di Tecnologia
