Home Bots & BrainsMIT Develops Interface to Simplify Robot Training

MIT Develops Interface to Simplify Robot Training

by Pieter Werner

Engineers at the Massachusetts Institute of Technology have developed a new handheld interface designed to simplify the process of teaching robots new tasks. The device enables users to train robots through three different demonstration methods—teleoperation, kinesthetic guidance, and natural teaching—without requiring programming expertise.

The interface, termed a “versatile demonstration interface” (VDI), attaches to standard collaborative robotic arms and incorporates a camera, motion markers, and force sensors. It records physical movements and pressure data, allowing robots to use this information as training input for various tasks.

Traditionally, learning-from-demonstration (LfD) systems have relied on a single method, limiting adaptability. The VDI supports multiple training modalities in a single tool, allowing users to either control the robot remotely, physically guide its movements, or perform tasks independently while the robot observes and records the process.

The MIT team, led by postdoctoral researcher Mike Hagenow and professor Julie Shah of the Department of Aeronautics and Astronautics, evaluated the device in collaboration with a local manufacturing innovation center. Manufacturing professionals used the tool to train a robot on two tasks common in industrial environments: press-fitting and molding. Participants employed each of the three training modes to complete both tasks.

Initial findings indicated a general preference for the natural teaching method, though users acknowledged that different methods could be advantageous depending on the application. For example, remote operation may be more appropriate for hazardous environments, while kinesthetic teaching could be useful for tasks involving heavy objects.

The research team sees potential for the tool in flexible manufacturing and other sectors where adaptable robot training is needed, including caregiving and domestic settings. According to Hagenow, the next phase of the project will involve refining the interface design based on user feedback and further testing its impact on robot learning outcomes.

The work will be presented at the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) in October and was supported by the MIT Postdoctoral Fellowship Program for Engineering Excellence and the Wallenberg Foundation Postdoctoral Research Fellowship.

Photo credit Mike Hagenow and Julie Shah

Misschien vind je deze berichten ook interessant