While many of us take the routine act of getting dressed each day for granted, a startling number of individuals rely on assistance for this fundamental task. According to the National Center for Health Statistics, 92% of those residing in nursing facilities or receiving at-home care need help with dressing.
Addressing this widespread need, researchers at Carnegie Mellon University’s Robotics Institute (RI) are harnessing the power of robotics and artificial intelligence to develop an advanced system for robot-assisted dressing.
Current robotic aids are relatively limited, primarily designed to assist with fixed poses and singular garments such as hospital gowns. “Most robot-assisted dressing systems are confined to a limited range of arm poses,” states Yufei Wang, a Ph.D. student at RI, who is actively involved in the project. “Our goal is to build a comprehensive system that can accommodate the diverse range of everyday clothing and adjust to individuals with varied arm movement capabilities.”
AI and Reinforcement Learning
A cornerstone of the RI team’s approach involves leveraging AI’s capabilities to adapt to different body shapes, arm postures, and clothing types. Through reinforcement learning, where the robot receives rewards for successful dressing actions, the team has successfully improved the system’s overall performance. The robot was trained in simulations, with special emphasis on diverse clothing materials to ensure the strategies learned could be seamlessly transferred to real-world applications.
The RI researchers conducted a comprehensive human study with 510 dressing trials across 17 diverse participants, testing the robot’s proficiency with five different garments. The results were promising, with the robot managing to dress up to 86% of the participants’ arm lengths on average.
The Complex World of Clothing
Designing such a system is not without its challenges. Unlike rigid, predictable objects, clothes are malleable, presenting challenges in perception and interaction. “To accurately predict the movement and state of the clothing, we use a segmented point cloud, which presents the visible parts of the clothes as a point set,” Wang explains.
Additionally, safety is paramount. Ensuring the robot’s gentle interaction with the human participants, without causing discomfort or potential injury, was critical. The team achieved this by rewarding the robot for gentle behavior during the trials.
There are ambitious plans for the future. The team’s next steps involve expanding the system’s capabilities to tackle more complex dressing tasks, such as donning jackets or T-shirts and performing intricate actions like buttoning or zipping.
With an eye on real-world applications, the researchers aim to conduct observational studies in nursing facilities, delving deeper into the unique needs of residents and refining their robotic system accordingly.