By Meghan Chua
Imagine a future where robots at home are more than just disc-shaped vacuum cleaners – a future where they are autonomous agents that can perform our everyday tasks.
Though we may not always realize it, these tasks require a lot of physical responsiveness to the environment that is natural for humans but a core challenge of robotics.
A cross-disciplinary project between the Computer Sciences and Mechanical Engineering departments, funded by a UW2020 award, is stepping up to that challenge and exploring how people can teach robots to perform physically-responsive tasks.
“It should be as natural as possible for a human, or the technology won’t be accessible to normal people,” said Guru Subramani, a graduate research assistant on the project.
The project explores the effectiveness of three modes of human-to-robot instruction, with different research assistants focusing on each. One mode is kinesthetic teaching, or physically guiding the robot to perform a task. Another, led by PhD student Danny Rakita, is tele-operational teaching using a remote to control the robot. Pragathi Praveena, a graduate student research assistant in the lab, compares the different robot teaching methods.
Guru, a mechanical engineering PhD student who is also pursuing a minor in computer sciences, focuses on physical demonstration. Using a pair of tongs developed in the lab, Guru can record a physical motion and the forces applied during the task. The robot then replays the task from the recording.
Importantly, Guru said giving a human operator a pair of tongs to use instead of their hands limits that person’s range of motion to match what a robot could do. One of the challenges in programming robots is to provide the programmer with a basic understanding of how that robot operates and what its limitations are.
“It’s not simply just replaying the exact same motions that a person did, because a robot’s mechanics – or the way a robot is built – is totally different from how a person is,” he said. “You also have the added complexity where a robot, physically, may not be capable of doing a particular task, and may not be as nimble or flexible enough to do a task.”
Currently, an average person with little to no programming skills can only teach a robot a limited number of tasks. Furthermore, as robots are capable of performing increasingly difficult and complex tasks, the software behind them becomes complex and difficult to program. The research by Guru and the team hopes to open up new avenues for consumers to teach robots.
“Our solution is to use demonstrations, because physical demonstrations have a lot of information and do not require the person to understand how to program a robot,” Guru said. “They just have to perform the demonstration like how they would show another person.”
The UW2020 initiative supports innovative and groundbreaking research at the University of Wisconsin–Madison with the potential to transform a field of study. UW2020 grants are supported by the Wisconsin Alumni Research Foundation (WARF) with combined funding from the Graduate School and other sources.