GoPro-equipped robot gloves teach robots new tricks
Getting your Trinity Audio player ready...
|
Visualizing the future as one where humans do less manual and repetitive work and robots do more, depends on finding an efficient way of teaching machines to perform such tasks. Ideally, the skills transfer process would generate rich data and be fast and cheap to carry out, but coming up with a method that ticks all of those boxes has proven to be difficult – until now. Hitting that sweet spot appears to be a pair of GoPro-equipped robot gloves developed by researchers in the US, which – according to video footage – could provide an easy way of training robots to do all kinds of things.
What’s more, all of the universal manipulation interface know-how has been open-sourced, including the 3D printing instructions for making the handheld robot gloves. As photos reveal, the soft finger design is capable of gripping a raw egg securely without breaking the shell.
To begin the skills transfer process between human and machine, users put on a pair of robot gloves and carry out the target task multiple times to build a training dataset. Don’t be discouraged by the need for repetition, as the results can be generalized to similar scenarios – using a so-called diffusion policy that has been shown to outperform existing state-of-the-art robot learning methods – which saves time later on.
Adding to the appeal, those same results can be used by different models of robot – provided that the unit can be fitted with duplicates of the robot gloves. In the demonstrations given by the team, whose members are based at Stanford University, Columbia University, and Toyota Research Institute, robots are taught how to place an espresso cup on a saucer and even wash up dirty plates.
Key to the success of the approach is the use of GoPro cameras – one on each of the robot training gloves and one on each of the grippers in the robot-mounted setup. The cameras feature fish eye lenses to capture a wide field of view, gathering large amounts of detail from the scene, and include inertial measurement units (IMUs) – to enable pose tracking.
The team makes sure that all of the data feeds are latency-matched, which means that robots can carry out two-handed tasks correctly and perform actions such as throwing objects with high precision. Also, there’s a one-off mapping step that uses a visual code to help with simultaneous localization and mapping (SLAM).
If sufficient numbers of people join in, robots could quickly be taught to do many common industrial tasks using the open-sourced robot gloves – and that knowledge shared. Currently, robots are often taught through teleoperation, which can be a slow process. The wearable teaching grippers, on the other hand, provide a much speedier option and are more instinctive to use.
“By recording all information in a single, standardized MP4 file, UMI’s data can be easily shared over the Internet, allowing geographically distributed data collection from a large pool of nonexpert demonstrators,” writes the group in its paper – ‘Universal Manipulation Interface: In-The-Wild Robot Teaching Without In-The-Wild Robots’ – which is free-to-read on arXiv.
Timing the robot training process, the researchers found that using their universal manipulation interface was around three times faster to use than teleoperation. Also, the learning framework was shown to be tolerant to big changes in lighting conditions, and other interference.
For example, robots trained using the gloves can continue performing their tasks even if their base is moved or humans perturb the scene in other ways – such as adding more sauce to the dirty plates.
The dishwashing task is noteworthy as it’s what’s termed an ultra-long horizon task from an automation perspective, with the success of each step dependent on the previous one. Here, the robot needs to perform seven actions in sequence – turn on the faucet, grasp the plate, pick up the sponge, wash and wipe the plate until the ketchups are removed, place the plate, place the sponge, and turn off the faucet.
Can we collect robot data without any robots?
Introducing Universal Manipulation Interface (UMI)
An open-source $400 system from @Stanford designed to democratize robot data collection
0 teleop -> autonomously wash dishes (precise), toss (dynamic), and fold clothes (bimanual) pic.twitter.com/BBG4S97B1Q
— Cheng Chi (@chichengcc) February 16, 2024
Given the apparent success of the approach, regular dishwashing appliances may face some competition from two-armed robots in the future – and it get’s you thinking about other jobs that robots could do around the home.