AI-Powered Robot Arms: Precise Enough to Handle Pringles Chips
These robot arms can gently pick up super delicate stuff using smart learning and their own senses!
A bimanual robot, guided by a cutting-edge artificial intelligence system, possesses the remarkable capability to respond to real-time tactile feedback with such precision that it can delicately grasp individual chip without causing any damage.
The team’s groundbreaking progress lies in the use of two robotic arms, which is a departure from the typical single-arm approach in most tactile robotic projects. Surprisingly, despite the doubling of limbs, the training process only requires a mere few hours.
The researchers achieve this by initially training their AI in a simulated environment and subsequently implementing the fully developed Bi-Touch system on their physical robotic arms.
“With our Bi-Touch system, we can easily train AI agents in a virtual world within a couple of hours to achieve bimanual tasks that are tailored towards the touch,” Lin continued. “And more importantly, we can directly apply these agents from the virtual world to the real world without further training.”
The remarkable success of the Bi-Touch system can be attributed to its reliance on a technique known as Deep Reinforcement Learning (Deep-RL). In this approach, robots learn by repeatedly attempt tasks, similar to trial-and-error experimentation.
When the robot accomplishes a task successfully, researchers reward the AI, much like how you would train a pet with positive reinforcement. Over time, the AI learns the most effective steps to achieve its designated objective. In this case, the goal is to use the two robotic limbs, each equipped with a soft pad, to pick up and manipulate objects like a foam brain mold, a plastic apple, or an individual chip.
Notably, the Bi-Touch system doesn’t rely on visual input; instead, it relies solely on proprioceptive feedback, including factors like force, physical positioning, and self-movement.
The research team envisions a promising future for their Bi-Touch system, with potential applications in industries like fruit-picking, domestic services, and even the possibility of integrating it into artificial limbs to recreate the sense of touch.
What’s particularly noteworthy is the system’s use of “affordable software and hardware” and the upcoming release of its code as open-source.
This approach ensures that teams worldwide can explore, experiment with, and customise the program to suit their specific objectives, fostering innovation and collaboration in the field.