[Feature Story- Robotics] New Skills for Robots- Table Tennis & Clothe-folding!

by Pei-Wen Wang

Today, there have been more robots of different functions under various technology. The following are examples of two new robots.

Clothe-folding in two minutes

SpeedFolding, an AI two-arm robot developed by the AUTOLAB at UC Berkeley, can fold clothes with the help of machine vision, BiManual manipulation network (BiMaMa-Net), and two robotic arms. Although the speed is still not the same as that of humans, it has already been far better than other robots doing the same task.

Due to the high-dimensional configuration in space and complex dynamics involved in folding clothes, being efficient and reliable in completing such as task has long been a challenge in robotics. Currently, SpeedFolding can fold 30 to 40 randomly placed clothes per hour, an average of about 1 in every two minutes (93% success rate), while other robots can only do 3 to 6 pieces per hour fastest.

To minimize wrinkles during the process, the research team adopted two robotic arms, which allow the robot to lay the clothes flat and then fold them according to the designated fold line. This is what makes it 5 to 10 times faster than how other robots do it with only one arm. SpeedFolding will first examine how the clothes are placed in the first place through its camera, decide which two parts of the clothes to be picked up, and lay them flat on a position to start folding. The research also mentions possible applications of such technology in the future, including in the textile industry and hospitals.

 

Table tennis robot that strikes 340-shot with humans

i-Sim2Real- Google’s AI research and development project that successfully trains robots to pull off a 340-hit rally in table tennis. Sim2Real is a way to build AI models, using virtual or simulated environments to train machine learning; it allows the models to apply the acquired knowledge to the real world, which greatly shortens the training time.

However, the training cannot rely solely on the virtual environment, because it is difficult to completely simulate real-time human responses. Therefore, real-world data as supplements is essential. To deal with this challenge, researchers of i-Sim2Real start the training using a simple human behavior model, even the model is not close to the actual human behaviors, because the robot itself is still in beginning stage of learning. Afterwards, the team continues to go back and forth between simulations and the real environment variables to train the robot.

 

Conclusion

The technology and research in robotics have been innovating all the time, which not only makes people’s lives more convenient, but also brings about many interesting applications.

分享到社群

vMaker編輯

The share account of vmaker editors. Send your work to us:contact@vmaker.tw