Google DeepMind unveils robot that plays table tennis at amateur level
Google has announced a new advancement in artificial intelligence: a robot using DeepMind technology is now capable of playing table tennis at an “amateur level” of skill.
Google researchers claim that their table tennis robot was able to win 13 out of 29 matches played against humans, with success rates varying depending on the level of players it competed against (from beginners to advanced).
This is the first robot capable of playing sports games with humans on a human level, and it represents an important milestone in robot learning and control. However, it is also noted that it is only a small step forward in the more general area of teaching robots to perform useful real-world skills.
The DeepMind team chose table tennis as their project because of the many different elements, from the complex physics of the movement to the hand-eye coordination required to successfully hit the ball.
The robot was trained by focusing on each specific type of shot separately. This training was then combined with a higher-level algorithm designed to select the right type of shot each time.
The robot had the hardest time making faster throws, giving the AI less time to think about its actions. Researchers are already thinking about ways to improve the system, including making the game more unpredictable. The robot also has a built-in ability to learn from the opponent's strategies and weigh their strengths and weaknesses.