Researchers have developed “Clio,” a method that enables dog-like robots to learn to play fetch using AI and computer vision.
Published on October 10 in IEEE Robotics and Automation Letters, Clio allows robots to map scenes in real time and focus on relevant objects based on voice commands.
For instance, a Boston Dynamics Spot robot demonstrated Clio’s ability to identify and retrieve specific items in an office environment, Live Science has reported.
This technology could lead to robots making intuitive, task-oriented decisions and eventually handle more complex tasks, like fetching objects in different settings.
Written by B.C. Begley
