Kodiak Lends a Hand at Cornell
At Cornell University, the Personal Robotics lab is home to robots named Blue, Polar, Panda and the latest addition, Kodiak. Kodiak is a PR2 robot from Willow Garage. Because Cornell's mascot is a bear, all of their robots are named after famous members of the Ursidae family.
To date, the team at Cornell have been programming their robots to perform everyday chores such as grasping novel objects, unloading items from a dishwasher, placing new items, organizing a disorganized house, finding and retrieving items on request, and much more.
A summary of these projects is available here.
Now, Ashutosh Saxena, Assistant Professor of Computer Science and the director of Cornell's Personal Robotics lab, is putting PR2 to work performing similar tasks. In order for PR2 and its descendants to be truly useful personal robots, they need to learn how to perform challenging tasks such as cooking a simple meal, operating household appliances, and more. As you can see in the following videos, PR2 is already hard at work and has taken its first few steps. PR2 is now able to perceive cardboard boxes and plan its actions accordingly in order to autonomously close the box.
PR2 also shows off its culinary and cutlery skills.
The degree of difficulty in both of these examples is extremely high, because getting robots to perceive such environments well enough to perform successful manipulation is challenging. Most robots have been designed to work in highly-structured environments such as factories; however, human environments are inherently unstructured; we are a messy species. In all likelihood, the objects encountered by today's personal robots haven't been seen or handled by robots before. At the same time, however, humans are very familiar and very comfortable in our unstructured worlds. The team at Cornell, expert in broad competence artificial intelligence techniques, is developing new algorithms and software that will bridge the gap between a robotics industry that excels in structured environments and human environments that lack such structure.
One example is for PR2 to be able to perceive the 3D environment so that it can identify objects in order to perform tasks such as figuring out where they belong and putting them away. Many 3D images are stitched together to create an overall view of the room, which the algorithm then divides into blocks based on discontinuities of color and shape. The robot has been shown several examples of each kind of object and learns its common characteristics. For each block it computes the probability of a match with each object in its database and chooses the most likely match. The ROS packages for doing so are available at: http://pr.cs.cornell.edu/codedata.php
In order to figure out how to perform manipulation techniques, Saxena's lab plans to have the robot learn by observing people perform daily activities in the home, including the manipulation of objects. Addressing this problem in a data-driven way will allow the robots to autonomously perform several tasks. Since the robots in this world can be networked, they will be able to routinely improve their capabilities using Internet data resources. This means that the practical experiences of one robot can be shared by others. Specifically, Yun Jiang, a PhD student in the Robot Learning Lab, used Google's 3D warehouse in order to have robots automatically learn how the objects are used by humans, and then transfer that manipulation skill to personal robots.
The personal robotics team at Cornell consists of 10 students together with expert faculty, including Professor Bart Selman, artificial intelligence and planning; Professor Doug James, computer graphics simulation; Professor Thorsten Joachims, machine learning; and Professor Ashutosh Saxena, robot learning and perception,