The PR2 Cleans Up After Our Hackathons!

While we at Willow Garage tend to be paragons of tidiness, there are times when we do forget to pick up after ourselves.  Our second hackathon this month made the robot do this for us by pushing around a wheeled cart and using it to take used cups, bowls, and boxes of Spam to the kitchen.  Because, what's a roboticist without Spam?

Cart pushing presented new challenges for our navigation stack.  First, the cart occludes the area immediately in front of the robot from the robot's sensors.  Second, our default planning algorithm works best for approximately circular robots, while the robot and the cart together form a long, thin shape.  The navigation stack has a highly modular, plug-in based architecture, though, so we were able to substitute in the sbpl forward-search-based planner, from the University of Pennsylvania, which works much better in this setting.

The robot must also recognize cups and bottles in its environment, and decide which ones need to be removed, and don't contain liquids, which could be unsafe for our robot to carry.  We used a human-in-the-loop approach, in which the robot sends an image of the scene to a (possibly remote) human, who draws a box around the next object to grasp.  The robot figures out the 3D position of the corresponding object, then uses our soon-to-be-released grasping pipeline to pick up the object and place it in the cart.

Our robot probably can't apply for a job at a restaurant just yet.  However, we believe that pushing carts, wheelchairs, and other wheeled objects is a very useful capability for personal robots, and we're continuing to work on improving its robustness.


Obscured sensors

I presume you can have a specially designed cart with additional sensor. When the robot grabs the card it also interfaces with the cart and then gains access to those sensors.

The only modification to the

The only modification to the cart is a checkboard on the handle, which is used to detect the cart's location.The top deck of the cart was also removed to provide a better field of view to the PR2's tilting laser.

Due to the limited time-period of the hackathon (1-week), there was not an opportunity to add pluggable sensing capabilities to the cart. The obscured view means that the robot has many more blind spots, and, as such, can get stuck more easily. There's definitely more to improve before we can consider this a robust capability.

Cart before the horse

Would pulling the cart have worked?

The PR2 could have pulled the

The PR2 could have pulled the cart, but that would require it to move backwards. The PR2 does not have any rear-mounted sensors, though you can rotate the head to face backwards for stereo.