Beer Me, Robot

Around 5 PM on Fridays, many of us here at Willow Garage start thinking that a cold one would taste pretty good.  However, we often have a few loose ends to tie up before the weekend begins in earnest.  In this situation we've often thought about how perfect it would be to have the robot autonomously deliver beer.  The goal of Willow Garage's third summer hackathon was to make this dream a reality.  The idea of the hackathon is to start hacking Monday morning and demo on Friday afternoon, using all of the existing ROS tools and packages.  Sleep is highly optional. 

For this hackathon, our goal was to make beer fetching as robust and user-friendly as possible.  We also wanted the robot to safely transport the beer to any office, which requires navigation with the arms tucked.  For safe beer transport we designed a bar-keeper add-on to the PR2 base -- a four-holed foam block placed behind the robot's base navigation laser.  Three round holes are for stowing beers during navigation, with the fourth hole storing a convenient beer opener that the robot can pick up.  Equipping our standard fridge with a tilted self-stocking rack meant that the robot could service many user requests without human intervention.    

ScreenshotThe user experience begins with the Beer Me web application.  In this web app, the user is presented with a menu of ice cold beers and ciders, and a pull-down menu specifying the office for delivery.  Once the user hits the enticing Beer Me button, it's the robot's job to make the magic happen.  The robot navigates to the fridge, identifies the door, and performs a handle detection to determine a precise grasping location to use for opening.  The robot then grasps and pulls open the handle, and positions itself between the door and the fridge to make sure the door doesn't close.

The robot uses object recognition to determine which beers are in the rack, and will report back to the app if the user's selection is not available.  Otherwise, it stocks the ordered beers into the foam holder, closes the door, and navigates to the indicated office.  The final piece of the puzzle is the handoff.  We wanted to make very sure the robot didn't commit a party foul and drop beers on the floor, so we added face detection to the handoff behavior.  The robot offers a beer and waits until it detects a face in near proximity.  It then looks at the closest person and will release the gripper when the beer is tugged.  The robot will also offer the bottle opener and wait for it to be returned.  We even got the robot to open beers itself using a standard bottle opener. 

Roboticists get used to hearing: "That's pretty cool, but can it bring me a beer?".  Well now the PR2 can, and it may even open the bottle for you.

RVIZ Beer classification



That was great, best beer-fetching robot I have ever seen. The bottle opening was specially cool. I would love to know how much machine learning, motion planning and computer vision is involved. Does it calculate from scratch every time the arm movement to pick up the bottle? And do you use lasers to model the fridge, or is it mostly CV based? ++nicolau

We use live perception. We

We use live perception. We combine data from several sources in order to reliably detect the fridge and bottles. We also identify them via their labels. We also use motion planning for the actual pickup motion -- some of the other motions are pre-recorded.


Best system for delivering a beer since the wench was invented. Only suggestion, can it be programmed to say "It's 5 o'clock some where", to make me feel alright for ordering a beer at 10:30 am just to see it work?


I was just wondering if you could the web cam on your computer to take a picture of the orderer and then only give the beer to that person useing face recognition.


Really good! Great job guys. But can someone explain why It is so slow? I'm not knocking it in the slightest- it's a great achievement as it is. I'm just looking for the real issues behind getting it going at the same pace as a person fetching a beer to approach real practicality. I noticed the film is sped up by quite a rate, I'm presuming it's a combination of image processing/recognition and the motion limitations of the robot? It's a shame, as robots have always been really slow at doing real world stuff, save for in the simplest of roles, such as welding car pieces together-essentially just basic motion scripting. Thanks!

Thanks for the question

There are a number of reasons that the robot takes longer than a human to fetch a beer, but the top two are probably perception and planning. With current perception algorithms, detecting the handle on a fridge or recognizing different beers takes a lot of computational power. This, in turn, leads to the robot spending a fair amount of time sitting still while these algorithms run. Similarly, algorithms to compute plans for the arm to grasp a beer while avoiding obstacles and keeping a beer upright are also computationally intensive. We certainly want robots to be useful in the real world, and we're working hard to both speed up our code and find applications for which speed isn't a huge factor. For example, if the robot takes five times as long to clear your table after a meal, you might not care as long as you're not doing it and it gets done eventually. Speed isn't everything, but its definitely a perk... especially when there's beer involved.

Try a GPU instead of CPU

Get a graphics card, then open a memory stream to the GPU and use that for your computations, its 10x faster than any processor today. I believe it's ATI that allows an API access to the GPU stream.

Try a GPU instead of CPU - Addition

It is actually NVIDIA. Their CUDA interface allows running computations in highly paralleized manner. This can offer much more than 10 times faster run.

Thanks for the suggestion. If

Thanks for the suggestion. If you check out:

You'll see that there are already many in the ROS community working on GPU acceleration.


Great job, I am looking forward to seeing the PR2 arriving in Freiburg! Cyrill