Willow Garage Blog

January 19, 2010

PR2, by Scott Morse

Last year, we commissioned Pixar/Red Window artist Scott Morse to do a painting of the PR2 Beta. Now that the PR2 Beta is unveiled, we can at long last share it with you. If you wander over to Scott's post on Alternative Press Expo, you can see paintings of other robots he's done, like "Robot" from Lost in Space and Astro Boy.

Congrats to Scott and the rest of the Pixar artists, who just received a Golden Globe for their work on Up!  

January 15, 2010

Call for Proposals

Today we are unveiling our PR2 Beta robot! We are also announcing to research institutions around the world that you can apply to have one at no cost.

For the past couple of years, our researchers, developers and interns have used prototypes of the PR2 to build exciting new capabilities for robotics. The PR2 program kickstarted development of ROS, an open source robotics platform. It also helped drive new capabilities in the OpenCV computer vision library. One year ago, we accomplished our first Milestone: autonomous navigation with the PR2 robot for 2π kilometers. Six months ago, we accomplished our second Milestone: opening doors, plugging in, and 26.2 miles of autonomous navigation. We are now completing our third Milestone, which solidifies the ROS and OpenCV software platforms that form the basis of the PR2 software system.

These past several months, you may have followed along on this site as our production floor was busy assembling the PR2 Beta piece by piece. Now that the PR2 Beta production robots are coming off the line, we're announcing our PR2 Beta Program, which will enable complete, production PR2 robots to leave the doors of Willow Garage for the first time. The PR2 Beta Program will make available approximately ten PR2 Beta robots at no cost.

We invite research institutions to respond to our Call for Proposals, describing the open source and scientific contributions that they can make with a PR2. The Call for Proposals contains additional details on how we will select the top proposals and distribute PR2s.

We believe that the PR2 Beta Program will accelerate robotics research and drive open source robotics development. We're excited to work with research organizations around the world in moving the open source robotics community forward.

You can find out more about the CFP, including downloadable CFP materials, on our PR2 Beta Program: CFP page.

PR2 PR2

High-resolution photos:

January 14, 2010

PR2 No Covers

Stay tuned...

January 8, 2010

Victoria Groom of Stanford University has been performing teleoperation and autonomy experiments at Willow Garage, looking for ways to improve usability, safety, and effectiveness in human-robot interactions. While some robots are driven by co-located operators (e.g., surgical teleoperated robots), other robots are teleoperated remotely (e.g., search and rescue robots) or are even sent to act autonomously (e.g., drones). These different types of interaction affect how people think and feel about these robots. While remote teleoperation may be appropriate in one instance, the same type of interaction could be ineffective or even dangerous in another.

Victoria just finished running the last of approximately 84 study participants, and has just disassembled her experimental set-up.

For the last few months, one room at Willow Garage has been dedicated to Victoria's human-robot interaction experiment investigating the effects of autonomy vs. teleoperation, direct vs. mediated view, and real vs. simulated robot presence, on the user experience of self-extension into robots.  Self-extension refers to the feeling that an object is a part of oneself-- more like a limb or familiar tool than a separate entity.

To accomplish this, she set up a floor-to-ceiling curtain with a monitor-size rectangle cut into it. The curtain separates a small workspace with desk, chair and computer, from the robot and prop area.  The robot area is outfitted with pictures of such items as a flashlight, tarp, rope, and water. These objects are part of the desert survival task, an exercise that asks participants to imagine that they are stranded in desert and must select a limited number of survival-critical objects to salvage from their luggage.  In Victoria's experiment, participants used the surveillance robot Rovio, and were told to select their choices with the assistance of the robot.

Half of the participants were told that Rovio autonomously approached the selections they made, while other participants used a small mobile device to drive the robot to their item choices. Volunteers were also put into different viewpoint conditions. While some were able to see the robot directly through the rectangular window in the curtain, others watched the robot on a large monitor with the help of a webcam mounted in the robot area. Additionally, some participants carried out the study with an actual Rovio, while others were shown the entire setup in a simulation format.

This study aims to help uncover important insights on how robot autonomy, mediation, and virtuality affect people's behaviors and feelings about robots. With this increased understanding, we will be able to design and build robots that can be used more easily and effectively.

January 8, 2010

Box Turtle

In the coming weeks, there will be a flurry of 1.0 stack releases. The first of these are common 1.0, common_msgs 1.0, physics_ode, sound_drivers 1.0 and visualization_common 1.0.

We are making these releases in advance of our first ROS "Distribution" (Box Turtle), which will package our stable stacks together. Much like a Linux distribution (e.g. Ubuntu's Karmic Koala), the software stacks in the distribution will have a stable set of APIs for developers to build upon. We will release patch updates into the distribution, but otherwise keep these stacks stable. We've heard the needs from the community for a stable version of libraries to build upon, whether it be for research lab setups or classroom teaching, and we hope that these well-documented and well-tested distributions will fit that need. We will separately continue work on development of new features so that subsequent distributions will have exciting new capabilities to try out.

We're very excited to be nearing the end of our milestone 3 process. These 1.0 releases represent several months of coding, user testing, and documentation, so that the ROS community can use a broad set of stable robotics libraries to build upon. We appreciate the many contributions the community has made to these releases, from code, to bug reports, to participating in user tests. These releases also build upon many thirdparty robotics-related open source libraries.

For these releases, you will find links to "releases" and "change list", where you can find information about downloading these releases as well as information on what has changed:

Links to download/SVN tags

Change Lists

In most cases, the 1.0 releases will only contain minor updates from previous releases.

NOTE: not every stack in the final distribution will attain 1.0 status. We are reserving the 1.0 label for libraries that we believe to be the most mature and stable.

-- your friendly neighborhood ROS team

January 7, 2010

ROS 0.11

ROS 0.11 has been released! ROS 0.11 includes many of the final features and API modifications that we wish to have in place for ROS 1.0. Our focus with this release was on improving consistency, better performance, and bug fixes. We are also removing much of the deprecated functionality with this release so that ROS 1.0 will have minimal deprecated functionality.

Some of the major updates in this release include new "wait for message" API routines in roscpp and rospy, bash tab-completion for most command-line tools, improvement in Python-related startup latency, and streamlining of the service API in rospy.

There were many other changes with this release as we work to improve and finalize the ROS feature set. You can find a complete list in the changelist.

You can download a tarball of the release, though we recommend that you start with the ROS installation instructions instead.

Notes on Updating from 0.10 or earlier in SVN: There are a number of previously installed programs which are now versioned scripts. SVN will error when updating and not delete the installed programs to add the scripts in their place. You will need to delete the program and svn up again. The error will read:

svn: Failed to add file 'FILENAME': an unversioned file of the same name already exists

The solution is to rm FILENAME && svn up again.

 

December 29, 2009

We spotted this video of an office prank that happened here a couple of years ago. It's one of our favorites and features modular CKbots from the ModLab at Penn, a ping pong robot, and a trap door concealed in a ceiling tile. We love modular robots and this shows off why -- need a robot to lift open your trap door in the ceiling so that your ping pong robot has a clear line of fire? No problem! 

While they weren't busy figuring out how to surprise attack their mentor, interns from the ModLab have also done a lot of fun "real" projects, like breaking eggs, quick-change end effectors for the PR2, and squeezing juice bottles. If you think these sorts of robot pranks and projects are fun, perhaps you should check out our intern program.

[Thanks BotJunkie]

December 28, 2009

Peter Pastor, a PhD student at USC, spent the past three months developing software that allows the PR2 to learn new motor skills from human demonstration. In particular, the robot learned how to grasp, pour, and place beverage containers after just a single demonstration. Peter focused on tasks like turning a door handle or grasping a cup -- tasks that personal robots like PR2 will perform over and over again. Instead of requiring new trajectory planning each time a common task is encountered, the presented approach enables the robot to build up a library of movements that can be used to execute these common goals.  For this library to be useful, learned movements must be generalizable to new goal poses. In real life, the robot will never face the exact same situation twice. Therefore, the learned movements must be encoded in such a way that they can be adapted to different start and goal positions.

Peter used Dynamic Movement Primitives (DMPs), which allow the robot to encode movement plans. The parameters of these DMPs can be learned efficiently from a single demonstration, allowing a user to teach the PR2 new movements within seconds. Thus, the presented imitation learning set-up allows a user to teach discrete movements, like a grasping, placing, and releasing movement, and then apply these motions to manipulate several objects on a table. This obviates the need to plan a new trajectory every single time a motion is reused. Furthermore, the DMPs allow the robot to complete its task even when the goal is changed on-the-fly.

You can find out more about the open source code for Peter's work here, and check out his presentation slides below (download PDF). For more about Peter's research with DMPs and learning from demonstration, see "Learning and Generalization of Motor Skills by Learning from Demonstration", ICRA 2009.

December 21, 2009

Integration test

Part 1: Building a New Lab
Part 2: Stockroom Ready for Raw Materials
Part 3: Gripper Production Underway
Part 4: First Production Grippers Complete
Part 5: Casters, Heads, Arms and More

For the past couple of months, the production floor has seen a collection of modular PR2 Beta sub-assemblies: grippers, casters, and heads. Last week, the first shoulder sub-assemblies came off the line, providing us with all of the sub-assemblies needed to integrate the first PR2 Beta.

A shelf full of casters yielded a mobile base. Cameras, pan-tilt platforms, and tilting lasers became a sensor head. At 6pm last Friday, all of these parts came together, along with a spine assembly, and, for the first time, powered on as one whole unit.

By 9pm, the armless robot was driving around the building. Grippers, forearms, upper arms and shoulders were added over the weekend, and what was once a collection of parts began to look more like a full PR2.

There is much more to do. The robot is now with our software team to test the integrated system and transfer the capabilities of our PR2 Alpha prototypes to the new system. Next, the robot will return to the production team, and they will continue to integrate, refine, and tune.

 - The PR2 Builders

Mobile Base Shoulder ready for burn-in Production Servers Production Batteries

December 14, 2009

Our online animation study has gone live and is complete! We invited you to weigh in with your perspective on our PR2 animations. You got a sneak preview of these animations as they were under development, but now you can check out the full study. The study is now closed, but you're welcome to try it out anyway.

The above video shows an example of the type of clip you'll find in the full-length study. We're investigating how to make personal robot behaviors more human-readable by identifying important principles that can inform the design of more effective human-robot interactions. To do this, we need your feedback on how you interpret certain robot behaviors. Please follow the link up top, and help us make PR2 more human-readable.

We've submitted the results of this study for peer-reviewed publication so that others can use the lessons learned from this study on different robots, too.