Willow Garage Blog

January 4, 2011

Daniel Hennes from Maastricht University spent his internship at Willow Garage modeling the dynamics of robotic manipulators using statistical machine learning techniques. He also created a useful visualization utility for ROS/rviz users that enables users to intuitively visualize the joint motor torques of a robot. Please watch the video above for an overview or read the slides below (download pdf) for more technical details. The software is available as open source in the inverse_dynamics and dynamics_markers packages on ROS.org.

January 3, 2011

138.9km

For the past several weeks, visitors to Willow Garage have witnessed a PR2 robot driving itself around and plugging itself in to recharge -- again and again and again. The goal was to achieve a new record of seven days of continuous operation, which it accomplished on December 15th. We weren't content to stop there, so we let it keep on going. The PR2's record-breaking run came to a conclusion over the holidays: 138.9 km of autonomous navigation over 13 days and 2 hours of operation.

This record wasn't just about being robust -- it was about what being robust can enable as a platform for PR2 and ROS. Robots have to be able to take care of themselves in order to become more useful as personal assistants at home. It is also a boon for developers: instead of writing applications that are limited by the battery life of the robot, developers can now write applications that take days, even weeks, to complete.

During the run, there were only two interventions: one to help the robot maneuver around a chair, and another to tell the robot where it was ("re-localization"). In both cases, the robot noticed there was an issue and sent a message for help, and the issue was resolved over the web.

December 28, 2010

Visiting scholar Christian Connette from Fraunhofer IPA has just finished up his projects here at Willow Garage. Christian works on the Care-O-bot 3 robot platform, which shares many ROS libraries in common with the PR2 robot. While he was here at Willow Garage, he worked on implementing an "elastic band" approach (Quinlan and Khatib) for the ROS navigation stack. You can watch the video above to find out more about this work, or checkout the slides below for more technical details (download PDF). The software is available as open source in the eband_local_planner package on ROS.org.

December 20, 2010

Adam Leeper from Stanford has just finished up his projects for helping the PR2 grasp objects more reliably here at Willow Garage, including a new stereo sensor for the PR2 gripper, a new teleoperation interfaces, and a grasp adjustment algorithm. You can watch the video for a quick overview. If you're interested in finding out more, please see the pr2_remote_teleop, pr2_gripper_stereo, and pr2_gripper_grasp_adjust packages on ROS.org. You can also read his slides below.

December 19, 2010

100km

Last Wednesday we reached the official goal for our "continuous operation" project: seven days of continuous operation.  PR2 logged 70km of autonomous navigation during that period and had to plug itself in countless times.  So, we wondered, how much farther can it go?  Last night the PR2 made it to 100km of autonomous navigation, and it's now at 110 km.  Go PR2 go!

December 15, 2010

Continuous Ops

Our "Continuous Ops" team (Eitan Marder-Eppstein, Wim Meeussen, and Kevin Watts) just completed a new milestone that shatters all of our previous robustness records: 7 days, 70 km (43.5 miles) of continuous operation. The robot had to autonomously navigate around the office and plug itself in whenever it ran low on battery.  The robot was allowed to announce it was "stuck" by sending a text message to the team.  The team was then allowed to use a web page to get the robot out of the stuck situation, which they only had to do twice. As you can see from the photo above, the robot picked up some pieces of flair along the way.

Getting to this level of robustness with the PR2 and ROS took lots of hard work -- over 250kms of debugging Linux kernel panics, building lights that wouldn't stay on, overheating batteries, and overzealous safety lockouts.  In fact, they completed three autonomous "marathons" (26.2 miles) getting to this point.  All of their improvements will soon be released to the rest of the PR2 and ROS community.  Congrats team!

December 15, 2010

PR2 Beta RecipientsWe sent 11 PR2s out the door last May as part of our PR2 Beta Program.  Now that they've been in the hands of the researchers around the world for 6 months, the PR2s have started showing off a lot of new tricks.  We heard about their progress last week during the second PR2 Beta Conference Call. The call was a chance for all the Beta Program recipients to show off what they've accomplished since the first call.  Below is a partial list of software that's been released in their public code repositories over the past few months, or you can find more details in the slides and audio from the call.

Bosch

The shared_autonomy stack is collection of tools for combining human control and automated algorithms including

  • augmented_object_detector provides fast and effective object detection, based on a combination of human input and machine vision
  • safe_teleop_base adds collision avoidance for easier joystick control of the robot
  • augmented_grasp_planner automatically identifies potential grasps and lets a person select the best one for the situation

Georgia Tech

  • clutter_segmentation separates the world into "surface" and "clutter" regions, as well as providing a dataset of cluttered scenes from real environments.
  • overhead_grasping simple way to quickly pick up basic objects from above.

KU Leuven

Penn

  • sbpl_lattice_planner does more advanced planning for the PR2 base motion, including planning for rotations.
  • roshask is the start of a new client library for the Haskell language.

USC

  • USC has been working on improved calibration of sensors and studying how robots should position themselves for interaction with people.

Freiburg

  • octomap2 builds efficient 3d models from point-cloud data
  • The articulation stack builds models of hinges and mechanisms from sensor data, including a great set of tutorials.
  • NARF is a set of keypoint detectors and descriptors that have been integrated into pcl.

MIT

  • The object_survey package allows a PR2 to automatically gather 3d models of objects such as chairs by driving around and taking data from all angles.

Stanford

  • Several integrated demos, including recyclerbot
  • pr2_stanford_wbc adapts the Stanford whole-body-control framework for use on the PR2, which provides a powerful way of specifying tasks for the robot.

UC Berkeley

JSK

  • Lots of updates to euslisp, including support for OSX, 30+ new robot models, and 100+ new object models.
  • Lots of updates to openRAVE, including official PR2 support and demos, and updated ikfast kinematics engine.
  • elevator_move_base_pr2 helps PR2 navigate multi-story buildings using elevators.
  • cr_capture and opt_camera drivers for depth and 360 degree cameras.
  • Collada integration with euslisp in euscollada, URDF in collada_urdf, and OpenRAVE, as well as robot-specific COLLADA extensions.

TUM

As you can see, the PR2 Beta Sites have producing an incredible amount of new code, and moved a great deal of their research forward since they took delivery in May. If the past few months are any indication, we are looking forward to exciting new developments and contributions from the PR2 Beta Sites in 2011 and beyond.

December 14, 2010

When we first made the PR2 commercially available in September we knew that the day would come when we would be able to proudly announce our first sale. What we didn't realize was that our first announcement would involve multiple customers – four of them, in fact.

CNRS Laboratory of Analysis and Architecture of Systems (LAAS-CNRS) in Toulouse, France; George Washington University in Washington, DC; Samsung Electronics in Suwon, Korea; and University of Washington in Seattle, WA are now the owners of their very own PR2 robots. At 5 ft and 450 pounds, more or less, PR2 isn't something that fits well under a Christmas tree. Nevertheless, there are four research institutions that are eagerly opening up their newest present.

The official press release is here, but we just couldn't wait to share it with the PR2 community.

December 14, 2010

Freiburg Santa

Seasons Greetings from the AIS lab at the University of Freiburg!

December 8, 2010

openni.pngPrimeSense™ is launching the OpenNI™ organization, an open effort to help foster "Natural Interaction"™ applications. As part of this effort, PrimeSense is releasing open source drivers for the RGB-D sensor that powers the Kinect™ and other devices such as PrimeSense's Development Kit 5.0 (PSDK 5.0) and are making the HW available for the OpenNI developers community! This will unlock full support for their sensor and also provide a commercially supported implementation. They are also releasing an open-source OpenNI API, which provides a common middleware for applications to access RGB-D sensors. Finally, they are releasing Windows and Linux binaries for the NITE skeleton-tracking library, which will enable developers to use OpenNI to create gesture and other natural-interaction applications. We at Willow Garage have been working with PrimeSense to help launch the open-source drivers and are happy to join PrimeSense in leading the OpenNI organization.

PrimeSense's RGB-D sensor is the start of a bright future of mass-market available 3D sensors for robotics and other applications. The OpenNI organization will foster and accelerate the use of 3D perception for human-computer/robot interaction, as well as help future sensors, libraries, and applications remain compatible as these technologies rapidly evolve.

For the past several weeks, we've been working with members of the libfreenect/OpenKinect community to provide open-source drivers, and we have already begun work to quickly integrate PrimeSense's contributions with these efforts. We will be using the full sensor API to provide better data for computer vision libraries, such as access to the factory calibration and image registration. We are also working on wrapping the NITE™ skeleton and handpoint tracking libraries into ROS. Having access to skeleton tracking will bring about "Minority Report" interfaces even faster. The common OpenNI APIs will also help the open-source community easily exchange libraries and applications that build on top. We've already seen many great RGB-D hacks -- we can't wait to see what will happen with the full power of the sensor and community unleashed.

This release was made possible by the many efforts of the open-source community. PrimeSense was originally planning on releasing these open-source drivers later, but the huge open-source Kinect community convinced them to accelerate their efforts and release now. They will be doing a more "formal" release in early 2011, but this initial access should give the developer community many new capabilities to play with over the holidays. As this is an early "alpha" release, we are still integrating the full capabilities and the ROS documentation is still being prepared. Stay tuned for some follow-up posts on how to start using these drivers and NITE with ROS.

PrimeSense's PSDK 5.0 is available separately and has several advantages for robotics: it is powered solely by USB, and the sensor package is smaller and lighter than the Kinect. This simplifies integration and will be important for use in smaller robots like quadrotors. PrimeSense is making a limited number of PrimeSense developer kits available for purchase. Please visit here to sign up to purchase the PSDK5.0.

You can visit OpenNI.org to find out more about the OpenNI organization and get binaries builds of these releases. Developers interested in working with the source code can checkout the repositories on GitHub and join the discussion groups at at groups.google.com/group/openni-dev. For more information about OpenNI, please visit OpenNI.org. To follow the efforts of the ROS community and Kinect, please join the ros-kinect mailing list.