Willow Garage Blog

March 10, 2009

We've tried to make ROS as open as possible, including being able to get access to data about which nodes you are running and dynamically inspecting the data flowing through topics. The rosviz package is home to a suite of tools we are developing to help you visualize this information to help debug your software system and understand your data better. One of the tools we just released with the latest ROS stable Subversion update is rxplot, which graphically plots any numerical data in a ROS topic over time. For example:

rxplot /numbers/num1

plots the num1 field in the /numbers topic. You can also plot multiple topics either together or separately. The documentation goes into greater detail on the various ways you can plot data together.

rxgraph

(larger image)

The rosviz package also contains the rxgraph and rosgraph system visualization tools. rxgraph generates a graph like the one above, where ellipses represent nodes and boxes represent topics. This sort of presnentation lets you easily study how data flows through your ROS nodes as well as discover problematic connections. The rosgraph is a command-line alternative to rxgraph. Instead of displaying a visual graph, it prints connection information to your console.

rosviz only contains tools for visualization data about ROS. If you're interested in visualization tools more specific to robot applications, you may wish to checkout rviz, aka "Robot Visualizer," which is part of our Personal Robots repository.

 

February 10, 2009

ROS 0.4

ROS 0.4 ("Mango Tango") has been released at SourceForge. ROS is the "Robot Operating System" that is the basis of our PR2 robot software development. If you're new to ROS, you can head over to the ROS wiki, where you'll find an overview, installation instructions, tutorials, and more.

Our first milestone was a major test of the robustness of ROS, requiring multiple days of continued operation of a 2D navigation stack with many software components interacting across multiple computers. We now feel that ROS is ready for an official release to the wider robotics community. Although this is an 0.4 release, we are committed to a stable API as we march towards a 1.0 release. For you early adopters that have been using ROS via Subversion, we appreciate your patience through the changes we have been making these past several weeks and hope that you will find this release a stable platform on which to continue developing.

This is the first of many releases of ROS in the coming months. In addition to future updates, we will be working on releasing parts of our Personal Robots software stack. We hope that you will find these drivers and algorithms useful in building robotics applications and accelerating research. Please feel free to give us feedback on this release on our ros-users mailing list or by filing feature requests (or bugs) in our ticket system.

ROS grew out of work at Stanford University, and contains contributions from community members around the world.  It is released under a BSD license, making it available for both commercial and non-commercial use.

February 7, 2009

OpenCV was started in 1999, so it's only appropriate that as it hits its ten-year milestone that it crosses another important milestone: two million downloads. With an average of over a thousand downloads per day, the demand for OpenCV only continues to grow as researchers across the globe tackle tough problems of machine perception. We congratulate Gary Bradski, Vadim Pisarevsky, Adrian Kaehler and the many other contributors that continue to make OpenCV a success.

January 28, 2009

Radu Bogdan Rusu of TU München and Ioan Alexandru Sucan of Rice University have been working hard at Willow Garage the past several weeks as part of our winter internship program. One of their projects was to do dynamic obstacle avoidance (replanning) with a 7-DOF PR2 prototype arm using real-time 3D mapping coupled with sampling-based motion planning. Radu's 3D perception pipeline processes the data received from the Hokuyo laser sensor in real-time, with continuous map updates every 20-50ms. This map is sent to Ioan's sampling-based motion planning pipeline, which monitors the execution of the current plan. If the current path is no longer valid, a new one is computed. If no safe path is found, the arm is stopped until a safe path can be computed. Computation of a new path usually takes 10-100 ms, depending on the density of the map.

In the video, the PR2's task is to move its 7-DOF arm from left to right, changing its goal every 10 seconds -- it tries to go left for 10 seconds, then it tries to go right for 10 seconds.

January 28, 2009

Gone Skiing

January 20, 2009

Eowyn

PR2 development is proceeding on many fronts at once. Last month, the hardware and software were robust enough to pass Milestone 1 (robust autonomous navigation) and now we are working toward Milestone 2, which will require arms to complete. Milestone 2 will involve even longer-distance navigation than before. The robot will have to recharge itself (plug itself in) to meet these distance goals and will also face new obstacles like closed doors. We now have six arms in house, which will enable us to test our new objectives and continue to improve the hardware design.

Our PR2 Alpha prototypes have come a long way over the past several months. The data we've gathered from the milestones and stress tests have enabled us to improve on nearly every aspect of the design. Now that the PR2 design is converging on its final form, we've been able to shift our focus towards the details of the industrial design. 

Eowyn has been transformed into our "ID prototype," making it our most complete robot yet. In the picture, it was going through a fit test with an initial version of its skins. "Skin" is, of course, a metaphor: Eowyn's skins are molded plastic parts that serve to cover exposed gears for safety and exposed electronics to avoid foreign objects falling in and causing problems. In the coming weeks we will improve upon the design of the skins and even play with its paint job.


January 20, 2009

OpenRAVE + PR2

Rosen Diankov of CMU just finished up a winter internship here at Willow Garage. Here's his report on the work he did helping Willow Garage improve its manipulation capabilities:

Manipulation is a key component to enable actuation of the environment. During my short stay at Willow Garage, we worked towards getting the PR2 robot to autonomously manipulate boxes on a table using the OpenRAVE planning system developed at Carnegie Mellon University (CMU). What makes OpenRAVE worth using is that its grasp and manipulation planning algorithms do not require manual human domain knowledge to complete the task; all the planners and algorithms easily generalize across robots without needing to tweak parameters. OpenRAVE has been used to write robust manipulation scripts that have been tuned and tested on multiple robot systems for the past 2 years.

During the first week at Willow Garage, we were able to quickly identify the features needed for ROS to support the OpenRAVE integration. Because Willow Garage supports an open environment, these features were quickly discussed upon, approved on, developed, debugged, and completed. Some of the packages include

  • openrave: an OpenRAVE server running on top of the ROS framework providing manipulation planning and grasping services
  • rosoct: Native Octave/Matlab ROS bindings for scripting.
  • laser_camera_calibration: Calibration routines for the real world sensors
  • robot_self_filter: filters for removing robot links from sensor data

In the end, the robot was able to run for long periods of time without any component crashing. One of the advantages of using OpenRAVE in conjunction with ROS is that it is easy to prototype demos and complex systems. The user does not have to take down the entire system when making minor changes to code -- in fact he only has to take down that one component of interest; the rest of the components are designed to be robust against these changes.

Below is a video showing the beginnings of manipulation on the PR2. The video first starts off showing how the grasps are automatically generated and shows a proof of concept in simulation. Then it shows the real robot following these same tasks. In the upper left corner is the output of the sensor world in OpenRAVE. At every grasp attempt, the robot makes a clone of this environment and analyzes it for possible grasps. Environment clones remove the need to worry about objects moving in and out of the scene as the planner executes. Trajectories are sent to the real robot as parts of the plan are done. In between pauses, the robot updates its internal view of the world with the new robot values and begins the next set of plans from there.

In the near future, we plan to integrate dynamic obstacles tracked by the laser range finder with the manipulation planner. The robot arm should be able to dynamically modify its trajectories as new obstacles are detected with the scanner. This will obviate the need for a motion capture system to track the robot and the table and will allow us to perform this manipulation demo virtually anywhere!

December 24, 2008

Ho Ho Ho

Happy Holidays from all of us at Willow Garage!

Robo-Santa came by to visit us on Friday to wish us good cheer for passing our first milestone. We took the opportunity to ask for some last-minute gifts. JD asked for lots of robot toys to play with on vacation...

... while baby Levi asked for some sparse bundle adjustment code converging to global optimal.

We hope that Robo-Santa brings you all the robots and code you're wishing for this holiday season as well.

Robo Santa

December 20, 2008

We cleanly passed our first major milestone this morning, with one of the Alpha PR2 robots (Gandalf) autonomously traversing π kilometers two days in a row. Gandalf had been doing a π-kilometer run each day for the past two weeks, but we wouldn't declare the milestone complete until we had two consecutive clean runs. In the process, we improved the navigation software to better avoid low obstacles (scooters are popular here, and weren't seen by the first version of the software), to more safely move into uncharted territory when stuck, and of course by fixing a few bugs.

This milestone is very important: It demonstrates the hardware of the PR2 (except arms), from casters to head, from power system to sensors. It demonstrates the software on the robot from the device drivers to the executive, from the controllers to the planners. It leverages the software infrastructure of ROS, the Open Source software development and testing infrastructure, and a very nice suite of tools that enable the work. Last but not least, it leverages the work of the Open Source robotics community.

We will continue testing the hardware and software, and extending its range of navigation and of things it can do. Our next milesone will require the robot to use its arms, and when we achieve it we will be confident that we have a robust platform to share with the Open Source robotics community - we stand on the shoulders of giants.

We wish you a Wonderful Holiday Season and a Happy and Productive New Year.

December 8, 2008

We've defined a set of milestones to guide us to our goal of making ten PR2 robots, complete with ROS (a suite of open-source software for mobile manipulation), available to universities. The milestones test the basic capabilities of the platform. To reach Milestone 1, the robot must navigate autonomously for π kilometers, for two days in a row. The milestone tests a range of capabilities, including basic mechanical robustness, electrical system and software from controllers up through the 2D navigation stack. Milestone 2 adds arms to the mix, and requires the robot to find power outlets, plug itself in when needed, and open doors along the way. For the second milestone, the robot will increase its time and distance to two full days and at least a marathon (42.195 km). Milestone 3 will have users from outside Willow Garage program the robots. This will test our documentation, and will ensure that when we do ship 10 that other users will be able to make progress with them.

Today we hit an important pre-milestone: one of our three alpha PR2's went 4.5 km with only slight intervention. The previous record had been 1.5 km continuous autonomous operation. Tomorrow the official test will begin. Passing the milestone will require 2 days, because our goal is to show a high level of robustness.