Willow Garage Blog

May 7, 2009

ICRA

If you'll be at the International Conference on Robotics and Automation (ICRA) in Kobe, Japan this year, don't throw away all those papers that come in your registration packet.  Tucked inside will be a miniature cardboard PR2 - a glamour shot of the alpha robot named Eowin.  Willow Garage is a proud sponsor of ICRA, and will have a booth where we'll be showing recent video clips of the robots in action, and talking to researchers about what's next.

We'll also be hosting an OpenCV and ROS users' meeting at the University of Tokyo on May 19th, after the conference.  Stop by our ICRA booth for details.

April 28, 2009

Proxemics
Until recently, our PR2 robots have seen people as mere obstacles in the environment. From a robotics perspective, preventing collisions with "obstacles" is a matter of setting small buffer zones around them to avoid.  Of course, your personal space is quite different from the safe buffer zone around a table.  Invading personal spaces is not only rude, but can also be unsafe as a person might move unpredictably. As mobile manipulators mature into personal robots operating in your environments, they must respect social norms to gain your acceptance and trust.

To move the PR2 toward its goal of becoming a personal robot, we are building people-specific sensory systems, detection algorithms, and interactive behaviors. As a start to our human-robot interaction research (HRI) at Willow Garage, we ran a controlled experiment on the proxemics (i.e., personal space) behaviors between people and PR2s. We explored several situations: (1) people approaching a PR2, (2) people being approached by an autonomously moving PR2, and (3) people being approached by a teleoperatated PR2.  From this study, we found substantially different influences on personal space than those seen in previous HRI work including the person's agreeableness and pet ownership experience.  Along with other findings about pet ownership and HRI noted by our friends at Carnegie Mellon's Human-Computer Interaction Institute, these latest findings have motivated our current research on the relationship between human-pet relationships and human-robot interactions.

With all of this research and development going on, we have lots of human-robot interaction designs to explore and evaluate. Of course, running these studies with only roboticists isn't acceptable when we really want to understand how people who are not necessarily familiar with robotics will interact with personal robots. If you live or work nearby (Menlo Park, CA) and are interested in visiting us to participate in studies with our robots, just let us know <takayama@willowgarage.com>. If you are far away, we might be running some studies online so drop us a line if you'd be interested in trying those out.

 -- Leila Takayama & Caroline Pantofaru

April 23, 2009

Our initial milestone with the PR2 used laser data and AMCL (Adaptive Monte Carlo Localization) in order to determine the position of the PR2 robot, but we are looking to integrate stereo camera data in order to calculate the PR2's position more robustly in 3D. Our perception group has been hard at work developing these new capabilities and you can see some of their recent work in this video. This video shows a test of visual_odometry package, which uses a Videre stereo camera to track the position of the PR2 as it makes a circuit around the room. 

Visual odometry works by first finding good image features to track (green points in bottom left window) and matches them from frame to frame (green lines in bottom left window). It uses these point tracks to compute a likely pose for each frame as well as its path (bottom right). As the visual odometry is tracking the position of the robot in 3D, it is also calculating the horizon line (top left window).

The visual odometry system was accurate to within 0.125 meters over the 25 meter journey, which is an error of about 0.5%. In the future we're planning to use visual odometry as a part of mapping and planning systems. We are also working on "place recognition", which will allow the PR2 to recognize where it is when it wakes up, if it's been there before.

April 23, 2009

DoorA key objective for ROS, and the personal robotics program, is to make it easier to get robots to do useful things. Our approach for pursuing this goal is to provide a library of useful primitives that can be assembled quickly to implement new tasks. To this end, we have developed an abstraction of robot action, defined in the robot_actions software package, which encapsulates a specific behavior that may execute for an extended period of time in accomplishing a goal.

A canonical example of such a robot action primitive is the MoveBase action, which was central to Milestone 1. It is commanded with a goal pose to achieve, and may be active in pursuing that goal for many minutes. Internally this action employs a path planner and maintains a world model suitable for navigation. It can handle its own re-planning to recover from a plan failure, as long as the goal is still feasible. If it cannot accomplish the goal, it aborts. If for any reason the action must be terminated prematurely, it can be preempted by a higher level client (we call this an executive). If it achieves its goal, it becomes inactive, indicating success. These features of being goal-directed, modular, durative, and preemptable constitute the main elements of a robot action.

Milestone 2, which is heavily under development as I write, will leverage robot actions extensively. We are sharing actions (e.g. tucking arms out of the way, switching controllers, moving the base) for doorway traversal, recharging, and navigation, as well as building more specific actions (e.g. grasping a handle, detecting a door) for each area. The task of integrating these actions falls to the executive, which will manipulate actions as primitives in building plans of action automatically. That is a topic for another day.

 -- Conor McGann

April 22, 2009

Before making the final design revisions for our PR2 robot, we wanted to make sure that we had fully tested the capabilities of our PR2 alpha prototypes.  One of the things we needed to verify was if the robot can easily manipulate small objects and perform basic household tasks. Our first challenge in setting up this test was creating a teleoperation station that could control the many degrees of freedom of our robot, from the four caster wheels on the base to the position of the arms to the opening and closing of the grippers.

Last December we created our first rudimentary teleoperation rig, which used our Phasespace motion capture system to move around the arm of the robot. As you can see in the video of that first test, that setup required us to sit inside of a motion capture cage and didn't give us any control over the base or gripper -- and the robot only had one arm. After testing various setups with special gloves, we came up with a new, simplified rig: we shrunk the motion-capture cage down to a single, portable bar, and we glued the motion-capture LEDs onto two salad tongs, one for each arm. We found the cheap salad tongs to be precise, light, and easy to manipulate. In order to control the wheels and and the grippers, we added a set of flight simulator pedals in front of the operator.

With our new setup, we could now test the robot in various parts of our building.  First, we made sure that the robot could pick up and manipulate various office items, like pencils, pens, binders, and textbooks.  In our building's kitchenette, we made sure the robot could easily interact with cabinets and drawers, as well as common appliances like our refrigerator, microwave, and dishwasher.  We were even able to try some two-handed tasks, like folding a towel and opening a water bottle.

Although the PR2 did quite well with all the functional verification tests, there's still a lot of work to be done before the PR2 can perform these tasks without a human operator.  Hopefully, with help from our collaborators and the open-source community, we can make this a reality.

 --  Vijay Pradeep

April 22, 2009

ICAPSPR2 is designed to be a powerful mobile manipulation platform. We at Willow Garage want to apply PR2 to do useful tasks autonomously, over extended periods of time. Planning is central to making this a reality. Motion planning, the primary focus of planning in robotics, plays a key role here, since figuring out how to move around a complex mechanism like PR2 in a dynamic, cluttered 3D space is fundamental to mobile manipulation. Task planning, more the focus in the AI community, is also important, determining which motions are required in the first place, and when.

Usually task and motion planning are considered separately, by separate communities, using separate approaches. Bhaskara Marthi and Conor McGann (Willow Garage), in conjunction with Max Likhachev (U. Penn.) and Dave Smith (NASA), are hoping to bridge that gap through a novel workshop on "Bridging the Gap Between Task and Motion Planning" co-located with the International Conference on Automated Planning and Scheduling. ICAPS is the premier planning conference in the world, and we hope for a high quality and engaging workshop attended by members from both the robotics and AI communities. The workshop will be held on September 19th in Thessaloniki, Greece. Submissions are due June 23rd, 2009.

April 8, 2009

A lot of people ask, "How is ROS different from X?" where X is another robotics software platform. For us, ROS was never about creating a platform with the most features, though we knew that the PR2 robot would drive many requirements. Instead, ROS is about creating a platform that would support sharing and collaboration. This may seem like an odd goal for a software framework, especially as it means that we want our code to be useful with or without ROS, but we think that one of the catalysts for robotics will be broad libraries of robotics drivers and algorithms that are freely available and easily integrated into any framework.

We're excited to see so many public ROS package repositories have already emerged. Here's the list that we know of:

An ecosystem of federated package repositories is as important to ROS as the ROS node is for powering the distributed system of processes. Just as the ROS node is the unit of a ROS runtime, the ROS package is the unit of code sharing and the ROS package repository is the unit of collaboration. Each provides the opportunity for independent decisions about development and implementation, but all can be brought together with ROS infrastructure tools.

One of the tools we've written to support these federated repositories is roslocate. roslocate solves the problem of "where is package X?" For example, you can type "svn co `roslocate svn imagesift`" to quickly checkout the source code for the "imagesift" package from cmu-ros-pkg. As more ROS repositories emerge, we will continue to refine our tools so that multiple repositories can easily be an integral component of ROS development.

If you have any public repositories of your own that you'd like to share, drop us a note on ros-users and we'll add it to our list.

April 6, 2009

Together with Chad Jenkins (Brown Univ.), Robert Platt (NASA), and Neo Ee Sian (AIST); Brian Gerkey and Kurt Konolige are organizing a workshop: Mobile Manipulation in Human Environments.  The workshop will be held on June 28th at RSS 2009 in Seattle, Washington.  It's the latest installment in a series of manipulation-related meetings that have been held at RSS over the last several years.  Following in that tradition, we're planning for a high-quality and well-attended workshop this year. 

We invite the community to submit papers presenting their latest work on mobile manipulation, especially if it integrates robust sensing and action.  We welcome preliminary results, particularly with compelling videos. Whether or not you submit a paper, you're invited to attend the workshop. 

The submission deadline is 8 May 2009; check the website for details.

See also other Willow-afilliated workshops.

April 6, 2009

Last month, researchers from the JSK Lab at the University of Tokyo visited Willow Garage to explore ROS and PR2. JSK has an impressive track record in robotics going back 30 years to Hirochika Inoue's pioneering work.  For the past 20 years, JSK's efforts using EusLisp have led to breakthroughs in planning, perception, sensor integration and applications.

Professor Masayuki Inaba, Associate Professor Kei Okada, and four graduate students have been working on Toyota's latest Assistant Robot (AR) and Panasonic's Kitchen Assistant Robots (KAR) at IRT , daily tasks on Kawada's HRP2 humanoid robots, and musculoskeletal humanoid Kotaro series.  This team spent 4½ days in Willow Garage's lab to connect their existing EusLisp software system with ROS.  They were able to come up to speed and make the PR2 do new things in only one week.  Arriving at San Francisco International Airport with a basic knowledge of ROS from the online tutorials and an idea of what PR2 might be capable of, they identified, explored and integrated ROS packages such as the navigation stack, face detection, and arm controllers with EusLisp's executive control and existing libraries.  You can see the results of their work in the video.

Willow Garage's 3rd Milestone (we're still working on the 2nd one) is to show that people from other labs can take advantage of the PR2/ROS platform.  JSK's demonstration in such a short period of time indicates that we're on the right track.  People familiar with other robot systems only need to learn something about ROS topics, services, and message formats in order to use the entire framework.

We were impressed with and appreciated intensive efforts by this enthusiastic team.  Students had only a few hours of sleep, watching the sun rise every day.  The professors enjoyed working closely and debugging with their students throughout their visit.  Willow Garage salutes the JSK Lab team:  we all experienced a wonderful week developing a mutual understanding and friendship and made progress in robotics, transcending differences in language and culture.

April 3, 2009

Josh Faust and Rob Wheeler recently got ROS working on the iPod Touch/iPhone and put together a quick demo that uses an iPod Touch as a joystick for the PR2 robot. The iPod Touch and iPhone combine a high-quality display with different modes of interaction that make it appealing for robotics interfaces, and they are a platform to test cross-compilation of ROS. The difficult challenge in getting ROS running on the iPod Touch was solving the cross-compilation issues. Once they had those figured out, they were able to add about twenty lines of code to the standard iPod Touch accelerator demo to translate the accelerator input into commands to drive the PR2.

This is still a proof of concept, but we hope in the coming months to make it a stable platform for ROS development. We've put up a ROSPod wiki page so you can track our efforts and contribute your own.