Willow Garage Blog

June 30, 2010

Call for Papers: IROS 2010 Workshop: Defining and Solving Realistic Perception Problems in Personal Robotics

Perception researchers: When a colleague working on grasping, planning, human-robot interaction, or any other aspect of personal robotics asks you which vision algorithms will work for them today, what do you answer? Which specific problems have you defined, constrained, and solved?  How will these solutions enable other robotics research?  This workshop aims to provide a venue for discussing vision algorithms that can be used today by our fellow personal robotics researchers. 

We solicit paper submissions that include an explicit problem statement including constraints, and describe the robot behavior that the problem's solution would enable.  Each solution should be tested and shown to actually solve the problem, and should be made reproducible by others (either via in-depth explanation or by making source code available).  We encourage you to constrain the problem to the point at which you can solve it, but the solution should remain applicable to real-world situations.  Whether or not you submit a paper, you are invited to attend the workshop.

The submission deadline is July 23, 2010. Please see the workshop website for more detailed information.

Important Dates:

  • Submissions Due: July 23, 2010
  • Notification of Acceptance: August 1, 2010
  • Final Papers Due: August 15,  2010
  • Workshop at IROS: October 18, 2010
June 29, 2010


View PR2 Beta Robots in a larger map

Last week, we completed our fourth milestone when we shipped out the remaining PR2 Beta Program robots to their host institutions.  We created four milestones to track our progress to this important goal. During Milestone 1 we saw our robot autonomously navigate for π kilometers, two days in a row.  Our second milestone brought about door-opening and autonomous plugging in.  With Milestone 3 came ROS 1.0 and improved ROS documentation and usability.

Now, with the accomplishment of another company milestone, we're excited to share the PR2 with other institutions.  All eleven robots have left the building and are en route to their new homes.  Stanford, UC Berkeley, and USC have already received and uncrated their robots, and the remaining PR2's should arrive in the next few weeks. 

While it was incredible to see all of the robots together in one place, it's going to be even better to see what they accomplish out in the world. We've still got plenty to keep us happy and hackin'.

PR2 Beta Sites

From left to right: USC removes the top of the crate; Stanford takes their PR2 out of the crate; UC Berkeley gets to work.

June 28, 2010

Path planningPR2GRASP: From Perception and Reasoning to Grasping

Household robotics is an area that can potentially change our lives for the better. Most importantly, disabled and elderly people can become less dependent on others if household robots are available to clean, cook, bring clothes and medication, and perform other home chores.  Unfortunately, robust household robotics is also one of the more difficult areas of the field.  At Penn, the PR2 recipients will tackle a number of the challenges facing the personal robotics field.

One such challenge is navigation in dynamic environments. For personal robots to be effective, they must be able to navigate while taking into account the movements of dynamic obstacles such as people.  To this end, the Penn group will use the PR2 to continue their work on such issues as motion planning for navigation and people tracking in cluttered spaces for navigation.    

Another project on Penn's list is spring-loaded door opening with the PR2.  While the PR2 can already open doors at Willow Garage, the robot is not yet capable of opening spring-loaded doors. In order to function robustly in real environments, a personal robot must be equipped to manipulate a variety of doors.  Because the PR2 has two arms, the robot might soon be able to open a spring-loaded door without the door closing on the robot before it releases the door handle and travels through the doorway.  This ability to open more challenging doors can make the PR2, and other robots, more useful in our everyday lives.

In addition to navigation and door-opening, the team at Penn aims to modify the PR2 hardware and create support for modular end-effectors.  Penn is one of the only Beta Sites to propose hardware changes, and the team hopes to improve the PR2's capabilities by making available end-effectors that are capable in ways that the current grippers are not.

Penn's proposal also includes a variety of other diverse projects, including: visual localization and pose estimation of objects for grasping; transferring natural handheld objects between the PR2, humans, and other robots; learning salient features for better perceptual processing by the PR2; planning and controls for two-arm manipulation; and establishing telepresence through the PR2 with the Pennochio project.

To see what's currently happening on the PR2GRASP project, check out www.pr2grasp.com.

Penn teamThe Team

This project includes the efforts of eight researchers coming from three different areas of Engineering and Science and specializing in such diverse areas of robotics as vision, planning, controls, machine learning, haptics and hardware design. These joint efforts are being led by Maxim Likhachev.

The team also includes an excellent group of graduate students and postdocs, including the following: Ben Cohen, Steve Gray, Soonkyum Kim, Mike Phillips, Cody Phillips, Matt Piccoli, Joseph Romano.

Presentation

Below is a video of the Penn team presenting their proposal to the rest of the PR2 Beta Program participants. You can download the slides as PDF.

This article written with assistance from Maxim Likhachev.

June 28, 2010

While we at Willow Garage tend to be paragons of tidiness, there are times when we do forget to pick up after ourselves.  Our second hackathon this month made the robot do this for us by pushing around a wheeled cart and using it to take used cups, bowls, and boxes of Spam to the kitchen.  Because, what's a roboticist without Spam?

Cart pushing presented new challenges for our navigation stack.  First, the cart occludes the area immediately in front of the robot from the robot's sensors.  Second, our default planning algorithm works best for approximately circular robots, while the robot and the cart together form a long, thin shape.  The navigation stack has a highly modular, plug-in based architecture, though, so we were able to substitute in the sbpl forward-search-based planner, from the University of Pennsylvania, which works much better in this setting.

The robot must also recognize cups and bottles in its environment, and decide which ones need to be removed, and don't contain liquids, which could be unsafe for our robot to carry.  We used a human-in-the-loop approach, in which the robot sends an image of the scene to a (possibly remote) human, who draws a box around the next object to grasp.  The robot figures out the 3D position of the corresponding object, then uses our soon-to-be-released grasping pipeline to pick up the object and place it in the cart.

Our robot probably can't apply for a job at a restaurant just yet.  However, we believe that pushing carts, wheelchairs, and other wheeled objects is a very useful capability for personal robots, and we're continuing to work on improving its robustness.

June 23, 2010

TeleopDeveloping the Personal Robotics Market: Enabling New Applications Through Novel Sensors and Shared Autonomy

Limited access to quality robotics research platforms is one significant obstacle to the development of new robotics applications.  Bosch’s PR2 Remote Lab will extend access to the PR2 platform to additional research labs which did not receive their own PR2.  Of course, Bosch also has its own plans for its PR2 beyond sharing it with the rest of the world.  The robotics team at the Bosch Research and Technology Center North America (RTC) will focus on developing solutions for the personal robotics market by making robots safer, affordable and capable.  The Bosch robotics team in Palo Alto closely collaborates with experts from Bosch Corporate Research based in Stuttgart, Germany, who bring in additional expertise in sensing, signal processing and manufacturing.

Bosch plans to integrate its latest sensor technology in the PR2 to explore new applications and lower costs.  These sensors include accelerometers, gyros, force sensors, air pressure sensors and a new near range sensing system.  These sensors will also be shared with other PR2 Beta Sites.

In the long run, Bosch is focused on developing the robotics market.  That means building affordable robots that work reliably in complex environments – not an easy task.  One approach to this problem that Bosch will explore is shared autonomy: including a human in the loop to help out on those tasks still too difficult for a robot to quickly or reliably complete on its own.  The goal is to combine the strengths of autonomous systems and human operators to improve performance and reduce the cost of commercial systems.

Bosch has been a contributor to ROS since March 2009.  They will extend their 3D reconstruction work, previously mentioned here, to ROS and the PR2.  This will allow the PR2 to build accurate, textured 3D models of its environment.

See the Bosch press release here.

Team photoThe Team

The Bosch team combines the expertise of researchers with a background in computer science, mechanical, electrical and aerospace engineering with the corporate expertise in sensing, perception, signal proccessing and manufacturing.  Several team members are also active in the Stanford Racing Team working on autonomous vehicles.

  • Jan Becker: Principal Investigator
  • Christian Bersch: Planning, Perception
  • Charles DuHadway: Shared Autonomy, Remote Lab
  • Benjamin Pitzer: 3D Reconstruction, Remote Lab
  • Soeren Kammel: 3D Perception, Image Processing
  • Lukas Marti: Sensors, Signal Processing

The team also includes some great interns: Hao Dang (Columbia), Adam Stambler (Rutgers), Michael Styer (Stanford), Joerg Wagner and Sebastian Haug (both Stuttgart). Researchers at other Bosch locations will also contribute to the project.

Presentation

Below is a video of Jan Becker presenting Bosch's proposal to the rest of the PR2 Beta Program participants. You can download the slides as PDF.

Article written with assistance from Charles DuHadway.

June 21, 2010

PR2 ThinkerCRAM: Cognitive Robot Abstract Machine

For the PR2 to perform household chores and everyday manipulation tasks, the robot must possess, among other competences, the ability to learn, reason, and plan. Even a seemingly simple task, such as picking up a cup from a table, requires complex and informed decision making. To pick up a cup, the robot must decide where to stand, which hand to use, how to reach, where to grasp, how to lift the cup, and so on. The robot's decision making competencies must be robust enough to take into account not only the task to be performed, but also the situational context in which the task is carried out. For example, were the robot tasked to fill a glass, it would likely grasp the body of the bottle, but were the robot tasked to set the bottle on the floor, a grasp position at the top of the bottle might be more appropriate.

The Technische Universität München (TUM) team develops novel ways of enabling robots to infer the right decisions by considering the relevant aspects of action selection and parametrization.  This means that programmers do not need to specify each single task execution decision in advance.  To this end, the group investigates what they call “cognition-enabled everyday manipulation” or “cognition-enabled perception-action loops”. The research group in Munich interprets cognition to be a resource for better performance: the robot acquires models of the consequences of its actions, and uses these models to select the actions that will best accomplish the robot's objectives.

The main scientific goal of TUM's project is to build CRAM (Cognitive Robot Abstract Machine) as a software toolbox for the design, implementation, and deployment of cognition-enabled autonomous robots performing everyday manipulation activities, such as the PR2. CRAM equips autonomous robots with lightweight reasoning mechanisms that can infer control decisions, such as those listed above, thereby obviating the need for pre-programmed decisions. In this way, CRAM-programmed autonomous robots become more flexible, reliable, and general-purpose than robots using control programs that lack such reasoning capabilities.

 

TUM TeamThe Team

The TU München team is an interdisciplinary group of researchers who are members of the German national Cluster of Excellence COTESYS (Cognition for Technical Systems).  Prof. Michael Beetz, who is the head of the Intelligent Autonomous Systems group and vice coordinator of the German Cluster of Excellence COTESYS is leading the overall project and the software development.

The other principal investigators are:

  • Prof. Gordon Cheng:  Humanoid robotics and cognitive systems.
  • Prof. Matthias Kranz: Ubiquitous computing and smart and cognitive objects.
  • Dr. Uwe Haass: General Manager of the Cluster of Excellence COTESYS.

The team includes a group of excellent doctoral students, including the following:

  • Dejan Pangercic: Knowledge-enabled Perception
  • Moritz Tenorth: Knowledge Processing for Mobile Robots and Human Activity Interpretation
  • Nico Blodow: 3D Robotics Perception
  • Mihai Dolha: Human-Robot Interaction and Robot Simulators
  • Dominik Jain: Statistical Relational Learning, Probabilistic Reasoning
  • Ingo Kresse: Robotic Manipulation and Representation of Manipulation Tasks
  • Lars Kunze: Common-sense and Naive Physics Reasoning
  • Alexis Maldonado: Arm/Hand Control for Robotic Manipulation
  • Zoltan-Csaba Marton: 3D Model Fitting, Semantic Labeling of Objects, Arrangements and Environments, Spatio-Temporal Learning
  • Lorenz Mösenlechner: High-Level Planning and Reasoning
  • Federico Ruiz-Ugalde: Connection of Low Level Motor Control with High Level Reasoning and Planning using Object Control
  • Andreas Holzbach: Human Vision
  • Ewald Lutscher: Unified Control Strategies for Robots
  • Marcia Riley: Humanoid Robot Behaviors

Presentation

Below is a video of Dejan Pangercic presenting TUM's proposal to the rest of the PR2 Beta Program participants. You can download the slides as PDF.

Article written with assistance from Dejan Pangercic.

June 17, 2010

PR2 Search BoxSTAIR on PR2

The PR2's arrival at Stanford University is a homecoming of sorts.  The PR (Personal Robot) project emerged from Ken Salisbury's lab at Stanford in the form of PR1.  Moreover, what was once the STAIR lab's Switchyard system has morphed into today's ROS.  We're excited to see ROS and the PR2 at Stanford, ready for more tinkering and growth.

The STAIR (STanford AI Robot) project seeks to develop the software needed to put a general-purpose robot in every home.  The team is making the PR2 the next robotic platform for STAIR, and will develop open-source software towards this goal.  The lab will transition the STAIR perception, grasping, and manipulation software to enable the PR2 to carry out three applications: fetching items, inventory taking, and clearing a dining room table.

The first component that the group plans to transition to the PR2 is their software for robotic perception, much of which is already released as open-source, but not yet adapted for the PR2.  They have developed algorithms for embodied perception, which take advantage of the physical presence of the robot to recognize objects more accurately.  The team has also developed algorithms that choose where to foveate (i.e., point a robot head or camera) to efficiently find/recognize objects, and use multiple views of an object to facilitate better recognition.  The STAIR lab proposes to integrate these perception tools with the PR2, and thereby develop an object detector with an unprecedented level of performance.

The second component to be transitioned will enable the PR2 to grasp novel objects.  The group's grasping algorithm explicitly reasons about the placement of multiple fingers on an object, and they will transition this algorithm to the PR2, modifying it to work with the robot’s grippers and sensors.  Additionally, the STAIR team will address the problem of smoothly placing objects on a surface. Finally, the group will use their algorithms to address the problem of grasping novel objects from previously known object classes. Specifically, they will develop class-specific algorithms for grasping tableware and cutlery.

The third component that the group will transition to the PR2 is the ability to use novel doors and elevators.  The STAIR lab will integrate their door opening methods on the PR2, allowing for generalizability to a broader range of doors and door handles than has been possible with their prior hardware platforms.  They've already made quick progress on this.  Within twelve hours of setting up their PR2, the group ported their existing door-opening code. This is impressively fast given how different the STAIR 1 Robot is from the PR2.

Stanford TeamThe Team

The project is led by Prof. Andrew Ng and will include contributions from Prof. Ken Salisbury and Prof. Oussama Khatib.

The team is also comprised of a group of graduate students, including:

Presentation

Below is a video of Morgan Quigley and Adam Leeper presenting Stanford's proposal to the rest of the PR2 Beta Program participants. You can download the slides as PDF.

Article written with assistance from Morgan Quigley.

June 15, 2010

With only a small team of developers and a week's worth of development, the PR2 can now play pool! The "Poolshark" team started last Monday and began making shots on Friday. The PR2 won't be hustling you in pool halls anytime soon, but it pocketed five shots on Friday before the team decided it was time to celebrate.

The Poolshark team dealt with numerous technical challenges throughout the week: engineering a special grip and bridge so the PR2 could hold the cue, a ball detector, table localization, visualizations and input tools, shot selector, and more.

A big thanks goes to Alon Altman for his open-source FastFiz billiards library. FastFiz is a physics and rules engine for billiards that the Poolshark team used to select which shots the PR2 should take. The Poolshark team has released its own code in the billiards stack.

June is "Hackathon" month, so there are two more one-week hackathons to come: pushing a cart and fetching a drink from a refrigerator. It's one down, two to go!

Previous "hackathon" sprints:

June 14, 2010

We're packing up a PR2 and heading to CVPR in San Francisco.  We'll be at the CVPR 2010: Computer Vision for Human-Robot Interaction Workshop on Monday, from 2-6 pm.  The PR2 will be showing a variety of demos at the poster session on Tuesday night, from 5:20-9:00 pm.  Hope to see you there!  

June 14, 2010

Unified framework for task specification, control and coordination for mobile manipulation

Suppose you want a robot to assist you in carrying an object through a crowded environment. For this to work, the robot must know which item you want to pick up, how to pick it up and with what force, how to avoid both moving and non-moving obstacles, and where to set the object down. Furthermore, it must do all of these tasks while taking your movements into account.

Katholieke Universiteit Leuven has already developed a number of tools that will enable developers to make this a reality with the PR2. Their iTaSC (instantaneous Task Specification using Constraints) and skills framework will enable developers to specify this complex task of carrying an object and avoiding obstacles, all while tracking the human co-operator's movements. iTaSC allows developers to specify constraints for the motion and forces the robot uses, and it can take into account the many uncertainties such a task presents. Carrying an object also requires complex, real-time control over the robot's motors and sensors. KU Leuven's Real-Time Toolkit (RTT) for Orocos will enable developers to control multiple iTaSC motions together.

The PR2 will need to be able to track people and other moving obstacles. For this, a state-of-the-art multi-target tracking and localization algorithm (MTTL) will be added to ROS. The group will also integrate ROS with the OpenRobot simulator, which uses the open-source 3D content creation suite Blender.

KU Leuven's work on these tools will help create reusable, complex tasks for solving many other difficult challenges. The iTaSC framework has applications in robotic surgery, such as moving a surgical tool in precise laparoscopic surgery, as well as enabling robots to manipulate objects together while also taking into account the presence of people in the area.

Team PhotoThe Team

The KU Leuven team has both a mechanical and software engineering background. Its primary expertise relevant to the PR2 program is in specification and real-time control of complex tasks for robots with many degrees of freedom and multiple sensors.

Professors Herman Bruyninckx, co-founder of the Orocos open robot control software project, and Joris De Schutter, pioneer of the iTaSC framework, are leading the project. Dr. Ruben Smits, developer of the iTaSC and skills framework and experienced contributor to both Orocos and ROS, is the principal investigator.

The team also consists of post-doctoral researchers Tinne De Laet, Eric Demeester, and Peter Soetens, and PhD students Koen Buys, Wilm Decré, and Dominick Vanthienen, who are all active in the field of robotics at the Department of Mechanical Engineering, and who have already contributed substantially to the open source community.
 
The project futhermore benefits from strong collaborations with Professor Jan Swevers, Professor Luc De Raedt’s Machine Learning group at the Department of Computer Science, and with Professor Moritz Diehl from KU Leuven’s Center of Excellence in Optimization in Engineering (OPTEC).

Presentation

Below is a video of Koen Buys presenting KU Leuven's proposal to the rest of the PR2 Beta Program participants. You can download the slides as PDF.

Article written with assistance from Eric Demeester.