Willow Garage Blog

August 20, 2012

Leila Takayama

It is with great pride that we inform you that Willow Garage's Dr. Leila Takayama been named to MIT Technology Review's TR35 Listing of the World's Top Young Innovators for 2012. The TR35 recognizes the world's top innovators under the age of 35, spanning energy, medicine, computing, communications, nanotechnology, and other emerging fields. Proving that innovation and creativity go hand in hand, this award follows up Dr. Takayama's earlier recognition by Fast Company magazine as one of The 100 Most Creative People in Business 2012.

Dr. Takayama has been honored for her work in the field of human-robot interaction, and was selected as a member of the 2012 TR35 by a panel of expert judges and the editorial staff of Technology Review, who evaluated more than 250 nominations. She will join other TR35 honorees in discussing their achievements at the EmTech MIT 2012 conference, taking place at the MIT Media Lab in Cambridge, MA on October 24-26, 2012.

A Research Scientist at Willow Garage with a background in Cognitive Science, Psychology, and Human-Computer Interaction, her current focus is understanding human encounters with robots in terms of how they perceive, understand, feel about, and interact with robots. Among other things, she is working on teaching the robots some manners.

She serves on the steering committee for the Human-Robot Interaction conference and recently completed editing the inaugural special issue of the Journal of Human-Robot Interaction, which is an open access publication. She is also serving on the World Economic Forum Global Agenda Council on robotics and smart devices, which recently convened in Singapore.

In the words of Jason Pontin, editor-in-chief and publisher of Technology Review, "This year's TR35 recipients are applying technology to some our generation’s greatest challenges, and innovating to improve the way we live and work. We look forward to watching these young technology leaders grow and advance over the coming years."

Please join all of us at Willow Garage in congratulating Dr. Takayama on this prestigious award.

July 27, 2012

"I was lying in bed, watching TV as usual, when I saw a technology special on a mobile robot.  I immediately imagined controlling it as a surrogate for my own body." 

Those are the words of Henry Evans as he describes how our project on assistive robotics, Robots for Humanity, first began in October 2010.  As a result of a brainstem stroke, Henry is mute and quadriplegic.  Following extensive therapy, he regained the ability to move his head and use a finger, enabling him to use a computer.  When Henry saw a television interview with Georgia Tech Professor Charlie Kemp showing research with the Willow Garage PR2, he immediately saw the opportunity for people with severe motor impairments to use mobile manipulators as assistive devices.  Henry is motivated by the possibility of using a robot as a surrogate for his paralyzed body, and he believes thousands of others with severe motor impairments could benefit as well.

Robots for Humanity, a collaboration between Willow Garage, Georgia Tech, Oregon State and Henry and Jane Evans, was recently featured on CBS Evening News.

Technical details of this work are pending publication, but some of the highlights of the Robots for Humanity initiative are as follows:

  • In October 2011, Henry gave Halloween candy to children through the PR2 robot at a local mall.  As a child would approach the robot and hold out his or her candy bag, Henry would command the robot to pick up a candy from the side table and then place it inside the child’s bag.
  • In Henry’s home, he has demonstrated navigating the robot through his home to find and deliver to himself objects such as a towel from a drawer in the kitchen, and small food items from inside the fridge.
  • One of Henry's very first accomplishments was to manipulate the PR2 to scratch his own nose--the first time he had been able to do so in over a decade. Since then, Henry has not only used the PR2 to shave himself, but also tele-operated a PR2 located at Georgia Tech and shaved Prof. Kemp remotely.

The Robots for Humanity team at Willow Garage is led by Matei Ciocarlie and Kaijen Hsiao.  Their approach is to research and develop a diverse suite of open source software tools that blend the capabilities of the user and the robot. This has resulted in what is, to the best of our knowledge, the first example of a mobile manipulation platform operated by a motor impaired person using only a head-tracker single-button mouse as an input device, and used for varied and unscripted manipulation tasks in a real home as well as limited forms of social interaction.

Henry controlling the PR2 & grasping a towel from his kitchen drawer

The goal of putting robots into real homes to help people with disabilities is a long-term vision for our project. By actively involving the users, Henry and Jane Evans, in our participatory design process, we have made tangible progress towards assistive capabilities that are both useful and usable.  We also anticipate that by putting robots into the real homes of people with disabilities early and often, we can better direct our research to overcome the real-world obstacles to the use of mobile manipulators as an effective assistive technology.

Our future challenges include enabling Henry and Jane to use a PR2 in their home for longer durations, and evaluating our methods with other people with motor impairments.

Robots for Humanity is made up of the following individuals and institutions:

Henry and Jane Evans

Willow Garage, Menlo Park, CA:

Matei Ciocarlie, Kaijen Hsiao, Steve Cousins, Andreas Paepcke, Caroline Pantofaru, Leila Takayama

Healthcare Robotics Lab, Georgia Institute of Technology, Atlanta, GA:

Charlie Kemp, Tiffany Chen, Phillip Grice, Kelsey Hawkins, Advait Jain, Chih-Hung King, Hai Nguyen, Sarvagya "Survy" Vaish

Oregon State University, Corvallis, OR:

Bill Smart and Daniel Lazewatsky

More information on the project and associated open-source code can be found here

 

July 23, 2012

During his internship at Willow Garage, Daniel Claes from Maastricht University worked on multi-robot collision avoidance. Put simply, he was able to get multiple robots to navigate efficiently while avoiding each other. 

Currently, many navigation algorithms assume that obstacles are static, which results in suboptimal or even unsafe paths in an environment featuring multiple moving robots.  In addition, a system of multiple robots with similar sensor locations such as the PR2's base laser, there is very little surface area that can be picked up by the sensors of other robots and thus prevents the robots from observing each other effectively.

Multi-robot collision avoidance is based on the velocity obstacle paradigm and incorporates the motion of other robots resulting in smoother and more efficient paths. The solution is a reactive and fully distributed approach running on the robots as a local planner.  Localization uncertainty is taken into account by virtually enlarging the robots footprint according to the covariance matrix of the AMCL pointcloud.

This key idea of this algorithm is that robots broadcast their positions and select collision-free velocities that continue to adapt as the robots approach their goals. These techniques allow them to react to nearby robots and does not require any centralized or coordinated planning. The work is based on the ORCA (optimal reciprocal collision avoidance) formulation developed by Berg et al.

This work was done with the TurtleBot and PR2 but it will work with any robots running the ROS navigation stack. This approach enables a wide variety of applications, from warehouse delivery to games.  

Willow Garage would also like to congradulate Daniel Claes, Daniel Hennes, Karl Tuyls, and Wim Meeussen on winning the Best Demonstration Award at the eleventh international conference on autonomous agents and multiagent systems (AAMAS’12, June 4th – June 8th 2012) with: “CALU: Collision Avoidance with Localization Uncertainty”.

See the demonstration video of their work and read the full paper here.

For more details visit see the multi robot collision stack.

July 13, 2012

During his visit as a research engineer at Willow Garage, Ken Anderson improved upon motion planning by developing a more fluid and computationally lighter trajectory smoother along with incremental distance field updates. The new trajectory smoother modifies the timing intervals to respect velocity and acceleration bounds. This differs from the existing shortcutting techniques, which move the actual point locations. Incremental distance field updates save computation time by only updating the portions of the distance map that have changed. 

The iterative trajectory filter and incremental distance field updates are significantly faster than previous methods making motion-planning and safe teleoperation more responsive.

Following his internship stint at Willow Garage, Ken is returning to his home base in Ottawa, Canada to continue his research.

For more details, see the arm_navigation, arm_navigation_experimental, and ompl stacks.

June 19, 2012

Clearpath Robotics is offering a grant to academic researchers who use ROS. Equipment grants for robotics research are often hard to come by. To remedy this, Clearpath Robotics started the PartnerBot Grant Program in the hopes of putting robots into the hands of deserving researchers.

PartnerBot

The purpose of this grant is to facilitate research and increase the speed of robotics development by encouraging publishing results and adding to open source initiatives. One of the prerequisites of the PartnerBot Grant requires the research team to use ROS and publish their findings, this includes posting any code they write to the ROS wiki and giving back to the community.

The PartnerBot Grant Program provides funding for one Clearpath Robotics Husky A200 and associated customization services, worth a total of $25,000. The Husky A200 is a rugged unmanned research platform that is perfectly suited for all-terrain outdoor purposes. ROS is fully integrated and it is completely customizable to suit your individual unmanned vehicle needs.

Don’t miss your chance to fund your research! Visit www.clearpathrobotics.com/partnerbot for more information.

June 18, 2012

Open Perception, Inc. Formed to Accelerate Development and Adoption of 2D and 3D Sensory Data Processing

Open Perception Logo

Over the weekend, in overflowing rooms at the Point Cloud Processing workshop and PCL tutorial at CVPR 2012, Willow Garage proudly announced the creation of Open Perception, Inc. (OP), an independent non-profit foundation, focused on advancing the development and adoption of open source software for 2D and 3D processing of sensory data. For more details about the announcement, please see the official press release.

Open Perception is founded by a global community of researchers and engineers for the benefit of the industrial and research 3D perception communities. The Point Cloud Library (PCL) project represents Open Perception's most important work to date consisting of a large scale, BSD-licensed open project for 3D point cloud processing.

The foundation is set up to receive donations and sponsorship from anyone, and will concentrate on paying developers in the community, giving students travel grants and stipends, organizing open source events, and supporting its projects, such as PCL. The more support OP receives, the more it can do and give back to the entire world.

Open Perception is an open organization. We are welcoming support ranging from a pat on a shoulder, to lines of code and monetary donations. Please visit the foundation's Get Involved! page and see how you can contribute today.

June 7, 2012
Passion starts with inspiration.  For many of us who are passionate about robotics, that inspiration came through the works of Ray Bradbury. 
 
We at Willow Garage humbly thank Mr. Bradbury for his contributions to the worlds of science and technology and send our condolences to Mr. Bradbury's friends and family.
 

Ray Bradbury

 

Given the untold influence of Mr. Bradbury's contributions to science, technology, innovation and imagination, there are countless encomiums pouring out from individuals inspired by Mr. Bradbury's life and work.  Read the articles from WiredThe New Yorker, and The Washington Post.  Better yet, pick up a Bradbury novel or story, find yourself a comfortable chair, and find your inspiration in Bradbury's imagination.



Photo by Alan Light
May 30, 2012

Cornell Bear

At Cornell University, the Personal Robotics lab is home to robots named Blue, Polar, Panda and the latest addition, Kodiak. Kodiak is a PR2 robot from Willow Garage. Because Cornell's mascot is a bear, all of their robots are named after famous members of the Ursidae family.

To date, the team at Cornell have been programming their robots to perform everyday chores such as grasping novel objects, unloading items from a dishwasher, placing new items, organizing a disorganized house, finding and retrieving items on request, and much more.

A summary of these projects is available here.

Now, Ashutosh Saxena, Assistant Professor of Computer Science and the director of Cornell's Personal Robotics lab, is putting PR2 to work performing similar tasks. In order for PR2 and its descendants to be truly useful personal robots, they need to learn how to perform challenging tasks such as cooking a simple meal, operating household appliances, and more. As you can see in the following videos, PR2 is already hard at work and has taken its first few steps. PR2 is now able to perceive cardboard boxes and plan its actions accordingly in order to autonomously close the box.

PR2 also shows off its culinary and cutlery skills.

The degree of difficulty in both of these examples is extremely high, because getting robots to perceive such environments well enough to perform successful manipulation is challenging. Most robots have been designed to work in highly-structured environments such as factories; however, human environments are inherently unstructured; we are a messy species. In all likelihood, the objects encountered by today's personal robots haven't been seen or handled by robots before. At the same time, however, humans are very familiar and very comfortable in our unstructured worlds. The team at Cornell, expert in broad competence artificial intelligence techniques, is developing new algorithms and software that will bridge the gap between a robotics industry that excels in structured environments and human environments that lack such structure.

One example is for PR2 to be able to perceive the 3D environment so that it can identify objects in order to perform tasks such as figuring out where they belong and putting them away. Many 3D images are stitched together to create an overall view of the room, which the algorithm then divides into blocks based on discontinuities of color and shape. The robot has been shown several examples of each kind of object and learns its common characteristics. For each block it computes the probability of a match with each object in its database and chooses the most likely match. The ROS packages for doing so are available at: http://pr.cs.cornell.edu/codedata.php

In order to figure out how to perform manipulation techniques, Saxena's lab plans to have the robot learn by observing people perform daily activities in the home, including the manipulation of objects. Addressing this problem in a data-driven way will allow the robots to autonomously perform several tasks. Since the robots in this world can be networked, they will be able to routinely improve their capabilities using Internet data resources. This means that the practical experiences of one robot can be shared by others. Specifically, Yun Jiang, a PhD student in the Robot Learning Lab, used Google's 3D warehouse in order to have robots automatically learn how the objects are used by humans, and then transfer that manipulation skill to personal robots.

The personal robotics team at Cornell consists of 10 students together with expert faculty, including Professor Bart Selman, artificial intelligence and planning; Professor Doug James, computer graphics simulation; Professor Thorsten Joachims, machine learning; and Professor Ashutosh Saxena, robot learning and perception,

May 14, 2012

The PR2 community recently got together for a workshop hosted by the University of Freiburg, Germany. The workshop was loosely structured around a hackathon to implement pieces of this year's ICRA mobile manipulation challenge - Yesterday's Sushi - using the PR2. But the real goal was to meet and work with new people, exchange code and tips and tricks, and generally have a great time! Teams were formed by mixing people from different schools and different fields, making for an exciting and social week. Thanks to the Universities of Freiburg, Leuven and Munich, the six teams had three PR2s to hack on.

At the end of the week, all of the teams demonstrated novel robot behaviors. Some teams decided to concentrate on picking up dishes from an automatic turntable, others stacked and unstacked dishes, one group performed two-handed manipulation of a tray, and another planned the best order in which to set a table. Along the way a lot of caffeine and schnitzel were consumed, a walk and meal in the beautiful Black Forest were enjoyed, and the robots got to play with real sushi.

A big THANK YOU to Juergen Hess, Wolfram Burgard and all of our hosts at the University of Freiburg for organizing the event!

You can download some of the software here

Enjoy some of the highlights of the week in the video above and photos here

Want to see more? Watch the final demonstrations of the Mobile Manipulation Challenge at ICRA on Wednesday May 16th starting at 2:30 pm! 

 

 

May 12, 2012

We're looking foward to seeing you in St. Paul, Minnesota at ICRA 2012 from May 14-18, 2012! If you're interested in what Willow Garage has been up to lately, come check out our booth in the exhibition area, our talks at the conference and the ICRA Sushi Challenge.

Workshops, Tutorials and Challenges

Tuesday, May 15 and Wednesday, May 16:

Friday, May 18:

Research Papers

Tuesday, May 15:

  • 09:00 "Navigation in Three-Dimensional Cluttered Environments for Mobile Manipulation" (TuA210.4)
  • 11:30 "Real-Time Compression of Point Cloud Streams" (TuB08.5)

Thursday, May 17:

  • 08:45 "FCL: A General Purpose Library for Collision and Proximity Queries" (ThA06.2)  
  • 09:00 "Search-Based Planning for Dual-Arm Manipulation with Upright Orientation Constraints" (ThA04.3)  
  • 10:45 "Exploiting Segmentation for Robust 3D Object Matching" (ThB08.2)
  • 17:00 "3DNet: Large-Scale Object Class Recognition from CAD Models" (ThD210.2)