Willow Garage Blog
Nearly a million people have watched UC Berkeley's PR2 folding towels and sorting socks on YouTube, and it's easy to understand why: having a robot that can do your laundry is a fantasy that's been around since The Jetsons, and while we're not there yet, it's not nearly as far off a future as it was before the PR2 Beta Program. Since those demos, one of the research groups at Berkeley has been working on ways of making the laundry cycle faster, more efficient, and more complete, and for starters, they've taught their PR2 to reliably handle your pants.
The goal of Pieter Abbeel’s group is to teach a robot to solve the laundry problem. That is, to develop a system to enable a robot to go into a home it's never seen before, load and unload a washer and dryer, and then fold the clean clothes and put them away just like you would. The first aspect of this problem that the group tackled was folding, which is one of those things that seems trivial to us but is very difficult for a robot to figure out since clothes are floppy, unpredictable, and often decorated with tasteless and complicated colors and patterns.
Last year, the Berkeley PR2 (unofficially named Brett, for “Berkeley Robot for the Elimination of Tedious Tasks”) showed us that it could pick a towel out of a pile of clean laundry one by one and neatly fold and stack them, which was an impressive demo. Somewhat less impressive was the fact that the robot would take between 20 and 25 minutes to neatly fold one single towel, which, let's face it, isn't entirely practical. That time has now been cut down to under six minutes, with the potential for as little as two minutes per towel if they really crank the robot up.
The way that Berkeley has been able to improve the performance of the folding software so dramatically is by reducing the dependence on a complex vision system and instead relying on gravity and the properties of cloth. The PR2 now just picks up a towel wherever is convenient and then drags it across a folding table, knowing that as it does, the piece of the towel furthest away from the gripping point must necessarily be a corner. By grabbing that corner and repeating the procedure, the robot is able to quickly pick up two opposite corners of the towel. This puts the towel into one of two states, and from there, the PR2 has no trouble folding it. This general approach also works on shirts and pants and whatever else a robot might find in your laundry.
the actual folding is what we care about most, the trickiest part for
the robot is just getting a random piece of crumpled up laundry into a
state where it can tell what kind of clothing it’s got, and that’s what
grad student Arjun Singh has been working on. Specifically, he’s taught
the PR2 to unfold
and identify complex items of clothing like shirts, skirts, and pants.
This is an essential capability for the PR2, since it enables the robot
to grab a whole bunch of random laundry out of the dryer, uncrumple and
identify each piece, reorient its grip, and then fold it properly.
At this stage, the laundry problem as a whole is “almost in principle solved,” as student Stephen Miller explains. “There are tiny little things that keep us from being able to put all the pieces together reliably, but the detection problem and the folding and unfolding, all of that is pretty much a non-issue regardless of what article of clothing you’re looking at.” Other students are working on getting the PR2 familiar with how to work a washing machine, how to unload a dryer, and even how to put clothes on hangers. Plus, all of Berkeley’s experience with deformable objects has led to some clever ROS packages that could also be adapted to (say) teach a robot to make your bed every morning.
Professor Pieter Abbeel is optimistic that by the time the PR2 Beta Program ends in 2012, they’ll be able to do an entire laundry cycle from start to finish. This of course means that at some point early next year, you’ll start seeing a lot of computer science students dragging bags of dirty laundry into the Berkeley robot lab, and going home with stacks of clean clothes, neatly folded and smelling of robot.
Julius Kammerl from Technische Universitaet Muenchen, Munich, Germany spent his internship at Willow Garage working on the Point Cloud Library (PCL). To find out more, please read the slides below (download pdf) for more technical details.
Robots such as the PR2 by Willow Garage employ depth sensors for acquiring information about the shape and geometry of their environment. These sensors discretely sample the three dimensional space with high spatial resolution and high update rate and therefore generate large point data sets. Once these so called point clouds have to be stored on the robot or transmitted over rate-limited communication channels, the interest in compressing this kind of data emerges and efficient algorithms for compressing and communicating point clouds become highly relevant. Further applications for point cloud compression can be found in the field of 3D television/conferencing.
In our work we compress the point distribution by performing a spatial decomposition based on octree data structures. Furthermore, by correlating and referencing the currently sampled sensor data to previously sensed and transmitted point cloud information, temporal redundancy can be detected and removed from the point cloud data stream. In this context, the detection of changes within the point data sets is of great importance. By subsequently analyzing and comparing the octree data structures of adjacent point clouds, spatial changes in point data can be extracted and used to successively extend the point clouds at the decoder. In addition, an entropy coder (range/arithmetic coder) is used for further removing redundancy from the signals to be transmitted/stored. For more information, please visit pointclouds.org.
We've had a great response to TurtleBot from hobbyists and developers, including a lot of interest from people who want to build their own robots, starting with components they already have. And we're delighted to see similar ROS-based low-cost mobile robots like Bilibot, POLYRO, and PrairieDog. So we decided that we can best support this emerging community by publishing all the information needed to build your own TurtleBot. Following the recently established Open Source Hardware (OSHW) definition, we will make available part numbers, CAD drawings for laser-cut hardware, board layouts and schematics, and all the necessary documentation.
Everything is coming together at turtlebot.com, a portal for all things TurtleBot and a place for the community to exchange ideas, including your own designs. To stay up to date as the site develops, subscribe to the TurtleBot announcement list.
We're currently lining up suppliers of components; if you're interested in selling TurtleBot parts, send email to firstname.lastname@example.org.
Yesterday at Google I/O, developers at Google and Willow Garage announced a new rosjava library that is the first pure-Java implementation of ROS. This new library was developed at Google with the goal of enabling advanced Android apps for robotics.
The library, tools, and hardware that come with Android devices are well-suited for robotics. Smartphones and tablets are sophisticated computation devices with useful sensors and great user-interaction capabilities. Android devices can also be extended with additional sensor and actuators thanks to the Open Accessory and Android @ Home APIs that were announced at Google I/O,
The new rosjava is currently in alpha release mode and is still under active development, so there will be changes to the API moving forward. For early adopters, there are Android tutorials to help you send and receive sensor data to a robot.
This announcement was part of a broader talk on Cloud Robotics, which was given by Ryan Hickman and Damon Kohler of Google, as well Ken Conley and Brian Gerkey of Willow Garage. This talk discusses the many possibilities of harnessing the cloud for robotics applications, from providing capabilities like object recognition and voice services, to reducing the cost of robotics hardware, to enabling the development of user interfaces in the cloud that connect to robots remotely. With the new rosjava library, ROS developers can now take advantage of the Android platform to connect more easily to cloud services.
Earlier this week, the Willow Garage PR2 was awarded with the prestigious ACE Award at EE Times' Annual Creativity in Electronics Awards ceremony held at the Embedded Systems Conference. We are honored to be placed in the good company of the other ACE Award recipients.
Our specific award was for "Technology in the Service of Society." This specific award is for the technologies having the greatest potential to provide the most overall benefit to humankind.
The award derived from an IEEE Spectrum Magazine article that announced their Technology Winners of 2011.
A special thank you to the folks at IEEE who put together such an enjoyable ceremony, and for the recognition of all of our hard work.
We're looking foward to seeing you in Shanghai, China at ICRA 2011 from May 9-13, 2011! If you're interested in checking out what Willow Garage has been up to lately, come check out our research talks and posters.
9:00 Workshop “Long Term Autonomy and Lifelong Learning” (WS-M-13)
9:00 Workshop “Semantic Perception, Mapping and Exploration” (WS-M-8)
Thanks to all of you who came and visited us at RoboGames 2011! We had a lot of fun watching the competitions and debuting TurtleBot. TurtleBot had a lot of fun following people (and robots) around the Exhibit Hall.
This year was a thrill for us as we got to take our first "ROS Family Photo". We appreciate all the ROS hobbyists that came from as far away as Japan and New York to show off their ROS robots. We're pretty sure the photo above is the most robots running ROS that have been pictured together. Many of the robots above should be familiar to readers of the "Robots Using ROS" series.
Also, congratulations to Christie Dudley for winning the TurtleBot raffle. She will be receiving a free TurtleBot kit when they ship later this year.
We've posted a small photo album of ROS robots at RoboGames on Flickr. Be sure to head over to Robots Dreams for lots of great coverage of the event.
PR2 has arrived safe and sound at the Intelligent Systems Research Center at the University of Ulster. As a quick side project for run, their Cognitive Robotics Group immediately put their PR2 to the test by having it autonomously solve a Rubik's Cube. The PR2 visually inspects the cube to find all the colors and find a solution. The code was written by Chris Burbridge and Lorenzo Riano, who also won an award in the ROS 3D contest for their Person Tracking and Reconstruction from a Mobile Base entry.
If you've wondered what it's like to have a PR2 arrive on your doorstep, ISRC has also put together a fun video of PR2's arrival:
Kai Wurm from the University of Freiburg (Germany) recently visited Willow Garage. During his stay, he worked on integrating the 3D mapping library OctoMap into the ROS and PCL frameworks. To provide real-time 3D maps of the workspace of the PR2 robot, the runtime and memory requirements of OctoMap were substantially reduced. To make OctoMap more attractive for mobile manipulation, he also investigated the use of collections of multi-resolution maps to model movable objects at millimeter resolution.
We are proud to introduce TurtleBot, a unique combination of state-of-the-art technology in a hobby platform. In keeping with Willow Garage's mission to bring personal robotics to the home, we feel that the time is ripe to put a low-cost, personal robot kit with open-source software in the hands of hobbyists and developers.
With TurtleBot, you'll be able to build a robot that can drive around your house, see in 3D, and have enough horsepower to create exciting applications. The main hardware includes:
- iRobot Create: mobile base that has been an effective platform for robotics in education.
- Microsoft Kinect: camera and 3D sensor in one package.
- Asus Eee PC 1215N dual-core Atom notebook: powerful enough to handle the demands of 3D data.
- Low-cost gyro: enhances the TurtleBot's ability to navigate around the home.
If you're tired of wiring and soldering to get your robot up and running, don't worry -- the TurtleBot assembles quickly with just a single screwdriver included in the kit.
The TurtleBot comes with an open-source, ROS-based TurtleBot SDK that lets you get the most of the hardware. The TurtleBot SDK integrates the hardware drivers with developer tools and high-level capabilities like autonomous navigation. You'll be able to develop apps from day one that build on powerful computer vision libraries like OpenCV and PCL. You'll also be able to access the thousands of libraries that the ROS community has built and share your code with the rest of the TurtleBot community.
With TurtleBot, we are adding a new dimension of possibilities to your Kinect hacking: the ability to drive. TurtleBot can explore your house on its own, build 3D pictures, take panoramas, and more. Checkout the results of our ROS 3D Contest to see some of the exciting possibilities for the Kinect in robotics.
There are two ways to bring a TurtleBot home. If you already have an iRobot Create and a laptop, you can purchase the TurtleBot Core kit for $499.99. The TurtleBot Core kit includes:
- USB Communications Cable
- TurtleBot Power and Sensor Board
- TurtleBot Hardware
- Microsoft Kinect
- TurtleBot to Kinect Power Cable
- USB Stick TurtleBot Installer
- #10 Torx Allen Key
The TurtleBot Complete kit sells for $1199.99 and includes everything you need to get started:
- TurtleBot Core Kit
- iRobot Create Robot
- 3000 mAh Ni-MH Battery
- Fast Charger
- ASUS Eee PC 1215N
We look forward to see what you can make TurtleBot do!