Motion Planning

Our work encompasses multiple research areas. We are investigating live environment construction using 3D point clouds from scanning lasers and stereo, where we model occlusions from the body parts of the robot. Information from this 3D representation is also used to construct a distance field that provides gradient information for moving away from obstacles. Our ultimate goal is to use semantic information fused with real-time sensor data to build more compact 3D representations for fast collision checking and motion planning.

Another area of focus is in developing and implementing fast and novel motion planners. We are currently exploring three different kinds of planners: probabilistic planners (through a collaboration with Ioan Sucan and Lydia Kavraki at Rice), anytime search-based planners (in collaboration with Maxim Likhachev and Ben Cohen at Penn) and trajectory optimizers based on the CHOMP algorithm developed at the Intel/CMU Robotics Lab (implemented by Mrinal Kalakrishnan from USC during his summer internship at Willow Garage). A major part of this effort involves the creation of a planning infrastructure where different types of planners can be easily plugged in. We also plan to extend this infrastructure for whole-body planning and control of robots with a mobile base and arms like the PR2.

We aim for safe physical contact interaction between robots and untrained persons. In order to provide a reusable and extensible system that simultaneously considers the motions of all limbs and allows for the inclusion of compliant behavior, we are collaborating with Oussama Khatib's group at the Stanford Robotics and AI lab to apply whole-body operational control to the PR2.

We are also interested in planning and control in dynamically-changing environments, especially in the presence of people. In this regard, we are exploring the use of planners that specifically take into account the motion of dynamic obstacles. We intend to integrate these planners with perceptual models for moving obstacles and reactive controllers that function at a more local level, and can safely execute desired plans in the presence of people and other moving obstacles.

In addition to designing motion planners for completely autonomous behavior, we are exploring the area of mixed autonomy where a human in-the-loop can make some decisions while being assisted by the motion planning framework, e.g. in assisted tele-operation of a mobile base. Our system would effectively serve as a guide or assistant, and actively prevent unsafe behavior while still allowing the human operator to achieve his or her desired tasks.

Relevant Blog Posts:


Combined Task and Motion Planning for Mobile Manipulation Wolfe, Jason., Marthi, Bhaskara., and Russell, Stuart International Conference on Automated Planning and Scheduling, 05/2010, Toronto, Canada, (2010)  Download: ctamp-icaps-2010.pdf (134.6 KB); ctamp-tr.pdf (188.95 KB)
Combining Planning Techniques for Manipulation Using Realtime Perception Sucan, Ioan Alexandru., Kalakrishnan, Mrinal., and Chitta, Sachin International Conference on Robotics and Automation, Anchorage, Alaska, (2010)  Download: arm_planning.pdf (1.89 MB)
Planning for Autonomous Door Opening with a Mobile Manipulator Chitta, Sachin., Cohen, Benjamin., and Likhachev, Maxim International Conference on Robotics and Automation, Anchorage, Alaska, (2010)  Download: door_planner.pdf (1.87 MB)
Search-Based Planning for Manipulation with Motion Primitives Cohen, Benjamin., Chitta, Sachin., and Likhachev, Maxim International Conference on Robotics and Automation, Anchorage, Alaska, (2010)