Featured Package: tf

TF Visualizer

One of the problems we've tried to simplify in ROS is keeping track of coordinate frames for robot data. The ROS platform is a distributed system and our PR2 robot has many sensors and manipulators in different spatial positions and orientations. We don't want sources of data to have to publish their data under different permutations of transforms, nor do we want subscribers to manually keep track of transforms. Instead we want to publish the data once and have the rest of the system automatically know how to interpret it.

The tf ('transform') package enables ROS nodes to keep track of coordinate frames and transform data between coordinate frames. Like ROS, it works in a distributed manner: a node that knows a transform between frames publishes it to a ROS topic; nodes that are interested use tf to subscribe to the transforms. tf can then efficiently compose the net transform between requested frames. Using linear algebra classes and methods from the open source Bullet Physics library, the tf package works a variety of data types: points, vectors, point clouds, orientations, and poses. ROS makes sure that the transforms are easily associated with this data by including a special "frame identifier" header field that stays with messages as they are passed around.

We're committed to providing you debugging tools to help you understand what's going on with the transform system. One of these tools is our rviz, which is our 3D viewer that we are actively developing. The ogre_visualizer helps you understand the structure of the transforms by displaying the transforms in space with names and values. It also helps you understand a series of transforms by including arrows that point to the parent coordinate frame.

We've also designed the tf package to be used with or without ROS, so you're free to try it out on its own or as part of our ROS ecosystem. You'll find the code over at our Personal Robots Sourceforge project and you can find out more technical details about the tf package, including code APIs, at our tf wiki page.

Here's a video of the tf package in action. We used optical markers to control the position of the arm. The position of the arm is fed to our robot head controller, which is able to transform it to its local coordinate frame and follow the motion.

Comments

Robot arm , Vijay and Wim

I am just flabbergasted. I am aleady planning to get one and donate to some Bomb Deactivation Unit . What a shame, as of now, a human has to take the risk of approaching the bomb to deactivate it. Rammohan MD