Help Wanted: PR2 Behavior Design

Our online animation study has gone live and is complete! We invited you to weigh in with your perspective on our PR2 animations. You got a sneak preview of these animations as they were under development, but now you can check out the full study. The study is now closed, but you're welcome to try it out anyway.

The above video shows an example of the type of clip you'll find in the full-length study. We're investigating how to make personal robot behaviors more human-readable by identifying important principles that can inform the design of more effective human-robot interactions. To do this, we need your feedback on how you interpret certain robot behaviors. Please follow the link up top, and help us make PR2 more human-readable.

We've submitted the results of this study for peer-reviewed publication so that others can use the lessons learned from this study on different robots, too.

Comments

multiple animations for a given task or problem

seems to me that there could be multiple animations for any given interaction. In hope you persue this idea as you go foward, rather than settling on one set of animations. This could have the benefit of avoiding repetition, but has the potential to be a point of customization much like ringtones, wallpapers, or application 'skins' bringing some individual 'personality' to particular robots, either by being assigned or self-randomized by the robot itself. It also strikes me that very subtle variations may be enough to convey changes in status/confidence/determination; how emphatic is a gesture with a power cord to tell how badly it needs help, multiple shakes/points in a given direction to emphasize iimportance, etc.

multiple dimensions of animations are a must

I definitely agree with your take on the need for multiple animations for any given robot and situation. This is a starter set that is aimed to be a step in the direction of stimulating more research and interaction design work in learning from animation techniques to inform human-robot interactions. As you note, status, confidence, and determination are likely to be important elements that will influence the way that a robot could be animated. We are also very interested in issues of customization and self-extension; here's one study we ran in this space. We're definitely looking to pursue these questions about dimensions that will influence robot forms and behaviors in more depth. We will do our best to continue to share our findings along the way, too. Thanks for your thoughtful feedback on this study!

Thanks!

We're really hoping to glean some useful and generalizable lessons from this study to make gestures clearer on robots. We've got other animations like these playing PR2, but not this particular set.

We are not only interested in getting these behaviors onto PR2 though. We will be submitting these results to an IEEE or ACM conference so that others can learn from this study and use its results, too.

Sorry about the video embedding problems. We are working to resolve this issue now. We've also added contact info and more background about the study to the survey. Thanks so much for your interest and for the feedback!

Pretty obvious

The gestures by the robots shown in the video are pretty clear to me. Much clearer than the snippets in the online study. Looks very promising. Can't wait to see this kind of gestures on real robots.

tech & setup trouble

I'd be happy to take part but it requires some sort of plugin from Apple (even for flv and wmv according to the embed-tag?!) which I don't have and which my Firefox can't find a replacement for. This appears to be due to the use of an embed-tag for the videos, instead of a dedicated player or the HTML5 video-tag. Additionally, it'd be cool if there were a contact e-mail and a statement on what kind of data you're collecting and how its going to be used, right on the first survey page.

Results

Will the full results of the study be made freely available for other robots to use?

Results

Yes, the results are now available in our published paper at the HRI 2011 conference: http://www.willowgarage.com/papers/expressing-thought-improving-robot-re...