Invited talk: Katherine Kuchenbecker

Haptic Intelligence for Robots

Humans use the sense of touch to fluidly manipulate all kinds of objects every day.  Surprisingly, few autonomous robots take advantage of haptic (touch) sensing and thus struggle to match the manual capabilities of humans.  This talk will present several ways we can endow autonomous robots with haptic intelligence, using examples from Professor Kuchenbecker’s research.  Most notably, we have found that using haptic data from a variety of sensor types yields the most robust performance, as different sensors excel at detecting different events and object properties.  First, we developed a suite of tactile data processing and control approaches that enable a Willow Garage PR2 to pick up and set down unknown objects using only built-in sensors; similar techniques also facilitate physical human-robot interaction.  Second, we equipped our PR2 with a pair of SynTouch BioTac biomimetic sensors and used data collection and machine learning to enable it to describe how new objects feel to touch.  Third, we are in the process of collecting a large visuo-haptic dataset on how natural surfaces both look and feel, with the goal of enabling robots to “feel” with their eyes.