Part of human intelligence comes from using the sense of touch to explore the own-body and learn about the world. Similarly, a humanoid robot can use tactile sensing to learn about itself and its surroundings, and to improve its ability in interacting with humans and objects. Indeed, there has been a remarkable progress in tactile sensors for robotics. Several robotic platforms and hands are equipped with sensors which allow measuring pressure, proximity and temperature; also, novel technologies (soft, small, distributed, stretchable sensors) can be used to cover the entire body of a humanoid. These sensors can allow robots to actively explore objects and extract information which is hidden or difficult to extract from vision, like texture, material and weight; however, this requires the development of appropriate learning strategies which incorporate explorative behaviors, signal processing and machine learning.
This workshop aims to bring together experts in the fields of tactile sensor design, tactile data analysis, machine learning and cognitive modeling to share their knowledge in order to review the recent works, plan the future directions, and encourage researchers, through a mix of oral presentations, round-table discussions, poster sessions and live demonstrations.