This workshop focuses on the development of novel tactile sensors (i.e. the bodyware) and how they can contribute to robot intelligence (i.e. the mindware). Robots need touch to interact with the surroundings (humans and/or objects) safely and effectively, to learn about the outside world and to develop self-awareness. To achieve these goals, the artificial skin of the next generation should measure temperature, vibration, proximity and the complete force vectors on multiple contact points; also, it should be both soft and robust to facilitate long-term interactions. Still, major challenges are posed by the need to analyze and interpret massive amounts of data in a fast and accurate way, and to combine such sensing information with other cognitive and perceptual processes to achieve real understanding. While advanced computational techniques (e.g. deep learning, Bayesian inference) can critically improve data representations, bio-inspired strategies for multimodal integration, prediction and reasoning seem to be necessary as well to revolutionize the robotic sense of touch. Therefore, the goal of this workshop is to discuss if and how the recent advancements in tactile technology and data analysis have been accompanied by an increased understanding of the ways in which tactile perception can support robot autonomy and cognition.