Invited talk: Kaiyu Hang

Dexterous Manipulation — Stable and More”

Dexterous manipulation is desired in many robotic applications, especially in the scenarios where a robot is tasked to first grasp an object and then manipulate it in-hand. In this process, visual perception can provide information for the grasp planner to generate an initial plan using either analytic or data-driven approaches. However, due to the uncertainties involved in both perception and control, the initial plan can hardly be accurately executed, and thus will potentially cause the execution to fail.

In this talk, aimed at addressing the uncertainties in robotic in-hand manipulation, I will first introduce our work on tactile sensing and learning based grasp stability prediction. Thereafter, based on our work of Hierarchical Fingertip Space, which is a unified framework for grasp planning and adaptation, I will discuss about how tactile sensing and other proprioceptive information can close the control loop for in-hand manipulation. In particular, I will talk about the design of an impedance controller based grasping system, which is capable of predicting grasp stability using tactile feedback, as well as adjusting grasping forces and performing finger gaiting to counteract external disturbances during grasp execution.