Our latest work on human grasping "Learning the signatures of the human grasp using a scalable tactile glove" was published in Nature. In this paper, we use a scalable tactile glove and deep convolutional neural networks to show that sensors uniformly distributed over the hand can be used to identify individual objects, estimate their weight and explore the typical tactile patterns that emerge while grasping objects. Using a low-cost (about US$10) scalable tactile glove sensor array, we record a large-scale tactile dataset with 135,000 frames, each covering the full hand while interacting with 26 different objects. This set of interactions with different objects reveals the key correspondences between different regions of a human hand while it is manipulating objects. Insights from the tactile signatures of the human grasp—through the lens of an artificial analog of the natural mechanoreceptor network—can thus aid the future design of prosthetics, robot grasping tools, and human-robot interactions.
The paper was led by Subramanian Sundaram, who obtained his Ph.D. from our group in 2018 and continued this work after his graduation.