Header logo is

Can Humans Infer Haptic Surface Properties from Images?

2018

Miscellaneous

hi


Human children typically experience their surroundings both visually and haptically, providing ample opportunities to learn rich cross-sensory associations. To thrive in human environments and interact with the real world, robots also need to build models of these cross-sensory associations; current advances in machine learning should make it possible to infer models from large amounts of data. We previously built a visuo-haptic sensing device, the Proton Pack, and are using it to collect a large database of matched multimodal data from tool-surface interactions. As a benchmark to compare with machine learning performance, we conducted a human subject study (n = 84) on estimating haptic surface properties (here: hardness, roughness, friction, and warmness) from images. Using a 100-surface subset of our database, we showed images to study participants and collected 5635 ratings of the four haptic properties, which we compared with ratings made by the Proton Pack operator and with physical data recorded using motion, force, and vibration sensors. Preliminary results indicate weak correlation between participant and operator ratings, but potential for matching up certain human ratings (particularly hardness and roughness) with features from the literature.

Author(s): Alex Burka and Katherine J. Kuchenbecker
Year: 2018
Month: March

Department(s): Haptic Intelligence
Research Project(s): Feeling With Your Eyes: Visual-Haptic Surface Interaction
Bibtex Type: Miscellaneous (misc)
Paper Type: Work in Progress

Address: San Francisco, USA
How Published: Work-in-progress paper (3 pages) presented at the IEEE Haptics Symposium

BibTex

@misc{Burka18-HSWIP-Surface,
  title = {Can Humans Infer Haptic Surface Properties from Images?},
  author = {Burka, Alex and Kuchenbecker, Katherine J.},
  howpublished = {Work-in-progress paper (3 pages) presented at the IEEE Haptics Symposium},
  address = {San Francisco, USA},
  month = mar,
  year = {2018},
  doi = {},
  month_numeric = {3}
}