Header logo is


2020


Utilizing Interviews and Thematic Analysis to Uncover Specifications for a Companion Robot
Utilizing Interviews and Thematic Analysis to Uncover Specifications for a Companion Robot

Burns, R. B., Seifi, H., Lee, H., Kuchenbecker, K. J.

Workshop paper (2 pages) presented at the ICSR Workshop on Enriching HRI Research with Qualitative Methods, Virtual, November 2020 (misc)

Abstract
We will share our experiences designing and conducting structured video-conferencing interviews with autism specialists and utilizing thematic analysis to create qualitative requirements and quantitative specifications for a touch-perceiving robot companion tailored for children with autism. We will also explain how we wrote about our qualitative approaches for a journal setting.

hi

link (url) [BibTex]

2020


link (url) [BibTex]


Characterization of a Magnetic Levitation Haptic Interface for Realistic Tool-Based Interactions
Characterization of a Magnetic Levitation Haptic Interface for Realistic Tool-Based Interactions

Lee, H., Tombak, G. I., Park, G., Kuchenbecker, K. J.

Work-in-progress poster presented at EuroHaptics, Leiden, The Netherlands, September 2020 (misc)

Abstract
We introduce our recent study on the characterization of a commercial magnetic levitation haptic interface (MagLev 200, Butterfly Haptics LLC) for realistic high-bandwidth interactions. This device’s haptic rendering scheme can provide strong 6-DoF (force and torque) feedback without friction at all poses in its small workspace. The objective of our study is to enable the device to accurately render realistic multidimensional vibrotactile stimuli measured from a stylus-like tool. Our approach is to characterize the dynamics between the commanded wrench and the resulting translational acceleration across the frequency range of interest. To this end, we first custom-designed and attached a pen-shaped manipulandum (11.5 cm, aluminum) to the top of the MagLev 200’s end-effector for better usability in grasping. An accelerometer (ADXL354, Analog Devices) was rigidly mounted inside the manipulandum. Then, we collected a data set where the input is a 30-second-long force and/or torque signal commanded as a sweep function from 10 to 500 Hz; the output is the corresponding acceleration measurement, which we collected both with and without a user holding the handle. We succeeded at fitting both non-parametric and parametric versions of the transfer functions for both scenarios, with a fitting accuracy of about 95% for the parametric transfer functions. In the future, we plan to find the best method of applying the inverse parametric transfer function to our system. We will then employ that compensation method in a user study to evaluate the realism of different algorithms for reducing the dimensionality of tool-based vibrotactile cues.

hi

link (url) [BibTex]

link (url) [BibTex]


Tactile Textiles: An Assortment of Fabric-Based Tactile Sensors for Contact Force and Contact Location
Tactile Textiles: An Assortment of Fabric-Based Tactile Sensors for Contact Force and Contact Location

Burns, R. B., Thomas, N., Lee, H., Faulkner, R., Kuchenbecker, K. J.

Hands-on demonstration presented at EuroHaptics, Leiden, The Netherlands, September 2020, Rachael Bevill Burns, Neha Thomas, and Hyosang Lee contributed equally to this publication (misc)

Abstract
Fabric-based tactile sensors are promising for the construction of robotic skin due to their soft and flexible nature. Conductive fabric layers can be used to form piezoresistive structures that are sensitive to contact force and/or contact location. This demonstration showcases three diverse fabric-based tactile sensors we have created. The first detects dynamic tactile events anywhere within a region on a robot’s body. The second design measures the precise location at which a single low-force contact is applied. The third sensor uses electrical resistance tomography to output both the force and location of multiple simultaneous contacts applied across a surface.

hi

Project Page Project Page [BibTex]

Project Page Project Page [BibTex]


no image
Estimating Human Handshape by Feeling the Wrist

Forte, M., Young, E. M., Kuchenbecker, K. J.

Work-in-progress poster presented at EuroHaptics, Leiden, The Netherlands, September 2020 (misc)

hi

[BibTex]

[BibTex]


Sweat Softens the Outermost Layer of the Human Finger Pad: Evidence from Simulations and Experiments
Sweat Softens the Outermost Layer of the Human Finger Pad: Evidence from Simulations and Experiments

Nam, S., Kuchenbecker, K. J.

Work-in-progress poster presented at EuroHaptics, Leiden, The Netherlands, September 2020, Award for best poster in 2020 (misc)

hi

Project Page [BibTex]

Project Page [BibTex]


no image
Intermediate Ridges Amplify Mechanoreceptor Strains in Static and Dynamic Touch

Serhat, G., Kuchenbecker, K. J.

Work-in-progress poster presented at EuroHaptics, Leiden, The Netherlands, September 2020 (misc)

hi

[BibTex]

[BibTex]


no image
Seeing Through Touch: Contact-Location Sensing and Tactile Feedback for Prosthetic Hands

Thomas, N., Kuchenbecker, K. J.

Work-in-progress poster presented at EuroHaptics, Leiden, The Netherlands, September 2020 (misc)

Abstract
Locating and picking up an object without vision is a simple task for able-bodied people, due in part to their rich tactile perception capabilities. The same cannot be said for users of standard myoelectric prostheses, who must rely largely on visual cues to successfully interact with the environment. To enable prosthesis users to locate and grasp objects without looking at them, we propose two changes: adding specialized contact-location sensing to the dorsal and palmar aspects of the prosthetic hand’s fingers, and providing the user with tactile feedback of where an object touches the fingers. To evaluate the potential utility of these changes, we developed a simple, sensitive, fabric-based tactile sensor which provides continuous contact location information via a change in voltage of a voltage divider circuit. This sensor was wrapped around the fingers of a commercial prosthetic hand (Ottobock SensorHand Speed). Using an ATI Nano17 force sensor, we characterized the tactile sensor’s response to normal force at distributed contact locations and obtained an average detection threshold of 0.63 +/- 0.26 N. We also confirmed that the voltage-to-location mapping is linear (R squared = 0.99). Sensor signals were adapted to the stationary vibrotactile funneling illusion to provide haptic feedback of contact location. These preliminary results indicate a promising system that imitates a key aspect of the sensory capabilities of the intact hand. Future work includes testing the system in a modified reach-grasp-and-lift study, in which participants must accomplish the task blindfolded.

hi

[BibTex]

[BibTex]


no image
Haptify: a Comprehensive Benchmarking System for Grounded Force-Feedback Haptic Devices

Fazlollahi, F., Kuchenbecker, K. J.

Work-in-progress poster presented at EuroHaptics, Leiden, The Netherlands, September 2020 (misc)

hi

[BibTex]

[BibTex]


no image
Vision-based Force Estimation for a da Vinci Instrument Using Deep Neural Networks

Lee, Y., Husin, H. M., Forte, M., Lee, S., Kuchenbecker, K. J.

Extended abstract presented as an Emerging Technology ePoster at the Annual Meeting of the Society of American Gastrointestinal and Endoscopic Surgeons (SAGES), Cleveland, Ohio, USA, August 2020 (misc) Accepted

hi

[BibTex]

[BibTex]


A Fabric-Based Sensing System for Recognizing Social Touch
A Fabric-Based Sensing System for Recognizing Social Touch

Burns, R. B., Lee, H., Seifi, H., Kuchenbecker, K. J.

Work-in-progress paper (3 pages) presented at the IEEE Haptics Symposium, Washington, DC, USA, March 2020 (misc)

Abstract
We present a fabric-based piezoresistive tactile sensor system designed to detect social touch gestures on a robot. The unique sensor design utilizes three layers of low-conductivity fabric sewn together on alternating edges to form an accordion pattern and secured between two outer high-conductivity layers. This five-layer design demonstrates a greater resistance range and better low-force sensitivity than previous designs that use one layer of low-conductivity fabric with or without a plastic mesh layer. An individual sensor from our system can presently identify six different communication gestures – squeezing, patting, scratching, poking, hand resting without movement, and no touch – with an average accuracy of 90%. A layer of foam can be added beneath the sensor to make a rigid robot more appealing for humans to touch without inhibiting the system’s ability to register social touch gestures.

hi

Project Page [BibTex]

Project Page [BibTex]


Do Touch Gestures Affect How Electrovibration Feels?
Do Touch Gestures Affect How Electrovibration Feels?

Vardar, Y., Kuchenbecker, K. J.

Hands-on demonstration (1 page) presented at the IEEE Haptics Symposium, Washington, DC, USA, March 2020 (misc)

hi

[BibTex]

[BibTex]

2016


no image
Quantifying Therapist Practitioner Roles Using Video-based Analysis: Can We Reliably Model Therapist-Patient Interactions During Task-Oriented Therapy?

Mendonca, R., Johnson, M. J., Laskin, S., Adair, L., Mohan, M.

pages: E55-E56, Abstract in the Archives of Physical Medicine and Rehabilitation, October 2016 (misc)

hi

DOI [BibTex]

2016


DOI [BibTex]


no image
Numerical Investigation of Frictional Forces Between a Finger and a Textured Surface During Active Touch

Khojasteh, B., Janko, M., Visell, Y.

Extended abstract presented in form of an oral presentation at the 3rd International Conference on BioTribology (ICoBT), London, England, September 2016 (misc)

Abstract
The biomechanics of the human finger pad has been investigated in relation to motor behaviour and sensory function in the upper limb. While the frictional properties of the finger pad are important for grip and grasp function, recent attention has also been given to the roles played by friction when perceiving a surface via sliding contact. Indeed, the mechanics of sliding contact greatly affect stimuli felt by the finger scanning a surface. Past research has shed light on neural mechanisms of haptic texture perception, but the relation with time-resolved frictional contact interactions is unknown. Current biotribological models cannot predict time-resolved frictional forces felt by a finger as it slides on a rough surface. This constitutes a missing link in understanding the mechanical basis of texture perception. To ameliorate this, we developed a two-dimensional finite element numerical simulation of a human finger pad in sliding contact with a textured surface. Our model captures bulk mechanical properties, including hyperelasticity, dissipation, and tissue heterogeneity, and contact dynamics. To validate it, we utilized a database of measurements that we previously captured with a variety of human fingers and surfaces. By designing the simulations to match the measurements, we evaluated the ability of the FEM model to predict time-resolved sliding frictional forces. We varied surface texture wavelength, sliding speed, and normal forces in the experiments. An analysis of the results indicated that both time- and frequency-domain features of forces produced during finger-surface sliding interactions were reproduced, including many of the phenomena that we observed in analyses of real measurements, including quasiperiodicity, harmonic distortion and spectral decay in the frequency domain, and their dependence on kinetics and surface properties. The results shed light on frictional signatures of surface texture during active touch, and may inform understanding of the role played by friction in texture discrimination.

hi

[BibTex]

[BibTex]


Behavioral Learning and Imitation for Music-Based Robotic Therapy for Children with Autism Spectrum Disorder
Behavioral Learning and Imitation for Music-Based Robotic Therapy for Children with Autism Spectrum Disorder

Burns, R., Nizambad, S., Park, C. H., Jeon, M., Howard, A.

Workshop paper (5 pages) at the RO-MAN Workshop on Behavior Adaptation, Interaction and Learning for Assistive Robotics, August 2016 (misc)

Abstract
In this full workshop paper, we discuss the positive impacts of robot, music, and imitation therapies on children with autism. We also discuss the use of Laban Motion Analysis (LMA) to identify emotion through movement and posture cues. We present our preliminary studies of the "Five Senses" game that our two robots, Romo the penguin and Darwin Mini, partake in. Using an LMA-focused approach (enabled by our skeletal tracking Kinect algorithm), we find that our participants show increased frequency of movement and speed when the game has a musical accompaniment. Therefore, participants may have increased engagement with our robots and game if music is present. We also begin exploring motion learning for future works.

hi

link (url) [BibTex]

link (url) [BibTex]


no image
Design and evaluation of a novel mechanical device to improve hemiparetic gait: a case report

Fjeld, K., Hu, S., Kuchenbecker, K. J., Vasudevan, E. V.

Extended abstract presented at the Biomechanics and Neural Control of Movement Conference (BANCOM), 2016, Poster presentation given by Fjeld (misc)

hi

Project Page [BibTex]

Project Page [BibTex]


no image
One Sensor, Three Displays: A Comparison of Tactile Rendering from a BioTac Sensor

Brown, J. D., Ibrahim, M., Chase, E. D. Z., Pacchierotti, C., Kuchenbecker, K. J.

Hands-on demonstration presented at IEEE Haptics Symposium, Philadelphia, Pennsylvania, USA, April 2016 (misc)

hi

[BibTex]

[BibTex]


Multisensory robotic therapy to promote natural emotional interaction for children with ASD
Multisensory robotic therapy to promote natural emotional interaction for children with ASD

Bevill, R., Azzi, P., Spadafora, M., Park, C. H., Jeon, M., Kim, H. J., Lee, J., Raihan, K., Howard, A.

Proceedings of the ACM/IEEE International Conference on Human Robot Interaction (HRI), pages: 571, March 2016 (misc)

Abstract
In this video submission, we are introduced to two robots, Romo the penguin and Darwin Mini. We have programmed these robots to perform a variety of emotions through facial expression and body language, respectively. We aim to use these robots with children with autism, to demo safe emotional and social responses in various sensory situations.

hi

link (url) DOI [BibTex]

link (url) DOI [BibTex]


Interactive Robotic Framework for Multi-Sensory Therapy for Children with Autism Spectrum Disorder
Interactive Robotic Framework for Multi-Sensory Therapy for Children with Autism Spectrum Disorder

Bevill, R., Park, C. H., Kim, H. J., Lee, J., Rennie, A., Jeon, M., Howard, A.

Extended abstract presented at the ACM/IEEE International Conference on Human Robot Interaction (HRI), March 2016 (misc)

Abstract
In this abstract, we present the overarching goal of our interactive robotic framework - to teach emotional and social behavior to children with autism spectrum disorders via multi-sensory therapy. We introduce our robot characters, Romo and Darwin Mini, and the "Five Senses" scenario they will undergo. This sensory game will develop the children's interest, and will model safe and appropriate reactions to typical sensory overload stimuli.

hi

link (url) DOI [BibTex]

link (url) DOI [BibTex]


no image
Designing Human-Robot Exercise Games for Baxter

Fitter, N. T., Hawkes, D. T., Johnson, M. J., Kuchenbecker, K. J.

2016, Late-breaking results report presented at the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (misc)

hi

Project Page [BibTex]

Project Page [BibTex]


no image
Design of a Low-Cost Platform for Autonomous Mobile Service Robots

Eaton, E., Mucchiani, C., Mohan, M., Isele, D., Luná, J. M., Clingerman, C.

Workshop paper (7 pages) presented at the 25th International Joint Conference on Artificial Intelligence (IJCAI) Workshop on Autonomous Mobile Service Robots, New York, USA, 2016 (misc)

Abstract
Most current autonomous mobile service robots are either expensive commercial platforms or custom manufactured for research environments, limiting their availability. We present the design for a lowcost service robot based on the widely used TurtleBot 2 platform, with the goal of making service robots affordable and accessible to the research, educational, and hobbyist communities. Our design uses a set of simple and inexpensive modifications to transform the TurtleBot 2 into a 4.5ft (1.37m) tall tour-guide or telepresence-style robot, capable of performing a wide variety of indoor service tasks. The resulting platform provides a shoulder-height touchscreen and 3D camera for interaction, an optional low-cost arm for manipulation, enhanced onboard computation, autonomous charging, and up to 6 hours of runtime. The resulting platform can support many of the tasks performed by significantly more expensive service robots. For compatibility with existing software packages, the service robot runs the Robot Operating System (ROS).

hi

link (url) [BibTex]

link (url) [BibTex]


no image
IMU-Mediated Real-Time Human-Baxter Hand-Clapping Interaction

Fitter, N. T., Huang, Y. E., Mayer, J. P., Kuchenbecker, K. J.

2016, Late-breaking results report presented at the {\em IEEE/RSJ International Conference on Intelligent Robots and Systems} (misc)

hi

[BibTex]

[BibTex]

2015


no image
Haptic Textures for Online Shopping

Culbertson, H., Kuchenbecker, K. J.

Interactive demonstrations in The Retail Collective exhibit, presented at the Dx3 Conference in Toronto, Canada, March 2015 (misc)

hi

[BibTex]

2015


[BibTex]

2014


no image
Teaching Forward and Inverse Kinematics of Robotic Manipulators Via MATLAB

Wong, D., Dames, P., J. Kuchenbecker, K.

June 2014, Presented at {\em ICRA Workshop on {MATLAB/Simulink} for Robotics Education and Research}. Oral presentation given by {Dames} and {Wong} (misc)

hi

[BibTex]

2014


[BibTex]


no image
Control of a Virtual Robot with Fingertip Contact, Pressure, Vibrotactile, and Grip Force Feedback

Pierce, R. M., Fedalei, E. A., Kuchenbecker, K. J.

Hands-on demonstration presented at IEEE Haptics Symposium, Houston, Texas, USA, February 2014 (misc)

hi

[BibTex]

[BibTex]


no image
A Modular Tactile Motion Guidance System

Kuchenbecker, K. J., Anon, A. M., Barkin, T., deVillafranca, K., Lo, M.

Hands-on demonstration presented at IEEE Haptics Symposium, Houston, Texas, USA, February 2014 (misc)

hi

[BibTex]

[BibTex]


no image
The Penn Haptic Texture Toolkit

Culbertson, H., Delgado, J. J. L., Kuchenbecker, K. J.

Hands-on demonstration presented at IEEE Haptics Symposium, Houston, Texas, USA, February 2014 (misc)

hi

[BibTex]

[BibTex]

2011


no image
Please \soutdo not touch the robot

Romano, J. M., Kuchenbecker, K. J.

Hands-on demonstration presented at IEEE/RSJ Conference on Intelligent Robots and Systems (IROS), San Francisco, California, sep 2011 (misc)

hi

[BibTex]

2011


[BibTex]


no image
Body-Grounded Tactile Actuators for Playback of Human Physical Contact

Stanley, A. A., Kuchenbecker, K. J.

Hands-on demonstration presented at IEEE World Haptics Conference, Istanbul, Turkey, June 2011 (misc)

hi

[BibTex]

[BibTex]