Header logo is


2017


no image
Synchronicity Trumps Mischief in Rhythmic Human-Robot Social-Physical Interaction

Fitter, N. T., Kuchenbecker, K. J.

In Proceedings of the International Symposium on Robotics Research (ISRR), Puerto Varas, Chile, December 2017 (inproceedings) In press

Abstract
Hand-clapping games and other forms of rhythmic social-physical interaction might help foster human-robot teamwork, but the design of such interactions has scarcely been explored. We leveraged our prior work to enable the Rethink Robotics Baxter Research Robot to competently play one-handed tempo-matching hand-clapping games with a human user. To understand how such a robot’s capabilities and behaviors affect user perception, we created four versions of this interaction: the hand clapping could be initiated by either the robot or the human, and the non-initiating partner could be either cooperative, yielding synchronous motion, or mischievously uncooperative. Twenty adults tested two clapping tempos in each of these four interaction modes in a random order, rating every trial on standardized scales. The study results showed that having the robot initiate the interaction gave it a more dominant perceived personality. Despite previous results on the intrigue of misbehaving robots, we found that moving synchronously with the robot almost always made the interaction more enjoyable, less mentally taxing, less physically demanding, and lower effort for users than asynchronous interactions caused by robot or human mischief. Taken together, our results indicate that cooperative rhythmic social-physical interaction has the potential to strengthen human-robot partnerships.

hi

[BibTex]

2017


[BibTex]


no image
From Monocular SLAM to Autonomous Drone Exploration

von Stumberg, L., Usenko, V., Engel, J., Stueckler, J., Cremers, D.

In European Conference on Mobile Robots (ECMR), September 2017 (inproceedings)

ev

[BibTex]

[BibTex]


Thumb xl iros2017
A Robotic Framework to Overcome Sensory Overload in Children on the Autism Spectrum: A Pilot Study

Javed, H., Burns, R., Jeon, M., Howard, A., Park, C. H.

In International Conference on Intelligent Robots and Systems (IROS) 2017, International Conference on Intelligent Robots and Systems, September 2017 (inproceedings)

Abstract
This paper discusses a novel framework designed to provide sensory stimulation to children with Autism Spectrum Disorder (ASD). The set up consists of multi-sensory stations to stimulate visual/auditory/olfactory/gustatory/tactile/vestibular senses, together with a robotic agent that navigates through each station responding to the different stimuli. We hypothesize that the robot’s responses will help children learn acceptable ways to respond to stimuli that might otherwise trigger sensory overload. Preliminary results from a pilot study conducted to examine the effectiveness of such a setup were encouraging and are described briefly in this text.

hi

[BibTex]

[BibTex]


Thumb xl op2 laugh
An Interactive Robotic System for Promoting Social Engagement

Burns, R., Javed, H., Jeon, M., Howard, A., Park, C. H.

In International Conference on Intelligent Robots and Systems (IROS) 2017, International Conference on Intelligent Robots and Systems, September 2017 (inproceedings)

Abstract
This abstract (and poster) is a condensed version of Burns' Master's thesis and related journal article. It discusses the use of imitation via robotic motion learning to improve human-robot interaction. It focuses on the preliminary results from a pilot study of 12 subjects. We hypothesized that the robot's use of imitation will increase the user's openness towards engaging with the robot. Post-imitation, experimental subjects displayed a more positive emotional state, had higher instances of mood contagion towards the robot, and interpreted the robot to have a higher level of autonomy than their control group counterparts. These results point to an increased user interest in engagement fueled by personalized imitation during interaction.

hi

[BibTex]

[BibTex]


no image
Stiffness Perception during Pinching and Dissection with Teleoperated Haptic Forceps

Ng, C., Zareinia, K., Sun, Q., Kuchenbecker, K. J.

In Proceedings of the International Symposium on Robot and Human Interactive Communication (RO-MAN), pages: 456-463, Lisbon, Portugal, August 2017 (inproceedings)

hi

link (url) DOI [BibTex]

link (url) DOI [BibTex]


no image
Design of a Parallel Continuum Manipulator for 6-DOF Fingertip Haptic Display

Young, E. M., Kuchenbecker, K. J.

In Proceedings of the IEEE World Haptics Conference (WHC), pages: 599-604, Munich, Germany, June 2017, Finalist for best poster paper (inproceedings)

Abstract
Despite rapid advancements in the field of fingertip haptics, rendering tactile cues with six degrees of freedom (6 DOF) remains an elusive challenge. In this paper, we investigate the potential of displaying fingertip haptic sensations with a 6-DOF parallel continuum manipulator (PCM) that mounts to the user's index finger and moves a contact platform around the fingertip. Compared to traditional mechanisms composed of rigid links and discrete joints, PCMs have the potential to be strong, dexterous, and compact, but they are also more complicated to design. We define the design space of 6-DOF parallel continuum manipulators and outline a process for refining such a device for fingertip haptic applications. Following extensive simulation, we obtain 12 designs that meet our specifications, construct a manually actuated prototype of one such design, and evaluate the simulation's ability to accurately predict the prototype's motion. Finally, we demonstrate the range of deliverable fingertip tactile cues, including a normal force into the finger and shear forces tangent to the finger at three extreme points on the boundary of the fingertip.

hi

DOI Project Page [BibTex]

DOI Project Page [BibTex]


no image
High Magnitude Unidirectional Haptic Force Display Using a Motor/Brake Pair and a Cable

Hu, S., Kuchenbecker, K. J.

In Proceedings of the IEEE World Haptics Conference (WHC), pages: 394-399, Munich, Germany, June 2017 (inproceedings)

Abstract
Clever electromechanical design is required to make the force feedback delivered by a kinesthetic haptic interface both strong and safe. This paper explores a onedimensional haptic force display that combines a DC motor and a magnetic particle brake on the same shaft. Rather than a rigid linkage, a spooled cable connects the user to the actuators to enable a large workspace, reduce the moving mass, and eliminate the sticky residual force from the brake. This design combines the high torque/power ratio of the brake and the active output capabilities of the motor to provide a wider range of forces than can be achieved with either actuator alone. A prototype of this device was built, its performance was characterized, and it was used to simulate constant force sources and virtual springs and dampers. Compared to the conventional design of using only a motor, the hybrid device can output higher unidirectional forces at the expense of free space feeling less free.

hi

DOI Project Page [BibTex]

DOI Project Page [BibTex]


no image
A Wrist-Squeezing Force-Feedback System for Robotic Surgery Training

Brown, J. D., Fernandez, J. N., Cohen, S. P., Kuchenbecker, K. J.

In Proceedings of the IEEE World Haptics Conference (WHC), pages: 107-112, Munich, Germany, June 2017 (inproceedings)

Abstract
Over time, surgical trainees learn to compensate for the lack of haptic feedback in commercial robotic minimally invasive surgical systems. Incorporating touch cues into robotic surgery training could potentially shorten this learning process if the benefits of haptic feedback were sustained after it is removed. In this paper, we develop a wrist-squeezing haptic feedback system and evaluate whether it holds the potential to train novice da Vinci users to reduce the force they exert on a bimanual inanimate training task. Subjects were randomly divided into two groups according to a multiple baseline experimental design. Each of the ten participants moved a ring along a curved wire nine times while the haptic feedback was conditionally withheld, provided, and withheld again. The realtime tactile feedback of applied force magnitude significantly reduced the integral of the force produced by the da Vinci tools on the task materials, and this result remained even when the haptic feedback was removed. Overall, our findings suggest that wrist-squeezing force feedback can play an essential role in helping novice trainees learn to minimize the force they exert with a surgical robot.

hi

DOI [BibTex]

DOI [BibTex]


no image
Handling Scan-Time Parameters in Haptic Surface Classification

Burka, A., Kuchenbecker, K. J.

In Proceedings of the IEEE World Haptics Conference (WHC), pages: 424-429, Munich, Germany, June 2017 (inproceedings)

hi

DOI Project Page [BibTex]

DOI Project Page [BibTex]


no image
Proton 2: Increasing the Sensitivity and Portability of a Visuo-haptic Surface Interaction Recorder

Burka, A., Rajvanshi, A., Allen, S., Kuchenbecker, K. J.

In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), pages: 439-445, Singapore, May 2017 (inproceedings)

Abstract
The Portable Robotic Optical/Tactile ObservatioN PACKage (PROTONPACK, or Proton for short) is a new handheld visuo-haptic sensing system that records surface interactions. We previously demonstrated system calibration and a classification task using external motion tracking. This paper details improvements in surface classification performance and removal of the dependence on external motion tracking, necessary before embarking on our goal of gathering a vast surface interaction dataset. Two experiments were performed to refine data collection parameters. After adjusting the placement and filtering of the Proton's high-bandwidth accelerometers, we recorded interactions between two differently-sized steel tooling ball end-effectors (diameter 6.35 and 9.525 mm) and five surfaces. Using features based on normal force, tangential force, end-effector speed, and contact vibration, we trained multi-class SVMs to classify the surfaces using 50 ms chunks of data from each end-effector. Classification accuracies of 84.5% and 91.5% respectively were achieved on unseen test data, an improvement over prior results. In parallel, we pursued on-board motion tracking, using the Proton's camera and fiducial markers. Motion tracks from the external and onboard trackers agree within 2 mm and 0.01 rad RMS, and the accuracy decreases only slightly to 87.7% when using onboard tracking for the 9.525 mm end-effector. These experiments indicate that the Proton 2 is ready for portable data collection.

hi

DOI Project Page [BibTex]

DOI Project Page [BibTex]


Thumb xl robottherapist
Robot Therapist for Assisting in At-Home Rehabilitation of Shoulder Surgery Patients

(Recipient of Innovation & Entrepreneurship Prize)

Burns, R., Alborz, M., Chalup, Z., Downen, S., Genuino, K., Nayback, C., Nesbitt, N., Park, C. H.

In 2017 GW Research Days, Department of Biomedical Engineering Posters and Presentations, April 2017 (inproceedings)

Abstract
The number of middle-aged to elderly patients receiving shoulder surgery is increasing. However, statistically, very few of these patients perform the necessary at-home physical therapy regimen they are prescribed post-surgery. This results in longer recovery times and/or incomplete healing. We propose the use of a robotic therapist, with customized training and encouragement regimens, to increase physical therapy adherence and improve the patient’s recovery experience.

hi

link (url) [BibTex]

link (url) [BibTex]


Thumb xl dmp wls v dmp gmr
Motion Learning for Emotional Interaction and Imitation of Children with Autism Spectrum Disorder

(First place tie in category, "Biomedical Engineering, Graduate Research")

Burns, R., Cowin, S.

In 2017 GW Research Days, Department of Biomedical Engineering Posters and Presentations, April 2017 (inproceedings)

Abstract
We aim to use motion learning to teach a robot to imitate people's unique gestures. Our robot, ROBOTIS-OP2, can ultimately use imitation to practice social skills with children with autism. In this abstract, two methods of motion learning were compared: Dynamic motion primitives with least squares (DMP with WLS), and Dynamic motion primitives with a Gaussian Mixture Regression (DMP with GMR). Movements with sharp turns were most accurately reproduced using DMP with GMR. Additionally, more states are required to accurately recreate more complex gestures.

hi

link (url) [BibTex]

link (url) [BibTex]


Thumb xl screen shot 2018 05 04 at 11.45.26
Roughness perception of virtual textures displayed by electrovibration on touch screens

Vardar, Y., Isleyen, A., Saleem, M. K., Basdogan, C.

In 2017 IEEE World Haptics Conference (WHC), pages: 263-268, 2017 (inproceedings)

Abstract
In this study, we have investigated the human roughness perception of periodical textures on an electrostatic display by conducting psychophysical experiments with 10 subjects. To generate virtual textures, we used low frequency unipolar pulse waves in different waveform (sinusoidal, square, saw-tooth, triangle), and spacing. We modulated these waves with a 3kHz high frequency sinusoidal carrier signal to minimize perceptional differences due to the electrical filtering of human finger and eliminate low-frequency distortions. The subjects were asked to rate 40 different macro textures on a Likert scale of 1-7. We also collected the normal and tangential forces acting on the fingers of subjects during the experiment. The results of our user study showed that subjects perceived the square wave as the roughest while they perceived the other waveforms equally rough. The perceived roughness followed an inverted U-shaped curve as a function of groove width, but the peak point shifted to the left compared to the results of the earlier studies. Moreover, we found that the roughness perception of subjects is best correlated with the rate of change of the contact forces rather than themselves.

hi

vardar_whc2017 DOI [BibTex]

vardar_whc2017 DOI [BibTex]


no image
Multi-View Deep Learning for Consistent Semantic Mapping with RGB-D Cameras

Ma, L., Stueckler, J., Kerl, C., Cremers, D.

In IEEE International Conference on Intelligent Robots and Systems (IROS), Vancouver, Canada, 2017 (inproceedings)

ev

[BibTex]

[BibTex]


no image
Accurate depth and normal maps from occlusion-aware focal stack symmetry

Strecke, M., Alperovich, A., Goldluecke, B.

In IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2017 (inproceedings)

ev

link (url) [BibTex]

link (url) [BibTex]


no image
Feeling multiple edges: The tactile perception of short ultrasonic square reductions of the finger-surface friction

Gueorguiev, D., Vezzoli, E., Sednaoui, T., Grisoni, L., Lemaire-Semail, B.

In 2017 IEEE World Haptics Conference (WHC), pages: 125-129, 2017 (inproceedings)

hi

DOI [BibTex]

DOI [BibTex]


no image
Semi-Supervised Deep Learning for Monocular Depth Map Prediction

Kuznietsov, Y., Stueckler, J., Leibe, B.

In IEEE International Conference on Computer Vision and Pattern Recognition (CVPR), 2017 (inproceedings)

ev

[BibTex]

[BibTex]


no image
Shadow and Specularity Priors for Intrinsic Light Field Decomposition

Alperovich, A., Johannsen, O., Strecke, M., Goldluecke, B.

In Energy Minimization Methods in Computer Vision and Pattern Recognition (EMMCVPR), 2017 (inproceedings)

ev

link (url) [BibTex]

link (url) [BibTex]


no image
Keyframe-Based Visual-Inertial Online SLAM with Relocalization

Kasyanov, A., Engelmann, F., Stueckler, J., Leibe, B.

In IEEE/RSJ Int. Conference on Intelligent Robots and Systems, IROS, 2017 (inproceedings)

ev

[BibTex]

[BibTex]


no image
SAMP: Shape and Motion Priors for 4D Vehicle Reconstruction

Engelmann, F., Stueckler, J., Leibe, B.

In IEEE Winter Conference on Applications of Computer Vision, WACV, 2017 (inproceedings)

ev

[BibTex]

[BibTex]