Header logo is


2020


no image
Vision-based Force Estimation for a da Vinci Instrument Using Deep Neural Networks

Lee, Y., Husin, H. M., Forte, M. P., Lee, S., Kuchenbecker, K. J.

Extended abstract presented as an Emerging Technology ePoster at the Annual Meeting of the Society of American Gastrointestinal and Endoscopic Surgeons (SAGES), Cleveland, Ohio, USA, April 2020 (misc) Accepted

hi

[BibTex]

2020


[BibTex]


Do touch gestures affect how electrovibration feels?
Do touch gestures affect how electrovibration feels?

Vardar, Y., Kuchenbecker, K. J.

Hands-on demonstration (1 page) presented at IEEE Haptics Symposium, March 2020 (misc) Accepted

hi

[BibTex]

[BibTex]


Learning to Predict Perceptual Distributions of Haptic Adjectives
Learning to Predict Perceptual Distributions of Haptic Adjectives

Richardson, B. A., Kuchenbecker, K. J.

Frontiers in Neurorobotics, 13(116):1-16, Febuary 2020 (article)

Abstract
When humans touch an object with their fingertips, they can immediately describe its tactile properties using haptic adjectives, such as hardness and roughness; however, human perception is subjective and noisy, with significant variation across individuals and interactions. Recent research has worked to provide robots with similar haptic intelligence but was focused on identifying binary haptic adjectives, ignoring both attribute intensity and perceptual variability. Combining ordinal haptic adjective labels gathered from human subjects for a set of 60 objects with features automatically extracted from raw multi-modal tactile data collected by a robot repeatedly touching the same objects, we designed a machine-learning method that incorporates partial knowledge of the distribution of object labels into training; then, from a single interaction, it predicts a probability distribution over the set of ordinal labels. In addition to analyzing the collected labels (10 basic haptic adjectives) and demonstrating the quality of our method's predictions, we hold out specific features to determine the influence of individual sensor modalities on the predictive performance for each adjective. Our results demonstrate the feasibility of modeling both the intensity and the variation of haptic perception, two crucial yet previously neglected components of human haptic perception.

hi

DOI [BibTex]

DOI [BibTex]


no image
TUM Flyers: Vision-Based MAV Navigation for Systematic Inspection of Structures

Usenko, V., Stumberg, L. V., Stückler, J., Cremers, D.

In Bringing Innovative Robotic Technologies from Research Labs to Industrial End-users: The Experience of the European Robotics Challenges, 136, pages: 189-209, Springer International Publishing, 2020 (inbook)

ev

[BibTex]

[BibTex]


Electronics, Software and Analysis of a Bioinspired Sensorized Quadrupedal Robot
Electronics, Software and Analysis of a Bioinspired Sensorized Quadrupedal Robot

Petereit, R.

Technische Universität München, 2020 (mastersthesis)

dlg

[BibTex]


no image
Visual-Inertial Mapping with Non-Linear Factor Recovery

Usenko, V., Demmel, N., Schubert, D., Stückler, J., Cremers, D.

IEEE Robotics and Automation Letters (RA-L), 5, 2020, accepted for presentation at IEEE International Conference on Robotics and Automation (ICRA) 2020, to appear, arXiv:1904.06504 (article)

Abstract
Cameras and inertial measurement units are complementary sensors for ego-motion estimation and environment mapping. Their combination makes visual-inertial odometry (VIO) systems more accurate and robust. For globally consistent mapping, however, combining visual and inertial information is not straightforward. To estimate the motion and geometry with a set of images large baselines are required. Because of that, most systems operate on keyframes that have large time intervals between each other. Inertial data on the other hand quickly degrades with the duration of the intervals and after several seconds of integration, it typically contains only little useful information. In this paper, we propose to extract relevant information for visual-inertial mapping from visual-inertial odometry using non-linear factor recovery. We reconstruct a set of non-linear factors that make an optimal approximation of the information on the trajectory accumulated by VIO. To obtain a globally consistent map we combine these factors with loop-closing constraints using bundle adjustment. The VIO factors make the roll and pitch angles of the global map observable, and improve the robustness and the accuracy of the mapping. In experiments on a public benchmark, we demonstrate superior performance of our method over the state-of-the-art approaches.

ev

[BibTex]

[BibTex]


Trunk pitch oscillations for energy trade-offs in bipedal running birds and robots
Trunk pitch oscillations for energy trade-offs in bipedal running birds and robots

Oezge Drama, , Badri-Spröwitz, A.

Bioinspiration & Biomimetics, 2020 (article)

Abstract
Bipedal animals have diverse morphologies and advanced locomotion abilities. Terrestrial birds, in particular, display agile, efficient, and robust running motion, in which they exploit the interplay between the body segment masses and moment of inertias. On the other hand, most legged robots are not able to generate such versatile and energy-efficient motion and often disregard trunk movements as a means to enhance their locomotion capabilities. Recent research investigated how trunk motions affect the gait characteristics of humans, but there is a lack of analysis across different bipedal morphologies. To address this issue, we analyze avian running based on a spring-loaded inverted pendulum model with a pronograde (horizontal) trunk. We use a virtual point based control scheme and modify the alignment of the ground reaction forces to assess how our control strategy influences the trunk pitch oscillations and energetics of the locomotion. We derive three potential key strategies to leverage trunk pitch motions that minimize either the energy fluctuations of the center of mass or the work performed by the hip and leg. We suggest how these strategies could be used in legged robotics.

dlg

link (url) DOI [BibTex]


no image
DirectShape: Photometric Alignment of Shape Priors for Visual Vehicle Pose and Shape Estimation

Wang, R., Yang, N., Stückler, J., Cremers, D.

In Accepted for IEEE international Conference on Robotics and Automation (ICRA), 2020, arXiv:1904.10097 (inproceedings) Accepted

ev

[BibTex]

[BibTex]

2014


no image
Haptic Robotization of Human Body via Data-Driven Vibrotactile Feedback

Kurihara, Y., Takei, S., Nakai, Y., Hachisu, T., Kuchenbecker, K. J., Kajimoto, H.

Entertainment Computing, 5(4):485-494, December 2014 (article)

hi

[BibTex]

2014


[BibTex]


no image
Automatic Skill Evaluation for a Needle Passing Task in Robotic Surgery

Leung, S., Kuchenbecker, K. J.

In Proc. IROS Workshop on the Role of Human Sensorimotor Control in Robotic Surgery, Chicago, Illinois, sep 2014, Poster presentation given by Kuchenbecker. Best Poster Award (inproceedings)

hi

[BibTex]

[BibTex]


no image
Modeling and Rendering Realistic Textures from Unconstrained Tool-Surface Interactions

Culbertson, H., Unwin, J., Kuchenbecker, K. J.

IEEE Transactions on Haptics, 7(3):381-292, July 2014 (article)

hi

[BibTex]

[BibTex]


no image
A Data-driven Approach to Remote Tactile Interaction: From a BioTac Sensor to Any Fingertip Cutaneous Device

Pacchierotti, C., Prattichizzo, D., Kuchenbecker, K. J.

In Haptics: Neuroscience, Devices, Modeling, and Applications, Proc. EuroHaptics, Part I, 8618, pages: 418-424, Lecture Notes in Computer Science, Springer-Verlag, Berlin Heidelberg, June 2014, Poster presentation given by Pacchierotti in Versailles, France (inproceedings)

hi

[BibTex]

[BibTex]


no image
Evaluating the BioTac’s Ability to Detect and Characterize Lumps in Simulated Tissue

Hui, J. C. T., Kuchenbecker, K. J.

In Haptics: Neuroscience, Devices, Modeling, and Applications, Proc. EuroHaptics, Part II, 8619, pages: 295-302, Lecture Notes in Computer Science, Springer-Verlag, Berlin Heidelberg, June 2014, Poster presentation given by Hui in Versailles, France (inproceedings)

hi

[BibTex]

[BibTex]


no image
Teaching Forward and Inverse Kinematics of Robotic Manipulators Via MATLAB

Wong, D., Dames, P., J. Kuchenbecker, K.

June 2014, Presented at {\em ICRA Workshop on {MATLAB/Simulink} for Robotics Education and Research}. Oral presentation given by {Dames} and {Wong} (misc)

hi

[BibTex]

[BibTex]


no image
Analyzing Human High-Fives to Create an Effective High-Fiving Robot

Fitter, N. T., Kuchenbecker, K. J.

In Proc. ACM/IEEE International Conference on Human-Robot Interaction (HRI), pages: 156-157, Bielefeld, Germany, March 2014, Poster presentation given by Fitter (inproceedings)

hi

[BibTex]

[BibTex]


no image
Dynamic Modeling and Control of Voice-Coil Actuators for High-Fidelity Display of Haptic Vibrations

McMahan, W., Kuchenbecker, K. J.

In Proc. IEEE Haptics Symposium, pages: 115-122, Houston, Texas, USA, February 2014, Oral presentation given by Kuchenbecker (inproceedings)

hi

[BibTex]

[BibTex]


no image
A Wearable Device for Controlling a Robot Gripper With Fingertip Contact, Pressure, Vibrotactile, and Grip Force Feedback

Pierce, R. M., Fedalei, E. A., Kuchenbecker, K. J.

In Proc. IEEE Haptics Symposium, pages: 19-25, Houston, Texas, USA, February 2014, Oral presentation given by Pierce (inproceedings)

hi

[BibTex]

[BibTex]


no image
Methods for Robotic Tool-Mediated Haptic Surface Recognition

Romano, J. M., Kuchenbecker, K. J.

In Proc. IEEE Haptics Symposium, pages: 49-56, Houston, Texas, USA, February 2014, Oral presentation given by Kuchenbecker. Finalist for Best Paper Award (inproceedings)

hi

[BibTex]

[BibTex]


no image
Control of a Virtual Robot with Fingertip Contact, Pressure, Vibrotactile, and Grip Force Feedback

Pierce, R. M., Fedalei, E. A., Kuchenbecker, K. J.

Hands-on demonstration presented at IEEE Haptics Symposium, Houston, Texas, USA, February 2014 (misc)

hi

[BibTex]

[BibTex]


no image
One Hundred Data-Driven Haptic Texture Models and Open-Source Methods for Rendering on 3D Objects

Culbertson, H., Delgado, J. J. L., Kuchenbecker, K. J.

In Proc. IEEE Haptics Symposium, pages: 319-325, Houston, Texas, USA, February 2014, Poster presentation given by Culbertson. Finalist for Best Poster Award (inproceedings)

hi

[BibTex]

[BibTex]


no image
A Modular Tactile Motion Guidance System

Kuchenbecker, K. J., Anon, A. M., Barkin, T., deVillafranca, K., Lo, M.

Hands-on demonstration presented at IEEE Haptics Symposium, Houston, Texas, USA, February 2014 (misc)

hi

[BibTex]

[BibTex]


no image
The Penn Haptic Texture Toolkit

Culbertson, H., Delgado, J. J. L., Kuchenbecker, K. J.

Hands-on demonstration presented at IEEE Haptics Symposium, Houston, Texas, USA, February 2014 (misc)

hi

[BibTex]

[BibTex]


Roombots: A hardware perspective on 3D self-reconfiguration and locomotion with a homogeneous modular robot
Roombots: A hardware perspective on 3D self-reconfiguration and locomotion with a homogeneous modular robot

Spröwitz, A., Moeckel, R., Vespignani, M., Bonardi, S., Ijspeert, A. J.

{Robotics and Autonomous Systems}, 62(7):1016-1033, Elsevier, Amsterdam, 2014 (article)

Abstract
In this work we provide hands-on experience on designing and testing a self-reconfiguring modular robotic system, Roombots (RB), to be used among others for adaptive furniture. In the long term, we envision that RB can be used to create sets of furniture, such as stools, chairs and tables that can move in their environment and that change shape and functionality during the day. In this article, we present the first, incremental results towards that long term vision. We demonstrate locomotion and reconfiguration of single and metamodule RB over 3D surfaces, in a structured environment equipped with embedded connection ports. RB assemblies can move around in non-structured environments, by using rotational or wheel-like locomotion. We show a proof of concept for transferring a Roombots metamodule (two in-series coupled RB modules) from the non-structured environment back into the structured grid, by aligning the RB metamodule in an entrapment mechanism. Finally, we analyze the remaining challenges to master the full Roombots scenario, and discuss the impact on future Roombots hardware.

dlg

DOI [BibTex]

DOI [BibTex]


no image
Rough Terrain Mapping and Navigation using a Continuously Rotating 2D Laser Scanner

Schadler, M., Stueckler, J., Behnke, S.

Künstliche Intelligenz (KI), 28(2):93-99, Springer, 2014 (article)

ev

link (url) DOI [BibTex]

link (url) DOI [BibTex]


no image
Adaptive Tool-Use Strategies for Anthropomorphic Service Robots

Stueckler, J., Behnke, S.

In Proc. of the 14th IEEE-RAS International Conference on Humanoid Robots (Humanoids), 2014 (inproceedings)

ev

link (url) [BibTex]

link (url) [BibTex]


Automatic Generation of Reduced CPG Control Networks for Locomotion of Arbitrary Modular Robot Structures
Automatic Generation of Reduced CPG Control Networks for Locomotion of Arbitrary Modular Robot Structures

Bonardi, S., Vespignani, M., Möckel, R., Van den Kieboom, J., Pouya, S., Spröwitz, A., Ijspeert, A.

In Proceedings of Robotics: Science and Systems, University of California, Barkeley, 2014 (inproceedings)

Abstract
The design of efficient locomotion controllers for arbitrary structures of reconfigurable modular robots is challenging because the morphology of the structure can change dynamically during the completion of a task. In this paper, we propose a new method to automatically generate reduced Central Pattern Generator (CPG) networks for locomotion control based on the detection of bio-inspired sub-structures, like body and limbs, and articulation joints inside the robotic structure. We demonstrate how that information, coupled with the potential symmetries in the structure, can be used to speed up the optimization of the gaits and investigate its impact on the solution quality (i.e. the velocity of the robotic structure and the potential internal collisions between robotic modules). We tested our approach on three simulated structures and observed that the reduced network topologies in the first iterations of the optimization process performed significantly better than the fully open ones.

dlg

DOI [BibTex]

DOI [BibTex]


no image
Dense Real-Time Mapping of Object-Class Semantics from RGB-D Video

Stueckler, J., Waldvogel, B., Schulz, H., Behnke, S.

Journal of Real-Time Image Processing (JRTIP), 10(4):599-609, Springer, 2014 (article)

ev

link (url) DOI [BibTex]

link (url) DOI [BibTex]


no image
Local Multi-Resolution Surfel Grids for MAV Motion Estimation and 3D Mapping

Droeschel, D., Stueckler, J., Behnke, S.

In Proc. of the 13th International Conference on Intelligent Autonomous Systems (IAS), 2014 (inproceedings)

ev

link (url) [BibTex]

link (url) [BibTex]


no image
Multi-Resolution Surfel Maps for Efficient Dense 3D Modeling and Tracking

Stueckler, J., Behnke, S.

Journal of Visual Communication and Image Representation (JVCI), 25(1):137-147, 2014 (article)

ev

link (url) DOI [BibTex]

link (url) DOI [BibTex]


no image
Active Recognition and Manipulation for Mobile Robot Bin Picking

Holz, D., Nieuwenhuisen, M., Droeschel, D., Stueckler, J., Berner, A., Li, J., Klein, R., Behnke, S.

In Gearing Up and Accelerating Cross-fertilization between Academic and Industrial Robotics Research in Europe: Technology Transfer Experiments from the ECHORD Project, pages: 133-153, Springer, 2014 (inbook)

ev

link (url) DOI [BibTex]

link (url) DOI [BibTex]


no image
Combining the Strengths of Sparse Interest Point and Dense Image Registration for RGB-D Odometry

Stueckler, J., Gutt, A., Behnke, S.

In Proc. of the Joint 45th International Symposium on Robotics (ISR) and 8th German Conference on Robotics (ROBOTIK), 2014 (inproceedings)

ev

link (url) [BibTex]

link (url) [BibTex]


no image
Cutaneous Feedback of Planar Fingertip Deformation and Vibration on a da Vinci Surgical Robot

Pacchierotti, C., Shirsat, P., Koehn, J. K., Prattichizzo, D., Kuchenbecker, K. J.

In Proc. IROS Workshop on the Role of Human Sensorimotor Control in Robotic Surgery, Chicago, Illinois, 2014, Poster presentation given by Koehn (inproceedings)

hi

[BibTex]

[BibTex]


no image
Increasing Flexibility of Mobile Manipulation and Intuitive Human-Robot Interaction in RoboCup@Home

Stueckler, J., Droeschel, D., Gräve, K., Holz, D., Schreiber, M., Topaldou-Kyniazopoulou, A., Schwarz, M., Behnke, S.

In RoboCup 2013, Robot Soccer World Cup XVII, pages: 135-146, Springer, 2014 (inbook)

ev

link (url) DOI [BibTex]

link (url) DOI [BibTex]


Kinematic primitives for walking and trotting gaits of a quadruped robot with compliant legs
Kinematic primitives for walking and trotting gaits of a quadruped robot with compliant legs

Spröwitz, A. T., Ajallooeian, M., Tuleu, A., Ijspeert, A. J.

Frontiers in Computational Neuroscience, 8(27):1-13, 2014 (article)

Abstract
In this work we research the role of body dynamics in the complexity of kinematic patterns in a quadruped robot with compliant legs. Two gait patterns, lateral sequence walk and trot, along with leg length control patterns of different complexity were implemented in a modular, feed-forward locomotion controller. The controller was tested on a small, quadruped robot with compliant, segmented leg design, and led to self-stable and self-stabilizing robot locomotion. In-air stepping and on-ground locomotion leg kinematics were recorded, and the number and shapes of motion primitives accounting for 95\% of the variance of kinematic leg data were extracted. This revealed that kinematic patterns resulting from feed-forward control had a lower complexity (in-air stepping, 2–3 primitives) than kinematic patterns from on-ground locomotion (νm4 primitives), although both experiments applied identical motor patterns. The complexity of on-ground kinematic patterns had increased, through ground contact and mechanical entrainment. The complexity of observed kinematic on-ground data matches those reported from level-ground locomotion data of legged animals. Results indicate that a very low complexity of modular, rhythmic, feed-forward motor control is sufficient for level-ground locomotion in combination with passive compliant legged hardware.

dlg

link (url) DOI [BibTex]

link (url) DOI [BibTex]


no image
Efficient Dense Registration, Segmentation, and Modeling Methods for RGB-D Environment Perception

Stueckler, J.

Faculty of Mathematics and Natural Sciences, University of Bonn, Germany, 2014 (phdthesis)

ev

link (url) [BibTex]

link (url) [BibTex]


no image
Mobile Teleoperation Interfaces with Adjustable Autonomy for Personal Service Robots

Schwarz, M., Stueckler, J., Behnke, S.

In Proceedings of the 2014 ACM/IEEE International Conference on Human-robot Interaction, pages: 288-289, HRI ’14, ACM, 2014 (inproceedings)

ev

link (url) DOI [BibTex]

link (url) DOI [BibTex]


no image
Efficient deformable registration of multi-resolution surfel maps for object manipulation skill transfer

Stueckler, J., Behnke, S.

In Proc. of the IEEE International Conference on Robotics and Automation (ICRA), pages: 994-1001, May 2014 (inproceedings)

ev

link (url) DOI [BibTex]

link (url) DOI [BibTex]


no image
Local multi-resolution representation for 6D motion estimation and mapping with a continuously rotating 3D laser scanner

Droeschel, D., Stueckler, J., Behnke, S.

In Proc. of the IEEE Int. Conf. on Robotics and Automation (ICRA), pages: 5221-5226, May 2014 (inproceedings)

ev

link (url) DOI [BibTex]

link (url) DOI [BibTex]