Header logo is


2013


no image
A Practical System For Recording Instrument Interactions During Live Robotic Surgery

McMahan, W., Gomez, E. D., Chen, L., Bark, K., Nappo, J. C., Koch, E. I., Lee, D. I., Dumon, K., Williams, N., Kuchenbecker, K. J.

Journal of Robotic Surgery, 7(4):351-358, 2013 (article)

hi

[BibTex]

2013


[BibTex]


no image
Jointonation: Robotization of the Human Body by Vibrotactile Feedback

Kurihara, Y., Hachisu, T., Kuchenbecker, K. J., Kajimoto, H.

Emerging Technologies Demonstration with Talk at ACM SIGGRAPH Asia, Hong Kong, November 2013, Hands-on demonstration given by Kurihara, Takei, and Nakai. Best Demonstration Award as voted by the Program Committee (misc)

hi

[BibTex]

[BibTex]


Vision meets Robotics: The {KITTI} Dataset
Vision meets Robotics: The KITTI Dataset

Geiger, A., Lenz, P., Stiller, C., Urtasun, R.

International Journal of Robotics Research, 32(11):1231 - 1237 , Sage Publishing, September 2013 (article)

Abstract
We present a novel dataset captured from a VW station wagon for use in mobile robotics and autonomous driving research. In total, we recorded 6 hours of traffic scenarios at 10-100 Hz using a variety of sensor modalities such as high-resolution color and grayscale stereo cameras, a Velodyne 3D laser scanner and a high-precision GPS/IMU inertial navigation system. The scenarios are diverse, capturing real-world traffic situations and range from freeways over rural areas to inner-city scenes with many static and dynamic objects. Our data is calibrated, synchronized and timestamped, and we provide the rectified and raw image sequences. Our dataset also contains object labels in the form of 3D tracklets and we provide online benchmarks for stereo, optical flow, object detection and other tasks. This paper describes our recording platform, the data format and the utilities that we provide.

avg ps

pdf DOI [BibTex]

pdf DOI [BibTex]


no image
Vibrotactile Display: Perception, Technology, and Applications

Choi, S., Kuchenbecker, K. J.

Proceedings of the IEEE, 101(9):2093-2104, sep 2013 (article)

hi

[BibTex]

[BibTex]


no image
ROS Open-source Audio Recognizer: ROAR Environmental Sound Detection Tools for Robot Programming

Romano, J. M., Brindza, J. P., Kuchenbecker, K. J.

Autonomous Robots, 34(3):207-215, April 2013 (article)

hi

[BibTex]

[BibTex]


no image
Data-Driven Modeling and Rendering of Isotropic Textures

Culbertson, H., McDonald, C. G., Goodman, B. E., Kuchenbecker, K. J.

Hands-on demonstration presented at IEEE World Haptics Conference, Daejeon, South Korea, April 2013, Best Demonstration Award (by audience vote) (misc)

hi

[BibTex]

[BibTex]


no image
Adding Haptics to Robotic Surgery

J. Kuchenbecker, K., Brzezinski, A., D. Gomez, E., Gosselin, M., Hui, J., Koch, E., Koehn, J., McMahan, W., Mahajan, K., Nappo, J., Shah, N.

Learning Center Station at SAGES (Society of American Gastrointestinal and Endoscopic Surgeons) Annual Meeting, Baltimore, Maryland, USA, April 2013 (misc)

hi

[BibTex]

[BibTex]


no image
In Vivo Validation of a System for Haptic Feedback of Tool Vibrations in Robotic Surgery

Bark, K., McMahan, W., Remington, A., Gewirtz, J., Wedmid, A., Lee, D. I., Kuchenbecker, K. J.

Surgical Endoscopy, 27(2):656-664, February 2013, dynamic article (paper plus video), available at \href{http://www.springerlink.com/content/417j532708417342/}{http://www.springerlink.com/content/417j532708417342/} (article)

hi

[BibTex]

[BibTex]


no image
Perception of Springs with Visual and Proprioceptive Motion Cues: Implications for Prosthetics

Gurari, N., Kuchenbecker, K. J., Okamura, A. M.

IEEE Transactions on Human-Machine Systems, 43, pages: 102-114, January 2013, \href{http://www.youtube.com/watch?v=DBRw87Wk29E\&feature=youtu.be}{Video} (article)

hi

[BibTex]

[BibTex]


Towards Dynamic Trot Gait Locomotion: Design, Control, and Experiments with Cheetah-cub, a Compliant Quadruped Robot
Towards Dynamic Trot Gait Locomotion: Design, Control, and Experiments with Cheetah-cub, a Compliant Quadruped Robot

Spröwitz, A., Tuleu, A., Vespignani, M., Ajallooeian, M., Badri, E., Ijspeert, A. J.

{The International Journal of Robotics Research}, 32(8):932-950, Sage Publications, Inc., Cambridge, MA, 2013 (article)

Abstract
We present the design of a novel compliant quadruped robot, called Cheetah-cub, and a series of locomotion experiments with fast trotting gaits. The robot’s leg configuration is based on a spring-loaded, pantograph mechanism with multiple segments. A dedicated open-loop locomotion controller was derived and implemented. Experiments were run in simulation and in hardware on flat terrain and with a step down, demonstrating the robot’s self-stabilizing properties. The robot reached a running trot with short flight phases with a maximum Froude number of FR = 1.30, or 6.9 body lengths per second. Morphological parameters such as the leg design also played a role. By adding distal in-series elasticity, self- stability and maximum robot speed improved. Our robot has several advantages, especially when compared with larger and stiffer quadruped robot designs. (1) It is, to the best of the authors’ knowledge, the fastest of all quadruped robots below 30 kg (in terms of Froude number and body lengths per second). (2) It shows self-stabilizing behavior over a large range of speeds with open-loop control. (3) It is lightweight, compact, and electrically powered. (4) It is cheap, easy to reproduce, robust, and safe to handle. This makes it an excellent tool for research of multi-segment legs in quadruped robots.

dlg

Youtube1 Youtube2 Youtube3 Youtube4 Youtube5 DOI Project Page [BibTex]

Youtube1 Youtube2 Youtube3 Youtube4 Youtube5 DOI Project Page [BibTex]


no image
Expectation and Attention in Hierarchical Auditory Prediction

Chennu, S., Noreika, V., Gueorguiev, D., Blenkmann, A., Kochen, S., Ibáñez, A., Owen, A. M., Bekinschtein, T. A.

Journal of Neuroscience, 33(27):11194-11205, Society for Neuroscience, 2013 (article)

Abstract
Hierarchical predictive coding suggests that attention in humans emerges from increased precision in probabilistic inference, whereas expectation biases attention in favor of contextually anticipated stimuli. We test these notions within auditory perception by independently manipulating top-down expectation and attentional precision alongside bottom-up stimulus predictability. Our findings support an integrative interpretation of commonly observed electrophysiological signatures of neurodynamics, namely mismatch negativity (MMN), P300, and contingent negative variation (CNV), as manifestations along successive levels of predictive complexity. Early first-level processing indexed by the MMN was sensitive to stimulus predictability: here, attentional precision enhanced early responses, but explicit top-down expectation diminished it. This pattern was in contrast to later, second-level processing indexed by the P300: although sensitive to the degree of predictability, responses at this level were contingent on attentional engagement and in fact sharpened by top-down expectation. At the highest level, the drift of the CNV was a fine-grained marker of top-down expectation itself. Source reconstruction of high-density EEG, supported by intracranial recordings, implicated temporal and frontal regions differentially active at early and late levels. The cortical generators of the CNV suggested that it might be involved in facilitating the consolidation of context-salient stimuli into conscious perception. These results provide convergent empirical support to promising recent accounts of attention and expectation in predictive coding.

hi

link (url) DOI [BibTex]

link (url) DOI [BibTex]


Horse-Like Walking, Trotting, and Galloping derived from Kinematic Motion Primitives (kMPs) and their Application to Walk/Trot Transitions in a Compliant Quadruped Robot
Horse-Like Walking, Trotting, and Galloping derived from Kinematic Motion Primitives (kMPs) and their Application to Walk/Trot Transitions in a Compliant Quadruped Robot

Moro, F., Spröwitz, A., Tuleu, A., Vespignani, M., Tsagakiris, N. G., Ijspeert, A. J., Caldwell, D. G.

Biological Cybernetics, 107(3):309-320, 2013 (article)

Abstract
This manuscript proposes a method to directly transfer the features of horse walking, trotting, and galloping to a quadruped robot, with the aim of creating a much more natural (horse-like) locomotion profile. A principal component analysis on horse joint trajectories shows that walk, trot, and gallop can be described by a set of four kinematic Motion Primitives (kMPs). These kMPs are used to generate valid, stable gaits that are tested on a compliant quadruped robot. Tests on the effects of gait frequency scaling as follows: results indicate a speed optimal walking frequency around 3.4 Hz, and an optimal trotting frequency around 4 Hz. Following, a criterion to synthesize gait transitions is proposed, and the walk/trot transitions are successfully tested on the robot. The performance of the robot when the transitions are scaled in frequency is evaluated by means of roll and pitch angle phase plots.

dlg

DOI [BibTex]

DOI [BibTex]

2011


no image
Human-Inspired Robotic Grasp Control with Tactile Sensing

Romano, J. M., Hsiao, K., Niemeyer, G., Chitta, S., Kuchenbecker, K. J.

IEEE Transactions on Robotics, 27(6):1067-1079, December 2011 (article)

hi

[BibTex]

2011


[BibTex]


no image
Please \soutdo not touch the robot

Romano, J. M., Kuchenbecker, K. J.

Hands-on demonstration presented at IEEE/RSJ Conference on Intelligent Robots and Systems (IROS), San Francisco, California, sep 2011 (misc)

hi

[BibTex]

[BibTex]


no image
Tool Contact Acceleration Feedback for Telerobotic Surgery

McMahan, W., Gewirtz, J., Standish, D., Martin, P., Kunkel, J., Lilavois, M., Wedmid, A., Lee, D. I., Kuchenbecker, K. J.

IEEE Transactions on Haptics, 4(3):210-220, July 2011 (article)

hi

[BibTex]

[BibTex]


no image
Body-Grounded Tactile Actuators for Playback of Human Physical Contact

Stanley, A. A., Kuchenbecker, K. J.

Hands-on demonstration presented at IEEE World Haptics Conference, Istanbul, Turkey, June 2011 (misc)

hi

[BibTex]

[BibTex]


no image
VerroTouch: Vibrotactile Feedback for Robotic Minimally Invasive Surgery

McMahan, W., Gewirtz, J., Standish, D., Martin, P., Kunkel, J., Lilavois, M., Wedmid, A., Lee, D. I., Kuchenbecker, K. J.

Journal of Urology, 185(4, Supplement):e373, May 2011, Poster presentation given by McMahan at the Annual Meeting of the American Urological Association in Washington, D.C., USA (article)

hi

[BibTex]

[BibTex]

2009


no image
Displaying Realistic Contact Accelerations Via a Dedicated Vibration Actuator

McMahan, W., Kuchenbecker, K. J.

Hands-on demonstration presented at IEEE World Haptics Conference, Salt Lake City, Utah, Proc. IEEE World Haptics Conference, pp. 613–614, Salt Lake City, Utah, USA, March 2009, {B}est Demonstration Award (misc)

hi

[BibTex]

2009


[BibTex]


no image
The iTorqU 1.0 and 2.0

Winfree, K. N., Gewirtz, J., Mather, T., Fiene, J., Kuchenbecker, K. J.

Hands-on demonstration presented at IEEE World Haptics Conference, Salt Lake City, Utah, March 2009 (misc)

hi

[BibTex]

[BibTex]


no image
Vibrotactile Feedback System for Intuitive Upper-Limb Rehabilitation

Kapur, P., Premakumar, S., Jax, S. A., Buxbaum, L. J., Dawson, A. M., Kuchenbecker, K. J.

Hands-on demonstration presented at IEEE World Haptics Conference, Salt Lake City, Utah, USA, Proc. IEEE World Haptics Conference, pp. 621–622, March 2009 (misc)

hi

[BibTex]

[BibTex]


no image
The SlipGlove

Romano, J. M., Gray, S. R., Jacobs, N. T., Kuchenbecker, K. J.

Hands-on demonstration presented at IEEE World Haptics Conference, Salt Lake City, Utah, March 2009 (misc)

hi

[BibTex]

[BibTex]


no image
Real-Time Graphic and Haptic Simulation of Deformable Tissue Puncture

Romano, J. M., Safonova, A., Kuchenbecker, K. J.

Hands-on demonstration presented at Medicine Meets Virtual Reality, Long Beach, California, USA, January 2009 (misc)

hi

[BibTex]

[BibTex]

2008


no image
The Touch Thimble

Kuchenbecker, K. J., Ferguson, D., Kutzer, M., Moses, M., Okamura, A. M.

Hands-on demonstration presented at IEEE Haptics Symposium, Reno, Nevada, USA, March 2008 (misc)

hi

[BibTex]

2008


[BibTex]


Learning to Move in Modular Robots using Central Pattern Generators and Online Optimization
Learning to Move in Modular Robots using Central Pattern Generators and Online Optimization

Spröwitz, A., Moeckel, R., Maye, J., Ijspeert, A. J.

The International Journal of Robotics Research, 27(3-4):423-443, 2008 (article)

Abstract
This article addresses the problem of how modular robotics systems, i.e. systems composed of multiple modules that can be configured into different robotic structures, can learn to locomote. In particular, we tackle the problems of online learning, that is, learning while moving, and the problem of dealing with unknown arbitrary robotic structures. We propose a framework for learning locomotion controllers based on two components: a central pattern generator (CPG) and a gradient-free optimization algorithm referred to as Powell's method. The CPG is implemented as a system of coupled nonlinear oscillators in our YaMoR modular robotic system, with one oscillator per module. The nonlinear oscillators are coupled together across modules using Bluetooth communication to obtain specific gaits, i.e. synchronized patterns of oscillations among modules. Online learning involves running the Powell optimization algorithm in parallel with the CPG model, with the speed of locomotion being the criterion to be optimized. Interesting aspects of the optimization include the fact that it is carried out online, the robots do not require stopping or resetting and it is fast. We present results showing the interesting properties of this framework for a modular robotic system. In particular, our CPG model can readily be implemented in a distributed system, it is computationally cheap, it exhibits limit cycle behavior (temporary perturbations are rapidly forgotten), it produces smooth trajectories even when control parameters are abruptly changed and it is robust against imperfect communication among modules. We also present results of learning to move with three different robot structures. Interesting locomotion modes are obtained after running the optimization for less than 60 minutes.

dlg

link (url) DOI [BibTex]

link (url) DOI [BibTex]

2007


no image
Comparing Visual and Haptic Position Feedback

Kuchenbecker, K. J., Gurari, N., Okamura, A. M.

Hands-on demonstration at IEEE World Haptics Conference, Tsukuba, Japan, March 2007 (misc)

hi

[BibTex]

2007


[BibTex]