Header logo is


2013


no image
Determination of an Analysis Procedure for FEM-Based Fatigue Calculations

Serhat, G.

Technical University of Munich, December 2013 (mastersthesis)

hi

[BibTex]

2013


[BibTex]


no image
A Practical System For Recording Instrument Interactions During Live Robotic Surgery

McMahan, W., Gomez, E. D., Chen, L., Bark, K., Nappo, J. C., Koch, E. I., Lee, D. I., Dumon, K., Williams, N., Kuchenbecker, K. J.

Journal of Robotic Surgery, 7(4):351-358, 2013 (article)

hi

[BibTex]

[BibTex]


no image
Jointonation: Robotization of the Human Body by Vibrotactile Feedback

Kurihara, Y., Hachisu, T., Kuchenbecker, K. J., Kajimoto, H.

Emerging Technologies Demonstration with Talk at ACM SIGGRAPH Asia, Hong Kong, November 2013, Hands-on demonstration given by Kurihara, Takei, and Nakai. Best Demonstration Award as voted by the Program Committee (misc)

hi

[BibTex]

[BibTex]


Vision meets Robotics: The {KITTI} Dataset
Vision meets Robotics: The KITTI Dataset

Geiger, A., Lenz, P., Stiller, C., Urtasun, R.

International Journal of Robotics Research, 32(11):1231 - 1237 , Sage Publishing, September 2013 (article)

Abstract
We present a novel dataset captured from a VW station wagon for use in mobile robotics and autonomous driving research. In total, we recorded 6 hours of traffic scenarios at 10-100 Hz using a variety of sensor modalities such as high-resolution color and grayscale stereo cameras, a Velodyne 3D laser scanner and a high-precision GPS/IMU inertial navigation system. The scenarios are diverse, capturing real-world traffic situations and range from freeways over rural areas to inner-city scenes with many static and dynamic objects. Our data is calibrated, synchronized and timestamped, and we provide the rectified and raw image sequences. Our dataset also contains object labels in the form of 3D tracklets and we provide online benchmarks for stereo, optical flow, object detection and other tasks. This paper describes our recording platform, the data format and the utilities that we provide.

avg ps

pdf DOI [BibTex]

pdf DOI [BibTex]


no image
Vibrotactile Display: Perception, Technology, and Applications

Choi, S., Kuchenbecker, K. J.

Proceedings of the IEEE, 101(9):2093-2104, sep 2013 (article)

hi

[BibTex]

[BibTex]


no image
ROS Open-source Audio Recognizer: ROAR Environmental Sound Detection Tools for Robot Programming

Romano, J. M., Brindza, J. P., Kuchenbecker, K. J.

Autonomous Robots, 34(3):207-215, April 2013 (article)

hi

[BibTex]

[BibTex]


Probabilistic Models for 3D Urban Scene Understanding from Movable Platforms
Probabilistic Models for 3D Urban Scene Understanding from Movable Platforms

Geiger, A.

Karlsruhe Institute of Technology, Karlsruhe Institute of Technology, April 2013 (phdthesis)

Abstract
Visual 3D scene understanding is an important component in autonomous driving and robot navigation. Intelligent vehicles for example often base their decisions on observations obtained from video cameras as they are cheap and easy to employ. Inner-city intersections represent an interesting but also very challenging scenario in this context: The road layout may be very complex and observations are often noisy or even missing due to heavy occlusions. While Highway navigation and autonomous driving on simple and annotated intersections have already been demonstrated successfully, understanding and navigating general inner-city crossings with little prior knowledge remains an unsolved problem. This thesis is a contribution to understanding multi-object traffic scenes from video sequences. All data is provided by a camera system which is mounted on top of the autonomous driving platform AnnieWAY. The proposed probabilistic generative model reasons jointly about the 3D scene layout as well as the 3D location and orientation of objects in the scene. In particular, the scene topology, geometry as well as traffic activities are inferred from short video sequences. The model takes advantage of monocular information in the form of vehicle tracklets, vanishing lines and semantic labels. Additionally, the benefit of stereo features such as 3D scene flow and occupancy grids is investigated. Motivated by the impressive driving capabilities of humans, no further information such as GPS, lidar, radar or map knowledge is required. Experiments conducted on 113 representative intersection sequences show that the developed approach successfully infers the correct layout in a variety of difficult scenarios. To evaluate the importance of each feature cue, experiments with different feature combinations are conducted. Additionally, the proposed method is shown to improve object detection and object orientation estimation performance.

avg ps

pdf [BibTex]

pdf [BibTex]


no image
Data-Driven Modeling and Rendering of Isotropic Textures

Culbertson, H., McDonald, C. G., Goodman, B. E., Kuchenbecker, K. J.

Hands-on demonstration presented at IEEE World Haptics Conference, Daejeon, South Korea, April 2013, Best Demonstration Award (by audience vote) (misc)

hi

[BibTex]

[BibTex]


no image
Adding Haptics to Robotic Surgery

J. Kuchenbecker, K., Brzezinski, A., D. Gomez, E., Gosselin, M., Hui, J., Koch, E., Koehn, J., McMahan, W., Mahajan, K., Nappo, J., Shah, N.

Learning Center Station at SAGES (Society of American Gastrointestinal and Endoscopic Surgeons) Annual Meeting, Baltimore, Maryland, USA, April 2013 (misc)

hi

[BibTex]

[BibTex]


no image
In Vivo Validation of a System for Haptic Feedback of Tool Vibrations in Robotic Surgery

Bark, K., McMahan, W., Remington, A., Gewirtz, J., Wedmid, A., Lee, D. I., Kuchenbecker, K. J.

Surgical Endoscopy, 27(2):656-664, February 2013, dynamic article (paper plus video), available at \href{http://www.springerlink.com/content/417j532708417342/}{http://www.springerlink.com/content/417j532708417342/} (article)

hi

[BibTex]

[BibTex]


no image
Perception of Springs with Visual and Proprioceptive Motion Cues: Implications for Prosthetics

Gurari, N., Kuchenbecker, K. J., Okamura, A. M.

IEEE Transactions on Human-Machine Systems, 43, pages: 102-114, January 2013, \href{http://www.youtube.com/watch?v=DBRw87Wk29E\&feature=youtu.be}{Video} (article)

hi

[BibTex]

[BibTex]


no image
Expectation and Attention in Hierarchical Auditory Prediction

Chennu, S., Noreika, V., Gueorguiev, D., Blenkmann, A., Kochen, S., Ibáñez, A., Owen, A. M., Bekinschtein, T. A.

Journal of Neuroscience, 33(27):11194-11205, Society for Neuroscience, 2013 (article)

Abstract
Hierarchical predictive coding suggests that attention in humans emerges from increased precision in probabilistic inference, whereas expectation biases attention in favor of contextually anticipated stimuli. We test these notions within auditory perception by independently manipulating top-down expectation and attentional precision alongside bottom-up stimulus predictability. Our findings support an integrative interpretation of commonly observed electrophysiological signatures of neurodynamics, namely mismatch negativity (MMN), P300, and contingent negative variation (CNV), as manifestations along successive levels of predictive complexity. Early first-level processing indexed by the MMN was sensitive to stimulus predictability: here, attentional precision enhanced early responses, but explicit top-down expectation diminished it. This pattern was in contrast to later, second-level processing indexed by the P300: although sensitive to the degree of predictability, responses at this level were contingent on attentional engagement and in fact sharpened by top-down expectation. At the highest level, the drift of the CNV was a fine-grained marker of top-down expectation itself. Source reconstruction of high-density EEG, supported by intracranial recordings, implicated temporal and frontal regions differentially active at early and late levels. The cortical generators of the CNV suggested that it might be involved in facilitating the consolidation of context-salient stimuli into conscious perception. These results provide convergent empirical support to promising recent accounts of attention and expectation in predictive coding.

hi

link (url) DOI [BibTex]

link (url) DOI [BibTex]

2011


no image
Human-Inspired Robotic Grasp Control with Tactile Sensing

Romano, J. M., Hsiao, K., Niemeyer, G., Chitta, S., Kuchenbecker, K. J.

IEEE Transactions on Robotics, 27(6):1067-1079, December 2011 (article)

hi

[BibTex]

2011


[BibTex]


no image
Please \soutdo not touch the robot

Romano, J. M., Kuchenbecker, K. J.

Hands-on demonstration presented at IEEE/RSJ Conference on Intelligent Robots and Systems (IROS), San Francisco, California, sep 2011 (misc)

hi

[BibTex]

[BibTex]


no image
Tool Contact Acceleration Feedback for Telerobotic Surgery

McMahan, W., Gewirtz, J., Standish, D., Martin, P., Kunkel, J., Lilavois, M., Wedmid, A., Lee, D. I., Kuchenbecker, K. J.

IEEE Transactions on Haptics, 4(3):210-220, July 2011 (article)

hi

[BibTex]

[BibTex]


no image
Body-Grounded Tactile Actuators for Playback of Human Physical Contact

Stanley, A. A., Kuchenbecker, K. J.

Hands-on demonstration presented at IEEE World Haptics Conference, Istanbul, Turkey, June 2011 (misc)

hi

[BibTex]

[BibTex]


no image
VerroTouch: Vibrotactile Feedback for Robotic Minimally Invasive Surgery

McMahan, W., Gewirtz, J., Standish, D., Martin, P., Kunkel, J., Lilavois, M., Wedmid, A., Lee, D. I., Kuchenbecker, K. J.

Journal of Urology, 185(4, Supplement):e373, May 2011, Poster presentation given by McMahan at the Annual Meeting of the American Urological Association in Washington, D.C., USA (article)

hi

[BibTex]

[BibTex]

2009


no image
Displaying Realistic Contact Accelerations Via a Dedicated Vibration Actuator

McMahan, W., Kuchenbecker, K. J.

Hands-on demonstration presented at IEEE World Haptics Conference, Salt Lake City, Utah, Proc. IEEE World Haptics Conference, pp. 613–614, Salt Lake City, Utah, USA, March 2009, {B}est Demonstration Award (misc)

hi

[BibTex]

2009


[BibTex]


no image
The iTorqU 1.0 and 2.0

Winfree, K. N., Gewirtz, J., Mather, T., Fiene, J., Kuchenbecker, K. J.

Hands-on demonstration presented at IEEE World Haptics Conference, Salt Lake City, Utah, March 2009 (misc)

hi

[BibTex]

[BibTex]


no image
Vibrotactile Feedback System for Intuitive Upper-Limb Rehabilitation

Kapur, P., Premakumar, S., Jax, S. A., Buxbaum, L. J., Dawson, A. M., Kuchenbecker, K. J.

Hands-on demonstration presented at IEEE World Haptics Conference, Salt Lake City, Utah, USA, Proc. IEEE World Haptics Conference, pp. 621–622, March 2009 (misc)

hi

[BibTex]

[BibTex]


no image
The SlipGlove

Romano, J. M., Gray, S. R., Jacobs, N. T., Kuchenbecker, K. J.

Hands-on demonstration presented at IEEE World Haptics Conference, Salt Lake City, Utah, March 2009 (misc)

hi

[BibTex]

[BibTex]


no image
Real-Time Graphic and Haptic Simulation of Deformable Tissue Puncture

Romano, J. M., Safonova, A., Kuchenbecker, K. J.

Hands-on demonstration presented at Medicine Meets Virtual Reality, Long Beach, California, USA, January 2009 (misc)

hi

[BibTex]

[BibTex]

2008


no image
The Touch Thimble

Kuchenbecker, K. J., Ferguson, D., Kutzer, M., Moses, M., Okamura, A. M.

Hands-on demonstration presented at IEEE Haptics Symposium, Reno, Nevada, USA, March 2008 (misc)

hi

[BibTex]

2008


[BibTex]

2007


no image
Comparing Visual and Haptic Position Feedback

Kuchenbecker, K. J., Gurari, N., Okamura, A. M.

Hands-on demonstration at IEEE World Haptics Conference, Tsukuba, Japan, March 2007 (misc)

hi

[BibTex]

2007


[BibTex]