Header logo is


2015


no image
Reducing Student Anonymity and Increasing Engagement

Kuchenbecker, K. J.

University of Pennsylvania Almanac, 62(18):8, November 2015 (article)

hi

[BibTex]

2015


[BibTex]


no image
Surgeons and Non-Surgeons Prefer Haptic Feedback of Instrument Vibrations During Robotic Surgery

Koehn, J. K., Kuchenbecker, K. J.

Surgical Endoscopy, 29(10):2970-2983, October 2015 (article)

hi

[BibTex]

[BibTex]


no image
Displaying Sensed Tactile Cues with a Fingertip Haptic Device

Pacchierotti, C., Prattichizzo, D., Kuchenbecker, K. J.

IEEE Transactions on Haptics, 8(4):384-396, October 2015 (article)

hi

[BibTex]

[BibTex]


no image
A thin film active-lens with translational control for dynamically programmable optical zoom

Yun, S., Park, S., Park, B., Nam, S., Park, S. K., Kyung, K.

Applied Physics Letters, 107(8):081907, AIP Publishing, August 2015 (article)

Abstract
We demonstrate a thin film active-lens for rapidly and dynamically controllable optical zoom. The active-lens is composed of a convex hemispherical polydimethylsiloxane (PDMS) lens structure working as an aperture and a dielectric elastomer (DE) membrane actuator, which is a combination of a thin DE layer made with PDMS and a compliant electrode pattern using silver-nanowires. The active-lens is capable of dynamically changing focal point of the soft aperture as high as 18.4% through its translational movement in vertical direction responding to electrically induced bulged-up deformation of the DE membrane actuator. Under operation with various sinusoidal voltage signals, the movement responses are fairly consistent with those estimated from numerical simulation. The responses are not only fast, fairly reversible, and highly durable during continuous cyclic operations, but also large enough to impart dynamic focus tunability for optical zoom in microscopic imaging devices with a light-weight and ultra-slim configuration.

hi

link (url) DOI [BibTex]

link (url) DOI [BibTex]


no image
Data-Driven Motion Mappings Improve Transparency in Teleoperation

Khurshid, R. P., Kuchenbecker, K. J.

Presence: Teleoperators and Virtual Environments, 24(2):132-154, May 2015 (article)

hi

[BibTex]

[BibTex]


no image
Robotic Learning of Haptic Adjectives Through Physical Interaction

Chu, V., McMahon, I., Riano, L., McDonald, C. G., He, Q., Perez-Tejada, J. M., Arrigo, M., Darrell, T., Kuchenbecker, K. J.

Robotics and Autonomous Systems, 63(3):279-292, January 2015, Corrigendum published in June 2016 (article)

hi

[BibTex]

[BibTex]


no image
Effects of Vibrotactile Feedback on Human Motor Learning of Arbitrary Arm Motions

Bark, K., Hyman, E., Tan, F., Cha, E., Jax, S. A., Buxbaum, L. J., Kuchenbecker, K. J.

IEEE Transactions on Neural Systems and Rehabilitation Engineering, 23(1):51-63, January 2015 (article)

hi

[BibTex]

[BibTex]

2013


no image
A Practical System For Recording Instrument Interactions During Live Robotic Surgery

McMahan, W., Gomez, E. D., Chen, L., Bark, K., Nappo, J. C., Koch, E. I., Lee, D. I., Dumon, K., Williams, N., Kuchenbecker, K. J.

Journal of Robotic Surgery, 7(4):351-358, 2013 (article)

hi

[BibTex]

2013


[BibTex]


no image
Vibrotactile Display: Perception, Technology, and Applications

Choi, S., Kuchenbecker, K. J.

Proceedings of the IEEE, 101(9):2093-2104, sep 2013 (article)

hi

[BibTex]

[BibTex]


no image
ROS Open-source Audio Recognizer: ROAR Environmental Sound Detection Tools for Robot Programming

Romano, J. M., Brindza, J. P., Kuchenbecker, K. J.

Autonomous Robots, 34(3):207-215, April 2013 (article)

hi

[BibTex]

[BibTex]


no image
In Vivo Validation of a System for Haptic Feedback of Tool Vibrations in Robotic Surgery

Bark, K., McMahan, W., Remington, A., Gewirtz, J., Wedmid, A., Lee, D. I., Kuchenbecker, K. J.

Surgical Endoscopy, 27(2):656-664, February 2013, dynamic article (paper plus video), available at \href{http://www.springerlink.com/content/417j532708417342/}{http://www.springerlink.com/content/417j532708417342/} (article)

hi

[BibTex]

[BibTex]


no image
Perception of Springs with Visual and Proprioceptive Motion Cues: Implications for Prosthetics

Gurari, N., Kuchenbecker, K. J., Okamura, A. M.

IEEE Transactions on Human-Machine Systems, 43, pages: 102-114, January 2013, \href{http://www.youtube.com/watch?v=DBRw87Wk29E\&feature=youtu.be}{Video} (article)

hi

[BibTex]

[BibTex]


no image
Expectation and Attention in Hierarchical Auditory Prediction

Chennu, S., Noreika, V., Gueorguiev, D., Blenkmann, A., Kochen, S., Ibáñez, A., Owen, A. M., Bekinschtein, T. A.

Journal of Neuroscience, 33(27):11194-11205, Society for Neuroscience, 2013 (article)

Abstract
Hierarchical predictive coding suggests that attention in humans emerges from increased precision in probabilistic inference, whereas expectation biases attention in favor of contextually anticipated stimuli. We test these notions within auditory perception by independently manipulating top-down expectation and attentional precision alongside bottom-up stimulus predictability. Our findings support an integrative interpretation of commonly observed electrophysiological signatures of neurodynamics, namely mismatch negativity (MMN), P300, and contingent negative variation (CNV), as manifestations along successive levels of predictive complexity. Early first-level processing indexed by the MMN was sensitive to stimulus predictability: here, attentional precision enhanced early responses, but explicit top-down expectation diminished it. This pattern was in contrast to later, second-level processing indexed by the P300: although sensitive to the degree of predictability, responses at this level were contingent on attentional engagement and in fact sharpened by top-down expectation. At the highest level, the drift of the CNV was a fine-grained marker of top-down expectation itself. Source reconstruction of high-density EEG, supported by intracranial recordings, implicated temporal and frontal regions differentially active at early and late levels. The cortical generators of the CNV suggested that it might be involved in facilitating the consolidation of context-salient stimuli into conscious perception. These results provide convergent empirical support to promising recent accounts of attention and expectation in predictive coding.

hi

link (url) DOI [BibTex]

link (url) DOI [BibTex]