Header logo is


2019


no image
Interactive Augmented Reality for Robot-Assisted Surgery

Forte, M. P., Kuchenbecker, K. J.

Workshop extended abstract presented as a podium presentation at the IROS Workshop on Legacy Disruptors in Applied Telerobotics, Macau, November 2019 (misc) Accepted

hi

Project Page [BibTex]

2019


Project Page [BibTex]


no image
High-Fidelity Multiphysics Finite Element Modeling of Finger-Surface Interactions with Tactile Feedback

Serhat, G., Kuchenbecker, K. J.

Work-in-progress paper (2 pages) presented at the IEEE World Haptics Conference (WHC), Tokyo, Japan, July 2019 (misc)

Abstract
In this study, we develop a high-fidelity finite element (FE) analysis framework that enables multiphysics simulation of the human finger in contact with a surface that is providing tactile feedback. We aim to elucidate a variety of physical interactions that can occur at finger-surface interfaces, including contact, friction, vibration, and electrovibration. We also develop novel FE-based methods that will allow prediction of nonconventional features such as real finger-surface contact area and finger stickiness. We envision using the developed computational tools for efficient design and optimization of haptic devices by replacing expensive and lengthy experimental procedures with high-fidelity simulation.

hi

[BibTex]

[BibTex]


no image
Fingertip Friction Enhances Perception of Normal Force Changes

Gueorguiev, D., Lambert, J., Thonnard, J., Kuchenbecker, K. J.

Work-in-progress paper (2 pages) presented at the IEEE World Haptics Conference (WHC), Tokyo, Japan, July 2019 (misc)

Abstract
Using a force-controlled robotic platform, we tested the human perception of positive and negative modulations in normal force during passive dynamic touch, which also induced a strong related change in the finger-surface lateral force. In a two-alternative forced-choice task, eleven participants had to detect brief variations in the normal force compared to a constant controlled pre-stimulation force of 1 N and report whether it had increased or decreased. The average 75% just noticeable difference (JND) was found to be around 0.25 N for detecting the peak change and 0.30 N for correctly reporting the increase or the decrease. Interestingly, the friction coefficient of a subject’s fingertip positively correlated with his or her performance at detecting the change and reporting its direction, which suggests that humans may use the lateral force as a sensory cue to perceive variations in the normal force.

hi

[BibTex]

[BibTex]


Thumb xl pocketrendering
Inflatable Haptic Sensor for the Torso of a Hugging Robot

Block, A. E., Kuchenbecker, K. J.

Work-in-progress paper (2 pages) presented at the IEEE World Haptics Conference (WHC), Tokyo, Japan, July 2019 (misc)

Abstract
During hugs, humans naturally provide and intuit subtle non-verbal cues that signify the strength and duration of an exchanged hug. Personal preferences for this close interaction may vary greatly between people; robots do not currently have the abilities to perceive or understand these preferences. This work-in-progress paper discusses designing, building, and testing a novel inflatable torso that can simultaneously soften a robot and act as a tactile sensor to enable more natural and responsive hugging. Using PVC vinyl, a microphone, and a barometric pressure sensor, we created a small test chamber to demonstrate a proof of concept for the full torso. While contacting the chamber in several ways common in hugs (pat, squeeze, scratch, and rub), we recorded data from the two sensors. The preliminary results suggest that the complementary haptic sensing channels allow us to detect coarse and fine contacts typically experienced during hugs, regardless of user hand placement.

hi

Project Page [BibTex]

Project Page [BibTex]


Thumb xl figure1
Understanding the Pull-off Force of the Human Fingerpad

Nam, S., Kuchenbecker, K. J.

Work-in-progress paper (2 pages) presented at the IEEE World Haptics Conference (WHC), Tokyo, Japan, July 2019 (misc)

Abstract
To understand the adhesive force that occurs when a finger pulls off of a smooth surface, we built an apparatus to measure the fingerpad’s moisture, normal force, and real contact area over time during interactions with a glass plate. We recorded a total of 450 trials (45 interactions by each of ten human subjects), capturing a wide range of values across the aforementioned variables. The experimental results showed that the pull-off force increases with larger finger contact area and faster detachment rate. Additionally, moisture generally increases the contact area of the finger, but too much moisture can restrict the increase in the pull-off force.

hi

[BibTex]

[BibTex]


Thumb xl h a image3
The Haptician and the Alphamonsters

Forte, M. P., L’Orsa, R., Mohan, M., Nam, S., Kuchenbecker, K. J.

Student Innovation Challenge on Implementing Haptics in Virtual Reality Environment presented at the IEEE World Haptics Conference, Tokyo, Japan, July 2019, Maria Paola Forte, Rachael L'Orsa, Mayumi Mohan, and Saekwang Nam contributed equally to this publication (misc)

Abstract
Dysgraphia is a neurological disorder characterized by writing disabilities that affects between 7% and 15% of children. It presents itself in the form of unfinished letters, letter distortion, inconsistent letter size, letter collision, etc. Traditional therapeutic exercises require continuous assistance from teachers or occupational therapists. Autonomous partial or full haptic guidance can produce positive results, but children often become bored with the repetitive nature of such activities. Conversely, virtual rehabilitation with video games represents a new frontier for occupational therapy due to its highly motivational nature. Virtual reality (VR) adds an element of novelty and entertainment to therapy, thus motivating players to perform exercises more regularly. We propose leveraging the HTC VIVE Pro and the EXOS Wrist DK2 to create an immersive spellcasting “exergame” (exercise game) that helps motivate children with dysgraphia to improve writing fluency.

hi

Student Innovation Challenge – Virtual Reality [BibTex]

Student Innovation Challenge – Virtual Reality [BibTex]


Thumb xl motorized device
Implementation of a 6-DOF Parallel Continuum Manipulator for Delivering Fingertip Tactile Cues

Young, E. M., Kuchenbecker, K. J.

IEEE Transactions on Haptics, 12(3):295-306, June 2019 (article)

Abstract
Existing fingertip haptic devices can deliver different subsets of tactile cues in a compact package, but we have not yet seen a wearable six-degree-of-freedom (6-DOF) display. This paper presents the Fuppeteer (short for Fingertip Puppeteer), a device that is capable of controlling the position and orientation of a flat platform, such that any combination of normal and shear force can be delivered at any location on any human fingertip. We build on our previous work of designing a parallel continuum manipulator for fingertip haptics by presenting a motorized version in which six flexible Nitinol wires are actuated via independent roller mechanisms and proportional-derivative controllers. We evaluate the settling time and end-effector vibrations observed during system responses to step inputs. After creating a six-dimensional lookup table and adjusting simulated inputs using measured Jacobians, we show that the device can make contact with all parts of the fingertip with a mean error of 1.42 mm. Finally, we present results from a human-subject study. A total of 24 users discerned 9 evenly distributed contact locations with an average accuracy of 80.5%. Translational and rotational shear cues were identified reasonably well near the center of the fingertip and more poorly around the edges.

hi

DOI [BibTex]


Thumb xl s ban outdoors 1   small
Explorations of Shape-Changing Haptic Interfaces for Blind and Sighted Pedestrian Navigation

Spiers, A., Kuchenbecker, K. J.

pages: 6, Workshop paper (6 pages) presented at the CHI 2019 Workshop on Hacking Blind Navigation, May 2019 (misc) Accepted

Abstract
Since the 1960s, technologists have worked to develop systems that facilitate independent navigation by vision-impaired (VI) pedestrians. These devices vary in terms of conveyed information and feedback modality. Unfortunately, many such prototypes never progress beyond laboratory testing. Conversely, smartphone-based navigation systems for sighted pedestrians have grown in robustness and capabilities, to the point of now being ubiquitous. How can we leverage the success of sighted navigation technology, which is driven by a larger global market, as a way to progress VI navigation systems? We believe one possibility is to make common devices that benefit both VI and sighted individuals, by providing information in a way that does not distract either user from their tasks or environment. To this end we have developed physical interfaces that eschew visual, audio or vibratory feedback, instead relying on the natural human ability to perceive the shape of a handheld object.

hi

[BibTex]

[BibTex]


no image
Bimanual Wrist-Squeezing Haptic Feedback Changes Speed-Force Tradeoff in Robotic Surgery Training

Cao, E., Machaca, S., Bernard, T., Wolfinger, B., Patterson, Z., Chi, A., Adrales, G. L., Kuchenbecker, K. J., Brown, J. D.

Extended abstract presented as an ePoster at the Annual Meeting of the Society of American Gastrointestinal and Endoscopic Surgeons (SAGES), Baltimore, USA, April 2019 (misc) Accepted

hi

[BibTex]

[BibTex]


no image
Interactive Augmented Reality for Robot-Assisted Surgery

Forte, M. P., Kuchenbecker, K. J.

Extended abstract presented as an Emerging Technology ePoster at the Annual Meeting of the Society of American Gastrointestinal and Endoscopic Surgeons (SAGES), Baltimore, Maryland, USA, April 2019 (misc) Accepted

hi

Project Page [BibTex]

Project Page [BibTex]


no image
A Design Tool for Therapeutic Social-Physical Human-Robot Interactions

Mohan, M., Kuchenbecker, K. J.

Workshop paper (3 pages) presented at the HRI Pioneers Workshop, Daegu, South Korea, March 2019 (misc) Accepted

Abstract
We live in an aging society; social-physical human-robot interaction has the potential to keep our elderly adults healthy by motivating them to exercise. After summarizing prior work, this paper proposes a tool that can be used to design exercise and therapy interactions to be performed by an upper-body humanoid robot. The interaction design tool comprises a teleoperation system that transmits the operator’s arm motions, head motions and facial expression along with an interface to monitor and assess the motion of the user interacting with the robot. We plan to use this platform to create dynamic and intuitive exercise interactions.

hi

Project Page [BibTex]

Project Page [BibTex]


no image
The Perception of Ultrasonic Square Reductions of Friction With Variable Sharpness and Duration

Gueorguiev, D., Vezzoli, E., Sednaoui, T., Grisoni, L., Lemaire-Semail, B.

IEEE Transactions on Haptics, 12(2):179-188, January 2019 (article)

Abstract
The human perception of square ultrasonic modulation of the finger-surface friction was investigated during active tactile exploration by using short frictional cues of varying duration and sharpness. In a first experiment, we asked participants to discriminate the transition time and duration of short square ultrasonic reductions of friction. They proved very sensitive to discriminate millisecond differences in these two parameters with the average psychophysical thresholds being 2.3–2.4 ms for both parameters. A second experiment focused on the perception of square friction reductions with variable transition times and durations. We found that for durations of the stimulation larger than 90 ms, participants often perceived three or four edges when only two stimulations were presented while they consistently felt two edges for signals shorter than 50 ms. A subsequent analysis of the contact forces induced by these ultrasonic stimulations during slow and fast active exploration showed that two identical consecutive ultrasonic pulses can induce significantly different frictional dynamics especially during fast motion of the finger. These results confirm the human sensitivity to transient frictional cues and suggest that the human perception of square reductions of friction can depend on their sharpness and duration as well as on the speed of exploration.

hi

DOI [BibTex]

DOI [BibTex]


no image
How Does It Feel to Clap Hands with a Robot?

Fitter, N. T., Kuchenbecker, K. J.

International Journal of Social Robotics, 2019 (article) Accepted

Abstract
Future robots may need lighthearted physical interaction capabilities to connect with people in meaningful ways. To begin exploring how users perceive playful human–robot hand-to-hand interaction, we conducted a study with 20 participants. Each user played simple hand-clapping games with the Rethink Robotics Baxter Research Robot during a 1-h-long session involving 24 randomly ordered conditions that varied in facial reactivity, physical reactivity, arm stiffness, and clapping tempo. Survey data and experiment recordings demonstrate that this interaction is viable: all users successfully completed the experiment and mentioned enjoying at least one game without prompting. Hand-clapping tempo was highly salient to users, and human-like robot errors were more widely accepted than mechanical errors. Furthermore, perceptions of Baxter varied in the following statistically significant ways: facial reactivity increased the robot’s perceived pleasantness and energeticness; physical reactivity decreased pleasantness, energeticness, and dominance; higher arm stiffness increased safety and decreased dominance; and faster tempo increased energeticness and increased dominance. These findings can motivate and guide roboticists who want to design social–physical human–robot interactions.

hi

[BibTex]

[BibTex]


Thumb xl teaser
Toward Expert-Sourcing of a Haptic Device Repository

Seifi, H., Ip, J., Agrawal, A., Kuchenbecker, K. J., MacLean, K. E.

Glasgow, UK, 2019 (misc)

Abstract
Haptipedia is an online taxonomy, database, and visualization that aims to accelerate ideation of new haptic devices and interactions in human-computer interaction, virtual reality, haptics, and robotics. The current version of Haptipedia (105 devices) was created through iterative design, data entry, and evaluation by our team of experts. Next, we aim to greatly increase the number of devices and keep Haptipedia updated by soliciting data entry and verification from haptics experts worldwide.

hi

link (url) [BibTex]

link (url) [BibTex]

2015


no image
Reducing Student Anonymity and Increasing Engagement

Kuchenbecker, K. J.

University of Pennsylvania Almanac, 62(18):8, November 2015 (article)

hi

[BibTex]

2015


[BibTex]


no image
Surgeons and Non-Surgeons Prefer Haptic Feedback of Instrument Vibrations During Robotic Surgery

Koehn, J. K., Kuchenbecker, K. J.

Surgical Endoscopy, 29(10):2970-2983, October 2015 (article)

hi

[BibTex]

[BibTex]


no image
Displaying Sensed Tactile Cues with a Fingertip Haptic Device

Pacchierotti, C., Prattichizzo, D., Kuchenbecker, K. J.

IEEE Transactions on Haptics, 8(4):384-396, October 2015 (article)

hi

[BibTex]

[BibTex]


no image
A thin film active-lens with translational control for dynamically programmable optical zoom

Yun, S., Park, S., Park, B., Nam, S., Park, S. K., Kyung, K.

Applied Physics Letters, 107(8):081907, AIP Publishing, August 2015 (article)

Abstract
We demonstrate a thin film active-lens for rapidly and dynamically controllable optical zoom. The active-lens is composed of a convex hemispherical polydimethylsiloxane (PDMS) lens structure working as an aperture and a dielectric elastomer (DE) membrane actuator, which is a combination of a thin DE layer made with PDMS and a compliant electrode pattern using silver-nanowires. The active-lens is capable of dynamically changing focal point of the soft aperture as high as 18.4% through its translational movement in vertical direction responding to electrically induced bulged-up deformation of the DE membrane actuator. Under operation with various sinusoidal voltage signals, the movement responses are fairly consistent with those estimated from numerical simulation. The responses are not only fast, fairly reversible, and highly durable during continuous cyclic operations, but also large enough to impart dynamic focus tunability for optical zoom in microscopic imaging devices with a light-weight and ultra-slim configuration.

hi

link (url) DOI [BibTex]

link (url) DOI [BibTex]


no image
Data-Driven Motion Mappings Improve Transparency in Teleoperation

Khurshid, R. P., Kuchenbecker, K. J.

Presence: Teleoperators and Virtual Environments, 24(2):132-154, May 2015 (article)

hi

[BibTex]

[BibTex]


no image
Haptic Textures for Online Shopping

Culbertson, H., Kuchenbecker, K. J.

Interactive demonstrations in The Retail Collective exhibit, presented at the Dx3 Conference in Toronto, Canada, March 2015 (misc)

hi

[BibTex]

[BibTex]


no image
Robotic Learning of Haptic Adjectives Through Physical Interaction

Chu, V., McMahon, I., Riano, L., McDonald, C. G., He, Q., Perez-Tejada, J. M., Arrigo, M., Darrell, T., Kuchenbecker, K. J.

Robotics and Autonomous Systems, 63(3):279-292, 2015, Vivian Chu, Ian MacMahon, and Lorenzo Riano contributed equally to this publication. Corrigendum published in June 2016 (article)

hi

[BibTex]

[BibTex]


no image
Effects of Vibrotactile Feedback on Human Motor Learning of Arbitrary Arm Motions

Bark, K., Hyman, E., Tan, F., Cha, E., Jax, S. A., Buxbaum, L. J., Kuchenbecker, K. J.

IEEE Transactions on Neural Systems and Rehabilitation Engineering, 23(1):51-63, January 2015 (article)

hi

[BibTex]

[BibTex]

2013


no image
A Practical System For Recording Instrument Interactions During Live Robotic Surgery

McMahan, W., Gomez, E. D., Chen, L., Bark, K., Nappo, J. C., Koch, E. I., Lee, D. I., Dumon, K., Williams, N., Kuchenbecker, K. J.

Journal of Robotic Surgery, 7(4):351-358, 2013 (article)

hi

[BibTex]

2013


[BibTex]


no image
Jointonation: Robotization of the Human Body by Vibrotactile Feedback

Kurihara, Y., Hachisu, T., Kuchenbecker, K. J., Kajimoto, H.

Emerging Technologies Demonstration with Talk at ACM SIGGRAPH Asia, Hong Kong, November 2013, Hands-on demonstration given by Kurihara, Takei, and Nakai. Best Demonstration Award as voted by the Program Committee (misc)

hi

[BibTex]

[BibTex]


no image
Vibrotactile Display: Perception, Technology, and Applications

Choi, S., Kuchenbecker, K. J.

Proceedings of the IEEE, 101(9):2093-2104, sep 2013 (article)

hi

[BibTex]

[BibTex]


no image
ROS Open-source Audio Recognizer: ROAR Environmental Sound Detection Tools for Robot Programming

Romano, J. M., Brindza, J. P., Kuchenbecker, K. J.

Autonomous Robots, 34(3):207-215, April 2013 (article)

hi

[BibTex]

[BibTex]


no image
Data-Driven Modeling and Rendering of Isotropic Textures

Culbertson, H., McDonald, C. G., Goodman, B. E., Kuchenbecker, K. J.

Hands-on demonstration presented at IEEE World Haptics Conference, Daejeon, South Korea, April 2013, Best Demonstration Award (by audience vote) (misc)

hi

[BibTex]

[BibTex]


no image
Adding Haptics to Robotic Surgery

J. Kuchenbecker, K., Brzezinski, A., D. Gomez, E., Gosselin, M., Hui, J., Koch, E., Koehn, J., McMahan, W., Mahajan, K., Nappo, J., Shah, N.

Learning Center Station at SAGES (Society of American Gastrointestinal and Endoscopic Surgeons) Annual Meeting, Baltimore, Maryland, USA, April 2013 (misc)

hi

[BibTex]

[BibTex]


no image
In Vivo Validation of a System for Haptic Feedback of Tool Vibrations in Robotic Surgery

Bark, K., McMahan, W., Remington, A., Gewirtz, J., Wedmid, A., Lee, D. I., Kuchenbecker, K. J.

Surgical Endoscopy, 27(2):656-664, February 2013, dynamic article (paper plus video), available at \href{http://www.springerlink.com/content/417j532708417342/}{http://www.springerlink.com/content/417j532708417342/} (article)

hi

[BibTex]

[BibTex]


no image
Perception of Springs with Visual and Proprioceptive Motion Cues: Implications for Prosthetics

Gurari, N., Kuchenbecker, K. J., Okamura, A. M.

IEEE Transactions on Human-Machine Systems, 43, pages: 102-114, January 2013, \href{http://www.youtube.com/watch?v=DBRw87Wk29E\&feature=youtu.be}{Video} (article)

hi

[BibTex]

[BibTex]


no image
Expectation and Attention in Hierarchical Auditory Prediction

Chennu, S., Noreika, V., Gueorguiev, D., Blenkmann, A., Kochen, S., Ibáñez, A., Owen, A. M., Bekinschtein, T. A.

Journal of Neuroscience, 33(27):11194-11205, Society for Neuroscience, 2013 (article)

Abstract
Hierarchical predictive coding suggests that attention in humans emerges from increased precision in probabilistic inference, whereas expectation biases attention in favor of contextually anticipated stimuli. We test these notions within auditory perception by independently manipulating top-down expectation and attentional precision alongside bottom-up stimulus predictability. Our findings support an integrative interpretation of commonly observed electrophysiological signatures of neurodynamics, namely mismatch negativity (MMN), P300, and contingent negative variation (CNV), as manifestations along successive levels of predictive complexity. Early first-level processing indexed by the MMN was sensitive to stimulus predictability: here, attentional precision enhanced early responses, but explicit top-down expectation diminished it. This pattern was in contrast to later, second-level processing indexed by the P300: although sensitive to the degree of predictability, responses at this level were contingent on attentional engagement and in fact sharpened by top-down expectation. At the highest level, the drift of the CNV was a fine-grained marker of top-down expectation itself. Source reconstruction of high-density EEG, supported by intracranial recordings, implicated temporal and frontal regions differentially active at early and late levels. The cortical generators of the CNV suggested that it might be involved in facilitating the consolidation of context-salient stimuli into conscious perception. These results provide convergent empirical support to promising recent accounts of attention and expectation in predictive coding.

hi

link (url) DOI [BibTex]

link (url) DOI [BibTex]