Header logo is


2017


Physical and Behavioral Factors Improve Robot Hug Quality
Physical and Behavioral Factors Improve Robot Hug Quality

Block, A. E., Kuchenbecker, K. J.

Workshop Paper (2 pages) presented at the RO-MAN Workshop on Social Interaction and Multimodal Expression for Socially Intelligent Robots, Lisbon, Portugal, August 2017 (misc)

Abstract
A hug is one of the most basic ways humans can express affection. As hugs are so common, a natural progression of robot development is to have robots one day hug humans as seamlessly as these intimate human-human interactions occur. This project’s purpose is to evaluate human responses to different robot physical characteristics and hugging behaviors. Specifically, we aim to test the hypothesis that a warm, soft, touch-sensitive PR2 humanoid robot can provide humans with satisfying hugs by matching both their hugging pressure and their hugging duration. Thirty participants experienced and evaluated twelve hugs with the robot, divided into three randomly ordered trials that focused on physical robot char- acteristics and nine randomly ordered trials with varied hug pressure and duration. We found that people prefer soft, warm hugs over hard, cold hugs. Furthermore, users prefer hugs that physically squeeze them and release immediately when they are ready for the hug to end.

hi

Project Page [BibTex]

2017


Project Page [BibTex]


no image
Physically Interactive Exercise Games with a Baxter Robot

Fitter, N. T., Kuchenbecker, K. J.

Hands-on demonstration presented at the IEEE World Haptics Conference (WHC), Munich, Germany, June 2017 (misc)

hi

Project Page [BibTex]

Project Page [BibTex]


no image
Proton Pack: Visuo-Haptic Surface Data Recording

Burka, A., Kuchenbecker, K. J.

Hands-on demonstration presented at the IEEE World Haptics Conference (WHC), Munich, Germany, June 2017 (misc)

hi

Project Page [BibTex]

Project Page [BibTex]


no image
Teaching a Robot to Collaborate with a Human Via Haptic Teleoperation

Hu, S., Kuchenbecker, K. J.

Work-in-progress paper (2 pages) presented at the IEEE World Haptics Conference (WHC), Munich, Germany, June 2017 (misc)

hi

Project Page [BibTex]

Project Page [BibTex]


How Should Robots Hug?
How Should Robots Hug?

Block, A. E., Kuchenbecker, K. J.

Work-in-progress paper (2 pages) presented at the IEEE World Haptics Conference (WHC), Munich, Germany, June 2017 (misc)

hi

Project Page [BibTex]

Project Page [BibTex]


no image
An Interactive Augmented-Reality Video Training Platform for the da Vinci Surgical System

Carlson, J., Kuchenbecker, K. J.

Workshop paper (3 pages) presented at the ICRA Workshop on C4 Surgical Robots, Singapore, May 2017 (misc)

Abstract
Teleoperated surgical robots such as the Intuitive da Vinci Surgical System facilitate minimally invasive surgeries, which decrease risk to patients. However, these systems can be difficult to learn, and existing training curricula on surgical simulators do not offer students the realistic experience of a full operation. This paper presents an augmented-reality video training platform for the da Vinci that will allow trainees to rehearse any surgery recorded by an expert. While the trainee operates a da Vinci in free space, they see their own instruments overlaid on the expert video. Tools are identified in the source videos via color segmentation and kernelized correlation filter tracking, and their depth is calculated from the da Vinci’s stereoscopic video feed. The user tries to follow the expert’s movements, and if any of their tools venture too far away, the system provides instantaneous visual feedback and pauses to allow the user to correct their motion. The trainee can also rewind the expert video by bringing either da Vinci tool very close to the camera. This combined and augmented video provides the user with an immersive and interactive training experience.

hi

[BibTex]

[BibTex]


Chapter 8 - Micro- and nanorobots in Newtonian and biological viscoelastic fluids
Chapter 8 - Micro- and nanorobots in Newtonian and biological viscoelastic fluids

Palagi, S., (Walker) Schamel, D., Qiu, T., Fischer, P.

In Microbiorobotics, pages: 133 - 162, 8, Micro and Nano Technologies, Second edition, Elsevier, Boston, March 2017 (incollection)

Abstract
Swimming microorganisms are a source of inspiration for small scale robots that are intended to operate in fluidic environments including complex biomedical fluids. Nature has devised swimming strategies that are effective at small scales and at low Reynolds number. These include the rotary corkscrew motion that, for instance, propels a flagellated bacterial cell, as well as the asymmetric beat of appendages that sperm cells or ciliated protozoa use to move through fluids. These mechanisms can overcome the reciprocity that governs the hydrodynamics at small scale. The complex molecular structure of biologically important fluids presents an additional challenge for the effective propulsion of microrobots. In this chapter it is shown how physical and chemical approaches are essential in realizing engineered abiotic micro- and nanorobots that can move in biomedically important environments. Interestingly, we also describe a microswimmer that is effective in biological viscoelastic fluids that does not have a natural analogue.

pf

link (url) DOI [BibTex]

link (url) DOI [BibTex]


no image
Hand-Clapping Games with a Baxter Robot

Fitter, N. T., Kuchenbecker, K. J.

Hands-on demonstration presented at ACM/IEEE International Conference on Human-Robot Interaction (HRI), Vienna, Austria, March 2017 (misc)

Abstract
Robots that work alongside humans might be more effective if they could forge a strong social bond with their human partners. Hand-clapping games and other forms of rhythmic social-physical interaction may foster human-robot teamwork, but the design of such interactions has scarcely been explored. At the HRI 2017 conference, we will showcase several such interactions taken from our recent work with the Rethink Robotics Baxter Research Robot, including tempo-matching, Simon says, and Pat-a-cake-like games. We believe conference attendees will be both entertained and intrigued by this novel demonstration of social-physical HRI.

hi

Project Page [BibTex]

Project Page [BibTex]


no image
Automatic OSATS Rating of Trainee Skill at a Pediatric Laparoscopic Suturing Task

Oquendo, Y. A., Riddle, E. W., Hiller, D., Blinman, T. A., Kuchenbecker, K. J.

Surgical Endoscopy, 31(Supplement 1):S28, Extended abstract presented as a podium presentation at the Annual Meeting of the Society of American Gastrointestinal and Endoscopic Surgeons (SAGES), Springer, Houston, USA, March 2017 (misc)

Abstract
Introduction: Minimally invasive surgery has revolutionized surgical practice, but challenges remain. Trainees must acquire complex technical skills while minimizing patient risk, and surgeons must maintain their skills for rare procedures. These challenges are magnified in pediatric surgery due to the smaller spaces, finer tissue, and relative dearth of both inanimate and virtual simulators. To build technical expertise, trainees need opportunities for deliberate practice with specific performance feedback, which is typically provided via tedious human grading. This study aimed to validate a novel motion-tracking system and machine learning algorithm for automatically evaluating trainee performance on a pediatric laparoscopic suturing task using a 1–5 OSATS Overall Skill rating. Methods: Subjects (n=14) ranging from medical students to fellows per- formed one or two trials of an intracorporeal suturing task in a custom pediatric laparoscopy training box (Fig. 1) after watching a video of ideal performance by an expert. The position and orientation of the tools and endoscope were recorded over time using Ascension trakSTAR magnetic motion-tracking sensors, and both instrument grasp angles were recorded over time using flex sensors on the handles. The 27 trials were video-recorded and scored on the OSATS scale by a senior fellow; ratings ranged from 1 to 4. The raw motion data from each trial was processed to calculate over 200 preliminary motion parameters. Regularized least-squares regression (LASSO) was used to identify the most predictive parameters for inclusion in a regression tree. Model performance was evaluated by leave-one-subject-out cross validation, wherein the automatic scores given to each subject’s trials (by a model trained on all other data) are compared to the corresponding human rater scores. Results: The best-performing LASSO algorithm identified 14 predictive parameters for inclusion in the regression tree, including completion time, linear path length, angular path length, angular acceleration, grasp velocity, and grasp acceleration. The final model’s raw output showed a strong positive correlation of 0.87 with the reviewer-generated scores, and rounding the output to the nearest integer yielded a leave-one-subject-out cross-validation accuracy of 77.8%. Results are summarized in the confusion matrix (Table 1). Conclusions: Our novel motion-tracking system and regression model automatically gave previously unseen trials overall skill scores that closely match scores from an expert human rater. With additional data and further development, this system may enable creation of a motion-based training platform for pediatric laparoscopic surgery and could yield insights into the fundamental components of surgical skill.

hi

[BibTex]

[BibTex]


no image
How Much Haptic Surface Data is Enough?

Burka, A., Kuchenbecker, K. J.

Workshop paper (5 pages) presented at the AAAI Spring Symposium on Interactive Multi-Sensory Object Perception for Embodied Agents, Stanford, USA, March 2017 (misc)

Abstract
The Proton Pack is a portable visuo-haptic surface interaction recording device that will be used to collect a vast multimodal dataset, intended for robots to use as part of an approach to understanding the world around them. In order to collect a useful dataset, we want to pick a suitable interaction duration for each surface, noting the tradeoff between data collection resources and completeness of data. One interesting approach frames the data collection process as an online learning problem, building an incremental surface model and using that model to decide when there is enough data. Here we examine how to do such online surface modeling and when to stop collecting data, using kinetic friction as a first domain in which to apply online modeling.

hi

link (url) Project Page [BibTex]

link (url) Project Page [BibTex]


Enhancing Human-Computer Interaction via Electrovibration
Enhancing Human-Computer Interaction via Electrovibration

Emgin, S. E., Sadia, B., Vardar, Y., Basdogan, C.

Demo in IEEE World Haptics, 2017 (misc)

Abstract
We present a compact tablet that displays electrostatic haptic feedback to the user. We track user?s finger position via an infrared frame and then display haptic feedback through a capacitive touch screen based on her/his position. In order to demonstrate practical utility of the proposed system, the following applications have been developed: (1) Online Shopping application allows users to be able to feel the cord density of two different fabrics. (2) Education application asks user to add two numbers by dragging one number onto another in order to match the sum. After selecting the first number, haptic feedback assists user to select the right pair. (3) Gaming/Entertainment application presents users a bike riding experience on three different road textures -smooth, bumpy, and sandy. (4) User Interface application in which users are asked to drag two visually identical folders. While dragging, users are able to differentiate the amount of data in each folder based on haptic resistance.

hi

[BibTex]

[BibTex]


Reproduction of textures based on electrovibration
Reproduction of textures based on electrovibration

Fiedler, T., Vardar, Y., Strese, M., Steinbach, E., Basdogan, C.

Demo in IEEE World Haptics, 2017 (misc)

Abstract
This demonstration presents an approach to represent textures based on electovibration. We collect acceleration data which occurs while sliding a tool tip over a real texture surface. The prerecorded data was collected by a ADXL335 accelerometer, which is mounted on a FALCON device moving on the x-axis with a regulated velocity. In order to replicate the same acceleration with electrovibration, we found two problems. The frequency of one sine wave shifts to the double frequency. This effect originates from the electrostatic force between the finger pad and the tactile display as proposed by Kactmarek et Al. [1]. Taking the square root of the input signal corrects the effect. This was also earlier proposed by [1, 2, 3] However, if not only one but multiple sine waves are displayed interference occur and acceleration signals from real textures may not feel perceptually realistic. We propose to display only the dominant frequencies from a real texture signal. Peak frequencies are determined within the respect of the JND of 11 percent found by earlier literature. A new sine wave signal with the dominant frequencies is created. In the demo, we will let the attendees feel the differences between prerecorded and artificially created textures.

hi

[BibTex]

[BibTex]

2016


no image
Quantifying Therapist Practitioner Roles Using Video-based Analysis: Can We Reliably Model Therapist-Patient Interactions During Task-Oriented Therapy?

Mendonca, R., Johnson, M. J., Laskin, S., Adair, L., Mohan, M.

pages: E55-E56, Abstract in the Archives of Physical Medicine and Rehabilitation, October 2016 (misc)

hi

DOI [BibTex]

2016


DOI [BibTex]


no image
Numerical Investigation of Frictional Forces Between a Finger and a Textured Surface During Active Touch

Khojasteh, B., Janko, M., Visell, Y.

Extended abstract presented in form of an oral presentation at the 3rd International Conference on BioTribology (ICoBT), London, England, September 2016 (misc)

Abstract
The biomechanics of the human finger pad has been investigated in relation to motor behaviour and sensory function in the upper limb. While the frictional properties of the finger pad are important for grip and grasp function, recent attention has also been given to the roles played by friction when perceiving a surface via sliding contact. Indeed, the mechanics of sliding contact greatly affect stimuli felt by the finger scanning a surface. Past research has shed light on neural mechanisms of haptic texture perception, but the relation with time-resolved frictional contact interactions is unknown. Current biotribological models cannot predict time-resolved frictional forces felt by a finger as it slides on a rough surface. This constitutes a missing link in understanding the mechanical basis of texture perception. To ameliorate this, we developed a two-dimensional finite element numerical simulation of a human finger pad in sliding contact with a textured surface. Our model captures bulk mechanical properties, including hyperelasticity, dissipation, and tissue heterogeneity, and contact dynamics. To validate it, we utilized a database of measurements that we previously captured with a variety of human fingers and surfaces. By designing the simulations to match the measurements, we evaluated the ability of the FEM model to predict time-resolved sliding frictional forces. We varied surface texture wavelength, sliding speed, and normal forces in the experiments. An analysis of the results indicated that both time- and frequency-domain features of forces produced during finger-surface sliding interactions were reproduced, including many of the phenomena that we observed in analyses of real measurements, including quasiperiodicity, harmonic distortion and spectral decay in the frequency domain, and their dependence on kinetics and surface properties. The results shed light on frictional signatures of surface texture during active touch, and may inform understanding of the role played by friction in texture discrimination.

hi

[BibTex]

[BibTex]


Behavioral Learning and Imitation for Music-Based Robotic Therapy for Children with Autism Spectrum Disorder
Behavioral Learning and Imitation for Music-Based Robotic Therapy for Children with Autism Spectrum Disorder

Burns, R., Nizambad, S., Park, C. H., Jeon, M., Howard, A.

Workshop paper (5 pages) at the RO-MAN Workshop on Behavior Adaptation, Interaction and Learning for Assistive Robotics, August 2016 (misc)

Abstract
In this full workshop paper, we discuss the positive impacts of robot, music, and imitation therapies on children with autism. We also discuss the use of Laban Motion Analysis (LMA) to identify emotion through movement and posture cues. We present our preliminary studies of the "Five Senses" game that our two robots, Romo the penguin and Darwin Mini, partake in. Using an LMA-focused approach (enabled by our skeletal tracking Kinect algorithm), we find that our participants show increased frequency of movement and speed when the game has a musical accompaniment. Therefore, participants may have increased engagement with our robots and game if music is present. We also begin exploring motion learning for future works.

hi

link (url) [BibTex]

link (url) [BibTex]


no image
Design and evaluation of a novel mechanical device to improve hemiparetic gait: a case report

Fjeld, K., Hu, S., Kuchenbecker, K. J., Vasudevan, E. V.

Extended abstract presented at the Biomechanics and Neural Control of Movement Conference (BANCOM), 2016, Poster presentation given by Fjeld (misc)

hi

Project Page [BibTex]

Project Page [BibTex]


no image
One Sensor, Three Displays: A Comparison of Tactile Rendering from a BioTac Sensor

Brown, J. D., Ibrahim, M., Chase, E. D. Z., Pacchierotti, C., Kuchenbecker, K. J.

Hands-on demonstration presented at IEEE Haptics Symposium, Philadelphia, Pennsylvania, USA, April 2016 (misc)

hi

[BibTex]

[BibTex]


Multisensory robotic therapy to promote natural emotional interaction for children with ASD
Multisensory robotic therapy to promote natural emotional interaction for children with ASD

Bevill, R., Azzi, P., Spadafora, M., Park, C. H., Jeon, M., Kim, H. J., Lee, J., Raihan, K., Howard, A.

Proceedings of the ACM/IEEE International Conference on Human Robot Interaction (HRI), pages: 571, March 2016 (misc)

Abstract
In this video submission, we are introduced to two robots, Romo the penguin and Darwin Mini. We have programmed these robots to perform a variety of emotions through facial expression and body language, respectively. We aim to use these robots with children with autism, to demo safe emotional and social responses in various sensory situations.

hi

link (url) DOI [BibTex]

link (url) DOI [BibTex]


Interactive Robotic Framework for Multi-Sensory Therapy for Children with Autism Spectrum Disorder
Interactive Robotic Framework for Multi-Sensory Therapy for Children with Autism Spectrum Disorder

Bevill, R., Park, C. H., Kim, H. J., Lee, J., Rennie, A., Jeon, M., Howard, A.

Extended abstract presented at the ACM/IEEE International Conference on Human Robot Interaction (HRI), March 2016 (misc)

Abstract
In this abstract, we present the overarching goal of our interactive robotic framework - to teach emotional and social behavior to children with autism spectrum disorders via multi-sensory therapy. We introduce our robot characters, Romo and Darwin Mini, and the "Five Senses" scenario they will undergo. This sensory game will develop the children's interest, and will model safe and appropriate reactions to typical sensory overload stimuli.

hi

link (url) DOI [BibTex]

link (url) DOI [BibTex]


no image
Designing Human-Robot Exercise Games for Baxter

Fitter, N. T., Hawkes, D. T., Johnson, M. J., Kuchenbecker, K. J.

2016, Late-breaking results report presented at the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (misc)

hi

Project Page [BibTex]

Project Page [BibTex]


no image
Design of a Low-Cost Platform for Autonomous Mobile Service Robots

Eaton, E., Mucchiani, C., Mohan, M., Isele, D., Luná, J. M., Clingerman, C.

Workshop paper (7 pages) presented at the 25th International Joint Conference on Artificial Intelligence (IJCAI) Workshop on Autonomous Mobile Service Robots, New York, USA, 2016 (misc)

Abstract
Most current autonomous mobile service robots are either expensive commercial platforms or custom manufactured for research environments, limiting their availability. We present the design for a lowcost service robot based on the widely used TurtleBot 2 platform, with the goal of making service robots affordable and accessible to the research, educational, and hobbyist communities. Our design uses a set of simple and inexpensive modifications to transform the TurtleBot 2 into a 4.5ft (1.37m) tall tour-guide or telepresence-style robot, capable of performing a wide variety of indoor service tasks. The resulting platform provides a shoulder-height touchscreen and 3D camera for interaction, an optional low-cost arm for manipulation, enhanced onboard computation, autonomous charging, and up to 6 hours of runtime. The resulting platform can support many of the tasks performed by significantly more expensive service robots. For compatibility with existing software packages, the service robot runs the Robot Operating System (ROS).

hi

link (url) [BibTex]

link (url) [BibTex]


no image
IMU-Mediated Real-Time Human-Baxter Hand-Clapping Interaction

Fitter, N. T., Huang, Y. E., Mayer, J. P., Kuchenbecker, K. J.

2016, Late-breaking results report presented at the {\em IEEE/RSJ International Conference on Intelligent Robots and Systems} (misc)

hi

[BibTex]

[BibTex]

2015


no image
Haptic Textures for Online Shopping

Culbertson, H., Kuchenbecker, K. J.

Interactive demonstrations in The Retail Collective exhibit, presented at the Dx3 Conference in Toronto, Canada, March 2015 (misc)

hi

[BibTex]

2015


[BibTex]


no image
Derivation of phenomenological expressions for transition matrix elements for electron-phonon scattering

Illg, C., Haag, M., Müller, B. Y., Czycholl, G., Fähnle, M.

2015 (misc)

mms

link (url) [BibTex]

2011


no image
Please \soutdo not touch the robot

Romano, J. M., Kuchenbecker, K. J.

Hands-on demonstration presented at IEEE/RSJ Conference on Intelligent Robots and Systems (IROS), San Francisco, California, sep 2011 (misc)

hi

[BibTex]

2011


[BibTex]


no image
Body-Grounded Tactile Actuators for Playback of Human Physical Contact

Stanley, A. A., Kuchenbecker, K. J.

Hands-on demonstration presented at IEEE World Haptics Conference, Istanbul, Turkey, June 2011 (misc)

hi

[BibTex]

[BibTex]


no image
Projected Newton-type methods in machine learning

Schmidt, M., Kim, D., Sra, S.

In Optimization for Machine Learning, pages: 305-330, MIT Press, Cambridge, MA, USA, 2011 (incollection)

Abstract
{We consider projected Newton-type methods for solving large-scale optimization problems arising in machine learning and related fields. We first introduce an algorithmic framework for projected Newton-type methods by reviewing a canonical projected (quasi-)Newton method. This method, while conceptually pleasing, has a high computation cost per iteration. Thus, we discuss two variants that are more scalable, namely, two-metric projection and inexact projection methods. Finally, we show how to apply the Newton-type framework to handle non-smooth objectives. Examples are provided throughout the chapter to illustrate machine learning applications of our framework.}

mms

link (url) [BibTex]

link (url) [BibTex]