Header logo is


2016


no image
Qualitative User Reactions to a Hand-Clapping Humanoid Robot

Fitter, N. T., Kuchenbecker, K. J.

In Social Robotics: 8th International Conference, ICSR 2016, Kansas City, MO, USA, November 1-3, 2016 Proceedings, 9979, pages: 317-327, Lecture Notes in Artificial Intelligence, Springer International Publishing, November 2016, Oral presentation given by Fitter (inproceedings)

hi

[BibTex]

2016


[BibTex]


no image
Designing and Assessing Expressive Open-Source Faces for the Baxter Robot

Fitter, N. T., Kuchenbecker, K. J.

In Social Robotics: 8th International Conference, ICSR 2016, Kansas City, MO, USA, November 1-3, 2016 Proceedings, 9979, pages: 340-350, Lecture Notes in Artificial Intelligence, Springer International Publishing, November 2016, Oral presentation given by Fitter (inproceedings)

hi

[BibTex]

[BibTex]


no image
Rhythmic Timing in Playful Human-Robot Social Motor Coordination

Fitter, N. T., Hawkes, D. T., Kuchenbecker, K. J.

In Social Robotics: 8th International Conference, ICSR 2016, Kansas City, MO, USA, November 1-3, 2016 Proceedings, 9979, pages: 296-305, Lecture Notes in Artificial Intelligence, Springer International Publishing, November 2016, Oral presentation given by Fitter (inproceedings)

hi

[BibTex]

[BibTex]


no image
Using IMU Data to Demonstrate Hand-Clapping Games to a Robot

Fitter, N. T., Kuchenbecker, K. J.

In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, pages: 851 - 856, October 2016, Interactive presentation given by Fitter (inproceedings)

hi

[BibTex]

[BibTex]


no image
ProtonPack: A Visuo-Haptic Data Acquisition System for Robotic Learning of Surface Properties

Burka, A., Hu, S., Helgeson, S., Krishnan, S., Gao, Y., Hendricks, L. A., Darrell, T., Kuchenbecker, K. J.

In Proceedings of the IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI), pages: 58-65, 2016, Oral presentation given by Burka (inproceedings)

hi

Project Page [BibTex]

Project Page [BibTex]


no image
Equipping the Baxter Robot with Human-Inspired Hand-Clapping Skills

Fitter, N. T., Kuchenbecker, K. J.

In Proceedings of the IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), pages: 105-112, 2016 (inproceedings)

hi

[BibTex]

[BibTex]


no image
Reproducing a Laser Pointer Dot on a Secondary Projected Screen

Hu, S., Kuchenbecker, K. J.

In Proceedings of the IEEE International Conference on Advanced Intelligent Mechatronics (AIM), pages: 1645-1650, 2016, Oral presentation given by Hu (inproceedings)

hi

[BibTex]

[BibTex]


Thumb xl capital
Patches, Planes and Probabilities: A Non-local Prior for Volumetric 3D Reconstruction

Ulusoy, A. O., Black, M. J., Geiger, A.

In IEEE Conf. on Computer Vision and Pattern Recognition (CVPR), IEEE International Conference on Computer Vision and Pattern Recognition (CVPR), June 2016 (inproceedings)

Abstract
In this paper, we propose a non-local structured prior for volumetric multi-view 3D reconstruction. Towards this goal, we present a novel Markov random field model based on ray potentials in which assumptions about large 3D surface patches such as planarity or Manhattan world constraints can be efficiently encoded as probabilistic priors. We further derive an inference algorithm that reasons jointly about voxels, pixels and image segments, and estimates marginal distributions of appearance, occupancy, depth, normals and planarity. Key to tractable inference is a novel hybrid representation that spans both voxel and pixel space and that integrates non-local information from 2D image segmentations in a principled way. We compare our non-local prior to commonly employed local smoothness assumptions and a variety of state-of-the-art volumetric reconstruction baselines on challenging outdoor scenes with textureless and reflective surfaces. Our experiments indicate that regularizing over larger distances has the potential to resolve ambiguities where local regularizers fail.

avg ps

YouTube pdf poster suppmat Project Page [BibTex]

YouTube pdf poster suppmat Project Page [BibTex]


Thumb xl jun teaser
Semantic Instance Annotation of Street Scenes by 3D to 2D Label Transfer

Xie, J., Kiefel, M., Sun, M., Geiger, A.

In IEEE Conf. on Computer Vision and Pattern Recognition (CVPR), IEEE International Conference on Computer Vision and Pattern Recognition (CVPR), June 2016 (inproceedings)

Abstract
Semantic annotations are vital for training models for object recognition, semantic segmentation or scene understanding. Unfortunately, pixelwise annotation of images at very large scale is labor-intensive and only little labeled data is available, particularly at instance level and for street scenes. In this paper, we propose to tackle this problem by lifting the semantic instance labeling task from 2D into 3D. Given reconstructions from stereo or laser data, we annotate static 3D scene elements with rough bounding primitives and develop a probabilistic model which transfers this information into the image domain. We leverage our method to obtain 2D labels for a novel suburban video dataset which we have collected, resulting in 400k semantic and instance image annotations. A comparison of our method to state-of-the-art label transfer baselines reveals that 3D information enables more efficient annotation while at the same time resulting in improved accuracy and time-coherent labels.

avg ps

pdf suppmat Project Page Project Page [BibTex]

pdf suppmat Project Page Project Page [BibTex]


no image
Deep Learning for Tactile Understanding From Visual and Haptic Data

Gao, Y., Hendricks, L. A., Kuchenbecker, K. J., Darrell, T.

In Proceedings of the IEEE International Conference on Robotics and Automation, pages: 536-543, May 2016, Oral presentation given by Gao (inproceedings)

hi

[BibTex]

[BibTex]


no image
Robust Tactile Perception of Artificial Tumors Using Pairwise Comparisons of Sensor Array Readings

Hui, J. C. T., Block, A. E., Taylor, C. J., Kuchenbecker, K. J.

In Proceedings of the IEEE Haptics Symposium, pages: 305-312, Philadelphia, Pennsylvania, USA, April 2016, Oral presentation given by Hui (inproceedings)

hi

[BibTex]

[BibTex]


no image
Data-Driven Comparison of Four Cutaneous Displays for Pinching Palpation in Robotic Surgery

Brown, J. D., Ibrahim, M., Chase, E. D. Z., Pacchierotti, C., Kuchenbecker, K. J.

In Proceedings of the IEEE Haptics Symposium, pages: 147-154, Philadelphia, Pennsylvania, USA, April 2016, Oral presentation given by Brown (inproceedings)

hi

[BibTex]

[BibTex]


Thumb xl romo breakdown
Multisensory Robotic Therapy through Motion Capture and Imitation for Children with ASD

Burns, R., Nizambad, S., Park, C. H., Jeon, M., Howard, A.

Proceedings of the American Society of Engineering Education, Mid-Atlantic Section, Spring Conference, April 2016 (conference)

Abstract
It is known that children with autism have difficulty with emotional communication. As the population of children with autism increases, it is crucial we create effective therapeutic programs that will improve their communication skills. We present an interactive robotic system that delivers emotional and social behaviors for multi­sensory therapy for children with autism spectrum disorders. Our framework includes emotion­-based robotic gestures and facial expressions, as well as tracking and understanding the child’s responses through Kinect motion capture.

hi

link (url) [BibTex]

link (url) [BibTex]


no image
Design and Implementation of a Visuo-Haptic Data Acquisition System for Robotic Learning of Surface Properties

Burka, A., Hu, S., Helgeson, S., Krishnan, S., Gao, Y., Hendricks, L. A., Darrell, T., Kuchenbecker, K. J.

In Proceedings of the IEEE Haptics Symposium, pages: 350-352, April 2016, Work-in-progress paper. Poster presentation given by Burka (inproceedings)

hi

Project Page [BibTex]

Project Page [BibTex]


Thumb xl angry romo
Multisensory robotic therapy to promote natural emotional interaction for children with ASD

Burns, R., Azzi, P., Spadafora, M., Park, C. H., Jeon, M., Kim, H. J., Lee, J., Raihan, K., Howard, A.

Proceedings of the Eleventh ACM/IEEE International Conference on Human Robot Interaction (HRI), pages: 571-571, March 2016 (conference)

Abstract
In this video submission, we are introduced to two robots, Romo the penguin and Darwin Mini. We have programmed these robots to perform a variety of emotions through facial expression and body language, respectively. We aim to use these robots with children with autism, to demo safe emotional and social responses in various sensory situations.

hi

link (url) DOI [BibTex]

link (url) DOI [BibTex]


Thumb xl interactive
Interactive Robotic Framework for Multi-Sensory Therapy for Children with Autism Spectrum Disorder

Burns, R., Park, C. H., Kim, H. J., Lee, J., Rennie, A., Jeon, M., Howard, A.

In Proceedings of the Eleventh ACM/IEEE International Conference on Human Robot Interaction (HRI), pages: 421-422, March 2016 (inproceedings)

Abstract
In this abstract, we present the overarching goal of our interactive robotic framework - to teach emotional and social behavior to children with autism spectrum disorders via multi-sensory therapy. We introduce our robot characters, Romo and Darwin Mini, and the "Five Senses" scenario they will undergo. This sensory game will develop the children's interest, and will model safe and appropriate reactions to typical sensory overload stimuli.

hi

link (url) DOI [BibTex]

link (url) DOI [BibTex]


Thumb xl teaser
Deep Discrete Flow

Güney, F., Geiger, A.

Asian Conference on Computer Vision (ACCV), 2016 (conference) Accepted

avg ps

pdf suppmat Project Page [BibTex]

pdf suppmat Project Page [BibTex]


no image
Psychophysical Power Optimization of Friction Modulation for Tactile Interfaces

Sednaoui, T., Vezzoli, E., Gueorguiev, D., Amberg, M., Chappaz, C., Lemaire-Semail, B.

In Haptics: Perception, Devices, Control, and Applications, pages: 354-362, Springer International Publishing, Cham, 2016 (inproceedings)

Abstract
Ultrasonic vibration and electrovibration can modulate the friction between a surface and a sliding finger. The power consumption of these devices is critical to their integration in modern mobile devices such as smartphones. This paper presents a simple control solution to reduce up to 68.8 {\%} this power consumption by taking advantage of the human perception limits.

hi

[BibTex]

[BibTex]


Thumb xl screen shot 2018 05 04 at 11.40.29
Effect of Waveform in Haptic Perception of Electrovibration on Touchscreens

Vardar, Y., Güçlü, B., Basdogan, C.

In Haptics: Perception, Devices, Control, and Applications, pages: 190-203, Springer International Publishing, Cham, 2016 (inproceedings)

Abstract
The perceived intensity of electrovibration can be altered by modulating the amplitude, frequency, and waveform of the input voltage signal applied to the conductive layer of a touchscreen. Even though the effect of the first two has been already investigated for sinusoidal signals, we are not aware of any detailed study investigating the effect of the waveform on our haptic perception in the domain of electrovibration. This paper investigates how input voltage waveform affects our haptic perception of electrovibration on touchscreens. We conducted absolute detection experiments using square wave and sinusoidal input signals at seven fundamental frequencies (15, 30, 60, 120, 240, 480 and 1920 Hz). Experimental results depicted the well-known U-shaped tactile sensitivity across frequencies. However, the sensory thresholds were lower for the square wave than the sinusoidal wave at fundamental frequencies less than 60 Hz while they were similar at higher frequencies. Using an equivalent circuit model of a finger-touchscreen system, we show that the sensation difference between the waveforms at low fundamental frequencies can be explained by frequency-dependent electrical properties of human skin and the differential sensitivity of mechanoreceptor channels to individual frequency components in the electrostatic force. As a matter of fact, when the electrostatic force waveforms are analyzed in the frequency domain based on human vibrotactile sensitivity data from the literature [15], the electrovibration stimuli caused by square-wave input signals at all the tested frequencies in this study are found to be detected by the Pacinian psychophysical channel.

hi

vardar_eurohaptics_2016 [BibTex]

vardar_eurohaptics_2016 [BibTex]

2005


no image
Perception of Curvature and Object Motion Via Contact Location Feedback

Provancher, W. R., Kuchenbecker, K. J., Niemeyer, G., Cutkosky, M. R.

In Proceedings of the International Symposium on Robotics Research (ISRR), 15, pages: 456-465, Springer Tracts in Advanced Robotics, Springer, Siena, Italy, 2005, Oral presentation given by Provancher in October of 2003 (inproceedings)

hi

[BibTex]

2005


[BibTex]


no image
Modeling Induced Master Motion in Force-Reflecting Teleoperation

Kuchenbecker, K. J., Niemeyer, G.

In Proc. IEEE International Conference on Robotics and Automation, pages: 348-353, Barcelona, Spain, April 2005, Oral presentation given by Kuchenbecker (inproceedings)

hi

[BibTex]

[BibTex]


no image
Event-Based Haptics and Acceleration Matching: Portraying and Assessing the Realism of Contact

Kuchenbecker, K. J., Fiene, J. P., Niemeyer, G.

In Proc. IEEE World Haptics Conference, pages: 381-387, Pisa, Italy, March 2005, Oral presentation given by Kuchenbecker (inproceedings)

hi

[BibTex]

[BibTex]