Voice and Gesture Control

Aluna Everitt

With the increase in popularity of robotic systems and autonomous agents in our everyday lives, human-robot interaction is becoming an increasingly important area of research. Drones are just one of such robotic entities that have increased in popularity, both for consumer use cases and industry applications. Beyond traditional single modality controller interfaces (e.g. joystick or buttons on a screen), there is little advancement for interaction design to enable intuitive human-machine interaction and collaboration. Our work aims to explore the potential of multi-modal interaction for advancing the usability experience of human-robot interaction, specifically drones. We present a prototype system consisting of a head-mounted hand tracker and voice control on a laptop to support gesture and voice recognition which can be used for controlling a drone. We utilize off the shelve components and software APIs for the implementation. Our system can support both gestures and voice commands to enhance interaction design for human-robot collaboration. We conduct a user study to evaluate the usability of multi-modal interaction. Our findings from 15 participants show that people prefer a combination of gestures and voice commands when controlling a drone. We hope our findings and detailed description of our system implementation will help advance interaction design for human-robot collaborations with drones and also can be generalized beyond to other robotic agents.  

screenshot 2021 12 06 at 13 55 45

 

 

Enabling Multi-Material 3D Printing for Designing and Rapid Prototyping of Deformable and Interactive Wearables

 

Deformable surfaces with interactive capabilities can also provide opportunities for new interaction capabilities with wearables. Yet current fabrication and prototyping techniques for deformable surfaces, that are both flexible and stretchable, are still limited by complex structural design and mechanical surface rigidity. We propose a simplified rapid fabrication technique that utilizes multi-material 3D printing for developing customizable and stretchable surfaces for mobile wearables with interactive capabilities embedded during the 3D printing process. Our prototype, FlexiWear, is a dynamic surface with embedded electronic components that can adapt to mobile body shape/movement and applied to contexts such as healthcare and sports wearables. We describe our design and fabrication approach using a commercial desktop 3D printer, the interaction techniques supported, and possible application scenarios for wearables and deformable mobile interfaces. Our approach aims to support rapid development and exploration of deformable surfaces that can adapt to body shape/movement.

 

https://www.youtube.com/embed/_MBzv4eRdAs