Gesture-based interaction methods for smart glasses
ERA-NET CHIST-ERA (FWF)
Interaction Methods for
The eGlasses project is focused on the development of an open platform in the form of multisensory electronic glasses and on the integration and designing of new intelligent interaction methods using the eGlasses platform. This is an initial development focused on long-term research and technological innovation in perceptual and super-perceptual (e.g. heart rate, temperature) computing. It is an emerging technology that is also focused on the creation of mobile, perceptual media. Perceptual media refers to multimedia devices with added perceptual user interface capabilities. These devices integrate human-like perceptual awareness of the environment, with the ability to respond appropriately. This can be achieved by using automatic perception of an object’s properties and delivering information about the object’s status as a result of reasoning operations. For example, using the eGlasses, it will be possible to control a device, which is recognized within the field of view using the interactive menu, associated with the identified device. Other examples include presentation of a recognized person name, recognition of people with abnormal physiological parameters, protection against possible head injuries, etc. The platform will use currently available user-interaction methods, new methods developed in the framework of this project (e.g. a haptic interface) and will enable further extensions to introduce next generation user-interaction algorithms. Furthermore, the goal of this project is to propose and evaluate new and intelligent user interactions, which are particularly useful for healthcare professionals, people with disabilities or at risk of exclusion, and to create and evaluate behavioural models of these mobile users.
Take a look at our work
In the past years of the eGlasses project, we’ve done a bunch of really cool things. Take a look.
FlexTiles: A Flexible, Stretchable, Formable, Pressure Sensitive, Tactile Input Sensor
P. Parzer, K. Probst, T. Babic, C. Rendl, A. Vogl, A. Olwal and M. Haller, 2016. in Ext. Abstracts CHI 2016: Proceedings of the 34th Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems, San José, CA, USA, 2016. to be published
Designing natural user interfaces: From large surfaces to flexible input sensors
Haller, M., in Human System Interactions (HSI), 2015 8th International Conference on , IEEE, pp.15-16, 25-27 June 2015, (doi: 10.1109/HSI.2015.7170636)
A. Vogl, N. Louveton, R. McCall, M. Billinghurst and M. Haller, 2015. in HSI15: 8th International Conference on Human System Interactions, Warsaw, Poland, Europe, 2015.
K. Czuszynski, J. Ruminski, J. Wtorek, A. Vogl and M. Haller, 2015. in HSI15: 8th International Conference on Human System Interactions, Warsaw, Poland, Europe, 2015.
D. Lindlbauer, T. Aoki, R. Walter, Y. Uema, A. Höchtl, M. Haller, M. Inami and J. Müller, 2014.
in UIST14: 27th ACM User Interface Software and Technology Symposium, Honolulu, Hawaii, USA, 2014.
D. Lindlbauer, T. Aoki, A. Höchtl, Y. Uema, M. Haller, M. Inami, and J. Müller, 2014. inACM SIGGRAPH 2014 Emerging Technologies, Vancouver, Canada, 2014.