Project Start:

January 2014

Run Time:

3 years


Gdansk University of Technology, University of Applied Sciences Upper Austria (Media Interaction Lab), Hochschule Luzern (iHomeLab), University of Lorraine (LCOMS), University of Luxembourg



Interaction Methods for
Smart Glassess

The eGlasses project is focused on the development of an open platform in the form of multisensory electronic glasses and on the integration and designing of new intelligent interaction methods using the eGlasses platform. This is an initial development focused on long-term research and technological innovation in perceptual and super-perceptual (e.g. heart rate, temperature) computing. It is an emerging technology that is also focused on the creation of mobile, perceptual media. Perceptual media refers to multimedia devices with added perceptual user interface capabilities. These devices integrate human-like perceptual awareness of the environment, with the ability to respond appropriately. This can be achieved by using automatic perception of an object’s properties and delivering information about the object’s status as a result of reasoning operations. For example, using the eGlasses, it will be possible to control a device, which is recognized within the field of view using the interactive menu, associated with the identified device. Other examples include presentation of a recognized person name, recognition of people with abnormal physiological parameters, protection against possible head injuries, etc. The platform will use currently available user-interaction methods, new methods developed in the framework of this project (e.g. a haptic interface) and will enable further extensions to introduce next generation user-interaction algorithms. Furthermore, the goal of this project is to propose and evaluate new and intelligent user interactions, which are particularly useful for healthcare professionals, people with disabilities or at risk of exclusion, and to create and evaluate behavioural models of these mobile users.

Take a look at our work

In the past years of the eGlasses project, we’ve done a bunch of really cool things. Take a look.

  • Understanding the Everyday Use of HWC

    Understanding the Everyday Use of HWC



FlexTiles: A Flexible, Stretchable, Formable, Pressure Sensitive, Tactile Input Sensor

P. Parzer, K. Probst, T. Babic, C. Rendl, A. Vogl, A. Olwal and M. Haller, 2016. in Ext. Abstracts CHI 2016: Proceedings of the 34th Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems, San José, CA, USA, 2016. to be published

Designing natural user interfaces: From large surfaces to flexible input sensors

Haller, M., in Human System Interactions (HSI), 2015 8th International Conference on , IEEE, pp.15-16, 25-27 June 2015, (doi: 10.1109/HSI.2015.7170636)

Understanding The Everyday Use Of Head-Worn Computers

A. Vogl, N. Louveton, R. McCall, M. Billinghurst and M. Haller, 2015.  in HSI15: 8th International Conference on Human System Interactions, Warsaw, Poland, Europe, 2015.

Interactions Using Passive Optical Proximity Detector. 

K. Czuszynski, J. Ruminski, J. Wtorek, A. Vogl and M. Haller, 2015.  in HSI15: 8th International Conference on Human System Interactions, Warsaw, Poland, Europe, 2015.

Tracs: Transparency-Control For See-Through Displays

D. Lindlbauer, T. Aoki, R. Walter, Y. Uema, A. Höchtl, M. Haller, M. Inami and J. Müller, 2014.
in UIST14: 27th ACM User Interface Software and Technology Symposium, Honolulu, Hawaii, USA, 2014.

A Collaborative See-Through Display Supporting On-Demand Privacy

D. Lindlbauer, T. Aoki, A. Höchtl, Y. Uema, M. Haller, M. Inami, and J. Müller, 2014.  inACM SIGGRAPH 2014 Emerging Technologies, Vancouver, Canada, 2014.


Department of Interactive Media
University of Applied Sciences Upper Austria
Softwarepark 11, 4232 Hagenberg, Austria / Europe

E-mail: haller[at]
Phone: +43 50804-22127
Fax: +43 50804-22199

Project Partners

Get in touch

Contact us