We create pressure sensors, which are made of 100% textile components and work just like common FSR (Force-Sensing Resistor) sensors. Using an ordinary embroidery machine, our sensors can be rapidly applied to numerous pre-existing fabrics, in arbitrary shape and scale. Just as print FSR, they are straightforward to use in combination with basic readout electronics. We also developed a custom PCB, which can be attached directly onto the augmented fabric. Even wiring for connecting the PCB to sensors is done using textile materials, by embroidering the respective circuit paths. We can also link several sensors for building pressure sensitive configurations with high flexibility.
As smart textiles are becoming more present in our lives, investigating and designing textile interfaces has started getting more and more attention. Still, very little research has been done on how to design interactive elements for non-wearable textile interfaces for the best recognition, perception, and interaction. In this paper, we present initial assumptions for designing such interfaces, which we derived from working intensively with our partners from the industry. These have been further explored with experts from the field during interviews, and finally tested in a user study. As a conclusion of the study, we define five design recommendations for textile interfaces and present several prototypes that demonstrate them in practice.
TextileUX proposes the creation of a pressure-sensitive textile-based sensing platform to enable computational environments to be embedded seamlessly into our lives. In the 1990s, Mark Weiser and his colleagues at Xerox PARC introduced their vision of Ubiquitous Computing – a world in which “computers disappear into our everyday environments and weave themselves into our daily lives without being noticed”. In this process, TextileUX aims to develop unique smart textile-sensing know-how and broaden the knowledge base at the intersection of material, textile and computer science.
Within Innovation Playground, we present Foxel, a modular, smart furniture concept that allows users to create their own interactive furniture on demand by simply snapping individual building blocks together. The modular building blocks offer benefits through their flexibility in how they can be arranged and their added interactive functionality, which makes them particularly well-suited for re-configurable spaces. With Foxel, users are capable of creating their own digitally augmentable furniture. Our software framework enables topology tracking, peer-to-peer communication, and thus a context-based combination of individual building blocks’ I/O functionalities. In this paper, we describe the underlying technology and demonstrate the system’s versatility by providing a set of real-world examples.
In FLASHED, we will discover a new technology in the field of flexible displays. A major problem of flexible displays is still the lack of interactivity, because none of the market-ready touch solutions are flexible (bendable). Therefore, one major goal of FLASHED is the development of novel, flexible touchscreen solutions, covering aspects like cost-effectiveness (all-printed), sustainability (material recycling and end-of-life disposal) and energy-efficiency (self-sustaining sensor arrays). Moreover, the FLASHED project puts strong emphasis on user-friendly interfaces. The FLASHED device itself consists of a flexible display, a touchscreen layer based on a pressure sensing touch array (utilizing ferroelectric polymers) and a haptic/acoustic feedback layer based on a relaxor ferrolectric material.
To promote physical activity at the workplace, we suggest a paradigm shift in workplace design towards an integrated activity-promoting environment that supports the adoption of a physically active office workflow. Our design concept involves an ergonomically designed workspace that integrates traditional office furniture with elements such as active seats, height-adjustable standing desks, and whiteboards to form an interconnected workplace ennvironment. Furthermore, office furniture is smoothly integrated with hardware-related structural elements such as desktop computers, notebooks, tablets, or large-scale interactive surfaces that extend the design with media support. Based that, the goal of the Active Office is then, to motivate users to draw full benefit from the provided structure. Consequently, we encourage the adoption a new way of working “in-motion” characterized by regular switches between different tasks, workstations, and postures.
Situation awareness is essential to the decision making process in the emergency response and control setting. Situation maps used for planning and decision making are fundamental to correct and complete situation awareness, and the maps have to fulfill requirements of various emergency cases. Moreover, operators often need to coordinate rescue squads from different organizational units on varying levels of detail. Based on these requirements, a highly flexible and dynamic situation map that provides both overview and enough level of detail is needed. However, the analog or static situation maps, that are currently prevalent, do not support these requirements well. Therefore, a new interactive situation map needs to be established. With new design metaphors (space and furniture design) and interaction techniques (Natural User Interface), the work efficiency and situation awareness within emergency control centers would be improved.
The eGlasses project is focused on the development of an open platform in the form of multisensory electronic glasses and on the integration and designing of new intelligent interaction methods using the eGlasses platform. This is an initial development focused on long-term research and technological innovation in perceptual and super-perceptual (e.g. heart rate, temperature) computing. It is an emerging technology that is also focused on the creation of mobile, perceptual media. Perceptual media refers to multimedia devices with added perceptual user interface capabilities. These devices integrate human-like perceptual awareness of the environment, with the ability to respond appropriately. The platform will use currently available user-interaction methods, new methods developed in the framework of this project (e.g. a haptic interface) and will enable further extensions to introduce next generation user-interaction algorithms.
While current systems for controlling industrial robots are very efficient, their programming interfaces are still too complicated, time-consuming, and cumbersome. In this paper, we present AHUMARI, a new human-robot interaction method for controlling and programming a robotic arm. With AHUMARI, operators are using a multi-modal programming operation technique that includes an optically tracked 3D stick, speech input, and an Augmented Reality (AR) visualization. We also implemented a prototype, simulating status-quo Teach Pendant interfaces, as they are commonly used for programming industrial robots. To validate our interaction design, a user study was conducted for gaining both quantitative and qualitative data. Summarizing, the AHUMARI interaction technique was rated superior throughout all aspects. It was found to be easier to learn and to provide better performance in task completion time as well as in positioning accuracy.
Over the last decade, touch sensing devices have become more and more important. Most researchers tried to improve multi-touch by introducing capacitive, resistive, or optical sensing devices. Although most of them provide already a multi-touch sensing, it is still often not possible to track input pressure efficiently. Tracking pen and touch separately in combination with pressure tracking provides new possibilities for user interfaces and interaction design: entirely new tools or even new interaction modes are possible due to the simultaneous use of pen and touch (e.g. touch input could be used for manipulations or navigation, whereas pen input could be used for accurate annotations). With ecoTouch, we present a novel sensing device that is based on a pyro- and piezoelectric sensor matrix that is screen-printed on a flexible film, and can detect changes in pressure and temperature respectively.
NiCE (Natural user Interfaces for Collaborative Environments) was one out of 14 Austrian Research Studios starting in October 2008. Over three years, we explored on novel large surfaces and new interaction techniques and embedded the into our daily-life environment (e.g. living room, conference room etc.). We primarily focused on new interaction methods and building new hardware. The research goal was to design, develop, and evaluate natural user interfaces for collaborative environments that enable everyone, not just experts, to use our interactive systems. Thus, the focus was on creating natural systems and interfaces that increase users’ productivity.