A Multitouch Notice Board Fostering Social Interaction

Abstract: We report on an alternative OCGM interface for a bulletin board, where a user can pin a note or a drawing, and actually shares contents. Exploiting direct and continuous manipulations, opposite to discrete gestures, to explore containers, the proposed interface supports a more natural and immediate interaction. It manages also the presence of different simultaneous users, allowing for the creation of local multimedia contents, the connection to social networks, providing a suitable working environment for cooperative and collaborative tasks in a multi-touch setup, such as touch-tables, interactive walls or multimedia boards.

Authors: S. A. Iacolina, M. Corrias, O. Pontis, A. Soro, F. Sorrentino, R. Scateni.
A Multitouch Notice Board Fostering Social Interaction.
ACM SIGCHI Italian Chapter (CHItaly 2013), 13:1–13:4.
Trento, Italia, Settembre 2013.

Evaluation of User Gestures in Multi-touch Interaction: a Case Study in Pair-programming.

Abstract: Natural User Interfaces are often described as familiar, evocative and intuitive, predictable, based on common skills. Though unquestionable in principle, such definitions don’t provide the designer with effective means to design a natural interface or evaluate a design choice vs another. Two main issues in particular are open: (i) how do we evaluate a natural interface, is there a way to measure ‘naturalness’; (ii) do natural user interfaces provide a concrete advantage in terms of efficiency, with respect to more traditional interface paradigms? In this paper we discuss and compare observations of user behavior in the task of pair programming, performed at a traditional desktop versus a multi-touch table. We show how the adoption of a multi-touch user interface fosters a significant, observable and measurable, increase of nonverbal communication in general and of gestures in particular, that in turn appears related to the overall performance of the users in the task of algorithm understanding and debugging.

Authors: A. Soro, S. A. Iacolina, R. Scateni, S. Uras.
Evaluation of User Gestures in Multi-touch Interaction: a Case Study in Pair-programming.
ICMI 2011, 161-168.
Alicante, Spagna, Novembre 2011.

MORAVIA: A Video-Annotation System Supporting Gesture Recognition

Abstract: Gestures and gesticulation play an important role in communication, particularly in public speech. We describe here the design, development and initial evaluation of MORAVIA (MOtion Recognition And VIdeo Annotation): a collaborative web application for (semi)automatic gesture annotation. MORAVIA was conceived as a support for the automatic evaluation of a speech based on non-verbal components, that is, as much as possible independent from the verbal content. We adopt an evaluation model, based on quality metrics related to gestures and provided by experts in the education and psychology domain. The final goal is to design and implement a system able to detect the gestures using a video camera and a depth camera, such as the Microsoft Kinect, to detect the position and the movements of the speaker. Then, the web application for video-annotation allows collaborative review and analysis of the different video sequences. This is useful both to domain experts, as a research tool, and to end users, for self-evaluation.

Authors: M. Careddu, L. Carrus, A. Soro, S. A. Iacolina, R. Scateni.
MORAVIA: A Video-Annotation System Supporting Gesture Recognition.
ACM SIGCHI Italian Chapter (CHItaly 2011) Adjunct Proceedings.
Alghero, Italia, Settembre 2011.

Walk, Look and Smell Through

Abstract: Human Computer interaction is typically constrained to the use of sight, hear, and touch. This paper describes an attempt to get over these limitations. We introduce the smell in the interaction with the aim of obtaining information from scents, i.e. giving meaning to odours and understand how people would appreciate such extensions. We discuss the design and implementation of our prototype system. The system is able to represent/manage an immersive environment, where the user interacts by means of visual, hearing and olfactory informations. We have implemented an odour emitter controlled by a presence sensor device. When the system perceives the presence of a user it activates audio/visual contents to encourage engaging in interaction. Then a specific scent is diffused in the air to augment the perceive reality of the experience. We discuss technical difficulties and initial empirical observations.

Authors: V. Cozza, G. Fenu, R. Scateni, A. Soro.
Walk, Look and Smell Through.
ACM SIGCHI Italian Chapter (CHItaly 2011) Adjunct Proceedings (poster).
Alghero, Italia, Settembre 2011.

Multi-touch and Tangible Interface: Two Different Interaction Modes in the Same System

Abstract: We present here a system built around the idea of giving to several users the possibility to interact with a cheap and solid hardware, at the same time, using natural gestures supported by an intuitive user interface. A projector and a camera are placed underneath a Plexiglas sheet, framed with an array of infrared LEDs, all set into a wooden table box. This allows for multiple users (up to four or five) to freely move around the box and manipulate the objects retro-projected on the screen, using a tangible interface designed aiming at offering few simple operations: geometric transforms (rotations, translations and scales), drawing, erasing and color selections. All of these are executed through the use of either a custom built IR LED pen and/or directly the fingers. The main purpose of the project is to offer an instrument of tangible interaction to classrooms of naive users (i.e.: neither technology nor science professors and students) in a university environment.

Authors: D. Cabiddu, G. Marcias, A. Soro, R. Scateni.
Multi-touch and Tangible Interface: Two Different Interaction Modes in the Same System.
ACM SIGCHI Italian Chapter (CHItaly 2011) Adjunct Proceedings (poster).
Alghero, Italia, Settembre 2011.