A platform to augment web applications with ...

6 downloads 75312 Views 217KB Size Report
Web applications designers, developers and engineers are ... their work developing a native application (Android). ... The Many Faces of Publish/Subscribe.
A Platform To Augment Web Applications With Multimodal Interactions Diego Paez and Andrés Rodríguez LIFIA, Facultad de Informática, UNLP 50 & 120 1st Floor, 1900 La Plata, Argentina {diego.paez,andres.rodriguez}@lifia.info.unlp.edu.ar http://lifia.info.unlp.edu.ar/

1

Introduction

Web Applications are increasingly popular. A huge number of people access to applications from anywhere using new devices that have potential for different forms of interaction. Web applications designers, developers and engineers are faced everyday with the need to support these new interactions in their products. We propose a platform that helps a multimodal development team to add new modalities in the context of a web application. This modalities can work standalone or together generating a multimodal experience. This platform helps to increase the set of interactions to be supported (beyond the usual modalities found on desktop computers and mobile devices) including: keyboard, mouse, optic (pens), haptics (i.e touch devices, vibro-tactile feedback) and gestures (e.g Microsoft Kinect or Leap Motion). Whilst this set offers a good amount of interaction capabilities, it is extremely rare to found all of these together on “a single device”, further more to found applications that uses all or even some of them, gives scarce results. With the proposed platform, we can “integrate” different modalities found on different devices, inside the context of a web application and distribute the interaction. We have a simple working demo application using two different modalities, haptics and air gestures, and they can be seen as two separate modalities or as a single multimodal interaction.

2

Related Work

There are not many contemporary works that addressed the main subject of this paper, add support for multimodal interactions on web applications. However, two interesting solutions can be cited: – i*Chameleon [4], introduces an MVC framework for developing multimodal applications. They mix the separation of responsibilities principle with all the different roles involved on the development of a multimodal application. This results in a ”easy to follow" workflow and was an inspiration for our

2

A platform to augment web applications with multimodal interactions

work. A key difference with our solution is the fact that i*Chameleon is a framework and we propose a platform (i.e the Plusultra instances allows to handle multiple web applications, not just multiple clients). – Multimodal Framework for Mobile Interaction [1], on this work they target mobile because these devices offer a rich set of sensors and actuators and they are very popular, we share this vision but we try to integrate every device capable of access the web, not only mobile. Another important aspect of this work it is they follow the W3C guides [5] [6]. But then they validate their work developing a native application (Android). We took some design characteristics of the architecture proposed by the W3C, but or work is not fully compliant with their guidelines.

3

The Platform

The platform architecture is composed by two separate components. A distributed publisher/subscriber system, named Plusultra, and an endpoint dependency, known as Gyes. They “talk” each other using a top-level protocol specifically designed for the platform.

Fig. 1. A quick view of the platform.

Figure 1 shows these components. Circles represent a Plusultra instance, working as a message gateway, surrounding it are the web clients and the modality device/s. They are using the component Gyes to connect with the system and exchange modality information.

A Platform To Augment Web Applications With Multimodal Interactions

3

The following sections will provide more information about each single part of the platform. 3.1

Communication

The communication component consists on a small Node.js [2] service. It handles messages and using and in-memory storage can quickly distribute messages between modalities and applications. Provides a websocket server on which Gyes modules can connect. Every service instance works like a distributed publishers/subscribers system, so it can escalate easily, it is based on events (same as web applications) and offers a clear separation between modalities and Web clients [3]. 3.2

Modalities and Applications

The endpoint component Gyes, can run on the client side of the web application and on the modalities devices too. It is mainly used by the Web Developer, the Device Engineer and possibly the Modality Designer. It is consumed like another dependency on the development stack and in general terms, it allows to connect with the Plusultra component, add new modalities to a Web application, propagate data recognized by some modalities and act, synthesizing data, if possible.

4

Conclusions & Further Work

The platform introduced here allows to augment web applications with multimodal interactions. Also, the proposed architecture introduces a novel approach distributing the fusion and fission engines along the clients. Using this platform, Web developers can consume modality data and use it in the context of a web application and Device Engineers have a way to extend Gyes in order to connect new modalities. Further work will include architectural and implementation details.

References 1. Francesco Cutugno, Vincenza Anna Leano, Roberto Rinaldi, and Gianluca Mignini. Multimodal framework for mobile interaction. In Proceedings of the International Working Conference on Advanced Visual Interfaces, pages 197–203. ACM, 2012. 2. Ryan Dahl. Node.js. 2009. Last access: 2014-08-03. 3. Anne-marie Kermarrec. The Many Faces of Publish/Subscribe. 35(2):114–131, 2003. 4. Kenneth WK Lo, Will WW Tang, Grace Ngai, Alvin TS Chan, Hong Va Leong, and Stephen CF Chan. i* chameleon: a platform for developing multimodal application with comprehensive development cycle. In Proceedings of the 28th Annual ACM Symposium on Applied Computing, pages 1103–1108. ACM, 2013. 5. W3C. Multimodal interaction framework. 2003. Last access: 2013-07-06. 6. W3C. Multimodal architecture and interfaces. 2012. Last access: 2013-07-06.