NoseTapping: What else can you do with your nose? - Semantic Scholar

7 downloads 207146 Views 400KB Size Report
and Android devices (4 phones, 2 tablet computers). The condition for ... 3. Phone interaction. The participant was asked to per- form typical tasks with his phone. The tasks ... wear gloves to pick up an incoming phone call for her. Regarding the .... Interacting with the nose might be an unconventional way of interacting with a ...
NoseTapping: What else can you do with your nose? Ondrej Polacek

Thomas Grill

Manfred Tscheligi

Faculty of Electrical Engineering, CTU in Prague Karlovo nam. 13, 12135 Prague, Czech Republic

ICT&S Center, University of Salzburg Sigmund Haffner G. 18, 5020 Salzburg, Austria

ICT&S Center, University of Salzburg Sigmund Haffner G. 18, 5020 Salzburg, Austria

[email protected]

[email protected]

[email protected]

ABSTRACT Touch-screen interfaces on smart devices became ubiquitous in our everyday lives. In specific contextual situations, capacitive touch interfaces used on current mobile devices are not accessible when, for example, wearing gloves during a cold winter. Although the market has responded by providing capacitive styluses or touchscreen-compatible gloves, these solutions are not widely accepted and appropriate in such particular situations. Using the nose instead of fingers is an easy way to overcome this problem. In this paper, we present in-depth results of a user study on nose-based interaction. The study was complemented by an online survey to elaborate the potential and acceptance of the nose-based interaction style. Based on the insights gained in the study, we identify the main challenges of nose-based interaction and contribute to the state of the art of design principles for this interaction style by adding two new design principles and refining one already existing principle. In addition, we investigated in the emotional effect of nose-based interaction based on the user experiences evolved during the user study.

Keywords Design Guidelines, NoseTapping, Touchscreen, Gloves, Winter

Categories and Subject Descriptors H.5.2. [User Interfaces]: Input devices and strategies

General Terms Design; Human Factors; Performance.

1.

INTRODUCTION

The pervasion of our lives with touch-screen based computing devices like smartphones and tablets is evident and is getting more and more common in our daily lives. The technology, however, hinders people from interacting with

Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. MUM 2013, Lulea, Sweden Copyright 2013 ACM 978-1-4503-2106-8/13/12 ...$15.00.

Figure 1: Nose-based interaction in a cold outdoor context

the devices in particular contextual situations. Such situations are usually characterized through parameters in the environment or in specific situations that encumber users from properly using fingers to interact with the capacitive touch screen. Capacitive touch displays used today, require a conductive material like the human skin for operation. The interaction is not possible in situations when people wear common gloves as they are typically not conductive. This may happen especially during cold winter days. The market has already reacted to such situations by providing solutions like capacitive styluses or touch-screen compatible gloves that allow the user to interact with the devices while wearing them. However, these solutions are not widely spread yet and we can observe people removing gloves, for example, to answer a phone call. Another possibility to interact with a touch-screen enabled device is to use any uncovered body part such as the nose, tongue, chin, etc. Using different types of interaction modalities and interaction styles implies not only feasibility aspects but also acceptance aspects which are crucial to the success of a particular interaction style. In this paper, we focus on nose-based interaction in which people tap or swipe with their nose on a touchscreen (see Figure 1). We found a strong need for an input modality which could be used in such contextual situations. We also found that nose-based interaction is suitable for short use cases. However, people are not willing to use such interaction in case of more complex functionality like, for example, typing a message. In order to investigate in the potentials of using such body

parts in different contextual situations, we first conducted a user study and an online survey, both aiming at exploring the potentials and problems of nose-based interaction. The results provide us with insights about the potentials and hurdles when interacting with your nose and allows us to elaborate design-guidelines for nose-based interaction.

2.

RELATED WORK

As mentioned above, wearing gloves disables the users of devices with capacitive touch screens to interact with their devices. However, users can still use available parts of skin that are not covered like the nose, cheeks, or tongue. An example for such is the Tongue Drive System (TDS) developed by Kim et al. [3] which allows a user to control computers, wheelchairs, etc. Such alternative interaction styles are currently used particularly in the area of assistive technologies. With the CheekTouch system, Park et al. [7] investigated the first approach of using the cheek to switch to different interaction modes that can be applicable during a call, when the mobile phone is attached to the cheek. In particular, they use finger based gestures to invoke specific commands through gestures on the back-side of the phone. Already in 1991 Henry et al. [2] investigated in the interaction potential of different parts of the body. They identified different parts of the body to be suitable for interacting with different, mostly virtual objects. First design ideas focused on a nose gesture interface device extending the functional capabilities of the nose. In order to interact with touchscreen devices, such an extension is not relevant as we rely on the capacitive conductivity of the nose for interaction purposes. Using nose for interacting with the touch-screen device is a novel approach to the usage of such devices in specific contextual situations. Zarek et al. [11] did a survey with 13 participants to get the first idea about non-standard touch interaction with mobile devices. They conducted an experiment with eight participants that focused on different interactions with their nose. Based on the results they derived five design guidelines aimed for nose-based interaction. Another approach is called Finger-nose stylus [10]. As shown in Figure 2, the Finger-nose stylus combines touch-stylus with an artificial nose replacement. This allows the users to interact with their nose while keeping the distance of the eyes and phone far enough to properly see what is on the screen. An existing iPhone application called NoseDial 1 is designed for being used with the nose. The application focuses on the functionality of dialing a number by interacting with the nose. Further several sports applications for e.g. counting push-ups exist. The phone is placed below the user’s head and a push-up is counted by using the proximity sensor or touch sensor of the device to count the number of pushups. Around-device interaction [1, 4, 5] uses infrared proximity sensors to track the hand above or beside the device’s screen. This is also feasible when wearing gloves. One major issue with this is that glove styles and glove sizes vary and influence the scanning accurateness of such approaches. Rico and Brewster [8] investigated into social acceptability of several gestures including nose-based gestures. They found 1

http://www.nosedial.com

Figure 2: A design approach for a Finger-nose stylus [10]

out that nose-based gestures are drawing much attention in comparison to other gestural interaction styles and people would not prefer to use nose tapping gestures to interact with their mobile phones. However, they did not put their research in the context of the cold weather and touchscreen interaction. Voice input can be used on some phones during the winter. However, the speech recognition still has accuracy and social acceptability issues when talking to your phone in public spaces. Zarek et al. [11] conducted an initial user study focusing on basic interaction and design properties of nose-based interaction. Our work focuses on concrete interaction requirements in a specific contextual situation identified by means of in-depth interviews with real tasks with 14 people and an online survey with 92 respondents.

3.

USER STUDY

The aim of the user study was to gather qualitative data from the users regarding non-traditional phone interaction in winter and their opinions and thoughts on nose-based interaction. In order to achieve this, we conducted semistructured interviews and asked the participants to perform simple tasks in which nose-based interaction was employed. The interviews and tasks provided us with rich qualitative data. Based on this data, we conducted an online survey (see Section 4) which quantified our results.

3.1

Participants

A total of 14 users of phones or tablets with touch screen were recruited for interviews (6 women, 8 men, mean age = 28, stdev = 3.0) from university students and through social media. Their background was not homogenous and stemmed from a multitude of fields. As the reward for participation, they received gloves used in the study worth 2 – 5 e. They were all computer users, 12 of them use a smartphone regularly and 9 of them were regular tablet users. The screening condition for the participation was to have previous experience with touch screen devices (either phone, tablet, or both).

3.2

Study Setup

Participants were instructed to bring their own devices if possible. This was done for different reasons. First, people

preferred to interact with their own devices due to hygienic reasons although the devices were cleaned thoroughly before and after every user test. Second, people should be comfortable using and interacting with the particular device in order to have a common basis for the qualitative results obtained during the interviews. The participants brought iOS devices (8 iPhones, 7 iPads) and Android devices (4 phones, 2 tablet computers). The condition for participation was to regularly use either smart phone or tablet. Two participants, who did not own a smartphone, were provided with Samsung Galaxy Nexus. For five participants, who did not bring a tablet, we had either Samsung Note 10.1 or iPad. Screen sizes of phones varied from 2.8” up to 4.65” and tablet screen sizes were from 9.7” to 10.1”. A typical session lasted approximately 50 minutes and was organized as follows: 1. Welcoming. The participant was welcomed and the procedure was explained. Further he was asked to fill out a demographic questionnaire and sign a consent form that allowed us to record the evaluation session. 2. Pre-test interview. The participant was interviewed with a focus on previous experiences with non-traditional interaction and using devices in the winter. 3. Phone interaction. The participant was asked to perform typical tasks with his phone. The tasks included (i) unlocking the phone, (ii) contact list navigation and calling, (iii) composing SMS, and (iv) taking a picture and browsing a gallery. After finishing the phone tasks, participants emotions were assessed by means of the Geneva Emotional Wheel (GEW) [9] which was used to explore the pleasantness and control dimensions of emotions. 4. Tablet interaction. The participant was asked to interact with a tablet computer using her or his nose. The tasks included (i) unlocking the tablet, (ii) running a web browser, entering a web address, and (iii) finding a particular information on a web page. Again, the GEW was used to asses the participant’s emotions. 5. Post-test interview. A post-test interview was done focusing on task debriefing and clarifying open issues. During the interaction tasks, the participants were asked to wear gloves and control the phone by their nose. Participants did not use any optimized technology for the nosebased interaction. In order to immediately being able to capture their thoughts and experiences the participants were instructed to use a think-aloud protocol. The whole session was audio-recorded for further evaluation and reference purposes.

3.3

Results & Discussion

After conducting the study, the collected materials were analyzed. In the following section, we discuss our insights based on the analyzed material.

Previous Experience Almost all participants in the user study already experienced interaction problems in winter with their touchscreen phone (13 of 14 participants), 9 solved this by pulling their gloves off and thus exposed fingers to uncomfortably cold air. One participant also owns touch gloves. Anyway, he uses them only when it is really freezing outside as he complains about the accuracy of these gloves. Further, the device tends to slip

away from the hands while holding it with the gloves. Three participants reported that they already used their nose for picking up a phone call or unlocking their phone. Two of them found using the nose for touch-screen interaction visually demanding and one remarked that the device gets filthy after a while. Especially in winter when having a runny nose. Two participants tried using the tongue to interact with the device, but they remarked that “It was weird” and “The device got dirty”. The need for an alternative interaction in winter was nicely demonstrated by two participants. One decided to use fingerless gloves to avoid the problem. Another participant reported that she asks her friends who are around and do not wear gloves to pick up an incoming phone call for her. Regarding the situations that are applied in outdoor scenarios in cold temperatures we could rank the different situations as shown in Table 1. # of part. 9 7 5 4 1

Scenario description Walking outside Waiting outside for other people Waiting for public transport During sports activities Walking a dog

Table 1: Contextual scenarios during mobile phone usage in outdoor contexts in winter. We also asked on specific use cases that participants are willing to perform when wearing the gloves. The identified use cases are summarized in Table 2. Based on the inherent communication functionality of mobile phones, it is obvious that the participants prioritized functionality to perform voice and text-based communication with other people, i.e. calling and texting. # of part. 14 13 12 9 8 7 6 4 3

2 1

Use cases Incoming phone call Outgoing phone call, incoming message Message sending E-mail checking Browsing a web E-mail composing, playing a sound/video, taking a picture/video Using pedestrian navigation Searching for a public transport connection Other notifications (incoming e-mail, Facebook post, appointment reminder, . . . ), time checking, browsing a gallery Checking weather, reading RSS feeds, purchasing a SMS ticket Gaming, calendar, managing todo (shopping) list, checking facebook, reading study materials

Table 2: Identified phone use cases when wearing gloves Apart from traditional tapping on the screen with fingers, participants use also other modalities – phone gestures, voice input, and styluses in order to interact with their mobile touch-screen device. By phone gestures, we mean tilting

the phone in games (3 participants) or shaking the phone to delete a text (3). Voice input was used at least once by eight participants, but most of them do not use it regularly (7) as it did not work accurately enough (6), or they felt embarrassed in public (1). Four participants already used a stylus to interact with their phone, but they do not use it anymore as it is an extra device they need to carry around. The results presented in this section mostly regard to a touch-screen enabled phone. We also asked particularly on previous experience with the tablet. However, we learned that the tablet is used mainly indoor (7). Only two of the participants use a tablet also outdoor and they always pull their gloves off.

Phone Tasks After the pre-test interview, participants were asked to perform several tasks with their nose. One participant was not able to finish any task and is therefore considered as an outlier. She gave up after a few unsuccessful initial taps saying “It doesn’t work for me”. The rest of the participants were able to finish all tasks. # of part. 8 5 1

Identified problems Vision problems Accuracy issues Nose shape issues

Table 3: Problems identified during the nose-based phone tasks the participants performed during the study As depicted in Table 3 mostly visual problems occurred. Out of the 8 participants that experienced vision problems, 4 tried to compensate them by closing one eye. The second severe problems that occurred with one third of the participants are related to accuracy issues. These are often linked to visual perception problems that occurred because of the necessary high frequency of focus changes. One participant mentioned that the nose shape might influence this interaction modality (“It is quite easy as I have a pointy nose”). During the test 12 of the 14 participants wore woolen and 2 leather gloves with different thickness. Participants were asked to accomplish following tasks: 1. Unlocking the phone. Eight participants used a swiping gesture to unlock the phone. The success rate was 75% where one participant used even the chin to unlock the device and one needed several attempts to make the correct gesture. Two participants had to enter a PIN code after unlocking the phone and they both needed several attempts. Three participants used a pattern lock-screen, which used a rather complicated gesture. Entering the gesture was quite challenging for all of them and it soon led to frustration. 2. Searching for a contact and calling. This task was relatively simple for all participants and all participants succeeded with the task. Some problems identified were that one participant invoked different functionality than expected by pressing a contact for too long. Three participants had problems hanging up the phone call as the screen went black while holding the device vertically. This behavior is implemented in many phones to save battery during a phone call. However,

it causes problems during nose-based interaction. Resulting in order to properly address also nose-based interaction changes to the standard behavior of mobile phones need to be implemented. 3. Composing a short text message. Typing was the most problematic task as the keyboard buttons are relatively small and participants produced a lot of errors. Five of the participants tried to enlarge buttons by switching the display to landscape mode. Two of the participants even used word prediction to simplify the text input with the nose. Another two participants reported escalated visual demands during text input. Most participants (13 of 14) used standard QWERTY keyboard and one used Multi-tap keyboard (the conventional text entry method on non-touchscreen mobile phones). Using the Multi-tap keyboard turned out to be tricky as the participant was sometimes not fast enough to tap before the timeout for the keys was reached which resulted in typing a wrong letter. 4. Taking a picture and browsing a gallery. Taking a picture was easy for most participants (10). However, three of them reported problems with aiming. It was difficult to aim at a target because participants were too close to the screen and were not able to see the target. They had to hold the phone in a fixed position and then moved the head to make a picture. Browsing a gallery by swipe gestures was not problematic at all.

Tablet Tasks When using a tablet, two participants explicitly mentioned that the interaction outperforms the interaction with the phone. Three participants remarked that their breath resulted in wet spots on the tablet surface while interacting with their nose. Participants were asked to accomplish following tasks: 1. Unlocking the tablet. Unlocking was not problematic at all as only simple swipe gesture was used on all tablets. Only one participant needed several attempts to unlock the tablet as he had problems aiming with the nose. 2. Entering an URL. Typing was simpler than using the phone because of bigger keys on the virtual QWERTY keyboard. Therefore, participants made less errors. One participant, who mistyped a letter, tried to even move the cursor and corrected it. He found cursor movement very difficult and almost impossible to do. One participant struggled with running web browser as he tapped the icon for too long. 3. Browsing. Participants mostly complained on small links on the web page and mentioned that they are not able to use zoom. However, two participants used double tap to enlarge the web page. Two participants mentioned problems with scrolling – they could not see what is on the screen while scrolling and thus can not estimate how fast to scroll. They also needed to put the tablet away to look at the screen and then go back again to perform scrolling.

Debriefing From the debriefing, we learned that the biggest issue of nose interaction lies in its visual demands. This confirms the findings during the observations that show that visual

Use case Incoming phone call Incoming message Device unlocking Time checking Other notifications Outgoing phone call Message sending E-mail checking E-mail composing Navigating Gaming Playing a sound/video Taking a picture/video Browsing a gallery Browsing a web

Phone

Tablet

1.5 2.5 2 3 4 5 5 5 5 5 5 5 5 5 5

N/A N/A 2 3 4.5 N/A N/A 5 5 5 5 5 5 5 5

Table 4: Average acceptance rate of the nosebased interaction method for the particular use cases where 1 – yes, I would use it; 3 – neutral; 5 – no, I would not use demands make the most severe problems during the interaction with mobile devices, especially smartphones. All the participants mentioned at least one problem related to sight. Examples are expressions like “My eyes hurt” (3), or eye fatigue complains (4). The second important issue was related to accuracy problems (12). Five participants mentioned that accuracy depends on the button size. The poor accuracy during nosebased interaction further influenced the performance (speed), which was found insufficient by six participants. Holding the device while wearing gloves was perceived as uncomfortable by five participants as the device tends to slip out of the hand. Due to this fact, participants had to hold the device more firmly which resulted in occluding edges by their fingers and thus making interaction on edges and corners hardly possible. The interaction with the tablet was perceived faster (5), more accurate (9) and less visually demanding (7) than with the phone. The size of the target has clearly a large impact on accuracy, not only between a tablet and a phone. The screen size also influences target size and thus the accuracy as well. Holding a tablet was also experienced more comfortable by six participants due to its wider edges. We also asked participants to compare the nose swipe gesture to nose tapping. Seven participants preferred swiping and only one preferred tapping. They reported nose taps as inaccurate. Three participants had no preference and three other considered this interaction style as task dependent. They explained that swipe gesture used for scrolling (web page browsing, contact searching) is cumbersome as it is difficult to guess how much to scroll and the visual information is always changing which is hard to follow. On the other hand, swiping in browsing a gallery, unlocking the phone, or answering a call is more accurate and comfortable than tapping.

3.4

Summary

The performance of the users who do not own a tablet or smart phone was slightly affected by some minor usability issues resulting in longer task time. However, we did not ob-

serve any differences regarding the interaction itself. Their comments and problems were identical to the ones of the device owners. Although we did not measure the learning curve directly in this study, we observed that the participants learned the interaction method quite fast. The main problem at the beginning is aiming. The user has to estimate the position where the nose touches the screen. Most of them mastered this after several seconds, already in the first task. Due to the accuracy issues of the nose-based interaction, we identified a strong need for relatively large buttons. This issue came out especially with the text input when the participants switched to landscape mode to enlarge the buttons. Also, we found that screen size has an impact on accuracy. Not only tablets are considered more accurate, we also observed a difference between screen sizes of phones. One major issue was that due to the visual demands and accuracy problems the participants resided too long on the area when invoking a function. This was often recognized as a “long tap” instead of the desired “short tap” which resulted in an error. The size of the screen influenced the way participants interacted with the phones and tablets. The phone was held in one hand and they moved the device in order to reach different spots on the screen. However, the tablet was held mostly in both hands and they tend to move the head instead of the device. In the behavior of participants, we found an interesting correspondence to one principle presented in [11] (Minimize the Number of Nose Taps). They constantly tried to minimize the nose taps by using predictions when composing a text message or history when entering a web address.

Experienced emotions Interacting with the nose might be an unconventional way of interacting with a touch-screen device. It looks funny on one hand and on the other hand people might have the feeling that they are observed by other users when doing so. Further difficulties with visual perception, precision issues, and hygienic reasons may have influence on experiencing different types of emotions like arousal, happiness, etc. In order to assess the user’s emotions based on the nose interacting, we used the Geneva Emotion Wheel (GEW) [9], which allows to address the pleasantness and control dimensions of emotions. The GEW was handed to the users immediately after they conducted the given tasks with the particular devices, i.e. the phone and the tablet. Thus we could ensure that the emotions that have been experienced during the interaction with the devices are still persistent. In order to avoid language-based problems we used the GEW in the local languages which were German and Czech. Figure 3 depicts the average values collected during the study. The GEW is constructed in a way that it represents four different combinations of control and pleasantness characteristics. The Table 5 depicts such collected values where pleasant attributes are rated with 3.63 and unpleasant attributes with 4.19 for the phone and 3.78 and 4.46 for the tablet. This corresponds to a more neutral but slightly positive emotion. This can be explained also through the interviews where people had fun interacting with the nose and their device. This is influenced also through the learning phase for nose interaction where emotions like happiness evoked through fun is

Irritation Anger

Legend

Involvement Interest

Contempt Sorn

phone tablet

Amusement Laughter Pride Elation

Disgust Repulsion Envy Jealousy

Happiness Joy

Disappointment Regret

No emotion felt

Enjoyment Pleasure

Guild Remorse

Other emotion felt

Tenderness Feeling love Wonderment Feeling awe

Embarrassment Shame Worry Fear

Feeling disburdened Relief

Sadness Despair

Astonishment Surprise Pity Compassion

Longing Nostalgia

Figure 3: The Geneva Emotion Wheel showing the average values of the phone and tablet task. Table 5: Emotions as collected by means of the Geneva Emotion Wheel. The values are based on a 5-point likert scale from 1 (yes) – 5 (no) -

Pleasant

Unpleasant

Low Control

High Control

Phone Median Average Stdev

4 3.63 1.37

5 4.19 1.23

4 3.6 1.43

5 4.13 1.17

Tablet Median Average Stdev

4 3.78 1.34

5 4.46 1.01

4 3.84 1.36

5 4.32 1.01

persisting. For a further proof of the funfactor of positive emotions further studies might be necessary. Indications for such are that the interaction potential in situations where it is freezing outside without having to pull of your gloves might induce some fun during the usage of the smart devices. The second dimension of the GEW refers to the feeling of control over the emotion. The average values of 3.6 for High Control situations and 4.13 for Low Control for the phone and 3.84 and 4.32 for the tablet situations indicate, in analogy to the results for pleasantness, that people feel to be slightly more in control over the emotion than not. In general, the interaction with the nose affects the user’s emotions only slightly. Positive emotions as depicted on the right side of the GEW are rated slightly more effective than negative emotions as depicted on the left side. The participant’s emotions have been relatively stable and only slightly effected.

4.

ONLINE SURVEY

The online survey was filled out by 92 respondents (53 women, 39 men, mean age = 27.8, std. dev. = 7.30), 87 of them were regular phone users and 21 tablet users. They mostly use Android (52 phones, 9 tablets), iOS (23 phones, 11 tablets) and Windows (5 phones, 1 tablet) devices. The respondents

were recruited by an advertisement in university mailing list and through social media. Their background was diverse, 57 of them were university students. The respondents were instructed to imagine a specific situation while filling out the form: “It is cold day in winter, temperature is about -5 ◦ C (= 23 ◦ F), you are wearing winter clothes including gloves, and you are waiting for a bus at a bus stop”. Phone Total number of respondents

87

Q1: Have you ever experienced problems ing [device] due to wearing gloves? Yes 71 (82%) No 8 (9%) I don’t remember 8 (9%) Q2: Do you use tablet outside? Yes N/A Sometimes N/A Never N/A

Tablet 21 with us5 (24%) 11 (52%) 5 (24%) 2 (10%) 11 (52%) 8 (38%)

Q3: How did you overcome this problem? I avoided the interaction 22 (25%) 12 (57%) I took off the glove 77 (89%) 3 (14%) I used nose or other uncovered 9 (10%) 0 (0%) body part I used stylus 4 (5%) 1 (5%) I used a touch screen glove 10 (11%) 1 (5%) Other 4 (5%) 1 (5%) Q4: Do you perceive the inability of interaction with the [device] while wearing gloves as problem? Yes 33 (38%) 2 (10%) Maybe yes 26 (30%) 3 (14%) I do not know 5 (5%) 7 (33%) Maybe no 11 (12%) 1 (5%) No 12 (14%) 8 (38%) Q5: Can you imagine using your nose for interacting with the [device]? No 54 (62%) 17 (81%) Simple tasks 33 (38%) 3 (14%) Complicated tasks 0 (0%) 1 (5%) Everything 0 (0%) 0 (0%) Table 6: Excerpt of the questions from the online survey. The numbers express total count of corresponding category. In Q1,2,4, and 5 respondents picked up one category, while in Q3 they could pick more than one. The most important excerpt of the survey is shown in Table 6. Respondents stated that they have already experienced problems with using phone (82%) due to wearing gloves (see Table 6-Q1 ). This differed for the tablet, where only 24% of respondents experienced the same problem as according to Q2 they interact with tablet outside only sometimes (52%) or do not interact outside at all (38%). According to Q3 they mostly solve the problem of the inability of interaction by removing the glove (89%). An interesting result is a finding that 10% of the respondents have already used nose or other uncovered body part to interact in the winter. The majority of tablet users avoid the interaction outside (57%).

No Simple tasks

Males

Females

Use case

17 (44%) 22 (56%)

40 (75%) 13 (25%)

Incoming phone call Incoming message Device unlocking Time checking Other notifications Outgoing phone call Message sending E-mail checking E-mail composing Pedestrian navigation Gaming Playing music or video Taking a picture or video Browsing a gallery Browsing web

Table 7: Q5 for phone split according to sex.

The numbers regarding problematic interaction in winter approximately correspond to those obtained in the user study. However, an interesting fact is the high number of people who avoided interaction (25% of phone users and 57% of tablet users). This clearly suggest that this problem is not yet solved sufficiently. This also relates to the Q4 regarding perception of the inability of interaction while wearing gloves. We can see that phone users tend to perceive this as a problem, while in case of tablet they do not tend to perceive it this way. Approximately one third of respondents (38%) are willing to use nose for interacting with the phone in simple use cases (Q5 ) and 62% of them would never use it. This finding is in contradiction to Zarek et al. [11] finding who claim that this number is much higher (60%). This can be explained when the responds are split according to the sex in case of phones. The Table 7 shows that nose interaction is more accepted by males. 56% of the males would use the nose in simple tasks which goes along with the results of Zarek et al. who used mostly males in their study and didn’t focus on gender-based differences regarding nose interaction. In case of tablet users, the number of people willing to use nose in simple use cases is even lower (14%). Almost none would use it for more complicated tasks or even for all tasks. In the online survey, we also focused on quantifying the use cases for winter interaction. The use cases were partly defined by us and were partly derived from the interviews (see Table 2) as described in previous section. The survey showed to which extent respondents are willing to use the phone or tablet outside even if it is cold and they have to wear gloves. In order to get this values, we asked them to rate the statement “I use [use case] outside even if it is cold and I have to wear gloves” on a 5-point Likert scale for phone and tablet devices. The median results are shown in Table 8. Similarly to the study results, the respondents tend to use the phone in the winter conditions for communication and they limit the other use cases. Since tablets are intended as consuming devices rather than communicators, their use in winter is even more limited.

5.

DISCUSSION

Based on the user study and the online survey, we identified a strong need for an input modality which would enable mobile interaction outside during the cold winter. Both research methods confirmed that participants already experienced the problem of inability of interaction while wearing gloves. They usually have to take their gloves off, which is unpleasant in cold weather conditions and which limits the time spent using the phone. Nose-based interaction can be seen as an alternative to fingertip interaction in particular contextual situations. However, based on the user research we could identify several advantages and disadvantages of such interaction. They are summarized into two main factors that may hinder users from using nose-based interaction:

Phone

Tablet

1 2 2 2 3 2 3 4 5 3 5 4 2 4 4

N/A N/A 3 4 4 N/A N/A 4 4 3 5 4 3 4 4

Table 8: Identified use cases for smartphone interaction when wearing gloves. Participants rated the statement “I use [use case] outside even if it is cold and I have to wear gloves”, with 1 – strongly agree, 3 – neutral, 5 – strongly disagree. Median values are shown.

– Interaction properties. When interacting with the nose, focusing on the screen is visually demanding. This results in poor accuracy of tapping as the user does not properly see the area that he or she is tapping on. Also an increased error rate and the need to invoke undo actions more often, which of course has negative effect on the speed of interaction, could be observed. The people are then willing to use nose only for very simple tasks. Reducing visual demands and increasing accuracy is therefore one of the main challenges in nose interaction. – Social acceptance. Even though approximately one third of the respondents are willing to use nose interaction, the main reason for avoiding the interaction was feeling embarrassed in public. Only marginal amount of people expressed hygiene concerns. This is partly caused by little awareness of nose-based interaction possibility which may change in near future. In spite of these disadvantages, people are willing to use the nose for simple use cases like picking up an incoming phone call, checking an incoming message, etc. Using the nose is too cumbersome for more complex functionality (e.g., composing a text message). This is also supported by the results of the Geneva Emotion Wheel that show a slightly more positive emotional state of the participants regarding nosebased interaction. Qualitative results from the user study, supported by the online survey, clearly state that people would prefer not to use nose-based interaction in public, unless for simple commands like picking up the phone. In our study, we need to point out the restriction that our results are valid for areas with relative mild winter as e.g. in Central Europe. The situation would be probably different for places where people experience temperatures below -20◦ C (-4◦ F) which refers to an interesting research question that came up during our study. Zarek et al. [11] proposed, based on a user study with students, five design principles for nose interaction. Based on our study results, we succeeded in contributing to their re-





Enhanced Nose-based Interaction Design Principles



P1. P2. P3. P4. P5. P6. P7.

Minimize the Number of Nose Taps Provide Feedback Without Eye Fatigue. Error Mitigation Preserve Application UI Layouts Minimize On-Screen Nose Sliding [enhanced] Avoid Using Edges [new] Use Only Simple Taps [new]



Figure 4: Nose-based interaction design principles from Zarek [11] and enhanced based on the results of the user study and the online survey conducted. sults. Figure 4 depicts the resulting list of design principles. In following paragraphs, we discuss how our study results support Zarek’s results. In addition, we could enhance one principle and add two new design principles based on the analysis performed above. [P1 – Minimize the Number of Nose Taps] as identified by Zarek aims at the fact, that people are willing to engage only low number of nose taps. We observed a similar behavior supporting Zarek’s finding in our study. Participants even tried to save nose taps by using the text-prediction functionality during text input. [P2 – Provide Feedback Without Eye Fatigue] focuses on the fact that nose-based interaction results in high visual demands for the user. Our results support this principle resulting in a design guideline to avoid the necessity to look at the screen during nose-based interaction. Also interactions should be kept simple and short to avoid necessary visual validations of the performed activity and thus additional focus changes which is tiring the user’s visual sense. [P3 – Error Mitigation] aims on decreased accuracy when tapping by the nose. Our results support this principle and explain it with the reduced possibility to relate visual perception to the interaction performed with the nose. [P4 – Preserve Application UI Layouts] The design principle emphasizes the need for an overlay layout for existing applications. Changing application layouts require the user to continuously validate the location of application elements on the touch-screen resulting in eye fatigue. [P5 – Minimize On-Screen Nose Sliding] This principle aims at continuous interaction (scrolling a web page, or scrolling a contact list) as the users are not able to follow the content. They also claim on-screen nose sliding should be minimized due to hygienic concerns. However, our interview results show that swiping gestures are highly preferred by the participants for an event-based interaction, such as gallery browsing, unlocking the phone or picking up a call as was considered more accurate than tapping by participants. An important issue is that the sliding gesture needs to be simple. A pattern gesture like e.g. the pattern lock as it is implemented in Android phones is too complicated to be used. This is especially true for nose interaction which is visually demanding and a proper visual feedback is missing to a certain extent. Also we did not obtain any result indicating that nose sliding is less hygienic than nose tapping. [P6 – Avoid Using Edges] During the interaction with gloves users tend to hold the device more firmly while wearing gloves as it might slip out. This results in occluding edges

of the device by user’s fingers making these parts less comfortable for interaction. [P7 – Use Only Simple Taps] Interaction based on tap length (invoking different functionality on long tap) should be avoided as the user does not control this as precisely as by fingers. Designers should also avoid interactions that require multiple nose taps within a short timeout (e.g. Multi-Tap keyboard). Two consecutive co-located taps, which are not performed accurately enough, may lead to an error.

6.

CONCLUSION & FUTURE WORK

The study described in this paper investigated on current usage of touchscreen devices during a cold winter when users wear gloves. Among others, we found that more than half of the respondents perceive this as a problem. A quite unconventional way of interacting with capacitive touch-screens is interacting with the nose. In our study we could identify that approximately one third of the respondents are willing to interact with their nose for simple use cases. This reflects a high potential for the nose-based interaction which is leveraged by high visual demands and poor social acceptability as the main drawbacks of this interaction style. In addition a gender-based difference in the acceptance of nosebased interaction was identified providing an area for future in-depth research on the acceptance of this interaction style. Based on our results, we further succeeded to contribute to an existing set of design principles for the nose interaction by refining one principle and adding two more. In future work, we aim to develop applications supporting nose-based interaction. For such applications identified criteria like acceptability, visual demands, and accuracy issues will be regarded as the main restrictions for a successful nose-based application. Especially challenging is design and development of a text entry method suitable for the nose. Based on our research, we could identify similarities between nose-based interaction and the area of accessibility. Future work will also focus on the area of accessibility as interacting with the nose may not only be a convenience function but also an enabling function for people with disabilities. Interacting with the nose also implies looking from a very short distance to the screen, which makes our vision being impaired as we are not able to focus on the content. Further nose-based interaction affects accuracy and speed of interactions, which reminds of interaction when fine motor activity is impaired. We may not only learn from accessibility research, but we can also contribute to this field as future applications enabling nose-based interaction may also

enable access to functions that are currently hardly accessible to people with even mild impairments (induced e.g. by age). As described in the paper, one of the main challenges is improvement of the feedback of nose-based interaction. Using the visual modality is quite demanding and thus future research should aim on investigating on other modalities such as tactile (device vibrations) or audio (earcons[6]) feedback. Adding these features may lower the visual demands of nose interaction and make the interaction more comfortable. If we look at interacting with a touchscreen device in winter in a broader perspective, we may further investigate on incorporating input modalities of interaction or designing alternative communication methods. A future multimodal combination of nose interaction and tilting, shaking or waving the phone may improve the user experience. The paper described the nose interaction in one context only. However, we can find more contexts in which the interaction can be further studied. For example, the fun emotion identified during the study suggests to use the nose interaction in context of a game.

Acknowledgments The work described in this paper was supported by the Christian Doppler Laboratory for “Contextual Interfaces”. The financial support by the Austrian Federal Ministry of Economy, Family and Youth and the National Foundation for Research, Technology and Development is gratefully acknowledged. This research has been also partially supported by projects Automatically Generated UIs in Nomadic Applications (SGS10/290/OHK3/3T/13; FIS 10-802900) and Veritas (IST-247765).

7.

REFERENCES

[1] A. Butler, S. Izadi, and S. Hodges. SideSight: multi-touch interaction around small devices. In Proc. of UIST’08, pages 201–204. ACM, 2008. [2] T. R. Henry, S. E. Hudson, A. K. Yeatts, B. A. Myers, and S. Feiner. A nose gesture interface device: extending virtual realities. In Proceedings of the 4th annual ACM symposium on User interface software and technology, pages 65–68. ACM, 1991. [3] J. Kim, H. Park, and M. Ghovanloo. Tongue-operated assistive technology with access to common smartphone applications via bluetooth link. Conference proceedings : ... Annual International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE Engineering in Medicine and Biology Society. Conference, 2012:4054–4057, 2012. [4] M. Kranz, P. Holleis, and A. Schmidt. Distscroll - a new one-handed interaction device. In Distributed Computing Systems Workshops, 2005. 25th IEEE International Conference on, pages 499–505, 2005. [5] S. Kratz and M. Rohs. Hoverflow: expanding the design space of around-device interaction. In Proc. of MobileHCI ’09, New York, NY, USA, 2009. ACM. [6] D. K. McGookin and S. A. Brewster. Understanding concurrent earcons: Applying auditory scene analysis principles to concurrent earcon recognition. ACM Transactions on Applied Perception (TAP), 1(2):130–155, 2004.

[7] Y.-W. Park, C.-Y. Lim, and T.-J. Nam. Cheektouch: an affective interaction technique while speaking on the mobile phone. In CHI’10 Extended Abstracts on Human Factors in Computing Systems, pages 3241–3246. ACM, 2010. [8] J. Rico and S. Brewster. Usable gestures for mobile interfaces: evaluating social acceptability. In Proc. of CHI ’10, pages 887–896, New York, NY, USA, 2010. ACM. [9] K. Scherer. What are emotions? and how can they be measured? Social Science Information, Jan. 2005. [10] D. Wilcox. Finger-Nose Stylus, 2011. http://variationsonnormal.com/2011/04/28/ finger-nose-stylus-for-touchscreens/. [11] A. Zarek, D. Wigdor, and K. Singh. SNOUT: one-handed use of capacitive touch devices. In Proc. of AVI 2012, pages 140–147, New York, NY, USA, 2012. ACM.