dynamic ui adaptations for one-handed use of large mobile … · author’s version dynamic ui...

18
Author’s version Dynamic UI Adaptations for One-Handed Use of Large Mobile Touchscreen Devices Daniel Buschek, Maximilian Hackenschmied, Florian Alt LMU Munich, Munich, Germany {daniel.buschek, florian.alt}@ifi.lmu.de, [email protected] Abstract. We present and evaluate dynamic adaptations for mobile touch GUIs. They mitigate reachability problems that users face when operating large smartphones or “phablets” with a single hand. In parti- cular, we enhance common touch GUI elements with three simple ani- mated location and orientation changes (Roll, Bend, Move ). Users can trigger them to move GUI elements within comfortable reach. A lab study (N =35) with two devices (4.95 in, 5.9 in) shows that these adap- tations improve reachability on the larger device. They also reduce device movements required to reach the targets. Participants perceived adap- ted UIs as faster, less exhausting and more comfortable to use than the baselines. Feedback and video analyses also indicate that participants re- tained a safer grip on the device through our adaptations. We conclude with design implications for (adaptive) touch GUIs on large devices. Keywords: UI adaptation, reachability, one-handed use, thumb, touch, mobile device 1 Introduction Mobile devices today come equipped with increasingly larger touchscreens. Large screens are attractive for displaying photos, videos and websites. They also pro- mise to ease input: Following Fitts’ Law [7], large GUI elements (e.g. buttons, text links) are easier targets than smaller ones. However, supporting and using a device with the same hand limits touch interaction to the use of the thumb. Thus, interaction with large screens suffers from the thumb’s limited reach, both with respect to reachable distance and comfortably bendable angles [1]. Certain screen regions and thus GUI elements cannot be reached comfortably or even not at all. Several UI adaptations have been proposed to address these problems. A popular approach implemented on devices on the market today shrinks and/or moves the displayed content to cover only that part of the screen which is easy to reach (e.g. “Reachability” on the iPhone 1 , Samsung’s “One-handed operation” 1 http://appleinsider.com/articles/14/09/09/how-apple-made-the-iphone-6-and-iphone-6-plus- one-handed-use

Upload: others

Post on 01-Aug-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Dynamic UI Adaptations for One-Handed Use of Large Mobile … · Author’s version Dynamic UI Adaptations for One-Handed Use of Large Mobile Touchscreen Devices Daniel Buschek, Maximilian

Author’s version

Dynamic UI Adaptations for One-Handed Use ofLarge Mobile Touchscreen Devices

Daniel Buschek, Maximilian Hackenschmied, Florian Alt

LMU Munich, Munich, Germany{daniel.buschek, florian.alt}@ifi.lmu.de,

[email protected]

Abstract. We present and evaluate dynamic adaptations for mobiletouch GUIs. They mitigate reachability problems that users face whenoperating large smartphones or “phablets” with a single hand. In parti-cular, we enhance common touch GUI elements with three simple ani-mated location and orientation changes (Roll, Bend, Move). Users cantrigger them to move GUI elements within comfortable reach. A labstudy (N=35) with two devices (4.95 in, 5.9 in) shows that these adap-tations improve reachability on the larger device. They also reduce devicemovements required to reach the targets. Participants perceived adap-ted UIs as faster, less exhausting and more comfortable to use than thebaselines. Feedback and video analyses also indicate that participants re-tained a safer grip on the device through our adaptations. We concludewith design implications for (adaptive) touch GUIs on large devices.

Keywords: UI adaptation, reachability, one-handed use, thumb, touch,mobile device

1 Introduction

Mobile devices today come equipped with increasingly larger touchscreens. Largescreens are attractive for displaying photos, videos and websites. They also pro-mise to ease input: Following Fitts’ Law [7], large GUI elements (e.g. buttons,text links) are easier targets than smaller ones.

However, supporting and using a device with the same hand limits touchinteraction to the use of the thumb. Thus, interaction with large screens suffersfrom the thumb’s limited reach, both with respect to reachable distance andcomfortably bendable angles [1]. Certain screen regions and thus GUI elementscannot be reached comfortably or even not at all.

Several UI adaptations have been proposed to address these problems. Apopular approach implemented on devices on the market today shrinks and/ormoves the displayed content to cover only that part of the screen which is easy toreach (e.g. “Reachability” on the iPhone1, Samsung’s “One-handed operation”

1http://appleinsider.com/articles/14/09/09/how-apple-made-the-iphone-6-and-iphone-6-plus-one-handed-use

Page 2: Dynamic UI Adaptations for One-Handed Use of Large Mobile … · Author’s version Dynamic UI Adaptations for One-Handed Use of Large Mobile Touchscreen Devices Daniel Buschek, Maximilian

Author’s version

Fig. 1. In everyday life, a) we often operate mobile devices with one hand. b) On largedevices, some screen regions cannot be reached comfortably. Here, the user struggles totouch the blue action bar at the top. c) Rotating the bar to the left screen side rendersit easily accessible. This may be triggered, for example, by “flicking”, i.e. tilting thedevice to the left.

feature2). Other concepts change layouts (e.g. for keyboards3) or introduce newwidgets (e.g. radial menus4).

Unfortunately, resizing the displayed content and input areas mitigates thebenefits of a larger screen. Special widgets allow to keep the full screen area, butthey must be introduced to the user, overlay existing content, or are difficult tointegrate well into existing layouts (e.g. radial menu vs common box-layout ofapps and websites).

To address these challenges and improve one-handed use, we investigate dy-namic adaptations of common GUI elements (Figure 1). We contribute: 1) fourdynamic UI adaptations for three common main elements in mobile touch GUIs,2) evaluated in a user study with 35 participants in the lab, 3) resulting ininsights into one-handed use and relevant design implications for large mobiletouchscreen devices.

2 Related Work

The main problem with one-handed touch interaction on large devices is thelimited range and flexibility of the human thumb [1]. Large screens may easilyexceed the thumb’s reachable area, yet one-handed use is often required in manyevery-day situations, for example when holding onto a rail or carrying otherobjects [15, 23, 24]. Users may cope by tilting the device to bring the far screencorner closer to the stretched thumb [5,22]. Besides the industry solutions men-tioned in the introduction, HCI research has proposed various approaches toimprove one-handed mobile touch interaction:

2http://www.androidcentral.com/how-set-your-galaxy-s5-better-one-handed-use

3https://support.swiftkey.com/hc/en-us/articles/201457382-How-do-I-change-my-keyboard-layout-with-SwiftKey-Keyboard-for-Android-

4https://play.google.com/store/apps/details?id=jun.ace.piecontrol

Page 3: Dynamic UI Adaptations for One-Handed Use of Large Mobile … · Author’s version Dynamic UI Adaptations for One-Handed Use of Large Mobile Touchscreen Devices Daniel Buschek, Maximilian

Author’s version

TouchShield [12] allows users to bring up a new control widget with theirthumb in the screen centre to facilitate a more stable grip. ThumbSpace [16,19] introduced an easily reachable small proxy area to map touches to theircorresponding location on the whole screen. Kim et al. [20] enabled users to 1)pan the screen content to move far targets towards the thumb, or to 2) summon acursor that multiplies the thumb’s movements to reach further. They used edge-swiping and “fat” touches to trigger these modes. In contrast, Chang et al. [5]triggered similar methods once the device’s tilt indicated reaching for a distanttarget. While these methods can improve reachability and/or grip stability, manyintroduce indirect input [5,12,16,19,20], or require extra panning actions [5,20],which can slow down target selection.

Roudaut et al. [25] proposed a two-tap selection method – first triggeringmagnification of an area of interest, then selecting within the magnified display.They further proposed a magnetic target selection “stick” of sizeable length.Magnification might also move some targets within reach, and the “stick” canextend the thumb, but their goal was to improve target selection on small screensand thus the concepts were not designed and evaluated for improving reachabilityon large devices.

Other related work designed task-specific widgets for one-handed use, suchas a radial contact list [13], an interface for video browsing [14], or an app-launcher [18]. While these designs can mitigate problems with reachability andprecision, they are limited to their specific use-cases and thus in general notapplicable across different tasks and applications.

Involving the fingers of the grasping hand on the back of the device [26–28] can also address reachability issues. However, this requires additional touchsensors on the back. Similarly, concepts envisioning a bendable device [8] cannotbe realised with current off-the-shelf hardware.

Recent work has predicted front screen touches before the thumb hits thescreen, based on grip changes during the reaching movement, registered with 1)back-of-device touch sensors [21] or with 2) device motion sensors [22]. Whilesuch methods could also be used to predict touch locations that users couldactually never reach, it seems unlikely that users would even try to stretchtowards obviously unreachable targets in order to enable such predictions inthe first place.

Another line of research adapted underlying touch areas of on-screen keybo-ards to typing behaviour and hand posture without visual changes [4, 9, 11, 29].However, the main concern of these projects was typing precision, not reachabi-lity. In contrast, Cheng et al. [6] visually adapted keyboard shape and locationto the user’s grasp on a tablet, also to facilitate reachability. Similarly, we alsofollow the idea of adapting the location and shape of GUI elements to better suitthe user’s hand posture. Instead of keyboards for two-handed typing on tablets,we address main navigation elements and action buttons for one-handed use oflarge phones or “phablets”.

In summary, related work 1) examined the thumb’s limited reach [1], 2)motivated the support of one-handed use [15, 23, 24], and 3) proposed solutions

Page 4: Dynamic UI Adaptations for One-Handed Use of Large Mobile … · Author’s version Dynamic UI Adaptations for One-Handed Use of Large Mobile Touchscreen Devices Daniel Buschek, Maximilian

Author’s version

Fig. 2. Mockups showing a) the unadapted UI, and the six adaptations evaluated inthe pre-study: b) Pull: users can pull down the action bar, it moves back up afterusing one of its buttons; c) Roll: the action bar rotates around the screen corners intoa vertical layout close to the holding hand; d) Bend: the menu items bend to matchthe thumb’s reachable angles; e) Move: the button/menu is moved to the side of theholding hand; f) Side and g) Pie: redundant action bars can be swiped in from thescreen edge. The top row shows the left-hand versions, the bottom row the ones for theright hand (exception: b) Pull is pulled down for both hands). Roll, Bend and Movewere selected for the final study.

which were either applied to the whole interface [5, 16, 19, 20] or introducednew UI elements [5, 13, 14, 18, 20, 25] and hardware [8, 26–28]. In contrast, weinvestigate how one-handed use of mobile touch devices can be improved byadapting existing main GUI elements. In particular, we are interested in simpleand easily understandable changes to UI layout, location and orientation.

3 Concept Development

Our design goal is to improve reachability of existing GUI elements – in contrastto related work that often invented new (task-specific) widgets. We decidedto explore adaptations of the main elements of Google’s Material Design5 asan example of a popular modern design language for mobile touch GUIs. Thefollowing subsections describe our concept development process.

3.1 Brainstorming Session

We conducted a brainstorming session with colleagues and students from anHCI lab to generate ideas. No default GUI was given; rather, the task was to

5https://www.google.com/design/spec/material-design/, last accessed 23rd Jan. 2017

Page 5: Dynamic UI Adaptations for One-Handed Use of Large Mobile … · Author’s version Dynamic UI Adaptations for One-Handed Use of Large Mobile Touchscreen Devices Daniel Buschek, Maximilian

Author’s version

freely come up with ideas for adapting common mobile touch GUI elementsto improve reachability in one-handed use. Clustering the ideas revealed designdimensions for adaptations: changing layouts globally (i.e. in GUI) and locally(i.e. within menu); changing alignment, orientation, shape, item arrangement;floating elements; and adding new elements.

3.2 Paper Prototyping

The generated ideas were captured as simple GUI sketches on paper, and shownto colleagues and students to gain early feedback. Ideas with moving elementswere liked best overall. However, this step also revealed that, for further feedback,we needed to go beyond paper sketches.

3.3 App Prototyping

Hence, we moved from paper to phone. Integrating feedback from the discussions,we created click-through prototypes to be able to demonstrate the refined ideason an actual device. We used the prototyping software Sketch6 and POP7. Wecreated 38 interactive mockup screens (like the ones in Figure 2), showing theconcepts embedded into a fake email client to give them meaningful applicationcontext.

3.4 Pre-study

We reduced the number of ideas to a feasible amount for a small study, basedon another round of feedback from colleagues and guided by the previouslyidentified design dimensions. We kept the six concepts shown in Figure 2.

We employed them in a small user study to gather qualitative feedback onour ideas. We recruited four participants (three female, mean age 26, all right-handed). They were compensated with a e5 gift card for an online shop.

Each participant tested all six concepts in random order. Participants wereasked to fulfil simple tasks: opening an email, composing a new mail, deletingone. They were encouraged to think aloud during the tasks. After completingeach task, they rated the current concept on a five-point Likert scale regardingeight items on aspects such as reachability, ease-of-use, understandability anddistraction. In addition, they shared their thoughts verbally.

3.5 Selection of Concepts

The study revealed problems for: a) Pull - pulling down the action bar merelyreduces vertical distance to targets. Horizontally distant targets are still proble-matic. b) Side - feedback revealed the importance of clear distinction betweencontent and (adaptive) controls, which was problematic due to the transparency

6https://www.sketchapp.com/, last accessed 22nd Jan. 2017

7https://popapp.in/, last accessed 22nd Jan. 2017

Page 6: Dynamic UI Adaptations for One-Handed Use of Large Mobile … · Author’s version Dynamic UI Adaptations for One-Handed Use of Large Mobile Touchscreen Devices Daniel Buschek, Maximilian

Author’s version

of Side. c) Pie - Some participants were confused about redundant GUI elements.This was the case for both Side and Pie, which duplicate the action bar withouthiding it. Hence, we decided against duplication. All other concepts received pro-mising ratings and feedback and were thus selected for the main study: RollingAction Bar (Roll), Moving Action Button/Menu (Move), and Bending ActionMenu (Bend)

3.6 Final Concepts: Dynamic Adaptive UI

In summary, we propose four adaptations and an example trigger action.

Rolling Action Bar (Roll) An Action Bar is located at the top of mostAndroid GUIs. It features buttons for navigation or main functionality in thecurrent view. Triggering our adaptation rotates the bar around the screen corner(animated), changing it from its default location at the top to a left/right-alignedlayout (see Figure 2c). Triggering adaptation again moves it back to the defaultlocation at the top.

Moving Action Button/Menu (Move) The Floating Action Button is asingle button “floating” on top of the view, often in the bottom right corner (seeFigure 2a). Our adaptation makes it movable: When triggering the adaptation,the button moves over to the left side of the screen (see Figure 2e). This makesit easier to reach in left-hand use. Triggering the adaptation again moves it backto the right.

There is also a menu-version of this button, Floating Action Menu. Touchingthis menu button opens a floating menu (see Figure 2e). As with the button,adaptation moves it to the other screen side, triggering again moves it back.

Bending Action Menu (Bend) Instead of moving the Floating Action Menu,this concept adapts the arrangement of its menu items. While the normal versiondisplays menu items in a straight line, triggering our adaptation moves theminto a curve to better fit the thumb’s reachable area (see Figure 2d). Repeatedadaptation moves them back into a straight line.

Triggering Adaptations Adaptations could be triggered explicitly or auto-matically. For our study, we implemented a simple explicit trigger: tilting thedevice in a short wrist turn (Figure 3b). We decided not to use a touch gesturefor the study to keep the trigger clearly distinguishable from the interactions, sothat people could easily report feedback on adaptations and trigger separately.Tilting may also go well with the idea of movable elements – users can ”flick”the UI elements to new locations. However, this trigger is an example, not partof our contribution.

Page 7: Dynamic UI Adaptations for One-Handed Use of Large Mobile … · Author’s version Dynamic UI Adaptations for One-Handed Use of Large Mobile Touchscreen Devices Daniel Buschek, Maximilian

Author’s version

4 User Study

To evaluate our adaptations, we conducted a repeated measures lab study. Theindependent variables were device (Nexus 5, HTC One Max), hand (left, right),and concept (four adaptive versions, plus the baseline versions of the three UIelements). We measured completion time, device orientation, and reachability,as well as user opinions on five-point Likert items.

4.1 Participants

We recruited 35 participants, mostly students, via a university mailing list andsocial media. 17 were female, 6 were left-handed. The average age was 24.7 years(range: 19–47). They received a e10 gift card for an online shop as compensation.

4.2 Apparatus

We used two devices to cover a range of interesting (i.e. “large”) screen sizes,namely an LG Nexus 5 (4.95 in screen, 137.9 × 69.2 × 8.6 mm), and an HTCOne Max (5.9 in screen, 164.5 × 82.5 × 10.3 mm). Our study app (Figure 3c andd) showed the tested GUI elements. Menu elements had labelled buttons (e.g.“C1” to “C6”, see Figure 3c). The floating action menus/buttons were locatednear the bottom right screen corner (see Figure 3d). A video camera capturedthe study, focusing on device and hands. Figure 4 shows a few selected scenes.

4.3 Procedure

Each participant used both the left and right hand with both devices. For eachdevice and hand, each participant completed seven tasks, namely using the fouradaptations, plus the non-adaptive baseline versions of the three GUI elements.The order was varied using a Latin Square design. We explain the procedure bydescribing its three hierarchical components: tasks, trial sequences, and trials.

Tasks Each task covered one GUI element and adaptation concept. We firstexplained the tested element and its adaptation, if available. A training trialsequence allowed participants to familiarise themselves with the task. They thencompleted six trial sequences.

Trial Sequences Each trial sequence consisted of subsequently hitting fivetargets (for the Floating Action Menu/Button) or six targets (for the ActionBar). For the adaptive versions, adaptation had to be triggered at the startof the trial sequence; the element then stayed in its adapted state until thestart of the next trial sequence. This procedure allowed us to analyse how manysubsequent interactions were needed to compensate for trigger overhead (i.e. timerequired to trigger adaptation vs time saved by using the adapted element).

Page 8: Dynamic UI Adaptations for One-Handed Use of Large Mobile … · Author’s version Dynamic UI Adaptations for One-Handed Use of Large Mobile Touchscreen Devices Daniel Buschek, Maximilian

Author’s version

Fig. 3. This figure shows a) roll and pitch measured relative to the device, b) thetrigger gesture, and example screens from the study app, c) an Action Bar trial, andd) a running Floating Action Menu trial.

Trials To start each trial within a sequence, participants had to touch a “Start”button with their thumb. The trial was over once they either had hit their targetor a “Can’t reach” button, to indicate that it was not possible for them to hitthe target.

Two GUI elements featured multiple targets – the six buttons on the ActionBar, and the five menu options of the Floating Action Menu. In these cases,the app displayed the target label for the current trial in the screen centre (seeFigure 3c and d).

Subjective Ratings and Feedback After each task, participants filled ina short questionnaire with five-point Likert items on perceived speed, comfortof reach, safe grip, and exhaustion. At the very end, qualitative feedback wasgathered in a short semi-structured interview. Here, we also asked for opinions onexplicitly triggered adaptations (i.e. manual trigger, as in the study) and implicitones (i.e. automatic trigger, e.g. with a hand posture recognition system).

4.4 Limitations

Our lab study provides control and direct observation. However, future workshould also evaluate acceptance in a field study over a longer period of time.Moreover, we chose a non-touch trigger (Figure 3b) to keep it clearly separatefrom main interactions. Feedback suggests that turning a larger device in thisway is not ideal for everyday use. Other triggers can be investigated, for examplea touch swipe from the edge, a large touch [20], long touch, or implicit triggers.

Page 9: Dynamic UI Adaptations for One-Handed Use of Large Mobile … · Author’s version Dynamic UI Adaptations for One-Handed Use of Large Mobile Touchscreen Devices Daniel Buschek, Maximilian

Author’s version

Element ConceptTime (s) Orientation (deg roll)

mean sd mean sd

Action Button Unadapted 0.63 0.65 -11.70 16.28Action Button Move 0.40 0.12 -0.71 4.10

Action Menu Unadapted 1.77 – 2.10 0.88 – 1.35 -23.87 – -17.67 18.81 – 19.71Action Menu Move 1.30 – 1.43 0.21 – 0.35 -5.97 – 2.92 3.94 – 5.85Action Menu Bend 1.85 – 2.03 0.85 – 1.58 -13.87 – -12.61 11.06 – 12.51

Action Bar Unadapted 0.72 – 2.35 0.20 – 2.25 -43.86 – -19.75 14.67 – 18.29Action Bar Roll 0.58 – 0.77 0.18 – 0.29 -10.94 – -1.93 5.31 – 7.41

Table 1. Results overview. The table shows the observed time and device roll whenhitting the given targets on the HTC One Max, for both unadapted and adaptedversions (adapted to left hand). Note that menu and action bar had multiple targets(i.e. menu items), hence we give ranges of values covering the target-specific values forthese elements. We also measured pitch, but omit it here, since no significant differenceswere found for any element.

Finally, moving elements is only one way of adapting touch GUIs; our study doesnot include a comparison to other approaches, which we leave for future work.

5 Results

Results are reported per adaptation in comparison to the corresponding non-adaptive baseline element. For each element, we report on task completion time,device orientation, and reachability. Significance, tested where applicable, is re-ported at the 0.05 level. Further aspects of our evaluation and report are descri-bed as follows:

We first focus on data from the larger device (HTC One Max), and thenprovide a summarised comparison to the results obtained on the other device(Nexus 5). To evaluate the adaptations, we first analyse the data without thetrigger. We then report on the influences of the trigger in a separate section. Theelements’ default versions are their right hand versions for all but the actionbar (Roll) and the bending menu (Bend). Hence, if not stated otherwise, wereport on the interesting adaptation to the left hand. We removed outliers (e.g.a participant interrupting the study in the middle of a trial).

Table 1 gives an overview, explained in detail in the following subsections.

5.1 Action Button – Fixed vs Move

We compared the fixed Action Button against the adaptive version (Move).Time: Using the adapted version was significantly faster than the baseline

(t(28) = 3.11, p < .01). Note df=28 in this test, since three people decided tonever use adaptation, and three others could not reach the baseline with theirleft hand at all.

Device Orientation: We found significantly less roll (Figure 3a) with Movethan in the baseline (t(28) = 3.82, p < .01).

Page 10: Dynamic UI Adaptations for One-Handed Use of Large Mobile … · Author’s version Dynamic UI Adaptations for One-Handed Use of Large Mobile Touchscreen Devices Daniel Buschek, Maximilian

Author’s version

Reachability: Three people could not reach the baseline button with theirleft thumb. Observations and video analyses showed that many people requiredconsiderable effort and grip changes. Move could easily be reached by everyone.

5.2 Action Menu – Fixed vs Move or Bend

We compared the baseline Action Menu with two menu adaptations, moving(Move) and bending (Bend). All three menus featured five menu buttons astargets (Figure 3d).

Time: Using the adapted version was only significantly faster than the base-line for moving the menu (t(28) = 4.19, p < .001), but not for bending it.

Device Orientation: We found significantly less roll for both adaptations thanwith the baseline (Move: t(28) = 4.93, p < .001; Bend: t(29) = 3.72, p < .001).

Reachability: Five people could not open the baseline menu with their lefthand. The closed bended menu at the same location could not be reached by fourpeople. This difference is explained by the coping strategies that some peopleinvented but only employed in some cases. No reachability problems occurredfor the movable menu.

5.3 Action Bar – Fixed vs Roll

We compared the static baseline Action Bar with the adaptive version (Roll).Time: Using the adapted version (Roll) was significantly faster than the

baseline for both the left hand (t(34) = 6.12, p < .001) and right hand (t(34) =7.36, p < .001). We also found positive correlations between time taken and thereaching distance, that is the distance from the target to the screen edge of theholding hand (left hand: r=.445; right hand: r=.314). Standard deviation of timealso correlated positively with reaching distance (left hand: r=.925; right hand:r=.788), indicating less controlled movements for further reaching.

Device Orientation: The adapted version had significantly less device rollthan the baseline for both the left hand (t(34) = 16.28, p < .001) and right hand(t(34) = 17.23, p < .001).

Reachability: Many people struggled to reach the outer targets of the baselineaction bar: Nine could not reach “6” and three could not reach “5” with theirleft hand (labels see Figure 3c). With the right hand, seven could not reach “1”and two could not reach “2”. Roll enabled everyone to reach all targets.

5.4 Trigger Overhead

Adaptations require time for trigger, animation, and users’ reorientation. Re-peated use of adapted elements might compensate for this. For our trigger weobserved the following results:

The Bending Action Menu could not compensate for the trigger, taking up to3.03s longer than the baseline after the fifth target. The Moving Action Buttonalmost reached zero, taking up to 0.43s longer after the fifth target. The Moving

Page 11: Dynamic UI Adaptations for One-Handed Use of Large Mobile … · Author’s version Dynamic UI Adaptations for One-Handed Use of Large Mobile Touchscreen Devices Daniel Buschek, Maximilian

Author’s version

Fig. 4. Coping strategies: a) tilting the device, b) and c) shifting the grip dependingon the target location, d) unstable grips, bringing the second hand close to “catch”the device if necessary, and e) further strategies, such as using other fingers, reachingaround the device.

Action Menu compensated for the trigger after three targets, leading to a meanadvantage of 1.07s after five targets. The Rolling Action Bar compensated forthis after four interactions, leading to a mean advantage of 1.23s after six targets.These values relate to our main adaptation scenario (default to left hand). Weobserved similar yet less pronounced results for right-hand adaptations.

5.5 Comparison of Screen Sizes

We have focused on the results from the larger device so far (HTC One Max, 5.9in screen), since such large devices are the main target of our adaptations. Tofurther evaluate benefits and limitations, we now compare the results to thoseobtained on the smaller device (LG Nexus 5, 4.95 in):

We found almost no effect of screen size on time with the adapted elements.However, the baseline elements strongly profited by the smaller screen. In con-sequence, the speed improvements achieved by using adapted elements almostdisappeared on the smaller device. The smaller device required slightly less tiltingthan the larger one. Adapted elements still caused less tilting than non-adaptiveones. Reachability greatly improved on the smaller device. However, one personcould still not reach the outermost target of the unadapted baseline Action Bar.

5.6 Reachability Coping Strategies

Our observations and video recordings revealed several coping strategies to dealwith far targets: A main strategy was tilting the device, sometimes up to 90degrees (Figure 4a). Another common strategy was to move the hand along theedge of the device (Figure 4b and c). This requires users to loosen their grip onthe device. As a rough estimate from live/video observations, on average theymoved approximately 2 cm along the edge to reach targets at the very top (onthe HTC One Max). Some even moved the hand to the device’s bottom, toenable their thumbs to better cover the full screen width. This was difficult andpeople often brought their second hand closer, ready to catch the device in caseof a slip (Figure 4d). Two tried reaching around the device with their fingers(Figure 4e). This required considerable efforts and was reported as tiresome.These participants also stated that they probably would have involved theirsecond hand instead in real-life use.

Page 12: Dynamic UI Adaptations for One-Handed Use of Large Mobile … · Author’s version Dynamic UI Adaptations for One-Handed Use of Large Mobile Touchscreen Devices Daniel Buschek, Maximilian

Author’s version

5.7 Subjective User Ratings

After each task, people rated the used element with four Likert items. All adap-tive elements were perceived as faster, more comfortable to use, less exhausting toreach, and more grip safe, compared to their baselines for use with the left handon the larger device. These differences were significant for all concepts apartfrom Bend (Bonf.-corrected Wilcoxon signed rank tests, all p < .05). For theright hand, Roll was also preferred over the baseline in all questions, since it hada right-hand adaptation as well (Figure 2). Ratings for “It is exhausting to re-ach this element” had a moderate positive linear relationship with roll (r=.511).We also found weak to moderate negative relationships between roll and bothcomfort (r=-.497) and safe grip (r=-.425). On the smaller device, people did notfeel at disadvantage with static GUIs, yet adapted UIs still had an advantagefor comfort.

6 Discussion and Implications

6.1 Can dynamic UI adaptations improve reachability?

On the larger device, 25% of participants could not reach the (unadapted) ActionBar, 15% the Floating Action Menu, and 9% the Floating Action Button. Othersstruggled to reach some targets. In general, the thumb’s functional area dependson the grip [1], which we did not keep fixed to avoid unnatural use. Our adap-tations moved elements into the centre (Bend when opened), or closer to thethumb (Move, Roll), thus making them generally easier to reach.

Crucially, our adaptations enabled all participants to reach all targets. Theonly exception was the bended menu (Bend), which did not move the menubutton itself and thus could not improve reachability for an unopened menu.Apart from this exception, our adaptations greatly improved reachability sincethey brought all targets into reach for everyone in our study.

Feedback, thinking-aloud, and Likert ratings further revealed that the ma-jority of participants also found the adapted GUI elements less exhaustive andmore comfortable to use.

Reachability was much less of an issue on a 4.95 inch device compared to the5.9 inch one. In conclusion, we see the main applications of our adaptations forreachability improvements on the very large end of smartphones, as well as on“phablets”.

6.2 Can dynamic UI adaptations improve speed?

The results show that our adaptations improve speed under certain circumstan-ces. Firstly, adapted elements improved speed on the large device, yet this effectwas marginal on the smaller one. This indicates that our adaptations are mainlybeneficial on devices beyond the five inch mark. Secondly, the trigger plays animportant role. Two of four adaptations could compensate for the ≈1.74s over-head caused by our example trigger. Moving a single button or changing the

Page 13: Dynamic UI Adaptations for One-Handed Use of Large Mobile … · Author’s version Dynamic UI Adaptations for One-Handed Use of Large Mobile Touchscreen Devices Daniel Buschek, Maximilian

Author’s version

shape of a menu could not compensate for the trigger, but relocating whole me-nus saved time after three to six interactions. We argue that such numbers ofinteractions without switching hand posture occur in many use-cases involvingan action bar or menu, such as a browsing session.

Once adapted, all but one element (Bend) significantly shortened interactiontime on the larger device. The exception for Bend is explained by the fact thatthe bended menu must still first be opened by reaching the menu button, as inthe unadapted version.

Participants also subjectively rated these elements significantly faster thanthe baselines. Even moving a single important button could be worthwhile witha fast or automatic trigger (e.g. by inferring hand postures from preceding touchbehaviour [2,10,29]). Finally, all adaptations were subjectively perceived as fasterthan the baselines where they mattered most, namely on the larger device whenused with the left hand.

6.3 Can dynamic UI adaptations facilitate a safer grip?

Device roll was significantly lower for elements adapted to the left hand, com-pared to the baselines. Pitch showed a similar yet non-significant tendency. Theaction bar’s adaptation (Roll) also significantly reduced roll for the right hand.Less variance in targeting times suggests that Roll also resulted in more control,especially for the non-dominant hand. Since greater device tilting and movementsmay increase the risk of letting it slip, these results suggest that adaptations canfacilitate a safer grip.

This conclusion is supported by participants’ own perceptions: Regardingtheir grip on the device, they perceived the adapted elements as significantlysafer than the baselines. This was revealed by the Likert ratings and was in linewith further verbal feedback. Recorded sensor values and users’ Likert ratingswere also moderately correlated (r=.4 to .5): Less device tilt was associated withhigher ratings on comfort and grip, and lower ratings on exhaustion.

6.4 Do users accept dynamic UI adaptations?

All adaptations received more positive ratings than the baselines. Most peopleelected the Bending Action Menu as their favourite adaptive UI element. In-terestingly, this was the only adaptation that could not improve speed, evenwithout the trigger overhead. However, people’s comments suggest that theyliked the feeling of “ergonomic optimisation”. This was probably more visiblyconveyed in the bended arrangement of the menu items than in location/rotationchanges. Moreover, the bended layout matches the more comfortable movementdirections, in which the thumb describes an arc instead of bending and extending(see [17]).

In conclusion, all GUI adaptations were well accepted by our participants.When asked about explicit/implicit triggers after the study, some participantssaid that they liked the explicit influence over the UI, yet the majority statedthat they would prefer automatic adaptation, if it worked reliably.

Page 14: Dynamic UI Adaptations for One-Handed Use of Large Mobile … · Author’s version Dynamic UI Adaptations for One-Handed Use of Large Mobile Touchscreen Devices Daniel Buschek, Maximilian

Author’s version

6.5 Design Implications

We expect several observations, insights, and lessons learned from this project tobe useful for designing adaptive touch GUIs for large devices beyond this work:

Reconsider default locations: Our baselines showed that users of larger de-vices can face serious reachability problems, even for UIs following a moderntouch GUI design language. On devices beyond five inches and without dynamicadaptations, navigation and action bars would be easier to reach at the bottomof the screen, and action buttons can improve reachability for both hands bybeing centred horizontally.

Separate controls from content: Our pre-study revealed that it is difficultfor users to understand and follow adaptive UI elements that are less clearlyseparated from the main content (e.g. due to transparency or overlays).

Avoid introducing redundancy: In our concept discussions and pre-study, wefound that duplicating UI elements at easier reachable locations caused confusiondue to the redundancy. Participants had to 1) understand that functionality hadbeen duplicated and 2) mentally draw a connection between the new elementand the normal one.

Make explicit adaptations “sticky”: We suggest that if adaptations are trig-gered by the user, they should not revert themselves after a single interaction.First, based on our speed and trigger-overhead analyses, keeping elements intheir adapted state for subsequent actions helps to compensate for the time re-quired by the trigger. Second, participants who would prefer an explicit triggeroutside of the study indicated that they liked the feeling of control and changingthe UI on their own command.

Adapt view groups: Based on our analyses of trigger-overhead against savedtime, we suggest to relocate multiple buttons at once (e.g. as in our moving menuand rotating action bar). This improves convenience and helps to compensate fortrigger overhead, since users are then more likely to already find future actionswithin reach.

An “ergonomic look” can facilitate user acceptance: Although bending themenu did not improve the quantitative measures, it looked ergonomically fittingfor the thumb and the majority of participants highlighted this adaptation astheir favourite.

6.6 Integration of Adaptations into Existing GUIs

Our non-adapted elements are simply the familiar elements from Material De-sign, not new ones. This makes it easy to integrate them into existing apps -simply replace the old version with an adaptive one; the visuals stay the same.On the other hand, this may make it hard for users to discover the new adaptivefunctionality. Ideally, the presented adaptations would be integrated into existingmobile interfaces by making them available on an operating system level. Thus,for example, all action bars would be adaptive ones. In contrast, integration intosingle apps might confuse users, since the GUI’s behaviour then becomes lessconsistent across applications.

Page 15: Dynamic UI Adaptations for One-Handed Use of Large Mobile … · Author’s version Dynamic UI Adaptations for One-Handed Use of Large Mobile Touchscreen Devices Daniel Buschek, Maximilian

Author’s version

These issues can be addressed by explaining new adaptations to the user,for example with an overlay upon first use, similar to the commonly used app-introduction screens. More generally, new adaptiveness in any app or on OS levelshould be revealed and explained to avoid unexpected GUI behaviour from theusers’ point of view.

Finally, from a practical perspective, developers can realise adaptations likethe ones presented here by making use of generalised frameworks that facilitateimplementing adaptive GUIs (e.g. see [3]).

7 Conclusion and Future Work

While large mobile devices are attractive for displaying multimedia content,they also introduce reachability problems when users need to operate them witha single hand as required in many every-day situations. Some screen regionscannot be reached comfortably or not at all.

We have proposed dynamic adaptations of basic touch GUI elements to mi-tigate reachability problems for one-handed use of large mobile devices. A labstudy (N=35) showed that adapted elements improve reachability on a largedevice, and can reduce interaction time and device tilting. Moreover, using adap-tations resulted in perceived higher comfort, less exhaustion, and a safer grip.We further derived lessons learned and implications for future design of adaptivemobile touch GUIs for large devices.

To take full advantage of dynamic GUI adaptations in practice, the nextstep is to investigate better triggers, including automatic ones. For example,adaptations could be triggered based on automatically inferred hand postures(e.g. [2, 3, 10,29]).

References

1. Bergstrom-Lehtovirta, J., Oulasvirta, A.: Modeling the functional area of thethumb on mobile touchscreen surfaces. In: Proceedings of the SIGCHI Conferenceon Human Factors in Computing Systems. pp. 1991–2000. CHI ’14, ACM, NewYork, NY, USA (2014), http://doi.acm.org/10.1145/2556288.2557354

2. Buschek, D., Alt, F.: TouchML: A machine learning toolkit for modelling spatialtouch targeting behaviour. In: Proceedings of the 20th International Conferenceon Intelligent User Interfaces. pp. 110–114. IUI ’15, ACM, New York, NY, USA(2015), http://doi.acm.org/10.1145/2678025.2701381

3. Buschek, D., Alt, F.: ProbUI: Generalising touch target representations to enabledeclarative gesture definition for probabilistic GUIs. In: Proceedings of the 2017CHI Conference on Human Factors in Computing Systems. pp. 4640–4653. CHI ’17,ACM, New York, NY, USA (2017), http://doi.acm.org/10.1145/3025453.3025502

4. Buschek, D., Schoenleben, O., Oulasvirta, A.: Improving accuracy in back-of-device multitouch typing: A clustering-based approach to keyboard up-dating. In: Proceedings of the 19th International Conference on IntelligentUser Interfaces. pp. 57–66. IUI ’14, ACM, New York, NY, USA (2014),http://doi.acm.org/10.1145/2557500.2557501

Page 16: Dynamic UI Adaptations for One-Handed Use of Large Mobile … · Author’s version Dynamic UI Adaptations for One-Handed Use of Large Mobile Touchscreen Devices Daniel Buschek, Maximilian

Author’s version

5. Chang, Y., L’Yi, S., Koh, K., Seo, J.: Understanding users’ touch behavior onlarge mobile touch-screens and assisted targeting by tilting gesture. In: Pro-ceedings of the 33rd Annual ACM Conference on Human Factors in Compu-ting Systems. pp. 1499–1508. CHI ’15, ACM, New York, NY, USA (2015),http://doi.acm.org/10.1145/2702123.2702425

6. Cheng, L.P., Liang, H.S., Wu, C.Y., Chen, M.Y.: igrasp: Grasp-based adaptivekeyboard for mobile devices. In: CHI ’13 Extended Abstracts on Human Factorsin Computing Systems. pp. 2791–2792. CHI EA ’13, ACM, New York, NY, USA(2013), http://doi.acm.org/10.1145/2468356.2479514

7. Fitts, P.M.: The Information Capacity of the Human Motor System in Controllingthe Amplitude of Movement. Journal of Experimental Psychology 47(6), 381–391(1954)

8. Girouard, A., Lo, J., Riyadh, M., Daliri, F., Eady, A.K., Pasquero, J.:One-handed bend interactions with deformable smartphones. In: Procee-dings of the 33rd Annual ACM Conference on Human Factors in Compu-ting Systems. pp. 1509–1518. CHI ’15, ACM, New York, NY, USA (2015),http://doi.acm.org/10.1145/2702123.2702513

9. Goel, M., Jansen, A., Mandel, T., Patel, S.N., Wobbrock, J.O.: Contexttype:Using hand posture information to improve mobile touch screen text entry.In: Proceedings of the SIGCHI Conference on Human Factors in Compu-ting Systems. pp. 2795–2798. CHI ’13, ACM, New York, NY, USA (2013),http://doi.acm.org/10.1145/2470654.2481386

10. Goel, M., Wobbrock, J., Patel, S.: Gripsense: Using built-in sensors to de-tect hand posture and pressure on commodity mobile phones. In: Procee-dings of the 25th Annual ACM Symposium on User Interface Software andTechnology. pp. 545–554. UIST ’12, ACM, New York, NY, USA (2012),http://doi.acm.org/10.1145/2380116.2380184

11. Gunawardana, A., Paek, T., Meek, C.: Usability guided key-target resizing forsoft keyboards. In: Proceedings of the 15th International Conference on Intelli-gent User Interfaces. pp. 111–118. IUI ’10, ACM, New York, NY, USA (2010),http://doi.acm.org/10.1145/1719970.1719986

12. Hong, J., Lee, G.: Touchshield: A virtual control for stable grip of a smartphoneusing the thumb. In: CHI ’13 Extended Abstracts on Human Factors in Compu-ting Systems. pp. 1305–1310. CHI EA ’13, ACM, New York, NY, USA (2013),http://doi.acm.org/10.1145/2468356.2468589

13. Huot, S., Lecolinet, E.: Spiralist: A compact visualization technique forone-handed interaction with large lists on mobile devices. In: Proceedingsof the 4th Nordic Conference on Human-computer Interaction: ChangingRoles. pp. 445–448. NordiCHI ’06, ACM, New York, NY, USA (2006),http://doi.acm.org/10.1145/1182475.1182533

14. Hurst, W., Merkle, P.: One-handed mobile video browsing. In: Proceedings ofthe 1st International Conference on Designing Interactive User Experiences forTV and Video. pp. 169–178. UXTV ’08, ACM, New York, NY, USA (2008),http://doi.acm.org/10.1145/1453805.1453839

15. Karlson, A.K., Bederson, B.B.: Studies in one-handed mobile design: Habit, desireand agility. Tech. rep., Proceedings of the 4th ERCIM Workshop on User Interfacesfor All (UI4ALL ’98 (2006)

16. Karlson, A.K., Bederson, B.B.: One-handed touchscreen input for legacy appli-cations. In: Proceedings of the SIGCHI Conference on Human Factors in Com-puting Systems. pp. 1399–1408. CHI ’08, ACM, New York, NY, USA (2008),http://doi.acm.org/10.1145/1357054.1357274

Page 17: Dynamic UI Adaptations for One-Handed Use of Large Mobile … · Author’s version Dynamic UI Adaptations for One-Handed Use of Large Mobile Touchscreen Devices Daniel Buschek, Maximilian

Author’s version

17. Karlson, A.K., Bederson, B.B., Contreras-Vidal, J.L.: Understanding Single-Handed Mobile Device Interaction. Handbook of research on user interface designand evaluation for mobile technology pp. 86–101 (2007)

18. Karlson, A.K., Bederson, B.B., SanGiovanni, J.: Applens and launchtile: Two de-signs for one-handed thumb use on small devices. In: Proceedings of the SIGCHIConference on Human Factors in Computing Systems. pp. 201–210. CHI ’05, ACM,New York, NY, USA (2005), http://doi.acm.org/10.1145/1054972.1055001

19. Karlson, A., Bederson, B.: Thumbspace: Generalized one-handed input fortouchscreen-based mobile devices. In: Baranauskas, C., Palanque, P., Abascal, J.,Barbosa, S.D.J. (eds.) Human-Computer Interaction – INTERACT 2007, LectureNotes in Computer Science, vol. 4662, pp. 324–338. Springer Berlin Heidelberg(2007), http://dx.doi.org/10.1007/978-3-540-74796-3 30

20. Kim, S., Yu, J., Lee, G.: Interaction techniques for unreachable objects onthe touchscreen. In: Proceedings of the 24th Australian Computer-Human Inte-raction Conference. pp. 295–298. OzCHI ’12, ACM, New York, NY, USA (2012),http://doi.acm.org/10.1145/2414536.2414585

21. Mohd Noor, M.F., Ramsay, A., Hughes, S., Rogers, S., Williamson, J., Murray-Smith, R.: 28 frames later: Predicting screen touches from back-of-device gripchanges. In: Proceedings of the SIGCHI Conference on Human Factors in Com-puting Systems. pp. 2005–2008. CHI ’14, ACM, New York, NY, USA (2014),http://doi.acm.org/10.1145/2556288.2557148

22. Negulescu, M., McGrenere, J.: Grip change as an information side channel formobile touch interaction. In: Proceedings of the 33rd Annual ACM Conference onHuman Factors in Computing Systems. pp. 1519–1522. CHI ’15, ACM, New York,NY, USA (2015), http://doi.acm.org/10.1145/2702123.2702185

23. Ng, A., Brewster, S.A., Williamson, J.H.: Investigating the effects of en-cumbrance on one- and two- handed interactions with mobile devices. In:Proceedings of the SIGCHI Conference on Human Factors in ComputingSystems. pp. 1981–1990. CHI ’14, ACM, New York, NY, USA (2014),http://doi.acm.org/10.1145/2556288.2557312

24. Oulasvirta, A., Bergstrom-Lehtovirta, J.: Ease of juggling: Studying the effectsof manual multitasking. In: Proceedings of the SIGCHI Conference on HumanFactors in Computing Systems. pp. 3103–3112. CHI ’11, ACM, New York, NY,USA (2011), http://doi.acm.org/10.1145/1978942.1979402

25. Roudaut, A., Huot, S., Lecolinet, E.: Taptap and magstick: Improving one-handedtarget acquisition on small touch-screens. In: Proceedings of the Working Confe-rence on Advanced Visual Interfaces. pp. 146–153. AVI ’08, ACM, New York, NY,USA (2008), http://doi.acm.org/10.1145/1385569.1385594

26. Wigdor, D., Forlines, C., Baudisch, P., Barnwell, J., Shen, C.: Lucid touch: A see-through mobile device. In: Proceedings of the 20th Annual ACM Symposium onUser Interface Software and Technology. pp. 269–278. UIST ’07, ACM, New York,NY, USA (2007), http://doi.acm.org/10.1145/1294211.1294259

27. Wolf, K., McGee-Lennon, M., Brewster, S.: A study of on-device gestures. In:Proceedings of the 14th International Conference on Human-computer Interactionwith Mobile Devices and Services Companion. pp. 11–16. MobileHCI ’12, ACM,New York, NY, USA (2012), http://doi.acm.org/10.1145/2371664.2371669

28. Yang, X.D., Mak, E., Irani, P., Bischof, W.F.: Dual-surface input: Augmentingone-handed interaction with coordinated front and behind-the-screen input. In:Proceedings of the 11th International Conference on Human-Computer Interactionwith Mobile Devices and Services. pp. 5:1–5:10. MobileHCI ’09, ACM, New York,NY, USA (2009), http://doi.acm.org/10.1145/1613858.1613865

Page 18: Dynamic UI Adaptations for One-Handed Use of Large Mobile … · Author’s version Dynamic UI Adaptations for One-Handed Use of Large Mobile Touchscreen Devices Daniel Buschek, Maximilian

Author’s version

29. Yin, Y., Ouyang, T.Y., Partridge, K., Zhai, S.: Making touchscreen keyboardsadaptive to keys, hand postures, and individuals: A hierarchical spatial backoffmodel approach. In: Proceedings of the SIGCHI Conference on Human Factors inComputing Systems. pp. 2775–2784. CHI ’13, ACM, New York, NY, USA (2013),http://doi.acm.org/10.1145/2470654.2481384