Experimental analysis of touch-screen gesture designs in mobile environments by Andrew Bragdon, Eugene Nelson, Yang Li, and Ken Hinckle. Published in the CHI '11 Proceedings of the 2011 annual conference on Human factors in computing systems.
Author Bios
- Andrew Bragdon is a second year PhD student at Brown University focusing on human-computer interaction.
- Eugene Nelson is a PhD student at Brown University.
- Yang Li is a researcher at Google and earned his PhD from the Chinese Academy of Sciences.
- Ken Hinckley is a Principal Researcher at Microsoft Research and has a PhD from the University of Virginia.
Hypothesis
Bezel and marked gestures can be a better way to increase user performance with mobile touch screens in terms of accuracy and the attention required
Methods
The users were intermediately comfortable with computing. The phones ran Android 2.1. The built-in hard buttons were disabled for the test. Eye gaze data was recorded to determine eye movement start and stop time, with the display and phone separately sufficiently to identify eye movements. To simulate expert usage, icons appeared to reveal the needed gesture or command. Feedback was immediately related to the user.
Results
Bezel marks had the lowest mean completion time, though there was no significant performance difference between soft button and hard button mark's mean. There was also no significant difference between soft button's and bezel's paths, but there was a noticeable increase in mean completion time between bezel paths and hard button paths. Bezel marks and soft buttons performed similarly in direct, and with various distraction types bezel marks significantly and consistently outperformed soft buttons.
Contents
The
authors examine four factors: moding technique(hard button, soft button
and bezel based), gesture type(mark based and free form), user‟s
motor activity(sitting and walking), and distraction level of the
environment(no distraction,moderate awareness and attention saturating
task) that play a crucial role in interaction with the smartphones.
Authors
were motivated by observations like putting a physical button which
produced a mechanical click did not let people look at the screen to
make sure they pressed button and was less cumbersome as well and the
tactile feedback of touching the bezel confirms to users that they are
contacting the bezel without having to look at screen and can be done
with single hand as well.
Discussion
I believe that is was all right paper, but I believe a lot of this knowledge was already intuitive.

No comments:
Post a Comment