Reference Information
Interactive Interfaces: Spatial Interaction with Empty Hands and without Visual Feedback
By Sean Gustafson, Daniel Bierwirth, and Patrick Baudisch
Presented at UIST'10, October 3-6, New York, New York, USA
Author Bios
- Sean Gustafson is a PhD student working in Hasso Plattner Institute's Human-Computer Interaction Lab. He focuses on new forms of interacting with computers and holds bachelor and master's degrees in Computer Science from the University of Manitoba.
- Daniel Bierwirth received a master's degree in IT-Systems Engineering from Hasso-Plattner Institute and now works for Matt Hatting & Company UG and Agentur Richard GbR. He focuses on mobile application development and design thinking.
- Dr. Patrick Baudisch is the head of the Human Computer Interaction group at the Hasso-Plattner Institute. His research focuses on interaction techniques, especially with small and large devices.
Summary
Hypothesis
Can users interact successfully with an imaginary user interface without visual feedback and to what extent?
Methods
Three user studies, users consisting of young adults created simple
drawings, edited existing ones, experienced deliberate interruptions,
and pointed to locations in space based on a user-defined origin. An
optical tracker determined gestures based on markers on
participants' gloves to minimize system limitations.
Results
Users
were partially successful in using the system. The test of simple
sketches and single stroke characters had a 95% success rate, which was considered a win for the system. Users also increased in accuracy
with repeated shapes. Multi-segment drawings did not do very well however.
The test of users' ability to remember where objects were even after a
brief interruption from the task somewhat succeeded, with non-rotating
participants and a reference point aiding in accuracy.
Contents
Users
worked with objects in a 2D space. Their non-dominant hand, producing
an L shape, created the origin point for the plane. Spatial references
are derived from the position of this hand. The system requires no
devices to be held in-hand. This technology extends wearable computing,
gestural input, mobile computing, and spatial interaction. The 3D
coordinates created by the test were converted to 2D from the user's
perspective. The first test presented users with a page of simple
shapes to draw using their left hand as the origin and finished at user
discretion. Some of the shapes were letters that posed problems with
connecting strokes in prior studies. Others were repeated simple shapes
and multi-stroke drawings. The multi-stroke gestures were not formally
analyzed. This test suggested that users augmented their visuospatial
memory throughout the test. The second test considered the longevity
visuospatial memory after interrupting the participants. Initially,
participants rotated 90 degrees after drawing a shape and indicated the
requested corner of that shape. Notably, the left hand as a reference
point increased accuracy even with an interruption. The third test
considered accuracy on a coordinate plane, using the left hand forming
the x and y unit vectors.
Discussion
This paper does a good job of measuring a user's ability to operate in a screen less, feedback free, gesture based interface.his
paper managed to measure user ability to use a screen-less,
gesture-based interface. It also shows that this could be useful for simple tasks, and that this particular system is better than competing approaches.This work is
interesting because almost all computing today takes place inside a screen. Where feedback is easy to see. This paper challenges the way that computers can be used and I really appreciated that however I am hesitant of the usability of the device due to the large knowledge base of gestures that may need to be learned and how gestures might interfere with whatever else the user may be doing.
No comments:
Post a Comment