Title :
Using relative head and hand-target features to predict intention in 3D moving-target selection
Author :
Casallas, Juan Sebastian ; Oliver, James H. ; Kelly, J.W. ; Merienne, Frederic ; Garbaya, Samir
fDate :
March 29 2014-April 2 2014
Abstract :
Selection of moving targets is a common, yet complex task in human-computer interaction (HCI) and virtual reality (VR). Predicting user intention may be beneficial to address the challenges inherent in interaction techniques for moving-target selection. This article extends previous models by integrating relative head-target and hand-target features to predict intended moving targets. The features are calculated in a time window ending at roughly two-thirds of the total target selection time and evaluated using decision trees. With two targets, this model is able to predict user choice with up to ~ 72% accuracy on general moving-target selection tasks and up to ~ 78% by also including task-related target properties.
Keywords :
decision trees; human computer interaction; virtual reality; 3D moving-target selection; HCI; VR; decision trees; human-computer interaction; relative hand-target features; relative head-target features; task-related target properties; user choice prediction; user intention prediction; virtual reality; Accuracy; Decision trees; Human computer interaction; Predictive models; Solid modeling; Three-dimensional displays; Virtual reality; H.5.2 [Information interfaces and presentation]: User Interfaces — Interaction Styles, Theory and methods; I.3.7 [Computer Graphics]: Three-Dimensional Graphics and Realism — Virtual Reality; I.5.4 [Pattern Recognition]: Applications;
Conference_Titel :
Virtual Reality (VR), 2014 iEEE
Conference_Location :
Minneapolis, MN
DOI :
10.1109/VR.2014.6802050