• DocumentCode
    56149
  • Title

    Predicting Targets of Human Reaching Motions Using Different Sensing Technologies

  • Author

    Novak, D. ; Omlin, Ximena ; Leins-Hess, Rebecca ; Riener, Robert

  • Author_Institution
    Sensory-Motor Syst. Lab., ETH Zurich, Zurich, Switzerland
  • Volume
    60
  • Issue
    9
  • fYear
    2013
  • fDate
    Sept. 2013
  • Firstpage
    2645
  • Lastpage
    2654
  • Abstract
    Rapid recognition of voluntary motions is crucial in human-computer interaction, but few studies compare the predictive abilities of different sensing technologies. This paper thus compares performances of different technologies when predicting targets of human reaching motions: electroencephalography (EEG), electrooculography, camera-based eye tracking, electromyography (EMG), hand position, and the user´s preferences. Supervised machine learning is used to make predictions at different points in time (before and during limb motion) with each individual sensing modality. Different modalities are then combined using an algorithm that takes into account the different times at which modalities provide useful information. Results show that EEG can make predictions before limb motion onset, but requires subject-specific training and exhibits decreased performance as the number of possible targets increases. EMG and hand position give high accuracy, but only once the motion has begun. Eye tracking is robust and exhibits high accuracy at the very onset of limb motion. Several advantages of combining different modalities are also shown, including advantages of combining measurements with contextual data. Finally, some recommendations are given for sensing modalities with regard to different criteria and applications. The information could aid human-computer interaction designers in selecting and evaluating appropriate equipment for their applications.
  • Keywords
    biomechanics; cameras; electroencephalography; electromyography; learning (artificial intelligence); man-machine systems; medical signal processing; EEG; EMG; camera-based eye tracking; electroencephalography; electromyography; electrooculography; hand position; human reaching motion; human-computer interaction; limb motion; sensing technology; signal preprocessing; supervised machine learning; user preferences; Accuracy; Electroencephalography; Electromyography; Electrooculography; Feature extraction; Sensors; Tracking; Human–computer interaction; intention detection; machine learning; physiology; sensor fusion; Adult; Artificial Intelligence; Biomedical Engineering; Electrodiagnosis; Female; Humans; Intention; Male; Man-Machine Systems; Movement; Reproducibility of Results; Signal Processing, Computer-Assisted; Task Performance and Analysis; Upper Extremity;
  • fLanguage
    English
  • Journal_Title
    Biomedical Engineering, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    0018-9294
  • Type

    jour

  • DOI
    10.1109/TBME.2013.2262455
  • Filename
    6515157