DocumentCode
3350852
Title
VIS-Tracker: a wearable vision-inertial self-tracker
Author
Foxlin, Eric ; Naimark, Leonid
fYear
2003
fDate
22-26 March 2003
Firstpage
199
Lastpage
206
Abstract
We present a demonstrated and commercially viable self-tracker, using robust software that fuses data from inertial and vision sensors. Compared to infrastructure-based trackers, self-trackers have the advantage that objects can be tracked over an extremely wide area, without the prohibitive cost of an extensive network of sensors or emitters to track them. So far, most AR research has focused on the long-term goal of a purely vision-based tracker that can operate in arbitrary unprepared environments, even outdoors. We instead chose to start with artificial fiducials, in order to quickly develop the first self-tracker which is small enough to wear on a belt, low cost, easy to install and self-calibrate, and low enough latency to achieve AR registration. We also present a roadmap for how we plan to migrate from artificial fiducials to natural ones. By designing to the requirements of AR, our system can easily handle the less challenging applications of wearable VR systems and robot navigation.
Keywords
augmented reality; image sensors; optical tracking; sensor fusion; wearable computers; AR registration; AR research; VIS-Tracker; artificial fiducials; augmented reality; commercially viable self-tracker; inertial sensors; infrastructure based trackers; robot navigation; robust software; vision sensors; vision-based tracker; wearable VR systems; wearable vision-inertial self-tracker; Belts; Costs; Delay; Fuses; Navigation; Robots; Robustness; Sensor fusion; Virtual reality; Wearable sensors;
fLanguage
English
Publisher
ieee
Conference_Titel
Virtual Reality, 2003. Proceedings. IEEE
ISSN
1087-8270
Print_ISBN
0-7695-1882-6
Type
conf
DOI
10.1109/VR.2003.1191139
Filename
1191139
Link To Document