DocumentCode :
2372068
Title :
Unsupervised machine learning and cognitive systems in learning user state for context-aware computing
Author :
Smailagic, A.
Author_Institution :
Carnegie Mellon University
fYear :
2004
fDate :
16-18 Dec. 2004
Firstpage :
1
Lastpage :
2
Abstract :
We at Carnegie Mellon University have pioneered context-aware mobile computing and built their first prototypes, including context-aware mobile phones and a context-aware personal communicator. These prototypes use machine learning and cognitive modeling techniques to derive user state and intent from the devices sensors. Context-aware computing describes the situation where a mobile computer is aware of its user´s state and surroundings and modifies its behavior based on this information. We have demonstrated the power of our method to automatically derive a meaningful user context model and performed experimental measurements and evaluation. We have employed unsupervised machine learning techniques to combine real time data from multiple sensors into a model of behavior that is individualized to the user. We observe that context does not require a descriptive label to be used for adaptivity and contextually sensitive response. This makes our approach towards completely unsupervised machine learning feasible. By unsupervised learning we mean the identification of the users´ context without requiring manually annotating current user states. We use unsupervised machine learning techniques to independently cluster sensor quantities and associate user interactions with these clusters. The use of this discretization enables learning from observations about the user. Each time a user interaction is observed, it is interpreted as a labeled example which can be used to construct a statistical model for context-dependent preferences. Example context-aware parameters are the following: location, nearby people and devices, calendar and other cyber sensors information, movement patterns and characteristics, user preferences, interests, and behavior patterns. By mapping observable parameters into cognitive states, the computing system can estimate the form of interaction that minimizes user distraction and the risk of cognitive overload. The capabilities herein proposed exte- d significantly the state-of-the-art, sometimes in a radical fashion, other times more incrementally. Our approach produces enriched observations by combining machine learning, instrumentation in software applications, sensors describing the user state, and task context information. Such diverse sensor fusion (symbolic and signal sensors) for inferring context and state goes much beyond the situation-sensing currently practiced, even in experimental settings.
Keywords :
Biosensors; Context modeling; Context-aware services; Machine learning; Mobile computing; Prototypes; Sensor fusion; Sensor phenomena and characterization; Very large scale integration; Wearable computers;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Machine Learning and Applications, 2004. Proceedings. 2004 International Conference on
Conference_Location :
Louisville, Kentucky, USA
Print_ISBN :
0-7803-8823-2
Type :
conf
DOI :
10.1109/ICMLA.2004.1383485
Filename :
1383485
Link To Document :
بازگشت