Title :
Sparse locally linear and neighbor embedding for nonlinear time series prediction
Author :
Mohamed Waleed Fakhr
Author_Institution :
Coll. of Comput., Cairo, Egypt
Abstract :
This paper proposes a dictionary-based L1-norm sparse coding for time series prediction which requires no training phase, and minimal parameter tuning, making it suitable for nonstationary and online prediction applications. The prediction process is formulated as a basis pursuit L1-norm problem, where a sparse set of weights is estimated for each test vector. Constrained sparse coding formulations are compared including sparse local linear embedding and sparse nearest neighbor embedding. 16 time series datasets are used to test the approach for offline time series prediction where the training data is fixed. The proposed approach is also compared to Bagging trees (BT), least-squares support vector regression (LSSVM) and regularized Autoregressive model. The proposed sparse coding prediction shows better performance than the LSSVM that uses 10-fold cross validation and significantly better performance than regularized AR and Bagging trees. In average, a few thousand sparse coding predictions can be done while the LSSVM is training making the proposed technique suitable for online prediction and highly nonstationary data.
Keywords :
"Dictionaries","Training","Time series analysis","Encoding","Training data","Predictive models","Prediction algorithms"
Conference_Titel :
Computer Engineering & Systems (ICCES), 2015 Tenth International Conference on
DOI :
10.1109/ICCES.2015.7393078