DocumentCode
50050
Title
Parsimonious Extreme Learning Machine Using Recursive Orthogonal Least Squares
Author
Ning Wang ; Meng Joo Er ; Min Han
Author_Institution
Marine Eng. Coll., Dalian Maritime Univ., Dalian, China
Volume
25
Issue
10
fYear
2014
fDate
Oct. 2014
Firstpage
1828
Lastpage
1841
Abstract
Novel constructive and destructive parsimonious extreme learning machines (CP- and DP-ELM) are proposed in this paper. By virtue of the proposed ELMs, parsimonious structure and excellent generalization of multiinput-multioutput single hidden-layer feedforward networks (SLFNs) are obtained. The proposed ELMs are developed by innovative decomposition of the recursive orthogonal least squares procedure into sequential partial orthogonalization (SPO). The salient features of the proposed approaches are as follows: 1) Initial hidden nodes are randomly generated by the ELM methodology and recursively orthogonalized into an upper triangular matrix with dramatic reduction in matrix size; 2) the constructive SPO in the CP-ELM focuses on the partial matrix with the subcolumn of the selected regressor including nonzeros as the first column while the destructive SPO in the DP-ELM operates on the partial matrix including elements determined by the removed regressor; 3) termination criteria for CP- and DP-ELM are simplified by the additional residual error reduction method; and 4) the output weights of the SLFN need not be solved in the model selection procedure and is derived from the final upper triangular equation by backward substitution. Both single- and multi-output real-world regression data sets are used to verify the effectiveness and superiority of the CP- and DP-ELM in terms of parsimonious architecture and generalization accuracy. Innovative applications to nonlinear time-series modeling demonstrate superior identification results.
Keywords
data analysis; feedforward neural nets; generalisation (artificial intelligence); learning (artificial intelligence); least squares approximations; matrix algebra; recursive functions; regression analysis; time series; CP-ELM; DP-ELM; ELM methodology; SLFN; backward substitution; constructive parsimonious extreme learning machine; destructive SPO; destructive parsimonious extreme learning machine; generalization accuracy; hidden node random generation; matrix size reduction; multiinput-multioutput single hidden-layer feedforward networks; nonlinear time-series modeling; parsimonious architecture; parsimonious structure; partial matrix; recursive orthogonal least squares decomposition; recursive orthogonalization; regression data set; regressor; residual error reduction method; sequential partial orthogonalization; termination criteria; upper triangular equation; upper triangular matrix; Context; Educational institutions; Mathematical model; Matrix decomposition; Training; Training data; Vectors; Extreme learning machine (ELM); parsimonious model selection; recursive orthogonal least squares (ROLS); sequential partial orthogonalization (SPO); single hidden-layer feedforward network (SLFN); single hidden-layer feedforward network (SLFN).;
fLanguage
English
Journal_Title
Neural Networks and Learning Systems, IEEE Transactions on
Publisher
ieee
ISSN
2162-237X
Type
jour
DOI
10.1109/TNNLS.2013.2296048
Filename
6704311
Link To Document