DocumentCode :
3417201
Title :
A new pruning algorithm for Feedforward Neural Networks
Author :
Ai, Fangju
Author_Institution :
Coll. of Math. & Comput. Sci., Hubei Univ., Wuhan, China
fYear :
2011
fDate :
19-21 Oct. 2011
Firstpage :
286
Lastpage :
289
Abstract :
The number of neurons in hidden layers of Feedforward Neural Networks is very relative to their learning ability and generalization ability. The Iterative Pruning(IP) algorithm spends much time computing adjusting factors of the remaining weights. So the Improved Iterative Pruning(IIP) algorithm is put forward, which adopts dividing blocks strategy and uses the Generalized Inverse Matrix(GIM) algorithm to replace the Conjugate Gradient Precondition Normal Equation(CGPCNE) algorithm for updating the remaining weights. The IIP algorithm is applied in the hidden layers of Feedforward Neural Networks to simplify their structures in a great extent and preserve a good level of accuracy and generalization ability without retraining after pruning. The simulation results demonstrate the effectiveness and the feasibility of the algorithm.
Keywords :
conjugate gradient methods; feedforward neural nets; iterative methods; learning (artificial intelligence); matrix algebra; conjugate gradient precondition normal equation algorithm; dividing blocks strategy; feedforward neural networks; generalization ability; generalized inverse matrix algorithm; improved iterative pruning algorithm; learning ability; neurons; Algorithm design and analysis; Biological neural networks; Equations; IP networks; Mathematical model; Neurons; Signal processing algorithms;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Advanced Computational Intelligence (IWACI), 2011 Fourth International Workshop on
Conference_Location :
Wuhan
Print_ISBN :
978-1-61284-374-2
Type :
conf
DOI :
10.1109/IWACI.2011.6160018
Filename :
6160018
Link To Document :
بازگشت