Title :
An outer product neural network for extracting principal components from a time series
Author_Institution :
Surveillance Syst. Div., Lockheed Sanders, Hudson, NH, USA
fDate :
30 Sep-1 Oct 1991
Abstract :
An outer product neural network architecture has been developed based on subspace concepts. The network is trained by auto-encoding the input exemplars, and will represent the input signal by k-principal components, k being the number of neurons or processing elements in the network. The network is essentially a single linear layer. The weight matrix columns orthonormalize during training. The output signal converges to the projection of the input onto a k-principal component subspace, while the residual signal represents the novelty of the input. An application to extracting sinusoids from a noisy time series is given
Keywords :
learning (artificial intelligence); neural nets; pattern recognition; time series; auto-encoding; input exemplars; k-principal components; neurons; outer product neural network; pattern recognition; processing elements; time series; weight matrix columns; Artificial neural networks; Cost function; Feature extraction; Neural networks; Neurons; Noise reduction; Signal processing; Statistics; Surveillance; Vectors;
Conference_Titel :
Neural Networks for Signal Processing [1991]., Proceedings of the 1991 IEEE Workshop
Conference_Location :
Princeton, NJ
Print_ISBN :
0-7803-0118-8
DOI :
10.1109/NNSP.1991.239525