Author/Authors :
Nicolas Pican، نويسنده , , Frédéric Alexandre، نويسنده ,
Abstract :
Artificial neural networks (ANNs) are widely used for classification tasks where discriminant cues and also contextual parameters are proposed as ANN inputs. When the input space is too large to enable a robust, time limited learning, a classical solution consists in designing a set of ANNs for different context domains. We have proposed a new learning algorithm, the lateral contribution learning algorithm (LCLA), based on the backpropagation learning algorithm, which allows for such a solution with a reduced learning time and more efficient performances thanks to lateral influences between networks. This attractive, but heavy solution has been improved thanks to the orthogonal weight estimator (OWE) technique, an original architectural technique which, under light constraints, merges the set of ANNs in one ANN whose weights are dynamically estimated, for each example, by others ANNs, fed by the context. This architecture allows to give a very rich and interesting interpretation of the weight landscape. We illustrate this interpretation with two examples: a mathematical function estimation and a process modelization used in neurocontrol.