Title of article :
Layered neural networks with non-monotonic transfer functions
Author/Authors :
Katsuki Katayama، نويسنده , , Yasuo Sakata، نويسنده , , Tsuyoshi Horiguchi، نويسنده ,
Issue Information :
روزنامه با شماره پیاپی سال 2002
Pages :
29
From page :
270
To page :
298
Abstract :
We investigate storage capacity and generalization ability for two types of fully connected layered neural networks with non-monotonic transfer functions; random patterns are embedded into the networks by a Hebbian learning rule. One of them is a layered network in which a non-monotonic transfer function of even layers is different from that of odd layers. The other is a layered network with intra-layer connections, in which the non-monotonic transfer function of inter-layer is different from that of intra-layer, and inter-layered neurons and intra-layered neurons are updated alternately. We derive recursion relations for order parameters for those layered networks by the signal-to-noise ratio method. We clarify that the storage capacity and the generalization ability for those layered networks are enhanced in comparison with those with a conventional monotonic transfer function when non-monotonicity of the transfer functions is selected optimally. We also point out that some chaotic behavior appears in the order parameters for the layered networks when non-monotonicity of the transfer functions increases.
Journal title :
Physica A Statistical Mechanics and its Applications
Serial Year :
2002
Journal title :
Physica A Statistical Mechanics and its Applications
Record number :
868217
Link To Document :
بازگشت