Title of article :
An improved differential evolution algorithm in training and encoding prior knowledge into feedforward networks with application in chemistry
Author/Authors :
Chen، نويسنده , , Chong-wei and Chen، نويسنده , , De-zhao and Cao، نويسنده , , Guang-zhi، نويسنده ,
Issue Information :
دوفصلنامه با شماره پیاپی سال 2002
Abstract :
Prior-knowledge-based feedforward networks have shown superior performance in modeling chemical processes. In this paper, an improved differential evolution (IDEP) algorithm is proposed to encode prior knowledge simultaneously into networks in training process. With regard to monotonic prior knowledge, IDEP algorithm employs a flip operation to adjust those prior-knowledge-violating networks to conform to the monotonicity. In addition, two strategies, Levenberg–Marquardt descent (LMD) strategy and random perturbation (RP) strategy, are adopted to speed up the differential evolution (DE) in the algorithm and prevent it from being trapped by some local minimums, respectively. To demonstrate the IDEP algorithmʹs efficiency, we apply it to model two chemical curves with the increasing monotonicity constraint. For comparison, four network-training algorithms without prior-knowledge constraints, as well as three existing prior-knowledge-based algorithms (which have some relationship and similarities with IDEP algorithm), are employed to solve the same problems. The simulation results show that IDEPʹs performance is better than all other algorithms. As a conclusion, IDEP algorithm and its promising prospective will be discussed in detail at the end of this paper.
Keywords :
Feedforward network , Levenberg–Marquardt descent strategy , Improved Differential Evolution , prior knowledge , Flip operation , Random perturbation strategy
Journal title :
Chemometrics and Intelligent Laboratory Systems
Journal title :
Chemometrics and Intelligent Laboratory Systems