Simulated annealing algorithm-based Elman network for dynamic system identification

One of the well-known recurrent neural networks is the Elman network. Recently, it has been used in applications of system identification. The network has feedforward and feedback connections. It can be trained essentially as a feedforward network by means of the basic backpropagation algorithm, but its feedback connections have to be kept constant. For training success, it is important to select the correct values for the feedback connections. However, finding these values manually can be a lengthy trial-and-error process. This paper investigates the use of the simulated annealing (SA) algorithm to obtain the weight values of both the feedforward and feedback connections of Elman networks used for dynamic system identification. The SA algorithm is an efficient random search procedure, which can simultaneously obtain the optimal weight values of both the feedforward and feedback connections.

Simulated annealing algorithm-based Elman network for dynamic system identification

One of the well-known recurrent neural networks is the Elman network. Recently, it has been used in applications of system identification. The network has feedforward and feedback connections. It can be trained essentially as a feedforward network by means of the basic backpropagation algorithm, but its feedback connections have to be kept constant. For training success, it is important to select the correct values for the feedback connections. However, finding these values manually can be a lengthy trial-and-error process. This paper investigates the use of the simulated annealing (SA) algorithm to obtain the weight values of both the feedforward and feedback connections of Elman networks used for dynamic system identification. The SA algorithm is an efficient random search procedure, which can simultaneously obtain the optimal weight values of both the feedforward and feedback connections.

___

  • D.T. Pham, D. Karaboga, Intelligent Optimisation Techniques: Genetic Algorithms, Tabu Search, Simulated Annealing and Neural Networks, London, Springer-Verlag, 2000.
  • D.T. Pham, S.J. Oh, “A recurrent backpropagation neural network for dynamic system identiŞcation”, Journal of Systems Engineering, Vol. 2, pp. 213-223, 1992.
  • X. Liu, Modelling and Prediction Using Neural Networks, PhD Thesis, University of Wales College of Cardiff, Cardiff, UK, 1993.
  • D.T. Pham, X. Liu, “IdentiŞcation of linear and nonlinear dynamic systems using recurrent neural networks”, ArtiŞcial Intelligence in Engineering, Vol. 8, pp. 67-75, 1993.
  • A. Kalinli, S. Sagiroglu, “Elman network with embedded memory for system identiŞcation”, Journal of Information Science and Engineering, Vol. 22, pp. 1555-1568, 1996.
  • D. Karaboga, A. Kalinli, “Training recurrent neural networks using tabu search algorithm”, 5th Turkish Symposium on ArtiŞcial Intelligence and Neural Networks, pp. 293-298, 1996.
  • D.T. Pham, X. Liu, Neural Networks for IdentiŞcation, Prediction and Control, 4th ed., London, Springer-Verlag, 1999.
  • D.T. Pham, D. Karaboga, “Training Elman and Jordan networks for system identiŞcation using genetic algorithms”, ArtiŞcial Intelligence in Engineering, Vol. 13, pp. 107-117, 1999.
  • A. Thammano, P. Ruxpakawong, “Nonlinear dynamic system identiŞcation using recurrent neural network with multi-segment piecewise-linear connection weight”, Memetic Computing, Vol. 2, pp. 273-282, 2010.
  • R.S. Sexton, J.N.D. Gupta, “Comparative evaluation of genetic algorithm and backpropagation for training neural networks”, Information Sciences, Vol. 129, pp. 45-59, 2000.
  • P.A. Castillo, J.J. Merelo, A. Prieto, V. Rivas, G. Romero, “G-Prop: Global optimization of multilayer perceptrons using GAs”, Neurocomputing, Vol. 35, pp. 149-163, 2000.
  • J Arifovic, R. Gen¸cay, “Using genetic algorithms to select architecture of a feedforward artiŞcial neural network”, Physica A, Vol. 289, pp. 574-594, 2001.
  • K.W. Ku, M.W. Mak, W.C. Siu, “Adding learning to cellular genetic algorithms for training recurrent neural networks”, IEEE Transactions on Neural Networks, Vol. 10, pp. 239-252, 1999.
  • A. Blanco, M. Delgado, M.C. Pegalajar, “A genetic algorithm to obtain the optimal recurrent neural network”, International Journal of Approximate Reasoning, Vol. 23, pp. 67-83, 2000.
  • A. Blanco, M. Delgado, M.C. Pegalajar, “A real-coded genetic algorithm for training recurrent neural networks”, Neural Networks, Vol. 14, pp. 93-105, 2001.
  • R. Battiti, G. Tecchiolli, “Training neural nets with the reactive tabu search”, IEEE Transactions on Neural Networks, Vol. 6, pp. 1185-1200, 1995.
  • P.A. Castillo, J. Gonzalez Pe˜nalver, J.J. Merelo, A. Prieto, V. Rivas, G. Romero, “SA-Prop: optimization of multilayer perceptron parameters using simulated annealing”, Lecture Notes in Computer Science, Vol. 1606, pp. 661-670, 1999.
  • A. Kalinli, “Training Elman network using simulated annealing algorithm”, Journal of the Institute of Science and Technology of Erciyes University, Vol. 19, pp. 28-37, 2003 (in Turkish).
  • A. Kalinli, D. Karaboga, “Training recurrent neural networks by using parallel tabu search algorithm based on crossover operation”, Engineering Applications of ArtiŞcial Intelligence, Vol. 17, pp. 529-542, 2004.
  • R.S. Sexton, B. Alidaee, R.E. Dorsey, J.D. Johnson, “Global optimization for artiŞcial neural networks: a tabu search application”, European Journal of Operational Research, Vol. 106, pp. 570-584, 1998.
  • B. Dengiz, C. Alabas-Uslu, O. Dengiz, “A tabu search algorithm for the training of neural networks”, Journal of the Operational Research Society, Vol. 60, pp. 282-291, 2009.
  • H.W. Ge, Y.C. Liang, M. Marchese, “A modiŞed particle swarm optimization-based dynamic recurrent neural network for identifying and controlling nonlinear systems”, Computers & Structures, Vol. 85, pp. 1611-1622, 2007. [23] J.L. Elman, “Finding structure in time”, Cognitive Science, Vol. 14, pp. 179-211, 1990.
  • S. Kirkpatrick, C.D. Gelatt Jr, M.P. Vecchi Jr, “Optimization by simulated annealing”, Science, Vol. 220, pp. 671-680, 1983.
  • F. Glover, “Future paths for integer programming and links to artiŞcial intelligence”, Computers and Operations Research, Vol. 5, pp. 533-549, 1986.
  • D.S. Johnson, C.R. Aragon, L.A. McGeoch, C. Schevon, “Optimization by simulated annealing: an experimental evaluation. Part I, graph partitioning”, Operations Research, Vol. 37, pp. 865-892, 1989.
  • K.C. Tan, Y. Li, D.J. Murray-Smith, K.C. Sharman, “System identiŞcation and linearization using genetic al- gorithms with simulated annealing”, First IEE/IEEE International Conference on GA in Engineering Systems: Innovations and Applications, pp. 164-169, 1995.
  • ˙I. Eksin, O.K. Erol, “A fuzzy identiŞcation method for nonlinear systems”, Turkish Journal of Electrical Engineering and Computer Sciences, Vol. 8, pp. 125-135, 2000.