Big bang-big crunch optimization algorithm with local directional moves

The big bang--big crunch (BB--BC) algorithm has been proposed as a novel optimization method that relies on one of the theories of the evolution of the universe, namely the big bang and big crunch theory. It has been shown that this algorithm is capable of quick convergence, even in long, narrow parabolic-shaped flat valleys or in the existence of several local extremes. In this work, the BB--BC optimization algorithm is hybridized with local search moves in between the ``banging'' and ``crunching'' phases of the algorithm. These 2 original phases of the BB--BC algorithm are preserved, but the representative point (``best'' or ``fittest'' point) attained after the crunching phase of the iteration is modified with some local directional moves, using the previous representative points so far attained, with the hope that a better representative point would be obtained. The effect of imposing local directional moves is inspected by comparing the results of the original and enhanced BB--BC algorithm on various benchmark test functions. Moreover, the crunching phase of the algorithm is improved with the addition of a simplex-based local search routine. The results of the new hybridized algorithm are compared with the state-of-the-art variants of widely used evolutionary computation methods, namely genetic algorithms, covariance matrix adaptation evolutionary strategies, and particle swarm optimization. The results over the benchmark test functions have proven that the BB--BC algorithm, enhanced with local directional moves, has provided more accuracy with the same computation time or for the same number of function evaluations.

Big bang-big crunch optimization algorithm with local directional moves

The big bang--big crunch (BB--BC) algorithm has been proposed as a novel optimization method that relies on one of the theories of the evolution of the universe, namely the big bang and big crunch theory. It has been shown that this algorithm is capable of quick convergence, even in long, narrow parabolic-shaped flat valleys or in the existence of several local extremes. In this work, the BB--BC optimization algorithm is hybridized with local search moves in between the ``banging'' and ``crunching'' phases of the algorithm. These 2 original phases of the BB--BC algorithm are preserved, but the representative point (``best'' or ``fittest'' point) attained after the crunching phase of the iteration is modified with some local directional moves, using the previous representative points so far attained, with the hope that a better representative point would be obtained. The effect of imposing local directional moves is inspected by comparing the results of the original and enhanced BB--BC algorithm on various benchmark test functions. Moreover, the crunching phase of the algorithm is improved with the addition of a simplex-based local search routine. The results of the new hybridized algorithm are compared with the state-of-the-art variants of widely used evolutionary computation methods, namely genetic algorithms, covariance matrix adaptation evolutionary strategies, and particle swarm optimization. The results over the benchmark test functions have proven that the BB--BC algorithm, enhanced with local directional moves, has provided more accuracy with the same computation time or for the same number of function evaluations.

___

  • O.K. Erol, ˙I. Eksin, “A new optimization method: big bang-big crunch”, Advances in Engineering Software, Vol. 37, pp. 106–111, 2006.
  • T. Kumbasar, E. Ye¸sil, ˙I. Eksin, M. G¨ uzelkaya, “Inverse fuzzy model control with online adaptation via big bang-big crunch optimization”, Proceedings of the 3rd International Symposium on Communications, Control, and Signal Processing, pp. 697–702, 2008.
  • T. Kumbasar, ˙I. Eksin, M. G¨ uzelkaya, E. Ye¸sil, “Big bang-big crunch optimization method based fuzzy model inversion”, Lecture Notes in Computer Science, Vol. 5317, pp. 732–740, 2008.
  • C.V. Camp, “Design of space trusses using big bang-big crunch optimization”, Journal of Structural Engineering, Vol. 133, pp. 999–1008, 2007.
  • A. Kaveh, S. Talatahari, “Size optimization of space trusses using big bang-big crunch algorithm”, Computers and Structures, Vol. 87, pp. 1129–1140, 2009.
  • H.M. Gen¸ c, O.K. Erol, ˙I. Eksin, “An application and solution to gate assignment problem for Atat¨ urk Airport”, Proceedings of the 6th IFAC International Workshop on Knowledge and Technology Transfer in/to Developing Countries: Automation and Infrastructure, pp. 125–130, 2009.
  • M. Do˘ gan, Y. Istefanopulos, “Optimal nonlinear controller design for flexible robot manipulators with adaptive internal model”, IET Control Theory and Applications, Vol. 1, pp. 770–778, 2007.
  • A. Akyol, Y. Yaslan, O.K. Erol, “A genetic programming classifier design approach for cell images”, Proceedings of the 9th European Conference on Symbolic and Quantitative Approaches to Reasoning with Uncertainty, pp. 878–888, 2007.
  • P. Moscato, “On evolution, search, optimization, genetic algorithms and martial arts: towards memetic algorithms”, Caltech Concurrent Computation Program (report 826), 1989.
  • N. Noman, H. Iba, “Accelerating differential evolution using an adaptive local search”. IEEE Transactions on Evolutionary Computation, Vol. 12, pp. 107–125, 2008.
  • M. Lozano, F. Herrera, N. Krasnogor, D. Molina, “Real-coded memetic algorithms with crossover hill-climbing,” Evolutionary Computation - Special issue on magnetic algorithms, Vol. 12, pp. 273–302, 2004.
  • Y.S. Ong, A.J. Keane, “Meta-Lamarckian learning in memetic algorithms,” IEEE Transactions on Evolutionary Computation, Vol. 8, pp. 99–110, 2004.
  • Y.S. Ong, M.H. Lim, N. Zhu, K.W. Wong, “Classification of adaptive memetic algorithms: a comparative study”, IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics, Vol. 36, pp. 141–152, 2006.
  • N.K. Bambha, S.S. Bhattacharyya, J. Teich, E. Zitzler, “Systematic integration of parameterized local search into evolutionary algorithms,” IEEE Transactions on Evolutionary Computation, Vol. 8, pp. 137–155, 2004.
  • C.W. Ahn, E. Kim, H.T. Kim, D.H. Lim, J. An, “A hybrid multiobjective evolutionary algorithm: striking balance with local search”, Mathematical and Computer Modeling, Vol. 52, pp. 2048–2059, 2010.
  • T.A. El-Mihoub, A.A. Hopgood, L. Nolle, A. Battersby, “Hybrid genetic algorithms: a review”, Electronic Letters, Vol. 13, 2004.
  • P.N. Suganthan, N. Hansen, J.J. Liang, K. Deb, Y.P. Chen, A. Auger, S. Tiwari, “Problem definitions and evaluation criteria for the CEC 2005 special session on real parameter optimization”, Tech. Report, Nanyang Technological University, 2005.
  • N. Hansen, S.D. M¨ uller, P. Koumoutsakos, “Reducing the time complexity of the derandomized evolution strategy with covariance matrix adaptation (CMA–ES)”, Evolutionary Computation, Vol. 11, pp. 1–18, 2003.
  • N. Hansen, The CMA evolution strategy: a comparing review, In: J.A. Lozano, P. Larra˜ nga, I. Inza, E. Bengoetxea (eds.), Towards a New Evolutionary Computation. Advances in Estimation of Distribution Algorithms, New York, Springer, pp. 75–102, 2006.
  • N. Hansen, A.S.P. Niederberger, L. Guzzella, P. Koumoutsakos, “A method for handling uncertainty in evolutionary optimization with an application to feedback control of combustion”, IEEE Transactions on Evolutionary Computation, Vol. 13, pp. 180–197, 2009.
  • N. Hansen, “Covariance matrix adaptation evolutionary strategy”, Available at: http://www.lri.fr/ ∼hansen/. G. Evers, “Matlab particle swarm optimization research toolbox”, Available at: http://www.georgeevers.org/pso research toolbox.htm.