A Comprehensive Study of Parameters Analysis for Galactic Swarm Optimization

The galactic swarm optimization algorithm is a metaheuristic approach inspired by the motion and behavior of stars and galaxies. It is a framework that can use basic metaheuristic search methods. The method, which has a two-phase structure, performs exploration in the first phase and exploitation in the second phase. GSO tries to find the best solution in the search space by repeating these two phases for the specified number of times. In this study, the analysis of maximum epoch number (EPmax), the number of iterations in the first phase (?1), and the number of iterations in the second phase (?2) parameters, which determine the exploration and exploitation balance in the GSO method, was performed. 15 different parameter sets consisting of different values of these three parameters were created. The methods with 15 different parameter sets were performed at 30 independent runs. The methods were analyzed using 26 benchmark functions. The functions are tested in 30, 60, and 100 dimensions. Detailed results of the analysis were presented in the study, and the results obtained were also evaluated statistically.

___

[1] Dokeroglu, T., et al., A survey on new generation metaheuristic algorithms. Computers & Industrial Engineering, 2019. 137.

[2] Korkmaz, S. and M.S. Kiran, An artificial algae algorithm with stigmergic behavior for binary optimization. Applied Soft Computing, 2018. 64: p. 627-640.

[3] Holland, J.H., Genetic Algorithms. Scientific American, 1992. 267(1): p. 66-72.

[4] Cheng, S., Y.H. Shi, and Q.D. Qin, Population Diversity of Particle Swarm Optimizer Solving Single- and Multi- Objective Problems. Emerging Research on Swarm Intelligence and Algorithm Optimization, 2015: p. 71-98.

[5] Rechenberg, I. Evolutionsstrategien. in Simulationsmethoden in der Medizin und Biologie. 1978. Berlin, Heidelberg: Springer Berlin Heidelberg.

[6] Storn, R. and K. Price, Differential Evolution – A Simple and Efficient Heuristic for global Optimization over Continuous Spaces. Journal of Global Optimization, 1997. 11(4): p. 341-359.

[7] Kirkpatrick, S., C.D. Gelatt, and M.P. Vecchi, Optimization by Simulated Annealing. Science, 1983. 220(4598): p. 671-680.

[8] Rashedi, E., H. Nezamabadi-Pour, and S. Saryazdi, GSA: A Gravitational Search Algorithm. Information Sciences, 2009. 179(13): p. 2232-2248.

[9] Kaveh, A. and S. Talatahari, A novel heuristic optimization method: charged system search. Acta Mechanica, 2010. 213(3-4): p. 267-289.

[10] Kennedy, J. and R. Eberhart, Particle swarm optimization. 1995 Ieee International Conference on Neural Networks Proceedings, Vols 1-6, 1995: p. 1942-1948.

[11] Dorigo, M., V. Maniezzo, and A. Colorni, Ant system: Optimization by a colony of cooperating agents. Ieee Transactions on Systems Man and Cybernetics Part BCybernetics, 1996. 26(1): p. 29-41.

[12] Karaboğa, D., An idea based on honey bee swarm for numerical optimization, in Technical Report-TR06. 2005, Erciyes University, Engineering Faculty, Comput. Eng.Dep.

[13] Mirjalili, S. and A. Lewis, The Whale Optimization Algorithm. Advances in Engineering Software, 2016. 95: p. 51-67.

[14] Yang, X.S. and S. Deb, Cuckoo Search via Levey Flights. 2009 World Congress on Nature & Biologically Inspired Computing (Nabic 2009), 2009: p. 210-214.

[15] Uymaz, S.A., G. Tezel, and E. Yel, Artificial algae algorithm (AAA) for nonlinear global optimization. Applied Soft Computing, 2015. 31: p. 153-171.

[16] Pham, D.T., et al., The Bees Algorithm and mechanical design optimisation. Icinco 2008: Proceedings of the Fifth International Conference on Informatics in Control, Automation and Robotics, Vol Icso, 2008: p. 250-+.

[17] Yang, X.-S. Flower Pollination Algorithm for Global Optimization. 2012. Berlin, Heidelberg: Springer Berlin Heidelberg.

[18] Yang, X.S. and A.H. Gandomi, Bat algorithm: a novel approach for global engineering optimization. Engineering Computations, 2012. 29(5-6): p. 464-483.

[19] Glover, F., Tabu Search—Part I. ORSA Journal on Computing, 1989. 1(3): p. 190-206.

[20] Glover, F., Tabu Search—Part II. ORSA Journal on Computing, 1990. 2(1): p. 4-32.

[21] Geem, Z.W., J.H. Kim, and G.V. Loganathan, A new heuristic optimization algorithm: Harmony search. Simulation, 2001. 76(2): p. 60-68.

[22] Rao, R.V., V.J. Savsani, and D.P. Vakharia, Teaching- Learning-Based Optimization: An optimization method for continuous non-linear large scale problems. Information Sciences, 2012. 183(1): p. 1-15.

[23] Cheng, S., Y.H. Shi, and Q.D. Qin, Population Diversity Based Study on Search Information Propagation in Particle Swarm Optimization. 2012 Ieee Congress on Evolutionary Computation (Cec), 2012.

[24] Muthiah-Nakarajan, V. and M.M. Noel, Galactic Swarm Optimization: A new global optimization metaheuristic inspired by galactic motion. Applied Soft Computing, 2016. 38: p. 771-787.

[25] Kaya, E., S.A. Uymaz, and B. Kocer, Boosting galactic swarm optimization with ABC. International Journal of Machine Learning and Cybernetics, 2019. 10(9): p. 2401- 2419.

[26] Eberhart, R.C. and Y.H. Shi, Particle swarm optimization: Developments, applications and resources. Proceedings of the 2001 Congress on Evolutionary Computation, Vols 1 and 2, 2001: p. 81-86.

[27] Yao, X., Y. Liu, and G.M. Lin, Evolutionary programming made faster. Ieee Transactions on Evolutionary Computation, 1999. 3(2): p. 82-102.

[28] Digalakis, J.G. and K.G. Margaritis, On benchmarking functions for genetic algorithms. International Journal of Computer Mathematics, 2001. 77(4): p. 481-506.

[29] Derrac, J., et al., A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms. Swarm and Evolutionary Computation, 2011. 1(1): p. 3-18.

[30] Gao, W.F., S.Y. Liu, and L.L. Huang, A Novel Artificial Bee Colony Algorithm Based on Modified Search Equation and Orthogonal Learning. Ieee Transactions on Cybernetics, 2013. 43(3): p. 1011-1024.

[31] Wong, G.Y., F.H. Leung, and S.-H. Ling, A hybrid evolutionary preprocessing method for imbalanced datasets. Information Sciences, 2018. 454: p. 161-177.

[32] Babalik, A., A.C. Cinar, and M.S. Kiran, A modification of tree-seed algorithm using Deb’s rules for constrained optimization. Applied Soft Computing, 2018. 63: p. 289- 305.

[33] Gungor, I., et al., Integration search strategies in tree seed algorithm for high dimensional function optimization. International Journal of Machine Learning and Cybernetics, 2020. 11(2): p. 249-267. Friedman, M., A Comparison of Alternative Tests of Significance for the Problem of $m$ Rankings. Ann. Math. Statist., 1940. 11(1): p. 86-92.