Research Article
BibTex RIS Cite

Differential Evolution Algorithm with Incremental Social Learning

Year 2020, Volume: 7 Issue: 100. Yıl Özel Sayı, 133 - 162, 23.03.2020
https://doi.org/10.35193/bseufbd.666626

Abstract

In this study, the differential evolution algorithm (DE), which has a very strong place among the optimization algorithms in literature, has been tried to be improved and bettered. The algorithm has been bettered by integrating incremental social learning (ISL) structure, which was applied to different optimization algorithms previously with positive feedbacks, into DE. In this betterment, DE has been initiated to search with a number of determined individuals, new individuals have been added to the population with different approaches in certain levels, the process of adding individuals has been ended at the maximum population number determined and the search has been continued with this population number until the stopping criterion has been provided. This new bettered algorithm which has been revealed as a new version of DE has been called Incremental differential evolution algorithm (IDE). Another purpose that comes into prominence in the study is to determine the best method to add individuals in ISL structure. For this purpose, five different approaches have been used in the operation of adding individuals to DE. A set of 13 unimodal and multimodal test functions defined on a 30-dimensional space have been solved with DE and IDE algorithms improved in this study. Evaluations have been made by examining the obtained numerical results, graphics and statistical analyses.

References

  • [1] Goldberg D.E. (1989) Genetic Algorithms in Search, Optimization, and Machine Learning, Addison-Wesley Publishing Company.
  • [2] Kirkpatrick, S. Gelatt, C.D. Vecchi, M.P. (1983). Optimisation by simulated annealing, Science, 220, 671-680.
  • [3] Storn, R. Price, K. (1997). Differential evolution-A simple and efficient heuristic for global optimization over continuous spaces, J. Global Optim. 11, 341-359.
  • [4] Kennedy, J. Eberhart, R. (1995). Particle Swarm Optimization, Proceedings of IEEE International Conference on Neural Networks, 6, 1942-1948.
  • [5] Geem, Z.W. Kim, J.H. Loganathan, G.V. (2001). A new heuristic optimization algorithm: Harmony search, Simulation, 76 (2), 60-68.
  • [6] Karaboğa, D. Baştürk, B. (2007). A powerful and efficient algorithm for numerical function optimization: Artificial bee colony (ABC) algorithm, J. Global Optim. 39 (3), 459-471.
  • [7] Dorigo, M. Di Caro, G. (1999). The Ant Colony Optimization Meta-heuristic, New Ideas in Optimization, McGraw-Hill Ltd., UK, Maidenhead, UK, England, 11-32.
  • [8] Rashedi, E. Nezamabadi-pour, Saryazdi, H.S. (2009). GSA: A gravitational search algorithm, Inf. Sci. 179 (13), 2232-2248.
  • [9] Das, S. Biswas, A. Dasgupta, S. Abraham, A. (2009). Bacterial foraging optimization algorithm: Theoretical foundations, analysis, and applications, Foundations of Computational Intelligence 3, Global Optimization, 23-55.
  • [10] Kaveh, A. Talahatari, S. (2010). A novel heuristic optimization method: Charged system search, Acta Mech. 213 (3-4), 267-289.
  • [11] Kaveh, A. Mahdavi, V.R. (2014). Colliding bodies optimization: A novel meta‐heuristic method, Comput. Struct. 139, 18-27.
  • [12] Erol, O.K. Eksin, I. (2006). A new optimization method: Big Bang–Big Crunch, Adv. Eng. Software 37 (2), 106-111.
  • [13] Hatamlou, A. (2013). Black hole: A new heuristic optimization approach for data clustering, Inf. Sci. 222, 175-184.
  • [14] Zheng, Y.J. (2015). Water wave optimization: A new nature-inspired metaheuristic, Comput. Oper. Res. 55, 1-11.
  • [15] Mirjalili, S. Lewis, A. (2016). The whale optimization algorithm, Adv. Eng. Software 95, 51-67.
  • [16] Rajabioun, R. (2011). Cuckoo optimization algorithm, Appl. Soft Comput. 11, 5508-5518.
  • [17] Mirjalili, S. (2015). Moth-flame optimization algorithm: A novel nature-inspired heuristic paradigm, Knowledge Based Syst. 89, 228-249.
  • [18] Askarzadeh, A. (2016). A novel metaheuristic method for solving constrained engineering optimization problems: Crow search algorithm, Comput. Struct. 169, 1-12.
  • [19] Mirjalili, S. (2016). SCA: A sine cosine algorithm for solving optimization problems, Knowledge Based Syst. 96, 120-133.
  • [20] Kashan, A.H. (2015). A new metaheuristic for optimization: Optics inspired optimization (OIO), Comput. Oper. Res. 55, 99-125.
  • [21] Mirjalili, S. Mirjalili, S.M. Hatamlou, A. (2016). Multi-verse optimizer: A nature -inspired algorithm for global optimization, Neural Comput. Appl. 27, 495-513.
  • [22] Rahmani, R. Yusof, R. (2014). A new simple, fast and efficient algorithm for global optimization over continuous search-space problems: Radial movement optimization, Appl. Math. Comput. 248, 287-300.
  • [23] Mirjalili, S. Mirjalili, S.M. Lewis, A. (2014). Grey wolf optimizer, Adv. Eng. Software 69, 46-61.
  • [24] Cheng, M.Y. Prayogo, D. (2014). Symbiotic organisms search: A new metaheuristic optimization algorithm, Comput. Struct. 139, 98-112.
  • [25] Montes de Oca, M.A. Stützle, T. (2008). Towards incremental social learning in optimization and multiagent systems. In W. Rand et al., editors, ECoMASS Workshop of the Genetic and Evolutionary Computation Conference (GECCO’08), 1939-1944, ACM Press, New York.
  • [26] Montes de Oca, M.A. Stützle, T. Van den Enden, K. Dorigo, M. (2011). Incremental social learning in particle swarms, IEEE Trans. Syst. Man Cybern. Part B Cybern. 41 (2), 368-384.
  • [27] Montes de Oca, M.A. Aydın, D. Stützle, T. (2011). An incremental particle swarm for large-scale optimization problems: An example of tuning-in-the-loop (re)design of optimization algorithms, Soft Comput. 15, 2233-2255.
  • [28] Liao, T. Montes de Oca, M.A. Aydın, D. Stützle, T. Dorigo, M. (2011). An incremental ant colony algorithm with local search for continuous optimization problems, In: Proceeding of Genetic and Evolutionary Computation Conference (GECCO’11), 125-132.
  • [29] Aydın, D. Liao, T. Montes de Oca, M. Stützle, T. (2011). Improving performance via population growth and local search: The case of the artificial bee colony algorithm. Proceedings of Artificial Evolutionary (EA’11), 131-142.
  • [30] Özyön, S. Aydın, D. (2013). Incremental artificial bee colony with local search to economic dispatch problem with ramp rate limits and prohibited operating zones, Energy Convers. Manage. 65, 397-407.
  • [31] Özyön, S. Yaşar, C. Temurtaş, H. (2018). Incremental gravitational search algorithm for high-dimensional benchmark functions, Neural Comput. Appl. 1-25.
  • [32] Brest, J. Greiner, S. Boskovic, B. Mernik, M. Zumer, V. (2006). Self-adapting control parameters in differential evolution: a comparative study on numerical benchmark problems, IEEE Trans. Evol. Comput. 10 (6), 646-657.
  • [33] https://pablormier.github.io/2017/09/05/a-tutorial-on-differential-evolution-with-python/#
  • [34] García, S. Molina, D. Lozano, M. Herrera, F. (2009). A study on the use of non-parametric tests for analyzing the evolutionary algorithms’ behaviour: a case study on the CEC’2005 Special Session on Real Parameter Optimization, J. Heuristics 15, 617-644.
  • [35] Gibbons, J.D. Chakraborti, S. (2011). Nonparametric Statistical Inference, 5th Ed. Boca Raton, FL: Chapman & Hall/CRC Press, Taylor & Francis Group.
  • [36] Hollander, M. Wolfe, D.A. (1999). Nonparametric Statistical Methods. Hoboken, NJ: John Wiley & Sons, Inc.
  • [37] https://www.mathworks.com/help/stats/signrank.html
  • [38] https://www.mathworks.com/help/stats/ranksum.html
  • [39] https://www.mathworks.com/help/stats/signtest.html

Artırımlı Sosyal Öğrenme Tabanlı Diferansiyel Gelişim Algoritması

Year 2020, Volume: 7 Issue: 100. Yıl Özel Sayı, 133 - 162, 23.03.2020
https://doi.org/10.35193/bseufbd.666626

Abstract

Bu çalışmada, literatürde yer alan optimizasyon algoritmaları arasında çok güçlü bir yere sahip olan diferansiyel gelişim algoritmasının (DE) geliştirilmesi ve iyileştirilmesi üzerine çalışılmıştır. DE’ye daha önce farklı optimizasyon algoritmalarına uygulanan ve olumlu geri dönüşler alınan artırımlı sosyal öğrenme yapısı (ISL) farklı yaklaşımlarla entegre edilerek, algoritma iyileştirilmiştir. Yapılan bu iyileştirmelerde DE, belirlenen minimum sayıda bireyle aramaya başlatılmış, belirli adımlarda farklı yaklaşımlarla popülasyona yeni bireyler eklenmiş, belirlenen maksimum popülasyon sayısında birey ekleme işlemi sonlandırılmış ve durdurma kriteri sağlanana kadar bu popülasyon sayısıyla aramaya devam edilmiştir. DE’nin yeni bir versiyonu olarak ortaya çıkarılan iyileştirilmiş bu algoritmaya artırımlı diferansiyel gelişim algoritmasının (IDE) adı verilmiştir. Çalışmada öne çıkan diğer bir amaç ISL yapısında en iyi birey ekleme yönteminin belirlenmesidir. Bu amaçla, DE’ye birey ekleme işlemi beş farklı yaklaşımla yapılmıştır. DE ve bu çalışmada geliştirilen IDE algoritmalarıyla, 13 adet, 30 boyutlu unimodal ve multimodal test fonksiyonlarının çözümleri yapılmıştır. Elde edilen sayısal sonuçlar, grafikler ve istatistiki analizler incelenerek, değerlendirmeler yapılmıştır.

References

  • [1] Goldberg D.E. (1989) Genetic Algorithms in Search, Optimization, and Machine Learning, Addison-Wesley Publishing Company.
  • [2] Kirkpatrick, S. Gelatt, C.D. Vecchi, M.P. (1983). Optimisation by simulated annealing, Science, 220, 671-680.
  • [3] Storn, R. Price, K. (1997). Differential evolution-A simple and efficient heuristic for global optimization over continuous spaces, J. Global Optim. 11, 341-359.
  • [4] Kennedy, J. Eberhart, R. (1995). Particle Swarm Optimization, Proceedings of IEEE International Conference on Neural Networks, 6, 1942-1948.
  • [5] Geem, Z.W. Kim, J.H. Loganathan, G.V. (2001). A new heuristic optimization algorithm: Harmony search, Simulation, 76 (2), 60-68.
  • [6] Karaboğa, D. Baştürk, B. (2007). A powerful and efficient algorithm for numerical function optimization: Artificial bee colony (ABC) algorithm, J. Global Optim. 39 (3), 459-471.
  • [7] Dorigo, M. Di Caro, G. (1999). The Ant Colony Optimization Meta-heuristic, New Ideas in Optimization, McGraw-Hill Ltd., UK, Maidenhead, UK, England, 11-32.
  • [8] Rashedi, E. Nezamabadi-pour, Saryazdi, H.S. (2009). GSA: A gravitational search algorithm, Inf. Sci. 179 (13), 2232-2248.
  • [9] Das, S. Biswas, A. Dasgupta, S. Abraham, A. (2009). Bacterial foraging optimization algorithm: Theoretical foundations, analysis, and applications, Foundations of Computational Intelligence 3, Global Optimization, 23-55.
  • [10] Kaveh, A. Talahatari, S. (2010). A novel heuristic optimization method: Charged system search, Acta Mech. 213 (3-4), 267-289.
  • [11] Kaveh, A. Mahdavi, V.R. (2014). Colliding bodies optimization: A novel meta‐heuristic method, Comput. Struct. 139, 18-27.
  • [12] Erol, O.K. Eksin, I. (2006). A new optimization method: Big Bang–Big Crunch, Adv. Eng. Software 37 (2), 106-111.
  • [13] Hatamlou, A. (2013). Black hole: A new heuristic optimization approach for data clustering, Inf. Sci. 222, 175-184.
  • [14] Zheng, Y.J. (2015). Water wave optimization: A new nature-inspired metaheuristic, Comput. Oper. Res. 55, 1-11.
  • [15] Mirjalili, S. Lewis, A. (2016). The whale optimization algorithm, Adv. Eng. Software 95, 51-67.
  • [16] Rajabioun, R. (2011). Cuckoo optimization algorithm, Appl. Soft Comput. 11, 5508-5518.
  • [17] Mirjalili, S. (2015). Moth-flame optimization algorithm: A novel nature-inspired heuristic paradigm, Knowledge Based Syst. 89, 228-249.
  • [18] Askarzadeh, A. (2016). A novel metaheuristic method for solving constrained engineering optimization problems: Crow search algorithm, Comput. Struct. 169, 1-12.
  • [19] Mirjalili, S. (2016). SCA: A sine cosine algorithm for solving optimization problems, Knowledge Based Syst. 96, 120-133.
  • [20] Kashan, A.H. (2015). A new metaheuristic for optimization: Optics inspired optimization (OIO), Comput. Oper. Res. 55, 99-125.
  • [21] Mirjalili, S. Mirjalili, S.M. Hatamlou, A. (2016). Multi-verse optimizer: A nature -inspired algorithm for global optimization, Neural Comput. Appl. 27, 495-513.
  • [22] Rahmani, R. Yusof, R. (2014). A new simple, fast and efficient algorithm for global optimization over continuous search-space problems: Radial movement optimization, Appl. Math. Comput. 248, 287-300.
  • [23] Mirjalili, S. Mirjalili, S.M. Lewis, A. (2014). Grey wolf optimizer, Adv. Eng. Software 69, 46-61.
  • [24] Cheng, M.Y. Prayogo, D. (2014). Symbiotic organisms search: A new metaheuristic optimization algorithm, Comput. Struct. 139, 98-112.
  • [25] Montes de Oca, M.A. Stützle, T. (2008). Towards incremental social learning in optimization and multiagent systems. In W. Rand et al., editors, ECoMASS Workshop of the Genetic and Evolutionary Computation Conference (GECCO’08), 1939-1944, ACM Press, New York.
  • [26] Montes de Oca, M.A. Stützle, T. Van den Enden, K. Dorigo, M. (2011). Incremental social learning in particle swarms, IEEE Trans. Syst. Man Cybern. Part B Cybern. 41 (2), 368-384.
  • [27] Montes de Oca, M.A. Aydın, D. Stützle, T. (2011). An incremental particle swarm for large-scale optimization problems: An example of tuning-in-the-loop (re)design of optimization algorithms, Soft Comput. 15, 2233-2255.
  • [28] Liao, T. Montes de Oca, M.A. Aydın, D. Stützle, T. Dorigo, M. (2011). An incremental ant colony algorithm with local search for continuous optimization problems, In: Proceeding of Genetic and Evolutionary Computation Conference (GECCO’11), 125-132.
  • [29] Aydın, D. Liao, T. Montes de Oca, M. Stützle, T. (2011). Improving performance via population growth and local search: The case of the artificial bee colony algorithm. Proceedings of Artificial Evolutionary (EA’11), 131-142.
  • [30] Özyön, S. Aydın, D. (2013). Incremental artificial bee colony with local search to economic dispatch problem with ramp rate limits and prohibited operating zones, Energy Convers. Manage. 65, 397-407.
  • [31] Özyön, S. Yaşar, C. Temurtaş, H. (2018). Incremental gravitational search algorithm for high-dimensional benchmark functions, Neural Comput. Appl. 1-25.
  • [32] Brest, J. Greiner, S. Boskovic, B. Mernik, M. Zumer, V. (2006). Self-adapting control parameters in differential evolution: a comparative study on numerical benchmark problems, IEEE Trans. Evol. Comput. 10 (6), 646-657.
  • [33] https://pablormier.github.io/2017/09/05/a-tutorial-on-differential-evolution-with-python/#
  • [34] García, S. Molina, D. Lozano, M. Herrera, F. (2009). A study on the use of non-parametric tests for analyzing the evolutionary algorithms’ behaviour: a case study on the CEC’2005 Special Session on Real Parameter Optimization, J. Heuristics 15, 617-644.
  • [35] Gibbons, J.D. Chakraborti, S. (2011). Nonparametric Statistical Inference, 5th Ed. Boca Raton, FL: Chapman & Hall/CRC Press, Taylor & Francis Group.
  • [36] Hollander, M. Wolfe, D.A. (1999). Nonparametric Statistical Methods. Hoboken, NJ: John Wiley & Sons, Inc.
  • [37] https://www.mathworks.com/help/stats/signrank.html
  • [38] https://www.mathworks.com/help/stats/ranksum.html
  • [39] https://www.mathworks.com/help/stats/signtest.html
There are 39 citations in total.

Details

Primary Language English
Subjects Engineering
Journal Section Articles
Authors

Serdar Özyön 0000-0002-4469-3908

Publication Date March 23, 2020
Submission Date December 28, 2019
Acceptance Date February 14, 2020
Published in Issue Year 2020 Volume: 7 Issue: 100. Yıl Özel Sayı

Cite

APA Özyön, S. (2020). Differential Evolution Algorithm with Incremental Social Learning. Bilecik Şeyh Edebali Üniversitesi Fen Bilimleri Dergisi, 7(100. Yıl Özel Sayı), 133-162. https://doi.org/10.35193/bseufbd.666626