All the evolutionary and swarm intelligence based algorithms are probabilistic algorithms and require common controlling parameters like population size, number of generations, elite size, etc. Besides the common control parameters, different algorithms require their own algorithm-specific control parameters. For example, GA uses mutation probability, crossover probability, selection operator; PSO uses inertia weight, social and cognitive parameters; ABC uses number of onlooker bees, employed bees, scout bees and limit; HS algorithm uses harmony memory consideration rate, pitch adjusting rate, and the number of improvisations. Similarly, the other algorithms such as ES, EP, DE, SFL, ACO, FF, CSO, AIA, GSA, BBO, FPA, ALO, IWO, etc. need the tuning of respective algorithm-specific parameters. The proper tuning of the algorithm-specific parameters is a very crucial factor which affects the performance of the above mentioned algorithms. The improper tuning of algorithm-specific parameters either increases the computational effort or yields the local optimal solution. Considering this fact, Rao et al. (2011) introduced the teaching-learning-based optimization (TLBO) algorithm which does not require any algorithm-specific parameters. The TLBO algorithm requires only common controlling parameters like population size and number of generations for its working. The TLBO algorithm has gained wide acceptance among the optimization researchers.
引用格式
iraj faraji (2024). Jaya: A simple and new optimization algorithm (https://www.mathworks.com/matlabcentral/fileexchange/74004-jaya-a-simple-and-new-optimization-algorithm), MATLAB Central File Exchange. 检索时间: .
MATLAB 版本兼容性
平台兼容性
Windows macOS Linux类别
标签
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!版本 | 已发布 | 发行说明 | |
---|---|---|---|
1.0.0 |