Genetic Algorithm Best Objective Function Value Increased in Later Generations

15 次查看(过去 30 天)
I am using GA to run an optimisation problem and I notice that the best objective function value increased in the second generation, indicating worse results, as shown in the screenshot below. Does anyone know if it is a normal situation?

采纳的回答

Mrutyunjaya Hiremath
In genetic algorithms (GA), it is common to observe fluctuations or changes in the objective function value (fitness) from one generation to another, especially in the early generations. These fluctuations are normal and can be attributed to the inherent stochastic nature of the GA optimization process. Here are a few reasons why this behavior might occur:
  1. Exploration and Exploitation: In the early generations, the GA is exploring the search space to find potential solutions. This exploration phase can lead to a wide range of objective function values, and the algorithm may not have converged to good solutions yet. As the GA progresses, it starts exploiting the promising regions, focusing on improving solutions, and this can lead to improvements in the objective function value.
  2. Random Initialization: GA starts with a population of randomly generated individuals. Depending on the initial population, the algorithm may initially produce suboptimal solutions. As the GA evolves, better solutions are generated through crossover, mutation, and selection operations.
  3. Diversity in Population: Initially, the population might be diverse with individuals scattered across the search space. This diversity can lead to a mix of good and poor solutions. Over time, the GA converges towards better solutions, reducing diversity.
  4. Genetic Operators: The genetic operators (crossover and mutation) introduce randomness in the evolution process. In some cases, these operators may not immediately improve the solutions in the population, leading to temporary fluctuations in the objective function value.
  5. Population Size: The size of the population can influence the optimization process. Smaller populations may converge faster, but they may also get stuck in local optima. Larger populations may require more generations to converge, but they can explore a wider range of solutions.
  • It is essential to monitor the convergence of the GA and assess the progress over multiple generations. Sometimes, the GA may need more time and generations to reach optimal or near-optimal solutions. If you observe consistent degradation in the objective function value over many generations, it might indicate an issue with the GA parameters, such as the selection method, crossover rate, mutation rate, or population size.
  • To improve the GA's performance, you can try adjusting the GA parameters, increasing the population size, fine-tuning the selection method, or using different crossover and mutation operators. Additionally, you can experiment with different termination conditions to allow the GA to run for more generations and explore the search space further.
  1 个评论
Ausama Ahmed
Ausama Ahmed 2024-1-31
I am having the same behaviour. But, I am sorry, I am can't fully understand the reason of the increase in thebest function value.
As I understand, as the ga searches wider space, it may or may not find a better silution.
If a better solution is found, the best individulal will be kept in the pool for the next generation. If no better solution is found, then the best solution from the previous generation will not be replaced and will be kept for the next generation.
This means that the best f(x) should always be decreasing. (Especially if the elitecount is not zero).
Please clarify if I am missing anything.

请先登录,再进行评论。

更多回答(0 个)

类别

Help CenterFile Exchange 中查找有关 Genetic Algorithm 的更多信息

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by