Fitness approximation

From Wikipedia, the free encyclopedia

Fitness approximation [1] aims to approximate the objective or fitness functions in evolutionary optimization by building up machine learning models based on data collected from numerical simulations or physical experiments. The machine learning models for fitness approximation are also known as meta-models or surrogates, and evolutionary optimization based on approximated fitness evaluations are also known as surrogate-assisted evolutionary approximation.[2] Fitness approximation in evolutionary optimization can be seen as a sub-area of data-driven evolutionary optimization.[3]

Approximate models in function optimization[]

Motivation[]

In many real-world optimization problems including engineering problems, the number of fitness function evaluations needed to obtain a good solution dominates the optimization cost. In order to obtain efficient optimization algorithms, it is crucial to use prior information gained during the optimization process. Conceptually, a natural approach to utilizing the known prior information is building a model of the fitness function to assist in the selection of candidate solutions for evaluation. A variety of techniques for constructing such a model, often also referred to as surrogates, metamodels or approximation models – for computationally expensive optimization problems have been considered.

Approaches[]

Common approaches to constructing approximate models based on learning and interpolation from known fitness values of a small population include:

Due to the limited number of training samples and high dimensionality encountered in engineering design optimization, constructing a globally valid approximate model remains difficult. As a result, evolutionary algorithms using such approximate fitness functions may converge to local optima. Therefore, it can be beneficial to selectively use the original fitness function together with the approximate model.

Adaptive fuzzy fitness granulation[]

Adaptive fuzzy fitness granulation (AFFG) is a proposed solution to constructing an approximate model of the fitness function in place of traditional computationally expensive large-scale problem analysis like (L-SPA) in the Finite element method or iterative fitting of a Bayesian network structure.

In adaptive fuzzy fitness granulation, an adaptive pool of solutions, represented by fuzzy granules, with an exactly computed fitness function result is maintained. If a new individual is sufficiently similar to an existing known fuzzy granule, then that granule's fitness is used instead as an estimate. Otherwise, that individual is added to the pool as a new fuzzy granule. The pool size as well as each granule's radius of influence is adaptive and will grow/shrink depending on the utility of each granule and the overall population fitness. To encourage fewer function evaluations, each granule's radius of influence is initially large and is gradually shrunk in latter stages of evolution. This encourages more exact fitness evaluations when competition is fierce among more similar and converging solutions. Furthermore, to prevent the pool from growing too large, granules that are not used are gradually eliminated.

Additionally, AFFG mirrors two features of human cognition: (a) granularity (b) similarity analysis. This granulation-based fitness approximation scheme is applied to solve various engineering optimization problems including detecting hidden information from a watermarked signal in addition to several structural optimization problems.

See also[]


References[]

  1. ^ Y. Jin. A comprehensive survey of fitness approximation in evolutionary computation. Soft Computing, 9:3–12, 2005
  2. ^ Surrogate-assisted evolutionary computation: Recent advances and future challenges. Swarm and Evolutionary Computation, 1(2):61–70, 2011
  3. ^ Y. Jin, H. Wang, T. Chugh, D. Guo and K. Miettinen. Data-driven evolutionary optimization -- An Overview and Case Studies or black-box optimization. 23(3):442-459, 2019
  4. ^ Manzoni, L.; Papetti, D.M.; Cazzaniga, P.; Spolaor, S.; Mauri, G.; Besozzi, D.; Nobile, M.S. Surfing on Fitness Landscapes: A Boost on Optimization by Fourier Surrogate Modeling. Entropy 2020, 22, 285.
Retrieved from ""