Metaheuristic

In computer science and mathematical optimization, a metaheuristic is a higher-level procedure or heuristic designed to find, generate, tune, or select a heuristic (partial search algorithm) that may provide a sufficiently good solution to an optimization problem or a machine learning problem, especially with incomplete or imperfect information or limited computation capacity.[1][2] Metaheuristics sample a subset of solutions which is otherwise too large to be completely enumerated or otherwise explored. Metaheuristics may make relatively few assumptions about the optimization problem being solved and so may be usable for a variety of problems.[3][4][5][6] Their use is always of interest when exact or other (approximate) methods are not available or are not expedient, either because the calculation time is too long or because, for example, the solution provided is too imprecise.

Compared to optimization algorithms and iterative methods, metaheuristics do not guarantee that a globally optimal solution can be found on some class of problems.[3] Many metaheuristics implement some form of stochastic optimization, so that the solution found is dependent on the set of random variables generated.[2] In combinatorial optimization, there are many problems that belong to the class of NP-complete problems and thus can no longer be solved exactly in an acceptable time from a relatively low degree of complexity.[7][8] Metaheuristics then often provide good solutions with less computational effort than approximation methods, iterative methods, or simple heuristics.[3][4] This also applies in the field of continuous or mixed-integer optimization.[4][9][10] As such, metaheuristics are useful approaches for optimization problems.[2] Several books and survey papers have been published on the subject.[2][3][4][11][12] Literature review on metaheuristic optimization,[13] suggested that it was Fred Glover who coined the word metaheuristics.[14]

Most literature on metaheuristics is experimental in nature, describing empirical results based on computer experiments with the algorithms. But some formal theoretical results are also available, often on convergence and the possibility of finding the global optimum.[3][15] Also worth mentioning are the no-free-lunch theorems, which state that there can be no metaheuristic that is better than all others for any given problem.

Especially since the turn of the millennium, many metaheuristic methods have been published with claims of novelty and practical efficacy. While the field also features high-quality research, many of the more recent publications have been of poor quality; flaws include vagueness, lack of conceptual elaboration, poor experiments, and ignorance of previous literature.[16][17]

  1. ^ Cite error: The named reference Bala2015 was invoked but never defined (see the help page).
  2. ^ a b c d Cite error: The named reference Bianchi2009 was invoked but never defined (see the help page).
  3. ^ a b c d e Cite error: The named reference blum03metaheuristics was invoked but never defined (see the help page).
  4. ^ a b c d Cite error: The named reference glover03handbook was invoked but never defined (see the help page).
  5. ^ Cite error: The named reference :1 was invoked but never defined (see the help page).
  6. ^ Cite error: The named reference :2 was invoked but never defined (see the help page).
  7. ^ Brucker, Peter; Knust, Sigrid (2012). Complex Scheduling. Berlin, Heidelberg: Springer. doi:10.1007/978-3-642-23929-8. ISBN 978-3-642-23928-1.
  8. ^ Papadimitriou, Christos H.; Steiglitz, Kenneth (1998). Combinatorial Optimization: Algorithms and Complexity. Mineola, N.Y: Dover Publ., corrected, unabridged new edition of the work published by Prentice-Hall in 1982. ISBN 978-0-486-40258-1.
  9. ^ Gad, Ahmed G. (2022). "Particle Swarm Optimization Algorithm and Its Applications: A Systematic Review". Archives of Computational Methods in Engineering. 29 (5): 2531–2561. doi:10.1007/s11831-021-09694-4. ISSN 1134-3060.
  10. ^ Li, Zhenhua; Lin, Xi; Zhang, Qingfu; Liu, Hailin (2020). "Evolution strategies for continuous optimization: A survey of the state-of-the-art". Swarm and Evolutionary Computation. 56: 100694. doi:10.1016/j.swevo.2020.100694.
  11. ^ Cite error: The named reference goldberg89genetic was invoked but never defined (see the help page).
  12. ^ Cite error: The named reference talbi09metaheuristics was invoked but never defined (see the help page).
  13. ^ X. S. Yang, Metaheuristic optimization, Scholarpedia, 6(8):11472 (2011).
  14. ^ Glover, Fred (January 1986). "Future paths for integer programming and links to artificial intelligence" (PDF). Computers and Operations Research. 13 (5): 533–549. doi:10.1016/0305-0548(86)90048-1. ISSN 0305-0548.
  15. ^ Rudolph, Günter (2001). "Self-adaptive mutations may lead to premature convergence". IEEE Transactions on Evolutionary Computation. 5 (4): 410–414. doi:10.1109/4235.942534. hdl:2003/5378.
  16. ^ Sörensen, Kenneth (2015). "Metaheuristics—the metaphor exposed". International Transactions in Operational Research. 22 (1): 3–18. CiteSeerX 10.1.1.470.3422. doi:10.1111/itor.12001. S2CID 14042315.
  17. ^ Brownlee, Alexander; Woodward, John R. "Why we fell out of love with algorithms inspired by nature". The Conversation (website). Retrieved 2024-08-30.