GECCO '20: Proceedings of the 2020 Genetic and Evolutionary Computation Conference

GECCO '20: Proceedings of the 2020 Genetic and Evolutionary Computation Conference

Full Citation in the ACM Digital Library

SESSION: Keynote talks

Biological evolution as a form of learning

  • Leslie Valiant

Living organisms function according to protein circuits. Darwin's theory of evolution suggests that these circuits have evolved through variation guided by natural selection. However, it is currently unknown what variation mechanisms can give rise to protein circuits of the complexity found in biology, within realistic population sizes and realistic numbers of generations.

We suggest that computational learning theory offers the framework for investigating this question, of how circuits can come into being via a Darwinian process without a designer. We formulate evolution as a form of learning from examples. The targets of the learning process are the protein expression functions that come closest to best behavior in the specific environment. The learning process is constrained so that the feedback from the experiences is Darwinian. We formulate a notion of evolvability that distinguishes function classes that are evolvable with polynomially bounded resources from those that are not. The dilemma is that if the function class that describes the expression levels of proteins in terms of each other, is too restrictive, then it will not support biology, while if it is too expressive then no evolution algorithm will exist to navigate it. We shall review current work in this area.

Evolutionary computation's niche for solving multi-criterion optimization problems

  • Kalyanmoy Deb

Evolutionary computation (EC) algorithms involve a careful collaborative and iterative update of a population of solutions to reach near a desired target. In a single-objective search and optimization problem, the respective optimal solution is often the single target. In a multi-criterion optimization problem, the target is a set of Pareto-optimal solutions. Although EC field started with solving single-objective problems, EC researchers soon realized that they were ideal for finding a well-diversed set of multiple Pareto-optimal solutions simultaneously for multi-criterion optimization problems, thereby making a clear niche of EC algorithms compared to their point-based classical counterparts. In this keynote talk, we provide a brief chronology of events on the evolutionary multi-criterion optimization (EMO) field in the past almost three decades, key challenges it faced, and key events and publications which pushed the field forward. We shall also provide a brief account of the current activities and a few pertinent future areas of research and applications.

Removing randomness from evolutionary algorithms

  • Darrell Whitley

It is natural to think of Evolutionary Algorithms as highly stochastic search methods. This can also make Evolutionary Algorithms, and particularly recombination, quite difficult to analyze. One way to reduce randomness involves the quadratization of functions, which is commonly used by modern optimization methods, and also has applications in quantum computing. After a function is made quadratic, random mutation is obsolete and unnecessary; the location of improving moves can be calculated deterministically, on average in O(1) time. Seemingly impossible problems, such as the Needle-in-a-Haystack, becomes trivial to solve in quadratic form. One can also provably tunnel, or jump, between local optima and quasilocal optima in O(n) time using deterministic genetic recombination.

The talk also explores how removing randomness from Evolutionary Algorithms might provide new insights into natural evolution. Finally, a form of evolutionary algorithm is proposed where premature convergence is impossible and the evolutionary potential of the population remains open-ended.