Advances In Genetic Programming

(AiGP-1 Book)

Last updated July 30, 1999

Edited by Kenneth E. Kinnear,Jr.
Published 1994 by
The MIT Press
Hardback. Approx. 525 pages.
MIT Press Order code: KINDH

Order Form for AIGP

AiGP Page at the MIT Press

Table of Contents of AiGP-1 Book

Part I -- Introduction

In this introductory section, the concepts necessary to understand the remainder of the book are explained.

1 A Perspective on the Work in this Book
Kenneth E. Kinnear, Jr.

In Chapter 1, a quick sketch of the overall field of evolutionary computation is offered and genetic programming is defined and placed in context within the field of evolutionary computation. A brief overview of the work contained in the rest of the book is presented, along with some practical advice on applying genetic programming.

2 Introduction to Genetic Programming
John R. Koza

Chapter 2 contains an introduction to the technology of genetic algorithms and genetic programming. The background on genetic programming supplied in this chapter or its equivalent is required to fully comprehend the contributions in the rest of the book. Both of the first two chapters contain extensive and distinct bibliographic references to additional information in the form of books, research papers, email mailing lists, email digests, and ftp sites.

Part II -- Increasing the Power of Genetic Programming

This section contains contributions whose principal purpose is to increase the power of the genetic programming paradigm.

3 The Evolution of Evolvability in Genetic Programming
Lee Altenberg

It leads off in Chapter 3 with a theoretical analysis of evolvability - the ability of a population to produce variants fitter than any yet existing. It explores the relationship between the fitness function, representation, and genetic operators that produce evolvability. It analyzes the relationship of evolvability to the observed proliferation of common blocks of code within programs evolved using genetic programming. This theoretical analysis points the way to several intriguing and quite practical ways to improve the power of a genetic programming system.

4 Genetic Programming and Emergent Intelligence
Peter J. Angeline

Chapter 4 argues that evolutionary computations, including genetic programming, belong to a unique class of problem solvers that display emergent intelligence, a method of problem solving that allows task specific knowledge to emerge from the interaction between a problem solver and a fitness function. It examines what is distinct about evolutionary computations and emergent intelligence, why they represent a novel approach to computational problem solving and how that difference is manifested in genetic programming.

5 Scalable Learning in Genetic Programming using Automatic Function Definition
John R. Koza

The evolution of scalable learning is the subject of Chapter 5, where three differently sized variations of a particular problem are solved using genetic programming. It is demonstrated that the problem difficulty as perceived by genetic programming grows very rapidly as the scale of the problem is increased. The use of automatically defined functions, a technique which allows subprograms to evolve along with main programs in genetic programming, dramatically reduces the computational effort required to solve the problem and also reduces the rate of increase in difficulty as the scale of the problem is increased.

6 Alternatives in Automatic Function Definition: A Comparison of Performance
Kenneth E. Kinnear, Jr.

Chapter 6 begins with a comparison of performance between the automatically defined functions from Chapter 5 and an alternative approach to the evolution of subprograms called modules, defined in Chapter 4. On the problem used for comparison, automatically defined functions show a large performance gain and modules do not. A variety of hypotheses to explain these differences are tested with experiments, yielding the conclusion that a particular form of structural regularity created by automatically defined functions is largely responsible for their increase in performance.

7 The Donut Problem: Scalability, Generalization and Breeding Policies in Genetic Programming
Walter Alden Tackett, Aviram Carmi

A classification problem with scalable difficulty is presented in Chapter 7. This problem is designed to be analogous to real world problems where the solution is frequently unknown and an exact solution is often not possible. This problem is used to test the performance of a genetic programming system on evolving a classification algorithm by varying the size and degree of representation of the training data set. The effects of locality, taking into account the spatial relationship of members of the population as part of the breeding scheme, on performance are reported, as are the effects of using steady state genetic programming.

8 Effects of Locality in Individual and Population Evolution
Patrik D'haeseleer, Jason Bluming

Chapter 8 focuses on the effect of taking locality into account in breeding scheme design as well as during fitness evaluation. In this contribution the dynamics of the population as a whole are examined instead of the usual focus on the best individual in order to analyze and understand the effects of introducing locality into a genetic programming system. A variety of population analysis tools are demonstrated which could be applied to a range of similar problems.

9 The Evolution of Mental Models
Astro Teller

Most problems to not have solutions that are simple mappings from the inputs to the correct outputs; some kind of internal state or memory is needed to operate effectively on these problems. Chapter 9 proposes a simple addition to the genetic programming paradigm that seamlessly incorporates the evolution of the effective gathering, storage, and retrieval of arbitrarily complex state information. Experimental results show that the effective production and use of complex memory structures can be evolved, and that functions that do so perform better than those that do not.

10 Evolution of Obstacle Avoidance Behavior: Using Noise to Promote Robust Solutions
Craig W. Reynolds

Chapter 10 reports on investigations into the evolution of reactive control programs for obstacle avoidance in a "robot like" simulated vehicle for a corridor-following task. Typically, controllers developed for tasks such as these are quite brittle and overly specific, and the central focus of this chapter is the addition of noise to the fitness evaluation process in order to increase the robustness of the evolved solutions. Noise is shown to aid in the evo- lution of robustness, but it is not without associated costs, which are examined.

11 Pygmies and Civil Servants
Conor Ryan

Premature convergence to a less optimal solution than is desired is sometimes a cause of difficulty in genetic programming systems. Chapter 11 proposes a new selection and breeding scheme which is similar to disassortive mating, taken directly from population genetics, to reduce premature convergence. This scheme, called the Pygmy algorithm, is shown to allow the use of small populations and strong selective pressure with little chance of losing genetic diversity.

12 Genetic Programming Using a Minimum Description Length Principle
Hitoshi Iba, Hugo de Garis, Taisuke Sato

The minimum description length principle is offered in Chapter 12 as a way to avoid the relentless growth of genetic programming programs. The use of the minimum description length principle in the fitness function of a genetic programming system not only keeps the size of the evolving individuals within bounds, but also increases the capability of the system to evolve general individuals. MDL based fitness is not appropriate for every problem, and criteria are presented for its effective use.

13 Genetic Programming in C++: Implementation Issues
Mike J. Keith, Martin C. Martin

In Chapter 13 the design of a high performance genetic programming system is discussed. A large number of implementation issues are examined, with the focus on the C++ programming language, yet the concepts are applicable to any high-performance language. Both execution time performance as well as memory performance are examined, and proposals for the optimization of each are illustrated by experiments. For anyone about to write a genetic programming system, this chapter is a tremendous help.

14 A Compiling Genetic Programming System that Directly Manipulates the Machine Code
Peter Nordin

Most genetic programming systems use a technique where a problem specific language is executed by an interpreter, essentially a virtual machine. This is highly flexible, but creates significant overhead. In a successful attempt to dramatically improve performance, Chapter 14 discusses a system which uses as its representation actual machine code. Each individual is a segment of machine code, and these segments are evolved directly using genetic operators. Fitness evaluation consists of giving control to the machine code which is the individual. The speedups available are very exciting, and the approach deserves serious consideration by anyone with a particularly complex problem.

Part III -- Innovative Applications of Genetic Programming

In this section, a wide range of applications of genetic programming are presented. Many of these contributions also include powerful and generally applicable techniques for improving the power of genetic programming.

15 Automatic Generation of Programs for Crawling and Walking
Graham Spencer

Chapter 15 leads off the section by evolving a program to enable a six-legged insect to walk in a simulated environment, using a minimum of a priori knowledge about the task of locomotion. In every trial of every experiment, genetic programming evolved programs capable of efficient walking, and often the solutions were better than that of hand-coded programs. Several enhancements to the genetic programming system are also presented with applicability to problems other than evolving walking behavior.

16 Genetic Programming for the Acquisition of Double Auction Market Strategies
Martin Andrews, Richard Prager

The double auction is the mechanism behind the minute- by-minute trading on many futures and commodity exchanges. Since 1990, double auction tournaments have been held by the Sante Fe Institute, where the competitors in the tournaments are strategies embodied in computer programs written by a variety of economists, computer scientist and mathematicians. In Chapter 16, genetic programming is used to evolve strategies superior to many hand-coded strategies from the Sante Fe Institute playoffs.

17 Two Scientific Applications of Genetic Programming: Stack Filters and Non-Linear Equation Fitting to Chaotic Data
Howard Oakley

Chapter 17 looks first at an engineering design problem, that of the design of a filter for removal of noise from experimental data. A filter designed by genetic programming is compared to one designed by heuristic search and another designed by a classical genetic algorithm. The filter designed by genetic programming is shown to be superior. In addition, genetic programming is used to fit empirical equations to a chaotic time-series (the Mackey-Glass equation) and non-linear physiological data. The key role of the fitness function is demonstrated, along with several other conclusions relevant to non-linear time series experiments.

18 The Automatic Generation of Plans for a Mobile Robot via Genetic Programming with Automatically Defined Functions
Simon G. Handley

Planning is examined in Chapter 18, where genetic programming is used to create plans for tasks in a simulated world. An illustrative discussion of the creation of a fitness function is followed by a comparison of plans resulting from classical genetic programming and solutions to the same problem using automatically defined functions. An analysis of several successful individuals from the runs using automatically defined functions illustrates interesting sub-goal generation behavior.

19 Competitively Evolving Decision Trees Against Fixed Training Cases for Natural Language Processing
Eric V. Siegel

Competitive fitness functions can generate superior performance for some problems, but in situations where the training cases for the fitness function are derived from real world data, actual evolution of the fitness cases is not possible. Chapter 19 presents a technique where competitive fitness functions can be used even with a fixed and non-evolvable set of training cases, illustrated by evolving decision trees for word sense disambiguation of natural language.

20 Cracking and Co-Evolving Randomizers
Jan Jannink

Chapter 20 discusses two experiments designed to test the applicability of genetic programming to the analysis and the unconstrained construction of pseudo-random number generators, or randomizers. The first experiment attempts to unravel the structure of a number of generators recommended in the literature by guessing their output given previous output data. The second examines the possibility that co-evolving populations of programs in a competitive environment can produce randomizers which conform to specific criteria without explicitly testing against those criteria as part of the evolutionary process.

21 Optimizing Confidence of Text Classification by Evolution of Symbolic Expressions
Brij Masand

Genetic programming is used to assign confidence measures for another machine learning paradigm, memory based reasoning (a k-nearest neighbor method) in Chapter 21. News stories are given keywords using memory based reasoning, and genetic programming is used to evolve an algorithm designed to distinguish between news stories which have correctly been assigned keywords, and those which have not and must be referred to a human editor for classification. The algorithm evolved by genetic programming is superior to the best hand-coded algorithm for the same task.

22 Evolvable 3D Modeling for Model-Based Object Recognition Systems
Thang Nguyen, Thomas Huang

Chapter 22 presents a system which evolves 3-D models over time, eventually producing novel models that are more desirable than initial models. Simulation results for 3-D jet aircraft model evolution illustrate that this approach to model design and refinement is feasible and effective. This approach is intended for use in automatic object recognition systems, though the model fitness is currently determined by interactive human selection.

23 Automatically Defined Features: The Simultaneous Evolution of 2-Dimensional Feature Detectors and an Algorithm for Using Them
David Andre

The automatic discovery of 2-dimensional features is the application examined in Chapter 23. It presents a hybrid system, with 2-dimensional hit-miss matrices evolved by a genetic algorithm and these matrices then utilized by a program evolved with genetic pro- gramming. This system is applied to a subset of the problem of handwriting recognition, specifically digit recognition, and individuals are evolved which can recognize very low resolution digits. Possibilities for expansion into a full-size character recognition system are discussed.

24 Genetic Micro Programming of Neural Networks
Frederic Gruau

Chapter 24 presents a novel approach for the evolution of neural networks using genetic programming. Cellular encoding is presented as a method for encoding families of similarly structured boolean neural networks, and is shown to be a particularly rich and effective representation mechanism for the evolution of such networks. This result is then generalized in a discussion of the appropriate representation language for a variety of tasks.

Return to home page of