Skip to content

Exploring multi-objective extensions to the NEAT algorithm described by Stanley & Miikkulainen, 2002.

Notifications You must be signed in to change notification settings

thecodemaiden/MO-NEAT

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

34 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

MO-NEAT

(Multi-Objective NeuroEvolution of Augmenting Topologies)

MO-NEAT is my attempt to adapt the general framework of NEAT as described in "Evolving Neural Networks Through Augmenting Topologies", Stanley & Miikkulainen, 2002.

The most valuable aspect borrowed from this paper is the concept of genetic history, and the method of forming alignable genomes. Recombination and inter-individual distances are made simple by comparing genomes. Niching is also maintained, however not necessarily as the explicit species set up in Stanley & Miikkulainen.

The current implementation contains a variant of the NEAT algorithm mixed with NSGA-II (Deb, Pratap, Agarwal, Meyarivan, 2000), named MONEAT, and another based on SPEA2 (Zitzler, Laumanns, Thiele, 2001). There is also a base class, BaseNEAT, that contains the framework for additional NEAT-based algorithms, whether single- or multi-objective.

The algorithms are run on a simple implementation of a neural network, BasicNN. Any network can be output in Graphviz dot format for visualization. The XOR network can be correctly evolved, as a simple test. As a test of multiple objectives, there is also a goal to produce a network with two outputs, one that identifies multiples of 2 and the other that identifies multiples of 3, where the input is a 4 digit binary number.

My next task is to test the performance with time-series input, i.e., recurrent networks.

Examples and results will be available soon.

About

Exploring multi-objective extensions to the NEAT algorithm described by Stanley & Miikkulainen, 2002.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published