NEAT (NeuroEvolution of Augmenting Topologies) is a method developed by Kenneth O. Stanley for evolving arbitrary neural networks. This project is a pure-Python implementation of NEAT with no dependencies beyond the standard library. It was forked from the excellent project by @MattKallada.
For further information regarding general concepts and theory, please see the publications page of Stanley's current website.
neat-python is licensed under the 3-clause BSD license. It is
currently only supported on Python 3.8 through 3.14, and pypy3.
fitness_criterion = minnow works correctly. Previously, only the termination check honored this setting — best-genome tracking, stagnation detection, elite selection, crossover parent selection, spawn allocation, and statistics reporting all hardcoded "higher is better." All fitness comparisons throughout the library now respect the configured criterion.- Checkpoints no longer repeat work on restore. Checkpoints are now saved after fitness
evaluation (in
post_evaluate) instead of after reproduction (inend_generation). Restoring a checkpoint skips the already-completed evaluation and proceeds directly to reproduction. For experiments with expensive fitness functions this eliminates potentially hours of redundant computation per restore. Checkpoint fileNnow means "generation N has been evaluated." Old checkpoint files (5-tuple format) are still loadable. - Reporter output no longer mixes generation boundaries. The species detail table printed by
StdOutReporterpreviously appeared inend_generationusing the post-reproduction population, which belongs to the next generation. It now appears inpost_evaluatealongside the fitness statistics, so all output under the "Running generation N" banner is consistent. - Fixed two double-buffer bugs in
CTRNN.advance. Incorrect buffer swapping could cause state corruption during multi-step CTRNN evaluation. - Fixed aggregation validation for builtins and callables.
Configurable options to more closely match Stanley & Miikkulainen (2002), with backward-compatible defaults:
- Connection gene matching by innovation number in the distance function (with separate
excess_coefficient) - Canonical fitness sharing (
fitness_sharing = canonical) - Proportional spawn allocation (
spawn_method = proportional) - Interspecies crossover (
interspecies_crossover_prob) - Dynamic compatibility threshold adjustment (
compatibility_threshold_adjustment) - 75% disable rule fix: replaces (rather than layers on) inherited enabled value
- Pruning of dangling nodes after deletion mutations
- Node gene distance contribution and enable/disable penalty are now configurable
- Optional GPU-accelerated evaluation for CTRNN and Izhikevich networks via CuPy
(
pip install 'neat-python[gpu]'). Lazy imports —import neatnever triggers a GPU dependency. - CTRNN integration switched from forward Euler to exponential Euler (ETD1) for improved numerical stability.
- 55 new unit tests covering feature gaps (618 total).
- Sphinx 9.x documentation build compatibility fix.
The CTRNN (Continuous-Time Recurrent Neural Network) implementation now supports per-node evolvable time constants. In v1.x, all nodes shared a single fixed time constant passed at network creation time. In v2.0, each node carries its own time constant as an evolved gene attribute, allowing the network to operate across multiple timescales simultaneously.
This is a breaking API change: CTRNN.create(genome, config, time_constant) is now CTRNN.create(genome, config). Existing feedforward and discrete-time recurrent configurations require no changes.
For details on the change, its motivation, quantitative impact, and migration guide, see CTRNN-CHANGES.pdf.
- Pure Python implementation with no dependencies beyond the standard library
- Supports Python 3.8-3.14 and PyPy 3
- Reproducible evolution - Set random seeds for deterministic, repeatable experiments
- Parallel fitness evaluation using multiprocessing
- Network export to JSON format for interoperability
- Comprehensive documentation and examples
If you want to try neat-python, please check out the repository, start playing with the examples (examples/xor is
a good place to start) and then try creating your own experiment.
The documentation is available on Read The Docs.
You can also ask questions via the experimental support agent!
neat-python supports exporting trained networks to a JSON format that is framework-agnostic and human-readable. This allows you to:
- Convert networks to other formats (ONNX, TensorFlow, PyTorch, etc.) using third-party tools (the beginnings of a conversion system can be found in the
examples/exportdirectory) - Inspect and debug network structure
- Share networks across platforms and languages
- Archive trained networks independently of neat-python
Example:
import neat
from neat.export import export_network_json
# After training...
winner_net = neat.nn.FeedForwardNetwork.create(winner, config)
# Export to JSON
export_network_json(
winner_net,
filepath='my_network.json',
metadata={'fitness': winner.fitness, 'generation': 42}
)See docs/network-json-format.md for complete format documentation and guidance for creating converters to other frameworks.
If you use this project in a publication, please cite both the software and the original NEAT paper. The listed authors are the originators and/or maintainers of all iterations of the project up to this point. If you have contributed and would like your name added to the citation, please submit an issue.
APA
McIntyre, A., Kallada, M., Miguel, C. G., Feher de Silva, C., & Netto, M. L. neat-python (Version 2.0.1) [Computer software]. https://doi.org/10.5281/zenodo.19024753
Bibtex
@software{McIntyre_neat-python,
author = {McIntyre, Alan and Kallada, Matt and Miguel, Cesar G. and Feher de Silva, Carolina and Netto, Marcio Lobo},
title = {{neat-python}},
version = {2.0.1},
doi = {10.5281/zenodo.19024753},
url = {https://github.com/CodeReclaimers/neat-python}
}
Many thanks to the folks who have cited this repository in their own work.
neat-python is developed and maintained by Alan McIntyre (CodeReclaimers LLC, ORCID: 0000-0002-8071-4219).
Alan McIntyre is an independent consultant with 28+ years of software development experience and an MS in Applied Mathematics. Specializations include computational geometry, CAD reverse engineering, C++ scientific computing, and Python scientific computing.
Available for research consulting and implementation engagements. Full profile: https://codereclaimers.com/consulting Contact: [email protected]