Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2024 Jul 15;19(7):e0307084.
doi: 10.1371/journal.pone.0307084. eCollection 2024.

Improving the performance of mutation-based evolving artificial neural networks with self-adaptive mutations

Affiliations

Improving the performance of mutation-based evolving artificial neural networks with self-adaptive mutations

Motoaki Hiraga et al. PLoS One. .

Abstract

Neuroevolution is a promising approach for designing artificial neural networks using an evolutionary algorithm. Unlike recent trending methods that rely on gradient-based algorithms, neuroevolution can simultaneously evolve the topology and weights of neural networks. In neuroevolution with topological evolution, handling crossover is challenging because of the competing conventions problem. Mutation-based evolving artificial neural network is an alternative topology and weights neuroevolution approach that omits crossover and uses only mutations for genetic variation. This study enhances the performance of mutation-based evolving artificial neural network in two ways. First, the mutation step size controlling the magnitude of the parameter perturbation is automatically adjusted by a self-adaptive mutation mechanism, enabling a balance between exploration and exploitation during the evolution process. Second, the structural mutation probabilities are automatically adjusted depending on the network size, preventing excessive expansion of the topology. The proposed methods are compared with conventional neuroevolution algorithms using locomotion tasks provided in the OpenAI Gym benchmarks. The results demonstrate that the proposed methods with the self-adaptive mutation mechanism can achieve better performance. In addition, the adjustment of structural mutation probabilities can mitigate topological bloat while maintaining performance.

PubMed Disclaimer

Conflict of interest statement

The authors have declared that no competing interests exist.

Figures

Fig 1
Fig 1. Example of the genotype-phenotype mapping in MBEANN.
A genome consists of operons, each of which corresponds to a subnetwork within the neural network. In this example, the genome of the neural network consists of three operons, that is, operon0, operon1, and operon2. Note that operon0 includes only nodes from the input and output layers, along with the direct connections between them. When the add-node mutation is applied to operon0, a new operon is created using the added hidden node.
Fig 2
Fig 2. Example of the add-node mutation.
In this figure, the connection of link0, which has the weight value of w0, is selected and replaced with node3, link2, and link3. If the selected connection to be replaced belongs to operon0, a new operon is generated with the new node and connections.
Fig 3
Fig 3. Example of the add-connection mutation.
A new connection with the weight value of w5 is created from node1 to node3. The nodes to be connected are selected either from two nodes within the same operon or from one in operon0 and the other in the operon being mutated. The weight value of the new connection is set to zero.
Fig 4
Fig 4. Screenshots of (A) HalfCheetah-v4 and (B) Ant-v4 provided in OpenAI Gym using the MuJoCo physics simulator.
Fig 5
Fig 5. Transitions of the fitness value of the best individual in HalfCheetah-v4.
Each line represents the mean of the best fitness values over 15 trials, and the shaded regions around them indicate the standard deviations.
Fig 6
Fig 6. Transitions of the fitness value of the best individual in Ant-v4.
Each line represents the mean of the best fitness values over 15 trials, and the shaded regions around them indicate the standard deviations.
Fig 7
Fig 7. Transitions of the network structure of the best individual in HalfCheetah-v4.
(A) Transitions of the number of nodes in the individual, including 17 input and 6 output nodes. (B) Transitions of the number of connections. Each line represents the mean over 15 trials, and the shaded regions around them indicate the standard deviations.
Fig 8
Fig 8. Transitions of the network structure of the best individual in Ant-v4.
(A) Transitions of the number of nodes in the individual, including 27 input and 8 output nodes. (B) Transitions of the number of connections. Each line represents the mean over 15 trials, and the shaded regions around them indicate the standard deviations.
Fig 9
Fig 9. Results of the re-evaluation for 100 trials using the best-evolved individuals.
Fig 10
Fig 10. Transitions of the step size of the best individual in HalfCheetah-v4.
Each line represents the mean over 15 trials, while the shaded regions around them show the standard deviations.
Fig 11
Fig 11. Transitions of the step size of the best individual in Ant-v4.
Each line represents the mean over 15 trials, while the shaded regions around them show the standard deviations.

References

    1. Floreano D, Dürr P, Mattiussi C. Neuroevolution: from architectures to learning. Evolutionary Intelligence. 2008;1:47–62. doi: 10.1007/s12065-007-0002-4 - DOI
    1. Yao X. Evolving artificial neural networks. Proceedings of the IEEE. 1999;87(9):1423–1447. doi: 10.1109/5.784219 - DOI
    1. Stanley KO, Clune J, Lehman J, Miikkulainen R. Designing neural networks through neuroevolution. Nature Machine Intelligence. 2019;1(1):24–35. doi: 10.1038/s42256-018-0006-z - DOI
    1. LeCun Y, Bengio Y, Hinton G. Deep learning. Nature. 2015;521(7553):436–444. doi: 10.1038/nature14539 - DOI - PubMed
    1. Goodfellow I, Bengio Y, Courville A. Deep learning. MIT Press; 2016.

LinkOut - more resources