Hands-On Neuroevolution with Python
eBook - ePub

Hands-On Neuroevolution with Python

Build high-performing artificial neural network architectures using neuroevolution-based algorithms

Iaroslav Omelianenko

  1. 368 páginas
  2. English
  3. ePUB (apto para móviles)
  4. Disponible en iOS y Android
eBook - ePub

Hands-On Neuroevolution with Python

Build high-performing artificial neural network architectures using neuroevolution-based algorithms

Iaroslav Omelianenko

Detalles del libro
Vista previa del libro
Índice
Citas

Información del libro

Increase the performance of various neural network architectures using NEAT, HyperNEAT, ES-HyperNEAT, Novelty Search, SAFE, and deep neuroevolution

Key Features

  • Implement neuroevolution algorithms to improve the performance of neural network architectures
  • Understand evolutionary algorithms and neuroevolution methods with real-world examples
  • Learn essential neuroevolution concepts and how they are used in domains including games, robotics, and simulations

Book Description

Neuroevolution is a form of artificial intelligence learning that uses evolutionary algorithms to simplify the process of solving complex tasks in domains such as games, robotics, and the simulation of natural processes. This book will give you comprehensive insights into essential neuroevolution concepts and equip you with the skills you need to apply neuroevolution-based algorithms to solve practical, real-world problems.

You'll start with learning the key neuroevolution concepts and methods by writing code with Python. You'll also get hands-on experience with popular Python libraries and cover examples of classical reinforcement learning, path planning for autonomous agents, and developing agents to autonomously play Atari games. Next, you'll learn to solve common and not-so-common challenges in natural computing using neuroevolution-based algorithms. Later, you'll understand how to apply neuroevolution strategies to existing neural network designs to improve training and inference performance. Finally, you'll gain clear insights into the topology of neural networks and how neuroevolution allows you to develop complex networks, starting with simple ones.

By the end of this book, you will not only have explored existing neuroevolution-based algorithms, but also have the skills you need to apply them in your research and work assignments.

What you will learn

  • Discover the most popular neuroevolution algorithms – NEAT, HyperNEAT, and ES-HyperNEAT
  • Explore how to implement neuroevolution-based algorithms in Python
  • Get up to speed with advanced visualization tools to examine evolved neural network graphs
  • Understand how to examine the results of experiments and analyze algorithm performance
  • Delve into neuroevolution techniques to improve the performance of existing methods
  • Apply deep neuroevolution to develop agents for playing Atari games

Who this book is for

This book is for machine learning practitioners, deep learning researchers, and AI enthusiasts who are looking to implement neuroevolution algorithms from scratch. Working knowledge of the Python programming language and basic knowledge of deep learning and neural networks are mandatory.

Preguntas frecuentes

¿Cómo cancelo mi suscripción?
Simplemente, dirígete a la sección ajustes de la cuenta y haz clic en «Cancelar suscripción». Así de sencillo. Después de cancelar tu suscripción, esta permanecerá activa el tiempo restante que hayas pagado. Obtén más información aquí.
¿Cómo descargo los libros?
Por el momento, todos nuestros libros ePub adaptables a dispositivos móviles se pueden descargar a través de la aplicación. La mayor parte de nuestros PDF también se puede descargar y ya estamos trabajando para que el resto también sea descargable. Obtén más información aquí.
¿En qué se diferencian los planes de precios?
Ambos planes te permiten acceder por completo a la biblioteca y a todas las funciones de Perlego. Las únicas diferencias son el precio y el período de suscripción: con el plan anual ahorrarás en torno a un 30 % en comparación con 12 meses de un plan mensual.
¿Qué es Perlego?
Somos un servicio de suscripción de libros de texto en línea que te permite acceder a toda una biblioteca en línea por menos de lo que cuesta un libro al mes. Con más de un millón de libros sobre más de 1000 categorías, ¡tenemos todo lo que necesitas! Obtén más información aquí.
¿Perlego ofrece la función de texto a voz?
Busca el símbolo de lectura en voz alta en tu próximo libro para ver si puedes escucharlo. La herramienta de lectura en voz alta lee el texto en voz alta por ti, resaltando el texto a medida que se lee. Puedes pausarla, acelerarla y ralentizarla. Obtén más información aquí.
¿Es Hands-On Neuroevolution with Python un PDF/ePUB en línea?
Sí, puedes acceder a Hands-On Neuroevolution with Python de Iaroslav Omelianenko en formato PDF o ePUB, así como a otros libros populares de Computer Science y Artificial Intelligence (AI) & Semantics. Tenemos más de un millón de libros disponibles en nuestro catálogo para que explores.

Información

Año
2019
ISBN
9781838822002

Section 1: Fundamentals of Evolutionary Computation Algorithms and Neuroevolution Methods

This section introduces core concepts of evolutionary computation and discusses particulars of neuroevolution-based algorithms and which Python libraries can be used to implement them. You will become familiar with the fundamentals of neuroevolution methods and will get practical recommendations on how to start your experiments. This section provides a basic introduction to the Anaconda package manager for Python as part of your environment setup.

This section comprises the following chapters:
  • Chapter 1, Overview of Neuroevolution Methods
  • Chapter 2, Python Libraries and Environment Setup

Overview of Neuroevolution Methods

The concept of artificial neural networks (ANN) was inspired by the structure of the human brain. There was a strong belief that, if we were able to imitate this intricate structure in a very similar way, we would be able to create artificial intelligence. We are still on the road to achieving this. Although we can implement Narrow AI agents, we are still far from creating a Generic AI agent.
This chapter introduces you to the concept of ANNs and the two methods that we can use to train them (the gradient descent with error backpropagation and neuroevolution) so that they learn how to approximate the objective function. However, we will mainly focus on discussing the neuroevolution-based family of algorithms. You will learn about the implementation of the evolutionary process that's inspired by natural evolution and become familiar with the most popular neuroevolution algorithms: NEAT, HyperNEAT, and ES-HyperNEAT. We will also discuss the methods of optimization that we can use to search for final solutions and make a comparison between objective-based search and Novelty Search algorithms. By the end of this chapter, you will have a complete understanding of the internals of neuroevolution algorithms and be ready to apply this knowledge in practice.
In this chapter, we will cover the following topics:
  • Evolutionary algorithms and neuroevolution-based methods
  • NEAT algorithm overview
  • Hypercube-based NEAT
  • Evolvable-Substrate HyperNEAT
  • Novelty Search optimization method

Evolutionary algorithms and neuroevolution-based methods

The term artificial neural networks stands for a graph of nodes connected by links where each of the links has a particular weight. The neural node defines a kind of threshold operator that allows the signal to pass only after a specific activation function has been applied. It remotely resembles the way in which neurons in the brain are organized. Typically, the ANN training process consists of selecting the appropriate weight values for all the links within the network. Thus, ANN can approximate any function and can be considered as a universal approximator, which is established by the Universal Approximation Theorem.
For more information on the proof of the Universal Approximation Theorem, take a look at the following papers:
  • Cybenko, G. (1989) Approximations by Superpositions of Sigmoidal Functions, Mathematics of Control, Signals, and Systems, 2(4), 303–314.
  • Leshno, Moshe; Lin, Vladimir Ya.; Pinkus, Allan; Schocken, Shimon (January 1993). Multilayer feedforward networks with a nonpolynomial activation function can approximate any function. Neural Networks. 6 (6): 861–867. doi:10.1016/S0893-6080(05)80131-5. (https://www.sciencedirect.com/science/article/abs/pii/S0893608005801315?via%3Dihub)
  • Kurt Hornik (1991) Approximation Capabilities of Multilayer Feedforward Networks, Neural Networks, 4(2), 251–257. doi:10.1016/0893-6080(91)90009-T (https://www.sciencedirect.com/science/article/abs/pii/089360809190009T?via%3Dihub)
  • Hanin, B. (2018). Approximating Continuous Functions by ReLU Nets of Minimal Width. arXiv preprint arXiv:1710.11278. (https://arxiv.org/abs/1710.11278)
Over the past 70 years, many ANN training methods have been proposed. However, the most popular technique that gained fame in this decade was proposed by Jeffrey Hinton. It is based on the backpropagation of prediction error through the network, with various optimization techniques built around the gradient descent of the loss function with respect to connection weights between the network nodes. It demonstrates the outstanding performance of training deep neural networks for tasks related mainly to pattern recognition. However, despite its inherent powers, it has significant drawbacks. One of these drawbacks is that a vast amount of training samples are required to learn something useful from a specific dataset. Another significant disadvantage is the fixed network architecture that's created manually by the experimenter, which results in inefficient use of computational resources. This is due to a significant amount of network nodes not participating in the inference process. Also, backpropagation-based methods have problems with transferring the acquired knowledge to other similar domains.
Alongside backpropagation methods, there are very promising evolutionary algorithms that can address the aforementioned problems. These bio-inspired techniques draw inspiration from Darwin's theory of evolution and use natural evolution abstractions to create artificial neural networks. The basic idea behind neuroevolution is to produce the ANNs by using stochastic, population-based search methods. It is possible to evolve optimal architectures of neural networks, which accurately address the specific tasks using the evolutionary process. As a result, compact and energy-efficient networks with moderate computing power requirements can be created. The evolutionary process is executed by applying genetic operators (mutation, crossover) to the population of chromosomes (genetically encoded representations of ANNs/solutions) over many generations. The central belief is that since this is in biological systems, subsequent generations will be suited to withstand the generational pressure that's expressed by the objective function, that is, they will become better approximators of the objective function.
Next, we will discuss the basic concepts of genetic algorithms. You will need to have a moderate level of understanding of genetic algorithms.

Genetic operators

Genetic operators are at the very heart of every evolutionary algorithm, and the performance of any neuroevolutionary algorithm depends on them. There are two major genetic operators: mutation and crossover (recombination).
In this chapter, you will learn about the basics of genetic algorithms and how they differ from conventional algorithms, which use error backpropagation-based methods for training the ANN.

Mutation operator

The mutation operator serves the essential role of preserving the genetic diversity of the population during evolution and prevents stalling in the local minima when the chromosomes of organisms in a population become too similar. This mutation alters one or more genes in the chromosome, according to the mutation probability defined by the experimenter. By introducing random changes to the solver's chromosome, mutation allows the evolutionary process to explore new areas in the search space of possible solutions and find better and better solutions over generations.
The following diagram shows the common types of mutation operators:
Types of mutation operators
The exact type of mutation operator depends on the kind of genetic encoding that's used by a specific genetic algorithm. Among the various mutation types we come across, we can distinguish the following:
  • Bit inversion: The randomly selected bit, which is inverted (binary encoding).
  • Order change: Two genes are randomly selected and their p...

Índice

  1. Title Page
  2. Copyright and Credits
  3. Dedication
  4. About Packt
  5. Contributors
  6. Preface
  7. Section 1: Fundamentals of Evolutionary Computation Algorithms and Neuroevolution Methods
  8. Overview of Neuroevolution Methods
  9. Python Libraries and Environment Setup
  10. Section 2: Applying Neuroevolution Methods to Solve Classic Computer Science Problems
  11. Using NEAT for XOR Solver Optimization
  12. Pole-Balancing Experiments
  13. Autonomous Maze Navigation
  14. Novelty Search Optimization Method
  15. Section 3: Advanced Neuroevolution Methods
  16. Hypercube-Based NEAT for Visual Discrimination
  17. ES-HyperNEAT and the Retina Problem
  18. Co-Evolution and the SAFE Method
  19. Deep Neuroevolution
  20. Section 4: Discussion and Concluding Remarks
  21. Best Practices, Tips, and Tricks
  22. Concluding Remarks
  23. Other Books You May Enjoy
Estilos de citas para Hands-On Neuroevolution with Python

APA 6 Citation

Omelianenko, I. (2019). Hands-On Neuroevolution with Python (1st ed.). Packt Publishing. Retrieved from https://www.perlego.com/book/1343357/handson-neuroevolution-with-python-build-highperforming-artificial-neural-network-architectures-using-neuroevolutionbased-algorithms-pdf (Original work published 2019)

Chicago Citation

Omelianenko, Iaroslav. (2019) 2019. Hands-On Neuroevolution with Python. 1st ed. Packt Publishing. https://www.perlego.com/book/1343357/handson-neuroevolution-with-python-build-highperforming-artificial-neural-network-architectures-using-neuroevolutionbased-algorithms-pdf.

Harvard Citation

Omelianenko, I. (2019) Hands-On Neuroevolution with Python. 1st edn. Packt Publishing. Available at: https://www.perlego.com/book/1343357/handson-neuroevolution-with-python-build-highperforming-artificial-neural-network-architectures-using-neuroevolutionbased-algorithms-pdf (Accessed: 14 October 2022).

MLA 7 Citation

Omelianenko, Iaroslav. Hands-On Neuroevolution with Python. 1st ed. Packt Publishing, 2019. Web. 14 Oct. 2022.