Neural Networks with Keras Cookbook
eBook - ePub

Neural Networks with Keras Cookbook

Over 70 recipes leveraging deep learning techniques across image, text, audio, and game bots

V Kishore Ayyadevara

  1. 568 pages
  2. English
  3. ePUB (adapté aux mobiles)
  4. Disponible sur iOS et Android
eBook - ePub

Neural Networks with Keras Cookbook

Over 70 recipes leveraging deep learning techniques across image, text, audio, and game bots

V Kishore Ayyadevara

DĂ©tails du livre
Aperçu du livre
Table des matiĂšres
Citations

À propos de ce livre

Implement neural network architectures by building them from scratch for multiple real-world applications.

Key Features

  • From scratch, build multiple neural network architectures such as CNN, RNN, LSTM in Keras
  • Discover tips and tricks for designing a robust neural network to solve real-world problems
  • Graduate from understanding the working details of neural networks and master the art of fine-tuning them

Book Description

This book will take you from the basics of neural networks to advanced implementations of architectures using a recipe-based approach.

We will learn about how neural networks work and the impact of various hyper parameters on a network's accuracy along with leveraging neural networks for structured and unstructured data.

Later, we will learn how to classify and detect objects in images. We will also learn to use transfer learning for multiple applications, including a self-driving car using Convolutional Neural Networks.

We will generate images while leveraging GANs and also by performing image encoding. Additionally, we will perform text analysis using word vector based techniques. Later, we will use Recurrent Neural Networks and LSTM to implement chatbot and Machine Translation systems.

Finally, you will learn about transcribing images, audio, and generating captions and also use Deep Q-learning to build an agent that plays Space Invaders game.

By the end of this book, you will have developed the skills to choose and customize multiple neural network architectures for various deep learning problems you might encounter.

What you will learn

  • Build multiple advanced neural network architectures from scratch
  • Explore transfer learning to perform object detection and classification
  • Build self-driving car applications using instance and semantic segmentation
  • Understand data encoding for image, text and recommender systems
  • Implement text analysis using sequence-to-sequence learning
  • Leverage a combination of CNN and RNN to perform end-to-end learning
  • Build agents to play games using deep Q-learning

Who this book is for

This intermediate-level book targets beginners and intermediate-level machine learning practitioners and data scientists who have just started their journey with neural networks. This book is for those who are looking for resources to help them navigate through the various neural network architectures; you'll build multiple architectures, with concomitant case studies ordered by the complexity of the problem. A basic understanding of Python programming and a familiarity with basic machine learning are all you need to get started with this book.

Foire aux questions

Comment puis-je résilier mon abonnement ?
Il vous suffit de vous rendre dans la section compte dans paramĂštres et de cliquer sur « RĂ©silier l’abonnement ». C’est aussi simple que cela ! Une fois que vous aurez rĂ©siliĂ© votre abonnement, il restera actif pour le reste de la pĂ©riode pour laquelle vous avez payĂ©. DĂ©couvrez-en plus ici.
Puis-je / comment puis-je télécharger des livres ?
Pour le moment, tous nos livres en format ePub adaptĂ©s aux mobiles peuvent ĂȘtre tĂ©lĂ©chargĂ©s via l’application. La plupart de nos PDF sont Ă©galement disponibles en tĂ©lĂ©chargement et les autres seront tĂ©lĂ©chargeables trĂšs prochainement. DĂ©couvrez-en plus ici.
Quelle est la différence entre les formules tarifaires ?
Les deux abonnements vous donnent un accĂšs complet Ă  la bibliothĂšque et Ă  toutes les fonctionnalitĂ©s de Perlego. Les seules diffĂ©rences sont les tarifs ainsi que la pĂ©riode d’abonnement : avec l’abonnement annuel, vous Ă©conomiserez environ 30 % par rapport Ă  12 mois d’abonnement mensuel.
Qu’est-ce que Perlego ?
Nous sommes un service d’abonnement Ă  des ouvrages universitaires en ligne, oĂč vous pouvez accĂ©der Ă  toute une bibliothĂšque pour un prix infĂ©rieur Ă  celui d’un seul livre par mois. Avec plus d’un million de livres sur plus de 1 000 sujets, nous avons ce qu’il vous faut ! DĂ©couvrez-en plus ici.
Prenez-vous en charge la synthÚse vocale ?
Recherchez le symbole Écouter sur votre prochain livre pour voir si vous pouvez l’écouter. L’outil Écouter lit le texte Ă  haute voix pour vous, en surlignant le passage qui est en cours de lecture. Vous pouvez le mettre sur pause, l’accĂ©lĂ©rer ou le ralentir. DĂ©couvrez-en plus ici.
Est-ce que Neural Networks with Keras Cookbook est un PDF/ePUB en ligne ?
Oui, vous pouvez accĂ©der Ă  Neural Networks with Keras Cookbook par V Kishore Ayyadevara en format PDF et/ou ePUB ainsi qu’à d’autres livres populaires dans Computer Science et Neural Networks. Nous disposons de plus d’un million d’ouvrages Ă  dĂ©couvrir dans notre catalogue.

Informations

Année
2019
ISBN
9781789342109
Édition
1
Sous-sujet
Neural Networks

Text Analysis Using Word Vectors

In the previous chapter, we learned about encoding an image or encoding users or movies for recommender systems, where the items that are similar have similar vectors. In this chapter, we will be discussing how to encode text data.
You will be learning about the following topics:
  • Building a word vector from scratch in Python
  • Building a word vector using skip-gram and CBOW models
  • Performing vector arithmetic using pre-trained word vectors
  • Creating a document vector
  • Building word vectors using fastText
  • Building word vectors using GloVe
  • Building sentiment classification using word vectors

Introduction

In the traditional approach of solving text-related problems, we would one-hot encode the word. However, if the dataset has thousands of unique words, the resulting one-hot-encoded vector would have thousands of dimensions, which is likely to result in computation issues. Additionally, similar words will not have similar vectors in this scenario. Word2Vec is an approach that helps us to achieve similar vectors for similar words.
To understand how Word2Vec is useful, let's explore the following problem.
Let's say we have two input sentences:
Intuitively, we know that enjoy and like are similar words. However, in traditional text mining, when we one-hot encode the words, our output looks as follows:
Notice that one-hot encoding results in each word being assigned a column. The major issue with one-hot encoding such as this is that the Eucledian distance between I and enjoy is the same as the Eucledian distance between enjoy and like.
However, intuitively, we know that the distance between enjoy and like should be lower than the distance between I and enjoy, as enjoy and like are similar to each other.

Building a word vector from scratch in Python

The principle based on which we'll build a word vector is related words will have similar words surrounding them.
For example: the words queen and princess will have similar words (related to a kingdom) around them more frequently. In a way, the context (surrounding words) of these words would be similar.

Getting ready

Our dataset (of two sentences) looks as follows when we take the surrounding words as input and the remaining (middle) word as output:
Notice that we are using the middle word as output and the remaining words as input. A vectorized form of this input and output looks as follows (recall the way in which we converted a sentence into a vector in the Need for encoding in text analysis section in Chapter 9, Encoding Input):
Notice that the vectorized form of input in the first row is {0, 1, 1, 1, 0}, as the input word index is {1, 2, 3}, and the output is {1, 0, 0, 0, 0} as the output word's index is {1}.
In such a scenario, our hidden layer has three neurons associated with it. Our neural network would look as follows:
The dimensions of each layer are as follows:
Layer
Shape of weights
Commentary
Input layer
1 x 5
Each row is multiplied by five weights.
Hidden layer
5 x 3
There are five input weights each to the three neurons in the hidden layer.
Output of hidden layer
1 x 3
This is the matrix multiplication of the input and the hidden layer.
Weights from hidden to output
3 x 5
Three output hidden units are mapped to five output columns (as there are five unique words).
Output layer
1 x 5
This is the matrix multiplication between the output of the hidden layer and the weights from the hidden to the output layer.
Note that we would not be applying activation on top of the hidden layer while building a word vector.
The output layer's values are not restricted to a specific range. Hence, we pass them through the softmax function so that we arrive at the probability of words. Furthermore, we minimize the cross-entropy loss to arrive at the optimal...

Table des matiĂšres

  1. Title Page
  2. Copyright and Credits
  3. Dedication
  4. About Packt
  5. Contributors
  6. Preface
  7. Building a Feedforward Neural Network
  8. Building a Deep Feedforward Neural Network
  9. Applications of Deep Feedforward Neural Networks
  10. Building a Deep Convolutional Neural Network
  11. Transfer Learning
  12. Detecting and Localizing Objects in Images
  13. Image Analysis Applications in Self-Driving Cars
  14. Image Generation
  15. Encoding Inputs
  16. Text Analysis Using Word Vectors
  17. Building a Recurrent Neural Network
  18. Applications of a Many-to-One Architecture RNN
  19. Sequence-to-Sequence Learning
  20. End-to-End Learning
  21. Audio Analysis
  22. Reinforcement Learning
  23. Other Books You May Enjoy
Normes de citation pour Neural Networks with Keras Cookbook

APA 6 Citation

Ayyadevara, K. (2019). Neural Networks with Keras Cookbook (1st ed.). Packt Publishing. Retrieved from https://www.perlego.com/book/921351/neural-networks-with-keras-cookbook-over-70-recipes-leveraging-deep-learning-techniques-across-image-text-audio-and-game-bots-pdf (Original work published 2019)

Chicago Citation

Ayyadevara, Kishore. (2019) 2019. Neural Networks with Keras Cookbook. 1st ed. Packt Publishing. https://www.perlego.com/book/921351/neural-networks-with-keras-cookbook-over-70-recipes-leveraging-deep-learning-techniques-across-image-text-audio-and-game-bots-pdf.

Harvard Citation

Ayyadevara, K. (2019) Neural Networks with Keras Cookbook. 1st edn. Packt Publishing. Available at: https://www.perlego.com/book/921351/neural-networks-with-keras-cookbook-over-70-recipes-leveraging-deep-learning-techniques-across-image-text-audio-and-game-bots-pdf (Accessed: 14 October 2022).

MLA 7 Citation

Ayyadevara, Kishore. Neural Networks with Keras Cookbook. 1st ed. Packt Publishing, 2019. Web. 14 Oct. 2022.