Deep Learning with Keras
eBook - ePub

Deep Learning with Keras

Antonio Gulli, Sujit Pal

  1. 318 pages
  2. English
  3. ePUB (adapté aux mobiles)
  4. Disponible sur iOS et Android
eBook - ePub

Deep Learning with Keras

Antonio Gulli, Sujit Pal

DĂ©tails du livre
Aperçu du livre
Table des matiĂšres
Citations

À propos de ce livre

Get to grips with the basics of Keras to implement fast and efficient deep-learning modelsAbout This Book‱ Implement various deep-learning algorithms in Keras and see how deep-learning can be used in games‱ See how various deep-learning models and practical use-cases can be implemented using Keras‱ A practical, hands-on guide with real-world examples to give you a strong foundation in KerasWho This Book Is ForIf you are a data scientist with experience in machine learning or an AI programmer with some exposure to neural networks, you will find this book a useful entry point to deep-learning with Keras. A knowledge of Python is required for this book.What You Will Learn‱ Optimize step-by-step functions on a large neural network using the Backpropagation Algorithm‱ Fine-tune a neural network to improve the quality of results‱ Use deep learning for image and audio processing‱ Use Recursive Neural Tensor Networks (RNTNs) to outperform standard word embedding in special cases‱ Identify problems for which Recurrent Neural Network (RNN) solutions are suitable‱ Explore the process required to implement Autoencoders‱ Evolve a deep neural network using reinforcement learningIn DetailThis book starts by introducing you to supervised learning algorithms such as simple linear regression, the classical multilayer perceptron and more sophisticated deep convolutional networks. You will also explore image processing with recognition of hand written digit images, classification of images into different categories, and advanced objects recognition with related image annotations. An example of identification of salient points for face detection is also provided. Next you will be introduced to Recurrent Networks, which are optimized for processing sequence data such as text, audio or time series. Following that, you will learn about unsupervised learning algorithms such as Autoencoders and the very popular Generative Adversarial Networks (GAN). You will also explore non-traditional uses of neural networks as Style Transfer.Finally, you will look at Reinforcement Learning and its application to AI game playing, another popular direction of research and application of neural networks.Style and approachThis book is an easy-to-follow guide full of examples and real-world applications to help you gain an in-depth understanding of Keras. This book will showcase more than twenty working Deep Neural Networks coded in Python using Keras.

Foire aux questions

Comment puis-je résilier mon abonnement ?
Il vous suffit de vous rendre dans la section compte dans paramĂštres et de cliquer sur « RĂ©silier l’abonnement ». C’est aussi simple que cela ! Une fois que vous aurez rĂ©siliĂ© votre abonnement, il restera actif pour le reste de la pĂ©riode pour laquelle vous avez payĂ©. DĂ©couvrez-en plus ici.
Puis-je / comment puis-je télécharger des livres ?
Pour le moment, tous nos livres en format ePub adaptĂ©s aux mobiles peuvent ĂȘtre tĂ©lĂ©chargĂ©s via l’application. La plupart de nos PDF sont Ă©galement disponibles en tĂ©lĂ©chargement et les autres seront tĂ©lĂ©chargeables trĂšs prochainement. DĂ©couvrez-en plus ici.
Quelle est la différence entre les formules tarifaires ?
Les deux abonnements vous donnent un accĂšs complet Ă  la bibliothĂšque et Ă  toutes les fonctionnalitĂ©s de Perlego. Les seules diffĂ©rences sont les tarifs ainsi que la pĂ©riode d’abonnement : avec l’abonnement annuel, vous Ă©conomiserez environ 30 % par rapport Ă  12 mois d’abonnement mensuel.
Qu’est-ce que Perlego ?
Nous sommes un service d’abonnement Ă  des ouvrages universitaires en ligne, oĂč vous pouvez accĂ©der Ă  toute une bibliothĂšque pour un prix infĂ©rieur Ă  celui d’un seul livre par mois. Avec plus d’un million de livres sur plus de 1 000 sujets, nous avons ce qu’il vous faut ! DĂ©couvrez-en plus ici.
Prenez-vous en charge la synthÚse vocale ?
Recherchez le symbole Écouter sur votre prochain livre pour voir si vous pouvez l’écouter. L’outil Écouter lit le texte Ă  haute voix pour vous, en surlignant le passage qui est en cours de lecture. Vous pouvez le mettre sur pause, l’accĂ©lĂ©rer ou le ralentir. DĂ©couvrez-en plus ici.
Est-ce que Deep Learning with Keras est un PDF/ePUB en ligne ?
Oui, vous pouvez accĂ©der Ă  Deep Learning with Keras par Antonio Gulli, Sujit Pal en format PDF et/ou ePUB ainsi qu’à d’autres livres populaires dans Computer Science et Artificial Intelligence (AI) & Semantics. Nous disposons de plus d’un million d’ouvrages Ă  dĂ©couvrir dans notre catalogue.

Informations

Année
2017
ISBN
9781787129030

Additional Deep Learning Models

So far, most of the discussion has been focused around different models that do classification. These models are trained using object features and their labels to predict labels for hitherto unseen objects. The models also had a fairly simple architecture, all the ones we have seen so far have a linear pipeline modeled by the Keras sequential API.
In this chapter, we will focus on more complex architectures where the pipelines are not necessarily linear. Keras provides the functional API to deal with these sorts of architectures. We will learn how to define our networks using the functional API in this chapter. Note that the functional API can be used to build linear architectures as well.
The simplest extension of classification networks are regression networks. The two broad subcategories under supervised machine learning are classification and regression. Instead of predicting a category, the network now predicts a continuous value. You saw an example of a regression network when we discussed stateless versus stateful RNNs. Many regression problems can be solved using classification models with very little effort. We will see an example of such a network to predict atmospheric benzene in this chapter.
Yet another class of models deal with learning the structure of the data from unlabeled data. These are called unsupervised (or more correctly, self-supervised) models. They are similar to classification models, but the labels are available implicitly within the data. We have already seen examples of this kind of model; for example, the CBOW and skip-gram word2vec models are self-supervised models. Autoencoders are another example of this type of model. We will learn about autoencoders and describe an example that builds compact vector representations of sentences.
We will then look at how to compose the networks we have seen so far into larger computation graphs. These graphs are often built to achieve some custom objective that is not achievable by a sequential model alone, and may have multiple inputs and outputs and connections to external components. We will see an example of composing such a network for question answering.
We then take a detour to look at the Keras backend API, and how we can use this API to build custom components to extend Keras' functionality.
Going back to models for unlabeled data, another class of models that don't require labels are generative models. These models are trained using a set of existing objects and attempt to learn the distribution these objects come from. Once the distribution is learned, we can draw samples from this distribution that look like the original training data. We have seen an example of this where we trained a character RNN model to generate text similar to Alice in Wonderland in the previous chapter. The idea is already covered, so we won't cover this particular aspect of generative models here. However, we will look at how we can leverage the idea of a trained network learning the data distribution to create interesting visual effects using a VGG-16 network pre-trained on ImageNet data.
To summarize, we will learn the following topics in this chapter:
  • The Keras functional API
  • Regression networks
  • Autoencoders for unsupervised learning
  • Composing complex networks with the functional API
  • Customizing Keras
  • Generative networks
Let's get started.

Keras functional API

The Keras functional API defines each layer as a function and provides operators to compose these functions into a larger computational graph. A function is some sort of transformation with a single input and single output. For example, the function y = f(x) defines a function f with input x and output y. Let us consider the simple sequential model from Keras (for more information refer to: https://keras.io/getting-started/sequential-model-guide/):
 from keras.models import Sequential
from keras.layers.core import dense, Activation

model = Sequential([
dense(32, input_dim=784),
Activation("sigmoid"),
dense(10),
Activation("softmax"),
])

model.compile(loss="categorical_crossentropy", optimizer="adam")
As you can see, the sequential model represents the network as a linear pipeline, or list, of layers. We can also represent the network as the composition of the following nested functions. Here x is the input tensor of shape (None, 784) and y is the output tensor of (None, 10). Here None refers to the as-yet undetermined batch size:
Where:
The network can be redefined using the Keras functional API as follows. Notice how the predictions variable is a composition of the same functions we defined in equation form previously:
 from keras.layers import Input
from keras.layers.core import dense
from keras.models import Model
from keras.layers.core import Activation

inputs = Input(shape=(784,))

x = dense(32)(inputs)
x = Activation("sigmoid")(x)
x = dense(10)(x)
predictions = Activation("softmax")(x)

model = Model(inputs=inputs, outputs=predictions)

model.compile(loss="categorical_crossentropy", optimizer="adam")
Since a model is a composition of layers that are also functions, a model is also a function. Therefore, you can treat a trained model as just another layer by calling it on an appropriately shaped input tensor. Thus, if you have built a model that does something useful like image classification, you can easily extend it to work with a sequence of images using Keras's TimeDistributed wrapper:
 sequence_predictions = TimeDistributed(model)(input_sequences) 
The functional API can be used to define any network that can be defined using the sequential API. In addition, the following types of network can only be defined using the functional API:
  • Models with multiple inputs and outputs
  • Models composed of multiple submodels
  • Models that used shared layers
Models with multiple inputs and outputs are defined by composing the inputs and outputs separately, as shown in the preceding example, and then passing in an array of input functions and an array of output functions in the input and output parameters of the Model constructor:
 model = Model(inputs=[input1, input2], outputs=[output1, output2]) 
Models with multiple inputs and outputs also generally consist of multiple subnetworks, the results of whose computations are merged into the final result. The merge function provides multiple ways to merge intermediate results such as vector addition, dot product, and concatenation. We will see examples of merging in our question answering example later in this chapter.
Another good use for the functional API are models that use shared layers. Shared layers are defined once, and referenced in each pipeline where their weights need to be shared.
We will use the functional API almost exclusively in this chapter, so you will see quite a few examples of its use. The Keras website has many more usage examples for the functional API.

Regression networks

The two major techniques of supervised learning are classification and regression. In both cases, the model is trained with data to predict known labels. In case of classification, these labels are discrete values such as genres of text or image categories. In case of regression, these labels are continuous values, such as stock prices or human intelligence quotients (IQ).
Most of the examples we have seen show deep learning models being used to perform classification. In this section, we will look at how to perform regression using such a model.
Recall that classification models have a dense layer with a nonlinear activation at the end, the output dimension of which corresponds to the number of classes the model can predict. Thus, an ImageNet image classification model has a dense (1...

Table des matiĂšres

  1. Title Page
  2. Credits
  3. About the Authors
  4. About the Reviewer
  5. www.PacktPub.com
  6. Customer Feedback
  7. Preface
  8. Neural Networks Foundations
  9. Keras Installation and API
  10. Deep Learning with ConvNets
  11. Generative Adversarial Networks and WaveNet
  12. Word Embeddings
  13. Recurrent Neural Network — RNN
  14. Additional Deep Learning Models
  15. AI Game Playing
  16. Conclusion
Normes de citation pour Deep Learning with Keras

APA 6 Citation

Gulli, A., & Pal, S. (2017). Deep Learning with Keras (1st ed.). Packt Publishing. Retrieved from https://www.perlego.com/book/527043/deep-learning-with-keras-pdf (Original work published 2017)

Chicago Citation

Gulli, Antonio, and Sujit Pal. (2017) 2017. Deep Learning with Keras. 1st ed. Packt Publishing. https://www.perlego.com/book/527043/deep-learning-with-keras-pdf.

Harvard Citation

Gulli, A. and Pal, S. (2017) Deep Learning with Keras. 1st edn. Packt Publishing. Available at: https://www.perlego.com/book/527043/deep-learning-with-keras-pdf (Accessed: 14 October 2022).

MLA 7 Citation

Gulli, Antonio, and Sujit Pal. Deep Learning with Keras. 1st ed. Packt Publishing, 2017. Web. 14 Oct. 2022.