Hands-On Gradient Boosting with XGBoost and scikit-learn
eBook - ePub

Hands-On Gradient Boosting with XGBoost and scikit-learn

Perform accessible machine learning and extreme gradient boosting with Python

Corey Wade

  1. 310 pages
  2. English
  3. ePUB (adapté aux mobiles)
  4. Disponible sur iOS et Android
eBook - ePub

Hands-On Gradient Boosting with XGBoost and scikit-learn

Perform accessible machine learning and extreme gradient boosting with Python

Corey Wade

DĂ©tails du livre
Aperçu du livre
Table des matiĂšres
Citations

À propos de ce livre

Get to grips with building robust XGBoost models using Python and scikit-learn for deployment

Key Features

  • Get up and running with machine learning and understand how to boost models with XGBoost in no time
  • Build real-world machine learning pipelines and fine-tune hyperparameters to achieve optimal results
  • Discover tips and tricks and gain innovative insights from XGBoost Kaggle winners

Book Description

XGBoost is an industry-proven, open-source software library that provides a gradient boosting framework for scaling billions of data points quickly and efficiently.

The book introduces machine learning and XGBoost in scikit-learn before building up to the theory behind gradient boosting. You'll cover decision trees and analyze bagging in the machine learning context, learning hyperparameters that extend to XGBoost along the way. You'll build gradient boosting models from scratch and extend gradient boosting to big data while recognizing speed limitations using timers. Details in XGBoost are explored with a focus on speed enhancements and deriving parameters mathematically. With the help of detailed case studies, you'll practice building and fine-tuning XGBoost classifiers and regressors using scikit-learn and the original Python API. You'll leverage XGBoost hyperparameters to improve scores, correct missing values, scale imbalanced datasets, and fine-tune alternative base learners. Finally, you'll apply advanced XGBoost techniques like building non-correlated ensembles, stacking models, and preparing models for industry deployment using sparse matrices, customized transformers, and pipelines.

By the end of the book, you'll be able to build high-performing machine learning models using XGBoost with minimal errors and maximum speed.

What you will learn

  • Build gradient boosting models from scratch
  • Develop XGBoost regressors and classifiers with accuracy and speed
  • Analyze variance and bias in terms of fine-tuning XGBoost hyperparameters
  • Automatically correct missing values and scale imbalanced data
  • Apply alternative base learners like dart, linear models, and XGBoost random forests
  • Customize transformers and pipelines to deploy XGBoost models
  • Build non-correlated ensembles and stack XGBoost models to increase accuracy

Who this book is for

This book is for data science professionals and enthusiasts, data analysts, and developers who want to build fast and accurate machine learning models that scale with big data. Proficiency in Python, along with a basic understanding of linear algebra, will help you to get the most out of this book.

Foire aux questions

Comment puis-je résilier mon abonnement ?
Il vous suffit de vous rendre dans la section compte dans paramĂštres et de cliquer sur « RĂ©silier l’abonnement ». C’est aussi simple que cela ! Une fois que vous aurez rĂ©siliĂ© votre abonnement, il restera actif pour le reste de la pĂ©riode pour laquelle vous avez payĂ©. DĂ©couvrez-en plus ici.
Puis-je / comment puis-je télécharger des livres ?
Pour le moment, tous nos livres en format ePub adaptĂ©s aux mobiles peuvent ĂȘtre tĂ©lĂ©chargĂ©s via l’application. La plupart de nos PDF sont Ă©galement disponibles en tĂ©lĂ©chargement et les autres seront tĂ©lĂ©chargeables trĂšs prochainement. DĂ©couvrez-en plus ici.
Quelle est la différence entre les formules tarifaires ?
Les deux abonnements vous donnent un accĂšs complet Ă  la bibliothĂšque et Ă  toutes les fonctionnalitĂ©s de Perlego. Les seules diffĂ©rences sont les tarifs ainsi que la pĂ©riode d’abonnement : avec l’abonnement annuel, vous Ă©conomiserez environ 30 % par rapport Ă  12 mois d’abonnement mensuel.
Qu’est-ce que Perlego ?
Nous sommes un service d’abonnement Ă  des ouvrages universitaires en ligne, oĂč vous pouvez accĂ©der Ă  toute une bibliothĂšque pour un prix infĂ©rieur Ă  celui d’un seul livre par mois. Avec plus d’un million de livres sur plus de 1 000 sujets, nous avons ce qu’il vous faut ! DĂ©couvrez-en plus ici.
Prenez-vous en charge la synthÚse vocale ?
Recherchez le symbole Écouter sur votre prochain livre pour voir si vous pouvez l’écouter. L’outil Écouter lit le texte Ă  haute voix pour vous, en surlignant le passage qui est en cours de lecture. Vous pouvez le mettre sur pause, l’accĂ©lĂ©rer ou le ralentir. DĂ©couvrez-en plus ici.
Est-ce que Hands-On Gradient Boosting with XGBoost and scikit-learn est un PDF/ePUB en ligne ?
Oui, vous pouvez accĂ©der Ă  Hands-On Gradient Boosting with XGBoost and scikit-learn par Corey Wade en format PDF et/ou ePUB ainsi qu’à d’autres livres populaires dans Computer Science et Neural Networks. Nous disposons de plus d’un million d’ouvrages Ă  dĂ©couvrir dans notre catalogue.

Informations

Année
2020
ISBN
9781839213809
Édition
1
Sous-sujet
Neural Networks

Section 1: Bagging and Boosting

An XGBoost model using scikit-learn defaults opens the book after preprocessing data with pandas and building standard regression and classification models. The practical theory behind XGBoost is explored by advancing through decision trees (XGBoost base learners), random forests (bagging), and gradient boosting to compare scores and fine-tune ensemble and tree-based hyperparameters.
This section comprises the following chapters:
  • Chapter 1, Machine Learning Landscape
  • Chapter 2, Decision Trees in Depth
  • Chapter 3, Bagging with Random Forests
  • Chapter 4, From Gradient Boosting to XGBoost

Chapter 1: Machine Learning Landscape

Welcome to Hands-On Gradient Boosting with XGBoost and Scikit-Learn, a book that will teach you the foundations, tips, and tricks of XGBoost, the best machine learning algorithm for making predictions from tabular data.
The focus of this book is XGBoost, also known as Extreme Gradient Boosting. The structure, function, and raw power of XGBoost will be fleshed out in increasing detail in each chapter. The chapters unfold to tell an incredible story: the story of XGBoost. By the end of this book, you will be an expert in leveraging XGBoost to make predictions from real data.
In the first chapter, XGBoost is presented in a sneak preview. It makes a guest appearance in the larger context of machine learning regression and classification to set the stage for what's to come.
This chapter focuses on preparing data for machine learning, a process also known as data wrangling. In addition to building machine learning models, you will learn about using efficient Python code to load data, describe data, handle null values, transform data into numerical columns, split data into training and test sets, build machine learning models, and implement cross-validation, as well as comparing linear regression and logistic regression models with XGBoost.
The concepts and libraries presented in this chapter are used throughout the book.
This chapter consists of the following topics:
  • Previewing XGBoost
  • Wrangling data
  • Predicting regression
  • Predicting classification

Previewing XGBoost

Machine learning gained recognition with the first neural network in the 1940s, followed by the first machine learning checker champion in the 1950s. After some quiet decades, the field of machine learning took off when Deep Blue famously beat world chess champion Gary Kasparov in the 1990s. With a surge in computational power, the 1990s and early 2000s produced a plethora of academic papers revealing new machine learning algorithms such as random forests and AdaBoost.
The general idea behind boosting is to transform weak learners into strong learners by iteratively improving upon errors. The key idea behind gradient boosting is to use gradient descent to minimize the errors of the residuals. This evolutionary strand, from standard machine learning algorithms to gradient boosting, is the focus of the first four chapters of this book.
XGBoost is short for Extreme Gradient Boosting. The Extreme part refers to pushing the limits of computation to achieve gains in accuracy and speed. XGBoost's surging popularity is largely due to its unparalleled success in Kaggle competitions. In Kaggle competitions, competitors build machine learning models in attempts to make the best predictions and win lucrative cash prizes. In comparison to other models, XGBoost has been crushing the competition.
Understanding the details of XGBoost requires understanding the landscape of machine learning within the context of gradient boosting. In order to paint a full picture, we start at the beginning, with the basics of machine learning.

What is machine learning?

Machine learning is the ability of computers to learn from data. In 2020, machine learning predicts human behavior, recommends products, identifies faces, outperforms poker professionals, discovers exoplanets, identifies diseases, operates self-driving cars, personalizes the internet, and communicates directly with humans. Machine learning is leading the artificial intelligence revolution and affecting the bottom line of nearly every major corporation.
In practice, machine learning means implementing computer algorithms whose weights are adjusted when new data comes in. Machine learning algorithms learn from datasets to make predictions about species classification, the stock market, company profits, human decisions, subatomic particles, optimal traffic routes, and more.
Machine learning is the best tool at our disposal for transforming big data into accurate, actionable predictions. Machine learning, however, does not occur in a vacuum. Machine learning requires rows and columns of data.

Data wrangling

Data wrangling is a comprehensive term that encompasses the various stages of data preprocessing before machine learning can begin. Data loading, data cleaning, data analysis, and data manipulation are all included within the sphere of data wrangli...

Table des matiĂšres

  1. Hands-On Gradient Boosting with XGBoost and scikit-learn
  2. Why subscribe?
  3. Preface
  4. Section 1: Bagging and Boosting
  5. Chapter 1: Machine Learning Landscape
  6. Chapter 2: Decision Trees in Depth
  7. Chapter 3: Bagging with Random Forests
  8. Chapter 4: From Gradient Boosting to XGBoost
  9. Section 2: XGBoost
  10. Chapter 5: XGBoost Unveiled
  11. Chapter 6: XGBoost Hyperparameters
  12. Chapter 7: Discovering Exoplanets with XGBoost
  13. Section 3: Advanced XGBoost
  14. Chapter 8: XGBoost Alternative Base Learners
  15. Chapter 9: XGBoost Kaggle Masters
  16. Chapter 10: XGBoost Model Deployment
  17. Other Books You May Enjoy
Normes de citation pour Hands-On Gradient Boosting with XGBoost and scikit-learn

APA 6 Citation

Wade, C. (2020). Hands-On Gradient Boosting with XGBoost and scikit-learn (1st ed.). Packt Publishing. Retrieved from https://www.perlego.com/book/1978236/handson-gradient-boosting-with-xgboost-and-scikitlearn-perform-accessible-machine-learning-and-extreme-gradient-boosting-with-python-pdf (Original work published 2020)

Chicago Citation

Wade, Corey. (2020) 2020. Hands-On Gradient Boosting with XGBoost and Scikit-Learn. 1st ed. Packt Publishing. https://www.perlego.com/book/1978236/handson-gradient-boosting-with-xgboost-and-scikitlearn-perform-accessible-machine-learning-and-extreme-gradient-boosting-with-python-pdf.

Harvard Citation

Wade, C. (2020) Hands-On Gradient Boosting with XGBoost and scikit-learn. 1st edn. Packt Publishing. Available at: https://www.perlego.com/book/1978236/handson-gradient-boosting-with-xgboost-and-scikitlearn-perform-accessible-machine-learning-and-extreme-gradient-boosting-with-python-pdf (Accessed: 15 October 2022).

MLA 7 Citation

Wade, Corey. Hands-On Gradient Boosting with XGBoost and Scikit-Learn. 1st ed. Packt Publishing, 2020. Web. 15 Oct. 2022.