Hands-On Gradient Boosting with XGBoost and scikit-learn
eBook - ePub

Hands-On Gradient Boosting with XGBoost and scikit-learn

Perform accessible machine learning and extreme gradient boosting with Python

Corey Wade

  1. 310 páginas
  2. English
  3. ePUB (apto para móviles)
  4. Disponible en iOS y Android
eBook - ePub

Hands-On Gradient Boosting with XGBoost and scikit-learn

Perform accessible machine learning and extreme gradient boosting with Python

Corey Wade

Detalles del libro
Vista previa del libro
Índice
Citas

Información del libro

Get to grips with building robust XGBoost models using Python and scikit-learn for deployment

Key Features

  • Get up and running with machine learning and understand how to boost models with XGBoost in no time
  • Build real-world machine learning pipelines and fine-tune hyperparameters to achieve optimal results
  • Discover tips and tricks and gain innovative insights from XGBoost Kaggle winners

Book Description

XGBoost is an industry-proven, open-source software library that provides a gradient boosting framework for scaling billions of data points quickly and efficiently.

The book introduces machine learning and XGBoost in scikit-learn before building up to the theory behind gradient boosting. You'll cover decision trees and analyze bagging in the machine learning context, learning hyperparameters that extend to XGBoost along the way. You'll build gradient boosting models from scratch and extend gradient boosting to big data while recognizing speed limitations using timers. Details in XGBoost are explored with a focus on speed enhancements and deriving parameters mathematically. With the help of detailed case studies, you'll practice building and fine-tuning XGBoost classifiers and regressors using scikit-learn and the original Python API. You'll leverage XGBoost hyperparameters to improve scores, correct missing values, scale imbalanced datasets, and fine-tune alternative base learners. Finally, you'll apply advanced XGBoost techniques like building non-correlated ensembles, stacking models, and preparing models for industry deployment using sparse matrices, customized transformers, and pipelines.

By the end of the book, you'll be able to build high-performing machine learning models using XGBoost with minimal errors and maximum speed.

What you will learn

  • Build gradient boosting models from scratch
  • Develop XGBoost regressors and classifiers with accuracy and speed
  • Analyze variance and bias in terms of fine-tuning XGBoost hyperparameters
  • Automatically correct missing values and scale imbalanced data
  • Apply alternative base learners like dart, linear models, and XGBoost random forests
  • Customize transformers and pipelines to deploy XGBoost models
  • Build non-correlated ensembles and stack XGBoost models to increase accuracy

Who this book is for

This book is for data science professionals and enthusiasts, data analysts, and developers who want to build fast and accurate machine learning models that scale with big data. Proficiency in Python, along with a basic understanding of linear algebra, will help you to get the most out of this book.

Preguntas frecuentes

¿Cómo cancelo mi suscripción?
Simplemente, dirígete a la sección ajustes de la cuenta y haz clic en «Cancelar suscripción». Así de sencillo. Después de cancelar tu suscripción, esta permanecerá activa el tiempo restante que hayas pagado. Obtén más información aquí.
¿Cómo descargo los libros?
Por el momento, todos nuestros libros ePub adaptables a dispositivos móviles se pueden descargar a través de la aplicación. La mayor parte de nuestros PDF también se puede descargar y ya estamos trabajando para que el resto también sea descargable. Obtén más información aquí.
¿En qué se diferencian los planes de precios?
Ambos planes te permiten acceder por completo a la biblioteca y a todas las funciones de Perlego. Las únicas diferencias son el precio y el período de suscripción: con el plan anual ahorrarás en torno a un 30 % en comparación con 12 meses de un plan mensual.
¿Qué es Perlego?
Somos un servicio de suscripción de libros de texto en línea que te permite acceder a toda una biblioteca en línea por menos de lo que cuesta un libro al mes. Con más de un millón de libros sobre más de 1000 categorías, ¡tenemos todo lo que necesitas! Obtén más información aquí.
¿Perlego ofrece la función de texto a voz?
Busca el símbolo de lectura en voz alta en tu próximo libro para ver si puedes escucharlo. La herramienta de lectura en voz alta lee el texto en voz alta por ti, resaltando el texto a medida que se lee. Puedes pausarla, acelerarla y ralentizarla. Obtén más información aquí.
¿Es Hands-On Gradient Boosting with XGBoost and scikit-learn un PDF/ePUB en línea?
Sí, puedes acceder a Hands-On Gradient Boosting with XGBoost and scikit-learn de Corey Wade en formato PDF o ePUB, así como a otros libros populares de Computer Science y Neural Networks. Tenemos más de un millón de libros disponibles en nuestro catálogo para que explores.

Información

Año
2020
ISBN
9781839213809
Edición
1
Categoría
Neural Networks

Section 1: Bagging and Boosting

An XGBoost model using scikit-learn defaults opens the book after preprocessing data with pandas and building standard regression and classification models. The practical theory behind XGBoost is explored by advancing through decision trees (XGBoost base learners), random forests (bagging), and gradient boosting to compare scores and fine-tune ensemble and tree-based hyperparameters.
This section comprises the following chapters:
  • Chapter 1, Machine Learning Landscape
  • Chapter 2, Decision Trees in Depth
  • Chapter 3, Bagging with Random Forests
  • Chapter 4, From Gradient Boosting to XGBoost

Chapter 1: Machine Learning Landscape

Welcome to Hands-On Gradient Boosting with XGBoost and Scikit-Learn, a book that will teach you the foundations, tips, and tricks of XGBoost, the best machine learning algorithm for making predictions from tabular data.
The focus of this book is XGBoost, also known as Extreme Gradient Boosting. The structure, function, and raw power of XGBoost will be fleshed out in increasing detail in each chapter. The chapters unfold to tell an incredible story: the story of XGBoost. By the end of this book, you will be an expert in leveraging XGBoost to make predictions from real data.
In the first chapter, XGBoost is presented in a sneak preview. It makes a guest appearance in the larger context of machine learning regression and classification to set the stage for what's to come.
This chapter focuses on preparing data for machine learning, a process also known as data wrangling. In addition to building machine learning models, you will learn about using efficient Python code to load data, describe data, handle null values, transform data into numerical columns, split data into training and test sets, build machine learning models, and implement cross-validation, as well as comparing linear regression and logistic regression models with XGBoost.
The concepts and libraries presented in this chapter are used throughout the book.
This chapter consists of the following topics:
  • Previewing XGBoost
  • Wrangling data
  • Predicting regression
  • Predicting classification

Previewing XGBoost

Machine learning gained recognition with the first neural network in the 1940s, followed by the first machine learning checker champion in the 1950s. After some quiet decades, the field of machine learning took off when Deep Blue famously beat world chess champion Gary Kasparov in the 1990s. With a surge in computational power, the 1990s and early 2000s produced a plethora of academic papers revealing new machine learning algorithms such as random forests and AdaBoost.
The general idea behind boosting is to transform weak learners into strong learners by iteratively improving upon errors. The key idea behind gradient boosting is to use gradient descent to minimize the errors of the residuals. This evolutionary strand, from standard machine learning algorithms to gradient boosting, is the focus of the first four chapters of this book.
XGBoost is short for Extreme Gradient Boosting. The Extreme part refers to pushing the limits of computation to achieve gains in accuracy and speed. XGBoost's surging popularity is largely due to its unparalleled success in Kaggle competitions. In Kaggle competitions, competitors build machine learning models in attempts to make the best predictions and win lucrative cash prizes. In comparison to other models, XGBoost has been crushing the competition.
Understanding the details of XGBoost requires understanding the landscape of machine learning within the context of gradient boosting. In order to paint a full picture, we start at the beginning, with the basics of machine learning.

What is machine learning?

Machine learning is the ability of computers to learn from data. In 2020, machine learning predicts human behavior, recommends products, identifies faces, outperforms poker professionals, discovers exoplanets, identifies diseases, operates self-driving cars, personalizes the internet, and communicates directly with humans. Machine learning is leading the artificial intelligence revolution and affecting the bottom line of nearly every major corporation.
In practice, machine learning means implementing computer algorithms whose weights are adjusted when new data comes in. Machine learning algorithms learn from datasets to make predictions about species classification, the stock market, company profits, human decisions, subatomic particles, optimal traffic routes, and more.
Machine learning is the best tool at our disposal for transforming big data into accurate, actionable predictions. Machine learning, however, does not occur in a vacuum. Machine learning requires rows and columns of data.

Data wrangling

Data wrangling is a comprehensive term that encompasses the various stages of data preprocessing before machine learning can begin. Data loading, data cleaning, data analysis, and data manipulation are all included within the sphere of data wrangli...

Índice

  1. Hands-On Gradient Boosting with XGBoost and scikit-learn
  2. Why subscribe?
  3. Preface
  4. Section 1: Bagging and Boosting
  5. Chapter 1: Machine Learning Landscape
  6. Chapter 2: Decision Trees in Depth
  7. Chapter 3: Bagging with Random Forests
  8. Chapter 4: From Gradient Boosting to XGBoost
  9. Section 2: XGBoost
  10. Chapter 5: XGBoost Unveiled
  11. Chapter 6: XGBoost Hyperparameters
  12. Chapter 7: Discovering Exoplanets with XGBoost
  13. Section 3: Advanced XGBoost
  14. Chapter 8: XGBoost Alternative Base Learners
  15. Chapter 9: XGBoost Kaggle Masters
  16. Chapter 10: XGBoost Model Deployment
  17. Other Books You May Enjoy
Estilos de citas para Hands-On Gradient Boosting with XGBoost and scikit-learn

APA 6 Citation

Wade, C. (2020). Hands-On Gradient Boosting with XGBoost and scikit-learn (1st ed.). Packt Publishing. Retrieved from https://www.perlego.com/book/1978236/handson-gradient-boosting-with-xgboost-and-scikitlearn-perform-accessible-machine-learning-and-extreme-gradient-boosting-with-python-pdf (Original work published 2020)

Chicago Citation

Wade, Corey. (2020) 2020. Hands-On Gradient Boosting with XGBoost and Scikit-Learn. 1st ed. Packt Publishing. https://www.perlego.com/book/1978236/handson-gradient-boosting-with-xgboost-and-scikitlearn-perform-accessible-machine-learning-and-extreme-gradient-boosting-with-python-pdf.

Harvard Citation

Wade, C. (2020) Hands-On Gradient Boosting with XGBoost and scikit-learn. 1st edn. Packt Publishing. Available at: https://www.perlego.com/book/1978236/handson-gradient-boosting-with-xgboost-and-scikitlearn-perform-accessible-machine-learning-and-extreme-gradient-boosting-with-python-pdf (Accessed: 15 October 2022).

MLA 7 Citation

Wade, Corey. Hands-On Gradient Boosting with XGBoost and Scikit-Learn. 1st ed. Packt Publishing, 2020. Web. 15 Oct. 2022.