Hands-On Ensemble Learning with R
Table of Contents
Hands-On Ensemble Learning with R
Why subscribe?
PacktPub.com
Contributors
About the author
About the reviewer
Packt is Searching for Authors Like You
Preface
Who this book is for
What this book covers
To get the most out of this book
Download the example code files
Download the color images
Conventions used
Get in touch
Reviews
1. Introduction to Ensemble Techniques
Datasets
Hypothyroid
Waveform
German Credit
Iris
Pima Indians Diabetes
US Crime
Overseas visitors
Primary Biliary Cirrhosis
Multishapes
Board Stiffness
Statistical/machine learning models
Logistic regression model
Logistic regression for hypothyroid classification
Neural networks
Neural network for hypothyroid classification
Naïve Bayes classifier
Naïve Bayes for hypothyroid classification
Decision tree
Decision tree for hypothyroid classification
Support vector machines
SVM for hypothyroid classification
The right model dilemma!
An ensemble purview
Complementary statistical tests
Permutation test
Chi-square and McNemar test
ROC test
Summary
2. Bootstrapping
Technical requirements
The jackknife technique
The jackknife method for mean and variance
Pseudovalues method for survival data
Bootstrap – a statistical method
The standard error of correlation coefficient
The parametric bootstrap
Eigen values
Rule of thumb
The boot package
Bootstrap and testing hypotheses
Bootstrapping regression models
Bootstrapping survival models*
Bootstrapping time series models*
Summary
3. Bagging
Technical requirements
Classification trees and pruning
Bagging
k-NN classifier
Analyzing waveform data
k-NN bagging
Summary
4. Random Forests
Technical requirements
Random Forests
Variable importance
Proximity plots
Random Forest nuances
Comparisons with bagging
Missing data imputation
Clustering with Random Forest
Summary
5. The Bare Bones Boosting Algorithms
Technical requirements
The general boosting algorithm
Adaptive boosting
Gradient boosting
Building it from scratch
Squared-error loss function
Using the adabag and gbm packages
Variable importance
Comparing bagging, random forests, and boosting
Summary
6. Boosting Refinements
Technical requirements
Why does boosting work?
The gbm package
Boosting for count data
Boosting for survival data
The xgboost package
The h2o package
Summary
7. The General Ensemble Technique
Technical requirements
Why does ensembling work?
Ensembling by voting
Majority voting
Weighted voting
Ensembling by averaging
Simple averaging
Weight averaging
Stack ensembling
Summary
8. Ensemble Diagnostics
Technical requirements
What is ensemble diagnostics?
Ensemble diversity
Numeric prediction
Class prediction
Pairwise measure
Disagreement measure
Yule's or Q-statistic
Correlation coefficient measure
Cohen's statistic
Double-fault measure
Interrating agreement
Entropy measure
Kohavi-Wolpert measure
Disagreement measure for ensemble
Measurement of interrater agreement
Summary
9. Ensembling Regression Models
Technical requirements
Pre-processing the housing data
Visualization and variable reduction
Variable clustering
Regression models
Linear regression model
Neural networks
Regression tree
Prediction for regression models
Bagging and Random Forests
Boosting regression models
Stacking methods for regression models
Summary
10. Ensembling Survival Models
Core concepts of survival analysis
Nonparametric inference
Regression models – parametric and Cox proportional hazards models
Survival tree
Ensemble survival models
Summary
11. Ensembling Time Series Models
Technical requirements
Time series datasets
AirPassengers
co2
uspop
gas
Car Sales
austres
WWWusage
Time series visualization
Core concepts and metrics
Essential time series models
Naïve forecasting
Seasonal, trend, and loess fitting
Exponential smoothing state space model
Auto-regressive Integrated Moving Average (ARIMA) models
Auto-regressive neural networks
Messing it all up
Bagging and time series
Ensemble time series models
Summary
12. What's Next?
A. Bibliography
References
R package references
Index
Hands-On Ensemble Learning with R
Copyright © 2018 Packt Publishing
All rights reserved. No part of this book may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, without the prior written permission of the publisher, except in the case of brief quotations embedded in critical articles or reviews.
Every effort has been made in the preparation of this book to ensure the accuracy of the information presented. However, the information contained in this book is sold without warranty, either express or implied. Neither the authors, nor Packt Publishing or its dealers and distributors, will be held liable for any damages caused or alleged to have been caused directly or indirectly by this book.
Packt Publishing has endeavored to provide trademark information about all of the companies and products mentioned in this book by the appropriate use of capitals. However, Packt Publishing cannot guarantee the accuracy of this information.
Commissioning Editor: Sunith Shetty
Acquisition Editor: Tushar Gupta
Content Development Editor: Aaryaman Singh
Technical Editor: Dinesh Chaudhary
Copy Editors: Safis Editing
Project Coordinator: Manthan Patel
Proofreader: Safis Editing
Indexer: Mariammal Chettiyar
Graphics: Jisha Chirayil
Production Coordinator: Nilesh Mohite
First published: July 2018
Production reference: 1250718
Published by Packt Publishing Ltd.
Livery Place
35 Livery Street
Birmingham B3 2PB, UK.
ISBN 978-1-78862-414-5
www.packtpub.com
On the personal front, I continue to benefit from the support of my family: my daughter, Pranathi; my wife, Chandrika; and my parents, Lakshmi and Narayanachar. The difference in their support from acknowledgement in earlier books is that now I am in Chennai and they support me from Bengaluru. It involves a lot of sacrifice to allow a writer his private time with writing. I also thank my managers, K. Sridharan, Anirban Singha, and Madhu Rao, at Ford Motor Company for their support. Anirban had gone t...