Machine Learning for Beginners 9789389845426

Get familiar with various Supervised, Unsupervised and Reinforcement learning algorithms This book covers important

212 101 4MB

English Pages 262 Year 2020

Report DMCA / Copyright

DOWNLOAD FILE

Machine Learning for Beginners
 9789389845426

Table of contents :
Cover Page
Title Page
Copyright Page
Dedication Page
About the Author
About the Reviewer
Acknowledgements
Preface
Errata
Table of Contents
1. An Introduction to Machine Learning
Structure
Objective
Conventional algorithm and machine learning
Types of learning
Supervised machine learning
Unsupervised learning
Working
Data
Train test validation data
Rest of the steps
Applications
Natural Language Processing (NLP)
Weather forecasting
Robot control
Speech recognition
Business Intelligence
History
Conclusion
Exercises
Multiple Choice Questions
Theory
Explore
2. The Beginning: Pre-Processing and Feature Selection
Introduction
Structure
Objective
Dealing with missing values and ‘NaN’
Converting a continuous variable to categorical variable
Feature selection
Chi-Squared test
Pearson correlation
Variance threshold
Conclusion
Exercises
Multiple Choice Questions
Programming/Numerical
Theory
3. Regression
Introduction
Structure
Objective
The line of best fit
Gradient descent method
Implementation
Linear regression using SKLearn
Experiments
Experiment 1: Boston Housing Dataset, Linear Regression, 10-Fold Validation
Experiment 2: Boston Housing Dataset, Linear Regression, train-test split
Finding weights without iteration
Regression using K-nearest neighbors
Conclusion
Exercises
Multiple Choice Questions
Theory
Experiments
4. Classification
Introduction
Structure
Objective
Basics
Classification using K-nearest neighbors
Algorithm
Implementation of K-nearest neighbors
The KNeighborsClassifier in SKLearn
Experiments – K-nearest neighbors
Logistic regression
Logistic regression using SKLearn
Experiments – Logistic regression
Naïve Bayes classifier
The GaussianNB Classifier of SKLearn
Implementation of Gaussian Naïve Bayes
Conclusion
Exercises
Multiple Choice Questions
Theory
Numerical/Programs
5. Neural Network I – The Perceptron
Introduction
Structure
Objective
The brain
The neuron
The McCulloch Pitts model
Limitations of the McCulloch Pitts
The Rosenblatt perceptron model
Algorithm
Activation functions
Unit step
sgn
Sigmoid
Derivative
tan-hyperbolic
Implementation
Learning
Perceptron using sklearn
Experiments
Experiment 1: Classification of Fisher Iris Data
Experiment 2: Classification of Fisher Iris Data, train-test split
Experiment 3: Classification of Breast Cancer Data
Experiment 4: Classification of Breast Cancer Data, 10 Fold Validation
Conclusion
Exercises
Multiple Choice Questions
Theory
Programming/Experiments
6. Neural Network II – The Multi-Layer Perceptron
Introduction
Structure
Objective
History
Introduction to multi-layer perceptrons
Architecture
Backpropagation algorithm
Learning
Implementation
Multilayer perceptron using sklearn
Experiments
Conclusion
Exercises
Multiple Choice Questions
Theory
Practical/Coding
7. Support Vector Machines
Introduction
Structure
Objective
The Maximum Margin Classifier
Maximizing the margins
The non-separable patterns and the cost parameter
The kernel trick
SKLEARN.SVM.SVC
Experiments
Conclusion
Exercises
Multiple Choice Questions
Theory
Experiment
8. Decision Trees
Introduction
Structure
Objective
Basics
Discretization
Coming back
Containing the depth of a tree
Implementation of a decision tree using sklearn
Experiments
Experiment 1 – Iris Dataset, three classes
Experiment 2 – Breast Cancer dataset, two classes
Conclusion
Exercises
Multiple Choice Questions
Theory
Numerical/Programming
9. Clustering
Introduction
Structure
Objective
K-means
Algorithm: K Means
Spectral clustering
Algorithm – Spectral clustering
Hierarchical clustering
Implementation
K-means
Experiment 1
Experiment 2
Experiment 3
Spectral clustering
Experiment 4
Experiment 5
Experiment 6
Agglomerative clustering
Experiment 7
Experiment 8
Experiment 9
DBSCAN
Conclusion
Exercises
Multiple Choice Questions
Theory
Numerical
Programming
10. Feature Extraction
Introduction
Structure
Objective
Fourier Transform
Patches
sklearn.feature_extraction.image.extract_patches_2d
Histogram of oriented gradients
Principal component analysis
Conclusion
Exercises
Multiple Choice Questions
Theory
Programming
Appendix 1. Cheat Sheet – Pandas
Creating a Pandas series
Using a List
Using NumPy Array
Using Dictionary
Indexing
Slicing
Common methods
Boolean index
DataFrame
Creation
Adding a Column in a Data Frame
Deleting column
Addition of Rows
Deletion of Rows
unique
nunique
Iterating a Pandas Data Frame
Appendix 2. Face Classification
Introduction
Data
Conversion to grayscale:
Methods
Feature extraction
Splitting of data
Feature Selection
Forward Feature Selection
Classifier
Observation and Conclusion
Bibliography
General
Nearest Neighbors
Neural Networks
Support Vector Machines
Decision Trees
Clustering
Fourier Transform
Principal Component Analysis
Histogram of Oriented Gradients

Polecaj historie