Hyperparameter Optimization for Machine Learning

Learn grid and random search, Bayesian optimization, multi-fidelity models, Optuna, Hyperopt, Scikit-Optimize & more.
Hyperparameter Optimization for Machine Learning
File Size :
4.48 GB
Total length :
9h 56m

Category

Instructor

Soledad Galli

Language

Last update

2/2023

Ratings

4.6/5

Hyperparameter Optimization for Machine Learning

What you’ll learn

Hyperparameter tunning and why it matters
Cross-validation and nested cross-validation
Hyperparameter tunning with Grid and Random search
Bayesian Optimisation
Tree-Structured Parzen Estimators, Population Based Training and SMAC
Hyperparameter tunning tools, i.e., Hyperopt, Optuna, Scikit-optimize, Keras Turner and others

Hyperparameter Optimization for Machine Learning

Requirements

Python programming, including knowledge of NumPy, Pandas and Scikit-learn
Familiarity with basic machine learning algorithms, i.e., regression, support vector machines and nearest neighbours
Familiarity with decision tree algorithms and Random Forests
Familiarity with gradient boosting machines, i.e., xgboost, lightGBMs
Understanding of machine learning model evaluation metrics
Familiarity with Neuronal Networks

Description

Welcome to Hyperparameter Optimization for Machine Learning. In this course, you will learn multiple techniques to select the best hyperparameters and improve the performance of your machine learning models.If you are regularly training machine learning models as a hobby or for your organization and want to improve the performance of your models, if you are keen to jump up in the leader board of a data science competition, or you simply want to learn more about how to tune hyperparameters of machine learning models, this course will show you how.We’ll take you step-by-step through engaging video tutorials and teach you everything you need to know about hyperparameter tuning. Throughout this comprehensive course, we cover almost every available approach to optimize hyperparameters, discussing their rationale, their advantages and shortcomings, the considerations to have when using the technique and their implementation in Python.Specifically, you will learn:What hyperparameters are and why tuning mattersThe use of cross-validation and nested cross-validation for optimizationGrid search and Random search for hyperparametersBayesian OptimizationTree-structured Parzen estimatorsSMAC, Population Based Optimization and other SMBO algorithmsHow to implement these techniques with available open source packages including Hyperopt, Optuna, Scikit-optimize, Keras Turner and others.By the end of the course, you will be able to decide which approach you would like to follow and carry it out with available open-source libraries.This comprehensive machine learning course includes over 50 lectures spanning about 8 hours of video, and ALL topics include hands-on Python code examples which you can use for reference and for practice, and re-use in your own projects.So what are you waiting for? Enroll today, learn how to tune the hyperparameters of your models and build better machine learning models.

Overview

Section 1: Introduction

Lecture 1 Introduction

Lecture 2 Course curriculum

Lecture 3 Course aim and knowledge requirements

Lecture 4 Course material

Lecture 5 Jupyter notebooks

Lecture 6 Presentations

Lecture 7 Datasets

Lecture 8 Set up your computer – required packages

Lecture 9 FAQ

Section 2: Hyperparameter Tuning – Overview

Lecture 10 Parameters and Hyperparameters

Lecture 11 Hyperparameter Optimization

Section 3: Performance metrics

Lecture 12 Performance Metrics – Introduction

Lecture 13 Classification Metrics (Optional)

Lecture 14 Regression Metrics (Optional)

Lecture 15 Scikit-learn metrics

Lecture 16 Creating your own metrics

Lecture 17 Using Scikit-learn metrics

Section 4: Cross-Validation

Lecture 18 Cross-Validation

Lecture 19 Bias vs Variance (Optional)

Lecture 20 Cross-Validation schemes

Lecture 21 Estimating the model generalization error with CV – Demo

Lecture 22 Cross-Validation for Hyperparameter Tuning – Demo

Lecture 23 Special Cross-Validation schemes

Lecture 24 Group Cross-Validation – Demo

Lecture 25 Nested Cross-Validation

Lecture 26 Nested Cross-Validation – Demo

Section 5: Basic Search Algorithms

Lecture 27 Basic Search Algorithms – Introduction

Lecture 28 Manual Search

Lecture 29 Grid Search

Lecture 30 Grid Search – Demo

Lecture 31 Grid Search with different hyperparameter spaces

Lecture 32 Random Search

Lecture 33 Random Search with Scikit-learn

Lecture 34 Random Search with Scikit-Optimize

Lecture 35 Random Search with Hyperopt

Lecture 36 More examples

Section 6: Bayesian Optimization

Lecture 37 Sequential Search

Lecture 38 Bayesian Optimization

Lecture 39 Bayesian Inference – Introduction

Lecture 40 Joint and Conditional Probabilities

Lecture 41 Bayes Rule

Lecture 42 Sequential Model-Based Optimization

Lecture 43 Gaussian Distribution

Lecture 44 Multivariate Gaussian Distribution

Lecture 45 Gaussian Process

Lecture 46 Kernels

Lecture 47 Acquisition Functions

Lecture 48 Additional Reading Resources

Lecture 49 Scikit-Optimize – 1-Dimension

Lecture 50 Scikit-Optimize – Manual Search

Lecture 51 Scikit-Optimize – Automatic Search

Lecture 52 Scikit-Optimize – Alternative Kernel

Lecture 53 Scikit-Optimize – Neuronal Networks

Lecture 54 Scikit-Optimize – CNN – Search Analysis

Section 7: Other SMBO Algorithms

Lecture 55 Other SMBO Algorithms

Lecture 56 SMAC

Lecture 57 SMAC Demo

Lecture 58 Tree-structured Parzen Estimators – TPE

Lecture 59 TPE Procedure

Lecture 60 TPE hyperparameters

Lecture 61 TPE – why tree-structured?

Lecture 62 TPE with Hyperopt

Lecture 63 Discussion: Bayesian Optimization and Basic Search

Section 8: Multi-fidelity Optimization

Lecture 64 Multi-fidelity Optimization

Lecture 65 Successive Halving

Lecture 66 Hyperband

Lecture 67 BOHB

Section 9: Scikit-Optimize

Lecture 68 Scikit-Optimize

Lecture 69 Section content

Lecture 70 Hyperparameter Distributions

Lecture 71 Defining the hyperparameter space

Lecture 72 Defining the objective function

Lecture 73 Random search

Lecture 74 Bayesian search with Gaussian processes

Lecture 75 Bayesian search with Random Forests

Lecture 76 Bayesian search with GBMs

Lecture 77 Parallelizing a Bayesian search

Lecture 78 Bayesian search with Scikit-learn wrapper

Lecture 79 Changing the kernel of a Gaussian Process

Lecture 80 Optimizing xgboost

Lecture 81 Optimizing Hyperparameters of a CNN

Lecture 82 Analyzing the CNN search

Section 10: Hyperopt

Lecture 83 Hyperopt

Lecture 84 Section content

Lecture 85 Search space configuration and distributions

Lecture 86 Sampling from nested spaces

Lecture 87 Search algorithms

Lecture 88 Evaluating the search

Lecture 89 Optimizing multiple ML models simultaneously

Lecture 90 Optimizing Hyperparameters of a CNN

Lecture 91 References

Section 11: Optuna

Lecture 92 Optuna

Lecture 93 Optuna main functions

Lecture 94 Section content

Lecture 95 Search algorithms

Lecture 96 Optimizing multiple ML models with simultaneously

Lecture 97 Optimizing hyperparameters of a CNN

Lecture 98 Optimizing a CNN – extended

Lecture 99 Evaluating the search with Optuna’s built in functions

Lecture 100 References

Lecture 101 More examples

Section 12: Moving Forward

Lecture 102 Congratulations

Lecture 103 Bonus Lecture

Students who want to know more about hyperparameter optimization algorithms,Students who want to understand advanced techniques for hyperparameter optimization,Students who want to learn to use multiple open source libraries for hyperparameter tuning,Students interested in building better performing machine learning models,Students interested in participating in data science competitions,Students seeking to expand their breadth of knowledge on machine learning

Course Information:

Udemy | English | 9h 56m | 4.48 GB
Created by: Soledad Galli

You Can See More Courses in the IT & Software >> Greetings from CourseDown.com

New Courses

Scroll to Top