## The Supervised Machine Learning Bootcamp

### What you’ll learn

Regression and Classification Algorithms

Using sk-learn and Python to implement supervised machine learning techniques

K-nearest neighbors for both classification and regression

Naïve Bayes

Ridge and Lasso Regression

Decision Trees

Random Forests

Support Vector Machines

Practical case studies for training, testing and evaluating and improving model performance

Cross-validation for parameter optimization

Learn to use metrics such as Precision, Recall, F1-score, as well as a confusion matrix to evaluate true model performance

You will dive into the theoretical foundation behind each algorithm with the aid of intuitive explanation of formulas and mathematical notions

### Requirements

The course is open to everyone who wants to learn data science.

You’ll need to install Anaconda and Jupyter Notebook. We will show you how to do that step by step.

### Description

Why should you consider taking the Supervised Machine Learning course?The supervised machine learning algorithms you will learn here are some of the most powerful data science tools you need to solve regression and classification tasks. These are invaluable skills anyone who wants to work as a machine learning engineer and data scientist should have in their toolkit.Naïve Bayes, KNNs, Support Vector Machines, Decision Trees, Random Forests, Ridge and Lasso Regression.In this course, you will learn the theory behind all 6 algorithms, and then apply your skills to practical case studies tailored to each one of them, using Python’s sci-kit learn library.First, we cover naïve Bayes – a powerful technique based on Bayesian statistics. Its strong point is that it’s great at performing tasks in real-time. Some of the most common use cases are filtering spam e-mails, flagging inappropriate comments on social media, or performing sentiment analysis. In the course, we have a practical example of how exactly that works, so stay tuned!Next up is K-nearest-neighbors – one of the most widely used machine learning algorithms. Why is that? Because of its simplicity when using distance-based metrics to make accurate predictions.We’ll follow up with decision tree algorithms, which will serve as the basis for our next topic – namely random forests. They are powerful ensemble learners, capable of harnessing the power of multiple decision trees to make accurate predictions.After that, we’ll meet Support Vector Machines – classification and regression models, capable of utilizing different kernels to solve a wide variety of problems. In the practical part of this section, we’ll build a model for classifying mushrooms as either poisonous or edible. Exciting!Finally, you’ll learn about Ridge and Lasso Regression – they are regularization algorithms that improve the linear regression mechanism by limiting the power of individual features and preventing overfitting. We’ll go over the differences and similarities, as well as the pros and cons of both regression techniques.Each section of this course is organized in a uniform way for an optimal learning experience:- We start with the fundamental theory for each algorithm. To enhance your understanding of the topic, we’ll walk you through a theoretical case, as well as introduce mathematical formulas behind the algorithm.- Then, we move on to building a model in order to solve a practical problem with it. This is done using Python’s famous sklearn library.- We analyze the performance of our models with the aid of metrics such as accuracy, precision, recall, and the F1 score.- We also study various techniques such as grid search and cross-validation to improve the model’s performance.To top it all off, we have a range of complementary exercises and quizzes, so that you can enhance your skill set. Not only that, but we also offer comprehensive course materials to guide you through the course, which you can consult at any time.The lessons have been created in 365’s unique teaching style many of you are familiar with. We aim to deliver complex topics in an easy-to-understand way, focusing on practical application and visual learning.With the power of animations, quiz questions, exercises, and well-crafted course notes, the Supervised Machine Learning course will fulfill all your learning needs.If you want to take your data science skills to the next level and add in-demand tools to your resume, this course is the perfect choice for you.Click ‘Buy this course’ to continue your data science journey today!

### Overview

Section 1: Introduction

Lecture 1 Introduction

Section 2: Setting up the Environment

Lecture 2 Installing Anaconda

Lecture 3 Jupyter Dashboard – Part 1

Lecture 4 Jupyter Dashboard – Part 2

Lecture 5 Installing the relevant packages

Section 3: Naïve Bayes

Lecture 6 Motivation

Lecture 7 Bayes’ Thought Experiment

Lecture 8 Bayes’ Thought Experiment: Assignment

Lecture 9 Bayes’ Theorem

Lecture 10 The Ham-or-Spam Example

Lecture 11 The Ham-or-Spam Example: Assignment

Lecture 12 The YouTube Dataset: Creating the Data Frame

Lecture 13 CountVectorizer

Lecture 14 The YouTube Dataset: Preprocessing

Lecture 15 The YouTube Dataset: Preprocessing: Assignment

Lecture 16 The YouTube Dataset: Classification

Lecture 17 The YouTube Dataset: Classification: Assignment

Lecture 18 The YouTube Dataset: Confusion Matrix

Lecture 19 The YouTube Dataset: Accuracy, Precision, Recall, and the F1 score

Lecture 20 The YouTube Dataset: Changing the Priors

Lecture 21 Naïve Bayes: Assignment

Section 4: K-Nearest Neighbors

Lecture 22 Motivation

Lecture 23 Math Prerequisites: Distance Metrics

Lecture 24 Random Dataset: Generating the Dataset

Lecture 25 Random Dataset: Visualizing the Dataset

Lecture 26 Random Dataset: Classification

Lecture 27 Random Dataset: How to Break a Tie

Lecture 28 Random Dataset: Decision Regions

Lecture 29 Random Dataset: Choosing the Best K-value

Lecture 30 Random Dataset: Grid Search

Lecture 31 Random Dataset: Model Performance

Lecture 32 KNeighbors Classifier: Assignment

Lecture 33 Theory with a Practical Example

Lecture 34 KNN vs Linear Regression: A Linear Problem

Lecture 35 KNN vs Linear Regression: A Non-linear Problem

Lecture 36 KNeighbors Regressor: Assignment

Lecture 37 Pros and Cons

Section 5: Decision Trees and Random Forests

Lecture 38 What is a Tree in Computer Science?

Lecture 39 The Concept of Decision Trees

Lecture 40 Decision Trees in Machine Learning

Lecture 41 Decision Trees: Pros and Cons

Lecture 42 Practical Example: The Iris Dataset

Lecture 43 Practical Example: Creating a Decision Tree

Lecture 44 Practical Example: Plotting the Tree

Lecture 45 Decision Tree Metrics Intuition: Gini Inpurity

Lecture 46 Decision Tree Metrics: Information Gain

Lecture 47 Tree Pruning: Dealing with Overfitting

Lecture 48 Random Forest as Ensemble Learning

Lecture 49 Bootstrapping

Lecture 50 From Bootstrapping to Random Forests

Lecture 51 Random Forest in Code – Glass Dataset

Lecture 52 Census Data and Income – Preprocessing

Lecture 53 Training the Decision Tree

Lecture 54 Training the Random Forest

Section 6: Support Vector Machines

Lecture 55 Introduction to Support Vector Machines

Lecture 56 Linearly separable classes – hard margin problem

Lecture 57 Non-linearly separable classes – soft margin problem

Lecture 58 Kernels – Intuition

Lecture 59 Intro to the practical case

Lecture 60 Preprocessing the data

Lecture 61 Splitting the data into train and test and rescaling

Lecture 62 Implementing a linear SVM

Lecture 63 Analyzing the results– Confusion Matrix, Precision, and Recall

Lecture 64 Cross-validation

Lecture 65 Choosing the kernels and C values for cross-validation

Lecture 66 Hyperparameter tuning using GridSearchCV

Lecture 67 Support Vector Machines – Assignment

Section 7: Ridge and Lasso Regression

Lecture 68 Regression Analysis Overview

Lecture 69 Overfitting and Multicollinearity

Lecture 70 Introduction to Regularization

Lecture 71 Ridge Regression Basics

Lecture 72 Ridge Regression Mechanics

Lecture 73 Regularization in More Complicated Scenarios

Lecture 74 Lasso Regression Basics

Lecture 75 Lasso Regression vs Ridge Regression

Lecture 76 The Hitters Dataset: Preprocessing and Preparation

Lecture 77 Exploratory Data Analysis

Lecture 78 Performing Linear Regression

Lecture 79 Cross-validation for Choosing a Tuning Parameter

Lecture 80 Performing Ridge Regression with Cross-validation

Lecture 81 Performing Lasso Regression with Cross-validation

Lecture 82 Comparing the Results

Lecture 83 Replacing the Missing Values in the DataFrame

Aspiring data scientists and machine learning engineers,Data Scientists and Data Analysts looking to up their skillset,Anyone who wants to gain an understanding of the machine learning field and its vast opportunities

#### Course Information:

Udemy | English | 5h 48m | 2.38 GB

Created by: 365 Careers

You Can See More Courses in the Developer >> Greetings from CourseDown.com