## Probability Stats The Foundations of Machine Learning

### What you’ll learn

Necessary concepts in stats and probability

Important concepts in the subject necessary for Data Science and/or ML

Distributions and their importance

Entropy – the foundation of all Machine Learning

Intro to Bayesian Inference

Applying concepts through code

Exceptional SUPPORT: Questions answered within the day. Try it!

### Requirements

Basic coding knowledge

No maths background needed (beyond basic arithmetic)

Crash course of Python provided in the contents

### Description

Everyone wants to excel at machine learning and data science these days — and for good reason. Data is the new oil and everyone should be able to work with it. However, it’s very difficult to become great in the field because the latest and greatest models seem too complicated. “Seem complicated” — but they are not! If you have a thorough understanding of probability and statistics, they would be much, much easier to work with! And that’s not all — probability is useful in almost all areas of computer science (simulation, vision, game development, AI are only a few of these). If you have a strong foundation in this subject, it opens up several doors for you in your career! That is the objective of this course: to give you the strong foundations needed to excel in all areas of computer science — specifically data science and machine learning. The issue is that most of the probability and statistics courses are too theory-oriented. They get tangled in the maths without discussing the importance of applications. Applications are always given secondary importance. In this course, we take a code-oriented approach. We apply all concepts through code. In fact, we skip over all the useless theory that isn’t relevant to computer science (and is useful for those pursuing pure sciences). Instead, we focus on the concepts that are more useful for data science, machine learning, and other areas of computer science. For instance, many probability courses skip over Bayesian inference. We get to this immensely important concept rather quickly and give it the due attention as it is widely thought of as the future of analysis! This way, you get to learn the most important concepts in this subject in the shortest amount of time possible without having to deal with the details of the less relevant topics. Once you have developed an intuition of the important stuff, you can then learn the latest and greatest models even on your own! Take a look at the promo for this course (and contents list below) for the topics you will learn as well as the preview lectures to get an idea of the interactive style of learning. Remember: The reason you pay for this course is support. I reply within the day. See any of my course reviews for proof of that. So make sure you post any questions you have or any problems you face. I want all my students to finish this course. Let’s get through this together.

### Overview

Section 1: Diving in with code

Lecture 1 Code env setup and Python crash course

Lecture 2 Getting started with code: Feel of data

Lecture 3 Foundations, data types and representing data

Lecture 4 Practical note: one-hot vector encoding

Lecture 5 Exploring data types in code

Lecture 6 Central tendency, mean, median, mode

Lecture 7 Section Review Tasks

Section 2: Measures of Spread

Lecture 8 Dispersion and spread in data, variance, standard deviation

Lecture 9 Dispersion exploration through code

Lecture 10 Section Review Tasks

Section 3: Applications and Rules for Probability

Lecture 11 Intro to uncertainty, probability intuition

Lecture 12 Simulating coin flips for probability

Lecture 13 Conditional probability; the most important concept in stats

Lecture 14 Applying conditional probability – Bayes rule

Lecture 15 Application of Bayes rule in real world – Spam detection

Lecture 16 Spam detection – implementation issues

Lecture 17 Section Review Tasks

Section 4: Counting

Lecture 18 Rules for counting (Mostly optional)

Lecture 19 Section Review Tasks

Section 5: Random Variables – Rationale and Applications

Lecture 20 Quantifying events – random variables

Lecture 21 Two random variables – joint probabilities

Lecture 22 Distributions – rationale and importance

Lecture 23 Discrete distributions through code

Lecture 24 Continuous distributions – probability densities

Lecture 25 Continuous distributions code

Lecture 26 Case study – sleep analysis, structure and code

Lecture 27 Section Review Tasks

Section 6: Visualization in Intuition Building

Lecture 28 Visualizing joint distributions – the road to ML success

Lecture 29 Dependence and variance of two random variables

Lecture 30 Section Review Tasks

Section 7: Applications to the Real World

Lecture 31 Expected values – decision making through probabilities

Lecture 32 Entropy – The most important application of expected values

Lecture 33 Applying entropy – coding decision trees for machine learning

Lecture 34 Foundations of Bayesian inference

Lecture 35 Bayesian inference code through PyMC3

Lecture 36 Section Review Tasks

Section 8: Extra Resources

Lecture 37 Bonus Lecture

Beginner ML and data science developers who need a strong foundation,Developers curious about data science and machine learning,People looking to find out why probability is the foundation of all modern machine learning,Developers who want to know how to harness the power of big data

#### Course Information:

Udemy | English | 6h 41m | 2.48 GB

Created by: Dr. Mohammad Nauman

You Can See More Courses in the IT & Software >> Greetings from CourseDown.com