Apache Kafka Realtime Stream Processing Master Class

Processing Real-time Streams using Apache Kafka and Kafka Streams API – Start as Beginner to Finish as PRO
Apache Kafka Realtime Stream Processing Master Class
File Size :
4.33 GB
Total length :
10h 58m



Prashant Kumar Pandey


Last update

Last updated 7/2021



Apache Kafka Realtime Stream Processing Master Class

What you’ll learn

Apache Kafka Foundation and Kafka Architecture
Creating Streams using Kafka Producer APIs
Designing, Developing and Testing Real-time Stream Processing Applications using Kafka Streams Library
Kafka Streams Architecture, Streams DSL, Processor API and Exactly Once Processing in Apache Kafka
Auto-generating Java Objects from JSON Schema definition, Serializing, Deserializing and working with JSON messages without Schema Registry.
Auto-generating Java Objects from AVRO Schema definition, Serializing, Deserializing and working with AVRO messages using Confluent Schema Registry.
Unit Testing and Integration Testing your Kafka Streams Application.
Supporting Microservices architecture and implementing Kafka Streams Interactive Query.

Apache Kafka Realtime Stream Processing Master Class


Programming Knowledge Using Java Programming Language
Familiarity with Java 8 Lambda
A Recent 64-bit Windows/Mac/Linux Machine with 4 GB RAM (8 GB Recommended)


This course does not require any prior knowledge of Apache Kafka. We have taken enough care to explain all necessary and complex Kafka Architecture concepts to help you come up to speed and grasp the content of this course. About the CourseI am creating Kafka Streams – Real-time Stream Processing to help you understand the stream processing in general and apply that knowledge to Kafka Streams Programming. This course is based on my book on the same subject with the same title. My Book is already published and is available with all major online retailers as an eBook and Paperback. My approach to creating this course is a progressive common-sense approach to teaching a complex subject. By using this unique approach, I will help you to apply your general ability to perceive, understand, and reason the concepts progressively that I am explaining in this course.Who should take this Course?Kafka Streams – Real-time Stream Processing course is designed for software engineers willing to develop a stream processing application using the Kafka Streams library. I am also creating this course for data architects and data engineers who are responsible for designing and building the organization’s data-centric infrastructure. Another group of people is the managers and architects who do not directly work with Kafka implementation, but they work with the people who implement Kafka Streams at the ground level.Kafka Version used in the CourseThis course is using the Kafka Streams library available in Apache Kafka 2.x. I have tested all the source code and examples used in this course on Apache Kafka 2.3 open source distribution. Some examples of this course also make use of the Confluent Community Version of Kafka. We will be using Confluent Community Version to explain and demonstrate functionalities that are only available in the Confluent Platform, such as Schema Registry and Avro Serdes. Source Code, Development IDE, Build Tool, Logging, and Testing ToolsThis course is fully example-driven, and I will be creating many examples in the class. The source code files for all the examples are included in your study material. This course will be making extensive use of IntelliJ IDEA as the preferred development IDE. However, based on your prior experience, you should be able to work with any other IDE designed for Java application development.This course will be using Apache Maven as the preferred build tool. However, based on your prior experience, you should be able to use any other build tool designed for Java applications.This course also makes use of Log4J2 to teach you industry-standard log implementation in your application.We will be using JUnit5, which is the latest version of JUnit for implementing Unit Test Cases.Example and ExercisesWorking examples and exercises are the most critical tool to convert your knowledge into a skill. I have already included a lot of examples in the course. This course also consists of objective questions and some programming assignments as and when appropriate. These exercises will help you to validate and check your concepts and apply your learning to solve programming problems.


Section 1: Before you Start

Lecture 1 Introduction

Lecture 2 About the Course

Lecture 3 About the Author

Lecture 4 What do you Need for this Course?

Lecture 5 Debugging Problems and Asking Questions

Lecture 6 Frequently Asked Questions

Lecture 7 Source Code and Other Resources

Section 2: Introduction to Real-time Streams

Lecture 8 Emergence of Bigdata – A Quick Recap

Lecture 9 Conception of Event Streams

Lecture 10 Real-time Streaming – Use Cases

Lecture 11 Real-time Streaming Challenges

Lecture 12 Real-time Streaming Design Consideration

Lecture 13 Section Summary

Section 3: Enter the world of Apache Kafka

Lecture 14 What is Apache Kafka?

Lecture 15 Kafka Storage Architecture

Lecture 16 Kafka Cluster Architecture

Lecture 17 Kafka Work Distribution Architecture – Part 1

Lecture 18 Kafka Work Distribution Architecture – Part 2

Lecture 19 Section Summary

Section 4: Creating Real-time Streams

Lecture 20 Streaming into Kafka

Lecture 21 Kafka Producers – Quick Start

Lecture 22 Kafka Producer Internals

Lecture 23 Scaling Kafka Producer

Lecture 24 Advanced Kafka Producers (Exactly Once)

Lecture 25 Advanced Kafka Producer (Implementing Transaction)

Lecture 26 Kafka Producer – Micro Project

Lecture 27 Kafka Producer – Final Note and References

Section 5: Enter the Stream Processing

Lecture 28 Stream Processing in Apache Kafka

Lecture 29 Kafka Consumer – Practical Introduction

Lecture 30 Kafka Consumer – Scalability, Fault tolerance and Missing Features

Lecture 31 Kafka Streams API – Quick Start

Lecture 32 Creating Streams Topology

Lecture 33 Implementing Streams Topology

Lecture 34 Kafka Streams Architecture

Lecture 35 Section Summary and References

Section 6: Foundation for Real Life Implementations

Lecture 36 Introduction to Types and Serialization in Kafka

Lecture 37 JSON Schema to POJO for JSON Serdes

Lecture 38 Creating and Using JSON Serdes

Lecture 39 AVRO Schema to POJO for AVRO Serdes

Lecture 40 Creating and using AVRO schema in Producers

Lecture 41 Creating and using AVRO schema in Kafka Streams

Lecture 42 Section Summary and References

Section 7: States and Stores

Lecture 43 Understanding States and State Stores

Lecture 44 Creating your First State Store

Lecture 45 Caution with States

Lecture 46 State Store Fault Tolerance

Lecture 47 Section Summary and References

Section 8: KTable – An Update Stream

Lecture 48 Introducing KTable

Lecture 49 Creating your First Update Stream – KTable

Lecture 50 Table Caching and Emit Rates

Lecture 51 Introducing GlobalKTable

Section 9: Real-time Aggregates

Lecture 52 Computing Your First Aggregate – Real-time Streaming Word Count

Lecture 53 Streaming Aggregates – Core Concept

Lecture 54 KStream Aggregation using Reduce()

Lecture 55 KStream Aggregation using Aggregate()

Lecture 56 Common Mistakes in Aggregation

Lecture 57 Count on KTable

Lecture 58 KTable Aggregation using Aggregate()

Section 10: Timestamps and Windows

Lecture 59 Timestamps and Timestamp Extractors

Lecture 60 Creating Tumbling Windows

Lecture 61 Stream Time and Grace Period

Lecture 62 Supressing Intermediate Results

Lecture 63 Creating Hopping Windows

Lecture 64 Creating Session Windows

Section 11: Joining Streams and Tables

Lecture 65 Streaming Joins

Lecture 66 Joining a KStrem to another KStream

Lecture 67 Joining a KTable to another KTable

Lecture 68 Joining a KStream to a KTable and GlobalKTable

Lecture 69 Mixing Joins with Aggregates – Computing Top 3

Lecture 70 Mixing Joins with Aggregates – Advert CTR

Section 12: Testing Streams Application

Lecture 71 How to test a Stream Processing Application

Lecture 72 Unit Testing Your Topology

Section 13: Interactive Query and Micro-Service Responses

Lecture 73 Introducing Micro-services Requirement

Lecture 74 Understanding Local Vs Remote State Store

Lecture 75 Implementing Interactive Query Micro-service

Section 14: Appendix

Lecture 76 Setting up Apache Kafka Development Environment

Section 15: Keep Learning

Lecture 77 Final Word

Lecture 78 Bonus Lecture : Get Extra

Software Engineers and Architects who are willing to design and develop a Stream Processing Application using Kafka Streams Library.,Java Programmers aspiring to learn everything necessary to start implementing real-time streaming applications using Apache Kafka

Course Information:

Udemy | English | 10h 58m | 4.33 GB
Created by: Prashant Kumar Pandey

You Can See More Courses in the Developer >> Greetings from CourseDown.com

New Courses

Scroll to Top