Kafka for Developers - Data Contracts using Schema Registry

Learn to build a Kafka Produce/Consumer applications that uses AVRO data format and Confluent Schema Registry.
4.60 (149 reviews)
Udemy
platform
English
language
Development Tools
category
Kafka for Developers - Data Contracts using Schema Registry
1 806
students
5.5 hours
content
Nov 2024
last update
$29.99
regular price

Why take this course?

🎓 Kafka for Developers - Master Data Contracts with AVRO and Schema Registry

📚 Course Overview: This comprehensive course is designed to equip you with the practical skills needed to build robust Kafka Producer/Consumer applications using the AVRO data format and Confluent Schema Registry. If you aspire to master the art of data handling in Kafka, this hands-on oriented course is your perfect match!

What You'll Learn:

  • Theoretical Foundation: Dive deep into the concepts of data evolution, serialization formats, and enforcing data contracts within a Kafka ecosystem.
  • Hands-On Experience: Get your hands dirty with coding exercises that will solidify your understanding of AVRO and Schema Registry.
  • Data Serialization & Kafka: Explore how AVRO fits into the Kafka architecture and why it's a preferred choice for serialization tasks.
  • Schema Evolution: Learn to handle data evolution gracefully using Schema Registry, ensuring compatibility between applications without disrupting the flow of data.

By the End of the Course:

  • You will have a solid grasp of:
    • Using AVRO as your data serialization format.
    • Data Evolution Techniques through Schema Registry, including backward and forward compatibility, full compatibility, and none compatibility strategies.

Course Breakdown:

🛠️ Getting Started with Kafka:

  • Introduction to the course and a preview of what's to come.

🚀 Data Contract & Serialization in Kafka:

  • Understand how serialization integrates with Kafka and its pivotal role in the overall architecture.
  • Explore different Serialization formats, including AVRO, Protobuf, and Thrift, and their schema support.

📈 Introduction to AVRO - A Data Serialization System:

  • Learn why AVRO is a top choice for Kafka applications.
  • Craft your first AVRO schema and understand its importance.

⚛️ Kafka Setup & Demo in Local Using Docker:

  • Get hands-on experience setting up Kafka locally with Docker.
  • Publish and consume messages using the Kafka Console Producer and Consumer.

🎨 Schema Naming Strategies:

  • Understand different strategies for naming schemas and how they impact application events.

👩‍💻 Build a Coffee Order Service with SpringBoot & Schema Registry:

  • Code your own Spring Boot Kafka application that leverages AVRO for data exchange and interacts with Schema Registry for seamless data evolution.

Key Topics Covered:

  • Understanding an AVRO Record: Delve into the anatomy of an AVRO record during publication and consumption within Kafka.
  • Schema Changes in AVRO: Learn how to manage schema changes, including demonstrations on handling backward and forward compatibility.
  • Data Evolution using Schema Registry: Master the techniques for evolving schemas to adapt to changing business requirements.

Who Should Take This Course? This course is ideal for:

  • Developers who are new to Kafka and looking to understand the role of serialization formats like AVRO.
  • Engineers aiming to master schema evolution with Schema Registry.
  • DevOps professionals interested in expanding their knowledge of data handling within Kafka ecosystems.

Join us on this journey to become an expert in leveraging AVRO and Schema Registry for robust data contract management in Kafka! 🌟

Loading charts...

Related Topics

4644252
udemy ID
15/04/2022
course created date
27/10/2022
course indexed date
Bot
course submited by