View all workshops

Apache Kafka Essentials

Trainer(s): Paul Noorland, duration: 8 hours

Scalable, efficient, fast, reliable, stable all these features apply to Apache Kafka. No wonder Kafka has gained popularity in a short time. Do you want to know more about this distributed event streaming platform? Have you just started a project where you want to use Apache Kafka? This training will help you understand all Kafka basic concepts. You will gain experience in setting up a local development environment. You will also practice programming and testing Kafka clients.

Detailed description

We will start at the beginning, no knowledge about Apache Kafka is required for this training. It does however help if you already know something about Kafka or have worked with it before. The different topics will be discussed at a rapid pace, because there is a lot to tell. First, we will look at what exactly Kafka is, what the main features are and what can you use them for. Then we will discuss what Kafka looks like, what Brokers are, the role of Zookeeper and how the communication between producers, consumers and Brokers work. Next, we look at data organization: what are topics, what are partitions and what are replicas. We will set up a local Kafka environment and do some exercises with Kafka’s Command Line Interface to see how Kafka works. Then we will discuss Producers and Consumers and their specific properties, we will start programming Consumers and Producers using the Java API. The next topic is about Kafka transactions, how to setup a transactional producer and consumer. For this we will use the Spring Kafka framework. After programming all kinds of producers and consumers, we will start writing tests for these producers and consumers. We will look at unit testing as well as integration testing with embedded Spring Kafka and test containers.

The last topic of the day is about a Kafka component that you can add to your Kafka cluster: Schema Registry developed by Confluent. This is used for versioning message schemes and checking compatibility between the different schemes.

Target audience
The training is intended for Java developers who are interested in working with Apache Kafka and have little or no experience with Apache Kafka yet. This is a hands-on training aimed at developers. A lot of attention is paid to creating and studying code examples of Kafka Clients that you can use when working on a project with Apache Kafka.

Learning goals

  • Understanding what Kafka is.
  • Learn what you can do with Kafka and what not.
  • Setting up a local development environment to play with Kafka
  • What to pay attention to when programming and configuring a Producer and Consumer.
  • Write useful tests for Kafka Producers and Consumers

Skills acquired

  • Learn all the basic Kafka concepts like Topics, Partitions, Brokers, Replicas, Producers, Consumers
  • Setup your local Kafka cluster
  • Learn how to use Kafka Command Line Interface (CLI)
  • Create Producer and Consumer using the Java API
  • Create Transactional Producer and Consumer using Spring Kafka API
  • Serialize and deserialize messages with Apache AVRO
  • Setup Confluent Kafka cluster with docker
  • Learn how to use schema registry
  • Test your Kafka clients with Unit and Integration Tests

Topics

  • What is Apache Kafka?
  • Basic Apache Kafka nomenclature
  • Data Organisation: topics, partitions, replicas
  • Producers
  • Partitioners
  • Consumers, Consumer groups, Consumer offsets
  • Transactions
  • Serialise and deserialize messages with Apache AVRO
  • Confluent Kafka Schema registry
  • Testing Kafka Clients

Training outline

Kafka Basics, part 1: +/- 3h:

  • Introduction & basic concepts of Kafka.
  • Setup local Kafka cluster. Practice with Kafka CLI
  • Program Producer and Consumer, Partitioning with Java API

Kafka Basics, part 2: +/- 1.5h:

  • Kafka Transactions
  • Practice with Spring Kafka
  • Setup Kafka cluster with Docker

Testing Kafka Clients: +/- 1,5h:

  • Unit tests and mocks
  • Setup Integration Tests with Spring Embedded Kafka and Test containers
  • Kafka and Spring Cloud Contracts

Kafka Messages and Schema Registry: +/ 1.5h:

  • Kafka Serialisation Schemes
  • Confluent Schema Registry
  • Compatibility Messages

Prerequisites

  • Experience with Java programming
  • Good to have experience with Maven, Docker and Spring Boot

Provided training material
After doing the training you will have good code examples that you can use when starting an Apache Kafka project. After the training you will receive possible solutions of the exercises and the slides in PDF format.

About the trainer
When Mireille was a little girl, she wanted to be an inventor or a surgeon or a school teacher. But now that she’s been a software developer for several years, she doesn’t want to be anything else. She likes to build beautiful applications and solve complex problems. She also likes to know how things work, and to share that knowledge when she knows. Mireille has worked with Apache Kafka on several projects and is very excited about this messaging and event streaming platform. She is looking forward to teach you more about Kafka.

Practical details

Standard pricing for this training: EUR 695,- ex VAT per attendee.
Please contact us for pricing for tailored content and for in house group trainings.

Trainings can be given in one of our offices (Utrecht, Amsterdam, Rotterdam, Arnhem, Munich, Dusseldorf, Vienna, Zurich), on site at a client location, or (in some cases) remote. Training content can be tailored to meet your specific requirements.

Want to enroll or have a question? Contact us via mail at info@openvalue.training, give us a call at +31-85-0606886 or use the form below.