During this hands-on course, you will:
• Write Producers and Consumers to send data to and read data from Kafka
• Integrate Kafka with external systems using Kafka Connect
• Write streaming applications with Kafka Streams & ksqlDB
• Integrate a Kafka client application
Fundamentals ofApache Kafka
• Explain the value of a *Distributed Event Streaming Platform*
• Explain how the “log” abstraction enables a distributed event streaming platform
• Explain the basic concepts of:
– Brokers, Topics, Partitions, and Segments
– Records (a.k.a. Messages, Events)
– Retention Policies
– Producers, Consumers, and Serialization
– Replication
– Kafka Connect
Producing Messagesto Kafka
• Sketch the high level architecture of a Kafka producer
• Illustrate key-based partitioning
• Explain the difference between `acks=0`, `acks=1`, and `acks=all`
• Configure `delivery.timeout.ms` to control retry behavior
• Create a custom `producer.properties` file
• Tune throughput and latency using batching
• Create a producer
Consuming Messages from Kafka
• Illustrate how consumer groups and partitions provide scalability and fault tolerance
• Tune consumers to avoid excessive rebalances
• Explain the difference between “range” and “round robin” partition assignment strategies
• Create a custom `consumer.properties` file
• Use the Consumer API to manage offsets
• Tune fetch requests
• Create a consumer