Back to Browse

Kafka Streams 101: Data Serialization (2023)

25.2K views
Sep 10, 2021
4:57

► TRY THIS YOURSELF: https://cnfl.io/kafka-streams-101-module-1 Apache Kafka® brokers only work with records in bytes, so data serialization is important. In this video, learn to convert objects and other useful types into bytes that can be sent across the network or put into a state store—and vice versa (deserialization). ► For a COMPLETE IMMERSIVE HANDS-ON EXPERIENCE, go to https://cnfl.io/kafka-streams-101-module-1 -- ABOUT CONFLUENT Confluent is pioneering a fundamentally new category of data infrastructure focused on data in motion. Confluent’s cloud-native offering is the foundational platform for data in motion – designed to be the intelligent connective tissue enabling real-time data, from multiple sources, to constantly stream across the organization. With Confluent, organizations can meet the new business imperative of delivering rich, digital front-end customer experiences and transitioning to sophisticated, real-time, software-driven backend operations. To learn more, please visit www.confluent.io. #kafka #kafkastreams #streamprocessing #apachekafka #confluent

Download

1 formats

Video Formats

360pmp46.8 MB

Right-click 'Download' and select 'Save Link As' if the file opens in a new tab.

Kafka Streams 101: Data Serialization (2023) | NatokHD