Spark Streaming – Kafka messages in Avro format
This article describes Spark Structured Streaming from Kafka in Avro file format and usage of…
This article describes Spark Structured Streaming from Kafka in Avro file format and usage of…
Spark Streaming with Kafka Example Using Spark Streaming we can read from Kafka topic and…
This article describes Spark Batch Processing using Kafka Data Source. Unlike Spark structure stream processing, we may need to process batch jobs which reads the data from Kafka and writes the data to Kafka topic in batch mode. To do this we should use read instead of resdStream similarly write instead of writeStream on DataFrame
Kafka allows us to create our own serializer and deserializer so that we can produce and consume different data types like Json, POJO e.t.c. In this post will see how to produce and consumer User pojo object. To stream pojo objects one need to create custom serializer and deserializer.
This article explains how to write Kafka Producer and Consumer example in Scala. Producer sends messages to Kafka topics in the form of records, a record is a key-value pair along with topic name and consumer receives a messages from a topic.