Felpfe Inc.
Search
Close this search box.
call 24/7

+484 237-1364‬

Search
Close this search box.

Applying various transformations on Kafka Streams data

Introduction to Transformations in Kafka Streams

In this section, we will explore various transformations that can be applied to data streams in Kafka Streams. Transformations allow for modifying, filtering, and enriching the data as it flows through the stream processing pipeline, enabling powerful data manipulation and preparation.

Topics covered in this section:

  1. Introduction to transformations in Kafka Streams.
  2. Key-value mapping and transformations.
  3. Filtering and predicate-based transformations.
  4. FlatMap and mapValues transformations.
  5. Enrichment and joining of streams.

Code Sample: Applying Transformations in Kafka Streams

Java
import org.apache.kafka.streams.*;
import org.apache.kafka.streams.kstream.*;
import java.util.Properties;

public class KafkaStreamsTransformationsExample {

    public static void main(String[] args) {
        // Configure Kafka Streams application
        Properties props = new Properties();
        props.put(StreamsConfig.APPLICATION_ID_CONFIG, "my-streams-app");
        props.put(StreamsConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");

        // Create Kafka Streams builder
        StreamsBuilder builder = new StreamsBuilder();

        // Define the processing logic
        KStream<String, String> inputStream = builder.stream("input_topic");
        KStream<String, String> transformedStream = inputStream
                .filter((key, value) -> value.length() > 5)
                .flatMapValues(value -> value.split(" "))
                .mapValues(value -> value.toUpperCase());

        // Write the transformed data to an output topic
        transformedStream.to("output_topic");

        // Build and start the Kafka Streams application
        KafkaStreams streams = new KafkaStreams(builder.build(), props);
        streams.start();

        // Gracefully shutdown the application on termination
        Runtime.getRuntime().addShutdownHook(new Thread(streams::close));
    }
}

Reference Link:

  • Kafka Streams documentation on transformations: link

Helpful Video:

  • “Kafka Streams Transformation Operations” by Confluent: link

Advanced Transformations and Joining Streams

In this section, we will dive into advanced transformations and joining of streams in Kafka Streams. Advanced transformations enable complex data manipulations and aggregations, while stream joining allows for combining data from multiple streams based on keys.

Topics covered in this section:

  1. Windowed transformations and aggregations.
  2. Table lookups and stream-table joins.
  3. Global state and cross-stream transformations.
  4. Handling out-of-order events in stream joining.
  5. Configuring advanced transformation parameters for optimal results.

Code Sample: Windowed Aggregation and Stream-Table Join in Kafka Streams

Java
import org.apache.kafka.streams.*;
import org.apache.kafka.streams.kstream.*;
import org.apache.kafka.common.utils.Bytes;
import java.util.Properties;
import java.time.Duration;

public class KafkaStreamsAdvancedTransformationsExample {

    public static void main(String[] args) {
        // Configure Kafka Streams application
        Properties props = new Properties();
        props.put(StreamsConfig.APPLICATION_ID_CONFIG, "my-streams-app");
        props.put(StreamsConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");

        // Create Kafka Streams builder
        StreamsBuilder builder = new StreamsBuilder();

        // Define the processing logic
        KStream<String, String> inputStream = builder.stream("input_topic");
        KTable<Windowed<String>, Long> windowedTable = inputStream
                .groupByKey()
                .windowedBy(TimeWindows.of(Duration.ofMinutes

(5)))
                .count();

        KTable<String, String> lookupTable = builder.table("lookup_table_topic");

        KStream<String, String> joinedStream = inputStream
                .join(
                        lookupTable,
                        (value1, value2) -> value1 + ", " + value2
                );

        // Write the aggregated data to an output topic
        windowedTable.toStream()
                .foreach((key, value) -> System.out.println("Window: " + key + ", Count: " + value));

        // Write the joined data to another output topic
        joinedStream.to("output_topic");

        // Build and start the Kafka Streams application
        KafkaStreams streams = new KafkaStreams(builder.build(), props);
        streams.start();

        // Gracefully shutdown the application on termination
        Runtime.getRuntime().addShutdownHook(new Thread(streams::close));
    }
}

Reference Link:

  • Kafka Streams documentation on advanced transformations: link

Helpful Video:

  • “Kafka Streams – Advanced Transformations” by DataCumulus: link

Conclusion:
In this module, we explored various transformations that can be applied to Kafka Streams data. Transformations allow for manipulating, filtering, and enriching the data as it flows through the stream processing pipeline. By leveraging transformations, you can perform complex data manipulations and join streams to derive valuable insights from real-time data.

The provided code samples and reference links enable you to apply transformations in your Kafka Streams applications, including key-value mapping, filtering, flatMap, mapValues, and more. Additionally, advanced transformations such as windowed operations, stream-table joins, and global state transformations empower you to perform advanced analytics and aggregations on streaming data.

By mastering the art of transformations in Kafka Streams, you can build powerful stream processing pipelines that efficiently process, enrich, and derive valuable insights from real-time data. Kafka Streams provides a robust framework for data manipulation, making it an ideal choice for building scalable and fault-tolerant stream processing applications.

About Author
Ozzie Feliciano CTO @ Felpfe Inc.

Ozzie Feliciano is a highly experienced technologist with a remarkable twenty-three years of expertise in the technology industry.

kafka-logo-tall-apache-kafka-fel
Stream Dream: Diving into Kafka Streams
In “Stream Dream: Diving into Kafka Streams,”...
ksql
Talking in Streams: KSQL for the SQL Lovers
“Talking in Streams: KSQL for the SQL Lovers”...
spring_cloud
Stream Symphony: Real-time Wizardry with Spring Cloud Stream Orchestration
Description: The blog post, “Stream Symphony:...
1_GVb-mYlEyq_L35dg7TEN2w
Kafka Chronicles: Saga of Resilient Microservices Communication with Spring Cloud Stream
“Kafka Chronicles: Saga of Resilient Microservices...
kafka-logo-tall-apache-kafka-fel
Tackling Security in Kafka: A Comprehensive Guide on Authentication and Authorization
As the usage of Apache Kafka continues to grow in organizations...
1 2 3 58
90's, 2000's and Today's Hits
Decades of Hits, One Station

Listen to the greatest hits of the 90s, 2000s and Today. Now on TuneIn. Listen while you code.