FS2-Kafka in real world -Part 2

Photo by Erlend Ekseth on Unsplash

This is a continuation of series on how to use F2-Kafka in real world. Checkout part1 to understand what is F2-Kafka and how it can be used ideally.

FS2-Kafka documentation gives examples on how to use a KafkaProducer to produce data. All examples assume that there is an existing stream of data from where data will be pushed to Kafka.

This raises a question. What if we want to push data in Kafka when there is no source stream? Consider this example

A simple service that deletes a entity from database. What if we wanted to publish a Kafka event when a organisation is deleted successfully? We could use the native java KafkaProducer but what if we wanted to use the FS2 KafkaProducer. We need something to link our and the FS2 stream

We will use a for this purpose. This idea is also expressed in the official FS2 documentation — https://fs2.io/#/guide?id=talking-to-the-external-world. Let’s create two methods and that push and pull data into our .

class is made with one goal — use any to publish data to Kafka. It does that so by internally using a . method pulls data from external stream and pushes them into a queue. The method basically pulls elements from the queue, creates Kafka records and pushes them into Kafka using the supplied .

The most critical part is linking the two methods in the method. A stream of is created and so is a stream of . We must use flatMap to access the producer and the actual queue and to ensure only one instance of each is created. The queue and producer are passed to the and methods to create the streams. The two streams are then started to ensure that both steps happen.

Now the question is how to use class to push delete events? We again need a to link them.

We create an unbounded . Although in real we would care about the memory usage and probably choose a queue of suitable length. We create an instance of using the newly created queue and then we start the stream. We also pass the newly created queue to the delete method. Here is how the delete method would use the queue.

Every time a delete happens we push an element to queue. The created inside class will ensure that every element pushed to this common queue is

  1. pulled and then pushed to an internal queue inside the method.
  2. As soon as its in the internal queue its pulled again, a producer record is created and then the record is pushed to Kafka

Steps 1 and 2 will keep happening forever because of that we have done . This ensures that the stream keeps running.

And that is it. We have successfully managed to link our external delete events to a that is being used to push data into Kafka. Hopefully this helps you in integrating FS2-Kafka into more code bases.

Further reading

  1. https://fs2.io/#/guide?id=talking-to-the-external-world
  2. https://underscore.io/blog/posts/2018/03/20/fs2.html

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store