Event-Driven Application Using Spring Cloud Stream

Anshu Mishra
FAUN — Developer Community 🐾
5 min readJun 24, 2020

--

unsplash.com

What is Event and Event-Driven architecture?

The event is a state change in application. In simple words, you can assume a button click on a page is an event. Source of events could be user, like button click or software system producing events like IoT devices generating real-time sensor data.

Event-driven architecture (EDA) is a software architecture paradigm. Event producer and event consumers are two major components of EDA. The responsibility of the producer is to sense any state change and present this state as an event message. The producer does not know who is the consumer of this event and what would be outcome of this event. Transmission of event will happen via the event channel.

There are two type of EDA model:

  1. Pub/Sub model: In this model when the event occurs or produced, then the system will put this event as a message in the event stream, and subscriber who is listening to this stream will consume this event.
  2. Event Streaming: In this model, events are written in logs and Event consumers don’t subscribe to an event stream. It can read events from any part of the stream at any time.

There are several benefits of this EDA approach, you can develop loosely coupled real-time and distributed systems. Eg, suppose you are developing a security system which is detecting face, apply some decision and then allow users to enter a restricted area. So the camera is clicking the photo, sending real-time data to you as an event, then you apply decision and send back another event in real-time as yes or no.

There is complexity involved as well in event-driven systems. Scalability and reliability are two major concerns, then you need to write producer/consumer code which will produce and consume events as well as you need to write specific logic to process a particular event. As a developer, you need to write code to connect your app to a specific messaging system.

Spring cloud stream is spring library which is helping developer to focus on core app rather than boilerplate connector code. It unifies lots of popular messaging platforms behind one easy to use API including RabbitMQ, Apache Kafka, Amazon Kinesis, Google PubSub, Solace PubSub+, Azure Event Hubs, and Apache RocketMQ. Spring cloud stream supports WebFlux, Multi IO (multiple input and output), Multi Binder (You can use Kafka and RabbitMQ as input and output channel respectively in a single application), Kafka stream features like KTable and KStream and many more.

Check out: https://spring.io/projects/spring-cloud-stream#overview

Build Event-Driven Application:

Let's build a simple event-driven system using Spring Cloud Stream and Apache Kafka.

Download Apache Kafka: I am using kafka_2.12–2.5.0 version.

Kafka comes with a zookeeper and the zookeeper is responsible for managing the cluster state.

Run zookeeper-server-start.sh and supply zookeeper.properties as command parameter. Zookeeper will start on default port 2181. You can change port in zookeeper.properties file.

Start the Kafka server. Kafka server will start on default port 9092. You can change port in the server.properties file.

Now create build.gradle file.

Create a spring boot main class:

We need to define inbound and outbound channels so that the application can communicate with Kafka. When you bind this EventStream, Spring will take care of the subscription of a particular topic as well as producer channel. For each type, you need to create separate methods. On runtime, spring will create a bean and you can inject it anywhere to get channel reference.

Now bind the stream. @EnableBinding annotation will take care of the binding input and output channel. You don’t need any boilerplate code to bind channels.

Define brokers and binding detail in the application yml file.

Event Producer:

For producing messages I am using a rest controller and putting a message on the data_stream topic.

Create payload class:

Controller class:

For producing messages we need channels so that we can send our payload. You can see we have injected EventStream bean which will give message channel object in runtime. Here we are producing a message that’s why we have taken the reference to producer channel.

Event Consumer:

Here we have written a listener. @StreamListener annotation will automatically register a listener on the inbound channels so that we can consume any incoming message.

Server Logs:

In server log, you see that subscription channel has already been opened on data_stream topic.

Demo:

Now try to create event form postman.

In server logs, you can see that event has been successfully produced and consumed.

Kafka writes messages in topic partition. Since we have only one partition, you can visit /tmp/kafka-logs/$topic_partition and you can see messages in .log file.

Conclusion :

Spring cloud stream helps you to reduce boilerplate code and just focus on the core business logic of your application. Spring cloud stream has support for all popular messaging platform. Write some spring beans and define some cluster properties and you are done with configuration code. Spring cloud stream is also supporting Multi message binding (you can use Kafka and RabbitMQ together in code) as well as cross-cluster binding. Spring cloud stream is also supporting vendor-specific messaging platforms binding like AWS Kinesis, Google PubSub, Azure Event Hubs.

Thanks :)

If you like this article please clap and follow me on medium.

Reference:

Subscribe to FAUN topics and get your weekly curated email of the must-read tech stories, news, and tutorials 🗞️

Follow us on Twitter 🐦 and Facebook 👥 and Instagram 📷 and join our Facebook and Linkedin Groups 💬

If this post was helpful, please click the clap 👏 button below a few times to show your support for the author! ⬇

--

--

Solution Architect at TSYS. Technical Enthusiast, Technical blogger. Interest area is cloud native technology , ML and AI.