LCP

A robust system with dynamic adaptability has become increasingly vital in this ever-evolving landscape of technology.In today's complex and fast-paced world, a more flexible system is required that can cope with the events as they happen, respond to them in real time, and allow adjustments. Introducing Event-Driven Architecture (EDA), a transformative concept that revolutionizes the way systems operate by focusing on the dynamic flow of events.

Imagine hosting a house party where everything is planned - when to serve drinks, when the dance will start, when to serve food, and so on. Now consider this as a flexible timeline where certain actions would be taken at the occurrence of certain specified triggers, called 'Events'. Serving drinks as soon as the guest enters the house party. Here, guests entering the house would be the trigger (Event) to take the action of serving the drink. Now, let us go technical into the understanding of Event Driven Architecture and see how to implement it using Kafka and Node JS.

What is event-driven architecture?

  • In Event-driven architecture, systems are designed to respond to events, which are occurrences or changes in state. Events could be user actions, sensor outputs, messages from other systems, etc. These events trigger specific actions or behaviors in the system. It is a software design pattern that emphasizes the production, detection, consumption, and reaction to events.
  • In this framework, the system's components interact by sharing events, which are lightweight messages conveying important occurrences or changes in the system's state.
  • Event-driven programming is a widely used approach in contemporary software development, providing a way for developers to create efficient and scalable applications.

Key components of event-driven architecture include:

  • Event Sources: These are entities or systems that generate events. Event sources could include user interfaces, sensors, databases, external services, or other parts of the application.
  • Events: An event is a signal that something of interest has happened. It encapsulates information about the occurrence and may include data relevant to the event. Events can be classified into different types based on their origin or purpose.
  • Event Bus or Broker: The event bus is a communication channel that facilitates the exchange of events between different components of the system. It acts as an intermediary, allowing event sources to publish events and event consumers to subscribe to and receive those events.
  • Event Consumers: These are components or services that react to specific events. When an event occurs, the event consumer takes appropriate actions, which could include updating the user interface, processing data, triggering further events, or communicating with other services.
  • Event Handlers: Event handlers are functions or modules responsible for processing and responding to specific types of events. They define the logic that should be executed when a particular event occurs.

Benefits of Event-Driven Architecture

  • Loose Coupling: Components in an event-driven system are loosely coupled, meaning they can operate independently without detailed knowledge of each other. This promotes modularity and simplifies maintenance and updates.
  • Scalability: Event-driven architectures are well-suited for scalable applications. New components or services can be added without affecting the existing system, and the system can efficiently handle varying workloads.
  • Flexibility: EDA provides flexibility in designing systems that can adapt to changing requirements and conditions. Components can be easily replaced or modified, and new features can be added by responding to events.
  • Real-time Responsiveness: Event-driven systems are inherently capable of real-time responsiveness since they react to events as they occur, making them suitable for applications requiring quick and interactive responses.

Demo App to Explain Event Driven Architecture

Prerequisites :

Before we proceed with the implementation, let's verify that we meet the required prerequisites.

Node.js

Ensure that Node.js is installed on your computer. If not, you can obtain it from the official Node.js website. (https://nodejs.org/en)

Why Node JS?

Node.js, founded on Chrome's V8 JavaScript engine, is a platform that thrives on asynchronous and event-driven principles. Its non-blocking I/O model makes it especially adept at constructing scalable network applications.

Apache Kafka

Set up an Apache Kafka cluster or utilize an existing one. Refer to the official documentation for guidance on installation and configuration. This will set up the Apache Kafka cluster for you. (https://kafka.apache.org/documentation/)

Why Apache Kafka?

Apache Kafka forms a robust base for the development of event-driven architectures, and its array of advantages establishes it as the preferred option for handling real-time data.

Apache Kafka, NodeJS, Kafka cluster, event driven architecture using kafka, kafka event driven architecture

Developing the Demo :

Install Node.js:

Make sure you have Node.js installed. Create a new Node.js project and install the necessary packages:

npm init -y

npm install kafkajs

Create Producer (Producer.js):

  • Create a Kafka producer to send events.
  • let's imagine a scenario where your IoT devices are sending temperature readings to the Kafka topic 'temperatureData'.
  • The producer will simulate sending temperature readings, and the consumer will process and handle these temperature readings.

const { Kafka, Partitioners } = require('kafkajs');

const kafka = new Kafka({
  clientId: 'iot-producer',
  brokers: ['localhost:9092'],
});

const producer = kafka.producer({
  createPartitioner: Partitioners.LegacyPartitioner,
});

const connectProducer = async () =undefined {
  try {
    await producer.connect();
    console.log('Producer Connected to Kafka!');
  } catch (e) {
    console.error(`Error while connecting to Kafka: ${e}`);
  }
};

const produceTemperatureReading = async () =undefined {
  try {
    const temperature = Math.floor(Math.random() * 50) + 1; // Simulate temperature readings
    const deviceID = Math.floor(Math.random() * 10) + 1; // Simulate different IoT devices

    await producer.send({
      topic: 'temperatureData',
      messages: [{ value: JSON.stringify({ deviceID, temperature }) }],
    });

    console.log(`Temperature reading sent successfully: For DeviceID ${deviceID}`);
  } catch (e) {
    console.error(`Error while producing temperature reading: ${e}`);
  }
};

// Connect to Kafka initially
connectProducer();

// Schedule temperature reading production every second
setInterval(async () =undefined {
  await produceTemperatureReading();
}, 1000);

Create Consumer (Consumer.js):

  • Create a Kafka consumer to receive and process events.

const { Kafka } = require('kafkajs');

const kafka = new Kafka({
  clientId: 'iot-consumer',
  brokers: ['localhost:9092'],
});

const consumer = kafka.consumer({ groupId: 'iot-group' });

const processTemperatureReading = async (message) =undefined {
  const { deviceID, temperature } = JSON.parse(message.value.toString());
  console.log(`Received Temperature Reading: DeviceID ${deviceID} - ${temperature}°C`);
};

const consumeTemperatureReadings = async () =undefined {
  await consumer.connect();
  console.log('Consumer Connected to Kafka!');
  await consumer.subscribe({ topic: 'temperatureData', fromBeginning: true });

  await consumer.run({
    eachMessage: async ({ topic, partition, message }) =undefined {
      await processTemperatureReading(message);
    },
  });
};

consumeTemperatureReadings()

Run the Demo:

Open two terminal windows and run the producer and consumer:

Terminal 1 (Producer):

node Producer.js

Terminal 2 (Consumer):

node Consumer.js

You should see the following output indicating that the producer sends a message, and the consumer receives and processes it.

Output:

Apache Kafka, NodeJS, Kafka cluster, event driven architecture using kafka, kafka event driven architecture


End Note

We conclude our exploration of Event-Driven Architecture, specifically leveraging the powerful combination of Kafka and Node.js. We explored what is event driven architecture, its key components, and benefits of event driven architecture. We also had our hands on experience in setting up the Kafka and developed a demo application to understand the event driven microservice.

We at Seaflux are your dedicated partners in the ever-evolving landscape of Cloud Computing. Whether you're contemplating a seamless cloud migration, exploring the possibilities of Kubernetes deployment, or harnessing the power of AWS serverless architecture, Seaflux is here to lead the way.

Have specific questions or ambitious projects in mind? Let's discuss! Schedule a meeting with us here, and let Seaflux be your trusted companion in unlocking the potential of cloud innovation. Your journey to a more agile and scalable future starts with us.
Jay Mehta - Director of Engineering
Jimit Raval

Technical Lead

Claim Your No-Cost Consultation!

Let's Connect