Your Web News in One Place

Help Webnuz

Referal links:

Sign up for GreenGeeks web hosting
January 25, 2023 07:27 am GMT

NodeJS with Kafka: Build Async Programs with ease

Sync and Async Programming

In modern world with so many complexities in Software development there are smart architectures to deal with it. For example if we build our application and later on we decide it to handle thousands or millions of users then we have to make smart decisions. We can take Micro-services approach in that.

On the other hand there are some services that can make our system slow.

Programming Meme

Take an example of buying a flight ticket online. When we will make our payment only then the booking will be successful, This is called Synchronous programming. On the other hand, Booking should not depend on the email like whether the Booking email delivered or not.

So, Email and Booking are not related directly or Booking completion should not wait whether the email is sent or not. So, we can make email run in the background or Asynchronous.

Refer below Image for the better clarification:-

  • Sync ProgrammingSynchronous Programming
  • Async Programming using Message QueuesMessage Queues

-> Here, we can see that Message Queues have been used in which Producers submit the events and different consumers do consume the event. Note:- This is just a simple example.

Messages Queues

Message queues are used for service to service communication and gives us the async behaviour. It is mainly used in serverless and microservices architecture.

Message Queues

There are various MQs out there in the market:-

  • Apache Kafka
  • Azure Scheduler
  • Nastel
  • Apache Qpid
  • RabbitMQ

We will see Kafka with NodeJS today so that you can use in your projects easily

** You can refer this Github Repo for the Reference**

Implementation

For the reference you can see:-

Kafka Topics

Kafka will run inside Docker container and Producer will add the events into the Kafka topics (Queue) and the consumer will consume.

  • Docker Compose File
version: "3"services:  zookeeper:    image: 'bitnami/zookeeper:latest'    ports:      - '2181:2181'    environment:      - ALLOW_ANONYMOUS_LOGIN=yes  kafka:    image: 'bitnami/kafka:latest'    container_name: 'kafka'    ports:      - '9092:9092'    environment:      - KAFKA_BROKER_ID=1      - KAFKA_LISTENERS=PLAINTEXT://:9092      - KAFKA_ADVERTISED_LISTENERS=PLAINTEXT://127.0.0.1:9092      - KAFKA_ZOOKEEPER_CONNECT=zookeeper:2181      - ALLOW_PLAINTEXT_LISTENER=yes    depends_on:      - zookeeper

Here we are running the kafka image inside the docker container and have exposed the port 9092 for the outsiders. Zookeeper is the dependency that kafka need to run.

  • eventType.js file
import avro from 'avsc';export default avro.Type.forSchema({  type: 'record',  fields: [    {      name: 'category',      type: { type: 'enum', symbols: ['DOG', 'CAT'] }    },    {      name: 'noise',      type: 'string',    }  ]});

This is the schema that we are defining for the events to be submitted inside the kafka topics.

  • Producer index.js
import Kafka from 'node-rdkafka';import eventType from '../eventType.js';const stream = Kafka.Producer.createWriteStream({  'metadata.broker.list': 'localhost:9092'}, {}, {  topic: 'test'});stream.on('error', (err) => {  console.error('Error in our kafka stream');  console.error(err);});function queueRandomMessage() {  const category = getRandomAnimal();  const noise = getRandomNoise(category);  const event = { category, noise };  const success = stream.write(eventType.toBuffer(event));       if (success) {    console.log(`message queued (${JSON.stringify(event)})`);  } else {    console.log('Too many messages in the queue already..');  }}function getRandomAnimal() {  const categories = ['CAT', 'DOG'];  return categories[Math.floor(Math.random() * categories.length)];}function getRandomNoise(animal) {  if (animal === 'CAT') {    const noises = ['meow', 'purr'];    return noises[Math.floor(Math.random() * noises.length)];  } else if (animal === 'DOG') {    const noises = ['bark', 'woof'];    return noises[Math.floor(Math.random() * noises.length)];  } else {    return 'silence..';  }}setInterval(() => {  queueRandomMessage();}, 3000);

Here, we are producing a random message every 3000ms and it is async because setInterval() is a method provided by Node API (Written in C++) which is async in nature.

  • Consumer index.js
import Kafka from 'node-rdkafka';import eventType from '../eventType.js';var consumer = new Kafka.KafkaConsumer({  'group.id': 'kafka',  'metadata.broker.list': 'localhost:9092',}, {});consumer.connect();consumer.on('ready', () => {  console.log('consumer ready..')  consumer.subscribe(['test']);  consumer.consume();}).on('data', function(data) {  console.log(`received message: ${eventType.fromBuffer(data.value)}`);});

Consumer will consume the test topic's messages and will print accordingly.

NOTE:- To run producer and consumer

  • Run npm run start:producer
  • Run npm run start:consumer

That's it for today. Do Save and comment if you found any value. You can build your projects using this simple example as a reference.


Original Link: https://dev.to/lovepreetsingh/nodejs-with-kafka-build-async-programs-with-ease-11n3

Share this article:    Share on Facebook
View Full Article

Dev To

An online community for sharing and discovering great ideas, having debates, and making friends

More About this Source Visit Dev To