Your Web News in One Place

Help Webnuz

Referal links:

Sign up for GreenGeeks web hosting
June 15, 2022 07:49 am GMT

Triggering a Lambda function using SQS to populate DynamoDB

This post is intended for beginners who want to understand how event driven architectures communicate in AWS. The repository holding the python scripts to be used is mentioned at the end of the

Log into your AWS console with your IAM credentials, select the region closest to you & follow the below steps:

1. Creating a SQS(Simple Queue Service):

Create a simple SQS queue called "DynamoDBMessages" which will receive a number of messages from the EC2 instance running in our account.

Image description

No need to change the access policy or the configuration we can keep it as default

Image description

2. Lambda execution role:

We need to grant access to the Lambda service so that the function created in the next step can access the SQS queue to check for messages, DynamoDB in order to write the messages from the queue to the DB table & access to write to CloudWatch Logs.

Image description

Image description

3. Lambda function:

This function will be triggered by an event whenever the messages are placed on DynamoDBMessages Queue created in step 1.

Provide the function name & select the runtime as Python 3.9 as we will be running the function written in Python.
The architecture can be default as x86_64. Click on Create Function.

Once the Function Overview screen open click on "Add Trigger", select SQS in the trigger configuration & select the queue we created above.

Image description

Select the Code tab & change the contents of the lambdafunction.py to the code attached (filename= lambda_function.py) in the repository given below:

Image description

Next, Deploy the code by clicking on Deploy.

4. Create a DynamoDB table:

Create a table by the name of Messages as shown below:

Image description

5. Create a EC2 Instance:

This instance will execute the code which will send messages to SQS. Be sure to select instance type t2.micro which is free tier eligible.

Image description

Select the EC2 Instance once its in the running state & click on Connect, alternatively we can also use the AWS CLI. You can run the scp command to move the send_message.py file to the EC2 Instance from your local once you are able to SSH into the instance(This requires the KeyPair pem file created at the time of instance creation)

Image description

To run the below command, install boto3 & faker packages on the EC2 instance as this will not be pre-installed. Next, you might face "No Credentials Error" thrown by boto3 package. To overcome this, there are three options,

  1. Run aws configure & give the access key id, key, region, format.
  2. Edit the credentials file in the /.aws folder. Create the folder/file if not present & then enter the above access key & region details.The above two options are not safe as any other user having access to this EC2 instance can get hold of these credentials.Alternatively STS service can be used to get temporary credentials for package execution on EC2.

Once the above steps are complete, run the below command to trigger sending of the messages to DynamoDBMessages queue.

Image description

The SQS queue name created should follow the -q option in the above command.

Press Ctrl+C once you see an adequate number of messages being sent.

Image description

Check the DynamoDB table "Message" for the messages sent from the EC2 -> SQS -> Lambda -> DynamoDB.

Image description
For any errors one can also check the CloudWatch Logs, under the Lambda log group.

Image description

GitHub Repository:

https://github.com/neetu-mallan/SQSLambdaDynamoDB/tree/master


Original Link: https://dev.to/neetumallan/triggering-a-lambda-function-using-sqs-to-populate-dynamodb-4n3i

Share this article:    Share on Facebook
View Full Article

Dev To

An online community for sharing and discovering great ideas, having debates, and making friends

More About this Source Visit Dev To