Your Web News in One Place

Help Webnuz

Referal links:

Sign up for GreenGeeks web hosting
April 20, 2023 10:32 am GMT

Serverless Asynchronous REST APIs on AWS

In a previous article, we explored how to use the API Gateway WebSocket API to provide real-time, asynchronous feedback and responses to its clients. While WebSocket APIs can be an excellent option to build real-time asynchronous APIs between a browser and a server, there are situations where asynchronous REST APIs are a better fit. For example, if you need to ensure compatibility with existing architectures or integrating with backend systems where setting up a messaging system like a queue or an event bus is not an option.

This article provides a guide on how to build a Serverless Async Request/Response REST API on AWS.

First, Lets design the API

Lets consider an application that lets clients perform long running tasks through HTTP requests, these tasks might require a long amount of time to complete (e.g. text-to-speech or audio-to-text tasks, report generation, etc.). In order to keep clients informed about the status of their requests, well need to create a separate status endpoint that allows clients to poll at regular intervals the current processing status until the task is finished.

As an example, to initiate the task creation, clients POST this request:

POST /task{    "someProperty": "someValue",    "anotherPropety": "anotherValue",    ...}

The API should then return a HTTP 202 Accepted response with a body containing a URI that references the task :

HTTP 202 (Accepted){ "task": {  "href": "/task/<taskId>",  "id": "<taskId>" }}

Clients can then poll that endpoint to retrieve the status of the task. The API should respond with a status corresponding to the current state of the task, such as inprogress, complete , error. Additionally, the response payload should reflect to the specific status of the task.

Here is an example of a request with a response containing the payload of a task that is in progress:

GET /task/<taskId>HTTP 200 (OK){    "status": "inProgress",    "taskCreationDate": "2023-04-04T19:23:42",    "requestData": {...},    ...}

For server-to-server scenarios, well add a callback/webhook mechanism: When the task completes, the server updates the task state and sends POST request to a callback URL that the client provides on the initial task creation request.

One solution is to include the callback URL in the header of the request so that the API use it to send the response payload:

POST /taskCallback: https://foo.example.com/some-callback-resource{    "someProperty": "someValue",    "anotherPropety": "anotherValue",    ...}

Important: Security and reliability are important aspects to be taken into account when using callback URLs:

  • Security: It is important to verify that the callback URLs belong to the client who initiated the requests. We also need to ensure that the callback destination server receives legitimate data from the source server, rather than forged information from malicious actors attempting to spoof the webhook. This can be done by requiring clients to use API Keys when making requests and on the server side associating these keys with well-known callback URLs that the clients must provide. In addition, HMAC signature verification can be used to authenticate and validate the payloads of API callbacks.

  • Failure handling: Its important to account for scenarios where the client may be unavailable or experiencing intermittent faults resulting in errors. To address this, well need to implement a retry mechanism that includes a dead-letter queue. This mechanism allows failed deliveries to be stored in the queue and resent at a later time.

Alright, lets see how to implement this on AWS.

Architecture Overview

In this architecture, my goal was to use AWS integrations wherever possible to minimize the amount of glue code that needs to be written and maintained. This approach not only saves time and effort, but also helps ensure the scalability of the API:

Architecture overview

Breaking this down:

  • We create direct AWS integration between the API Gateway RestAPI and the DynamoDb table. In this task table we store the context of the request: the client id, the request payload, the task status and the task result once available. This table gets updated by the task processing workflow.

  • Since were using API Gateway as a proxy for the DynamoDB table, we rely on two Lambda authorizers to authenticate API calls. The first authorizer verifies client request headers: The clients API key and the callback URL, to authorize POST /task
    requests. The second authorizer is dedicated to GET /task/<taskId> route.

  • To trigger the task workflow, we rely on the tables DynamoDb Streams. We connect the stream to the tasks state machine by using EventBridge Pipes. This Pipe selects only the newly created tasks.

  • The task is then run and coordinated by a Step function workflow. The status of the task is also updated in this workflow by direct AWS SDK integration.

  • Once the task is complete, its state gets updated and the result payload gets written into the task table. We use another EventBridge Pipe to trigger the sending of the callback to the client from the stream of the completed tasks.

Note : In this architecture, Instead of using the Event Bridge API destinations, well write a custom lambda to send the callbacks. I did this to achieve better flexibility and control over how the task result payload is sent to the clients. For example, a client may require multiple registered callback URLs. To accomplish this, we use an EventBus as the target destination for the output Pipe.

TL;DR

You will find the complete source code with the complete deployment pipeline here
GitHub - ziedbentahar/async-rest-api-on-aws

In this example I will use nodejs, typescript and CDK for IaC.

Lets see the code

1- API Gateway, creating the long running task

Lets zoom into the POST /task integration with the DynamoDb task table. We associate this route with an authorizer that requires two headers to be present in the request: Authorization and Callback.

Once the request is authorized, this integration maps the request payload to a task entry. We use the API gateway $context.requestId as the identifier of the task. Additionally, we map $context.authorizer.principalId (in our case the client Id) in order to be used down the line. And we extract the request callback URL from the header using $util.escapeJavascript($input.params(callback) :



When the task is successfully added to the table, we return HTTP 202 Accepted response with a body containing the task id as well as a reference pointing to the task: by using $context.path/$context.requestId which translates to this URI /<stage-name>/task/<task-id>

You can find the complete CDK definition of the API Gateway here.

2- Create Task authorizer Lambda

As mentioned above, this authorizer validates the API key and checks whether the callback URL is associated with the API Key that identifies the client:


Here, I do not dive into the implementation of validateTokenWithCallbackUrl, but this is something that can be delegated to another service that owns the responsibility to do the proper checks.

3- Defining the create task EventBridge Pipe

This EventBridge Pipe selects the INSERT events from the DynamoDb stream: The newly created tasks will trigger new state machine executions.


Well to set the invocation type to FIRE_AND_FORGET as the execution of this state machine is asynchronous.

Well also need to set these target and destination policies:


When deployed, this Pipe will look like this on the AWS console:

DynamoDb stream to step function Pipe

4- Defining the handle completed task EventBridge Pipe

We apply the same principle as in the previous section. However, this Pipe selects the completed tasks from the DynamoDb stream, transforms them, and subsequently forwards them to the EventBus:


Well need to attach this role to this Pipe:

After deployment, the Pipe resources looks like this in the console:

DynamoDb stream to EventBus Pipe

5- Sending the Callbacks to the clients

To send callbacks, we associate an EventBridge rule with the EventBus. This rule matches all the events and defines the send callback lambda as the target:


In some situations, callbacks may fail to be delivered due to temporary unavailability of the client or a network error. We configure EventBridge to retry the operation up to 10 times within a 24-hour time-frame, after which, the message is directed to a dead letter queue to prevent message loss:

EventBridge rule configuration

Callback lambda

The callback lambda is a simple function that computes the HMAC of the message to be sent and then sends a POST request to the target callback URL.


In the computeHMAC function, the clientId is used to retrieve the key necessary for generating the HMAC signature from the response payload.

Wrapping up

In this article, we've seen how to create Serverless, asynchronous REST API on AWS. By leveraging API Gateway AWS integrations and EventBridge Pipes, it is possible to build such an API that requires minimal Lambda glue code. This approach not only simplifies the development process, but it also allows greater flexibility and scalability.

You can find the complete source code and the deployment pipeline here:
GitHub - ziedbentahar/async-rest-api-on-aws

Thanks for reading !

Further readings

Event retry policy and using dead-letter queues
Asynchronous Request-Reply pattern - Azure Architecture Center
Amazon DynamoDB stream as a source


Original Link: https://dev.to/aws-builders/serverless-asynchronous-rest-apis-on-aws-5cji

Share this article:    Share on Facebook
View Full Article

Dev To

An online community for sharing and discovering great ideas, having debates, and making friends

More About this Source Visit Dev To