Your Web News in One Place

Help Webnuz

Referal links:

Sign up for GreenGeeks web hosting
May 21, 2022 11:59 pm GMT

Create Instance Scheduler on Serverless by using Lambda, DynamoDB, API Gateway, Cognito,S3 and CloudFront

By using an instance scheduler you can start/stop or scheduler EC2/RDS instances to save costs.

The instance scheduler is open source and based on AWS serverless architecture.

Why do I use serverless?
Because:

  • No servers to manage.
  • Never pay for idler resources, only pay per event.
  • Scale with resilience and flexibility.
  • Fast development.

The App Architecture

instance scheduler architecture

In the application architecture:
Frontend
The user requests content from CloudFront service to get the HTML, CSS, and javascript. CloudFront caches the static files from the S3.
Backend
Once the Frontend is loaded locally on the user's internet browser, then the user SignUp or SignIn to the Cognito service, and it responded with a JWT token to the user.

The user can perform HTTP methods GET/POST/PUT/DELETE to the API gateway with the JWT token, API Gateway checks if the JWT token is valid from the Cognito service and after that invoke the lambda function with the request.

The lambda checks the events and extracts the user's email from the JWT token and add/update/delete the user's data from/to dynamoDB.
Also, lambda interacts with Key Management Service(KMS) to encrypt/decrypt sensitive data from/into dynamoDB.
In the case of email notification, lambda uses Simple Email Service (SES) to send emails to the users.

Database
The data is saved in dynamoDB.
The user's email is the partition key, and the sort key is created, always email is unique and the sort key can be changed with [ date or account information or aws keys].
Also used the attribute named dtime as Global Secondary Indexes GSI to get all users by specific date and time.

Scheduler
CloudWatch is used as a cron job to run each hour, it invokes the lambda function to get all attributes from Global Secondary Indexes GSI (dtime) of dynamoDB with specific dates and time, and it performs which EC2/RDS instances need to start/stop, after that will notify the user by email or webhook.

Steps to deploy the App:
You need to have aws cli and configured aws access key and secret keys.

Clone the app locally by

git clone https://github.com/maradwan/instance-scheduler.git

1- Create a dynamoDB table

aws dynamodb create-table \    --table-name instance-scheduler \    --attribute-definitions AttributeName=email,AttributeType=S AttributeName=created,AttributeType=S AttributeName=dtime,AttributeType=S\    --key-schema AttributeName=email,KeyType=HASH AttributeName=created,KeyType=RANGE \    --provisioned-throughput ReadCapacityUnits=1,WriteCapacityUnits=1 \    --global-secondary-indexes "IndexName=dtime-index,\    KeySchema=[{AttributeName=dtime,KeyType=HASH}],\    Projection={ProjectionType=ALL},\    ProvisionedThroughput={ReadCapacityUnits=1,WriteCapacityUnits=1}"   

2- Create Cognito service

aws cognito-idp create-user-pool --pool-name instance-scheduler --schema Name=email,Required=true --auto-verified-attributes email --verification-message-template DefaultEmailOption=CONFIRM_WITH_LINKaws cognito-idp list-user-pools --max-results 20

Get the id of UserPools of instance-scheduler and create the user pool client

aws cognito-idp create-user-pool-client --client-name my-user-pool-client --user-pool-id eu-west-1_2z7VlI6nq

Take a note of the UserPoolId, ClientId

{    "UserPoolClient": {        "UserPoolId": "eu-west-1_2z7VlI6nq",        "ClientName": "my-user-pool-client",        "ClientId": "gqmnvsr3e8jmbg6ptaul30v9g",        "LastModifiedDate": 1653128931.722,        "CreationDate": 1653128931.722,        "RefreshTokenValidity": 30,        "AllowedOAuthFlowsUserPoolClient": false    }}

Take a note arn of the Cognito service

aws cognito-idp describe-user-pool --user-pool-id eu-west-1_2z7VlI6nq

Create custom domain name, should be unique name

aws cognito-idp create-user-pool-domain --domain instance-scheduler --user-pool-id eu-west-1_2z7VlI6nq

3- Verify your email from Simple Email Service (SES) to used for sending notifications

aws ses verify-email-identity --email-address [email protected] --region eu-west-1

You will receive an email from "Amazon Web Services" you need to click on the link to verify your email.

Check your email if has been verified by

aws ses list-identities
{    "Identities": [        "[email protected]"    ]}

4- Create hourly Scheduler Lambda, KMS and Cloudwatch event by using terraform.
edit this file scheduler-lambda/scheduler-lambda.tf and change the following:

  • "SOURCE_EMAIL" to your verified email in step 4
  • "REGION_NAME" to which region you are usingafter that apply the following commands and take a note of kms key_id:
cd scheduler-lambdaterraform initterraform apply

5- Create API Gateway and Lambda by using Zappa

edit this file app/zappa_settings.json and change the following:

  • "provider_arns" to your Cognito ARN in the step 2
  • "USERPOOLID" and "COGNITOR_CLIENTID" to your Cognito UserPoolId and ClientId in step 2
  • "SOURCE_EMAIL" to your verified email in step 3
  • "s3_bucket" bucket named should be unique globally
  • "REGION_NAME" to which region you are using
  • "KMS_KEY_ID" to your KMS id in the step 4

after that apply the following commands:

virtualenv -p python3 env3source env3/bin/activatepip install -r app/requirements.txtcd appzappa deploy

Take a note of your API Gateway looks like this

Your updated Zappa deployment is live!: https://74628jl936.execute-api.eu-west-1.amazonaws.com/staging

6- Update the KMS to allow the lambda in step 5 to encrypt/decrypt

Add the following role to scheduler-lambda/kms.tf , "Sid" of Allow use of the key and Allow attachment of persistent resources

["arn:aws:iam::${local.account_id}:role/instance-scheduler-staging-ZappaLambdaExecutionRole", "${aws_iam_role.lambda_role.arn}"]

After that apply terraform again

cd scheduler-lambdaterraform apply

7- Create CloudFront and S3
The bucket should be a unique name, edit the cloudfront-s3-sync.sh and change the bucket name and region name. The script creates S3 and CloudFront then sync html folder to the bucket.
Edit this file html/js/config.js and change the following:

  • userPoolId, "userPoolClientId" and "region" to your UserPoolId, ClientId in step 2 and which region you are using
  • invokeUrl to your API Gateway in step 5 It looks like this:
window._config = {    cognito: {        userPoolId: 'eu-west-1_XXX',        userPoolClientId: 'XXX',        region: 'eu-west-1'    },    api: {         invokeUrl: 'https://api.example.com'    }};

Then run the script

sh cloudfront-s3-sync.sh

Wait 5 minutes until CloudFront be deployed and then login to the URL.

Test the APP
Create a new IAM user only for programmatic access and assign the the following policy.
replace REGION_NAME, ACCOUNT_NUMBER and INSTANCE_ID

{    "Version": "2012-10-17",    "Statement": [        {            "Effect": "Allow",            "Action": [                "ec2:StartInstances",                "ec2:StopInstances"            ],            "Resource": "arn:aws:ec2:REGION_NAME:ACCOUNT_NUMBER:instance/INSTANCE_ID"        }    ]}

For RDS policy

{    "Version": "2012-10-17",    "Statement": [        {            "Sid": "VisualEditor0",            "Effect": "Allow",            "Action": [                "RDS:StopDBInstance",                "RDS:StartDBInstance"            ],            "Resource": [                "arn:aws:rds:REGION_NAME:ACCOUNT_NUMBER:db:DB_NAME"            ]        }    ]}
  • Create a key in the app
    access-keys

  • Add the EC2 and RDS instances

instance-scheduler


Original Link: https://dev.to/aws-builders/create-instance-scheduler-on-serverless-by-using-lambda-dynamodb-api-gateway-cognitos3-and-cloudfront-1op8

Share this article:    Share on Facebook
View Full Article

Dev To

An online community for sharing and discovering great ideas, having debates, and making friends

More About this Source Visit Dev To