Setup an Event Based Architecture With AWS
Today, we will design simple system that contains some aws services likewise sqs, dynamodb and lambda, which provides us receiving data from any clients and saving it to dynamodb. Here is a short summary.
First, we will push message to sqs. Then lambda will be triggered and read message from sqs. Finally lambda will save message to DynamoDB. But first we will talk about event source mappings before the project. An event source mapping is an AWS Lambda resource that reads from an event source and invokes a Lambda function. You can use event source mappings to process items from a stream or queue in services that don’t invoke Lambda functions directly (more detail).
Create the Lambda and SQS
First we will create an execution role that gives our function permission to access AWS resources. Here is a our role.
That name is lambda-sqs-role and there is one policy which name is AWSLambdaSQSQueueExecutionRole. We will write our lambda method for reading messages from sqs.
exports.handler = async function(event, context) { event.Records.forEach(record => { const { body } = record; console.log(body); });return {};}
Create a deployment package for lambda.
zip function.zip index.js
Create a Lambda function with the create-function
command.
aws lambda create-function --function-name ProcessSQSRecord \
--zip-file fileb://function.zip --handler index.handler --runtime nodejs12.x \
--role arn:aws:iam::123456789012:role/lambda-sqs-role --profile personal
Here is our first function.
Go amazon sqs service and create. that’s all. Actually there is one important point in this stage. It’s our event source mapping between the lambda function and sqs. Let’s configure the event source. Just run the following command.
aws lambda create-event-source-mapping --function-name ProcessSQSRecord --batch-size 10 \
--event-source-arn arn:aws:sqs:eu-west-1:228038637300:messages.fifo --profile personal
and you will see this output on your terminal
Check the trigger on sqs service page.
or we can list triggers with following command
aws lambda list-event-source-mappings —-function-name ProcessSQSRecord \
--event-source-arn arn:aws:sqs:eu-west-1:228038637300:messages.fifo --profile personal
Test This Setup
- In the Amazon SQS console, send messages to the queue. Amazon SQS writes records of these actions to the queue.
- AWS Lambda polls the queue and when it detects updates, it invokes your Lambda function by passing in the event data it finds in the queue.
- Your function runs and creates logs in Amazon CloudWatch. You can verify the logs reported in the Amazon CloudWatch console.
Now we have one fifo queue that contains our messages and one method that is listening our queue. Now we will save the message on DynamoDB.
Write to DynamoDB
- Go to IAM and give DynamoDB permission to our lambda-sqs-role. I gave AmazonDynamoDBFullAccess but i think it is not best practice. If this project important for you, you should search a little more.
- Create the DynamoDB table with correct key attributes defined
- Update the code.
var AWS = require('aws-sdk');AWS.config.update({region: 'eu-west-1'});var DynamoDB = new AWS.DynamoDB.DocumentClient();exports.handler = async function(event, context) {var messageObj;event.Records.forEach(record => {const { body } = record;console.log("BODY: ", body);messageObj = JSON.parse(body);console.log("body.user", messageObj.user);console.log("body.message", messageObj.message);var params = {"TableName": "messages","Item": {"user" : { "S" : messageObj.user},"message" : { "S" : messageObj.message}}};DynamoDB.put(params, function (err) {if (err) {return err;}});});return {};}
For Test This Setup
- Send to test message and check DynamoDB table.
Some Clients
You can add any client that can send message to sqs. I will not implement any project for this but i will add some examples.
SendMessageRequest send_msg_request = new SendMessageRequest() .withQueueUrl(queueUrl) .withMessageBody(“hello world”) .withDelaySeconds(5); sqs.sendMessage(send_msg_request);