Using AWS S3 to Trigger Lambda, SNS and SQS.

I reviewed SNS and SQS for the AWS Certified Developer - Associates exam. The course I’m taking from A Cloud Guru provides a theoretical lecture, but no lab.

I created my own lab and scenario. I upload a CSV file to an S3 Bucket, which triggers a Lambda Function. The Lambda Function, written in Python, reads the CSV file and sums each line, which consists of comma separated numbers. The results are published to an SNS topic and distributed to 2 subscribers: 1) an SQS Queue, and 2) email subscriber.

Here is the architecture.

Using AWS S3 to Trigger Lambda, SNS and SQS.

Cool things I learned from coding this serverless application.

  • S3 can trigger a Lambda Function.
  • Each Lambda Function has 512MB of /tmp storage.
  • SNS and SQS are quite easy to use :)

The Python Lambda Function downloads the CSV File from S3 to its /tmp storage. This may not be necessary, but it’s quite easy to do and allows one to open and process the CSV file using native Python functions.

import boto3

s3 = boto3.client("s3")
sns = boto3.client("sns")

topic_arn = "{your sns topic}"

def handler(event, context):
    bucket_name = event["Records"][0]["s3"]["bucket"]["name"]
    key_name = event["Records"][0]["s3"]["object"]["key"]
    
    filename = "/tmp/" + key_name
    s3.download_file(bucket_name, key_name, filename)
    
    totals = []
    with open(filename) as f:
        for line in f:
            values = [int(x) for x in line.split(",")]
            totals.append(sum(values))
    
    sns_message = f"These are the totals: {totals}"
    
    response = sns.publish(TopicArn=topic_arn, Message = sns_message)