Cloud Consultant | AWS & Azure | DevOps |Infrastructure Automation with Ansible & Jenkins | Docker & Kubernetes Enthusiast| Passionate about Scalable & Secure Cloud Solutions

TriggeringAWSLambda
Introduction

AWS Lambda allows you to run code without provisioning or managing servers. In this guide, we will integrate AWS Lambda with Amazon S3 to trigger a function whenever a new object is uploaded. We will also walk through setting up the necessary infrastructure using Terraform.

Prerequisites

1️) Install Required Tools

Ensure you have the following installed on your machine:

Terraform (Download: https://developer.hashicorp.com/terraform/downloads)

terraform -v

Ensure you have Terraform 1.x installed.

AWS CLI (Download: https://aws.amazon.com/cli/)

 aws --version

Ensure you have AWS CLI v2 installed.

Zip Utility (to package the Lambda function)

 zip --version

2️)  Configure AWS Credentials

You must configure your AWS CLI with credentials having sufficient permissions.

aws configure

Enter the following:

  • AWS Access Key ID: your-access-key
  • AWS Secret Access Key: your-secret-key
  • Default region: us-east-1 (or your preferred AWS region)
  • Output format: json (default)

You can also manually set credentials in ~/.aws/credentials:

aws_access_key_id = YOUR_ACCESS_KEY
aws_secret_access_key = YOUR_SECRET_KEY
region = us-east-1

3️) Ensure IAM Permissions

The AWS IAM user or role running Terraform must have the following permissions: ✅ S3 permissions:

  • s3:CreateBucket
  • s3:PutBucketNotification
  • s3:GetBucketNotification
  • s3:GetObject
  • s3:PutObject

 
Lambda permissions:

  • lambda:CreateFunction
  • lambda:GetFunction
  • lambda:InvokeFunction
  • lambda:AddPermission

 
IAM permissions:

  • iam:CreateRole
  • iam:AttachRolePolicy
  • iam:PassRole


 CloudWatch permissions (for logging):

  • logs:CreateLogGroup
  • logs:CreateLogStream
  • logs:PutLogEvents


Terraform Steps

  1. Create an S3 bucket.
  2. Create an IAM role and policy to allow Lambda to be triggered by S3.
  3. Create a Lambda function.
  4. Set up S3 event notification to invoke the Lambda function.


How It Works

  • When a file is uploaded to the S3 bucket, AWS S3 triggers the Lambda function.
  • The Lambda function logs the event and can be customized to process the uploaded file.


To modularize the Terraform code, we will break it down into separate modules for
S3, IAM, and Lambda. This improves reusability and maintainability.

Module Structure

  • terraform-s3-lambda-trigger/
    • main.tf
    • variables.tf
    • outputs.tf
    • modules/
      • s3/
        • main.tf
        • variables.tf
        • outputs.tf
      • iam/
        • main.tf
        • variables.tf
        • outputs.tf
      • lambda/
        • main.tf
        • variables.tf
        • outputs.tf
    • lambda_function.py
    • lambda_function.zip

1. modules/s3/main.tf (S3 Bucket & Event Notification)

resource "aws_s3_bucket" "this" {
  bucket = var.bucket_name
}

resource "aws_s3_bucket_notification" "this" {
  bucket = aws_s3_bucket.this.id

  lambda_function {
    lambda_function_arn = var.lambda_arn
    events              = ["s3:ObjectCreated:*"]
  }

  depends_on = [var.lambda_permission]
}

modules/s3/variables.tf

variable "bucket_name" {}
variable "lambda_arn" {}
variable "lambda_permission" {}

modules/s3/outputs.tf

output "bucket_arn" {
  value = aws_s3_bucket.this.arn
}

output "bucket_name" {
  value = aws_s3_bucket.this.id
}

2. modules/iam/main.tf (IAM Role & Policy for Lambda)

resource "aws_iam_role" "lambda_role" {
  name = "lambda-s3-trigger-role"

  assume_role_policy = jsonencode({
    Version = "2012-10-17"
    Statement = [{
      Action = "sts:AssumeRole"
      Effect = "Allow"
      Principal = { Service = "lambda.amazonaws.com" }
    }]
  })
}

resource "aws_iam_policy" "lambda_s3_policy" {
  name        = "lambda-s3-policy"
  description = "Policy for Lambda to access S3"

  policy = jsonencode({
    Version = "2012-10-17"
    Statement = [
      {
        Effect   = "Allow"
        Action   = ["s3:GetObject", "s3:PutObject"]
        Resource = ["${var.s3_arn}/*"]
      },
      {
        Effect   = "Allow"
        Action   = ["logs:CreateLogGroup", "logs:CreateLogStream", "logs:PutLogEvents"]
        Resource = "arn:aws:logs:*:*:*"
      }
    ]
  })
}

resource "aws_iam_role_policy_attachment" "attach_policy" {
  role       = aws_iam_role.lambda_role.name
  policy_arn = aws_iam_policy.lambda_s3_policy.arn
}

modules/iam/variables.tf

variable "s3_arn" {}

modules/iam/outputs.tf

output "role_arn" {
  value = aws_iam_role.lambda_role.arn
}

3. modules/lambda/main.tf (Lambda Function)

resource "aws_lambda_function" "this" {
  function_name    = var.function_name
  handler         = "index.lambda_handler"
  runtime         = "python3.9"
  timeout         = 30
  role            = var.role_arn
  filename        = "lambda_function.zip"
  source_code_hash = filebase64sha256("lambda_function.zip")
}

resource "aws_lambda_permission" "allow_s3" {
  statement_id  = "AllowS3Invoke"
  action        = "lambda:InvokeFunction"
  function_name = aws_lambda_function.this.function_name
  principal     = "s3.amazonaws.com"
  source_arn    = var.s3_arn
}

modules/lambda/variables.tf

variable "function_name" {}
variable "role_arn" {}
variable "s3_arn" {}

modules/lambda/outputs.tf

output "lambda_arn" {
  value = aws_lambda_function.this.arn
}

output "lambda_permission" {
  value = aws_lambda_permission.allow_s3.id
}

4. main.tf (Root Module)

provider "aws" {
  region = "us-east-1"
}

module "iam" {
  source = "./modules/iam"
  s3_arn = module.s3.bucket_arn
}

module "lambda" {
  source       = "./modules/lambda"
  function_name = "s3-event-trigger"
  role_arn      = module.iam.role_arn
  s3_arn        = module.s3.bucket_arn
}

module "s3" {
  source            = "./modules/s3"
  bucket_name       = "my-unique-bucket-12345"
  lambda_arn        = module.lambda.lambda_arn
  lambda_permission = module.lambda.lambda_permission
}

5. variables.tf (Root Module)

variable "aws_region" {
  default = "us-east-1"
}

6. outputs.tf (Root Module)

output "bucket_name" {
  value = module.s3.bucket_name
}

output "lambda_arn" {
  value = module.lambda.lambda_arn
}

7. lambda_function.py (Lambda Code)

import json

def lambda_handler(event, context):
    print("S3 Event:", json.dumps(event, indent=2))
    return {"statusCode": 200, "body": "S3 event processed!"}

Deployment Steps

Create Lambda ZIP File

zip lambda_function.zip lambda_function.py

Initialize Terraform

terraform init

Validate terraform

terraform validate

Apply Terraform Configuration

terraform apply -auto-approve

Verify Deployment

Once Terraform completes, check S3 bucket is created:

aws s3 ls

 The Lambda function exists:

aws lambda list-functions

 The S3 event trigger is attached to Lambda:

aws s3api get-bucket-notification-configuration --bucket my-unique-bucket-12345

Test by Uploading a File

Upload a test file to S3:

aws s3 cp test-file.txt s3://my-unique-bucket-12345/

Check the Lambda logs in AWS CloudWatch:

aws logs describe-log-groups
aws logs get-log-events --log-group-name "/aws/lambda/s3-event-trigger"

How to Find the Log Stream Name

If you’re unsure about the log stream name, follow these steps:

Step 1: List Available Log Streams

Run the following command to list all log streams in the CloudWatch log group for your Lambda function:

aws logs describe-log-streams --log-group-name "/aws/lambda/s3-event-trigger"

You’ll get output similar to:

{
    "logStreams": [
        {
            "logStreamName": "2025/04/02/[$LATEST]6be894d260184633a3a3019862eebcd8",
            "creationTime": 1712012345678,
            "firstEventTimestamp": 1712012345678,
            "lastEventTimestamp": 1712015678901
        }
    ]
}

Copy the logStreamName from the output.

Fetch Logs for the Specific Stream

Once you have the correct log stream name, use:

aws logs get-log-events --log-group-name
"/aws/lambda/s3-event-trigger" --log-stream-name
"2025/04/02/[$LATEST]6be894d260184633a3a3019862eebcd8"
Additional Debugging Steps
  • Check if logs exist: If the list of log streams is empty, your Lambda function might not have run yet. Try uploading a new file to the S3 bucket to trigger it.
  • Verify IAM permissions: Ensure your IAM user has logs:DescribeLogStreams and logs:GetLogEvents permissions.

Challenges

Check Available Log Streams

aws logs describe-log-streams --log-group-name "/aws/lambda/s3-event-trigger"

If Output is Empty

If no log streams are found, it means your Lambda function hasn’t run yet.

Solution: Upload a test file to the S3 bucket to trigger Lambda:

aws s3 cp test-file.txt s3://your-s3-bucket-name/

Then, re-run the command to check log streams.

Verify IAM Permissions

Your Lambda function’s IAM role must have permission to write logs. Check its IAM policy:

Find the IAM Role of Lambda:

aws lambda get-function --function-name s3-event-trigger
 "Role": "arn:aws:iam::123456789012:role/your-lambda-role-name"

Verify IAM Role Permissions

Your Lambda function’s IAM role should have these policies:

  • AWSLambdaBasicExecutionRole
  • AWSLambdaS3ExecutionRole
  • CloudWatchLogsFullAccess


Run this command to check

aws iam list-attached-role-policies --role-name your-lambda-role-name

If missing, attach the policy

aws iam attach-role-policy --role-name your-lambda-role-name --policy-arn arn:aws:iam::aws:policy/CloudWatchLogsFullAccess

Attach CloudWatch Logs Permissions:

aws iam attach-role-policy --role-name your-lambda-role-name --policy-arn arn:aws:iam::aws:policy/CloudWatchLogsFullAccess

Check If Lambda Executed Successfully

If no logs appear, check if your Lambda function ran:

aws lambda list-invocations --function-name s3-event-trigger

If no invocations are found, the function did not execute.

Verify the S3 Event Notification

Run the following command to check if your S3 bucket is correctly set to trigger the Lambda function:

aws s3api get-bucket-notification-configuration --bucket your-s3-bucket-name

Manually Invoke the Lambda Function

If you suspect the S3 event is not triggering Lambda, manually invoke it to confirm it works:

aws lambda invoke --function-name s3-event-trigger output.txt
cat output.txt

Check the Lambda logs

aws logs describe-log-groups
aws logs tail /aws/lambda/s3-event-trigger --follow

To check the specify logs

aws logs filter-log-events --log-group-name "/aws/lambda/s3-event-trigger"

Check the Output File (output.json)

Since you saved the Lambda response to output.json, you can open it:

Get-Content output.json

Expected output:

{"statusCode": 200, "body": "\"S3 event processed successfully!\""}

Confirms that Lambda executed and processed the event.