2 min read

AWS Lambda Word Count Lab (Challenge)

AWS Lambda Word Count Lab (Challenge)
Photo by Mehmet Ali Peker / Unsplash

Elevate your serverless chops by building a Python-powered AWS Lambda that counts words in any text file you drop into S3—and notifies you via SNS.


🚀 Architecture Overview

  1. Amazon S3 — Text files land here (trigger).
  2. AWS Lambda — Python 3.x function reads the file, counts words, publishes to SNS.
  3. Amazon SNS — Sends email (and optionally SMS) with the word-count payload.
  4. CloudWatch Logs — Tracks Lambda execution and debug traces.
[ S3 PutObject ] 
       ↓
[ Lambda (boto3) ] ──> [ SNS Publish ]
       ↓
[ CloudWatch Logs ]

1. Create the SNS Topic & Subscription

  1. SNS ConsoleCreate topic
    • Type: Standard
    • Name: WordCountTopic
    • Encryption: (Optional) Enable KMS for at-rest security.
  2. SubscriptionsCreate subscription
  3. Confirm the subscription link in your inbox to activate it.

2. Build the Lambda Function

a) IAM Role

Use the pre-provisioned LambdaAccessRole which includes:

  • AWSLambdaBasicExecutionRole
  • AmazonS3FullAccess
  • AmazonSNSFullAccess
  • CloudWatchFullAccess

b) Function Configuration

  • Name: WordCountFunction
  • Runtime: Python 3.11 (boto3 v1.x)
  • Handler: lambda_function.lambda_handler
  • Memory: 128 MB
  • Timeout: 30 seconds

Environment Variables:

SNS_TOPIC_ARN=arn:aws:sns:eu-central-1:123456789012:WordCountTopic

c) Function Code (lambda_function.py)

import os
import json
import logging
import boto3
from urllib.parse import unquote_plus

# Initialize clients
s3  = boto3.client('s3')
sns = boto3.client('sns')

# Config from env
SNS_TOPIC_ARN = os.environ['SNS_TOPIC_ARN']

# Logging config
logger = logging.getLogger()
logger.setLevel(logging.INFO)

def lambda_handler(event, context):
    # 1️⃣ Parse S3 event
    record = event['Records'][0]['s3']
    bucket = record['bucket']['name']
    key    = unquote_plus(record['object']['key'])

    logger.info(f"Received S3 event for bucket={bucket}, key={key}")

    # 2️⃣ Retrieve object
    response = s3.get_object(Bucket=bucket, Key=key)
    body_bytes = response['Body'].read()
    text = body_bytes.decode('utf-8', errors='replace')

    # 3️⃣ Count words
    word_list = text.split()
    count = len(word_list)
    logger.info(f"Word count for {key}: {count}")

    # 4️⃣ Publish to SNS
    subject = "Word Count Result"
    message = f"The word count in file **{key}** is **{count}**."
    sns.publish(
        TopicArn= SNS_TOPIC_ARN,
        Subject=  subject,
        Message=  message
    )

    # 5️⃣ Return payload
    return {
        'statusCode': 200,
        'body': json.dumps({
            'fileName': key,
            'wordCount': count
        })
    }
Pro tip:Add CloudWatch Metric filters on your function’s logs to alert on errors or unusually large files.Use AWS SAM or CloudFormation for infra-as-code repeatability.

3. Configure S3 Trigger

  1. S3 Console → Navigate to your bucket (my-wordcount-bucket).
  2. PropertiesEvent notificationsCreate event notification
    • Name: InvokeWordCountLambda
    • Event types: All object create events (s3:ObjectCreated:*)
    • Suffix filter: .txt
    • Destination: Lambda function → WordCountFunction
  3. Save—your Lambda will now auto-invoke on every .txt upload.

4. Testing & Validation

  1. Upload a .txt file (e.g. example.txt) with ~100 words into your S3 bucket.
  2. Check CloudWatch Logs: Verify your Lambda ran without errors and logged the count.
  3. (Optional) Add an SMS subscription to get counts on your mobile.

Inspect SNS Email: Ensure you receive an email with:

Subject: Word Count Result
Body: The word count in file example.txt is 100.

5. Deliverables

  • Email Forward: Send one of the SNS notifications to your instructor.
  • Screenshots:
    • Lambda code & environment variables.
    • S3 event notification config.

🎯 Conclusion

In this lab, you provisioned an SNS topic and subscription to receive notifications. You built a Python-based AWS Lambda function (using the provided IAM role) that reads uploaded S3 text files, counts the words, and publishes the result to SNS. You configured an S3 event trigger so any .txt upload automatically invokes your function. You tested the pipeline end-to-end by uploading a sample file and verifying the email notification. Finally, you prepared deliverables by forwarding an SNS message and capturing screenshots of your Lambda and S3 configurations.


That’s it—your serverless pipeline is live! 🥳 Good Job !