Step 3: Connect Trigger and Function
A trigger is the link between your storage bucket and your function. When a file is uploaded, the trigger automatically invokes your function.
What is a Trigger?
A trigger is an automatic connection:
- Event source: Storage bucket
- Event type: File uploaded, deleted, etc.
- Target: Your serverless function
Think of it like this: “When X happens in the bucket, automatically run Y function.”
How Triggers Work
Here’s the flow:
Event Types
You can trigger on different events:
Object Created
- File uploaded
- File copied
- Most common for notifications
Object Deleted
- File removed
- Useful for cleanup workflows
Object Updated
- File modified
- Metadata changed
For this tutorial, we’ll use Object Created.
Configure Trigger: AWS
Using AWS Console
- Go to S3 → Your bucket
- Click Properties tab
- Scroll to Event notifications
- Click Create event notification
- Configure:
- Event name:
notify-on-upload - Event types: Select
All object create events - Destination: Lambda function
- Function: Select your function
- Event name:
- Click Save changes
Using AWS CLI
# First, create the Lambda function (we'll do this in next step)
# Then add permission for S3 to invoke it
# Add S3 invoke permission to Lambda
aws lambda add-permission \
--function-name notify-on-upload \
--principal s3.amazonaws.com \
--statement-id s3-trigger \
--action "lambda:InvokeFunction" \
--source-arn arn:aws:s3:::my-notifications-bucket-2025
# Create S3 event notification configuration
aws s3api put-bucket-notification-configuration \
--bucket my-notifications-bucket-2025 \
--notification-configuration '{
"LambdaFunctionConfigurations": [{
"Id": "notify-on-upload",
"LambdaFunctionArn": "arn:aws:lambda:us-east-1:123456789:function:notify-on-upload",
"Events": ["s3:ObjectCreated:*"]
}]
}' Using Azure Portal:
1. Go to your Function App
2. Click "Functions" in left menu
3. Click "+ Create" or select existing function
4. Choose "Azure Blob Storage trigger"
5. Configure:
- Name: notifyOnUpload
- Path: uploads/{name}
- Storage account connection: Select your storage account
- Access rights: Blob
6. Click "Create"
The trigger is automatically configured when you create the function with a Blob Storage trigger template. Using GCP Console:
1. Go to Cloud Functions
2. Click "Create Function"
3. Configure:
- Name: notify-on-upload
- Region: Same as your bucket
- Trigger type: Cloud Storage
- Event type: Finalize/Create
- Bucket: Select your bucket
4. Click "Save" then "Next"
5. Write your function code
6. Click "Deploy"
The trigger is configured as part of function creation when you select Cloud Storage trigger. Filter Events
You can filter which files trigger the function:
By file extension:
- Only
.pdffiles - Only
.jpgor.pngimages - Exclude certain types
By path prefix:
- Only files in
documents/folder - Only files in
incoming/folder - Ignore files in
archive/
{
"LambdaFunctionConfigurations": [{
"Id": "notify-pdf-only",
"LambdaFunctionArn": "arn:aws:lambda:...",
"Events": ["s3:ObjectCreated:*"],
"Filter": {
"Key": {
"FilterRules": [
{"Name": "suffix", "Value": ".pdf"},
{"Name": "prefix", "Value": "documents/"}
]
}
}
}]
} // In function.json
{
"bindings": [{
"name": "blobTrigger",
"type": "blobTrigger",
"direction": "in",
"path": "uploads/{name}.pdf",
"connection": "AzureWebJobsStorage"
}]
}
// Only triggers for .pdf files in uploads/ folder // In function code, filter by file extension
exports.notifyOnUpload = (file, context) => {
// Only process PDF files
if (!file.name.endsWith('.pdf')) {
console.log('Skipping non-PDF file:', file.name);
return;
}
// Only process files in documents/ folder
if (!file.name.startsWith('documents/')) {
console.log('Skipping file outside documents/ folder');
return;
}
// Process the file
// ... your code here
}; Common Pitfalls
Pitfall 1: Wrong event type
// ❌ Wrong: This fires on ALL events
Events: ["s3:*"]
// ✅ Correct: Only on creation
Events: ["s3:ObjectCreated:*"]
Pitfall 2: Trigger in different region
- Function must be in same region as bucket (usually)
- Or configure cross-region permissions
- Check your provider’s requirements
Pitfall 3: Trigger not enabled
- Verify trigger is active in console
- Check function permissions
- Review function logs for errors
Pitfall 4: Missing permissions
- Function needs permission to be invoked by storage
- Storage needs permission to invoke function
- Check IAM roles and policies
Verify Trigger Configuration
Test that your trigger works:
- Upload a test file to your bucket
- Check function logs - Should see function execution
- Verify event received - Function should log the event
# Upload test file
echo "Test content" > test.txt
aws s3 cp test.txt s3://my-notifications-bucket-2025/
# Check Lambda logs
aws logs tail /aws/lambda/notify-on-upload --follow
# Or check in CloudWatch Logs console # Upload test file
echo "Test content" > test.txt
az storage blob upload \
--account-name mynotifications2025 \
--container-name uploads \
--name test.txt \
--file test.txt \
--auth-mode login
# Check function logs in Azure Portal
# Go to Function App → Functions → notifyOnUpload → Monitor # Upload test file
echo "Test content" > test.txt
gsutil cp test.txt gs://my-notifications-bucket-2025/
# Check function logs
gcloud functions logs read notify-on-upload --limit 10
# Or check in Cloud Console → Cloud Functions → Logs Knowledge Check
What’s Next?
Now that the trigger is connected, let’s add email sending functionality. In the next page, we’ll integrate an email service to send notifications.