Optimizing AWS Costs Using Boto3, AWS Lambda, and AWS CloudWatch
Simple is the answer to complex!!
As cloud usage grows, so does the importance of managing and optimizing cloud costs. In this blog post, I'll walk you through a project I recently completed, which focuses on cost optimization using AWS services. By leveraging Boto3, AWS Lambda, and AWS CloudWatch, I was able to automate the process of identifying and addressing cost inefficiencies within our AWS environment. This project used my Python knowledge that I learned as well as some basic AWS tools that made my life much more easier.
Motivation
The motivation behind this project was straightforward: reduce the AWS spending without compromising on performance or availability. I noticed that some resources were underutilized, leading to unnecessary costs. This project aimed to automate the identification of these resources and take appropriate actions to optimize costs.
Tools and Services Overview
Boto3: The AWS SDK for Python, which allows you to interact with AWS services using Python code.
AWS Lambda: A serverless compute service that lets you run code without provisioning or managing servers.
AWS CloudWatch: A monitoring service for AWS cloud resources and applications, which provides data and actionable insights.
Project Architecture
The architecture of the solution is simple yet effective. Here's a high-level overview:
CloudWatch Alarms: Run an ebs volume check lambda function and monitor the costs and billing estimates.
Lambda Functions: Triggered by CloudWatch alarms to perform actions that delete any snapshot volume or any volume of sort that is not associated to any running instance.
Boto3: Used within Lambda functions to interact with AWS services programmatically so that writing the python code can be done on IDE of choice as well as remotely.
Implementation Details
Setting Up AWS Lambda
First, I created Lambda functions to handle different optimization tasks.
import boto3
def lambda_handler(event, context):
ec2 = boto3.client('ec2')
response = ec2.describe_snapshots(OwnerIds=['self'])
instances_response = ec2.describe_instances(Filters=[{'Name': 'instance-state-name', 'Values': ['running']}])
active_instance_ids = set()
for reservation in instances_response['Reservations']:
for instance in reservation['Instances']:
active_instance_ids.add(instance['InstanceId'])
for snapshot in response['Snapshots']:
snapshot_id = snapshot['SnapshotId']
volume_id = snapshot.get('VolumeId')
if not volume_id:
ec2.delete_snapshot(SnapshotId=snapshot_id)
print(f"Deleted EBS snapshot {snapshot_id} as it was not attached to any volume.")
else:
try:
volume_response = ec2.describe_volumes(VolumeIds=[volume_id])
if not volume_response['Volumes'][0]['Attachments']:
ec2.delete_snapshot(SnapshotId=snapshot_id)
print(f"Deleted EBS snapshot {snapshot_id} as it was taken from a volume not attached to any running instance.")
except ec2.exceptions.ClientError as e:
if e.response['Error']['Code'] == 'InvalidVolume.NotFound':
ec2.delete_snapshot(SnapshotId=snapshot_id)
print(f"Deleted EBS snapshot {snapshot_id} as its associated volume was not found.")
Policy Implementation
Using AWS CloudWatch
Challenges and Solutions
One of the challenges I faced was ensuring the accuracy of the CloudWatch metrics. Initially, I noticed some discrepancies in the data, which I resolved by refining the metric filters and aggregation periods. I made use of Cron - based expressions to make more efficient precise scheduling possible. Another challenge was managing IAM permissions for the Lambda functions. By following the principle of least privilege, I ensured that each function had only the necessary permissions to perform its tasks.
Conclusion
This project demonstrated the power of automation in managing AWS costs. By leveraging Boto3, AWS Lambda, and AWS CloudWatch, we were able to create an efficient and scalable solution for cost optimization. I hope this blog post provides useful insights and inspiration for your own cost optimization efforts.
For more awesome and simple to understand info about DevOps technologies, consider following me on LinkedIn. Want to know more about me!! follow me on Instagram!!