10 Essential Python Scripts to Automate AWS Tasks
As businesses expand, their cloud management needs grow as well. Amazon Web Services (AWS) offers powerful cloud computing tools, but managing daily tasks manually can be both time-consuming and error-prone. Python automation can simplify these tasks, allowing you to manage AWS resources efficiently and securely.
1. Understanding Python for AWS Automation
Python is a versatile programming language well-suited for cloud automation due to its simplicity and extensive support for AWS SDKs like Boto3. Boto3 facilitates seamless interaction with AWS services, making it essential for running automation scripts. Ensure you have:
Basic Python knowledge.
Installed Boto3 (
pip install boto3
) and configured AWS CLI (aws configure
).Appropriate IAM permissions for each script.
2. Automate EC2 Instance Start/Stop
Use Case: A company running development and testing environments on EC2 instances wants to minimize costs by stopping instances during non-business hours.
Purpose: Reduce costs by automatically starting and stopping EC2 instances during off-hours.
Script:
import boto3
ec2 = boto3.client('ec2')
def manage_ec2(action, instance_ids):
ec2.start_instances(InstanceIds=instance_ids) if action == 'start' else ec2.stop_instances(InstanceIds=instance_ids)
print(f"Instances {action}ped: {instance_ids}")
manage_ec2('stop', ['i-1234567890abcdef0'])
Setup: Define instance IDs and action (start or stop). Schedule using AWS Lambda and CloudWatch Events.
3. Automated S3 Bucket Backup and Sync
Use Case: A media company needs to ensure that their content stored in S3 is backed up to another bucket to prevent data loss.
Purpose: Sync data between two S3 buckets for backup.
Script:
import boto3
s3 = boto3.resource('s3')
def sync_s3_buckets(source_bucket, destination_bucket):
copy_source = {'Bucket': source_bucket, 'Key': obj.key}
for obj in s3.Bucket(source_bucket).objects.all():
s3.Object(destination_bucket, obj.key).copy(copy_source)
print(f"Data synced from {source_bucket} to {destination_bucket}")
sync_s3_buckets('source-bucket-name', 'destination-bucket-name')
Setup: Replace bucket names and schedule with Lambda or a local cron job.
4. DynamoDB Data Export to S3
Use Case: An analytics team requires regular exports of user data from DynamoDB to perform data analysis.
Purpose: Export data from DynamoDB tables to S3 for analytics or backup.
Script:
import boto3
dynamodb = boto3.resource('dynamodb')
s3 = boto3.client('s3')
def export_dynamodb_to_s3(table_name, bucket_name, file_name):
table = dynamodb.Table(table_name)
data = table.scan()['Items']
s3.put_object(Bucket=bucket_name, Key=file_name, Body=str(data))
print(f"Data exported to s3://{bucket_name}/{file_name}")
export_dynamodb_to_s3('my-table', 'my-bucket', 'backup.json')
Setup: Define table and bucket names, and run periodically.
5. RDS Instance Snapshot Backup
Use Case: A financial institution needs to ensure database snapshots are taken regularly for disaster recovery purposes.
Purpose: Take regular snapshots of RDS instances to ensure data safety.
Script:
import boto3
rds = boto3.client('rds')
def create_rds_snapshot(db_instance_identifier, snapshot_id):
rds.create_db_snapshot(DBSnapshotIdentifier=snapshot_id, DBInstanceIdentifier=db_instance_identifier)
print(f"Snapshot {snapshot_id} created for {db_instance_identifier}")
create_rds_snapshot('my-db-instance', 'my-db-snapshot')
Setup: Replace db_instance_identifier
and snapshot_id
. Schedule through Lambda.
6. Automated Lambda Deployment
Use Case: A development team frequently updates their serverless application code and needs an efficient deployment process.
Purpose: Automatically deploy updated Lambda functions.
Script:
import boto3
lambda_client = boto3.client('lambda')
def deploy_lambda(function_name, zip_file_path):
with open(zip_file_path, 'rb') as f:
lambda_client.update_function_code(FunctionName=function_name, ZipFile=f.read())
print(f"Lambda {function_name} updated successfully")
deploy_lambda('my-function', '/path/to/your/lambda.zip')
Setup: Package Lambda code as a .zip file and update function_name
.
7. AWS Cost and Usage Reporting
Use Case: A finance team needs to monitor AWS spending to prevent budget overruns and optimize resource usage.
Purpose: Automate the generation of AWS cost and usage reports.
Script:
import boto3
from datetime import datetime, timedelta
client = boto3.client('ce')
def get_cost_and_usage(start_date, end_date):
response = client.get_cost_and_usage(
TimePeriod={'Start': start_date, 'End': end_date},
Granularity='DAILY',
Metrics=['UnblendedCost'],
GroupBy=[{'Type': 'DIMENSION', 'Key': 'SERVICE'}]
)
for result_by_time in response['ResultsByTime']:
print(f"Date: {result_by_time['TimePeriod']['Start']}")
for group in result_by_time['Groups']:
print(f"Service: {group['Keys'][0]}, Cost: {group['Metrics']['UnblendedCost']['Amount']}")
end_date = datetime.today().strftime('%Y-%m-%d')
start_date = (datetime.today() - timedelta(days=7)).strftime('%Y-%m-%d')
get_cost_and_usage(start_date, end_date)
Setup: Enable AWS Cost Explorer and schedule the script execution using a cron job or AWS Lambda.
8. CloudWatch Alert Setup
Use Case: An operations team wants to be notified when CPU usage of critical EC2 instances exceeds a certain threshold.
Purpose: Set up CloudWatch alerts for AWS resources.
Script:
import boto3
cloudwatch = boto3.client('cloudwatch')
def create_cloudwatch_alarm(instance_id):
cloudwatch.put_metric_alarm(
AlarmName=f'CPU_Utilization_{instance_id}',
MetricName='CPUUtilization',
Namespace='AWS/EC2',
Statistic='Average',
Period=300,
EvaluationPeriods=1,
Threshold=70.0,
ComparisonOperator='GreaterThanThreshold',
AlarmActions=['<SNS_TOPIC_ARN>'],
Dimensions=[{'Name': 'InstanceId', 'Value': instance_id}]
)
print(f"CloudWatch alarm created for {instance_id}")
create_cloudwatch_alarm('i-1234567890abcdef0')
Setup: Replace <SNS_TOPIC_ARN>
and instance_id
.
9. Automated IAM User and Role Management
Use Case: An IT department needs to automate the creation of IAM users for new employees to streamline onboarding.
Purpose: Automate management of IAM users, such as creating or deleting users.
Script:
import boto3
iam = boto3.client('iam')
def create_iam_user(username):
iam.create_user(UserName=username)
print(f"IAM user {username} created")
create_iam_user('new-user')
Setup: Define IAM username in the script.
10. Scheduled AMI Backup for EC2 Instances
Use Case: A cloud administrator wants to ensure EC2 instances are backed up as AMIs on a regular schedule to prevent data loss.
Purpose: Create AMIs for EC2 instances on a schedule.
Script:
import boto3
ec2 = boto3.client('ec2')
def create_ami(instance_id, ami_name):
ec2.create_image(InstanceId=instance_id, Name=ami_name, NoReboot=True)
print(f"AMI {ami_name} created for {instance_id}")
create_ami('i-1234567890abcdef0', 'my-ami-backup')
Setup: Schedule with CloudWatch Events and Lambda.
Conclusion
Using Python scripts for AWS task automation can drastically simplify daily operations. By automating these tasks, you can save time, reduce costs, and focus on more critical aspects of cloud management. Implement these scripts to streamline your workflows and improve overall AWS efficiency.