Skip to main content

Posts

List all AWS IAM roles with their last used date

The following Python script will help to list all the AWS IAM roles with the last used date. If the role is not been used, it will show 'Never used" instead of date. You will require Python3.8 or above to run the script. I prefer to use Tabulate to format the output in to table format. You can format the output in to HTML or even convert in to CSV file too. Let's start the script to list all IAM role and its last used date. import boto3 import time from tabulate import tabulate Once you've imported the boto3, time and tabulate module, let's setup the AWS session using the AWS config profile and region name. session = boto3 . Session ( profile_name = profile , region_name = region ) iam_client = session . client ( 'iam' ) # use paginator if you have long list of IAM roles paginator = iam_client .get_paginator( 'list_roles' ) iterator = paginator .paginate() The following lines will help to setup the header row of the table in the output. In
Recent posts

List the AWS Lambda function information

In this blog, I will share the Python script that will help you to list the AWS Lambda function details such as Function name, Function ARN, Run time, Description and Version etc... I will be using Tabulate module to format my output in to table format. If you are new to Tabulate, please visit my previous blog where I have provided details of Tabulate module. The following Python code is written in to Python 3.8 version.  import boto3 from tabulate import tabulate   After importing the Boto3 and Tabulate module, let's setup our AWS session details by providing region and profile name.  profile = "default" region = "us-east-1" session = boto3 . Session ( profile_name = profile , region_name = region ) lambda_client = session . client ( 'lambda' ) response = lambda_client .list_functions() # Get the list of functions lambda_list = response [ 'Functions' ] The following code will help you to setup a Header row in the table  # Setup heade

List all AWS support cases with their details

 In this blog, I will show you how to write a Python script that will list all the support cases raised from your AWS account. This script can be very useful when you want a report showing the information about AWS support cases are either open or in resolved state. Let's begin with importing Boto3 and Tabulate module. Tabulate module will help us to present the output in table format. You can install Tabulate module by running  pip install tabulate  command. import boto3 from tabulate import tabulate Once you've imported the require modules, we can set up the connection to AWS by giving the default profile and region name. At this stage, we can only use " us-east-1 " region to run any commands to fetch support case details. So, we will be using the same region in our code too. # setup profile and region profile = "default" region = "us-east-1 " Now, here is the actual code where we will gather the information about all the cases, covert in t

List AWS EC2 instance details with Name tag value

In this blog, we will look at how to execute Python script to get the AWS EC2 details like Name tag valie of the instance, Instance ID, Platform, Instance Type, State and its private IPv4 address. You will require Python 3.8 or above to run the the script successfully. Setting up Boto3 Session and client import boto3 # setup profile and region profile = "default" region = "us-east-1" session = boto3 . Session ( profile_name = profile , region_name = region ) ec2_client = session . client ( 'ec2' ) response = ec2_client .describe_instances() ec2_list = response [ 'Reservations' ] # Gather information about EC2 ec2_name = session . resource ( 'ec2' ) instances = ec2_name .instances.all() # This will help to retrieve Tags information Now, let's print the header row. # print header print ( 'Name, Instance ID, Platform, Instance Type, State, Private IPv4' ) Following code will run and collect the information about EC2.   for

Creating your own Speedtest utility in Python

It is easy to create your own speedtest utility in Python. You will require Speedtest-CLI module to build your utility. To install Speedtest-cli, run “ pip install speedtest-cli ” command Once speedtest-cli is installed, import it into your Python file. import speedtest result = speedtest . Speedtest ()   Following code will help you to find the best Speedtest server depending on your location best_server = result . get_best_server () print ( f "Checking speedtest using server located in { best_server [ 'country' ] } " )   Now, run the speed test for Download, Upload and ping result. # Running the download speed print ( "Running download speedtest..." ) dw_result = result . download () # Running the upload spped print ( "Running upload speedtest..." ) up_result = result . upload () # Running a ping test ping_result = result . results . ping   Using below code, you can covert the result in to Mbps and truncate the value to 2 decima

Python - Getting AWS account ID

Here is the Python script that helps to find out the AWS account id. # The script helps you to find out AWS account number import boto3 # Configure AWS access key and secret access key to access AWS resources access_key_id = "test-key" secret_access_key_id = "secret-key" sts = boto3 . client (     "sts" , aws_access_key_id = access_key_id , aws_secret_access_key = secret_access_key_id , ) account_id = sts .get_caller_identity()[ "Account" ] # Print account number print ( account_id ) # The following code helps you to return AWS account number when IAM role based permissions are used account_id = boto3 . client ( "sts" ).get_caller_identity()[ "Account" ] # Print account number print ( account_id ) Hope you find it useful. Disclaimer: www.TechieTalks.co.uk does not conceal the possibility of error and shortcomings due to human or technical factors. www.TechieTalks.co.uk does not bear responsibility upon any loss or da

Python script to list all AWS CloudWatch logs with their retention time

The following Python script will help you to export list of any all the CloudWatch logs with their retention time and size of the log group an AWS account. The script also helps you to save the output within CSV file. #!/usr/bin/env python3.8 #This script will save the list of CloudWatch log group names, their retention time and # size (in MBs) in to CSV file import boto3 import time import csv # Setting up AWS profile and region profile = "default" region = "us-east-1" session = boto3 . Session ( profile_name = profile , region_name = region ) #getting current date and time date_time = time . strftime ( " %d %m%Y_%H%M" ) # setting up file name fname = "CloudWatch-Logs_" ftime = date_time fext = ".csv" filename = fname + ftime + fext # open file to write output with open ( filename , 'w' ) as f :     print ( 'Profile, Region,Log Group Name, Retention Time (in days), Size (in MB)' , file = f )   cw_logs =