Last updated on May 31st, 2022 at 08:27 am

In this tutorial we are going to see how to write a simple python script to generate temporary credentials from STS. There are two examples below on how to use those temporary credentials one is how to list objects inside an S3 bucket and other is launching or creating a new EC2 instance.

This is by no means an exhaustive list of capabilities doable using temporary credentials but the idea here is to show you the power of using STS and role based access feature available in AWS and how we can connect to various AWS services using those credentials.

This is going to be straight forward method without any complexity. As you know I (mistonline.in) loves simple and basic scripting approach that does the job for you . We believe that this helps our users in building their own ideas around various use cases.

Create a Role

In order to assume a role we have to first create a role. Take a look at this document on how to create a role in AWS IAM.

Note: While creating the role I selected AWS Account as the trusted entity and left everything else default.
In the permission section I created a policy that has access to

  • List my S3 bucket
  • Launch EC2 Instance

My policy blueprint is as shown below, replace mybucket (this is a dummy name) with the bucket you would like to list.

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "VisualEditor0",
            "Effect": "Allow",
            "Action": "ec2:RunInstances",
            "Resource": "*"
        },
        {
            "Sid": "VisualEditor1",
            "Effect": "Allow",
            "Action": "s3:*",
            "Resource": [
                "arn:aws:s3:::mybucket/*",
                "arn:aws:s3:::mybucket"
            ]
        }
    ]
}

Note: These are for educational purpose. Make sure that you fine tune access to the users when these policies go to production. Please don’t blindly give <SERVICE_NAME>.* to the Action statement unless it is required.

Get temporary credentials

So now we have the role ready with policy (as shown above) added. Next step is to write the code to get temporary credentials from AWS Secure Token Service.

Since we are using Python here boto3 module needs to be installed. Take a look at this documentation to install boto3. At the time of writing this tutorial I am using Python 3.9

Once done with boto3 installation, use this code below to get the temporary credentials. This code prints the access_key, secret_key and also the session token generated along with the expiry time.

import boto3
import os
from boto3.session import Session

# Below is the ARN of the role.
arn = "arn:aws:iam::<account>:role/<role_name>"
session_name = "my-temp-session-2022"
client = boto3.client("sts")
# Assume role takes the roles ARN and a sample session name
response = client.assume_role(RoleArn=arn, RoleSessionName=session_name)
# Create an S3 resource that can access the account with the temporary credentials.
temp_credentials = response["Credentials"]
expire = temp_credentials["Expiration"]
print("Credential Expiry Time : "+expire.isoformat())
print("Access Key : "+temp_credentials["AccessKeyId"]
print("Secret Key: "+temp_credentials["SecretAccessKey"]
print("Session Token: "+temp_credentials["SessionToken"]

Replace the arn above with the role arn you created. You can give any name for the session.

Awesome we got the temporary credentials to connect to specific services under the account (arn).

List S3 bucket

We got the temporary credentials and are good to use them according to the policy defined in the role. In the role I have a permission block to list my bucket. In the example code we are going to use the same temporary credentials to list an S3 bucket.

Once you have the STS credentials handy, add the below code block to list the objects inside bucket mybucket . Replace mybucket to the one you own

s3_resource = boto3.resource(
    "s3",
    aws_access_key_id=temp_credentials["AccessKeyId"],
    aws_secret_access_key=temp_credentials["SecretAccessKey"],
    aws_session_token=temp_credentials["SessionToken"],
    region_name="us-east-1"
)
my_bucket = s3_resource.Bucket('mybucket')
print(f"Listing buckets for the assumed role's account:")
for bucket in my_bucket.objects.all():
    print(bucket.key)

As you can see above I am extracting access key , secret key and session token from STS

    aws_access_key_id=temp_credentials["AccessKeyId"],
    aws_secret_access_key=temp_credentials["SecretAccessKey"],
    aws_session_token=temp_credentials["SessionToken"],

Launch EC2 instance

Using the the credentials we can also launch an EC2 instance. I am not going to explain each and every line in the code as these are outside the scope of this tutorial.

#Function for launching an EC2 instance
ec2 = boto3.client('ec2',
                   'us-east-1',
                   aws_access_key_id=temp_credentials["AccessKeyId"],
                   aws_secret_access_key=temp_credentials["SecretAccessKey"],aws_session_token=temp_credentials["SessionToken"])
 
#Function for running instances
conn_me = ec2.run_instances(InstanceType="t2.micro",
                         MaxCount=1,
                         MinCount=1,
                         ImageId="ami-xxx")
print(conn_me)

In the above code we are connecting to US-EAST-1 region using the temporary credentials we got from here . After that we are launching a t2.micro instance using this public AMI ami-xxx (replace with available public/private AMI ID). Change the max count to increase number of instance launched. More details on ec2.run_instances can be found here.

Leave a Reply

Your email address will not be published. Required fields are marked *