Within this post, we will cover.
- How to automate copy or sync data/objects from one bucket to another.
- How we can use an EC2 instance to copy data from one bucket to another.
- We will leverage the power of AWS IAM Role and AWS S3 CLI to accomplish our requirement.
- AWS CloudFormation script to create IAM role and Inline Policy.
So let's know our lab setup and similarly you can assume your requirement by replacing the variables.
- We already have an EC2 Instance within zone ap-south-1 ( Mumbai )
- Since S3 is region independent, we will be not highlighting it here.
- We have two different bucket and two files under those bucket within aws same account as
- Bucket 1 name : cyberkeeda-bucket-a --> demo-file-A.txt
- Bucket 2 name : cyberkeeda-bucket-b --> demo-file-B.txt
- Since S3 is region independent, we will be not highlighting it here.
- We will copy data from cyberkeeda-bucket-a to cyberkeeda-bucket-b by running aws cli commands from our ec2 instance.
- Above task can be done using AWS CLI Command from any host but the major difference is, one need to store credentials while running aws configure command.
- We will by pass the aws configure command by assigning an Instance Profile IAM role.
- We will create an IAM Role with Inline policy.
- We will use Cloudformation Script to create the required role.
- We already have an EC2 Instance within zone ap-south-1 ( Mumbai )
- Since S3 is region independent, we will be not highlighting it here.
- We have two different bucket and two files under those bucket within aws same account as
- Bucket 1 name : cyberkeeda-bucket-a --> demo-file-A.txt
- Bucket 2 name : cyberkeeda-bucket-b --> demo-file-B.txt
- Since S3 is region independent, we will be not highlighting it here.
- We will copy data from cyberkeeda-bucket-a to cyberkeeda-bucket-b by running aws cli commands from our ec2 instance.
- Above task can be done using AWS CLI Command from any host but the major difference is, one need to store credentials while running aws configure command.
- We will by pass the aws configure command by assigning an Instance Profile IAM role.
- We will create an IAM Role with Inline policy.
- We will use Cloudformation Script to create the required role.
Few things we must know about IAM role before proceeding further,
- IAM Role : IAM role is a set of permissions that are created to initiate various AWS Service request, when we say aws service request that means request made to initiate services like ( S3, EC2, LAMBDA, etc etc )
- IAM Roles are not attached to any user or group, it's assumed by other aws services like ( ec2, lambda ), applications.
- Policy : Policy can be defined as set of permissions allowed/denied to role,user or group.
- Managed Policy : A policy that has been created keeping in mind of reusibility, creating one and can be mapped to multiple user/service/role.
- Inline Policy : Policy that has been created for one to one mapping between policy and entity.
CloudFormation Script to create IAM Role and Inline Policy.
AWSTemplateFormatVersion: 2010-09-09
Description:
CFN Script to create role and inline policy for ec2 instance.
Will be used further to transfer data from Source bucket to Destination bucket.
Author - Jackuna ( https://github.com/Jackuna)
Parameters:
RoleName:
Type: String
Description: Provide Role Name that will be assumed by EC2. [a-z][a-z0-9]*
InlinePolicyName:
Type: String
Description: Provide Inline Policy name, it will attached with above created role. [a-z][a-z0-9]*
SourceBucketName:
Type: String
Description: Provide Source Bucket name [a-z][a-z0-9]*
DestinationBucketName:
Type: String
Description: Provide Destination Bucket name [a-z][a-z0-9]*
Resources:
RootRole:
Type: 'AWS::IAM::Role'
Properties:
RoleName: !Sub "${RoleName}"
AssumeRolePolicyDocument:
Version: 2012-10-17
Statement:
- Effect: Allow
Principal:
Service: ["ec2.amazonaws.com"]
Action: ['sts:AssumeRole']
Policies:
- PolicyName: !Sub ${InlinePolicyName}
PolicyDocument:
Version: '2012-10-17'
Statement:
- Effect: Allow
Action:
- s3:ListBucket
- s3:PutObject
- s3:GetObject
Resource:
- !Sub arn:aws:s3:::${SourceBucketName}/*
- !Sub arn:aws:s3:::${SourceBucketName}
- Effect: Allow
Action:
- s3:ListBucket
- s3:PutObject
- s3:GetObject
Resource:
- !Sub arn:aws:s3:::${DestinationBucketName}/*
- !Sub arn:aws:s3:::${DestinationBucketName}
RootInstanceProfile:
Type: 'AWS::IAM::InstanceProfile'
DependsOn:
- RootRole
Properties:
Path: /
InstanceProfileName: !Sub "${RoleName}"
Roles:
- !Ref RoleName
Outputs:
RoleDetails:
Description: Role Name
Value: !Ref RootRole
PolicyDetails:
Description: Inline Policy Name
Value: !Ref InlinePolicyName
Steps to use the above cloud formation script:
- Copy the above content and save it into a file and name it as iam_policy_role.yaml
- Go to AWS Console --> Services --> Cloudformation --> Create Stack
- Choose options : Template is ready and Upload a Template File and upload your saved template iam_policy_role.yaml --> Next
- Next page will ask you for required parameters as input, we will fill it as per our lab setup and requirement.
- Stack Name : Name of the stack ( Could be anything )
- Destination Bucket name : Name of the bucket where we want to copy data from our source bucket.
- Role Name : Name of your IAM role ( Could be anything )
- Inline Policy : Name of your policy, which will allow list,get,put object permission to buckets ( Could be anything )
- Click Next --> Again Click Next and then click on check Box to agree --> Then create Stack.
- Once the stack status stands completed, click on output tab and verify the name of your created resources.
- Now toggle down to IAM windows and search our above created role.
- Once Verified we can go to our EC2 instance, where we will be attaching our above created role to give access to S3 bucket.
- AWS Console → EC2 → Search instance → yourinstaceName→ Right Click → Instance Setting → Attach/Replace IAM Role → Choose above created IAM role (s3_copy_data_between_buckets_role) --> Apply
Now we are ready to test, verify and further automate it using cronJob.
- Login to your EC2 instance.
- Run the below command to verify you proper access to both the S3 buckets.
List content within bucket.
Copy file/content from one bucket to another.
- Now we will try to copy file name demo-file-A.txt from bucket cyberkeeda-bucket-a to cyberkeeda-bucket-a
- Next page will ask you for required parameters as input, we will fill it as per our lab setup and requirement.
- Stack Name : Name of the stack ( Could be anything )
- Destination Bucket name : Name of the bucket where we want to copy data from our source bucket.
- Role Name : Name of your IAM role ( Could be anything )
- Inline Policy : Name of your policy, which will allow list,get,put object permission to buckets ( Could be anything )
- Click Next --> Again Click Next and then click on check Box to agree --> Then create Stack.
- Once the stack status stands completed, click on output tab and verify the name of your created resources.
- Now toggle down to IAM windows and search our above created role.
- Once Verified we can go to our EC2 instance, where we will be attaching our above created role to give access to S3 bucket.
- AWS Console → EC2 → Search instance → yourinstaceName→ Right Click → Instance Setting → Attach/Replace IAM Role → Choose above created IAM role (s3_copy_data_between_buckets_role) --> Apply
- Login to your EC2 instance.
- Run the below command to verify you proper access to both the S3 buckets.
List content within bucket.
Copy file/content from one bucket to another.
- Now we will try to copy file name demo-file-A.txt from bucket cyberkeeda-bucket-a to cyberkeeda-bucket-a
aws s3 cp s3://SOURCE-BUCKET-NAME/FILE-NAME s3://DESTINATION-BUCKET-NAME/FILE-NAME
aws s3 cp s3://cyberkeeda-bucket-a/demo-file-A.txt s3://cyberkeeda-bucket-b/demo-file-A.txt
Sync all file/content from one bucket to another.
aws s3 sync s3://SOURCE-BUCKET-NAME/ s3://DESTINATION-BUCKET-NAME/
aws s3 sync s3://cyberkeeda-bucket-a/ s3://cyberkeeda-bucket-b/
Sync all file/content from one bucket to another with ACL as bucket owner.
aws s3 sync --acl bucket-owner-full-control s3://cyberkeeda-bucket-a/ s3://cyberkeeda-bucket-b/
That's it with this post, we will cover how to do the same for Cross Account within next post.
Feel free to comment, if you face any issue implementing it.
Thank you.
ReplyDeleteSorry for my unknowledge; before creating the stack I tried making a copy from an S3 bucket to another one and it was copied successfully then what is the goal of this stack / role?
I read your article carefully and now I understand the role was
Delete"• We will bypass the aws configure command by assigning an Instance Profile IAM role"
Very good example.
How can I make a role to bypass other operations when using AWS CLI? Is there any generic template to give this role to any resource bypassing the AWS CLI credentials?