CyberKeeda In Social Media
Showing posts with label AWS IAM. Show all posts
Showing posts with label AWS IAM. Show all posts

AWS IAM Policy to Allow All Operations except IAM

 





Below policy template can be used to provide access to a user or add policy to a role with below set of permissions.

  • Allow all Services.
  • Allow all Resources
  • Allow all actions linked to every resource
  • Except IAM all operations and actions.

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "VisualEditor0",
            "Effect": "Allow",
            "Action": "*",
            "Resource": "*"
        },
        {
            "Sid": "VisualEditor1",
            "Effect": "Deny",
            "Action": "iam:*",
            "Resource": "*"
        }
    ]
}


I have spent time to explore little template, hope this finds you via google to save some of yours time.


Read more ...

AWS Managed Policy to Restrict IAM User to Access AWS Resource from Specific IP Address.

 




AWS Managed Policy

Within this blog post, we will cover 
How we can use IAM Managed Policy used to create an IAM User Boundary which will limit a user for the below operations.

  • AWS S3 Limited Access [Get, Put, List]
  • S3 Access with only single IP Address.
Syntax Template

AWSTemplateFormatVersion: 2010-09-09
DescriptionCFN to create ManagedPolicy 

Resources:
  IBDSReconUserBoundaryPolicy:
    TypeAWS::IAM::ManagedPolicy
    Properties
      DescriptionA ManagedPolicy meant to restrict user based upon ingress IP.
      ManagedPolicyNamemy_s3_user_boundary
      Path/
      Users:
      - my_s3_user
      PolicyDocument
            Version'2012-10-17'
            Statement:
            - EffectAllow
              Action:
              - s3:ListBucket
              - s3:GetBucketLocation
              Resourcearn:aws:s3:::my-randon-s3-bucket
            - EffectAllow
              Action:
              - s3:PutObject
              - s3:PutObjectAcl
              Resourcearn:aws:s3:::my-randon-s3-bucketa/bucketfiles/*
              Condition:
                IpAddressIfExists:
                  aws:SourceIp123.345.657.12

Read more ...

AWS S3 - Copy data from one bucket to another without storing credentials anywhere.


Within this post, we will cover.

  • How to automate copy or sync data/objects from one bucket to another.
  • How we can use an EC2 instance to copy data from one bucket to another.
  • We will leverage the power of AWS IAM Role and AWS S3 CLI to accomplish our requirement.
  • AWS CloudFormation script to create IAM role and Inline Policy.


So let's know our lab setup and similarly you can assume your requirement by replacing the variables.

  • We already have an EC2 Instance within zone ap-south-1 ( Mumbai )
  • Since S3 is region independent, we will be not highlighting it here.
  • We have two different bucket and two files under those bucket within aws same account as 
    • Bucket 1 name : cyberkeeda-bucket---> demo-file-A.txt
    • Bucket 2 name : cyberkeeda-bucket-b --> demo-file-B.txt
    • Since S3 is region independent, we will be not highlighting it here.
  • We will copy data from cyberkeeda-bucket-to cyberkeeda-bucket-by running aws cli commands from our ec2 instance.
  • Above task can be done using AWS CLI Command from any host but the major difference is, one need to store credentials while running aws configure command.
  • We will by pass the aws configure command by assigning an Instance Profile IAM role.
  • We will create an IAM Role with Inline policy.
  • We will use Cloudformation Script to create the required role.

Few things we must know about IAM role before proceeding further,

  • IAM Role : IAM role is a set of permissions that are created to initiate various AWS Service request, when we say aws service request that means request made to initiate services like ( S3, EC2, LAMBDA, etc etc )
  • IAM Roles are not attached to any user or group, it's assumed by other aws services like ( ec2, lambda ), applications.
  • Policy : Policy can be defined as set of permissions allowed/denied to role,user or group.
  • Managed Policy : A policy that has been created keeping in mind of reusibility, creating one and can be mapped to multiple user/service/role.
  • Inline Policy : Policy that has been created for one to one mapping between policy and entity.

CloudFormation Script to create IAM Role and Inline Policy.


AWSTemplateFormatVersion: 2010-09-09
Description
  CFN Script to create role and inline policy for ec2 instance.
  Will be used further to transfer data from Source bucket to Destination bucket.
  Author - Jackuna ( https://github.com/Jackuna)

Parameters:
  RoleName:
    TypeString
    DescriptionProvide Role Name that will be assumed by EC2. [a-z][a-z0-9]*
  InlinePolicyName:
    TypeString
    DescriptionProvide Inline Policy name, it will attached with above created role. [a-z][a-z0-9]*
  SourceBucketName:
    TypeString
    DescriptionProvide Source Bucket name [a-z][a-z0-9]* 
  DestinationBucketName:
    TypeString
    DescriptionProvide Destination Bucket name [a-z][a-z0-9]*

Resources:
  RootRole:
    Type'AWS::IAM::Role'
    Properties:
      RoleName!Sub "${RoleName}"
      AssumeRolePolicyDocument:
        Version: 2012-10-17
        Statement:
          - EffectAllow
            Principal:
              Service: ["ec2.amazonaws.com"]
            Action: ['sts:AssumeRole']
      Policies:
        - PolicyName!Sub ${InlinePolicyName}
          PolicyDocument:
            Version'2012-10-17'
            Statement:
              - EffectAllow
                Action:
                - s3:ListBucket
                - s3:PutObject
                - s3:GetObject
                Resource:
                - !Sub arn:aws:s3:::${SourceBucketName}/*
                - !Sub arn:aws:s3:::${SourceBucketName}
              - EffectAllow
                Action
                - s3:ListBucket
                - s3:PutObject
                - s3:GetObject
                Resource:
                - !Sub arn:aws:s3:::${DestinationBucketName}/*
                - !Sub arn:aws:s3:::${DestinationBucketName}
  RootInstanceProfile:
    Type'AWS::IAM::InstanceProfile'
    DependsOn:
      - RootRole
    Properties:
      Path/
      InstanceProfileName!Sub "${RoleName}"
      Roles:
      - !Ref RoleName

Outputs:
  RoleDetails:
    DescriptionRole Name
    Value!Ref RootRole
  PolicyDetails:
    DescriptionInline Policy Name
    Value!Ref InlinePolicyName


Steps to use the above cloud formation script:
  • Copy the above content and save it into a file and name it as iam_policy_role.yaml
  • Go to AWS Console --> Services --> Cloudformation --> Create Stack
  • Choose options : Template is ready and Upload a Template File and upload your saved template iam_policy_role.yaml  --> Next

  • Next page will ask you for required parameters as input, we will fill it as per our lab setup and requirement.
    • Stack Name : Name of the stack ( Could be anything )
    • Destination Bucket name : Name of the bucket where we want to copy data from our source bucket.
    • Role Name : Name of your IAM role ( Could be anything )
    • Inline Policy : Name of your policy, which will allow list,get,put object permission to buckets ( Could be anything )

  • Click Next --> Again Click Next and then click on check Box to agree --> Then create Stack.
  • Next screen will initiate CloudFormation stack creation window, we can see the progress of our stack creation... wait and use the refresh button till stack creation say's it's completed.

  • Once the stack status stands completed, click on output tab and verify the name of your created resources.
  • Now toggle down to IAM windows and search our above created role.
  • Once Verified we can go to our EC2 instance, where we will be attaching our above created role to give access to S3 bucket.
  • AWS Console → EC2 → Search instance → yourinstaceName→ Right Click → Instance Setting → Attach/Replace IAM Role → Choose above created IAM role (s3_copy_data_between_buckets_role) --> Apply


Now we are ready to test, verify and further automate it using cronJob.
  • Login to your EC2 instance.
  • Run the below command to verify you proper access to both the S3 buckets.
List content within bucket.

 aws s3 ls s3://cyberkeeda-bucket-a/
aws s3 ls s3://cyberkeeda-bucket-b/


You can see the output of the above command shows file for different buckets.

Copy file/content from one bucket to another.

  • Now we will try to copy file name demo-file-A.txt from bucket cyberkeeda-bucket-to cyberkeeda-bucket-a


 aws s3 cp s3://SOURCE-BUCKET-NAME/FILE-NAME s3://DESTINATION-BUCKET-NAME/FILE-NAME
aws s3 cp s3://cyberkeeda-bucket-a/demo-file-A.txt  s3://cyberkeeda-bucket-b/demo-file-A.txt
Sync all file/content from one bucket to another.

 aws s3 sync s3://SOURCE-BUCKET-NAME/ s3://DESTINATION-BUCKET-NAME/
aws s3 sync s3://cyberkeeda-bucket-a/  s3://cyberkeeda-bucket-b/
Sync all file/content from one bucket to another with ACL as bucket owner.

 aws s3 sync --acl bucket-owner-full-control s3://cyberkeeda-bucket-a/  s3://cyberkeeda-bucket-b/

That's it with this post, we will cover how to do the same for Cross Account within next post.
Feel free to comment, if you face any issue implementing it.



Read more ...
Designed By Jackuna