CyberKeeda In Social Media
Showing posts with label AWS Cloudformation. Show all posts
Showing posts with label AWS Cloudformation. Show all posts

AWS CloudFormation : AWS Specific Parameter Types



AWS Cloud Formation Parameters.


Parameters section within CloudFormation template is way to gather inputs from user, which can be used within other sections of entire cloudformation script.

Parameter Type
Parameter Type plays a very important, it enable CloudFormation to validate inputs earlier in the stack creation process, thus it is a way cloudformation can validate your input based upon type before even instantiating stack creation.

Here are the valid Parameter types supported by AWS CloudFormation

TypeDetails
String Can be used to validate normal string.
Number FAn integer or float
List<Number>RAn array of integers or floats
CommaDelimitedListHAn array of literal strings that are separated by commas
List<AWS::EC2::SecurityGroup::Id>Array of security group IDs
AWS::EC2::KeyPair::KeyNameAn Amazon EC2 key pair name
AWS::EC2::SecurityGroup::IdA security group ID
AWS::EC2::Subnet::Idsubnet ID
AWS::EC2::VPC::IdVPC ID
List<AWS::EC2::VPC::IdAn array of VPC IDs
List<AWS::EC2::Subnet::Id>An array of subnet IDs

Parameter section Example.



Parameters:
  EnvironmentName:
    DescriptionSelect the environment.
    TypeString
    Defaultdev
    AllowedValues:
      - dev
      - prd
  EC2InstanceType:
    TypeString
    Defaultt2.micro
    AllowedValues:
        - t2.micro
        - t2.small
  EC2KeyName:
    DescriptionSelect your Key name from List.
    TypeAWS::EC2::KeyPair::KeyName




Read more ...

AWS Cloudwatch : How to create CloudWatch Rule and Schedule Lambda Function Using it.



AWS CloudWatch


Amazon CloudWatch is a monitoring and management service that provides data and actionable insights for AWS, hybrid, and on-premises applications and infrastructure resources. With CloudWatch, you can collect and access all your performance and operational data in form of logs and metrics from a single platform.


CloudWatch Rule : Create rules to invoke Targets based on Events happening in your AWS environment.

So this tutorial is just a walk through to
  • How to create a Cloudwatch rule using AWS Console
  • How to create a Cloudwatch rule and invoke lambda function using Rule scheduler and Cron expression.
  • What is Constant Keys and what is it's use while we use Lambda function as Target.

What we are trying to achieve here : 
  • We have a Lambda function named as "Check_File" and it's function is to check existence of file.
  • File is dynamic type : Format --> YYMMDDYdemo-file-A.txt
  • Thus this part of file name demo-file-A.txt  is treated as file_suffix.
  • So we will be using Constant keys to send the required value, which will be used as EVENTS within our Lambda Function.
  • Our dynamic file gets uploaded everyday between 11:30 to 12:00, thus we will trigger Lambda Function via Cloudwatch rule with Custom cron to check file existence.
Lab Setup details

  • AWS Lambda Function name : Check_File
  • AWS Lambda Function Task : Checks existence of file uploaded daily between 11:30 to 12:00
  • Cron Timing : Every Five minute between 11:30 to 12:00 GMT
  • Constant Keys : { "bucket_sub_directory": "reports/", "file_suffix": "demo-file-A.txt", "bucket_name": "cyberkeeda-bucket-a", "fileType": "daily"}

Steps to create Cloudwatch Rule via AWS Console.

  • AWS Console --> Cloudwatch --> Rules --> Create Rule
  • Enter the required details.
    • Event Source : 
      • Schedule --> Custom Cron --> 30,35,40,45,50,55,59 11 ? * * *
      •  Target --> Lambda Function --> Check_File
      • Constant Keys --> { "bucket_sub_directory": "reports/", "file_suffix": "demo-file-A.txt", "bucket_name": "cyberkeeda-bucket-a", "fileType": "daily"}
  • Configure Rule details :
    • Name : Cloudwatch Rule name
    • Description : Describe about it and Click on Create Rule.
Note : Replace the above Inputs with your own.


    We are done with the creation of Cloudwatch Rule via Admin Console.

Verify Cloudwatch logs to confirm if scheduled lambda executed.


Cloudwatch Rule via CloudFormation Script.

Use the below AWS CFN template to create above cloudwatch rule via AWS Cloudformation.


AWSTemplateFormatVersion: 2010-09-09
DescriptionCFN Script to create Lambda Function and a CloudWatch Rule with cron to trigger it.
Parameters:
  LambdaFunctionNameARNParam:
    TypeString
    DescriptionProvide Lambda Function ARN Parameter. [a-z][a-z0-9]* 
  CloudwatchRuleNameParm:
    TypeString
    DescriptionProvide Cloudwatch Rule name. [a-z][a-z0-9]*

Resources:
  ScheduledRule
    TypeAWS::Events::Rule
    Properties
      Description!Sub "Scheduled Rule created to invoke Lambda Function: ${LambdaFunctionNameARNParam}"
      Name!Sub ${CloudwatchRuleNameParm}
      ScheduleExpressioncron(0/5 11-12 ? * * *)

      State"DISABLED"
      Targets
      - IdLambda1
        Arn!Sub ${LambdaFunctionNameARNParam}
        Input'{ "bucket_sub_directory": "reports/",
                   "file_suffix": "demo-file-A.txt",                    "bucket_name": "cyberkeeda-bucket-a",
                   "fileType": "random"
                  }'

  PermissionForEventsToInvokeLambda
    TypeAWS::Lambda::Permission
    Properties
      FunctionName!Sub ${LambdaFunctionNameARNParam}
      Action"lambda:InvokeFunction"
      Principal"events.amazonaws.com"
      SourceArn!GetAtt ScheduledRule.Arn



Read more ...

AWS SNS : How to create SNS Topic and add email Subscribers to it.



AWS SNS


Amazon Simple Notification Service (SNS) is a highly available, durable, secure, fully managed pub/sub messaging service that enables you to decouple microservices, distributed systems, and serverless applications. 
Additionally, SNS can be used to trigger notification end users using mobile push, SMS, and email.
Source : Official


Within this post, we will cover.

  • How to create SNS Topic and attach Email Subscribers to it via AWS Console.
  • How to create SNS Topic and attach Email Subscribers to it via  AWS CloudFormation script. 

We will start with the use case:

  • If you want to trigger email notifications coupled to any AWS services like AWS lambda function.
  • If you want to fan out AWS lambda function via SNS.

Steps:

Create SNS Topic using AWS Console.
  • AWS Console -->  SNS --> Create Topic --> Fill detail and Create Topic.
    • Topic Name : Identifier of topic name, could be anything.
    • Display Name : This one is important, when Notification email will land to your email this will be the name for the SNS email, so keep so that it will give you a hint about the notification content.
Once created we will get a Success Notification.




Attach email subscribers to SNS Topic and Verify the Subscription.
  • Under above created SNS topic, scroll down and Click on Create Subscription
  • Fill in the below required details and create Subscriptions.
    • Topic ARN --> Selected by default.
    • Protocol --> Email.
    • Endpoint --> Email address of recipient, if you want notification to your inbox put your email id.
  • Verify Subscription --> You will get a AWS Notification - Subscription Confirmation email.
  • Click on the link to confirm subscription.
  • With respect to filled input within screenshot, something similar you can expect.
  • SNS Topic ARN can be used later to send email notifications or fan out Lambda.
Note : Every time you want to send to notification to new email address, repeat the above process to add Subscriber under same SNS Topic.





Yeah above are the steps to create SNS topic and add subscribers via Console, below is the AWS CloudFormation Script to SNS topic and add email Subscribers to it.

AWS CloudFormation Script.


AWSTemplateFormatVersion: 2010-09-09
DescriptionCFN to create SNS topic and Add two different email Subscribers to it. 
Parameters:
 
  SNSTopicName:
    TypeString
    DescriptionProvide SNS Topic Name
  SNSTopicDisplayName:
    TypeString
    DescriptionProvide SNS Topic's Display Name, will be used as a name for SNS emails.
  SubscriptionEmail:
    TypeString
    DescriptionProvide DL or email for your SNS Topic.

Resources:
  SNSTopic:
    TypeAWS::SNS::Topic
    Properties:
      TopicName!Sub ${SNSTopicName}
      DisplayName!Sub ${SNSTopicDisplayName}
      Subscription:
      - Endpoint!Sub ${SubscriptionEmail}
        Protocolemail

In case if we want to add more email subscribers, we can append more -Endpoint and Protocol as email to it.

Below one is the hard coded Resource for SNS topic with two email subscribers,  


Resources:
  SNSTopic:
    TypeAWS::SNS::Topic
    Properties:
      TopicNamemySNSTopic
      DisplayNameSNS Admin
      Subscription:
      - Endpointadmin@cyberkeeda.com
        Protocolemail
      Subscription:
      - Endpointblogger@cyberkeeda.com
        Protocolemail

    Read more ...

    AWS CloudFormation Script to Create Lambda Role with Inline Policy for S3 Operations.



    Within this blog we have a requirement to copy data from one bucket to another bucket using Lambda Function, in order to accomplish the task Lambda needs an additional role in order to perform task for other AWS Services.

    So we will use Cloudformation script to create the below AWS Resources.

    • IAM Role for Lambda Service.
    • Above created Role has attached Inline Policy with the below access.
      • ACCESS to two individual Bucket.
      • ACCESS to Cloud Watch to perform basic Log Operations 

    In case if your are looking to use it, replace the below enlisted by yours value.
    • Bucket 1 name : mydemodests1
    • Bucket 2 name : mydemodests2
    • IAM Role name : LambaRoleforS3operation
    • Inline Policy name : LambaRoleforS3operation-InlinePolicy

    AWSTemplateFormatVersion: 2010-09-09
    Description:  Lambda role creation for S3 Operation.
      
    Resources:
      LambdaIAMRole:
        Type'AWS::IAM::Role'
        Description"Lambda IAM Role"
        Properties:
          RoleNameLambaRoleforS3operation
          AssumeRolePolicyDocument:
            Version'2012-10-17'
            Statement:
              - SidAllowLambdaServiceToAssumeRole
                EffectAllow
                Principal:
                  Service:
                    - lambda.amazonaws.com
                Action:
                  - sts:AssumeRole
          Path/service-role/
          Policies:
            - PolicyName"LambaRoleforS3operation-InlinePolicy"
              PolicyDocument: {
        "Version""2012-10-17",
        "Statement": [
            {
                "Effect""Allow",
                "Action": [
                    "logs:CreateLogGroup",
                    "logs:CreateLogStream",
                    "logs:PutLogEvents"
                ],
                "Resource""arn:aws:logs:*:*:*"
            },
            {
                "Effect""Allow",
                "Action": [
                    "s3:*"
                ],
                "Resource": [
                    "arn:aws:s3:::mydemodests1/*"
                ]
            },
            {
                "Effect""Allow",
                "Action": [
                    "s3:*"
                ],
                "Resource": [
                    "arn:aws:s3:::mydemodests2/*"
                ]
            }
        ]
    }

    Read more ...

    AWS S3 - Cross accounts copy data from one bucket to another.

    Within this post, we will cover.

    • How to allow data copy from AWS Cross account S3 Bucekts.
    • Data from Bucket existing with one account can copy data to s3 bucket lying in another AWS account.

    Setup is exactly similar to our last blog post : Link 

    We have two different bucket and two files under those bucket within different AWS Accounts.
    • Bucket 1 name : cyberkeeda-bucket-account-a --> demo-file-A.txt
    • Bucket 2 name : cyberkeeda-bucket-account-b -> demo-file-B.txt


    We will start by creating a bucket on Account B and modifying few things to allow our source bucket account owner to give access to our destination bucket.

    We will assume we already have a bucket on account B, with all the public access to bucket denied, so we need to modify/add below changes within destination bucket Permission tab.

    Below all modifications, we are doing at our destination account - B 
    •  Modify Public Access Rights : S3 --> choose your destination bucket --> Permission tab --> Click on Block Public Access --> Edit.
      • Uncheck : Block Public Access
      • Check : Block public access to buckets and objects granted through new access control lists (ACLs)
      • Check : Block public access to buckets and objects granted through any access control lists (ACLs)
      • Check : Block public access to buckets and objects granted through new public bucket or access point policies
      • Uncheck : Block public and cross-account access to buckets and objects through any public bucket or access point policies
    • In the above manner we are blocking every public access except for AWS Cross accounts.
    • Add Bucket Policy to allow read, write access to Account A:
      • S3 --> choose your destination bucket --> Permission tab --> Click on Block Policy --> Add the below lines.
      • Replace the AWS Account number with your source bucket owner account number, here our source account is for Account-A number.
      • And bucket with the destination bucket name, here our destination bucket name (cyberkeeda-bucket-account-b)
      • Update the variables Source Account number and Destination bucket name and save it.
    {
        "Version": "2012-10-17",
        "Id": "Policy1586529665189",
        "Statement": [
            {
                "Sid": "SidtoAllowCrossAccountAccess",
                "Effect": "Allow",
                "Principal": {
                    "AWS": "arn:aws:iam::387789623977:root"
                },
                "Action": "s3:*",
                "Resource": [
                    "arn:aws:s3:::cyberkeeda-bucket-account-b",
                    "arn:aws:s3:::cyberkeeda-bucket-account-b/*"
                ]
            }
        ]
    }

    We are done with all required changes with Destination Bucket Account B, now lets move and do the needful at account A.

    All below changes are made at Account -A ( Source Account )

    Link for Cloudformation script : Link
    Use the above cloudformation script to create instance based IAM role and replace the destination bucket with bucket name of Account B.

    • Stack Name : Name of the stack ( Could be anything )
    • Source Bucket name : Name of the bucket where we want to copy data from our source bucket, Account A bucket name (cyberkeeda-bucket-account-A)
    • Destination Bucket name : Name of the bucket where we want to copy data from our source bucket, Account B bucket name (cyberkeeda-bucket-account-b)
    • Role Name : Name of your IAM role ( Could be anything )
    • Inline Policy : Name of your policy, which will allow list,get,put object permission to buckets ( Could be anything )
    • Once Stack is created, follow the same process to attach IAM role to instance and after that we can use aws CLI commands as (LS,CP,SYNC)

    Note
    1. This is really important stuff to share, whenever we copy any data/object from source s3 bucket to destination bucket while in Cross account, use sync --acl bucket-owner-full-control.
    2. This is mandatory else you can copy but the destination bucket owner will be unable to view/download any uploaded file/object from source account.

    Now use the below AWS CLI command to Sync all file/content from one bucket to another with ACL as bucket owner.

     aws s3 sync --acl bucket-owner-full-control s3://cyberkeeda-bucket-account-A/  s3://cyberkeeda-bucket-account-B/

    You can see a stream of data copying as an STDOUT after command is executed.



    Read more ...

    AWS S3 - Copy data from one bucket to another without storing credentials anywhere.


    Within this post, we will cover.

    • How to automate copy or sync data/objects from one bucket to another.
    • How we can use an EC2 instance to copy data from one bucket to another.
    • We will leverage the power of AWS IAM Role and AWS S3 CLI to accomplish our requirement.
    • AWS CloudFormation script to create IAM role and Inline Policy.


    So let's know our lab setup and similarly you can assume your requirement by replacing the variables.

    • We already have an EC2 Instance within zone ap-south-1 ( Mumbai )
    • Since S3 is region independent, we will be not highlighting it here.
    • We have two different bucket and two files under those bucket within aws same account as 
      • Bucket 1 name : cyberkeeda-bucket---> demo-file-A.txt
      • Bucket 2 name : cyberkeeda-bucket-b --> demo-file-B.txt
      • Since S3 is region independent, we will be not highlighting it here.
    • We will copy data from cyberkeeda-bucket-to cyberkeeda-bucket-by running aws cli commands from our ec2 instance.
    • Above task can be done using AWS CLI Command from any host but the major difference is, one need to store credentials while running aws configure command.
    • We will by pass the aws configure command by assigning an Instance Profile IAM role.
    • We will create an IAM Role with Inline policy.
    • We will use Cloudformation Script to create the required role.

    Few things we must know about IAM role before proceeding further,

    • IAM Role : IAM role is a set of permissions that are created to initiate various AWS Service request, when we say aws service request that means request made to initiate services like ( S3, EC2, LAMBDA, etc etc )
    • IAM Roles are not attached to any user or group, it's assumed by other aws services like ( ec2, lambda ), applications.
    • Policy : Policy can be defined as set of permissions allowed/denied to role,user or group.
    • Managed Policy : A policy that has been created keeping in mind of reusibility, creating one and can be mapped to multiple user/service/role.
    • Inline Policy : Policy that has been created for one to one mapping between policy and entity.

    CloudFormation Script to create IAM Role and Inline Policy.


    AWSTemplateFormatVersion: 2010-09-09
    Description
      CFN Script to create role and inline policy for ec2 instance.
      Will be used further to transfer data from Source bucket to Destination bucket.
      Author - Jackuna ( https://github.com/Jackuna)

    Parameters:
      RoleName:
        TypeString
        DescriptionProvide Role Name that will be assumed by EC2. [a-z][a-z0-9]*
      InlinePolicyName:
        TypeString
        DescriptionProvide Inline Policy name, it will attached with above created role. [a-z][a-z0-9]*
      SourceBucketName:
        TypeString
        DescriptionProvide Source Bucket name [a-z][a-z0-9]* 
      DestinationBucketName:
        TypeString
        DescriptionProvide Destination Bucket name [a-z][a-z0-9]*

    Resources:
      RootRole:
        Type'AWS::IAM::Role'
        Properties:
          RoleName!Sub "${RoleName}"
          AssumeRolePolicyDocument:
            Version: 2012-10-17
            Statement:
              - EffectAllow
                Principal:
                  Service: ["ec2.amazonaws.com"]
                Action: ['sts:AssumeRole']
          Policies:
            - PolicyName!Sub ${InlinePolicyName}
              PolicyDocument:
                Version'2012-10-17'
                Statement:
                  - EffectAllow
                    Action:
                    - s3:ListBucket
                    - s3:PutObject
                    - s3:GetObject
                    Resource:
                    - !Sub arn:aws:s3:::${SourceBucketName}/*
                    - !Sub arn:aws:s3:::${SourceBucketName}
                  - EffectAllow
                    Action
                    - s3:ListBucket
                    - s3:PutObject
                    - s3:GetObject
                    Resource:
                    - !Sub arn:aws:s3:::${DestinationBucketName}/*
                    - !Sub arn:aws:s3:::${DestinationBucketName}
      RootInstanceProfile:
        Type'AWS::IAM::InstanceProfile'
        DependsOn:
          - RootRole
        Properties:
          Path/
          InstanceProfileName!Sub "${RoleName}"
          Roles:
          - !Ref RoleName

    Outputs:
      RoleDetails:
        DescriptionRole Name
        Value!Ref RootRole
      PolicyDetails:
        DescriptionInline Policy Name
        Value!Ref InlinePolicyName


    Steps to use the above cloud formation script:
    • Copy the above content and save it into a file and name it as iam_policy_role.yaml
    • Go to AWS Console --> Services --> Cloudformation --> Create Stack
    • Choose options : Template is ready and Upload a Template File and upload your saved template iam_policy_role.yaml  --> Next

    • Next page will ask you for required parameters as input, we will fill it as per our lab setup and requirement.
      • Stack Name : Name of the stack ( Could be anything )
      • Destination Bucket name : Name of the bucket where we want to copy data from our source bucket.
      • Role Name : Name of your IAM role ( Could be anything )
      • Inline Policy : Name of your policy, which will allow list,get,put object permission to buckets ( Could be anything )

    • Click Next --> Again Click Next and then click on check Box to agree --> Then create Stack.
    • Next screen will initiate CloudFormation stack creation window, we can see the progress of our stack creation... wait and use the refresh button till stack creation say's it's completed.

    • Once the stack status stands completed, click on output tab and verify the name of your created resources.
    • Now toggle down to IAM windows and search our above created role.
    • Once Verified we can go to our EC2 instance, where we will be attaching our above created role to give access to S3 bucket.
    • AWS Console → EC2 → Search instance → yourinstaceName→ Right Click → Instance Setting → Attach/Replace IAM Role → Choose above created IAM role (s3_copy_data_between_buckets_role) --> Apply


    Now we are ready to test, verify and further automate it using cronJob.
    • Login to your EC2 instance.
    • Run the below command to verify you proper access to both the S3 buckets.
    List content within bucket.

     aws s3 ls s3://cyberkeeda-bucket-a/
    
    aws s3 ls s3://cyberkeeda-bucket-b/
    


    You can see the output of the above command shows file for different buckets.

    Copy file/content from one bucket to another.

    • Now we will try to copy file name demo-file-A.txt from bucket cyberkeeda-bucket-to cyberkeeda-bucket-a


     aws s3 cp s3://SOURCE-BUCKET-NAME/FILE-NAME s3://DESTINATION-BUCKET-NAME/FILE-NAME
    
    aws s3 cp s3://cyberkeeda-bucket-a/demo-file-A.txt  s3://cyberkeeda-bucket-b/demo-file-A.txt
    Sync all file/content from one bucket to another.

     aws s3 sync s3://SOURCE-BUCKET-NAME/ s3://DESTINATION-BUCKET-NAME/
    
    aws s3 sync s3://cyberkeeda-bucket-a/  s3://cyberkeeda-bucket-b/
    Sync all file/content from one bucket to another with ACL as bucket owner.

     aws s3 sync --acl bucket-owner-full-control s3://cyberkeeda-bucket-a/  s3://cyberkeeda-bucket-b/

    That's it with this post, we will cover how to do the same for Cross Account within next post.
    Feel free to comment, if you face any issue implementing it.



    Read more ...
    Designed By Jackuna