CyberKeeda In Social Media
Showing posts with label Boto. Show all posts
Showing posts with label Boto. Show all posts

AWS Lambda Function to check existence of file under S3 bucket and Notify via Email



File Check Automation on AWS using Lambda,CloudWatch, SNS.


Within this post, we will cover.

  • How we can check the existence of a file under a AWS S3 Bucket Using Python as an AWS Lambda Function
  • How to use AWS Simple Notification Service to notify file existence status within Lambda
  • How we can automate the lambda function to check file existence using ClodWatch Rule and Custom Crontab
  • How can we implement entire solution of File Check monitoring using AWS CloudFormation template.

We will start with the use case:

  • If you have a scheduled event to drop a specif file daily/hourly to S3 bucket and want to check it's existence status.
  • If you have multiple of file checks daily, with only one lambda function by leveraging the power of cloudwatch rule's constant keys and custom cron we will accomplish it.
  • Want multiple file checks for different file within different buckets.
  • Want Success or Failure notification for file existence.

We will use a python as a language within Lambda Function to accomplish above requirements and here is the process we will follow sequentially.
  1. Create SNS topic and add Subscribers within it.
  2. Create Lambda Function
  3. Configure test events within AWS lambda function.
  4. Verify the working of Lambda function by modifying the test events values.
  5. Create CloudWatch rule to automate the file check lambda function.
Lab Setup details :
  • S3 Bucket name : cyberkeeda-bucket-a
  • S3 Bucket name with directory name : cyberkeeda-bucket-a/reports/
  • File Name and Format 
    • File Type 1 : Static file : demo-file-A.txt
    • File Type 2 : Dynamic file : YYMMDDdemo-file-A.txt (20200530demo-file-A.txt)
Steps:
  • Create Lambda Function.
    • Follow the standard procedure to create AWS lambda function : How to create Lambda Function from AWS Console with example.
    •  Add the below python Code. --> Github Link
    • In case if we don't want email notifications for SUCCESS/INFO conditions, comment out the function named trigger_email() 
    • Replace the below variables with your own.
      • SNS_TOPIC_ARN = 'arn:aws:sns:ap-south-1:387684573977:mySNSTopic'

from boto3 import resource, client
import botocore
from datetime import datetime

SNS_TOPIC_ARN = 'arn:aws:sns:ap-south-1:387650023977:mySNSTopic'

def lambda_handler(eventcontext):
    
    def trigger_email(email_subjectemail_message):
        
        sns = client('sns')
        sns.publish(
        TopicArn = SNS_TOPIC_ARN,
        Subject = email_subject,
        Message = email_message
    )
        
    def initialize_objects_and_varibales():
        global SOURCE_BUCKET_NAME
        global FILE_NAME
        global FILE_NAME_WITH_DIRECTORY
        global dt
        
        dt = datetime.now()
        File_PREFIX_DATE = dt.strftime('%Y%m%d')
        FILE_PREFIX_DIRECTORY = event["bucket_sub_directory"]
        FILE_SUFFIX = event["file_suffix"]
        SOURCE_BUCKET_NAME = event["bucket_name"]
        FILE_TYPE = event['fileType']
        
        if FILE_PREFIX_DIRECTORY == 'False':
            
            if FILE_TYPE == 'daily':
                FILE_NAME = File_PREFIX_DATE+FILE_SUFFIX
                FILE_NAME_WITH_DIRECTORY = FILE_NAME
            else:
                FILE_NAME = FILE_SUFFIX
                FILE_NAME_WITH_DIRECTORY = FILE_NAME   
        else:
            if FILE_TYPE == 'daily':
                FILE_NAME = File_PREFIX_DATE+FILE_SUFFIX
                FILE_NAME_WITH_DIRECTORY = FILE_PREFIX_DIRECTORY+FILE_NAME    
            else:
                FILE_NAME = FILE_SUFFIX
                FILE_NAME_WITH_DIRECTORY = FILE_PREFIX_DIRECTORY+FILE_NAME
                

    def check_file_existance():
        
        s3 = resource('s3')
        try:
            s3.Object(SOURCE_BUCKET_NAME, FILE_NAME_WITH_DIRECTORY).load()
            print("[SUCCESS]", dt, "File Exists with name as",FILE_NAME)
            email_subject = "[INFO] Daily Report File found in report Folder"
            email_message = "Today's file name : {} \n Bucket Name : {} \n Lambda Function Name : {}".format(FILE_NAME, SOURCE_BUCKET_NAME,context.function_name )
            trigger_email(email_subject,email_message)

        except botocore.exceptions.ClientError as errorStdOut:
            
            if errorStdOut.response['Error']['Code'] >= "401":
                print("[ERROR]", dt, "File does not exist. :", FILE_NAME)
                email_subject = "[ERROR]  Daily Report File not found in report Folder"
                email_message = "Expected file name : {} \n Bucket Name : {} \n Lambda Function Name : {}".format(FILE_NAME, SOURCE_BUCKET_NAME,context.function_name)
                trigger_email(email_subject,email_message)

            else:
                print("[ERROR]", dt, "Something went wrong")
                email_subject = "[ERROR] Lambda Error"
                email_message = "Something went wrong, please check lambda logs.\n Expected file name : {} \n Bucket Name : {}\n Lambda Function Name : {}".format(FILE_NAME,SOURCE_BUCKET_NAME,context.function_name )
                trigger_email(email_subject,email_message)
    
    initialize_objects_and_varibales()
    check_file_existance()

    • Configure Lambda function test events.
    Above Lambda function can be used for the following use case :
    •  Can be used to check existence of  file under S3 bucket and even file located under sub directories of any S3 bucket.
    Note : replace bucket-name and file_suffix as per your setup and verify it's working status.
      • To check existence of file under a bucket manually use the below JSON under configure test events.
        • We have file demo-file-A.txt located at cyberkeeda-bucket-a/
        • {
            "bucket_sub_directory""False",
            "file_suffix""demo-file-A.txt",
            "bucket_name""cyberkeeda-bucket-a",
            "fileType""random"
          }
      • To check existence of file under a sub directory located within bucket manually use the below JSON under configure test events.
        • We have file demo-file-A.txt located at cyberkeeda-bucket-a/reports/
        • {
            "bucket_sub_directory""reports/",
            "file_suffix""demo-file-A.txt",
            "bucket_name""cyberkeeda-bucket-a",
            "fileType""random"
          }
    • Can be used to check existence of  dynamic file under S3 bucket and even file located under sub directories of any S3 bucket.
    • Create Cloud Watch rule to automate the file check Lambda.
    We have our dynamic file with format YYMMDDdemo-file-A.txt , where file prefix is today's date, so for today's file the name of the file will be 20200530demo-file-A.txt

    Our Lambda function python script  is written in a way to validate such file.

    Please Comment, in case you face any issue or support required using the above scripts.

    Read more ...

    AWS Cloudwatch : How to create CloudWatch Rule and Schedule Lambda Function Using it.



    AWS CloudWatch


    Amazon CloudWatch is a monitoring and management service that provides data and actionable insights for AWS, hybrid, and on-premises applications and infrastructure resources. With CloudWatch, you can collect and access all your performance and operational data in form of logs and metrics from a single platform.


    CloudWatch Rule : Create rules to invoke Targets based on Events happening in your AWS environment.

    So this tutorial is just a walk through to
    • How to create a Cloudwatch rule using AWS Console
    • How to create a Cloudwatch rule and invoke lambda function using Rule scheduler and Cron expression.
    • What is Constant Keys and what is it's use while we use Lambda function as Target.

    What we are trying to achieve here : 
    • We have a Lambda function named as "Check_File" and it's function is to check existence of file.
    • File is dynamic type : Format --> YYMMDDYdemo-file-A.txt
    • Thus this part of file name demo-file-A.txt  is treated as file_suffix.
    • So we will be using Constant keys to send the required value, which will be used as EVENTS within our Lambda Function.
    • Our dynamic file gets uploaded everyday between 11:30 to 12:00, thus we will trigger Lambda Function via Cloudwatch rule with Custom cron to check file existence.
    Lab Setup details

    • AWS Lambda Function name : Check_File
    • AWS Lambda Function Task : Checks existence of file uploaded daily between 11:30 to 12:00
    • Cron Timing : Every Five minute between 11:30 to 12:00 GMT
    • Constant Keys : { "bucket_sub_directory": "reports/", "file_suffix": "demo-file-A.txt", "bucket_name": "cyberkeeda-bucket-a", "fileType": "daily"}

    Steps to create Cloudwatch Rule via AWS Console.

    • AWS Console --> Cloudwatch --> Rules --> Create Rule
    • Enter the required details.
      • Event Source : 
        • Schedule --> Custom Cron --> 30,35,40,45,50,55,59 11 ? * * *
        •  Target --> Lambda Function --> Check_File
        • Constant Keys --> { "bucket_sub_directory": "reports/", "file_suffix": "demo-file-A.txt", "bucket_name": "cyberkeeda-bucket-a", "fileType": "daily"}
    • Configure Rule details :
      • Name : Cloudwatch Rule name
      • Description : Describe about it and Click on Create Rule.
    Note : Replace the above Inputs with your own.


        We are done with the creation of Cloudwatch Rule via Admin Console.

    Verify Cloudwatch logs to confirm if scheduled lambda executed.


    Cloudwatch Rule via CloudFormation Script.

    Use the below AWS CFN template to create above cloudwatch rule via AWS Cloudformation.


    AWSTemplateFormatVersion: 2010-09-09
    DescriptionCFN Script to create Lambda Function and a CloudWatch Rule with cron to trigger it.
    Parameters:
      LambdaFunctionNameARNParam:
        TypeString
        DescriptionProvide Lambda Function ARN Parameter. [a-z][a-z0-9]* 
      CloudwatchRuleNameParm:
        TypeString
        DescriptionProvide Cloudwatch Rule name. [a-z][a-z0-9]*

    Resources:
      ScheduledRule
        TypeAWS::Events::Rule
        Properties
          Description!Sub "Scheduled Rule created to invoke Lambda Function: ${LambdaFunctionNameARNParam}"
          Name!Sub ${CloudwatchRuleNameParm}
          ScheduleExpressioncron(0/5 11-12 ? * * *)

          State"DISABLED"
          Targets
          - IdLambda1
            Arn!Sub ${LambdaFunctionNameARNParam}
            Input'{ "bucket_sub_directory": "reports/",
                       "file_suffix": "demo-file-A.txt",                    "bucket_name": "cyberkeeda-bucket-a",
                       "fileType": "random"
                      }'

      PermissionForEventsToInvokeLambda
        TypeAWS::Lambda::Permission
        Properties
          FunctionName!Sub ${LambdaFunctionNameARNParam}
          Action"lambda:InvokeFunction"
          Principal"events.amazonaws.com"
          SourceArn!GetAtt ScheduledRule.Arn



    Read more ...

    AWS CloudFormation Script to Create Lambda Role with Inline Policy for S3 Operations.



    Within this blog we have a requirement to copy data from one bucket to another bucket using Lambda Function, in order to accomplish the task Lambda needs an additional role in order to perform task for other AWS Services.

    So we will use Cloudformation script to create the below AWS Resources.

    • IAM Role for Lambda Service.
    • Above created Role has attached Inline Policy with the below access.
      • ACCESS to two individual Bucket.
      • ACCESS to Cloud Watch to perform basic Log Operations 

    In case if your are looking to use it, replace the below enlisted by yours value.
    • Bucket 1 name : mydemodests1
    • Bucket 2 name : mydemodests2
    • IAM Role name : LambaRoleforS3operation
    • Inline Policy name : LambaRoleforS3operation-InlinePolicy

    AWSTemplateFormatVersion: 2010-09-09
    Description:  Lambda role creation for S3 Operation.
      
    Resources:
      LambdaIAMRole:
        Type'AWS::IAM::Role'
        Description"Lambda IAM Role"
        Properties:
          RoleNameLambaRoleforS3operation
          AssumeRolePolicyDocument:
            Version'2012-10-17'
            Statement:
              - SidAllowLambdaServiceToAssumeRole
                EffectAllow
                Principal:
                  Service:
                    - lambda.amazonaws.com
                Action:
                  - sts:AssumeRole
          Path/service-role/
          Policies:
            - PolicyName"LambaRoleforS3operation-InlinePolicy"
              PolicyDocument: {
        "Version""2012-10-17",
        "Statement": [
            {
                "Effect""Allow",
                "Action": [
                    "logs:CreateLogGroup",
                    "logs:CreateLogStream",
                    "logs:PutLogEvents"
                ],
                "Resource""arn:aws:logs:*:*:*"
            },
            {
                "Effect""Allow",
                "Action": [
                    "s3:*"
                ],
                "Resource": [
                    "arn:aws:s3:::mydemodests1/*"
                ]
            },
            {
                "Effect""Allow",
                "Action": [
                    "s3:*"
                ],
                "Resource": [
                    "arn:aws:s3:::mydemodests2/*"
                ]
            }
        ]
    }

    Read more ...

    AWS S3 - Cross accounts copy data from one bucket to another.

    Within this post, we will cover.

    • How to allow data copy from AWS Cross account S3 Bucekts.
    • Data from Bucket existing with one account can copy data to s3 bucket lying in another AWS account.

    Setup is exactly similar to our last blog post : Link 

    We have two different bucket and two files under those bucket within different AWS Accounts.
    • Bucket 1 name : cyberkeeda-bucket-account-a --> demo-file-A.txt
    • Bucket 2 name : cyberkeeda-bucket-account-b -> demo-file-B.txt


    We will start by creating a bucket on Account B and modifying few things to allow our source bucket account owner to give access to our destination bucket.

    We will assume we already have a bucket on account B, with all the public access to bucket denied, so we need to modify/add below changes within destination bucket Permission tab.

    Below all modifications, we are doing at our destination account - B 
    •  Modify Public Access Rights : S3 --> choose your destination bucket --> Permission tab --> Click on Block Public Access --> Edit.
      • Uncheck : Block Public Access
      • Check : Block public access to buckets and objects granted through new access control lists (ACLs)
      • Check : Block public access to buckets and objects granted through any access control lists (ACLs)
      • Check : Block public access to buckets and objects granted through new public bucket or access point policies
      • Uncheck : Block public and cross-account access to buckets and objects through any public bucket or access point policies
    • In the above manner we are blocking every public access except for AWS Cross accounts.
    • Add Bucket Policy to allow read, write access to Account A:
      • S3 --> choose your destination bucket --> Permission tab --> Click on Block Policy --> Add the below lines.
      • Replace the AWS Account number with your source bucket owner account number, here our source account is for Account-A number.
      • And bucket with the destination bucket name, here our destination bucket name (cyberkeeda-bucket-account-b)
      • Update the variables Source Account number and Destination bucket name and save it.
    {
        "Version": "2012-10-17",
        "Id": "Policy1586529665189",
        "Statement": [
            {
                "Sid": "SidtoAllowCrossAccountAccess",
                "Effect": "Allow",
                "Principal": {
                    "AWS": "arn:aws:iam::387789623977:root"
                },
                "Action": "s3:*",
                "Resource": [
                    "arn:aws:s3:::cyberkeeda-bucket-account-b",
                    "arn:aws:s3:::cyberkeeda-bucket-account-b/*"
                ]
            }
        ]
    }

    We are done with all required changes with Destination Bucket Account B, now lets move and do the needful at account A.

    All below changes are made at Account -A ( Source Account )

    Link for Cloudformation script : Link
    Use the above cloudformation script to create instance based IAM role and replace the destination bucket with bucket name of Account B.

    • Stack Name : Name of the stack ( Could be anything )
    • Source Bucket name : Name of the bucket where we want to copy data from our source bucket, Account A bucket name (cyberkeeda-bucket-account-A)
    • Destination Bucket name : Name of the bucket where we want to copy data from our source bucket, Account B bucket name (cyberkeeda-bucket-account-b)
    • Role Name : Name of your IAM role ( Could be anything )
    • Inline Policy : Name of your policy, which will allow list,get,put object permission to buckets ( Could be anything )
    • Once Stack is created, follow the same process to attach IAM role to instance and after that we can use aws CLI commands as (LS,CP,SYNC)

    Note
    1. This is really important stuff to share, whenever we copy any data/object from source s3 bucket to destination bucket while in Cross account, use sync --acl bucket-owner-full-control.
    2. This is mandatory else you can copy but the destination bucket owner will be unable to view/download any uploaded file/object from source account.

    Now use the below AWS CLI command to Sync all file/content from one bucket to another with ACL as bucket owner.

     aws s3 sync --acl bucket-owner-full-control s3://cyberkeeda-bucket-account-A/  s3://cyberkeeda-bucket-account-B/

    You can see a stream of data copying as an STDOUT after command is executed.



    Read more ...

    AWS S3 - Copy data from one bucket to another without storing credentials anywhere.


    Within this post, we will cover.

    • How to automate copy or sync data/objects from one bucket to another.
    • How we can use an EC2 instance to copy data from one bucket to another.
    • We will leverage the power of AWS IAM Role and AWS S3 CLI to accomplish our requirement.
    • AWS CloudFormation script to create IAM role and Inline Policy.


    So let's know our lab setup and similarly you can assume your requirement by replacing the variables.

    • We already have an EC2 Instance within zone ap-south-1 ( Mumbai )
    • Since S3 is region independent, we will be not highlighting it here.
    • We have two different bucket and two files under those bucket within aws same account as 
      • Bucket 1 name : cyberkeeda-bucket---> demo-file-A.txt
      • Bucket 2 name : cyberkeeda-bucket-b --> demo-file-B.txt
      • Since S3 is region independent, we will be not highlighting it here.
    • We will copy data from cyberkeeda-bucket-to cyberkeeda-bucket-by running aws cli commands from our ec2 instance.
    • Above task can be done using AWS CLI Command from any host but the major difference is, one need to store credentials while running aws configure command.
    • We will by pass the aws configure command by assigning an Instance Profile IAM role.
    • We will create an IAM Role with Inline policy.
    • We will use Cloudformation Script to create the required role.

    Few things we must know about IAM role before proceeding further,

    • IAM Role : IAM role is a set of permissions that are created to initiate various AWS Service request, when we say aws service request that means request made to initiate services like ( S3, EC2, LAMBDA, etc etc )
    • IAM Roles are not attached to any user or group, it's assumed by other aws services like ( ec2, lambda ), applications.
    • Policy : Policy can be defined as set of permissions allowed/denied to role,user or group.
    • Managed Policy : A policy that has been created keeping in mind of reusibility, creating one and can be mapped to multiple user/service/role.
    • Inline Policy : Policy that has been created for one to one mapping between policy and entity.

    CloudFormation Script to create IAM Role and Inline Policy.


    AWSTemplateFormatVersion: 2010-09-09
    Description
      CFN Script to create role and inline policy for ec2 instance.
      Will be used further to transfer data from Source bucket to Destination bucket.
      Author - Jackuna ( https://github.com/Jackuna)

    Parameters:
      RoleName:
        TypeString
        DescriptionProvide Role Name that will be assumed by EC2. [a-z][a-z0-9]*
      InlinePolicyName:
        TypeString
        DescriptionProvide Inline Policy name, it will attached with above created role. [a-z][a-z0-9]*
      SourceBucketName:
        TypeString
        DescriptionProvide Source Bucket name [a-z][a-z0-9]* 
      DestinationBucketName:
        TypeString
        DescriptionProvide Destination Bucket name [a-z][a-z0-9]*

    Resources:
      RootRole:
        Type'AWS::IAM::Role'
        Properties:
          RoleName!Sub "${RoleName}"
          AssumeRolePolicyDocument:
            Version: 2012-10-17
            Statement:
              - EffectAllow
                Principal:
                  Service: ["ec2.amazonaws.com"]
                Action: ['sts:AssumeRole']
          Policies:
            - PolicyName!Sub ${InlinePolicyName}
              PolicyDocument:
                Version'2012-10-17'
                Statement:
                  - EffectAllow
                    Action:
                    - s3:ListBucket
                    - s3:PutObject
                    - s3:GetObject
                    Resource:
                    - !Sub arn:aws:s3:::${SourceBucketName}/*
                    - !Sub arn:aws:s3:::${SourceBucketName}
                  - EffectAllow
                    Action
                    - s3:ListBucket
                    - s3:PutObject
                    - s3:GetObject
                    Resource:
                    - !Sub arn:aws:s3:::${DestinationBucketName}/*
                    - !Sub arn:aws:s3:::${DestinationBucketName}
      RootInstanceProfile:
        Type'AWS::IAM::InstanceProfile'
        DependsOn:
          - RootRole
        Properties:
          Path/
          InstanceProfileName!Sub "${RoleName}"
          Roles:
          - !Ref RoleName

    Outputs:
      RoleDetails:
        DescriptionRole Name
        Value!Ref RootRole
      PolicyDetails:
        DescriptionInline Policy Name
        Value!Ref InlinePolicyName


    Steps to use the above cloud formation script:
    • Copy the above content and save it into a file and name it as iam_policy_role.yaml
    • Go to AWS Console --> Services --> Cloudformation --> Create Stack
    • Choose options : Template is ready and Upload a Template File and upload your saved template iam_policy_role.yaml  --> Next

    • Next page will ask you for required parameters as input, we will fill it as per our lab setup and requirement.
      • Stack Name : Name of the stack ( Could be anything )
      • Destination Bucket name : Name of the bucket where we want to copy data from our source bucket.
      • Role Name : Name of your IAM role ( Could be anything )
      • Inline Policy : Name of your policy, which will allow list,get,put object permission to buckets ( Could be anything )

    • Click Next --> Again Click Next and then click on check Box to agree --> Then create Stack.
    • Next screen will initiate CloudFormation stack creation window, we can see the progress of our stack creation... wait and use the refresh button till stack creation say's it's completed.

    • Once the stack status stands completed, click on output tab and verify the name of your created resources.
    • Now toggle down to IAM windows and search our above created role.
    • Once Verified we can go to our EC2 instance, where we will be attaching our above created role to give access to S3 bucket.
    • AWS Console → EC2 → Search instance → yourinstaceName→ Right Click → Instance Setting → Attach/Replace IAM Role → Choose above created IAM role (s3_copy_data_between_buckets_role) --> Apply


    Now we are ready to test, verify and further automate it using cronJob.
    • Login to your EC2 instance.
    • Run the below command to verify you proper access to both the S3 buckets.
    List content within bucket.

     aws s3 ls s3://cyberkeeda-bucket-a/
    
    aws s3 ls s3://cyberkeeda-bucket-b/
    


    You can see the output of the above command shows file for different buckets.

    Copy file/content from one bucket to another.

    • Now we will try to copy file name demo-file-A.txt from bucket cyberkeeda-bucket-to cyberkeeda-bucket-a


     aws s3 cp s3://SOURCE-BUCKET-NAME/FILE-NAME s3://DESTINATION-BUCKET-NAME/FILE-NAME
    
    aws s3 cp s3://cyberkeeda-bucket-a/demo-file-A.txt  s3://cyberkeeda-bucket-b/demo-file-A.txt
    Sync all file/content from one bucket to another.

     aws s3 sync s3://SOURCE-BUCKET-NAME/ s3://DESTINATION-BUCKET-NAME/
    
    aws s3 sync s3://cyberkeeda-bucket-a/  s3://cyberkeeda-bucket-b/
    Sync all file/content from one bucket to another with ACL as bucket owner.

     aws s3 sync --acl bucket-owner-full-control s3://cyberkeeda-bucket-a/  s3://cyberkeeda-bucket-b/

    That's it with this post, we will cover how to do the same for Cross Account within next post.
    Feel free to comment, if you face any issue implementing it.



    Read more ...
    Designed By Jackuna