CyberKeeda In Social Media
Showing posts with label AWS CloudWatch. Show all posts
Showing posts with label AWS CloudWatch. Show all posts

AWS Cloudformation template to create ECS Task definition.

 



Cloudformation Template that will created below resources.

  • IAM role for ECS Task execution
  • ECS Task definition


Template

AWSTemplateFormatVersion: 2010-09-09
Description: | 
              ECS Task is responsible to fetch files from sftp location.
              1. IAM Role to be used by ECS task and cloudwatch event rule.
              2. ECS Task defination with container env variables, please note credential needs to be created first within parameter store.
             
Parameters:
  ProductName:
    Description: Parent Product name.
    Type: String
    Default: cyberkeeda
  ProjectName:
    Description: Project Name
    Type: String
    Default: cyberkeeda-report
  Environment:
    Description: The equivalent CN name of the environment being worked on
    Type: String
    AllowedValues:
      - dev
      - uat
      - qa
  Region:
    Description: Ck Region specific parameter
    Type: String
    AllowedValues:
      - mum
      - hyd
  ECSTaskDefARN:
    Description: ARN for ECS Task defination
    Type: String
  SFTPHostFQDN:
    Description: Remote SFTP Host FQDN.
    Type: String
    Default: 123.111.11.1
  SFTPHostPort:
    Description: Remote SFTP Host Port.
    Type: String
    Default: 22
  SFTPUserName:
    Description: Remote SFTP Host username.
    Type: String
    Default: sftpadmin
  SFTPPasswordParameterStoreName:
    Description: Remote SFTP Host Parameter store name.
    Type: String
    Default: sftppass
  ContainerImageUrlwithTag:
    Description: Container Image URL with tag.
    Type: String
    Default: docker.io/jackuna/sftpnew
  ECSClusterARN:
    Description: ECS Cluster ARN to schedule Task 
    Type: String
    Default: arn:aws:ecs:ap-south-1:895678824142:cluster/sftp

Metadata:
  AWS::CloudFormation::Interface:
    ParameterGroups:
      - 
        Label:
          default: CK Project Details
        Parameters:
          - ProductName
          - ProjectName
          - Environment
          - Region
      - 
        Label:
          default: Remote SFTP Server details used as Container Environment Variables.
        Parameters:
          - SFTPHostFQDN
          - SFTPHostPort
          - SFTPUserName
          - SFTPPasswordParameterStoreName
      
Resources:
  ExecutionRole:
    Type: AWS::IAM::Role
    Properties:
      RoleName: !Sub "${ProductName}-${Region}-${Environment}-${ProjectName}-role"
      AssumeRolePolicyDocument:
        Statement:
          - Effect: Allow
            Principal:
              Service: [ 'ecs-tasks.amazonaws.com', 'events.amazonaws.com' ]
            Action: sts:AssumeRole
      ManagedPolicyArns:
        - arn:aws:iam::aws:policy/service-role/AmazonECSTaskExecutionRolePolicy
      Policies:
      - PolicyName: !Sub "${ProductName}-${Region}-${Environment}-${ProjectName}-role-inlinePolicy"
        PolicyDocument: 
            Version: 2012-10-17
            Statement:
              - Effect: Allow
                Action:
                - ssm:GetParameters
                Resource:
                - !Sub "arn:aws:ssm:${AWS::Region}:${AWS::AccountId}:parameter/${Environment}.sftp-password" 
              - Effect: Allow
                Action:
                - ecs:RunTask
                Resource:
                - !Sub "${ECSTaskDefARN}:*"
              - Effect: Allow
                Action: iam:PassRole
                Resource:
                - "*"
                Condition:
                  StringLike:
                    iam:PassedToService: ecs-tasks.amazonaws.com
  TaskDefinition:
    Type: AWS::ECS::TaskDefinition
    Properties:
      Family: !Sub "${ProductName}-${Region}-${Environment}-${ProjectName}-ecs-task"
      Memory: 128
      NetworkMode: bridge 
      ExecutionRoleArn: !Ref ExecutionRole
      TaskRoleArn : !Ref ExecutionRole
      ContainerDefinitions:
        - Name: !Sub "${ProductName}-${Region}-${Environment}-${ProjectName}-container"
          Image: !Ref ContainerImageUrlwithTag
          Memory: 128
          Cpu: 0
          MountPoints: 
            - 
              SourceVolume: "ecs-logs"
              ContainerPath: "/var/log/ecs"
          Command: 
            - python
            - sftp_python.py
          WorkingDirectory: "/usr/local/aws-swa"
          Secrets:
            - 
              Name: SFTP_PASSWORD
              ValueFrom: !Sub ${CNEnvironment}.sftp-password
          Environment: 
            - 
              Name: APPLICATION_LOGS
              Value: !Sub  "/var/log/ecs/${ProductName}-${Region}-${Environment}-${ProjectName}-ecs-task.logs"
            - 
              Name: SFTP_HOST
              Value: !Ref SFTPHostFQDN
            - 
              Name: SFTP_PORT
              Value: !Ref SFTPHostPort
            - 
              Name: SFTP_USERNAME
              Value: !Ref SFTPUserName

      RequiresCompatibilities:
        - EC2
      Volumes: 
        - 
          Host: 
            SourcePath: "/var/log/ecs"
          Name: "ecs-logs"

Let me know, for any questions in comment box.

Read more ...

AWS Lambda Function to check existence of file under S3 bucket and Notify via Email



File Check Automation on AWS using Lambda,CloudWatch, SNS.


Within this post, we will cover.

  • How we can check the existence of a file under a AWS S3 Bucket Using Python as an AWS Lambda Function
  • How to use AWS Simple Notification Service to notify file existence status within Lambda
  • How we can automate the lambda function to check file existence using ClodWatch Rule and Custom Crontab
  • How can we implement entire solution of File Check monitoring using AWS CloudFormation template.

We will start with the use case:

  • If you have a scheduled event to drop a specif file daily/hourly to S3 bucket and want to check it's existence status.
  • If you have multiple of file checks daily, with only one lambda function by leveraging the power of cloudwatch rule's constant keys and custom cron we will accomplish it.
  • Want multiple file checks for different file within different buckets.
  • Want Success or Failure notification for file existence.

We will use a python as a language within Lambda Function to accomplish above requirements and here is the process we will follow sequentially.
  1. Create SNS topic and add Subscribers within it.
  2. Create Lambda Function
  3. Configure test events within AWS lambda function.
  4. Verify the working of Lambda function by modifying the test events values.
  5. Create CloudWatch rule to automate the file check lambda function.
Lab Setup details :
  • S3 Bucket name : cyberkeeda-bucket-a
  • S3 Bucket name with directory name : cyberkeeda-bucket-a/reports/
  • File Name and Format 
    • File Type 1 : Static file : demo-file-A.txt
    • File Type 2 : Dynamic file : YYMMDDdemo-file-A.txt (20200530demo-file-A.txt)
Steps:
  • Create Lambda Function.
    • Follow the standard procedure to create AWS lambda function : How to create Lambda Function from AWS Console with example.
    •  Add the below python Code. --> Github Link
    • In case if we don't want email notifications for SUCCESS/INFO conditions, comment out the function named trigger_email() 
    • Replace the below variables with your own.
      • SNS_TOPIC_ARN = 'arn:aws:sns:ap-south-1:387684573977:mySNSTopic'

from boto3 import resource, client
import botocore
from datetime import datetime

SNS_TOPIC_ARN = 'arn:aws:sns:ap-south-1:387650023977:mySNSTopic'

def lambda_handler(eventcontext):
    
    def trigger_email(email_subjectemail_message):
        
        sns = client('sns')
        sns.publish(
        TopicArn = SNS_TOPIC_ARN,
        Subject = email_subject,
        Message = email_message
    )
        
    def initialize_objects_and_varibales():
        global SOURCE_BUCKET_NAME
        global FILE_NAME
        global FILE_NAME_WITH_DIRECTORY
        global dt
        
        dt = datetime.now()
        File_PREFIX_DATE = dt.strftime('%Y%m%d')
        FILE_PREFIX_DIRECTORY = event["bucket_sub_directory"]
        FILE_SUFFIX = event["file_suffix"]
        SOURCE_BUCKET_NAME = event["bucket_name"]
        FILE_TYPE = event['fileType']
        
        if FILE_PREFIX_DIRECTORY == 'False':
            
            if FILE_TYPE == 'daily':
                FILE_NAME = File_PREFIX_DATE+FILE_SUFFIX
                FILE_NAME_WITH_DIRECTORY = FILE_NAME
            else:
                FILE_NAME = FILE_SUFFIX
                FILE_NAME_WITH_DIRECTORY = FILE_NAME   
        else:
            if FILE_TYPE == 'daily':
                FILE_NAME = File_PREFIX_DATE+FILE_SUFFIX
                FILE_NAME_WITH_DIRECTORY = FILE_PREFIX_DIRECTORY+FILE_NAME    
            else:
                FILE_NAME = FILE_SUFFIX
                FILE_NAME_WITH_DIRECTORY = FILE_PREFIX_DIRECTORY+FILE_NAME
                

    def check_file_existance():
        
        s3 = resource('s3')
        try:
            s3.Object(SOURCE_BUCKET_NAME, FILE_NAME_WITH_DIRECTORY).load()
            print("[SUCCESS]", dt, "File Exists with name as",FILE_NAME)
            email_subject = "[INFO] Daily Report File found in report Folder"
            email_message = "Today's file name : {} \n Bucket Name : {} \n Lambda Function Name : {}".format(FILE_NAME, SOURCE_BUCKET_NAME,context.function_name )
            trigger_email(email_subject,email_message)

        except botocore.exceptions.ClientError as errorStdOut:
            
            if errorStdOut.response['Error']['Code'] >= "401":
                print("[ERROR]", dt, "File does not exist. :", FILE_NAME)
                email_subject = "[ERROR]  Daily Report File not found in report Folder"
                email_message = "Expected file name : {} \n Bucket Name : {} \n Lambda Function Name : {}".format(FILE_NAME, SOURCE_BUCKET_NAME,context.function_name)
                trigger_email(email_subject,email_message)

            else:
                print("[ERROR]", dt, "Something went wrong")
                email_subject = "[ERROR] Lambda Error"
                email_message = "Something went wrong, please check lambda logs.\n Expected file name : {} \n Bucket Name : {}\n Lambda Function Name : {}".format(FILE_NAME,SOURCE_BUCKET_NAME,context.function_name )
                trigger_email(email_subject,email_message)
    
    initialize_objects_and_varibales()
    check_file_existance()

    • Configure Lambda function test events.
    Above Lambda function can be used for the following use case :
    •  Can be used to check existence of  file under S3 bucket and even file located under sub directories of any S3 bucket.
    Note : replace bucket-name and file_suffix as per your setup and verify it's working status.
      • To check existence of file under a bucket manually use the below JSON under configure test events.
        • We have file demo-file-A.txt located at cyberkeeda-bucket-a/
        • {
            "bucket_sub_directory""False",
            "file_suffix""demo-file-A.txt",
            "bucket_name""cyberkeeda-bucket-a",
            "fileType""random"
          }
      • To check existence of file under a sub directory located within bucket manually use the below JSON under configure test events.
        • We have file demo-file-A.txt located at cyberkeeda-bucket-a/reports/
        • {
            "bucket_sub_directory""reports/",
            "file_suffix""demo-file-A.txt",
            "bucket_name""cyberkeeda-bucket-a",
            "fileType""random"
          }
    • Can be used to check existence of  dynamic file under S3 bucket and even file located under sub directories of any S3 bucket.
    • Create Cloud Watch rule to automate the file check Lambda.
    We have our dynamic file with format YYMMDDdemo-file-A.txt , where file prefix is today's date, so for today's file the name of the file will be 20200530demo-file-A.txt

    Our Lambda function python script  is written in a way to validate such file.

    Please Comment, in case you face any issue or support required using the above scripts.

    Read more ...

    AWS Cloudwatch : How to create CloudWatch Rule and Schedule Lambda Function Using it.



    AWS CloudWatch


    Amazon CloudWatch is a monitoring and management service that provides data and actionable insights for AWS, hybrid, and on-premises applications and infrastructure resources. With CloudWatch, you can collect and access all your performance and operational data in form of logs and metrics from a single platform.


    CloudWatch Rule : Create rules to invoke Targets based on Events happening in your AWS environment.

    So this tutorial is just a walk through to
    • How to create a Cloudwatch rule using AWS Console
    • How to create a Cloudwatch rule and invoke lambda function using Rule scheduler and Cron expression.
    • What is Constant Keys and what is it's use while we use Lambda function as Target.

    What we are trying to achieve here : 
    • We have a Lambda function named as "Check_File" and it's function is to check existence of file.
    • File is dynamic type : Format --> YYMMDDYdemo-file-A.txt
    • Thus this part of file name demo-file-A.txt  is treated as file_suffix.
    • So we will be using Constant keys to send the required value, which will be used as EVENTS within our Lambda Function.
    • Our dynamic file gets uploaded everyday between 11:30 to 12:00, thus we will trigger Lambda Function via Cloudwatch rule with Custom cron to check file existence.
    Lab Setup details

    • AWS Lambda Function name : Check_File
    • AWS Lambda Function Task : Checks existence of file uploaded daily between 11:30 to 12:00
    • Cron Timing : Every Five minute between 11:30 to 12:00 GMT
    • Constant Keys : { "bucket_sub_directory": "reports/", "file_suffix": "demo-file-A.txt", "bucket_name": "cyberkeeda-bucket-a", "fileType": "daily"}

    Steps to create Cloudwatch Rule via AWS Console.

    • AWS Console --> Cloudwatch --> Rules --> Create Rule
    • Enter the required details.
      • Event Source : 
        • Schedule --> Custom Cron --> 30,35,40,45,50,55,59 11 ? * * *
        •  Target --> Lambda Function --> Check_File
        • Constant Keys --> { "bucket_sub_directory": "reports/", "file_suffix": "demo-file-A.txt", "bucket_name": "cyberkeeda-bucket-a", "fileType": "daily"}
    • Configure Rule details :
      • Name : Cloudwatch Rule name
      • Description : Describe about it and Click on Create Rule.
    Note : Replace the above Inputs with your own.


        We are done with the creation of Cloudwatch Rule via Admin Console.

    Verify Cloudwatch logs to confirm if scheduled lambda executed.


    Cloudwatch Rule via CloudFormation Script.

    Use the below AWS CFN template to create above cloudwatch rule via AWS Cloudformation.


    AWSTemplateFormatVersion: 2010-09-09
    DescriptionCFN Script to create Lambda Function and a CloudWatch Rule with cron to trigger it.
    Parameters:
      LambdaFunctionNameARNParam:
        TypeString
        DescriptionProvide Lambda Function ARN Parameter. [a-z][a-z0-9]* 
      CloudwatchRuleNameParm:
        TypeString
        DescriptionProvide Cloudwatch Rule name. [a-z][a-z0-9]*

    Resources:
      ScheduledRule
        TypeAWS::Events::Rule
        Properties
          Description!Sub "Scheduled Rule created to invoke Lambda Function: ${LambdaFunctionNameARNParam}"
          Name!Sub ${CloudwatchRuleNameParm}
          ScheduleExpressioncron(0/5 11-12 ? * * *)

          State"DISABLED"
          Targets
          - IdLambda1
            Arn!Sub ${LambdaFunctionNameARNParam}
            Input'{ "bucket_sub_directory": "reports/",
                       "file_suffix": "demo-file-A.txt",                    "bucket_name": "cyberkeeda-bucket-a",
                       "fileType": "random"
                      }'

      PermissionForEventsToInvokeLambda
        TypeAWS::Lambda::Permission
        Properties
          FunctionName!Sub ${LambdaFunctionNameARNParam}
          Action"lambda:InvokeFunction"
          Principal"events.amazonaws.com"
          SourceArn!GetAtt ScheduledRule.Arn



    Read more ...
    Designed By Jackuna