CyberKeeda In Social Media
Showing posts with label AWS Lambda. Show all posts
Showing posts with label AWS Lambda. Show all posts

Jenkins Pipeline for Continuous Integration of AWS Lambda Function with GitHub repository

 


AWS Lambda Function is awesome and trust me if you are working on AWS, sooner or later you have to deal with AWS Lambda.

Due to the nature of being a PAAS service, we can't ignore the ways of Lambda deployment and it's test methods which is somehow more or less through the Lambda Function Console.

Off course there are ways to write code, test code and deploy code directly through IDEs, but keep in mind you still need an ACCESS Key and ACCESS Secret. 

So what about the code base, how will we track the code changes done in Lambda Function itself.

In this post, we will cover the challenges with Lambda approaches for CICD and one of the my proposed solutions to address some part of it.

Let's know some of the challenges and their probable solutions.

  • Lambda Deployment : We can use Terraform and CloudFormation for the same, then what's is the challenge ?
    • CloudFormation :
      • We can use Inline methods to put our Lambda Code under the Code block of ZipFile, but what about the 3rd party modules like panda, we can't use the code block under CloudFormation.
      • One can still package those third party modules and other code together, but still one needs to upload those in S3 bucket and think of a way of handling changes before using it.
  • Lambda Function Code base : 
    • We still need to have snapshots of our lambda function code to track the daily changes and to be later use in deployment pipeline.

There are some more, challenges with Lambda Function, but in this blog post we will try to cover the basic of CICD, that is replicating our Lambda Function code from Console to GitHub repository.

  • The moment when we talk about CICD, mostly pipeline uses git to get source code and then use it for further process like, checkout, build, release, deploy, test..
  • Here the case is somehow different, due to nature of PAAS, we have to test our code's functionality in Lambda Console first, then it can be pushed to repository to save our source code.
  • Yes, AWS SAM is yet another option of testing Lambda Function code within our local environment, but not in the case when Lambda is hosted in VPC and it uses other services to communicate.

Below is one of my proposed solution to achieve the same.



Prerequisites.
  • IAM Access Key Secrets or IAM Role attached to EC2 instance, from where the Jenkins job is triggered.
  • GitHub Personal Access Token
Here is the flow...
  1. I assume, developer is initially testing his/her code's functionality on Lambda Console, Once Developer is okay, with his/her Lambda Function code, we will move to next step.
  2. SysAdmin/Developer can check-in his/her code directly from Lambda Function to GitHub repository using Jenkins Job.
  3. Jenkins Job has scripted pipeline attached to it, thus it will go through below stages.
    • Stage : Check Out code to appropriate branch.
    • Stage : Build Docker image from Docker File for Ansible.
    • Stage : Run Ansible container from above created Docker image and run Ansible Playbook command to execute Ansible role and it's relative ansible tasks.
      1. Task 1 - 
        • Download Lambda Code from Lambda Console using Python Script, which is using boto3 module.
        • Unzip the downloaded code into specific directory to track the changes as a file, else changes in zip file can't be tracked.
      2. Task 2 - 
        • Clone existing repository from git, replace the existing lambda source code with the newer one downloaded in above step.
        •  Git add, commit and push it into git repository.

Here is the lab setup.

Our Lambda Function in console has something by name "ck-uat-Lambda-Authorizer"


And it's code looks like something below in console.


GitHub repository where I want to publish my code.

Repo Snapshot.


Directory Layout for the same...



Our Final Intention is to dump or lambda function code under src directory, that is lambda_folder/src

So according to the flow stated earlier in the post, I will paste the code..

Jenkins Scripted Pipeline code.

Note: Do mask the additional secrets to avoid it to be appear in plain text.
def gituser = env.GIT_USERNAME
def gituserpass = env.GIT_PASSWORD
def ACCESS_KEY = env.AWS_ACCESS_KEY
def KEY_ID = env.AWS_SECRET_ACCESS_KEY
def DEBUG_MODE = env.LOG_TYPE

node('master'){
  
  try {

    stage('Git Checkout'){
            checkout scm
            sh "git checkout lambda_deployer"
        }

     stage('build'){
                  sh "ls -ltr"
                   echo "Building docker image via dockerfile..."
                   sh "docker build  -t ansible:2.10-$BUILD_ID ."
                  }

     stage('deploy'){
                    echo "Infrastructure deployment started...."
                    wrap([$class: "MaskPasswordsBuildWrapper",
                          varPasswordPairs: [[password: gituserpass, var: gituserpass] ]]) {
                    sh "docker run --rm \
                        -e gituser=$gituser \
                        -e gituserpass=$gituserpass \
                        -e AWS_ACCESS_KEY_ID=$ACCESS_KEY \
                        -e AWS_SECRET_ACCESS_KEY=$KEY_ID \
                        -e AWS_DEFAULT_REGION='ap-south-1' \
                        ansible:2.10-$BUILD_ID ansible-playbook -$DEBUG_MODE  --extra-vars 'env=dev1 git_username=${gituser} token=${gituserpass}' lambda_folder/root_lambda_project.yml"
                      }
                    }          
      }


            
  catch (e){
    echo "Error occurred - " + e.toString()
    throw e
    } 
  finally {
    deleteDir()
       
            sh 'docker rmi -f ansible:2.10-$BUILD_ID  && echo "ansible:2.10-$BUILD_ID local image deleted."'
  }
}

Build Pipe Line should have something like below in Jenkins Console.



Jenkins One of the Stage : Build will build docker image from Docker File, here is the docker file source code.

FROM python:3.7
RUN python3 -m pip install ansible==2.10 boto3 awscli

RUN rm -rf /usr/local/ansible/

copy lambda_folder /usr/local/ansible/lambda_folder

WORKDIR usr/local/ansible/

CMD ["ansible-playbook", "--version"]

Once Docker Images is created, next step is to run Docker container from the above created Ansible image.

Here is the Ansible Role and it's respective tasks.

Ansible Root Playbook YAML -- root_lambda_project.yml

---
- hosts: localhost
  connection: local
  gather_facts: False

  roles:
   - role-

Ansible Variable file under roles -- lambda_folder/role/vars/dev1/main.yml

---
region: us-east-1
function_name: ck-uat-LambdaAuthorizer
git_repo_name: aws-swa
git_repo_branch: lambda_deployer

Python Script, that will be called on one of the Ansible Task to download Lambda Function code  

Note : It's an edited version of existing version of code from stackoverflow.
"""
    Script to download individual Lambda Function and dump code in specified directory
"""
import os
import sys
from urllib.request import urlopen
import zipfile
from io import BytesIO

import boto3


def get_lambda_functions_code_url(fn_name):

    client = boto3.client("lambda")
    functions_code_url = []
    fn_code = client.get_function(FunctionName=fn_name)["Code"]
    fn_code["FunctionName"] = fn_name
    functions_code_url.append(fn_code)
    return functions_code_url


def download_lambda_function_code(fn_name, fn_code_link, dir_path):

    function_path = os.path.join(dir_path, fn_name)
    if not os.path.exists(function_path):
        os.mkdir(function_path)
    with urlopen(fn_code_link) as lambda_extract:
        with zipfile.ZipFile(BytesIO(lambda_extract.read())) as zfile:
            zfile.extractall(function_path)


if __name__ == "__main__":
    inp = sys.argv[1:]
    print("Destination folder {}".format(inp))
    if inp and os.path.exists(inp[0]):
        dest = os.path.abspath(inp[0])
        fc = get_lambda_functions_code_url(sys.argv[2])
        for i, f in enumerate(fc):
            print("Downloading Lambda function {}".format(f["FunctionName"]))
            download_lambda_function_code(f["FunctionName"], f["Location"], dest)
    else:
        print("Destination folder doesn't exist")


Ansible Task 1 : lambda_folder/role/tasks/download_lambda_code.yml

---

- name: Read Variables
  include_vars:
    file: "role/vars/{{ env }}/main.yml"

- name: Download Lambda Function using Python script..
  command:
    argv:
      - python3 
      - role/files/download_lambda.py 
      - src
      - "{{ function_name }}"
Ansible Task 2 : lambda_folder/role/tasks/update_repository.yml

---
- name: Git clone source repository..
  command:
    argv:
      - git 
      - clone 
      - https://{{ git_username }}:{{ token }}@github.com/Jackuna/{{ git_repo_name }}.git 
      - -b 
      - "{{ git_repo_branch }}"

- name: Git add Lambda function source code to repo..
  command: >
    cp -r src {{ git_repo_name }}/lambda_folder

- name: Git add recent changes..
  command: >
    git add --all lambda_folder/src
  args:
    chdir: "{{ git_repo_name }}"

- name: Git Config username..
  command: >
    git config user.name {{ git_username }}
  args:
    chdir: "{{ git_repo_name }}"

- name: Git Config email..
  command: >
    git config user.email {{ git_username }}@cyberkeeda.com 
  args:
    chdir: "{{ git_repo_name }}"  
- name: Git commit recent changes..
  command: >
    git commit -m "Updated Latest code"
  args:
    chdir: "{{ git_repo_name }}"

- name: Git push recent changes..
  command:
    argv:
      - git 
      - push 
      - https://{{ git_username }}:{{ token }}@github.com/Jackuna/{{ git_repo_name }}.git 
      - -u 
      - "{{ git_repo_branch }}"
  args:
    chdir: "{{ git_repo_name }}"
  register: git_push_output  

That's all you need.. in case of hurdles or issues, do comment !
Read more ...

AWS Lambda Function to check existence of file under S3 bucket and Notify via Email



File Check Automation on AWS using Lambda,CloudWatch, SNS.


Within this post, we will cover.

  • How we can check the existence of a file under a AWS S3 Bucket Using Python as an AWS Lambda Function
  • How to use AWS Simple Notification Service to notify file existence status within Lambda
  • How we can automate the lambda function to check file existence using ClodWatch Rule and Custom Crontab
  • How can we implement entire solution of File Check monitoring using AWS CloudFormation template.

We will start with the use case:

  • If you have a scheduled event to drop a specif file daily/hourly to S3 bucket and want to check it's existence status.
  • If you have multiple of file checks daily, with only one lambda function by leveraging the power of cloudwatch rule's constant keys and custom cron we will accomplish it.
  • Want multiple file checks for different file within different buckets.
  • Want Success or Failure notification for file existence.

We will use a python as a language within Lambda Function to accomplish above requirements and here is the process we will follow sequentially.
  1. Create SNS topic and add Subscribers within it.
  2. Create Lambda Function
  3. Configure test events within AWS lambda function.
  4. Verify the working of Lambda function by modifying the test events values.
  5. Create CloudWatch rule to automate the file check lambda function.
Lab Setup details :
  • S3 Bucket name : cyberkeeda-bucket-a
  • S3 Bucket name with directory name : cyberkeeda-bucket-a/reports/
  • File Name and Format 
    • File Type 1 : Static file : demo-file-A.txt
    • File Type 2 : Dynamic file : YYMMDDdemo-file-A.txt (20200530demo-file-A.txt)
Steps:
  • Create Lambda Function.
    • Follow the standard procedure to create AWS lambda function : How to create Lambda Function from AWS Console with example.
    •  Add the below python Code. --> Github Link
    • In case if we don't want email notifications for SUCCESS/INFO conditions, comment out the function named trigger_email() 
    • Replace the below variables with your own.
      • SNS_TOPIC_ARN = 'arn:aws:sns:ap-south-1:387684573977:mySNSTopic'

from boto3 import resource, client
import botocore
from datetime import datetime

SNS_TOPIC_ARN = 'arn:aws:sns:ap-south-1:387650023977:mySNSTopic'

def lambda_handler(eventcontext):
    
    def trigger_email(email_subjectemail_message):
        
        sns = client('sns')
        sns.publish(
        TopicArn = SNS_TOPIC_ARN,
        Subject = email_subject,
        Message = email_message
    )
        
    def initialize_objects_and_varibales():
        global SOURCE_BUCKET_NAME
        global FILE_NAME
        global FILE_NAME_WITH_DIRECTORY
        global dt
        
        dt = datetime.now()
        File_PREFIX_DATE = dt.strftime('%Y%m%d')
        FILE_PREFIX_DIRECTORY = event["bucket_sub_directory"]
        FILE_SUFFIX = event["file_suffix"]
        SOURCE_BUCKET_NAME = event["bucket_name"]
        FILE_TYPE = event['fileType']
        
        if FILE_PREFIX_DIRECTORY == 'False':
            
            if FILE_TYPE == 'daily':
                FILE_NAME = File_PREFIX_DATE+FILE_SUFFIX
                FILE_NAME_WITH_DIRECTORY = FILE_NAME
            else:
                FILE_NAME = FILE_SUFFIX
                FILE_NAME_WITH_DIRECTORY = FILE_NAME   
        else:
            if FILE_TYPE == 'daily':
                FILE_NAME = File_PREFIX_DATE+FILE_SUFFIX
                FILE_NAME_WITH_DIRECTORY = FILE_PREFIX_DIRECTORY+FILE_NAME    
            else:
                FILE_NAME = FILE_SUFFIX
                FILE_NAME_WITH_DIRECTORY = FILE_PREFIX_DIRECTORY+FILE_NAME
                

    def check_file_existance():
        
        s3 = resource('s3')
        try:
            s3.Object(SOURCE_BUCKET_NAME, FILE_NAME_WITH_DIRECTORY).load()
            print("[SUCCESS]", dt, "File Exists with name as",FILE_NAME)
            email_subject = "[INFO] Daily Report File found in report Folder"
            email_message = "Today's file name : {} \n Bucket Name : {} \n Lambda Function Name : {}".format(FILE_NAME, SOURCE_BUCKET_NAME,context.function_name )
            trigger_email(email_subject,email_message)

        except botocore.exceptions.ClientError as errorStdOut:
            
            if errorStdOut.response['Error']['Code'] >= "401":
                print("[ERROR]", dt, "File does not exist. :", FILE_NAME)
                email_subject = "[ERROR]  Daily Report File not found in report Folder"
                email_message = "Expected file name : {} \n Bucket Name : {} \n Lambda Function Name : {}".format(FILE_NAME, SOURCE_BUCKET_NAME,context.function_name)
                trigger_email(email_subject,email_message)

            else:
                print("[ERROR]", dt, "Something went wrong")
                email_subject = "[ERROR] Lambda Error"
                email_message = "Something went wrong, please check lambda logs.\n Expected file name : {} \n Bucket Name : {}\n Lambda Function Name : {}".format(FILE_NAME,SOURCE_BUCKET_NAME,context.function_name )
                trigger_email(email_subject,email_message)
    
    initialize_objects_and_varibales()
    check_file_existance()

    • Configure Lambda function test events.
    Above Lambda function can be used for the following use case :
    •  Can be used to check existence of  file under S3 bucket and even file located under sub directories of any S3 bucket.
    Note : replace bucket-name and file_suffix as per your setup and verify it's working status.
      • To check existence of file under a bucket manually use the below JSON under configure test events.
        • We have file demo-file-A.txt located at cyberkeeda-bucket-a/
        • {
            "bucket_sub_directory""False",
            "file_suffix""demo-file-A.txt",
            "bucket_name""cyberkeeda-bucket-a",
            "fileType""random"
          }
      • To check existence of file under a sub directory located within bucket manually use the below JSON under configure test events.
        • We have file demo-file-A.txt located at cyberkeeda-bucket-a/reports/
        • {
            "bucket_sub_directory""reports/",
            "file_suffix""demo-file-A.txt",
            "bucket_name""cyberkeeda-bucket-a",
            "fileType""random"
          }
    • Can be used to check existence of  dynamic file under S3 bucket and even file located under sub directories of any S3 bucket.
    • Create Cloud Watch rule to automate the file check Lambda.
    We have our dynamic file with format YYMMDDdemo-file-A.txt , where file prefix is today's date, so for today's file the name of the file will be 20200530demo-file-A.txt

    Our Lambda function python script  is written in a way to validate such file.

    Please Comment, in case you face any issue or support required using the above scripts.

    Read more ...

    AWS CloudFormation Script to Create Lambda Role with Inline Policy for S3 Operations.



    Within this blog we have a requirement to copy data from one bucket to another bucket using Lambda Function, in order to accomplish the task Lambda needs an additional role in order to perform task for other AWS Services.

    So we will use Cloudformation script to create the below AWS Resources.

    • IAM Role for Lambda Service.
    • Above created Role has attached Inline Policy with the below access.
      • ACCESS to two individual Bucket.
      • ACCESS to Cloud Watch to perform basic Log Operations 

    In case if your are looking to use it, replace the below enlisted by yours value.
    • Bucket 1 name : mydemodests1
    • Bucket 2 name : mydemodests2
    • IAM Role name : LambaRoleforS3operation
    • Inline Policy name : LambaRoleforS3operation-InlinePolicy

    AWSTemplateFormatVersion: 2010-09-09
    Description:  Lambda role creation for S3 Operation.
      
    Resources:
      LambdaIAMRole:
        Type'AWS::IAM::Role'
        Description"Lambda IAM Role"
        Properties:
          RoleNameLambaRoleforS3operation
          AssumeRolePolicyDocument:
            Version'2012-10-17'
            Statement:
              - SidAllowLambdaServiceToAssumeRole
                EffectAllow
                Principal:
                  Service:
                    - lambda.amazonaws.com
                Action:
                  - sts:AssumeRole
          Path/service-role/
          Policies:
            - PolicyName"LambaRoleforS3operation-InlinePolicy"
              PolicyDocument: {
        "Version""2012-10-17",
        "Statement": [
            {
                "Effect""Allow",
                "Action": [
                    "logs:CreateLogGroup",
                    "logs:CreateLogStream",
                    "logs:PutLogEvents"
                ],
                "Resource""arn:aws:logs:*:*:*"
            },
            {
                "Effect""Allow",
                "Action": [
                    "s3:*"
                ],
                "Resource": [
                    "arn:aws:s3:::mydemodests1/*"
                ]
            },
            {
                "Effect""Allow",
                "Action": [
                    "s3:*"
                ],
                "Resource": [
                    "arn:aws:s3:::mydemodests2/*"
                ]
            }
        ]
    }

    Read more ...

    Creating Random password generator using AWS Lambda Function


    AWS Lambda


    AWS Lambda is an serverless computing platform served by Amazon  Amazon Web Services. It is a PAAS computing service that runs code in response to events and automatically manages the computing resources required by that code.



    So this tutorial is just a walk through to
    • How to run your python code from AWS Lambda
    • How to make use of AWS lambda to create tiny python application
    • Python based random strong password generator hosted on AWS Lambda.
    • How to create AWS Lambda Function for Python

    Let's begin.
    In order to run python code from Lambda function, the first thing we do is to create a Handler within Lambda function, which will trigger the ineer python function using event and context

    A basic lambda function handler looks like the below code.

    def handler_name(event, context): 
        ...
        return some_value

    Now let's focus on the above code.

    • handler_name : type (String ) It could be anything, but there is great use of it further.
    • event : type (dictionary) It's the way through which we can feed variable to our inner python code/programme.
    • context : In simple terms it will be used for the logging and debugging purpose.
    Above things are just theory, i'm quite sure once i let you know how to run our python code using above defined parameters, you will use it then further for you.

    Lets toggle down to Lambda Service from AWS Console and create a function.

    Services --> Compute --> Lambda --> Functions --> Create Function


















    Select Author from Scratch and proceed to fill the basic information for your new lambda function and then click on create function.

    Function name : Enter a name that describes the purpose of your function, it must be anything.
    Here i will be creating a lambda function to generate random password, hence we will name it as "Password_Generator"

    Run time : Choose your desired language, we will  python 3.7 here.

    Permissions : Lambda function need appropriate IAM role to run, for now choose option "Create a new role with basic lambda permissions"

    Permissions --> Choose or create execution role --> Create a new role with basic lambda permissions --> Create Function




    Once function created, it will redirect into the below looking page.




    Scroll down to section "Function code" and let's paste our python code within the lambda handler function.

    from random import choice
    
    def lambda_handler(event, context):
        # TODO implement
        keyboard_char='abcdefghijklmnopqrstuvwxyz1234567890ABCDEFGHIJKLMNOPQRSTUVWXYZ!@#$%^&*_-'
    
        
        
        
        def Gen_Strong_Pass(pass_len,pass_no):
            
            cast_pass_len=int(pass_len)
            cast_pass_no=int(pass_no)
            
            for i in range(pass_no):
                if i == 0:
                    print('Here are your', cast_pass_no, 'random', cast_pass_len, 'digit tpasswords ! \n')
                
                hard_pass=''
                for j in range(cast_pass_len):
                    hard_pass+=choice(keyboard_char)
                print(hard_pass+'\n')
    
        Gen_Strong_Pass(event['pass_len'],event['pass_no'])
    

    The top most mandatory function "lambda_handler" can be replaced by anything, and the same must be entered within Handler, inlined to the next text after lambda_function
    as to look like lambda_function.lambda_handler




















    For example, if we want to name our function lambda_handler as my_lambda_handler.

    Handler on the top box must be replaced by lambda_function.my_lambda_handler

    We are ready now lets scroll up to "Configure test events"  and add our events, what is events ? a way to define variables using python dictionary... more is written on top.





    Now let's replace Event name and the key-value pairs by our own inputs.

    Event name : We will name it as "Generate8digit3password" and paste the required key value pairs by our own.


    You might be wondering why this key value pair and where it would be used !

    Events defined within test events can be called within lambda function as
    event['name_of_your_key']  as we defined two key-values as pass_len and pass_no
    let's look into our code and the red boxes will give you the idea, where the event values are called.

























    Once Test event is created, it will be visible under test events drop down.













    Now, we are ready  !

    Go back to your function "Password_Generator" choose from your recent saved test events.
    and click on test.

    Once completed, expand logs to look into log output to see code output as below.





















    Conclusion : We have created a simple password generator using python, deployed it into AWS PASS services named as Lambda.
    We got a run time infrastructure without any setup.

    Tip : You can save more test events like, 9 digit, 16 digit password generator and use it as per your requirements.


    Read more ...
    Designed By Jackuna