AWS Lambda â Useful resources ”; Previous Next The following resources contain additional information on AWS Lambda. Please use them to get more in-depth knowledge on this. Useful Video Courses Serverless Development With AWS Lambda And NodeJS 45 Lectures 7.5 hours Eduonix Learning Solutions More Detail AWS LAMBDA- Serverless Computing Course 32 Lectures 3 hours Harshit Srivastava More Detail AWS Certified Solution Architect Training 59 Lectures 5.5 hours Harshit Srivastava More Detail Functional Programming and Lambda- Learn Java8 by Coding it Best Seller 73 Lectures 10.5 hours Arun Ammasai More Detail Build and Deploy Lambda Functions: AWS with Python and Boto3 47 Lectures 3 hours Niyazi Erdogan More Detail AWS Automation: Boto3 And Lambda Functions using Python Most Popular 42 Lectures 9.5 hours VRTechnologies More Detail Print Page Previous Next Advertisements ”;
Category: aws Lambda
AWS Lambda â Additional Example ”; Previous Next Till now, we have seen working of AWS Lambda with AWS services. Based on that knowledge, let us create a simple user registration form and post the data using API gateway to AWS Lambda. AWS Lambda will get the data from the event or theAPI gateway trigger and will add those details to DynamoDB table. Example Let us consider an example and perform the following functionalities on it − Create DynamoDB Table Create Form for User Registration Create AWS Lambda and API gateway to send message to Phone using AWS SNS service Create AWS Lambda and API gateway to POST form data and insert in DynamoDb table Create AWS Lambda and API gateway to read data from Dynamodb table Final Working of the User Registration Form Create DynamoDB Table The data entered will be stored in DynamodDB table. We will use API gateway to share data entered with AWS Lambda and later AWS Lambda will add the details in DynamoDB. You can use the following details to create DynamodDB table in AWS console. First, go to AWS Service and click DynamoDB. Click Table to create the table as shown below − You can use the ARN to create policy for the DynamoDB to be used with AWS Lambda. Go to IAM and select Policies. Click Create policy, choose service as DynamodDB as shown below − Click All DynamoDB actions as shown above. Choose resource and enter the ARN for table as shown below − Now, click Add as shown below. If you click Review policy button at the end of the screen, you can see the following window − Enter name of the policy and click Create policy button at the end of the page. Now, we need to create role to be used with Lambda. We need permissionsforDynamoDB, APIGateway and Lambda. Go to AWS services and select IAM. Select Roles from left side and add the required roles. Enter the role name and click Create role. The role created is roleforlambdaexample. Create Form for User Registration Here is the display of the user registration form to enter and to read the data from the dynamodb table. Create AWS Lambda and API Gateway to Send OTP Message to Phone using SNS service If you see the user registration form, there is a button validate phone. User is suppose to enter phone number and click on validate phone button to validate the phone number. For this purpose − When a user clicks this button, the API gateway post method which contains the phone details is called and internally AWS Lambda is triggered. Then, AWS Lambda sendsOTP to the phone number entered using AWS SNS service. The user receives the OTP and has to enter thisOTP number. The textbox to enter OTP will appear when the phone number is entered and validate phone button is clicked. The OTP received from AWS Lambda and the OTP entered by the user has to match, to allow the user to submit the user registration form. A simple block diagram that explains the working of phone validation is shown here − The AWS Lambda function created is as shown here − The corresponding AWS Lambda code is as given below − const aws = require(“aws-sdk”); const sns = new aws.SNS({ region:”us-east-1” }); exports.handler = function(event, context, callback) { let phoneno = event.mphone; let otp = Math.floor(100000 + Math.random() * 900000); let snsmessage = “Your otp is : “+otp; sns.publish({ Message: snsmessage, PhoneNumber: “+91″+phoneno }, function (err, data) { if (err) { console.log(err); callback(err, null); } else { console.log(data); callback(null, otp); } }); }; Note that we are using SNS service to send the OTP code. This code is used to validate the mobile number entered by the user in the user registration form. The API gateway created for above phone validation is as follows − The Lambda function given is phonevalidationexample. We are taking the mobile phone details here to be used inside AWS Lambda. Then, AWS Lambda will send the OTP code to the given mobile number. Create AWS Lambda and API Gateway to POST Form Data and Insert in DynamoDB Table For user registration form, all the fields are mandatory. There is anAJAX call made wherein the data entered in the form is posted to the API Gateway URL. A simple block diagram which explains the working of the submit button is shown here − Once the form is filled, the submit button will call the API gateway which will trigger AWS Lambda. AWS Lambda will get the details of the form from event or theAPI Gateway and the data will be inserted in the DynamodDB table. Let us understand the creation of API Gateway and AWS Lambda. First, go to AWS services and click Lambda. The Lambda function created is as shown here − Now, to create an API gateway, go to AWS service and select API Gateway. Click on Create API button shown below. Enter the API name and click on Create API button to add the API. Now, an API is created called as registeruser. Select the API and click Actions dropdown to create Resource. Click Create Resource. Now, let us add the POST method. For this, click on resources created on left side and from Actions dropdown select create method. This will display dropdown as shown below − Select the POST method and add the Lambda function that we created above. Click Save button to add the method. To send the form details to Lambda function lambdaexample we need to add the Integration Request as shown below − To post the form details, you will have to click Integration Request. It will display below details. Click Body Mapping Templates to add the form fields to be posted. Next, click Add mapping template and enter the content type. Here, we have added application/json as the content type. Click it and here you need to enter the field in json
Using AWS Lambda@Edge with CloudFront ”; Previous Next Lambda@Edge is an addition to the AWS Lambda compute service which is used to customize the content that cloudfront delivers. The block diagram which shows the working of AWS Lambda with cloudfront from AWS is shown below − There are four ways in which AWS Lambda can be used − Viewer Request − End user makes the request called Viewer Request to CloudFront Origin Request − CloudFront forwards the request to the origin Origin Response − CloudFront receives the response from the origin Viewer Response − CloudFront send the response to the viewer We can use Lambda@Edge for the following purposes − To change the headers at the request and response time. Add cookies details to the headers. Carry out AB testing based on the request and response. Redirect the URL to another site, based on the header details. We can fetch the user-agent from the headers and find out the details of the browser, OS, etc. Requisites To start with working on CloudFront and Lambda@Edge, we need the following − Create S3 storage bucket with file details Create role which will allow permission to work with CloudFront and Lambda@Edge Create CloudFront distribution Create lambda function Add lambda function details to cloudfront Check the cloudfront url in browser We will work on an example with CloudFront and Lambda@Egde, wherein we will host the page and change the response when detected as desktop and devices. Create S3 Storage Bucket with File Details Login to AWS console and create a bucket in S3 and add the . html file which you want to display. Click on S3 and Create bucket as shown below − Now, click Create bucket button and add the details of the bucket as shown below − Click on Create button and upload the .html in it. Create Role Go to AWS console and click IAM. Now, click Roles -> Create role button as shown − Choose the permission for S3, Lambda and Cloudfront. It is a good practice to create the policy giving permission to only the required function, storage by using the ARN details. In the example discussed below, we are showing the Full Access permission. Policies for the role name role for cloudfront is added as shown above. Click on Create role. All the policy required for lambda@edge and cloudfront are as shown above. There is a additional step to be done here since incase of cloudfront the url will be available across region and it needs a trust relationship between the services we are using. Now, for the role created, click on Trust relationships tab as shown − Click on Edit Trust Relationship as shown below − It displays a policy document. We need to add the other services in the Principal -> Service which we are planning to use. The final trust relationship policy document is as shown below − Click Update Trust Policy button to save the changes. Create CloudFront Distribution Go to CloudFront service as shown below − Click on CloudFront service and click on Create Distribution − Origin Settings, Behaviour Settings and Distribution settings Let us look into these settings one by one − Origin Settings Various parameters of Origin settings are explained as below − Origin Domain Name − This is the name of the S3 bucket where we have stored the html files. We can also store images, if any, in the S3 bucket by creating folders of our choice. Origin Path − Here you need to enter the name of the folder where the files are stored. At present, we do not have this folder, so we will keep it blank for now. Origin ID − It gets populated when the origin domain name is selected. You can change the id as per your choice. Restrict Bucket Access − In this, we will choose the option yes. Here we need security for the S3 bucket so that no one has the access to the S3 bucket. For this option there are some more options populated like Origin Access Identity, Comment and Grant Read Permission on Bucket. Origin Access Identity − We have used create a new identity option. You can also choose the existing identity. This creates a new identity which is used by CloudFront to read the details from S3 bucket. Grand Read Permission on Bucket − For this, choose the option Yes. Origin Custom Headers − We will keep the headers blank here, as we do not need the details right now. Next, let us discuss and fill up the Behaviour Settings for Cloudront distribution − Now, select the protocol – https or http, and the caching option. Note that the default caching is 86400 or 24 hrs. You can change this value as per the requirement. Click Object Caching (customize option) to change the caching. You can use smooth streaming in case if there any videos on your page. Here, we are keeping the default option available. Once the lambda function is created, its details will be added. The details for distribution settings are shown below − Various parameters of distribution settings are explained below − Price class − It has details like the origin of users traffic. Note that here we have selected the default one – Use All Edge Locations. AWS WAF Web ACL − This is for web application firewall selection. Here, it has option as None. First, we need to create the firewall in AWS. It provides security to the site. Alternate Domain Names − Here you can specify the domain name if you have. SSL Certificate − This has all the details to be selected for SSL certificate. We will keep the default ones. Default Root Object − Here we will specify the filename which we have uploaded in S3. For this, we need the content from the .html to be displayed by default. For the rest, we will keep the default setting. Click Create Distribution button to
Using Lambda Function with CloudTrail ”; Previous Next AWS CloudTrail is a service available with Amazon, which helps to logs all the activities done inside AWS console. It logs all the API calls and stores the history, which can be used later for debugging purpose. Note that we cannot trigger Lambda from CloudTrail. Instead, CloudTrail stores all the history in the form of logs in S3 bucket and we can trigger AWS Lambda from S3. Once any logs are to be processed, AWS Lambda will get triggered whenever any logs are added to S3 bucket. Requisites Before you start to work with AWS CloudTrail, S3 and AWS Lambda, you need to perform the following − Create S3 bucket to store CloudTrail logs Create SNS service Create a trail in CloudTrail and assign the S3 bucket and SNS service Create IAM role with permission. Create aws lambda function AWS Lambda configuration Example Let s consider an example which shows the working of AWS CloudTrail, S3 and AWS Lambda. Here, we will create a bucket in S3 which will store all the logs for any interaction done in AWS console. Let us create SNS topic and publish it. For this action, the logs will be entered as a file in S3. AWS lambda will get triggered which will send mail using Amazon SES service. The block diagram for explaining this process is as shown below − Create S3 Bucket to Store CloudTrail logs Go to AWS console and click S3 service. Click Create bucket and enter the name of the bucket you want to store cloudtrail logs as shown − Observe that here we have created a S3 bucket cloudtraillogsaws for storing the logs. Create SNS Service Go to AWS console and click Simple notification Service. Select topics from left side and click Create new topic button. We have created topic called displaytrail to publish a topic. Its details will get stored in S3bucket that is created above. Create a Trail in Cloudtrail and Assign the S3 bucket and SNS service Go to AWS console and click CloudTrail service from Management tools as shown − Click Trails from the left side as shown below − Click Create Trail button. Enter the Trail name, Apply trail to all regions and choose Yes. Then So the logs will be applied for all the region. For Read/Write events, choose All. Add the S3 bucket and SNS topic details as shown below. You can create a new one here or add an existing one. Note that there are options available to encrypt log files, enable log file validation, send sns notification for every log file delivery etc. I have used the default values here. You can allow file encryption and it will ask for encryption key. Click on Create Trail button once the details are added. Create IAM Role with Permission Go to AWS console and select IAM. Create a role with permission for S3, Lambda, CloudTrail and SES for sending email. The role created is as shown below − Create AWS Lambda Function Go to AWS service and click Lambda service. Add the function name, select runtime as nodejs, and select the role created for the lambda function. Following is the lambda function created. AWS Lambda Configuration Next, we need to add S3 as the trigger for AWS lambda created. Add the S3 bucket details to add the trigger and add the following AWS Lambda code − const aws = require(“aws-sdk”); const sns = new aws.SNS({ region:”us-east-1” }); var ses = new aws.SES({ region: ”us-east-1” }); exports.handler = function(event, context, callback) { console.log(“AWS lambda and SNS trigger “); console.log(event); const s3message = “Bucket Name:”+event.Records[0].s3.bucket.name+”nLog details:”+event.Records[0].s3.object.key; console.log(s3message); var eParams = { Destination: { ToAddresses: [“[email protected]”] }, Message: { Body: { Text: { Data:s3message } }, Subject: { Data: “cloudtrail logs” } }, Source: “[email protected]” }; var email = ses.sendEmail(eParams, function(err, data) { if (err) console.log(err); else { console.log(“===EMAIL SENT===”); console.log(“EMAIL CODE END”); console.log(”EMAIL: ”, email); context.succeed(event); callback(null, “email is send”); } }); }; Note that we are taking the S3 bucket and log details from the event and sending mail using SES service as shown above. Whenever any activity takes place in AWS console, the logs will be sent to S3 bucket and at the same time, AWS lambda will get triggered and the mail will be send to the email id mentioned in the code. Note that you can process the logs as per your needs in AWS Lambda. Print Page Previous Next Advertisements ”;
Using Lambda Function with Amazon DynamoDB ”; Previous Next DynamoDB can trigger AWS Lambda when the data in added to the tables, updated or deleted. In this chapter, we will work on a simple example that will add items to the DynamoDB table and AWS Lambda which will read the data and send mail with the data added. Requisites To use Amazon DB and AWS Lambda, we need to follow the steps as shown below − Create a table in DynamoDB with primary key Create a role which will have permission to work with DynamoDBand AWS Lambda. Create function in AWS Lambda AWS Lambda Trigger to send mail Add data in DynamoDB Let us discuss each of this step in detail. Example We are going to work out on following example which shows the basic interaction between DynamoDB and AWS Lambda. This example will help you to understand the following operations − Creating a table called customer in Dynamodb table and how to enter data in that table. Triggering AWS Lambda function once the data is entered and sending mail using Amazon SES service. The basic block diagram that explains the flow of the example is as shown below − Create Table in DynamoDB with Primary Key Log in to AWS console. Go to AWS Services and select DynamoDB as shown below. Select DynamoDB. DynamoDB shows the options as shown below − Now, click Create table to create the table as shown. We have named the table as customer with primary key for that table as cust_id. Click on Create button to add the table to dynamodb. The table created is as shown below − We can add items to the table created as follows − Click Items and click Create item button as shown − Creating Role with Permissions to Work with DynamoDB and AWS Lambda To create role, Go to AWS services and click IAM. Let us create a policy to be used only for the DynamoDB table created earlier − Now, choose a Service. Observe that the service we have selected is DynamoDB. For Actions we have taken all Dynamodb actions ie access to list, read and write. For resources, we will select the table resource type actions. When you click it, you can see a screen as follows − Now, select table and Add ARN to it as shown. We will get ARN details from customer table created as shown below − Enter arn details here − Click Add button to save the changes. Once done Click on Review policy. Enter the name of the policy, description etc as shown below − Click on create policy to save it. Add the policy to the role to be created. Select Role from left side and enter the details. Observe that the policies added are newpolicyfordynamdb, awslambdafullaccess, cloudwatchfullaccess and amazonsesfullaccess. Add the role and will use it while creating AWS Lambda function. Create Function in AWS Lambda Thus, we have created Lambda function called newlambdafordynamodb as shown. Now, let us add DynamodDB trigger to the AWS Lambda created. The runtime we shall use is Node.js. You can find the following details in Dynamodb trigger that are to be configured for AWS Lambda − Now, simply click Add to add the trigger to AWS Lambda. AWS Lambda Trigger to Send Mail AWS Lambda will get triggered when data is inserted intoAWS Lambda. The event parameter will have the dynamodb data inserted. This will read the data from the event and send email. Sending an email To send email, you need to follow the steps given below − Step 1 Go to AWS service and select SES (simple email service). Validate the email to which we need to send an email as shown − Step 2 Click the button Verify a New Email Address to add the email address. Step 3 Enter an email address to verify it. The email address will receive and activation mail from Amazon which needs to be clicked. Once the activation is done, the email id is verified and can be used with AWS services. Step 4 The AWS Lambda code which reads data from the event and sends email is given below − var aws = require(”aws-sdk”); var ses = new aws.SES({ region: ”us-east-1” }); exports.handler = function(event, context, callback) { console.log(event); let tabledetails = JSON.parse(JSON.stringify(event.Records[0].dynamodb)); console.log(tabledetails.NewImage.address.S); let customerid = tabledetails.NewImage.cust_id.S; let name = tabledetails.NewImage.name.S; let address = tabledetails.NewImage.address.S; var eParams = { Destination: { ToAddresses: [“[email protected]”] }, Message: { Body: { Text: { Data: “The data added is as follows:n CustomerId:”+customerid+”n Name:”+name+”nAddress:”+address } }, Subject: { Data: “Data Inserted in Dynamodb table customer” } }, Source: “[email protected]” }; console.log(”===SENDING EMAIL===”); var email = ses.sendEmail(eParams, function(err, data) { if (err) console.log(err); else { console.log(“===EMAIL SENT===”); console.log(“EMAIL CODE END”); console.log(”EMAIL: ”, email); context.succeed(event); callback(null, “email is send”); } }); } Now, save the Lambda function and data in DynamoDB table. Add Data in DynamoDB Use the following sequence to add data in DynamoDB. Step 1 Go to the table customer created in Dynamodb. Step 2 Click Create item. Step 3 Click Save button and check the email id provided in AWS Lambda to see if the mail has been sent by AWS Lambda. Print Page Previous Next Advertisements ”;
Using Lambda Function with Scheduled Events ”; Previous Next Scheduled events are suppose to happen at regular intervals based on a rule set. Scheduled events are used to execute Lambda function after an interval which is defined in cloudwatch services. They are best used for working on cron jobs along with AWS Lambda. This chapter will explain with simple example how to send mail after every 5 minutes using scheduled events and AWS Lambda. Requisites The requirements for using Lambda function with Scheduled events are as follows − Verify email id using AWS SES Create Role to use AWS SES, Cloudwatch and AWS Lambda Create Lambda Function to send email Add rule for scheduled events from AWS CloudWatch Example The example that we are going to consider will add CloudWatch event to the AWS Lambda function. Cloudwatch will trigger AWS Lambda based on the time pattern attached to it. For Example, in the example below we have used 5 minutes as the trigger. It means for every 5 minutes, AWS Lambda will be triggered and AWS Lambda will send mail whenever triggered. The basic block diagram for the same is shown below − Verify Email ID using AWS SES Log in to AWS and go to AWS SES service as shown below − Now, click Simple Email Service as shown − Click Email Addresses on left side as shown − It displays a button Verify a New Email Address. Click it. Enter Email Address you want to verify. Click Verify This Email Address button. You will receive mail from AWS on that email id with email subject: Amazon Web Services – Email Address Verification Request in region US East (N. Virginia) Click the link given in the mail to verify email address. Once verified, it will display the email id as follows − Create Role to use AWS SES, Cloudwatch and AWS Lambda You can also create a role which gives permission to use the services. For this, go to IAM and select Role. Add the required policies and create the role. Observe that the role created here is events with lambda. Create Lambda Function to Send Email You will have to follow the steps to create Lambda function using runtime as nodejs. Now, add trigger to Lambda as shown − Add details to CloudWatch Events Trigger as shown below − Note that the event will be triggered after every 5 minutes as per the rule trigger created. The Lambda code for sending an email is given below − var aws = require(”aws-sdk”); var ses = new aws.SES({ region: ”us-east-1” }); exports.handler = function(event, context, callback) { var eParams = { Destination: { ToAddresses: [“[email protected]”] }, Message: { Body: { Text: { Data: “this mail comes from aws lambda event scheduling” } }, Subject: { Data: “Event scheduling from aws lambda” } }, Source: “[email protected]” }; console.log(”===SENDING EMAIL===”); var email = ses.sendEmail(eParams, function(err, data) { if (err) console.log(err); else { console.log(“===EMAIL SENT===”); console.log(“EMAIL CODE END”); console.log(”EMAIL: ”, email); context.succeed(event); callback(null, “email is send”); } }); }; Now, we need the AWS SES service. You can add this using the code shown as follows − var aws = require(”aws-sdk”); var ses = new aws.SES({ region: ”us-east-1” }); To send mail from nodejs, we have created eParams object which has details like the example mail, to mail id and the body with message as follows − var eParams = { Destination: { ToAddresses: [“[email protected]”] }, Message: { Body: { Text: { Data: “this mail comes from aws lambda event scheduling” } }, Subject: { Data: “Event scheduling from aws lambda” } }, Source: “[email protected]” }; The Lambda code to send email is as follows − var email = ses.sendEmail(eParams, function(err, data) { if (err) console.log(err); else { console.log(“===EMAIL SENT===”); console.log(“EMAIL CODE END”); console.log(”EMAIL: ”, email); context.succeed(event); callback(null, “email is send”); } }); Now, let us save this Lambda function and check the email id for mails. The screenshot shown below shows that the mail is sent from AWS Lambda after every 5 minutes. Print Page Previous Next Advertisements ”;
Using Lambda Function with Amazon Kinesis ”; Previous Next AWS Kinesis service is used to capture/store real time tracking data coming from website clicks, logs, social media feeds. We can trigger AWS Lambda to perform additional processing on this logs. Requisites The basic requirements to get started with Kinesis and AWS Lambda are as shown − Create role with required permissions Create data stream in Kinesis Create AWS Lambda function. Add code to AWS Lambda Add data to Kinesis data stream Example Let us work on an example wherein we will trigger AWS Lambda for processing the data stream from Kinesis and send mail with the data received. A simple block diagram for explaining the process is shown below − Create Role with Required Permissions Go to AWS console and create a role. Create Data Stream in Kinesis Go to AWS console and create data stream in kinesis. There are 4 options as shown. We will work on Create data stream in this example. Click Create data stream. Enter the name in Kinesis stream name given below. Enter number of shards for the data stream. The details of Shards are as shown below − Enter the name and click the Create Kinesis stream button at the bottom. Note that it takes certain time for the stream to go active. Create AWS Lambda Function Go to AWS console and click Lambda. Create AWS Lambda function as shown − Click Create function button at the end of the screen. Add Kinesis as the trigger to AWS Lambda. Add configuration details to the Kinesis trigger − Add the trigger and now add code to AWS Lambda. Adding Code to AWS Lambda For this purpose, we will use nodejs as the run-time. We will send mail once AWS Lambda is triggered with kinesis data stream. const aws = require(“aws-sdk”); var ses = new aws.SES({ region: ”us-east-1” }); exports.handler = function(event, context, callback) { let payload = “”; event.Records.forEach(function(record) { // Kinesis data is base64 encoded so decode here payload = new Buffer(record.kinesis.data, ”base64”).toString(”ascii”); console.log(”Decoded payload:”, payload); }); var eParams = { Destination: { ToAddresses: [“[email protected]”] }, Message: { Body: { Text: { Data:payload } }, Subject: { Data: “Kinesis data stream” } }, Source: “[email protected]” }; var email = ses.sendEmail(eParams, function(err, data) { if (err) console.log(err); else { console.log(“===EMAIL SENT===”); console.log(“EMAIL CODE END”); console.log(”EMAIL: ”, email); context.succeed(event); callback(null, “email is send”); } }); }; The event param has the data entered in kinesis data stream. The above aws lambda code will get activated once data is entered in kinesis data stream. Add Data to Kinesis Data Stream Here we will use AWS CLI to add data kinesis data stream as shown below. For this purpose, we can use the following command − aws kinesis put-record –stream-name kinesisdemo –data “hello world” — partition-key “789675” Then, AWS Lambda is activated and the mail is sent. Print Page Previous Next Advertisements ”;
Using Lambda Function with Amazon S3 ”; Previous Next Amazon S3 service is used for file storage, where you can upload or remove files. We can trigger AWS Lambda on S3 when there are any file uploads in S3 buckets. AWS Lambda has a handler function which acts as a start point for AWS Lambda function. The handler has the details of the events. In this chapter, let us see how to use AWS S3 to trigger AWS Lambda function when we upload files in S3 bucket. Steps for Using AWS Lambda Function with Amazon S3 To start using AWS Lambda with Amazon S3, we need the following − Create S3 Bucket Create role which has permission to work with s3 and lambda Create lambda function and add s3 as the trigger. Example Let us see these steps with the help of an example which shows the basic interaction between Amazon S3 and AWS Lambda. User will upload a file in Amazon S3 bucket Once the file is uploaded, it will trigger AWS Lambda function in the background which will display an output in the form of a console message that the file is uploaded. The user will be able to see the message in Cloudwatch logs once the file is uploaded. The block diagram that explains the flow of the example is shown here − Creating S3 Bucket Let us start first by creating a s3 bucket in AWS console using the steps given below − Step 1 Go to Amazon services and click S3 in storage section as highlighted in the image given below − Step 2 Click S3 storage and Create bucket which will store the files uploaded. Step 3 Once you click Create bucket button, you can see a screen as follows − Step 4 Enter the details Bucket name, Select the Region and click Create button at the bottom left side. Thus, we have created bucket with name : workingwithlambdaands3. Step 5 Now, click the bucket name and it will ask you to upload files as shown below − Thus, we are done with bucket creation in S3. Create Role that Works with S3 and Lambda To create role that works with S3 and Lambda, please follow the Steps given below − Step 1 Go to AWS services and select IAM as shown below − Step 2 Now, click IAM -> Roles as shown below − Step 3 Now, click Create role and choose the services that will use this role. Select Lambda and click Permission button. Step 4 Add the permission from below and click Review. Step 5 Observe that we have chosen the following permissions − Observe that the Policies that we have selected are AmazonS3FullAccess, AWSLambdaFullAccess and CloudWatchFullAccess. Step 6 Now, enter the Role name, Role description and click Create Role button at the bottom. Thus, our role named lambdawiths3service is created. Create Lambda function and Add S3 Trigger In this section, let us see how to create a Lambda function and add a S3 trigger to it. For this purpose, you will have to follow th Steps given below − Step 1 Go to AWS Services and select Lambda as shown below − Step 2 Click Lambda and follow the process for adding Name. Choose the Runtime, Role etc. and create the function. The Lambda function that we have created is shown in the screenshot below − Step 3 Now let us add the S3 trigger. Step 4 Choose the trigger from above and add the details as shown below − Step 5 Select the bucket created from bucket dropdown. The event type has following details − Select Object Created (All), as we need AWS Lambda trigger when file is uploaded, removed etc. Step 6 You can add Prefix and File pattern which are used to filter the files added. For Example, to trigger lambda only for .jpg images. Let us keep it blank for now as we need to trigger Lambda for all files uploaded. Click Add button to add the trigger. Step 7 You can find the the trigger display for the Lambda function as shown below − Let’s add the details for the aws lambda function. Here, we will use the online editor to add our code and use nodejs as the runtime environment. Step 8 To trigger S3 with AWS Lambda, we will have to use S3 event in the code as shown below − exports.handler = function(event, context, callback) { console.log(“Incoming Event: “, event); const bucket = event.Records[0].s3.bucket.name; const filename = decodeURIComponent(event.Records[0].s3.object.key.replace(/+/g, ” ”)); const message = `File is uploaded in – ${bucket} -> ${filename}`; console.log(message); callback(null, message); }; Note that the event param has the details of the S3event. We have consoled the bucket name and the file name which will get logged when you upload image in S3bucket. Step 9 Now, let us save the changes and test the lambda function with S3upload. The following are the code details added in AWS Lambda − Step 10 Now, let us add the role, memory and timeout. Step 11 Now, save the Lambda function. Open S3 from Amazon services and open the bucket we created earlier namely workingwithlambdaands3. Upload the image in it as shown below − Step 12 Click Upload button to add files as shown − Step 13 Click Add files to add files. You can also drag and drop the files. Now, click Upload button. Thus, we have uploaded one image in our S3 bucket. Step 14 To see the trigger details, go to AWS service and select CloudWatch. Open the logs for the Lambda function and use the following code − exports.handler = function(event, context, callback) { console.log(“Incoming Event: “, event); const bucket = event.Records[0].s3.bucket.name; const filename = decodeURIComponent(event.Records[0].s3.object.key.replace(/+/g, ” ”)); const message = `File is uploaded in – ${bucket} -> ${filename}`; console.log(message); callback(null, message); }; The output you can observe in Cloudwatch is as shown − AWS Lambda function gets triggered when file is uploaded in S3
AWS Lambda – Quick Guide
AWS Lambda – Quick Guide ”; Previous Next AWS Lambda – Overview AWS Lambda is a service which performs serverless computing, which involves computing without any server. The code is executed based on the response of events in AWS services such as adding/removing files in S3 bucket, updating Amazon dynamo dB tables, HTTP request from Amazon API gateway etc. To get working with AWS Lambda, we just have to push the code in AWS Lambda service. All other tasks and resources such as infrastructure, operating system, maintenance of server, code monitoring, logs and security is taken care by AWS. AWS Lambda supports languages such as Java, NodeJS, Python, C#, Go, Ruby, and Powershell. Note thatAWS Lambda will work only with AWS services. What is AWS Lambda? Definition of AWS Lambda as given by its official documentation is as follows − AWS Lambda is a compute service that lets you run code without provisioning or managing servers. AWS Lambda executes your code only when needed and scales automatically, from a few requests per day to thousands per second. You pay only for the compute time you consume – there is no charge when your code is not running. How AWS Lambda Works? The block diagram that explains the working of AWS Lambda in five easy steps is shown below − Step 1 − Upload AWS lambda code in any of languages AWS lambda supports, that is NodeJS, Java, Python, C# and Go. Step 2 − These are few AWS services on which AWS lambda can be triggered. Step 3 − AWS Lambda which has the upload code and the event details on which the trigger has occurred. For example, event from Amazon S3, Amazon API Gateway, Dynamo dB, Amazon SNS, Amazon Kinesis, CloudFront, Amazon SES, CloudTrail, mobile app etc. Step 4 − Executes AWS Lambda Code only when triggered by AWS services under the scenarios such as − User uploads files in S3 bucket http get/post endpoint URL is hit data is added/updated/deleted in dynamo dB tables push notification data streams collection hosting of website email sending mobile app, etc. Step 5 − Remember that AWS charges only when the AWS lambda code executes, and not otherwise. Advantages of using AWS Lambda AWS Lambda offers multiple benefits when you are working on it. This section discusses them in detail − Ease of working with code AWS Lambda gives you the infrastructure to upload your code. It takes care of maintaining the code and triggers the code whenever the required event happens. It allows you to choose the memory and the timeout required for the code. AWS Lambda can also execute parallel requests as per the event triggers. Log Provision AWS Lambda gives the details of number of times a code was executed and time taken for execution, the memory consumed etc. AWS CloudWatch collects all the logs, which helps in understanding the execution flow and in the debugging of the code. Billing based on Usage AWS Lambda billing is done on memory usage, request made and the execution, which is billed in increments of minimum 100ms. So for a 500ms execution, the billing will be after every 100ms. If you specify your AWS lambda code to be executed in 500ms and the time taken to execute is just 200ms, AWS will bill you only for the time taken, that is 200ms of execution instead of 500ms. AWS always charges for the execution time used. You need not pay if the function is not executed. Multi Language Support AWS Lambda supports popular languages such as Node. js, Python, Java, C# and Go. These are widely used languages and any developer will find it easy to write code for AWS Lambda. Ease of code authoring and deploying There are many options available for Lambda for authoring and deploying code. For writing your code, you can use AWS online editor, Visual Studio IDE, or Eclipse IDE. It also has support for serverless framework which makes writing and deploying of AWS Lambda code easy. Besides AWS console, we have AWS-cli to create and deploy code. Other features You can use AWS Lambda for free by getting a login to AWS free tier. It gives you service for free for 1 year. Take a look at the free services offered by AWS free tier. Disadvantages of using AWS Lambda In spite of many advantages, AWS Lambda possesses the following disadvantages − It is not suitable for small projects. You need to carefully analyze your code and decide the memory and timeout. Incase if your function needs more time than what is allocated, it will get terminated as per the timeout specified on it and the code will not be fully executed. Since AWS Lambda relies completely on AWS for the infrastructure, you cannot install anything additional software if your code demands it. Events that Trigger AWS Lambda The events can trigger AWS Lambda are as follows − Entry into a S3 object Insertion, updation and deletion of data in Dynamo DB table Push notifications from SNS GET/POST calls to API Gateway Headers modification at viewer or origin request/response in CloudFront Log entries in AWS Kinesis data stream Log history in CloudTrail Use Cases of AWS Lambda AWS Lambda is a compute service mainly used to run background processes. It can trigger when used with other AWS services. The list of AWS services where we can use AWS Lambda is given below − S3 Object and AWS Lambda Amazon S3 passes the event details to AWS Lambda when there is any file upload in S3. The details of the file upload or deletion of file or moving of file is passed to the AWS Lambda. The code in AWS Lambda can take the necessary step for when it receives the event details. For Example creating thumbnail of the image inserted into S3. DynamoDB and AWS Lambda DynamoDB can trigger AWS Lambda when there is data added, updated and deleted in the table. AWS
Creating and Deploying using Serverless Framework ”; Previous Next AWS Lambda can be created and deployed using serverless framework. It allows you to create AWS Lambda triggers and also deploy the same by creating the required roles. Serverless framework allows to handle big projects in an easier way. The events and resources required are written in one place and just a few commands helps in deploying the full functionality on AWS console. In this chapter, you will learn in detail how to get started with AWS serverless framework. Install Serverless Framework using npm install To begin with, you need to first install nodejs. You can check for nodejs as follows − You will have to use the following command to install serverless using npm package − npm install -g serverless Once npm is done, execute serverless command which shows the list of command to be used to create and deploy AWS Lambda function. Observe the screenshots given below − You can also use sls instead of serverless. sls is the shorthand command for serverless. In case you need help on the command sls, you can use the following command − sls create –help For creating a serverless framework, you have to follow the steps given below − Step 1 To start using serverless framework, we need to add the credentials. By this, you can the user first in AWS console as follows − Step 2 Click on Next:Permissions button to add permissions. You will have to attach the existing policies or Administrator Access to this user. Step 3 Click Create User to add the user. It will display the access key and secret key which we need to configure the serverless framework − Configure AWS Serverless Framework Let us see how to configure AWS serverless framework. You can use the following command for this purpose − sls config credentials –provider aws –key accesskey –secret secretkey Note that the details of credentials entered, that is the access key and secret key are stored in the file /aws/credentials. First, create a folder where you want your project files to be stored. Next, we will start the work in aws-serverless folder. Create AWS Lambda using Serverless Framework Now, let us create a Lambda function with the serverless framework using the Steps given below − Step 1 Following are the details for serverless create command − Step 2 Now, we need to assign the template which are as follows − AWS-nodejs, aws-nodejs-typescript, aws-nodejs-ecma-script, aws-python, aws-python3, aws-groovy-gradle etc. Step 3 We shall make use of aws-nodejs template to create our first project using serverless framework. The command for the same purpose is as shown here − sls create –template aws-nodejs Note that this command creates a boilerplate for template aws-nodejs. Step 4 Now, open the folder created in an IDE. Here we are using Visual Studio code and the folder structure is as follows − Step 5 There are 2 files created: handler.js and Serverless.yml The AWS Lambda basic function details are shown in handler.js as follows − ”use strict”; module.exports.hello = (event, context, callback) => { const response = { statusCode: 200, body: JSON.stringify({ message: ”Go Serverless v1.0! Your function executed successfully!”, input: event, }), }; callback(null, response); // Use this code if you don”t use the http event with the LAMBDA-PROXY integration // callback(null, { message: ”Go Serverless v1.0! Your function executed successfully!”, event }); }; This file Serverless.yml has the configuration details of the serverless framework as shown below − # Welcome to Serverless! # # This file is the main config file for your service. # It”s very minimal at this point and uses default values. # You can always add more config options for more control. # We”ve included some commented out config Examples here. # Just uncomment any of them to get that config option. # # For full config options, check the docs: # docs.serverless.com # # Happy Coding! service: aws-nodejs # NOTE: update this with your service name # You can pin your service to only deploy with a specific Serverless version # Check out our docs for more details # frameworkVersion: “=X.X.X” provider: name: aws runtime: nodejs6.10 # you can overwrite defaults here # stage: dev # region: us-east-1 # you can add statements to the Lambda function”s IAM Role here # iamRoleStatements: # – Effect: “Allow” # Action: # – “s3:ListBucket” # Resource: { “Fn::Join” : [“”, [“arn:aws:s3:::”, { “Ref” : “ServerlessDeploymentBucket” } ] ] } # – Effect: “Allow” # Action: # – “s3:PutObject” # Resource: # Fn::Join: # – “” # – – “arn:aws:s3:::” # – “Ref” : “ServerlessDeploymentBucket” # – “/*” # you can define service wide environment variables here # environment: # variable1: value1 # you can add packaging information here #package: # include: # – include-me.js # – include-me-dir/** # exclude: # – exclude-me.js # – exclude-me-dir/** functions: hello: handler: handler.hello # The following are a few example events you can configure # NOTE: Please make sure to change your handler code to work with those events # Check the event documentation for details # events: # – http: # path: users/create # method: get # – s3: ${env:BUCKET} # – schedule: rate(10 minutes) # – sns: greeter-topic # – stream: arn:aws:dynamodb:region:XXXXXX:table/foo/stream/1970-01-01T00:00:00.000 # – alexaSkill: amzn1.ask.skill.xx-xx-xx-xx # – alexaSmartHome: amzn1.ask.skill.xx-xx-xx-xx # – iot: # sql: “SELECT * FROM ”some_topic”” # – cloudwatchEvent: # event: # Example: # – “aws.ec2” # detail-type: # – “EC2 Instance State-change Notification” # detail: # state: # – pending # – cloudwatchLog: ”/aws/lambda/hello” # – cognitoUserPool: # pool: MyUserPool # trigger: PreSignUp # Define function environment variables here # environment: # variable2: value2 # you can add CloudFormation resource templates here #resources: # resources: # NewResource: # Type: AWS::S3::Bucket # Properties: # BucketName: my-new-bucket # Outputs: # NewOutput: # Description: “Description for the output” # Value: “Some output value” Now, we need to add changes in serverless.yml file as per our requirements. You can use the commands as given below −