Setting up Elastic Load Balancing (ELB) Application Load Balancers using LocalStack, deployed via the Serverless framework

Set up Elastic Loading Balance (ELB) Application Load Balancers to configure Node.js Lambda functions as targets, to forward requests to the target group for your Lambda function using LocalStack. Learn how you can use Serverless framework to setup and deploy your infrastructure locally with LocalStack using the serverless-localstack plugin.

AWS Elastic Load Balancer (ELB) distributes incoming application traffic across multiple targets, such as EC2 instances, containers, IP addresses, and Lambda functions. ELBs are either physical hardware or virtual software which accepts incoming traffic and distributes it across multiple targets in one or more Availability Zones. Using ELB, you can scale your load balancer per the traffic changes over time to scale to the needs of your application and the majority of workloads running on the AWS infrastructure.

ELB supports Application Load Balancer, Network Load Balancer, and Classic Load Balancer. Application Load Balancer (ALB) operates at the Application layer of the OSI model and supports load balancing of applications using HTTP and HTTPS requests. ALB operates at the request level and is used in web applications for advanced load balancing of HTTP and HTTPS traffic. Using ALB, you can register your Lambda functions as targets. A listener rule takes care of forwarding requests to the target group for your Lambda function, which is then invoked to process the request.

LocalStack Pro supports the creation of ELB Application Load Balancers and configuring target groups, such as Lambda functions. In this tutorial, we will set up an ELB Application Load Balancer to configure Node.js Lambda functions as targets using the Serverless framework via our serverless-localstack plugin and setup ELB endpoints to forward requests to the target group for your Lambda functions.

Prerequisites

If you don’t have LocalStack Pro, you can sign up for a free trial.

Setup a Serverless project

Serverless framework is an open-source framework for building, packaging, and deploying serverless applications across multiple cloud providers and platforms. Using Serverless framework, you can set up your serverless development environment, define your applications as functions and events, and deploy your entire infrastructure to the cloud with a single command. You can use Serverless framework by installing serverless framework via npm:

$ npm install -g serverless

After installation, you can verify the installation by running the following command:

$ serverless --version

Framework Core: 3.24.1
Plugin: 6.2.2
SDK: 4.3.2

Let us now go ahead and create a new Serverless project using the serverless command:

$ serverless create --template aws-nodejs --path serverless-elb

We are using the aws-nodejs template to create our Serverless project. This template consists of a simple Node.js Lambda function that returns a message when invoked. The template also creates a serverless.yml file containing the project’s configuration.

The serverless.yml file contains the configuration for the project, such as the name of the service, the provider, the functions, and example events that trigger the functions. If you need to set up your project using a different template, you can refer to the Serverless templates documentation. We can now go ahead and configure our Serverless project to use LocalStack.

Configure Serverless project to use LocalStack

To configure your Serverless project to use LocalStack, we need to install the serverless-localstack plugin. Before that, let us initialize our project and install some dependencies:

$ npm init -y
$ npm install -D serverless serverless-localstack serverless-deployment-bucket

The serverless-localstack plugin allows your Serverless project to redirect AWS API calls to LocalStack. The serverless-deployment-bucket plugin creates a deployment bucket in LocalStack, which stores the deployment artifacts to ensure that old deployment buckets don’t linger around after a deployment. We can now set up the plugin by adding the following properties to our serverless.yml file:

...
plugins:
  - serverless-deployment-bucket
  - serverless-localstack

custom:
  localstack:
    stages:
      - local

This sets up Serverless to use the LocalStack plugin but only for the stage local. To ensure that our Serverless project deploys only on LocalStack, not real AWS Cloud, we need to set a --stage flag with the serverless deploy command and set the flag variable to local.

You can also configure serverless deploy --stage local as a deploy script in your package.json to allow you to run the serverless deploy command directly over your local infrastructure. The package.json file should look like this:

{
  "name": "serverless-elb",
  "version": "1.0.0",
  "description": "",
  "main": "handler.js",
  "scripts": {
    "deploy": "sls deploy --stage local"
  },
  "keywords": [],
  "author": "",
  "license": "ISC",
  "devDependencies": {
    "serverless": "^3.25.0",
    "serverless-deployment-bucket": "^1.6.0",
    "serverless-localstack": "^1.0.1"
  }
}

You can give the above setup a run by deploying your Serverless project and checking the Lambda function that has been created via our existing Serverless project:

$ npm run deploy
$ awslocal lambda list-functions

{
    "Functions": [
        {
            "FunctionName": "serverless-elb-local-hello",
            "FunctionArn": "arn:aws:lambda:us-east-1:000000000000:function:serverless-elb-local-hello",
            "Runtime": "nodejs12.x",
            "Role": "arn:aws:iam::000000000000:role/serverless-elb-local-us-east-1-lambdaRole",
            "Handler": "handler.hello",
            ...
        }
    ]
}

We can now create our target Lambda functions to work with Node.js runtime and configure ELB Application Load Balancers via our serverless.yml configuration.

Create Lambda functions & ELB Application Load Balancers

Let us now create two Lambda functions, named hello1 and hello2, which can run over Node.js 12.x runtime. Open handler.js and replace the existing code with the following:

'use strict';

module.exports.hello1 = async (event) => {
  console.log(event);
  return {
    "isBase64Encoded": false,
    "statusCode": 200,
    "statusDescription": "200 OK",
    "headers": {
        "Content-Type": "text/plain"
    },
    "body": "Hello 1"
  };
};

module.exports.hello2 = async (event) => {
  console.log(event);
    return {
    "isBase64Encoded": false,
    "statusCode": 200,
    "statusDescription": "200 OK",
    "headers": {
        "Content-Type": "text/plain"
    },
    "body": "Hello 2"
  };
};

In the above code, we have created hello1 and hello2 Lambda functions and created a response body for them, including the Base64 encoding status, status code, and headers. If you wish to include binary content in the response body, set the isBase64Encoded property to true. The load balancer requires retrieving the binary content to send it in the body of an HTTP response.

Let us now move to the serverless.yml file and specify our deployment bucket and the functions we want to deploy. Our initial configuration in the serverless.yml file should look like this:

service: serverless-elb

provider:
  name: aws
  runtime: nodejs12.x
  deploymentBucket:
    name: testbucket

functions:
  hello1:
    handler: handler.hello1
    events:
    - alb:
        listenerArn: !Ref HTTPListener
        priority: 1
        conditions:
          path: /hello1
  hello2:
    handler: handler.hello2
    events:
    - alb:
        listenerArn: !Ref HTTPListener
        priority: 2
        conditions:
          path: /hello2

plugins:
  - serverless-deployment-bucket
  - serverless-localstack

custom:
  localstack:
    stages:
      - local
...

In the above configuration, we specify our Lambda functions hello1 and hello2, and specify an HTTP listener that forwards requests to the target group. We have also specified a deploymentBucket property, which stores the deployment artifacts.

Let us now create a VPC, a subnet, an Application Load Balancer, and an HTTP listener on the load balancer, which redirects the traffic to the target group. We can do this by adding the following resources to our serverless.yml file:

...
resources:
  Resources:
    LoadBalancer:
      Type: AWS::ElasticLoadBalancingV2::LoadBalancer
      Properties:
        Name: lb-test-1
        Subnets:
          - !Ref Subnet
    HTTPListener:
      Type: AWS::ElasticLoadBalancingV2::Listener
      Properties:
        DefaultActions:
          - Type: redirect
            RedirectConfig:
              Protocol: HTTPS
              Port: 443
              Host: "#{host}"
        LoadBalancerArn: !Ref LoadBalancer
        Protocol: HTTP
    Subnet:
      Type: AWS::EC2::Subnet
      Properties:
        VpcId: !Ref VPC
        CidrBlock: 12.2.1.0/24
        AvailabilityZone: !Select
          - 0
          - Fn::GetAZs: !Ref "AWS::Region"
    VPC:
      Type: AWS::EC2::VPC
      Properties:
        EnableDnsSupport: "true"
        EnableDnsHostnames: "true"
        CidrBlock: 12.2.1.0/24

With this, we have now completed our Serverless project’s configuration. We can now create our local AWS infrastructure on LocalStack and deploy our Application Load Balancers with two Lambda functions as targets.

Creating the infrastructure on LocalStack

Now that the initial setup is done, we can give LocalStack’s AWS emulation a run on our local machine. Let’s start LocalStack:

$ LOCALSTACK_API_KEY=<your-api-key> localstack start -d

We can now deploy our Serverless project and check the created resources in LocalStack. We can do this by running the following command:

$ npm run deploy

> serverless-elb@1.0.0 deploy
> sls deploy --stage local

Using serverless-localstack

Deploying test-elb-load-balancing to stage local (us-east-1)
Creating deployment bucket 'testbucket'...
Using deployment bucket 'testbucket'
Skipping template validation: Unsupported in Localstack

✔ Service deployed to stack test-elb-load-balancing-local (15s)

functions:
  hello1: test-elb-load-balancing-local-hello1 (157 kB)
  hello2: test-elb-load-balancing-local-hello2 (157 kB)

You can check the Lambda functions and the Application Load Balancer created in LocalStack by running the following commands:

$ awslocal lambda list-functions

{
    "Functions": [
        {
            "FunctionName": "test-elb-load-balancing-local-hello1",
            "FunctionArn": "arn:aws:lambda:us-east-1:000000000000:function:test-elb-load-balancing-local-hello1",
            "Runtime": "nodejs12.x",
            "Role": "arn:aws:iam::000000000000:role/test-elb-load-balancing-local-us-east-1-lambdaRole",
            "Handler": "handler.hello1",
            ...
        },
        {
            "FunctionName": "test-elb-load-balancing-local-hello2",
            "FunctionArn": "arn:aws:lambda:us-east-1:000000000000:function:test-elb-load-balancing-local-hello2",
            "Runtime": "nodejs12.x",
            "Role": "arn:aws:iam::000000000000:role/test-elb-load-balancing-local-us-east-1-lambdaRole",
            "Handler": "handler.hello2",
            ...
        }
    ]
}

$ awslocal elbv2 describe-load-balancers
{
    "LoadBalancers": [
        {
            "LoadBalancerArn": "arn:aws:elasticloadbalancing:us-east-1:000000000000:loadbalancer/app/lb-test-1/<ID>",
            "DNSName": "lb-test-1.elb.localhost.localstack.cloud",
            "CanonicalHostedZoneId": "<ID>",
            "CreatedTime": "<TIMESTAMP>",
            "LoadBalancerName": "lb-test-1",
            "Scheme": "None",
            ...
        }
    ]
}

The ALB endpoints for the two Lambda functions we created are accessible at http://lb-test-1.elb.localhost.localstack.cloud:4566/hello1 and http://lb-test-1.elb.localhost.localstack.cloud:4566/hello2. We can test this by running the following commands:

$ curl http://lb-test-1.elb.localhost.localstack.cloud:4566/hello1 | jq
"Hello 1"
$ curl http://lb-test-1.elb.localhost.localstack.cloud:4566/hello2 | jq
"Hello 2"

Conclusion

This tutorial shows how to create an Application Load Balancer with two Lambda functions as targets using LocalStack. We have also seen how to create, configure, and deploy a Serverless project with LocalStack serving as an emulated local AWS environment that allows you to develop and test your Cloud & Serverless applications with AWS locally.

LocalStack also offers integrations with other popular tools such as Terraform, Pulumi, Serverless Application Model (SAM), and more. You can find more information about LocalStack integrations on our Integration documentation. You can find the code for this tutorial (including a Makefile to execute it step-by-step) in our LocalStack Pro samples over GitHub.


Last modified December 1, 2022: LocalStack Beta Docs (#337) (28576f89)