Hot Reloading
10 minute read
Hot reloading (formerly known as hot swapping) continuously applies code changes to Lambda functions without manual redeployment.
Quickly iterating over Lambda function code can be quite cumbersome, as you need to deploy your function on every change. LocalStack enables fast feedback cycles during development by automatically reloading your function code. Pro users can also hot-reload Lambda layers.
Note
The magic S3 bucket name changed from __local__
to hot-reload
in LocalStack 2.0
Please change your deployment configuration accordingly because the old value is an invalid bucket name.
The configuration BUCKET_MARKER_LOCAL
is still supported.
More information about the new Lambda provider is available under Lambda providers.
Covered Topics
- Hot Reloading Behavior
- Application Configuration Examples:
- Deployment Configuration Examples:
- Useful Links
Hot Reloading Behavior
Delay in code change detection: It can take up to 700ms to detect code changes. In the meantime, invocations still execute the former code. Hot reloading triggers after 500ms without changes, and it can take up to an additional 200ms until the reloaded code is in effect.
Runtime restart after each code change: The runtime inside the container is restarted after every code change. During the runtime restart, the handler function re-executes any initialization code outside your handler function. The container itself is not restarted. Therefore, filesystem changes persist between code changes for invocations dispatched to the same container.
File sharing permissions with Docker Desktop on macOS: If using Docker Desktop on macOS, you might need to allow file sharing for your target folders. MacOS may prompt you to grant Docker access to your target folders.
Layer limit with hot reloading for layers: When hot reloading is active for a Lambda layer (Pro), the function can use at most one layer.
Application Configuration Examples
Hot reloading for JVM Lambdas
Since lambda containers lifetime is usually limited, regular hot code reloading techniques are not applicable here.
In our implementation, we will be watching for fs changes under the project folder,
then build a FatJar
, unzip it, and mount it into the Lambda Docker Container.
We assume you already have:
- watchman
- configured JVM project capable of building FatJars using your preferred build tool
First, create a watchman wrapper by using one of our examples
Don’t forget to adjust permissions:
$ chmod +x bin/watchman.sh
Now configure your build tool to unzip the FatJar to some folder, which will be
then mounted to LocalStack. We are using Gradle
build tool to unpack the
FatJar
into the build/hot
folder:
// We assume you are using something like `Shadow` plugin that comes with `shadowJar` task
task buildHot(type: Copy) {
from zipTree("${project.buildDir}/libs/${project.name}-all.jar")
into "${project.buildDir}/hot"
}
buildHot.dependsOn shadowJar
Now run the following command to start watching your project in a hot-reloading mode:
$ bin/watchman.sh src "./gradlew buildHot"
Please note that you still need to configure your deployment tool to use local code mounting. Read the Deployment Configuration Examples for more information.
Hot reloading for Python Lambdas
We will show you how you can do this with a simple example function, taken directly from the AWS Lambda developer guide.
You can check out that code, or use your own lambda functions to follow along. To use the example just do:
$ cd /tmp
$ git clone git@github.com:awsdocs/aws-doc-sdk-examples.git
Creating the Lambda Function
To create the Lambda function, you just need to take care of two things:
- Deploy via an S3 Bucket. You need to use the magic variable
hot-reload
as the bucket. - Set the S3 key to the path of the directory your lambda function resides in. The handler is then referenced by the filename of your lambda code and the function in that code that needs to be invoked.
So, using the AWS example, this would be:
$ awslocal lambda create-function --function-name my-cool-local-function \
--code S3Bucket="hot-reload",S3Key="/tmp/aws-doc-sdk-examples/python/example_code/lambda/boto_client_examples" \
--handler lambda_handler_basic.lambda_handler \
--runtime python3.8 \
--role arn:aws:iam::000000000000:role/lambda-role
You can also check out some of our Deployment Configuration Examples.
We can also quickly make sure that it works by invoking it with a simple payload:
$ awslocal lambda invoke --function-name my-cool-local-function --payload '{"action": "square", "number": 3}' output.txt
The invocation returns itself returns:
{
"StatusCode": 200,
"LogResult": "",
"ExecutedVersion": "$LATEST"
}
and output.txt
contains:
{"result":9}
Changing things up
Now, that we got everything up and running, the fun begins. Because the function is now mounted as a file in the executing container, any change that we save on the file will be there in an instant.
For example, we can now make a minor change to the API and replace the response in line 41 with the following:
response = {'math_result': result}
Without redeploying or updating the function, the result of the previous request will look like this:
{"math_result":9}
Cool!
Usage with Virtualenv
For virtualenv-driven projects, all dependencies should be made available to the Python interpreter at runtime. There are different ways to achieve that, including:
- expanding the Python module search path in your Lambda handler
- creating a watchman script to copy the libraries
Expanding the module search path in your Lambda handler
The easiest approach is to expand the module search path (sys.path
) and add the site-packages
folder inside the virtualenv.
We can add the following two lines of code at the top of the Lambda handler script:
import sys, glob
sys.path.insert(0, glob.glob(".venv/lib/python*/site-packages")[0])
...
import some_lib_from_virtualenv # import your own modules here
This way you can easily import modules from your virtualenv, without having to change the file system layout.
Note: As an alternative to modifying sys.path
, you could also set the PYTHONPATH
environment variable when creating your Lambda function, to add the additional path.
Using a watchman script to copy libraries
Another alternative is to implement a watchman script that will be preparing a special folder for hot code reloading.
In our example, we are using build/hot
folder as a mounting point for our Lambdas.
First, create a watchman wrapper by using one of our examples
After that, you can use the following Makefile
snippet, or implement another shell script to prepare the codebase for hot reloading:
BUILD_FOLDER ?= build
PROJECT_MODULE_NAME = my_project_module
build-hot:
rm -rf $(BUILD_FOLDER)/hot && mkdir -p $(BUILD_FOLDER)/hot
cp -r $(VENV_DIR)/lib/python$(shell python --version | grep -oE '[0-9]\.[0-9]')/site-packages/* $(BUILD_FOLDER)/hot/
cp -r $(PROJECT_MODULE_NAME) $(BUILD_FOLDER)/hot/$(PROJECT_MODULE_NAME)
cp *.toml $(BUILD_FOLDER)/hot
watch:
bin/watchman.sh $(PROJECT_MODULE_NAME) "make build-hot"
.PHONY: build-hot watch
To run the example above, run make watch
. The script is copying the project module PROJECT_MODULE_NAME
along with all dependencies into the build/hot
folder, which is then mounted into
LocalStack’s Lambda container.
Hot reloading for TypeScript Lambdas
You can hot-reload your TypeScript Lambda functions. We will check-out a simple example to create a simple Hello World!
Lambda function using TypeScript.
Setting up the Lambda function
Create a new Node.js project with npm
or an alternative package manager:
$ npm init -y
Install the the @types/aws-lambda and esbuild packages in your Node.js project:
$ npm install -D @types/aws-lambda esbuild
Create a new file named index.ts
. Add the following code to the new file:
import { Context, APIGatewayProxyResult, APIGatewayEvent } from 'aws-lambda';
export const handler = async (event: APIGatewayEvent, context: Context): Promise<APIGatewayProxyResult> => {
console.log(`Event: ${JSON.stringify(event, null, 2)}`);
console.log(`Context: ${JSON.stringify(context, null, 2)}`);
return {
statusCode: 200,
body: JSON.stringify({
message: 'Hello World!',
}),
};
};
Add a build script to your package.json
file:
"scripts": {
"build": "esbuild index.ts --bundle --minify --sourcemap --platform=node --target=es2020 --outfile=dist/index.js --watch"
},
The build script will use esbuild
to bundle and minify the TypeScript code into a single JavaScript file, which will be placed in the dist
folder.
You can now run the build script to create the dist/index.js
file:
$ npm run build
Creating the Lambda Function
To create the Lambda function, you need to take care of two things:
- Deploy via an S3 Bucket. You need to use the magic variable
hot-reload
as the bucket. - Set the S3 key to the path of the directory your lambda function resides in. The handler is then referenced by the filename of your lambda code and the function in that code that needs to be invoked.
Create the Lambda Function using the awslocal
CLI:
awslocal lambda create-function \
--function-name hello-world \
--runtime "nodejs16.x" \
--role arn:aws:iam::123456789012:role/lambda-ex \
--code S3Bucket="hot-reload",S3Key="$(PWD)/dist" \
--handler index.handler
You can quickly make sure that it works by invoking it with a simple payload:
$ awslocal lambda invoke \
--function-name hello-world \
--payload '{"action": "test"}' output.txt
The invocation returns itself returns:
{
"StatusCode": 200,
"ExecutedVersion": "$LATEST"
}
The output.txt
file contains the following:
{"statusCode":200,"body":"{\"message\":\"Hello World!\"}"}
Changing the Lambda Function
The Lambda function is now mounted as a file in the executing container, hence any change that we save on the file will be there in an instant.
Change the Hello World!
message to Hello LocalStack!
and run npm run build
. Trigger the Lambda once again. You will see the following in the output.txt
file:
{"statusCode":200,"body":"{\"message\":\"Hello LocalStack!\"}"}
Deployment Configuration Examples
Serverless Framework Configuration
Enable local code mounting
custom:
localstack:
...
lambda:
mountCode: true
# or if you need to enable code mounting only for specific stages
custom:
stages:
local:
mountCode: true
testing:
mountCode: false
localstack:
stages:
- local
- testing
lambda:
mountCode: ${self:custom.stages.${opt:stage}.mountCode}
Pass LAMBDA_MOUNT_CWD
env var with path to the built code directory
(in our case to the folder with unzipped FatJar):
$ LAMBDA_MOUNT_CWD=$(pwd)/build/hot serverless deploy --stage local
AWS Cloud Development Kit (CDK) Configuration
|
|
Then to bootstrap and deploy the stack run the following shell script
$ STAGE=local && LAMBDA_MOUNT_CWD=$(pwd)/build/hot &&
cdklocal bootstrap aws://000000000000/$(AWS_REGION) && \
cdklocal deploy
Terraform Configuration
variable "STAGE" {
type = string
default = "local"
}
variable "AWS_REGION" {
type = string
default = "us-east-1"
}
variable "JAR_PATH" {
type = string
default = "build/libs/localstack-sampleproject-all.jar"
}
variable "LAMBDA_MOUNT_CWD" {
type = string
}
provider "aws" {
access_key = "test_access_key"
secret_key = "test_secret_key"
region = var.AWS_REGION
s3_force_path_style = true
skip_credentials_validation = true
skip_metadata_api_check = true
skip_requesting_account_id = true
endpoints {
apigateway = var.STAGE == "local" ? "http://localhost:4566" : null
cloudformation = var.STAGE == "local" ? "http://localhost:4566" : null
cloudwatch = var.STAGE == "local" ? "http://localhost:4566" : null
cloudwatchevents = var.STAGE == "local" ? "http://localhost:4566" : null
iam = var.STAGE == "local" ? "http://localhost:4566" : null
lambda = var.STAGE == "local" ? "http://localhost:4566" : null
s3 = var.STAGE == "local" ? "http://localhost:4566" : null
}
}
resource "aws_iam_role" "lambda-execution-role" {
name = "lambda-execution-role"
assume_role_policy = <<EOF
{
"Version": "2012-10-17",
"Statement": [
{
"Action": "sts:AssumeRole",
"Principal": {
"Service": "lambda.amazonaws.com"
},
"Effect": "Allow",
"Sid": ""
}
]
}
EOF
}
resource "aws_lambda_function" "exampleFunctionOne" {
s3_bucket = var.STAGE == "local" ? "hot-reload" : null
s3_key = var.STAGE == "local" ? var.LAMBDA_MOUNT_CWD : null
filename = var.STAGE == "local" ? null : var.JAR_PATH
function_name = "ExampleFunctionOne"
role = aws_iam_role.lambda-execution-role.arn
handler = "org.localstack.sampleproject.api.LambdaApi"
runtime = "java11"
timeout = 30
source_code_hash = filebase64sha256(var.JAR_PATH)
environment {
variables = {
FUNCTION_NAME = "functionOne"
}
}
}
$ terraform init && \
terraform apply -var "STAGE=local" -var "LAMBDA_MOUNT_CWD=$(pwd)/build/hot"