CodePipeline
Introduction
Section titled “Introduction”CodePipeline is a continuous integration/continuous delivery (CI/CD) service offered by AWS. CodePipeline can be used to create automated pipelines that handle the build, test and deployment of software.
LocalStack comes with a bespoke execution engine that can be used to create, manage, and execute pipelines. It supports a variety of actions that integrate with S3, CodeBuild, CodeConnections, and more. The available operations can be found on the API coverage page.
Getting started
Section titled “Getting started”In this guide, we will create a simple pipeline that fetches an object from an S3 bucket and uploads it to a different S3 bucket.
It is for users that are new to CodePipeline and have a basic knowledge of the AWS CLI and the awslocal
wrapper.
Start LocalStack using your preferred method.
Create prerequisite buckets
Section titled “Create prerequisite buckets”Begin by creating the S3 buckets that will serve as the source and target.
awslocal s3 mb s3://source-bucketawslocal s3 mb s3://target-bucket
It is important to note the CodePipeline requires source S3 buckets to have versioning enabled.
This can be done using the S3 PutBucketVersioning
operation.
awslocal s3api put-bucket-versioning \ --bucket source-bucket \ --versioning-configuration Status=Enabled
Now create a placeholder file that will flow through the pipeline and upload it to the source bucket.
echo "Hello LocalStack!" > fileawslocal s3 cp file s3://source-bucket
Pipelines also require an artifact store, which is also an S3 bucket that is used as intermediate storage.
awslocal s3 mb s3://artifact-store-bucket
Configure IAM
Section titled “Configure IAM”Depending on the specifics of the declaration, CodePipeline pipelines need access other AWS services. In this case we want our pipeline to retrieve and upload files to S3. This requires a properly configured IAM role that our pipeline can assume.
Create the role and make note of the role ARN:
{ "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Principal": { "Service": "codepipeline.amazonaws.com" }, "Action": "sts:AssumeRole" } ]}
Create the role with the following command:
awslocal iam create-role --role-name role --assume-role-policy-document file://role.json | jq .Role.Arn
Now add a permissions policy to this role that permits read and write access to S3.
{ "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "s3:*" ], "Resource": "*" } ]}
The permissions in the above example policy are relatively broad. You might want to use a more focused policy for better security on production systems.
awslocal iam put-role-policy --role-name role --policy-name policy --policy-document file://policy.json
Create pipeline
Section titled “Create pipeline”Now we can turn our attention to the pipeline declaration.
A pipeline declaration is used to define the structure of actions and stages to be performed. The following pipeline defines two stages with one action each. There is a source action which retrieves a file from an S3 bucket and marks it as the output. The output is placed in the intermediate bucket until it is picked up by the action in the second stage. This is a deploy action which uploads the file to the target bucket.
Pay special attention to roleArn
, artifactStore.location
as well as S3Bucket
, S3ObjectKey
, and BucketName
.
These correspond to the resources we created earlier.
{ "name": "pipeline", "executionMode": "SUPERSEDED", "pipelineType": "V1", "roleArn": "arn:aws:iam::000000000000:role/role", "artifactStore": { "type": "S3", "location": "artifact-store-bucket" }, "version": 1, "stages": [ { "name": "stage1", "actions": [ { "name": "action1", "actionTypeId": { "category": "Source", "owner": "AWS", "provider": "S3", "version": "1" }, "runOrder": 1, "configuration": { "S3Bucket": "source-bucket", "S3ObjectKey": "file", "PollForSourceChanges": "false" }, "outputArtifacts": [ { "name": "intermediate-file" } ], "inputArtifacts": [] } ] }, { "name": "stage2", "actions": [ { "name": "action1", "actionTypeId": { "category": "Deploy", "owner": "AWS", "provider": "S3", "version": "1" }, "runOrder": 1, "configuration": { "BucketName": "target-bucket", "Extract": "false", "ObjectKey": "output-file" }, "inputArtifacts": [ { "name": "intermediate-file" } ], "outputArtifacts": [] } ] } ]}
Create the pipeline using the following command:
awslocal codepipeline create-pipeline --pipeline file://./declaration.json
Verify pipeline execution
Section titled “Verify pipeline execution”A ‘pipeline execution’ is an instance of a pipeline in a running or finished state.
The CreatePipeline
operation we ran earlier started a pipeline execution.
This can be confirmed using:
awslocal codepipeline list-pipeline-executions --pipeline-name pipeline
{ "pipelineExecutionSummaries": [ { "pipelineExecutionId": "37e8eb2e-0ed9-447a-a016-8dbbd796bfe7", "status": "Succeeded", "startTime": 1745486647.138571, "lastUpdateTime": 1745486648.290341, "trigger": { "triggerType": "CreatePipeline" }, "executionMode": "SUPERSEDED" } ]}
Note the trigger.triggerType
field specifies what initiated the pipeline execution.
Currently in LocalStack, only two triggers are implemented: CreatePipeline
and StartPipelineExecution
.
The above pipeline execution was successful.
This means that we can retrieve the output-file
object from the target-bucket
S3 bucket.
awslocal s3 cp s3://target-bucket/output-file output-file
To verify that it is the same file as the original input:
cat output-file
The output will be:
Hello LocalStack!
Examine action executions
Section titled “Examine action executions”Using the ListActionExecutions
, detailed information about each action execution such as inputs and outputs can be retrieved.
This is useful when debugging the pipeline.
awslocal codepipeline list-action-executions --pipeline-name pipeline
{ "actionExecutionDetails": [ { "pipelineExecutionId": "37e8eb2e-0ed9-447a-a016-8dbbd796bfe7", "actionExecutionId": "e38716df-645e-43ce-9597-104735c7f92c", "pipelineVersion": 1, "stageName": "stage2", "actionName": "action1", "startTime": 1745486647.269867, "lastUpdateTime": 1745486647.289813, "status": "Succeeded", "input": { "actionTypeId": { "category": "Deploy", "owner": "AWS", "provider": "S3", "version": "1" }, "configuration": { "BucketName": "target-bucket", "Extract": "false", "ObjectKey": "output-file" }, "resolvedConfiguration": { "BucketName": "target-bucket", "Extract": "false", "ObjectKey": "output-file" }, "region": "eu-central-1", "inputArtifacts": [ { "name": "intermediate-file", "s3location": { "bucket": "artifact-store-bucket", "key": "pipeline/intermediate-file/01410aa4.zip" } } ] }, "output": { "outputArtifacts": [], "executionResult": { "externalExecutionId": "bcff0781", "externalExecutionSummary": "Deployment Succeeded" }, "outputVariables": {} } }, { "pipelineExecutionId": "37e8eb2e-0ed9-447a-a016-8dbbd796bfe7", "actionExecutionId": "ae99095a-1d43-46ee-8a48-c72b6d60021e", "pipelineVersion": 1, "stageName": "stage1", "actionName": "action1", ...
Pipelines
Section titled “Pipelines”The operations CreatePipeline
, GetPipeline
, UpdatePipeline
, ListPipelines
, DeletePipeline
are used to manage pipeline declarations.
LocalStack supports emulation for V1 pipelines. V2 pipelines are only created as mocks.
Pipeline executions can be managed with:
When stopping pipeline executions with StopPipelineExecution
, the stop and abandon method is not supported.
Setting the abandon
flag will have no impact.
This is because LocalStack uses threads as the underlying mechanism to simulate pipelines, and threads cannot be cleanly preempted.
Action executions can be inspected using the ListActionExecutions
operation.
Tagging pipelines
Section titled “Tagging pipelines”Pipelines resources can be tagged using the following operations:
Tag the pipeline with the following command:
awslocal codepipeline tag-resource \ --resource-arn arn:aws:codepipeline:eu-central-1:000000000000:pipeline \ --tags key=purpose,value=tutorial
awslocal codepipeline list-tags-for-resource \ --resource-arn arn:aws:codepipeline:eu-central-1:000000000000:pipeline
{ "tags": [ { "key": "purpose", "value": "tutorial" } ]}
Untag the pipeline with the following command:
awslocal codepipeline untag-resource \ --resource-arn arn:aws:codepipeline:eu-central-1:000000000000:pipeline \ --tag-keys purpose
Variables
Section titled “Variables”CodePipeline on LocalStack supports variables which allow dynamic configuration of pipeline actions.
Actions produce output variables which can be referenced in the configuration of subsequent actions. Make note that only when the action defines a namespace, its output variables are availabe to downstream actions.
CodePipeline’s variable placeholder syntax is as follows:
#{namespace.variable}
As with AWS, LocalStack only makes the codepipeline.PipelineExecutionId
variable available by default in a pipeline.
Actions
Section titled “Actions”You can use runOrder
to control parallel or sequential order of execution of actions.
The supported actions in LocalStack CodePipeline are listed below. Using an unsupported action will make the pipeline fail. If you would like support for more actions, please raise a feature request.
CloudFormation Deploy
Section titled “CloudFormation Deploy”The CloudFormation Deploy action executes a CloudFormation stack.
It supports the following modes: CREATE_UPDATE
, CHANGE_SET_REPLACE
, CHANGE_SET_EXECUTE
CodeBuild Source and Test
Section titled “CodeBuild Source and Test”The CodeBuild Source and Test action can be used to start a CodeBuild container and run the given buildspec.
CodeConnections Source
Section titled “CodeConnections Source”The CodeConnections Source action is used to specify a VCS repo as the input to the pipeline.
LocalStack supports integration only with GitHub at this time.
Please set the environment configuration option CODEPIPELINE_GH_TOKEN
with the GitHub Personal Access Token to be able to fetch private repositories.
ECR Source
Section titled “ECR Source”The ECR Source action is used to specify an Elastic Container Registry image as a source artifact.
ECS CodeDeploy Blue/Green
Section titled “ECS CodeDeploy Blue/Green”The ECS CodeDeply Blue/Green action is used to deploy container application using a blue/green deployment.
LocalStack does not accurately emulate a blue/green deployment due to limitations in ELB and ECS. It will only update the running ECS service with a new task definition and wait for the service to be stable.
ECS Deploy
Section titled “ECS Deploy”The ECS Deploy action creates a revision of a task definition based on an already deployed ECS service.
Lambda Invoke
Section titled “Lambda Invoke”The Lambda Invoke action is used to execute a Lambda function in a pipeline.
Manual Approval
Section titled “Manual Approval”The Manual Approval action can be included in the pipeline declaration but it will only function as a no-op.
S3 Deploy
Section titled “S3 Deploy”The S3 Deploy action is used to upload artifacts to a given S3 bucket as the output of the pipeline.
S3 Source
Section titled “S3 Source”The S3 Source action is used to specify an S3 bucket object as input to the pipeline.
Limitations
Section titled “Limitations”- Emulation for V2 pipeline types is not supported. They will be created as mocks only.
- Rollbacks and stage retries are not available.
- Custom actions and associated operations (AcknowledgeJob, GetJobDetails, PollForJobs, etc.) are not supported.
- Triggers are not implemented. Pipelines are executed only when CreatePipeline and StartPipelineExecution are invoked.
- Execution mode behaviours are not implemented. Parallel pipeline executions will not lead to stage locks and waits.
- Stage transition controls are not implemented.
- Manual approval action and PutApprovalResult operations are not available.
API Coverage
Section titled “API Coverage”Operation ▲ | Implemented | Image |
---|