Bootstrapping AWS account with CloudFormation and CodePipeline
Saturday, 09 June 2018, 08:45
Yes, I love CloudFormation when working with AWS. With all of it's limitations and quirks, it's a really simple and extensible tool, that allows you fully manage your resources in an automated Infrastructure-as-a-code approach. If you create a project from scratch you probably want to automate as much as possible - if you decide on CloudFormation you probably want to figure out how to provision as much as possible from your AWS cloud with it? The thing is: you can manage everything - by using just a single command it's possible to bootstrap your account in a fully automated manner.
Chicken and egg problem
So at the beginning we have a common problem - what comes first? And not only out of two - to automate the process you need:
- automation pipeline, but where to define it and store the artifacts?;
- CloudFormation stack to provision the resources, but how to automate it where there is nothing in place?;
- S3 storage for the template files, but how to provision S3 bucket before the stack is in place?.
Concept
There is a way to keep it all together. Let's go step-by-step by these points:
- AWS provides automation pipeline service - CodePipeline, which can be managed with CloudFormation (it keeps artifacts in the S3 bucket, which you can define in the same template as the pipeline);
- CloudFormation can be automated with CodePipeline - here starts the twisted part: CloudFormation can define the CodePipeline pipeline and that pipeline can handle stack updates;
- S3 bucket is not required in fact - it's needed mainly for external tools to upload data to AWS cloud before triggering CloudFormation - in case of internal AWS tools, like CodePipeline or AWS CLI you can just pass template body directly (it's size is then limited, but ~50kB should be enough).

Now the question is - start with CodePipeline or CloudFormation? And there is a very preceise answer to it - CloudFormation. Reason is simple - CloudFormation can't provision existing resources, it needs to create them from the beginning; while CodePipeline doesn't care, it has CREATE_UPDATE action which detects if stack already exists. It is also easier - CloudFormation stack creation requires just pointing template file, to create pipeline by hand would require much more work.
Last issue to address: don't we create an infinite loop here? No - when CloudFormation updates CodePipeline it doesn't trigger pipeline. Pipeline is only triggered by source changes, not configuration changes.
GitHub integration
Just as a side note, I will use GitHub as my source repository, which means I will need to set up authentication token, but CodePipeline fully supports it, so it's zero-effort.
Bootstrapping
Ok, theory looks fine - now it's time to make hands dirty. We need to model pipeline from above concept in the CloudFormation template (let's assume we keep that in bootstrap.yaml
file):
Parameters:
GitHubOwner:
Description: "GitHub user or organization name."
Type: "String"
Default: "yourGitHubLogin"
GitHubRepo:
Description: "GitHub repository name."
Type: "String"
Default: "yourGitHubRepository"
GitHubOAuthToken:
Description: "AWS CodePipeline OAuth access token to GitHub."
Type: "String"
Default: "yourGitHubOAuthToken"
Resources:
PipelineArtifactsBucket:
Type: "AWS::S3::Bucket"
Properties:
BucketName: "my-code-pipeline-artifacts"
AccessControl: "Private"
# this is the role for CodePipeline - it's used to execute pipeline stages
PipelineRole:
Type: "AWS::IAM::Role"
Properties:
RoleName: "my-codepipeline"
AssumeRolePolicyDocument:
Statement:
-
Action:
- "sts:AssumeRole"
Effect: "Allow"
Principal:
Service:
- "codepipeline.amazonaws.com"
Policies:
-
PolicyName: "AllowS3PipelinesStorage"
PolicyDocument:
Version: "2012-10-17"
Statement:
-
Action:
- "s3:GetObject"
- "s3:PutObject"
Effect: "Allow"
Resource:
- !Sub "arn:aws:s3:::${PipelinesBucket}"
- !Sub "arn:aws:s3:::${PipelinesBucket}/*"
-
PolicyName: "AllowRunningCloudFormation"
PolicyDocument:
Version: "2012-10-17"
Statement:
-
Action:
- "cloudformation:CreateStack"
- "cloudformation:DescribeStacks"
- "cloudformation:UpdateStack"
Effect: "Allow"
Resource:
- !Sub "arn:aws:cloudformation:${AWS::Region}:${AWS::AccountId}:stack/${AWS::StackName}"
-
Action:
- "iam:PassRole"
Effect: "Allow"
Resource:
- !GetAtt "DeployRole.Arn"
# this is the role for CloudFormation - it will be used to perform stack operations
DeployRole:
Type: "AWS::IAM::Role"
Properties:
RoleName: "my-bootstrap-deploy"
AssumeRolePolicyDocument:
Statement:
-
Action:
- "sts:AssumeRole"
Effect: "Allow"
Principal:
Service:
- "cloudformation.amazonaws.com"
# just to make this example compact I give full rights, feel free to limit this role
ManagedPolicyArns:
- "arn:aws:iam::aws:policy/AdministratorAccess"
# just a minimal pipeline to apply stack changes
BootstrapPipeline:
Type: "AWS::CodePipeline::Pipeline"
Properties:
Name: "my-bootstrap"
RoleArn: !GetAtt "PipelineRole.Arn"
ArtifactStore:
Type: "S3"
Location: !Ref "PipelinesBucket"
RestartExecutionOnUpdate: false
Stages:
-
Name: "Source"
Actions:
-
Name: "Checkout"
ActionTypeId:
Category: "Source"
Owner: "ThirdParty"
Provider: "GitHub"
Version: "1"
Configuration:
Owner: !Ref "GitHubOwner"
Repo: !Ref "GitHubRepo"
# change it to fit your needs
Branch: "master"
OAuthToken: !Ref "GitHubOAuthToken"
OutputArtifacts:
-
Name: "checkout"
-
Name: "Infrastructure"
Actions:
-
Name: "Bootstrap"
ActionTypeId:
Category: "Deploy"
Owner: "AWS"
Provider: "CloudFormation"
Version: "1"
Configuration:
ActionMode: "CREATE_UPDATE"
StackName: !Ref "AWS::StackName"
RoleArn: !GetAtt "DeployRole.Arn"
Capabilities: "CAPABILITY_NAMED_IAM"
TemplatePath: "checkout::bootstrap.yaml"
InputArtifacts:
-
Name: "checkout"
Everything is ready, let's go - just execute the command and you will have your account completely provisioned with CloudFormation, with CodePipeline automation applied:
aws cloudformation create-stack \
--stack-name "bootstrap-stack" \
--template-body "file://bootstrap.yaml" \
--capabilities CAPABILITY_NAMED_IAM
Multiple stacks

Fine, what can I do from here? Virtually everything. From now on every commit to the repository (specified branch) will trigger update to the stack. You can now add more resources. Good idea would be to keep the bootstrap stack as minimal as possible - you can now add more stacks to your pipeline. You can even integrate these stacks by passing some variables using cross-stack references (you could also do that through the pipeline, but this time I want to present the solution with cross-stack references).
The good thing about cross stack references is that they are just there - you can use use it in more stacks. Of course it all comes with some cost: remember that cross-stack references are a contract and once used anywhere in other stack, CloudFormation will not allow you to change it's value. If you want to build further chain of deployment, eg. another pipeline for application deployment with multiple stages stack cross-stack references are the only way. To avoid further conflicts, it's good to keep them versioned from the beginning. In the example I try to keep it all simple, but you can rename ArtifactsBucketName
into ArtifactsBucketName:v1
.
Of course this is a minimal example - small templates that fit size restriction of ~50kB of direct upload. If you want to pass some external templates, or larger ones you will need to upload them into S3 first. For that you can use simple CodeBuild project that will just do aws s3 sync local-templates/ s3://templates-bucket/
before executing the stack update.
For some minimal example, put your further resources into application.yaml
file (see below). In this example we have two stacks and the application stack requires variable from bootstrap stack to be exported first - for that we use RunOrder
parameter to enforce proper chain of execution. Alternative is to define completely separate stage. Effect will be similar and it's more about your flow you want to represent, not really a technical requirement. But adding more stages can be tricky, as in future you can have more stacks with more dependencies and re-defining pipeline stages because of stacks refactoring will require a lot of modifications. In case you keep the stacks that belong to same stage together, just changing RunOrder
is enough.
Resources:
AppRole:
Type: "AWS::IAM::Role"
Properties:
/* … */
DeployRewritesLambda:
Type: "AWS::Lambda::Function"
Properties:
FunctionName: "my-app-lambda"
Runtime: "java8"
Code:
S3Bucket: !ImportValue "ArtifactsBucketName"
S3Key: "key/to/your/app.zip"
Handler: "index.handler"
MemorySize: 256
Description: "Your app."
Timeout: 300
Role: !GetAtt "AppRole.Arn"
Just a note - this is a dummy example, you need to build your app something (you can use CodeBuild for that), just wanted to present CloudFormation example.
Now, having your application "manifest" created, we can add it into our pipeline (we also need to define ArtifactsBucketName
export):
Parameters:
/* … */
Resources:
PipelineArtifactsBucket:
Type: "AWS::S3::Bucket"
Properties:
BucketName: "my-code-pipeline-artifacts"
AccessControl: "Private"
PipelineRole:
Type: "AWS::IAM::Role"
Properties:
/* … */
Policies:
/* … */
-
PolicyName: "AllowRunningCloudFormation"
PolicyDocument:
Version: "2012-10-17"
Statement:
-
Action:
- "cloudformation:CreateStack"
- "cloudformation:DescribeStacks"
- "cloudformation:UpdateStack"
Effect: "Allow"
Resource:
- !Sub "arn:aws:cloudformation:${AWS::Region}:${AWS::AccountId}:stack/${AWS::StackName}"
- !Sub "arn:aws:cloudformation:${AWS::Region}:${AWS::AccountId}:stack/application"
-
Action:
- "iam:PassRole"
Effect: "Allow"
Resource:
- !GetAtt "DeployRole.Arn"
/* … */
BootstrapPipeline:
Type: "AWS::CodePipeline::Pipeline"
Properties:
/* … */
Stages:
-
Name: "Source"
/* … */
-
Name: "Infrastructure"
Actions:
-
Name: "Bootstrap"
ActionTypeId:
Category: "Deploy"
Owner: "AWS"
Provider: "CloudFormation"
Version: "1"
Configuration:
ActionMode: "CREATE_UPDATE"
StackName: !Ref "AWS::StackName"
RoleArn: !GetAtt "DeployRole.Arn"
Capabilities: "CAPABILITY_NAMED_IAM"
TemplatePath: "checkout::bootstrap.yaml"
InputArtifacts:
-
Name: "checkout"
# note that we add `RunOrder`
# as `application.yaml` imports value from this stack it needs to wait
RunOrder: 1
-
Name: "Application"
ActionTypeId:
Category: "Deploy"
Owner: "AWS"
Provider: "CloudFormation"
Version: "1"
Configuration:
ActionMode: "CREATE_UPDATE"
StackName: "application"
RoleArn: !GetAtt "DeployRole.Arn"
Capabilities: "CAPABILITY_NAMED_IAM"
TemplatePath: "checkout::application.yaml"
InputArtifacts:
-
Name: "checkout"
RunOrder: 2
Outputs:
ArtifactsBucketName:
Value: !Ref "PipelineArtifactsBucket"
Export:
Name: "ArtifactsBucketName"
Tags: AWS, Cloud, CodePipeline, Continuous Delivery, CloudFormation