Rafał Wrzeszcz - Wrzasq.pl

Continuous Delivery with CodePipeline, CodeBuild and Maven

Saturday, 02 December 2017, 20:47

Neue Arbeit, neue Stadt, neue Blog Post. My adventure in HRS has finished. Time for the next step, but next step starts next Monday, so I have some time to spend with family and update my blog a little. As recently I was mainly involved into managing infrastructure of various projects on AWS and picked CloudFormation as a tool to manage all of the resources, I discovered a lot of quirks, sometimes undocumented behaviors, but mostly just built simple stuff from simple pieces that are working great with a minimum effort. Want to share some of my knowledge here, especially that AWS documentation is usually very fragmented, unclear or missing; which doesn't change the fact, that still - it provides all you need to build the stuff you want, managing most of them for you.

So, at first I want to describe my simple setup of CI/CD using CodeBuild and CodePipeline. First of all - why? Cloud/SaaS CI services like Travis are great: all you need, with zero maintenance cost, managed for you, usually free for open-source projects. But private/internal projects very often have different requirements. For example they need access to private repositories, they need to integrate with corporate tools, they need access control. If your company makes use of AWS cloud anyway, CodeBuild and CodePipeline are great choices, as they integrate with IAM, are manageable with CloudFormation, plus of course are completely managed and very simple.

CloudFormationCeption

First let's describe the process itself - the entire automation pipeline consists of just four "active" elements:

CloudFormation-ception
automation CloudFormation stack
responsible for provisioning all of the other resources, bootstrapping the pipeline etc.;
CodePipeline pipeline
responsible for defining automation flow;
CodeBuild project
defines build tasks;
deployment CloudFormation stack(s)
a single application deployment stage, but a pipeline can deploy multiple stages.

The process is completely controlled with the root CloudFormation stack (can be easily automated with shell script if you need): it defines S3 bucket for storing the artifacts; CodeBuild project for executing build tasks; CodePipeline to provision the workflow, which will at some step trigger another CloudFormation stack deployment - this time application stack (we also need IAM roles for most of that). To handle all of these we need just a bunch of files, so everything is fully automated, managed by source code and can be easily kept in one repository:

infrastructure/init.sh
root stack deployment automation;
infrastructure/pipeline.yaml
this file will contain template for root stack;
infrastructure/stage.yaml
this file will contain template for single application stage stack;
infrastructure/config-production.yaml
stack configuration for production deployment;
infrastructure/config-staging.yaml
stack configuration for staging deployment, to present capability of multiple deployment environments;
buildspec.yml
CodeBuild project tasks;
src/main/resources/build-info.json
Maven-specific location for some dynamic stuff (read further).

Start with CloudFormation

I highly recommend using CloudFormation if you want to start managing your AWS resources. It's simple (doesn't mean easy!); offers some nice features, like cross-stack references; detects some mistakes upfront but most of all - it allows you to track your resources. When you just start your AWS adventure you will spawn different resources of all kinds like crazy. Finding some of them unneeded or some approaches being doomed, you can easily bloat your account with strange phantom resources, that even you won't recognize after one month. Of course there are other tools for managing them, but CloudFormation is out there, from AWS, for free. One other reason is, that you can easily move out of it any time (just update your stack with DeletionPolicy: Retain for all of the resources and afterwards delete the stack), but you can't move in (or even back) as it can't import existing resources - so you can do if from the beginning, or not at all.

First element is our bootstrap script, that will create a stack for pipeline. In fact it only needs to do two things - upload template file to S3 and execute CloudFormation:

aws s3 cp "pipeline.yaml" "s3://${S3_BUCKET}/pipeline.yaml"
aws cloudformation "${OPERATION}" \
    --stack-name "my-project-automation" \
    --template-url "https://s3.${AWS_REGION).amazonaws.com/${S3_BUCKET}/pipeline.yaml" \
    --capabilities CAPABILITY_NAMED_IAM \
    # put your parameters here

The only tricky part here is ${OPERATION}, as CloudFormation doesn't provide any action like create-or-update-stack (can only either create or update). It's simple problem to solve, but don't want to go too deep in this post, so you can figure that one on your own.

Now let's see what will we keep in infrastructure/pipeline.yaml template.

CodePipeline setup

Let's start from the pipeline. Pipeline consists of stages and actions executed on the artifacts passed over from one action to next one. Our simple pipeline will fetch the source code from GitHub repository, execute tests automated with CodeBuild, deploy artifacts and then execute CloudFormation stack. We will define five actions:

  1. source code checkout;
  2. build execution;
  3. staging environment deployment;
  4. manual approval trigger;
  5. production environment deployment.
Parameters:
    GitHubOwner:
        Description: "GitHub user or organization name."
        Type: "String"
    GitHubRepo:
        Description: "GitHub repository name."
        Type: "String"
    GitHubOAuthToken:
        Description: "AWS CodePipeline OAuth access token to GitHub."
        Type: "String"
    ProjectKey:
        Description: "DNS-friendly project key."
        Type: "String"

Resources:
    # bucket for storing action artifacts

    PipelinesBucket:
        Type: "AWS::S3::Bucket"
        Properties:
            BucketName: !Sub "${ProjectKey}-pipelines"
            AccessControl: "Private"

    # IAM role, with which, CodePipeline will execute all actions (can be overridden for particular actions if needed)

    PipelineRole:
        Type: "AWS::IAM::Role"
        Properties:
            RoleName: !Sub "${ProjectKey}-codepipeline"
            AssumeRolePolicyDocument:
                Statement:
                    -
                        Action:
                            - "sts:AssumeRole"
                        Effect: "Allow"
                        Principal:
                            Service: "codepipeline.amazonaws.com"
            Policies:
                -
                    PolicyName: "AllowS3PipelinesStorage"
                    PolicyDocument:
                        Version: "2012-10-17"
                        Statement:
                            -
                                Action:
                                    - "s3:GetObject"
                                    - "s3:PutObject"
                                Effect: "Allow"
                                Resource:
                                    - !Sub "arn:aws:s3:::${PipelinesBucket}"
                                    - !Sub "arn:aws:s3:::${PipelinesBucket}/*"

    # pipeline definition itself

    DeployPipeline:
        Type: "AWS::CodePipeline::Pipeline"
        Properties:
            Name: !Ref "ProjectKey"
            RoleArn: !GetAtt "PipelineRole.Arn"
            ArtifactStore:
                # here we define where all of the artifacts will be stored
                Type: "S3"
                Location: !Ref "PipelinesBucket"
            Stages:
                -
                    Name: "Source"
                    Actions:
                        -
                            Name: "Checkout"
                            ActionTypeId:
                                Category: "Source"
                                Owner: "ThirdParty"
                                Provider: "GitHub"
                                Version: "1"
                            Configuration:
                                Owner: !Ref "GitHubOwner"
                                Repo: !Ref "GitHubRepo"
                                Branch: "master"
                                OAuthToken: !Ref "GitHubOAuthToken"
                            OutputArtifacts:
                                -
                                    Name: "checkout"
                # here we will add more stages and actions

So far rather simple - we will add more steps soon. But let's stay here for a moment - there are already some aspects that need to be described.

  • So far CodePipeline doesn't support webhooks for GitHub integration. It fetches specified branch periodically to check for updates. It's a little uncomfortable setup, but it checks very frequently so the pipeline updates are very smooth.
  • Another thing is that it needs OAuth token for accessing the repo. The process is described here.
  • Artifact is not a single file - it's a ZIP archive with state of the project. The name is just identifier, CodePipeline will generate unique filename for it (in fact, it's stored in S3 as ${BucketName}/${PipelineName}/${ArtifactName}/${RandomId}.zip.

CodeBuild setup

Time for running some tasks in our project - of course with CodeBuild. For that we need to:

  • define a CodeBuild project buildspec (tasks);
  • define a CodeBuild project by CloudFormation (and IAM role for it);
  • give CodePipeline access to build project (update IAM role).

Let's start from the build specification - in my case I use Maven, but CodeBuild is totally agnostic about your toolset, it will just execute the commands you specify, within the Docker image you specify. As Maven builds are usually encapsulated within pom.xml I simply need to invoke the build (I guess it's the same for most of the tools) - build specification is just an integration part:

version: "0.2"

phases:
    build:
        commands:
            - "mvn test"

    post_build:
        commands:
            - "mvn deploy"

artifacts:
    files:
        - "**/*"

CodeBuild documentation contains detailed specification of the buildspec file. The important thing here is the artifacts definition - it specifies which files will be included in the pipeline state transfered by CodePipeline to next pipeline phase. You can specify which files you need, in my case I prefer to keep all files (**/*) to let further pipeline steps decide what they need. Important note is, that without artifacts clause no files at all would be transfered.

Now, as we have build tasks, we can integrate them into pipeline template:

Resources:
    /* … */

    # IAM role for CodeBuild that will allow it to access build pipeline artifacts and log to CloudWatch

    BuildRole:
        Type: "AWS::IAM::Role"
        Properties:
            RoleName: !Sub "${ProjectKey}-codebuild"
            AssumeRolePolicyDocument:
                Statement:
                    -
                        Action:
                            - "sts:AssumeRole"
                        Effect: "Allow"
                        Principal:
                            Service: "codebuild.amazonaws.com"
            Policies:
                -
                    PolicyName: "AllowLoggingToCloudWatchLogs"
                    PolicyDocument:
                        Version: "2012-10-17"
                        Statement:
                            -
                                Action:
                                    - "logs:CreateLogGroup"
                                    - "logs:CreateLogStream"
                                    - "logs:PutLogEvents"
                                Effect: "Allow"
                                Resource:
                                    - !Sub "arn:aws:logs:${AWS::Region}:${AWS::AccountId}:log-group:/aws/codebuild/${ProjectKey}:*"
                -
                    PolicyName: "AllowS3PipelinesStorage"
                    PolicyDocument:
                        Version: "2012-10-17"
                        Statement:
                            -
                                Action:
                                    - "s3:GetObject"
                                    - "s3:PutObject"
                                Effect: "Allow"
                                Resource:
                                    - !Sub "arn:aws:s3:::${PipelinesBucket}"
                                    - !Sub "arn:aws:s3:::${PipelinesBucket}/*"

    # build project - specifies environment (input, output, Docker image, role) in which build tasks will be executed

    BuildProject:
        Type: "AWS::CodeBuild::Project"
        Properties:
            Name: !Ref "ProjectKey"
            ServiceRole: !Ref "BuildRole"
            Environment:
                Type: "LINUX_CONTAINER"
                ComputeType: "BUILD_GENERAL1_SMALL"
                Image: "maven:3.5.2-jdk-8-slim"
            Artifacts:
                Type: "CODEPIPELINE"
            Source:
                Type: "CODEPIPELINE"

    # updated pipeline role that grants CodePipeline access to execute CodeBuild project

    PipelineRole:
        Type: "AWS::IAM::Role"
        Properties:
            RoleName: !Sub "${ProjectKey}-codepipeline"
            /* … AssumeRolePolicyDocument stays as it was */
            Policies:
                /* … */
                -
                    PolicyName: "AllowRunningCodeBuild"
                    PolicyDocument:
                        Version: "2012-10-17"
                        Statement:
                            -
                                Action:
                                    - "codebuild:BatchGetBuilds"
                                    - "codebuild:StartBuild"
                                Effect: "Allow"
                                Resource:
                                    - !GetAtt "BuildProject.Arn"

    # updated pipeline with CodeBuild step

    DeployPipeline:
        Type: "AWS::CodePipeline::Pipeline"
        Properties:
            /* … */
            Stages:
                /* … Source action with `checkout` artifact */
                -
                    Name: "Build"
                    Actions:
                        -
                            Name: "Build"
                            ActionTypeId:
                                Category: "Build"
                                Owner: "AWS"
                                Provider: "CodeBuild"
                                Version: "1"
                            Configuration:
                                ProjectName: !Ref "BuildProject"
                            InputArtifacts:
                                -
                                    Name: "checkout"
                            OutputArtifacts:
                                -
                                    Name: "build"

With this setup, after fetching sources form GitHub pipeline will pass project state of artifact checkout to next stage - in our case build project. CodeBuild will create environment specified in template file, load the checkout artifact there, execute build tasks there, and output resulting project state back to S3 as artifact named build. In our setup, CodeBuild project defines three important things:

  • that input state is controlled by CodePipeline, so it doesn't need to access any additional source definition;
  • that output state will also be grabbed by CodePipeline, so it doesn't need to care about storage;
  • it defines build environment to use BUILD_GENERAL1_SMALL machine and run all commands in maven:3.5.2-jdk-8-slim Docker images (additionally you can also define environment variables which will be available for all build tasks).

CloudFormation integration setup

Last thing to add is CloudFormation deployment of our infrastructure/stage.yaml (put there whatever is needed to deploy your application). We will add three next steps to our pipeline:

  1. deploying application to staging environment;
  2. manual approval for checking if everything is fine on staging;
  3. at last, deploying to production environment.

We will use same template for both environments (and in future you should be able to deploy to as many environments as you like). To use same template for multiple environments you probably need to somehow distinguish the stacks. To do so, CodePipeline provides possibility to specify, apart from template file, also stack configuration file.

First, the stage stack files. The stage.yaml is whatever you need for your app, so I will just focus on passing parameters to it, resources are up to you - say it's just:

Parameters:
    StageName:
        Description: "Deploy stage."
        Type: "String"

Resources:
    /* … */

Now, create two stages configuration files:

  • config-staging.json:
    {
        "Parameters": {
            "StageName": "staging"
        },
        "Tags": {
            "product:id": "my-project",
            "product:stage": "staging"
        }
    }
  • config-production.json:
    {
        "Parameters": {
            "StageName": "production"
        },
        "Tags": {
            "product:id": "my-project",
            "product:stage": "production"
        }
    }

We can now add the missing pieces to our root CloudFormation template:

  • create IAM role for CloudFormation actions;
  • grant access to CloudFormation to CodePipeline IAM role;
  • define deployment actions in the pipeline.
Resources:
    /* … */

    # we could grant full administrative access, but as we make it fully automated with CD pipeline
    # we may want to keep control of what our updates do - so I prefer to keep, granular full control

    DeployRole:
        Type: "AWS::IAM::Role"
        Properties:
            RoleName: !Sub "${ProjectKey}-deploy"
            AssumeRolePolicyDocument:
                Statement:
                    -
                        Action:
                            - "sts:AssumeRole"
                        Effect: "Allow"
                        Principal:
                            Service: "cloudformation.amazonaws.com"
            ManagedPolicyArns:
                # for some actions you may want to use ready-to-use policies
                - "arn:aws:iam::aws:policy/AmazonVPCFullAccess"
            Policies:
                # this is almost always needed
                -
                    PolicyName: "AllowDescribeAccountAttributes"
                    PolicyDocument:
                        Version: "2012-10-17"
                        Statement:
                            -
                                Action:
                                    - "ec2:DescribeAccountAttributes"
                                Effect: "Allow"
                                Resource:
                                    - "*"
                /* … */
                # put whatever else you need - you can even restrict access based on known ARNs formats, like:
                -
                    PolicyName: "AllowManagingStageDynamoDd"
                    PolicyDocument:
                        Version: "2012-10-17"
                        Statement:
                            -
                                Action:
                                    - "dynamodb:CreateTable"
                                    - "dynamodb:DeleteTable"
                                    - "dynamodb:DescribeTable"
                                    - "dynamodb:UpdateTable"
                                Effect: "Allow"
                                Resource:
                                    - !Sub "arn:aws:dynamodb:${AWS::Region}:${AWS::AccountId}:table/${ProjectKey}-*"

    # new policy allows our pipeline to execute CloudFormation stacks action and allows it to switch to role
    # that we defined for CloudFormation service

    PipelineRole:
        Type: "AWS::IAM::Role"
        Properties:
            RoleName: !Sub "${ProjectKey}-codepipeline"
            /* … AssumeRolePolicyDocument stays as it was */
            Policies:
                /* … */
                -
                    PolicyName: "AllowRunningCloudFormation"
                    PolicyDocument:
                        Version: "2012-10-17"
                        Statement:
                            -
                                Action:
                                    - "cloudformation:CreateStack"
                                    - "cloudformation:DescribeStacks"
                                    - "cloudformation:UpdateStack"
                                Effect: "Allow"
                                Resource:
                                    - !Sub "arn:aws:cloudformation:${AWS::Region}:${AWS::AccountId}:stack/${ProjectKey}-*"
                            -
                                Action:
                                    - "iam:PassRole"
                                Effect: "Allow"
                                Resource:
                                    - !GetAtt "DeployRole.Arn"

    DeployPipeline:
        Type: "AWS::CodePipeline::Pipeline"
        Properties:
            /* … */
            Stages:
                /* … Source action, then build action with `build` artifact */
                -
                    Name: "Staging"
                    Actions:
                        -
                            Name: "Staging"
                            ActionTypeId:
                                Category: "Deploy"
                                Owner: "AWS"
                                Provider: "CloudFormation"
                                Version: "1"
                            Configuration:
                                ActionMode: "CREATE_UPDATE"
                                StackName: !Sub "${ProjectKey}-staging"
                                RoleArn: !GetAtt "DeployRole.Arn"
                                TemplatePath: "build::infrastructure/stage.yaml"
                                TemplateConfiguration: "build::infrastructure/config-staging.json"
                            InputArtifacts:
                                -
                                    Name: "build"
                -
                    Name: "Production"
                    Actions:
                        # manual approval for production
                        -
                            Name: "Approval"
                            ActionTypeId:
                                Category: "Approval"
                                Owner: "AWS"
                                Provider: "Manual"
                                Version: "1"
                            RunOrder: 1
                        -
                            Name: "Production"
                            ActionTypeId:
                                Category: "Deploy"
                                Owner: "AWS"
                                Provider: "CloudFormation"
                                Version: "1"
                            Configuration:
                                ActionMode: "CREATE_UPDATE"
                                StackName: !Sub "${ProjectKey}-production"
                                RoleArn: !GetAtt "DeployRole.Arn"
                                TemplatePath: "build::infrastructure/stage.yaml"
                                TemplateConfiguration: "build::infrastructure/config-production.json"
                            InputArtifacts:
                                -
                                    Name: "build"
                            RunOrder: 2

Ok, as our pipeline and template are growing, what happens here? First of all, notice the separation of IAM roles: pipeline is only allowed to execute actions on CloudFormation stack - all of the particular resource management permisions are attached to dedicated deploy role.

The very nice thing is the CREATE_UPDATE action - CodePipeline solves the problem, which I mentioned on the beginning, that AWS CLI for CloudFormation can only create or update stacks. This action detects if stack with given name exists and if so, updates it; otherwise creates new one.

You might also notice the RunOrder parameter. As we know, pipelien consists of stages and each stage consists of actions - stages are sequentional, actions are (by default) executed in parallel. But we want to hold production deployment until it gets (manually) approved. That's what the order is for - it groups actions in sequence, waiting with actions having higher number for those with lower ones.

Last, but not least, we have template path and configuration - we use same template everywhere, so stacks are completly mirrored; however we use different configuration files, which pass different parameter values allowing stacks to manage different resources and group them (for example by putting StageName as a prefix to many resource names). Path to both template and configuration files are of form ${ArtafactName}::${PathInTheArtifact} (as we operate on the output artifact of build step, this is why I prefer to output entire project as an artifact in CodeBuild).

Dynamic parameters

Seems like everything is done, but I bet you miss one part - how to pass dynamic values to the stack? Let's say - just a current version, to build packages URLs (for examples Lambda source locations)? No worries, it's all feasible. To see how, start by adding version parameter to the stage template:

Parameters:
    StageName:
        Description: "Deploy stage."
        Type: "String"
    ProductVersion:
        Description: "Deployed version."
        Type: "String"

Resources:
    /* … */

CodePipeline provides a way to read such parameters from artifacts and pass them to CloudFormation. It can read form JSON key-value files. AWS documentation descries them as other stack outputs, but it can be literaly any JSON file in the artifact:

Resources:
    /* … */

    DeployPipeline:
        Type: "AWS::CodePipeline::Pipeline"
        Properties:
            /* … */
            Stages:
                /* … */
                -
                    Name: "Staging"
                    Actions:
                        -
                            Name: "Staging"
                            ActionTypeId:
                                Category: "Deploy"
                                Owner: "AWS"
                                Provider: "CloudFormation"
                                Version: "1"
                            Configuration:
                                ActionMode: "CREATE_UPDATE"
                                StackName: !Sub "${ProjectKey}-staging"
                                RoleArn: !GetAtt "DeployRole.Arn"
                                Capabilities: "CAPABILITY_NAMED_IAM"
                                TemplatePath: "build::infrastructure/stage.yaml"
                                TemplateConfiguration: "build::infrastructure/config-staging.json"
                                ParameterOverrides: |
                                    {
                                        "ProductVersion": {
                                            "Fn::GetParam": [
                                                "build",
                                                "target/classes/build-info.json",
                                                "version"
                                            ]
                                        }
                                    }
                            InputArtifacts:
                                -
                                    Name: "build"
                # use same overrides in production deployment

The whole magic is done by ParameterOverrides property and Fn::GetParam function. ParameterOverrides is a string containing JSON (no matter if you use YAML or JSON template format - it is just a string containing JSON document) with key-value mapping for passing parameter values. Fn::GetParam on the other hand reads value from JSON file withing artifact. It takes three arguments:

  1. artifact name;
  2. path of JSON file within the artifact;
  3. key in the JSON file, under which values is stored.

Now the only thing left for you is to make sure your build step will generate needed JSON file, making it available for next phases. It's probably very simple task in most of the build setups. Here I will describe just…

…the Maven way

With Maven the easiest way that came to my mind was to simply define a resource, that will be processed during the build - so just put a file named src/main/resources/build-info.json (processed version will be placed in target/classes/build-info.json):

{
    "version": "${project.version}"
}

Don't forget to enable filtering on this resource, as this is needed to interpolate variables.

If you have a multi-module Maven project, you may want to put it into separate sub-module to not bloat produced packages.

Multiple branches

So we have a pipeline for master branch. What about others, like - running just a tests and test environment deployments form develop branch (assuming GitFlow-like workflow)? The answer is simple - just define another pipelines. So in CodePipeline pipeline as a simple flow - don't try to branch it too much (it's possible) and handle all of the cases. If you have different processes for your branches - use different pipelines:

Resources:
    # only CI build on develop
    DevelopPipeline:
        Type: "AWS::CodePipeline::Pipeline"
        Properties:
            Name: "my-project-develop"
            RoleArn: !GetAtt "DevelopPipelineRole.Arn"
            ArtifactStore:
                Type: "S3"
                Location: "my-pipelines"
            Stages:
                -
                    Name: "Source"
                    Actions:
                        -
                            Name: "Checkout"
                            ActionTypeId:
                                Category: "Source"
                                Owner: "ThirdParty"
                                Provider: "GitHub"
                                Version: "1"
                            Configuration:
                                Owner: !Ref "GitHubOwner"
                                Repo: !Ref "GitHubRepo"
                                Branch: "develop"
                                OAuthToken: !Ref "GitHubOAuthToken"
                            OutputArtifacts:
                                -
                                    Name: "develop"
                -
                    Name: "Build"
                    Actions:
                        -
                            Name: "Build"
                            ActionTypeId:
                                Category: "Build"
                                Owner: "AWS"
                                Provider: "CodeBuild"
                                Version: "1"
                            Configuration:
                                ProjectName: !Ref "BuildProject"
                            InputArtifacts:
                                -
                                    Name: "develop"
                # what's next? test environment deploy?

    # master pipeline with deploy
    MasterPipeline:
        Type: "AWS::CodePipeline::Pipeline"
        Properties:
            Name: "my-project-master"
            RoleArn: !GetAtt "MasterPipelineRole.Arn"
            ArtifactStore:
                Type: "S3"
                Location: "my-pipelines"
            Stages:
                -
                    Name: "Source"
                    Actions:
                        -
                            Name: "Checkout"
                            ActionTypeId:
                                Category: "Source"
                                Owner: "ThirdParty"
                                Provider: "GitHub"
                                Version: "1"
                            Configuration:
                                Owner: !Ref "GitHubOwner"
                                Repo: !Ref "GitHubRepo"
                                Branch: "master"
                                OAuthToken: !Ref "GitHubOAuthToken"
                            OutputArtifacts:
                                -
                                    Name: "master"
                -
                    Name: "Build"
                    Actions:
                        -
                            Name: "Build"
                            ActionTypeId:
                                Category: "Build"
                                Owner: "AWS"
                                Provider: "CodeBuild"
                                Version: "1"
                            Configuration:
                                ProjectName: !Ref "BuildProject"
                            InputArtifacts:
                                -
                                    Name: "master"
                            OutputArtifacts:
                                -
                                    Name: "build"
                # staging deploy? manual approval?
                -
                    Name: "Deploy"
                    Actions:
                        -
                            Name: "Deploy"
                            ActionTypeId:
                                Category: "Deploy"
                                Owner: "AWS"
                                Provider: "CloudFormation"
                                Version: "1"
                            Configuration:
                                ActionMode: "CREATE_UPDATE"
                                StackName: !Ref "ProjectName"
                                RoleArn: !GetAtt "DeployRole.Arn"
                                TemplatePath: "build::infrastructure/stage.yaml"
                            InputArtifacts:
                                -
                                    Name: "build"

This is exactly the AWS approach - KISS. Don't complicate your flow, just define a separate one.

Multiple CodeBuild setups

Last piece that we might need to make variable is CodeBuild project setting - for example you might want to run different commands in different pipelines or stages (let's say mvn test in DevelopPipeline and mvn deploy in MasterPipeline). Again, just like with defining multiple pipelines for different flow, AWS solution to this problem is very simple - if you want to run a different command, define a different project. You can override build configuration location from buildspec.yml to different one, or even provide it inline, in CloudFormation template:

Resources:
    MasterBuildProject:
        Type: "AWS::CodeBuild::Project"
        Properties:
            Name: "my-project-master"
            ServiceRole: !Ref "BuildRole"
            Environment:
                Type: "LINUX_CONTAINER"
                ComputeType: "BUILD_GENERAL1_SMALL"
                Image: "maven:3.5.2-jdk-8-slim"
            Artifacts:
                Type: "CODEPIPELINE"
            Source:
                Type: "CODEPIPELINE"
                BuildSpec: |
                    version: "0.2"

                    phases:
                        build:
                            commands:
                                - "mvn deploy"

    DevelopBuildProject:
        Type: "AWS::CodeBuild::Project"
        Properties:
            Name: "my-project-develop"
            ServiceRole: !Ref "BuildRole"
            Environment:
                Type: "LINUX_CONTAINER"
                ComputeType: "BUILD_GENERAL1_SMALL"
                Image: "maven:3.5.2-jdk-8-slim"
            Artifacts:
                Type: "CODEPIPELINE"
            Source:
                Type: "CODEPIPELINE"
                BuildSpec: |
                    version: "0.2"

                    phases:
                        build:
                            commands:
                                - "mvn test"

    DevelopPipeline:
        Type: "AWS::CodePipeline::Pipeline"
        Properties:
            Name: "my-project-develop"
            RoleArn: !GetAtt "DevelopPipelineRole.Arn"
            ArtifactStore:
                Type: "S3"
                Location: "my-pipelines"
            Stages:
                -
                    Name: "Source"
                    Actions:
                        -
                            Name: "Checkout"
                            ActionTypeId:
                                Category: "Source"
                                Owner: "ThirdParty"
                                Provider: "GitHub"
                                Version: "1"
                            Configuration:
                                Owner: !Ref "GitHubOwner"
                                Repo: !Ref "GitHubRepo"
                                Branch: "develop"
                                OAuthToken: !Ref "GitHubOAuthToken"
                            OutputArtifacts:
                                -
                                    Name: "develop"
                -
                    Name: "Build"
                    Actions:
                        -
                            Name: "Build"
                            ActionTypeId:
                                Category: "Build"
                                Owner: "AWS"
                                Provider: "CodeBuild"
                                Version: "1"
                            Configuration:
                                ProjectName: !Ref "DevelopBuildProject"
                            InputArtifacts:
                                -
                                    Name: "develop"

    MasterPipeline:
        Type: "AWS::CodePipeline::Pipeline"
        Properties:
            Name: "my-project-master"
            RoleArn: !GetAtt "MasterPipelineRole.Arn"
            ArtifactStore:
                Type: "S3"
                Location: "my-pipelines"
            Stages:
                -
                    Name: "Source"
                    Actions:
                        -
                            Name: "Checkout"
                            ActionTypeId:
                                Category: "Source"
                                Owner: "ThirdParty"
                                Provider: "GitHub"
                                Version: "1"
                            Configuration:
                                Owner: !Ref "GitHubOwner"
                                Repo: !Ref "GitHubRepo"
                                Branch: "master"
                                OAuthToken: !Ref "GitHubOAuthToken"
                            OutputArtifacts:
                                -
                                    Name: "master"
                -
                    Name: "Build"
                    Actions:
                        -
                            Name: "Build"
                            ActionTypeId:
                                Category: "Build"
                                Owner: "AWS"
                                Provider: "CodeBuild"
                                Version: "1"
                            Configuration:
                                ProjectName: !Ref "MasterBuildProject"
                            InputArtifacts:
                                -
                                    Name: "master"

Note, that even though embedded project specification is a valid YAML document, you need to embed is as a string.

Bye bye

Have more scenarios to describe soon, but need to rest now!

Tags: CloudFormation, JSON, Maven, AWS, Cloud, Continuous Delivery, CodeBuild, CodePipeline