Introducing CI/CD with AWS CodePipeline

Reading Time: 4 minutes

Codepipeline is one of the AWS’s developer tools that acts as the glue and processor for other AWS services. It uses code commit or another external source control to pull in source code and trigger a pipeline execution. CodeBuild can run code, build software, and do basically anything else you need to do in your pipeline. At any point in your pipeline, you can trigger Lambda functions to run, expanding the possibilities for what you can do with the pipeline. There are numerous deployment services that work with CodePipeline, such as CodeDeploy, Elastic Beanstalk, ECS, and Fargate. The many integrations that CodePipeline provides ensures that you can accomplish an any scenarios that you need. Codepipeline only costs 1$ per pipeline, excluding any processing time with CodeBuild or Storage in S3.

CodePipeline Structure

Each individual instance of CodePipeline is itself called a pipeline. This contains all the configuration options for that pipeline, as well as a minimum of 2 of 10 stages in each pipeline, and inside those stages, 1 to 50 actions. A pipeline is able to carry a type of state between each action and stage by using input and output artifacts. These artifacts are just files. whether it’s your input source code, pipeline status, files, or a build application binary. They are stored in an S3 bucket. This is the main way your actions will interact with each other in CodePipeline.

Any input or output artifacts are defined in the action itself and are distinguished by a unique identifier name. If you need to input an artifact that was output from a previous action, you only need to reference it by the name as an input artifact, and CodePipeline will retrieve it from the pipeline S3 bucket.

Configuration options for a pipeline

Pipeline needs a name to identify themselves. These only need to be unique in the same region of your account. A pipeline will need a service role, which will give it access to things like Lambda, S3, CodeCommit and many other resources that CodePipeline interacts with. An S3 bucket will need to be created or designated to store the input/output artifacts used and created by the pipeline. You can also designate your own KMS encryption key for the artifacts in the S3 bucket, or you can also use the default AWS encryption key.  Besides these configuration options, the only other thing a pipeline contains is a number of stages. 

Stages in a Pipeline

A stage is a collection of actions. You can  use them to organize and isolate different types of actions according to whatever criteria you want. One of the most basic examples is to have a source stage, which contains an action that pulls code from source control and outputs a source code artifact, and a build stage, which contains a few actions which take that source code artifact and runs some build commands on it. Stages are yours to use to structure your pipeline as you want. The only real requirement is that the first stage must contain one or more source actions and nothing more. In each stage is at least one action, with a maximum of 50 actions allowed. There are few different actions, and each one performs some type of function.

Action type in CodePipeline

Source Action Type

This action type can only appear in the first stage and pulls in source code from one of these providers. The next action type is the build type. These actions run a job on one of the available build providers. This is where you would build or compile the code for your application. If needed, you can also have some arbitrary code run on a CodeBuild job, for example, if you want to run some commands using the AWS CLI.

Test Action Type

An important part of automated deployment is automated testing, and the test action type is how you can implement that in your pipeline. There are many different providers for the test action type, including various third parties, and also just using CodeBuild to run your tests.

Deploy Action Type

To deploy your code, you can use the Deploy action type. There are many different providers for deploying your application, including ECS, ElasticBeanstalk or CodeDeploy.

There are two more action types you may find useful. The first is the approval action type, which allow you to insert a manual approval into your pipeline. This can be really helpful when you’ve got a pipeline that deploys to production, and you want someone to do a quick manual smoke test before the new version actually gets deployed.


Hey Reader’s, I hope you like this blog and got to learn something new. To get a detailed insight in this service of AWS, visit the official documentation here. In case of doubt, feel free to contact:

Written by 

Vidushi Bansal is a Software Consultant [Devops] at Knoldus Inc. She is passionate about learning and exploring new technologies.