Usually setting up the build dependencies is a major part of each build job. Thankfully, Atlassian’s Bitbucket Pipelines, the new CI platform that integrates into Bitbucket, supports custom docker images.

To configure the build pipeline, you create bitbucket-pipeline.yml . This one uses our custom image (built below) and triggers builds whenever a releases-* tag is pushed.

image: tonymet/tonym.us:latest
pipelines:
  tags:
    release-*:
      - step:
          script:
            - make sync_down_images
            - make s3_upload

That first line is the magic part – you can run ANY public docker image from dockerhub (and private ones as well with further setup).

Building a Static Blog Using Build Pipeline

In this case we’ll use Pelican, a great static site generator. For builds & deploys, we just need pelican, some python packages and the aws cli to deploy to S3.

Using this Dockerfile, we create our custom build image. Pre-installing dependencies saves immense time during builds. And time is literally money, as with most CI platforms.

FROM python:2.7-alpine
RUN mkdir /build
COPY requirements.txt /build
WORKDIR /build
RUN apk add --no-cache --virtual make && \
  pip install --no-cache-dir -r requirements.txt

Activating Build Pipeline

Assuming your repo is on Bitbucket, you just need to flip a switch to turn on the Pipeline. Also add your environment variables needed for AWS credentials.

Defining Our Build

Now that the pipeline is activated, our build just needs to do 3 things:

  1. Fetch large assets into the source repo
  2. Compile the blog
  3. Push to S3

Let’s look at the bitbucket-pipeline.yml more closely

# use our custom Docker image
image: tonymet/tonym.us:latest
pipelines:
  tags:
    # only trigger the build for release tags
    release-*:
      - step:
          script:
            # fetch large assets
            - make sync_down_images
            # build & deploy
            - make s3_upload

Debugging & Troubleshooting

The two most typical painpoints in CI environments are dependencies and access control. Thankfully you can test both of these easily by running the docker image locally first

docker run -v $(pwd):/build -it tonymet/tonym.us:latest /bin/sh

Once inside the image, you can trigger each build step one by one to simulate what’s run in the pipeline. Make sure your entire build works first locally before pushing the code up. Each pipeline triggered costs you money 💵 and precious time ⏱