Mainlining on PipeLines

06 Apr 2019

Build status

We’ve recently updated the London Apple Admins website to use the static site generator Jekyll… Well, I say we - Graham Gilbert did the hard work, and the rest of us sat back and suggested silly colour schemes.

This site is hosted in a similar way… but it’s so rare I actually post anything, when I do it’s just a question of building the site using jekyll build and then use s3_website to push the site live - s3_website push

Graham on the other hand, has a whole CI/CD pipeline to publish web pages. Which is very clever… and made my very manual efforts seem, er, manual. As I’d just been talking about using Azure Pipelines to do stuff… it felt like this was a perfect opportunity to create a pipeline to do something similar.

I didn’t want to spend too much time working on it - it’s so rare I post at all, do I really want to spend ages building some magic blog posting automation?

At a basic level - all I need to do is:

  • install Ruby
  • install Jekyll
  • Install s3_website
  • Download my site from gitlab
  • build the site on the build agent
  • run s3_website to push the site live

And that’s pretty much what I’ve done:

pool:
name: Hosted Ubuntu 1604
steps:
- script: |
sudo apt-get install ruby ruby-dev make gcc nodejs

sudo gem install jekyll --no-rdoc --no-ri

sudo gem install redcarpet

sudo gem install s3_website
displayName: 'Install Ruby'

- task: DownloadSecureFile@1
displayName: 'Download secure file (s3 website config)'
inputs:
secureFile: '0dbba248-4b56-4ca5-a223-50c5ef90a160'

- script: 'jekyll build'
displayName: 'Build a website'

- script: 's3_website push --config-dir $(Agent.TempDirectory)'
displayName: 'Actually deploy the site'

The only vaguely clever (?) bit, is that s3_website is configured via a s3_website.yml file. This contains AWS credentials and the details of the S3 bucket my site is hosted in - and the cloudfront ID my site is using.

s3_website then does all the hard work for me - syncing content - invalidating CloudFront objects and lots of other clever things I could add to my pipeline - but remember - keep things simple - this is evolutionary - adding a little bit of automation to my current manual workflows.

Azure DevOps supports the concept of secure files - so I can upload my s3_website.yml file (which is in my .gitignore file - the last thing I want to do is to commit AWS credentials!)

In my pipeline I can download my config file to a temporary location - and then trigger a build:

s3_website push --config-dir $(Agent.TempDirectory)


And this is the moment of truth. When I commit this post to master - the pipeline should take over, and publish this post.

So - fingers crossed…

(edit - first time didn’t work… turns out I’d not switched on the CI trigger in Azure. This is also the first time I’ve used Azure with Gitlab - which shouldn’t make a difference, but who knows…?)

Published on 06 Apr 2019 Find me on Twitter and Mastodon.