Using Bitbucket Pipelines to Publish to AWS S3
- tags
- #hugo #bitbucket #aws-s3
- categories
- technology
- published
- reading time
- 3 minutes
I have been wanting to do continuous integration and deployment (CI/CD) for my blog ever since Bitbucket introduced pipelines. But I never really got around to doing it until now. Previously, I would generate the whole blog and push it to my AWS S3 bucket manually. There are quite a number of steps that I have to go through just to publish so having this finally automated every time I push my changes to the repo is a welcome reprieve.
Here are the details on how I managed to push my Hugo blog to AWS S3.
Get AWS S3 credentials
This is assuming you already have an Amazon Web Services (AWS) account. Go into the AWS Management Console and access Identity and Access Management (IAM). If you don’t have a user other than root, create one. You don’t want to use the root account to push stuff to your S3 bucket. When you create your user, download the user credentials. That’s where you’ll find the AWS Access Key ID
and the AWS Secret Access Key
. You’ll need both of them for the next step. Don’t forget to give your user read and write access to S3.
Configure Bitbucket
This is what my pipeline configuration looks like.
Lines 7-13: This is where I download and install the Hugo version I’m using. These are the first few lines in my Dockerfile setup as well. So it was just a matter of copying them over to the YAML file. Line 13 just allows me to test and see if the environment has access to Hugo.
Lines 14-15: This is where I generate my blog. Now my repo currently looks like this:
Turns out that in the pipeline, I get logged into the root
of my repo. Before running Hugo to generate my files, I have to be inside src
. Running hugo
generates the blog into a public
folder.
Lines 16-23: This is where the magic happens. Bitbucket has several pipeline-templates that allow you to access certain AWS products. One of them happens to be S3. I enter my AWS Access Key ID
and AWS Secret Access Key
as well as the other required values. The LOCAL_PATH
is relative to where I already am inside the repo. I entered ./public
as the value here. I set the DELETE_FLAG
to true so that it completely empties the bucket before I upload the contents of the public
folder.
That’s it!
Now every time I update the master branch of my repo, the pipeline runs and my blog updates all on its own.
This site no longer uses Disqus or any other comment platform.
If you would like to reach out to me regarding this post, whether you want to add something helpful or if you would just like to say “thank you”, you can reach out to me via email with the following information.
To: me [at] thegeekettespeaketh [dot] com
Subject: comment on 2019-11-03-1818