Automating Static Site Deploys to S3

June 20, 2017

Amazonā€™s AWS S3 (Simple Storage Service) is amazing. With just a few simple steps itā€™s easy to deploy a SSL secured website. This blog is, at the time of this writing, a Hugo static site hosted on S3 and I regularly update the site when I make new content.

Over time the upload process has become a burden:

  • First I compile the production version of my website
  • Then I login to my AWS account and go to the S3 bucket selection to get to my ā€˜benhofferber.comā€™ bucket
  • Next I click into the upload modal and drag my files into to update the site
    • Since I have many pages that are generated I just pull the whole directory in (~2.5 MB)
  • Finally I just need to wait a little bit for the upload to go through

Itā€™s not a terribly long process, but it does take time out of my day when I update my website. Eventually I decided to figure out a way to automate this process using some simple node script or something. Fortunately, I did my research, and something like this already exists! Iā€™d like to introduce you to s3-upload.

s3-upload will run in any directory you configure and attempt to upload files that match your upload patterns. It specifically uploads files that have changed (created/deleted/updated) so that you donā€™t have to upload the entire directory. This is great because Amazon charges for uploading files and uploading fewer files saves money. In the case of s3-upload it also saves a bit of time.

Iā€™ve followed the steps on the s3-upload README to get this setup. The steps were easy to follow but I did have issues uploading my Hugo blog.

Running hugo in a terminal will compile a Hugo template into a \public folder. This is problematic as configuring a directory for s3-upload requires putting your settings right into the directory that you want to upload. In this case the \public folder will get cleaned out whenever Hugo runs.

If youā€™re an experienced Hugo user your first thought will be to plop the files in the \static folder. After all, that folder will get uploaded directly to the root of \public when Hugo runs. Thatā€™s exactly the right approach for this situation! It sure took me a while to get there so pat yourself on the back if you figured it out!

The configuration files contain passwords that are sensitive so I need to be careful with how I upload to S3. This means that my patterns need to be specific and I must make sure to gitignore my configuration files so they donā€™t leak out to my repository.

Hereā€™s what I ended up using for my configuration patterns:

module.exports = {
  credentials: "aws-credentials.json",
  bucketName: "BLOG-BUCKET",
  patterns: [
    "*.html",
    "*.xml",
    "*.jpg",
    "*.jpeg",
    "*.PNG",
    "*/**/*", // get me any nested directories recursively
  ],
}

With these two things setup getting new posts up has been way easier! Iā€™ll definitely be defaulting to this process from now on!

Shoutout to James Talmage for the great library! It would be interesting to see if Hugoā€™s team would be interested in incorporating the s3-upload process in their processesā€¦ Just a thought :)