Setting up a static site in S3

zvekovius

2018/11/28

Categories: website howto Tags: aws s3 cloudfront website self-host write-up

A Static Website

I had a great conversation with a friend from work about doing some devblog stuff. Till then, I hadn’t heard of generating static web pages, especially not with some content generator. I was intrigued to say the least. Here we are, writing a devblog and utilizing some cool stuff. The only problems I have with this plan: The phrase devblog is not my favorite, and I’m not great at writing. Sorry for any readers that may or may not exist. I will be calling things write-ups instead.

The Plan

The hope is to use the following:

Implementing Things

Hugo

I run Ubuntu 16.04 (Xenial) on my box. Your mileage may vary on the below if you aren’t.

Hugo has a good quickstart, I definitely suggest hitting up their website. Here’s my chronicle of the quick start journey:

First, one has to get hugo installed. For Ubuntu, this was a breeze. At first, I did what any sane person would do, and hit aptitude… I guess I was wrong? I thought I was on the path to success.

zvekovius@fastbrick:~/veknet$ apt search hugo

hugo/xenial 0.15*git20160206.203.ed23711-1 amd64
Fast and flexible Static Site Generator written in Go

Cool, I like go, I heard hugo is the bomb. Installed. Little did I know, I was setting myself up for failure.

Got it installed, created a directory, followed the quick start.. and error. Hmmpph. A quick search of the google said “Here’s the fix, but that version be ancient. People use the snap these days.”

I haven’t heard too much about snap on Ubuntu. From what I have heard though, I felt like apt is fine for me. I guess snap is in some future (that I’m still not sure if I want to be apart of).

With defeat fully accepted:

zvekovius@fastbrick:~/veknet$ apt remove hugo

Now… To snap?

zvekovius@fastbrick:~/veknet$ snap install hugo

This is where I find out snap binaries are not in your path by default. Alright, edit ~/.bashrc to contain the /snap/bin path.

Now I’m in business. Look into a few articles about the yaml required, then markdown. Now I’m making a website that doesn’t look ugly with minimal effort. Thanks to those who endure website development and publish themes for free!

Finding a theme

I’m not a web dev. I’m also not a designer. Thankfully, there are folks on the Internet that are both and share their work. Woo hoo! I’m using the hugo-classic theme by goodroot. Getting this theme installed is pretty simple. I also made a few modifications so that the nav links were proper, and some color changes.

Figuring out the Amazon

AWS is flexible, but because of that it isn’t always simple. Thankfully, this portion doesn’t seem to painful. I was going to use google DNS and manage things for free since they include DNS within Google Domains. However, I ran into an issue because I couldn’t create an alias A-record for my veknet.net domain to point to the S3 bucket URL. Oh well, I had to setup my Google DNS to point to Amazon Route53 nameservers to let Amazon handle my DNS.

Setting up the S3 buckets

This was pretty painless. Since none of this is automated yet, it was all manual for the first run. The premise is simple. Create two buckets (I did three, one for logging). One bucket is your domain name (like veknet.net), the other is your www. domain name for redirect (www.veknet.net -> veknet.net). They have a pretty simple write up on their Tutorial page here

This gets you HTTP only. I’m not a fan of that. I like HTTPS. This requires you use CloudFront which introduces its own set of hurdles.

New adventure!

I was thinking this would be hard, but I just hit up the AWS tutorial on setting up CloudFormation in front of S3. It’ll take a bit for the distribution to be created, meaning you won’t be able to link your Route53 right away and have it work.

I made the initial mistake of backing the CloudFront distribution with my S3 bucket API (it defaults to this in the drop down) instead of the website. Instead of webpages, I was getting XML when hitting the site. After I changed it to the website URL, and invalidated the cache (you get 1,000 free invalidations a month), things were looking good.

That was easy. I’m starting to feel bad for not calling AWS easy, they’ve really come a long way in making their manual UI a bit easier to navigate and deal with.

Setting up build, upload, and cache invalidation

Most of this is manual at the moment, but here’s the break down.

Setting up the CodeBuild:

Here’s my buildspec.yml (super simple):

version: 0.2

phases:
  install:
    commands:
       - curl -Ls https://github.com/gohugoio/hugo/releases/download/v0.52/hugo_0.52_Linux-64bit.tar.gz -o /tmp/hugo.tar.gz
       - tar xf /tmp/hugo.tar.gz -C /tmp
       - mv /tmp/hugo /usr/local/bin
  build:
    commands:
      - hugo 

  post_build:
    commands:
      -  aws s3 sync --delete public s3://veknet.net --cache-control max-age=3600

Downsides to this build

There isn’t going to be a huge cost for this, mostly because it will probably never see high traffic. However, cloudfront isn’t free, requests into S3 isn’t free, and builds won’t always be free (for my purposes they will be). This project is projected to cost cents a month, so I can probably live with it. The alternative to this is hosting a web server in my homelab, dynamic DNS, and a LetsEncrypt cert that I auto-renew with a cron-job.

I’m not using github to host the code. I didn’t actually investigate this path fully, but one AWS article suggested it could interact with the webhooks to build per commit on the branch of my choice. There is not a server console click away option for this. My original plan was to use Azure DevOps as I had used it for another project and it was silly simple to set things up in Github. However, I wanted to try giving the whole AWS ecosystem an honest chance.

What’s next?

Automate all of the clicks. Took loads of clicking around to get to the point where I could semi-automagically do things. I could get crazy and setup a cloud-formation, but since the work is already done and not needing to be repeated, there probably is no sense in that. The next logical thing will probably to make the build kick off/deployment and cache invalidation done via a script I kick off after a commit. Then if I’m feeling super dangerous, I’ll look into automating that with another AWS service. Rumblings said pipelines, but that didn’t seem right at first glance.

The End

It was fun, learned a lot about AWS. If you run into questions, feel free to hit me up at zvekovius(at)veknet.net. I left a lot of details out of this write-up. Mostly because I fumbled my way through a lot of this, and loads of other content exists for setting this up. I linked where I found external resources that should get people running. If there is demand for in-depth things, I can probably spend the time to make it thorough.