Migrating from WordPress to Hugo Part 4: Securing the Site with SSL

I originally drafted this in June 2018. Following on from Sam McGeown’s recent migration to Hugo, I thought I’d finally publish this in case it’s useful for anyone rather than sitting on it until I complete the process!

Why SSL?

It’s only a blog, so why SSL? It’s going to be static content, so why SSL?

In this article I’ll deal with those questions and go through the process of requesting an SSL certificate using AWS Certificate Manager.

Let’s Go Secure!

If you recall, our finishing point is going to end up being a collection of static HTML files served out by AWS. There’s nothing particularly risky about serving up or requesting such static files, it’s how the internet started out after all. What’s different now though are people’s perception of risk and privacy and how that’s reflected in the technology we use.

Google, for example, promote SSL sites slightly higher in their search rankings than non-SSL sites and have been doing so since 2014. Some modern browsers have started flagging warnings about non-SSL sites and this will likely become more obvious over time. Users are becoming more picky and aware as a result, or perhaps they’re driving the changes to an extent. SSL is here to stay though and it’s worth setting it up, especially if it’s free!

AWS off public SSL certificates for free. Let’s go set one up!

We could either do this via the AWS console, or using the CLI. At this time I haven’t worked out how to do it completely via the CLI, but I’m going to start there. (Note: Fro CloudFront, I think that the certificate has to be in the us-east-1 region regardless.)

This command will requests a new SSL certificate with a subject name of “mpoore.uk” and alternative names of “www.mpoore.uk”, “michaelpoore.com” and “www.michaelpoore.com”. The validation method of DNS will require us to validate that we own the domain by making certain DNS entries.

What you get back, is the reference to the certificate. I’ll need that later.

Looking at the AWS Console though, you’ll see that the certificate is not yet issued and must be validated.

To validate each of the domains in the certificate, you need to get some DNS CNAMEs created. Luckily, for mpoore.uk there’s a button for that. For michaelpoore.com though, I had to do these manually as the DNS for that is still with 1&1 for the time being.

Once they’re all done, the validation will eventually complete and the certificate will be issued. Just save the certificate ARN value from earlier as it’ll be needed later.

Sadly, this is as far as I got in the process before other things (life, eh) got in the way. I will be back to revisit and complete the the process though.

Migrating from WordPress to Hugo Part 3: Hosted Zones in Route53

I originally drafted this in June 2018. Following on from Sam McGeown’s recent migration to Hugo, I thought I’d finally publish this in case it’s useful for anyone rather than sitting on it until I complete the process!

It’s Always DNS

When things go wrong in the IT world DNS misconfiguration is one of those things that often sits at the root of your problems. It’s important to get it right not only for correct functioning, but also because some of the subsequent steps depend on it.

As part of my migration of this blog to Hugo, I’m placing one of the two domains I’ll be using under the control of the AWS Route53 (Amazon’s DNS service). I’ll move the other one in time as well.

Creating a Hosted Zone

I tend to use separate providers for domain registration and hosting as I’ve found it easier to move my site(s) around when you can just update the domain’s nameserver (NS) records to point to the new provider rather than have to transfer the domain as well. Practically all of my domains (I host a couple of sites for local community interests too) are registered through FastHosts.

AWS cater for this sort of arrangement too in Route53 (their DNS service). From the Route53 dashboard, all I had to do was select “Hosted zones” from the menu and then click the “Create Hosted Zone” button.

All you need to enter is the domain name and leave the type at its default value (“Public Hosted Zone”).

The zone is created for you and helpfully tells you what namesservers need to be set:

All I then had to do was apply those nameservers to the domain in FastHosts:

Once the dust settles, DNS requests for mpoore.uk will go to AWS for resolution. Which is important as I want to set my site up with an SSL certificate (as Amazon will give you them for free) but validation requires DNS.

So let’s do that next…

Migrating from WordPress to Hugo Part 2: Basic Tooling

I originally drafted this in June 2018. Following on from Sam McGeown’s recent migration to Hugo, I thought I’d finally publish this in case it’s useful for anyone rather than sitting on it until I complete the process!

Summary of Tools Used

These are the tools I’ll be using during my migration of my WordPress blog to Hugo (in AWS):

  • Github
  • SourceTree (git client)
  • Homebrew
  • Hugo
  • AWSCLI
  • Sublime (text editor)
  • AWS S3
  • AWS CloudFront
  • AWS Certificate Manager
  • AWS Route53
  • Filezilla

Building a Toolkit

I’m a Mac user. I have been for a number of years and I don’t plan to switch anytime very soon. Most of the tools that I’ll be using either have Windows / Linux versions or there are similar tools available for those OSs. I’ll try not to go in to too much OSX specific detail about any of them, and, if you’re following this process, you might have to adapt to whatever tooling works best for you.

A good number of tools listed above are web-based or cross-platform so shouldn’t present a big problem for anyone. I will be using the command line when I can, hence the inclusion of AWSCLI.

Probably the most OSX specific tool in that list is Homebrew (aka “Brew”). It’s a package manager for OSX and I’ll be using it to install Hugo and AWSCLI on my laptop. If you’re a Windows user, try Chocolatey instead. If you’re a Linux user, you should use whatever package manager comes with your distro.

Naturally, the use of AWS services means that you need an AWS account of your own. I’m going to assume that you have one and have got it to a point where you can consume the services above.

Installing AWSCLI and Hugo

Let’s assume that we’ve got Brew installed (it’s easy, the instructions are right there on the homepage). Installing AWSCLI and Hugo is straightforward too!

First, AWSCLI. Just type the following in to a terminal window:

Once installed, you’ll need to execute the following command to configure AWSCLI with your Access Key ID and Secret Access Key:

Now let’s do Hugo. Can you guess the command? (I still managed to mistype it!)

Start Your Engines

So, we know how our journey will start and what we expect to find when we get there. We’ve just packed the car. Let’s get going!

Migrating from WordPress to Hugo Part 1: Overview

I originally drafted this in June 2018. Following on from Sam McGeown’s recent migration to Hugo, I thought I’d finally publish this in case it’s useful for anyone rather than sitting on it until I complete the process!

What, When and Why?

I’ve been blogging using WordPress for about 10 years. In recent months I’ve seen several respected bloggers make the move to Hugo and it has inspired me to do the same. You might ask “Why?”, and I have a few reasons:

  • For starters, I want to improve my skills and knowledge in certain areas of cloud technology. The LAMP (Linux / Apache / MySQL / PHP) stack that WordPress sits on isn’t exactly revolutionary.
  • Next, I want to simplify the site itself and reduce the chances of it being hacked.
  • Finally, the most important reason, because I can!

This series of posts will document my journey.

The Starting Point

WordPress is a great solution, don’t get me wrong about that. I’ve been using it since 2008 to host my blog through its various iterations. During that time WordPress has evolved into quite a mature solution, with a rich ecosystem of theme developers and plugins. It just works and you don’t have to have ninja skills to get your ideas shared with the world. However, every time I login to my self-hosted WordPress installation, there’s a dearth of updates waiting for me to apply them. From time-to-time you get incompatibilities come up and you have to swap out one plugin for another. Also, as the database grows it can become more of a challenge to back up the site or migrate it to a new hosting provider – something I do from time to time to keep the cost of running it down.

As it stands, the starting point looks something like this when it comes to retrieving content from michaelpoore.com (not trying to teach anyone to suck eggs here, I just fancied drawing a diagram – it also helps compare with the finishing point below):

  1. Web browser requests a page from michaelpoore.com and a DNS query is triggered that results in the nameservers for the domain being queried.
  2. The nameservers for michaelpoore.com (hosted by 1&1) are queried for the website IP address.
  3. A connection is made to the 1&1 CDN (Content Delivery Network) for the requested page. That page may be served directly by the CDN or the backend Apache server may have to provide the content.
  4. Assuming at least part of the content is not cached by the CDN, the Apache webserver receives the request and various PHP scripts are executed to render the page content. Combined with other elements such as images and javascript, the content is returned back to the requesting web browser.
  5. The aforementioned PHP scripts will make numerous queries to the MySQL database.

Now, unless you’re adding lots of dynamic content (which I’m not), and unless the CDN is caching significant portions of the returned content (which I don’t know), then there’s a lot going on each time a page is requested. Also, each plugin I add or the WordPress installation in general just represents a greater attack surface. I’m not that arrogant as to believe that anyone would want to hack my blog, but you never know.

Of course, I could migrate from a self-hosted solution to a hosted WordPress site and take away some of the issues that I have (such as applying updates to WordPress and the infrastructure (PHP and MySQL – which I have to update via the 1&1 control panel from time-to-time). I’m all for using such solutions typically, but it seems too easy 🙂

The Finishing Point

Hugo isn’t exactly a webserver. It’s actually a static site generator. It creates a structure of flat HTML files that can be hosted somewhere. As there’s no dynamic content, the pages are very easy to cache. In terms of my finishing point, much of the process looks the same as above:

(One key difference is that I’m introducing another domain name in to the mix. This is partly to help with the migration process, but also because I’ll end up redirecting one of them to the other and I wanted a domain name that matched my twitter handle.)

  1. Web browser requests a page from michaelpoore.com (or mpoore.uk) and a DNS query is triggered that results in the nameservers for the domain being queried.
  2. The nameservers for michaelpoore.com (hosted by AWS Route53) are queried for the website IP address.
  3. A connection is made to the AWS CloudFront CDN (I could also use CloudFlare) for the requested page. That page will likely be served directly by the CDN.
  4. Assuming that the page content cache has expired or perhaps has never been created, the HTML file will be served directly from the AWS S3 bucket.

That should be so much quicker. But let’s talk briefly about how the HTML files get in to S3 in the first place. That is where Hugo comes in as well as a few more pieces that I’ll cover in a later post.

The Journey

So now that I’ve mapped out the starting point and the finishing point, we’ve got the makings of a journey. Let’s get started!

0

Amazon Simple Workflow (SWF)

Yesterday the retail and cloud behemoth Amazon made an announcement regarding a new service that they’re offering for developers called Amazon Simple Workflow (SWF).

Now I don’t often write about public cloud offerings (in fact I may never have done it), and I wouldn’t consider myself a developer in the traditional sense but I thought that this was noteworthy. Normally I write about virtualisation infrastructures / technologies but what’s interesting about this is that it’s clearly targeted at enabling complex and / or large applications to run in Amazon’s cloud offerings.

Amazon’s own CTO, Werner Vogels, describes SWF as:

an orchestration service for building scalable distributed applications

Some of the possible applications of SWF that Amazon mention are:

  • Automating business processes for finance or insurance applications
  • Building sophisticated data analytics applications
  • Managing cloud infrastructure services

But there are bound to be others. Read Werner’s blog post that fleshes out SWF’s aims and purpose a little more. For now, I’m going to sit back and listen to what the AWS (Amazon Web Services) people are up to.