UPDATE The instructions here have been updated to reflect the current deployment process involving Hugo, Bitbucket and Firebase.

Sometimes, blogging is the easiest thing in the world: set up an account on Medium or Wordpress, or whatever, and in 10 minutes, you’re off, publishing a new post about your pet peeve of the day. However, such simplicity doesn’t cut it for a dyed-in-the-wool DIYer like me: I’d like to have control over the entire flow - right from how my posts look to my publishing workflow to the underlying infrastructure used to host the blog. I want to not only write and publish, but also want the platform and presentation to be truly mine. Why, you ask? Just for the heck of it.

There is also one additional motivation: as a technology professional, while my posts are a showcase of my knowledge and technological prowess, the blog itself is a meta-level demonstration of the same abilities - a way of walking the walk, rather than just talking the talk. Hence, I’d like to make the blog truly mine - reflecting my specific interests and idiosyncrasies.

When I started writing posts for this blog, I considered briefly to focus on the content first, before diving deeper into the blogging platform itself. However, as I started jotting down ideas for posts, I started getting drawn in more and more towards the convenience of Markdown, which led me to wonder: why do I need to do this whole rigmarole of copy/pasting my content over to these rich editing platforms, doing the requisite formatting? Sure, Markdown plugins do exist for these platforms, but is it worth the effort of then doing the process in reverse when I inevitably decide to switch to self-hosting?

So, I set about searching for a low-effort, high-customizability engine for running a blog. I had a few criteria:

  • I didn’t want to pay a lot for hosting; the closer to zero cost, the better, but it had to be reasonably secure.
  • I wanted something that would generate a compact yet beautiful site that didn’t measure in the MBs per page.
  • I didn’t want to have to sit and take care of formatting my contents. I want a default, sane set of templates to be applied to all of my content, written in Markdown.
  • I wanted my blog to a be a unidirectional medium; I have no intentions of engaging with an audience (it may change, but let’s cross that bridge when we get there).
  • I wanted it to be extremely fast, with every optimization possible applied to reduce load times.
  • I wanted the flexibility to modify the appearance and behavior of my blog without too much of a hassle.
  • I wanted to learn a few new things in the process, so that I could blog about it.

I considered multiple options: Django, a custom WordPress instance, plain simple text files and other approaches, on a variety of hosting platforms. Out of all of those, the combination of Hugo + Bitbucket + Firebase stood out for the following reasons:

Static-ness and Customizability

Hugo is a static site generator ie. it generates HTML, images, JS and CSS. All content is written in Markdown, and rendered sanely. That’s it. No fancy schmancy server-side logic here. A large part of the content is static HTML - no late-loading content that’s a nightmare to make visible to a search engine, so my searchability goes up as well. Since the generated site is static, there’s no need for me to deal with inbound communication ie. comments. Spam problems don’t exist when there’s no opportunity for random bots to spam you. Hugo provides a simple, sane theming system; as long as I can compile my new Frobble Widget down to HTML, JS and CSS, I can add whatever Go code I need to do so.

Simplified Publishing Workflow

Bitbucket provides me with a convenient private repository to store my posts, and a CD pipeline (Bitbucket Pipelines) to automatically generate and deploy my static site to prod anytime I choose to cut a release branch.

Cheap, Secure and Fast: Pick Three

Finally, Firebase Hosting is a great, cheap hosting option that deploys to Google-owned CDNs; It’s quite cheap - 1 GB of free storage, and 10 GB/month of free traffic; much less than what I’m expecting the size of my blog to be and the amount of traffic I’m expecting to see (assuming I don’t get DDoSed out of this planet.)

My Blogging Workflow

With these tools set up, my blogging workflow has become as follows: write my content on VS Code and save it on Google Drive which I have synced to my Mac. On my tablet, I use Working Copy to sync my Bitbucket repo locally, and Textastic to edit. I used to use StackEdit for this purpose, but I found it to be too annoying to use regularly. Furthermore, I also needed to perform a git push/pull regularly anyway to make sure that the device I was writing on had the latest version of my drafts. Occasionally, go to the corresponding folder on a machine I own that I’ve synchronized with Google Drive, and check in the changes to Bitbucket. When I’m ready to publish a post, I go to Bitbucket and branch off a new release version. This triggers Bitbucket pipelines to build my site, and publish it to Firebase Hosting.

I like that there’s multiple levels of redundancy built in: Google Drive contents are backed up on my local machine; and checking in ensures that the changes are versioned in Bitbucket. Finally, explicitly cutting versions means that I always have a last-known-good-version of my site to rollback to.

In my next post, I’ll describe how I went about setting up this workflow.

The rest of this post is a tutorial on how to set up a blog using this winning combination. Note that this post is accurate as of the date published; at the pace at which Cloud services keep evolving, it may become out of date over time. I will try my best to keep this up-to-date as I modify my setup to adjust, but I cannot guarantee complete accuracy.

Setting Up Google Drive for Desktop

This part is fairly straightforward: install Google Drive for Desktop, and make sure that all of your activities (eg. the git clone step below) happen within the Google Drive directory. This ensures that your repo you create in the next step is backed up to Google Drive in addition to Bitbucket. This is especially useful if you want to start your work on one machine and continue it in another machine, without losing your state. Make sure that you disable automatic conversion of known filetypes in Drive Settings (look for “Convert uploads to Google Docs editor format” and disable it); this will prevent Markdown files (in the content/ directory below) from being converted to Google Docs automatically (this bit me the first time I uploaded my blog repo to Google Drive, unfortunately).

Setting up Bitbucket

No modern software development setup is complete without a VCS, and my VCS of choice for this setup is Bitbucket. Not only because it provides git as a VCS, but also because it comes integrated with Bitbucket Pipelines which makes the job of publishing my site as easy as writing a single configuration file.

To set up Git on Bitbucket, follow the instructions here. This will set up an empty repository with no files, or if you chose to, a README.md file.

Once the remote repository is set up, clone it locally by following the instructions here.

You now have a remote Git repository, and a local clone from which you can make changes, commit, and push modifications up to the remote repo. You’ll need to clone the (empty) repo locally so that you can initialize your Hugo setup within. However, before cloning from Bitbucket, you’ll need to set up git-credential-helper which will ensure that you have the required credentials to be able to clone the repository. The instructions below are for OS X; similar / equivalent instructions should apply for Windows & Linux:

brew install git-credential-helper

Git will use the credential helper to initiate an OAuth flow, which will allow it to connect to Bitbucket with the appropriate credentials to clone your repo. Now you can clone using the usual command (replace <username> and <reponame> with your Bitbucket username, and the name of the repo you wish to clone):

git clone https://<username>@bitbucket.org/<username>/<reponame>.git

Setting up Hugo

Installation

Installing Hugo for your OS is as simple as following the instructions on the Hugo website.

Hugo provides a CLI called hugo that takes care of most tasks. Initialize the site by running the following command in the directory you want to have your blog structure:

hugo init

This will create a few subdirectories; the ones of interest are content/, themes/, and later on, public/. We’ll cover each of these subdirectories in more detail in the following subsections.

You now need to commit all the files and directories that hugo init created for you.

git add -A
git commit -am "Initial commit"

This commits the files/directories you have in the current directory to the local repo you’ve just created.

However, this step only committed your changes in the local repository. You still need to push changes committed to your local repo to your remote one. You can do so by executing the following command:

git push

This will use the Bitbucket credentials previously created during the clone step to push your local changes to the remote. Visit the Bitbucket website and check your repository to make sure that your freshly generated blog structure is all there.

Hugo Basics

Hugo, as mentioned before, generates static sites by compiling Markdown to HTML. In the generated directory structure, content/ contains all of your content. I like to set up my posts themselves under a subdirectory under content/ called posts/. To create a new post, you use the hugo CLI tool:

hugo new post/this-is-a-post.md

Hugo automatically initializes the post with some metadata. The best part about Hugo is that you can add more metadata to the post to support your specific needs; since the metadata is represented as a Markdown comment, it never gets rendered.

The central configuration file that drives a lot of how your site ends up looking is config.toml. I didn’t have much to do with respect to configuration except for setting up the theme and the site name, since I’m planning to go with defaults for now.

Hugo provides a quick-and-dirty server to quickly serve up rendered content so that you can see what your site will look like. It supports hot reloading, so you can see edits to your pages as they happen. Start the hugo server by running:

hugo server

which will start the server at http://localhost:1313. To stop the server, just press Ctrl-C.

Hugo Themes

Themes are one of Hugo’s compelling features; the hugo website features hundreds of themes you can use as a starting point for your blog. You can have multiple themes available to you under the themes folder, and you can switch from one to another by simply changing your config.toml. However, to get the themes into the themes folder, there are multiple ways.

One option is to git clone the theme you want from the github repo that’s listed in the hugo website. While this is an easy approach, it also leads to some inconveniences: what if the theme owner updates the theme to add new features? You’ll need to update it yourself. Moreover, the entire codebase of the git repo will need to be checked into your repository.

Instead, I went down the route of setting up git submodules. Using git submodules is easy: just go to the location where you want the theme to be cloned (say the themes directory), and run the following command:

git submodule add http://github.com/example/theme.git

This does a few things: first, it adds a file called .gitmodules in your root directory; drops the contents of the repository in your current directory, but if you run git status, you’ll find that only the directory that has been created under your current working directory is added, rather than the contents of the entire repository.

Git understands that this is a remote repository you are depending upon, and as such, it should be tracked separately; it tracks this directory using a special mode so that when you eventually run a git push origin master to push your local changes to your central repository, you aren’t pushing the contents of the dependency as well. This makes it much more convenient: you can always stay up-to-date on all submodules with a single command - git submodule update, and you aren’t using up space on your central repo for storing your dependencies’ code.

Once last thing to note: when you clone your repo on a new machine, you’ll need to run git submodule init followed by git submodule update in order to clone the dependencies as well. Without doing these steps, your submodules won’t be cloned on the new machine. As you’ll see later, this also applies in the continuous deployment pipelines as well.

After getting your theme cloned locally, enabling the theme is a simple config change in config.toml - just type the name of the theme for the theme= entry.

Setting Up Firebase Hosting

Firebase Hosting allows you to store static content on a storage bucket and deploy it on Google’s CDN. This is quite simple to set up - follow the step 1 in the Add Firebase to your App to set up your Firebase project, and then follow the steps here in the Quickstart Guide in order to deploy to your site.

If you have a custom domain, you can also set up your registrar’s DNS provider to point to your Firebase app. Follow the instructions here.

Bitbucket Pipelines for Deployment

I now have a way to write my posts and a place to publish the static site; how do I connect the two with the least amount of work? The answer is Bitbucket Pipelines. The moment I push a change into my repo, the Bitbucket Pipelines infrastructure kicks in to auto-generate my static site, and push it to the Firebase Hosting page. Fortunately, Atlassian has provided a helper to deploy directly to Firebase Hosting, so the overhead here is setting up service accounts and configuring the pipeline to run whenever a release branch is created. These are the overall steps:

  1. Create a service account in the Google Cloud project associated with the Firebase App, with the appropriate roles, and create and download an API key.
  2. Configure the Bitbucket deployment environment to include the base64-encoded API key and project ID so that the Atlassian Firebase pusher has the required information to push to Firebase correctly.
  3. Create a Bitbucket Pipelines run that triggers whenever a new release branch is created, and executes the script defined in bitbucket-pipelines.yml.
  4. Update the auto-generated bitbucket-pipelines.yml to contain the script listed below.

This should allow you to automate the process of generating the Hugo site, and pushing it out to Firebase hosting.

Google Cloud Service Account & API Key Creation

This set of instructions is based on learnings from here; thanks to Jan Kir for doing the hard work of figuring this out.

Open the Firebase Console and select your project. In the left side menu, click on the settings icon and open the Project Settings. In the tab bar, open the Service Accounts view. Then click on Manage service account permissions in the top right corner. This opens a new tab with the Service Account settings in the Google Cloud area, so we’re leaving Firebase here.

In the Google Cloud settings for Service Account, click the button to create a new Service Account in the top menu. Enter a name (and potentially a description) for the new account and press “Create and Continue”.

Now comes the important part: You need to add roles to the service account that will include the permissions needed to deploy and run your Firebase Functions. Depending on which APIs you use, this might vary. In particular, you need the following roles (this has been modified from the linked article):

  • Service Account User
  • Firebase Authentication Viewer
  • Firebase App Hosting Service Agent
  • Firebase Hosting Admin

If you still see permission errors later on, you can add an extra role with the needed permission and try again.

DISCLAIMER: These roles probably give more permissions to your service account than are needed to host your blog on Firebase. For security reasons, it is generally recommended to use as few permissions as possible. I tried to find the roles with the fewest extra permissions that still worked for me, but this can most likely be improved.

Skip the third step to add user access to this service account and simply click on “Done” to finish. You also need to enable IAM for your project.

Back in the list of service accounts, click on the account you just created. And in the Account’s settings select the Keys view where you click on “Add Key” and then “Create New Key”. Make sure the Key type is JSON, then click “Create”. A JSON key file is automatically downloaded to your computer and this is the file that we need to use in the CI. Generate a base64 encoded version of the json key as follows (replace <key-file.json> with the name of your key file):

cat <key-file.json> | base64 -w 0 > key_file.base64

You will use the contents of this key_file.base64 in the next step, where we will set up deployment environment secrets. Keep the contents of the json file, and the base64 encoded version safe; this provides direct access to modify your Firebase project, so you’ll need to be extremely careful about where you leave them lying around.

Bitbucket Deployment Environment Configuration

Bitbucket sets up a default “Production” deployment environment, which is sufficient for my purpose here. I updated Bitbucket’s settings (Repository Settings > Deployments), to add environment variables that will be used by Bitbucket Pipelines to push to Firebase. These environment variables are encrypted, so they don’t need to be present unencrypted in the code, nor do they need to be present in Bitbucket’s configuration options in their unencrypted form. You need to define the following two keys, with the corresponding values (make sure to enable the “Secured” checkbox):

  • KEY_FILE: contains the contents of the base64-encoded key file.
  • PROJECT_ID: Add the Firebase (not the Google Cloud) project id here. This should be available in your Firebase console.

Bitbucket Pipelines Configuration

Follow the steps in this Getting Started guide. Choose the “Other” option - this allows you to use a custom Docker container, that lets you run your own code. We’ll be needing this to run the Hugo site generation steps.

Note that Atlassian limits Bitbucket pipelines to <50 build minutes/month, so you should ensure you aren’t pushing too often. That’s the reason I’ve used a ‘release/*’ flag: it allows me to explicitly decide when I want to push out a new release.

Update your bitbucket-pipelines.yml

Copy/paste (with appropriate modification), this bitbucket-pipelines.yml script (I’ve added comments to explain the steps):

image: atlassian/default-image:5

pipelines:
  branches:
    release/*: # This line ensures that any branches with the name 'release/*'  will be deployed. 
      - step:
          name: "Download, install & run hugo 0.150.0, then deploy to Firebase"
          deployment: production
          script:
            - git submodule init # We use submodules for themes, so we need to ensure that the submodules are properly init.
            - git submodule update --recursive --remote # Fetch the submodules so hugo knows to find them
            - wget https://github.com/gohugoio/hugo/releases/download/v0.150.0/hugo_extended_0.150.0_linux-amd64.deb # Or the latest version of Hugo you've tested your code with
            - dpkg -i hugo_extended_0.150.0_linux-amd64.deb # Install hugo
            - hugo # Generate the static site
            - pipe: atlassian/firebase-deploy:5.1.1 # Hand it off to the firebase deployer.
              variables: # You got this from the previous section.
                KEY_FILE: $FIREBASE_KEY_FILE
                PROJECT_ID: $FIREBASE_PROJECT

This file can be checked in at the root of the repository, but it should be named bitbucket-pipelines.yml. Bitbucket will automatically detect the file and execute it any time a new release branch is cut (or updated).