Categories
Deployment DevOps Docker Optimization

Migrating this blog to self-hosted again

4 years ago I migrated this blog to Hostinger from a self-hosted docker instance. With the 48 month plan ending in 4 days, I went back to self-hosted once again.

Why?

Mainly cost 💸. The price has went up >230% since the last time I paid for it. It’s the difference between paying $44.16 vs $104.62 (after discounts) for 48 months of hosting. For something that I barely use, or have barely any traffic in, there’s little to no incentive for me to pay ~$2.18 usd/month for this blog.

CurrentRenewed
Yearly (usd)$44.16$104.62
Monthly (usd)$0.92$2.18

“Well surely self-hosted can’t be free right?”

You’re right, it isn’t “free” per se, but because I have a home lab server running anyway, I might as well use the spare capacity to host the blog. (again, the home lab is something I should write about, hopefully next week)

It took me about an hour to fully migrate over, it was a smooth process with a tiny bit of pain (self-inflicted carelessness).

The home lab is a mini pc, with a measly Intel N100 CPU, along with 16GB of ram and and 500GB SSD. I am shocked to find out how many services it can host comfortably, it has completely changed my view on what’s possible with these small machines.

This blog is hosted on docker as expected. But it’s a Docker container, inside of a Ubuntu VM, inside of a Proxmox host. The idle stats are pretty decent, consuming about 1GB of RAM.

NAME            CPU %     MEM USAGE / LIMIT     MEM %     NET I/O           BLOCK I/O      
wordpress       0.01%     355.5MiB / 7.752GiB   4.48%     4.12GB / 288MB    75.8MB / 2.25GB
wordpress-db    0.71%     533.6MiB / 7.752GiB   6.72%     78.8MB / 3.6GB    3.45MB / 1.87GB

To make this work we employ the usual caching strategies and pre-load the pages to make sure that they are already cached on server and ready to go.

  • WordPress: Some kind of caching plugin, e.g. WP-Optimize + Jetpack
  • CDN: Cloudflare

While setting this up I also found out that there is Redis object caching for WordPress, but it seems like it’s only useful if my site has many reads from the database. Based on my gut feel, I doubt it, so I’m omitting Redis until the day when this setup cannot handle it anymore.

All things considered, pretty good performance!

lighthouse report from chrome: 99 performance

Of course, I’ve no idea how this’ll perform under load but given that there’s barely any dynamic content on this site, it’s unlikely that this setup will buckle under any typical loads.

To expose this blog over the internet, it’s done with Cloudflare Tunnel, which saves me the typical hassle of securing the connection to my server with origin certificates.

illustration from: https://blog.cloudflare.com/getting-cloudflare-tunnels-to-connect-to-the-cloudflare-network-with-quic

It’s secure and easy to setup, would recommend anyone who wants to host public services with it. There is one major caveat: Cloudflare would be able to see all traffic between your origin server and Cloudflare, so you have to trust Cloudflare. Honestly, it’s kind of inevitable that you have to place your trust on someone or something. Given their track record of transparency when there are downtime or when shit hits the fan, they’ve earned my trust.

While I was aiming for zero-downtime, unfortunately, there was about 10 minutes of downtime.

the importance of uptime monitoring, which I’m planning to self-host in the near future

I had my new site up and running and it was a simple DNS cutover. Unfortunately, I forgot to take into account DNS propagation time, and clients that got the old IP ended up not being able to reach the site. To be honest I still don’t understand why it failed because it should still show the old site and seamlessly switch over when the new DNS kicks in. Let me know in the comments if you have any ideas!

Summary 📖

Thanks to the beauty of virtualisation, I’ve saved myself $104.62 usd over 4 years. If this mini pc server lasts anywhere as long as that, it would’ve paid for itself plus interest (including the other services that it’s hosting).

Now., on to figuring out an automated backup solution…

Categories
Deployment Docker

Automated Torrenting Configuration using Docker

I was sick of downloading my shows manually, it actually takes up quite a bit of time especially if you add them up over the years. Before I had my server set up, I was running Deluge with the YaRSS2 plugin which works wonderfully well as long as my computer was turned on. (kind of a power hog)

http://dev.deluge-torrent.org/wiki/Plugins/YaRSS2

But since I have a low-power server now, I can let it run 24/7 without worries. Here’s my experience with it.

Diagram of current setup

Categories
Deployment Docker

WordPress Multisite

So… wow, I finally managed to get it all up and running. The amount of effort is way more than I would’ve liked but at least it’s done now. There’s a ton of things I would like to write about, especially the troubleshooting steps I did so that it’ll be easier to migrate this configuration in the future.

First of all, I tried on my own to get the subdomain routing working with jwilder/Nginx-proxy along with MariaDB and official WordPress image.

https://github.com/jwilder/nginx-proxy

I will write more about the proxy as well as the let’s encrypt SSL containers in another post.

Unfortunately, for whatever ungodly reason I wasn’t able to get it up and running. So after some Google-fu, I came across this article that helped me greatly.

https://cianallner.com/ultimate-wordpress-docker-setup-guide/

I ended up not doing the docker-compose method because I was trying to troubleshoot why I wasn’t able to obtain an SSL certificate from Let’s Encrypt. Bad news, SSL still isn’t working yet but while I was debugging it I hit the rate limit for the number of certificates I could request for in an hour/day/week. Hopefully when that’s sorted out this site will have a proper SSL certificate.

WordPress Multisite

http://www.wpbeginner.com/wp-tutorials/how-to-install-and-setup-wordpress-multisite-network/

I wanted to have the ability to host multiple WordPress sites, for my own testing/development as well as for my freelance work. Instead of running a separate new WordPress installation every time I need a new site, multi-site allows me to run multiple sites off a single installation and manage them through a centralized zone.

There are two ways of running this.

  1. sub-domain
  2. sub-directory (chosen)

The reason for choosing sub-directory was pretty easy for me.

  1. There is no need for pretty URLs eg. xyz.lordofgeeks.com for the sites I’m hosting
  2. Let’s Encrypt doesn’t offer wildcard certificates where 1 certificate can cover all sub-domains under *.lordofgeeks.com
  3. It makes sense that all of the sites belong to blog.lordofgeeks.com/[name-of-site]

For point 2, starting from 2018 onwards, Let’s Encrypt will offer wildcard certificates. So all my effort for setting all these up will be for nought, but it’s still a good learning experience. 

Everything went on fine until I added a new site blog.lordofgeeks.com/dev/ and tried to upload a file that’s >1 megabyte.