Backup for WordPress

Backups are essential to any systems. Especially if it’s data that cannot be easily downloaded again, like a blog. Even though I should really employ a system-wide backup for my server, I’m still finding the most cost-effective and efficient way of making it happen.

In the meantime, I decided to backup on application basis while figuring out the best solution. Enter BackWPUp

BackWPup – WordPress Backup Plugin

Based on my research, it’s by far the best “free” plugin you can use to manage most of your backup needs. It’s “free” because there is a pro option, just like most of the other “free” backup plugins out there. BackWPUp is great because it’s not crippled like the rest.

The free version allows you to do a manual/scheduled full-site backup to multiple locations such as

  • FTP
  • Email
  • Local Folder
  • Dropbox
  • Azure
  • SugarSync
  • Rackspace
  • S3

There are also options for what DB tables to backup, what folders to exclude. It’s simply a solid plugin. The pro option unlocks more backup options like AWS Glacier, GDrive, and more advanced options if you really need them. This is good enough for me for the time being.

This is also great because it supports WordPress Multisite configuration, which is what I have running. I can run this as the super admin and backup all the sites together at once, saving me a heck lot of time.

As for the backup itself, I went with the FTP option as I’m a cheapskate, and having a NAS is pointless if I don’t make full use of it. Created a job, ran it, and voila! It appeared on my NAS like magic.

Zip backup of WordPress

Opened the archive and sure enough, it has the entire WordPress installation on it. The entire backup only took 72 seconds and I’m really pleased with the performance.

— Update @ 11:15pm —

Since I took the AWS architecting course a couple months ago, I felt like it would be a waste not to experiment with S3 storage. Within this short amount of time, AWS has updated their interface again, thankfully, it actually made things easier to configure, and they recommend good security defaults.

After tinkering with IAM Users, Security groups and attaching a policy to the group, I finally got a specialized access token for the plugin, granting it API access to my S3 buckets. So I re-ran the backup…

AWS: Encrypted backup as easy as that

— End update —

Which begs the question, how can I restore from it?

The Caveat

This plugin currently doesn’t support restore functions.  What?!

Yup, there’s currently no way to restore natively from the plugin. They are currently testing it in beta so maybe it’ll work in the future but I wouldn’t put my money on it.

In order to restore from the backup, it’s more of a server level operation rather than application level. The backup preserves the entire WordPress installation folder, so you simply have to FTP in, and copy in the entire backup. Then go to the SQL server, and import the whole damn database backup.

I definitely have to test this out one day to see if I can restore this entire installation onto another machine. What’s worse than not backing up, is realizing that your backups don’t work.

Landing Page

Even though there are exams and what not, a geek can still eek out a little bit of time here and there for passion projects. It used to be blogger hosting all of my blog posts but ever since switching over to WordPress, I now have a “free” domain waiting to be used.

www.lordofgeeks.com

I finally created a portfolio site for myself, even though I’m not quite sure what to put on it yet. In fact, most of it is still work in progress but I thought it would make more sense to have this rather than a page that isn’t working at all.

Since there are only 3 pages, I wrote it entirely in HTML, hoping to convert it to some form of static NodeJS page in the future. However, I’ve employed all the new techniques I’ve recently, as well as some concepts that I’ve learnt in school.

CSS Grid

One of the latest updates to CSS3 is CSS Grid which allows you to be really flexible with your template. If you have not heard of it, you definitely need to check it out because this reduces a lot of hacky code and reduces the need to use templating systems.

A Complete Guide to Grid

https://developer.mozilla.org/en-US/docs/Web/CSS/CSS_Grid_Layout

That’s two links to get started, but I also recommend watching some YouTube videos that will speed up the initial understanding process. I built the portfolio page using this new syntax, as well as flex-box for some responsive goodness.

ES6 Syntax

The “new” Javascript syntax. Probably not that new at this point in time but I’ve never used it before until now. The greeting that changes dynamically every 4 seconds is built using purely Javascript without and jQuery functions in order to make it more efficient.

In the search for efficiency, I used a generator function that yields a value only when it’s called. Using the default JS setInterval() for the call, then used a timeout function to set the animation timing. The animation itself is done using CSS transitions, and since only opacity is being controlled, everything is extremely streamlined.

“MVC” Template (in progress)

MVC comes in quotation marks because it really isn’t, it just kind of seems like it (firstly because there’s no model).

(Couldn’t avoid it) Using jQuery’s load() function, I could split up the main content into different views, then dynamically load them into the page by calling upon it.

This allows me to simply each page to essential this block of code.

snippet from body
custom jquery code to load views

The custom Javascript (copied from Stackoverflow) allows me to load the views in automatically without me having to specify it for every single page.

However, for some reasons I can’t understand, this seems to be a finicky solution as it would refuse to load the page correctly from time to time, from browser to browser. This erratic behaviour leads me to create another page in order to beta test it before I push it out officially.

Writing this was really troublesome because I couldn’t get the syntax highlighting to work, possibly because of the mobile optimization as well as the Grammarly plugin to help with my writing. Hopefully, I can get it to work next time because screenshotiing is a major pain in the butt.

Optimizing wordpress

When the entire WordPress multisite installation runs off a really tiny server, I have to use pretty much every single tip and trick I know in order to keep it on an acceptable performance level (while being secure).

These are the 4 techniques I used to optimize my WordPress installation.

  1. CDN (Cloudflare)
  2. Caching (WP Supercache)
  3. Jetpack
  4. AMP

If there’s anything that I missed out, please let me know and I’ll test it out.

1. CDN (Cloudflare)

Cloudflare

I’ve written briefly about this in the previous post Cloudflare CDN but I’ll go slightly deeper in depth here. There is a plugin that is WP specific and allows you to load optimized-defaults.

In the speed tab, I enabled auto-minify for Javascript, CSS, and HTML. CF will try to cache those static files and attempt to minify them when serving to the clients. I also enabled AMP  (Accelerated Mobile Pages) in the settings*.

* you have to ensure that your site has a mobile version that is AMP compatible. I’ve done this through a plugin in point 4. 

In the cache tab, I left it as standard as it works well with WordPress. I’ve also set the Browser cache expiration to 4 hours, which seems like a pretty nice compromise between the number of updates the end users will see since I don’t write posts that quickly either way.

There are more settings that are only available in the paid tier like the ability to optimize and cache images on CF, this will definitely help to improve loading times. But since I’m cheap, I decided to make use of WordPress.com’s Jetpack plugin which gives me a free CDN for loading images. Point 3.

2. Caching (WP Supercache)

WP Super Cache

Caching on WordPress application level works on the concept of output caching if I’m not mistaken. This allows the site to serve cached dynamic content instead of having the server process each PHP request manually. Applying together with the caching that a CDN provides creates a form of 2 layer cache, this drastically reduces the amount of CPU time my server has to spend on each request.

I simply have to enable it to enjoy most of the benefits that this plugin offers, there are more tweaks that can be done through the expert option but I haven’t spent the time in making that happen yet.

3. Jetpack

Jetpack by WordPress.com

This is a plugin by the official WordPress.com, it ties in your self-hosted site with their services, granting free site analytics, SEO, Ads, Image CDN to name a few.

In this case, I used it simply to speed up the loading of images, as well as for the faster/nice looking image carousel. There are even more features if you decide to go for the paid version, but the free one works well enough for me for now.

4. AMP (Accelerated Mobile Pages)

AMP for WP – Accelerated Mobile Pages

This plugin auto-magically transforms your normal WordPress pages/posts into AMP format so that it will load on mobile devices a lot quicker. This plugin allows you to auto-redirect mobile users to the AMP version of the site. It also allows integration with Facebook’s instant article, something which I intend to make use of in the future.

https://www.ampproject.org

AMP is an open-source project meant to accelerate load times on mobile devices, from the limited testing I have, it seems to have a noticeable speed up when I load the site from my phone.

 

That’s all for now. I probably won’t write an article about securing WordPress as I’m just following guides online for the moment and I’m not really sure what’s the best way of doing things are yet. Suffice to say I’ve done the minimum in securing the site, let’s hope it’s enough.

Cloudflare CDN

In my attempts to get a valid SSL certificate for this site, I ended up cheating a little and making use of Cloudflare to do the securing for me instead.

Getting it set up was pretty straightforward, though I ran into some issues as I wasn’t familiar with Cloudflare’s infrastructure. I managed to set up a full SSL encryption as shown in the diagram below.

First, point my DNS NS records to Cloudflare, then generate the keypair on Cloudflare, import them into my server then update the Nginx config file to point to those keys. And everything just automagically become secured with TLS just like that. Made a few more optimizations on to minify JS/CSS/HTML as well as enforcing HTTPS for all of my sub-domains. Worked like a freaking charm.

SSL was my main concern when I decided used Cloudflare, but even on the free-tier there is basic protection against DDOS attacks, and my content will be cached closer to any visitors. This provides a nice boost in performance which is noticeable; it also provides a good boost in security, helping my tiny server stay available, just in case.

In the midst of working on this, I ended up optimizing the site at the same time, it should feel a lot more responsive now. In the next post, I’ll write about the tweaks I made to make WordPress run a lot faster.

Journey to security

I’m transitioning to the security industry, since it’s an area that I do not have much experience in, this will be a brand new journey and I would pretty much have to learn everything from the ground up.

Hence, I’ve decided to start a new category for Security and Learning to document all the things that I’m learning along the way.

I’ll also attempt to develop some mini-applications in order to test all the concepts that I’ve learnt. Let’s see how this will go.

Estimated deadline: 31st December 2017. Let’s do this.

Automated Torrenting Configuration using Docker

I was sick of downloading my shows manually, it actually takes up quite a bit of time especially if you add them up over the years. Before I had my server set up, I was running Deluge with the YaRSS2 plugin which works wonderfully well as long as my computer was turned on. (kind of a power hog)

http://dev.deluge-torrent.org/wiki/Plugins/YaRSS2

But since I have a low-power server now, I can let it run 24/7 without worries. Here’s my experience with it.

Diagram of current setup

Before I get to the commands, here’s a quick breakdown of what’s happening.

Transmission is the torrenting client, plain and simple. I connect to it through the WebUI.

Flexget is the daemon that is querying the different RSS feeds set up in the config.yml file, it works like a rule-based regex search for the files and is able to detect quality settings from the file name. It also tracks the files that has been downloaded, rejected, or accepted but not downloaded yet. Upon matching a file, it will send a download job to transmission.

Using the kukki/docker-flexget image, here’s how I ran my Flexget container.

docker create --name flexget \
-v /home/log/configs/flexget:/root/.flexget \
-v /home/log/configs/transmission/watch:/torrents \
--restart=on-failure \
kukki/docker-flexget daemon start --autoreload-config

Transmission’s docker settings is a little bit more complicated as it’s the public-facing container where the nginx-proxy routes to. I’m using linuxserver/transmission image for this container.

docker create --name=transmission \
-v /home/log/configs/transmission:/config \
-v /home/log/data/torrents:/downloads \
-v /home/log/configs/transmission/watch:/watch \
-e VIRTUAL_HOST=[your-domain-here] \
-e TZ=GMT+8 \
-e PGID=1000 -e PUID=1000 \
-p 9091 -p 51413 \
-p 51413/udp \
-e VIRTUAL_PORT=9091 \
--restart=on-failure \
linuxserver/transmission

If you’ve noticed in the diagram, I said that the server stores the downloaded torrents as a cache. There are a couple of reasons for this setup. Firstly, I upgraded the server to an SSD so space is a premium now. But I already have a CIFS mount of my NAS, why not write directly to it?

From my experience running that configuration of downloaded straight into the NAS in the past year, it seems to screw up after a couple of months. Something about the buffer/network pipeline that just doesn’t like staying connected for such a long time and having data streamed to it on a regular basis. ISCSI would’ve worked perfectly but unfortunately, my NAS isn’t that high end.

I’m experimenting with this new configuration and hopefully, it stays stable for all the way.

  1. It first stores the download on the server’s memory on download
  2. Every hour a cron job will check if there are newly downloaded files
  3. If there are, move it to the NAS

My assumption is that a move command would be less taxing on the NAS than a constant write of data that may or may not be corrupted.

Why not use rsync? Because I fucked something up and the result is the same either way.

Hopefully, this will be useful for anybody reading it. Or myself in the future.

WordPress Multisite

So… wow, I finally managed to get it all up and running. The amount of effort is way more than I would’ve liked but at least it’s done now. There’s a ton of things I would like to write about, especially the troubleshooting steps I did so that it’ll be easier to migrate this configuration in the future.

First of all, I tried on my own to get the subdomain routing working with jwilder/Nginx-proxy along with MariaDB and official WordPress image.

https://github.com/jwilder/nginx-proxy

I will write more about the proxy as well as the let’s encrypt SSL containers in another post.

Unfortunately, for whatever ungodly reason I wasn’t able to get it up and running. So after some Google-fu, I came across this article that helped me greatly.

Ultimate WordPress Docker setup guide

I ended up not doing the docker-compose method because I was trying to troubleshoot why I wasn’t able to obtain an SSL certificate from Let’s Encrypt. Bad news, SSL still isn’t working yet but while I was debugging it I hit the rate limit for the number of certificates I could request for in an hour/day/week. Hopefully when that’s sorted out this site will have a proper SSL certificate.

WordPress Multisite

http://www.wpbeginner.com/wp-tutorials/how-to-install-and-setup-wordpress-multisite-network/

 

I wanted to have the ability to host multiple WordPress sites, for my own testing/development as well as for my freelance work. Instead of running a separate new WordPress installation every time I need a new site, multi-site allows me to run multiple sites off a single installation and manage them through a centralized zone.

There are two ways of running this.

  1. sub-domain
  2. sub-directory (chosen)

The reason for choosing sub-directory was pretty easy for me.

  1. There is no need for pretty URLs eg. xyz.lordofgeeks.com for the sites I’m hosting
  2. Let’s Encrypt doesn’t offer wildcard certificates where 1 certificate can cover all sub-domains under *.lordofgeeks.com
  3. It makes sense that all of the sites belong to blog.lordofgeeks.com/[name-of-site]

For point 2, starting from 2018 onwards, Let’s Encrypt will offer wildcard certificates. So all my effort for setting all these up will be for nought, but it’s still a good learning experience. 

Everything went on fine until I added a new site blog.lordofgeeks.com/dev/ and tried to upload a file that’s >1 megabyte.

Getting large file uploads to work

After spending hours checking all over the internet, these are the settings that I had to change in order to make it work.

Change the Max upload file size as a super admin and it’ll take change across all the sites. This is probably not ideal if I were to run a business but it’ll do for my personal uses.

In the vhost.d folder that is linked to the Nginx proxy, I had to create a file named after the site’s FQDN, blog.lordofgeeks.com

server_tokens off;
client_max_body_size 100m;
root /var/www/html;
# Necessary for Let's Encrypt
location ^~ /\.well-known {
    allow all;
    alias /var/www/html/.well-known;
    default_type "text/plain";
    try_files $uri =404;
}

The client_max_body_size 100m; is the magic sauce that make bigger file uploads work. server_tokens is used to remove unnecessary information about the server sent in the headers.

I also had to create a uploads.ini file that is linked to the WordPress container (as shown in the Ultimate Guide)

file_uploads = On
memory_limit = 64M
upload_max_filesize = 64M
post_max_size = 64M
max_execution_time = 1800

With these configurations in place (assuming that nothing else cocks up), I was finally able to upload large files without weird http-error that gives me absolutely no information on what’s going wrong.

Getting SSL to work

Update: 29/11/17

I finally got SSL to work with the WordPress but I ended up cheating and using Cloudflare instead of rolling my own Let’s Encrypt configuration. Well, it works in my favour since I’m planning to add a CDN to this site anyway.

Here’s the post in more detail: Cloudflare CDN

Docker configuration

Since I’m running all of these on Docker, this is my exact steps for getting the WordPress part up and running

docker create --name blog.lordofgeeks.com_mariadb_data \
-v /var/lib/mysql mariadb

docker create --name blog.lordofgeeks.com_wordpress_data \
-v /var/www/html wordpress

docker create --name blog \
--restart on-failure \
--link mariadb:mysql \
-e "VIRTUAL_HOST=blog.lordofgeeks.com" \
-e "LETSENCRYPT_HOST=blog.lordofgeeks.com" \
-e "[email protected]" \
-p 443 -p 80 \
--volumes-from blog.lordofgeeks.com_wordpress_data \
-v /home/log/sites/blog.lordofgeeks.com/uploads.ini:/usr/local/etc/php/conf.d/uploads.ini \
wordpress
docker create --name mariadb \
--restart on-failure \
-e MYSQL_ROOT_PASSWORD=[your password here] \
-e MYSQL_DATABASE=[database name] \
--volumes-from blog.lordofgeeks.com_mariadb_data \
mariadb

That’s about it, the rest of the time was spent securing the site and ensuring that I don’t get taken down by some kind of trivial hack. After a good day of work, I finally got my own WordPress network up and running.

If any of my friends want to make use of this to host their profile page of sorts, let me know and I can get you hooked up pretty easily. The only caveat is that the link will have to exist in this format

blog.lordofgeeks.com/[your-site-name]

But I think it’s pretty alright for something that’s totally free for you to post and edit right?