tl;dr: I set up self-hosted Plausible Analytics on this site. You can see it here.

This site runs on an instance of Ghost I self-host on DigitalOcean. [1]


  1. That's my referral link. It gets you $100 in credit to be used in 60 days and me $25 if you go on to spend $25 ↩︎

After setting it up and publishing my first post, I immediately wanted a way to count the 3 or 4 visitors I knew for sure would be visiting the site. Ghost has lots of easy ways to integrate analytics packages, and my go-to solution for this has always been Google Analytics because a) it's free and b) it's free.

I may not be doing anything nefarious with the user data harvested from an analytics cookie but I understand I'm selling out my visitors to some extent by installing the GA tracker. In the past, I'd just have lived with that compromise. Even today, I don't know of a more effective way to track ad conversion rates other than analytic tracking cookies so for now it has to stay for user acquisition tracking on work projects. But for personal projects I've been wavering for a while.

With the advent of Big Sur, Safari now comes with a new Privacy Report that shows a list of known trackers that have been blocked from tracking you on a particular site. This is another one of those nudges that is exactly what I was waiting for. Frankly, I want users to see a clean bill of health if they come to my  site and click that link.

It's as simple as that – pure signalling. But the great thing for the privacy loving users of the internet, is that signalling is really powerful and I'm not the only one swept up in. Over the past year or so there have been numerous private-by-default services launched with the specific aim of delivering site stats without invading user privacy.

For my purposes, gathering stats instead of analytics requires a tool that a) is free to get started; b) doesn't use cookies; c) looks decent; and d) is fairly easy to install. I went through a bunch of these today and it turns out that for a) you pretty much need to self-host and for d) I wanted it to be client-side javascript rather than server log parsing. [1]


  1. Really the client-side javascript is only because it's more familiar to me and offers future portability, neither of which are good reasons so YMMV ↩︎

The solution I ended up with was to add the self-hosted version of Plausible Analytics to my existing Ghost install on DigitalOcean. You can see the public stats page for this site here. It's a great solution for me because I don't want to increase my total running costs for this site to more that $5 a month until my visitor numbers justify it.

But I'll be honest with you... it was pretty finicky to get set up so you should probably just pay them an additional $6 a month for the hosted version if you in any way value your own time. If they had a free tier for 500 pageviews a month they'd have monetised me eventually I bet. But I respect their bootstrapping philosophy and that sort of freemium tier could potentially  end up being an albatross around their necks so they probably shouldn't add it!

Boring technical bits

I suspect someone may end up on this site because they are also trying to self-host Plausible on a pre-configured Ghost Droplet running Ubuntu 18.04. If that's not exactly what you are doing... look away now.

If you are still here, I'd like to pay it forward by trying to help you avoid the pain I had getting this set up. The very abridged version of how I got this working is:

Set up docker-compose on the droplet instance

I followed these instructions to get docker-compose working on the droplet instance. I use docker inferquently and I always have to re-learn the syntax every time but these instructions worked perfectly.

Set up the self-hosting Plausible docker image

Plausible have a set of self-hosting instructions here. They are pretty sparse but are mostly correct. The small snafus I hit were firstly, that the tar command was missing the z parameter when you download the archive:

curl -L https://github.com/plausible/hosting/archive/master.tar.gz | tar -zx

Secondly, it implied you should go ahead and start the server before going through the optional extras:

docker-compose up --detach

When I did this I found later on that I ran out of memory trying to add in the GeoLite2 config. So you would be better not starting the server early and waiting until you issue the combined command later in the process:

docker-compose -f docker-compose.yml -f geoip/docker-compose.geoip.yml up --detach

Lastly, I went through a lot of pain getting the correct settings working for the Nginx reverse proxy. Plausible runs inside the docker instance on port 8000. I wanted to run this on the same machine as my Ghost instance. I played around a lot trying to get it to run on the /analytics path on mattfarrugia.com but in the end just put it on a subdomain. Initially on an analytics.mattfarrugia.com before seeing the light in terms of signalling, and moving it to stats.mattfarrugia.com 🤓

The most important bits of plausible-conf.env were getting the BASE_URL and PORT correct. When these were wrong I was seeing the analytics javascript specify stats.mattfarrugia.com:8000 as the endpoint. Anyway, for the avoidance of doubt, this is what worked:

BASE_URL=https://stats.mattfarrugia.com
PORT=8000

Set up nginx and DNS config for stat subdomain

I set up sites for both HTTPS and HTTP in /etc/nginx/sites-available and symlinked them. The config was just copied from the one that had been set up for Ghost with the proxy_pass target port set to 8000:

server {
    listen 443 ssl http2;
    listen [::]:443 ssl http2;

    server_name stats.mattfarrugia.com;
    root /var/www/ghost/system/nginx-root; # Used for acme.sh SSL verification (https://acme.sh)
    ssl_certificate /etc/letsencrypt/live/mattfarrugia.com-0001/fullchain.pem; # managed by Certbot
    ssl_certificate_key /etc/letsencrypt/live/mattfarrugia.com-0001/privkey.pem; # managed by Certbot
    include /etc/nginx/snippets/ssl-params.conf;

    location / {
        proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
        proxy_set_header X-Forwarded-Proto $scheme;
        proxy_set_header X-Real-IP $remote_addr;
        proxy_set_header Host $http_host;
        proxy_pass http://127.0.0.1:8000;
        
    }

    location ~ /.well-known {
        allow all;
    }

    client_max_body_size 50m;


}

Yes, there is stuff in there that is probably wrong but at this point I didn't care. I also quickly went to Hover [1] and set up an A record pointing to the stats subdomain for mattfarrugia.com. One of the things I really love about hover is that fiddling with DNS gets reflected really quickly. I typically see updates with 60 seconds.


  1. That's my referral link. It gets you $2 off your first purchase of a domain and in return I get $2 ↩︎

Set up an SSL certificate from Let's Encrypt

The last step was to set up a free SSL certificate from  Let's Encrypt. I already had a cert from when I set up the Ghost site but it's not a wildcard so I needed a different one for the stat subdomain. I ended up following these instructions. I must have done this differently last time because while most things were already in place, I didn't have the certbox Nginx plugin installed. After adding that pre-requisite it worked fine to re-generate both certificates:

sudo certbot --nginx -d mattfarrugia.com -d stats.mattfarrugia.com

Finally I just restarted the nginx service with and everything worked perfectly:[1]


  1. Not really obviously. I wasted hours arseing around with Nginx. But of course, that won't be the case for you... ↩︎

sudo systemctl reload nginx