A five-node Raspberry Pi Cluster running Einstein@Home and various other applications. It has been running 24/7 in my garage in one form or another since October 2015, which is over 30,000 computing hours as of September 2019. Live statistics are uploaded to this website every 10 minutes, including temperature, CPU and memory usage.
I am a freelance technical writer for the cloud computing company DigitalOcean, where I write guides and tutorials about Linux infrastructure and web technologies. I have covered topics including DNS infrastructure-as-code tools and email security. The payments for my work are kindly matched by DigitalOcean as a charitable donation to the Electronic Frontier Foundation.
My personal security blog and website - the one you're reading right now. I started the site in 2011, but shifted its focus to IT security and infrastructure in 2016. It's hosted on a pair of geo-diverse Ubuntu 18.04 cloud servers (London and New York City) using Apache, and a Cloudflare DNS load balancer is used in active failover mode to ensure that the site remains at 100% uptime, even when one of the servers is rebooted. The entire server configuration and build process is automated and version controlled using Ansible and Git, meaning that I can easily design and test changes in a safe environment before deploying them to production.
An anonymisation tool for web server log files which removes sensitive information such as IP addresses and user agents, while retaining the uniqueness of individual log entries. The tool utilises a Bloom filter in order to determine whether a particular IP address has been seen before, without actually having to permanently store it. A customisable level of k-anonymity is used to ensure that the filter cannot be brute-forced. This means that the resulting log files are still useful for high-level web statistics such as unique visitor counts, without having to store the potentially sensitive information that they usually contain.
A continuous integration job configuration for automatically testing for violations of your website's Content Security Policy. A copy of your PHP website with your desired CSP header is set up in the CI system and crawled using Headless Chrome Crawler, resulting in violation reports being sent to a local or remote reporting endpoint. The reports can then be reviewed in order to determine whether your site is compliant with its own security policy, helping to ensure that content will not be accidentally blocked.
A Bash script to automatically download and perform GPG and/or hash integrity verifications on a list of pre-programmed software packages and files, including Ubuntu Linux installation ISOs and various Windows software. The purpose of this is to mitigate the risk of human error when carrying out this sensitive process, and to save time by making the process mostly automated.
A domain name expiry monitoring and reporting tool designed to produce weekly email reports on your domain names that have upcoming renewals, including renewals due in the next 7, 28 and 90 days. A list of domain names to monitor is specified in a configuration file, which is then used by the tool to make WHOIS lookups to the relevant WHOIS servers in order to check the expiry date. This data is stored, and can then be used to produce a weekly report to be sent via email.
A Google Chrome extension which allows you to set a whitelist of trusted websites. Hyperlinks to these websites in Google search results, Reddit and Hacker News will then be highlighted in green. This is an anti-typosquatting and anti-phishing tool designed to remove the requirement to carefully check links before clicking on them, and to provide assurance that the link you're clicking isn't a lookalike or phishing website.