So previously I've documented the setup of the Debian-Administration website, and now I'm going to retire it I'm planning how that will work.
There are currently 12 servers powering the site:
- web1
- web2
- web3
- web4
- These perform the obvious role, serving content over HTTPS.
- public
- This is a HAProxy host which routes traffic to one of the four back-ends.
- database
- This stores the site-content.
- events
- There was a simple UDP-based protocol which sent notices here, from various parts of the code.
- e.g. "Failed login for bob from 1.2.3.4".
- mailer
- Sends out emails. ("You have a new reply", "You forgot your password..", etc)
- redis
- This stored session-data, and short-term cached content.
- backup
- This contains backups of each host, via Obnam.
- beta
- A test-install of the codebase
- planet
- The blog-aggregation site
I've made a bunch of commits recently to drop the event-sending, since no more dynamic actions will be possible. So events
can be retired immediately. redis
will go when I turn off logins, as there will be no need for sessions/cookies. beta
is only used for development, so I'll kill that too. Once logins are gone, and anonymous content is disabled there will be no need to send out emails, so mailer
can be shutdown.
That leaves a bunch of hosts left:
- database
- I'll export the database and kill this host.
- I will install mariadb on each web-node, and each host will be configured to talk to localhost only
- I don't need to worry about four database receiving diverging content as updates will be disabled.
- backup
- This will be retired, as nowadays Bytemark provide cloud-backups.
- planet
- This will become orphaned, so I think I'll just move the content to the web-nodes.
All in all I think we'll just have five hosts left:
public
to do the routingweb1
-web4
to do the serving.
I think that's sane for the moment. I'm still pondering whether to export the code to static HTML, there's a lot of appeal as the load would drop a log, but equally I have a hell of a lot of mod_rewrite
redirections in place, and reworking all of them would be a pain. Suspect this is something that will be done in the future, maybe next year.
https://www.daniel.priv.no/
Please be sure to save every URL to the Wayback Machine hosted at the Internet Archive. Their API is very easy to use; just send one GET request per URL (and sleep a second or two in between each request).
curl https://web.archive.org/save/$your_url