About Archive Tags RSS Feed

 

Entries tagged adsl

The pain of a new IP address

13 October 2010 21:50

Tonight I was having some connectivity issues, so after much diagnostic time and pain I decided to reboot my router. At the moment my home router came back my (external) IP address changed, and suddenly I found I could no longer login to my main site.

Happily however I have serial console access, and I updated things such that my new IP address was included in the hosts.allow file. [*]

The next step was to push that change round my other boxes, and happily I have my own tool slaughter which allows me to make such global changes in a client-pulled fashion. 60 minutes later cron did its magic and I was back.

This reminds me that I let the slaughter tool stagnate. Mostly because I only use it to cover my three remote boxes and my desktop, and although I received one bug report (+fix!) I never heard of anybody else using it.

I continue to use and like CFEngine at work. Puppet & Chef have been well argued against elsewhere, and I'm still to investigate BFG2 + FAI.

Mostly I'm happy with slaughter. My policies are simple, readable, and intuitive. Learn perl? Learn the "CopyFile" and you're done. For example.

By contrast the notion of state machines, functional operations, and similar seem over-engineered in other tools. Perhaps thats my bug, perhaps that's just the way things are - but the rants linked to above makes sense to me and I find myself agreeing 100%.

Anyway; slaughter? What I want to do is rework it such that all policies are served via rsync and not via HTTP. Other changes, such as the addition of new primitives, don't actually seem necessary. But serving content via rsync just seems like the right way to go. (The main benefit is recursive copies of files become trivial.)

I'd also add the ability to mandate GPG-signatures on policies, but that's possible even now. The only step backwards I see is that currently I can serve content over SSL, but that should be fixable even if via stunnel.


*

My /etc/hosts.allow file contains this:

ALL: 127.0.0.1
ALL: /etc/hosts.allow.trusted
ALL: /etc/hosts.allow.trusted.apache

Then hosts.allow.trusted contains:

# www.steve.org.uk
80.68.85.46

# www.debian-administration.org
80.68.80.176

# my home.
82.41.x.x

I've never seen anybody describe something similar, though to be fair it is documented. To me it just seems clean to limit the IPs in a single place.

To conclude hosts.allow.trusted.apache is owned by root.www-data, and can be updated via a simple CGI script - which allows me to add a single IP address on the fly for the next 60 minutes. Neat.

ObQuote: Tony is a little boy that lives in my mouth. - The Shining

| 1 comment