Entries posted in December 2007
2 December 2007 21:50
My migration to mercurial is now complete. I've completed the last step, which was the migration of all the $foo-commits mailing lists.
It is at times like this that I remember two things:
- All mailing list software sucks. (here we go again.)
- All the mailing-list-archive-makers suck.
Right now I'm using ecartis for my mailing lists, primarily because it isn't mailman. (Long story).
For making archives of mbox files I'm using hypermail.
The latter I intend to change as soon as I can find something else to use. It isn't attractive, but it is in the Debian archive and seems reliable.
Since I was changing things around already I've centralised the list-archive making, and started using the haschanged tool to avoid rebuilding the archives i there are no new posts. That works nicely.
Tags: cvs, ecartis, hypermail, lists, mailman, mercurial, migration
2 December 2007 21:50
In the next week I intend to drop the search engine which archives content posted to Planet Debian.
It appears to have very little use, except for myself, and I'm significantly better at bookmarking posts of interest these days.
If you'd like to run your own copy the code is available and pretty trivial to reimplement regardless. There are only two parts:
- Poll and archive content from the planet RSS feed - taking care of duplicates.
- Scanning for /robots.txt upon the source-host, to avoid archiving content which should be "private".
Once you've done that you'll have a database populated with blog entries, and you just need to write a little search script.
ObRandom: In the time it has been running it has archived 15,464 posts!
Tags: discontinuation, planet-debian, planet-search, random
4 December 2007 21:50
If you're interested in working upon your CV/Resume, as Otavio Salvador was recently, then I'd highly recommend the xml-resume-library.
It allows you to write your address, previous jobs, and skills as XML then generate PDF, HTML, and plain text format documents via a simple Makefile.
It won't help with clueless agencies that mandate the use of Microsoft Word Documents for submission, so they can butcher your submission and "earn" their fee(s), but otherwise it rocks.
Tags: cv, links, not-job-hunting-mr.boss, random, resume, tools, xml
5 December 2007 21:50
After mentioning the xml-resume-library package I was reminded that the English translation has been out of date for over a year.
With permission from the maintainer I've made a new upload which fixes this, and a couple of other bugs.
On a different topic it seems that many Debian-related websites are having their designs tweaked.
I'm not redesigning mine, but I'd love other people to have a go.
Tags: cv, debian-administration, links, not-job-hunting-mr.boss, resume, tools, xml
6 December 2007 21:50
After recently intending to drop the Planet Debian search and recieving complaints that it was/is still useful it looks like there is a good solution.
The code will be made live and official upon the planet debian in the near future.
The DSA team promptly installed the SQLite3 package for me, and I've ported the code to work with it. Once Apache us updated to allow me to execute CGI scripts it'll be moved over, and I'll export the current data to the new database.
In other news I'm going to file an ITP bug against asql as I find myself using it more and more...
Tags: asql, planet-debian, planet-search, random, todo
11 December 2007 21:50
So I run a blog. You might have read bits of it in passing, probably more due to syndication than actual desire.
Initially I started off using Wordpress, but that was later purged once I bit the bullet and decided I'd rather be blogless than install PHP upon any server I cared about.
To replace my blog I looked for different solutions and eventually decided to try something built upon Ruby (on Rails). That lead to me trying Typo. After a little bit of pain I decided this wasn't a great solution.
So in the hunt for something new I moved to Mephisto, but over the past few months I've noticed that my virtual machine has been running low on memory - and the culprit has always been the Mongrel isntance which was running the blog.
So once more into the breach.
This time I had a new plan. I'd actually use a piece of blog software which I wrote for a couple more sites, a static system which outputs RSS & HTML files - with no dynamic-fu at all. That system is the chronicle blog compiler.
In performing the migration I accidentally flooded a couple of Planet-Planet installations. It took me a moment to work out why, and that was because my RSS feeds didn't include valid "pubDate" files for the entries..
So now I've made a new release of chronicle which supports valid RSS feeds, as tested against the feed validator and all is well. Well almost.
Users no longer have access to post comments, and whilst I've successfully migrated 700+ entries through to four different blogging engines I've managed to lost most (all) comments along the way at different points in time which is a real shame.
In the future I'll setup a simple CGI system to write comments to text files, and then allow them to rsync'd and merged into the entries. I just don't have the concentration to manage that right now.
In the meantime hello to Planet Sysadmin - never heard of you before, but thanks for carrying my feed. I'm suprised I qualify, but I guess at the very least I'm spreading interesting lyrics...
(Remind me to post the mod_rewrite rules I'm using now; I've kept all links at the same location through successive blog migrations which means I have numerous mod_rewrite rules which are probably obsolete, but I just don't want to touch them because cool URIs don't change....)
Tags: chronicle, mephisto, wordpress
13 December 2007 21:50
After a lot of hacking I've now got chronicle displaying comments upon entries.
Since my blog is compiled on my home desktop machine and comment submission happens upon a remote machine the process involves a bit of a hack:
- Publish Blog
The blog is compiled and uploaded to the live location using rsync.
- Wait For Comments
Once the blog is live there are embedded forms which may be used to receive comments.
The CGI script which is the target of the forms will then write each comment out to a text file, located outside the HTTP-root.
- Sync Them
Prior to rebuilding the blog the next time I update I rsync the comments directory to my local machine - such that the comments posted are included in the output
This my local tree looks something like this:
Here I have a Makefile to automate the import of the comments from the live site to the local comments/ directory, rebuild, and finally upload.
All this means that I can rebuild a blog created by a combination of plain text post files and plain text comment files.
It also means that there is a fair lag between comment submission and publication - though I guess there is nothing stopping me from auto-rebuilding and syncing every hour or two via cron...
I'll make a new release with this comment.cgi script and changes once the package hits Debian unstable...
Tags: chronicle, comments, meta
16 December 2007 21:50
Lars Wirzenius recently released, and packaged for Debian, a simple script to make release tarballs. He calls it Unperish.
It makes me wonder how many other people use that kind of system?
Of the top of my head the only similar thing I can recall using is Brad Fitzpatrick's ShipIt - another moduler/plugin-based system (Perl rather than Python this time.)
For my needs I tend to just write a Makefile which has a "dist" target, and then I have a simple script called "release". This runs:
- make dist / make release.
- creates a gpg signature of the release.
- scp's the resulting files to a remote source.
All this is configurable via a per-project .release file.
The configuration files are very simple, the script itself is almost trivial but being able to sit in a random project directory and have a new tarball on my webserver just by typing "release" is enormously useful.
There are times when I think I should make it a mini-project of its own, with the ability to auto-build Debian packages, etc. Other times I just think .. well its a hell of a lot better than my previous ad-hoc solution.
At the very least I think I will make the cosmetic change of updating the script to run "make test" if there is a test/ or t/ directory inside the generated tarball.
In real news - tomorrow I leave for a two week holiday with my partner's parents. Yesterday I got back from a night spent with her in York. The Bytemark staff night out. Lots of fun. Over too soon, but lots of fun.
Tags: brad, bytemark, christmas, lars, randomness, release, software, travel
24 December 2007 21:50
Well I've been in Devon for the past week, and will remain here for another week. During that time I've averaged about ten minutes of online time a day!
So far things are going well, but it will be wierd spending time with another family over Christmas.
Still I've been vaguely productive. I've released a new version of chronicle - which has the CGI support for leaving comments upon my entries.
(TODO: Reinstate some previous comments for Martin Krafft)
One thing I'd not anticipated was the amount of spam I'd start recieving. The peak so far was 40+ comments a day, all with random URLs. I deleted them manually for the first day - but now I've written a shell script or two to classify comments via the spamfilter I use for handling email.
I'm not 100% sure I should have done that. I suspect that over time I will find better results if I actually have a distinct "blog spam" and "blog ham" corpus - rather than risk confusion over "email" and "blog" texts to classify. Still I shall wait and see.
The only thing that I think I want to do now is add a bit more fine control over the comment process, so the choice is not just between comments being globally on, or globally off. A nice middle-ground where I could say "Comments enabled for entries posted within the past two weeks or so".
Anyway thats my quota for today. Now I have to walk the family dog ..
Tags: christmas, chronicle, comments
28 December 2007 21:50
Tomorrow, all being well, I should receive a new desktop computer. I ordered it upon Boxing Day so the turnaround is pretty good.
Right now I have two desktop machines:
This machine is occaisionally used by my partner, but primarily exists to remotely backup my various servers. It runs rsnapshot to do a full rsync of four machines every six hours.
This is my day-to-day desktop machine. It is going to be replaced, and this will be rotated into the backup machine. The machine yours will sit in a closet for a couple of months then be donated to somebody local.
As with all the machines I've bought recently the new one is coming
from Novatech - (Another
company whos website has a suprising hostname.)
The interesting parts of the spec are :
- AMD X2 4200 processor
- 2 x 500Gb SATA drives
- 4Gb memory
The biggest difference to me will be the significantly increased disk space which means that I'll be able to remove the external 120Gb USB hard drive I currently house my MP3 collection upon. By retiring vain and moving my current desktop machine to be the "random guest machine" and backup host I'll also have a lot more space for storing backups upon. (Full database backups every six hours; I'm paranoid...)
Having 4Gb of memory will also mean that I can run more than 2 or 3 Xen guests at the same time.
My plan is to setup at least the following guests:
The desktop machines will run GDM / X11 and will be available for logins via xdmcp - so that I can have a fully seperate working environment which is pristine and easily regenerated. Giving 5 xen instances 256Mb of memory each will leave me with close to 3Gb of disk for the host and that is three times what I hav enow!
All in all I'm looking forward to the new machine enormously a bargain at ~ £290.
Tags: computers:mine, computers:vain, xdmcp
29 December 2007 21:50
As I wrote yesterday I've recently ordered, and now received a new desktop machine. For completeness I've now finished juggling machines around and installed etch twice - so now I have two working desktop machines:
- gold.my.flat (unstable)
This is my personal machine, fitted with 4Gb of RAM.
It is currently running 6 xen instances of 256Mb each leaving me with more meory free than I have ever had upon my desktop machine - so the upgrade was definitely worthwhile!.
(The previous machine had bad sockets and couldn't take more than 1Gb of RAM - hence the replacement rather than mere upgrade.)
- vain.my.flat (etch)
This, reinstalled, machine is now running only backuppc and those programs that Meg/random guests wish to run.
In terms of networking I've now split things up into three well-defined ranges. Knowing how things develop I'll not expect this to contining, but right now I'm pleased with:
- These two desktops, a printer, and any wired laptops that are turned on.
All in all I'm very happy with the work of the day even if I did have to rsync stuff all over the place, juggle power cables and install Etch upon both the new machine, and the re-purposed backup machine. (Since that was previously my mahcine and was running sid.)
The following links were helpful.
All being well tomorrow will be spent tidying the flat, drinking with my erstwhile cat-sitter, and getting ready for work on Monday.
30 December 2007 21:50
Whilst I'm very pleased with my new segmented network setup, and the new machine, I'm extremely annoyed that I cannot get a couple of (graphical) Xen guest desktop guests up and running.
The initial idea was that I would setup a 64-bit installation of Etch and then communicate with it via VNC - xen-tools will do the necessary magic if you create your guest with "--role=gdm". Unfortunately it doesn't work.
When vncserver attempts to start upon an AMD64 host it dies with a segfault - meaning that I cannot create a scratch desktop environment to play with.
All of this works perfectly with a 32-bit guest, and that actually is pretty neat. It lets me create a fully virtualised, restorable, environment for working with flash/java/etc.
The bug was filed over three years ago as #276948, but there doesn't appear to be a solution.
Also, only on the amd64 guest, I'm seeing errors when I try to start X which mention things like "no such file or directory /dev/tty0". I've no idea whats going on there - though it could be a vt (virtual terminal) thing?.
The upshot of all this is that I currenly have fewer guests than I was expecting:
skx@gold:~/blog/data$ xm list
Name ID Mem(MiB) VCPUs State Time(s)
Domain-0 0 3114 2 r----- 1180.6
cfmaster.services.xen 1 256 1 -b---- 1.0
etch32.desktop.xen 2 256 1 -b---- 1.4
etch32.security-build.xen 3 128 1 -b---- 1.4
etch64.security-build.xen 4 128 1 -b---- 1.4
sarge32.security-build.xen 5 128 1 -b---- 1.0
Tags: lazyweb, woe is me, xen, xen-tools