About Archive Tags RSS Feed

 

Entries tagged chronicle

Welcome to where time stands still

21 December 2006 21:50

I can be childish too: PHP must die

I've been actively migrating services away from PHP for a while now, even at the cost of having to use less mature applications which aren't as pretty. I'm definitely at the point now where I simply don't trust it and the applications which (ab)use it.

Tonights job is to look for a non-PHP blogging application which can import my wordpress content from. I even wrote 75% of one myself once upon a time, but I lacked the time to finish it off.

Sure there are good PHP coders out there, I'm just sick of the rest.

Somebody should write a intepretted templating system similar to PHP, but with no bugs. Although maybe thats the problem, giving people the ablity to mix presentation inline with code was a fundamentally bad idea to start with? Since it led to non-programmers producing things that were broken.

I do keep meaning to look at Rails, but I'm usually turned off by the problems of mixing Debian packages with the rubyforge gems - I like to keep things from one source. (e.g. 99% of my perl modules are Debian packages, not CPAN packages.)

| No comments

 

Dio has rocked

13 August 2007 21:50

Inspired by Joey's wiki compiler I've been toying with a blog compiler.

Very similar idea - you give it a directory of text files, and it creates a static blog complete with tagging support, RSS feeds, and all that good stuff.

Feel free to have a look - probably the demo is the most interesting bit.

The only obvious downside is that people cannot easily leave comments... However that might be a plus for some people, especially those that don't want to touch MySQL / PHP / etc

| No comments

 

The temple walls are made of flesh

4 September 2007 21:50

This week has consisted of fighting registrars and doing a bit of hacking on xen-shell, xen-tools, and the chronicle blog compiler.

CJ has done some good work trying to get the code modularised, and I expect between the pair of us we can make things neater and better generally.

I've also fixed a couple of bugs relating to the hard-wiring of device names (/dev/sda, /dev/tty1, etc). These devices are replaced in newer versions of Xen which wants to use /dev/xvc0 and /dev/xvd[a-z] instead.

There's nothing else happening at the moment; I'm just having a lot of fun laughing at our new kitten sliding around on our polished wooden floor!

Chronicle seems to be getting pretty popular which is ironic because it was a quick hack to allow me to post blog entries on a couple of hosted sites - which I've not yet done. Oops.

In other news I'm loving the Nintendo DS at the moment, Megan brought me one back from America on her recent trip and I think a day hasn't passed where one or both of us has played less than 30 minutes each.

I'm annoyed that Sim City DS only allows us to play with one city - right now it is her turn, and I'll have to wait until she's finished with her creation before I can have a go - because otherwise I'll wipe her city out.. :(

| No comments

 

I role and I tumble practically all night long

10 October 2007 21:50

Today mostly consisted of a new release of the chronicle blog compiler. Interestingly this received several random mails today. I wonder what caused that all of a sudden?

The release of the compiler is timely, as it reminds me I've still not managed to find a decent gallery compiler. Although the thought of writing my own, rightly, fills me with depression.

I've been interested in reading more about both Git and SELinux upon Planet Coker Debian recently. I've switched several small projects over to GIT but I've not yet listed them publically. First of all I would like to see if there is a version of trac that I can install which supports git repositories. I guess that's a job to research tomorrow.

I wonder if I would confuse people by hosting GIT projects upon cvsrepository.org? ;)

| No comments

 

I put a spell on you

11 October 2007 21:50

Felipe Sateler kindly made a Debian package for the chronicle blog compiler, so you can now get it from my apt-get repository.

He suggested it be uploaded to Debian sid, I'm happy to do so if there is any interest. Otherwise I'll keep placing release there when they occur.

(To be honest I don't anticipate any major development unless there are bugs, or people would like to contribute themes ..)

| No comments

 

It's been seven hours and fifteen days

26 October 2007 21:50

I made a new release of the Chronicle blog compiler the other day, which seems to be getting a suprising number of downloads from my apt repository.

The apt repository will be updated shortly to drop support for Sarge, since in practise I've not uploaded new things there for a while.

In other news I made some new code for the Debian Administration website! The site now has the notion of a "read-only" state. This state forbids new articles from being posted, new votes being cast, and new comments being posted.

The read-only state is mostly designed for emergencies, and for admin work upon the host system (such as when I'm tweaking the newly installed search engine).

In more coding news I've been updating the xen-shell a little recently, so it will shortly have the ability to checksum the filesystem of Xen guests - and later validate them. This isn't a great security feature because it assumes you trust dom0 - and more importantly to checksum files your guest must be shutdown.

However as a small feature I believe the suggestion was an interesting one.

Finally I've been thinking about system exploitation via temporary file abuse. There are a couple of cases that are common:

  • Creation of an arbitrary (writeable) file upon a host.
  • Creation of an arbitrary (non-writable) file upon a host.
  • Truncation of an existing file upon a host.

Exploiting the first to go from user to root access is trivial. But how would you exploit the last two?

Denial Of Service attacks are trivial via the creation/truncation of /etc/nologin, /etc/shadow, (or even /boot/grub/menu.lst! But gaining privileges? I can't quite see how.

Comments welcome!

| No comments

 

No, I don't want your number

23 November 2007 21:50

I'm still in the middle of a quandry with regards to revision control.

90% of my open code is hosted via CVS at a central site.

I wish to migrate away from CVS in the very near future, and having ummed and ahhed for a while I've picked murcurial as my system of choice. There is extensive documentation, and it does everything I believe I need.

The close-runner was git, but on balance I've decided to choose mercurial as it wins in a few respects.

Now the plan. I have two options:

  • Leave each project in one central site.
  • Migrate project $foo to its own location.

e.g. My xen-tools could be hosted at mercurial.xen-tools.org, my blog compiler could live at mercurial.steve.org.uk.

Alternatively I could just leave the one site in place, ignoring the fact that the domain name is now inappropriate.

The problem? I can't decide which approach to go for. Both have plusses and minuses.

Suggestions or rationales welcome - but no holy wars on why any particular revision control system is best...

I guess ultimately it matters little, and short of mass-editing links its 50/50.

| No comments

 

Your trumpet's blowing for far too long,

11 December 2007 21:50

So I run a blog. You might have read bits of it in passing, probably more due to syndication than actual desire.

Initially I started off using Wordpress, but that was later purged once I bit the bullet and decided I'd rather be blogless than install PHP upon any server I cared about.

To replace my blog I looked for different solutions and eventually decided to try something built upon Ruby (on Rails). That lead to me trying Typo. After a little bit of pain I decided this wasn't a great solution.

So in the hunt for something new I moved to Mephisto, but over the past few months I've noticed that my virtual machine has been running low on memory - and the culprit has always been the Mongrel isntance which was running the blog.

So once more into the breach.

This time I had a new plan. I'd actually use a piece of blog software which I wrote for a couple more sites, a static system which outputs RSS & HTML files - with no dynamic-fu at all. That system is the chronicle blog compiler.

In performing the migration I accidentally flooded a couple of Planet-Planet installations. It took me a moment to work out why, and that was because my RSS feeds didn't include valid "pubDate" files for the entries..

So now I've made a new release of chronicle which supports valid RSS feeds, as tested against the feed validator and all is well. Well almost.

Users no longer have access to post comments, and whilst I've successfully migrated 700+ entries through to four different blogging engines I've managed to lost most (all) comments along the way at different points in time which is a real shame.

In the future I'll setup a simple CGI system to write comments to text files, and then allow them to rsync'd and merged into the entries. I just don't have the concentration to manage that right now.

In the meantime hello to Planet Sysadmin - never heard of you before, but thanks for carrying my feed. I'm suprised I qualify, but I guess at the very least I'm spreading interesting lyrics...

(Remind me to post the mod_rewrite rules I'm using now; I've kept all links at the same location through successive blog migrations which means I have numerous mod_rewrite rules which are probably obsolete, but I just don't want to touch them because cool URIs don't change....)

| No comments

 

Children Of The Dammed

13 December 2007 21:50

After a lot of hacking I've now got chronicle displaying comments upon entries.

Since my blog is compiled on my home desktop machine and comment submission happens upon a remote machine the process involves a bit of a hack:

Publish Blog

The blog is compiled and uploaded to the live location using rsync.

Wait For Comments

Once the blog is live there are embedded forms which may be used to receive comments.

The CGI script which is the target of the forms will then write each comment out to a text file, located outside the HTTP-root.

Sync Them

Prior to rebuilding the blog the next time I update I rsync the comments directory to my local machine - such that the comments posted are included in the output

This my local tree looks something like this:

~/blog/
|-- comments/
|-- data/
|-- output/
|-- Makefile
`-- chroniclerc

Here I have a Makefile to automate the import of the comments from the live site to the local comments/ directory, rebuild, and finally upload.

All this means that I can rebuild a blog created by a combination of plain text post files and plain text comment files.

It also means that there is a fair lag between comment submission and publication - though I guess there is nothing stopping me from auto-rebuilding and syncing every hour or two via cron...

I'll make a new release with this comment.cgi script and changes once the package hits Debian unstable...

| 4 comments

 

I saw momma kissing santa claus

24 December 2007 21:50

Well I've been in Devon for the past week, and will remain here for another week. During that time I've averaged about ten minutes of online time a day!

So far things are going well, but it will be wierd spending time with another family over Christmas.

Still I've been vaguely productive. I've released a new version of chronicle - which has the CGI support for leaving comments upon my entries.

(TODO: Reinstate some previous comments for Martin Krafft)

One thing I'd not anticipated was the amount of spam I'd start recieving. The peak so far was 40+ comments a day, all with random URLs. I deleted them manually for the first day - but now I've written a shell script or two to classify comments via the spamfilter I use for handling email.

I'm not 100% sure I should have done that. I suspect that over time I will find better results if I actually have a distinct "blog spam" and "blog ham" corpus - rather than risk confusion over "email" and "blog" texts to classify. Still I shall wait and see.

The only thing that I think I want to do now is add a bit more fine control over the comment process, so the choice is not just between comments being globally on, or globally off. A nice middle-ground where I could say "Comments enabled for entries posted within the past two weeks or so".

Anyway thats my quota for today. Now I have to walk the family dog ..

| No comments

 

I still got the blues for you

11 January 2008 21:50

Been a week since I posted. I've not done much, though I did complete most of the migration of my planet-searching code to gluck.debian.org.

This is now logging to a local SQLite database, and available online.

I've updated the blog software so that I can restrict comments to posts made within the past N days - which has helped with spam.

My other comment-spam system is the use of the crm114 mail filter. I have a separate database now for comments (distinct from that I use for email), and after a training on previous comments all is good.

Other than being a little busy over the past week life is good. Especially when I got to tell a recruitment agent that I didn't consider London to be within "Edinburgh & Surrounding Region". Muppets.

The biggest downside of the week was "discovering" a security problem in Java, which had been reported in 2003 and is still unfixed. Grr. (CVE-2003-1156 for those playing along at home).

Heres the code:

#!/bin/sh
#
#  Grep for potentially unsafe /tmp usage in shared libraries
#


find /lib -name '*.so' -type f -print > /tmp/$$
find /usr/lib -name '*.so' -type f -print >> /tmp/$$
for i in $(cat /tmp/$$ ); do
    out=$(strings $i | grep '/tmp')
    if [ ! -z "$out" ]; then
        echo "$i"
        echo "$out"
    fi
done
rm /tmp/$$

| 4 comments

 

Never made it as a wise man

16 January 2008 21:50

Recently I bought a new desktop PC and rotated my existing machines around, leaving me with one spare. Last night I donated that PC to a local friend, (co-incidentally the same woman who had been out cat-sitter when I went to stay at Megan's parents for Christmas).

I said "Its got Debian on it, that funny operating system that you used when you were looking after our kitten, remember?".

She said "Linux ssems to be getting popular these days I might as well try it out".

But seriously she used it. It had GNOME, it had firefox, she seemed happy. It isn't rocket science these days to point a technicalish user at a Debian Desktop and expect them to know what to do with it. The only minor complication was the lack of flash/java since it was etch/amd64 - but thats probably a plus with the amount of flash adverts!

In other news I made a new release of asql last night. This now copes with both Apache's common and combined logformats by default. (Yet another tool of mine which is now being used at work, which motivated the change.)

I've also started pushing commits to xen-tools last night, so there'll be a new release soon. Mostly I'm bored with that code though, it doesn't need significant updates to work any longer. All the interesting things have been done, so it's probably only a matter of time until I drift off. (To the extent that I've unsubscribed myself from xen-users & xen-devel mailing lists)

Virtualisation is increasingly a commodity these days. If xen dies people probably won't care, there are enough other projects out there doing similar things. Being tied to one particular platform will probably come to be regarded as a mistake.

(Talking of which I should try out this lguest thing; I just find it hard to be motivated...)

| 4 comments

 

Don't you just hate loose ends?

21 March 2008 21:50

Today I spent a while fixing some more segfault bugs. I guess that this work qualifies as either fixing RC bugs, or potential security bugs.

Anyway I did an NMU of libpam-tmpdir a while back to fix all but one of the open bugs against it.

I provided a patch for #461625 yelp: segfault while loading info documentation, which fixes the symptoms of bad info-parsing, and avoids the segfault.

I also looked into the #466771 busybox cpio: double free or corruption during cpio extraction of hardlinks - but it turns out that was already fixed in Sid.

Finally I found a segfault bug open against ftp:

To reproduce this bug run:

skx@gold:~$ ftp ftp.debian.org
220 saens.debian.org FTP server (vsftpd)
Name (ftp.debian.org:skx): anonymous
331 Please specify the password
Password: [email protected]
ftp> cd debian/doc
250 Directory successfully changed.
ftp> get dedication-2.2.cn.txt dedication-2.2.de.txt dedication-2.2.es.txt ..
local: dedication-2.2.de.txt remote: dedication-2.2.cn.txt
Segmentation fault

You need to repeat the arguments about 50 times. But keep adding more and more copies of the three files to the line until you get the crash.

It isn't interesting as a security issue as it is client side only; but as a trivially reproducable issue it becomes fun to solve.

Lets build it with debugging information, and run it again. Here is what we see:

Core was generated by `./ftp/ftp ftp.debian.org'.
Program terminated with signal 11, Segmentation fault.
#0  0x00002b85ad77f1cf in fwrite () from /lib/libc.so.6
(gdb) up
#1  0x0000000000408c3e in command (fmt=0x40dd15 "TYPE %s") at ftp.c:366
366		fputs("\r\n", cout);
(gdb) up
#2  0x0000000000402c3e in changetype (newtype=3, show=)
    at cmds.c:348
348			comret = command("TYPE %s", p->t_mode);
(gdb) up
#3  0x000000000040a569 in recvrequest (cmd=,
    local=0x623d10 "dedication-2.2.de.txt",
    remote=0x6238d4 "dedication-2.2.cn.txt", lmode=0x40e310 "w",
    printnames=) at ftp.c:935
935			changetype(type, 0);

OK so things look trashed, and not in the middle of a copy/sprintf/similar. i.e. there is no obvious problem.

Lets take a step back from things. We know that the crash occurs when we send a long command line. Looking over the code we see the fucntion main.c:makeargv(). This attempts to split the input command line string into an array of tokens.

Interestingly we see this:

char **
makeargv(int *pargc, char **parg)
{
	static char *rargv[20];
	int rargc = 0;
	char **argp;

I wonder what happens if we set the 20 to 2048? Good guess. The crash no longer occurs. (Though I'm sure it would if you entered enough tokens...)

So we know that the crash is relating to the number of space-separated tokens upon the command line. If we increase the limit we'll be fine. But of course we want to fix it properly. There are two ways forward:

  • Abort handling a line if there are >15 "space" characters on the line.
  • Recode the makeargv function to work properly.

I did eventually submit a patch to the bug report which uses dynamic memory allocation, and should always work. Job done.

I mailed the maintainer of FTP and said unless I heard differently I'd NMU and cleanup the package in a week.

All being well this entry will be nicely truncated in the RSS feeds as support for the <cut> tag was the main new feature in my previous upload of chronicle - the blog compiler I use/wrote/maintain.

ObQuote: Razor Blade Smile

| No comments

 

It's just a puzzle box!

3 April 2008 21:50

Chronicle

I've made a new release of the chronicle blog compiler, primarily to allow the "Subject:" header to be used for new blog subjects.

(That allows new entries to be automatically posted via email, with an appropriate procmail setup. I'll add one as an example shortly.)

RSS Utility

Whilst on the subject of RSS creation (huh?) I've written a tiny utility which will create an RSS feed from a list of text files. It will also create an index.html file to match.

To see why this is useful you could view my recent changelog.

I think there is a need for a small tool to read files and create feeds from them - like mod_index_rss does, but without messing with Apache.

If there is any interest I'd be happy to release the code, as-is it doesn't use a template..

Anonymous Hosting?

Online privacy is important. Mostly when this is discussed it is in the context of client-side anonymity.

Looking at it from the other side, though, How do you host a website anonymously?

You could register the domain via a proxy, or with bogus details. But if you host the site yourself the IP address may be traced to the hosting provider, and that may be used to trace back to you.

So, the alternatives? Well you could use a hosted site such as livejournal / wordpress / googlepages / etc. But pretty surely they'll be able to trace content back to you - and if you don't host it there's a high chance they'll just pull it if you talk about "bad things". (I guess you could use TOR for uploading / your connections there.)

So, going back to the question. How can you host something, easily accessible to the world, without risk of your identity/association being discovered?


I'm, obviously, ignoring FreeNet. Two reasons for that:

  • It's slow, has no search-engine goodness, and is unproven.
  • It requires an atypical client. Aunt Milly won't be able to surf Freenet...

I almost think the best way forward would be to write a site which was a proxy for a file-sharing protocol, then link people to items that way. Relying on the swarm to host the files..

The downside is that you'd have to have a convincing argument for when RIAA comes calling, suggesting that you're sharing their stuff too. If it wasn't a general purpose proxy then the deniability is gone, and if it is you're at risk of general copyright infringement claims.

Hard problem. Shame.

ObQuote: HellRaiser

| 6 comments

 

So helpless against what is coming.

14 April 2008 13:04

I've made a new release of the chronicle blog compiler.

There are a couple of minor changes to the variables exported to the theme templates, as contributed by MJ Ray, and a new spooling system.

This works in a simple fashion, and allows you to queue up posts. For example If you write a new entry containing the psuedo-header "Publish: 20th April 2008" and you have a crontab entry containing this:

  chronicle-spooler
    --spool-dir=~/blog/spool/  \
    --live-dir=~/blog/data/  \
    --post-move='cd ~/blog && make upload'

It works as expected. When you call this on the 20th April the file will be moved from ~/blog/spool into ~/blog/data, and your blog will be rebuilt & uploaded.

The implementation was different than the original suggestion, but is nice and clean, and can be used to do other things with a little bit of effort.

Anyway if you see this entry the spooling system is working!

ObQuote: 30 Days of Night.

| No comments

 

That wasn't true. Made it up. Shouldn't have done that. Sorry.

18 April 2008 21:50

Chronicle

My blog compiler received a bit of love recently, primarily because MJ Ray wanted to use it.

As mentioned before I've added a simple spooling system, and the mercurial repository now contains a simple RSS importer.

Debian Work

In other news I've been working on various Debian packages, here is a brief summery:

bash-completion

After seeing a RFH bug I closed a few bash-completion bugs, and submitted patches for a couple more.

I was intending to do more, but I'm still waiting for the package code to be uploaded to the the alioth project.

javascript work

I've updated the jquery package I uploaded to follow the new "Javascript standard" - in quotes only because it is both minimal and new.

Once the alioth project has been configured I'll upload my sources.

Apache2

I've agreed to work on a couple of SSL-related bugs in the Apache 2.x package(s) - time disappeared but I hope to get that done this weekend.

Initially that was because I was hoping I could trade a little love for getting a minor patch applied to mod_vhost_alias - instead I've now copied that module into libapache2-mod-vhost-bytemark and we'll maintain our own external module.

Hardware

I've been loaned a Nokia 770 which is very nice. Having used it with vim, ssh & etc I think that I'd rather have a device with a real keyboard.

The Nokia 810 looks pretty ideal for me. I'm going to be asking around to see if I can get a donated/loaned device to play with for a while before I take the plunge and pay for one of my own.

I've got a couple more things on the go at the moment, but mostly being outdoors is more interesting to me than the alternative. Hence the downturn in writing and releasing security advisories.

I'll pick things up more fully over the coming weeks I'm sure.

ObQuote: Shaun of the Dead

| No comments

 

Please don't let them be as boring as Brian's friends

3 May 2008 21:50

I made an emergency release of the chronicle blog compiler yesterday, after noticing that it was truncating titles containing periods.

That was a bit of a mea-culpea moment, but I guess mistakes happen.

The new release is in perfect shape for Lenny, and now includes two new scripts installed into the examples/ directory:

The latter was applied to my own blog, and I discovered several duplicates. I guess my film quotes having only a limited source collection to work from could also include duplicates - so I've updated my Makefile to only build and rysnc my blog if there are none detected.

(In many ways that films site is the precursor to this blog; it uses a collection of text files, one per film, and generates a cross-linked HTML output of film entries. Sadly it is out of date, because entering titles is a real pain..)

Chronicle Comments

I'm pleased with the comment process now though, the CGI comment submission script simply archives each submitted comment into a "comments/" directory on the webserver.

There a cron-job passes each one through a bayasian filter and moves the file(s) to either "comments/good/", "comments/bad/" or "comments/unsure/".

When I come to rebuild the blog I rsync the "comments/good" directory to my local machine, rebuild and then rsync the output back to my remote webserver.

(On a single machine this would be much simpler process!)

I've imported my blog source into a mercurial repository, so the client-side is consistent. I have a bad habit of making new postings from wherever I happen to be and having a central repository will make that less prone to diaster.

Just running "make steve" against the Makefile is sufficient to rebuild everything and sync it to my live system.

ObQuote: Kalifornia

| No comments

 

You get the dwarf. I get the girl.

21 May 2008 21:50

Recently I have mostly been "behind". I've caught up a little on what I wanted to do though over the past couple of days, so I won't feel too bad.

I've:

made a new release of the chronicle blog compiler, after recieving more great feedback from MJ Ray.

un-stalled the Planet Debian.

updated the weblogs hosted by Debian Administration, after help and suggestions from Daniel Kahn Gillmor.

stripped, cleaned, and tested a new steam engine. Nearly dying in the process.

discovered a beautiful XSS attack against a popular social networking site, then exploited that en masse to collect hundreds of username/password pairs - all because the site admins said "Prove it" when I reported the hole. Decisions decisions .. what to do with the list...

released a couple of woefully late DSAs.

started learning British Sign Language.

Anyway I've been bad and not writing much recently on the Debian Administration site, partly because I'm just sick of the trolling comments that have been building up, and partly due to general lack of time. I know I should ignore them, and I guess by mentioning them here I've kinda already lost, but I find it hard to care when random folk are being snipy.

Still I've remembed that some people are just great to hear from. I know if I see mail from XX they will offer an incisive, valid, criticism or a fully tested and working patch. Sometimes both at the same time.

In conclusion I need my pending holiday in the worst way; and I must find time to write another letter...

ObQuote: Dungeons & Dragons

| 2 comments

 

I think I'll take this back

24 July 2008 21:50

KVM Utility

Gunnar Wolf made an interesting post about KVM today which is timely.

He points to a simple shell script for managing running instances of KVM which was a big improvement on mine - and so is worth a look if you're doing that stuff yourself.

Once I find time I will document my reasons for changing from Xen to KVM, but barring a few irritations I'm liking it a lot.

Chronicle Theme Update

I made a new release of the chronicle blog compiler yesterday, mostly to update one of the themes.

That was for purely selfish reasons as I've taken the time to update the antispam protection site I'm maintaining. There have been some nice changes to make it scale more and now it is time for me to make it look prettier.

(A common theme - I'm very bad at doing website design.)

So now the site blog matches the real site.

ObQuote: Resident Evil

| No comments

 

If you don't learn to behave yourself - there won't be a tonight

2 September 2008 21:50

Yesterday I made a new release of the chronicle blog compiler. This fixes a bug in the handling of comments.

Previously comments were sorted badly, when they crossed a month boundary. Now they are always sorted first to last - which makes reading entries with multiple comments more natural.

Other than that I've been readying for the launch of a new MX machine for my mail filtering service. The process went pretty smoothly, and so I'm happy.

Still have that paranoid feeling that something will break, but at the very least I'll hear about it quickly thanks to the SMS-alerts!

ObMovie: Brief Encountery

| No comments

 

I gotta motor if I wanna be ready for that party tonight.

22 February 2009 21:50

Since I already shared it elsewhere here is my KVM-launcher, and the mercurial repository it lives in.

I'll add my kvm-shell program later - the tools I've written so far is mostly standalone, rather than a package.

This is almost a content-free post, but I can pretend it isn't because I'm testing a new theme on my blog. The theme is included in the new release of my chronicle blog compiler which was released yesterday.

ObFilm: Heathers

| No comments

 

Poppa's got a brand new bang.

6 October 2009 21:50

Recently I posted a brief tool for managing "dotfile collections". This tool was the rationalisation of a couple of adhoc scripts I already used, and was a quick hack written in nasty bash.

I've updated my tool so that it is coded in slightly less nasty Perl. You can find the dotfile-manager repository online now.

This tool works well with my dotfile repository, and the matching, but non-public dotfiles-private repository.

I'm suspect that this post might flood a couple of feed agregators, because I've recently my chronicle blog compiler with a new release. This release has updated all the supplied themes/templates such that they validate strictly, and as part of that I had to edit some of my prior blog entries to remove bogus HTML markup. (Usually simple things suck as failing to escape & characters correctly, or using "[p][/P]" due to sloppy shift-driving.)

I should probably update the way I post entries, and use markdown or textile instead of manually writing HTML inside Emacs, but the habit has been here for too long. Even back when I used wordpress I wrote my entries in HTML...

Finally one other change in the most recent chronicle release is that the "mail-scanning.com theme" has been removed, as the service itself is no longer available. But all is not lost.

ObFilm: Blade II

| No comments

 

There's no such thing as a wrong war

13 October 2009 21:50

Once upon a time I wrote a blog compiler, a simple tool that would read in a bunch of text files and output a blog. This blog would contain little hierarchies for tags, historical archives, etc. It would also have a number of RSS feeds too.

Every now and again somebody will compare it to ikiwiki and I'll ignore that comparison entirely, because the two tools do different things in completely different fashions.

But I was interested to see Joey talk about performance tweaks recently as I have a blog which has about 900 pages, and which takes just over 2 minutes to build from start to finish. (Not this one!)

I've been pondering performance for a while as I know my current approach is not suited to high speed. Currently the compiler reads in every entry and builds a giant data structure in memory which is walked in different fashions to generate and output pages.

The speed issue comes about because storing the data structure entirely in memory is insane, and because sometimes a single entry will be read from disk multiple times.

I've made some changes over the past few evenings such that a single blog entry will be read no more than once from disk (and perhaps zero times if Memcached is in use :) but that doesn't solve the problem of the memory usage.

So last night I made a quick hack - using my introduction to SQLite as inspiration I wrote a minimal reimplementation of chronicle which does things differently:

  • Creates a temporary SQLite database with tables: posts, tags, comments.
  • Reads every blog entry and inserts it into the database.
  • Uses the database to output pages.
  • Deletes the database.

This is a significantly faster approach than the previous one - with a "make steve" job taking only 18 seconds, down from just over 2 minutes 5 seconds.

("make steve" uses rsync to pull in comments on entries, rebuilds the blog, then uses rsync to push the generated output into its live location.)

ObFilm: If...

| 6 comments

 

What the hell are you laughing at?

24 January 2010 21:50

Slaughter

I received my first patch to slaughter today, which made me happy.

(I've made a new release including it, and updated the list of primitives to actually document the file-deletion facilities which previously I'd omitted to avoid encouraging mass-breakage.)

Signing Binaries

Andrew Pollock mentions that the days of elfsign might be numbered.

This is a shame because I've always liked the idea of signing binaries. Once upon a time, in the 2.4.x days, I wrote a kernel patch which would refuse to execute non-signed binaries. (This was mostly a waste of time; since it denied the execution of shell scripts. Which meant that the system init scripts mostly failed. My solution in the end was to only modprobe my module once the system was up and running, and hope for the best ...)

Right now, having performed only a quick search, I don't see anything like that at the moment.

  • elfsign will let you store a binaries MD5 hash.
  • bsign will let you sign a binary with a GPG key.

But where is the kernel patch to only execute such hashed/signed binaries, preventing the execution of random shell scripts and potentially trojaned binaries?

Without that I think signing binaries is a crazyish thing to do. Sure you can test that a file hasn't been modified, but even without those tools you can do the same thing via md5sums.

(ObRandom: Clearly if you mass-modify all your binaries the system md5sums database will be trashed.)

Perl UTF

I've received a bug report against chronicle, my blog compiler.

It seems that some versions of perl fail to run this:

     #
     #  Run the command, reading stdout.
     #
    open( FILTER, "$cmd|;utf8" ) or
       die "Failed to run filter: $!";

Removing the ;utf8 filter allows things to work, but will trash any UTF-8 characters from the output - so that's a nasty solution.

I'm not sure what the sane solution is here, so I'm going to sit on it for a few days and continue to write test scripts.

ObSubject: 300

| 6 comments

 

Are you sure you don't mind me going without you?

17 March 2010 21:50

Recently I received a small flurry of patches to my blog compiler, from Chris Frey. These patches significantly speedup rebuilding a static blog when using Danga's memcached.

The speedup is sufficiently fast that my prior SQLite based approach is no longer required - and (re)building my blog now takes on the order of 5 seconds.

On the topic of other people's blogs I've been enjoying David Watson's recent photo challenge. I was almost tempted to join in, but I'm not sure I could manage one every day - Although I can pretend I recently carried out my my first real photoshoot.

I'm still taking pictures of "things/places" but I'm starting to enjoy "people" more. With a bit of luck I'll get some more people to pose in the near future, even if I have to rely upon posting to gumtree for local bodies!

ObFilm: Love Actually

| 2 comments

 

Different cultures call us by different names

18 April 2010 21:50

This past week has had a couple of minor software releases:

chronicle

I made a new release which improves support for foreign language; so dates can be internationalised, etc.

The online demos now include one in with French month names.

slaughter

The perl-based sysadmin tool had a minor update earlier today, after it was pointed out that I didn't correctly cope with file content checks.

I'm still pretty pleased with the way this works out, even if it is intentionally simple.

milli

This is a simple bug-record-thingy which I was playing with recently, and I've now started using to record bugs in other projects.

I'll pretend its a fancy distributed-bug-tracker, but actually it isn't. It's nothing more than a bunch of text-files associated with a project, which have sufficiently random names that collisions are unlikely and which thus becomes semi-distributed-friendly.

Today I'll be learning to love Javascript a little more. I want to use the gallerific image gallery - but it doesn't make thumbnails automatically - which galleria does.

I need to either come up with my own which looks like galleriffic, or port the thumbnail bits over.

(I'm currently using a slightly modified version of gallerifific for my people-shots.)

ObFilm: Hancock

| 6 comments

 

Everybody loves me, and I intend to keep it that way.

3 June 2010 21:50

Would it be wrong to make a new blog entry just to test the blogging platform? If so I'm wrong.

To pretend that isn't the case I'll briefly document some recent events:

Coding

I've been making changes to the blogging software, chronicle, to support RSS feeds on comments, and to allow outgoing Pings on new entries.

I've also been flirting with an OpenID-backed image upload and sharing site, provisionally named picshare. Why? Partly for fun, partly because I like comments, and partly to flirt with RESTful design and API construction.

Photography

I've provisionally agreed to be the photographer for a special event at a local hair-dressers. This will either be fun, or all end badly.

No money, but free food and the chance to experiment more. So all good.

There. Almost like a real entry, right?

ObSubject: Cruel Intentions

| No comments

 

Happy birthday to me

10 March 2012 21:50

Recently I accidentally flooded Planet Debian with my blog feed. This was an accident caused by some of my older blog entries not having valid "Date:" headers. (I use chronicle which parses simple text files to build a blog, and if there is no Date: header present in entries it uses the CTIME of the file(s).)

So why did my CTIMEs get lost? Short version I had a drive failure and a PSU failure which lead to me rebuilding a few things and cloning a fresh copy of my blog to ~/hg/blog/.

My host is now once again OK, but during the pain the on-board sound started to die. Horribly crackly and sounding bad. I figure the PSU might have caused some collateral damage, but so far thats the only sign I see.

I disabled the on-board sound and ordered a cheap USB sound device which now provides me with perfect sound under the Squeeze release of Debian GNU/Linux.

In the past I've ranted about GNU/Linux sound. So I think it is only fair to say this time things worked perfectly - I plugged in the device, it was visible in the output of dmesg, and /proc/asound/cards and suddenly everything just worked. Playing music (mpd + sonata) worked immediately, and when I decided to try playing a movie with xine just for fun sound was mixed appropriately - such that I could hear both "song" + "movie" at the same time. Woo.

(I'm not sure if I should try using pulse-audio, or similar black magic. Right now I've just got ALSA running.)

Anyway as part of the re-deployment of my desktop I generated and pass-phrased a new SSH key, and then deployed that with my slaughter tool. My various websites all run under their own UID on my remote host, and a reverse-proxy redirects connections. So far example I have a Unix "s-stolen" user for the site stolen-souls.com, a s-tasteful user for the site tasteful.xxx, etc. (Right now I cannot remember why I gave each "webserver user" an "s-" prefix, but it made sense at the time!)

Anyway once I'd fixed up SSH keys I went on a spree of tidying up and built a bunch of meta-packages to make it a little more straightforward to re-deploy hosts in the future. I'm quite pleased with the way those turned out to be useful.

Finally I decided to do something radical. I installed the bluetile window manager, which allows you to toggle between "tiling" and "normal" modes. This is my first foray into tiling window managers, but it seems to be going well. I've got the hang of resizing via the keyboard and tweaked a couple of virtual desktops so I can work well both at home and on my work machine. (I suspect I will eventually migrate to awesome, or similar, this is very much a deliberate "ease myself into it" step.)

ObQuote: "Being Swedish, the walk from the bathroom to her room didn't need to be a modest one. " - Cashback.

| 4 comments

 

If this goes well I have a new blog engine

17 September 2014 21:50

Assuming this post shows up then I'll have successfully migrated from Chronicle to a temporary replacement.

Chronicle is awesome, and despite a lack of activity recently it is not dead. (No activity because it continued to do everything I needed for my blog.)

Unfortunately though there is a problem with chronicle, it suffers from a bit of a performance problem which has gradually become more and more vexing as the nubmer of entries I have has grown.

When chronicle runs it :

  • It reads each post into a complex data-structure.
  • Then it walks this multiple times.
  • Finally it outputs a whole bunch of posts.

In the general case you rebuild a blog because you've made a entry, or received a new comment. There is some code which tries to use memcached for caching, but in general chronicle just isn't fast and it is certainly memory-bound if you have a couple of thousand entries.

Currently my test data-set contains 2000 entries and to rebuild that from a clean start takes around 4 minutes, which is pretty horrific.

So what is the alternative? What if you could parse each post once, add it to an SQLite database, and then use that for writing your output pages? Instead of the complex data-structure in-RAM and the need to parse a zillion files you'd have a standard/simple SQL structure you could use to build a tag-cloud, an archive, & etc. If you store the contents of the parsed-blog, along with the mtime of the source file you can update it if the entry is changed in the future, as I sometimes make typos which I only spot once Ive run make steve on my blog sources.

Not surprisingly the newer code is significantly faster if you have 2000+ posts. If you've imported the posts into SQLite the most recent entries are updated in 3 seconds. If you're starting cold, parsing each entry, inserting it into SQLite, and then generating the blog from scratch the build time is still less than 10 seconds.

The downside is that I've removed features, obviously nothing that I use myself. Most notably the calendar view is gone, as is the ability to use date-based URLs. Less seriously there is only a single theme, which is what is used upon this site.

In conclusion I've written something last night which is a stepping stone between the current chronicle and chronicle2 which will appear in due course.

PS. This entry was written in markdown, just because I wanted to be sure it worked.

| 9 comments

 

Kraków was nice

4 October 2014 21:50

We returned safely from Kraków, despite a somewhat turbulent flight home.

There were many pictures taken, but thus far I've only posted a random night-time shot. Perhaps more will appear in the future.

In other news I've just made a new release of the chronicle blog compiler, So 5.0.7 should shortly appear on CPAN.

The release contains a bunch of minor fixes, and some new facilities relating to templates.

It seems likely that in the future there will be the ability to create "static pages" along with the blog-entries, tag-clouds & etc. The suggestion was raised on the github issue tracker and as a proof of concept I hacked up a solution which works entirely via the chronicle plugin-system, proving that the new development work wasn't a waste of time - especially when combined with the significant speedups in the new codebase.

(ObRandom: Mailed the Debian package-mmaintainer to see if there was interest in changing. Also mailed a couple of people I know who are using the old code to see if they had comments on the new code, or had any compatibility issues. No replies from either, yet. *shrugs*)

| No comments

 

Apologies for the blog-churn.

19 February 2017 21:50

I've been tweaking my blog a little over the past few days, getting ready for a new release of the chronicle blog compiler (github).

During the course of that I rewrote all the posts to have 100% lower-case file-paths. Redirection-pages have been auto-generated for each page which was previously mixed-case, but unfortunately that will have meant that the RSS feed updated unnecessarily:

That triggered a lot of spamming, as the URLs would have shown up as being new/unread/distinct.

| 3 comments

 

A blog overhaul

8 October 2019 18:00

When this post becomes public I'll have successfully redeployed my blog!

My blog originally started in 2005 as a Wordpress installation, at some point I used Mephisto, and then I wrote my own solution.

My project was pretty cool; I'd parse a directory of text-files, one file for each post, and insert them into an SQLite database. From there I'd initiate a series of plugins, each one to generate something specific:

  • One plugin would output an archive page.
  • Another would generate a tag cloud.
  • Yet another would generate the actual search-results for a particular month/year, or tag-name.

All in all the solution was flexible and it wasn't too slow because finding posts via the SQLite database was pretty good.

Anyway I've come to realize that freedom and architecture was overkill. I don't need to do fancy presentation, I don't need a loosely-coupled set of plugins.

So now I have a simpler solution which uses my existing template, uses my existing posts - with only a few cleanups - and generates the site from scratch, including all the comments, in less than 2 seconds.

After running make clean a complete rebuild via make upload (which deploys the generated site to the remote host via rsync) takes 6 seconds.

I've lost the ability to be flexible in some areas, but I've gained all the speed. The old project took somewhere between 20-60 seconds to build, depending on what had changed.

In terms of simplifying my life I've dropped the remote installation of a site-search which means I can now host this site on a static site with only a single handler to receive any post-comments. (I was 50/50 on keeping comments. I didn't want to lose those I'd already received, and I do often find valuable and interesting contributions from readers, but being 100% static had its appeal too. I guess they stay for the next few years!)

| 5 comments