It is funny the way things work out when you're looking for help.
Recently I was working on a Ruby + FUSE based filesystem and as part of the development I added simple diagnostic output via trivial code such as this:
@debug && puts "called foo(#{param});"
That was adequate for minimal interactive use, but not so good for real live use. In real live use I started outputing messages to a dedicated logfile, but in practise became overwhelmed by thousands of lines of output describing everything ever applied to the filesystem.
I figured the natural solution was to have a ring-buffer. (Everybody knows what a ringbuffer is, right?) It could keep the last 500 messages and newer debug information would just replace older entreis. That'd be just enough to be useful if I had a problem, but not so overwhelming it would get ignored.
In Perl I found a nice ringbuffer library, but for Ruby nothing. Locking a region of shared memory via shmget, shmset and keeping an array of a few hunded strings would be simple, but it seems odd I have to code this myself.
I started searching around and I accidentally stumbled upon the unrelated IPC::DirQueue perl module. Not useful for my ringbuffer logging problem, but beautifully useful.
There is no package for Debian but that was easily created:
dh-make-perl --build --cpan IPC::DirQueue
Already I have a million and one uses for it - not least to solve my problem of maintaining a centralised quarantine for all the spam mail rejected by N MX machines. (Which currently uses a combination of rsync and lockfiles.)
This is the reason why sites like Perl Advent Calendar are useful - they introduce a useful module every day or two, and introduce you to thinks that you can use in the future.
Of course keeping a sustainable site like that up and running is hard which is why sites like debaday struggle to attract contributors, for example.
Anyway random happyness.
ObFilm: Lord of the rings: Two Towers
Tags: fuse, perl, random, ruby 3 comments
Something like
find . -size 100M -a \! -name \*z | xargs -P 4 -n 1 gzip
will keep 4 gzips busy as long as there are files to be found.
This can also be handy for things like recursive greps, where you are IO-bound by metadata lookups on lots of small files:
find . | xargs -P 10 -n 10 grep pattern