About Archive Tags RSS Feed

 

Entries tagged hacks

If you read the TV Guide, you don't need a TV

19 April 2008 21:50

So I've written a quick hack. A client-side filter/utility program for working against IMAP servers.

Consider it a general purpose system which is similar to Procmail, but applied after your remote machine has already done the sorting.

Here's a flavour:


<GMail>
  username somebody.like.me
  password yeah.right
</Gmail>

<Folders>
  <livejournal>
        unread exec /usr/local/bin/notify "Livejournal Comment"
        mark read
  </livejournal>

  <inbox>
        mark read
  </inbox>

</Folder>

What does that do? It first of all logs into GMail with the given username and password, then selects two folders:

=livejournal/

For each unread message in the folder it runs the specified command with STDIN being the message body.

Then it marks each new message as "read".

=inbox/

This simple rule just marks all messages as read.

Why? Well I have a bunch of folders on a bunch of gmail accounts and I don't pay attention to them - but some, specific, mails should result in an SMS being sent to me ... so I need to do something clever.

I'm sure with a bit of effort this could be made IMAP-server independent, and could have a more flexible matching system. The simplicity right now comes about primarily because i dont want to parse a config file.

Anyway, suggestions for potential features are welcome. It does what I need as-is, even if it isn't pretty.

ObQuote: Lost Boys

| 2 comments

 

On the other side of the screen, it all looks so easy

20 April 2008 21:50

I've updated the IMAP utility that I mentioned previously, which has now been given the name sift. It will accept, and process, a much simpler configuration file format keeping state as it goes.

Here's my updated sample file:

username: blah.bah
password: pas.word

#
#  Comments are fine.
#
folder:livejournal status:new subject:temp mark:read exec:~/bin/notify
folder:foo status:new mark:read
folder:bar status:old exec:/usr/local/bin/record delete

Each line consists of a set of tokens, split by whitespace, which is "executed" in order.

So the first line selects the folder "livejournal", finds messages which are "new", then each message containing "temp" in the subject is marked as read, and the program "notify" is executed once for each match.

Essentially we keep a list of messages as "current" as we process each line, that list of messages is then refined as we move through the line. (When a folder is opened all messages are selected by default.)

As a simple example to delete all the messages contained in a folder we'd use this:

folder:foo delete

To refine that to only delete messages from "fred" we'd say:

folder:foo from:fred delete

(If there were no matches the "delete" action wouldn't occur.)

Consider each line of input a collection of filters each operating on the previous result. Simple to understand, simple to extend with more operations, and simple for me to code!

TODO: Add a "move:xxx" to move a message to folder "xxx", and a bit more polish, then release.

ObQuote: Tron.

| No comments

 

I want reliable people, people who aren't going to be carried away

21 April 2008 21:50

OK I'm done with this now, the sift utility has been released.

I think that is a large overlap with imapfilter; but I win because I can write simple rules, rather than any actual code, to perform jobs.

 

In other news I flew my kite today, and I still like eating Pies: Thank God reading Debian Planet isn't mandatory.

ObQuote: The Godfather

| No comments

 

You'd better get yourself a garlic T-shirt, buddy, or it's your funeral

6 March 2009 21:50

There are times when I hate xkcd. Mostly these are:

1. When reading a discussion on /. and you just know a particular image will be posted.

2. When you spend hours searching for a specific comic that you're certain exists.

The latter is what bit me tonight - I'm certain there exists a cartoon which has a plot of:

Woman says hi.

Guy says hi.

Woman looks confused.

Guy realises she was talking to her phone, not him.

Cannot find the image for the life of me - only phone-related image I could find was tones.

I thought I might get lucky if I knocked up a quick hack to search the alt-text on all the images, but sadly not.

Still it was a fun project. To be uber-useful we'd need to persuade people to imput the text in each cartoon, along with the number.

Given that there are only 550ish cartoons published thus far creating a database would take a person a day, or a group of people a couple of hours.

Tempting .. very tempting ..

ObFilm: The Lost Boys. Yay!

| 7 comments

 

A lot of people drink mineral water

4 May 2009 21:50

dmonitor now has a webpage.

I've been running it for a night now, watching alerts come and go via manual firewall rules so I'm pretty confident it works reliably and in a way that avoids transient failures. The only obvious failure case is if each monitoring node loses the links to each of the others. (Solution there is to have a sufficiently large number of them! Hence the reason the configuration is file/directory based. rsync for the win!)

Still I will leave it there for now. The only things missing are better instructions, and more service checking plugins..

Now to go paint some more ... adding new wall decorations has inspired me a little!

ObFilm: Heathers

| No comments

 

But now that I have you in my custody, I may do with you what I please.

27 December 2009 21:50

I sketched out a quick prototype of a Kernel ChangeLog viewer:

Choose the kernel on the left, select the changelog summary at the top and the text is shown in the bottom pane.

I spend a fair amount of time reading kernel changelogs and something like this (but with nice filtering and searching) would be useful. The only major problems I see are :

  • "Recent" changelog entries have one format, older ones have another.
  • You need to download a lot of changelog files locally for it to be useful.

Anyway if you follow kernels you might like the idea, if not the implementation. I look forward to seeing your improved version. (Doesn't free software rock? ;)

ObSubject: Aeon Flux

| 4 comments

 

I updated my redis-based filesystem

2 March 2011 21:50

In July last year I made a brief post about a simple filesystem I'd put together which used Redis for the storage.

At that time I thought it was a cute hack, and didn't spend too much time with it. But recently I found a use for it so I cleaned it up, synced up the C client for Redis which I used and generally started to care again.

If it is useful you can now find it online:

The basic idea is the same as it was before, except I did eventually move to an INODE-like system. Each file/directory entry receives a unique identifier (integer) - and then I store the meta-data in a key based off that name.

This means for a file I might have keys, and values,like this:

KeyValue
INODE:1:NAMEThe name of the file (e.g. "passwd").
INODE:1:SIZEThe size of the file (e.g. "1661" )
INODE:1:GIDThe group ID of the file's owner (e.g. "0")
INODE:1:UIDThe user ID of the file's owner (e.g. "0")
INODE:1:MODEThe mode of the file (e.g. 0755)

To store these things I use a Redis "SET" which allows me to easily iterate over all the entries in each directory.

ObQuote: "They fuck up, they get beat. We fuck up, they give us pensions. " - The Wire

| 3 comments

 

moreutils makes a lot of sense to me

15 April 2012 21:50

I've installed moreutils on several hosts now, and each time I find a new use for one of those tools I'm very happy.

Unfortunately I suspect that too many of us continue to hoard our little shell script archives, continuing to re-invent wheels with slightly different colours, shapes, and sizes.

I suspect I've just re-invented the wheel again, writing nrecent, because there seems to be no "simple" way of doing the job of keeping the most N recent files in a directory.

I guess the standard approach would be to use "ls", or "find", to find all files in a directory, reverse-sorted by mtime, then use "tail +n2 to skip the ones you want to keep, removing the rest.

Anyway nrecent - keep the most recent N files in a directory:

nrecent --keep 20 /tmp

Written as naively/hurridly as possible as it solved, and continues to solve, a real need.

ObQuote: "You're like the drummer from REO Speedwagon. Nobody knows who you are. " - Employee of the month

| 3 comments