About Archive Tags RSS Feed

 

Entries posted in February 2008

Shout it out

3 February 2008 21:50

Well I've had a busy weekend, but I'm sober now.

I made a new release of xen-tools, which has a couple of minor bugfixes and not much else. I also released a new update for Debian of the xen-shell which fixes a couple of bashisms.

Finally I've managed to sign up two new users to my anti-spam proxy. Hopefully they're very happy.

In real news I painted about 1 square meter of my flat, (we're now into week three of painting a single room...), and replaced five light bulbs:

My eyes! The goggles do nothing!

Now I need to install a request tracking system (otrs2) and catch up on significantly outstanding RT status updates.

I'm getting hopeless again.

Maybe I should just give it all up and become a plumber. Plumbing is easy: Water goes downhill. The rest is just regulations and a willingness to get dirty...

| No comments

 

Lot 666 then: a chandelier in pieces

6 February 2008 21:50

This week I have been primarily looking over ticketing systems.

So far I've looked at the following:

roundup

Initially this was the package that I liked best. It is small, self-contained, and easy both to install and manage.

The significant downside was that it expected all users to create an account first, prior to submitting a ticket. If they didn't and they mailed [email protected] they'd receive a reply back which read "Unknown user: [email protected]".

Shame because it was reasonably featureful.

otrs2

This was a little fiddly to get going with, but actually suprisingly good.

Had I not experimented with roundup I'd have said it were the best I looked at.

request-tracker

Big. Heavy. Intense.

It looks ugly, but provides a bamillion (metric) configuration options and can be used to do almost anything you'd like.

If it didn't such up so many resources I'd have had no qualms about using it. Even the mail-gateway was nice and simple to get working.

I'm going to take a look around some more and see if there are any non-packaged systems to look at. Pointers welcome.

All I want is for incoming emails to appear in some sort of queue where they may be assigned to an owner. If the original submitter can view the ticket online great, but no big deal if not.

I'd expect that replies sent to the ticket would be routed back to the submitter, and that replies to those would similarly update the ticket. I guess thats standard though?

| No comments

 

I wish I could tie you up in chains

10 February 2008 21:50

Today I've been mostly unwell. Although I have managed to write some minor new code, and watch a little bit of Doctor Who on DVD.

Recently several people have been ranting about Ruby on Rails. I like it, but I wouldn't use it for personal development in a hurry. Deployment is fiddly, and upgrades are annoying.

But one thing that I utterly condemn Rails for is helping to spread bad paging throughout the online world.

So, what is "bad paging" and why is it important? Well cool URLs don't change, right? "Bad paging" is any user-interface which presents you with a limited view upon a changing list of items which is non-bookmarkable.

Consider the following "list". Assume it represents your view of a collection of items numbering 100+. You may only view ten items at a time; clicking "next", or "previous", to navigate your viewport:

1.  first item
2.  second item
..
10. tenth item

[see next: /start/1] [see prev]

Whats wrong with this picture? It is subtle, but this list is broken. The issue is that when the list grows new items are prepended to the front, yet the navigation is linked to the starting page number.

If that description wasn't clear consider what happens if you want to bookmark the page containing item 11. How can you?

Right now it is at /start/1. If a ten new items are appended to the head of the list then it will instead become /start/2 - as items that are currently numbered 1-10 will be shifted forward to become items 11-20, and and they will be on page /start/1 instead.

The solution is simple enough once you consider what you want to happen:

  • Either append items to the end of the list.
    • Such that /start/1 always gives the items 11-20.
  • Number the links in the reverse order.

So why does nobody do that? (As a counter example look at my website: Rather than the 'Show previous' items linking you to the changing link /start/1, it instead links you to /start/569 (for example).

| 2 comments

 

A good cockerel always points north

11 February 2008 21:50

I spent a while yesterday thinking over the software projects that I'm currently interested in. It is a reasonably short list.

At the time I just looked over the packages that I've got installed and the number of bugs. I'm a little disappointed to see that the bugfixes that I applied to GNU screen have been mostly ignored.

Still I have the day off work on Thursday and Friday this week and would probbly spend it releasing the pending advisories I've got in my queue, and then fixing N bugs in a single package.

The alternative is to build a quick GPG-based mailing list manager.

I'd like a simple system which allowed users to subscribe, and only accepted GPG-signed mails. The subscriber could choose to receive their messages either signed (as-is) by the submitter or encrypted to them.

So to join you'd do something like this:

subscribe [email protected] [encrypted]
--BEGIN PUBLIC KEY --
...
--ND PUBLIC KEY--

There is the risk, with a large enough number of users, that a list could DOS the host if it had to encrypt each message to each subscribers. But if the submissions were validated as being signed by a user with a known key it should be minimal, unless there is a lot of traffic.

The cases are simple:

  • foo-subscribe => Add the user to the list, assuming valid key data found
  • foo-unsubscribe => Do the reverse.
  • foo:
    • If the message is signed accept and either mail to each recipient, or encrypt on a per-recipient basis.
    • If the message is not signed, or signed by a non-subscriber drop it.

There are some random hacks out there for this, including a mailman patch (did I mention how much I detest mailman yet today?) but nothing recent.

| 1 comment

 

Listen to me when I'm telling you

14 February 2008 21:50

So today I'm a little bit lazy and I've got the day off work. As my previous plan suggested I wanted to spend at least some of the day tackling semi-random bugs. Earlier I picked a victim: less.

less rocks, and I use it daily. I even wrote an introduction to less once upon a time.

So lets take a look at two bugs from the long-neglected pile. These two issues are basically the same:

They seem like simple ones to fix, with the same root cause. Here's an example if you want to play along at home:

 cp /dev/null testing
 gzip testing
 zless testing.gz

What do you see? I see this:

"testing.gz" may be a binary file.  See it anyway?

When I select "y" I see the raw binary of the compressed file.

So, we can reproduce it. Now to see why it happens. /bin/zless comes from the gzip package and is a simple shell script:

#!/bin/sh
# snipped a lot of text
LESSOPEN="|gzip -cdfq -- %s"; export LESSOPEN
exec less "$@"

So what happens if we run that?

$ LESSOPEN="|gzip -cdfq -- ~/testing.gz" /usr/bin/less ~/testing.gz
"/home/skx/testing.gz" may be a binary file.  See it anyway?

i.e. it fails in the same way. Interestingly this works just fine:

gzip -cdfq -- ~/testing.gz | less

So we've learnt something interesting and useful. We've learnt that when LESSOPEN is involved we get the bug. Which suggests we should "apt-get source less" and then "rgrep LESSOPEN ~/less-*/".

Doing so reveals the following function in filename.c:

	public char *
open_altfile(filename, pf, pfd)
	char *filename;
	int *pf;
	void **pfd;
{

/* code to test whether $LESSOPEN is set, and attempt to run the
   command if it is */

		/*
		 * Read one char to see if the pipe will produce any data.
		 * If it does, push the char back on the pipe.
		 */
		f = fileno(fd);
		SET_BINARY(f);

		if (read(f, &c, 1) != 1)
		{
			/*
			 * Pipe is empty.  This means there is no alt file.
			 */
			pclose(fd);
			return (NULL);
		}
		ch_ungetchar(c);
		*pfd = (void *) fd;
		*pf = f;
		return (save("-"));

That might not be digestible, but basically less runs the command specified in $LESSOPEN. If it may read a single character of output from that command it replaces the file it was going to read with the output of the command instead!

(i.e. Here less failed to read a single character, because our gzipped file was zero-bytes long! So instead it reverted to showing the binary gzipped file.)

So we have a solution: If we want this to work we merely remove the "read a single character test". I can't think of circumstance in which that would do the wrong thing, so I've submitted a patch to do that.

Bug(s) fixed.

Incidentally if you like these kind of "debuggin by example" posts, or hate them, do let me know. So I'll know whether to take notes next time or not..

| 22 comments

 

I got the poison

20 February 2008 21:50

I've two video-related queries, which I'd be greatful if people could help me out with:

Mass Video Uploading

Is there any tool, or service, which will allow me to upload a random movie to multiple video-download sites? Specifically I'm curious to learn whether there is a facility to transcode as necessary a given input file and then upload to youtube, google video, and other sites as a one-step operation.

Mass Video Searching

Relating to that is there a service which will allow me to search for vidoes with given titles/tags/keywords across multiple video-hosting networks?

Regarding the searching I see that YouTube has support for "OpenSearch", but Google's video hosting has neither that nor a sitemap.xml file: Irony Is ...

| 6 comments

 

No, no, no, no.

20 February 2008 21:50

I'm going to admit up front here that I'm pushing my luck, and that I anticipate the chances of success are minimal. But that aside .. There are a lot of people who read my entries, because of syndication, and I'm optimistic that somebody here in the UK will have a copy of the following three books they could send me:

  • Flash Gordon vol 3: Crisis on Citiadel II
  • Flash Gordon vol 5: Citadels under attack
  • Flash Gordon vol 6: Citadels on Earth

(All three are cheap paperback pulp fiction novels from the 1980s written by Alex Raymond.)

If you have a copy of any of those three books, and are willing to part with them, then I'd love to hear from you. Either as a comment or via email.

I'm certainly expecting to pay for them up to around £5 for each volume.

Backstory: I read the first when I was 10-12, then mostly forgot about it.

A while back I remembered enjoying it and bought volumes 1, 2, 3, & 4 from an online store. I got screwed and volume 3 hasn't arrived, but possibly that will be rectified soon.

Here in the UK the last two volumes are either extremely rare or extremely in demand. Typically they seem to sell for £15-30 - I'm frustrated to not have the conclusion, but not desperate to spend so much money upon them, (been there, done that).

So if anybody has some or all of these books and can bear to part with them please do let me know.

</luck:pushing>

| No comments

 

I guess I'll go on home, it's late

24 February 2008 21:50

Friday: La traviata.

Saturday: The Remorseful Day.

Sunday: Pie & Chips.

Could life get better, without more pie?

| No comments

 

I'm wearing my heart like a crown

27 February 2008 21:50

For the past couple of days I've been working on some "easy hosting" setup for Debian. This is a continuation of my shell-script based solution, but intended to be much more dynamic.

The system makes it simple for us to deploy a Debian Etch installation, and allow users to create virtualhosts easily. (And by easily I mean by creating directories, and very little else.)

So, for example, to create a new website simple point the IP address of your domain example.org to the IP of your machine. Then run:

mkdir -p /srv/example.com/cgi-bin
mkdir -p /srv/example.com/htdocs

If you then want to allow FTP access to upload you may run:

echo "mysecretpass" > /srv/example.com/ftp-password

This will give you FTP access, username "example.com", password "mysecretpass". You'll be chrooted into the /srv/example.com/ directory.

All of this is trivial. Via Apache's mod_vhost_alias, and a simple script to split logfiles and generate statistics via webalizer for each domain. The only thing that I really needed to do was to come up with a simple shell script & cron entry to build up an FTP password file for pure-ftpd.

So here's where it gets interesting. The next job, obviously, is to handle mail for the domains. Under Debian it should be a matter of creating an appropriate /etc/exim/exim4.conf - and ignoring the rest of the setup.

I'm getting some help with that, because despite knowing almost too much about SMTP these days I'm still a little hazy on Exim4 configuration.

I'm watching the recent debian configuration packages system with interest, because right now I'm not touching any configuration files I'm sure that it is only a matter of time.


In other news I cut prices, and am seeing a lot of interest in my mail-scanning.

Finally my .emacs file has been tweaked a lot over the previous few days. Far too much fun. (support files.)

| 2 comments