About Archive Tags RSS Feed

 

Entries posted in November 2007

thinking everything's gonna be as sweet as pie

1 November 2007 21:50

I'm in a position where I need to rebuild a Linux kernel for a number of distributions and architectures. Currently the distributions are:

  • Debian Etch
  • Ubuntu Dapper
  • Ubuntu Edgy
  • Ubuntu Feisty
  • Ubuntu Gutsy

(For each distribution I need a collection of packages for both i386 and amd64.)

I've written a couple of scripts to automate the process - first of all running "make menuconfig" within a debootstrap-derived chroot of each arch & distribution pair. Then later using those stored .config files to actually produce the packages via make-kpkg.

This process, as you could imagine, takes several hours to complete. Then there's the testing ...

I'm sure there must be other people with this kind of need but I was suprised to see nothing in my search attempts.

ObRandom: I'm tempted to switch from song-lyrics to film names as post titles. Undecided as yet. I guess it doesn't really matter, just gives me a small amount of amusement. Even now.

| No comments

 

When the day is through

4 November 2007 21:50

The webpages for the Debian Security Audit Project have been outdated for quite some time. Specifically because they contained two static pages comprised of large lists which were annoying to update:

  • The list of security advisories we've been responsible for.
  • The list of security-sensitive bug reports we'd made which didn't require a security advisory. (ie. packages in sid)

Last week, with the help of Rhonda, and a couple of other people, I removed the static lists and replaced them with simple data files, and a perl script to convert those data files into HTML content.

Now the advisories which have been released are ordered by date, and broken down into years. For example 2002, 2003, & etc. We also have a list of all the people who've been credited with at least one advisory.

There are some outstanding advisories still to be included, but otherwise I'm much happier with the process (and feel only guilt at the breakage of the translations).

There isn't much actual auditing happening at the moment, with only four advisories released in 2007 compared to many more at the peak. But I guess that is a separate problem, and one that I can do less about - short of finding more time to look at source code.

| No comments

 

There's trouble blowing like a hurricane

6 November 2007 21:50

xen-tools has just got a new command:

xen-resize-guest --hostname=foo.my.flat --increase=5Gb

That will take the existing guest foo.my.flat and magically resize the main disk image to be 5Gb larger. This works for LVM & loopback images, but not yet for users of EVMS. Whilst doing this job manually isn't terribly difficult it can be troublesome to perform all the steps in order without screwing up. Hence the new command.

In other news I've managed to fix my broken greylisting implementation - so all the mails which were previously being queued upon klecker.debian.org should now be delivered/bounced.

I believe the blame here was 50/50 me and exim's back-off behaviour, but I'll know better in the future.

Remember me when you come to choose your next anti-spam service. It copes beautifully with a sustained delivery rate of 300-700 messages a minute when queues suddenly restart delivery ;)

TODO: Catchup on mail. Implement message tagging for mutt to better keep track of items which are pending/claimed by me. Until we get RT.

| No comments

 

Since you've been gone

10 November 2007 21:50

Confessor - Terry Goodkind's last novel in the Sword of Truth series.

Brilliant.

Exceptionally Brilliant.

Well worth waiting for, and the annoyance of 'Chainfire' itself which seemed to go nowhere despite its length.

| No comments

 

At first I was afraid

12 November 2007 21:50

BlueTooth

I've just ordered a bluetooth adapter with the intention that my desktop machine will automatically screensaver & lock when I leave my desk.

This looks trivial. As does setting my gaim status automatically.

The challenge is to send a message to my work chatroom, via jabber, when the same thing happens. I think this should be straightfoward via xmpp but we shall see.

I guess run-parts invoked upon /etc/presence/present & /etc/presece/absent or similar.

Guitar Amps

I have a "rare" (read limited edition) Marshall Jubilee Mini-Stack I'm trying to sell.

From 198x.

But I have no idea what the current price should be .. My google-fu is weak this week, or I'd know.

The biggest issue is that I've found listings which just say "Call for price". Or expired Ebay auctions. I guess there can't be too many around so the price should be "high". But at the same time there can't be a huge market for them, so the price should be "low"...

Complicating that is the fact that the buyer would have to be extremely local - as shipping isn't a realistic option. The cab + 2xspeaker combo is >1m high.

Now what are the chances that somebody reading this would know how much one "should" cost..?

ho hum.

Still life is good, and that is the main thing.

| No comments

 

I love this hive employee

13 November 2007 21:50

Russell Coker wants something to save and restore file permissions en masse.

That exists already:

apt-get install acl

Once installed you can dump the filesystem permissions of, for example, /etc/ recursively with this:

 getfacl -R  /etc > orig.perms

Want to see what is different? First change something:

steve@steve:~$ sudo chmod 0 /etc/motd

Now see what would be restored:

setfacl --test -R --restore=./orig.perms /etc | grep -v "\*,\*"
etc/motd : u::rw-,g::r--,o::r--,*

Finally lets make it do the restoration:

steve:/# setfacl -R --restore=./orig.perms /etc

Job done.

| No comments

 

It brings on many changes, and I can take or leave it as I please

15 November 2007 21:50

On Tuesday I released a new version of rinse which now supports Fedora Core 8.

On Wednesday I rebuilt xen-unstable several times, and reported a vaguely security relevant issue against the Exaile music player. I flagged that as important, but I'm not really sure how important it should be. True it works. True it requires DNS takeover, or similar, to become a practical attack, but .. serious or not?

Today I'm wondering about "hiding" messages in debian/changelog files. Each changelog entry includes the time & date of the new revision. I tend to pick the last two digits of the timestamp pretty much as random. (ie. the hours and minutes are always correct, but the seconds is a random value).

Given two digits which may be manipulated in the range 0-59 I'm sure a few small messages could be inserted into a package. But the effort would be high. (Hmmm timezone offset too?)

And that concludes todays entry.

| No comments

 

You're not going to end up like your mum and dad

18 November 2007 21:50

I've been working on updating my online film list since Thursday evening.

I have some code which will convert static data-files containing film entries into a browsable HTML site.

The next job is to actually go through all our DVDs and make sure the lists are correct.

I've updated all our TV shows, and I've made an initial pass at making sure all our films are present but it'll take me a few more days to ensure the lists are completely correct.

In the past I used to browse my list of films via my mobile phone to make sure I didn't buy duplicate films (more than once in the past I had managed to do that!) These days I don't seem to need to, but it is nice for organizing and it appeals to my love of lists..

I'm not sure which is worse, me doing it or Megan taking one look and saying "That's so cool!".

| No comments

 

Another town I've left behind

20 November 2007 21:50

Hyperlink[0] are fun.

 

No, I don't want your number

23 November 2007 21:50

I'm still in the middle of a quandry with regards to revision control.

90% of my open code is hosted via CVS at a central site.

I wish to migrate away from CVS in the very near future, and having ummed and ahhed for a while I've picked murcurial as my system of choice. There is extensive documentation, and it does everything I believe I need.

The close-runner was git, but on balance I've decided to choose mercurial as it wins in a few respects.

Now the plan. I have two options:

  • Leave each project in one central site.
  • Migrate project $foo to its own location.

e.g. My xen-tools could be hosted at mercurial.xen-tools.org, my blog compiler could live at mercurial.steve.org.uk.

Alternatively I could just leave the one site in place, ignoring the fact that the domain name is now inappropriate.

The problem? I can't decide which approach to go for. Both have plusses and minuses.

Suggestions or rationales welcome - but no holy wars on why any particular revision control system is best...

I guess ultimately it matters little, and short of mass-editing links its 50/50.

| No comments

 

All the time anywhere

24 November 2007 21:50

Thanks for the comments/mails on my previous post.

I've now started the migration properly.

All code currently held upon cvsrepository.org will be moved to http://repository.steve.org.uk/ - a nice system-agnostic name.

So far I've only setup the pages for two project (my dotfiles and ~/bin) but I'm happy with the process, and the naming scheme appears sane to me now.

Using sub-domains I need only install one CGI which is a big win, and using a simple set of templates + mod_rewrite all projects get the same look and feel.

I've still got to tweak the stylesheet a little, but otherwise I'm happy with things. I can host both public and private repositories, and that all works magically.

Of course the huge win is no longer needing to run a pserver for CVS! All future checkouts will be via HTTP, and the rare few people with commit access will be able to do so over SSH.

Once things have been migrated I'll cause the cvsrepository names to redirect to the new locations, and keep that up until the domain expires as a transition period. (I think +1 year :))

The other job for the weekend is releasing a fixed security update for Samba, to fix the regressions.... That is in hand, but the buildds are being slow. If they're not all done I'll release it anyway today/tomorrow. I'm annoyed that I handled the timing so badly, but I hope that people don't hate me for it.

| No comments

 

Open your eyes, look up to the skies and see

25 November 2007 21:50

If you have a (public) revision controlled ~/bin/, or bash/shell scripts I'd love to see them. Feel free to post links to your repositories as comments.

I'm certain there are some great tools and utilities out there with I could be using. Right now the only external thing I'm using is Martin Krafft's pub script. I don't use it often, but it is very neat and handy when I do want it. (Something that I'd never have considered writing myself, which suggests there are many more gems I'm missing!)

In other news my migration to mercurial is going extremely well. With only minimal downtime. Downtime for services really comes about because I have several websites which are powered entirely with a CVS checkout of remote repositories, so the process looks a little like this:

  • Convert CVS repository to hg.
  • Archive "live" CVS checkout from the server.
  • Move the local CVS checkout somewhere temporary.
  • Checkout from the new mercurial repository.
  • Fix any broken symlinks.
  • Do a recursive diff to make sure there are no unexpected changes.
  • Remove the previously archived local CVS checkout
  • Done!

| No comments

 

You're making me live

26 November 2007 21:50

Is there an existing system which will allow me to query Apache logfiles via an SQL string? (Without importing into a database first).

I've found the perl library SQL::YASL - but that has a couple of omissions which mean it isn't ideal for my task:

  • It doesn't understand DISTINCT
  • It doesn't understand COUNT
  • It doesn't understand SUM

Still it did allow me to write a simple shell which works nicely for simple cases:

SQL>LOAD /home/skx/hg/engaging/logs/access.log;
SQL>select path,size from requests where size > 10000;
path size 
/css/default.css 13813 
/js/prototype.js 71261 
/js/effects.js 37872 
/js/dragdrop.js 30645 
/js/controls.js 28980 
/js/slider.js 10403 
/view/messages 15447 
/view/messages 15447 
/recent/messages 25378 

It does mandate the use of a "WHERE" clause, but that was easily fixed with "WHERE 1=1". If I could just have support for count I could do near realtime interesting things...

Then again maybe I should just log directly and not worry about it. I certainly don't want to create my own SQL engine .. it just seems that Perl doesn't have a suitable library already made which is a bit of a shocker!

| No comments

 

Find me somebody to love

27 November 2007 21:50

Debian Admin

It looks like the fine DebianAdmin.com site is at it agian:

Word-for-word copyright infringement by the user "admin". See the other cisco articles for more infringements.

(OK I'm still bitter their site is pimped all over the place, including wiki.debian.org, and their name is confusingly similar to my sites...)

Apache SQL

After yesterdays frustrations with SQL::YASP I moved to parsing logfiles to an temporary SQLite database.

I now have a tool which you may use to load an arbitary number of Apache logfiles to, and query via SQL. It looks something like this:

asql> load /var/log/apache2/acc*
Creating tables
Loading: /var/log/apache2/access.log
Loading: /var/log/apache2/access.log.1

asql>   SELECT referer,COUNT(referer) AS number from logs GROUP BY referer ORDER BY number DESC,referer
- 4807
http://localhost/stats/ 2
http://foo.ocm/stats/ 2

asql>

Very useful :) A Debian package is available if you're interested in testing / using it, as is a mercurial repository.

The package now stands at 0.4 and is essentially done. It has tab completion on filenames and does enough for me. It might be nice to allow it to auto-read certain files on startup, or persist the database but I'll not bother unless people ask for it.

| No comments

 

We are the champions my friend

30 November 2007 21:50

My tool to query apache logfiles via SQL seems suprisingly popular.

Just as a recap the process goes like this:

  • Start the shell.
  • A temporary SQLite database is created.
  • You load any number of apache logfiles into it.
  • Then queries may be executed against those records until you exit.
  • The temporary database is dropped.

Now it is possible to save and load the SQLite database, so that you don't need to reparse the apache logs each time, that gives a nice speed increase for non-changing files.

By tonight I'll have aliases working for queries so you can bookmark them:

alias refers SELECT distinct(referer) FROM logs

Then in the future the 'refers' command will be available and will run the named query. Neat.

Now that I'm comfortable with SQL queries it just seems so natural, easy, and right to query logfiles this way. I guess that makes me strange.

| No comments