Entries posted in November 2009

I work alone like you. We always work alone.

Friday, 27 November 2009

A couple of days ago I was lamenting the state of webstats, although I was a little vague as to my purpose. Specifically I was wanting to find out about the screen resolutions and user-agents viewing a couple of sites.

To get screen resolutions you really need to inject javascript into your pages, which is icky. Still its a small price to pay, and chances are most people won't notice.

Of course there are drawbacks:

Javascript dependency.

If the visitors don't use/enable javascript you see nothing.

You cannot capture everything.

e.g. HTTP status code isn't available.

To solve this problem completely you therefore need to have access to both your apache logs and your javascript-captured information. Probably.

As a proof of concept I've injected the following javascript into most pages of three sites. This code:

  • Finds the screen resolution.
  • Finds the HTTP referer.
  • Finds the current page's title.
  • Then submits that to a server-side collection script, via a one-by-one pixel IMG

The script that receives the data writes out the data to a small per-domain SQLite database, which I can then use to generate prettyness. However I suck at being pretty, in most ways, so I've only got functional:

All of this is dynamic and most of the data is anchored to "today", as thats proof of concept enough. Were piwik not written in vile PHP I'd use that - I don't see anything similar out there which is Perl..

The big decision is now "Keep it dynamic" vs. "Output static pages". (vs. call off the experiment now I know that I'm safe to assume "big resolutions").

(Naming software is hard. Recent stuff I've done has had an skx prefix primarily for google-juice. e.g. Randomly I notice that if you search for my personal site on Google's UK engine I come top. Cool.)

ObSubject: The Bourne Identity

| 2 comments.

 

This is the part where you tell me what matters is on the inside

Tuesday, 24 November 2009

Some technology evolves very quickly, for example the following things are used by probably 80% of readers of this page:

  • A web browser.
  • A mail client.
  • A webserver.

But other technology is stuck in the past and only sees laclustre updates and innovations (not that inovation is mandatory or automatically a good thing)

Right now I'm looking at my webserver logs, trying to see who is viewing my sites, where they came from, and what their favourite pie is.

In the free world we have the choice of awstats, webalizer, and visitors (possibly more that I'm unaware of). In the commercial world everybody and their dog uses Google's analytics.

On the face of it a web analysis package is trivial:

  • Read in some access.log files.
  • Process to some internal database representaqtion.
  • Generate static/dynamic HTML output from your intermediate form, optionally including graphs, images, and pie-charts.

If you add javascript-fu to each of your pages you can track page titles, exit links, screen resolutions, and other data to record too. (Though I guess thats a seperate problem; trying to merge that data in with the data you have in your access log without making nasty links like "GET /trackin.gif?x_res=800;y_res=600". Anyway I guess with cookies you could correlate reasonably carefully.)

In conclusion why are my web statistics so dull, boring, and less educational than I desire?

I'd be tempted to experiment, but I suspect this is a problem which has subtle issues I'm overlooking and requires an artistic slant to make pretty.

(ObLink: asql is my semi-solution to logfile analysis.)

ObFilm: Bound

| 7 comments.

 

I am not stupid, you know. They cannot make things like that yet.

Friday, 20 November 2009

I've really enjoyed reading some of Matthew Garrett's entries about legacy PC hardware features - specifically the cute hack involving A20 and the keyboard controller. Reading things like that really takes me back.

I remember when the z80 was cutting edge, and I discovered you could switch to a whole new set of shadow registers via "exx" and "ex af,af'". I remember using undocumented opcodes, and even now I can assemble and disassemble simple z80 machine code. (Don't get me started on the speedlock protectors and their fiendish use of the R register; that'll stick in your mind if you cracked it. I did.)

I remember being introduced to the PC, seeing subdirectories appear in MS-DOS 2.0, network redirectors appearing in DOS 3.x, and support for big hard drives appearing in MS-DOS 4.0.

I remember the controvosy over the AARD code in the betas of Windows 3, and the focus that was given in the book "Undocumented DOS" which mostly focussed upon the "list of lists". (At that time I'd have been running GEM on an IBM XT with hercules (monochrome) graphics.)

I remember learning about that a .COM file was a flat image, limited to 64k which loaded at offset 100h, to accomodate the PSP (program segement prefix) for compatbility with CP/M (something I've never seen, never used, and known nothing about. I just know you could use the file control blocks to get simple wildcard handling for your programs "for free")

I wrote simple viral code which exploited this layout, appending new code to end of binary, replacing the first three bytes with a jump to the freshly added code. Later you could restore the original bytes back to their original contents and jump back to 100h (I even got the pun in the name of 40Hex magazine.)

I recall when COM files started to go out of fashion and you had to deal with relocation and segment fixups. When MZ was an important figure.

I even remember further back to when switching to protected mode was a hell of tripple-faults, switching back from protected mode to real mode was "impossible" and the delight to be had with the discovery and exploitation of "unreal mode".

All these things I remember .. and yet .. they're meaningless to most now, merely old history. Adults these days might have grown up and reached age 20 having always had Windows 95 - never having seen, used, or enjoyed MS-DOS.

How far have we come since then? In some ways not far at all. In other ways the rise of technology has been exponential.

Back then when I was doing things I'd not have dreamed I could update a webpage from a mobile phone whilst trapped upon a stalled train.

There are now more mobile phones than people in the UK. In some ways that's frightening - People miss the clear seperation between home & work for example - but in other ways .. liberation.

I have no predictions for the future, but it amazing how far we've come just in my lifetime; and largely without people noticing.

The industrial revolution? Did that happen with people mostly not noticing? Or was there more concious awareness? Food for thought.

ObFilm: Terminator

| 1 comment.

 

And if you fail gym, you'll never get into college.

Saturday, 14 November 2009

Today I rebooted my desktop for the first time in a few months. This did not go well. Probably as a result of this issue with lvm/dmsetup/cryptsetup conflicting my system didn't boot, and the error message was non-helpful.

The error shown just after grub2 had started to load a system was :

Cannot find LVM volume group gold-vol

The actual cause was that I was missing the mdadm package. D'oh. My desktop has 2x500Gb drives setup as:

 sda1 + sdb1 = md1  = /boot [1Gb]
 sda2 + sdb2 = md0  = LVM storage [460Gb]

(It's only as I write this that I'm surprised that md1 + md0 are opposite to the fashion I'd have expected them to be. I guess I just created them in the "wrong" order at install time. Oops)

So without mdadm the LVM volume group on /dev/md0 couldn't be found, and that in turn meant my root filesystem couldn't be accessed at /dev/gold-vol/root.

Fixing this was a real pain. Because the system is the PXE network host on my LAN I couldn't boot it that way, and the machine has no CD-ROM drive connected.

My solution was to download and install System Rescue CD, which I placed upon a USB stick. This worked beautifully once I realised I had to boot with rescue64 to get a 64-bit kernel capable of letting me run chroot.

Oddly enough I had problems booting from USB. If I powered down my system and hit the "on" switch the system just ignored the USB stick. I noticed that my USB mouse and card reader didn't show any power lights at all - not until after grub had failed to boot the system.

So the process of booting from USB was eventually determined to be:

  • Poweroff system.
  • Power on system - wait for grub to fail to boot kernel.
  • At this point the USB mouse and card reader would be initialised in some fashion and would show their LED lights.
  • Press Ctrl-alt-delete - at which point the BIOS would allow the USB booting to occur.

Very very odd. I guess its a question of what does the "USB enabling". I'd previously assumed the BIOS would do this setup - but looking over at another system I notice that the USB mouse doesn't "come alive" until mid-way through the Linux boot process even though I know that BIOS has options for "Enabling USB mouse & keyboard". Maybe I'm missing something obvious ..?

In conclusion .. I restarted GDM for the first time in weeks and rebooted, and this was a bad idea.

ObFilm: Never Been Kissed

| 3 comments.

 

You're not even trying

Wednesday, 11 November 2009

I've spent the past 24 hours playing with my new phone, and so far have enjoyed it.

I've put together very very rough documentation on what I've done so far which is just:

  • Installing a terminal.
  • Installing an ssh client.
  • Uploading wallpaper, ring tones, and similar.

So far I've not had any major problems, and my biggest issue is the lack of TAB, arrows, and ctrl keys on the keyboard.

I've put together a local packages for the novacom/precom USB connection utility, and will share if there's any interest. Happily it "just works" and was trivial to compile so theres probably not really any point in uploading as such.

ObFilm: Dark Angel (TV Series)

| No comments

 

What would you have me do, Stephen?

Friday, 6 November 2009

Things which should exist, but don't yet:

Transparent SQL Cache

Imagine a proxy listening on 127.0.0.1:3306, receiving SQL Queries.

Any query that was "SELECT .." could return the result from a local cache for the appropriate table. Any query of the form "UPDATE" or "INSERT" would flush all caches for the table.

Should be near-trivial: Hash the incoming query & parameters via SHA1sum to get a unique key then store/lookup results in Memcached.

Would it be useful? I think so, but of course it depends on the application and the effort involved.

"Global status"

A single site that would rebroadcast a posted (short) status message to facebook/twitter/your chat client/etc/etc.

Hard part would be receiving comments from the sites it re-served to.

Scraping statuses from facebook is hard, not sure about twitter.

This concludes my Friday wishlist.

ObFilm: Master and Commander

| 10 comments.

 

Recent Posts

Recent Tags