About Archive Tags RSS Feed

 

Entries tagged sqlite

There's no such thing as a wrong war

13 October 2009 21:50

Once upon a time I wrote a blog compiler, a simple tool that would read in a bunch of text files and output a blog. This blog would contain little hierarchies for tags, historical archives, etc. It would also have a number of RSS feeds too.

Every now and again somebody will compare it to ikiwiki and I'll ignore that comparison entirely, because the two tools do different things in completely different fashions.

But I was interested to see Joey talk about performance tweaks recently as I have a blog which has about 900 pages, and which takes just over 2 minutes to build from start to finish. (Not this one!)

I've been pondering performance for a while as I know my current approach is not suited to high speed. Currently the compiler reads in every entry and builds a giant data structure in memory which is walked in different fashions to generate and output pages.

The speed issue comes about because storing the data structure entirely in memory is insane, and because sometimes a single entry will be read from disk multiple times.

I've made some changes over the past few evenings such that a single blog entry will be read no more than once from disk (and perhaps zero times if Memcached is in use :) but that doesn't solve the problem of the memory usage.

So last night I made a quick hack - using my introduction to SQLite as inspiration I wrote a minimal reimplementation of chronicle which does things differently:

  • Creates a temporary SQLite database with tables: posts, tags, comments.
  • Reads every blog entry and inserts it into the database.
  • Uses the database to output pages.
  • Deletes the database.

This is a significantly faster approach than the previous one - with a "make steve" job taking only 18 seconds, down from just over 2 minutes 5 seconds.

("make steve" uses rsync to pull in comments on entries, rebuilds the blog, then uses rsync to push the generated output into its live location.)

ObFilm: If...

| 6 comments

 

A simple Perl alternative to storing data in Redis

16 December 2016 21:50

I continue to be a big user of Perl, and for many of my sites I avoid the use of MySQL which means that I largely store data in flat files, SQLite databases, or in memory via Redis.

One of my servers was recently struggling with RAM, and the suprising cause was "too much data" in Redis. (Surprising because I'd not been paying attention and seen how popular it was, and also because ASCII text compresses pretty well).

Read/Write speed isn't a real concern, so I figured I'd move the data into an SQLite database, but that would require rewriting the application.

The client library for Perl is pretty awesome, and simple usage looks like this:

# Connect to localhost.
my $r = Redis->new()

# simple storage
$r->set( "key", "value" );

# Work with sets
$r->sadd( "fruits", "orange" );
$r->sadd( "fruits", "apple" );
$r->sadd( "fruits", "blueberry" );
$r->sadd( "fruits", "banannanananananarama" );

# Show the set-count
print "There are " . $r->scard( "fruits" ) . " known fruits";

# Pick a random one
print "Here is a random one " . $r->srandmember( "fruits" ) . "\n";

I figured, if I ignored the Lua support and the other more complex operations, creating a compatible API implementation wouldn't be too hard. So rather than porting my application to using SQLite directly I could juse use a different client-library.

In short I change this:

use Redis;
my $r = Redis->new();

To this:

use Redis::SQLite;
my $r = Redis::SQLite->new();

And everything continues to work. I've implemented all the set-related functions except one, and a random smattering of the other simple operations.

The appropriate test-cases in the Redis client library (i.e. removing all references to things I didn't implement) pass, and my own new tests also make me confident.

It's obviously not a hard job, but it was a quick solution to a real problem and might be useful to others.

My image hosting site, and my markdown sharing site now both use this wrapper and seem to be performing well - but with more free RAM.

No doubt I'll add more of the simple primitives as time goes on, but so far I've done enough to be useful.

| No comments