About Archive Tags RSS Feed

 

Entries posted in February 2014

disqus on the cheap?

2 February 2014 21:50

Last night I was up again, really hard to sleep when you have a bad cold.

I decided to do something fun, and allow my tweaking guide to accept comments.

Like many of my sites this is 100% static, and generated by templer, so comments are "hard".

I've seen a few people try to rewrite disqus as a general-purpose solution, and I like that idea, because I don't trust that particular service.

I wasn't so ambitious though, I just hacked up a quick sinatra server:

  • "GET /comments/ID"
    • Retrieves the comments on the specified identifier as a JSON array of comment-hashes.
  • "POST /comments/ID"
    • Append the submitted comment to a redis set.

My jquery/javascript is nasty, but the thing seems to work pretty well. The page loads and comments are populated, and new ones are persisted as expected.

I can see the appeal of putting all this magic in one javascript file. You include that and get both the existing comments and the form to add new ones - my approach is to hardwire the submission/display in my generated site.

Perhaps something for the future.

In conclusion if people wish they can now leave feedback on most of the pages :)

| 9 comments

 

External Comments, updated

6 February 2014 21:50

The simple external-comments code is now complete enough for me to stop poking it on a daily basis:

  • Although the comments are styled minimally you can override that with CSS.
  • Although the default "Add your reply" form is ugly you can replace it with your own.
    • The reply-form may go above or below the comments.
  • If you add an email field then your comments will include a gravitar link.
  • Comments are assumed to be in markdown now.
  • The commments may be retrieved in newest-first, or oldest-first order.
  • There's now a simple anti-spam plugin system present.

All in all I'm pretty happy with the way it works, and the server-code. The client-side Javascript is less good, but I'm probably done poking that too.

In an ideal world the client-side code should be a jQuery plugin, but I've not worked out how to make a static method (the JSONP callback) be a member of a jQuery plugin-object. So without that I have to re-pass the options around too many places, rather than making them a member of "this".

Meh, pull requests welcome for adding new storage back-ends (redis and sqlite are supported by default), and similarly for cleanups.

Links:

| No comments

 

Sad times

10 February 2014 21:50

There are times when I'm very proud of the Debian project, the developers, the contributors, the bug-reporters, even the users.

There are times when I'm less impressed.

These days I guess I'm not qualified to comment, being an ex-developer, but I still am disappointed.

Part of me wants to rejoin the project, to see if I can help. The other part is thinking there are other choices, maybe I should look at them.

Conflict is bad.

Being conflicted is worse.

| 6 comments

 

Secure your rsync shares, please.

13 February 2014 21:50

Recently I started doing a internet-wide scan for rsync servers, thinking it might be fun to write a toy search-engine/indexer.

Even the basics such as searching against the names of exported shares would be interesting, I thought.

Today I abandoned that after exploring some of the results, (created with zmap), because there's just too much private data out there, wide open

IP redacted for obvious reason:

shelob ~ $ rsync  rsync://xx.xx.xx.xx/
ginevra        	Ginevra backup
krsna          	Alberto Laptop Backup
franziska      	Franz Laptop Backup
genoveffa      	Franz Laptop Backup 2

Some nice shares there. Lets see if they're as open as they appear to be:

shelob ~ $ rsync  rsync://xx.xx.xx.xx/ginevra/home/
drwxrwsr-x        4096 2013/10/30 13:42:29 .
drwxr-sr-x        4096 2009/02/03 10:32:27 abl
drwxr-s---       12288 2014/02/12 20:05:22 alberto
drwxr-xr-x        4096 2011/12/13 17:12:46 alessandra
drwxr-sr-x       20480 2014/02/12 22:55:01 backup
drwxr-xr-x        4096 2008/10/03 14:51:29 bertacci
..

Yup. Backups of /home, /etc/, and more.

I found numerous examples of this, along with a significant number of hosts that exported "www" + "sql", as a pair, and a large number of hosts that just exported "squid/". I assume they must be some cpanel-like system, because I can't understand why thousands of people would export the same shares with the same comments otherwise.

I still would like to run the indexer, but with so much easy content to steal, well I think the liability would kill me.

I considered not posting this, but I suspect "bad people" already know..,

| 13 comments

 

Pastebin site with markdown support

16 February 2014 21:50

Today I setup a new website:

Something I want, something I'll use, and something that might be useful to others?

| 4 comments

 

My pastebin will now run under docker.

17 February 2014 21:50

I've updated my markdown-pastebin site, to be a little cleaner, and to avoid spidering issues.

Previously every piece of uploaded text received an incrementing integer to describe it - which meant it was trivially easy for others to see how many pieces of text had been uploaded, and to spider all past uploads (unless the user deleted them).

Now each fresh paste receives a random UUID to describe it, and this means spidering is no longer feasible.

I've also posted the source code to Gitub so folk can report bugs, fork, etc:

That source code now includes a Dockerfile which allows you to quickly and easily build your own container running this wonderful service, and launch it without worrying about trashing your server ;)

Anyway other than the user-interface overhaul it is still as functional, or not, as it used to be!

| No comments

 

Changing my stack ..

22 February 2014 21:50

For the past few years I've hosted all my websites in a "special" way:

  • Each website runs under its own UID.
  • Each website runs a local thttpd / webserver.
  • Each server binds to localhost, on a high-port.
    • My recipe is that the port of the webserver for user "foo" is "$(id -u foo)".
  • On the front-end I have a proxy to route connections to the appropriate back-end, based on the Host header.

The webserver I chose initially was thttpd, which gained points because it was small, auditable, and simple to launch. Something like this was my recipe:

#!/bin/sh
exec thttpd -D -C /srv/steve.org.uk/thttpd.conf

Unfortunately thttpd suffers from a few omissions, most notably it doesn't support either "Keep-Alive", or "Compression" (i.e. gzip/deflate), so it would always be slower than I wanted.

On the plus side it was simple to use, supported CGI scripts, and served me well once I'd patched it to support X-Forwarded-For for IPv6 connections.

Recently I setup a server optimization site and was a little disappointed that the site itself scored poorly on Google's page-speed test. So I removed thttpd for that site, and replacing it with nginx. The end result was that the site scored 98/100 on Google's page-speed test. Progress. Unfortunately I couldn't do that globally because nginx doesn't support old-school plain CGI scripts.

So last night I removed both nginx and thttpd, and now every site on my box is hosted using lighttpd.

There weren't too many differences in the setup, though I had to add some rules to add caching for *.css, etc, and some of my code needed updating.

Beyond that today I've setup a dedicated docker host - which allows me to easily spin up containers. Currently I've got graphite monitoring for my random hosts, and a wordpress guest for plugin development/testing.

Now to go back to reading Off to be the wizard .. - not as good as Rick Cook's wizardry series (which got less good as time went on, but started off strongly), but still entertaining.

| 2 comments

 

Two minor toys ..

23 February 2014 21:50

Two minor things:

graphite_send

A simple shell-script to submit metrics to a graphite server, extensible via local plugins, but covers the obvious metrics by default.

Metrics are submitted via simple calls to netcat.

Trivial, but much more lightweight than collectd and similar.

HTML::Emoji

A perl module for converting HTML like "<p>:smile:</p>" into something graphical.

This was written for my markdown sharing site, but is pretty fun.

The konami-code page demonstrates usage.

(This parses the HTML so it won't transform attributes, ids, or anything that isn't in the "text" part of any HTML input.)

The graphite sending script is perhaps the most useful, but at the same time it feels too small to be a package of its own. I'm tempted to bundle it up into my sysadmin-util collection, but I can't quite decide if it belongs there either.

| 2 comments

 

What do you pay for, and what would you pay for?

25 February 2014 21:50

There are times when I consider launching my own company again, most often when it is late at night and the inpetitude of so many other companies gets me too worked up. Then I sit back and think about details and write it off.

I've worked for myself in the past a couple of times, and each time it was both more fun and more difficult than expected. Getting a couple of clients is usually easy, getting a ten more is common, but getting "many" is hard and getting "lots" is something I've never done - lots of users for free sites though, along with the associated support burdon!

So the though dies away once I sit down and work out the net profit I'd need to live. My expenses are low, so let us pretend I can easily live on £1000 a month. So the "company" has to make more than that, to cover costs, but perhaps not much.

Pretend you were offering DNS hosting you'd probably be able to implement that easily on, say 10, virtual machines, net of £150 a month. Imagine clients pay £5 for an unlimited number of domains that means you need to have 1000+150/5 = 230 clients. Not impossible, but also not easy.

Pretend instead you're offering backup space, and the numbers get bigger because disk is expensive. Again getting some users would be easy, but getting lots would be hard because your competition is dropbox, skydrive, etc, etc.

Once you start thinking of "ideas" they come easily, but the hard part is being realistic about what people would pay for. As always the idea is the easy part, the execution is the hardest part. Realistically if I were to be desperate to work for myself at short notic I'd do the obvious thing - I'd buy a pair of ladders, a bucket, and clean windows. Low overheads, reasonable demand, and I'd be both "fit" and "outdoors".

When it comes to paying for online services off the top of my head I personally pay for maybe two things, both of them niche (although profitable for their providers I'm sure), and I know many people who live on the internet but pay for nothing.

For example I'm a VIP member of an online modeling community, which in theory allows me a higher chance of persuading interesting people to pose for me.

In practice the turnover on those sites is immense. Lots of cute boys and girls hear constantly "You're so pretty, you should be a model", which is true in perhaps 1% of cases, and the net result is you have a few hard working people who do good things day in day out, and many flighty teenagers who'll pose for two-three people, and then never do it again because they realise it is neither glamourous nor easy money.

Two things I've semi-serously considered recently where hosted "status pages", and hosted "domain parking", but both have many competitors and both I can see a) some people would pay for but b) not very many.

I suspect there is no universal "I'd pay for this" online service hwich is both competition free and genuinely trivial to setup, but I'd be curious to see what people are missing, and even more curious to see what people do pay for.

| 5 comments

 

Some direction, some distraction

27 February 2014 21:50

It seems that several people replied to the effect that they would pay people to take care of applying security updates, or even configuring adhoc things such as wikis, graphite, and MySQL.

Not enough people to rely upon, but perhaps there is scope for remote stuff being done in exchange for folding-money. (Of course some of those that replied are in foreign countries which makes receiving payment an annoyance, that's a separate problem though.)

Food for thought.

In the meantime I've settled into my use of lighttpd, which I've recently migrated to.

One interesting thing is that you can set your own "Server Name" directive:

# Set server name/version
server.tag = "lighttpd/(steve)"

This value is used by mod_dirlisting, so for example if you examine a directory which doesn't contain an index.html file you see the server-name. Cute.

Well cute unless, or until, somebody sets:

# Set server name/version
server.tag = "<script>alert(3)</script>"

That does indeed show javascript to all your visitors. Not a security problem itself, as you need to be root on the remote site. If you're root in the remote server you could just modify the actual HTML pages being served to include your javascript. That said it's a little icky.

The following patch avoids the issue:

--- mod_dirlisting.c.org	2014-02-26 00:14:43.296373275 +0000
+++ mod_dirlisting.c	2014-02-26 00:16:28.332371547 +0000
@@ -618,7 +618,7 @@
 		} else if (buffer_is_empty(con->conf.server_tag)) {
 			buffer_append_string_len(out, CONST_STR_LEN(PACKAGE_DESC));
 		} else {
-			buffer_append_string_buffer(out, con->conf.server_tag);
+                        buffer_append_string_encoded(out, CONST_BUF_LEN(con->conf.server_tag), ENCODING_HTML);
 		}

 		buffer_append_string_len(out, CONST_STR_LEN(

| 2 comments