About Archive Tags RSS Feed

 

Entries tagged planet-debian

That I can't show you how

30 July 2007 21:50

Russell Coker has recently started posting random tech-tips and recipes in his blog :

To improve things in this regard I plan to increase the number of posts I write with solutions to random technical problems that I encounter with the aim of providing a resource for google searches and to randomly inform people who read my blog.

This is nice to see on Planet Debian - although I hope we continue to see the personal entries.

For anybody else who is considering posting things like this I would be delighted if you'd copy them to the Debian Administration website. There have been numerous times when I've been just about to write something on a topic, seen it posted elsewhere and figured I shouldn't do so:

  • Because it would be duplication.
  • Because it would look like plagiarism

(Notable examples off the top of my head: Introduction to OpenVZ, Introduction to GIT, several Xen pieces.)

I don't get many submissions, which I'm getting resigned to, but it is easy and people really really are greatful for new posts.

In other news linuxlinks.com are a bunch of spammers and will be reported as such. I utterly fail to care that they've added "my software" to their list; if I cared I'd join their site and agree to receive emails from them..

| No comments

 

Let the bells ring out for Christmas

2 December 2007 21:50

In the next week I intend to drop the search engine which archives content posted to Planet Debian.

It appears to have very little use, except for myself, and I'm significantly better at bookmarking posts of interest these days.

If you'd like to run your own copy the code is available and pretty trivial to reimplement regardless. There are only two parts:

  • Poll and archive content from the planet RSS feed - taking care of duplicates.
  • Scanning for /robots.txt upon the source-host, to avoid archiving content which should be "private".

Once you've done that you'll have a database populated with blog entries, and you just need to write a little search script.

ObRandom: In the time it has been running it has archived 15,464 posts!

| No comments

 

You can't hide the knives

6 December 2007 21:50

After recently intending to drop the Planet Debian search and recieving complaints that it was/is still useful it looks like there is a good solution.

The code will be made live and official upon the planet debian in the near future.

The DSA team promptly installed the SQLite3 package for me, and I've ported the code to work with it. Once Apache us updated to allow me to execute CGI scripts it'll be moved over, and I'll export the current data to the new database.

In other news I'm going to file an ITP bug against asql as I find myself using it more and more...

| No comments

 

I still got the blues for you

11 January 2008 21:50

Been a week since I posted. I've not done much, though I did complete most of the migration of my planet-searching code to gluck.debian.org.

This is now logging to a local SQLite database, and available online.

I've updated the blog software so that I can restrict comments to posts made within the past N days - which has helped with spam.

My other comment-spam system is the use of the crm114 mail filter. I have a separate database now for comments (distinct from that I use for email), and after a training on previous comments all is good.

Other than being a little busy over the past week life is good. Especially when I got to tell a recruitment agent that I didn't consider London to be within "Edinburgh & Surrounding Region". Muppets.

The biggest downside of the week was "discovering" a security problem in Java, which had been reported in 2003 and is still unfixed. Grr. (CVE-2003-1156 for those playing along at home).

Heres the code:

#!/bin/sh
#
#  Grep for potentially unsafe /tmp usage in shared libraries
#


find /lib -name '*.so' -type f -print > /tmp/$$
find /usr/lib -name '*.so' -type f -print >> /tmp/$$
for i in $(cat /tmp/$$ ); do
    out=$(strings $i | grep '/tmp')
    if [ ! -z "$out" ]; then
        echo "$i"
        echo "$out"
    fi
done
rm /tmp/$$

| 4 comments

 

Do you have monkeys in Scotland?

13 March 2008 21:50

I've uploaded a new make package to fix the memory corruption bug which I recently tracked down, with kind permission from the maintainer.

I've also been working on a Debian Planet filtering/exclusion system. I've put together a (working) online demo, and I think I could probably inject it via greasemonkey without too many problems. (I'm a little reluctant to install that addon, because I suspect the security implications are severe).

Still it was a nice hack, and actually reminds me that I like javascript these days. The demo will probably disappear in a week or two, but otherwise works as expected - just a couple of GUI issues to solve.

(As with the Debian Planet search this isn't tied to our install, and could work on any PlanetPlanet installation.)

Maybe it isn't the friendliest of ideas, but I think it is a good one regardless.

ObQuote: Last King of Scotland

| 1 comment

 

I promise that I will not kill anyone

15 March 2008 21:50

Planet Debian now has Javascript to facilitate the persistent hiding of particular feeds.

The code is still in flux, but appears to work.

Comments welcome, especially from people better at Javascript than myself!

The way it works was chosen to minimise the changes requred:

  • Each entry is wrapped within a new <div>
  • The DIV has an ID & a CLASS attribute added.

At load-time the function hideHosts() is called. If a comma-separated cookie called "excludes" is present this is split and iterated over.

For each domain-name in the list the current document is searched for elements having a class prefixed with that hostname - if any match and they have an ID defined (regardless of what that might be) they are hidden.

And thats it.

The changes to the template were minimal; each index entry already has a "link" attribute, so I just had to add this:

 <div class="<!-- tmpl_var name='link' escape='html' -->"
         id="<!-- tmpl_var name='link' escape='html' -->" >
 ....
 </div>

Because the ID and LINK attributes are URLs there is a little mangling, and inefficiency. But I didn't want to change the core of the PlanetPlanet to define a "hostname" attribute for each feed member...

In an ideal world I'd add "class='feed, $link'" and then iterate over that at load-time to attach handlers, and an ID appropriately. But that is a little scary.. If it works in IE great. I've tested Firefox & Ephinay

ObQuote: Terminator II

Update: Moved javascript into external file, and removed the image toggling, it seemed to fail and I'm not sure why. I will investigate.

| 2 comments

 

Hack the planet!

22 September 2009 21:50

Recently I was viewing Planet Debian and there was an entry present which was horribly mangled - although the original post seemed to be fine.

It seemed obvious to me that that some of the filtering which the planet software had applied to the original entry had caused it to become broken, malformed, or otherwise corrupted. That made me wonder what attacks could be performed against the planet aggregator software used on Planet Debian.

Originally Planet Debian was produced using the planet software.

This was later replaced with the actively developed planet-venus software instead.

(The planet package has now been removed from Debian unstable.)

Planet, and the Venus project which forked from it, do a great job at scrutinising their input and removing malicious content. So my only hope was to stumble across something they had missed. Eventually I discovered the (different) filtering applied by the two feed aggregators missed the same malicious input - an image with a src parameter including javascript like this:

<img src="javascript:alert(1)">

When that markup is viewed by some browsers it will result in the execution of javascript. In short it is a valid XSS attack which the aggregating software didn't remove, protect against, or filter correctly.

In fairness it seems most of the browsers I tested didn't actually alert when viewing that code - but as a notable exception Opera does.

I placed a demo online to test different browsers:

If your browser executes the code there, and it isn't Opera, then please do let me know!

The XSS testing of planets

Rather than produce a lot of malicious input feeds I constructed and verified my attack entirely off line.

How? Well the planet distribution includes a small test suite, which saved me a great deal of time, and later allowed me to verify my fix. Test suites are good things.

The testing framework allows you to run tiny snippets of code such as this:

# ensure onblur is removed:
HTML( "<img src=\"foo.png\" onblur=\"alert(1);\" />",
      "<img src=\"foo.png\" />" );;

Here we give two parameters to the HTML function, one of which is the input string, and the other is the expected output string - if the sanitization doesn't produce the string given as the expected result an error is raised. (The test above is clearly designed to ensure that the onblur attribute and its value is removed.)

This was how I verified initially that the SRC attribute wasn't checked for malicious content and removed as I expected it to be.

Later I verified this by editing my blog's RSS feed to include a malicious, but harmless, extra section. This was then shown upon the Planet Debian output site for about 12 hours.

During the twelve hour window in which the exploit was "live" I received numerous hits. Here's a couple of log entries (IP + referer + user-agent):

xx.xx.106.146 "http://planet.debian.org/" "Opera/9.80
xx.xx.74.192  "http://planet.debian.org/" "Opera/9.80
xx.xx.82.143  "http://planet.debian.org/" "Opera/9.80
xx.xx.64.150  "http://planet.debian.org/" "Opera/9.80
xx.xx.20.18   "http://planet.debian.net/" "Opera/9.63
xx.xx.42.61   "-"                         "gnome-vfs/2.16.3
..

The Opera hits were to be expected from my previous browser testing, but I'm still not sure why hits were with from User-Agents identifying themselves as gnome-vfs/n.n.n. Enlightenment would be rewarding.

In conclusion the incomplete escaping of input by Planet/Venus was allocated the identifier CVE-2009-2937, and will be fixed by a point release.

There are a lot of planets out there - even I have one: Pluto - so we'll hope Opera is a rare exception.

(Pluto isn't a planet? I guess thats why I call my planet a special planet ;)

ObFilm: Hackers.

| 6 comments