1

I set up Apache with mod_wsgi in a default installation, with a very simple test application test.wsgi that goes like this:

def application(environ, start_response):
    status = '200 OK'
    output = 'Path: %s' % environ['PATH_INFO']
    response_headers = [('Content-type', 'text/plain'),
                        ('Content-Length', str(len(output)))]
    start_response(status, response_headers)
    return [output]

Apache itself is basically configured like that:

WSGIScriptAlias / /home/user/wsgi_test/test.wsgi

Now if I access URLs on that server, I get the following output:

  • GET /test/../test/./test => Path: /test/test
  • GET /test/%2E%2E/abc => Path: /abc
  • GET /test%2fabc => 404 Not found (due to AllowEncodedSlashes off, I suppose)

Thus it looks to me that Apache resolves/pre-processes these URLs before passing them to the application, which is nice because it can prevent a lot of local file inclusion and directory traversal attacks where the file path is not part of the query string.

Can one rely on this behaviour or is there some nasty way to trick Apache into actually passing a URL like /test/foo/../bar to the application? Would it be an Apache bug if that was possible?

Niklas B.
  • 214
  • 1
  • 2
  • 9

1 Answers1

2

The security guideline "Defense in Depth" suggests that you should configure Apache to preprocess paths, but you should also protect against dangerous paths yourself.

Since you probably know a lot more about what the valid paths should look like, you should be able to come up with a lot more restrictive test that's also easier to get right than anything Apache can do in general.

Ben Voigt
  • 473
  • 6
  • 20
  • Can you give details on how much you can expect from Apache here? Especially on how this path preprocessing could possibly be circumvented? – Niklas B. Feb 20 '12 at 00:22
  • @Niklas: It's not defense in depth if you are relying on Apache's preprocessing step at all. Assume the remote attacker has complete control over the string you receive. – Ben Voigt Feb 20 '12 at 00:22
  • I don't want to question the fact that multiple layers of defense are better than a single one (rather to the contrary). But as a penetration tester, I am also interested in how mistakes like trusting Apache to properly sanitize URLs could be exploited. – Niklas B. Feb 20 '12 at 00:25
  • @Niklas: Are you testing apache, or the custom code? – Ben Voigt Feb 20 '12 at 00:27
  • The custom code. I take it that it would take an Apache bug to make a situation like this exploitable, where a script uses a file path retrieved from the URL. But I am not completely sure. – Niklas B. Feb 20 '12 at 00:29
  • @Niklas: If you're testing the custom code, inject your test cases into the custom code. – Ben Voigt Feb 20 '12 at 00:30
  • Unfortunately, that's usually not the way it works. Nobody is interested in being told "this is in theory unsafe, although in practice you are lucky because your code will always sit behind a web server that does input sanitization for you". But thanks for all the information :) – Niklas B. Feb 20 '12 at 00:32
  • @Niklas: You usually can construct a strong argument around the potential for unknown bugs, regressions in future versions, and the desire for flexibility to use some other web server that doesn't perform the same processing. – Ben Voigt Feb 20 '12 at 00:42