My next class:

Common Web Attacks. A quick 404 project update

Published: 2011-08-05. Last Updated: 2011-08-05 15:49:45 UTC
by Johannes Ullrich (Version: 1)
5 comment(s)

We are now collecting for about a week now, and I think it is time to give everybody a quick update on the project. Thanks to all the submissions so far. We do have some initial results, just not enough to automate the reports quite yet. But there are now clients for perl, python and ASP! (thanks to the contributors)

Some of the most common scans target:

  • Word Press. We do have a good number of reports joing for wp-login.php. 
  • PHPMyAdmin (/phpmyadmin/scripts/setup.php )
  • MediaWiki/Wiki (but these hits only come from a few submitters, may not be statistically significant yet)

And some frequently requested files that are likely not an attack:

  • robots.txt - search engines will look for it. You should have the file to control well behaved search engines. Just don't use it to list secret / restricted pages ;-)
  • apple-touch-icon files (there are a number of different once for different resolutions). This is just like a "favicon", but used by Apple's IOs devices. With them being more and more popular, you may want to set one up.
  • crossdomain.xml - this file is used by flash and Silverlight to communicate your cross domain policies. We have talked about the file before. It is a good idea to have an empty one that restricts access (this is the default for up to date flash players)

Please keep the reports coming and please install the "client code" on your error page if you haven't yet. Once you installed it, you can verify if your submissions are working after logging in and projecting to the 404 report page.

------
Johannes B. Ullrich, Ph.D.
SANS Technology Institute
Twitter

Keywords: 404 project
5 comment(s)
My next class:

Comments

Where is the PERL version? I can only find the PHP one. Sorry, no PHP here, thanks.
I had recently uninstalled MediaWiki before installing the 404 monitoring code, so it may just be search engines crawling pages that used to be there, on my site at least. Hope I don't screw up your data too much.
Is there a way to submit other errors also?
fe. I have setup my webserver to be accessible though use of a "Vittualhost name" only and log all access to the default unnamed host.
And lately i see a lot of weird accesses like these:
[Fri Aug 05 03:20:24 2011] [error] [client 188.54.65.180] Invalid URI in request \xba\v:`aX\x06\xd8J8\x99\xff\x1b\xf5\x81
[Fri Aug 05 04:31:29 2011] [error] [client 98.228.48.241] Invalid URI in request \xc2BrD\x1d}5\x10\xd6\xbf?\xec)\xf2D\x9b\xae\x80\x17@\xe8pt\x1bp]F\xbd\xfc\xcd\x97\xba\x14b\xe4\r\xd8\x86B\xf9\xaa\x93\x9a\xcbos\xcb\x16M\xe9
[Fri Aug 05 12:17:01 2011] [error] [client 80.61.152.143] Invalid URI in request \x16\xc6B\xf1\x80\xac\xd85\xc6\x8f\xb7!\xb4?\xd7\xc1T\xb8\x9c\r\xef\xc8\xb2\x03
@TriMoon there is a way you can 'manually' input data. The best way to submit is to try to get all 404's to go through a version of the reporter that fits with your infrastructure. If that isn't possible then maybe someone can write a manual submitter. I could write one up in Python pretty quickly _IF_ the 404 project thinks it's a good idea.
@Ashcrow; Im not sure what you mean by inputing data manually...
Im already using the provided php script to submit 404's, i just wanted to know about the other error codes because of those accesses i posted, those didnt showup until recently.
They showed up some time before i installed the php script thouhg, but they are new in my eyes :)

Diary Archives