Analyzing HTTP Packet Captures

Published: 2011-03-16
Last Updated: 2011-03-16 15:12:36 UTC
by Johannes Ullrich (Version: 1)
6 comment(s)

There are plenty of tools to extract files that are transmitted via HTTP. For example Jim Clausing's brilliant perl script [1], or the Wireshark "export features among many others (chaosreader, xplico, network miner ...).

However, I am sometimes faced with a different problem: You do have a network capture of a set of HTTP requests, and you are trying to "replay" them in all of their beauty, which includes all headers and POST data if present.

There are two parts to this challenge:

- extracting the HTTP requests from the packet capture.
- sending the packet capture to a web server

"tcpreplay" may appear like the right tool, but it will just blindly replay the traffic, and the web server will not actually establish a connection.

"wireshark" can be used to extract the data using the tcp stream reassembly feature, but this can't easily be scripted. "tshark" does not have a simple feature to just extract the http requests. You can only extract individual headers easily or the URLs.

The probably easiest way to parse the packet capture, and extract the request, is the perl module "Sniffer::HTTP". This module will not only reassemble the TCP streams, it will also extract HTTP requests:

#!/usr/bin/perl                                                                                                               

use Sniffer::HTTP;
my $VERBOSE=0;
my $sniffer = Sniffer::HTTP->new(
  callbacks => {
      request  => sub { my ($req,$conn) = @_; print $req->as_string,"n" if $req },
  }
);

$sniffer->run_file("/tmp/tcp80");

Will read packets from the file "/tmp/tcp80", and print the HTTP requests. The output could now be used to pipe it to netcat (or directly send it from perl).

 

[1] http://handlers.sans.org/jclausing/extract-http.pl

------
Johannes B. Ullrich, Ph.D.
SANS Technology Institute
Twitter

Keywords: http perl tshark
6 comment(s)

Comments


Diary Archives