[unisog] A new way to look for exploits

H. Morrow Long morrow.long at yale.edu
Thu Mar 14 16:46:50 GMT 2002


John -  The initiating IP address for the connections coming into your web server
	should still be logged in the appropriate web server logfiles.

	The 'Host: ' HTTP 1.1 header which the web browser client is sending
	to the web server is supposed to contain the IP address (or even 
	more commonly the DNS name) of the server which the browser is
	expecting to contact and talk to -- it is NOT the client's IP address
	but rather the the expected server's address or name.

	The second case is obviously a vulnerability probe (probably being done
	against each web server found in your entire Class B
	for Ohio U.) and the scanning agent software (run by either a person
	or a worm) is 'funning' you by setting the Host: header to a 130.132.x.x.

	The HTTP 1.1 version of the HTTP protocol uses the client's Host: header
	to allow the web server computer to 'host' multiple virtual web servers
	without actually having to dedicate (and 'waste') multiple public IP
	addresses (one for each) -- IP addresses are a valuable scarce resources
	for some ISPs and web hosting companies.

	Before HTTP 1.1 companies running multiple web servers (instances) on
	the same machine would need to configure the machine with multiple IP
	addresses (stacking them up on one real Ethernet NIC card for example)
	just so that they could serve multiple URLs for different customers
	(such as http://www.abc.com/, http://www.cbs.com/, http://www.nbc.com/)
	off of the same server.  The web server (e.g. Netscape/Apache/IIS) would
	know which one was being requested by which IP address the socket conn
	was coming into.

	In HTTP 1.1 the client sends over to the server the name (or IP #) of
	the server it wants as the 'Host:' header part of the client side headers:

	Host: www.abc.com

	or 

	Host: www.nbc.com

	or

	Host: www.cbs.com


	Often if you use telnet to connect to port 80 on a newer version web server
	hosting multiple virtual server instances and type in 'GET / HTTP/1.0' or
	'HEAD / HTTP/1.0'   (followed by hitting return twice) you'll get a message
	that the web server doesn't know which particular web server instance you
	really want to talk to.

- H. Morrow Long
  University Information Security Officer
  Yale Univ., ITS, Dir. InfoSec Office
  


"John E. Tysko" wrote:
> 
>  Recently, I noticed several of our machines being scanned by
> alexa.com, presumably for the web archive services provided by
> archive.org. Several requests for interesting web pages came
> to my attention, and one of the latest from yesterday looked
> like this; 2 connetions, the first, a polite:
> 
>    GET /robots.txt HTTP/1.0
>    Connection: close
>    Host: 132.235.16.144
>    User-Agent: ia_archiver
>    From: crawler at alexa.com
> 
> and the second, an interesting:
> 
>    GET /cgi-bin/..%c0%af..%c0%af..%c0%af..%c0%af..%c0%af../winnt/system32/cmd.exe HTTP/1.0
>    Connection: close
>    Host: 132.235.x.x
>    User-Agent: ia_archiver
>    From: crawler at alexa.com
> 
> It would seems there is a new way to probe for security holes
> without giving away your ip.
> 
> Is this unique to our machines, or has anyone else seen this?
> 
> John
> 
>   John Tysko
>   Systems Administrator
>   Electrical Engineering and Computer Science
>   Ohio University, Athens Oh 45701
-------------- next part --------------
A non-text attachment was scrubbed...
Name: smime.p7s
Type: application/x-pkcs7-signature
Size: 2578 bytes
Desc: S/MIME Cryptographic Signature
Url : http://www.dshield.org/pipermail/unisog/attachments/20020314/fd4b0600/smime-0007.bin


More information about the unisog mailing list