[unisog] HTTP Session Reconstruction and Monitoring

Donal Lynch donal at yorku.ca
Sat Dec 11 01:01:58 GMT 2004


If you can control the settings on the public access terminals, set
them up to use a proxy-cache server (like squid), and then use the
squid logs to track use.  Combine that with something like argus
(which you can probably do everything you want all on its own), and
that should give you everything you want.  I'm pretty sure that
squid will also do enduser/client authentication if you wanted to
force your 'public' users to authenticate before they could surf the



Donal Lynch
Asst. Manager, CNS Network Operations, York University
email: donal at yorku.ca   voice: 416.736.2100 x20282

On Fri, 10 Dec 2004, Jacob Roberts wrote:

> We would like to improve our ability to monitor inappropriate web
> surfing activity on our public access workstations.
> We have 3 basic requirements. The system can:
> 1. Handle our large amount of traffic
> 2. Reconstruct HTTP sessions (e.g. an analyst can retrieve a view of the
> visited web site based on captured packet data)
> 3. Configure rules for specific traffic matching.
> Does anyone know of any enterprise level applications that can do these
> things.  We'd prefer an Open-source solution.
> We are currently testing Steel-Cloud by Computer Associates.  It meets
> reqs 2 and 3 but fails to meet our needs for req 1.
> Thanks,
> Jake Roberts
> Brigham Young University
> _______________________________________________
> unisog mailing list
> unisog at lists.sans.org
> http://www.dshield.org/mailman/listinfo/unisog

More information about the unisog mailing list