[unisog] SP*M Detection Methods & Processes

Bill Martin BMARTIN at luc.edu
Mon Sep 25 19:56:34 GMT 2006


To everyone that has replied, thank you... it is always comforting to
know that we either only as far as other universities :-).

Now, it appears of those responded, we are all pretty much the same
thing in terms of process although the architecture and utilities might
vary slightly.
	Free as possible (for some)
	Open Source (for most)
	Reject above a moderately high level
	Tag anything within the "average spam range" and pass
	Virus scanning typical (open source or commercial, some using
both)
	Most designs are of the "inline" variety
	
One last bit on this, given the similarities in tools, design, etc... 
I would expect that statistics (rejected, tagged-passed, passed clean)
would be roughly similar in percentages.  Given the recent escalation in
"complaints" of "excessive spam getting through", I can only speculated
that others are seeing similar results?  If not, I'm going to guess that
your particular institution may have provided some type of FAQ, HowTo,
or more formal training on preventing SPAM or proper use of e-mail,
Listserv, e-mail tags or SIGs, (or security in general).

About a year ago, we were seeing roughly 60-65% being tagged at 5.0,
*after* blocking a number hosts/blocks for sending mail to unknown
recipients.  The mail that does pass, use to form a well balanced bell
curve and was evenly distributed between 0 and 10, but as of late, it
looks more like Igor from Frankenstein (the bell is more between 0 and
5), now these were quick ad hoc stats and nothing formal, so milage may
vary .  Of these, less than 5% is actual unsolicited SPAM. So, what was
published on Educause back in '03 is pretty apparent

Once again, than you everyone for your input. 

I'll through some stats out after Is collect them again from last week




More information about the unisog mailing list