|
Boost : |
From: Rene Rivera (grafik.list_at_[hidden])
Date: 2005-09-08 09:31:49
Michael Goldshteyn wrote:
> I agree with most of what you said. However, putting at least mild spam
> prevention including searching message content for multiple links outside of
> the Wiki will at least control automated spamming.
AFAIK Jeff is already doing content filtering, in addition to the IP
black listing.
> Also, some human spammers
> may just realize it isn't worth the effort and move on to another wiki site.
Possibly, but they are a determined bunch as they are getting paid to do
this. And hence they have considerable incentive to work around barriers.
An additional filter to consider... As many spammers use automated
tools, it might be worth it to also filter based on a white/black list
of user agent IDs. For example only allowing edits when the UAID is
"Mozilla/*Gecko/*Firefox/*", etc. And not allowing for obvious spammers,
for example "EmailSiphon*", etc. --- UAIDs at:
http://www.psychedelix.com/agents.html
List of User-Agents (Spiders, Robots, Browser)
http://www.zytrax.com/tech/web/browser_ids.htm
Browser ID Strings (a.k.a. User Agent ID)
-- -- Grafik - Don't Assume Anything -- Redshift Software, Inc. - http://redshift-software.com -- rrivera/acm.org - grafik/redshift-software.com -- 102708583/icq - grafikrobot/aim - Grafik/jabber.org
Boost list run by bdawes at acm.org, gregod at cs.rpi.edu, cpdaniel at pacbell.net, john at johnmaddock.co.uk