From: Michael Goldshteyn (mgoldshteyn_at_[hidden])
Date: 2005-09-08 08:51:39
"Jeff Garland" <jeff_at_[hidden]> wrote in message
> In general my goal has been to put as few human usability barriers in the
> of using the wiki while being able to shut out spammers after the first
> incident and quickly recover all spammed pages. So far it's been working
> well. In a more typical week we get about 3 spammers changing 1-2 pages
> incident. These look like they are done by hand and hence CAPTCHA would
> nothing to stop these. I've also resisted calls for registration as I'm
> convinced that people smart enough to run bots to spam from 50 different
> addresses will simply register to work around that barrier. And, one of
> bots was also smart enough to meter it's pace to work around 'throttling'
> traps. So I don't put it past some of these guys to find a way around
> too. In the end, the critical thing is the backup -- no matter how bad
> spam, things can be restored easily...
I agree with most of what you said. However, putting at least mild spam
prevention including searching message content for multiple links outside of
the Wiki will at least control automated spamming. Also, some human spammers
may just realize it isn't worth the effort and move on to another wiki site.
Boost list run by bdawes at acm.org, gregod at cs.rpi.edu, cpdaniel at pacbell.net, john at johnmaddock.co.uk