|
Boost : |
From: Michael Goldshteyn (mgoldshteyn_at_[hidden])
Date: 2005-09-08 08:51:39
"Jeff Garland" <jeff_at_[hidden]> wrote in message
news:20050908014237.M29002_at_crystalclearsoftware.com...
> In general my goal has been to put as few human usability barriers in the
> way
> of using the wiki while being able to shut out spammers after the first
> incident and quickly recover all spammed pages. So far it's been working
> well. In a more typical week we get about 3 spammers changing 1-2 pages
> per
> incident. These look like they are done by hand and hence CAPTCHA would
> do
> nothing to stop these. I've also resisted calls for registration as I'm
> fully
> convinced that people smart enough to run bots to spam from 50 different
> IP
> addresses will simply register to work around that barrier. And, one of
> the
> bots was also smart enough to meter it's pace to work around 'throttling'
> traps. So I don't put it past some of these guys to find a way around
> CAPTCHA
> too. In the end, the critical thing is the backup -- no matter how bad
> the
> spam, things can be restored easily...
>
> Jeff
I agree with most of what you said. However, putting at least mild spam
prevention including searching message content for multiple links outside of
the Wiki will at least control automated spamming. Also, some human spammers
may just realize it isn't worth the effort and move on to another wiki site.
Mike
Boost list run by bdawes at acm.org, gregod at cs.rpi.edu, cpdaniel at pacbell.net, john at johnmaddock.co.uk