Place your favourite anti-spam measures here, organised by wiki engine.
Mettez ici vos measures anti-spam préférées, organisées par moteur wiki.
- Manual:Combating spam — at MediaWiki.org
- The admins at This Might Be A Wiki have a bot crawl the wiki constantly searching for spam and automatically blocks offenders. I'm not sure how they got it to do that, but it seems to have a low rate of false positives. —User:Sean Fennel@ 15:02, 27 Mar 2006 (EST)
- Spam blacklist
- WikiIndex is being overwhelmed by spam bots. Please install the Asirra extension to stop this.
- First measure
- First measure (Première mesure)
Spammers regularly targeted Old Liberapedia, (now no longer a wiki) and the wiki had out of date software. Sysops deleted articles that were entirely spam and blocked spammers. Ordinary users replaced spam with delete templates and alerted sysops. Spam in good articles was reverted. Over about 2 months spammers decided targeting Liberapedia wasn't worthwhile. Spam reduced to a trickle. New Liberapedia (currently online) has better software and should be less vulnerable. If humans guard their wiki systematically this helps even with bad software. Vandals were a regular problem in Old Liberapedia and are less easily deterred.
Spamming the spammers
As WikiIndex has grown, spammers have started spamming this wiki, and it's becoming annoying. Proxima Centauri has started replacing spam with the message below. Afterwards the page is protected so the spammer can't do anything about this message which they don't want potential customers to read. See Opportunityreview.
WikiIndex users are free to copy this for their own wiki. If you are not an administrator you can ask one of the admins on your wiki to protect the page.
- I find this to be a very interesting strategy. Not sure how I feel about it yet. What is the thought around using a template or category to be able to keep track of these pages? Best, MarkDilley
- adding Category:SpamFighting to the pages I see this tactic used on.
Who is unreliable?
Among others firms that 'spam' advertising to other web sites are exploiting those other sites for free advertising. Before buying anything from any 'spamming' website you should ask yourself, "Will this business try to exploit me the way the website exploits the site where the advertisement appears".
- (This page is protected. Please put suggested modifications onto the talk page.)
Specific spam was also targeted. For example when a spammer advertised garden ornaments the page was modified to show that such products are considered Kitsch. A link to the Wikipedia article on Kitsch gave readers the chance to look at a beautiful example of a Kitschy garden gnome. See User talk:Susy Lunardi.
Interested in the anti spam measures in development from WikiSym2005
- Sure am! Any interesting developments??? John 12:44, 27 October 2006 (EDT)
- I've come to the conclusion that even good editors occasionally accidentally make bad edits, such as page-blanking good pages. What I really want is a spam filter that can tell the difference between "good edits" vs "bad edits". Alas, that seems far more difficult than distinguishing between "editors who know a password" and "editors who don't know any password". --DavidCary 12:42, 16 March 2009 (EDT)
- "If only there were evil people somewhere insidiously committing evil deeds, and it were necessary only to separate them from the rest of us and destroy them. But the line dividing good and evil cuts through the heart of every human being. And who is willing to destroy a piece of his own heart?" -- Aleksandr Solzhenitsyn, 'The Gulag Archipelago' (1973). --DavidCary 23:06, 18 March 2009 (EDT)
WikiIndex is being overwhelmed by spam bots
- I don't think we are being "overwhelmed" per se, spam pages tend to be deleted by me or other admins fairly quickly. That being said, Asirra is the most interesting anti-spam extensions I've ever seen. It won't make us immune (see Referata which has this extension installed fighting spambots here) but it it might reduce spam. Elassint 27 June 2012
- QuestyCaptcha is working well too, I use it on the Semantic Stargate Wiki, I see many spam attempts in my configuration space, but none on the wiki. --LIMAFOX76 04:07, 27 June 2012 (PDT)
- WikiIndex is nowhere near being "overwhelmed" by spam bots; it's – unfortunately – standard spambot activity. Unlike some wikis out there, WikiIndex has active admins and therefore the best the spambots can do is submit a spam page which stays up for a couple hours until it's deleted by an admin.
- That being said, WikiIndex is once again running an outdated, unsupported release of MediaWiki and not using any other anti-spam extensions than ConfirmEdit and SpamBlacklist. Installing the SimpleAntiSpam extension, and preferably customizing the MediaWiki:Simpleantispam-label message (I have a feeling that spambots know what text to expect and therefore can skip this trap easily), would most likely help a bit...but in time an all-in-one solution, like Phalanx, should be deployed. Phalanx is an anti-spam extension that combines the features of Extension:SpamRegex, Extension:RegexBlock, Extension:TitleBlacklist, Extension:SpamBlacklist into one, and even adds some additional features. It needs to be generalized a bit to work on more standard MediaWikis, as it was written with a wiki farm setup in mind.
- For now, QuestyCaptcha and/or Asirra would be good solutions; though experience shows that Asirra is not as easy to use as one would think; I had users clicking on all images and wondering why the CAPTCHA fails...
- There's also the AbuseFilter extension, which allows privileged users (administrators) to define certain rules, and if an edit matches one of these rules, it could be tagged as suspicious, or you could even prevent the submission of the edit and block the user submitting it. It's enabled on all Wikimedia Foundation wikis and plenty of other wikis (though again, I'm not sure if a 1.17-compatible version exists and/or how stable & feature-rich such a version would be). --Jack Phoenix 08:53, 27 June 2012 (PDT)
Asirra is a damn good anti-spambot measure, although it can be a pain to get set up properly. You have to watch out for a common problem in which it doesn't let users through even when they get the CAPTCHA right. Leucosticte (talk) 18:15, 12 September 2012 (PDT)