Talk:NSwiki (second generation)

Stats artificially inflated on NSwiki (second generation)
Hi, It has come to my attention that a majority of the “articles” on NSwiki (second generation), possibly up to 114,424, are actually bot-created (see and  as evidence). How should we approach this? --Minoa (talk) 19:23, 10 September 2016 (PDT)

I do not think, it is of interest for us. We do not judge the quality of pages, just count. For them it might make sense to pre-create pages by BOT. This way they would formally be correct, containing the standard templates etc. Users would not have to bother to find out, which templates to use and how to do this. Other wikis import pages from Wikipedia (to rescue them from deletion). We do not judge how pages are created. We can be happy is we got an entry about a wiki at all. Most wikis out there are not listed in WikiIndex. To go deep into a wiki and make investiagtions and judgements would cost enormous energy which in turn could not be used for our actual work: Have a page for every wiki out there.

We could (if we would care to be efficient) pre-create a page for every wiki out there on Wikia and double our own page count. Bot nobody here is able to write such a bot :-(  Manorainjan  04:52, 11 September 2016 (PDT)