83,675
edits
Hoof Hearted (talk | contribs) (update, more info from Internet Archive (which was a duplicate article, converted into a redirect)) |
Hoof Hearted (talk | contribs) m (wl) |
||
Line 4: | Line 4: | ||
The 'Internet Archive' is an expansive repository of {{tag|archived}} screen scrapes from all types of websites, including wiki sites. Wikis frequently go [[:Category:Dead|down (dead)]] when [[owner]]s die, or are otherwise incapacitated, are not able to pay the server and / or hosting bills; or when they [[:Category:GoalAbandoned|give up]] because they find they are unable a cultivate a vibrant community, or are not equal to the task of coping with [[spam]], technical bug fixes, software updates, or other problems. The Internet Archive then becomes one of the only ways to get the content, since most wiki owners do not share the wiki content database with the general public. | The 'Internet Archive' is an expansive repository of {{tag|archived}} screen scrapes from all types of websites, including wiki sites. Wikis frequently go [[:Category:Dead|down (dead)]] when [[owner]]s die, or are otherwise incapacitated, are not able to pay the server and / or hosting bills; or when they [[:Category:GoalAbandoned|give up]] because they find they are unable a cultivate a vibrant community, or are not equal to the task of coping with [[spam]], technical bug fixes, software updates, or other problems. The Internet Archive then becomes one of the only ways to get the content, since most wiki owners do not share the wiki content database with the general public. | ||
However, the WaybackMachine at the Internet Archive does not always make the raw wikitext available for those who may have wanted to import it into their own wikis. This is not an issue for [[:Category:Wikia|Wikia]], which is apparently happy to host an ever-increasing number of abandoned [[wp:Ghost town|ghost wikis]]. Many wikis [[mw:Manual:Robots.txt#With short URLs|set robot policies]] that prevent some or all of their content from being archived; for example, the [[English Wikipedia]] excludes deletion debates from being archived.[https://En.Wikipedia.org/robots.txt] Also, sometimes when sites go down, the new domain owner sets a robot policy that prevents archives of the old content from being viewed.{{Fact}} | However, the WaybackMachine at the Internet Archive does not always make the raw [[wikitext]] available for those who may have wanted to import it into their own wikis. This is not an issue for [[:Category:Wikia|Wikia]], which is apparently happy to host an ever-increasing number of abandoned [[wp:Ghost town|ghost wikis]]. Many wikis [[mw:Manual:Robots.txt#With short URLs|set robot policies]] that prevent some or all of their content from being archived; for example, the [[English Wikipedia]] excludes deletion debates from being archived.[https://En.Wikipedia.org/robots.txt] Also, sometimes when sites go down, the new domain owner sets a robot policy that prevents archives of the old content from being viewed.{{Fact}} | ||
;''See also | ;''See also |
edits