

Wikipedia would be a shambles without bots - paulsilver
http://www.bbc.co.uk/news/magazine-18892510

======
bane
_The bots do make mistakes, however, if they encounter a new circumstance
their programming cannot account for. ClueBot NG, the anti-vandalism bot, has
a small rate of false positives - edits it mistakes for vandalism, but which
are in fact legitimate._

 _Since Wikipedia closely tracks edits, however, mistakes can be repaired
almost as quickly as they happened, administrators say._

I think fairly consistent commentary over the years demonstrates that this is
patently false. Deletionists are capricious and arbitrary, reversals are
beauracratic, complex and lengthy (if even possible) It dissapoints me to
think about how much legitimate human knowledge has been wiped from WP, the
authors discouraged and WP made poorer for it.

~~~
user49598
_It dissapoints me to think about how much legitimate human knowledge has been
wiped from WP_

Then it must absolutely enthrall you to think about how much human knowledge
is made available by wikipedia. You can be just as persistent as a
"delitionist", theres nothing stopping you. It doesn't mean the system is
broken just because some people are passionate about it.

~~~
user49598
Did you downvote my comment because you don't agree with it or because you
think it's non-constructive?

~~~
danielweber
I didn't downvote you, but "if you don't like someone else's edit, edit it
yourself" comes down to who can afford to spend more of their life babysitting
Wikipedia.

I thought that line of reasoning was completed years ago. Yeah, the people who
are more willing to waste their life on Wikipedia policy pages will win. Now,
what are the bad effects of that?

~~~
user49598
Talk pages. If you have a problem with an article bring it up on the talk
page. I don't see how else you could expect a free, user-edited, online
encyclopedia to function.

For what it is, wikipedia is incredibly effective. If you don't care about
wikipedia policy, how can you care about the exact content of wikipedia?

~~~
ars
It used to be I would check the color of the talk tab, and if there was text
there I'd go read it.

Then someone had the idea of putting a template with the "importance" of the
page in the talk page. So now _every_ page has a talk page with text, and I
never check them.

It used to be I would ask questions in the talk page and get an answer within
hours, now I'm lucky to get an answer that year! (Not exaggerating.)

The amount of content that has been lost by vandals changing text to garbage,
and then someone comes and removes the garbage - but doesn't revert the edit,
is staggering. I think it's time to auto-lock virtually all old pages, require
a second opinion on every edit.

~~~
user49598
Vandalism is a bummer, but it's certainly not a reason to loose faith in
wikipedia. "Staggering" loss of content is an enormous exaggeration,
especially since the articles are versioned and the versions are easily
comparable.

~~~
ars
Unless you go searching in the history you'll never even know about all the
missing content.

And yes, it is staggering, I am not exaggerating given that I've personally
restored quite a number of pages - some of them have sat completely gutted of
content for a year.

------
espinchi
Just some complementary information, extracted from
<http://en.wikipedia.org/wiki/Wikipedia:Bots> (very interesting for those that
didn't know much about the role of bots in the wikipedia).

Over 60 million edit operations have been performed by bots on the English
Wikipedia.

Some bot examples are: _Yobot_ , that categorizes individuals in categories by
birth date, profession and other criteria, _SineBot_ , that signs comments
left on talk pages, _MiszaBot_ , that archives talk pages, _Xqbot_ , that
solves double redirects, ...

Some bots, like _RussBot_ (<http://en.wikipedia.org/wiki/User:RussBot>) have
big red _Emergency bot shutoff_ button.

------
gaius
_one called rambot created about 30,000 articles - at a rate of thousands per
day - on individual towns in the US_

Umm, no, someone wrote a script and ran it. This article massively
overestimates what a "bot" is.

~~~
Sharlin
That's not really a meaningful distinction especially in an article meant for
a popular audience. Besides:

 _A bot (derived from 'robot') is an automated or semi-automated tool that
carries out repetitive and mundane tasks in order to maintain the 4,009,598
articles of the English Wikipedia._

\-- <http://en.wikipedia.org/wiki/Wikipedia:Bots>

_Internet bots, also known as web robots, WWW robots or simply bots, are
software applications that run automated tasks over the Internet. Typically,
bots perform tasks that are both simple and structurally repetitive, at a much
higher rate than would be possible for a human alone._

\-- <http://en.wikipedia.org/wiki/Internet_bot>

~~~
_delirium
I agree they're all bot-ish, but in the context of Wikipedia, I think it's
useful to distinguish. Possibly not for the audience of the original article,
but they do pretty different things.

Heuristic reactive bots are an interesting component of the human/machine
hybrid "Wikipedia immune system" that keeps most encyclopedia articles non-
vandalized most of the time, despite that seeming implausible at first (the
most surprising thing about Wikipedia, if you've ever run any other wiki, is
that it doesn't get totally full of spam and garbage within hours).

Others are more like content-import scripts, running once; Rambot falls into
that category. And still others are closer to external implementations of
functionality that MediaWiki is missing internally. For example, MediaWiki
lacks a "rename category" function, so there are helper bots that will
"rename" categories by mass-removing every article in the category and mass-
adding it to the new category.

------
pavel_lishin
I wish they'd cited some examples of the mistakes. In general, this article
doesn't offer much that the submission title didn't cover.

~~~
TazeTSchnitzel
Yeah, I expected it to be slightly more in-depth and to maybe mention several
particular bots and profile them, perhaps.

