Hacker News new | comments | show | ask | jobs | submit login
Kids these days: the quality of new Wikipedia editors over time (wikimedia.org)
117 points by vgnet on Mar 28, 2012 | hide | past | web | favorite | 51 comments



It's not even a new user issue. I've been an editor for going on 5 years, and I still get templated and reverted on inconsequential edits.

Case in point: I recently edited this picture: http://en.wikipedia.org/wiki/File:Turing_Machine_in_Golly.pn...

It's a screenshot of the 6 octillionth generation of a Turing Machine built in Conway's Game of Life, using the GPL'd program Golly. The previous image contained Windows GUI elements (title bar, etc) which are non-free and cannot be used on WP when a free alternative exists. So I cropped it and uploaded the edited image.

Within minutes, I had received a template on my (seldom-visited) commons talk page informing me that I had uploaded a file without specifying its license, and that it would be deleted. Take a look at the "Permission" field on that image: "See LICENSE.TXT distributed with Golly for GPLv2 license"

Despite this (incredibly clear) assertion that the image was GPL'd, I received a warning that it would be deleted. Why? Because I hadn't included the "This image covered by the GPL" template that a) I didn't know existed, b) there was no mention of on the upload page, and c) is a wordier version of what I wrote in the license field.

As an experienced editor, I'm used to these stupid quibbles and time-wasting fights. I'll still contribute, although they are a large part of why I don't contribute more. As a new editor seeing this, however? I would have told them to fuck off, got banned for incivility, and never gone back.

There's an attitude among the regulars that Wikipedia is a treasured resource that must be defended against innumerable vandals, trolls, and spammers by a select cadre of noble volunteers. To an extent, they're right. But when you have such badges as "The Defender of the Wiki Barnstar" [1] being held up as the height of achievement for veteran editors, it engenders a culture that is exclusionary, if not actively hostile, towards new editors.

[1] http://en.wikipedia.org/wiki/File:WikiDefender_Barnstar.png


Are you sure showing title bars isn't allowed? Almost all screenshots on Wikipedia include them, even for open source software. e.g.:

https://en.wikipedia.org/wiki/File:Firefox11.png

https://en.wikipedia.org/wiki/File:Winscp_screenshot.png


Here's the full deal:

If you store your screenshot on English Wikipedia, non-free imagery is allowed when there's a rationale, which in this case would be "fair use". That's the decision of that community. I think you could store that screenshot as is. http://en.wikipedia.org/wiki/Category:Fair_use_screenshots

If you use Wikimedia Commons (a common media backend for all the Wikipedias and other projects), there are no exceptions; it's all free, all the time. Fair use is NOT allowed. The guidelines are here: http://commons.wikimedia.org/wiki/Commons:Screenshots

On Commons, in theory, title bars and other OS windowing widgetry are okay if and only if it's a fully open source operating system.

In practice, most Commons volunteers are not insane, and don't care. I don't know of any case where Microsoft has sued someone for a screenshot that included a title bar, where there was no intent to deceive people or otherwise misuse their work.

But, Wikipedia and Wikimedia Commons tend to attract and enable a kind of person that values extreme consistency and rule-following over all other considerations. Someone who makes it their mission to eliminate all screenshots with titlebars probably won't be reined in.


I've had images with them included deleted before, which is why I modified the golly screenshot. To be perfectly honest, I'm not sure what's allowed, and I doubt many people who are not active commons contributors do, either.


Some editors went through and deleted a massive number of useful Mac/Windows screenshots several years ago, but I don't think there's ever been an actual policy stating these screenshots were a legal issue.

Also, the GPL only covers code, not program output (under normal circumstances), so I don't see how it applies to screenshots anyway.


>the GPL only covers code, not program output (under normal circumstances), so I don't see how it applies to screenshots anyway.

It's an interesting question. The drawing of the window is, I would argue, indistinguishable from the code. It's like saying the HTML, CSS, and images on a site are GPL'd, but screenshots aren't. The rest of the program output is merely algorithmic, and thus not subject to copyright.


This discussion is wandering off into la-la land. The GPL isn't designed to cover things like websites or bitmapped images. There's other 'free' content-oriented licenses which should be used instead.


It doesn't matter, the application of the "rules" (read: guidelines) is so arbitrary and capricious as to be entirely meaningless and unpredictable.


Did you use the defauly upload field or the upload wizard? The wizard fixes many of the "missing license" issues by prompting the selection of a license - but I forget if the GPL is included there.

I always forget those tags - which do need to exist to explain the license (I'll give them that) - and there is no obvious resource listing all the relevant templates.

No one cares when I raise those issues though; anyone able to make "decisions" were newbies far too long ago to recognise a problem.


I wrote most of UploadWizard. It defaults to Creative Commons licenses, usually.

There is a way to use the GPL but you need to specify that you want a custom license, and enter it using wikitext. In this case the community's standard practice is to use the incantation "{{Free screenshot|GPL}}".

Unfortunately even UploadWizard doesn't help the problem here, because it doesn't help with image edits and what Wiki culture calls a "reupload". (This was a deliberate choice.)

For what it's worth, I care about the issues you raise. The real solutions require a total rethink about how Wikipedia keeps track of image metadata, though.

The community has different priorities at the moment, largely because they have inadequate tools to police the wikis, and this is why they get a bit trigger-happy on the revert button and written a lot of bots to complain about less-than-ideal edits. Most of them value encyclopedia quality far more than abstract niceties like "openness". I see it as a matter of not having the right tools to do both, though.


Thanks for your hard work. I'll be sure to look into UploadWizard the next time I need to upload a picture.

To be honest, even after 5 years of editing I had to google how to upload it. Wikipedia editing in general is rather obfuscated, but I find the Commons tasks that much more so due to my unfamiliarity with them.


I'm just sorry that two years after I was hired, we still have basically the same problems, as evidenced by your comments. It's partially the nature of MediaWiki to be impervious to usability enhancement, but we also made some poor choices in how we attacked the problem. I have to take some of the blame myself there.


Ha! Yes, sorry I probably shouldn't have used such a broad brush.

Some people do care deeply.

I'm just frustrated with Commons at the moment for other reasons, and took the opportunity for a childish jibe.

(good work on the UploadWizard :D)


I'm glad that they're finally coming around to the realization that Wikipedia has become increasingly closed to new contributions, and that they've stopped touting the (patently absurd) hypothesis that new users just don't "get it." (The fact that they'd even think, let alone think first, to blame the users is just a giant head-scratcher).

As a simple UX experiment, I would ask new users this: try to contribute substantively to any article on Wikipedia. Just try it. Make a good-faith, high-quality edit to a page, and see how long the edit is allowed to stand. More likely than not, the contribution will be automatically reverted, within milliseconds, by a bot. If it's not, it'll be hand-reverted by a hardcore Wikipedia editor -- part of the statistically small, but disproportionately powerful cadre of self-appointed content cops, who seem to see their jobs as being bulwarks against change. In its zeal for the trappings of due process -- attributions, anti-"vandalism" policework, source checks, guidelines, and so forth -- this clique has lost sight of the net effect it's had on the site, which is to calcify and close off the free exchange of information that was so crucial to Wikipedia's early growth.

IMO, Wikipedia has faced a fundamental challenge in recent years: namely, that content-quality efforts have threatened new content volume. I don't envy this strategic predicament, being forced -- quite literally -- to choose between quantity and quality. It's not an easy balance to strike, and, given the circumstances, Wikipedia's historic track record is quite admirable. Recently, however, the balance has tipped too far in the direction of quality-policing. And now it's starting to undermine the core tenets of the project. I remain optimistic that Wikipedia (and/or the Wikimedia Foundation) can right the ship. But it'll have to mean a substantial uprooting of some bad seeds that have been allowed to take hold for years now.


I've done it. Without a doubt there is a high barrier to entry, but with anything you have to start small. Find articles that aren't fleshed out already. Find things that are entirely incomplete. Dive into those articles first, those are the ones that need attention, not Barack Obama's page. I started getting involved with wikipedia about a year ago. If you start small and create a history of trust, older users are more likely to trust and approve your edits.


I agree that you should start small. My contributions have been minor alterations to articles on obscure or esoteric topics. Like fixing a spelling error in the article for "A Pup Named Scooby-Doo." (Although that article is maybe more active than you'd think it would be.)


I'd argue something even worse than what you are positing. The folks who've stuck around and made contributing to WP nigh impossible are also the same who make strong statements about turning WP into what's essentially just a digital copy of a traditional encyclopedia, thus artificially constraining it in both breadth and depth even though the storage of the material (bits) is almost free, while the print cousins were constrained largely due to cost and physical size.

WP is free of both of those constraints so why are their legions of self-appointed bureaucrats trying to impose those constraints?


Because the articles have to be concise to be readable? For example I just stumbled into the article "Tinnitus" (http://en.wikipedia.org/wiki/Tinnitus) which contains a "list of notable individuals with tinnitus" (given that tinnitus is common, this is analogous to a list people who like the colour beige). Makes the (medical) article a bit longer, but crucially it fills the references sections with totally irrelevant links.


What would argue against making it its own article? Instead of deleting, these parts can just be separated- and storage is only getting more plentiful by the year.


A "good-faith, high-quality edit to a page" has no reason to be reverted automatically by bots or hardcore editors. If the content is encyclopedic and not controversial, it will be "wikified" by contributors that know more the syntax.


See other comments on this thread about fair-use image bots, which have caused considerable anguish on Wikipedia. They run fast, and template many people. Fair use is important, and there are many images without a compliant licence, but most people have no idea what counts for fair use.

EG: I have some toys. I take a nice photograph of those toys. Can I release that under a completely free licence and upload it to wiki? There's a famous statue in my home town. I take a nice photo of it, and release it under a free licence. Can I upload that to wiki? What about a building? etc etc etc.


For what it's worth, this is just restating some research that's been ongoing since at least 2008. I think the final nails in that coffin were established in the "Summer of Research" in 2011. There's no doubt that the decline in editorship is due to endogenous, not exogenous factors.

The WMF are not "finally" coming around to this realization; they are trying to convince the rest of the community to make it a higher priority, and to publicize the efforts they're making to the press.


(The fact that they'd even think, let alone think first, to blame the users is just a giant head-scratcher)

Logically, I can see this. However, when you survey the world, you see a whole lot of blame directed at users. (With certain tech/business subcultures as notable exceptions.) I'll note that when our brains evolved, there was no software nor was there UX, and the only suitable targets for blame were people and critters. I'd bet this is a human cognitive bias.


Largely agreed, but if you'll permit me to get cute, I'd suggest that's there's always been UX. Whether in city design, the evolution of tool usage, invention, product design, etc. UX is at the very heart of why we, as humans, always seek to improve upon our lot.

The difference with software is that, for the first time in our history, we're able to measure, isolate, quantify, and control the elements of UX better than we've ever been able to. (It helps that software is often experienced in isolation from its environment, so UI can be more closely correlated with UX than it is for other domains). UX was a fuzzy, ethereal, probably subconscious concept that only recently became a serious discipline. But it's always been important. And our brains have evolved to conceptualize it, albeit intangibly until now.


Largely agreed, but if you'll permit me to get cute, I'd suggest that's there's always been UX. Whether in city design, the evolution of tool usage, invention, product design, etc.

Permission denied. :) Everything but tool usage is in the realm of cultural evolution. For the simplest tools, most all of the error is user error. There's simply not so much functionality in a stick that isn't mostly dependent on user actuation.

our brains have evolved to conceptualize it, albeit intangibly until now.

That's my point. UX flaws that are independent of user error have been largely intangible until now.


Define "cultural evolution," though. I'm not sure what you mean there. Not trying to be difficult, because I rather enjoy this conversation. Just unclear about the distinction you're drawing.

I'd argue that UX design -- even if it didn't have that exact name -- has been a distinct discipline long before software. Just ask anyone in the food service industries, the retail industry (department stores were basically innovations in UX in retail; so was IKEA), the casino gaming industry, the amusement park industry, and so forth.

Casinos, in particular, are fascinating UX case studies. The person who first thought of modern casino layout, comping free drinks at table games, oxygenating the gambling floor, removing clocks from the walls, comping rooms and other amenities for big spenders and regulars, which games to place adjacent to which others, etc., was a UX designer in spirit if not in title. And those decisions were pretty rigorously tested and quantified. These things may not meet the technical definition of UX as we commonly speak of it on HN, but they certainly hold with the spirit of the discipline Don Norman would later come to articulate as "UX."


Define "cultural evolution," though.

As for a definition, I'm talking about the evolution of individual behaviors as transmitted through culture. If one somehow rendered the entire human race sterile, but we continued to propagate ourselves for the next 2000 years through cloning, you'd still have "cultural evolution."

I'd argue that UX design -- even if it didn't have that exact name -- has been a distinct discipline long before software.

Again, I don't disagree. That you bring this up indicates to me you've missed my point.

food service industries, the retail industry..., the casino gaming industry, the amusement park industry...

All of these predate most of the evolution of the human brain's structure and capabilities. It's somewhat true that there were "UX errors" before the stone age. I say "somewhat" because it's really hard to delineate these as entities without a certain degree of technology. When all you have are sticks, what is the error of the designer and what is the error of the user? Maybe the user's just "holding it wrong?"

It seems to me, that we're likely to assign blame to sentient and animate entities. And even if the stick wasn't "whittled correctly" according to Thag, maybe it's just fine to Ookla? It's just hard to talk about "UX errors" as quantifiable entities until we get standardized production and large sample sizes.


If I understand your original point correctly, it's this: it is human nature to attribute most things to user error, ergo, my assertion that Wikipedia's blaming its users was a "head-scratcher" was off the mark. Our evolved inclination, which predates the discipline of even considering UX as a tangible -- and, more important, a controllable -- concept, is first to start with the hypothesis that the user is in error. (And, furthermore, that such a hypothesis is not necessarily unjustified by historical frequency).

I actually agree with you here, but I think this point and mine are not so much at odds, as they are orthogonal. My point is that, human nature or not, Wikipedia came about in the modern era. Even if our cognitive bias/inclination is toward blaming the user, we have tools and analytic frameworks at our disposal which exist precisely to allow a necessary check against our brains' heuristics. Those checks should have been run by the Wikimedia elite. While I'll admit that "head-scratcher" is an unfair description, rendered mostly for rhetorical effect, I believe my point still stands. We have modern tools at our disposal, precisely because we are now -- uniquely, in our history -- aware of our brains' strengths and weaknesses in pattern recognition and situational assessment.

If this is not an accurate summation of your position, then I'll freely admit that I'm missing your point.


No, that's it. Strangely enough, we were basically agreeing the whole time.


Make a good-faith, high-quality edit to a page, and see how long the edit is allowed to stand.

I guess I haven't found this, but I admittedly don't try to edit [[George W. Bush]] or articles like that. The common case for me is crickets: I create an article on, say, a 19th-century German author who had a de.wikipedia article but no en.wikipedia one yet. Short article, maybe 2 paragraphs, one reference. The usual outcome is that I hear from nobody about the article ever again. No complaints, no praise, no reverts, no improvements, no anything at all. Exceptions: 1) it might get tagged as an "orphan" if nothing links in; and 2) some people may fiddle with the window dressing, adding/modifying categories and infoboxes and whatnot. Once in a long while someone will come in and expand it greatly, and that's usually positive.

My recent hobby has been articles about archaeological sites, and that has the same crickets-type feeling. So I'm wondering what new users are editing that they get any comment at all, much less angry ones! In my experience picking up a book and creating short new articles that reference the book is more than enough following-the-policies to avoid complaints about your articles, assuming it's some kind of legit reference book.

If anything, there are a lot of edits that should be challenged that aren't, with the bar for getting spam into Wikipedia not really that high. I know of at least one university that actually has a paid staff member creating total PR-puff-piece articles about that university's professors, and few of them get challenged, despite the fact that they read pretty much like a PR person wrote them, and are "cited" mainly to the subject's own articles and university press releases. Some academics edit Wikipedia solely to insert references to their own papers in articles. There are also rumors that publishing companies have people inserting references to the publisher's recent books, as a form of advertising rather than because they in good faith think it'll improve the article. Heck, check out the External Links section on popular vacation destinations; around 20-30% of them are full of travel spam, which isn't even that subtle, and yet manages to get in. Maybe all those false-negatives (not reverted) are worth not having more false positives, but it's a hard problem overall in both directions imo.

Really general articles, like [[global warming]], do have more fundamental problems, but I think some of them are just unavoidable. With anything controversial, there will be 50 different opinions about how the article should go, and the only way to reach a compromise is to discuss it and try to figure out how to best organize the material, split some material out to subsidiary articles, word controversial points delicately, etc. If someone who didn't read that discussion comes in and makes an edit, it's likely to be problematic on one way or another. I don't think that's even Wikipedia-specific; that's how things work when it comes to controversial subjects when people are writing academic review papers, subject-matter-specific encyclopedias, Linux kernel contributions, etc. Wikipedia's article on global warming is at least probably less bureaucratic to contribute to than the IPCC report is. ;-)

[edit: The above doesn't mean I don't think that there are problems with Wikipedia, which I do, but I think they're not quite as "it's a completely horrible community" as is often suggested. Some of the things, like reaching consensus on a controversial subject, are just inherently hard; other things that are broken for no good reason should indeed be fixed, others depend strongly on what kinds of articles you create, etc. Overall Wikipedia's newbie-friendliness actually seems pretty good relative to other collaborative projects I've been involved in, though.]


Really general articles, like [[global warming]], do have more fundamental problems

As a wikipedia user (edited only typos & punctuation), I don't see these problems. I think http://wikipedia.org/wiki/Global_warming is very informative and reasonable, more so than any other page I can find immediately with google. The talk page is also very good.

Can you be more specific about the problems you think it has? I have trouble seeing wikipedia as anything other than a mind-bogglingly amazing 'shit i woke up in the future' thing :-)


When it first started you could have written an article on cars or some other easy topic that a of people know about. Now the topics are insane that are left. (While there are still some easy stuff left, there isn't nearly as much.)

Because of the nicheness of the topics it will get increasing difficult to write on.


Not really; the niche topics are the easiest to work on for the most part, because you're unlikely to run into anyone else.

The hard topics are:

- traditional conflict areas (Religion, Politics, Race)

- broad-level topics (I tried to work on Computer once, bad mistake...)

- Current events

There is an awful lot of low hanging fruit ready to be plucked; but people aren't really interested in them.


It's good to see real data to address this question, rather than the neverending stream of basically anecdotal information about the problem that we've been slinging around. It's especially interesting to see the percent of not just "good faith" (flawed but well-intended) edits that are reverted but also the percent of "golden" (actually contributing) edits that are reverted; this sort of hostile drive-by is discouraging even for experienced editors.

One thing I see in the graphs is that the "survival rate" of editors who made a good-faith first edit was already in relatively steep decline by 2005, but the same rate for editors whose first edit was golden remained on a high plateau through 2006 and then just categorically dropped off a cliff. What happened then?


In 2006, bots were introduced to do various clean up tasks automatically. There are currently over 800 registered bots on Wikipedia. They are generally made to enforce the rules. But if you need 800 different bots controlled by 800 different people to enforce the rules, that means you have a lot of obscure rules.

For instance, there is a list of domain names that wikipedia sees often as spam. So if an innocent user goes to edit a page, and wants to add a new piece of information referencing a domain on that list, a bot will come along and mark it as spam (and revert it) even though its a good faith edit.

It can be frustrating and confusing for new users. The bot is often quite terse and rude, and so there is no incentive to return to wikipedia to make edits. 9/10 new edits are rejected, so no wonder they can't keep new editors.


I agree that it's nice to see hard data on this but

"the neverending stream of basically anecdotal information"

should be the first clue that this is basically what everybody seems to already know except for the foundation. This is not even remotely a new issue as their data and numerous complaints, articles, comments etc. shows, but it's taken them more than half a decade to even bother to notice it.

Here's from 2010:

http://ostatic.com/blog/is-wikipedias-deletionism-out-of-con...

2009

http://schott.blogs.nytimes.com/2009/11/26/deletionists/

http://blogs.telegraph.co.uk/technology/shanerichmond/100002...

2008

http://news.slashdot.org/story/08/09/21/2342226/debating-del...

2007

http://www.roughtype.com/archives/2007/08/rise_of_the_wik.ph...

and on and on and on

Here's foundation's response during all this

http://news.bbc.co.uk/2/hi/technology/8382477.stm

It's only in the last 2 years has the foundation even bothered to acknowledge there's a problem, and only after the problem became so bad that it couldn't be ignored.


Twinkle? Which I think was introduced at the end 2006 or beginning 2007.

The Counter Vandalism Unit was created late 2005; so maybe it got traction during that time and over-zealous members templated the newbies?


"What this means is that while just as many productive contributors enter the project today as in 2006, they are entering an environment that is increasingly challenging, critical, and/or hostile to their work. These latter findings have also been confirmed through previous research."

This confirms to me what I argued on HN last year: http://news.ycombinator.com/item?id=3272926 and http://news.ycombinator.com/item?id=3273204


NPR's Talk of the Nation recently had a segment devoted to the bureaucratic run-around that can happen on Wikipedia.

http://www.npr.org/2012/02/22/147261659/gauging-the-reliabil...


This is a great example of my complaints elsewhere in this thread, thanks for the link.

So listening to this show, there's an expert guest, a couple call-ins (including another expert guest) and even a caller who talked about how important the crowd sourcing is, who all more or less say the same thing I'm saying, trying to contribute to wikipedia is a pain because of the overzealous editor's arbitrary and capricious application of largely nonsenical rules -- the foundation representatives? Regurgitation of the guidelines and nonanswers. Then lots of lip service to the importance of crowd sourcing an encyclopedia without acknowledging that the crowd is turning into a very small gang and "let's make new contributors feel more welcome" without any particular ideas presented of how to do it.


This ignores the fact that the standard for what is considered a good edit has risen dramatically over time. Most of the featured articles from the early years of the site no longer even meet that standard. There are 3,500 featured articles, and almost 1,000 that have become unfeatured as standards have risen.


No, they used the same method to judge edits over the entire sample space.

> To test the hypothesis ... we randomly sampled the first edits of newcomers to the English Wikipedia from the earliest days of the project to the present. With the help of some experienced Wikipedians, we hand-categorized the edits of 2,100 new users according to a four point quality scale


That only shows that the edits have been roughly the same quality of time, what I'm saying is that edits of the same quality are now judged much more harshly than they used to be. For example six or seven years ago if you wrote something that was true but unsourced, or else the citation was formatted incorrectly, the text would generally be left up and a higher level editor would get around to hunting down the proper citation eventually. But today these kind of edits would get instantly reverted, even though they create value. The issue is that because so much of Wikipedia is already properly cited, adding new text that is correct but not properly cited temporarily lowers the overall quality of the encyclopedia even though it would ultimately create value if left up and fixed, so essentially what we have now is one of those game theory failures.

This really isn't that hard to figure out and all they'd have to do is slightly tweak the design to fix it, so I don't really see why they're having so much trouble on this one.


Is it possible that the opportunities to make quality edits has decreased as wikipedia has matured?

What I have in mind is the possibility that topics that haven't been well written out are like low-hanging fruit and are easy to positively contribute to. As wikipedia has expanded, that low hanging fruit has disappeared to some extent, and thus new editors have fewer opportunities to actually provide quality edits.

This explanation does not place blame on either newbies or pre-existing users who are "content cops". Instead, it proposes that the difficulty of providing a "golden" edit increases as we move forward in time. If that's a problem (debatable, since it means wikipedia as a product is better than it was previously) then it strikes me as a difficult one to solve.


Those were basically my thoughts before reading this article. However, while it might be that its more difficult to create a good edit, that still shouldn't lead to more of the good edits that are made being reverted as the data shows is happening.


Like a lot of people here, I have sporadically read Wikipedia for years. I came on board as a Wikipedia editor in May 2010 after meeting the project's co-founder and his family in person the year before. I've since seen some of those immediate family members on another lengthy occasion. My children regularly interact with that family in an online education community. Through a web of mutual friendships, I thought I had some sense of what the community norms would be like on Wikipedia before I started. Moreover, I began editing only after reading the several published books about Wikipedia available at that time (as disclosed on my Wikipedia user page), and came on board as someone who has actually had both academic journal editing positions and paid journalism editing positions before Wikipedia even existed.

Even at that, I get a lot of well sourced edits reverted by ideologically motivated drive-by I.P. editors as part of an ongoing process of edit-warring on articles that I happen to know sources for.

http://en.wikipedia.org/wiki/Wikipedia:Arbitration/Requests/...

For some controversial topics, no amount of good-faith editing by editors who actually know how to look up reliable sources and how to have civil discussions of controversial issues can overcome the flood of point-of-view pushers (both I.P. editors and registered editors who are sock puppets or meat puppets of previously banned editors) who want to drag down the project to below the level of a partisan blog. There simply isn't any incentive in today's atmosphere on Wikipedia for readers who actually know what encyclopedias look like and who have actually engaged in careful research on controversial topics to devote any of their time and effort to Wikipedia.

My number of edits per month has plummeted, and mostly I wikignome to clean up copyediting mistakes on miscellaneous articles written by young people or foreign nationals who didn't write grammatical English in the last revision of the article. The way to increase participation by productive, knowledgeable, literate editors is to drive away the ideologues and enforce some reasonable behavioral norms on article talk pages and user talk pages. I see no sign of that happening over at Wikipedia, and until I do, I will heartily support anyone's effort to build a competing resource, either limited to a specialized topic or a direct attempt to build a better quality general online encyclopedia.

I think the "Lamest Edit Wars" page in project space sums up much of what is amiss about Wikipedia.

http://en.wikipedia.org/wiki/Wikipedia:Lamest_edit_wars


I was surprised to see that “Assume good faith” was a principle of Wikipedia - it doesn't match my experience with the community at all.


There is a set of knowledge in the world. It is vast, but much of it has already been contributed to wikipedia. The knowledge of the world continues to grow, but I don't think at the rate that it is contributed to wikipedia.

Contributors are really mining the collective knowledge of the world, and so they are running out. So similar to mining an nonrenewable resource, the cost of extraction goes up with time: http://desmond.imageshack.us/Himg521/scaled.php?server=521&#...

With difficulty in making significant contributions dropping off, you would expect to see a drop off of new users (they really don't get rewarded enough for their efforts). Back in the day, you could contribute something meaningful that you know to wikipedia. Most likely now, someone has already put it in.


Thanks wikimedia for once again providing a detailed analysis of something that's been absurdly obvious for the last 5 or 6 years - "However, the rate of rejection of all good-faith new editors’ first contributions has been rising steadily, and, accordingly, retention rates have fallen."

And so the conclusion? "At the Foundation level, this includes major software changes like the creation of a visual editor to lower the technical barrier to entry..."

Which completely doesn't address the human issue of entrenched editors who reject perfectly good edits by new users and delete perfectly good content. Nothing about that problem is addressed by making the editing GUI simpler.

Useless.


What a nice bit of research and summary presentation.


Oh, the good old days. I went and inserted some Irish Evil into Wikipedia in commemoration.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: