"My most surprising discovery: the overwhelming importance in business of an unseen force that we might call 'the institutional imperative.' In business school, I was given no hint of the imperative's existence and I did not intuitively understand it when I entered the business world. I thought then that decent, intelligent, and experienced managers would automatically make rational business decisions. But I learned over time that isn't so. Instead, rationality frequently wilts when the institutional imperative comes into play.
For example: (1) As if governed by Newton's First Law of Motion, an institution will resist any change in its current direction; (2) Just as work expands to fill available time, corporate projects or acquisitions will materialize to soak up available funds; (3) Any business craving of the leader, however foolish, will be quickly supported by detailed rate-of-return and strategic studies prepared by his troops; and (4) The behavior of peer companies, whether they are expanding, acquiring, setting executive compensation or whatever, will be mindlessly imitated.
Institutional dynamics, not venality or stupidity, set businesses on these courses, which are too often misguided. After making some expensive mistakes because I ignored the power of the imperative, I have tried to organize and manage Berkshire in ways that minimize its influence. Furthermore, Charlie and I have attempted to concentrate our investments in companies that appear alert to the problem."
"As companies get larger and more complex, there’s a tendency to manage to proxies. This comes in many shapes and sizes, and it’s dangerous, subtle, and very Day 2.
A common example is process as proxy. Good process serves you so you can serve customers. But if you’re not watchful, the process can become the thing. This can happen very easily in large organizations. The process becomes the proxy for the result you want. You stop looking at outcomes and just make sure you’re doing the process right. Gulp. It’s not that rare to hear a junior leader defend a bad outcome with something like, “Well, we followed the process.” A more experienced leader will use it as an opportunity to investigate and improve the process. The process is not the thing. It’s always worth asking, do we own the process or does the process own us? In a Day 2 company, you might find it’s the second."
It's not immediately clear to me, and as such makes an otherwise good-looking argument feel less "solid".
"Amazon.com passed many milestones in 1997: by year-end, we had served more than 1.5 million customers, yielding 838% revenue growth to $147.8 million, and extended our market leadership despite aggressive competitive entry.
But this is Day 1 for the Internet and, if we execute well, for Amazon.com."
Bezos has attached the 1997 letter to every letter to the shareholders he has written since. There is even a building in Seattle named Day 1.
 https://www.amazon.com/p/feature/z6o9g6sysxur57t (bottom half of the page)
You'll notice for example that even my initial comment pointed the guy to the answer (i.e.: it's in the third paragraph of the link).
Now, both requirements aren't opposite, someone could have made a comment like this: I like Bezos's notion of a Day 2 company, how it differs from Day 1, etc.
Basically, I like people to show that they aren't asking someone else to do the work for them (or, if they are, at least they made some attempt).
That's at least as reasonable a position as the one you just expressed IMO.
"In particular, their poor handling of software development has been well known for many years. The answer to the WMF's problems with software development has been well known for decades and is extensively documented in books such as The Mythical Man-Month and Peopleware: Productive Projects and Teams, yet I have never seen any evidence that the WMF has been following standard software engineering principles that were well-known when Mythical Man-Month was first published in 1975. If they had, we would be seeing things like requirements documents and schedules with measurable milestones. This failure is almost certainly a systemic problem directly caused by top management, not by the developers doing the actual work."
"This is not to imply that decades-old software development methods are somehow superior to modern ones, but rather that the WMF is violating basic principles that are common to both. Nothing about Agile or SCRUM means that the developers do not have to talk to end users, create requirements, or meet milestones. In fact, modern software development methods require more communication and interaction with the final end users. Take as an example the way Visual Editor was developed. There are many pages of documentation on the WMF servers and mailing lists, but no evidence that any developer had any serious discussions with the actual editors of Wikipedia who would be using the software. Instead. the role of "customer" was played by paid WMF staffers who thought that they knew what Wikipedia editors need better than the editors themselves do. Then they threw the result over the wall, and the community of Wikipedia editors largely rejected it. Or Knowledge Engine, which was developed in secret before being cancelled when word got out about what the WMF was planning. Another example: The MediaWiki edit toolbar ended up being used by a whopping 0.03% of active editors."
Sure, such a company would have no chance of succeeding in the stock market, but what if it never plans on IPOing in the first place?
Buffet's core complaint is about companies not making enough profits due to their profligate spending on other things. A profit cap would not have the impact you're hoping for here. :)
> such that all net revenue in excess of some fixed amount is forced by the corporate charter to be turned into either employee compensation or stockholder dividends.
By definition, only profits can be paid out as dividends, so again, a profit cap would prevent the thing you're you're trying to boost.
Er, sorry, I rather meant that profits after dividends would be capped. You can do anything you like with the money—other than keep it in the corporate coffers (or in commercial paper or anything else that's still liquid assets on the balance sheet.)
But yes, you're still right, it wouldn't have the correct effect.
How about a sliding window cap on non-compensation spending, together with a sliding window cap on headcount? So, in a year with record 10x net revenue, you would be allowed to 10x your salaries/bonuses/dividends, but you wouldn't be allowed to multiply your capital costs, or "new" labor costs, by more than, say, 1.3x. (And keep the fixed profit-after-dividend cap, because otherwise the corporation would just hold all its money in the bank until the sliding window grew enough to let it spend it.)
Also, presumably you're talking about new laws in the US here? What stops businesses from just moving their HQ overseas to places with different laws they like better?
That seems likely to cause the exact _opposite_ effect of what you intend. Apple is sitting on a pretty big cash pile right now. If the only legal options they had were to invest it or return it as dividends, the original article's premise is that they would find ways to invest it (buying factories in China, bringing app development in-house, etc) rather than risk the shrinking of the bureaucracy.
The problem seems doubly bad with Wikipedia and other non-profits. The problem isn't that they have too much profit. It's that they are growing their expenses to match income (rather than capping expenses at some "rational" level and trying to maintain enough income above it). In theory, in for-profit companies shareholders will start to complain if expenses increase too high -- it's not clear who would make the same complaint at Wikipedia (other than the original article).
Trying to come up with piecemeal regulatory "solutions" like this is like trying to invent a perpetual motion machine by coming up with increasingly more complicated contraptions. There are fundamental, local interactions that you have to contend with that basically rule out accomplishing what you want.
Everything from the growth of the welfare state, to populist revolutions to put socialists into power, can be blamed on how common this misconception is.
The problem with applying top-down planning to economy is not that it doesn't work; it's that it optimizes for the goals of the few doing the planning, leaving large swaths of people struggling with the awful conditions created by not taking their goals into account in the plan.
In other words as long as you don't try to plan their activity. What you're describing is how more free market oriented economies work: with a central authority that has the power to force everyone to comply with its plans, but that deliberately chooses to mostly not exercise this power, in order to maintain a free market.
Central economic planning is better than the alternative (anarchy) only when addressing externalities that market forces do not address. So for instance providing a system of law, a national defense and basic scientific research. Even with all of the inefficiencies of top down planning, it still remains the only way to optimize resource allocation for the production of public goods.
> In other words as long as you don't try to plan their activity.
That's not what I said, and the two ideas are not equivalent. Agents may be allowed to accommodate locally only within certain limits, giving an appearance of free will, but still being within planned parameters and thus producing a controlled outcome. The system then will converge to a state that complies with the centrally stated goals. The key here is that individual agents are only allowed some local improvements, but they're prevented from any improvement to themselves that would damage the global optimization goals. See local search,  like for example Variable neighborhood search. 
The central authority sets very strong limits on what can and cannot be done, and enforces them strictly. It also sets up economic incentives, goals, rewards and punishments over the "free actors", like convincing workers that they need to program their lives around continuous learning and adaptation, jumping jobs every X years, and discouraging people from relying on the existing public education, healthcare and public retirement plans, which were the goals of the previous central plan, now deprecated. Anyone ''willingly'' failing to comply with the new program find themselves quickly pushed out of the system (but it was "their own fault", of course, for not "behaving rationally").
That's how the current Western world economic forces operate, and it's quite different from "mostly not exercising its power"; if that's a free market, it looks a lot like "forcing everyone to comply with its plans".
I haven't seen any evidence that this is an effective way of organizing an economic system, at least relative to the free market.
Modern western economies have grown increasingly stagnant as the level of central planning has increased since 1960. Contrary to your claim that "public education, healthcare and public retirement plans" have steadily been deprecated, the raw statistics show that government social spending in a major Western economy, the US, increased, on average, 4.8 percent per year between 1972 and 2011, after adjusting for inflation . This represents a massive shift to more central economic planning. And this shift has been associated with reduced rates of improvement in all metrics of economic development.
That's not really it either. If you have a business which is generating high profits, it will have a high market cap. You can reduce it by transferring the corporation's liquid assets to shareholders, but if you've transferred the liquid assets and you're still above the limit, transferring non-liquid assets to shareholders is clearly the wrong direction.
What you really want is for separate businesses to be separate companies. Conglomerates are inefficient. Don't expand into new markets, just give the shareholders their money. The shareholders can invest it in the new markets if they want to.
The main impediment to this is the tax code, because if you give the shareholders the money it gets taxed -- twice -- but if you spend it internally or leave it inside the corporation in an offshore subsidiary, that doesn't happen.
Wikipedia gets to the same place via a different route. A non-profit doesn't have shareholders to pay dividends to, so instead of paying dividends being discouraged by the tax code, it just isn't possible, and you get the same results. The website generates more revenue than it actually needs to operate and the rest of the money has to go somewhere, so it goes somewhere inefficient as determined by internal politics.
It's interesting that you say that on a subthread discussing something Warren Buffet said. Berkshire Hathaway became an efficient conglomerate by acquiring and operating companies that eschewed the very principle we're discussing.
Dividends are nice, but dividends would stay small if the company itself stays small (because it's constrained from growing.)
(I'm not equipped to evaluate whether or not that point is true, especially in the case of any specific company. Just trying to point out that the argument you're making here kind of begs the question.)
Look at utilities and regional banks, which are usually priced based on return on assets or operating margins.
Growth oriented companies have advantages and disadvantages. Microsoft is a great example -- they have a business that's a monopoly cash cow, but they also need to hit high growth rates to prosper. They squandered a decade trying to both grow and milk the cow, and are now growing again, while breaking and eventually losing parts of the legacy business.
That is basically what you are doing when you leave money on the table. Opportunity costs are still costs.
For about 7 years, I was collecting an effective 10% dividend while getting significant capital appreciation as well.
Likewise, as part of my diversified retirement portfolio, I have a portion in boring dividend paying stocks that generate moderate returns and don't get punished as severely during bad market conditions.
Is there some known principle that governs this, something that could be named along the lines of Parkinson's Law or the Peter Principle?
Anecdotally I noticed that even with 5 people you usually have someone starting to act as adhoc part-time manager, and this role quickly become too much for a part time responsibility as the team grows.
you don't need to create a middle management structure from the get go, just have a chain of command.
Any project with more than one team already has a middle management structure of sorts - you've got the CEO / CTO, then the team leaders (scrum masters, or just whoever's the loudest). There's your middle management layer already.
Elinor Ostrom received a Nobel Prize for researching into how societies have develop structures to manage commons sustainably. Unfortunately her conclusion was there is no default solution.
I theorize that it is related to size/scale, which disassociates causes and effects across time and groups.
My solution is to keep companies as small as possible
There are also plenty of counter examples in corporations and governments throughout history where the size of the organization has not affected its ability to achieve its goals or compete with smaller organizations. Therefore, I think the issue and solution lies elsewhere. Somewhere between accountability, culture and trust.
That's not to say I disagree with you completely: large corporations can achieve things small ones cannot (especially as you move out of software) but I've seen plenty of talk around accountability, culture and trust, and very few results.
Sure there's plenty of talk around accountability and proper management without much substance and I don't claim I have any specific solutions in mind to the problem. I think keeping organizations small (i.e. tribalism) is one hack that may help, but comes with different considerations as I mentioned.
I ask because it's a large work, but I'd read it if it was the latter.
> "In this work, Kropotkin points out what he considers to be the defects of the economic systems of feudalism and capitalism, and how he believes they thrive on and maintain poverty and scarcity, as symbol for richness and in spite of being in a time of abundance thanks to technology, while promoting privilege. He goes on to propose a more decentralised economic system based on mutual aid and voluntary cooperation, asserting that the tendencies for this kind of organisation already exist, both in evolution and in human society. He also talks about details of revolution and expropriation in order not to end in a reactionary way."
Usually long term initiative take a lot of domain knowledge and a specific set of connections. When something new becomes more important, they risk losing their personal position since they may not be the best person in terms of knowledge and connection to carry out the new direction. Instead they try to use their accumulated power to keep their direction supported even when it's not in the interest of the business for as long as they can.
To address this, perhaps there's merit in rewarding timely exits and somehow punishing people that have dragged things out. However the most that companies can do is usually just fire someone. And at that point they have probably sucked the company dry of the value they can personally extract.
It would be awfully surprising if virtually all companies had the same problems due to smart people suddenly behaving irrationally in the same way. It's much less surprising that smart people behave rationally in their own interests, but that those interests regularly fail to align with overall company interests.
And I think the pattern you describe is even mirrored on an institutional level. People are quick to mock Kodak for ignoring the rise of digital cameras, but they tend to undervalue the amount of risk that pivot would involve. If you're a world leader in product X, and that gets replaced by product Y, pivoting destroys the value of your expertise and puts you at risk. It's unpopular but perhaps sensible to settle on winding down profitably (the institutional version of a timely exit) instead of pivoting.
The basic takeaway is that these behaviours and forces are natural, intrinsic and inevitable; that a wise systems-manager will seek to carefully manage them rather than oppose or decry them.
For example, there is - in my opinion - a low chance that the Gates Foundation will expend its resources within 20 years post the death of the last of the two.
They are likely to have something total equivalent to (in present dollars) perhaps $250-$300 billion to get rid of in the next 40-50 years. I'd be skeptical they can get rid of it that fast in their model. They're eroding that mass of capital (presently near $190b) so relatively slowly that by the time Gates is 80, they'll probably still be dealing with $200 billion in today's dollar.
A succession of massive, but formally time-limited foundations, administered in lockstep by dynasties of custodians would add a very interesting touch to any future scenario of fiscal neo-feudalism.
"We face another obstacle: In a finite world, high growth
rates must self-destruct. If the base from which the growth is
taking place is tiny, this law may not operate for a time. But
when the base balloons, the party ends: A high growth rate
eventually forges its own anchor."
Everyone is conservative about what he knows best.
Any organization not explicitly right-wing sooner or later becomes left-wing.
The simplest way to explain the behavior of any bureaucratic organization is to assume that it is controlled by a cabal of its enemies.
If you transform Conquest's rules from politics to running (ruining?) the wikipedia, the transformed rules are
"We can change nothing not even our exponential growth spiral, not our policies, nothing"
"LOL We're not doing the fiscal conservatism thing. I like how the current top discussion is about popularity and the need for a circular firing squad, not something financial. A direct quote of an attempt to avoid working on the issues "I'm reminded of the inflammatory, low-rent campaign of Donald Trump." Yeah buddy that'll fix it, that'll fix it real good."
"We're headed off the financial cliff now get out of the way I'm going gas pedal to the floor as you can see in the exponential graphs. The problem is we're a CRUD app and that's cutting edge CS just like quantum computing so naturally there's no possibility of criticism there. After all, the Egyptians didn't have flush toilets and they built a pyramid, so any criticism of the toilet in my bathroom is either making fun of the entire Egyptian culture or pretending the pyramids were not a logistical challenge."
There is some humor in that the world of paper encyclopedia publishing ran on mostly capitalist operational principles for decades, centuries. It turns out that running an online encyclopedia off donations and extreme hand waving is powerful enough to destroy an industry on its way to its inevitable collapse. Maybe someday in the future we'll have encyclopedias again, but the era after wikipedia and before the next encyclopedia will be a bit of a dark age. That's too bad.
Also, what scenario are you talking about when yo usay "after wikipedia"? There's plenty of copies of it on the internet, so the data won't suddently vanish, and wikis don't suddenly stop existing if the Wikipedia foundation implodes.
But overall this oped is misplaced. Running the leanest possible operation shouldn't be Wikipedia's focus at this stage in its lifecycle, it's improving the quality of its content.
Back in 2005 Wikipedia had 438k articles and the focus was expanding the reach of its content to cover all topics; today the article count is 5.4 million it's quality that matters more. You can't improve quality just based on crowd-sourcing alone (see: Yelp, Reddit, etc), and the bigger it's gotten the more of a target it's become by disinformation activists.
This attitude on budgets over value strikes me as a classic engineer's POV. The OP is nostalgic about a time when the site was run by a single guy in his basement, but could 1 guy handle the assault of an army of political zealots or Russian hackers? DDoS attacks? Fundraising? Wikipedia is arguably one of the most coveted truth sources the world over, protecting and improving its content is more important than an optimal cost-to-profit ratio.
If the OP has specifics, by all means, share them, but this kind of generalized fearmongering about budgets isn't spectacularly useful, IMHO.
That's not what I got at all. And that's why the article is interesting.
Wikipedia's funding is what's growing unsustainably. It's higher funding that's pushing the costs higher. And that's what makes it interesting (and only a little click-baity.)
It seems, having taken people's money for a charity, you have a moral obligation to spend the money on the charity, whether it needs it or not. And as a manager of said charity, it's very easy to believe (or to convince yourself) it needs the money. Or otherwise why were we making plaintive pleas for money?
(And that happens in a world of good intentions. When fundraisers become cynical, you end up with the US political outrage machine, which operates simply to raise money rather than to effect political change....)
From the OP: "After we burn through our reserves, it seems likely that the next step for the WMF will be going into debt to support continued runaway spending, followed by bankruptcy."
If it was just about wasting donations, they'd never go into debt. It's costs, and specifically costs-to-income ratio he seems perturbed about.
The point isn't that the donations are wasted, its that in spending them, you create an organism that needs to be fed.
So if donations don't continue to grow to match or at least keep pace, they could start running a deficit to eat away those reserves in no time. And once a non-profit organization starts running at a deficit, some contributors will question their contributions and they may shrink accordingly.
Those reserves could disappear in just a few years.. unless there's a change, two years should show the direction and another couple years, the course will be set one way or another.
And your evidence for this is...?
I've worked in the nonprofit space for 10 years and leaders, like for-profit companies, cut costs when they're facing a deficit.
If costs go up at WMF they'll either raise revenue or cut somewhere, like virtually any other mature business.
That's the problem. A non=profit is not a "mature business" because it's not a business at all. In fact, in most non-profits any effort to "run it like a business" will be met with opposition. Once again, for evidence look at any government agency, budget, etc.
But even if it was a "mature business" the idea that any organization knows when and how to cut costs after its past its prime is dubious at best. For evidence, look at any company bankruptcy. Their costs outpaced their revenue and they didn't react fast enough or in the right ways.
And that's assuming there were "right ways" to react.
As far as cutting costs and institutional maturity: sure, your all look at any company bankruptcy. You can also look at the millions and millions of non-bankrupt companies, and realize that you're overgeneralizing.
Or as the top commenter put it "the institutional imperative" - https://news.ycombinator.com/item?id=14287430 - as described by Warren Buffet.
Is this some kind of Freudian slip?
That's backwards though. It's not unusual for a non-profit to run a deficit. A donor will question a non-profit that's running a surplus - why am I giving you money you don't need.
Once invoices or paychecks are delayed, people stop and question why the leadership is "making so much money" and the ROI on galas and events. And once a few donors see the deficits, the questions get harder and money slows down.. making the next round of invoices a little harder.
But there's no debt. The whole argument is pure conjecture based on imagination. Using it as if it were a fact that proves something makes little sense.
I believe the author's thesis is that by the time "it's not currently a problem" is no longer an argument that makes sense, it will also no longer be possible to effectively correct the WMF's course in a way that will solve it.
I'm not sure I have any idea how to effectively determine if the author is correct about that, but certainly I don't think "it's not currently a problem" actually contradicts anything he's saying.
The point is that costs will continue to rise (or not fall) after the funding inevitably falls (or fails to rise enough)
The table in the article suggests it's growing sustainably, as assets are increasing and revenues exceed expenses. The whole unsustainability hypothesis seems to be based on one metaphor "if it's growing, then it's cancer, ergo it is deadly, ergo it has to be stopped".
The 1,250x cost growth seems like the heart of the article - it's a claim that the growth is disproportionate to, and unmotivated by, actual value. Since WMF is funded as a charity, not a business, the revenues are 'sustainable' based on what people want to give. So if revenues exceed expenses, that says people like WMF and will give when asked. It doesn't say anything about whether the expenditures are justified.
I'm not sure I accept the thesis of the article, but I think it deserves deeper consideration. The bankruptcy line seems overwrought, but there's no intrinsic reason that high (charitable) revenues prove high expenditures justified.
This claim is largely unsubstantiated, as "number of pages in Wikipedia" is not a good measure of value of the whole project, especially when you take it naively as the only parameter that must numerically match expense figures in linear proportionality. I think such measure is nonsensical.
> It doesn't say anything about whether the expenditures are justified.
How you define "justified" then? There's an extensive process for financing and planning projects. The article author seems to either be completely unfamiliar with this process, or ignore it altogether, focusing only on one measure, which is number of pages. I do not see this as a good argument. Of course, the concern "do we adequately spend the money" is valid - but this concern is known, and constantly on the radar, without alarmist articles. The contribution of this particular article seems to be null - the valid concerns it raises are already known and accounted for, and the new ones are not valid.
Is is clearly a poetic metaphor - no-one clicking on the article seriously believed that Wikipedia has a literal biological cancer (the "cancer" metaphor is hardly a new one, if any criticism can be made here it would be that it is almost verging on cliche). Indeed the entire article is structured around this metaphor - the title is hardly false advertising.
I disagree strongly with this new obsession that every article title and headline must be written as pure "Man Bites Dog" factual summary, particularly for opinion pieces like this. Surely there is room for some attempt at poetic flair.
(A hypothetical example of a real "clickbait" style headline for this article might be "Google Will Buy Wikipedia").
You forgot the obligatory "this" in the title. Better would be: This Is Why Google Will Buy Wikipedia
Please note that https://en.wikipedia.org/wiki/User:Guy_Macon/Wikipedia_has_C... is the origiginal essay, and the version in the Signpost has been modified at the request of the Signpost editors.
The thing is that there doesn't really seem to be any REAL alternatives challenging Wikipedia in an honest, high effort way. As a wiki/knowledge fan myself, I've gone to different sites with different takes such as Quora (which is fantastic, can't really say it's a real competitor though), Genius.com (which is comparable only in a very narrow sense for songs/texts and nothing else), and Everipedia (which is the closest thing to a real competitor with all Wikipedia content imported, but is tiny in comparison to the last 2 sites above - Alexa 6k US vs Alexa top 100 for the other 2).
I would say out of everything I've found Everipedia comes closest in a valiant effort and I frequently contribute to it here and there, but at the same time, Wikipedia is just too dominant to see any real necessity to change how it is doing anything, whether that is for good or for bad. And my personal opinion is that maybe that is how it should stay too, given the size and scale that Wikipedia operates and its general continued success across most of its fronts. One thing is for sure: the world is definitely better with Wikipedia continuing onward even if "it has cancer."
so I enabled google analytics to see where they were coming from. the source was one of our sub pages on redlining was the 5th external link listed in wikipedia on that topic.
IMHO if Wikipedia loses that, it costs dearly not just to Wikipedia but to society.
So long as there's a contingency plan for donations falling, whereby all the nonessential stuff can be dropped if necessary, there's no need to worry.
tldr; prepare for the worst, expect the best.
It was a very different time, and the whole thing was run much more like a typical open source project.
I think the whole project has gone in completely the wrong direction since then. Wikipedia itself is still awesome, but what's not awesome is that the typical reader / contributor experience is pretty much the same as it was in 2005.
Moreover, because of the growing number of employees & need for revenue the foundation's main goal is still to host a centrally maintained site that must get your pageviews & donations directly.
The goal of Wikipedia should be to spread the content as far & wide as possible, the way OpenStreetMap operates is a much better model. Success should be measured as a function of how likely any given person is to see factually accurate content sourced from Wikipedia, and it shouldn't matter if they're viewing it on some third party site.
Instead it's still run as one massive monolithic website, and it's still hard to get any sort of machine readable data out of it. This IMO should have been the main focus of Wikimedia's development efforts.
Wikimedia universe is way bigger than one site. There's Wikidata, Commons, Wikisource, Wiktionary, Wikivoyage, Wikibooks and so on. And there's a lot of language versions too - English is not the only way to store knowledge, you know.
> The goal of Wikipedia should be to spread the content as far & wide as possible
The requires a) creating the content and b) presenting the content in the form consumable by the users. Creating tools for this is far from trivial, especially if you want it to be consumable and not just unpalatable infodump accessible only to the most determined.
> Instead it's still run as one massive monolithic website
This is not accurate. A lot has changed since 2004. It's not one monolithic website, it's a constellation of dozens, if not hundreds, of communities. They are using common technical infrastructure (datacenters, operations, etc.) and common software (Mediawiki, plus myriad of extensions for specific projects), but they are separate sites and separate communities, united by common goal of making knowledge available to everyone.
> it's still hard to get any sort of machine readable data out of it
Please check out Wikidata. This is literally the goal of the project. You can also be interested in "structured Commons" and "structured Wiktionary" projects, both in active development as we speak.
> This IMO should have been the main focus of Wikimedia's development efforts.
It is. One of focuses - for a project of this size, there's always several directions. BTW, right now Wikimedia is in the process of formulating movement strategy, with active community participation. You are welcome to contribute: https://meta.wikimedia.org/wiki/Strategy/Wikimedia_movement/...
Disclosure: working for WMF, not speaking for anybody but myself.
>> The requires a) creating the content and b) presenting the content in the form consumable by the users. Creating tools for this is far from trivial, especially if you want it to be consumable and not just unpalatable infodump accessible only to the most determined.
Yes, and as emphasized in the article, WMF has done a terrible job at building better tools. For crying out loud, we are still typing in by hand the complete bibliographic information for each cited reference.
Your other comments are similar. The fact that "WMF is trying", or have a named task force whose formal mission includes a complaint, is not enough justify years of high spending.
I respectfully disagree. I think WMF has done pretty good job. Could it be better? Of course, everything could. Is it "terrible"? not even close.
> For crying out loud, we are still typing in by hand the complete bibliographic information for each cited reference.
In any case "it misses my pet feature" and "the whole multi-year effort is terrible" are not exactly the same thing.
> is not enough justify years of high spending.
I think the work that has been done and is being done justifies it. All this work is publicly documented. You think it's too much and you have the ideas how to do it better - you're welcome to comment. I can not comment on your value judgements - you may seem some projects are more valuable and not done, you are entitled to it. There's a process which gets some things done and some things left out, and by nature not everybody will be satisfied. I only want to correct completely factually false claims in the Op-ed, and I believe I have done so. If I can help with more information, you are welcome to ask. As for value judgements, I think we'd have to agree to disagree here.
It's clear from context that this is just an example. The issues with the Wikipedia editing UI are legion and described in many other places.
> You think it's too much and you have the ideas how to do it better - you're welcome to comment
Clean house. Put the people who built Zotero in charge.
Any existing UI can be analyzed to find a legion issues, no UI is ever perfect, especially over time and changing requirements. Wikipedia UI is certainly not perfect, and much work is to be done (and being done), but I would stop very far from calling the work that was already done "terrible".
> Clean house. Put the people who built Zotero in charge.
Err, I am having hard time making sense of this advice - why exactly people who built a reference management software must be running Wikimedia Foundation?
You already declared you weren't going to debate me on this point, so I don't know why you're bringing it up again, especially since you're not saying anything substantive.
> why exactly people who built a reference management software must be running Wikimedia Foundation?
Because they are philanthropically funded non-profit who build great academic/research software on a small budget while responding rapidly to user feedback.
If your objections center around the fact that WMF does a lot more than develop Wikipedia software, then you are missing the whole point of this thread: that WMF's primary contribution is Wikipedia, and almost everything else is secondary. So long as it's being funded by private citizens because of the value they get from Wikipedia, then this should be the focus. Yes, that means the people running Wikipedia conferences and local meetups will have less power.
WMF does a lot more than develops one piece of software to manage citations, yes. Nothing wrong with the software, I'm sure people who made it are awesome. But it's like discovering US federal government didn't solve a problem with a faulty light on your street and proposing that an electrician that did should thus be the President of the USA. Nothing wrong with the electrician or fixing the light, and maybe he'd even be a great President, but that in no way follows from his ability to fix the light. That's just completely unrelated things.
> that WMF's primary contribution is Wikipedia, and almost everything else is secondary
Not so for some time. Also, Wikipedia as a project is way bigger than just software.
> So long as it's being funded by private citizens because of the value they get from Wikipedia, then this should be the focus.
It is. I mean the value and improving it (again, if we correct from Wikipedia to "Wiki sites to gather and disseminate knowledge", which are more that just Wikipedia). But opinion on how to improve that value may not only be "improve this one particular feature".
> Yes, that means the people running Wikipedia conferences and local meetups will have less power.
Than who? And why? There are processes that decide which directions are prioritized and which are not. Right now two of them are happening as we speak - board elections and strategy consultation. Any decision that happens leaves somebody unsatisfied, because it's not possible to satisfy everyone. That doesn't mean everything is terrible, sorry.
> Not so for some time. Also, Wikipedia as a project is way bigger than just software.
well my donation certainly is aimed that way and the insistent nagscreen certainly made me think "yeah I don't want this resource to go away"
and that is Wikipedia. I occasionally use some of the other wikimedia projects, but they should be secondary, it's definitely specifically that one great body of knowledge that got me to donate.
wiktionary is the project I use second most. if they were to beg for donations or else it may go away, I'd be like "eh"
it's Wikipedia only that got me "no wait this is super important, take my money" every year.
I was really confused by this comment until I realized you thought I was suggesting Zotero run things because they power Citoid. In fact, as any of the people I eat lunch with can tell you, I have been singing the praises of Zotero for years.
The fact that Citoid is very flawed but the part of it that actually works is made by Zotero was merely delicious coincidence.
Your remaining comments then do a better job than I could possibly hope of illustrating exactly the pathological attitude that afflict non-profits. The whole point of my criticism, which is partially shared by the OP and many others in this thread, is that the proper focus of WMF is determined by the people who donate their money and, especially, their time. (That's a normative claim.) That fact that you responded to these points by sayings "No, actually, we at the WMF have expanded well beyond such trifling concerns as the base functionality of Wikipedia" perfectly captures this destructive mindset.
> if we correct from Wikipedia to "Wiki sites to gather and disseminate knowledge", which are more that just Wikipedia
Incorrect. For instance, I use and love Wikivoyage, but I do not pretend that the millions of people who donate to Wikipedia intend to subsidize it! Yes, if Wikivoyage ends up better off through Wikipedia-financed improvements on the general Wikimedia software, all the better. But my friends should not be made to feel like Wikipedia will shutdown if he doesn't donate yearly just so WMF can hold more conferences.
> But opinion on how to improve that value may not only be "improve this one particular feature".
Again, for the second time, the comment on the antiquated citation process was an illustrative example. I have resisted diving into the millions of issues with Wikipedia's software.
>> Yes, that means the people running Wikipedia conferences and local meetups will have less power.
> Than who?
Well, not "less power than other people", but "less power than they did before", i.e., fewer resources and less influence. (WMF can simply get smaller, so that no one gets more power.) But for clarity, I'm happy to suggest that more institutional power within WMF should be given to technical people, to (say) Zotero staff or other people from software non-profits with a better track record, and to anyone who internalizes the idea that the non-profit exists to as a servant to the people who donate time and money.
> There are processes that decide which directions are prioritized and which are not.
Oh thank goodness! There are processes! Just like there are processes for new Wikipedians to dispute the deletion of content they write.
I guess so long as a nation is nominally democratic we don't ever have to worry about it being badly run. And if anyone complains, we can just say they should vote and be satisfied. After, we can't make everyone happy, so if people are unhappy there's no reason to worry about it!
> Not so for some time.
Oh yeah? To me, as Johnny Q. Public, definitely so, now as always.
Could well be that you're doing Crom-knows-what too, nowadays -- but who cares? Why should we?
> Also, Wikipedia as a project is way bigger than just software.
Yup, you're right there: It's all about the knowledge, the actual _content_ of the on-line encyclopedia.
Which worked prefectly fine with the software of ca 2004, so why waste millions and millions to, AFAICS, hardly any benefit at all compared to that?
Because it's not 2004 anymore. What worked perfectly in 2004 (which btw it didn't, people complained back then no less than they do now), doesn't work that perfectly now. 10 years is a long time on the Internet, and the project has grown since then.
I have to agree. I keep seeing pleas for donations on Wikipedia when I browse it, but now that I'm reading that they're spending most of that money on other bullshit besides Wikipedia itself, that means I no longer feel any need or duty to donate. I don't use all that other stuff, nor do I care about it, I only care about Wikipedia itself. Surely I'm not the only person who feels this way; anyone reading this article is going to see all the largesse that WMF is spending on, and many are going to question these donation pleas, which likely means donations are going to fall.
Remember Napster, back in the day? It was able to be shut down because it had an SPOF: Napster-the-corporation owned and maintained all the "supernodes" that formed the backbone of the network.
Or consider the Great Firewall of China. If the Great Firewall can block your site/content entirely with a single rule, you have an SPOF.
The answer to such problems isn't simple sharding-by-content-type into "communities" like you're talking about; this is still centralized, in the sense of "centralized allocation."
Instead, to answer such problems, you need true distribution. This can take the form of protocols allowing Wiki articles to be accessed and edited in a peer-to-peer fashion with no focal point that can be blocked; this can take the form of Wikipedia "apps" that are offline-first, such that you can "bring Wikipedia with you" to places where state actors would rather you don't have it; this can take the form of preloaded "Wikipedia mirror in a box" appliances (plus a syncing logistics solution, ala AWS Snowball) which can be used by local libraries in countries with little internet access to allow people there access to Wikipedia.
In fact, one of the long-term projects in WMF is making sure the infrastructure is resistant to single-point-of-failure problems - up to whole data center going down. We are pretty close to it (not sure if 100%, but if not close to it). Of course, if you consider existence of WMF to be point of failure, it's another question, by that logic existence of Wikipedia can be treated as single point of failure too. Anybody is welcome to create a new Wikipedia, but that's certainly not a point of criticism towards WMF.
> It was able to be shut down because it had an SPOF: Napster-the-corporation owned and maintained all the "supernodes"
WMF does not own the content or the code, both are in open access and extensively mirrored. WMF does own the hardware - I don't think there's a way to do anything about it, unless somebody wants to donate a data center :)
> If the Great Firewall can block your site/content entirely with a single rule, you have an SPOF.
True, though there are ways around it. Currently witnesses with Turkey blocking Wikipedia. See e.g. https://github.com/ipfs/distributed-wikipedia-mirror for ways around it.
> Instead, to answer such problems, you need true distribution.
I am skeptical about the possibility of making community work using "true distribution". Even though we have good means to distribute hardware and software, be it production or development code, we still do not have any ways to make a community without having gathering points. I won't say it is impossible. I'd say I have yet to see anybody having done it.
But if somebody wants to try, all power to them. You can read more about Wikimedia discussions on the topic here: https://strategy.wikimedia.org/wiki/Proposal:Distributed_Wik...
> such that you can "bring Wikipedia with you"
That already exists, there are several offline wikipedia setups and projects: https://strategy.wikimedia.org/wiki/Offline/List_of_Offline_...
> this can take the form of preloaded "Wikipedia mirror in a box" appliances
We are pretty close to this - you can install working Mediawiki setup very quickly (vagrant or I think there are some other containers too, I use vagrant), dumps are there. Won't be 100% copy of true site since there are some complex operational structures that ensure caching, high availability, etc. which kinda hard to put into a box - they are in public (mostly as puppet recipes) but implementing them is not out-of-the-box experience. But you can make a simple mirror with relatively low effort (probably several hours excluding time to actually load the data, that depends on how beefy your hardware is :)
Most of this, btw, is made possible by the work of WMF Engineers :)
That doesn't come close at all, from the perspective of a librarian who wants a "copy of Wikipedia" for their library, no? It assumes a ton of IT knowledge, just from the point where you need to combine software with hardware with database dumps.
The average library staff who'd want to set this up in some African village would be less on the side of the knowledge spectrum of "knows what to do with a VM image", and more toward the side of "can plug in and go through the configuration wizard for a NAS/router/streaming box."
Once I can tell such a person to buy some little box with a 4TB hard disk inside it, that you plug in, go to the URL printed on the top, and there Wikipedia is—and then it can keep itself up to date, with a combination of "large patches that get mailed on USB sticks that you plug in, wait, and then drop back into the mail", and critical quick updates to text content for WikiNews et al that it can manage to do using a 20kbps line that's only on for two hours per day—then you'll have something.
I don't think "critical updates" are really that necessary. Swapping SD cards a couple times a year would solve most of it. I think it's pretty incredible (and useful) to to be able to have access to all that information for such a low cost even if it's a few months (or even years) out of date.
Depends what you mean by copy. If it's just a static data source, any offline project would do it. If it has to update it's trickier, but some offline projects do it too. If you want to run a full clone of Web's fifth popular website, yes, it requires some effort. Sorry, no magic here :)
> "can plug in and go through the configuration wizard for a NAS/router/streaming box."
There are boxes that are integrated with one or another of the offline projects. There's also Wikipedia Zero - which in the world where mobile coverage is becoming more and more widespread even in poor regions, may be even better alternative.
Sadly it seems the opposite is true, whole parts of Wikipedia are infested by cancer (aka corrupt/out-od-mind admins who are acting in their own world/turf and interest), have a closer look at certain languages like de.Wikipedia.org where more new and old articles get deleted than content can grow (source: various German news media incl. Heise.de reported about it). And why is Wikia a thing? And why is it from the Wikipedia founder, has he a double interest!? And now he is starting a news offspring as well! Something like the Wikipedia frontpage and WikiNews, just under his own company. And on the otherside Wikipedia banned trivia sections to make the Wikia offspring even possible (happened 10 years ago, but you probably remember it; yet Wikipedia deleted/buried the trivia section history). Why even delete non-corporate-spam artices? Why are fictional manga creatures all o er on Wikipedia but info about movie characters all deleted? Many Wikipedia admin seem to be deletionists that care only about their turf, the care about "their own" articles, they revert changes to them just for their own sake. Look at the WikiData project. Why is it implemented by a German Wikipedia org that has little to do with intl Wikimedia foundation, it's not a sister project, they do their own fundraiser and media news reported not so nice things over the years.
Look at OpenStreetMap project, it works a lot better. Maybe the Wikipedia project should be transfered over or forked by OpenStreetMaps project. And delete all admin rights, and start over with the in some way toxic community that scares away most normal people who don't want to engage in childish turf wars and see their contributions deleted and cut down for no reason but admin power play.
Doesn't that qualify?
I work on a Wikitext parser . So do many other people, in different ways. Wikitext syntax is horrible and it mixes content and presentation indiscriminately (for example, it contains most of HTML as a sub-syntax).
The problem is basically unsolvable, as the result of parsing a Wiki page is defined only by a complete implementation of MediaWiki (with all its recursively-evaluated template pages, PHP code, and Lua code), but if you run that whole stack what you get in the end is HTML -- just the presentation, not the content you presumably wanted.
So people solve various pieces of the problem instead, creating approximate parsers that oversimplify various situations to meet their needs.
One of these solutions is DBPedia , but if you use DBPedia you have to watch out for the parts that are false or misleading due to parse errors.
avar: "The goal of Wikipedia should be to spread the content as far & wide as possible, the way OpenStreetMap operates is a better model."
I am confused.
Doesn't OSM data come encapsulated in XML or some binary format?
As for dispersion of content, I could have sworn I have seen Wikipedia content on non-Wikipedia websites. Is there some restriction that prohibits this?
I have seen Wikipedia data offered in DNS TXT records as well.
For anyone who has not worked with the Wikipedia data dumps extensively before, trust us that it is not easily machine-readable and that even solutions like DBPedia / Wikidata are not yet suitable for many purposes.
The Wiki markup is extremely complicated and being user created, it is also inconsistent and error prone. I believe the MediaWiki parser itself is something like a single 5000 line PHP function! All of the alternate parsers I've tried are not perfect. There is a ton of information encoded in the semi-structured markup, but it's still not easy to turn that into actual structured data. That's where the problem lies.
It's not. I'm on mobile so not easiest to link, but the PHP versio of the parser is nothing like a single function. There is also a nodejs version of the parser under active development with the goal of replacing the php parser.
Would there be some particular structure that everyone would agree on?
Alternatively, what is the desired structure you want?
Because the current format is so messy, I just focus on what I believe is most important: titles and externallinks. IMO, often the most interesting content in an article is lifted from content found via the external links. I also would like to capture the talk pages. Maybe just the contributing usernames and IP addresses.
Opinions or explanations that have no supporting reference are inexpensive. One can always these for free on the web. No problem recruiting "contributors" for that sort of "content".
Back to the question: I am curious what structure would you envision would be best for Wikipedia data? Assume hypothetically that a "perfect" parser has been written for you to do the transformation.
* The definitions from each Wiktionary entry.
* The links between those definitions, whether they are explicit templated links in a Translation or Etymology section, or vaguer links such as words in double-brackets in the definition. (These links carry a lot of information, and they're why I started my own parser instead of using DBnary.)
* The relations being conveyed by those links. (Synonyms? Hypernyms? Part of the definition of the word?)
* The links should clarify the language of the word they are linking to. (This takes some heuristics and some guessing so far, because Wiktionary pages define every word in any language that uses the same string of letters, and often the link target doesn't specify the language.)
* The languages involved should be identified by BCP 47 language codes, not by their names, because names are ambiguous. (Every Wiktionary but the English one is good at this.)
There are probably analogous relations to be extracted from Wikipedia, but it seems like an even bigger task whose fruit is higher-hanging.
Don't get me wrong: Wiktionary is an amazing, world-changing source of multilingual knowledge. Wiktionary plus Games With A Purpose are most of the reason why ConceptNet works so well and is mopping the floor with word2vec. And that's why I'm so desperate to get at what the knowledge is.
The burden of repurposing falls on you and wikipedia makes the exact same data that they have at their disposal available to you, to expect it in a more structured format that is usable by you and your project but that goes beyond what Wikipedia needs in order to function is asking for a bit much I think.
They make the dumps available, they make the parser they use available, what more could you reasonably ask for that does not exceed the intended use case for Wikipedia?
Afaics any work they do that increases the burden on Wikipedia contributors that would make your life easier would be too much.
But since you are already so far along with this and you have your parser, what you could do is to re-release your own intermediary format dumps that would make the lives of other researchers easier.
But this could be easier. What I hate about Wikimedia's format is templates. They are not very human-editable (try editing a template sometime; unless you're an absolute pro, you will break thousands of articles and be kindly asked to never to do that again) and not very computer-parseable. They're just the first thing someone thought of that worked with MediaWiki's existing feature set and put the right thing on the screen.
Replacing templates with something better -- which would require a project-wide engineering effort -- could make things more accessible to everyone.
FWIW, I do make the intermediate results of my parser downloadable, although to consider them "released" would require documenting them. For example: 
It's a nice agreed-upon vocabulary for linked data. But you still need the data that the vocabulary refers to. The information you can get without ever leaving the Wikidata representation is still too sparse.
What if you just render the content into HTML and then "screen scraped" the text, and then convert into a more useful format (MarkDown, JSON, etc). Is that plausible?
"2017-05-07 23:24:34: enwiki (ID 13918) 103 pages (0.0|1.2/sec all|curr), 921000 revs (14.4|11.4/sec all|curr), 100.0%|100.0% prefetched (all|curr), ETA 2019-01-23 15:17:29 [max 779130995]"
There are real logistical challenges in making these dumps and making them _useful_. For all Wikimedia's spending, they have not invested sufficiently in this area.
Years back Wikipedia released HTML dumps of the entire site, which was closer to providing the actual content of Wikipedia as structured data, but that was discontinued.
EDIT: Turns out someone is already working on it: https://lunyr.com/
Not sure how far along it is, but it looks interesting.
Also what's the difference between WikiData and DBPedia?
Wikidata is a Wikimedia project with aim to create a structured knowledge based. It is mostly filled and curated by humans: https://www.wikidata.org
DBPedia is a knowledge base which content is extracted from Wikipedia (mostly from the infoboxes). It is a project run by researchers: http://dbpedia.org
Year Revenue Expenses Net Assets Expense Ratio (1-margin)
2003/2004 $80,129 $23,463 $56,666 29%
2004/2005 $379,088 $177,670 $268,084 47%
2005/2006 $1,508,039 $791,907 $1,004,216 53%
2006/2007 $2,734,909 $2,077,843 $1,658,282 76%
2007/2008 $5,032,981 $3,540,724 $5,178,168 70%
2008/2009 $8,658,006 $5,617,236 $8,231,767 65%
2009/2010 $17,979,312 $10,266,793 $14,542,731 57%
2010/2011 $24,785,092 $17,889,794 $24,192,144 72%
2011/2012 $38,479,665 $29,260,652 $34,929,058 76%
2012/2013 $48,635,408 $35,704,796 $45,189,124 73%
2013/2014 $52,465,287 $45,900,745 $53,475,021 87%
2014/2015 $75,797,223 $52,596,782 $77,820,298 69%
2015/2016 $81,862,724 $65,947,465 $91,782,795 81%
Salaries and wages 31,713,961 26,049,224
Awards and grants 11,354,612 4,522,689
Internet hosting 2,069,572 1,997,521
In-kind service expenses 1,065,523 235,570
Donations processing expenses 3,604,682 2,484,765
Professional service expenses 6,033,172 7,645,105
Other operating expenses 4,777,203 4,449,764
Travel and conferences 2,296,592 2,289,489
Depreciation and amortization 2,720,835 2,656,103
Special event expense, 311,313 266,552
Total expenses 65,947,465 52,596,782
That's not the essay's concern, the essay's concern is that expenses grow much faster than the site's load.
(1) Bad experiences for new and established contributors mean less motivated contributors. This is due to factors such as too much bureaucracy, too many subjective guidelines, too much content being deleted (exclusionism), and an overwhelming mess of projects and policies.
(2) Not enough focus. By starting many projects the foundation has muddied its mission and public identity. In addition, it has broad and potentially mutually conflicting goals such as educating the public about various issues, educating the public about how to work with others to contribute to projects, asking the public for money, agitating governments and corporations for policy change and support, monitoring public behavior looking for evidence of wrongdoing, and engaging with education. Why not leave education to the educationalists, politics to the politicians, spying to the government and motivated contributors and fundraising to donors?
(3) Non-free academic media is hurting the project. Given that only a small number of editors have true access to major academic databases, it is often hard for contributors to equally and fairly balance an article.
Having said that, I still have tremendous respect for the project and comparing its costs to those of the prior systems necessarily incorporating manual preparation, editing, production and distribution of printed matter by 'experts', the opportunity costs for access alone justify the full expenditure. It's not a lot of money in global terms.
This has nothing to do with the financials of the foundation and is completely a community issue.
> Not enough focus.
This is a valid point but I think you're being too scorched earth with it like saying that Wikipedia shouldn't do any political outreach at all. If its millions of viewers hadn't seen the SOPA blackout, would it have been as successful? If it didn't fight for freedom of panorama and other copyright issues, would it be able to exist in the same form as now? Your suggestion is like telling Japan to go back to isolationism. Sure it might work if you're self-sustaining, but it's no way to run a global project.
> Non-free academic media is hurting the project.
If you are part of a university, you likely have access to such media. Many public libraries also have such access. Lastly, there's the Wikipedia Library.  I'm not sure what you want Wikipedia to do here past what it's already doing.
> This has nothing to do with the financials of the foundation and is completely a community issue.
I do not contribute financially to wikipedia, despite being very interested in doing so, because of this issue.
I am sick and tired of seeing large amounts of properly formatted, well formed articles, written in good faith, deleted by the little hitlers protecting their precious wikipedia turf.
This "community issue" costs wikipedia several thousand of my dollars per year. I wonder how many other people decline to support them financially due to these "community issues" ?
I don't contribute to Wikipedia anymore, because the editorial policies don't agree with my views on how an internet encyclopedia should be run.
I don't agree. Choices in how the foundation spends it's money can amplify or diminish these concerns.
E.g. WMF has spent extensively to try to bring in a wider space of borderline editors rather than investing as much in infrastructure to soften the learning curve to retain and boost the participation of middle-tier editors, which exacerbates us vs them seige mentality... and the overuse of blunt tools to stem a rising tide of low quality edits at the cost of a poor experience for new contributors.
E.g. An example I'd cite for this is the extreme investment in "visual editing"-- which only barely manages to not mangle pages when its used-- over things like syntax highlighting.
Not all the blame in these areas falls on the WMF for sure. As an example, enwiki community factionlization around deletion blocked the deployment of revision flagging (basically supporting having 'release' versions of articles so that non-contributors are not constantly subjected to the very latest unreviewed revision of an article) which would have allowed radically less aggressive edit patrolling.
From there you can get more specific on what "operate" means (i.e. will layoffs occur before scaling back hosting costs).
Has there been any such analysis?
The question as I understand it is is "if no other money comes in how long could Wikipedia operate?
If you assume (and I do) that the Wikimedia foundation (WMF) would keep on the spending path they are on, it would take a year plus or minus a few months to go completely broke. If they were to immediately respond with massive spending cuts they could last a lot longer.
The reason I don't believe that the WMF will react to a revenue decrease with spending cuts is because they really, really, believe that everything they are doing and everything they have planned is absolutely essential. Plus, it is human nature to say "this is temporary. The revenue will go back to increasing next year", all the while greatly expanding the fundraising appeals.
Employee cost has grown 300. How many employees are there? What do they work on?
It's clear that salaries and awards and grants are driving the increase in cost. Maybe this is damning evidence of a decadent culture, which the author of this op-ed clearly presume, but I doubt it. I would expect that Wikipedia's employees have been working very hard for a long time to keep the site running and they've cultivated expertise in governing the site in a way that avoids controversy and maintains credibility. I'd rather Wikipedia spend to retain long-tenured experts who have paid their dues than be an underpaid-college-graduate-mill like so many non-profits are. It seems that they've done that, and they've waited until the organization was financially stable to do so.
But this is a red herring, because outright fraud is relatively rare. Rather, the much biggest issue is a terribly managed organization spending resources ineffectively. Non-profits shouldn't be judged on overhead or executive salary (who cares?), they should be judged on what they accomplish for the amount they spend. And WMF does terribly on this metric.
"spending resources ineffectively"
Often staff is taken on in order to fill vacancies without as much regard for skill levels. The marginal value of extra employees lowers and can dip into the negative. This is the sort of ineffective spending which is invisible to all but their closest colleagues -- who have too little political capital to do anything about it.
This part of the critique seems a little off, doesn't it? I don't know the state of WMF engineering, it very well may have problems, and a complete lack of documentation or planning is not a good sign, but the particular artifacts (requirements documents, schedules with milestones) mentioned here are more from the pre-Agile waterfall school of thought. Can anyone familiar with WMF engineering comment?
The author of the op-ed is a devoted editor but seems almost totally ignorant of how development is conducted. The Foundation has been doing transparent quarterly/yearly roadmap planning alongside their annual plan / budget cycle (which is shared all publicly). On a shorter timeframe, they are pretty serious about Agile/Scrum. You can see on https://wikimediafoundation.org/wiki/Staff_and_contractors that today they even have a team of half a dozen full time Scrum masters. If what he thinks is missing is serious, detailed planning, he's sorely wrong.
The platform (MediaWiki) is still a FOSS community so you can find project requirements docs/roadmaps all over mediawiki.org, all the bugs on Phabricator, follow along on mailing lists, and even see commits on their Gerrit instance.
Agile isn't my cup of tea personally and I could grok criticisms of the organization's software output (ignoring the fact that they're buried under 10+ years of technical debt...), but it takes minimal digging to find all their plans and timelines. I would venture that the author chose not to dig into this because he, like a lot of entrenched old school editors, viscerally hates some of the past attempts to make MediaWiki a modern collaboration platform, such as building a WYSIWYG editor and a threaded discussion system to replace wiki talk pages.
I am very familiar with Agile and Scum, and I have seen the advantages over older paradigms such as waterfall. That being said, there are certain basic principles that the old methods and the new methods have in common. One such principle is the basic idea of having some sort of contact with the people who will be using the finished software and understanding their needs. The WMF does not do that. Instead, they build something in secret, throw it over the wall, and watch as the Wikipedia editors reject it as the steaming pile of crap that it is. They have done this again and again. Visual Editor. Flow. Mobile App. Knowledge Engine. All failures. All built without any input from the people who would be using them.
Now I KNOW that the developers are not stupid or ignorant, and I have checked as best I can and it appears that every one of them was able to create high quality software that meets the customer's needs when they were working other places. That leaves me with management as the probable culprit. And I don't think that the problem is product managers like the author of the post above this one. I think the blame is at the very top.
I would advise anyone who really wishes to understand these issues to at least read the pages linked to on my [ https://en.wikipedia.org/wiki/User:Guy_Macon/Wikipedia_has_C... ] page, especially [ http://mollywhite.net/wikimedia-timeline/ ]
Finally, if it really "it takes minimal digging to find all their plans and timelines", I would like to see this demonstrated by providing links to the plans and timelines for the Knowledge Engine. --~~~~
TL;DR: the largest chunk of the budget goes to the two departments that do engineering/design/PM/data science.
On your last point ("for all we know most of their staff time is off building stuff like Wiktionary") it's actually a big gripe in the smaller communities that probably 90% of the time and attention goes to Wikipedia.
Neither it is true. There is documentation and planning. As pretty much every software project I've seen over my career, documentation could use some TLC (and unlike many other pieces of software, anybody can actually help it), but it's not exactly described as "complete lack". There's a lot of documentation, though some areas are covered less than others -
Mediawiki is a big piece of software, and a long-term organically grown project, and if you have any experience with those you know what it means. It is known and regular effort is taken to improve it.
Same for planning - not exactly ideal, but "complete lack" is very, very far from the truth - moreover, unlike many other organizations, all the plans and all the internal workboards are public (excluding security issues and sensitive information), so you can check for yourself.
> Can anyone familiar with WMF engineering comment?
Yes, I am familiar with it by virtue of being part of it (still not speaking for anyone by myself, off-the-clock, in completely personal capacity :) and I say this claim is completely false. Moreover, it is so obviously false and so easily disproven by public documents that I wonder how one could publish that in a public media without bothering to do minimal due diligence. I mean, we all panic about "fake news" and stuff - shouldn't it make us to at least minimally try to check our claims with easy search or question on a mailing lists having dozens of people who could point out where the appropriate documents are? The author of the article seemingly believes it is unnecessary. I do respect his long-time contribution to Wikipedia (much more sizeable than mine) but that still does not entitle him to his own facts.
Looks like he disagrees with some of the projects Wikimedia took on - like making user experience more friendly with Visual Editor and mobile support, both IMO excellent projects, but everybody is entitled to their own take on this. It is fine. What's not fine is claiming that not agreeing with him is equivalent to not having direction at all and wasting money and being cancer. That's way too far and completely untrue.
Waterfall methodologies are deeply un-hip today, of course, but when they first coalesced they were a big improvement over what came before them, which was essentially nothing: an absence of any formal project management methodologies, with people cobbling together projects from bits and pieces of expertise learned in other disciplines.
(Note that I have no idea how WMF's software engineering practices work, so I have no idea if this assertion is accurate or not. I'm just trying to clarify what I think Macon is arguing here.)
Again, the developers and their managers know this. I am convinced that they have been told in no uncertain terms that they will be fired if they interact with the Wikipedia community.
(I am the author of the op-ed. A better version is at [ https://en.wikipedia.org/wiki/User:Guy_Macon/Wikipedia_has_C... ].)
The ultimate straw man indeed :)
No, this is false.
> and the term itself actually originates in a paper describing why it's broken.
So does the term "capitalism". Like capitalism, though, the waterfall method was a thing actually in wide use both before (the first paper describing it's use was about 20 years earlier than the critical one in which the term seems to have been first used) and after (it's been mandated by many institutions, particularly in government, even after that critical paper) being names in criticism.
> No one has ever advocated for the "waterfall" approach.
Actually, a number of large organizations, particularly governments, to this day mandate processes for software development projects, particularly large projects, that embody essentially the key features of the waterfall method, most critically that of doing full analysis across the whole scope before beginning development (often, in government, before getting approval for funding to open up contracting for the actual development work.) A lot of the contractors involved advertise that they use agile methods, but it ends often up being a kind of Scrum-within-waterfall monstrosity that managed to preserve the worst features of both.
Point me to someone espousing the benefits of the waterfall approach, please.
> So does the term "capitalism". Like capitalism, though, the waterfall method was a thing actually in wide use both before (the first paper describing it's use was about 20 years earlier than the critical one in which the term seems to have been first used) and after (it's been mandated by many institutions, particularly in government, even after that critical paper) being names in criticism.
I'm not saying that no one has ever tried building software this way. But the term and "methodology" are literally the collection of broken processes associated with early development.
> Actually, a number of large organizations, particularly governments, to this day mandate processes for software development projects, particularly large projects, that embody essentially the key features of the waterfall method, most critically that of doing full analysis across the whole scope before beginning development (often, in government, before getting approval for funding to open up contracting for the actual development work.)
How do you go to tender without reasonable complete requirements?
The problem here isn't the development methodology, it's the fact that going to tender for development basically forces you into this position. Governments seem to be moving to in house development to solve this problem, but I'd hardly say that the original requirements gathering was the result of anyone advocating for the waterfall approach.
I wish. I have actually worked under someone seriously putting it forward. (First IT job, I had no idea how bogus this was.)
And I think there's big problem with idea paying writers. The moment you start paying your writers become not just people who want to help and share their knowledge because of goodness, but they (or some part, which you can't distinguish) also become people who want money. And it's hard to manage it, people can start bickering who's got how much, etc. You might end up with thousands of people willing to write anything just to get some money and some good writers leaving because they don't want to compete with money grabbers.
Okay, but why isn't programming held to the same standard? Why do they have, what, four scrum coaches on payroll? Can't they rely on scrum coaches who want to help and share their knowledge because of goodness, too?
Even if they didn't pay writers (which might create its own problems), they could do far more to create and publicise a user-friendly scheme to provide volunteers with access to paywalled sources, digitised books etc. They could spend some of those tens of millions on that. But there too they rely on begging and volunteer labour:
It speaks volumes that even this rudimentary project, the Wikipedia Library, was originally initiated by a volunteer - for ten years, nobody at WMF seems to have had the brains to think of it. Supporting the broad mass of volunteers (rather than a few snouts in the trough) has never been near the top of the WMF list of priorities. I suspect the volunteers are largely regarded as amazingly convenient and useful idiots.
The lion's share of WMF money has always been spent on software engineering, much of it done by former volunteers who sucked up enough to first land a WMF job and then prove their engineering incompetence (as in the VisualEditor debacle):
only thing I want from wikipedia is wikipedia, no awards, grants and other nonsense to launder money to friends who don't wanna work, so cut those 300 employees to 30 actually doing work on wikipedia and maybe one day I will contribute again either by editing or donating money, but feeding bunch of parasites doesn't seem like healthy longterm solution
If anything, I got from this article that Wikipedia has kept costs well below revenue growth, which is normally the sign of a healthy organization.
Images are dramatically larger, both in the raw size out of cameras and the resolutions people are willing to put on pages (including higher-DPI screens).
Video's likely a lot more prevalent now, too.
I dunno what they're paying for bandwidth, but AWS S3 has barely dropped per-GB bandwidth costs since its release in 2006.
That shouldn't be a big problem. Screen resolutions have not increased significantly in years (most desktops and laptops are still stuck with 1920x1080), so there's no reason for larger images at all. It should be trivial to write the software so that it auto-scales the image down to an appropriate size for web display on modern screens (which is small, since most Wikipedia images are pretty small within the text, just like any reference work), while also providing a clickable link to allow seeing the image at full resolution. Very few visitors are going to view any image in an article at full resolution, let alone all the images in the article, so the only thing growing should be the image sizes of those optional full-size images. Even there, the software can compress the raw images down to well-compressed JPEGs; it's not Wikipedia's job to store unedited, low or no-compression raw images; an 80% quality JPEG is sufficient.
Same goes for videos; that stuff can be compressed, scaled, recoded to more efficient codecs, etc.
> So, we're doing around 1.4 billion page views monthly. So, it's really gotten to be a huge thing. And everything is managed by the volunteers and the total monthly cost for our bandwidth is about US$5,000, and that's essentially our main cost. We could actually do without the employee … We actually hired Brion because he was working part-time for two years and full-time at Wikipedia so we actually hired him so he could get a life and go to the movies sometimes.
According to the WMF, Wikipedia (in all language editions) now receives 16 billion page views per month. The WMF spends roughly US$2 million a year on Internet hosting and employs some 300 staff. The modern Wikipedia hosts 11–12 times as many pages as it did in 2005, but the WMF is spending 33 times as much on hosting, has about 300 times as many employees, and is spending 1,250 times as much overall. WMF's spending has gone up by 85% over the past three years.
Should you? I mean, my day job is making it so you expressly don't. If your costs are scaling even linearly I would say you're doing something wrong. The point of scaling is to reduce costs--economies of scale are why you scale. And a user-editable encyclopedia and PHP application are not really good arguments for diseconomies.
Without knowing more information about the financials, or how resources are allocated this is all conjecture. But a website that services 17bb page views/month is going to cost a lot of money to run. They could be spending their money very poorly, idk, but I also don't know whether or not what they are spending is an appropriate amount.
Point to the diseconomies of scale and we can talk about them, but everybody else has figured out how to leverage economies of scale when building out a large technical system.
Not only does it seem costs haven't been reduced, their rate of increase has exceeded the growth in pages served, quite substantially. That's not the whole story I'm sure, but as a rough estimate that doesn't seem sustainable or healthy... or necessary simply in terms of general hosting cost declines over that period.
But that won't be a live project. That would be a fossil that would slowly wither and die, as it becomes less and less relevant and more and more inadequate with the needs of the current user. In 2005, iPhone didn't exist, now everybody has a smartphone. Should we somehow account for that or just ignore it? How about knowledge graphs and linked data and all AI developments - should all Wikipedia knowledge still be text-only and ignoring whole Linked Data universe? How about supporting thousands of existing languages - should we just dump them in their own domain, or should we help them with automatic translations, article templating, language-sensitive searches and so on? How about creating richer media like maps, diagrams, graphs, video and audio content - should we help this or should we be content with just inserting links to outside content? And that's only minuscule part of the questions we can ask about things changed and developed in the last decade.
The point here is that Wiki universe is a a big and complex live active project (or set of projects), with very many facets, and reducing it to technical maintenance of one webserver site - even one that gets tons of traffic - is not a good idea. The goal of the movement, as I understand it, is not "make sure site en.wikipedia.org does not crash", it is "make all sum of human knowledge be available to everyone". It's a big mission, and it requires people to achieve it.
Twitter doesn't need 3,500 employees, Facebook doesn't need 17,000 and Google doesn't need 72,000. They could all fire 50%-90% of their workforce and the product wouldn't be materially affected for the vast majority of users. The reason they have this many is because they can have this many. Their success has given food to the cancerous growth.
You get this situation whenever a company isn't operated by a shareholder who get's dividends.
When the person who runs the company knows that every dollar they don't spend is money in their pocket, people start to actually care about expenses and only focusing on what is important.
1) one of the biggest moats for tech companies is talent, and it's necessary to have GOOD employees available to test new fields and grow quickly (when Android came along, Google suddenly needed a lot more engineers that it had available and saved the time/challenge/cost of hiring good quality engineers). Also the numbers you posted are overall employment numbers, not engineering ones (you need to have bigger HR, sales, support etc as you grow). I totally agree that there is a point of diminishing marginal return and they could be bloated, I just don't know what that is and I don't think you do either, so don't discount the number just because it's large.
2) Dividends are given if the company doesn't think that it can reinvest those in a more profitable way. Berkshire Hathaway notoriously doesn't give dividends - it doesn't mean Buffett & Munger are personally pocketing all the wealth. Also, dividend returns are taxed so that's why many times shareholders are okay with having the company reinvest it/bring it down as retained earnings because it's smarter that way.
In countries with Dividend Imputation (https://en.wikipedia.org/wiki/Dividend_imputation) such as the one I live in, companies actually distribute the money they earn.
In the USA, wasting money on bloat is effectively incentivised by the tax structure because you can spend money that is already inside the company with an effective discount vs distributing it so that it can be invested elsewhere.
Owners should be withdrawing the profits of their companies. The current situation leads to bloat and stagnation.
Though I still don't agree with withdrawing profits - I think it's smart to reinvest profits into the business (just look at Amazon).
It's fief-building, to a large extent.
The key difference is this: they earned the money and they can do with it whatever their shareholders allow them to. Wikimedia, on the other hand, get the money from the annual beg-fest aka "give is money or bad things may happen to humanity's last remaining encyclopedia" racket and they are only bound by whatever liberal interpretation they apply to their statutes. The problem I see with this is that with each increase in avoidable expenses they increase the risk of donations one day not keeping up with rising expenses, threatening the very existence of Wikipedia as we know it. By growing expenses, they are essentially endangering what they are supposed to protect.
The trick with any great start up is how to get out of the "aggressive growth without revenue" stage. Some (very few) keep the growth but aggressively grow revenue and become self-sustaining. Others dial down the growth at a certain stage, and revenue increases to match, and they're self-sustaining.
The point is you have to become sustainable. Most start-ups flare out badly. You cannot change over-night from "growth like cancer" to "stable and flat". You need some way to switch and it takes time.
In the early days, when almost everything was volunteer-run, hosting costs were indeed Wikipedia's main expense (as explained in that old quote from Jimmy Wales). These days, hosting amounts to about 2% of expenses, and most of the rest goes to staff costs (incl. about two dozen people in fundraising alone). Meanwhile, most of the value - the actual content - is contributed by unpaid volunteers. The paid staff have absolutely nothing to do with it.
Nobody minded working for free when there was just enough money to cover hosting costs. Now, however, there is an influx of $100 million in donations a year, and none of that benefits the average contributor, the people actually writing Wikipedia. That grates a little, much like in the monkey fairness experiment (Google it if you're unfamiliar with it).
A charity is based on how much people are willing to donate, which can change VERY quickly. Imagine what would happen if the public perception of the people behind wikipedia was to dramatically change and next fundraising campaign only brought in 20% of last years?
Additionally If people start to believe that the finances are squandered or not spent as expected, there could be a movement to NOT donate. This happened with the Red Cross after 9/11 where people felt they were duped by the fact their donations went to a consolidated funds (often to fund overseas activities) rather exclusively to 9/11 victims.
Charities also tend to need to keep a few years of funding in the bank to deal with a change in markets, as a charity typically can't use debt to get through rough years (economic downturns / recessions). Exponential growth makes this nearly impossible unless your running extremely lean.
I think the article is a little sensationalist (and maybe it needed to be to reach certain people), it seems to have some sound concerns.
This is eye-opening, to say the least.
This has been a trap for charitable organizations ever since they existed. Churches, clubs, museums, Etc. The second trap is embezzlement which happens all the time because, as a donation funded organization for some reason people often neglect strong financial controls.
The article author is correct that unless corrected, this situation will kill the Wikimedia Foundation.
Well, that's a little unfair. When you measure from 1, virtually any number will look like an absurd multiple. Calling it 300x makes it sound ridiculous but for a company with $80MM in revenue, 300 employees doesn't seem to be unreasonable.
To me, it's a no-brainer that the WMF of today needs more than one employee. Whether it needs 300, I don't know, but that doesn't sound far enough off for me to quibble with them over it.
300 feels like too much but 80 could be a logical number.
Yes there is.
2005: > So, we're doing around 1.4 billion page views monthly.
2016: > According to the WMF, Wikipedia (in all language editions) now receives 16 billion page views per month.
For most startups, success means getting bought out by a larger company. Wikipedia by contrast has always put the highest priority on maintaining its own independence, which means that for them a buyout would be a profound failure.
You're right that the costs of running wikipedia are still quite low considering the incredible scope, popularity and importance of the site. "Cancer" seems over the top and needlessly combative. But, it definitely doesn't hurt having someone bring up the fact that costs grew by 6x in 6 years. As he says, more years of this could put wikipedia in a precarious position.
It seems the OP cares about wikipedia and is genuinely worried. These conversations need to be had (also at fast growing startups) and having them in the open is part of the wikipedia way. Maybe he's wrong but it doesn't seem nonsensical or disingenuous to me. It seems genuine and rational.