Hacker News new | comments | show | ask | jobs | submit login
Wikimedia Foundation spending (wikipedia.org)
1054 points by apsec112 167 days ago | hide | past | web | 406 comments | favorite



Also known as "the institutional imperative." Quoting Warren Buffett's 1989 letter to shareholders:[1]

"My most surprising discovery: the overwhelming importance in business of an unseen force that we might call 'the institutional imperative.' In business school, I was given no hint of the imperative's existence and I did not intuitively understand it when I entered the business world. I thought then that decent, intelligent, and experienced managers would automatically make rational business decisions. But I learned over time that isn't so. Instead, rationality frequently wilts when the institutional imperative comes into play.

For example: (1) As if governed by Newton's First Law of Motion, an institution will resist any change in its current direction; (2) Just as work expands to fill available time, corporate projects or acquisitions will materialize to soak up available funds; (3) Any business craving of the leader, however foolish, will be quickly supported by detailed rate-of-return and strategic studies prepared by his troops; and (4) The behavior of peer companies, whether they are expanding, acquiring, setting executive compensation or whatever, will be mindlessly imitated.

Institutional dynamics, not venality or stupidity, set businesses on these courses, which are too often misguided. After making some expensive mistakes because I ignored the power of the imperative, I have tried to organize and manage Berkshire in ways that minimize its influence. Furthermore, Charlie and I have attempted to concentrate our investments in companies that appear alert to the problem."

[1] http://www.berkshirehathaway.com/letters/1989.html


Jeff Bezos talked about something very similar in his most recent letter to shareholders[0]:

"As companies get larger and more complex, there’s a tendency to manage to proxies. This comes in many shapes and sizes, and it’s dangerous, subtle, and very Day 2.

A common example is process as proxy. Good process serves you so you can serve customers. But if you’re not watchful, the process can become the thing. This can happen very easily in large organizations. The process becomes the proxy for the result you want. You stop looking at outcomes and just make sure you’re doing the process right. Gulp. It’s not that rare to hear a junior leader defend a bad outcome with something like, “Well, we followed the process.” A more experienced leader will use it as an opportunity to investigate and improve the process. The process is not the thing. It’s always worth asking, do we own the process or does the process own us? In a Day 2 company, you might find it’s the second."

[0] https://www.amazon.com/p/feature/z6o9g6sysxur57t


It's probably off-topic and I'd hate to derail the discussion, but could you outline what you mean with a "Day 2 company"?

It's not immediately clear to me, and as such makes an otherwise good-looking argument feel less "solid".


He's referring to earlier comments in the same letter where he talks about answering a question from someone about what Day 2 at Amazon looks like. At Amazon, one of the company mantras is "It's still Day 1." That's a reference to his 1997 letter to the shareholders[0] in which he said:

"Amazon.com passed many milestones in 1997: by year-end, we had served more than 1.5 million customers, yielding 838% revenue growth to $147.8 million, and extended our market leadership despite aggressive competitive entry.

But this is Day 1 for the Internet and, if we execute well, for Amazon.com."

Bezos has attached the 1997 letter to every letter to the shareholders he has written since. There is even a building in Seattle named Day 1[1].

[0] https://www.amazon.com/p/feature/z6o9g6sysxur57t (bottom half of the page)

[1] https://en.wikipedia.org/wiki/Day_1_(building)


In a nutshell, it's Bezos' metaphor for a company that is losing the behaviours it exhibited on 'day 1', such as innovation, pragmatism and risk-taking, which made it successful, drove massive growth, and got it to 'day 2' in the first place.


Well, seeing as the nomenclature is not the parent commenter's but Bezos's and seeing as how it's literally explained in the third paragraph of the link said commenter gave, I think there's little to explain.


He had some nerve asking a question, instead of reading the entire linked document!


There's a big difference between not reading everything related to a topic and not even doing the most basic cursory search. Have you never heard of 'how to ask questions the smart way'? Notice how I'm not linking to anything? Do you think you'll have to read the whole Internet to know what I'm talking about?


You're right. At the same time I think some people want to post in a forum or talk to someone. Some people want conversation. I think that's ok.


I get annoyed when people ask something of others without having made a very basic effort themselves.

You'll notice for example that even my initial comment pointed the guy to the answer (i.e.: it's in the third paragraph of the link).

Now, both requirements aren't opposite, someone could have made a comment like this: I like Bezos's notion of a Day 2 company, how it differs from Day 1, etc.

Basically, I like people to show that they aren't asking someone else to do the work for them (or, if they are, at least they made some attempt).

That's at least as reasonable a position as the one you just expressed IMO.


Please don't rage and humiliate. Not cool. Not everyone is able to read HN regularly. Have some empathy.


See: Agile


From [ https://en.wikipedia.org/wiki/User:Guy_Macon/Wikipedia_has_C... ]:

"In particular, their poor handling of software development has been well known for many years. The answer to the WMF's problems with software development has been well known for decades and is extensively documented in books such as The Mythical Man-Month and Peopleware: Productive Projects and Teams, yet I have never seen any evidence that the WMF has been following standard software engineering principles that were well-known when Mythical Man-Month was first published in 1975. If they had, we would be seeing things like requirements documents and schedules with measurable milestones. This failure is almost certainly a systemic problem directly caused by top management, not by the developers doing the actual work."

"This is not to imply that decades-old software development methods are somehow superior to modern ones, but rather that the WMF is violating basic principles that are common to both. Nothing about Agile or SCRUM means that the developers do not have to talk to end users, create requirements, or meet milestones. In fact, modern software development methods require more communication and interaction with the final end users. Take as an example the way Visual Editor was developed. There are many pages of documentation on the WMF servers and mailing lists, but no evidence that any developer had any serious discussions with the actual editors of Wikipedia who would be using the software. Instead. the role of "customer" was played by paid WMF staffers who thought that they knew what Wikipedia editors need better than the editors themselves do. Then they threw the result over the wall, and the community of Wikipedia editors largely rejected it. Or Knowledge Engine, which was developed in secret before being cancelled when word got out about what the WMF was planning. Another example: The MediaWiki edit toolbar ended up being used by a whopping 0.03% of active editors."


This sounds related to Pournelle's Iron Law of Bureaucracy:

https://en.wikipedia.org/wiki/Jerry_Pournelle#Iron_Law_of_Bu...


I love it that we turn to Wikipedia to understand its own weaknesses.


You'd think the organization between what probably is the biggest online source of information would've learned a thing or two from its own content.


Opportunist in nature, Resource Control System in nurture.


I wonder how much of this could be counteracted simply by capping profits, such that all net revenue in excess of some fixed amount is forced by the corporate charter to be turned into either employee compensation or stockholder dividends.

Sure, such a company would have no chance of succeeding in the stock market, but what if it never plans on IPOing in the first place?


> I wonder how much of this could be counteracted simply by capping profits

Buffet's core complaint is about companies not making enough profits due to their profligate spending on other things. A profit cap would not have the impact you're hoping for here. :)

> such that all net revenue in excess of some fixed amount is forced by the corporate charter to be turned into either employee compensation or stockholder dividends.

By definition, only profits can be paid out as dividends, so again, a profit cap would prevent the thing you're you're trying to boost.


> By definition, only profits can be paid out as dividends, so again, a profit cap would prevent the thing you're you're trying to boost.

Er, sorry, I rather meant that profits after dividends would be capped. You can do anything you like with the money—other than keep it in the corporate coffers (or in commercial paper or anything else that's still liquid assets on the balance sheet.)

But yes, you're still right, it wouldn't have the correct effect.

How about a sliding window cap on non-compensation spending, together with a sliding window cap on headcount? So, in a year with record 10x net revenue, you would be allowed to 10x your salaries/bonuses/dividends, but you wouldn't be allowed to multiply your capital costs, or "new" labor costs, by more than, say, 1.3x. (And keep the fixed profit-after-dividend cap, because otherwise the corporation would just hold all its money in the bank until the sliding window grew enough to let it spend it.)


I mean, you took a simple rule, got on objection, and made it a lot more complex. That's usually a sign that the underlying issue is not all that simple, and a simple rule will not remedy it.


With all the thousands of corporations that get created and destroyed all the time, it seems like parent's proposed rule could probably work for one or two of them? If not, maybe we could come up with something more substantial than, "this could be hard!"


That guy gave solid advice. OP is welcome to implement OP's rule but adding epicycles isn't going to get meaningful responses. When you start adding epicycles it's worth reinvestigating if your original theory has a meaningful basis.


Profits don't generally have a simple/strict meaning in business terms, AFAIK. Maybe you mean EBITDA? (earnings before interest, taxes, depreciation and amortization) I think even that is maybe not what you mean/wouldn't acheive your goals, even if it's theoretically unambiguous, because of what is invested in R&D/aquisitions/etc.

Also, presumably you're talking about new laws in the US here? What stops businesses from just moving their HQ overseas to places with different laws they like better?


All the ways organisations waste money Buffet describes would be categorised as expenses or investments, so occur before profit is even calculated. Setting arbitrary goals independently of the cost structure of the business or its competitive environment is like setting a flight plan for a plane using just a map and then flying it blind without instruments or a weather report.


Profits are revenue minus expenses. In a company ruled by the model Buffet describes, "[c]orporate projects or acquisitions will materialize to soak up available funds" means that expensive boondoggles shall arise to stop any pesky excess profits from appearing in the first place.


> I wonder how much of this could be counteracted simply by capping profits, such that all net revenue in excess of some fixed amount is forced by the corporate charter to be turned into either employee compensation or stockholder dividends.

That seems likely to cause the exact _opposite_ effect of what you intend. Apple is sitting on a pretty big cash pile right now. If the only legal options they had were to invest it or return it as dividends, the original article's premise is that they would find ways to invest it (buying factories in China, bringing app development in-house, etc) rather than risk the shrinking of the bureaucracy.

The problem seems doubly bad with Wikipedia and other non-profits. The problem isn't that they have too much profit. It's that they are growing their expenses to match income (rather than capping expenses at some "rational" level and trying to maintain enough income above it). In theory, in for-profit companies shareholders will start to complain if expenses increase too high -- it's not clear who would make the same complaint at Wikipedia (other than the original article).


Wikipedia has effectively done this with regard to employee compensation/additional hiring.


When you interfere with price communication and incentivization mechanisms, you always break things.

Trying to come up with piecemeal regulatory "solutions" like this is like trying to invent a perpetual motion machine by coming up with increasingly more complicated contraptions. There are fundamental, local interactions that you have to contend with that basically rule out accomplishing what you want.


Your comment brings me joy. Society would benefit from inventing better ways of communicating the properties of large social/economic systems, so that this misconception - that top down economic planning, through cookie cutter mandates, can improve social welfare - becomes less common.

Everything from the growth of the welfare state, to populist revolutions to put socialists into power, can be blamed on how common this misconception is.


Top-down optimized planning of complex systems works exceedingly well, as long as you allow for small actors to freely re-accommodate to their local environment conditions. I know it because my company does this for a living.

The problem with applying top-down planning to economy is not that it doesn't work; it's that it optimizes for the goals of the few doing the planning, leaving large swaths of people struggling with the awful conditions created by not taking their goals into account in the plan.


>as long as you allow for small actors to freely re-accommodate to their local environment conditions.

In other words as long as you don't try to plan their activity. What you're describing is how more free market oriented economies work: with a central authority that has the power to force everyone to comply with its plans, but that deliberately chooses to mostly not exercise this power, in order to maintain a free market.

Central economic planning is better than the alternative (anarchy) only when addressing externalities that market forces do not address. So for instance providing a system of law, a national defense and basic scientific research. Even with all of the inefficiencies of top down planning, it still remains the only way to optimize resource allocation for the production of public goods.


>> as long as you allow for small actors to freely re-accommodate to their local environment conditions.

> In other words as long as you don't try to plan their activity.

That's not what I said, and the two ideas are not equivalent. Agents may be allowed to accommodate locally only within certain limits, giving an appearance of free will, but still being within planned parameters and thus producing a controlled outcome. The system then will converge to a state that complies with the centrally stated goals. The key here is that individual agents are only allowed some local improvements, but they're prevented from any improvement to themselves that would damage the global optimization goals. See local search, [1] like for example Variable neighborhood search. [2]

The central authority sets very strong limits on what can and cannot be done, and enforces them strictly. It also sets up economic incentives, goals, rewards and punishments over the "free actors", like convincing workers that they need to program their lives around continuous learning and adaptation, jumping jobs every X years, and discouraging people from relying on the existing public education, healthcare and public retirement plans, which were the goals of the previous central plan, now deprecated. Anyone ''willingly'' failing to comply with the new program find themselves quickly pushed out of the system (but it was "their own fault", of course, for not "behaving rationally").

That's how the current Western world economic forces operate, and it's quite different from "mostly not exercising its power"; if that's a free market, it looks a lot like "forcing everyone to comply with its plans".

[1] https://en.wikipedia.org/wiki/Local_search_(optimization)

[2] https://en.wikipedia.org/wiki/Variable_neighborhood_search


>Agents may be allowed to accommodate locally only within certain limits, giving an appearance of free will, but still being within planned parameters and thus producing a controlled outcome.

I haven't seen any evidence that this is an effective way of organizing an economic system, at least relative to the free market.

Modern western economies have grown increasingly stagnant as the level of central planning has increased since 1960. Contrary to your claim that "public education, healthcare and public retirement plans" have steadily been deprecated, the raw statistics show that government social spending in a major Western economy, the US, increased, on average, 4.8 percent per year between 1972 and 2011, after adjusting for inflation [1]. This represents a massive shift to more central economic planning. And this shift has been associated with reduced rates of improvement in all metrics of economic development.

[1] http://fivethirtyeight.com/features/what-is-driving-growth-i...


To achieve what you're describing you should cap market capitalization. No point in trying to grow past a point if you'll have to just give away all that growth in the form of dividends.


> To achieve what you're describing you should cap market capitalization.

That's not really it either. If you have a business which is generating high profits, it will have a high market cap. You can reduce it by transferring the corporation's liquid assets to shareholders, but if you've transferred the liquid assets and you're still above the limit, transferring non-liquid assets to shareholders is clearly the wrong direction.

What you really want is for separate businesses to be separate companies. Conglomerates are inefficient. Don't expand into new markets, just give the shareholders their money. The shareholders can invest it in the new markets if they want to.

The main impediment to this is the tax code, because if you give the shareholders the money it gets taxed -- twice -- but if you spend it internally or leave it inside the corporation in an offshore subsidiary, that doesn't happen.

Wikipedia gets to the same place via a different route. A non-profit doesn't have shareholders to pay dividends to, so instead of paying dividends being discouraged by the tax code, it just isn't possible, and you get the same results. The website generates more revenue than it actually needs to operate and the rest of the money has to go somewhere, so it goes somewhere inefficient as determined by internal politics.


> conglomerates are inefficient

It's interesting that you say that on a subthread discussing something Warren Buffet said. Berkshire Hathaway became an efficient conglomerate by acquiring and operating companies that eschewed the very principle we're discussing.


He has addressed that in the past. They are able to work around some of the problems by maintaining a hands off approach to their subsidiaries. There is also an important trust relationship that encourages the subsidiaries to give up "excess" profits to the parent.


Why would a company with mandatory dividends have no chance of succeeding?


Relative to other public companies, it basically would have have zero money to re-invest into growth. So there'd be no expectation of returns on investment—no reason to buy the stock.

Dividends are nice, but dividends would stay small if the company itself stays small (because it's constrained from growing.)


That is the whole idea here, though. That Wikipedia's money, which is being "reinvested in growth" is not having an impact on the mission. Stopping businesses from reinvesting in growth -- at least, stopping them doing it the traditional way -- is the point that Buffett is trying to make.

(I'm not equipped to evaluate whether or not that point is true, especially in the case of any specific company. Just trying to point out that the argument you're making here kind of begs the question.)


Not all companies are growth oriented.


My point was that all publicly-traded companies are. Speculation on corporate growth's geometric effects on corporate long-term profitability is what the stock market is.


Not at all.

Look at utilities and regional banks, which are usually priced based on return on assets or operating margins.

Growth oriented companies have advantages and disadvantages. Microsoft is a great example -- they have a business that's a monopoly cash cow, but they also need to hit high growth rates to prosper. They squandered a decade trying to both grow and milk the cow, and are now growing again, while breaking and eventually losing parts of the legacy business.


Sure. If you want to take your pile of cash, and set it on fire, a non-growth company is free to do that.

That is basically what you are doing when you leave money on the table. Opportunity costs are still costs.


I think you've spent too much time focusing on tech companies. Many companies have a growth ceiling (utilities, etc) that investors are happy with as long as it pays a dividend.


During the recession, I parked a significant amount of money in a variety of organizations that operate pipelines for oil, gas, CO2 and other products. Their sole purpose in being is to shlep stuff around and pay dividends.

For about 7 years, I was collecting an effective 10% dividend while getting significant capital appreciation as well.

Likewise, as part of my diversified retirement portfolio, I have a portion in boring dividend paying stocks that generate moderate returns and don't get punished as severely during bad market conditions.


Agreed. It's for this reason that we have to be as free as possible to create new institutions. All institutions, no matter how well-intentioned and self-aware their founders, succumb to these forces. The only way to prevent perverse misallocation of effort and resources is to allow for a steady turnover of institutions.


Wow, this really hits the nail on the head in regards to the small company I work for. Expanded rapidly from 30-40 to 120 in two years. The higher-ups have failed to address some core issues (with some refused on principle, such as an aversion to stronger organizational structure due to not wanting to seem "corporate") while still pursuing pet projects and spending freely (of money and others' time). Communication is also a massive issue - half the company (within the current structure) knows what's going on, the other half have to either do their own investigative work or wait. Would fine with me if you're a much larger company but at 120 or so it seems pretty silly and hypocritical to their "anti-corporate" ethos. Will be interesting when/if we hit Dunbar's number.


I've worked at 3 startups. Every single one of them has (or had) the problems you describe.

Is there some known principle that governs this, something that could be named along the lines of Parkinson's Law or the Peter Principle?


I don't remember the source but I read about there being several inflection points in terms of team size where communication overhead changes suddenly...IIRC the first one is 8: beyond this you start needing to have 'managers' (their title may be different), i.e. people responsible for organization & team communication rather than directly making the product/sales/etc.

Anecdotally I noticed that even with 5 people you usually have someone starting to act as adhoc part-time manager, and this role quickly become too much for a part time responsibility as the team grows.


after a certain size not every person can be answerable to the big boss, you need an actually structure.

you don't need to create a middle management structure from the get go, just have a chain of command.


Lots of startups / young companies are run by people that never had a formal business school thing - they don't have the history lessons besides other successful startups and whatnot that explains why 'traditional' management (with multiple layers etc) is the way it is.

Any project with more than one team already has a middle management structure of sorts - you've got the CEO / CTO, then the team leaders (scrum masters, or just whoever's the loudest). There's your middle management layer already.


Thats a really interesting quote, I hadn't heard of it before today. Do you have any resources that talk of how to organise and combat this?


Not the op, but I think another way of thinking about the problem is related to Tragedy of the Commons[0]. In general, it's an issue that arises out of the different goals of the various actors, which can be orthogonal or counter to the goals of the group. One way of thinking about it intuitively is to imagine multiple painters all trying to create their own artwork on the same canvas, leading to a very messy result. The solution is for the actors to have aligned goals and execute as a team. Achieving that seems to be an interesting and unsolved problem.

https://en.m.wikipedia.org/wiki/Tragedy_of_the_commons


It's not an unsolved problem.

https://en.m.wikipedia.org/wiki/Elinor_Ostrom

Elinor Ostrom received a Nobel Prize for researching into how societies have develop structures to manage commons sustainably. Unfortunately her conclusion was there is no default solution.


I have seen this problem first hand in a startup.

I theorize that it is related to size/scale, which disassociates causes and effects across time and groups.

My solution is to keep companies as small as possible


I've been thinking about this problem for a long time, but have come to realize that keeping groups small has its own major shortcoming. Namely, it limits the ability of the group to achieve greater things. The larger the group, the more influence it has and generally the more resources it has too.

There are also plenty of counter examples in corporations and governments throughout history where the size of the organization has not affected its ability to achieve its goals or compete with smaller organizations. Therefore, I think the issue and solution lies elsewhere. Somewhere between accountability, culture and trust.


Given the force-multiplying factors of technology, more and more can be accomplished by fewer and fewer people. IBM > Google > Facebook > Instagram

That's not to say I disagree with you completely: large corporations can achieve things small ones cannot (especially as you move out of software) but I've seen plenty of talk around accountability, culture and trust, and very few results.


I agree that more may be accomplished with less in certain circumstances aided by technology. One could go as far as to say one day management and governance may be taken over by machines (fun thought experiment: ponder the ramifications for democracy when machines are capable of making better decisions than humans). I think we're a ways away from that still, but the future is exciting.

Sure there's plenty of talk around accountability and proper management without much substance and I don't claim I have any specific solutions in mind to the problem. I think keeping organizations small (i.e. tribalism) is one hack that may help, but comes with different considerations as I mentioned.



Is this a response suggesting that anarchy is the only way to defeat institutional inertia? Or a link to something that actually addresses the subject constructively and realistically?

I ask because it's a large work, but I'd read it if it was the latter.


I was also curious; here's what Wikipedia says about Kropotkin's "The Conquest of Bread":

> "In this work, Kropotkin points out what he considers to be the defects of the economic systems of feudalism and capitalism, and how he believes they thrive on and maintain poverty and scarcity, as symbol for richness and in spite of being in a time of abundance thanks to technology, while promoting privilege. He goes on to propose a more decentralised economic system based on mutual aid and voluntary cooperation, asserting that the tendencies for this kind of organisation already exist, both in evolution and in human society. He also talks about details of revolution and expropriation in order not to end in a reactionary way."

https://en.wikipedia.org/wiki/The_Conquest_of_Bread


Of course there's many ways to solve any problem, but GP asked for a resource, and I gave one. Anarchy has been used realistically, and the book was written to be a realistic approach organizing people.


I think this problem derives from the nature of and incentives of people. What happens is that certain managers (and founders) have derived their success from pushing certain directions that have added a lot of value.

Usually long term initiative take a lot of domain knowledge and a specific set of connections. When something new becomes more important, they risk losing their personal position since they may not be the best person in terms of knowledge and connection to carry out the new direction. Instead they try to use their accumulated power to keep their direction supported even when it's not in the interest of the business for as long as they can.

To address this, perhaps there's merit in rewarding timely exits and somehow punishing people that have dragged things out. However the most that companies can do is usually just fire someone. And at that point they have probably sucked the company dry of the value they can personally extract.


The only line in Buffet's quote I disagreed with was "rationality frequently wilts when the institutional imperative comes into play", and this is why.

It would be awfully surprising if virtually all companies had the same problems due to smart people suddenly behaving irrationally in the same way. It's much less surprising that smart people behave rationally in their own interests, but that those interests regularly fail to align with overall company interests.

And I think the pattern you describe is even mirrored on an institutional level. People are quick to mock Kodak for ignoring the rise of digital cameras, but they tend to undervalue the amount of risk that pivot would involve. If you're a world leader in product X, and that gets replaced by product Y, pivoting destroys the value of your expertise and puts you at risk. It's unpopular but perhaps sensible to settle on winding down profitably (the institutional version of a timely exit) instead of pivoting.


This will be very familiar to readers of The Systems Bible [1], which expounds in great depth about the above problems and potential mitigation strategies.

The basic takeaway is that these behaviours and forces are natural, intrinsic and inevitable; that a wise systems-manager will seek to carefully manage them rather than oppose or decry them.

[1] https://en.wikipedia.org/wiki/Systemantics


One solution to this problem is to impose a time limit. For instance: Bill and Melinda Gates Foundation (1994–projected: twenty years after the death of the survivor of Bill and Melinda) why: http://roadmap.rockpa.org/setting-a-time-horizon/ people that did: http://cspcs.sanford.duke.edu/time-limited-philanthropy/time...


Even in that case it can become very difficult to follow.

For example, there is - in my opinion - a low chance that the Gates Foundation will expend its resources within 20 years post the death of the last of the two.

They are likely to have something total equivalent to (in present dollars) perhaps $250-$300 billion to get rid of in the next 40-50 years. I'd be skeptical they can get rid of it that fast in their model. They're eroding that mass of capital (presently near $190b) so relatively slowly that by the time Gates is 80, they'll probably still be dealing with $200 billion in today's dollar.


Can a foundation create a foundation? That could "solve" the problem, from the management perspective.

A succession of massive, but formally time-limited foundations, administered in lockstep by dynasties of custodians would add a very interesting touch to any future scenario of fiscal neo-feudalism.


Thanks for the link. He says something else very relevant to OP's point, too:

"We face another obstacle: In a finite world, high growth rates must self-destruct. If the base from which the growth is taking place is tiny, this law may not operate for a time. But when the base balloons, the party ends: A high growth rate eventually forges its own anchor."


This is from 1989, about 27 years ago. Is there any academic research on this since then?


28 years ago, I wonder how much if this still holds true for Berkshire Hathaway.


NASA might offer a more salient comparison.


A better more original source of pithy organizational rules would be Robert Conquests three laws of politics

Everyone is conservative about what he knows best.

Any organization not explicitly right-wing sooner or later becomes left-wing.

The simplest way to explain the behavior of any bureaucratic organization is to assume that it is controlled by a cabal of its enemies.

If you transform Conquest's rules from politics to running (ruining?) the wikipedia, the transformed rules are

"We can change nothing not even our exponential growth spiral, not our policies, nothing"

"LOL We're not doing the fiscal conservatism thing. I like how the current top discussion is about popularity and the need for a circular firing squad, not something financial. A direct quote of an attempt to avoid working on the issues "I'm reminded of the inflammatory, low-rent campaign of Donald Trump." Yeah buddy that'll fix it, that'll fix it real good."

"We're headed off the financial cliff now get out of the way I'm going gas pedal to the floor as you can see in the exponential graphs. The problem is we're a CRUD app and that's cutting edge CS just like quantum computing so naturally there's no possibility of criticism there. After all, the Egyptians didn't have flush toilets and they built a pyramid, so any criticism of the toilet in my bathroom is either making fun of the entire Egyptian culture or pretending the pyramids were not a logistical challenge."

There is some humor in that the world of paper encyclopedia publishing ran on mostly capitalist operational principles for decades, centuries. It turns out that running an online encyclopedia off donations and extreme hand waving is powerful enough to destroy an industry on its way to its inevitable collapse. Maybe someday in the future we'll have encyclopedias again, but the era after wikipedia and before the next encyclopedia will be a bit of a dark age. That's too bad.


I honestly cannot make any sense of your "translations" of these "laws" (which sound more like assertions to me).

Also, what scenario are you talking about when yo usay "after wikipedia"? There's plenty of copies of it on the internet, so the data won't suddently vanish, and wikis don't suddenly stop existing if the Wikipedia foundation implodes.


Pretty sure this article could have been called "Wikipedia's Costs Growing Unsustainably" instead of the clickbait headline.

But overall this oped is misplaced. Running the leanest possible operation shouldn't be Wikipedia's focus at this stage in its lifecycle, it's improving the quality of its content.

Back in 2005 Wikipedia had 438k articles and the focus was expanding the reach of its content to cover all topics; today the article count is 5.4 million it's quality that matters more. You can't improve quality just based on crowd-sourcing alone (see: Yelp, Reddit, etc), and the bigger it's gotten the more of a target it's become by disinformation activists.

This attitude on budgets over value strikes me as a classic engineer's POV. The OP is nostalgic about a time when the site was run by a single guy in his basement, but could 1 guy handle the assault of an army of political zealots or Russian hackers? DDoS attacks? Fundraising? Wikipedia is arguably one of the most coveted truth sources the world over, protecting and improving its content is more important than an optimal cost-to-profit ratio.

If the OP has specifics, by all means, share them, but this kind of generalized fearmongering about budgets isn't spectacularly useful, IMHO.


> Pretty sure this article could have been called "Wikipedia's Costs Growing Unsustainably" instead of the clickbait headline.

That's not what I got at all. And that's why the article is interesting.

Wikipedia's funding is what's growing unsustainably. It's higher funding that's pushing the costs higher. And that's what makes it interesting (and only a little click-baity.)

It seems, having taken people's money for a charity, you have a moral obligation to spend the money on the charity, whether it needs it or not. And as a manager of said charity, it's very easy to believe (or to convince yourself) it needs the money. Or otherwise why were we making plaintive pleas for money?

(And that happens in a world of good intentions. When fundraisers become cynical, you end up with the US political outrage machine, which operates simply to raise money rather than to effect political change....)


Huh?

From the OP: "After we burn through our reserves, it seems likely that the next step for the WMF will be going into debt to support continued runaway spending, followed by bankruptcy."

If it was just about wasting donations, they'd never go into debt. It's costs, and specifically costs-to-income ratio he seems perturbed about.


From the table in the OP, reserves have gone up every year.

The point isn't that the donations are wasted, its that in spending them, you create an organism that needs to be fed.


In most non-profit organizations, costs never go down. Once a budget is set for programs, staffing, etc, they don't just go away and in fact, they often continue to grow without bound. See any government budget ever for evidence.

So if donations don't continue to grow to match or at least keep pace, they could start running a deficit to eat away those reserves in no time. And once a non-profit organization starts running at a deficit, some contributors will question their contributions and they may shrink accordingly.

Those reserves could disappear in just a few years.. unless there's a change, two years should show the direction and another couple years, the course will be set one way or another.


> In most non-profit organizations, costs never go down.

And your evidence for this is...?

I've worked in the nonprofit space for 10 years and leaders, like for-profit companies, cut costs when they're facing a deficit.

If costs go up at WMF they'll either raise revenue or cut somewhere, like virtually any other mature business.


> like virtually any other mature business

That's the problem. A non=profit is not a "mature business" because it's not a business at all. In fact, in most non-profits any effort to "run it like a business" will be met with opposition. Once again, for evidence look at any government agency, budget, etc.

But even if it was a "mature business" the idea that any organization knows when and how to cut costs after its past its prime is dubious at best. For evidence, look at any company bankruptcy. Their costs outpaced their revenue and they didn't react fast enough or in the right ways.

And that's assuming there were "right ways" to react.


It's a Non Profit Corporation - it is a business, but a business who's job is not to return value to shareholders, but to server the public good.

As far as cutting costs and institutional maturity: sure, your all look at any company bankruptcy. You can also look at the millions and millions of non-bankrupt companies, and realize that you're overgeneralizing.


Once again, I'm talking about the time after a company starts to lose.. and the time it takes them to "correct" if there is a "correct" path.

Or as the top commenter put it "the institutional imperative" - https://news.ycombinator.com/item?id=14287430 - as described by Warren Buffet.


>but a business who's job is not to return value to shareholders, but to server the public good.

Is this some kind of Freudian slip?


No, it was a typo.


> And once a non-profit organization starts running at a deficit, some contributors will question their contributions and they may shrink accordingly.

That's backwards though. It's not unusual for a non-profit to run a deficit. A donor will question a non-profit that's running a surplus - why am I giving you money you don't need.


How many people that give to charities look at their balance sheets before hand?


Most don't as long as everyone is getting paid, the galas and events continue, and outward appearances are fine.

Once invoices or paychecks are delayed, people stop and question why the leadership is "making so much money" and the ROI on galas and events. And once a few donors see the deficits, the questions get harder and money slows down.. making the next round of invoices a little harder.


Apparently not in the case of WMF.


> If it was just about wasting donations, they'd never go into debt.

But there's no debt. The whole argument is pure conjecture based on imagination. Using it as if it were a fact that proves something makes little sense.


The author imagines that the funding will not grow forever. That's almost a certainty, as the article describes carefully.


While it's a good warning to WMF, it's not currently a problem. With longevity, they are in for some large endowments. I think those giving a certain amount (?) should have a vote on various directions of the org, like shareholders but, for the common good that supports their mission statements, instead of shareholder value.

Edit: Punctuation.


> While it's a good warning to WMF, it's not currently a problem.

I believe the author's thesis is that by the time "it's not currently a problem" is no longer an argument that makes sense, it will also no longer be possible to effectively correct the WMF's course in a way that will solve it.

I'm not sure I have any idea how to effectively determine if the author is correct about that, but certainly I don't think "it's not currently a problem" actually contradicts anything he's saying.


Ecxept for the correction that it is the authors unsupported assertion, not any actually argued thesis, I think you summarized it nicely.


Funding will not grow forever, neither would expense. There's no danger it would consume whole world's GDP and would require us to acquire an intergalactic loan from Arcturian Galactic Bank. https://www.xkcd.com/605/


> If it was just about wasting money people give you, you'd never go into debt.

The point is that costs will continue to rise (or not fall) after the funding inevitably falls (or fails to rise enough)


He's expected a future where the growth in fundraising declines, but the costs they have committed to are unable to decline at the same rate.


Thank you for this rebuttal. Wikipedia is no longer the small platform it was. It's international, has to expand to developing markets, has multiple sister projects like Wikidata, and tools like VisualEditor that require developers. Sure you can probably rein in costs if you don't believe in any sort of expansion, software improvement, or outreach programs. Lastly, a lot of the claims in the op-ed are simply unfounded like the statement that the Foundation isn't transparent enough, that it's developers are idiots, or that Wikipedia isn't sustainable with its reserves. This just seems like him being overzealous with his consulting experience "rescuing engineering projects that have gone seriously wrong", just as every week a designer will "fix Wikipedia's design".


> could have been called "Wikipedia's Costs Growing Unsustainably"

The table in the article suggests it's growing sustainably, as assets are increasing and revenues exceed expenses. The whole unsustainability hypothesis seems to be based on one metaphor "if it's growing, then it's cancer, ergo it is deadly, ergo it has to be stopped".


I think this undersells the article.

The 1,250x cost growth seems like the heart of the article - it's a claim that the growth is disproportionate to, and unmotivated by, actual value. Since WMF is funded as a charity, not a business, the revenues are 'sustainable' based on what people want to give. So if revenues exceed expenses, that says people like WMF and will give when asked. It doesn't say anything about whether the expenditures are justified.

I'm not sure I accept the thesis of the article, but I think it deserves deeper consideration. The bankruptcy line seems overwrought, but there's no intrinsic reason that high (charitable) revenues prove high expenditures justified.


> it's a claim that the growth is disproportionate to, and unmotivated by, actual value

This claim is largely unsubstantiated, as "number of pages in Wikipedia" is not a good measure of value of the whole project, especially when you take it naively as the only parameter that must numerically match expense figures in linear proportionality. I think such measure is nonsensical.

> It doesn't say anything about whether the expenditures are justified.

How you define "justified" then? There's an extensive process for financing and planning projects. The article author seems to either be completely unfamiliar with this process, or ignore it altogether, focusing only on one measure, which is number of pages. I do not see this as a good argument. Of course, the concern "do we adequately spend the money" is valid - but this concern is known, and constantly on the radar, without alarmist articles. The contribution of this particular article seems to be null - the valid concerns it raises are already known and accounted for, and the new ones are not valid.


I agree with the thrust of your comment, but the bit about "clickbait" is misplaced.

Is is clearly a poetic metaphor - no-one clicking on the article seriously believed that Wikipedia has a literal biological cancer (the "cancer" metaphor is hardly a new one, if any criticism can be made here it would be that it is almost verging on cliche). Indeed the entire article is structured around this metaphor - the title is hardly false advertising.

I disagree strongly with this new obsession that every article title and headline must be written as pure "Man Bites Dog" factual summary, particularly for opinion pieces like this. Surely there is room for some attempt at poetic flair.

(A hypothetical example of a real "clickbait" style headline for this article might be "Google Will Buy Wikipedia").


> A hypothetical example of a real "clickbait" style headline for this article [...]

You forgot the obligatory "this" in the title. Better would be: This Is Why Google Will Buy Wikipedia


Were you aware that the vast majority of both the WMF board of directors and the WMF top management have close ties to Google? Or that a large chunk of those millions that the WMF has been spending went to creating a secret search engine? https://motherboard.vice.com/en_us/article/wikipedias-secret...

Please note that https://en.wikipedia.org/wiki/User:Guy_Macon/Wikipedia_has_C... is the origiginal essay, and the version in the Signpost has been modified at the request of the Signpost editors.


I agree with your post. The OP is a bit fearmongering, however there is some general issues that are definitely true of Wikipedia, but because it seems to be doing well overall (in terms of traffic and ubiquity), nothing will change even if "it has cancer." If what the OP describes as "cancer" is how the top 5 website in the world operates and stays there, then so be it I say.

The thing is that there doesn't really seem to be any REAL alternatives challenging Wikipedia in an honest, high effort way. As a wiki/knowledge fan myself, I've gone to different sites with different takes such as Quora (which is fantastic, can't really say it's a real competitor though), Genius.com (which is comparable only in a very narrow sense for songs/texts and nothing else), and Everipedia (which is the closest thing to a real competitor with all Wikipedia content imported, but is tiny in comparison to the last 2 sites above - Alexa 6k US vs Alexa top 100 for the other 2).

I would say out of everything I've found Everipedia comes closest in a valiant effort and I frequently contribute to it here and there, but at the same time, Wikipedia is just too dominant to see any real necessity to change how it is doing anything, whether that is for good or for bad. And my personal opinion is that maybe that is how it should stay too, given the size and scale that Wikipedia operates and its general continued success across most of its fronts. One thing is for sure: the world is definitely better with Wikipedia continuing onward even if "it has cancer."


I use to manage the IT needs of a cartography lab in a uni. one day i was curious to how much web traffic we got off the old website so I checked the IIS logs and saw we were getting over 200k hits a year. i was like what the?

so I enabled google analytics to see where they were coming from. the source was one of our sub pages on redlining was the 5th external link listed in wikipedia on that topic.


This. It is worth it for people to figure out how to preserve truth and verifiability. Perhaps community guidelines is the state of the art, but how can software help prevent it from being gamed?

IMHO if Wikipedia loses that, it costs dearly not just to Wikipedia but to society.


I agree this is misplaced. There are so many things Wikipedia can try to fix now, and so many things they could invest in for the future. Wikidata for example. Lots of these things will fail and appear to be a waste of money of course, but that's called R&D. Growing businesses usually borrow money after all while Wikipedia is firmly in credit.

So long as there's a contingency plan for donations falling, whereby all the nonessential stuff can be dropped if necessary, there's no need to worry.

tldr; prepare for the worst, expect the best.


I was very actively involved in MediaWiki development & Wikimedia ops (less so though) in 2004-2006 back when IIRC there were just 1-4 paid Wikimedia employees.

It was a very different time, and the whole thing was run much more like a typical open source project.

I think the whole project has gone in completely the wrong direction since then. Wikipedia itself is still awesome, but what's not awesome is that the typical reader / contributor experience is pretty much the same as it was in 2005.

Moreover, because of the growing number of employees & need for revenue the foundation's main goal is still to host a centrally maintained site that must get your pageviews & donations directly.

The goal of Wikipedia should be to spread the content as far & wide as possible, the way OpenStreetMap operates is a much better model. Success should be measured as a function of how likely any given person is to see factually accurate content sourced from Wikipedia, and it shouldn't matter if they're viewing it on some third party site.

Instead it's still run as one massive monolithic website, and it's still hard to get any sort of machine readable data out of it. This IMO should have been the main focus of Wikimedia's development efforts.


> the foundation's main goal is still to host a centrally maintained site

Wikimedia universe is way bigger than one site. There's Wikidata, Commons, Wikisource, Wiktionary, Wikivoyage, Wikibooks and so on. And there's a lot of language versions too - English is not the only way to store knowledge, you know.

> The goal of Wikipedia should be to spread the content as far & wide as possible

The requires a) creating the content and b) presenting the content in the form consumable by the users. Creating tools for this is far from trivial, especially if you want it to be consumable and not just unpalatable infodump accessible only to the most determined.

> Instead it's still run as one massive monolithic website

This is not accurate. A lot has changed since 2004. It's not one monolithic website, it's a constellation of dozens, if not hundreds, of communities. They are using common technical infrastructure (datacenters, operations, etc.) and common software (Mediawiki, plus myriad of extensions for specific projects), but they are separate sites and separate communities, united by common goal of making knowledge available to everyone.

> it's still hard to get any sort of machine readable data out of it

Please check out Wikidata. This is literally the goal of the project. You can also be interested in "structured Commons" and "structured Wiktionary" projects, both in active development as we speak.

> This IMO should have been the main focus of Wikimedia's development efforts.

It is. One of focuses - for a project of this size, there's always several directions. BTW, right now Wikimedia is in the process of formulating movement strategy, with active community participation. You are welcome to contribute: https://meta.wikimedia.org/wiki/Strategy/Wikimedia_movement/...

Disclosure: working for WMF, not speaking for anybody but myself.


> The goal of Wikipedia should be to spread the content as far & wide as possible

>> The requires a) creating the content and b) presenting the content in the form consumable by the users. Creating tools for this is far from trivial, especially if you want it to be consumable and not just unpalatable infodump accessible only to the most determined.

Yes, and as emphasized in the article, WMF has done a terrible job at building better tools. For crying out loud, we are still typing in by hand the complete bibliographic information for each cited reference.

Your other comments are similar. The fact that "WMF is trying", or have a named task force whose formal mission includes a complaint, is not enough justify years of high spending.


> Yes, and as emphasized in the article, WMF has done a terrible job at building better tools.

I respectfully disagree. I think WMF has done pretty good job. Could it be better? Of course, everything could. Is it "terrible"? not even close.

> For crying out loud, we are still typing in by hand the complete bibliographic information for each cited reference.

https://www.mediawiki.org/wiki/Citoid ? In any case "it misses my pet feature" and "the whole multi-year effort is terrible" are not exactly the same thing.

> is not enough justify years of high spending.

I think the work that has been done and is being done justifies it. All this work is publicly documented. You think it's too much and you have the ideas how to do it better - you're welcome to comment. I can not comment on your value judgements - you may seem some projects are more valuable and not done, you are entitled to it. There's a process which gets some things done and some things left out, and by nature not everybody will be satisfied. I only want to correct completely factually false claims in the Op-ed, and I believe I have done so. If I can help with more information, you are welcome to ask. As for value judgements, I think we'd have to agree to disagree here.


> In any case "it misses my pet feature" and "the whole multi-year effort is terrible" are not exactly the same thing.

It's clear from context that this is just an example. The issues with the Wikipedia editing UI are legion and described in many other places.

> You think it's too much and you have the ideas how to do it better - you're welcome to comment

Clean house. Put the people who built Zotero in charge.


> The issues with the Wikipedia editing UI are legion

Any existing UI can be analyzed to find a legion issues, no UI is ever perfect, especially over time and changing requirements. Wikipedia UI is certainly not perfect, and much work is to be done (and being done), but I would stop very far from calling the work that was already done "terrible".

> Clean house. Put the people who built Zotero in charge.

Err, I am having hard time making sense of this advice - why exactly people who built a reference management software must be running Wikimedia Foundation?


> I would stop very far from calling the work that was already done "terrible".

You already declared you weren't going to debate me on this point, so I don't know why you're bringing it up again, especially since you're not saying anything substantive.

> why exactly people who built a reference management software must be running Wikimedia Foundation?

Because they are philanthropically funded non-profit who build great academic/research software on a small budget while responding rapidly to user feedback.

If your objections center around the fact that WMF does a lot more than develop Wikipedia software, then you are missing the whole point of this thread: that WMF's primary contribution is Wikipedia, and almost everything else is secondary. So long as it's being funded by private citizens because of the value they get from Wikipedia, then this should be the focus. Yes, that means the people running Wikipedia conferences and local meetups will have less power.


> If your objections center around the fact that WMF does a lot more than develop Wikipedia software

WMF does a lot more than develops one piece of software to manage citations, yes. Nothing wrong with the software, I'm sure people who made it are awesome. But it's like discovering US federal government didn't solve a problem with a faulty light on your street and proposing that an electrician that did should thus be the President of the USA. Nothing wrong with the electrician or fixing the light, and maybe he'd even be a great President, but that in no way follows from his ability to fix the light. That's just completely unrelated things.

> that WMF's primary contribution is Wikipedia, and almost everything else is secondary

Not so for some time. Also, Wikipedia as a project is way bigger than just software.

> So long as it's being funded by private citizens because of the value they get from Wikipedia, then this should be the focus.

It is. I mean the value and improving it (again, if we correct from Wikipedia to "Wiki sites to gather and disseminate knowledge", which are more that just Wikipedia). But opinion on how to improve that value may not only be "improve this one particular feature".

> Yes, that means the people running Wikipedia conferences and local meetups will have less power.

Than who? And why? There are processes that decide which directions are prioritized and which are not. Right now two of them are happening as we speak - board elections and strategy consultation. Any decision that happens leaves somebody unsatisfied, because it's not possible to satisfy everyone. That doesn't mean everything is terrible, sorry.


> > that WMF's primary contribution is Wikipedia, and almost everything else is secondary

> Not so for some time. Also, Wikipedia as a project is way bigger than just software.

well my donation certainly is aimed that way and the insistent nagscreen certainly made me think "yeah I don't want this resource to go away"

and that is Wikipedia. I occasionally use some of the other wikimedia projects, but they should be secondary, it's definitely specifically that one great body of knowledge that got me to donate.

wiktionary is the project I use second most. if they were to beg for donations or else it may go away, I'd be like "eh"

it's Wikipedia only that got me "no wait this is super important, take my money" every year.


> But it's like discovering US federal government didn't solve a problem with a faulty light on your street...

I was really confused by this comment until I realized you thought I was suggesting Zotero run things because they power Citoid. In fact, as any of the people I eat lunch with can tell you, I have been singing the praises of Zotero for years.

http://blog.jessriedel.com/2014/11/12/zotero-is-great-tex-sh...

The fact that Citoid is very flawed but the part of it that actually works is made by Zotero was merely delicious coincidence.

Your remaining comments then do a better job than I could possibly hope of illustrating exactly the pathological attitude that afflict non-profits. The whole point of my criticism, which is partially shared by the OP and many others in this thread, is that the proper focus of WMF is determined by the people who donate their money and, especially, their time. (That's a normative claim.) That fact that you responded to these points by sayings "No, actually, we at the WMF have expanded well beyond such trifling concerns as the base functionality of Wikipedia" perfectly captures this destructive mindset.

> if we correct from Wikipedia to "Wiki sites to gather and disseminate knowledge", which are more that just Wikipedia

Incorrect. For instance, I use and love Wikivoyage, but I do not pretend that the millions of people who donate to Wikipedia intend to subsidize it! Yes, if Wikivoyage ends up better off through Wikipedia-financed improvements on the general Wikimedia software, all the better. But my friends should not be made to feel like Wikipedia will shutdown if he doesn't donate yearly just so WMF can hold more conferences.

> But opinion on how to improve that value may not only be "improve this one particular feature".

Again, for the second time, the comment on the antiquated citation process was an illustrative example. I have resisted diving into the millions of issues with Wikipedia's software.

>> Yes, that means the people running Wikipedia conferences and local meetups will have less power.

> Than who?

Well, not "less power than other people", but "less power than they did before", i.e., fewer resources and less influence. (WMF can simply get smaller, so that no one gets more power.) But for clarity, I'm happy to suggest that more institutional power within WMF should be given to technical people, to (say) Zotero staff or other people from software non-profits with a better track record, and to anyone who internalizes the idea that the non-profit exists to as a servant to the people who donate time and money.

> There are processes that decide which directions are prioritized and which are not.

Oh thank goodness! There are processes! Just like there are processes for new Wikipedians to dispute the deletion of content they write.

I guess so long as a nation is nominally democratic we don't ever have to worry about it being badly run. And if anyone complains, we can just say they should vote and be satisfied. After, we can't make everyone happy, so if people are unhappy there's no reason to worry about it!


>> that WMF's primary contribution is Wikipedia, and almost everything else is secondary

> Not so for some time.

Oh yeah? To me, as Johnny Q. Public, definitely so, now as always.

Could well be that you're doing Crom-knows-what too, nowadays -- but who cares? Why should we?

> Also, Wikipedia as a project is way bigger than just software.

Yup, you're right there: It's all about the knowledge, the actual _content_ of the on-line encyclopedia.

Which worked prefectly fine with the software of ca 2004, so why waste millions and millions to, AFAICS, hardly any benefit at all compared to that?


> Which worked prefectly fine with the software of ca 2004, so why waste millions

Because it's not 2004 anymore. What worked perfectly in 2004 (which btw it didn't, people complained back then no less than they do now), doesn't work that perfectly now. 10 years is a long time on the Internet, and the project has grown since then.


>the whole point of this thread: that WMF's primary contribution is Wikipedia, and almost everything else is secondary. So long as it's being funded by private citizens because of the value they get from Wikipedia, then this should be the focus.

I have to agree. I keep seeing pleas for donations on Wikipedia when I browse it, but now that I'm reading that they're spending most of that money on other bullshit besides Wikipedia itself, that means I no longer feel any need or duty to donate. I don't use all that other stuff, nor do I care about it, I only care about Wikipedia itself. Surely I'm not the only person who feels this way; anyone reading this article is going to see all the largesse that WMF is spending on, and many are going to question these donation pleas, which likely means donations are going to fall.


"One massive monolithic website" is, I think, meant to be read as referring to the WMF sites being a thing that have a shared telecommunications Single Point of Failure—a "choke point" where a given piece of information can only get from a given WMF site, to a user, by travelling through WMF-managed Internet infrastructure.

Remember Napster, back in the day? It was able to be shut down because it had an SPOF: Napster-the-corporation owned and maintained all the "supernodes" that formed the backbone of the network.

Or consider the Great Firewall of China. If the Great Firewall can block your site/content entirely with a single rule, you have an SPOF.

The answer to such problems isn't simple sharding-by-content-type into "communities" like you're talking about; this is still centralized, in the sense of "centralized allocation."

Instead, to answer such problems, you need true distribution. This can take the form of protocols allowing Wiki articles to be accessed and edited in a peer-to-peer fashion with no focal point that can be blocked; this can take the form of Wikipedia "apps" that are offline-first, such that you can "bring Wikipedia with you" to places where state actors would rather you don't have it; this can take the form of preloaded "Wikipedia mirror in a box" appliances (plus a syncing logistics solution, ala AWS Snowball) which can be used by local libraries in countries with little internet access to allow people there access to Wikipedia.


> WMF sites being a thing that have a shared telecommunications Single Point of Failure

In fact, one of the long-term projects in WMF is making sure the infrastructure is resistant to single-point-of-failure problems - up to whole data center going down. We are pretty close to it (not sure if 100%, but if not close to it). Of course, if you consider existence of WMF to be point of failure, it's another question, by that logic existence of Wikipedia can be treated as single point of failure too. Anybody is welcome to create a new Wikipedia, but that's certainly not a point of criticism towards WMF.

> It was able to be shut down because it had an SPOF: Napster-the-corporation owned and maintained all the "supernodes"

WMF does not own the content or the code, both are in open access and extensively mirrored. WMF does own the hardware - I don't think there's a way to do anything about it, unless somebody wants to donate a data center :)

> If the Great Firewall can block your site/content entirely with a single rule, you have an SPOF.

True, though there are ways around it. Currently witnesses with Turkey blocking Wikipedia. See e.g. https://github.com/ipfs/distributed-wikipedia-mirror for ways around it.

> Instead, to answer such problems, you need true distribution.

I am skeptical about the possibility of making community work using "true distribution". Even though we have good means to distribute hardware and software, be it production or development code, we still do not have any ways to make a community without having gathering points. I won't say it is impossible. I'd say I have yet to see anybody having done it. But if somebody wants to try, all power to them. You can read more about Wikimedia discussions on the topic here: https://strategy.wikimedia.org/wiki/Proposal:Distributed_Wik...

> such that you can "bring Wikipedia with you"

That already exists, there are several offline wikipedia setups and projects: https://strategy.wikimedia.org/wiki/Offline/List_of_Offline_...

> this can take the form of preloaded "Wikipedia mirror in a box" appliances

We are pretty close to this - you can install working Mediawiki setup very quickly (vagrant or I think there are some other containers too, I use vagrant), dumps are there. Won't be 100% copy of true site since there are some complex operational structures that ensure caching, high availability, etc. which kinda hard to put into a box - they are in public (mostly as puppet recipes) but implementing them is not out-of-the-box experience. But you can make a simple mirror with relatively low effort (probably several hours excluding time to actually load the data, that depends on how beefy your hardware is :)

Most of this, btw, is made possible by the work of WMF Engineers :)


> We are pretty close to this ... [things you'd expect ops staff to do]

That doesn't come close at all, from the perspective of a librarian who wants a "copy of Wikipedia" for their library, no? It assumes a ton of IT knowledge, just from the point where you need to combine software with hardware with database dumps.

The average library staff who'd want to set this up in some African village would be less on the side of the knowledge spectrum of "knows what to do with a VM image", and more toward the side of "can plug in and go through the configuration wizard for a NAS/router/streaming box."

Once I can tell such a person to buy some little box with a 4TB hard disk inside it, that you plug in, go to the URL printed on the top, and there Wikipedia is—and then it can keep itself up to date, with a combination of "large patches that get mailed on USB sticks that you plug in, wait, and then drop back into the mail", and critical quick updates to text content for WikiNews et al that it can manage to do using a 20kbps line that's only on for two hours per day—then you'll have something.


I presume you have you tried Kiwix? For less than $100, you can install the full Wikipedia (with reduced size graphics) on a cheap Android tablet with a 64GB card. The installation the first time is a little clumsy, but the experience once it's local is solid: http://www.kiwix.org/downloads/.

I don't think "critical updates" are really that necessary. Swapping SD cards a couple times a year would solve most of it. I think it's pretty incredible (and useful) to to be able to have access to all that information for such a low cost even if it's a few months (or even years) out of date.


> That doesn't come close at all, from the perspective of a librarian who wants a "copy of Wikipedia" for their library, no?

Depends what you mean by copy. If it's just a static data source, any offline project would do it. If it has to update it's trickier, but some offline projects do it too. If you want to run a full clone of Web's fifth popular website, yes, it requires some effort. Sorry, no magic here :)

> "can plug in and go through the configuration wizard for a NAS/router/streaming box."

There are boxes that are integrated with one or another of the offline projects. There's also Wikipedia Zero - which in the world where mobile coverage is becoming more and more widespread even in poor regions, may be even better alternative.


Completely with you. The goal of Wikipedia should be to spread the content and allow more new content.

Sadly it seems the opposite is true, whole parts of Wikipedia are infested by cancer (aka corrupt/out-od-mind admins who are acting in their own world/turf and interest), have a closer look at certain languages like de.Wikipedia.org where more new and old articles get deleted than content can grow (source: various German news media incl. Heise.de reported about it). And why is Wikia a thing? And why is it from the Wikipedia founder, has he a double interest!? And now he is starting a news offspring as well! Something like the Wikipedia frontpage and WikiNews, just under his own company. And on the otherside Wikipedia banned trivia sections to make the Wikia offspring even possible (happened 10 years ago, but you probably remember it; yet Wikipedia deleted/buried the trivia section history). Why even delete non-corporate-spam artices? Why are fictional manga creatures all o er on Wikipedia but info about movie characters all deleted? Many Wikipedia admin seem to be deletionists that care only about their turf, the care about "their own" articles, they revert changes to them just for their own sake. Look at the WikiData project. Why is it implemented by a German Wikipedia org that has little to do with intl Wikimedia foundation, it's not a sister project, they do their own fundraiser and media news reported not so nice things over the years.

Look at OpenStreetMap project, it works a lot better. Maybe the Wikipedia project should be transfered over or forked by OpenStreetMaps project. And delete all admin rights, and start over with the in some way toxic community that scares away most normal people who don't want to engage in childish turf wars and see their contributions deleted and cut down for no reason but admin power play.


> and it's still hard to get any sort of machine readable data out of it.

Huh? https://dumps.wikimedia.org/

Doesn't that qualify?


That gives you Wikitext encapsulated in XML. How do you get at the content of the Wikitext?

I work on a Wikitext parser [1]. So do many other people, in different ways. Wikitext syntax is horrible and it mixes content and presentation indiscriminately (for example, it contains most of HTML as a sub-syntax).

The problem is basically unsolvable, as the result of parsing a Wiki page is defined only by a complete implementation of MediaWiki (with all its recursively-evaluated template pages, PHP code, and Lua code), but if you run that whole stack what you get in the end is HTML -- just the presentation, not the content you presumably wanted.

So people solve various pieces of the problem instead, creating approximate parsers that oversimplify various situations to meet their needs.

One of these solutions is DBPedia [2], but if you use DBPedia you have to watch out for the parts that are false or misleading due to parse errors.

[1] https://github.com/LuminosoInsight/wikiparsec

[2] http://wiki.dbpedia.org/


"That gives you Wikitext encapsulated in XML."

avar: "The goal of Wikipedia should be to spread the content as far & wide as possible, the way OpenStreetMap operates is a better model."

I am confused.

Doesn't OSM data come encapsulated in XML or some binary format?

As for dispersion of content, I could have sworn I have seen Wikipedia content on non-Wikipedia websites. Is there some restriction that prohibits this?

I have seen Wikipedia data offered in DNS TXT records as well.


For each article there is some metadata, but the entire text of an article is just a blob inside one XML element.

For anyone who has not worked with the Wikipedia data dumps extensively before, trust us that it is not easily machine-readable and that even solutions like DBPedia / Wikidata are not yet suitable for many purposes.


As someone who contributes to many knowledge projects, including Wikipedia and Wikidata frequently, I'm curious about what you mean that Wikidata is not yet suitable for any purposes. Am I wasting my time contributing to it? I thought that it was helping a lot of machines understand data. Can you please explain further?


Please reread, for many purposes! I love Wikipedia.

The Wiki markup is extremely complicated and being user created, it is also inconsistent and error prone. I believe the MediaWiki parser itself is something like a single 5000 line PHP function! All of the alternate parsers I've tried are not perfect. There is a ton of information encoded in the semi-structured markup, but it's still not easy to turn that into actual structured data. That's where the problem lies.


> believe the MediaWiki parser itself is something like a single 5000 line PHP function!

It's not. I'm on mobile so not easiest to link, but the PHP versio of the parser is nothing like a single function. There is also a nodejs version of the parser under active development with the goal of replacing the php parser.


Thanks, I had heard that somewhere but stand corrected.


"... into actual structured data."

Would there be some particular structure that everyone would agree on?

Alternatively, what is the desired structure you want?

Because the current format is so messy, I just focus on what I believe is most important: titles and externallinks. IMO, often the most interesting content in an article is lifted from content found via the external links. I also would like to capture the talk pages. Maybe just the contributing usernames and IP addresses.

Opinions or explanations that have no supporting reference are inexpensive. One can always these for free on the web. No problem recruiting "contributors" for that sort of "content".

Back to the question: I am curious what structure would you envision would be best for Wikipedia data? Assume hypothetically that a "perfect" parser has been written for you to do the transformation.


The structure I need for my particular project (ConceptNet) is:

* The definitions from each Wiktionary entry.

* The links between those definitions, whether they are explicit templated links in a Translation or Etymology section, or vaguer links such as words in double-brackets in the definition. (These links carry a lot of information, and they're why I started my own parser instead of using DBnary.)

* The relations being conveyed by those links. (Synonyms? Hypernyms? Part of the definition of the word?)

* The links should clarify the language of the word they are linking to. (This takes some heuristics and some guessing so far, because Wiktionary pages define every word in any language that uses the same string of letters, and often the link target doesn't specify the language.)

* The languages involved should be identified by BCP 47 language codes, not by their names, because names are ambiguous. (Every Wiktionary but the English one is good at this.)

There are probably analogous relations to be extracted from Wikipedia, but it seems like an even bigger task whose fruit is higher-hanging.

Don't get me wrong: Wiktionary is an amazing, world-changing source of multilingual knowledge. Wiktionary plus Games With A Purpose are most of the reason why ConceptNet works so well and is mopping the floor with word2vec. And that's why I'm so desperate to get at what the knowledge is.


I don't think you are using this in the way it was meant to be used. Wikipedia is a user edited, human centered project. Humans are error prone and that's something that you are going to have to live with if you want to re-purpose the data.

The burden of repurposing falls on you and wikipedia makes the exact same data that they have at their disposal available to you, to expect it in a more structured format that is usable by you and your project but that goes beyond what Wikipedia needs in order to function is asking for a bit much I think.

They make the dumps available, they make the parser they use available, what more could you reasonably ask for that does not exceed the intended use case for Wikipedia?

Afaics any work they do that increases the burden on Wikipedia contributors that would make your life easier would be too much.

But since you are already so far along with this and you have your parser, what you could do is to re-release your own intermediary format dumps that would make the lives of other researchers easier.


Yeah, I understand that. I'm re-purposing the data and it's my job to decide how that works.

But this could be easier. What I hate about Wikimedia's format is templates. They are not very human-editable (try editing a template sometime; unless you're an absolute pro, you will break thousands of articles and be kindly asked to never to do that again) and not very computer-parseable. They're just the first thing someone thought of that worked with MediaWiki's existing feature set and put the right thing on the screen.

Replacing templates with something better -- which would require a project-wide engineering effort -- could make things more accessible to everyone.

FWIW, I do make the intermediate results of my parser downloadable, although to consider them "released" would require documenting them. For example: [1]

[1] https://s3.amazonaws.com/conceptnet/precomputed-data/2016/wi...


Agreed, editing anything more complex than a simple text i.e. a table or some note is a shore. And I'm an advanced user!


The GP said Wikidata isn't suitable for many purposes, different from any.

It's a nice agreed-upon vocabulary for linked data. But you still need the data that the vocabulary refers to. The information you can get without ever leaving the Wikidata representation is still too sparse.


He's saying that Wikipedia doesn't give you clean, usable data, it gives you data with weird markup everywhere.


Thanks for working on that! Didn't know it was so bad. The following is a possibly stupid idea, but I'd like to hear your thoughts:

What if you just render the content into HTML and then "screen scraped" the text, and then convert into a more useful format (MarkDown, JSON, etc). Is that plausible?


That would allow a basic UI change on Wikipedia to break your code. Sometimes it is necessary, but not usually the best option in my experience, and it's pretty annoying to do.


Wikipedia used to do HTML dumps but stopped a long time ago, unfortunately.


You can get what amounts to an HTML dump (which is then indexed and compressed in a single huge archive) from Kiwix. Although they do them basically twice a year or so.


You should have a look at [1] that outputs an HTML rendering of pages with a lot of metadata.

[1] https://en.wikipedia.org/api/rest_v1/#!/Page_content/get_pag...


You could download the search indexes, also on the dumps site, that has the text content among other things.


Go click through those links. Most of them are hardly maintained. E.g. last static HTML dump was in 2008. Current enwiki raw data dump is in progress and reads:

"2017-05-07 23:24:34: enwiki (ID 13918) 103 pages (0.0|1.2/sec all|curr), 921000 revs (14.4|11.4/sec all|curr), 100.0%|100.0% prefetched (all|curr), ETA 2019-01-23 15:17:29 [max 779130995]"

There are real logistical challenges in making these dumps and making them _useful_. For all Wikimedia's spending, they have not invested sufficiently in this area.


No, not at all.

Years back Wikipedia released HTML dumps of the entire site, which was closer to providing the actual content of Wikipedia as structured data, but that was discontinued.


Random thought. Why can't something like Wikipedia be run distributed through a blockchain? Edits are just transactions that are broadcast over the network. I imagine the total cost of that to individual contributors of nodes would be less than the millions they're paying right now.

EDIT: Turns out someone is already working on it: https://lunyr.com/


Most of Wikipedia's money goes to pay for things besides hosting. Centralized hosting also happens to be more efficient than a decentralized version.


Saw this the other day. Someone working on making Wikipedia on MaidSafe

https://safenetforum.org/t/safe-drive-wikipedia-on-safe-tech...

https://github.com/loureirorg/wikirofs

Not sure how far along it is, but it looks interesting.


Don't they have MediaWiki APIs and dumps? Look at other wiki sites. Also there is Kiwix and various offline apps. Have you seen them?

Thoughts?

Also what's the difference between WikiData and DBPedia?


> Also what's the difference between WikiData and DBPedia?

Wikidata is a Wikimedia project with aim to create a structured knowledge based. It is mostly filled and curated by humans: https://www.wikidata.org

DBPedia is a knowledge base which content is extracted from Wikipedia (mostly from the infoboxes). It is a project run by researchers: http://dbpedia.org


How would you quantify the success of spreading data as far and as wide as possible?


Reproducing the table from the article with one extra column, the ratio of expenses to revenue for clarity, it looks like they're still operating with a very comfortable margin. Yes, the 19% margin is tighter than a 50% margin 12 years ago, and their existence depends on donations now more than ever ($23,463/yr is sustainable by a single engineer's salary, $65,947,465/yr is...not), but Wikipedia and other Wikimedia projects also serve a much wider audience and broader purpose. This isn't scary in and of itself, especially if they've got cash reserves to give them time to tighten the belt later on if it becomes a problem and someone in a leadership position is monitoring their finances to act if their burn rate gets too high... I've seen plenty of nonprofits with tighter margins survive and succeed.

  Year      Revenue      Expenses      Net Assets    Expense Ratio (1-margin)
  2003/2004     $80,129       $23,463       $56,666   29%
  2004/2005    $379,088      $177,670      $268,084   47%
  2005/2006  $1,508,039      $791,907    $1,004,216   53%
  2006/2007  $2,734,909    $2,077,843    $1,658,282   76%
  2007/2008  $5,032,981    $3,540,724    $5,178,168   70%
  2008/2009  $8,658,006    $5,617,236    $8,231,767   65%
  2009/2010 $17,979,312   $10,266,793   $14,542,731   57%
  2010/2011 $24,785,092   $17,889,794   $24,192,144   72%
  2011/2012 $38,479,665   $29,260,652   $34,929,058   76%
  2012/2013 $48,635,408   $35,704,796   $45,189,124   73%
  2013/2014 $52,465,287   $45,900,745   $53,475,021   87%
  2014/2015 $75,797,223   $52,596,782   $77,820,298   69%
  2015/2016 $81,862,724   $65,947,465   $91,782,795   81%
How sure are we that these numbers are accurate, anyhow?


It seems op-ed followed Wikimedia's Financial Statements [1]

Expenses (2016/2015)

                                 2016       2015
  Salaries and wages             31,713,961 26,049,224
  Awards and grants              11,354,612 4,522,689
  Internet hosting               2,069,572  1,997,521
  In-kind service expenses       1,065,523  235,570
  Donations processing expenses  3,604,682  2,484,765
  Professional service expenses  6,033,172  7,645,105
  Other operating expenses       4,777,203  4,449,764
  Travel and conferences         2,296,592  2,289,489
  Depreciation and amortization  2,720,835  2,656,103
  Special event expense,         311,313    266,552
  
  Total expenses                 65,947,465 52,596,782
--

Sources: [1]: https://upload.wikimedia.org/wikipedia/foundation/4/43/Wikim...


11 million in awards and grants seems like something that you can cut easily in bad times. Also, they are still generating more income than expenses, and the margin is big enough to adapt if there is a sudden decrease.


> it looks like they're still operating with a very comfortable margin.

That's not the essay's concern, the essay's concern is that expenses grow much faster than the site's load.


I thought the essay's concern was that funding is growing too fast, allowing expenses to expand to fill the gap in a way that is unsustainable (because eventually the funding growth must come to an end).


The biggest ticket in the expenses is salaries. I wonder how much of the growing expenses is linked to growing developer salaries


Administrator since 2003 here. I have contributed to Wikipedia in various languages, Wikimedia Commons, Wikibooks, Wiktionary, Wikisource, etc. Three core points, particularly on Wikipedia:

(1) Bad experiences for new and established contributors mean less motivated contributors. This is due to factors such as too much bureaucracy, too many subjective guidelines, too much content being deleted (exclusionism), and an overwhelming mess of projects and policies.

(2) Not enough focus. By starting many projects the foundation has muddied its mission and public identity. In addition, it has broad and potentially mutually conflicting goals such as educating the public about various issues, educating the public about how to work with others to contribute to projects, asking the public for money, agitating governments and corporations for policy change and support, monitoring public behavior looking for evidence of wrongdoing, and engaging with education. Why not leave education to the educationalists, politics to the politicians, spying to the government and motivated contributors and fundraising to donors?

(3) Non-free academic media is hurting the project. Given that only a small number of editors have true access to major academic databases, it is often hard for contributors to equally and fairly balance an article.

Having said that, I still have tremendous respect for the project and comparing its costs to those of the prior systems necessarily incorporating manual preparation, editing, production and distribution of printed matter by 'experts', the opportunity costs for access alone justify the full expenditure. It's not a lot of money in global terms.


> Bad experiences for new and established contributors mean less motivated contributors.

This has nothing to do with the financials of the foundation and is completely a community issue.

> Not enough focus.

This is a valid point but I think you're being too scorched earth with it like saying that Wikipedia shouldn't do any political outreach at all. If its millions of viewers hadn't seen the SOPA blackout, would it have been as successful? If it didn't fight for freedom of panorama and other copyright issues, would it be able to exist in the same form as now? Your suggestion is like telling Japan to go back to isolationism. Sure it might work if you're self-sustaining, but it's no way to run a global project.

> Non-free academic media is hurting the project.

If you are part of a university, you likely have access to such media. Many public libraries also have such access. Lastly, there's the Wikipedia Library. [0] I'm not sure what you want Wikipedia to do here past what it's already doing.

[0]: https://en.wikipedia.org/wiki/Wikipedia:The_Wikipedia_Librar...


>> Bad experiences for new and established contributors mean less motivated contributors.

> This has nothing to do with the financials of the foundation and is completely a community issue.

I do not contribute financially to wikipedia, despite being very interested in doing so, because of this issue.

I am sick and tired of seeing large amounts of properly formatted, well formed articles, written in good faith, deleted by the little hitlers protecting their precious wikipedia turf.

This "community issue" costs wikipedia several thousand of my dollars per year. I wonder how many other people decline to support them financially due to these "community issues" ?


I don't normally post "me too" on HN, but I feel I should here.

I don't contribute to Wikipedia anymore, because the editorial policies don't agree with my views on how an internet encyclopedia should be run.


You do know that each Wikipedia is community-run right? You'd rather the foundation take away the autonomy of each wiki and force it's own standards? That goes completely against what the project was started for and would quickly alienate the userbase. If you don't believe me, just look at the backlash behind the media viewer (search for "superprotect"), visual editor, and flagged revisions. You cannot dictate rules to a community-established project, each wiki has its own culture and precedents. Sure the foundation could go all dictator, but that would be the fastest way to cause a fork and destroy the goodwill of volunteers. You don't like the community, either whine like you are doing now, or go in there and contribute. You're being no different that someone complaining that their local beach is full of trash while doing nothing to help out, or—if your username has any relevance—someone who wants a feature/bug-fix in an open-source project but won't make any pull requests.


> This has nothing to do with the financials of the foundation and is completely a community issue.

I don't agree. Choices in how the foundation spends it's money can amplify or diminish these concerns.

E.g. WMF has spent extensively to try to bring in a wider space of borderline editors rather than investing as much in infrastructure to soften the learning curve to retain and boost the participation of middle-tier editors, which exacerbates us vs them seige mentality... and the overuse of blunt tools to stem a rising tide of low quality edits at the cost of a poor experience for new contributors.

E.g. An example I'd cite for this is the extreme investment in "visual editing"-- which only barely manages to not mangle pages when its used-- over things like syntax highlighting.

Not all the blame in these areas falls on the WMF for sure. As an example, enwiki community factionlization around deletion blocked the deployment of revision flagging (basically supporting having 'release' versions of articles so that non-contributors are not constantly subjected to the very latest unreviewed revision of an article) which would have allowed radically less aggressive edit patrolling.


I am the author of this op-ed, which I will prove by posting a comment on my Wikipedia talk page [ https://en.wikipedia.org/wiki/User_talk:Guy_Macon#Hacker_New... ] before saving this. I am open to any questions, criticism, or discussion. BTW, as I noted in the op-ed. At the request of the editors of The Signpost, the version linked to at the top of this thread has fewer citations and less information in the graphic. The original version is at [ https://en.wikipedia.org/wiki/User:Guy_Macon/Wikipedia_has_C... ]


The key financial metric I'd look at is "months to cash out". Basically someone reasonable needs to decide "if no other money comes in to this organization, how long should it need to operate?"

From there you can get more specific on what "operate" means (i.e. will layoffs occur before scaling back hosting costs).

Has there been any such analysis?


(I am the author of the op-ed. A better version is at [ https://en.wikipedia.org/wiki/User:Guy_Macon/Wikipedia_has_C... ].)

The question as I understand it is is "if no other money comes in how long could Wikipedia operate?

If you assume (and I do) that the Wikimedia foundation (WMF) would keep on the spending path they are on, it would take a year plus or minus a few months to go completely broke. If they were to immediately respond with massive spending cuts they could last a lot longer.

The reason I don't believe that the WMF will react to a revenue decrease with spending cuts is because they really, really, believe that everything they are doing and everything they have planned is absolutely essential. Plus, it is human nature to say "this is temporary. The revenue will go back to increasing next year", all the while greatly expanding the fundraising appeals.


Have you compared the growth of costs with similar sized organizations?


What was 81 million dollars spent on in 2016?

Employee cost has grown 300. How many employees are there? What do they work on?


His argument seems to boil down to "growth must be cancer" and "wikipedia/wmf shouldn't have expanded it's scope", with the conclusion "this must fail". But don't most organizations do? Are non-profits not allowed? Otherwise I also would like some more specific criticism about how money is wasted.


He was comparing that back then, they have 1 employee and now the foundation have 300-ish employees.


Wikimedia publishes independently-audited financial statements. Here's the latest one. https://upload.wikimedia.org/wikipedia/foundation/4/43/Wikim...

It's clear that salaries and awards and grants are driving the increase in cost. Maybe this is damning evidence of a decadent culture, which the author of this op-ed clearly presume, but I doubt it. I would expect that Wikipedia's employees have been working very hard for a long time to keep the site running and they've cultivated expertise in governing the site in a way that avoids controversy and maintains credibility. I'd rather Wikipedia spend to retain long-tenured experts who have paid their dues than be an underpaid-college-graduate-mill like so many non-profits are. It seems that they've done that, and they've waited until the organization was financially stable to do so.


When I say "I want to know where Wikimedia is spending its money", I don't mean "is it on people or on bandwidth or on equipment?"; I mean "is it on Wikipedia or Wiktionary? how much money did they burn into the finally-launched WYSIWYG editor that their own research shows is barely used and solves the wrong problem? how little time is being spent figuring out how to handle a world with decreasingly reliable second-party sources, given their adamant refusal to allow reliance on first-party material? do they have any resources at all dedicated to dealing with deletionism?". I do not care if the people there are being paid a million dollars a year: I want to know their time is being used in ways that makes sense, and as far as I can tell almost none of their resources are being spent on anything which seems to actually matter. If they explained "actually, we added an automated model for verifying the value of an edit that our metrics have shown decreased the amount of time moderators have to spend watching the site while having minimal effects on new user retention, a project which used twenty engineers for five years to build" I'd at least shrug and go "huh, OK"; but as of right now I am not seeing it... it isn't that they overpay their staff, it is that they fundamentally don't have anything useful to do with staff but seem to keep growing their staff and then allocating them towards dumb things while telling everyone if they stop donating to this cancerous staff growth the site will go offline, which is a situation for which I simply can't attach enough modifiers to the word "lie" to to express the level of active deception at play.


Exactly. There's this folk belief that the main risk with non-profits is that that they will pay themselves above-market salaries or otherwise embezzle money in outright fraud. And when people criticize the non-profit for inefficiency, they often defend themselves by saying "Look! The salaries for our legions of workers are market rate and we have all these noble sounding projects."

But this is a red herring, because outright fraud is relatively rare. Rather, the much biggest issue is a terribly managed organization spending resources ineffectively. Non-profits shouldn't be judged on overhead or executive salary (who cares?), they should be judged on what they accomplish for the amount they spend. And WMF does terribly on this metric.


Great comments.

"spending resources ineffectively"

Often staff is taken on in order to fill vacancies without as much regard for skill levels. The marginal value of extra employees lowers and can dip into the negative. This is the sort of ineffective spending which is invisible to all but their closest colleagues -- who have too little political capital to do anything about it.


Where I feel a lot of the animosity from the Wikipedia community stems from is that the people who have "cultivated expertise in governing" are actually Wikipedia volunteers, not WMF employees.


I'd add: Let's not get caught in the trap of looking at good-paying jobs as a problem. Wikipedia employees shouldn't be expected to work for next-to-nothing or nothing and to make great sacrifices for the rest of us, which is what many open source leaders and contributors must do (a bad thing). Why shouldn't we pay then well?


That's a red herring; see jessriedel's comment above.


> ...I have never seen any evidence that the WMF has been following standard software engineering principles that were well-known when Mythical Man-Month was was first published in 1975. If they had, we would be seeing things like requirements documents and schedules with measurable milestones.

This part of the critique seems a little off, doesn't it? I don't know the state of WMF engineering, it very well may have problems, and a complete lack of documentation or planning is not a good sign, but the particular artifacts (requirements documents, schedules with milestones) mentioned here are more from the pre-Agile waterfall school of thought. Can anyone familiar with WMF engineering comment?


[Former product manager at the Wikimedia Foundation and longtime Wikipedia editor/admin here.]

The author of the op-ed is a devoted editor but seems almost totally ignorant of how development is conducted. The Foundation has been doing transparent quarterly/yearly roadmap planning alongside their annual plan / budget cycle (which is shared all publicly). On a shorter timeframe, they are pretty serious about Agile/Scrum. You can see on https://wikimediafoundation.org/wiki/Staff_and_contractors that today they even have a team of half a dozen full time Scrum masters. If what he thinks is missing is serious, detailed planning, he's sorely wrong.

The platform (MediaWiki) is still a FOSS community so you can find project requirements docs/roadmaps all over mediawiki.org, all the bugs on Phabricator, follow along on mailing lists, and even see commits on their Gerrit instance.

Agile isn't my cup of tea personally and I could grok criticisms of the organization's software output (ignoring the fact that they're buried under 10+ years of technical debt...), but it takes minimal digging to find all their plans and timelines. I would venture that the author chose not to dig into this because he, like a lot of entrenched old school editors, viscerally hates some of the past attempts to make MediaWiki a modern collaboration platform, such as building a WYSIWYG editor and a threaded discussion system to replace wiki talk pages.


(I am the author of the op-ed. A better version is at [ https://en.wikipedia.org/wiki/User:Guy_Macon/Wikipedia_has_C... ].)

I am very familiar with Agile and Scum, and I have seen the advantages over older paradigms such as waterfall. That being said, there are certain basic principles that the old methods and the new methods have in common. One such principle is the basic idea of having some sort of contact with the people who will be using the finished software and understanding their needs. The WMF does not do that. Instead, they build something in secret, throw it over the wall, and watch as the Wikipedia editors reject it as the steaming pile of crap that it is. They have done this again and again. Visual Editor. Flow. Mobile App. Knowledge Engine. All failures. All built without any input from the people who would be using them.

Now I KNOW that the developers are not stupid or ignorant, and I have checked as best I can and it appears that every one of them was able to create high quality software that meets the customer's needs when they were working other places. That leaves me with management as the probable culprit. And I don't think that the problem is product managers like the author of the post above this one. I think the blame is at the very top.

I would advise anyone who really wishes to understand these issues to at least read the pages linked to on my [ https://en.wikipedia.org/wiki/User:Guy_Macon/Wikipedia_has_C... ] page, especially [ http://mollywhite.net/wikimedia-timeline/ ]

Finally, if it really "it takes minimal digging to find all their plans and timelines", I would like to see this demonstrated by providing links to the plans and timelines for the Knowledge Engine. --~~~~


Where can one find out what percentage of Wikimedia's developer staff resources (as opposed to open source contributors) are being allocated towards what projects? They are spending $31m this year on staff: what percentage of that is being spent on what kinds of staff (ex. software engineer vs. community manager), and what percentage of their developer staff is being used to build these aforementioned projects? If that number is extremely low then you can just discount that issue, but if that number is enourmous then more questions have to be asked (which would of course involve looking at the success metrics on those projects and what validation was done on them while they were designed). As it stands, Wikimedia is constantly asking for more money using the threat that Wikipedia will collapse, when for all we know most of their staff time is off building stuff like Wiktionary.


Their Annual Plan with spending breakdown is published every year on wikimediafoundation.org. The draft for the upcoming year is https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Annual_...

TL;DR: the largest chunk of the budget goes to the two departments that do engineering/design/PM/data science.

On your last point ("for all we know most of their staff time is off building stuff like Wiktionary") it's actually a big gripe in the smaller communities that probably 90% of the time and attention goes to Wikipedia.


Ok, so from this I see that $20m/year is going towards staff for "product" and "technology"; but there is no breakdown on what that is being spent towards. The point I was making is that if we knew the percentage of effort going towards engineering and multiplied it by the percentage of time being allocated towards some targeted projects and that value was low, then it would not be relevant... but we only have he first number and that number is high enough to mean we have to be concerned by the second number. Spending $20m for a year of engineering effort is a ridiculously large number for a website that fundamentally does as little as Wikipedia does... what was shown for it and what percentage of that can be allocated towards each outcome?


Agile approach doesn't mean that there are no requirements documents and no milestones. You're still supposed to write requirements in some form (e.g. user stories and test cases) and plan a few months ahead (while being ready to correct your course based on user feedback after every sprint).


> and a complete lack of documentation or planning is not a good sign

Neither it is true. There is documentation and planning. As pretty much every software project I've seen over my career, documentation could use some TLC (and unlike many other pieces of software, anybody can actually help it[1]), but it's not exactly described as "complete lack". There's a lot of documentation, though some areas are covered less than others - Mediawiki is a big piece of software, and a long-term organically grown project, and if you have any experience with those you know what it means. It is known and regular effort is taken to improve it.

Same for planning - not exactly ideal, but "complete lack" is very, very far from the truth - moreover, unlike many other organizations, all the plans and all the internal workboards are public[2] (excluding security issues and sensitive information), so you can check for yourself.

> Can anyone familiar with WMF engineering comment?

Yes, I am familiar with it by virtue of being part of it (still not speaking for anyone by myself, off-the-clock, in completely personal capacity :) and I say this claim is completely false. Moreover, it is so obviously false and so easily disproven by public documents[3] that I wonder how one could publish that in a public media without bothering to do minimal due diligence. I mean, we all panic about "fake news" and stuff - shouldn't it make us to at least minimally try to check our claims with easy search or question on a mailing lists having dozens of people who could point out where the appropriate documents are? The author of the article seemingly believes it is unnecessary. I do respect his long-time contribution to Wikipedia (much more sizeable than mine) but that still does not entitle him to his own facts.

Looks like he disagrees with some of the projects Wikimedia took on - like making user experience more friendly with Visual Editor and mobile support, both IMO excellent projects, but everybody is entitled to their own take on this. It is fine. What's not fine is claiming that not agreeing with him is equivalent to not having direction at all and wasting money and being cancer. That's way too far and completely untrue.

[1] https://www.mediawiki.org/wiki/MediaWiki_Documentation_Day_2... [2] https://phabricator.wikimedia.org/ [3] https://www.mediawiki.org/wiki/Wikimedia_Engineering/2016-17...


That's the point; he's arguing that WMF engineering practices are so disorganized that not only don't they qualify as agile, they don't even qualify as waterfall (which predates agile by several decades).

Waterfall methodologies are deeply un-hip today, of course, but when they first coalesced they were a big improvement over what came before them, which was essentially nothing: an absence of any formal project management methodologies, with people cobbling together projects from bits and pieces of expertise learned in other disciplines.

(Note that I have no idea how WMF's software engineering practices work, so I have no idea if this assertion is accurate or not. I'm just trying to clarify what I think Macon is arguing here.)


You are correct. I am arguing that WMF engineering practices are so disorganized that not only don't they qualify as agile, they don't even qualify as waterfall (which predates agile by several decades). In particular, I am part of the community of Wikipedia editors. Nobody asked us what flow or Knowledge Engine should look like. That's part of Waterfall AND Agile. Yes a dedicated person can go to the developer's separate Wiki and mailing lists, but on Wikipedia itself there is zero evidence that the principles I see at [ http://agilemanifesto.org/principles.html ] are in play.

Again, the developers and their managers know this. I am convinced that they have been told in no uncertain terms that they will be fired if they interact with the Wikipedia community.

(I am the author of the op-ed. A better version is at [ https://en.wikipedia.org/wiki/User:Guy_Macon/Wikipedia_has_C... ].)


"Waterfall methodology" is the ultimate straw man.


It's hilarious. It has only existed as a thing to critcize, and the term itself actually originates in a paper describing why it's broken. No one has ever advocated for the "waterfall" approach.

The ultimate straw man indeed :)


> It has only existed as a thing to critcize

No, this is false.

> and the term itself actually originates in a paper describing why it's broken.

So does the term "capitalism". Like capitalism, though, the waterfall method was a thing actually in wide use both before (the first paper describing it's use was about 20 years earlier than the critical one in which the term seems to have been first used) and after (it's been mandated by many institutions, particularly in government, even after that critical paper) being names in criticism.

> No one has ever advocated for the "waterfall" approach.

Actually, a number of large organizations, particularly governments, to this day mandate processes for software development projects, particularly large projects, that embody essentially the key features of the waterfall method, most critically that of doing full analysis across the whole scope before beginning development (often, in government, before getting approval for funding to open up contracting for the actual development work.) A lot of the contractors involved advertise that they use agile methods, but it ends often up being a kind of Scrum-within-waterfall monstrosity that managed to preserve the worst features of both.


> No, this is false.

Point me to someone espousing the benefits of the waterfall approach, please.

> So does the term "capitalism". Like capitalism, though, the waterfall method was a thing actually in wide use both before (the first paper describing it's use was about 20 years earlier than the critical one in which the term seems to have been first used) and after (it's been mandated by many institutions, particularly in government, even after that critical paper) being names in criticism.

I'm not saying that no one has ever tried building software this way. But the term and "methodology" are literally the collection of broken processes associated with early development.

> Actually, a number of large organizations, particularly governments, to this day mandate processes for software development projects, particularly large projects, that embody essentially the key features of the waterfall method, most critically that of doing full analysis across the whole scope before beginning development (often, in government, before getting approval for funding to open up contracting for the actual development work.)

How do you go to tender without reasonable complete requirements?

The problem here isn't the development methodology, it's the fact that going to tender for development basically forces you into this position. Governments seem to be moving to in house development to solve this problem, but I'd hardly say that the original requirements gathering was the result of anyone advocating for the waterfall approach.


> No one has ever advocated for the "waterfall" approach.

I wish. I have actually worked under someone seriously putting it forward. (First IT job, I had no idea how bogus this was.)


It just... really bothers me that Wikipedia has grown into this massive thing, with $60 million in cash reserves and $31 million in salaries a year... and the people who aren't getting paid are the ones actually writing an encyclopedia. For that kind of money, you'd think they could actually pay people to write an encyclopedia, like Britannica used to. Now Britannica is circling the drain, Wikipedia is raking in money, and instead of paying the writers, there's this whole bureaucracy slurping up the cash and not giving it to the people doing the actual work. I hate all this digital sharecropping. I hate all these businesses based on paying millions of amateurs nothing or next to nothing for large volumes of low quality labor, making it up on volume, and paying a handful of people large sums of money to "administer" it. You'd think for that kind of money you could pay some writers.


60 millions in spending, but that's for top5 site in the world, serving knowledge to billions of people, and is a phenomenon of modern society, is it that much? Of course they probably could spend less, like 20 millions if they were perfectly effective. But who is perfectly effective all the time? Sure, it's needed to keep an eye on effectiveness and try to improve it, and increase transparency, evaluate what all those people are doing etc But I wouldn't say that situation is awful or anything. Original article is using imagination and projection a lot.

And I think there's big problem with idea paying writers. The moment you start paying your writers become not just people who want to help and share their knowledge because of goodness, but they (or some part, which you can't distinguish) also become people who want money. And it's hard to manage it, people can start bickering who's got how much, etc. You might end up with thousands of people willing to write anything just to get some money and some good writers leaving because they don't want to compete with money grabbers.


> You might end up with thousands of people willing to write anything just to get some money and some good writers leaving because they don't want to compete with money grabbers.

Okay, but why isn't programming held to the same standard? Why do they have, what, four scrum coaches on payroll? Can't they rely on scrum coaches who want to help and share their knowledge because of goodness, too?


I think they are held to a degree - mediawiki is open-source and devs can contribute. I guess scrum coaches or other kind of management is harder to use this way, you still need core team working full-time, but I don't know what they are doing.


Exactly.

Even if they didn't pay writers (which might create its own problems), they could do far more to create and publicise a user-friendly scheme to provide volunteers with access to paywalled sources, digitised books etc. They could spend some of those tens of millions on that. But there too they rely on begging and volunteer labour:

https://meta.wikimedia.org/wiki/The_Wikipedia_Library

It speaks volumes that even this rudimentary project, the Wikipedia Library, was originally initiated by a volunteer - for ten years, nobody at WMF seems to have had the brains to think of it. Supporting the broad mass of volunteers (rather than a few snouts in the trough) has never been near the top of the WMF list of priorities. I suspect the volunteers are largely regarded as amazingly convenient and useful idiots.

The lion's share of WMF money has always been spent on software engineering, much of it done by former volunteers who sucked up enough to first land a WMF job and then prove their engineering incompetence (as in the VisualEditor debacle):

http://wikipediocracy.com/2014/09/21/wikipedia-keeping-it-fr... https://www.theregister.co.uk/2013/09/25/wikipedia_peasants_...


this is reason why I don't contribute in years already, there is bunch of parasites leeching on work of other people, somehow resembling charity organization where minimum of donated money go actually to those in need

only thing I want from wikipedia is wikipedia, no awards, grants and other nonsense to launder money to friends who don't wanna work, so cut those 300 employees to 30 actually doing work on wikipedia and maybe one day I will contribute again either by editing or donating money, but feeding bunch of parasites doesn't seem like healthy longterm solution


It's the digital version of college football except nobody ever makes it to the pros.


This op ed is non-sensical. According to the author, every successful startup in history is "cancer". Wikipedia's costs have grown because their usage has grown exponentially (comparing costs to economy-wide inflation is particularly baffling).

If anything, I got from this article that Wikipedia has kept costs well below revenue growth, which is normally the sign of a healthy organization.


If hosting cost per hit has increased, something is wrong. Computer costs have gone down since 2005.

That's worth looking into. Wikipedia hasn't gone down the Web 2.0 Javascript/CSS rathole, where every page loads megabytes of vaguely useful junk. What's the problem?


And thank heavens for that. Imagine medium style resource bloat every time you want to read about how Rasputin died


Improved high availability and geographic dispersion. They're running in multiple datacenters across the globe now.


The OP is complaining about personnel and fundraising costs rather than actual hosting costs.


> If hosting cost per hit has increased, something is wrong. Computer costs have gone down since 2005.

Images are dramatically larger, both in the raw size out of cameras and the resolutions people are willing to put on pages (including higher-DPI screens).

Video's likely a lot more prevalent now, too.

I dunno what they're paying for bandwidth, but AWS S3 has barely dropped per-GB bandwidth costs since its release in 2006.


>Images are dramatically larger, both in the raw size out of cameras and the resolutions people are willing to put on pages (including higher-DPI screens).

That shouldn't be a big problem. Screen resolutions have not increased significantly in years (most desktops and laptops are still stuck with 1920x1080), so there's no reason for larger images at all. It should be trivial to write the software so that it auto-scales the image down to an appropriate size for web display on modern screens (which is small, since most Wikipedia images are pretty small within the text, just like any reference work), while also providing a clickable link to allow seeing the image at full resolution. Very few visitors are going to view any image in an article at full resolution, let alone all the images in the article, so the only thing growing should be the image sizes of those optional full-size images. Even there, the software can compress the raw images down to well-compressed JPEGs; it's not Wikipedia's job to store unedited, low or no-compression raw images; an 80% quality JPEG is sufficient.

Same goes for videos; that stuff can be compressed, scaled, recoded to more efficient codecs, etc.


In 2005, Wikipedia co-founder and Wikimedia Foundation founder Jimmy Wales told a TED audience:

> So, we're doing around 1.4 billion page views monthly. So, it's really gotten to be a huge thing. And everything is managed by the volunteers and the total monthly cost for our bandwidth is about US$5,000, and that's essentially our main cost. We could actually do without the employee … We actually hired Brion because he was working part-time for two years and full-time at Wikipedia so we actually hired him so he could get a life and go to the movies sometimes.

According to the WMF, Wikipedia (in all language editions) now receives 16 billion page views per month. The WMF spends roughly US$2 million a year on Internet hosting and employs some 300 staff. The modern Wikipedia hosts 11–12 times as many pages as it did in 2005, but the WMF is spending 33 times as much on hosting, has about 300 times as many employees, and is spending 1,250 times as much overall. WMF's spending has gone up by 85% over the past three years.


No real reason to think that the costs scale linearly. Maybe you need 300 employees to maintain the site at that scale?


> Maybe you need 300 employees to maintain the site at that scale?

Should you? I mean, my day job is making it so you expressly don't. If your costs are scaling even linearly I would say you're doing something wrong. The point of scaling is to reduce costs--economies of scale are why you scale. And a user-editable encyclopedia and PHP application are not really good arguments for diseconomies.


But you can't reach perfect efficiency. Scaling will of course reduce costs in general, but it's not a given. If I had an app that had 5 users, my costs would increase if my user number grew to 5,000,000,000.

Without knowing more information about the financials, or how resources are allocated this is all conjecture. But a website that services 17bb page views/month is going to cost a lot of money to run. They could be spending their money very poorly, idk, but I also don't know whether or not what they are spending is an appropriate amount.


We aren't talking about "perfect efficiency". We're talking about not blowing up your costs. A website that serves seventeen billion pageviews per month of mostly cacheable and edge-serviced data is, while certainly a technical challenge, a very surmountable one. And a lot of the harder parts are blunted through Wikipedia's situation. Search, for example, is a difficult problem in those situations--but I'd bet money that most of those searches are coming from Google, which mitigates a large chunk of the demands on in-house search that a different kind of website might see. (I've used Wikipedia's internal search once this year, according to my browser history.)

Point to the diseconomies of scale and we can talk about them, but everybody else has figured out how to leverage economies of scale when building out a large technical system.


> Scaling will of course reduce costs in general, but it's not a given.

Not only does it seem costs haven't been reduced, their rate of increase has exceeded the growth in pages served, quite substantially. That's not the whole story I'm sure, but as a rough estimate that doesn't seem sustainable or healthy... or necessary simply in terms of general hosting cost declines over that period.


To just maintain the site - I mean, making sure every HTTP request (or reasonable part of them) is answered and Apache does not crash and logs are rotated and backups are performed - no, you don't need 300 people. If you froze Wikipedia in 2005 and never wanted to improve anything there, and never open new project, never make a local chapter, an editathon or support a new language, etc. etc. - you could probably do with 25-30 people.

But that won't be a live project. That would be a fossil that would slowly wither and die, as it becomes less and less relevant and more and more inadequate with the needs of the current user. In 2005, iPhone didn't exist, now everybody has a smartphone. Should we somehow account for that or just ignore it? How about knowledge graphs and linked data and all AI developments - should all Wikipedia knowledge still be text-only and ignoring whole Linked Data universe? How about supporting thousands of existing languages - should we just dump them in their own domain, or should we help them with automatic translations, article templating, language-sensitive searches and so on? How about creating richer media like maps, diagrams, graphs, video and audio content - should we help this or should we be content with just inserting links to outside content? And that's only minuscule part of the questions we can ask about things changed and developed in the last decade.

The point here is that Wiki universe is a a big and complex live active project (or set of projects), with very many facets, and reducing it to technical maintenance of one webserver site - even one that gets tons of traffic - is not a good idea. The goal of the movement, as I understand it, is not "make sure site en.wikipedia.org does not crash", it is "make all sum of human knowledge be available to everyone". It's a big mission, and it requires people to achieve it.


I'd expect sublinear growth for the kind of hosting Wikipedia is doing. They're mostly serving cached text documents and images.


To be honest, most successful startups are indeed quite cancerous.

Twitter doesn't need 3,500 employees, Facebook doesn't need 17,000 and Google doesn't need 72,000. They could all fire 50%-90% of their workforce and the product wouldn't be materially affected for the vast majority of users. The reason they have this many is because they can have this many. Their success has given food to the cancerous growth.

You get this situation whenever a company isn't operated by a shareholder who get's dividends.

When the person who runs the company knows that every dollar they don't spend is money in their pocket, people start to actually care about expenses and only focusing on what is important.


I strongly disagree with this on two points

1) one of the biggest moats for tech companies is talent, and it's necessary to have GOOD employees available to test new fields and grow quickly (when Android came along, Google suddenly needed a lot more engineers that it had available and saved the time/challenge/cost of hiring good quality engineers). Also the numbers you posted are overall employment numbers, not engineering ones (you need to have bigger HR, sales, support etc as you grow). I totally agree that there is a point of diminishing marginal return and they could be bloated, I just don't know what that is and I don't think you do either, so don't discount the number just because it's large.

2) Dividends are given if the company doesn't think that it can reinvest those in a more profitable way. Berkshire Hathaway notoriously doesn't give dividends - it doesn't mean Buffett & Munger are personally pocketing all the wealth. Also, dividend returns are taxed so that's why many times shareholders are okay with having the company reinvest it/bring it down as retained earnings because it's smarter that way.


With regard to dividends the tax situation is what is causing the bloated and shitty behavior in the first place.

In countries with Dividend Imputation (https://en.wikipedia.org/wiki/Dividend_imputation) such as the one I live in, companies actually distribute the money they earn.

In the USA, wasting money on bloat is effectively incentivised by the tax structure because you can spend money that is already inside the company with an effective discount vs distributing it so that it can be invested elsewhere.

Owners should be withdrawing the profits of their companies. The current situation leads to bloat and stagnation.


Never heard of dividend imputation - thanks for sharing!

Though I still don't agree with withdrawing profits - I think it's smart to reinvest profits into the business (just look at Amazon).


I agree that a lot of tech companies have a much larger staff than they need (especially the mid sized ones) and I agree that Google in particular seems to suffer from severe mismanagement, but this is kind of the best case when you see a company with high enough barriers to entry that they collect rents: they spend on R&D and other "wasteful" stuff. Part of the problem with the thought experiment of the perfect free market (infinite producers/consumers) is that margins are so razor thin that no one can do this kind of stuff, which can be enormously beneficial when well coordinated (see: postwar Japan) so you do want some of it to happen.


There isn't any basis to your statement that Facebook could make as much short term and long term profit if they fired half their staff. Instagram, for instance, is much bigger than they were a couple years ago, and those employees have built things that have drove growth for the product.


You don't even have to go as far as the big tech companies. You see it even in smaller ones. Something that's stuck with me is when, in the span of about a week or so, I had multiple people at different startups answer a question "hey, how's it going?" with "oh, really great, we [note: not founders] just closed a round and we're hiring X more people [note: at companies where the product is stable and well-understood]".

It's fief-building, to a large extent.


I agree with your assessment of Google, Facebook et al not needing almost all of their manpower. They hire because they can and because they (and, by proxy, their shareholders) sincerely believe that this manpower can create nonnegative marginal improvements to an already very successful "machine" and because they fear that going lean will make it impossible to adapt to changing markets.

The key difference is this: they earned the money and they can do with it whatever their shareholders allow them to. Wikimedia, on the other hand, get the money from the annual beg-fest aka "give is money or bad things may happen to humanity's last remaining encyclopedia" racket and they are only bound by whatever liberal interpretation they apply to their statutes. The problem I see with this is that with each increase in avoidable expenses they increase the risk of donations one day not keeping up with rising expenses, threatening the very existence of Wikipedia as we know it. By growing expenses, they are essentially endangering what they are supposed to protect.


But he's right: every successful startup in history is cancer. Deliberately so: they grow huge very, very quickly. It's steroids.

The trick with any great start up is how to get out of the "aggressive growth without revenue" stage. Some (very few) keep the growth but aggressively grow revenue and become self-sustaining. Others dial down the growth at a certain stage, and revenue increases to match, and they're self-sustaining.

The point is you have to become sustainable. Most start-ups flare out badly. You cannot change over-night from "growth like cancer" to "stable and flat". You need some way to switch and it takes time.


But, according to the table, they have more revenue than expenses and revenue is growing just as fast if not faster than expenses.


Yes, and that revenue comes from begging banners that more often than not create the - entirely false - impression that if people don't donate, Wikipedia will blink out of existence.

In the early days, when almost everything was volunteer-run, hosting costs were indeed Wikipedia's main expense (as explained in that old quote from Jimmy Wales). These days, hosting amounts to about 2% of expenses, and most of the rest goes to staff costs (incl. about two dozen people in fundraising alone). Meanwhile, most of the value - the actual content - is contributed by unpaid volunteers. The paid staff have absolutely nothing to do with it.

Nobody minded working for free when there was just enough money to cover hosting costs. Now, however, there is an influx of $100 million in donations a year, and none of that benefits the average contributor, the people actually writing Wikipedia. That grates a little, much like in the monkey fairness experiment (Google it if you're unfamiliar with it).


I think the article was questioning if this is suitable for a Charity. A business should be able to estimate its growth and sales, and can re-invent itself if a path isn't going to be profitable. The growth in a business should also be growing revenue (or it should be creating the foundation for later revenue).

A charity is based on how much people are willing to donate, which can change VERY quickly. Imagine what would happen if the public perception of the people behind wikipedia was to dramatically change and next fundraising campaign only brought in 20% of last years?

Additionally If people start to believe that the finances are squandered or not spent as expected, there could be a movement to NOT donate. This happened with the Red Cross after 9/11 where people felt they were duped by the fact their donations went to a consolidated funds (often to fund overseas activities) rather exclusively to 9/11 victims.

Charities also tend to need to keep a few years of funding in the bank to deal with a change in markets, as a charity typically can't use debt to get through rough years (economic downturns / recessions). Exponential growth makes this nearly impossible unless your running extremely lean.

I think the article is a little sensationalist (and maybe it needed to be to reach certain people), it seems to have some sound concerns.


If you read the article, you'll notice it says that the expenses are growing faster than the traffic, and faster than revenue. This is the exact opposite of what you would normally expect (due to economies of scale).


Succinct and correct. I have been a regular donor to Wikipedia for years now, because I use the service heavily, support its mission, and like the access I have to contribute to its vast body of knowledge. When the fundraising solicitations come around, I've typically been too busy to look into the financials and have always assumed they operated at near costs and had declining revenue. Now, perhaps Wikipedia has some unfortunate but real problems around editorializing, and they have to pay a growing army to keep the worst of the forces of troll-dom at bay with conscientious hired curatorial help. Otherwise, I see no other reason why they should have 300X the employees.

This is eye-opening, to say the least.


And this is exactly the issue. Fund raising got to be "too" successful, which is to say at some point WMF did a fundraising round and they had more money than they needed, and rather than give that money back they spent it. And the next year, what ever it was they spent it on (could have been salaries or perks or what ever) seemed like "of course we need that thing we did it last year" and so they targeted to raise the higher amount, but they over shot again, and then spent more.

This has been a trap for charitable organizations ever since they existed. Churches, clubs, museums, Etc. The second trap is embezzlement which happens all the time because, as a donation funded organization for some reason people often neglect strong financial controls.

The article author is correct that unless corrected, this situation will kill the Wikimedia Foundation.


> I see no other reason why they should have 300X the employees.

Well, that's a little unfair. When you measure from 1, virtually any number will look like an absurd multiple. Calling it 300x makes it sound ridiculous but for a company with $80MM in revenue, 300 employees doesn't seem to be unreasonable.


Fair enough on the relative change, but what new core ongoing function does Wikipedia have since 2005 to merit the raw numbers?


No idea. I'm just commenting on the way the data is presented. 300 doesn't seem absurd to me. How many they actually need, though, I don't know.


Wikipedia staff don't edit content, fix pages, warn trolls, or ban vandals. That's all volunteer labor.


Supporting many more languages and new projects like news and structured data should require more individual contributors and a corresponding management structure. Then of course you need HR to manage them, a fundraising organization that gets money from all over the world, and yet more people to handle the finances that have now gotten complicated.

To me, it's a no-brainer that the WMF of today needs more than one employee. Whether it needs 300, I don't know, but that doesn't sound far enough off for me to quibble with them over it.


They may need more than one, and I'm sure there are fixed or near-fixed labor costs pertaining to HR, legal, accounting, etc. but what business function besides content curation - which another commenter above claims is entirely volunteer, does WMF need that approaches even a linear growth rate to remotely justify the jump in orders of magnitude, let alone 300X?



Wikipedia hires php programmers as well. Last year I saw a posting on stackoverflow. Think the devop work required, project management, handling media, ever increasing storage needs, the managers, etc.

300 feels like too much but 80 could be a logical number.


The table of spending vs. revenue suggests revenue is growing just as fast as expenses. There's nothing here about the trends of the amount of traffic over time. And if there was, the cost of hosting and developing a system vs the amount of traffic is can take is not a linear relationship.


> There's nothing here about the trends of the amount of traffic over time.

Yes there is.

2005: > So, we're doing around 1.4 billion page views monthly. 2016: > According to the WMF, Wikipedia (in all language editions) now receives 16 billion page views per month.


My mistake. What about revenue, though? It looks to be growing as fast as expenses if not faster


The version of the article on the signpost lacks the table of figures, if you go to the version on his user page, it paints a clearer picture [0]. idorosen has posted a table on the root thread which adds a column for the expense ratio (1 minus the margin) and it's getting much higher much quicker [1].

[0]: https://en.wikipedia.org/wiki/User:Guy_Macon/Wikipedia_has_C... [1]: https://news.ycombinator.com/item?id=14287495


idrosen is pretty clear these numbers aren't that scary, though.


Well, try to draw a trend line through the operating margin and tell me it's not that scary. I think that it's pretty scary if your permanent, ongoing financial commitments are growing to match your revenues. This way, you have no cash to react to intermittent expenses or R&D. And let's remember that Wikipedia is already doing more donation drives than ever.


Wikipedia is an explicitly not-for-profit enterprise, and thus something completely different from the standard startup.

For most startups, success means getting bought out by a larger company. Wikipedia by contrast has always put the highest priority on maintaining its own independence, which means that for them a buyout would be a profound failure.


I'd say that's a little strong.

You're right that the costs of running wikipedia are still quite low considering the incredible scope, popularity and importance of the site. "Cancer" seems over the top and needlessly combative. But, it definitely doesn't hurt having someone bring up the fact that costs grew by 6x in 6 years. As he says, more years of this could put wikipedia in a precarious position.

It seems the OP cares about wikipedia and is genuinely worried. These conversations need to be had (also at fast growing startups) and having them in the open is part of the wikipedia way. Maybe he's wrong but it doesn't seem nonsensical or disingenuous to me. It seems genuine and rational.


What you say is correct. He just says something different and not the opposite. He talks about that a money hungry, fast growing business is not the model an open content wiki should have as a foundation. The value to the user would probably be the same if it would be a 3 person inc with $150k/year revenue.


My first thought was, "there's no way Wikipedia still only needs $5000 per month for bandwidth"


Well, most startups fail. (For various reasons)

More

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: