I once had a strategy professor define the Google business model somewhat like that, where "Google tries to make every other business around it free or irrelevant". It results in a few different effects:
- By reducing the cost of other links of the value chain, there is more money available to spend on the links you actually generate revenue on. This shifts profits along the value chain to that link. One example is dramatically reducing the cost of phones and internet access, thus allowing customers to spend more time and money online, which generates revenue and profits for Google
- By making the other links free or irrelevant, you reduce the odds that a competitor in those links will strengthen their position and will extract more profits from the rest of the value chain. One example is using Android to prevent a monopoly on the smartphone side. If Apple had a monopoly or near monopoly, it would be able to extract larger economic profits from the other links on the value chain, including Google
A desert of profitability shifts consumers to you, and keeps competitors away.
e.g Chrome is free and google wants to push for web technologies only because their cash cow (Web Advertising) can keep generating profits as long as there are more web users specially the ones whose behavior can be easily tracked on their browser.
Yeah, people should stop idealizing and worshiping companies. Apple is only pursuing "privacy" because it tried and failed miserably at user monetization, but it won't think twice about handing over all your data to the government (like they did in China) if that's the best business options.
They're all companies, they're here to make a profit.
That's one reason why I find it easier to trust companies than to trust governments. I know exactly what companies want, so I can predict how they'll act.
Companies are that "friend" who will always get money from your wallet, whenever they have a chance. Governments are those friends who swear they are on your side and want to help and protect you, but will sneak their fingers into your wallet whenever you're not looking.
By this logic, the biggest force in the universe preventing micropayments by tying media to the tawdry, corrupting mechanism of advertising, is Google.
Cities are supposed to impoverish the regions directly around them in this way, as well.
No, it is users being willing to pay. The vast majority of users are far more ok with being shown ads than paying the equivalent of the cost of the ad to have it go away.
What if the ad were not in the picture at all, and you had low friction micropayments? There are YouTube channels which eschew advertisements and get most of their money from Patreon instead. There are other channels that have been demonetized against their will and have gone this route as well. It's workable, and yet, Patreon is far from the lowest friction it could achieve.
If there was some party able to make micropayments work, and able to make them not ever work, it would be Google.
Again, people don't want to pay for content. Between one website asking you to pay $0.10 to view an article, and another offering it for free but showing ads, people will pick the free one.
And Google has been trying:
But it is really hard to convince people to pay, when they can get it for free.
I could absolutely image people being willing to pay 1 one thousandth of a cent to read an article.
Oh, no, way more than that. Take a look at CPMs and ad costs. If your claim was true, and an article had 5 ads, every impression would cost 1/20,000 of a cent, and the CPM would be 5 cents.
Actual CPMs are a few dollars per thousand impressions, up to 10+ in developed markets. Which means that in the US, and article with, say, 5 ads would probably require you to pay 5 cents to view that article.
Take a look at the articles of ad-paid publications: you're more likely to see 10+ ads per page.
I'm sure you could have used the same conceit with cable TV to prove that something like YouTube would never work.
It doesn't matter, you'd still need to generate the same level of revenue than the ad-supported business, or even more, to justify the switch.
If you didn't, content creators would just stay in the ad-supported business.
> I'm sure you could have used the same conceit with cable TV to prove that something like YouTube would never work.
That doesn't apply here. We're talking about providing the same content with a different monetization strategy. YT wasn't about the monetization strategy, but about a different type of content.
Not quite the same thing. It's one thing to be extorted by ads in order to pay. There's a lot of content for which advertisement doesn't work, so there's no extortion to shut off ads. Google doesn't want to facilitate that content for some reason. It wants to squash it.
Isn't it providing more tools for easy media subscription? I used it to subscribe to the NYT pretty quickly not too long ago.
But hey, if you really believe that there's a significant business opportunity there, go do it!
You're not getting to the real problem: if a user can see an article with ads for free, or pay 10 cents, that user will most likely opt for the free version.
And you can't compete with free.
if a user can see an article with ads for free, or pay 10 cents, that user will most likely opt for the free version.
There is quite a bit of content for which ads don't work, for which people are only too happy to pay. Reduce the friction for that, and the market would increase.
YouTube just slaughtered a bunch of WeedTube channels. Then, there's also GunTube. That content definitely has an audience. That audience definitely is willing to pay! Why isn't someone making something with less friction than Patreon?
Sure, 1 user. Maybe even a million users. Well, the solution for that exists: subscriptions.
But nobody will build a system for such a small number of users.
> That content definitely has an audience. That audience definitely is willing to pay!
Sure, niche markets always exist. HBO makes money out of subscriptions, The NYT too. If you can find a niche audience that's willing to pay for your content, that makes it a viable business model for you.
But that doesn't scale across the billions of users on the internet. Particularly when you include the 1.5 billion who are not in rich countries.
I think it's still a bit too high in friction. I think there's yet another model possible.
A lot of those niches aren't advertiser friendly. Subscription kind of works already. There must be something with lower friction yet.
I don't see why micropayments couldn't work on adjusted scales. We already have different prices for things depending on the market. Also, it's a problem that somewhat fixes itself, automatically. A website relevant to low-income rural people outside the 1st world won't be able to charge as much as a 1st world website, but it might well be able to charge enough to greatly benefit the operator anyways.
Oh, sure, there could be, but I wouldn't invest my money in trying to prove that.
What number of users do you think would be willing to pay how much for all the content they access for free today?
Are you willing to pay 10 cents for every single ad-supported website you visit?
But for content with deep value for the user - helping him advance his life(improving skills,health,finance ), and that is hard to copy, users do pay - for example for books, courses, bundles of articles, access to some communities.
But why aren't we seeing micro-payments for deep value content, at the article level ? Because legally "copying" a single article offers no barrier to competition.
But maybe articles of deep value with a lot of valuable graphical content, will both raise the barrier to competition high-enough, and bring enough value to the user, that he will be willing to pay enough.
But credit cards suck for the kind of small payments the web warrants.
Also, what is the chance that we would pay and get shown ads anyways?
For me it is not ads, full stop, that are the problem. but distracting/disturbing ads unrelated to what i am reading at the time.
This doesn't make any sense. Cities exist as trading hubs: where people trade labor, services and goods.
Why are you bringing cities into this anyway? Makes no sense at all...
This is also true. However, the centralization causes various forms of impoverishment -- which you might also describe as "specialization." Cities develop zones around them that lack many services, but concentrate others. Think of the places where there are lots of warehouses. If you are in a city to be connected to some kind of a hub, the fact is that not all of that city's area is going to be equally well connected.
Think of what happens to towns near large cities. This has been studied by social geography since at least the 70's.
> That still doesn't have anything to do with the discussion.
Cities often act like businesses. In the U.S. cities even incorporate, just like businesses do. The "shareholders" of a city are usually its property owners who vote for city leaders who implement supply-side policies that raise property values and rents in the city, such as zoning restrictions, and demand-side policies like concentrating services, as the previous commenter mentioned. If the city is the leading commercial city in a country, it may even influence central government immigration policies, which can further increase demand for property, a.k.a. the city's "shares".
It makes a lot of sense and has everything to do with the discussion.
Not exactly a desert. More like zones of coalescence, like areas cleared out inside a nebula that's a stellar nursery. You're not going to find every single kind of service in those suburbs. You're not going to find certain kinds of businesses. Anything that benefits by being well connected to a hub is going to be drawn into that hub, which means that other areas of the city may have less of it.
By "impoverishment," I don't mean "becoming poor" in an absolute wealth sense. I mean that certain things are sucked out of certain areas to other areas.
And make a ton of money while doing it! Farms closer to cities are richer than farms further away from cities.
Just because you can specify a business model, doesn't mean it's automatically viable, or moral, to execute in any context.
Where’d you hear that?
I read it in an Indian social geography paper while writing an undergraduate Geography "101" paper. (The course number was actually 1)
Irony is that it was the Master of the Channel himself who came up with the falsehood that "content is king": Viacom's Sumner Redstone. Maybe he believes that himself, but it doesn't make it the truth.
If you read the case, many of the videos Viacom was asking Google to take down, were uploaded by Viacom themselves. So everyone was scrambling to get more eyeballs, that ultimately led to commoditization.
Also referring to the parent post, “desert” is not the right analogy when you create surplus around you to keep out competition. I think the knowledge industry is more like a religion, where the information is free but there are still strong structures that support it.
In case of religion, these structures are supported by supportive communal relationships. In technology, by massive network effects.
Maybe initially, before the iPhone. But the iPhone was the Dreadnaught of phones: everything that came before it became instantly irrelevant (look for HMS Dreadnaught).
> Microsoft would be tempted to use smartphone dominance to steer search traffic to Bing rather than Google.
Just as Apple would as well, unless Google pays, which it does.
yeah, good point. i think it helps us understand the Android ecosystem, its SDK, its user base, etc.
it doesn't seem like Google, a search and advertising company, had a natural reason to enter the mobile phone OS market. and they didn't set out to create an "insanely great" mobile OS experience, either for the users or for the developers.
instead, the strategy was apparently to just shoehorn a camera OS into the mobile phone market, offer it for free, grab as many users as possible, and thereby disrupt/restrain Apple.
and it has worked quite well for Google. but Android has subjected users and developers to a pretty bumpy ride. a lot of people went along for that ride because the upfront costs appeared to be lower than Apple/iOS. i don't know about the longer term costs though.
Essentially, you look at all the links on a value chain and look at how competitive each one of those markets are. The links with the least competition will capture most of the excess profits in the chain, and the links with the most competition tend to be commoditized and have zero economic profits (which are different from financial profits).
Taking Google's online ads value chain, for example:
Chipset & component makers > device makers > OS makers > browser makers > ISP/carriers > online platforms > content creators > advertisers > ad platform > users
(You could rearrange this in a few different, but still valid, ways)
What would happen if there was a single device maker? Or a single OS maker? Those would be able to yield monopoly pricing, capturing most of the profits in the value chain.
Now look at those links: in how many of them Google operates now in a strong way?
- Device makers (Pixel)
- OS makers (Android, ChromeOS)
- Browser makers (Chrome)
- ISP/carriers (Fi, Fiber)
- Online platforms (YT, Blogger, G+, sites, etc.)
- Content creators (indirectly, sponsoring)
- Ad platform (Adsense & Doubleclick)
Now look at where it makes money:
All those other businesses exist to protect the revenue-generating business.
one of those links i personally find interesting is "content creators."
on the top end, it appears that a handful of the apps in the Google Play Store make the lion's share of the revenue (e.g. Facebook, Google itself, and some really strong game companies).
at the same time, i've heard estimates that 50% of the independent app creators earn less than $500 per month. and it's getting worse.
in short, the android app market was quickly populated by a huge number of independent app creators (who presumably thought it would be a good, durable, new line of business). but, for the most part, these app creators became a low-income-neighborhood/swamp around the top app creators.
but Google still (indirectly) benefits from this arrangement because it offers the appearance of a free, open, lucrative, land of opportunity -- a healthy marketplace.
Building an app isn't any different from starting a company. If you're a self-employed app developer, there's a strong chance that your app will just be yet another random app with not key differentiation.
Apps are just as competitive as other markets, and often even more as the barriers to entry to creating an app go down. This is the same with content: when creating content became almost free (hosting is free, you have a camera in your phone already, basic video editing is free) that lowered the barriers to entry, dramatically increasing competition and therefore reducing the available profits.
So, by reducing the cost of creating content, Google made its store and the ad business more profitable.
Now on some of the "monopoly" arguments being thrown around: people usually define monopoly in two ways:
1) Causes harm to consumers
2) No harm to consumers (usually benefits) but drives other companies out of business
If you're proposing (2), you're arguing that the government should be protecting less efficient companies.
Now that's a really bad idea.
If you do not believe that competition is linear, then it cannot be true by any stretch of the imagination that a moat is "less efficient" than a desert. A moat/desert/<insert metaphor> is only one of many tactics employed to win battles.
IOW, it is not a strategy as you seem to be implying with this reply.
Nope, a moat protects a profit center by creating barriers to entry in the form of competitive advantages.
A desert kills profit centers that could be used to attack your own profit center. Competitors can't leverage neighbor markets as an entry point, they can only attack your core business directly.
You can have both, but a desert end up being more efficient for the simple fact that it reduces the total amount of profit to be had in the market, draining competitors of resources.
The desert of profitability is the Russian winter of business models (with all the "Russian winter" caveats, so let's not go down the route of discussing if it was actually the winter or German lack of fuel, or Napoleon's whatever): you don't need strong moats when enemies die before reaching your walls.
Great! You are essentially proving my point that they are tactics that can only be effective in tandem.
A desert of profitability tactic is only feasible if you are already awash in profits.
In other words, a moat must necessarily exist to justify the expense of sabotaging your competitors' defenses, otherwise, if they prove to be resilient (Intel+AGP vs PCI) or launch a surprise counterattack, you are toast.
And neither of them guarantees anything. Markets change, and you might be sitting in a drying oasis, behind a massive moat, surrounded by a desert of profitability.
And both moats and deserts have the drawbacks: moats defend from attacks but make you less mobile and they can easily expire (or be bypassed) with technological changes, just as stone walls became irrelevant with gunpowder, and Vauban-style fortifications became irrelevant with mobile warfare and aviation.
Deserts restrict your movements, since you can't try to move to adjacent markets to increase your profits, making it harder to handle declines in your core market.
Terms like "market", "nearby market", "niche" etc are all abstractions the same way a map is an abstraction of an actual territory. They help us reason clearly but they should not be used as a substitute for the real thing ala "the map is not the territory".
A "desert of profitability" is essentially a battle tactic -- a tactic where you delay the use of your moat (fortress, trench etc) by igniting trouble elsewhere, usually in neighboring lands that enemies must traverse before the war reaches your own land.
IOW, it is a tactic used in conjunction with other tactics, it is not a strategy; all it does is buy you time. Eventually, the technology landscape will shift and such economic moats, no matter how deep, will lose their relevance.
Ah, a light-bulb goes on when wondering why Google spends on CDN and DNS services. (The additional internet telemetry-streams are nice-to-have but probably not the actual point, economically speaking.)
Also these big companies have so many competing interests - how do these empires manage internal conflicts of interest?
In the case of Apple vs Android I'd say Apple is doing pretty good and are arguably doing substantially better than Google in spite of not having followed this strategy themselves.
The difference is that most of Apple's complements are B2B suppliers and so are rather invisible to the general public, while many of Google's complements were formerly consumer-visible markets. So it's much more obvious when Google turns the web browser or ISP market into a commodity than when Apple turns the electronic supplier marketplace into a commodity.
If anything, Microsoft would have won over OEMs and been where Android is.
Or maybe a different company, like if Amazon didn't fail with Fire Phone. Perhaps that entity could have been more forgiving since they're not as aggressive with the same set of productivity services. But in general Google has lost that moat for the rest of the mobile era. At best they can hope for is a partnership.
Actually, imagine now homegrown Alternate-Android competitor that was based in China. Google would be cut from that too.
When I sold my 3-yr old iPhone, which cost around $700 new, I got well over $200 for it. The net cost was under $500. Android phones might be cheaper, but you can't really sell them when you're done with them because there's always newer Android phones that are also inexpensive.
The same effect happens with Apple's desktops and laptops...by not offering a bargain basement tier, they foster a thriving resale market that's reliable enough that you can factor it into the price when buying a new device. Apple is essentially charging for both the primary sale and the secondary sale up front. Once you account for that, a lot of the difference in price between them and their competition goes away.
At $200, the 1-2 generations behind used iPhone is very competitive with the slower, cheaper Android phones that you can buy new.
The phones you're comparing the iPhone to are second-tier phones. But, as I said in my original comment, second-tier iPhones are resale, not new.
For instance, my son has one of these and it's a perfectly capable device.
The accellerated EOL impinges on this though.
But why would they? They already capture 80%+ of the profit in the global cell phone market.
And, actually, it does have a tiered presence: iPad, iPhone, iPod, Shuffle. I'm not sure how many of those remain extant.
The goal would be staving off encroachment from below.
The average selling price of an Android phone is $202 (https://www.statista.com/statistics/309472/global-average-se...). That's still $150 less than the cheapest iPhone.
The average selling price of a tablet is still much less than the lowest cost iPad.
Apple doesn't sell iPods anymore except the iPod in name only iPod Touch.
Apple only has about 12% world wide market share in smart phones.
There's also the used-device market, as noted by another reader. I'd considered making that point as well, though uncharacteristically for me I decided to focus on a single thread.
They follow a different strategy: vertically-integrated walled garden.
It means free stuff or low prices for consumers. This is another consequence of the "invisible hand" which is sometimes used to justify capitalistic competition.
A commodity market working well is also a desert of profitability.
The opposite of a desert of profitability is economic rent.
Or perhaps an Amazon ;)
I don’t know if the analogy makes any sense, I just couldn’t resist
The "free stuff" is an illusion, a monopolistic sleight of hand. Consumers are subsidizing "free stuff" in one market by paying excessive rent in the neighbouring market. In the end, consumers pay more, not less.
> This is another consequence of the "invisible hand" which is sometimes used to justify capitalistic competition.
These deserts are not a feature of competition, they are an attempt to avoid competition by shrinking the pool of viable competitors.
> A commodity market working well is also a desert of profitability.
No, it's the opposite of that. A commodity market working well is what you get when monopolistic tricks such as "deserts of profitability" have failed, and you are forced to compete the old-fashioned way.
> The opposite of a desert of profitability is economic rent.
Again, the exact opposite is true. Deserts of profitability exist to protect economic rent. Show me any "desert of profitability" and I will show you the economic rent that is subsidizing it.
It just doesn't make any sense.
After all, consumers benefitted enormously from the phone system coming into existence. It only cost everyone else their ability to do anything with telephony ATT didn't like.
(Yes, government monopoly vs. private action. It doesn't change anything in terms of the presence or absence of actors in the space.)
You're mixing two distinct periods of time. When there was the expansion of telephone systems, there was no consumer harm, but when AT&T stopped others from entering the market by using its monopolistic power to prevent competition, there was harm to consumers.
Those two periods are over half a century apart.
Similarly, when SO invested in horizontal integration, it was great for consumers: the reduction in cost and standardization of plants and materials (oil, metals, etc.) created a boom.
But when it used its monopoly to prevent competition and extract rents, there was consumer harm.
I think I already explained my position pretty clearly, which is that consumers benefit from the free stuff less than they suffer from the monopoly subsidizing it.
I'm happy to defend my position but you need to give me an argument of your own first, besides "it just doesn't make any sense".
Well, you need to prove that to start with: consumer harm.
Your example actually shows the opposite and that's because conventional wisdom is often wrong.
Conventional wisdom states that there is a direct correlation between market share dominance and excess profits which is why anti-trust laws target monopolies to protect consumers from price-gouging, but this characterization is not always the case, as Apple has proven by not pursuing a market-share focused strategy.
Apple enjoys 87% of smartphone profits from a mere 18% of all shipments .
They were saying that if Apple had a monopoly on smartphones, they could extract more value from adjacent markets (google search, etc)
If you do not believe it is an actual thing, then an adjacent market is no more real than a map is real. An adjacent market is merely just a mental aid, it is not somewhere you or I can schedule a visit to.
At the end of the day, a hypothetical Apple monopoly in the "smartphone market" would mean consumers would have less money to spend on other things like google search (to use your example), apps & games, Netflix subscriptions, etc all of which exist in "adjacent markets".
My gripe is that the use of jargon here is very misleading as it has lead you and others to conclude that there is a dichotomy -- between a market and its adjacent(s), where none exists in the real world where all of this matters.
They already do charge for making google the default browser, so it's not a stretch to imagine this payment would be larger and other similar payments would exist. I'm not sure why you think this is a hypothetical when it already happens.
1. Before the iPhone, Nokia enjoyed a large market share on mobile phones based on Series 40, Series 60, Maemo, etc. Essentially they had several OSes. There was also BlackBerry which had several editions of one OS but was huge as a "smartphone" for checking email.
2. Before the iPhone went on sale, Steve Job's target was 1% of all phone shipments in the original 2007 keynote:
957 mobile phones in 2006. Goal: 1% of market share = 10 million iPhones in 2008. 
3. Prior to the public unveil of the iPhone, Eric Schmidt who was a board member at Apple, frequently recused himself from board meetings focused on the iPhone due to the conflict of interest from Google's own mobile phone efforts. IOW, there was no monopoly threat to quelch because both companies were working almost simultaneously.
Monopolies sometimes get overused as a metaphor as in this case where the gp talked about them in the context of excess profits. My point being you don't really need to control market share to enjoy the bulk of an industry's profits as shown by Apple. Heck, even the terms market share, industry etc are proxies for determining (abuse of) market power. Their overuse/overreliance as a measure of power can be misleading when trying to reason about the effects of competition.
of course, they're also trying to make a contribution to the larger ecosystem, because they feel they've benefited from the contributions of others.
"Technology changes, economic laws do not"
Among other things, Hal Varian went on to work at Google as an economist.
They gave an example of how Intel used their AGP standard as part of their commoditize your complement tactic except they characterized it as maximizing the value of your technology (p. 197-198):
"In choosing between openness and control, remember that your ultimate goal is to maximize the value of your technology, not your control over it."
So not the same means of making complements unprofitable as the tech examples, but the same end -- the hospitals and some hospital owned outpatient clinics drive all the profits, everything else is like a loss leader that prevents competition from springing up around the hospital
Since new hospitals cannot be constructed without proving a "need", the certificate-of-need system grants monopoly privileges to already existing hospitals. [At least one lawmaker has] argued that the true motivation behind certificate-of-need legislation is that "large hospitals are... trying to make money by eliminating competition" under the pretext of using monopoly profits to provide better patient care.
More at https://en.wikipedia.org/wiki/Certificate_of_need
In reality they are bringing all providers under their umbrella to control patients i.e. Market share, and to drive patients to the site of care that is most profitable to them under the constraints of minimizing malpractice risk and quality fines. ACOs are in many cases just a way to increase patient volume at a discount to payers and have nothing to do with risk sharing
Hospitals are an incredibly political powerful entity. The AHA (hospital industry lobbying group) spends about as much as Phrma (pharma lobbying group), but hospitals have massive grassroots political support bc they are huge employers whereas the public hates pharma
The complement of a web browser isn't the web servers, it's the OS--and vice versa. So, Netscape tries to commoditize the OS, and MS commoditized and destroyed the browser market, and then nearly themselves with antitrust.
Also, Netscape was initially free in the early days, but had become a product which cost money until IE came out and was free. The plan was not "free browser". (Read Ben Horowitz's book...)
In Netscape's case, their strategy shifted from selling web browsers to consumers in the hope of commoditizing the OS  to selling more web servers to businesses, after Microsoft made it impossible for them to compete with IE which was offered for free.
With 9 years of hindsight, how did it turn out? I'm honestly not sure, as I haven't followed the state of video codecs. It seems like a lot of video is viewed on Apple devices, which only have hardware support for MP4?
What are people trying to commoditize with Kubernetes. But even more importantly: How is this helping them? (since in my head the 'what' being commoditized is the same thing the big players contributing to Kubernetes make profits with)
They want to make it easier to move off of the market leader, AWS. If a customer's programs are written to AWS APIs, it becomes incredibly expensive to even try a new cloud provider.
So basically every other cloud provider besides AWS wants standard interfaces. They can all rent more computing resources if it's easier for customers to move off of AWS. That's why Microsoft supports Kubernetes. And that's why everyone got on board with Docker too -- it's (supposedly) a standard application format, or at least a cloud-vendor-neutral one.
It's exactly analogous to Windows vs. Unix. Windows locked users into its proprietary APIs, with exactly one implementation, and made tremendous profits. With POSIX, you can port applications from Solaris to IBM Unix to Linux, etc.
Kubernetes was called "POSIX for the cloud" as far as I remember.
It's related to the "commoditize your complements" model, but I don't think that's exactly the right way to describe it. Maybe it's more like "commoditizing your competitor's product", to erode their profits.
In theory I don't see a problem with AWS owning 95% of the cloud market share and Google still making billions of dollars on ads (with free complements like search, apps, Android, and the browser).
I think AWS is more threatening to Microsoft, because Microsoft sells enterprise products like SQL Server and so forth. Cloud services are the complement to a lot of enterprise products that MS sells.
Google isn't nearly as big an enterprise business as Microsoft is, and the customers for AWS/Cloud are typically enterprises and not "consumers".
When you shop at Amazon, you don't buy shipping from whoever's cheapest, Amazon provides the shipping via a complex logistics system interfacing with external partners.
Any company that has a cash cow and fail to find a new way for growth could use that cash to commoditizing its competitor product or services, this happens in every industry. Jeff Bezos, possibly the king of commoditization, "Your margin is my opportunity."
Like when Peter Thiel starting talking about his novel strategy of explicitly trying to not compete with others and aiming for a monopoly. You can find that explicitly described in complete detail in any ordinary marketing textbook in the chapter titled Market Segmentation, Targeting, and Positioning. But for months people couldn't stop talking about how Thiel was a genius for this new idea.
I get the whole concept of XKCD's lucky 10,000. This is different. This would be like taking credit for inventing mentos and coke bombs just because someone hasn't heard of it before.
"Commoditing your complements" is about you making the razor blades interchangeable where multiple manufacturers are competing to sell razor blades at low margins while you make all of the money selling razors.
Apple commoditized apps and the operating system by giving them away for free, as did Google. Microsoft commoditized PC manufacturers, etc.
No, it would be the opposite. The product always was the blades...they were never an accessory to the handle. They entered the market because before disposable razor blades, people had to sharpen and maintain their own razor blades. They designed a razor handle merely as an instrument to hold a pre-sharpened disposable razor blade. The only value-added product they ever had was blades, they never gave a rip about the handle.
And if your product is the blades, then in the "commoditize your compliment" variation you would make the blades, and you outsource or open source the razor handles and encourage competition between razor handle makers. Much like how Ford sells cars, but they give blueprints and design specs to multiple parts manufacturers to ensure that parts are available for all the cars they sell.
> Apple commoditized apps and the operating system by giving them away for free, as did Google. Microsoft commoditized PC manufacturers, etc.
Giving things away for free isn't commoditizing the compliment, that's merely a loss leader strategy. Getting third parties to develop apps and and give them away for free or extremely low prices would be the example you're looking for.
While I was researching before I replied, I found out the whole razer razer blade analogy was an urban myth. In fact, both the razors and the blades were expensive at first until the patents expired and (https://papers.ssrn.com/sol3/papers.cfm?abstract_id=1676444).
But after thinking about the rest, you're right.
OpenXR seems to be moving forward. This will hopefully solve this VR lock-in mess.
Instead we’re trying to make each other trivial.
Which is about as good a definition of "technological progress" as you are going to find. I for one am happy at how economically trivial housing, transport and food production have become over the last thousand years.
But there are lots of verticals near each other, so there can be many people at one level working with commodities at another level, who likewise believe that the people above/below them are also generating a commodity complement.
What we end up with instead is a global economy of people generating value and building things that make lives better, which I think is awesome! Generating value is the most legitimate reason to have people pay you - you don't have to extract rent from a monopoly position to be a viable business.
Example headlines FTA: IBM Spends Millions to Develop Open Source Software; Netscape Open Sources Their Web Browser; Transmeta Hires Linus, Pays Him To Hack on Linux