Hacker News new | past | comments | ask | show | jobs | submit login
Demonopolizing the Internet with Interoperability (pluralistic.net)
291 points by samizdis 29 days ago | hide | past | favorite | 160 comments



I disagree that forced interoperability will somehow make they emergence of "winner-take-all"-style tech giants less likely, and I think the last ~25 year history of the Internet proves that.

I mean, in the late 90s everyone was talking about how the Internet would "democratize information", because anyone could become a content publisher from their garage. The story then was about how "the power" was concentrated in huge media companies, and the Internet would change that.

But when the barrier to entry is tiny, and indeed the barrier to switching is so low, it means that any competitor that is even just a tad better than the other guys will vacuum up all the business. It's indeed actually this more open framework that leads to higher concentrations of wealth and power, not the other way around.


I'd say that it didn't really happen that way. What happened was that they would be interoperable initially, the other clients would get neglected as the dominant client vacuumed up half of the business, then when the dominant client reached a certain size, it would close. The neglected clients wouldn't be able to pick up the people resentful that the dominant client closed because of their fewer features, more difficult (and less opinionated) UI, and the loss of half of the user network. Attrition happens among the holdouts, they switch to the dominant client, and development that was once slow on the other clients stops dead. Then the dominant client starts asking for your firstborn and gets contracts with the CIA.

I'd submit that it's the closing that's the problem, not the openness. Openness tends to support a power law distribution of clients. One or two will dominate, but there will be a dozen that are significant.


The problem is that the open protocols which were initially used by the dominant client to achieve its dominance later became closed when they "seamlessly" forced their users onto a closed protocol. OSS licenses solve the open code issue to varying degrees but has largely fell flat on the issue of open protocols. So if coopting IP-law (aka OSS licenses) is the wrong approach, what is the correct tool to address this problem? Would other legal concepts like like contracts help? Something else?


100% agree. I also observed that some time ago, many platforms would use interoperability as a growth tools (ex: Facebook with 3rd party apps/games, Google with the 3rd party cookies, Apple with iOS APIs) and then, when they reach critical mass, they would close down and become closed gardens.

Watching all the preparations for the Open Banking regulations here in Brazil (still not in place fully). The concept is sound, as the data belongs to the user, not the bank/platforms, so banks/platforms should enable this data to freely flow (on the user command) to other platforms.

I think forced interoperability might have a good effect on that world, and also it seems to foster innovation by lowering the barriers of entry but I don't really know the effects on investment and VC money when you eliminate an effective competitive barrier.


Open wouldn't hurt, but I'd submit that price, specifically "free", is the bigger problem. It doesn't really matter which platform is dominant when they're all advertisement driven.

If our data became worthless, because of say, regulation on its use and a right to be forgotten, then we'd see more than just "one business model under different names".


it is difficult to stop the "winner-takes-all" economic dynamics from playing out, but interoperability does allow for small competitors to emerge in niche markets and through technological innovation bite off substantial markets. (AMD uses interoperability to compete with Intel.)

and ¿por qué no los dos, interoperabilidad y antimonopolio?


Completely agree. I'm in favor of interoperability for its own benefits (it's not without issues but thank God I can mostly depend on USB-C on all my devices these days), but just pointing out that interoperability as a counter to wealth/power concentration is insufficient.


It's less about wealth/power concentration and more about the ability to upset that wealth/power concentration.


Yes, for example if Myspace, Facebook, Google+, and all the open alternatives had all been interoperable, it would have been less likely for one of them to become a monopoly on the friends/wall/timeline segment.


> just pointing out that interoperability as a counter to wealth/power concentration is insufficient

Interoperability moves the world towards less concentration of wealth and power. So it's a step in the right direction.


"Better" is a multi-dimensional measure and people have very different preferences for what constitutes "better". The source of the winner take all dynamic is network effects. Because of those, everyone agrees that the "better" platform is the one that everyone is on, which inevitably causes everyone to use that same platform, even if they would have different preferences but for the number of people on it.


Not quite. Yes, network effects are important, but if that were the only important thing, then Facebook would be the only social media site. The opposing dynamic is that it's not hard to be on multiple sites at the same time. So if there's a site that meets my definition of "better" and has critical mass among the people I'd like to connect with it, it can compete. Each new generation gets its own social network because it has no interest in connecting with the squares on Facebook.


> Each new generation gets its own social network

So instead of accelerating progress we get one monopoly every 20 years (and some people splitting their attention, and friendship groups, and exposure to surveillance, between multiple incompatible sites).

That's not as reassuring as you might think it is.


I wouldn't call it reassuring, but it is the pattern so far. What gives me hope is that we don't have a single unassailable monopoly. Instead, we get a chance, every five years or so, to choose something better. We just might figure out how to deal with this.


"Winner-take-all" giants are not the problem. The problem is when something is strictly better (price, features, UX, performance or security) but can't win due to properties that are not inherent to the solution like network effects.


What is your claim? I don't think that the dynamics of tech giants are being properly accounted for in your example. Walled gardens are properly anticompetitive. You can't just go and iterate on facebook and expect to "suck up all of the business".


> I mean, in the late 90s everyone was talking about how the Internet would "democratize information", because anyone could become a content publisher from their garage.

Well, to be fair, this did happen. People today are better informed than ever. It worked so well, that classical media was (is?) dying. But what people forgot to mention was that money will still drive society, that people still cannot know everything and make failures, as people also will still manipulate others for whatever reason.

The world has become better, but it still remains flawed. After all, nothing will ever be perfect.


> and indeed the barrier to switching is so low

the cost of switching is not low though, exactly for lack of interoperability

if a user moves from a platform to another one, the user has to start from scratch because all of the contacts, social interactions and content are locked behind walled gardens


> I mean, in the late 90s everyone was talking about how the Internet would "democratize information", because anyone could become a content publisher from their garage

Well, it did democratize disinformation.


Health care software is being forced to be interoperable soon in the US. The 21st Century Cures Act requires it (https://en.m.wikipedia.org/wiki/21st_Century_Cures_Act). One interesting and, to technologists, disappointing aspect of the regulation is the complete lack of a standard by which to interoperate. There is no prescription for any data format for any type of health information. Health software companies are only required to provide a hyperlink to a web page that describes the data format.

This is a step in the right direction but it certainly doesn't enable the anything that looks like the developments we have seen around the internet due to its open protocols. Health care will be "interoperable" without any of the compatibility or interfaces the TFA wants. We need regulators who understand the technology and have a much higher standard for interoperability if we are to demonopolize the internet.


That's not true as a generalization. The newly mandated HL7 FHIR standards are a huge step forward in interop [1], and we've seen varying but progressively improving levels of support from all leading EHR vendors. The immediate deadline mandates patient information to be made available via FHIR, with more data segments to follow.

Prior to this each vendor had a custom API, and getting integrations working was an enormous effort. There are a bunch of companies who offer a standardized API around various EHRs, such as Redox https://www.redoxengine.com/. Now most of them have started supporting FHIR, as a way to ingest and expose data. FHIR isn't comprehensive yet, but it'll get there at its own pace.

1: https://www.cms.gov/Regulations-and-Guidance/Guidance/Intero...


I confess I don't know much about CMS. What is in scope? Or, what is available in FHIR? The Cures act includes all PHI plus anything that could be used to make medical decisions.


It's quite comprehensive. The things you're talking about are covered under FHIR L4 and L5 (https://hl7.org/fhir/).


We’re much better off letting the market decide on the best standard. Imagine if XML had become the only legal standard for data transfer. Or SAML the only one for authentication. While one single standard is preferable, regulators should give the market time to evaluate which standard is battle proven.


I agree requiring XML could have been a disaster. Still, the law seems to need to require more of we are to more meaningfully achieve interoperability. What's stopping companies from coming up with pathological data formats or generally making the data available but not easily available? Could the law specify some necessary characteristics of the data format? What if there already is a popular standard (http://fhir.org)?


If the standard is open (so that anyone can read the documentation and write a fully capable parser), it really doesn't matter what the standard is exactly.


I'm skeptical that will do anything. Healthcare tried to be interoperable with the HL7 protocol. Which worked out so well that software companies sell "hubs" to translate between the different vendor flavors of HL7.


Plaid for healthcare startup?


Redox + Health Gorilla both do this


I strongly believe platforms should be forced to allow interoperability and it should be illegal to prevent or frustrate access from other clients or services.

The idea that someone hosting a product on the internet should be able to control how I access my data or services is utter nonsense and it’s amazing that we’ve allowed it to become the norm.

This should include interoperability that allows “unbundling” such as using a site/app’s messaging feature alone with a different client or service and replacing the platform’s feed curation algorithms with your own or third party algos.

If they can’t make money under these conditions, tough. They either need to start charging for the core product instead of extracting value in hidden ways, improve their own money making services so people don’t go elsewhere, or die.


> If they can’t make money under these conditions, tough. They either need to start charging for the core product instead of extracting value in hidden ways, improve their own money making services so people don’t go elsewhere, or die

Agreed. It is possible to have FOSS software that is not gratis. Ever heard of a business model called: 'Making something of value and charging for it'?


You can't call your license FOSS if the code isn't entirely gratis to run and distribute. It's the first requirement defined by the OSI.

It's also the only requirement that I disagree with


> You can't call your license FOSS if the code isn't entirely gratis to run and distribute.

you're introducing some confusion here: first, OSI doesn't define FOSS, just their subset of OSS;

and someone can offer to sell you, and you can buy and then resell, FOSS code (both Free (GPL according to FSF et al) and Open Source (BSD, MIT, according to OSI et al)); you are simply not required to pay extra ex post for reselling.

from OSI webpage https://opensource.org/osd

"1. Free Redistribution

The license shall not restrict any party from selling or giving away the software as a component of an aggregate software distribution containing programs from several different sources. The license shall not require a royalty or other fee for such sale."


Forgive me, but I don't understand the distinction here. I tried build software that cost money to run, and I had a horde of people telling me its not open source but rather "source available" because the license specified that a 20% fee needed to be paid.

I don't know of any license that is widely accepted as FOSS which also requires that the developer get paid when the software is used.


Tell the horde OSS != FOSS and also get off their entitlement horse.

By opening your source you’re already in the minority contributing to common good, openly sharing your understanding of how to solve problems with code (aka IP). Making IP free is a big deal — you’re making it free to learn, understand, and advance shared knowledge. You don’t owe anyone making it free to run a business on.

To head off an AWS vs. Elastic situation, where someone else offers your code as a paid service w/o compensating you and w/o releasing their in-house patches (https://www.theregister.com/2021/01/22/aws_elastic_fork/), consider AGPLv3: https://choosealicense.com/licenses/agpl-3.0/

See top comment on that here on HN about 8 months ago: https://news.ycombinator.com/item?id=25834523

---

TL;DR: “How to charge for OSS”: https://www.mikeperham.com/2015/11/23/how-to-charge-for-your...


This is a distinction without a difference.


There is actually a difference.

The person who wants a change can pay you, or anyone else, to implement it. Then everyone gets it, but you still get paid for it.

Which is actually a sustainable business model. One corporation pays you $5000 for change A, another pays you $3500 for change B, an individual pays you $100 for small change C, you make $8600 this month and the whole world gets A, B and C.

In theory you might now have corporation A waiting for someone else to pay for the change instead of paying for it themselves, but if the change to them is worth $10,000/month and waiting for somebody else to do it causes them to have to wait five years, how does the math work out for them on that?


The other nice thing about OSS is that I, as the user of the software, can look across the whole labor market for a developer to add a feature I want to the OSS package, without having to rely on the original OSS developer(s). This way I can save lots of money!


Wouldn't this be a direct antithesis to any expectation of privacy or data security people have of their social media hosts? If I run an instance called mycoolfacebook.example and get thousands of people to sign up, what's stopping me from just passively saving all 'friends only' posts that pour in from people with friends @facebook.com? Do we need e2ee mastodon now, or do we just hope laws take into account malicious observations?


I think that expectation of privacy is mostly misplaced and the hosting providers are the least part of the concern: anyone who’s “friends” with more than a handful of people on social media should be treating all the posts on that platform as potentially public: there’s no way to prevent one of your connections from screenshotting and/or otherwise broadcasting your “private” posts.

In this way, the older unauthenticated model of the internet was better: by not creating an illusion of privacy around your website (think c2.com or Wikipedia), it does not encourage you to rely on that illusion for safety.


Great point. It’s a solvable problem, at least using code audits (since it’s client side)


This is a solvable problem. I think something along the lines of SSB is needed to square the triangle. More progress needs to be made on the notifications and efficiency stories. Maybe there is regulation that would put pressure in that direction.

I do think that making data into enough of a liability (as opposed to profitable) will eventually break apart data silos.


This doesn't seem like a new problem. If mycoolfacebook.example is a new frontend for Facebook, it must only talk to Facebook's backend, this is reasonably easy to verify. If it has its own backend, we'll have the same concerns we already have about Facebook.


> I strongly believe platforms should be forced to allow interoperability

At what level of API with what level of SLA?

> and it should be illegal to prevent or frustrate access from other clients or services.

Many existing APIs that are intended to allow access are extremely frustrating and poorly designed and implemented. Obviously this is true for products and services as well.

It seems very hard to imagine how you could mandate good quality design and implementation of APIs.


You're making this too complicated.

All that's required is that breaking changes to the API the vendor's first party client uses to access the service be documented and announced e.g. two years in advance.

It doesn't matter how poor the vendor's documentation is. It doesn't even need to exist. As long as the first party client can be reverse engineered and the fruits of that work don't get wiped out every month by purposeful undocumented adversarial modifications.

And then a vendor has a simple way to avoid running afoul of the rule -- keep a stable API. You can still change it, if you have to, but then the documentation of the change has to satisfy the lawyers, and more importantly you only get to do it once every two years, because you have to provide that much advance notice.

And it doesn't apply to adding new features, only breaking existing ones.


> You're making this too complicated.

You’re pretending this is simpler than it is.

> you only get to do it once every two years, because you have to provide that much advance notice. > And it doesn't apply to adding new features, only breaking existing ones.

What about changing existing features, or removing them?

Can that only be done every two years?


How is it complicated?

If an API existed yesterday, and it does the same thing it did yesterday, you're fine. If you don't like how it works, add a new one and use that. You just can't take the old one out, or change how it works, without providing significant advance notice.


> significant advance notice

Are you changing your mind about it being 2 years?

What if you want to make a change to the system that isn’t compatible with maintaining the old api?

How about if the old api can’t scale as the user base grows?

This is a clearly unworkable proposal.


> What if you want to make a change to the system that isn’t compatible with maintaining the old api?

> How about if the old api can’t scale as the user base grows?

Tough luck?

What if Amtrak wanted to make trains twice as wide?

What if EDF Energy would rather supply 160V AC power?

What if it'd be easier if there were a few orders of magnitude more IP addresses?

Once you build something, especially if it becomes part of the infrastructure, you have to support it, often for a longer time than you'd like. When you get to a certain scale and level of societal/economic importance, the support required of you as a company should be enforced by society.

Scale brings many benefits but also should come with certain expectations and commitments. We are very bad at making tech companies play their part in this.


>> How about if the old api can’t scale as the user base grows?

> Tough luck?

Tough luck for whom? It seems like nobody wins in that scenario. What do you think would happen?

> Once you build something, especially if it becomes part of the infrastructure…

This is a circular argument. The proposal is that all services be forced to be treated as infrastructure, even if they aren’t. As I say, this is clearly unworkable.

No thanks. It’s fairly obvious that the current crop of giants will not retain their power forever.

I’d rather not have them declared infrastructure and become a permanent fixture.

AC voltage and Railway gauges were standardized in the 1880s.

The idea that Facebook should be established by law as infrastructure for the next 130 years is a level of dystopian thinking I hadn’t previously considered.


> Tough luck for whom? It seems like nobody wins in that scenario. What do you think would happen?

They would immediately announce that the old API is deprecated and will be removed in two years and then have high server bills for two years.

Also, you can provide the new API in parallel and publish good documentation for it so that other implementations use the new one immediately instead of waiting the two years and then only some small minority of clients will continue using the old one.

> I’d rather not have them declared infrastructure and become a permanent fixture.

Requiring Facebook et al to have a stable API would cause them to be less permanent, because it makes switching easier.

Right now you can create an alternative messaging app all you like but nobody will use it unless other people are using it. It has to be a lot better to dethrone Facebook.

If anyone could make one that could still be used to talk to people on Facebook, it would only have to be a little bit better for people to switch to it. And once you have a messaging app that can use twelve different services, the ones people were only using because of the network effect lose that advantage and die out.

Also, how are two years and "permanent" in any way equivalent? It's two years to give interoperable implementations enough time to be switched to the new API before the old one is discontinued, nothing more. The problem right now is that the first party service will discontinue the existing API and roll out a new client using the new API on the same day, intentionally breaking all alternative implementations until they scramble to reverse engineer and implement the new one, and then do that repeatedly on purpose until all other implementations are dead.


> Tough luck for whom? It seems like nobody wins in that scenario. What do you think would happen?

> They would immediately announce that the old API is deprecated and will be removed in two years and then have high server bills for two years.

What if they are a startup who can’t afford that?

More importantly, what if they can’t realistically operate both simultaneously because there is an impedance mismatch between the implementations?

> Also, you can provide the new API in parallel and publish good documentation for it so that other implementations use the new one immediately instead of waiting the two years and then only some small minority of clients will continue using the old one.

What if the service is operated by a startup, and the clients are the incumbents like Facebook, or competitors, who don’t have any incentive to switching to the new api immediately or even quickly?

> Also, how are two years and "permanent" in any way equivalent?

Because it’s far easier for a company with billions in revenue to afford the costs of keeping many parallel APIs running, and crippling for a startup which needs to iterate fast. Forcing this expense on services would be the best thing to happen to the incumbents.

> It's two years to give interoperable implementations enough time to be switched to the new API before the old one is discontinued, nothing more.

You might like it to be that, but that’s not what it is. You seem to think this would hurt Facebook, when really it just creates a giant impediment to newcomers that Facebook never had to deal with as it was growing.

> The problem right now is that the first party service will discontinue the existing API and roll out a new client using the new API on the same day

Yes, which is exactly how an evolving new service must operate in order to be competitive.

> Right now you can create an alternative messaging app all you like but nobody will use it unless other people are using it. It has to be a lot better to dethrone Facebook.

We deserve a lot better than Facebook. There is no point in dethroning it with something that is only slightly better.


> At what level of API with what level of SLA?

Could be dependent on the size of the company/userbase. E.g. Facebook (>100M users) should implement full API, while a small company inventing new social software (<100k users) should only implement minimum API.


> need to start charging for the core product

I think you've misunderstood what their core product is. They charge good money to their customers - companies placing ads.


Seems like there's roughly four revenue streams. Some companies dip into all of them!

* Sell you a physical product

* Sell you a service

* Sell information about you: your conversations, your clicks, your friends, etc

* Sell you ads

Smart TV's are an example of all four at once.


What about digital products? I guess you left that out for a reason, what would that be?


A variant of #1 or, usually, #2: most of the time when you "buy" a digital good, you're really renting it out under extremely limiting terms.


I got your point, it's valid, specially membership sites. But some stuff is different. Ebooks, audio files.


It's still correct regarding ebooks and audio files, you're not paying for the files but the rights to keep a copy of the file and use it personally.

If the files are not protected by DRM, then there's no technical limitation on copying or redistributing the file, but according to your license agreement you're not permitted to do so.

In practice, no one is probably going to come after you for copying your music files or ebooks across your devices or sharing it with friends, but you don't own the file. Try mass distributing it or reselling it long enough and you'll attract someone's attention.


Which ebooks and audio files? Two mainstream providers - Amazon (Kindle) and Spotify - both rent out access, not sell actual ebooks.


I bought a lot such files on various web pages. I am not sure what your question is. Good point re Kindle/Spotify.


I think you misunderstood the point.

If it becomes possible for users to access the product without also driving ad revenue, they will soon discover that the service they offer to users is just as much a part of the core product you describe as the ad platform. Without eyeballs, ads are worthless.

Then there are two choices:

1. make people pay for that service directly (i.e. acknowledge as a product, separate from the ad business); or

2. make the part of the product with ads good enough and palatable enough thast users keep visiting.


100% on interoperability.

It's not amazing that it's become the norm because, it's always been the norm that people control other people in ways that seem barbaric and counter-productive in hindsight.

There are a few creative spirits who think through what would be best for as many, as long as possible. Then there's everyone else that wants to play whatever the game already is and win.

You just can't explain to people who care only care about winning that logic, decency, solidarity, are fundamental pillars everything else they enjoy relies upon. They want what they want, their world is simple and cruel, like the animal kingdom.


Sadly I agree. Business is about optimising for local maxima — making yourself money, whether or not it makes us all poorer.

Profit and wealth are important motivators, but democratic society should balance this by looking after the global optimisation problem, even when that sometimes constrains individual or corporate profit seeking.

Interoperability, right to repair, etc. are exactly the kinds of globally beneficial, [possibly] locally detrimental balance we should be pushing for.


i dont look at the boon of interoperability as being that of the creative few. i view interoperability as leaving the on ramp open to anyone, of permissionlessness that lets everyone have a chance to respin, remake, reconsider, ongoingly. we dont just think through really good solutions... we adapt & coadapt & readapt. discovery is continual & progressive & inclusive & shifting.

the social arguments dont seem necessary.


> The idea that someone hosting a product on the internet should be able to control how I access my data or services is utter nonsense and it’s amazing that we’ve allowed it to become the norm.

Nobody can control your data unless you give it to them. What do you want to do that doesn’t have an open alternative?


Yup, mandatory interoperability, transfer of data in and out, and required all directions are the same difficulty (and the same for transactions - no single-click signup and 3hrs on hold in five tel calls to unsubscribe)


The lesson of the internet is that lower friction always wins. If people had charged for everything on the internet from its inception, it would have died.

I'd rather have good than perfect, I think the internet is pretty good at the moment.


I see it more as capture of unwitting content producers. It's the same faustian deal made by medieval landlords to their serfs. The social media companies own the real estate and tools for improving it and allow their users to live and work there for free so long as they sign over everything they produce to their lords.


That's not even remotely true. If you're sufficiently big enough on social media you can get your own advertising deals directly with advertisers and cut out the platform itself. If you're small enough that you can't do that then your content isn't worth much anyway on an individual basis.


So... exactly what they said in the parent post? You have no choice and no value unless the landlords bestow it upon you.


Will anyone pay me for this comment? I'm guessing not. Will anyone pay for yours? Also probably not. Because in both cases they take minutes or less to write. 99.999% of most content on most social media is equivalent to this, what is it worth?

The internets "landlords" didn't decide these comments have no monetary value, we did.


>The internets "landlords" didn't decide these comments have no monetary value, we did.

I disagree. The value of my "creative" (I'm using that in a very broad sense) output is real and belongs to me.

While that may not be translatable to a pay day, not everything is a commodity to be bought and sold.

There are a variety of issues which created the current (dysfunctional, IMHO) landscape, none of which have anything to do with monetization.

Firstly, there's the huge barrier to entry that comes with the prevalence of asymmetric internet links. If I have (multi)GB symmetric network links, I can host as well as consume.

Secondly, there's no broad-based mechanism for individual control of creative output. PGP or a similar mechanism would be great for that. But instead, we have centralized platforms (see my first point) that dictate how and to whom data is shared.

With symmetric network links and strong cryptographic access controls, barriers to an individual having control of their creative output are significantly reduced.

Some folks will want to monetize that, others will not, with a mix of both being the norm.

But claiming that there's no "value" in something because you can't assign it a monetary equivalent seems a pretty narrow view of value, especially WRT to social interactions with friends and family.


It's worth everything - without these small dribs and drabs there's no social media at all.


No, viewing the posts of the person with 50k+ followers is worth something, the dribs and drabs are just the cost of business.


If you took a social network and split it into one network with everyone who has more than 50k followers and one network with everyone who has less than that, everyone would use the second one, because it would be the one with all their friends and family on it.

And then all the pop stars would move to that one because they're inherently the ones chasing the users, whereas dad doesn't want to install another app on his phone which means mom can't stop using that one and neither can you.


If you don't assign value to posts written by those without 50k followers, why are you reading these comments?


With the right micropayments architecture an upvote could easily be 0.1 or 0.01 cents. Or even $1. None of it has to be visible to the user either, just like mobile data bills aren't visible to the user


The internet wouldn't have died, because there always were and always will be actors without a profit motive.


> I think the internet is pretty good at the moment

Yeah, with uBlock Origin installed it's actually bearable.


Yeah, some corners of the Web are bearable, assuming uBlock origin or equivalent is installed. Most mainstream stuff is frankly braindead, uBO or not.

Still much better than, as Pink Floyd would put it, "thirteen channels of shit on the TV to choose from".


Thanks for saying that. I strongly agree.


What happens if platform A wants to interoperate with platform B and requires information from person C's account about person D (because they are friends)? Person D has likely not accepted any kind of privacy policy of platform A or consented to the exchange of data about them.

This would very quickly turn into a massive GDPR headache.


If data is required to perform a service you've requested (lawful basis = 'contract') then further consent is not needed under GDPR.

If by "requires information about" you mean "wishes to track/advertise to" then yes, that sounds like a headache. This is by design, and welcome.


Big tech should be making just that: big tech. They should not be touching our data. Let them produce hardware and software independently, like in the old days of the internet. That way, they empower companies by providing the modules they need rather than the monolithic products that work against the interest of both consumers and smaller companies.


Hardware and software is not the game at big tech. The game is profiting off data and comm channels ownership.

Hardware and software is the game for "hobbyists" like Purism and Pine64 now.


Google made a custom video encoder for youtube which further entrenches their monopoly. https://www.datacenterknowledge.com/google-alphabet/google-s...


I know I'm alone in this, but i trust Google with my data more than the government, more than my family and more than myself. If they don't handle it well, they lose billions of dollars of value.

On the other hand, I don't trust Facebook worth sh##. I would love to have an decentralized alternative (blog culture & the FOAF dream was nice while it lasted)


Not handling your data well will not cost google billions of dollars in value.

Dealing with mega-corps that treat you poorly always remind me of this exchange from hhgttg:

Builder: Do you have any idea how much damage this bulldozer would suffer if I just let it roll straight over you?

Arthur: No. How much?

Builder: None at all.


You don't think that if Google has a data accident (e.g., my Gmail data gone or leaked) that their stock price would fall? Of course it would.


if your individual data was leaked? Um, you'd open a support case and likely get some bottled response... before then needing to go a media outlet or trying to gain some traction on social media... even then the impact would be minimal. Maybe if you are a famous person or the data leak occurred en-mass itd be a different story? Id be curious if this sort of thing has happened.


I think this would be a major news story: I'm pretty sure it has never happened.

(Disclosure: I work for Google, speaking only for myself)


If they lost everyones? Of course.

Yours? No. You can join the choir of people complaining that google has locked them out of their accounts.


Just like Facebook brand value plummeted after Cambridge Analytica. I.e., not at all.


Yeah, exactly. If that happened to Google, they would lose value in a way that Facebook wouldn't. That's why I trust Google.

I'm sure millions of others at least implicitly share my opinion. And, I'd argue, the more explicit this opinion becomes, the more real the value and the greater the risk to Google for being a poor data steward.


> Yeah, exactly. If that happened to Google, they would lose value in a way that Facebook wouldn't.

Why do you think that it would be different for Google?


Because they offer a very different service than Facebook.


There is nothing Alphabet could do to lose money. Nothing.

They have already completely screwed people many times over. Deleting all of their drive data and locking them out of 10 year old accounts for false alarms on some “bad content” or something.

One does not simply sue a company like this. It is larger and more wealthy than a nation state and unaccountable to all.

The Butlerian Jihad seems more plausible every year.


Aside: Cambridge Analytica was a scandal of interoperability. People gave CA access which it then abused to collect data about others. In the kind of highly interoperable world that is being proposed here, this is not something that you could prevent.


It doesn't really matter whether you trust Google or the government more with your data, because either way the government gets it.

* https://en.m.wikipedia.org/wiki/PRISM_(surveillance_program)


PRISM is old news, google encrypts data between datacenters now


The government can still get it with a valid warrant, though.

(Disclosure: I work for Google, speaking only for myself)


And with a gag order to boot!


This is already happening in banking (Europe): Google for PSD2 and Open Banking. Financial institutions are required to provide Third Party Providers API-acccess to customer data (if the customer has provided his/her consent to do so)


Browsers are cited as an example of interoperable tech in the article. While it maybe true that *anyone* can write their browser, we see that Chrome does have inordinate amount of power. Even though it's literally based on an open-source engine that people can (and have) fork to build competing browsers, there isn't a wildly competing browser market.

Maybe just enforcing interoperability won't cut it.


Its a valid concern. There are so many issues with the current architecture there is likely no silver bullet.

The ultimate objective is to align the interests of users with the interest of service providers (abolish the user-as-a-product business model). Interoperability may be used as a fulcrum to force some price discovery about services, or allow building new business models that add value to the users, who knows... Anything but the current dystopia


i'm 1% concerned 99% still thrilled & delighted. the interoperetability here is amazing. and there's still a lot of room for growth. especially if we start focusing on websites that support interoperation, encourage it.


Browser competition doesn't matter much for interoperability, because they don't restrict what sites you can use. It's only a risk if browsers start banning sites like the app stores do.

And open source Chromium and Firefox are bulwark against that, with active fork ecosystems.


It goes the other way now, Google blocks non-Chrome and select others: https://news.ycombinator.com/item?id=25155451


> It's only a risk if browsers start banning sites like the app stores do.

What if browsers start banning apps from their own app stores, though?

https://hub.packtpub.com/mozilla-and-google-chrome-refuse-to...


You can say about Ethererum what you want, but the payable keyword in Solidity blew my mind.

Having a globally available standardized decentralized and transparent way to pay for all APIs calls baked into a programming language is a pretty awesome feature.


It's a nice idea, but what can I pay for now? Does it rely on a centralized notion of identity?


Right now Ethereum smart contracts don't have access to real world data from outside the Ethereum virtual machine. This is why most contracts are financial in nature.

There are projects attempting to fix that but so far nothing has materialized yet.


I believe Civic is working on a sort of identity verification blockchain integration but I haven't heard much come out of the space as of yet. In fact, I think it might dissuade people who don't want their identity linked to their crypto transactions.


What about Chainlink and TheGraph?


Too bad the US government is worthless. Say what you will about other countries and the EU, but at least they assert themselves sometimes.


Maybe if they asserted themselves as much as the EU does the tech situation of the US would be the same as the one in the EU.


Would the EU assert itself so firmly against a European Facebook? Maybe multiple governments fighting for their slice of the internet tax pie is just slowing big tech's regulatory capture.


Maybe, maybe not. Is the EU's situation a result of their regulation?


Probably.


Based on what? Any evidence?


Yes, I assume that European are as smart an entrepreneurial as Americans, therefore the kinds of companies they build are likely a result of the regulatory environment.


There are big European software companies, such as Avast, but they are mostly now owned by Americans.

There aren't big European social networks, but that might be caused by the fact that Europe is linguistically and culturally very fragmented. Which does not matter as much if your product is an antivirus, but it does matter when you build a product whose value is in the sheer mass of mutually communicating users. The Anglosphere is HUGE.


Reductionist to the point of nonsense of course. It could be any of many factors.


Such as?


Such as legacy effects.


It's funny how interoperability has to be reintroduced as a new concept now, when back in the day it was part of the philosophy of the internet itself. I heard stories about a CS professor who had to teach incoming students what files and folders were. The interoperability thing is like that. 30 years into the Eternal September, we're learning that nothing we were enculturated in computing-wise can be taken for granted and we must re-teach it all to our successors.


Interoperability is kryptonite to business. It kills whatever one's moat is, the way a microneedle will kill a cell by puncturing its membrane. Most companies avoid it at all costs, except for using it as a weapon to hurt competition. Of note is how upstarts embrace interoperability while it gives them an edge over incumbents - and then abandon it as soon as they establish themselves as a major player (see e.g. Slack, which built their userbase on this trick).

Early Internet was interoperable because of low commercial interest. Now it's centralized because it's a big market. Interoperability got replaced by contracts.


Interoperability needs to be brought all the back to the protocol. Force ISPs to give people static, public IP addresses. Then the tech industry has to build local-first, P2P apps, and what was once a dominant social network is now just another interface competing with features people actually want.

What is broken about the internet is that we don't have our own addresses, like in real life. (Cue security red-scare and condescending technical types who think it's too complicated for the user.)


I think that would cause a lot of problems if forced to give a static IPv4 address. However, I agree that they should give people an IPv6 address. Furthermore, they should have any dynamic mapping accessible via DNS by default (with an opt out process for people who don't want it). Ideally IPv6 address assignment should be done without NAT as well since NAT is a way bigger hurdle to overcome than dynamic IP addresses.


> static, public IP address

That’s basically a super-cookie.


Everything would be a different paradigm than we have now. Security, auth, sharing; we can only imagine how things would shift if we turned the model we have on its head. Super cookie? ISP is a private VPN proivder now.

Cookies are a good example. That browsers give up a cookie at all—who consented to this specification? GDPR could have changed browser specs so that cookies were truly opt-in; it regulates company behavior instead, which is weak.

Fundamentally, our choices are being made for us at the protocol level, and everything we have as a result is emergent, and so people argue about regulating the emergent properties.


> ISP is a private VPN proivder now.

Reality check: ISPs sell your information. They’re pretty much the last people you should trust.

> who consented to this specification?

Third party cookies (the “tracking” kind) were a bug. The original specification did not include them.


> Reality check: ISPs sell your information. They’re pretty much the last people you should trust.

Exactly why we should regulate IP address and protocols, so that every company that handles them is beholden to the same conditions for preserving our privacy. Instead we play whack-a-mole regulating individual company behavior, while they continue to control the protocols and addresses and everything on top of those layers.


Another problem is that Big Tech owns the relatively corrupt US government.


Yes but only authoritarianism can make big business kneel. But that has it's own problems.


> Yes but only authoritarianism can make big business kneel

I'd say this only applies within a capitalist mode of production.


If Big Tech is too powerful, and the US government is too corrupt, then combining them together in a planned economy is unlikely to improve matters.

If, on the other hand, you're instead suggesting an alternative to the capitalist mode of production that doesn't involve state-run monopolies, then I think you need to draw the rest of the owl before we can judge whether your system will end up with authoritarianism or not.


Basically the only way this really works is if we define standard APIs for all forms of cloud services. From infrastructure all the way up to social media and beyond. In prior eras protocols were custom designs on top of the IP protocol. That's not going to fly today and it also won't let us move fast enough. Most API definitions at the likes of Google are now defined in protobuf, another alternative is openapi. Either would suffice but I think protobuf is less verbose and potentially easier to evolve.

It also requires open standards for defining those services and potentially open implementations. See github.com/micro/services as an example.

Ultimately it's going to take a long time and significant coordination between multiple players for it to happen. I do wish we just had an open set of services anyone could run and contribute to. Then we could either go through the pains of hosting ourselves or paying someone to do it for us.


> is if we define standard APIs for all forms of cloud services

Nope. All it requires is for each company to provide unrestricted access to the APIs of their services. Plenty of interested parties out there to complete the plumbing


Yep, the good old adapter pattern.


I agree with Doctorow's goals here, but framing it as he does at the start of the piece doesn't make any sense. He claims we need to fix big tech abuses, then gives as examples of issues to fix disinformation and copyright infringement.

Those aren't big tech abuses; they are two systemic side effects of the internet itself... Of a technology that disintermediates gatekeepers from peer to peer communication. If anything, big tech serves as a gatekeeper that has any chance at all of addressing those issues. Empowering communities and individuals to escape monopoly platforms decentralizes disinformation and copyright infringement and increases the severity of those problems.

I think there are good reasons to decentralize the current mega platforms we have, but addressing disinformation or copyright management aren't them.


The longer piece on CACM touches on this: https://cacm.acm.org/magazines/2021/10/255710-competitive-co...


I came here to say just this. The pivot at "rather than fixing tech companies, we can fix the internet" makes absolutely no sense.


I don't have much to add here. I will share that one thing I cherish about my doctor is his refusal to go digital record. Everything is paper. He has long been an advocate for repealing medical record retention laws. Keeping it to paper at least adds a speed bump some what.

The hope would be that the system splinters. Those that want more data oriented system can have it, those that don't can remain in the past.

Other than that, Herbert L Fred's "The Tyranny of Technology" and that of his term 'technological tenesmus' comes to mind; https://doi.org/10.1080/21548331.1997.11443437


Interoperability is arguably a hit to innovation. You start with just text messages, one company wants to add images, either (A) they have to push for a standard which takes years or (B) they add a non-standard extension. Then someone wants to add video clips, audio clips. Okay you say, older clients will skip those. But then someone wants to add threading, suddenly the entire format needs to change (see A and B above).

Or, you just let each developer go as fast as they want to adding new features and/or selecting the ones they want (Apple putting adding in memoji that works by sending only the parameters, to very specific and copyright protected assets)


> Okay you say, older clients will skip those.

Yes, that is what I say.

> But then someone wants to add threading, suddenly the entire format needs to change

Does it? If older clients don't want to implement threading, they can just receive the message bodies in the order they were posted. The old API should continue to work regardless of the extra metadata which one provider might be storing about it.

Ideally, the host with the new feature would also add an "In response to #123456789:" line at the top of each message, so that users of the older clients can construct the threads in their mind (assuming that these older clients have been listing the IDs of the messages), but I don't think anyone's arguing that a service is breaking interoperability by failing to add such an affordance.


I had this thought that it might be possible to define some of the social media interfaces in terms of CapnProto services. Then it wouldn’t matter where the service was hosted. Especially for things like LinkedIn the major draw is that CV data is made available in a convenient and canonical form.


See also CloudEvents[0].

This post misses mentioning the incentives of incumbents who have seen the innovator risk filtering pipelines constricting. Inviting more people to play feeds their futures.

[0] https://cloudevents.io


When he says "Let's Fix the Internet, Not the Tech Giants", in my mind I translate that to: "capitalism doesn't care directly about human well being. Well, that's not cool, but I believe it's better to try to provide an alternative flow / dynamic / space for competition than trying to stop the immoral practices of these powerful beasts directly. Don't fight the problem face-to-face, try to make it obsolete".

Well. One should be shocked: government and laws are exactly what should protect human well-being directly and decisively when other things fail (or in prevention), but now it turns out capitalism is too powerful so we can't do that? We have to ignore morality for a while, and start to leverage government and laws just to create a side pathway that might eventually lead to the possibility to compete against the big beasts of capitalism in their own (or slightly shifted) terrain? Hope we adapt better than they do?

I'm not even saying this is stupid. It might actually be the most pragmatic way forward. I tend to take a similar view when looking for solutions... but when we reached this point, we should realize that the problem is not the tech giants, interoperability, Java's error model, APIs, EFFs, js typecasting nor the internet. If having to resort to this kind of strategies doesn't make it clear to us that we are playing the wrong game, we are doomed at a more fundamental level: money sits at the top of the power pyramid, and we have no effective mechanism to balance human well-being against it (which doesn't make patches useless, but maybe we should start prefacing appropriately or writing angry ò.ó comments about it at some point).


> money sits at the top of the power pyramid, and we have no effective mechanism to balance human well-being against it

That's a succinct analysis of the root cause of many problems today, and it helpfully points towards some patches that would address the problems directly. Firstly, the link between money and political power could be severed by having publicly funded elections and limits on political advertising (which might require a narrowly-targeted constitutional amendment[0] in the US).

Secondly, the fact that money creates power is not a problem for human well-being if everyone has a similar amount of money (assuming that having power ensures well-being). Progressive taxation already exists in much of the world, but some countries go a step further with a wealth tax. I would advocate for a wealth tax which makes millionaires pay an amount in tax each year which is an equivalent percentage of their net worth as the amount paid by the median tax payer as a percentage of their net worth (to whom the wealth tax would not apply).

[0] https://campaignfinancereform.org/explanation


The internet is already widely interoperable. But some efforts of late make it less interoperable, and less able to route around problems in the network.

The obsession with HTTPS has to end. Not because of privacy, but because of interoperability.

DNS over HTTPS is not interoperable with the whole universe of existing DNS products. But what's worse, it's locking people into centralized platforms. With traditional DNS, you can move to any network at all, and automatically pick up a new local, fast, customized DNS catching resolver designed for the network you're on. If you're on DNS over HTTPS, you've always got the same provider, which does not scale to every network. The solution, people will tell you, is just to disable DoH. Until we no longer can, because everything expects to use it.

The obsession with HTTPS has also led to the apologists decrying any technical solution that doesn't use TLS 1.3 and HTTPS, because middleboxes!!!! And because literally everyone is reluctant to design new protocols that can be extended as successfully as HTTPS. If it doesn't work over HTTPS, it's not part of the modern internet. This not only severely restricts how you can design technical solutions today, it's stupid: we have this transport protocol with 65,000 port numbers, but we'll only ever use one of them (443), because a redesigned stack is just unfathomable.

Every modern network service today needs many things. Routing metadata, dynamic host/service lookup, federated authentication+authorization, encryption, geo-localized load balancing, error correction, session management, etc. If we build things like these into lower levels of the stack, and build primitives for them into the operating system, then all applications can gain their benefits, and we won't need to rely on convoluted hacks to provide it all.

We can't keep on for the next 100 years with the shitty protocols and shitty solutions we have today. We have to start thinking about brand new designs, and how we will upgrade systems to use them. Otherwise, every solution we come up with will just become more and more convoluted and ridiculous, as we build more and more on top of antiquated systems designs from 40 years ago.

Phone lines were pretty cool. We were able to extend them to transfer data, from 1400 baud to 1.5 megabits. We could technically do up to 50+ megabits, but it wouldn't scale. So we built new solutions. They were expensive, but we needed them in order to grow. Well, I think it's time for tcp/ip and its related protocols to be replaced as well. Not immediately, but it's time for us to start building the replacement.

That new replacement can take everything into account in a variety of new stacks. Federation of data, access, services; new kinds of encryption and privacy mechanisms, new trust models. New routing and service models to make the "last mile" less complicated and more flexible. And more responsive to network partition, including the ability to detect them early, to make applications more responsive.

We can do literally anything we want, people! We can start building the future today! But we have to choose to do it!


It pains me to see even technically literate people referring to the web as the "internet". They are NOT the same


What is the distinction?


> The internet is a global network of billions of servers, computers, and other hardware devices. Each device can connect with any other device as long as both are connected to the internet using a valid IP address. The internet makes the information sharing system known as the web possible.

> The web, which is short for World Wide Web, is one of the ways information is shared on the internet (others include email, File Transfer Protocol (FTP), and instant messaging services). The web is composed of billions of connected digital documents that are viewed in a web browser, such as Chrome, Safari, Microsoft Edge, Firefox, and others.

https://www.lifewire.com/difference-between-the-internet-and...


Interesting, I didn’t know that!


The Internet is the current incarnation of the Network, this time as a digital packet switched network (its predecessors being the Public Switched Telephone Network, and the Universal Postal Union which enables letters to be sent around the world). The Network is a tremendously powerful technology which enables civilisation on a much larger scale.

The Web, or World Wide Web is an application of the Internet made by (Sir) Tim Berners-Lee about thirty years ago to deliver hypermedia over the Internet.

The Web is to the Internet as "premium rate" chat lines were to Signalling System Seven.


I work on a project, Esteroids, who has a goal of creating a democratic Internet. I also wrote about it in my previous project Almonit (currently discontinued).

https://almonit.club/blog/2021-01-08/self-governing_internet...


Tangential: What a beautiful website!


How does this solve the content moderation problem? We’re allowed to upload 18+ content on Twitter. But a nursing parent posting a nipple on Facebook is grounds for account deletion. Or in another case, what if I block a user on Facebook, but they come through on Twitter? Does interop need to be include verified user identity to prevent abuse?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: