Hacker News new | past | comments | ask | show | jobs | submit login
Who Will Control the Software That Powers the Internet? (a16z.com)
116 points by simonpure on Jan 23, 2021 | hide | past | favorite | 76 comments



It's unclear to me if the person who wrote "who controls the Internet" has ever interacted with core internet infrastructure at the router, BGP, major transit and transport service provider ISP level.

I find it very hard to take seriously prognostication from persons who've never been responsible for even a small sized ASN.

Persons I do take seriously when they prognosticate on the future of the Internet:

The individuals responsible for some of the most important RFCs of the past 20 years.

The people who have the equivalent of 'enable' on core routers of gargantuan ISPs, and on much smaller ISPs doing very special things in the world like last-mile 1Gbps symmetric FTTH.

Persons recognized in the community of individuals associated with RIPE, ARIN, APNIC, AFRINIC, etc and the various special interest regional network operator groups, as having a high degree of network engineering acumen.


There was a recent post from an A16Z insider that basically made it clear that they are nothing more than a media house with VC capabilities. This is probably why the person writing it doesn't make much sense to you.


I just had a good chuckle looking at the top google result for a16z, the tagline of their own website is:

"Andreessen Horowitz | Software Is Eating the World"

I don't disagree with that as a statement, but it's such a funny premise to put as the MAIN THING for your big serious VC website.


That quote is what made a16z. It's the cornerstone of their image, it seems they are riding that quote into the ground.

If you are interested in having some more laughs, this is the full story: https://www.newcomer.co/p/the-unauthorized-story-of-andreess...


Behind a paywall but here’s the original: https://www.wsj.com/articles/SB10001424053111903480904576512...

Here’s a non-paywalled version: https://genius.com/Marc-andreessen-why-software-is-eating-th...

If you’re questioning the venue of that second link, as I recall, a16z was one of the earlier backers of Genius.


Link?


That was a hit piece. Another salvo in the mainstream media's ongoing war on Silicon Valley.


Is there any group at all that doesn't think that the "mainstream media" boogeyman is at war with them?


Do you know another group that stole ad-revenue from the media like high tech companies did?


Isn't it possible to keep tracing things back to a more base layer beyond what you've described though? It's kind of arbitrary to draw the line any particular place.

Like I could write "it's unclear to me if the person who wrote 'interacted with core internet infrastructure at the router, BGP, major transit and transport service provider ISP level' has ever interacted with electrical grid management, power plant operations, or coal or natural gas pipelines and logistics.

I find it very hard to take seriously prognostication from persons who've never been responsible for the electrical input of a nationstate."

But then someone else could just respond to that with a similar message about how those who control the powergrid are hard to take seriously when they've never grown their own food or managed the food and water needs of humans who run the powergrid.


Yes, but "The Internet" is an actual specific thing composed of stuff at OSI layer 1, and various other things at layers 2-7 in the rest of the model.

If a person whose only experience with it is at the software application layer starts opining about what internet infrastructure should look like underneath what their knowledge base encompasses, I would try to determine what if any experience they might have with actually running an ISP.

To follow your analogy, I don't agree with the concept, because it would be more like criticizing an article titled "Who controls the power grid", in which the author opines on some cryptocurrency based solution for something related to the grid, but has no experience actually running the grid as an electrical engineer at the BPA or TVA might have.


No, TCP/IP and the ability to exchange routes between different networks is what makes the Internet. All of the value is in these interconnected networks and being able to send traffic to arbitrary endpoints regardless of your provider.

There is no “base layer” beneath TCP/IP. That’s the beauty of the Internet. The underlying ISPs connecting together will have different lower level protocols, different internal routing protocols, different hardware, different software, different power sources, different currencies, different spoken languages, etc.

AWS/Google/Cloudflare could cease to exist tomorrow and the Internet would continue on just fine. There would be a ton of websites down for US/European residents, but the Internet would work nonetheless.


> There is no “base layer” beneath TCP/IP.

Actually, there usually are. Physical transport layers run their own protocols and the physical media themselves (in case we are not talking OTA) also count as layers in the stack. And while these have been mostly commoditized whoever controls them does have a lot of power.


> Actually, there usually are. Physical transport layers run their own protocols and the physical media themselves

Whoosh, you missed the point. The “base layer” of the Internet is the one that allows cross-connectivity of the networks. All of the layers beneath tcp/ip are interchangeable and people will still be able to get on the Internet.

> And while these have been mostly commoditized whoever controls them does have a lot of power.

They were immediately commoditized by design when we standardized on IP. IP over MPLS vs IP over ATM vs IP over frame relay looks no different to the end user. In one minute I can roam from a network based on dwdm to one based on microwave back-haul and I still can connect to hackernews and pull down comments over TCP/IP.

When people want “Internet access”, the vast majority don’t give two shits about the medium, the link layer protocols, or anything like that. They want IP connectivity that has decent speeds. That’s the “base layer” that everything on the Internet is built on. Anything below that is transparently (and frequently) replaced because it’s an irrelevant implementation detail stripped off at the boundary of each network.


> There is no “base layer” beneath TCP/IP.

The people who run the DWDM systems in my region might disagree with that statement :)

Not a network protocol layer but there's plenty of transport gear that exists at layers 1/2 in the OSI model and is opaque to people who are looking at things from the perspective of router-to-router (or metro ethernet L2 transport) adjacencies.


If someone can get by without your technology and access the Internet, it’s not the base layer of the Internet.

DWDM and whatever other technologies are important but simultaneously irrelevant when discussing the base layer of the Internet. Everything online is built on TCP/IP so Twitter doesn’t need to do anything if Comcast switches from DWDM to pigeon carrier as backhaul.


I think he's talking about a layer up.

He has a bit of a point.

In practical terms, if you want to purchase something on the internet or get an email address you'll interact with companies/platforms he alludes to.

Thankfully email is open enough so you have a lot of choices up to installing your own server. But purchasing something is all tied up.


> But purchasing something is all tied up.

Not entirely:

* https://en.wikipedia.org/wiki/WooCommerce

* https://en.wikipedia.org/wiki/Comparison_of_shopping_cart_so...

What's really tied up is payments: Visa and Mastercard specifically.


These are the technicians who control IP.

For the US this can really be whittled down more - the people who go to IETF and NANOG meetings that work for AT&T and Verizon. Because the vast majority of Americans are dependent on pipes or cell phone coverage from those two companies.


When I worked at one of the biggest ISP's in the world we got extremely despondent when https and other end-2-end encryptions became the standard because we were then unable to understand what traffic was on the wires. For ISPs and network providers now everything is just bits, and it's all commodity. The service layer is now implemented by clients.


I 'grew up' on networking in the late 90's when next to nothing was encrypted on the wire. The amount of understanding I was able to develop about how protocols work by just sniffing the packets was immense. I see folks coming out of school these days and while they far surpass my skills with software development they don't really have any intuition about how the protocols work at the network level, and as a result tend to flounder when things break.

However, if I hadn't stowed away all of that info 20+ years ago I think I would be in largely the same position. Obviously encrypted network protocols are essential for security and privacy, but it kind of sucks for folks trying to figure out how things work.


Another big change was the move to switched networks. Coming from a shared lan that felt like being blind.


Great point! Then you'd span ports just to get the same visibility you used to have but somehow it felt sneaky haha.


that's a good outcome. I would be very scared if the electrical utility company started looking at what i'm using the electrons for, and start disallowing certain types of uses for their electrons.


> I would be very scared if the electrical utility company started looking at what i'm using the electrons for, and start disallowing certain types of uses for their electrons.

The energy companies already do disallow certain types of use for "their electrons" (check your contract, seriously - domestic users are forbidden from certain uses) and with the advent of accurate monitoring thanks to 'smart meters' are able to enforce that much more easily.


What uses are you talking about?

As far as I understand it suppliers are generally more than happy for you to use more of their electrons and don't particularly care what you use them for.

One thing that I do think will change over the next few years, helped by smart meters is that we'll see more time of use based pricing, and that with renewables on the rise (and their associated intermittency) at some point we'll see the traditional night rate cheaper than day rate equation flip.

By being able to measure consumption at a higher resolution it's possible to offer pricing that is reflective of the "true cost" of the energy, eg: agile tariffs that we see emerging from companies like octopus in the UK

Personally I think this is good for consumers, and good for the environment as it allows for incentivising consumption when the energy is greener/cheaper.

There is certainly an element of concern over privacy (eg detecting activity in a home), however I would posit that internet usage patterns probably already provide the same data and of course with far worse privacy implications.


> What uses are you talking about?

> As far as I understand it suppliers are generally more than happy for you to use more of their electrons and don't particularly care what you use them for.

In the UK the law differentiates "domestic customers" using services for domestic purpose (this part is important, commercial use is is excluded!) and, "non-domestic" which includes all others. Domestic customers get to have a fuel price cap and lower climate charge levy thanks to that (though caveats apply, as it does not include "green deals" and fixed terms) but "non-domestic" users do not.

Of course you are right: the suppliers would indeed be more than happy to suply you as much of their electrons as you could obtain - at the business rates, not domestic, ideally at one where caps don't apply.

Take this little innocous line from the British Gas (Centrica) standard rate T&Cs:

    Supply and supplying: The gas or electricity (or both) we provide, or what we do to provide them. The supply is for you to use entirely or mainly for domestic purposes
There is a reason for it - a significant enough jump in energy use (say, if you want to charge an EV or put up a multi-GPU bitcoin miner) does trigger an investigation. EV users get a to select a preferential rate [1], suspected commercial users are asked to move to business rates (or show why that shouldn't be the case), any detected pot growers get referred to the appropriate police force. The energy company does care - at least so you can pay it more.

> 'Smart meters'

I agree that the move to more accurate/higher resolution measurement would make it possible to offer pricing that is more reflective of "true cost" - Economy 7 is well known to skew incentives, especially as more and more energy is used outside of traditional "day" hours (and with more and more solar powr in the mix, the overgeneration during the day hours) - but I would not be so optimistic that the consumers are going to be the primary beneficiaries of the change when the dust settles, they never are.

As for privacy, well, there is one substantial additional difference: your bandwidth and internet usage patterns generally have to be monitored by your ISP or internet services you visit, whereas a SMETS2 meter will create a nice new 2.4 Ghz network (and 868 Mhz, if/when it is available), so sooner or later it will be readable by a local attacker, if it isn't already. This is a device that is supposed to last as long as 20 years (though SMETS1 proved this might be a bit optimistic...).

[0] https://www.britishgas.co.uk/GetAQuote/tariff-information/ta... [1] https://www.britishgas.co.uk/GetAQuote/tariff-information/ta...


Thanks for the detailed response, there's a lot to unpack there and a few things I wasn't aware of.

> Domestic customers get to have a fuel price cap

Interesting, I didn't realize that commercial usage wasn't subject to the price cap, and generally know very little about commercial supply

> EV users get a to select a preferential rate

Yes most suppliers are now offering time of use tariffs that are better suited for electric vehicle owners, these typically come with a super low off peak rate, and then a higher rate the rest of the day.

For someone that can charge a vehicle during that off peak time they should come out ahead, but I'd be very surprised if British gas got upset if you charged your vehicle on their standard tariffs.

> any detected pot growers get referred to the appropriate police force.

I thought that kind of investigation was generally instigated by the police rather than the other way around, but not 100% sure.

> SMETS2 meter will create a nice new 2.4 Ghz network (and 868 Mhz

I presume you're meaning the HAN that the meter uses to update the in home display. As far as I'm aware this is securely encrypted. Whether it'll stand the test of time who knows (let's face it WEP and friends certainly didn't), but the same logic could be applied to the WiFi network your router is creating.

Difference being that the data going over the HAN is likely to be a constant stream of values meaning it doesn't leak much information unless you can decrypt it. Someone able to passively observe that local network can almost certainly observe your WiFi transmissions, which will be much more variable - little to no activity when you aren't home, and lots of activity when you're streaming a movie.

Also yes smets1 was a bit of a cock up, but they aren't obsolete just dormant and in the process of joining the DCC - https://www.smartdcc.co.uk/smart-future/enrolment-and-adopti...


> I presume you're meaning the HAN that the meter uses to update the in home display. As far as I'm aware this is securely encrypted. Whether it'll stand the test of time who knows (let's face it WEP and friends certainly didn't), but the same logic could be applied to the WiFi network your router is creating.

Yup, I do mean the HAN - and the way CAD including IHD connect to it. It may be based on ZigBee, but it modifies the protocol to add its own authorisation and encryption prototypes. They may have sound basis as of today but given the precedent (GSM/UMTS encryption, WEP, WPA, but also Infineon cryptographic corprocessor failures), the custom modifications to ZigBee and a couple of other deviations from usual practices I would consider it almost a certainty that within the lifetime of the device the protocol will be broken or bypassed. The question is how and when, not if.

Let me put it this way: I would be pleasantly surprised if it is otherwise, as that would be the exception, not the rule.

> Difference being that the data going over the HAN is likely to be a constant stream of values meaning it doesn't leak much information unless you can decrypt it. Someone able to passively observe that local network can almost certainly observe your WiFi transmissions, which will be much more variable - little to no activity when you aren't home, and lots of activity when you're streaming a movie.

Most of the equipment I have is wired (optical or copper), including supposedly mobile devices ([0]) and the ones that aren't are not a good indicator of presence (roughly constant data stream). It's a different story if one was to bring an IMSI catcher, of course - but there is only so much you can do and still participate in the modern world.

As a side note: I admit I may be biased as the meter location would preclude any IHD from being in range of HAN (5 reinforced concrete slabs away for electric, worse for gas), has problems with GSM connectivity for the same reasons and has dubious physical security against anyone who posseses a high tech tool known as "screwdriver" or "knife". The fallback procedure that UK suppliers have for this situation is "read the meter like a dumb meter" [sic!] until the supposed "alternative HAN solution" is in place. I know it puts me in the 3.5% minority of households, but still it rankles.

[0] This is normally where I go on a rant about flagship Pixel USB-C powered phones being unable to charge and use network adapter at the same time due to incomplete implementation on the phone side - one or the other. Older micro-usb models could do both as long USB-OTG was supported. iOS devices have no such issue, but they have other ones.


Yes - I think it's generally agreed that this is the case, especially as we move to effectively infinite bandwidth for home/office use (literally no one uses more than 100mbs for more than 60s a day). Unfortunately in computing we have two more layers of utility; the ones that own the clients (Microsoft, Apple) and the ones that own the services (Facebook, Twitter, Microsoft, Apple). We really need to get this fixed.


Interesting criteria, as I had 'enable' on AS701 among several others for a few years, and understand where the article is coming from. I do think the smart bet would be on divergence before change. The analogy I think that fits is the blockchain community today is like if the NetBSD community grew without Linux ever happening. There is still a more accessible culture-purpose-specific linux-like framework for the next overlay "networking" stack to emerge.


A lot of times when people say “the internet”, they mean “the web”


This reminds me of the quote heard recently:

“If the critics aren’t getting their ass kicked in the arena, they’re not worth paying attention to”


He’s talking about the software layer, not the protocol layer.


You know Andreessen wrote the first browser, right?


I find this article naively optimistic (which may be unsurprising given how much Andreessen Horowitz is invested in the crypto space).

I don't see how anybody can delude themselves to think that blockchain is about giving control back to the users, or that it is somehow democratic. Just look at the distribution of the supply for any token. You'll have the top 1% of wallets controlling > 80% of the supply. Dev keeping half the supply to themselves. Even airdrops to users like Uniswap did don't prevent someone from buying up most of the supply. Crypto and governance tokens give you plutocracy, not democracy.

And that's not even mentioning the issue around access to the code (see Blockstream)


Power law distributions are inevitable and necessary in organic scale-free networks. Any change to a scale-free system that would make it no longer obey a power law would also render it useless.

Democracy (uniform distribution of some resource) only makes sense in narrow contexts. In any financial system, a uniform resource distribution is highly unstable and will stabilize to a power law distribution in the absence of massive deadweight loss inducing distortions. This is completely independent of whether or not some shitcoin uses a scammy premine or some other initial allocation strategy.


I'm curious what you all think about Nano? No ICO, no fees, no mining.

https://nano.org/


Much of the groundwork for the internet and open software happened in the 1970s and 1980s thanks to influxes of government funding, including at many universities and research organizations.

I advocate for government funding for public good internet software, because the societal benefit can be so high, and the only alternative so far is monopoly-equivalent companies such as FAANG. In the United States, and in many other countries, there's plenty of precedence for government funding for public libraries, public broadcasting, public transportation, etc.


Yeah, plus those folks actually had problems to solve and things to invent. These days 99.99% of VC projects and the stuff coming out of most tech giants is just throwing obscene amounts of money at creating new problems that no one knew they had, hoping some of it will stick and fund the next round of waste.


You mention FAANG but don't forget Bell Labs which was also supported by a giant monopooy at the time, so that supports your theory too.


> Much of the groundwork for the internet and open software happened in the 1970s and 1980s thanks to influxes of government funding, including at many universities and research organizations.

Sure, that’s one part of it. Then it was industry that came in and made it all useful.

A significant portion of open source contributions to Linux are from people working for industry. Most of the advancements to make routing scale, etc were driven by the likes of Cisco, AT&T, etc.


Nice article, I just posted the link on my Mastodon [1] account, which seems appropriate.

I understand that HN people who work for Internet platform companies like Facebook and Google push back on using open protocols and open distributed systems - never blame people for doing things in their self interest.

For the rest of us, for social, tech, and political reasons, I hope that we can push back at least a little against abuse of platform companies - that is in our self interest.

[1] https://mastodon.social/web/accounts/1150176


>> I understand that HN people who work for Internet platform companies like Facebook and Google push back on using open protocols and open distributed systems - never blame people for doing things in their self interest.

Strongly disagree. Live and let live is fine so long as we dont screw each other over. In those cases they are being actively hostile to peoples freedom. When they push, push back as hard as you can. Push them down the stairs if that's what it takes to make them see their self interest is actively hostile to others.

(No, I'm not advocating violence, just taking the term "pushback" to the next metaphorical level. Ugh)


Fight like hell.


> That’s why the pendulum is swinging back to an internet governed by open, community-controlled services. This has only recently become possible, thanks to technologies arising from the blockchain and cryptocurrencies.

I absolutely believe that the pendulum does need to swing back towards community controlled services but it's quite the leap to make to say that we can only do that now and only with a specific set of technologies that this VC firm happens to be heavily invested in.

What about good old fashioned trust? Producing and following open standards based on systems of collective decision making is how we built the web in the first place. We don't have to have zero trust networks with monetary incentives to rebuild the open web. We just need aligned individuals building towards a common vision of what the web should be.

I think our failure to do this well in the last decade has really been a failure of vision where former leaders in this space have either been too dogmatic to really represent our collective interests or all too willing to adopt antithetical business practices that compromise that vision in the first place.

We need new leaders to represent open innovation and to build alignment towards an internet that can really be a tool for disintermediation. Novel technology won't save the web. Focusing on money will only exacerbate the problems in the long-run. A culture of open innovation is the only thing that will help us realize the abundance of the digital economy. To build that, we need communities that we can all trust will remain aligned to our collective interests.


We desperately need a decent desktop operating system. After that basic need is satisfied, fixing the problems in the upper layers will be easier in comparison.

To develop any kind of software, you need to do so in a desktop operating system, so it's important for it to be reliable and usable.

I think the only viable way to achieve this is to create a Linux-based OS. Not another Linux distribution, but something consistent and well thought out and that solves the problems that no one in the Linux space wants to solve right now.


I totally agree with you, but lets not forget that we are going through a phase where the open source model is suffering over the lack of incentives.

We need to figure it out a good and sustainable economic model or else it might start falling apart.

I really think that society, governments, civil entities should step up with some formula, because i mean, this is the XXI century infrastructure.. open source software are now the "roads" where everything is passing through.

Government all around worry about their physical infrastructure, so why not start to care about the virtual infrastructure we all rely on? Where a big chunk of the economy, culture and society in general are relying on?

Unfortunately capitalism on its own is failing in presenting a good solution, unless you think the guy running (idk) some important crypto code having to take care of ads to survive is a sign that everything is working as it should.

I would love if we had some sort of "trickle-down tax", where we use dependency resolution and some algo to rate the importance of that particular dependency and make the software and services that are using them and being payed for it, to let some of the profit to help the open source ones (here i think a blockchain solution might be of some help, as things would be totally automated).


Ubuntu will do just fine.


What problems are those?


1. We need a clear separation between system programs and application programs, so that a user can install the latest version of Python without it ever colliding with the outdated version of Python that the OS uses internally. The user shouldn't even need to be aware of the existence of the other Python.

2. The default way to install third-party application software shouldn't depend on repositories or servers. It should be based on installing files, just like on Windows. The OS should define a stable format for the OS to install, update and uninstall these files. Additionally, a permission system like Android's could be a good idea.

3. We need a desktop environment that's visually appealing and has a good UX.

4. We need centralized, thorough and official user documentation about the OS, so that the user doesn't need to browse random forums and wikis in order to learn how to use some piece of the OS. Note that this documentation is about the OS, not about third party software. And this documentation should be distinct from the developer's documentation.

5. Consistency. The user shouldn't need to learn 10 different config file formats or inconsistent command flags across the system in order to configure it.


> so that a user can install the latest version of Python without it ever colliding with the outdated version of Python that the OS uses internally

Consider checking out nixos. Although this is already possible to do in normal distros.

> The OS should define a stable format for the OS to install, update and uninstall these files

Like .deb and .rpm?

> 3. We need a desktop environment that's visually appealing and has a good UX.

Such as KDE and the rest?

> We need centralized, thorough and official user documentation about the OS

Distros have their own documentation already.

> And this documentation should be distinct from the developer's documentation

Why?

> The user shouldn't need to learn 10 different config file formats

You will love nixos then.


> > The OS should define a stable format for the OS to install, update and uninstall these files

> Like .deb and .rpm?

It's ironic that the common complaint about these two formats (especially .deb [0]!) is that they are too stable and rigid...

[0] https://lwn.net/Articles/789449/


> > 3. We need a desktop environment that's visually appealing and has a good UX.

> Such as KDE and the rest?

Just personal taste, but I don't find KDE visually appealing, and not only the default theme ("Breeze"). Unfortunately, I can't get used to the usability issues on GNOME.

IMO, the sweet spot of visually appealing and good UX is .... Windows 10 (sigh), followed by a close second in Cinnamon (and I'm also a fan of IceWM).


Sounds like you just want..... Windows? But an open source version?


We already have reactOS for that


> We need a clear separation between system programs and application programs ... The default way to install third-party application software shouldn't depend on repositories or servers. It should be based on installing files, just like on Windows.

So the end goal is for a system where users download random binaries off the net, which never get any security patches and leave dozens of copies of vulnerable dependencies lying around?

> Additionally, a permission system like Android's could be a good idea.

Even with a permission system like Android's, how are you going to stop malicious applications from exfiltrating all the files in your home directory, or recording all your keystrokes, or spying on your clipboard, or putting up a fake password prompt, or mining bitcoin with your GPU, or just escaping whatever sandbox you think you're running them in?

If those are the risks I have to put up with in order to run a bleeding-edge Python interpreter on my system, then I think I'll pass. I guess that makes me an old_unixer.


> So the end goal is for a system where users download random binaries off the net

They download the binaries they trust off the net.

You're insinuating that some users are too stupid to know who to trust and go around downloading random binaries. If there are such users, then I guess my OS is not geared towards them, but rather to software developers, creative professionals, etc. who know what they're doing and need to get their application software directly from the developer.

People who don't know who to trust or want their OS developers to keep repositories for them can keep using Apple devices with their walled gardens or repository-centric Linux distros.

> which never get any security patches and leave dozens of copies of vulnerable dependencies lying around?

Do Windows applications never get any security patches?

And yes, sandboxing is not an infallible security mechanism, but it could help, and it also has other practical advantages not related to security.


> 3. We need a desktop environment that's visually appealing and has a good UX.

Is this where the Gnome vs KDE battle starts? Gnome looks great and is silky with Wayland but KDE does let you twiddle configs...

You'll never get alignment on what this means to a big enough group to be meaningful but I'm happy to pay towards something that does have a focus.


> 2. The default way to install third-party application software shouldn't depend on repositories or servers. It should be based on installing files, just like on Windows.

Huh? This is already true.


So... Windows 10 I feel fits those criteria?


Not having a keyboard-like pop-up app that can work inside all other apps that's half speech-to-text processor, half emoji picker, like you have on mobile except having it on desktop.

And a way to run all Windows software flawlessly.

And making KDE the default DE because it's the best one.

And Wayland feature parity.

And another packaging format that's like Flatpak and AppImage, except the best of both combined into one, that everyone uses from now on instead of any of the others.

And something about how good Macbook trackpads are.


- "As long as technology has a global reach, someone will have the world in the palm of his hand. If not Bob Page, then Everett, Dowd..." -Tracer Tong from Deus Ex (2000)



Surprising to me that would repost on their blog so much later.


“There has been a lot of talk in the past few years about blockchains, which are heavily hyped but poorly understood. Blockchains are networks of physical computers that work together in concert to form a single virtual computer.”

I mean, how could anyone be confused when you yourself are actively fudging the meaning in the immediate next sentence? The rest of the paragraph is worse. Embarrassingly bad, but par for the course from a16z.


It's cringeworthy.


Always funny to see these seemingly thought provoking articles from VCs and startup people as if people haven't already been talking about these same ideas for years.

Congrats, a12345z, you finally realized it.


This is an interesting idea, but can people help me by posting examples of the type of crypto contract underpinned service that a16z are talking about?


Cloudflare.


This deserves more visibility. Cloudflare is one of the parties with immense power.


Answer: The user.


Whoever controls the Internet accumulates enormous power. This is evident in things such as the enormous salaries of software engineers, social media's influence on public opinion, and Trump being able to stir his devotees to attack the US Capitol. I like the optimism of this piece - decentralized tech can revolutionize the Internet's cloud-based economy - but I would caution falling into the same hubris that technologists felt in the early 2000s about Big Tech. FAANG does not want to give up its immense power on the Internet, and there will always be somebody looking for ways to co-opt it if we move into a decentralized Internet economy.


what is the benefits for us


Are investors usually this clueless about how things work?


Not all of the. But quite a few will invest in areas that they have no expertise in. They usually have the smarts to not start writing about those subjects as though they suddenly do have that expertise.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: