
Cities spent millions on gigabit networks. No one is sure what they're good for - jseliger
http://www.vox.com/new-money/2016/10/17/13230500/gigabit-networks-chattanooga-google
======
snewk
Chattanoogan here.

Our fiber network has been a huge asset for the local economy.

It helped give birth to our own little tech startup ecosystem.

It made way for the build out of city wide wifi.

It's run by our local power company, EPB, who remains committed to net
neutrality, and refuses to enact data caps.

It's forced ATT/Comcast to offer gigabit+ services to remain relevant.

I could go on and on.

Yes, its true that no 'killer gigabit app' has emerged, but give it time. IoT,
decentralized services, etc. are all still in their infancy. Its too early to
dismiss the internet of the future as a boondogle because 'it doesnt make
money' yet.

~~~
msandford
Seems like you've listed a bunch of things that together might well be
considered a killer app. Sure it's not an app you put on your phone and print
money with just because it's got access to gigabit. But quantitative
differences eventually become qualitative ones.

------
martinald
I've had gigabit internet (symmetrical) in my apartment in London for the past
~3years.

Some things I've found:

Even 802.11ac WiFi is really lacking. Despite link speed being 900mbit, it
really struggles to push more than 200mbit/sec. On 802.11n it was more like
90mbit/sec.

Torrents are often IO (or even CPU) bound at high speed. I use a transcend
drive on my MBP for more space and that seems to max out at 100mbit/sec (yes,
bits) with very random read/write like bittorrent.

You end up not noticing internet faster than 100/100\. I have 0.3ms pings to
the local Cloudflare CDN, so latency is not a problem. You do notice slower
than that. Especially low upload connections, you are so used to git push
taking no time at all, you forget on slow upstream how slow that can feel.

While it's nice to have gigabit internet, I would rather 100/100 with low
latency (eg: GFast) over 300/30 with variable latency (eg: DOCSIS3).

The highest non-4K (I don't have a 4K TV) bitrate I have seen is netflix at
around 8mbit/sec, which looks great.

I honestly do think BT in the UK is making the right option in choosing a
faster/larger deployment of FTTC and GFast over FTTH. Gigabit really isn't
that useful, and I can't see this changing for the next few years.

------
zokier
Fiber in the ground is a long term investment. The old copper might have been
there for decades, and it is getting pretty much a dead end as far as tech
goes. Sure, gigabit now seems overkill for average consumer, but how about in
ten years time? How about twenty years?

Of course those decade old copper networks do not just magically maintain
themselves in perfect order. I believe that in many fiber deployment cases the
local network was already due for some sort of major operation, due just
general flakiness, low speed, or poor availability. Fixing that is expensive
no matter how you do it, and it really wouldn't make sense to lay down new
copper anymore these days if you can avoid it.

There is one major consumer application for gigabit-scale internet: self-
hosting. Gigabit internet is one key factor in enabling people to move away
from massive clouds to personal services. Combine this with the common NAS
boxes, device whose popularity is already being fueled by light ssd-powered
laptops, and it is not that farfetched to see something like sandstorm to
really take off.

~~~
wmf
I think the point of the article is questioning when to make that investment.
If you're Verizon and Sandy destroyed your copper, sure, go ahead and replace
it with fiber. If you're a cable company with a network that currently works
fine, fiber mania is a waste of money today. There's some benefit in delaying
since the routers needed to light that fiber will be somewhat cheaper in the
future.

Also, Sandstorm runs much better in a datacenter than in your home.

------
t0mbstone
The problem with people not knowing how to utilize gigabit networks is that
it's sort of a "cart before the horse" type of problem.

Until enough people actually HAVE symmetrical (both up and down) gigabit
speeds, you will be limited by the transmission speed of whichever provider
you are downloading from.

A lot of web servers are sitting in racks in data centers, and you will only
ever be able to download a file from that web server at the speed that that
web server is able to send. Assuming the web server has a single gigabit
connection, and ten other people are downloading from that server at the same
time, you might only be getting 50-100 megabits tops.

Where gigabit networks for consumers would REALLY start to shine is when
thousands of people are on gigabit, and are all doing peer-to-peer
transactions, or hosting their own servers.

Imagine if there were 100,000 people with decent computers (at least a
terabyte of drive space apiece), with every computer connected via fully bi-
directional gigabit speed connections.

Now imagine if you were to build a fully encrypted peer to peer file storage
cloud network on top of that, with each client having their own key (which
only they had access to). Encrypted data would be stored in pieces across the
cloud, with redundancy and parity correction stored across nodes.

If you built something like that properly, you would have a giant, disaster
resistant cloud drive, where you could store tons of private data and access
it from _anywhere_ at gigabit speeds. If you had a large file that was striped
in redundancy across a thousand nodes, you could even achieve terabit speeds
by downloading the pieces in parallel and assembling them on your end.

You could even tie a pricing model into it (perhaps powered by a blockchain
methodology), where you had to pay monthly to store your data in the peer to
peer cloud, and people who contributed storage and machine resources would
receive payments in return for their contribution.

You could even take it a step FURTHER and add a cloud processing layer on top
of it all, where people could dedicate a couple of their CPU cores for usage
by the cloud (when their machine wasn't in use), and they would be
compensated.

If you had that, you would have the ability to not only store data in the
cloud, but also process the data (and even do things like run distributed web
servers).

The possibilities are endless!

But it will never happen until it makes sense, and for it to make sense, there
needs to be a critical mass of people who actually HAVE symmetrical gigabit
connections.

~~~
wmf
Between Europe, Japan, and South Korea, doesn't that critical mass already
exist? Where are the apps?

------
upofadown
Gigabit is backward compatible to any slower speed. The unused bandwidth is
not wasted somehow. The current copper network is going to have to be replaced
anyway. It doesn't make a lot of sense to run fibre past buildings just to
connect more and more VDSL2 and DOCSIS 3.1 boxes sitting on the boulevard in a
desperate attempt to squeeze another decade of life out of shorter and shorter
lengths of existing copper.

------
TonyCoffman
The killer app is fiber reliability as compared to copper and coax where line
errors are the norm, particularly every time it rains and water gets into a
compromised circuit.

The speed is just a bonus.

~~~
astrodust
The speed isn't just a bonus, it means you can _do things_ that weren't
possible before, or were painfully slow.

Just as dial-up held us back, but the benefits of broadband weren't realized
until we finally perfected things like streaming audio and video, the impact
of gigabit connected networks will take time and may come in an unexpected
form.

The more immediate effects will be that distributed computing becomes no big
deal, and accessing your files from the cloud or from a home device while
remote becomes frictionless. Instead of necessarily lugging around a laptop
you might travel lighter, confident that you can get access to what you need
anyway.

It also makes apps like Dropcam possible where you stream endless hours of
video to a remote server on the off chance you might need it. The cost of
maintaining multiple streams becomes so low you don't even worry about it. No
longer do you need to fret over the equivalent of popping a breaker when
trying to watch Netflix as well.

The funny thing about these applications is they don't seem like a big deal
when you have them, but when you suddenly lose them it's a huge problem. This
is much the same way we take electricity and running water for granted, never
thinking much of it, but when it cuts out we're in trouble.

------
acchow
The empty promises of fibre-to-the-premises feel a bit like the internet
analog of "trickle down".

------
norea-armozel
I think the killer app for gigabit symmetrical internet service will be the
ability to host your own apps (from social networking to video streaming)
either on the router or another appliance. It's just hard to see how that will
ever become a common practice until someone builds such appliances that are
friendly with each other beyond their primary manufacturer because the natural
inclination of any corporation making such appliances is to make them locked
in. So it'll be interesting to see if that pattern keeps up or softens.

------
Apreche
The main problem with gigabit is that the person on the other end of the line
doesn't have gigabit. If someone with 10MBps upload sends me a file, it's
still coming in at 10MBps, even if I've got a terabit connection.

Likewise, you can't build an application that requires such tremendous
bandwidth unless you know there are enough customers out there who have it. A
few small towns here and there aren't going to cut it.

Light up the wires in New York City, and then you'll see some action.

~~~
zokier
> Likewise, you can't build an application that requires such tremendous
> bandwidth unless you know there are enough customers out there who have it.
> A few small towns here and there aren't going to cut it.

From the article

> about 14 percent of Americans have access to gigabit speeds today.

That is quite a bit more than "few small towns here and there", and frankly
nearly 50M people (just in US, globally much much more) should be more than
enough for building user base.

~~~
GauntletWizard
Those numbers probably vastly inflated (The FCC defines access only by zip-
code, and there were many zipcodes with just a few homes given access to
broadband to game the numbers for many years), and "Access" is not the same as
"having" to many, especially as services that take advantage of gigabit have
not yet become available, in a chicken->egg problem.

There's an interesting condundrum - I want my parents to have instant access
to a variety of photos and videos that I produce, and for a while it looked
like fiber networks were going to be the way - I'd host a home fileserver, and
they'd be able to stream whatever they wanted to me from there. But my home
internet speeds have actually gone down over the years - Was on a 100mbit
connection, and now the plan is '50/50' that rarely achieves more than 10 up.

The cloud has become the distribution network, and it makes a lot of sense in
many ways - It also works as a handy backup mechanism, and I trust 'cloud'
services to keep good copies of my data more than I trust the second hard-
drive I used to keep around.

------
stuaxo
I don't know, if you are going to lay the cable, surely gigabit makes more
sense than say 100mbit

~~~
wmf
Yes, this is why telcos are interested in fiber but cable companies are not.
DSL is limited to ~48 Mbps yet in many cities the standard cable plan is 100
Mbps and 300 Mbps plans are available.

------
rootedbox
Almost sounds like this story I heard once about this guy selling monorails to
cities.

