
Fed up with slow and pricey Internet, cities start demanding gigabit fiber - RougeFemme
http://arstechnica.com/business/2013/11/fed-up-with-slow-and-pricey-internet-cities-start-demanding-gigabit-fiber/
======
Amadou
It never hurts to revisit this article:

The $200 Billion Rip-Off: Our broadband future was stolen

[http://www.pbs.org/cringely/pulpit/2007/pulpit_20070810_0026...](http://www.pbs.org/cringely/pulpit/2007/pulpit_20070810_002683.html)

------
AdamFernandez
I'm all for the market in most instances, but I really hope Internet access
starts to be treated like a utility in the United States. It is approaching
the necessity of electricity a century ago (or water, gas, roadways, etc.) The
market had zero incentive to invest so heavily in these utilities for such a
drawn out return before the government financed the basic infrastructure. I
know the analogy isn't perfect for a few reasons, but imagine where we would
be if all these other necessities were privatized?

~~~
praxeologist
You are off your rocker if you think this is a market problem. Where I live
now and where I lived previously (downtown Baltimore) there is some sort of
regulation that keeps Verizon/FIOS from being able to provide service so the
ONLY option is Comcast besides some shit 3g/Clear service nobody should use.

Just because the internet _is important_ doesn't mean the same laws of
economics don't apply to it and it will somehow be better provided by
government.

~~~
groby_b
"some sort of regulation" is entirely devoid of information. If there indeed
is a regulatory hurdle, what _precisely_ is it? Nobody that I can find has
ever pointed out any regulatory hurdles for Baltimore FIOS, so if you make
that claim, it would be nice to actually back it up.

~~~
praxeologist
[http://progressivemaryland.org/public/documents/2010/utility...](http://progressivemaryland.org/public/documents/2010/utilityreg/2010-1fios-
whereisbmore-form.pdf)

>In the state of Maryland, Verizon currently offers FiOS service in Anne
Arundel, Baltimore, Charles, Harford, Howard, Montgomery and Prince George’s
counties. Thus, the only major jurisdiction in the state that is not yet wired
for FiOS is Baltimore City. Verizon recently told members of the City Council
that they have no plans to apply for a franchise agreement with the Baltimore
City Office of Cable and Communications.

I'm not sure on all the ins and outs of it. Feel free to dig more yourself but
basically Verizon needs some sort of permit to lay cable and the city council
or this office has a certain price for it that makes Verizon see it as
unprofitable to provide the service then, unlike most other areas.

~~~
groby_b
I have no issue believing that there are occasionally issues with permits, and
that they might sometimes prevent neighborhoods from getting a certain kind of
service

But given that Verizon is not rolling out any more FIOS nation-wide[1], I
really have trouble believing it's a regulatory issue in this case. And they
slowed down way before 2010[2]. VZW gave up on FIOS, plain and simple.
Everywhere.

[1] [http://www.dslreports.com/shownews/Verizon-Again-Confirms-
Fi...](http://www.dslreports.com/shownews/Verizon-Again-Confirms-FiOS-
Expansion-is-Over-118949) [2] [http://www.huffingtonpost.com/bruce-
kushnick/the-great-veriz...](http://www.huffingtonpost.com/bruce-kushnick/the-
great-verizon-fios-ripoff_b_1529287.html)

------
Amadou
_" The old model was 'everybody has to have service,' which is where cable and
telephone came from," Smith said. "This is a model that says, 'we can be
patient while demand builds.' We'd like to see some of our most disadvantaged
served, but we're not starting out with 'everybody must get service
immediately.'"_

My understanding is that Chattanooga's gigabit service was rolled out as
quickly as they could as limited by funding to everyone within the service
area. My understanding is that they are the only city in the US with such
complete gigabit coverage.

I've read that they are now selling consulting services to other communities
looking to do gigabit rollouts. I'd like to know what their take is on that
philosophy since it seems to be the opposite of what they did and they seem to
be significantly exceeding revenue targets.

[http://www.chattanoogan.com/2013/9/17/259342/EPB-
Increasing-...](http://www.chattanoogan.com/2013/9/17/259342/EPB-Increasing-
Fiber-Optic-Speeds.aspx)

~~~
crymer11
From what I recall, EPB initially planned for the fiber network to be used for
their smart grid, which encompasses all of their service area, and then after
that was built, they realized they had the capacity to run an ISP. They had
the benefit of much of the infrastructure needed to deploy the fiber network
already in place and not having to deal with access rights.

------
ihsw
Maybe the municipalities should start asking Google why they were rejected.
I'm extremely skeptical that cities are opening up to competitive fiber
offerings -- a lot of it is political wrangling that most aren't even willing
to deal with.

Cut the red tape and they will come. Fuck the exclusivity wheeling and
dealing.

~~~
subway
My understanding of the Louisville RFI is that it is to be an open access
network. The prospect of a local fiber plant providing last mile connectivity
between various ISPs and customers excites me greatly.

------
blisterpeanuts
Once gigabit is in perhaps 25% of homes and businesses, everyone else will
want it, too, and a tipping point will have been reached.

I'm guessing it will take that large a market for the really cool apps to
become practical and well known: remote desktops and thin clients that serve
up high def TV/movies and real-time high quality video telephone (finally).
This will be when television and computers finally converge and we will stop
thinking of them as separate and different products.

I have to admire Google. They not only envisioned this, but (as usual) very
smartly went about implementing a demo. The "fiberhood" approach makes sense;
it's the right way to roll out the service (or any non-essential service,
really). Make money on neighborhood #1, and that pays for the rollout to
neighborhood #2, and so forth.

The whole "either everyone gets it or no one gets it" approach goes out the
window, and suddenly, every town wants it. Certainly it will have to be
provided to "under-served" areas but the important thing is for there to be a
solid business model, preferably with competing providers, else all the
problems of a utility-quasi-monopoly come into play: indifferent customer
service and support, uneven service quality, etc.

I can't wait until this comes to my town, just outside of Boston, but for
various reasons I suspect Mass. will be among the last places to get it.
Anyway--congratulations to those who have it, and just _don 't rub it in_!!

------
beachstartup
here's some data points:

we're located in santa monica / west LA, in the middle of town, and our
building had, until last year, had the following options for bandwidth:

* adsl ... $40/month for 3/.25 megabit

* EoC (sdsl) ... $1500/month for 15/15 megabit

* Bonded T1 / T3 ... insanely expensive.

Time warner business class moved in with fiber to the building and now we
have:

* cable modem over coax muxed into fiber switch @ roughly $300/month for consumer-level speeds (30/5 megabit). which is what we use right now.

* Fiber to fiber (ethernet switch on customer premises):

10/10 megabit starting @ $1300/month (worse than EoC above)

40/40 megabit $2400/month

1gbps/1gbps for $14,000/month (yes $14 THOUSAND/month.

we pay $400/month at the datacenter for each of our 1gbit/1gbit links w/ a 100
megabit commitment.

~~~
aaron42net
If you are in Santa Monica City proper, Santa Monica has a not-well-known
fiber network: [http://gismap.santa-
monica.org/GISMaps/pdf/smFiberOpticsZoni...](http://gismap.santa-
monica.org/GISMaps/pdf/smFiberOpticsZoning.pdf)

TrueCar uses the city's fiber network to connects between its buildings with
dark fiber, get a gigabit cross-connect to its colo in downtown LA, and 500
megabits of nLayer bandwidth to the offices. The service has been fantastic
and is cheaper by far than anything else we found.

See
[http://www.smgov.net/departments/isd/smcitynet.aspx](http://www.smgov.net/departments/isd/smcitynet.aspx)
for contact info.

~~~
beachstartup
unfortunately, we are literally on the other street of the city limit
(centinela).

------
Hydraulix989
When I was in Seattle, I had CondoInternet fiber which offered 100 mbps for
$60/mo and gigabit for $120/mo.

Then I moved to Santa Clara, and my options are Comcast (currently paying for
$50/mo for 50 mbps and actually receiving 10 mbps) or 24 mbps AT&T Uverse for
$60/mo.

Why does Silicon Valley, of all places, have such slow and anachronistic
Internet options?

------
breckinloggins
As an Aggie and former Bryan / College Station resident (of 15 years), it's
fantastic to see them taking a lead on this.

BCS has always been pretty good for high speed internet (I had 2Mbps cable in
1999), but the price disparity between residential and business is insane, and
speed increases have stalled somewhat in recent years. Suddenlink is pretty
good, but that area has been the target of rather frequent cable company
acquisitions and mergers (TCA -> Cox -> Suddenlink just while I was there), so
it's not a foregone conclusion that things will stay "ok".

I love living in SF, but I have to admit I kind of like the idea that my
"little Texas town" will probably get gigabit and a sensible fiber
infrastructure long before SF or most parts of Silicon Valley will.

------
snarfy
The electric grid was built on subsidized money and is regulated as a natural
monopoly. A large part of that grid is made up of utility poles.

If you want to be a cable provider or ISP, the easiest way is to put your
lines on those utility poles. They are already carrying power lines to
people's homes any way. Run some cables and you can avoid the extremely
expensive proposition of building secondary infrastructure to support more
cables running to houses.

When the local municipalities set up agreements with the cable providers, a
part of that agreement states nobody else can be on the utility pole. This is
why there is a monopoly.

If cities are really fed up with pricey internet, they need to re-evaluate
these agreements they've made with the all the incumbents.

------
VladRussian2
having the same DSL connection over recent several years my price has
gradually risen from $26 to $40 per month. I'd say something is really wrong
with a technology sector if it has such negative price/performance dynamics.

------
jsmeaton
And in Australia we were getting gigabit (rollout was happening), then a new
government came to power and said we don't need it, and that copper is fine.

[http://howfastisthenbn.com.au/](http://howfastisthenbn.com.au/)

------
wmeredith
I would love me some gigabit. I'm in Independence, MO, just east of Kansas
City, MO. I just switched to Comcast (puke) for $50/month for 50mbps. I was
using ATT Uverse where I was paying $60/month for 16mbps (and actually getting
2mbps, hence the switch). Meanwhile, 10 miles West in KC they're paying Google
$120 for gigabit + cable. I wish I could get in on that.

------
azatris
Is it really all about the speed race? Why don't I ever hear about quality of
service, smaller latencies, etc? Is it because it is less marketable? I am
sure ISPs can ultimately do something about it.

If I am naïve, then please guide me.

------
salient
I hope this won't turn into something like:

"We offer Gigabit fiber*. Come and get it!"

Footnote: Speeds may get no higher than 200 Mbps most of the time.

If these investments are going to last for another decade, they need to offer
at least 1 Gbps to everyone.

------
massysett
Genuine question: why the obsession with gigabit? What applications demand it?

~~~
MaulingMonkey
Game development.

Setting up a new workspace generally is a "go out to lunch and it'll hopefully
be mostly done with the perforce checkout by the time you're back" affair on
our internal LAN, nevermind if you wanted to work from home. Downloading the
built assets through our internal tools takes a similar length of time, so you
hopefully can wait until the end of the day to kick that off as you head out.

Various SDKs are delivered by download - my last download was 15GB,
compressed. These often require frequent updates, which we save to local
shares as a simple matter of expediency when shared widely.

Delivering complete, self contained milestones in specific formats causing
12-50GB .zips to be uploaded via FTP for hours... with the semi-frequent
resort to sneakernet due to quota limitations, service issues, or just plain
not having hours between when development for a build was stopped and when the
build was needed someplace for a demo or whatever else.

I have an archive of some of those deliveries on my local machine. Even with a
good 30%+ of my archives missing due to lack of drive space, the archives for
the single project I've archives for total over a terrabyte. They're useful
for reproducing bugs exactly on occasion. Between all the other projects,
other employees, and monstrous shared resources like complete Perforce
history, build machines and other servers... complete backup would probably be
somewhere around the 0.1 _peta_ bytes mark... or close to a full petabit of
data. And we're considered only a "small" business, under 50 employees!

Fortunately for us backups are fairly static, and the nightly rsync to offsite
storage is _usually_ done bringing all our services to a grinding halt by the
time work starts the next day. But it's sufficiently slow that those are our
secondary backups, with primaries and restoration being done through
sneakernet as the rule, not the exception, despite the "beefy" pipe. A second
internet backup would push backup times into our workday, and if we're ever
forced to do a major restoration through the internet, it's going to be
extremely painful... days of waiting with employees unable to do their jobs.
Expensive!

~~~
chubot
Wow I'm surprised game development is still like that. I haven't done game
development in almost 9 years, but the "Perforce saturating the LAN" problem
sounds very familiar!

So I'm actually working on a (binary) software distribution system, and I
think it would work very well for this use case. This problem can be solved
with software -- it doesn't require an upgrade to fiber.

The (straightforward) idea is to use content-addressed storage and
differential compression, pretty much like Git does. Then you can actually
sync data from your neighbor's machine, in a BitTorrent-ish fashion (possibly
getting tiny amounts of metadata from a central server for consistency).
Perforce definitely has problems with spurious lock contention, and just a few
team members syncing from the same machine can easily clog its pipes if you're
not careful.

I'm not up to date, but I'm pretty sure you can do better than Perforce's
delta compression. You can probably do it with a single generic algorithm, but
one way to really improve would be to use file-type specific compression, e.g.
bsdiff or Chrome's Courgette for executables, and use other heuristics for
game data, raw audio, video, etc. You actually want to avoid single-file
compression in the repository so you can take advantage of the differential
compression.

I can understand that the dynamics of game development teams means that this
will never get written in house (although perhaps in the years since I've
left, people started placing more emphasis on tools).

But for all the entrepreneurs out there, I wouldn't dismiss the market of
selling tools to game developers. Perforce is a tiny company that I know made
an absolute killing in that market, and I'm somewhat surprised that after a
decade they're still the state of the art.

~~~
MaulingMonkey
> Wow I'm surprised game development is still like that.

Some of the problems are fundamental. We're forced to use certain compressed
and signed packaging schemes which aren't sanely delta compressable. We
distribute in self contained blobs because not everyone has older versions
lying around, or will not have a worthwhile pipe when they get to where they
need to install things and will further sneakernet on their end. Even if we
did assume sane deltas are possible, simple information density puts a lower
bound on patch size based on how much content changed, and even that
theoretical lower bound can frequently be too much and take too long for
relatively "beefy" pipes. Heck, even with fiber, people will still be saving
these things out to USB drives for good reason.

> Then you can actually sync data from your neighbor's machine, in a
> BitTorrent-ish fashion (possibly getting tiny amounts of metadata from a
> central server for consistency). Perforce definitely has problems with
> spurious lock contention, and just a few team members syncing from the same
> machine can easily clog its pipes if you're not careful.

That helps if the bottleneck is the server. Our perforce machines are beefy
enough that our bottlenecks are frequently the client and the LAN pipe,
neither of which will be helped by BitTorrent style networking. Work from home
would be 100% pipe bottlenecked, as the work pipe is able to saturate it quite
easily. FTP is still bottlenecked by either the developer's or the publisher's
pipe, and wouldn't be helped by p2p either.

> I'm not up to date, but I'm pretty sure you can do better than Perforce's
> delta compression.

There's always room for incremental improvements... but not enough to beat out
fiber.

> You actually want to avoid single-file compression in the repository so you
> can take advantage of the differential compression.

Not always possible due to format requirements. Windows 8 .appx packages, for
example, are basically self contained signed .zips containing an application
and all it's resources. Want to provide three builds? You just compressed and
signed three different entire copies of all your built resources. For bonus
points, those were .zip ed _again_. Insanity! But software cannot magically
solve the social problems that let such designs reach the marketplace.

And since things break unless you use the exact "correct" certificate for
signing, any script that generated the .appx from one set of resources would
require distributing the actual private key which is a non starter...

------
HeyLaughingBoy
Gigabit! I only dream of that. I'm happy that my ISP claims I can get 30 MBps
here 10 miles outside town.

------
ffrryuu
Much like basic wage, gigabit fiber is a universal human right.

