This move might be good for Google, but I find it somewhat alarming as a consumer. Those who control or curate content (Google, NBC/Comcast, Facebook) should not control infrastructure. This was less of a concern when Google was acting as a conduit of information, but as Google moves to treat personal (if aggregate) data as property and profits from user-generated data like YouTube or Google+, the lines become significantly blurred.
There may be nothing wrong intrinsically with controlling content and infrastructure, but it seems to be bad for consumers generally, as exhibited by Comcast[0]. And while a high-speed competitor to local cable monopolies is exciting, I worry about trading one dictator for another just because there's some competition in the transition period from the old guard to the new.
Is there some safeguard in place Kansas City and other fiber recipients have arranged to prevent Google from having the power to exert cable-esque control? Not rhetorical, genuinely curious.
EDIT: I'd like to make it clear that I'm not anti-Google, I don't think they're evil, or abuse consumer data, or are actively trying to become the next Comcast. I am, however, expressing concern over the position of power they will find themselves in if Fiber takes off. So far I've seen a lot of comments suggesting Google is not evil, to which I agree, but I haven't seen any indicating that there are adequate checks in place to (relatively easily) prevent the abuse of power.
> This move might be good for Google, but I find it somewhat alarming as a consumer. Those who control or curate content (Google, NBC/Comcast, Facebook) should not control infrastructure.
I don't know about you, but I'll take Google over Verizon, AT&T, Comcast, etc. any day of the week, regardless of their conflicts of interest. Google has a proven track record of doing things in my favor.
> Google has a proven track record of doing things in my favor.
The current, publicly acknowledged level of intercourse between Google and the U.S. Federal Government is already worrisome. Perhaps your data has been a part of those exchanges, perhaps it hasn't. You and I don't know.
Now, imagine that Google begins relaying traffic unrelated to their services on behalf of internet users. It isn't hard to fathom that Google will come under the same pressures to give the U.S. government access to user traffic as have Verizon, AT&T, Comcast, and others.
That could just be the tip of the iceberg though. How long before the company caves in and decides to start including user internet traffic analysis into the search engine's signaling and ranking? Would it still be doing you a favor?
> The current, publicly acknowledged level of intercourse between Google and the U.S. Federal Government is already worrisome. Perhaps your data has been a part of those exchanges, perhaps it hasn't. You and I don't know.
We've known for a fact that companies like AT&T have been working closely with the NSA for years. So it seems like Google is the better choice here.
Yes, Google hands over some of your data to the government. But as far as I am aware, Google tells you when requests have been made for your email account, and when search results have been censored. If your account is accessed by anyone but you, Google will notify you that you have been hacked.
Do I believe Google tells the whole truth? I'm not 100% convinced, but I can't know that. What I do know is that even if they tell 5% of the truth about government interaction with your data, this is several times more information than AT&T, Comcast, or Verizon tell you. Most of the time your data is simply handed over on a simple request. At least Google attempts/pretends to put up a fight.
Is it worrisome that I would accept 5% of the truth? Yes. But that goes to show how shady everyone else is. While immoral cooperation has no excuse, I would still point more blame to the government that makes the requests in the first place. Always assume your security is lost the minute the data touches a public pipe unless you take deliberate attempts to secure it.
Does that mean they CANNOT notify you or its up to the discretion of the entity handing over the data? If the latter it still makes a strong case that a notification system would be used for such things if one exists for similar circumstances.
If the government issues an NSL then you are prohibited from notifying anyone of its existence, and they are not subject to judicial oversight. They are issued solely at the discretion of the government, and if you receive one you cannot talk about it.
> Google has a proven track record of doing things in my favor.
Thus far.
The point still stands, as a principle – that sort of combination of function/power and interests has, historically, not been one that has resulted in good things for consumers, in the long run.
I don't dispute that. But the other option are the companies I listed, which also have conflicts of interest and have repeatedly abused their control of telecommunications to favor the other branches of their businesses in the past. So I'd rather go with Google, which doesn't have a history of abuse.
> Google has a proven track record of doing things in my favor.
Don't forget all the horror stories of trying to sort out customer support issues. Given the range of issues that supplying internet to houses can have, this would discourage me a lot.
It's fine today, but what about when the management changes? And I'm sure, Google's management change could easily occur during our lifetime. Heck, look at Apple and Microsoft. Steve Jobs -> Tim Cook and Bill Gates -> Steve Ballmer.
If you believe that Google's management and board is willing to consider the public interest in day-to-day business decisions, there's no reason you can't believe they will consider it in a management succession plan.
Wasn't ATT before the divestiture their day's equivalent of an owner of infrastructure and a controller, if not owner, of "content?" You got good, reliable service. What was more reliable than the phone?
You got good, reliable service until anything went wrong or you tried to change anything or you tried to get a less nosey line to use a modem on or you wanted to use a different style of phone.
Is this kind of like how you get good, reliable service from Google until "anything goes wrong", like your Adsense account is erroneously suspended, your site is erroneously blacklisted from their search results, or your Gmail account is erroneously disabled and/or deleted? There is story after story after story of what happens when your interactions with Google require human attention, and it's not pretty.
When there is no competition then companies can afford to be lazy or ignore customer complaints. AdSense has no real competition, so Google just doesn't care if they have one less customer.
In the local access area, fiber from Google would increase competition, so would be a good thing.
While your concerns are legitimate - content should not control infrastructure; look at cellular carriers for the case study - there are also wins that are worth thinking about.
Not to be Pollyana; Google needs to be watched. So we all keep each other in check. The following disruptive moves benefit Google _and_ your average Joe:
1. Reduce bandwidth costs. Google pays to send data and we pay to receive it. This is just your classic elimination of the middleman, in this case the anti-competitive backbone and last-mile carriers. In theory the first to benefit from this are the startups with large bandwidth bills today.
2. Reduce delays associated with new tech. Google wants new tech like a larger TCP Initial Congestion Window [1], IPv6, SPDY / HTTP 2.0, or Strict Transport Security [2] to happen faster. As long as your average Joe has a fast enough pipe, these are a win-win. The lag in adoption comes from slow-moving risk-averse players, such as incumbent backbone providers.
3. Reduce mobile bandwidth costs. Google is already heavily invested in mobile broadband clients. In especially congested areas like New York, anything to reduce mobile broadband costs benefits Android first: the top tier of customers who don't feel the costs are probably using iOS.
There are more win-win situations but those are my top 3.
I don't get why anyone thinks any company should control infrastructure. We don't have private electric lines, or private roads that are corporate funded. We don't expect train tracks to be privately laid. Our entire phone and cable infrastructure were built by private companies (mainly AT&T) by getting tremendous government subsidies and being effectively publicly funded for private enterprise.
There is absolutely no reason that any local government in the US worth its weight in salt couldn't start a fiber to the home initiative where they take a tax hike for a few years to pay off laying fiber to everyone in the district. They could contract with neighbor regions, or with private long line services like Level3 to provide internet access, and then nobody gets ripped off when they get a telecom monopoly in their area, because it is a publicly owned utility.
It is getting to the point where internet access is so ubiquitous and essential that we need to see the rise of public fiber and wifi to enable competition in an area where for the last 30 years there has only been collusion and monopoly abuse. I consider myself a social libertarian but we have proven time and time again in history you can not effectively privatize infrastructure and expect long term benefits to society from that.
Most of the U.S.'s railway system was laid by private companies. Similarly, private companies laid much of the electrical grid, too. They are regulated utilities today but were once free companies.
Even when governments build infrastructure they typically contract that work out to private companies. I'll let you decide whether government contractors and bureaucracies are preferable to making the guy who built it have his skin in the game.
More importantly, if someone else wants to build it, as long as they follow rational rules (that can be imposed ex post facto) it is irrational for the taxpayer to clamour to pay for it.
"There is absolutely no reason that any local government in the US worth its weight in salt couldn't start a fiber to the home initiative where they take a tax hike for a few years to pay off laying fiber to everyone in the district."
We. Can. Not. Afford. It.
In case you haven't been paying attention:
Unemployment is high, so new taxes are tougher to sell than usual, and income tax revenue is declining.
Home values have just taken a big hit, so property tax revenue is declining.
The housing market is having a difficult time clearing in many areas; i.e. prices are still too high and there's a surplus of homes, which sit abandoned. This stretches local services (fires, copper theft, copper theft of e.g. gas lines which causes fires, homeless, drug activity, etc.).
Expensive pensions, healthcare, etc. cause costs for government employees to keep going up.
In short, state and local governments are overextended and are feeling a need to cut services, not expand them. The federal government is too, to some extent, but they can essentially print money.
Of course, these are generalizations; they obviously won't apply to every locality.
Laying fiber would create jobs all over the country, not only by employing companies to lay the fiber, but by creating work for the companies that manufacture the fiber. More people working means more tax revenue and it means they have money to spend elsewhere, which in turn creates additional jobs in local regions - once again increasing tax revenues.
What the US cannot afford is to continue to cut federal, state, and local jobs when there is already not enough demand in the private sector to provide anything close to full employment.
In my "Can't. Afford. It," comment [1] I wasn't referring to the trickle-down effects, job stimulus, or long-term return on investment. Those are certainly worthy issues to discuss, but in comment [1] I was limiting myself to saying that, regardless of the merits of the proposal, it might be difficult for many localities to come up with the financial resources to implement it.
> What the US cannot afford is to continue to cut federal, state, and local jobs when there is already not enough demand in the private sector to provide anything close to full employment.
Whether creating jobs should be a goal of government, or merely a side effect of implementing policies, depends on your politics. (It's as much a question of "big-government" vs. "small-government" as "left" vs. "right".)
One of Google's stated requirements for the fiber project was the ability to impact business. They specifically wanted a location where there was an industrial market without access to reliable high-speed Internet access.
Fiber is a benefit to businesses. When my city installed a fiber line with the assistance of a large regional credit union headquartered here, we also saw a couple data centers and co-location facilities move in. The lines paid for themselves in a more than reasonable time frame.
In reality, the big thing holding back municipal Internet access or the installation of high speed lines is not the lack of money in the government (the government doesn't need to pay a dime). It's that the government is willingly allowing the existing local telco to maintain their monopoly with bans on the installation of new cable. Make it cheap and easy to drop new lines in and the market will expand to fill the desire.
Studies have shown that fiber to a house results in a net increase of about $10K in the house's value. Since typical property taxes are about 1% of the value, any local government that does this is looking at about $100/year extra income per house, ad infinitum.
You can use bonds to pay for fiber, because fiber is infrastructure (like water, electricity, roads).
Here's how I would like to see it: the City provides fiber infrastructure, and charges a base fee for it. This gets your traffic to some termination point(s). If you want access to the Internet, you pay a third party to carry your traffic to/from the Internet past those termination points. It'd be an open market, so carriers would try to undercut each other.
I doubt studies really show any such thing. They probably showed that when fiber is rare in a residential area, it may add $10K to a house value. When it becomes common, it won't.
Not necessarily. Laying fiber might be end up being revenue positive without a tax increase -- depending on the time frame and its effect on stimulating the economy.
The assumption behind any infrastructure being laid is that it will be a net positive gain. You wouldn't consider it otherwise. If everywhere in the US with a population density over around 50 persons per square mile got end to end fiber via local taxes (and to argue the supers problem, if a region can't afford it, they don't have to do it - many localities in the US are prospering just fine, while some are destitute. Its a case by case situation) the benefits to society would be massive.
Same thing with maglev trains along the east and west seaboards. They cost a ton, but in 50 years they would usher in new eras of mobility and an order of magnitude less cost to transport goods extremely quickly.
Some might debate whether what you say is true but it doesn't really matter because it would be political suicide, I think, in the near future to try to push this agenda anywhere outside of the Valley.
Of course it would be preferable to have true structural separation in service providers, having a competitor that has interests that are not aligned with most content publishers is a pretty good consolation prize.
While Google has products that overlap with legacy content publishers, that doesn't appear to have made Google enthusiastic about keeping bandwidth prices up, and content under tight control.
You are completely right, but its just a piece of the puzzle.
The whole market is becoming vertical. There is a war going on who gets to extort content providers, by controlling the access to consumers.
The hardware vendors can control access and demand a slice (apple, amazon). The operating system vendors can control access and demand a slice (microsoft, google). The connection providers can control access and demand a slice (phone/cable companies).
Here in Holland, we now have net neutrality. But maybe thats not enough. Maybe we need app/media store neutrality as well, by requiring OS vendors to allow side-loading. Maybe we need OS neutrality as well, by demanding HW vendors to offer their machienes "rooted".
For now, Google is pro net neutrality, and has not abused their position. Maybe this is just a defensive step. Neverthless, google, like any other ISP should be forced to treat packets neutrally. But that really should be a law, not a contract.
> Maybe we need app/media store neutrality as well, by requiring OS vendors to allow side-loading. Maybe we need OS neutrality as well, by demanding HW vendors to offer their machienes "rooted".
Maybe? There's no question. Manufacturers are already abusing these holes.
Google's corporate motto is "Don't be Evil" and the United States has the 16th fastest broadband speeds in the world, and we invented the damn thing. Comcast, I agree, is a terrible monopolistic crack-baby of the capitalist society but Google is genuinely improving things.
What good is a corporate motto if the shareholders can't sue to enforce it? Some good, sure. I always liked it. But it is about as binding as it is well-defined.
Except Google has been acting pretty 'evil' recently.
The disgraceful FRAND abuses, the privacy abuses, the anti-competitive expansion of Google Search. Hence why they are under multiple investigations by ITC, EU, US governments.
So it is far better to assume NOBODY should be trusted.
Except again: In Google vs AT&T, Comcast et al, I'll take Google any day of the week. Their "evil" is a bunch of shit that nobody really cares about, whereas the others actively screw me, running fiber optic cables straight into my apartment and then making sure that even the priciest data plan I can buy is rate limited to the equivalent of mid-2000s DSL.
When has Google, in and of itself, abused FRAND? The only actions I know of happening on FRAND patents are things grandfathered in from motorola - are you blaming those on Google? I think it's far too early to judge Google's behavior wrt to patents. We have to see what they do in their own right.
For example, all of the H.264 patents except Motorola's are licensed for pennies, which is something like 0.25% of the cost of an Xbox. But Motorola wants ~2% for their few H.264 patents. A complex product infringes hundreds of patents; if each one has a 2% royalty then the total royalties exceed the price of the product.
That's certainly an argument that could be had but Google's position on it is not in the document you cited. It specifies that the maximum royalty Google will ever ask for a standards essential patent is 2%. Whether Google approves of asking for 2% in the case you are referring to is not clear to me.
At this point the law suit is way past the phase of negotiating licensing fees and I doubt very much that it is possible to rewind it back to there even if Google wanted to (which I don't see why they would; they know that Apple is intent on banning their devices - Apple's goal is nothing less than the complete destruction of Android as a product, not negotiating reasonable licensing fees).
Google has owned Motorola Mobility for months now and is in every respect in full control of the company. They have had plenty of opportunities to "do the right thing" but have chosen not to.
So to be clear you're talking about the inertia regarding the Motorola suit, not any specific action taken by Google executives. So they've been "evil" for two months. I call shenanigans here: you're spinning. No one sane would indict a giant corporate entity on this kind of evidence.
The acquisition closed on May 22nd [1]. While 'months' is technically true, it barely squeaks over the bar, and really not that long when you consider that they might have other priorities, like not losing money.
Thanks for your consistent concern trolling about Google anyway.
the simple answer is don't use services if you don't trust them. If google provides cheaper fiber access in your area, and you'd trust verizon or comcast then use those providers.
Meh, people said the same thing about Chrome but I know of almost no one who thinks that has been bad either for the market or any individual segment of the market.
If previous versions of Adblock are still the base code for the new versions, it doesn't block the ad. It merely hides it. The ad is still loaded, you just don't see it. This limitation is/was because Chrome didn't allow extensions to have the access they would need to actually keep an ad from loading.
I haven't used Chrome with AdBlock in maybe a year or two, so this might be out of date. But that's what it used to be like, and that's where the perception is coming from.
I wish this meme would die. There's a kernel of truth, that initially Chrome didn't support blocking network requests, but that's because of the tradeoff of extension compatibility vs power that Chrome made. Firefox addons (as originally designed at least, I'm not up on their latest efforts) have always been incredibly powerful, but that power comes from calling internal browser APIs that might change from version to version.
Chrome extension apis have to be explicitly designed and developed, and backcompat is taken care of, so your extensions don't all break every six weeks when Chrome updates.
Sounds like a load of FUD to me. DOM elements are DOM elements, aren't they? How would Chrome's JS interpreter be able to discriminate ad content from anything else?
If you want to investigate how it works, open the extensions tab and enable developer mode. You can open 'background.html' for Adblock and check out what it does.
> Those who control or curate content (Google, NBC/Comcast, Facebook) should not control infrastructure.
Why not?
> while a high-speed competitor to local cable monopolies is exciting,
It's not just exciting, it's everything. Comcast, TW et al, can get away with being massively valuable enterprises even though they provide the absolute worst service on earth. Any kind of competition for them, can only be a good thing, even if its from another company that controls both content and infrastructure.
Because they will favor transmission of their own content over competing content. See Net Neutrality. This being said, Comcast is already doing this and I trust Google more than the big 3 internet providers.
I am more concerned about current internet providers blocking technology for sake of their vested interest in old ways of business. Unlimited and unrestrained internet access is something so important and one of the basic needs. This morning I heard that ATT requires you to buy additional plans for Facetime. After data caps by all vendors, this is heights of absurdity.
> This move might be good for Google, but I find it somewhat alarming as a consumer. Those who control or curate content (Google, NBC/Comcast, Facebook) should not control infrastructure.
I agree. I hope this spurs competition, more than anything else.
Consider a service competing with Youtube; will they have the same streaming performance when delivered over Google's fiber as Youtube? It might even just be accidental (say, Google colocates it's services with its fiber infrastructure) - but it could easily give competitors a big disadvantage if they can't get access to the same performance as Youtube has.
say, Google colocates it's services with its fiber infrastructure
This is actually the defense Comcast is using with their IPTV services. They don't have to pay anything to move that data over public lines, so they don't "charge" extra for it (by "charge", I mean putting it against the caps they recently removed). On the other hand, Netflix et al does go across public lines that Comcast does have to pay for, and they do "charge" for that because they themselves are being charged for it.
Local CDNs are troublesome when it comes to the difference between WAN and last-mile prices for content providers/deliverers.
I would tend to agree. Google has been extremely monopolistic lately. The only reason for Google+'s and Google Chat's relative success is shoving it down the throats of users of GMail's throats. Google mission has moved from organizing to controlling the world's information. I really liked the former, and I'm pretty scared of the latter.
I want fast fiber. I also want to avoid another evil monopoly from taking over the computing landscape. I miss the old, trustworthy, don't-be-evil Google.
We have google fiber at home - Stanford faculty homes have had it the past year, like Kansas City.
It's pretty cool. My record so far is to consume 400Mbps, using 4 computers downloading from about 10 places, all wired through gigabit switches.
In practice, though, it doesn't make much difference compared to a 30Mbps cable modem for most consumers:
most streaming video is < 10 Mbps;
large file downloads are generally limited by a server (or somewhere else in the network?), so it's hard to exceed 30-40Mbps download speed;
web browsing feels about the same, because it's limited by round trips of DNS and http requests, not by bandwidth (spdy will help here?);
many consumer-grade NAT boxes (linksys and friends) are capable of only 100-200 Mbps
The one place it's made a big difference so far is uploads. For example, backup to a cloud backup service (backblaze) often goes 50Mbps or higher. But I did have to try several backup services because some were limited on the server side to a few Mbps.
Running services from home could be a use case too, but then you get into reliability of power/etc, and the fact that so far you can't get a static IP address through google fiber.
So for now google fiber is mostly a fast cable modem from a "don't be evil" provider. I think the real disruption will come with new services that don't really exist yet. What kind of new things can be built if there's enough audience?
I have the Asus RT-N66U, which is rated on that chart for 730+ Mbps (of course, QoS will bring that down). Even the AirPort Extreme is rated for 400+ Mbps, but not QoS to speak of.
That's a great question! Downloading a 1GB file, I peaked at 65 Mbps download rate just now. It ramped up slowly over a period of a few minutes. I was connected to 58 of 60 peers to get that rate.
I'm finding spideroak superior to backblaze - it actually backs up my entire system rather than skipping /Library & /opt and allows me to backup external drives without needing to have them constantly plugged in. I think the pay-for-what-you-use model allows them to be a lot more flexible.
Backblaze has been a rock star for me over a year. It used to be very, very slow (My first backup took the better part of a week to get the "Initial" image in place) - but it was okay because their constant background backups only needed around 1 mbit to be effective (and, more importantly to me - invisible. Time Machine had a bad habit of ramping up my CPU).
But, recently, they must have made modifications to their system - because I routinely see backup in excess of 15 mbits/second. Highly recommended.
With an internet connection like that, CrashPlan might be more interesting. The fact that you could back up to a friends computer would be nifty to test.
There is not much difference in perception to a normal customer between 50Mbit and 1Gb - the former manages two 1080p YouTube streams perfectly well with quite some headroom for extra misc traffic. Torrenters will always crave more bandwidth, but even there its going to be tough to fill 1Gb downstream.
But all of that is missing the point. The internet is not just a better TV with cats - it has true full duplex communication!
It's hard to believe because the monopolies in control of the last mile will generally offer you tons of downstream with little to no upstream. In some cases, the upstream is merely enough to send TCP ACKs for when you use all of your downstream. Its a natural move for these big old companies because upstream traffic from customers is more expensive for them and they are still stuck in the mindset of the "media consumption machine".
Google, of course, realizes that 1Mb of upstream is bad news for Google Hangouts and terrible for uploading 1080p video to YouTube. And all kind of distributed systems benefit tremendously from matching upload and download bandwidth.
There's no good reason why I shouldn't be able to participate in lifelike HD video conferences from here in Austin with my team back in CA, along with our other remote members.
But, of course, the problem is that while all of us easily have enough bandwidth to pull down streams of high-quality HD video, the video streams we're trying to pull down are choked into low quality by the sender's lack of upstream bandwidth.
For cable systems there have historically been some good technical reasons why upstream speeds are much lower. DOCSIS3 solves most of those problems. If there is a demand for it cable upload speeds can jump to 10-15Mbit/sec in the near future without much trouble.
These speeds are already available. I've had 50Mbps/15Mbps from Comcast for over a year, for about $100/month -- and I actually get those speeds so long as the server with the content is willing. Dual-stack support is soon as well, so I'll have an IPv6 address.
In the next week or so, my downstream will be increasing to 105Mbps at no additional charge[1]. In some markets (mine included) Comcast will be offering 305Mbps downstream service (granted at $300/mo), which is 90% of the bandwidth the DOCSIS 3.0 specification would allow over 8 channels.
Granted, on the opposite end we have Verizon now trying to slowly kill their DSL service and replace it with LTE. [2]
From the ISP perspective it's actually getting easier to accommodate these faster speeds. Very few customers can actually saturate a 100Mbit/sec connection for more than a few minutes. I suspect the usage on a 305Mbit/sec package would be nearly identical to the 50 or 100Mbit package for most customers. Once we got past this 15-20Mbit/sec mark usage stays about the same as you scale speeds upwards. The old model of oversubscription works again. There's a lot of gloom & doom about US broadband but the reality is we're actually in pretty good shape now that most cable operators have deployed DOCSIS3 equipment.
Does that include cable? I pay 105$ a month for 75/35 + cable w/HD DVR from Verizon FIOS which might make Comcast better after that upgrade.
Or, if it's just internet then 150/75 is 99$ and 300/65 Mbps is 200$/month from FIOS which would make Comcast a little behind but still reasonable alternative. Which is a lot better than the last time I compared them.
With the most basic of TV service, taxes, fees, etc. my total bill comes to about $116.81/mo. If I punted the TV service, my savings would be about $4/mo. I don't bother with TV service because to get the one channel I want, the required pre-reqs and equipment would double my bill.
Regarding Verizon, I'd love to have FiOS, but they won't deploy in Boston without major tax breaks, which Boston refuses to offer. FiOS is great if it's available, but it's not in most markets. There aren't many other options in the continental US where you can get 100Mbps service for ~$100/mo if FiOS is unavailable (and for most of the US, that's the case).
Going to the 404 page reveals a menu which hints (well, it says that there's going to be some plural number of cities, not too much else) at what is being announced tomorrow: http://fiber.google.com/savethedate/404
The main menu has links to "About", "How to get it", "Plans & Pricing", "Cities", "Help" and a button "Pre-Register"
Also, since that page doesn't seem to indicate the time, the Google fiber blog (http://googlefiberblog.blogspot.com/) says it's at 11AM CDT. Also an impressive stat is that they've apparently laid down over a hundred miles of fiber.
...until Google algos shut down your account; now you loose gmail, docs, drive, etc., etc., etc., and fiber.
OK, I am kidding to some extent. Maybe not. I'd sure like to see them move in a direction that assures users that all services will not be cutoff without recourse for unknown algo violations.
And that part scares me. That is a whole lot of my personal info to provide to a company that is really good at processing it.
Of course, I'm probably being stupid about that. My network data was probably sold a long time ago. And they have also probably already correlated it with my google account.
Well, that's true of you and I but if you care about other people that don't know any better it could be a real issue. There's millions of FB users that don't understand the implications of all their oversharing and will not be happy when/if it becomes a problem for them. But, I think it's too early to judge anything about Google Fiber yet, at least for me, without having much details.
Thanks for the context, the page itself was quite devoid of information.
I would love to see how this service compares to the roll out of fibre we are seeing in Australia at the moment. There is a lot of resistance from the opposition as the government is largely funding our new network.
In particular, many commentators claim that rolling out wireless technology is more financially viable then fibre to the home. In Australia we do have a sparse consumer distribution, so the context is a little different, however if google's model is viable (as you would assume) it might make a strong case study for our situation.
> In Australia we do have a sparse consumer distribution
Nonsense, Australia is one of the most urbanised countries on Earth. We are less dispersed than the USA.
> There is a lot of resistance from the opposition as the government is largely funding our new network.
The government is right that FTTP is the most future-proof technology choice. But the way the project is being run is shameful -- they're carefully hiding the costs and debts behind "commercial-in-confidence" for a company wholly owned by the government.
And the dealing with Telstra, and creation of a new monopoly ... there's lots not to like.
> Nonsense, Australia is one of the most urbanised countries on Earth. We are less dispersed than the USA.
That is a good point. To be fair, I am not fully aware of what our distribution looks like.
Perhaps I should clarify the point I was trying to make.
Whilst it is true that a large proportion of our population reside in urban environments, one of the aims of the fibre roll out here is to reach almost all of the population with fibre to the home.
For those not in urban environments (which is still significant), the geographic spacing is quite large, and I would imagine in that context we are sparsely distributed. Is running fibre to the home in that context a viable option? Hard to say, and perhaps even the google roll out will not give us much more information or real world example then we currently have.
> ... the way the project is being run is shameful
The thing I hope most to see out of the google roll out is how they handle it. I don't know of many data points regarding nation-wide roll out of fibre technology, or any similar infrastructure, in recent times, so the potential to 'see it done right' is an exciting one.
Australia is a bit odd. It is among the lower ones in terms of average density but among the higher in median density. We're very clumped; 89% percent of the population is in urban areas. Compare with densely populated Netherlands, with only 61% percent in urban areas.
In particular, many commenters claim that rolling out wireless technology is more financially viable then fibre to the home.
I don't think anyone claims that wireless isn't cheaper than fibre-to-the-home (FTTH). Some commentators claim it offers similar performance to fibre
(They are wrong: at scale, with a large number of geographically concentrated users wireless doesn't offer anything near fibre performance. Comparing the best wireless case performance - where there is a single client on an uncongested network, against the cheapest available - bandwidth capped - fibre plans is clearly an invalid comparison)
Well, governments fund roads, schools, etc, and they don't expect to make financial returns on those.
However, the national broadband network is funded through government loans, but expects to make a profit on the returns from it.
So there is quite a difference in terms of the Government budget. The former are a cost to the budget, with ongoing expense and no expectation of any monetary returns. The latter is forecast to take nothing out of the government budget.
I am absolutely ecstatic that Google is attempting to build the next generation wired network.
Why does this matter? Well owning the pipe is always a good move especially in the face of net neutrality. But there are all sorts of other tie ins. For one, payments.
Our current payment infrastructure is based around private leased lines. If you really wanted to take on the payments industry you have to start with the infrastructure. Otherwise you are always at the mercy of a credit card processor just shutting you off. When you own the network where the transactions actually flow it is a different story.
Our current payment infrastructure is based around private leased lines. If you really wanted to take on the payments industry you have to start with the infrastructure. Otherwise you are always at the mercy of a credit card processor just shutting you off.
Umm..
Credit card processors have other, more obvious ways of shutting you off that taking away a leased line.
The leased lines used for credit card processing are such a minor factor that you can pretty much ignore it. Yes, it is a weird way to do it, and no, if you were building a replacement you wouldn't use leased lines, but that isn't what is holding back payment technologies.
If you really wanted to take on the payments industry you have to start with the infrastructure. Otherwise you are always at the mercy of a credit card processor just shutting you off.
Today Google Wallet is at the mercy of credit cards, but in the future... people will pay their Google Fiber bill with credit cards. I don't see where you're going.
Not sure how owning the pipe would help them there. Google's search domination is (arguably) a result of them having a superior product, with advertising being completely coupled to that.
Honestly, the more network is out there, the better.
There is probably an upper limit on the number of networks any given city can support, so it's going to be natural for some combination of content sources to be partnered with fiber distribution networks to revenue share.
It's enormously costly to deliver "wireline" fiber to the home service. Most of this is labor cost, but you also can't discount the impact of property taxes and upkeep on infrastructure that is supposed to weather the elements and wildlife (including human) for upwards of 30 years.
Fiber is a great solution but wireless is still the most economical way to deliver access for the "last-mile". If we ever find a way to provide low-cost, high bandwidth wireless service within a one mile radius that doesn't make the NIMBY types have an aneurism, then fiber-to-the-home will seem as quaint as an individual copper pair to every residence.
disclaimer: In 1997-1998 I worked at a municipally owned city utility that was able to deliver 10mbps symmetric Internet access to a development of homes in FL. Bellsouth subsequently had laws passed to prohibit political subdivisions from engaging in the provisioning of telecommunication services.
Fiber is fundamentally no different than wireless. It's just at a high enough frequency the light is visible. That means you get much higher bandwidth. There's just no way around that.
If you could rig up visible-light laser communications, you wouldn't need the fiber. Fiber's just a convenient container.
Whilst the information carrier, photons, is identical in both cases, the medium is extremely different.
In the case of fibre we have a highly controlled, flexible medium with extremely good quality of service and very little attenuation. This comes at the price of requiring relatively expensive infrastructure between network nodes.
Wireless technology on the other hand travels through air as a medium. You remove the end-to-end infrastructure requirement, but the trade off is much higher attenuation and lower quality of service. Additionally, you require some amount of 'direct line of sight' depending on frequency. Lower (and hence slower) frequencies are able to refract around mountains and even off the outer atmosphere, providing quite good coverage, but visible light has no such capability. Repeater satellites or LOS base stations would be required to effectively use such a frequency.
This is why HAM radios are great for talking to people kilometres away, but can't be used for effective wireless internet, and why hotspots don't cover more than a few hundred metres at most.
Fiber is just a convenient waveguide for certain electromagnetic frequencies. Bad for others.
That said, your understanding is a bit "off."
Amateur Radio (HAM) has microwave frequencies available for use that would be VERY effective for point-to-point wireless internet. We even have frequencies that are very close to the existing bands and there would be no problem running LTE over those frequencies. The problems with amateur radio are regulatory, not technical.
Generally speaking, the higher the frequency, the higher the attenuation. Of course, the higher frequencies are better suited for the kinds of modulation rates necessary for high bitrate communications. So there's that.
But all of this is besides the point. No technology exists in a vacuum and must compete against others for particular applications and for particular cost models. That's why telecommunications engineers have no hair and bad tempers. This stuff is hard.
For "lower" bitrate communications, wireless is the hands-down winner. The cost to deploy a network for a given level of service to a range of end-stations is just so much easier and cheaper that lots of other nations skipped over building a wireline network and just went straight to wireless. Mobility of the stations is almost a secondary concern.
Once you get to the point where your communications are less intermittent and higher bitrate, it's really a question of how far you need to go and how much/how often you need to communicate.
Lots of LTE stations are connected to a central office (router) via fiber (gigabit ethernet mostly). Lots more are connected via point to point microwave. Backhaul of bandwidth is something you dont hear talked about a lot but it's a big part of why it can be hard to get LTE/4G/WiMax into a particular location. Economics are a big part of that equation.
SO, I wish the Google Fiber guys well and I hope they succeed, but the telecommunications market is so broad and complicated that it would be foolish to look at one company's efforts to shake up a single market and equate that with a sea-change in how the broadband market works.
edit: I met Milo Medin (who is now running Google Fiber) and he offered me a chance to work at @Home way back when. In retrospect, I probably should have pursued it.
Thanks for the in-depth reply. The example of HAM radio was a poor one, I was associating it with Low Frequency [1] radio when it in fact covers a much larger spectrum.
I agree that telecommunications is a hard subject, and even more so that different technologies are useful for different purposes. That is why I think the original comment was off:
> If you could rig up visible-light laser communications, you wouldn't need the fiber. Fiber's just a convenient container.
Something we haven't really talked about, and I'm not sure how relevant it is, is the saturation of the wireless spectrum. Point to Point wireless is obviously a different beast, but for broadcast wireless is there a saturation point where we can't safely send more data over the airwaves?
I like that fibre is by construction point to point, not restricted by line of sight, capable of very high bit-rates with relatively low energy (correct me here if this is wrong), and excellent quality of service.
Wireless has a lot less infrastructure, particularly when used for the last mile, and is definitely extremely convenient in many situations. I don't think it is the solution for general purpose network infrastructure, though that may change in the future (and is clearly the choice for some countries already).
I wonder what the replacement for fibre will be in 20-30 years time - is wireless the only frontier at the moment?
Well, at microwave frequencies (think anything at 10Ghz and up, really) you're already talking about directional antennas. The RF engineers will call anything above 1Ghz microwave, but that's really splitting hairs.
Fiber is a waveguide and not much different in that respect from coaxial cable (or for that matter twister pair). What matters is what kinds of frequencies the waveguide will accomodate and what kind of attenuation those frequencies will experience. This matters for lots of important reasons.
As an example, take RG-6 coaxial cable (the kind you probably have in your home):
At 100Mhz (VHF) the attenuation is about 2.0 dB/100ft.
At 700Mhz (UHF) the attenuation is about 6.0 dB/100ft.
Now, take a look at Corning SMF-28 singlemode fiber:
At 1310nm (229.644 THz) the attenuation is 0.35dB/kilometer.
At 1550nm (192.4 THz) the attenuation is 0.22dB/kilometer.
So, fiber is an exceptional waveguide at very high frequencies, which makes it uniquely suitable for high-bandwidth communications. Compared to all of the other waveguides, it's the most durable, most compact, and most future-proof solution.
Except when you can't use it.
SO, what's left? Wireless probably. Well, there is free-space optical which uses the same 1550nm frequency but has the problem of aligning the transmitter and receiver to have a completely free line-of-sight. Then there is the fact that attenuation through free space isn't the same as within a fiber. Attenuation at 1550nm ranges from 0.2dB/km (clear) to 100dB/KM (foggy). So, the atmosphere is a shitty waveguide for optical frequencies.
But there is still a chance with wireless, since we aren't restricted to the propagation characteristics of optical frequencies or the directionality. With advanced modulation techniques (CDMA/OFDM), it's actually pretty easy to fit many more bits/hertz than you could with fiber transmission systems. It's just that that sort of heroics is not needed for fiber systems since it's easy to get extra bandwidth with another fiber.
Now, when you are talking about "last-mile" types of solutions, it really depends on what the alternatives are and what the "load-factor" is for that method. There is a big difference between phone calls and streaming audio for example when it comes to wireless systems. All of those things need to be taken into account.
IMHO, the end-game is probably going to be coax to a smaller set of homes using whatever flavor of DOCSIS is available at that moment. Most CableCo networks are mostly fiber anyhow. Once you get to a certain "cluster size" the choice of fiber to the home versus fiber to the local node and then coax to the house becomes an engineering question. For lower bandwidth or other services, you'll see wireless broadband displace some take-up of coax-base services. The only reason I think this will happen is because the coax is already deployed (and so it will be used). If we didn't have almost universal penetration of coax into the average home, there would be no question that most broadband would be wireless to the home.
The FTTH systems now being deployed have lots more in common with cable systems than is commonly understood. Your fiber doesn't actually go ALL the way to the central office. Your fiber is split at some point in the neighborhood (from 4-64 homes) and then is carried to the central office.
Honest question: what sort of support can you expect when something goes wrong? Fiber attracts backhoes (buried routes) and bored hunters with shotguns (aerial ones). It's a fact of life. Who's going to do customer service, Google, or some other agency? What's their track record like?
(As the old gag goes, this is why you should carry a short piece with you, in case you are stranded on a desert island. Bury it and when the backhoe shows up, get a ride back to land with the driver).
This is a great question. I love Google for the most part, but their customer support is absolutely and utterly terrible. And by design it would seem. No doubt backed up by data mining and A/B testing telling them that offering good customer support is terrible for their bottom line?
For something like this I sincerely hope they find and partner with a great customer support provider.
Knowing google, they're just going to run 8 different routes to any given place and leave cut cable in the ground, then replace the whole network in 10 years or so.
So I'm all for getting better broadband in this country, and good on Google for trying to make it happen. But allow me to place devil's advocate for a minute. This is like AOL 2.0 right? Isn't it a really bad idea to have one of the largest sites on the internet also be your ISP?
There is something that I call the paradox of Google. The only real content which Google makes that I can think of is their front page doodles. They are an entirely service focused company. For the most part google products fall into two categories: Services which serve advertisements, or services which feed into category one. Their entire strategy is to use content created by other people (websites in search, videos in youtube, etc.) to give a reason to provide content people pay them for you to see (ads.) Therefor the paradox lies in this: In order to grow and thrive as a company they must keep the internet free. If the web is monopolized by a small number of large corporations, the need for Google to facilitate access to a large number of small websites diminishes. Youtube is their most resilient platform to this notion, and even it requires companies to remain small enough where using Youtube's service still makes sense. Also, this is why I think their products are so good: Every last little bit of quality in a product allows that much larger of a company to still find it more useful to use Google as their gateway to the web.
So all in all, having a major website be an isp may be dangerous, but powers which guide Google guide it to be an open portal to a free web, instead of being the one stop content shop. In fact, I think this is possibly their biggest driver for doing fiber: to ensure the later doesn't happen.
This was true until Google started making phones and tablets and opened the Google Play store and Google Apps for business and ...
Actually wait.... Google has been "making [and selling] content] for a while. Their main business might still be search advertising, but that doesn't mean that rerouting the pipes won't benefit them and absolutely destroy others.
What about a phone is content? Content is something you spend time watching, or playing - nothing google sells is content, it's all service. All of their content is derived from other people, and they enable content creation on a grand scale. With the exception of a few google blogs and videos, it's all a way to get where you're going, not a thing in itself.
The day there's a google studios, or google game production, or google music labels, they'll be creating content.
A phone is not a service, it's a good and a platform for serving more goods/content in the form of the Google Play store.
What about YouTube or Maps or Apps? YouTube produces it's own content, as does Google Maps.
Again, Google today is not Google c. 2002. They produce massive amounts of content and goods, and being able to control the full distribution pipeline for said content and goods could end up being a significant anti-trust issue.
meh, the last mile buried fiber is a 'natural monopoly' - personally, I think that fiber to the home should be a municipal service, like power, and fiber to the business in santa clara. I mean, the majority of the value there is the public right of way; private companies simply can't do it without subsidies (in the form of right of ways granted by the municipality)
But, see, that only gets you from your house to the CO. I think that going from the CO to the internet could be an active and healthy free market.
In theory open access is a great model, but we've seen some problems in practice. DSL had a lot of buck-passing where your DSL was down but the telco and the ISP would both blame each other. Also, it's possible that all the competing ISPs could end up offering the same prices and evil TOSes. The easier it is to switch ISPs the easier it is to get adverse selection problems, like if you offer policies that are friendly to heavy users then soon you will have all heavy users which may invalidate your business model.
> DSL had a lot of buck-passing where your DSL was down but the telco and the ISP would both blame each other.
This is a huge problem, yes.
It can be mitigated in general by having very clear lines between responsibility, and easy tests. Set the system up so you have one fiber strand from the house to the CO; at that point, well, if the light meters say it's okay, it's the fault of the ISP. (or the end user, in any case, it's not the fault of the owner of the fiber.)
But yeah, every time you have a service that involves more than one person, you have that problem. Even when one company owns the whole thing, you often have people trying to point the finger at other departments; but yeah, this is worse when it's two different companies.
>. Also, it's possible that all the competing ISPs could end up offering the same prices and evil TOSes.
Possible, but unlikely. The capital to start an ISP at that level (e.g. selling connectivity within a datacenter) is way smaller than the capital required to, say, buy a house. (I mean, marketing that sort of thing is damn difficult, but re-selling bandwidth within a datacenter/co is not a capital intensive thing.)
That's why I think that from the CO to the internet should be a free market; the barriers to entry are so low (except, maybe, marketing, but hell, I'll go knock on doors.) that if the existing players are doing something that irritates customers, there is a lot of incentive for new people to enter the market.
>The easier it is to switch ISPs the easier it is to get adverse selection problems, like if you offer policies that are friendly to heavy users then soon you will have all heavy users which may invalidate your business model.
As a Kansas City resident, this is going to be inexplicably interesting. We have a pretty decent tech scene here. I'm just wondering how this will impact it, if it will at all.
What does Google hopes to accomplish with making fiber available?
They want to enable entirely new applications.
For example, online video was an application enabled by the widespread availability of broadband Internet. Before broadband, downloading videos was possible, but streaming was not. Simply put, the (average) rate at which you download frames of video has to be greater than the rate at which frames are displayed for streaming to work.
The most interesting changes were not quantitative, but qualitative.
Google -- and most HN readers -- probably believe that higher broadband speeds are an inevitability, although the process has been going much slower in the US than most of us would like. And new ways of using the Internet will be enabled as speeds get faster. And, if it offers fiber, Google will be at the forefront of that wave, which will help Google by:
(1) Accelerating the change, pushing those new markets to be created sooner than they would have been created without Google Fiber
(2) Putting it in a good position to capture the new markets -- i.e. if Application X is eating a lot of bandwidth on Google Fiber, that might be an early signal that the Application X space is a growth market and Google should find a way to get involved.
Just got a fibre plan 20mbps for $30 US per month, no contract, they also give the router and modem. Although had to pay $70 or so upfront for the no contract option.
They also have a 1000mbps plan. First world countries really dragging there feet on fibre networks.
1gbps is not as impressive 2 years later... 100mbps broadband is already becoming affordable here in Brazil, it's a given that 1gbps will be available in large scale in 2-3 years.
Until December last year the fastest I could get was 3mbps. I was then able to double to just under 7mbps. I'm also forced to pay for landline service I don't want or use as part of that giving me a total bill of $65/ month ($5/mo less than when I had 3mbps plus phone line).
I live in a city of 60,000 residents 20 km from Silicon Valley.
Westside. Comcast isn't an option because I run my own servers, need a static ip, and don't need their silly games. And note that Comcast speed is an "up to" and shared thing. A friend is with them in Palo Alto and gets numbers like yours during the day, and then in the evening it drops to around a Mbit or two as the entire neighbourhood watches Netflix.
I'm using Cruzio. The spend bump was because they partnered with Sonic and also use ADSL 2+ instead of only the old crusty ADSL. Cruzio also will not pull any of the stunts Comcast et al do ("traffic shaping" to favour their own products, quotas etc).
Man, was I pissed when Austin lost out to Kansas City for the Google Fiber for Communities pilot project. The lack of FIOS in a city like Austin is painful. Austin is "AT&T territory" and AT&T's UVerse service isn't even worth talking about.
I sure hope this announcement is, "Fiber is coming, and Austin's in the first wave!"
I have the 50/5 plan from Time Warner as well, and it's definitely best option around. But 5Mb upstream is still pretty chintzy.
Go up to Dallas and you can get 50/25, 75/35, 150/65, and now they're rolling out 300/65. It gets expensive on the extreme end of that, but I would sure love even 50/25. Major, major difference in upstream bandwidth.
No, it's not. TWC's TV/Phone/50Mbps Internet plan costs me $170 a month. (I think I have HBO in there too.)
Now incidentally, a firmware update bricked my cable box and I haven't had the desire to talk to anyone at Time Warner about it, but $170 is a fine price for 50Mbps Internet. (I don't use the phone either. Again, not wanting to talk to anyone in a call center, this is the best I could do.)
Their DNS does lie about NXDOMAIN responses, however. If you try to resolve a nonexistent domain name, it returns NOERROR and an A record pointing to their servers. I run my own DNS server fed from the root servers, so I don't really care, but I thought I'd mention it. The technician that did the install was also not happy that I didn't have a Windows or Mac computer for him to insert a shady-looking CD in. "What kind of router do you have." "It's an OpenBSD box." "..."
(I had Speakeasy before and kind of miss them. 24 hour phone support, and they were always extremely knowledgeable. Someone I talked to once also used OpenBSD at home.)
! Gbps upload and download speed. Wonderful. Unless you can give me an static IP address and let me serve whatever I want from that IP than I'm not failling out of my chair over this.
If you're just looking to run a server, a VPN service like Linode or bulk hosting like S3 will be more effective and less expensive than residential hosting.
I work for a GIS Services company and I know one of the major issues from
us not going full cloud is that we deal with really large data-sets. If we had access to a large fiber
pipe we could easily dump all our servers. I would imagine any engineering, graphics or video production office runs across the same problem. I see this as completing the cloud story and alleviating business of having to run there own data centers.
As google gets larger and larger the slower and less nimble it becomes.
But given the other competitors in this market, I'm glad google is making their move, US is falling behind other countries when it comes to broadband access and this will only open up so many new business opportunities in US.
However, as others said I'll be curious about neutrality of Google when it comes to content. Will they block vimeo in favor of youtube?
If Google doesn't do it who will? Who else has the vision? As long as they keep not doing evil everything is cool. If they do start doing evil we are screwed but we will have high bandwidth to get on the Internet and read about how messed up we are.
What happens when a large company that Google deems a competitor tries to buy bulk use of the fiber? I could see Facebook lobbying for government regulation of Google's fiber in an attempt to secure competitive pricing.
This is pretty amazing. It's a transformative move if Google can pull it off. Google might be best served by being the wholesale pipe provider rather than customer facing ISP.
How does Google fiber compare to other countries around the world like S. Korea and Japan where their bandwidth is much higher. We're #1? USA?! USA?! err Kansas City?!
I expect that Google already has plenty of ways to consume an entire gigabit link; eg, Google Play music/movies/tv, Google Drive, etc. I'm sure they'd be more than happy for all of your data to be stored in the cloud, and have all of your access streamed to you in realtime over that gigabit link.
Come to think of it, I would be more than happy for all of my data to be held securely in the cloud so that I would never need to worry about backups or syncing my data between multiple machines. Assuming reasonable privacy and security practices, of course.
That's a good thing. Instead of relying on third-party providers with loads of bandwidth for cloud storage, you and your family create your own private cloud, with direct peer-to-peer transfers across your gigabit links. The reliability would be as good as the size of your family.
In my case it is definitely one of the limiting factors. Try editing your family's latest HD home movies over the 2mbit upload speed that comes with most cable plans.
Good software support is another hurdle; I believe AeroFS is one of the companies working on this, though since I'm fully comfortable with sftp et. al. I never signed up for their beta to find out.
It usually is. Designers tend to say, "Gee, I've got an entire 1Gbps link! I don't need to think about efficiently using bandwidth ever again!". This is fine if the pipe never fills up, but if it does (like I am sure it will) you pay the price.
This is a cycle that has repeated for decades with every type of computing resource, and the end result is usually, for a 10x improvement in hardware capacity, you get a correlated but much smaller improvement in performance.
It usually is. Designers tend to say, "Gee, I've got an entire 1Gbps link! I don't need to think about efficiently using bandwidth ever again!".
This is true, but is only a problem if everything else is equal - which they aren't.
I'm not defending badly performing graphics heavy websites which have no reason for existence except to display 24bit versions of things that could be done in CSS.
BUT I am looking forward to the ability to transmit multiple streams of 1080p (and higher) video while my children play games with rich, 3D video assets and my electricity supply optimization company monitors the temperature of every cubic centimetre of air to determine if the air conditioner's fans need to be turned up.
Yes, that will chew up a lot of bandwidth - and I'll love every single bit travelling over that beautiful, beautiful, fibre...
I think bandwidth will continue to be a problem, no matter how big the pipe gets.
The domain of bandwidth restricted problems may shift, but it will almost certainly still be an issue. Historically we seem to be really good at using whatever bandwidth is available.
The main thing that might prove this sentiment wrong is the other choke points in the system: it's no use pumping data down the line if our hard drives can't store it quickly enough, or we can't render it as fast as we get it.
>[...] other choke points in the system: it's no use pumping data down the line if our hard drives can't store it quickly enough, or we can't render it as fast as we get it.
Or have the resolution to display it.
Displays have been lagging capture technology for a while, and I don't think it's entirely delivery constrained.
exactly and also understand that the 802.11g standard for wireless networking supports a maximum bandwidth of 54 Megabits per second (Mbps) so unless you're wired, you can't really see the difference.
802.11g is quite old, however. The current version of 802.11, 802.11n, supports rates up to 450Mbps. My laptop with a 2x2 antenna configuration (and pricey access point) does pretty close to the theoretical 300Mbps.
Incidentally, once you have a fiber optic connection between two points, you can do a lot better than 1Gbps.
There may be nothing wrong intrinsically with controlling content and infrastructure, but it seems to be bad for consumers generally, as exhibited by Comcast[0]. And while a high-speed competitor to local cable monopolies is exciting, I worry about trading one dictator for another just because there's some competition in the transition period from the old guard to the new.
Is there some safeguard in place Kansas City and other fiber recipients have arranged to prevent Google from having the power to exert cable-esque control? Not rhetorical, genuinely curious.
EDIT: I'd like to make it clear that I'm not anti-Google, I don't think they're evil, or abuse consumer data, or are actively trying to become the next Comcast. I am, however, expressing concern over the position of power they will find themselves in if Fiber takes off. So far I've seen a lot of comments suggesting Google is not evil, to which I agree, but I haven't seen any indicating that there are adequate checks in place to (relatively easily) prevent the abuse of power.
[0]http://scrawford.net/blog/comcastnbcu-will-raise-costs-for-c...