
Wi-Fi “as free as air” – totally false story - iProject
http://arstechnica.com/tech-policy/2013/02/wi-fi-as-free-as-air-the-totally-false-story-that-refuses-to-die/
======
anigbrowl
There are three things to learn from this story:

1\. The advertising-supported media model is broken.

2\. People on the internet are distressingly credulous.

3\. There is a _huge_ pent-up demand in the US for a next-generation
communications network at reasonable cost. Somehow we can do long distance
phone service for a low monthly flat rate, and have done for ~20 years, but
you can't get moderate speed consumer internet service in a major metro for
less than about $65/month.

~~~
pseut
I read some of the other comments and don't understand your point 3. Do you
expect huge demand to drive the price down? I'd tend to expect the opposite.

~~~
trentmb
I dunno, bandwidth doesn't seem to be finite in the same sense that oil or
gold are.

Not to mention that bandwidth doesn't 'roll-over.'

~~~
pekk
Bandwidth isn't a non-renewable resource but supply won't increase without
incentive. Why isn't there sufficient incentive?

~~~
gbhn
Most would-be providers of bandwidth to consumers are guarding legacy premium
bandwidth sales. The phone company would like to sell you differentiated phone
bandwidth at 10x market rates. The cable company would like to sell you
differentiated TV bandwidth at 10x market rates.

One approach would be to unbundle internet from premium bandwidth services
offered by the same company over the same channel. Suppose the US was
organized with a tier of companies competing to offer you cheap, reliable
bandwidth, and a tier of companies competing to offer you premium services
riding atop that bandwidth. Other countries have a model similar to this, and
they generally have much better bandwidth services. In the US, we're tied to
legacy investments in carrier infrastructure that make moving to such a system
very difficult.

Personally, I think we should just pay them off and start fresh.
Unfortunately, that ends up basically looking like nationalization, which in
the US is extremely difficult politically.

~~~
crusso
_Personally, I think we should just pay them off and start fresh.
Unfortunately, that ends up basically looking like nationalization, which in
the US is extremely difficult politically._

Actually if the Government broke up the monopolies that have a stranglehold on
connection points and wires to the homes -- competition would sort everything
out without a bunch of needless Government bureaucracy creating huge
unintended consequences.

------
femto
"Cognitive" or "White-space" radio, the subject of the actual FCC notice, is
real though. In simplistic terms, it means making fixed "FCC style" spectrum
management largely obsolete, by pairing each radio transmitter with a
receiver. The receiver is used to monitor the surrounding radio environment,
and the system makes a determination, in real-time, of what (if anything) it
can transmit whilst avoiding interference with other users of the spectrum.

Cognitive radio offers the potential for large increases in data capacity
since the radio spectrum is currently underutilised. I was at a conference a
few years ago, where a presenter had done the measurements, and the
utilisation of the radio spectrum (averaged across time and frequency) was
about 5%. It turns out that most licensees hardly, if ever, transmit.

There's currently a regulatory push to allow others to use a chunk of
spectrum, as long as they don't interfere with the licensee, meaning the
licensee gets guaranteed access, rather than exclusive access. It should be
interesting, since it will open the (cheaper?) alternative of buying smarter
radios instead of buying spectrum.

~~~
qlkzy
Note that sensing-based whitespace radio has a number of issues. The biggest
one is the hidden node problem - you can interfere with receivers without
being able to hear the transmitter. There is also the issue that the licensed
users of the band need to be very interference-resilient for the first few ms
of their transmissions, until all of the sensing-radio users realise that they
aren't allowed to be there anymore (and then the sensing radios need to have a
consistent way to keep the link up).

The first problem can be mitigated by making the sensing receiver
significantly more powerful than the transmitter, although you still have
issues with specific terrain layouts (e.g. something big and RF-attenuating
between you and the transmitter). The second problem makes it very difficult
to interoperate with _existing_ licensees (who assume exclusive use of the
spectrum); it may be possible to make it work better with new licensees.

There is another interesting approach to cognitive radio, which is to use a
centralised database of spectrum allocation, and then use RF propagation
models & surveys to find gaps in space & frequency where you can let people
transmit - for example, there are big gaps between terrestrial TV transmitter
zones, to avoid the multi-kW transmitters interfering with each other, where
you can easily fit "big WiFi"-style transmitters transmitting at a few watts.

Obviously, there are a number of issues here (need GPS or an accurate location
on installation, need internet access for the DB, need to restrict movement,
inefficient without really good RF models, weather affecting RF propagation)
which make this less of a panacea than sensing-based cognitive radio, but it
is relatively simple and robust to implement. In particular, a lot of the
issues disappear if you use it for "big WiFi" applications - access points
already have a fixed location, internet access, and mains power.

It also opens the possibility of a much more dynamic marketplace for spectrum
- if all spectrum users are checking with a centralised DB (at least in a
particular band), then it becomes much easier to handle short-term/local
licenses - for example, providing massive short-term additional cell/wifi
service to big events and festivals.

tl;dr Cognitive radio is a very interesting and promising development, but
it's _much_ harder than initial intuition suggests.

------
dylangs1030
It's pretty clear that two things need to be changed about how articles are
written and propagated on the internet.

First, only primary sources should be cited. If an article is written by a
news source, it stands to reason that is is _not suitable_ for citation by
another news source. They are both at the same level of empiricism, one cannot
be an adequate citation for another.

And second, the exchange of data has not changed drastically in the says since
the story was first reported, so this isn't an internet speed issue. It's an
issue of people being less pliable to the truth once they've hit upon a
wonderful fallacy.

What that translates to is that news articles need to be less sensationalized,
more factual and have a much more rigorous set of criteria for what's true and
what's not.

I'm not a journalist, but I wouldn't be surprised if journalists deliberately
look for what can be sensationalized in a potential story, to the point of
_unintentionally_ fabricating material - it is not inconceivable that a
reporter simply skimmed some material about the FCC and then this story hit
him like a "Eureka!" moment. I doubt it was fully intentional, it was just
being caught up in the prospect of a great story.

~~~
macchina
And when a primary source can be provided, it ought to be linked, e.g.,
articles based on a study, "new report," or court document. I feel like I am
constantly hunting these things down for myself.

------
Aloha
So while many of the posts seem to be focusing on how to pay for it, there is
another issue. It's not technically feasible, in urban areas at 700 mhz you
would need a very wide block of spectrum, several hundred megs, plus backhaul
to provision something that could offer 50 mbit service to everyone, you need
this, because of frequency reuse issues (700 carries too far, not enough
atmospheric attenuation). In rural areas, you cant pay for the cost of the
infrastructure on the subscriber density, rural areas need a much lower
frequency to be able to make an affordable site density.

For this to be possible, we need much more dense modulation than we currently
have, and a wide block of spectrum. I really believe for urban areas the
future is mesh, ala Ricochet (even using some of the same methods), which
means cells that extend over a 1-3 block radius and extremely fast/wide
backhaul, the backhaul needs to be automatic and self healing, wireline
backhaul from anything more then every other 3-5 sites breaks the cost
quotient. High Sites wont work in any event, you could deploy something in
rural areas looking more like the cell phone network - but again, to enable
reuse, you need a really wide block of spectrum.

In either case, be it mesh, or cell site like things, these have to be
engineered networks, you cant just throw it up and expect it to work. Also,
the chance for great financial success as a commerical entity is low, see
Ricochet and Clearwire.

~~~
surrealize
> 700 carries too far, not enough atmospheric attenuation

I'm no RF expert, but couldn't you decrease power to increase cell density?

~~~
Aloha
To an extent, yes.

That said, for universal wireless anything, you need a considerable block of
spectrum. Consider that between T-Mobile/Sprint/Clearwire/AT&T/Verizon in the
Seattle market, they hold something like 400+ mhz of spectrum. To provide this
kind of service, you would need a similar (though not as large) block of
spectrum. If you just looked at LTE or WiMax, to roll out a nationwide
footprint, you would need something like 250 mhz of spectrum, and that
includes backhaul pairs, and this is to provide enough carriers per site that
most people could get a 'similar to cable internet experience'.

There isnt 250 mhz of spectrum (other than the huge swaths the government has
held for itself) in the bands that are useful for both rural and urban
environments.

------
patrickaljord
I've read more stories denying this was true than the reverse. Actually, I
hadn't heard about that rumor before reading all the blogs denying it since
yesterday. Am I the only one here?

------
ars
These types of rumors that can't be squashed are caused by intense desire for
the thing.

People apparently want this so badly that they'll believe anything.

------
MBCook
I love the fact that Current TV was trying to tape a segment at least a day
after their own website had written a piece debunking the story.

Now that's journalism.

------
speeder
This remembered me of that post here on HN last week about how a guy invented
a false lead about the new Xbox and news sites were running with it without
even sending a e-mail back to him asking for more information.

Even worse is that some sites tried to just copy and paste from other sites
and change a bit to look original, and ended reporting wrong (invented) facts.

------
crusso
What the Washington Post lacks in understanding the facts it makes up for with
a lack of journalistic integrity.

No surprise here.

Notice that the "old media" in the article are the ones that refuse to correct
incorrect reporting. At this point, the only people continuing to subscribe to
their publications can't be driven off with a stick. Their revenue will
finally go dry when their subscribers succumb to old age.

~~~
saraid216
> What the Washington Post lacks in understanding the facts it makes up for
> with a lack of journalistic integrity.

It took me about three passes to understand this sentence. A comma would have
helped.

~~~
crusso
Yeah, but reading it again; I kind of like the way "the facts it makes up with
a lack of journalistic integrity" reads.

------
josh2600
Does anyone think that having Wi-Fi that's "as free as air" would be a bad
thing? If the government was actually doing this, wouldn't everyone react
positively?

Excluding lobbying from telcos, is there any reason that the FCC hasn't
already done this?

~~~
SoftwareMaven
It wouldn't be "Free as air", it would be buried in your taxes along with
twice the bureaucracy necessary to run it. Judging by other government IT
projects, especially at the fed level, it would also be more concerned with
filling out forms in triplicate than actually getting anybody connected to the
Internet.

At the municipal level, though, things seem to run more smoothly and the
amount of tax waste seems to be less. Keep the Feds out other than to provide
the bandwidth, IMO.

~~~
sbmassey
In addition to the inefficiency, putting a large proportion of the
communications infrastructure directly under Federal control would seem to
make political interferance way too easy as well. Assuming they could get the
thing up and running as promised, of course.

