
Wi-Fi Alliance introduces Wi-Fi 6 - okket
https://www.wi-fi.org/news-events/newsroom/wi-fi-alliance-introduces-wi-fi-6
======
mrb
Really good whitepaper on WiFi 6 (802.11ax):
[https://www.cisco.com/c/dam/en/us/products/collateral/wirele...](https://www.cisco.com/c/dam/en/us/products/collateral/wireless/white-
paper-c11-740788.pdf) Also: [http://www.ni.com/white-
paper/53150/en/](http://www.ni.com/white-paper/53150/en/)

Main differences with 802.11ac:

\- denser subcarriers (or "tones") spacing; they are spaced by 78.125 kHz
instead of 312.5 kHz, for example a 80 MHz channel now allows 980 data
subcarriers, up from 234. That's a 4.19× improvement (4× higher density
doesn't translate in exactly a 4× improvement because 11ac had a bit more
"pilot" or "null" subcarriers not used for data.)

\- new 1024QAM mode (encodes 10 bits per symbol, up from 8 bits with 256QAM).
That's a 1.25× improvement.

\- downside is that the symbol length had to be increased from 3.2 µs to 12.8
µs, and the symbol interval too from 0.4 µs to 0.8 µs. That's a 3.78×
reduction in efficiency.

So the maximum data rate for a single stream on a 80 MHz channel increased by
4.19×1.25/3.78 = 1.39× between WiFi 5 and WiFi 6 (you can confirm it with
published max data rate: 802.11ac = 433 Mbit/s, 802.11ax = 600 Mbit/s; and
600/433 = 1.39).

But the best feature of WiFi 6 is that different subcarriers can be
concurrently used by different users, what they dubbed OFDMA
([https://en.wikipedia.org/wiki/Orthogonal_frequency-
division_...](https://en.wikipedia.org/wiki/Orthogonal_frequency-
division_multiple_access)). This means that during the same 12.8 µs timeslot,
even on a small channel like 20 MHz, you can have 9 concurrent users each
assigned 26 subcarriers and each transmitting 26 different symbols (total of
234 symbols transmitted concurrently.) Whereas with WiFi 5, all subcarriers of
the 20 MHz channel have to be used by the same user.

~~~
ricardobeat
Due to the increased efficiency, the speed increase is supposed to be much
higher. At CES they showed up to 11Gbits:
[https://www.zdnet.com/article/d-link-asus-tout-802-11ax-
wi-f...](https://www.zdnet.com/article/d-link-asus-tout-802-11ax-wi-fi-
routers-but-youll-have-to-wait-until-later-in-2018/)

~~~
zamadatix
That's just the marketing, AC would have been 7 gbit/s if measured in the same
ludicrous 8x8 160 MHz setup (though technically AC only specifies up to 4x4 it
doesn't matter, clients are almost exclusively 2x2, even high end laptops).
Even in a full greenfield ax deployment it'll be a rare sight to see your phy
rate pop above 1 gbit/s half duplex as a client.

Like the parent comment said the actual felt speed gains in a real world
situation are going to be from OFDMA allowing concurrent use at slower speeds.

~~~
rsynnott
MacBook Pros have had 3x3 for a while now (except for the non-touchbar 13”
post-2016, which is really more of a MacBook Air internally anyway).

~~~
zamadatix
Yep, it's about the one client I ever see be 3x3. They use a specially
customized version of the 2014 BCM943602 to do it. Don't think I've ever seen
a 4x4 client pop into our logs that wasn't an AP to AP bridge but I know there
are a handful of adapters made which support it.

Interestingly Intel gave up on making 3x3 cards a couple of years ago but will
sell you a 160 MHz 2x2 ¯\\_(ツ)_/¯

------
kevin_b_er
Wi-Fi Alliance is naming the not-yet-complete IEEE 802.11ax standard as Wi-Fi
6, with ac being 5, and n being 4. One can guess then a is supposed to 1, b as
2, and g as 3, but this isn't mentioned anywhere I can see.

As far as I can tell, all of these numbers are new. At least this naming is a
good deal clearer than HDMI's confusing version and feature mix or USB's Speed
names.

What isn't clear is how much control the Wi-Fi Alliance has over the tech
industry and how their branding is used, but it looks like they might be able
to compel a lot of companies to adopt this new naming. They've got standards
for logos on things like your computer or phone, so we'll see if these start
getting adopted by major manufacturers.

~~~
colanderman
Hm, I would expect 802.11-1997 to be 1. Is that 0? Or does it share a number
with b, because they're the same technology? Or is it 1, and a and g share a
number because they're the same technology? Or do a and b share a number
because they were released simultaneously?

(This is, of course, a Very Important Question.)

~~~
WorldMaker
I think 802.11-1997 works as 0 in this case. The implication from the Wi-Fi
Alliance seems to be they are counting consumer hardware generations, and
there doesn't seem to have been much consumer hardware that was directly
802.11-1997 (it took b to get consumer adoption). Which is why the order I've
heard that makes the most sense is 1 (b), 2 (a), 3 (g), because that was
roughly the consumer hardwave waves, with the brief (a) hardware wave the fun
out-of-order one.

~~~
Dylan16807
a and b were specced at the same time though, so it's really weird to look at
a+b+g as anything other than two generations.

~~~
WorldMaker
That's certainly a fair viewpoint (and reflected together in the spec version
802.11-1999), but the interesting thing about this rebrand is the Wi-Fi
Alliance taking a step back, realizing that the individual specs don't matter
to consumers and as a consumer brand trying to break away from just using the
spec names/versions.

From that perspective A, B, and G were all sold as three separate things to
consumers, so it makes sense to count all three.

------
joemi
FINALLY. Confusing version numbering always annoys me, and wifi's "n", "ac",
"g", etc version names (combined with the easy to get wrong "802.11") have
been one of the biggest offenders for a while. When a version number is a
marketable metric, it should not be confusing.

Others notable offenders that jump to mind: the Xbox range (Xbox, Xbox 360,
Xbox One, Xbox One X), iPhone SE and XR (since they break the pattern all the
rest fit in).

~~~
reaperducer
Maybe there's someone with a sense of humor there and Wifi 9 will be skipped
as a dig at both Microsoft and Apple.

Windows 7, Windows 8, Windows 10

iPhone 7, iPhone 8, iPhone X

Wifi 6, Wifi 7, Wifi 8, Wifi Ten

~~~
piotrkubisa
Well, I think they have already skipped 802.11ad which initially looked like a
next version. Personally, I know only one router available to casual consumer
made by Netgear [1].

[0]:
[https://en.wikipedia.org/wiki/IEEE_802.11#Standards_and_amen...](https://en.wikipedia.org/wiki/IEEE_802.11#Standards_and_amendments)

[1]:
[http://www.za.netgear.com/landings/ad7200/default.aspx](http://www.za.netgear.com/landings/ad7200/default.aspx)

------
johnzim
Good decision on their part. As laughable as some technical branding sometimes
seems, it usually serves a decent purpose in helping people upgrade
infrastructure.

------
dev_dull
From Wikipedi:

> _Though the nominal data rate is just 37% higher than IEEE 802.11ac, the new
> amendment is expected to achieve a 4× increase to user throughput due to
> more efficient spectrum utilization.

>IEEE 802.11ax is due to be publicly released sometime in 2019.[2] Devices
were presented at CES 2018 that showed a top speed of 11 Gbit/s.[3]_

Sounds good but with limited practical value while broadband speeds stay so
limited in home. For businesses this would make a big difference (for example
editing HD video over WiFi).

~~~
jessriedel
I suspect for most WiFi users, the overwhelming need is more reliable and fast
establishment of connection. I waste vastly more time trying to deal with
flaky connections, reconnection after wake from sleep, and network priority
issues than I do waiting in the rare cases the WiFi is limiting thd data
transfer rate (rather than the wired connection).

The timescales involved in making Wifi connections should be ms or less, not
human-noticeable (order 10 sec).

~~~
ericd
I've found Macs to be vastly better at this than my Linux/windows laptops, for
some reason, to the point where I almost never wait for them to connect. Might
just be some aggressive workarounds they've implemented, though.

~~~
reaperducer
IME, Macs connect faster when the signal is strong. But on a weaker link, it
takes just as long as my Windows and Linux machines.

~~~
ericd
True, this is mostly at home. In cafes and hotels it can definitely take just
as long.

------
ocdtrekkie
This is just such a sensible change, it's amazing it took everyone this long
to come up with it. Figuring out whether something is b, g, n, ac, etc. and
then trying to remember which one comes next (ax) is just not going to filter
down to the regular consumer like... ever.

I can definitely see perks both in "hey, that's the newer network, it's got a
higher version number", and "oh, I guess I do need to buy a new Wi-Fi device,
it only supports version 4", etc.

~~~
chrisfinazzo
At the same time, _some_ version numbers are meaningless (because Chrome), so
I'm ambivalent about the change. Still, this should make it easier for mere
mortals to understand.

~~~
wlesieutre
I don't know about "meaningless" as much as "not something normal people need
to care about."

------
mtgx
802.11ax is 802.11n's true successor and I hope it will be adopted quickly by
everyone. 802.11ac for the 5GHz band was a faux successor, in my opinion, as
you couldn't use it in the same scenarios. It came with some higher
performance, but with major compromises in reach for a typical home.

Wi-Fi 6/802.11ax should last for a while, so I hope the Wi-Fi Alliance starts
focusing on an actual long-range standard that's more of a competitor to LTE
but that works in the unlicensed spectrum and for distances of 1km or longer.
Then it needs to incentivize smartphone makers or smartphone modem makers to
adopt it so that everyone will have it.

This would remove the biggest obstacle towards having a real meshnet.

~~~
garmaine
WiMax?

~~~
DiabloD3
WiMax is an LTE alternative designed by an IEEE 802 committee. It is designed
for last mile delivery and also meets the requirements to be considered a 4G
and 5G protocol.

Although technically superior to LTE, phone companies (outside of Sprint, in
the US at least) chose not to deploy it as it was not based on existing
protocols (GSM and CDMA), even though that isn't an actual legitimate reason,
as "true" LTE requires VoLTE deployment, which phone companies refused to do
until forced to; VoLTE-enabled handsets perform a lot better than their pre-
VoLTE or VoLTE-disabled siblings purely because of not needing to waste
valuable spectrum on 3G connections, allowing that band to be reassigned to a
4G radio on the tower.

On top of the VoLTE debacle, companies invested in the GSM and CDMA monopoly
tried to claim WiMax did not perform well at long distances, the same
distances that LTE does not work well with today (found frequently in rural
areas in the US, or similarly in areas heavily shadowed by hills or tall
buildings); however the adoption of 600/700mhz to fill in those gaps (plus the
forced adoption of VoLTE to improve spectrum usage) has proven that to be
false.

LTE in areas that are partly covered by existing bands, with the gaps filled
in by 600/700mhz, have finally caught up to WiMax in real world testing.

Interestingly, Asia has adopted WiMax heavily but may be switching to LTE in
the future for 5G deployment, even though WiMax beat LTE-A to the commercial
gigabit deployment milestone, due to these continued misconceptions. Africa's
few networks that were WiMAX are (or have already) switched to LTE (driving up
the cost, and lowering the reliability of their networks).

The one place WiMax survived in the US was fixed broadband links (as this is
what WiMax was originally designed for, until it merged with Korea's WiBro
spec more than a decade ago), but that seems to be finally being replaced in
favor with LTE-A's fixed profiles.

WiMax will probably beat LTE to working gigabit deployments, although you'll
need to live in Asia for this to be relevant to you.

WiMax is not related to WiFi; although the actual underlying technology of all
standards are rapidly converging, they are designed for different purposes.

~~~
kalleboo
WiMAX may have had technical potential in its standards (I am not equipped to
evaluate that) but all the actual deployments of it were just worse than
existing HSDPA networks. 10-20 Mbit WiMAX in the US got falsely branded by
Sprint as "4G" which caused companies like T-Mobile and AT&T to also brand
their faster HSDPA networks as "4G" which caused massive confusion in the the
market (iPhones in the US will still show HSDPA networks as "4G"!! It's kind
of insane)

I live in Asia and actually have a WiMAX plan still active (since it's
grandfathered in on the only still remaining "really really unlimited" plan)
but it always sucked in performance even when I got it, and since then they've
refarmed half the spectrum. They sell "WiMAX 2" now, but that's just branding
- it's just LTE.

~~~
DiabloD3
4G's sticking point is speed. Specifically, the ITU-R IMT Advanced proposal
requires several things, of which, today, are easy to meet. What wasn't easy
was 100mbit peak speeds for mobile users, gigabit peak speeds for fixed users.

WiMAX in a lot of markets is not 4G, but neither is LTE in a lot of markets.
In other words, a lot of markets do not, and seemingly never will, have 4G as
defined by IMT Advanced. I live in a market that is LTE, is sold to me as 4G,
will never meet 4G requirements.

WiMAX was developed with proposals like IMT Advanced in mind: ability to have
MIMO, all IP packet switching (non-VoLTE LTE networks cannot ever qualify to
be called a 4G network due to this, btw), 20mhz and higher channel widths, and
spectral efficiency above a certain level (which put a lower limit on how big
your modem's DSP has to be due to coding techniques), forwards and backwards
compatibility of future specs, and smooth handover between heterogeneous
technology (ie, tower to home femtocell and back). WiMAX's original
specification (802.16e-2005, which was based on the original 802.16 spec from
2001) met the IMT Advanced requirements.

LTE _was not_ developed from day one to do this, and did not really meet the
requirements until LTE-Advanced. The original LTE specification (3GPP release
8 (2008); LTE-A was defined in release 10 (2011); LTE-A Pro was defined in
release 13/14 (2016/2017); additions to LTE-A Pro for 5G (but does not meet 5G
yet) were added to release 15 (2018)) fell short of the speed requirement.

What makes this all interesting is, WiMax could do fixed modems 10 years
before LTE-A added it, was "true 4G" (as how ITU-R defines it now due to how
everyone rushed to muddy the definition with HSPA+ and whatnot) 3 years before
LTE was, and is currently the only protocol that has hope of deploying gigabit
to fixed users currently (via 802.16m-2011/802.16-2012 aka WiMax 2 or WiMax-
Advanced).

Also, they're trying to sell LTE as 5G, I've already seen ads saying
600/700mhz support is 5G (it is not, although it is welcome), just like how
they tried to sell highest order (2x2 MIMO with dual cell and widest channels)
HSPA+ as 4G (which would be more correctly be described as 3.5G, as newest
spec LTE would be best described as 4.5G).

------
ksec
WiFi 6, or 802.11ax, has been in development for sometime and faces lots of
difficulties and controversies.

First it was discovered all the major companies were working behind close door
forming a group called Densi-Fi, trying to super speed the spec, or more like
neglecting all the issues around it and push forward the time to market. This
has been discover by some other IEEE members, and the problem has since been
"resolved". The Density-Fi section in 802.11ax has since be deleted from Wiki,
despite many attempts to bring it back. The 802.11ax committees continue to
push forward and correct me if I am wrong, are there any other IEEE spec that
failed to pass in all of its Draft? Draft 1.0, 2.0 and 3.0 for 802.11ax all
failed to pass the vote. With Draft 4.0 being pushed and forcibly passed,
while all the comments ( ~2300 of it ) being the same and unresolved as Draft
3.0. Much like 802.11ac there will be Wave 1 and Wave 2. Wave 1 does not
include Uplink MU-MIMO, 80+80Mhz Channels and some other things I don't
remember on top of my head.

I am not sure what to make of this. Because it reads to me as a giant pile of
mess and I don't want to be the guinea pig for this new spec.

------
yholio
This time, maybe they hire some cryptographers too instead of letting the
network engineers design the security features. For crying out loud, they took
their sweet 20 years to actually create a protocol with forward secrecy and
resistant to offline attacks - WPA3.

------
sargun
It’s kinda a bummer to see 802.11ad being ignored. I use 802.11ad for game
streaming in my house, and it’s great.

It’s also great for the “access point per room” story, if there was a
mechanism to have cheaper APs and do hand off.

~~~
bjoli
I suspect the numbering will only be used for more or less backwards
compatible standards. I doubt they will break compatibility with N anytime
soon.

------
Al-Khwarizmi
I have a quite long apartment with reception problems in some rooms, and was
planning to buy a mesh system (like the TP-Link Deco M9 Plus or similar) to
replace my aging range extender (which is stuck on 802.11n now that I have a
802.11ac router).

I'm more of a software guy, don't know that much about hardware and
networking, so can any expert on these things give an opinion as to whether
this is a bad moment to buy it? Would it be better to wait for 802.11ax/6 to
arrive? When can we expect to find hardware (such as mesh kits) reliably
supporting the new standard?

~~~
rayiner
802.11ax (or WiFi 6) is going to primarily benefit you at 5 GHz, which is best
where you have line-of-sight between the device and the AP. If you're looking
into range extenders, you probably don't have line of sight and it's not worth
it.

Mesh Wifi sucks.[1] It takes Wifi unreliability and latency and compounds it
by adding hopes. Just bite the bullet and put in multiple APs, all connected
to a router via Ethernet.

[1] Mesh [any sort of wireless] sucks.

~~~
fyfy18
> Mesh Wifi sucks.

As an alternative for what doesn’t suck, you want multiple hardwired access
points. Obviously if you are renting that is tricky though. Ubiquti UAP-PROs
are pretty good for home use, and a couple at each end of your apartment would
be faster (and probably cheaper) than a prosumer mesh setup.

------
femto
Cunning. Why have 5G, when you can have Wi-Fi 6?

I'd posit that this renaming is part of the ongoing competition between Wi-Fi
and LTE, and prompted by the advent of 5G?

------
ezoe
I like new numbering system which allows me to quickly deduce the generation
of standards. No more a/b/g/n/a-something.

------
schen57
Can someone explain what this means for consumers? Who benefits the most from
this new release?

~~~
azinman2
Ordinary people who go to buy a device can know what “version” a router or
laptop/phone supports. It makes it much clearer when comparing two routers
which supports newer technology rather than knowing the diff between .ac or .n

------
shmerl
Looks like penetration and range decrease with each new WiFi generation.

~~~
kalleboo
Not really. There's only the 2.4 GHz and 5 GHz split.

And the problem with "long range" is that everyone who doesn't live in a
detached suburban house now has massive noise on the 2.4 GHz band from their
20 neighbors so that everyone gets terrible speeds. The weaker
penetration/range of 5 GHz solves that by right-sizing everyone's network.

edit: researching the updates in 802.11ax, it looks like Wi-Fi 6 will bring
the improvements from 802.11ac to 2.4 GHz as well, so it will actually improve
performance for people who need range as well. It also has guard interval
improvements for outdoor environments.

~~~
kbirkeland
The larger problem with the 2.4 GHz ISM band is the lack of non-overlapping
channels along with only one station being able to transmit at a time per
channel (in pre 802.11ax). There are only three non-overlapping 20 MHz
channels in the 2.4 GHz unlicensed spectrum, where 5 GHz spectrum has about 21
(although some require active radar avoidance).

~~~
forapurpose
> The larger problem with the 2.4 GHz ISM band is ... only one station being
> able to transmit at a time per channel

Is there any Wifi tech for which that isn't true, sans beam-shaping and other
tech that effectively puts stations on different physical networks? How do 5
GHz technologies handle multiple simultaneous broadcasts on the same channel?
How will 802.11ax do it? Signal processing?

My impression, based on only a little research, is that it's impossible with
any tech. Even cell providers need CDMA, TDMA, etc. Maybe that understanding
is out of date?

~~~
kalleboo
Maybe this?

[https://en.wikipedia.org/wiki/IEEE_802.11ax#Technical_improv...](https://en.wikipedia.org/wiki/IEEE_802.11ax#Technical_improvements)

> Spatial frequency reuse

> Coloring enables devices to differentiate transmissions in their own network
> from transmissions in neighboring networks.

> Adaptive Power and Sensitivity Thresholds allows dynamically adjusting
> transmit power and signal detection threshold to increase spatial reuse.

> Without spatial reuse capabilities devices refuse transmitting concurrently
> to transmissions ongoing in other, neighboring networks. With coloring, a
> wireless transmission is marked at its very beginning helping surrounding
> devices to decide if a simultaneous use of the wireless medium is
> permissible or not. A station is allowed to consider the wireless medium as
> idle and start a new transmission even if the detected signal level from a
> neighboring network exceeds legacy signal detection threshold, provided that
> the transmit power for the new transmission is appropriately decreased.

~~~
forapurpose
That makes good sense to me, but it's still basically keeping stations on
separate networks - it's managing the networks' transmission power carefully
so they can run in closer proximity.

Still I don't know a solution to concurrent broadcasts on the same network, as
the GGP comment seemed to imply.

------
riffic
This is so dumb, no one's even using Wi-Fi 1 through 5 yet.

~~~
craftyguy
> no one's even using Wi-Fi 1 through 5 yet.

Yea they are.

