Proponents of spectrum auctions should be ashamed of themselves.
If you pop up a spectrum analyzer anywhere there are people, you'll see loads of traffic jammed into the "free-for-all" ISM bands, you'll see AM & FM radio, and you'll see precious little else. If you have a really good spectrum analyzer, you'll be able to spot the occasional cellular traffic, but by and large the spectrum that has been auctioned off to corporate owners goes almost entirely to waste.
50 years ago when fixed-frequency narrowband receivers were all we had, this made sense. It was a necessary compromise. Nobody needed bandwidth but everybody needed a dedicated frequency to accommodate their primitive hardware. Today it's the opposite, and the relative usage of ISM bands vs everything else proves it. The radio-frequency situation makes the real-estate markets with block after block of unoccupied apartments look downright efficient by comparison. We need to fix this.
The reason the ISM bands look so "hot" or busy on a spectrum analyzer is exactly because they're a "free for all." It's a classic tragedy of the commons: everyone keeps "shouting" at max volume to be heard and there's no coordination between operators and users of different services.
Here's an example: in a typical apartment block, each resident runs their WiFi access point at full power. In any one apartment, there's a ton of interference from neighboring units. If everyone coordinated and reduced their WiFi power to just what was necessary to cover their unit, it would result in less interference and higher net data rates for everyone. But no one is incentivized to do take action unilaterally.
LTE and 5G networks are very careful planned to limit intracell interference and handover zones between towers. This central planning maximizes the efficiency of the network. It allows your cell phone to connect to a tower 10km away if you're in a rural area, and results in higher spectral efficiency for the overall network by increasing signal-to-noise ratios for all users.
With a more sensitive spectrum analyzer, you'll see that the cellular bands are actually extremely busy. In fact, macro networks are almost always at capacity during peak usage. However, all the devices are "speaking" at the lowest volume necessary to be heard.
> "If everyone coordinated and reduced their WiFi power to just what was necessary to cover their unit, it would result in less interference and higher net data rates for everyone. But no one is incentivized to do take action unilaterally."
This is something that needs to be fixed in future WiFi specifications and enforced via device certification. You're lucky if a typical WiFi base station operator understands how to set channels, let alone adjust TX power levels.
People expect to be able to plug in a base station and have it "just work", so coordinating with other nearby stations to optimize spectrum usage, minimize TX power, etc is something that needs to be automatic.
The current situation is particularly disappointing when you realize that a lot of WiFi devices have been capable of adjusting TX power on a per-packet basis (per frame, really) at least as far back as 802.11n hardware. So your router should never be shouting at your laptop that's in the same room, even if it would use full transmit power to reach a device in the attic/basement/yard.
This made me realize how long it's been since I actually bought a router. The last time I was shopping around, getting 802.11n significantly narrowed the options.
I'm impressed. The only good router I have left from that era has several Ethernet ports burned out, and the rest have too little storage and RAM to accommodate a recent OS with features like IPv6 support. And those are purchases from an era where 802.11n support was a given but dual-band 802.11n support was still reserved for more high-end products.
... and to make matters even worse the solutions being marketed for fixing a lot of these problems are... MORE ACCESS POINTS!
Every time a friend tells me about getting a mesh system to cover dead zones in their place I cringe at the thought of more 2.4 GHz blasting everywhere. 99% of the time the problem can be solved by better placing their current access point, whether that means moving their ISP provided router/AP, or installing a standalone AP and disabling whatever they're currently using for WiFi.
I don't really know the answer to fixing the issues but it seems to me it's not technical, it's better information about placing the devices. End users just want it to work but don't have the time to learn how to best do that. ISP installers are not incentivized to care either, install as cheap and quickly as possible and get out.
Mesh definitely isn't the answer though. It's the anti-answer.
But it’s part of the answer isn’t it? As I understand it the answer in general is more, smaller access points each with less tx power. My wish would be that those mesh extensions wouldn’t then rebroadcast the signal at max 2.4ghz power but then you’d be pilloried by ignorant YouTube reviewers who rank solely on “how many bars at max distance”
In most cases I feel your best position is in the ceiling at the center point of the house. I did that myself but at significant cost - fishing cat5 thru walls and ceilings and repositioning the incoming wires to the basement. 99.9% of users will never take that approach.
That could be true. However, it's better to use wired backhaul than more WiFi for the backhaul. Commercial locations do many access points with smaller broadcast area but use copper to bridge everything together.
You're right about the location aspect though. 99% of people will not be bothered. How do we change that? The ISPs could be incentivized to do higher quality install work perhaps? I'm not sure what the answer is.
The eruption in popularity of pro-sumer installs using gear like Ubiquiti gives me some hope that slowly people will see the way. I've installed UAPs at 5 homes now over the last 6-7 years. Zero support calls.
I gave some thought years ago about “leasing” a ubnt setup to wean people off renting modems from their isp. Couldn’t make the math work tho.
I have ubiquiti equipment and wired back haul too. But even among my tech nerd buddies it’s hard to convince people that its worth spending the time and money to run cable and multiple aps.
For their part isp provided equipment has improved dramatically. They must make a killing on the rental fees. Plus they have a vested interest in reducing WiFi service calls as well.
Wired is an option if you own a house. For everyone else it's a no-go for ethernet unless you're ok routing it outside the walls. There's ethernet over power lines, but whether that works depends on your electric installation.
Back in my last apartment we did powerline ethernet and it worked ok so long as the pcs weren't on the same electrical circuit as the ethernet adapters. I think the huge power draw from the gaming pcs was causing interference with the adapter
Ah, interference is an issue too. I was actually thinking of wiring that makes half of the house completely separate. (Seen it in practice, not sure how that works on the wiring level)
That split comes from US residential electricity being 240V split-phase AC. Adjacent circuit breakers are on different phases that each provide 120V, and double-wide circuit breakers allow appliances like dryers and ovens to use the full 240V difference between phases. If your house (like mine) puts all the lighting circuits on one phase and all the 120V outlets on the other phase, then you'll probably have relatively little trouble with powerline networking going beyond a single circuit breaker. But if you're trying to get powerline networking to work between outlets on different phases, you'll have more trouble (but I've seen it suggested that it might work better while the dryer is running and providing more coupling between phases).
Side note: the transformer on our street once had one of its phases fail. All the outlets in our house dropped to some useless voltage like ~30V AC, but the lights worked fine. Next door, the lights went out but the outlets worked fine. The light in the clothes dryer came on at half brightness. This was very puzzling, not the least because we were all used to issues with that transformer knocking out power entirely when it was hit by another teenage driver. Most of us didn't understand until then how a transformer could fail half-way.
After a windstorm my outlets weren't working but the lights were. Took me a while to figure out what was going on, but one of the hot wires had come loose at the weatherhead. The other hot and neutral were still connected. PG&E came out to fix it within an hour.
Before: One flaky wifi connection between AP and client.
After: Two wifi connections, both of which must work. Using a combination of operating conditions, AP and client that have proved unreliable in the past.
Now with no visible status for the middle link, and a second DHCP server on the repeater, handing out conflicting addresses when the middle link is down.
The typical home setup guide tells you to disable DHCP on your repeater and delegate to the original box. I was told to do that when I set up something similar due to living in a U-shaped apartment with no nice central position for the router.
Naturally DHCP is handled by the original network when the original network can be seen, yes!
But if the original network is unavailable - perhaps the SSID has changed for example - you need to be able to get to the repeater's configuration page somehow. And the user can't google their problems or download any configuration tools while their connection is down, so you'd better make it simple. So many devices like this will be transparent under normal operation, but if there's a network problem will start their own DHCP server.
But mesh means a few plugged in small devices. Your soluton means calling yours ISP to have the fiber come to a different place in your house and/or buying another router, having to pull cables (=drilling, cables,...) which is not allowed in some cases (when renting), pulling power cords (if there are no outlets at the optimal position), and having a blinking box in the middle of your dining room.
> 99% of the time the problem can be solved by better placing their current access point, whether that means moving their ISP provided router/AP, or installing a standalone AP and disabling whatever they're currently using for WiFi.
This argument is specious and unrealistic. 99% of the time you cannot move the current access point. Everyone tries that first, and it didn't worked. You're stuck with the fiber AP in a corner of your home, and you're not going to place your WiFi AP as your dinner table's center piece. And reinstalling APs is far more expensive than adding a repeater.
I mean, are you aware that a WiFi repeater nowadays goes for less than 20$? You order it, get it immediately delivered to your home, plug it in, press a button. Bam. Problem solved. No recabling, no technician, no appointment, no wasted afternoon. nothing. And you don't have to go through the WiFi seance session whenever you move your laptop. Just plug in the repeater and you're done.
I agree. Cabling homes is not easy as it seems when you're settled in. Also not every home is built the same way so, you may need to drill a lot of concrete walls.
Repeaters are no use with thick walls (I have quite a few). Powerline + access point combo takes a lot of place and is not cheap. Rewiring home is not always possible. Also, not all APs have power settings so you can't optimize power the way you want (also, thick walls). So it's a hard problem.
Throwing "just re-wire it, eh?" into the middle is not helping.
> People expect to be able to plug in a base station and have it "just work", so coordinating with other nearby stations to optimize spectrum usage, minimize TX power, etc is something that needs to be automatic.
... isn't it ? my internet box is from something like 2014 and is able to choose the less used wifi band automatically.
Power usage is based on the worse-case attenuation scenario, the last thing ISP want are call asking why wifi don't work good in the bedroom if the router is on the other side of the apartment / house. Thus transmitting at max allowable power is the best solution.
> Thus transmitting at max allowable power is the best solution.
Transmitting at the max allowable power is the selfish and dumb solution. It isn't the best solution, in a world with multiple overlapping WiFi networks. It isn't even a Nash equilibrium.
Why do you think it isn't a Nash equilibrium? Any actor unilaterally choosing to transmit at lower power is going to experience worse signal quality, even if it's only marginally worse in some cases.
> Any actor unilaterally choosing to transmit at lower power is going to experience worse signal quality, even if it's only marginally worse in some cases.
Yes, your received signal to noise ratio for that transmission will be somewhat decreased, and that might (or might not) hurt data throughput meaningfully. But it also means that your transmissions are causing less interference for your neighbor(s), and as a result your neighbor's equipment will not have to send as many retransmissions that potentially interfere with your own communications. It's a second-order effect, but also one that's proportional to how many neighbors you have in range that are trying to use the same slice of spectrum. So it's entirely possible that by unilaterally shouting less, you cause a significant reduction in wasted airtime that benefits both sides (though probably benefiting your neighbors more than you, depending on where your priorities lie between throughput and latency). It's not always a zero-sum game.
In general (at least for WiFi), it's better to worry about airtime as the scarce resource rather than focusing on bytes per second. For fixed-rate wired networking, transmit time and bytes delivered are exactly proportional, but a variable-rate shared wireless medium decouples the two.
While you are right this is still a Nash equilibrium. Nash equilibriums aren't supposed to be "good", they just are equilibriums (ie any actor is worse off by deviating)
I don't think you quite understood what I was saying: that decreasing your transmit power can actually lead to better overall network performance for you (and your neighbors) without requiring any corresponding change in strategy from your neighbors. Therefore it is not always an equilibrium (this may depend on the severity of local congestion).
Only if every ISP cooperate and there is no single one bad actors (eg. individuals) boosting their own transmit power for their own good.
Also, TX power is not all about interference: attenuation, there is very good reason to transmit at max power for best wall penetration / range. Even in a suburban/rural setting, with more and more devices connected remotely, you need the range. It's not in a manufacturer best interest to decrease the range of their routers. Only a small fraction of user will look at setting the TX power / investing in a better antenna / setup mesh network. The first reflex will always be "buy a new router".
So once again, max legal transmit power is the best overall bet.
> Only if every ISP cooperate and there is no single one bad actors (eg. individuals) boosting their own transmit power for their own good.
Could you please try to re-write that sentence in a way that gives me some indication that you actually read and understood my last two comments? Because it seems like you're simply repeating an assertion without addressing any of the points I brought up, and I'm not sure whether my explanations were inadequate or just unwanted.
The points you brought up are irrelevant to the "effective range" argument. Range requires TX power, so maximum legal range is inherently the same as maximum legal TX power.
For frequency use, this already happens with automatic channel selection for neighbours, and DFS for avoiding a regulated frequency as soon as a signature is detected. WiFi 6 presents more efficiency with at a higher layer with BSS coloring to mitigate crowded channels.
But as you said, most consumer wireless access points are still shouting over each other by default and it needs to be in the specification because there's zero incentive for a manufacturer to "worsen" their RF performance.
It's a pointless consumer market IMO, most people don't need a Wireless LAN. Mobile carriers already provide Internet access to thousands of devices with just a handful of radios over several city blocks, but pricing means many homes end up standing up their own emitters for exactly the same purpose. Now we see bickering like this over what should be a far better coordinated EM spectrum.
Worth note, most business/enterprise gear, which typically gets installed professionally, usually defaults to adjusting TX power automatically because more often than not it benefits the RF environment.
people very often need to setup their own WAP because cellular providers sell metered connections on the order of GB/month, whereas comcast (the most prevalent ISP in america) meters at TB/month. in addition, not every device currently has the radio / components necessary to receive cellular data
also there's the limitation that cellular providers often don't allow tethering (or more generally data access outside of the device the sim card is slotted in)
What I was getting at but forgot to include in my comment. Unfortunately it’s still a distant dream of mine that all carriers, fixed or mobile, would simply provide a dumb pipe billed based on max throughput and nothing more.
But if your apartment is large, and you need coverage in your bathroom at the other side of it, you need more power.... this can't be automated, and manufacturers prefer higher range to users calls to support and helping them raise the power manually (especially when compared to an old AP, which had that range).
What about 802.11h? In addition to defining DFS, it also provides for Transmit Power Control (TPC). I was under the impression that it's an FCC requirement for 5Ghz 802.11a Wifi devices.
Here, it appears to be working as expected. I see messages like "Limiting TX power to 27 (30 - 3) dBm as advertised" in the kernel logs of my linux devices.
Is TPC not well supported?
edit: Whoops, I think TPC isn't really that interesting, and I'm confusing it with DTPC which is Cisco-proprietary.
Ummm... All WiFi devices adjust bitrate dynamically...
For a given amount of data to be transmitted, adjusting bitrate up and freeing the channel quicker is equivalent to using a lower power for more time. Either way the disadvantages to other spectrum users are equal.
> For a given amount of data to be transmitted, adjusting bitrate up and freeing the channel quicker is equivalent to using a lower power for more time.
Shanon says that it is an exact equivalence, provided transmit power is >> thermal noise (it is)... If in real world devices that isn't the case, that's a shortcoming of the devices or protocol.
Well, obviously in the real world, devices do not have unbounded transmit rates. But you can get pretty damn close to the transmitter. The Shannon limit isn't the only relevant limit, and is no reason to dismiss out of hand an entire class of optimizations that do help in the real world.
(Also, even if your equipment is capable of hitting the Shannon limit, I'm not convinced that transmitting at the highest power level is going to be optimal for multi-party communications with geographically distributed APs and stations. The information channel between my neighbor's AP and his laptop is neither identical to nor completely separate from the information channel between my AP and my laptop.)
> The reason the ISM bands look so "hot" or busy on a spectrum analyzer is exactly because they're a "free for all." It's a classic tragedy of the commons: everyone keeps "shouting" at max volume to be heard and there's no coordination between operators and users of different services.
This would be a lot more believable if the unlicensed spectrum wasn't already under low regulatory limits on power levels.
You can always come up with some kind of system that will extract a little more efficiency out of things, but the reason most things use maximum power is that "maximum power" is already low enough that you can barely provide decent coverage from an access point on one side of your apartment to a device on the other.
The reason the unlicensed spectrum is busy it that it's busy. Because there isn't enough of it.
And the reason licensed spectrum isn't as busy is that incumbents buy more spectrum than they strictly need because it's a mechanism of excluding competitors.
It's really hard to believe in the theory that the WiFi bands are poorly regulated and all the routers are shouting at each other and fill the spectrum, when your routers can barely reach through 2-3 rooms and 1 floor, with no nearby stations to blame for interference. The issue is obviously the transmit power which is already set so low (while perhaps being on the maximum allowed settings) that the reach is already a couple of rooms. How much more weak could it even get? To only reach within one room? What would be the point of WiFi then?
The radius at which your WiFi AP causes interference with other networks is larger than the useful radius at which your client devices can maintain a decently fast connection speed.
And for many APs with poor management of airtime, having a client device on the fringe of your coverage area can cause your router and that client device to both waste a disproportionate amount of airtime shouting at full TX power and a low data rate, maximizing the amount of interference you're causing to anyone else trying to make use of that channel.
Yes but what is your proposed alternative? It is still the lowest possible amount of interference in terms of tx power. If you decreased it the WiFi would not even be useful (if it would only work within a single room)...
Your claim that WiFi "can barely reach through 2-3 rooms and 1 floor, with no nearby stations to blame for interference" is simply not believable, when it's easily demonstrated that WiFi can cover a whole house and a good sized yard in a truly quiet RF environment. Without interference, covering a single apartment is no trouble at all unless the building materials for interior walls were maliciously chosen.
I think you're simply misidentifying the cause of your connectivity issues, and blaming low transmit power when your real issue is that you have far more competing traffic on the 2.4GHz band than you think, and thus a shortage of quiet airtime. And lowering transmit power of everything (preferably done automatically and adaptively on a per-device basis) is the solution to the latter problem, while using higher transmit powers on everything can only make the latter problem much worse.
There is an odd misconception in public, that "wireless" means full connection through two walls of rebar filled concrete...
If you think that you can have pleasant, modern, net surfing experience without an AP in each room, then you can't say you know much about radio communications.
WiMAX delivered a pleasant, modern experience without an AP in each room when that was operating at 2.5Ghz in the US, one tower would provide the coverage you describe for 2/3rds of a mile in any direction.
Sadly, this 250+Mhz chunk of spectrum is now held by T-Mobile who uses only 40% to 60% of this spectrum for LTE and 5G coverage. There is plenty of spectrum left over to double the size of the 2.4Ghz ISM band used for WiFi, but the FCC has let Sprint/T-Mobile mispurpose this 2.5Ghz/2.6Ghz spectrum that was licensed for educational use only (explicitly for educational institutions to use for educational purposes like inter-building wireless links for campus TV stations).
Surfing the internet is a pretty low bar for WiFi networks, and has been easily passed for 10+ years.
The fact is that wireless signals do penetrate walls, and putting an access point in every room would probably do more harm than good in terms of increasing interference.
> Here's an example: in a typical apartment block, each resident runs their WiFi access point at full power. In any one apartment, there's a ton of interference from neighboring units. If everyone coordinated and reduced their WiFi power to just what was necessary to cover their unit, it would result in less interference and higher net data rates for everyone. But no one is incentivized to do take action unilaterally.
There's a lot of assumptions here. First of all None of my wireless access points except a single range extender has transmission power settings. In that case, the setting is ganged for both 2.4GHz and 5GHz radios. So, I can't take this action unilaterally even if I want to do.
Second, wall thicknesses and home geometries are very different across the world. Both my and my family's houses have very thick walls and tall storage units which effectively double the wall thickness so, with a relatively small house, I can't have reliable wireless everywhere at home even with two access points.
I could increase coverage with more, less powerful access points but it needs recabling in a small, packed house which is not feasible. If I want to use range extenders with wireless repeating, trying to reduce wireless power becomes moot. This house runs on ADSL and the phone lines are at very strange places so both houses are effectively legacy systems with very small wiggle room.
FWIW, I don't think OP was necessarily saying that is a problem that needs a solution, so much as it's an explanation for the relative noise seen by GP. That said, it would be interesting to see coordination between access points in neighboring networks.
This is pretty much true, but with a catch - wifi is contention based. If two access points are close enough to hear each other's transmissions and they are operating on the same channel they will share that channel's bandwidth. Wifi was designed this way for exactly this reason, and it's probably why wifi still kind-of works even in large apartment buildings.
> This central planning maximizes the efficiency of the network.
On one hand I see unused bandwidth*time, on the other I see applications that want to use more bandwidth and can't because of the obscene monetary cost. So no, the central planning clearly does not maximize efficiency.
It does maximize the efficiency with which the telecom oligopoly removes the money from our wallets, though. I'll grant it that.
> If everyone coordinated and reduced their WiFi power to just what was necessary to cover their unit, it would result in less interference and higher net data rates for everyone
Our apartment is short and long, and the fiber comes in at one end. So of course we put the WiFi AP at the end of the apartment where the fiber comes in, and it needs to be at full power to reach the other end. Of course, now the person on the other side of the wall will be blasted with signal.
A planned network from a provider would instead have 2-3 APs spread out through the middle of the apartment at a low power. But that costs 2-3 times more, plus now you have to drill through the walls to wire them (which we can't do as we're renting).
There's a lot more difference between the usage of the ISM band and the licensed bands than "if everyone just lowered their power a bit it would be great". Since ISM band usage can't be planned to the same extent, it should have MORE spectrum than the planned bands, not less.
Yes! Compare a highway to a subway at rush hour. The road will be full of cars while the track will be empty most of the time, but the track will be carrying many more people per minute than the road due to more efficient allocation.
This analogy provides the argument for packet aggregation and maybe for for centralized packet scheduling (as on cell networks), but I don't see how it works for anything related to transmit power.
You couldn't blame customers either, because you should consider that an average Joe doesn't have any time to think all about that and only know the bare basic of just plug-and-play.
Furthermore, the IEEE spec itself has no standardised protocol to automatically adjust signal power between devices, which is an oversight in their side.
An alternative solution is to line the floors and roof of your apartment block in RF blocking material so only your wifi exists within it. Causes problems for phones not on the wifi network I guess but maybe its worth it.
Eh it's debatable but I don't think this is entirely accurate.
WiFi is very good at maintaining maximum transmission rates. High power is needed when doing stuff like QAM64 over the air.
If wifi was louder than it needed to be, you wouldn't have to be mere feet away from the router to get maximum speed.
Wifi bands are saturated because there's so little spectrum for a widely used service. Everyone has multiple wifi transmitter in their house yet there's only enough spectrum for three non-overlapping channels.
Imagine there was enough spectrum for 30-40 wifi channels. There would be hardly any overlap and everybody would have better range. Maybe 500-1000 feet line of sight.
>If you have a really good spectrum analyzer, you'll be able to spot the occasional cellular traffic, but by and large the spectrum that has been auctioned off to corporate owners goes almost entirely to waste.
Just because you don't see any usage in your area (rural/suburbs?) doesn't mean it's unused. They could be heavily used major metropolitan centers, for instance.
Just because you don't see any usage in your area (rural/suburbs?) doesn't mean it's unused.
This is absolutely true. Some devices only transmit occasionally. Not everyone is blasting their frequency 24/7.
It reminds me of when I worked for a company that had a bunch of super-cool switch 56 linking various radio stations in our network. We'd use them for moving audio around, doing interviews, sports remotes, etc...
But we also had a problem with them constantly failing. I was told by our engineer that this was because whenever another company called up Bell Atlantic to get one installed, some tech would listen in to the pairs (presumably with a scope), and when he found one that was silent, that was the one he used for the new company. Often, it was one of ours, so when we would try to use the link the next morning, it wouldn't work.
It still "belonged" to us, but because we weren't using it 24/7, someone thought it should be reallocated. Not cool.
If he can't see it where he is then by definition it isn't being used in that location. Use on some other part of the planet isn't relevant because these frequencies don't propagate far.
OP's impression seems to be that because of it's unused, it's being "wasted", and is therefore bad. However, that doesn't factor in that most of the usage occur in urban areas, where most of the country's population lives. The consequence of the spectrum auctions/allocations is that someone living in rural wyoming can't use the 1Ghz to 7Ghz band (numbers made up) for himself, even though nobody else is using it. Is that a shame? I suppose, but only if he's a radio enthusiast. For the vast majority of the population, this unused spectrum doesn't impact their lives. They can still use their 802.11ac 5Ghz routers and get 500 Mb/s transfer speeds in their homes.
When considering whether spectrum is used or fallow, we really need to consider the location as part of the function - if a given block piece of spectrum is only used in large cities (say mid- 2GHz TDD band) for capacity in urban environments, that spectrum is fallow for perhaps 99.9% of a country's landmass, and maybe 70% of its population (depending on their urban city density).
With some of the work coming through the pipelines though, in future it will be feasible to deploy your own small cell, either independent of an existing operator, or in partnership with a co-op or other facilitator who provides the mobile core capabilities, and run a mini network in your local area. This is already happening in the UK in various commercial trial projects.
If he could use it, tech that is useful to him would be developed, allowing him greater connectivity at lower costs. Say he owns a ranch and wants to have fully autonomous ranching hardware that benefit from connectivity for example...
My impression is that internet speeds are lower in the rural setting because it's more expensive to lay cable over longer distances. My nomination for unused bands is fast rural internet.
Cost to deploy fiber is going to be lower in rural settings due to fewer obstacles and easier right of way. However the reason connectivity is more costly to provide is due to lower density of customers.
It’s true. However there’s a question of what to do about this. You can’t sell a wireless device to someone in Ohio and expect it not to be used in New York City. And certain emergency frequencies might be rarely utilized but very important when used.
That said, I’d like to see more allocation of spectrum for consumer use. It’s very frustrating that cellular companies get more and more spectrum but we can’t feasibly deploy public LTE for example.
Perhaps if the wireless device has GPS and network connectivity we could give the government permission to enable and disable extra frequencies depending on location. Though of course this scheme invites suppression of dissent.
The UK now has a "use it or share it" regime for operator spectrum.
I can go and get prime operator spectrum if it isn't in use in my local area, and use it at certain agreed power levels. That license is just for one location.
There's also dedicated "held back for sharing" spectrum in the N77 5G band (3.8 to 4.2 GHz), which is more widely usable without detailed coordination (but you still need a low cost license)
With modern spectrum access systems (see CBRS for example), geolocation of the radio is used to determine via a coordination server what spectrum can be used in a given location. Whether that's at risk of suppressing dissent or not is another discussion however - most radios in cellular networks will have some kind of backhaul and coordination management plane for interference avoidance and management - the threat model in cellular technology is a bit different still, and the communications are there to enable spectrum requests to be made in real-time. Spectrum coordination wouldn't need to be done by government - there's conceivable ways to do sensing and avoidance, but at the risk of tragedy of the commons scenarios.
I believe any enforcement Ofcom do these days is reactive, and in response to reports from licensed operators.
A lot of the new "shared" access licenses do not give guarantees of interference protection though, so I imagine Ofcom enforcement would be on a best-effort basis there.
The rule of thumb I use is that the utilisation of the spectrum is 5%. This is based on a conference that I went to (20 years ago granted) where an Australian group had actually measured the utilisation of the spectrum. This was in the context of "whitespace" radio, where they were investigating the opportunistic use of licensed but unused spectrum. Their answer was that from a technical perspective it's worth doing as 95% of the spectrum is available for opportunistic use, the main barriers being non-technical.
Personally I can't see a problem with a "free for all". As space-time (MIMO) coding comes to dominate, WiFi access points will be increasingly focusing their transmissions on specific spatial regions leading to less interference on average, even if the devices are sharing a frequency.
It's in the transmitter's self-interest to maximise channel capacity by focusing as much power as possible on the receiver of interest and not wasting power by spraying it around in an omni-directional manner. This self-interest will avoid a "tragedy of the commons", provided the rules say that everyone has the same maximum power, forcing people to use their radiated power wisely.
>It's in the transmitter's self-interest to maximise channel capacity by focusing as much power as possible on the receiver of interest and not wasting power by spraying it around in an omni-directional manner. This self-interest will avoid a "tragedy of the commons", provided the rules say that everyone has the same maximum power, forcing people to use their radiated power wisely.
That breaks down when you factor in that beamforming devices are expensive, so not all devices will opt to use it. The devices that don't use them will still blaring at full power, causing interference to others.
But if they are limited the same radiated power (an important caveat placed on the "free for all", in the form of a class license) the MIMO system will be able to get more power focused on its receiver, and within that spatial region overpower the omnidirectional interferer. The omnidirectional transmitter will lose more by not being MIMO than the MIMO system will lose from the interference.
MIMO isn't expensive these days with the availability of integrated chip sets, and the cost is only dropping. It will get to the point where apart from broadcasters no one runs a non-MIMO system because the cost of not being MIMO (loss of channel capacity) will exceed the cost of the MIMO hardware.
>The omnidirectional transmitter will lose more by not being MIMO than the MIMO system will lose from the interference.
The problem is that you're not competing against another laptop/router that wants to maximize throughput, you're competing against a IoT device (eg. smart fridge, surveillance camera, lowend tablet, etc.) that doesn't give a shit about throughput and will happily tolerate a 50Mb/s (or slower) connection, while taking up the same timeslice/radiative power as every other device.
Granted that there will be some impact on the MIMO from the Iot device, but it will be lopsided as the MIMO can null out the IoT transmission. It can be looked at in a couple of ways: the IoT is having to spread its power more thinly than the MIMO, so the IoT tx will have less influence on the MIMO receiver than the MIMO tx will, or the MIMO's spatial coding gives a degree of isolation which benefits both the IoT and MIMO devices.
There will be a problem if there are a large number of close by IoT devices up against the MIMO, as their smallish contributions will sum up to something big. In that case the MIMO would need to improve its isolation from the IoT device by increasing its number of antennas or the quality of its processing.
As a side issue (doesn't really negate your argument), excessive transmission power is a cost for battery powered devices, so at least some of this IoT devices will be motivated to use their transmitted power more wisely.
To raise a new front in the argument :-), there is a case to be made that the ability of a device to cause interference isn't actually dominated by the power of its transmission, but by the information content of its transmission. Low information content means the signal is easier to predict and a smart enough receiver can predict the signal and cancel it out, leaving only the desired signal.
Some parts of the spectrum are only suitable for certain uses. Cell phone providers prefer frequencies in the MHz range because they penetrate buildings well. Higher or lower frequencies wouldn't be suitable for this purpose.
In addition, a lot of Tx and Rx equipment moves around. A quiet part of the spectrum in a rural area might be crowded in a city, for example, and poorly designed equipment could be subject to harmful interference
This looks like a money-making scheme from big telecom companies to me. They have to compete with faster, cheaper, more reliable hardwired connections, and can only benefit from hobbling WiFi.
WiFi 6E is necessary and potentially revolutionary for indoor spaces. If you’ve ever lived in a large apartment building, then you have probably seen the huge number of neighbor WiFi routers with varying signal strengths interfering with your WiFi connection. I’ve found WiFi to be completely unusable for gaming for that exact reason, and have to use a MoCA adapter instead.
WiFi 6 helps, but most devices on 5Ghz are still not on WiFi 6, and won’t be for a long time. Any non-WiFi 6 device on the same channel will interfere with the more advanced devices.
WiFi 6E not only introduces a new 6Ghz band where every device is using WiFi 6, but it also has significantly more channels, allowing for more routers in a limited area without interference.
On top of that, it has higher speed and lower latency than even WiFi 6 on 5Ghz. I’ve seen some claims that it might be used for wireless VR headsets, though that remains to be seen.
This is a technology apartment dwellers don’t even know they’ve been dreaming of, and it should not be delayed.
Likely here is the thinking process for carriers right now.
1. how do we make our service as good as possible so we can provide the most value to customer / (and make the most money)
2. when you get home your phone switches from 5G to wifi network -- which often in many people's houses sucks
3. we will provide you a "combo / bundle" offering.. we will bring fiber to your house and install a small repeater which will provide "5G" at home -- which will basically just be higher frequency bands of 5G service -- higher frequency means does not go through walls / trees / barriers outside as well, so those bands are harder to use anyway for that -- better to use like 1GHz if possible
4. Man, that 6GHz band looks might tempting for us to use.. I wish we could have that be part of our service.. rather than people switching to wifi, they should just keep using their wireless network all the time at home / on the go etc..
5. we will "manage" that spectrum better than a bunch of cowboy's with their random 6GHz routers they buy on aliexpress
While all of that may be true, the important thing here is that many people have limited options (both price wise and option's wise) for cellular providers, and similarly for fiber or cable internet. So selling the spectrum to the cellular carriers is basically saying that is who gets to use it, and people are even more dependent upon cellular data because existing 2.4GHz and 5GHz bands don't work well enough for people. The end goal world for cellular carriers is that you use them all day everyday. I think most people would judge that having 6GHz be an open band that people can use in multiple ways is pro-consumer. But I personally would say, if the carriers want to use a small portion of that for the same goal where they "manage" it, then why not give them some too and we can see who manages it better. But the idea that the cellular carriers end up getting all of it seems like a disaster for consumers. That is a ton of spectrum, and cellular guys would happily charge you tons of money for it vs. you just buying a router and hooking up to a cheap fiber connection.
Another good way of looking at the problem is that you need to test latency under load, not latency on an otherwise-idle network. Ping tests on an idle network only tell you the best-case latency. If you try doing that a while couple of devices are running video streams (especially if it's two-way from videoconferencing rather than Netflix), you get a very different picture of your network's performance.
There's always a use for less latency... HTF (high frequency traders) are in the nano-seconds. There's probably other areas where lower latency will always matter.
HFT CAN and will use wireless if it is faster than fibre (refractive index of air vs glass). Albeit not WiFi but they do use Ham Radio to front run trades between cities and continents. There have been a few articles posted to HN over the years....
Can't agree more... 5G adoption hasn't been as fast as anticipated, and an easy fix to that problem is to give customers no choice but to upgrade to 5G by using regulatory mechanisms to prevent the roll out of potentially competitive upgrades.
Even if function wise they are different radios, consumers don't really know the difference - phone providers just need something new to pitch the next generation of handsets and wifi 6 could compete with 5G for being the 'it' upgrade on a generation of handsets.
"As long as a number goes up, it must be better...right?"
Seems like a weird takeaway to me. As far as the consumer is concerned wifi and cellular serve the same purpose. The speed argument isn't even really that relevant when people can already play Overwatch and other latency sensitive games on a mobile hotspot, the shortcoming there isn't speed it's simply data use. I'd much rather just have a mobile network I can connect to anywhere and let the tech crowd pay extra for features and politics only they care about.
Taking a position on the politics isn't leaving the politics to the tech people. Whoever says 'it doesn't matter' first loses, because that means it shouldn't matter to them and so the other party can get their way.
5G already gives the cell users everything they need. In the places where this matters, the tools are already there. The 6GHz band would not actually help anything, for them, especially given that it's an indoor frequency and would need some kind of repeater to be useful. But for WiFi it does solve an immediate, important need.
Maybe 5G is much more advanced than I understand, but I don’t think those networks would be able to take on the added load of most people switching away from wired internet anytime soon.
But I do know that 5G in my apartment does not match my fiber internet connection in terms of bandwidth or stable latency yet. But maybe it’d be good enough? That’s an interesting question.
I’ve never tried it, so it’s hard to say how it’d hold up. I do have bad memories of previous cable internet plans that were really fast during the day, only to slow down a lot at nights when everybody was streaming.
My current fiber internet is the only connection I’ve been 100% satisfied with, so it’s not tempting to seriously consider any alternatives.
This should be an easy choice. Award huge amounts of indoor spectrum to a consortium that requires a SIM card to use any of it? What are they thinking?
There are indoor 5G radio units for use in malls and large buildings (see Ericsson's Radio Dot products for example [1]). That's probably what they were thinking.
As I understand it (and please, correct me where I'm wrong, internet) as the frequency goes up, the range and ability to penetrate walls goes down. Each generation uses more spectrums, more ranges.
So 5G is neat because it's got low frequencies with long-range towers in rural areas, but also high frequency short-range 'towers' in urban areas. This means instead of way too many people trying to share a limited spectrum over a huge area (IE: low frequency in the city), you can have short-range high frequency 'towers' every few hundred meters in a city.
But now they're coming for our wifi routers, demanding that spectrum too. And that's kind of bullshit.
Well to be fair, how many malls have you been in that actually have decent WiFi? The infrastructure already exists in the enterprise to provide it, but usually they cheap out at some point so it ends up being useless, and after going through the hoops to register and realise you are getting 500kbps, you switch to mobile data.
If this ends up with large indoor areas (malls, airports, etc) having decent connectivity I'm all for it.
An Ethernet cable plugs into the unseen side. Very similar to Wi-Fi but costs much more, is faster, does not require any user effort to connect, and is limited to just one telco's customers.
Why the hell are we not just using beam forming with WiFi? Why are we restricting the construction of computer networks to some few privileged companies?
>Why the hell are we not just using beam forming with WiFi?
AFAIK the router I have from years ago is advertised to have "beam forming". Whether/how well it works is another story.
>Why are we restricting the construction of computer networks to some few privileged companies?
The theory is that building out networks in a planned/organized fashion would produce better results than ad-hoc deployment. See sibling comment: https://news.ycombinator.com/item?id=25652088
Organization does not necessarily mean monopolization. Open standards/protocols with several operators is not only possible, but desirable. An ecosystem without diversity is death.
There's more to wireless networks than just radio technology.
Contemporary cellular service is made possible with low interference licensed spectrum ($0.XX per MHz per million residents), real estate (towers) $$$$/mo each, fiber backhaul ($$$$/mo each), and an army of workers. Telcos each spend billions of USD per year on CAPEX.
Spectrum auctions are better than the previous allocation method of licensing channels for specific uses. If companies have to pay money they will efficiently deploy it, unlike incumbent users that refuse to digitize, simulcast on extra channels, or even leave it idle.
Spectrum auctions and allocation are both worse than unlicensed use.
Having specific users for high power stuff kind of makes sense but requiring another (specific) company to be involved to build short range wireless networks in your own building is stupid.
I'm eagerly awaiting an AP that supports 6GHz and am hoping for a few years of relatively interference free wifi in an urban setting (like 5Ghz was for the first year or two).
I have been asking and waiting for a response on WiFi 6E. Because right now AX210 doesn't have 6Ghz turned on. And its WiFi Certified document [1] does not have a 6Ghz listed. The WiFi Alliance also does not specifically mention 6Ghz support as being the requirement of WiFi 6E certified. I am under the impression for Intel, a software update will come once 6Ghz has been truly opened up in US. But this will be a first to see product adding new Frequency Spectrum support after its FCC testing and approval.
The reason I ask and gone though these trouble is because I saw a few, namely Xiaomi released an WiFi 6E router with no support of 6Ghz.
[1] Google: WFA101047 , Its an PDF File with no direct link
This is interesting. Indeed the certification report you found does not mention 6GHz. But according to Intel's AX210 whitepaper[1] "The product supports dual-stream Wi-Fi in the 2.4GHz, 5GHz and 6GHz bands". So I certainly hope it supports 6Ghz now. I'll be very disappointed if it doesn't.
Is the Xiaomi router for sale with 6Ghz enabled? I can't seem to find it on their web site. I saw a few listings on AliExpress for a 2021 AX6000, but they did not mention 6Ghz.
No it is not. Since there are no 6Ghz Spectrum time table in China. The official webpage for AX6000 is here [1] ( In Chinese ) And after I have made multiple enquiry to WiFI Alliance and Xiaomi they have somehow removed all mention of "WiFI 6E" from their pages. Instead referring it to WiFI 6 "enhanced". You can still see some site [2] and here [3] referring it to WiFi 6E. But I think within the last 48 hours they are actively removing all WiFI6E reference.
Not sure if it has anything to do with my enquiry, considering no one on the internet actually cared to question and ask I would like to take some credit for it :D
I would expect 6Ghz to work as advertise from Intel with software update. It is Intel after all and they have at least listed 6Ghz as being officially supported.
Although not ideal, 5GHz has significant wall reflection/reduced penetration. If you're in a dense, urban, noisy environment, you should still be able to achieve decent MCS on 5GHz in your home.
6GHz will add more spectrum with these characteristics, which will helpfully alleviate future performance concerns pretty thoroughly.
The problem I'm facing is that my friend is a streamer, and she can only reliably stream with a wired connection. When trying to use Wifi, she has frequent breakups of her stream. Using https://packetlosstest.com/, we see that frequently many packets are delayed several hundred MS.
The only thing which has seemed to help is reducing the 5Ghz channel width and forcing the AP to use one of the lesser occupied 5Ghz channels. This is what has made me think that interference is the problem. A channel scan shows well over 100 APs..
EDIT: Are there AP features or tuning which will influence how well, and/or how quick it can react to interference?
I've been hoping that the 6GHz band will improve things for her when 6Ghz APs become available. I was going to build her an AP using an Intel AX210, however I realized that the AX210 does not support AP mode so that plan won't work.
Have you considered bringing an AP closer to the client? Especially on 5GHz a good line-of-sight / line-of-reflection is very helpful, and some extra 10-20dB of signal should compensate for a lot of interference.
OFDM MU-MIMO AX 20000+ 1000$ routers with 20 antennas work better, but in an inconvenient location, when the noise floor is high, it has no chance to compete with a decent 2-antenna ac (or even n) AP with good signal propagation (same room vs. behind a wall somewhere).
And it would be helpful if you can agree with your neighbors to use narrow channels (less competition and unintended interference), decrease radiation power (less distortion => better signal) and if supported, increase minimum bitrate (faster communication => less jamming). Also, lower transmission power and faster SSID broadcasts help induce roaming, for when you have multiple APs.
Building a performant wifi network is a serious pain. Building material, channel congestion, network load, AP placement, other sources of RFI, etc all cause issues.
Your best bet is look for APs that support 802.11r or v (also check that clients support it though most modern cards do) which allows seamless roaming between APs then replace the existing AP with 2 or 3 APs. If you want a cheaper solution you can try for a DIY site survey to try to find the best placement and channel for the existing AP.
There are client priority features in wifi, but your device may not support them. If her antenna has line of sight to the BS then it's not likely to be interference but something else, like traffic congestion.
I'm not sure what hardware/chipset you're using, but if it's consumer grade, that might be a big part of the problem. Competently executed prosumer/soho like the amplifi/unifi line or google wifi may make a huge difference.
However, ethernet may just be the best option, especially for streaming.
With wifi, you need to make accommodations as well. It's not just Magic Internet Waves. I've had no problem video conferencing on wifi, but I make sure to have reasonable line of sight to my AP. I also chose a model with a simplistic and streamlined design so that it didn't look awful placed prominently in my space.
With that many neighbors I am not sure if there is much that can be done. You can take steps on your network, but getting everyone else to cooperate is a different story. Even if they are willing, how many would be able to manage their channel allocations and transmit power?
The problem with 5GHz is that some 60% of the spectrum is locked behind 1/10 min DFS for ill-conceived radar coexistence, so all the APs usually bunch in the few channels that don't require that.
I just suck it up and take the DFS hit, which seems to be about +30 seconds of additional router reboot time. The interference-free 80 Mhz channels are absolutely worth it!
If my router could remember the DFS state between reboots it would be even better, though.
It cuts both ways - when 5GHz was new, there were lots of client devices that didn't support the DFS channels at all (even if as stations they don't need to do DFS).
So still today AP vendors are cautious to use them because of the potential for support calls from people with older mobile devices that "don't show the AP at all".
2.4GHz channels are another of the many many regrettable paper cut things with WiFi. I have a 2.4 only gadget that the manufacturer ships with the narrowest regulatory settings so they can sell one SKU and it can't use channel 12 that is available almost everywhere but the US.
The "licensed crowd" that advocate for auctioning/licensing the band has lost in US already. However, they are advocating for licensing part of the band in other regions, e.g. Europe/Africa.
Specifically, they are advocating for licensing above 6425 MHz.
I bet a delay of ten years of fully using the 6GHz band would be considered a big win for the 5G advocates. They will have crippled Wifi's performance for 10 years, giving them more breathing space to show that 5G is the future.
Most of the comments here are looking at this the wrong way
The big problem here is the spectrum, any spectrum, is being SOLD..
Spectrum use should come at a cost based on % of revenue or Profit paid back the people and used for common programs. Not a one time free where by then the private company owns it for all time
Don't worry. Even when it's sold it's never really sold. If someone with more money comes along they get your spectrum. Just look at the incumbent satellite uplink operators who had half of C band taken away from them and resold (to 5G telcos).
As for spectrum usage, I can only talk about my region, but I have been monitoring all of 70-1000 MHz at ~10 second intervals and storing the data since 2014. For periods of that I monitor 700-2700 MHz as well. The cell operators are not letting their bands sit idle as far as I can tell. It's nothing like what the top level comment thread implies re: ISM vs. Cell bands. If anything it's the reverse.
But that doesn't mean they should get this 6-7 GHz of new ISM spectrum the FCC returned the original owners: the people. They really, really shouldn't. And they shouldn't be allowed to operated unlicensed LTE in the ISM bands either.
The FCC also designated this 6Ghz spectrum for indoor use only below a certain transmission limit, unlike the 5GHz channels which you have to stop using if there's other traffic.
It's a good case of dual-use of spectrum, though more spectrum should be made public either way.
http://superkuh.com/radio/ . In that directory you'll find most of the recorded radio history as well as the bash script and depends to do it yourself (http://superkuh.com/radio/deps/). The top of the bash script has a readme in the comments.
For 70-1000 MHz it's just a homemade discone antenna hooked to an rtlsdr dongle running rtl_power. I have written a little bash script that automates calling heatmap.py to render the 30k pixel wide image, imgcnv to slice it into image tiles, and generate an PanoJS HTML page to display the tiles.
Spectrum reminds me very much of diamonds. It is an artificial scarcity. Except a government controlled cartel rather than a private cartel. Most of the RF spectrum is reserved for "DoD" they never use.
If you pop up a spectrum analyzer anywhere there are people, you'll see loads of traffic jammed into the "free-for-all" ISM bands, you'll see AM & FM radio, and you'll see precious little else. If you have a really good spectrum analyzer, you'll be able to spot the occasional cellular traffic, but by and large the spectrum that has been auctioned off to corporate owners goes almost entirely to waste.
50 years ago when fixed-frequency narrowband receivers were all we had, this made sense. It was a necessary compromise. Nobody needed bandwidth but everybody needed a dedicated frequency to accommodate their primitive hardware. Today it's the opposite, and the relative usage of ISM bands vs everything else proves it. The radio-frequency situation makes the real-estate markets with block after block of unoccupied apartments look downright efficient by comparison. We need to fix this.