You should see 2.4 GHz noise floor when viewed from a spectrum analyzer in a densely populated, 5-floor, wood framed condo/apartment in a major urban area. Prototypical example would be some place in south lake union seattle.
Since 2.4 GHz goes through wood fairly effectively to moderate distances, it's a total half duplex CSMA hell...
One possible mitigation if you have older 2.4 GHz only devices is to run your own wifi on a 20 MHz channel, sacrificing throughput for better SNR. As channel sizes get narrower it cuts through the noise floor a little bit better. And of course to use your own choice of the cleanest 5.x GHz channel for everything you care about.
Why are your first two links just about buildings looking the same? I was thinking they'd spectrum visualizations per your lead-in which would be really fascinating to see...
probably not the best links, they're meant to be reference to a construction style of putting five wood framed floors on top of a concrete first level/foundation, called a "five over one". Then when the wood walls are divided up into 450 to 750 sq ft apartments, it's a LOT of individual 2.4 GHz APs all packed into a dense noisy space, with only wood between them.
I've got the opposite problem, a big old house with dense old pine shiplap full of nails covered with sheetrock. My wifi can't cover the house. Any recommendations for good router/APs with large range or maybe repeaters?
Probably better off with a mesh / repeater setup, a single super powerful router will still be sketchy at the edges as in that case the other device’s radio and signal strength also begin to play a role.
I’m angry because I just had a TP-Link (ax20) router go bad on me, so I’d just advise you to avoid that brand but do your research - maybe I just got bad luck :)
I do this in my own house w/powerline networking as the backhaul. Cable Modem is located at the main TV and connected to an Ethernet switch with local devices and powerline node. Three more power line nodes power two additional APs and an ethernet switch for wired connections in my home office.
It's not the most elegant solution, but it works and allows me to have Inet/WiFi access in and around my house.
Others have already suggested that your best bet is access points connected by copper.
Ethernet is the most obvious choice, but it might be a lot of hassle to run Ethernet through the walls or attic or basement and add outlets everywhere you need them.
If you don't mind visible cables, running along the top of walls can work. Here's part of the Ethernet cable that is connecting opposite ends of my house and also speaker cable for on of my rear channels [1]. Ugly, but since when I'm indoors I look down a lot more than I look up, I'm fine with it.
Powerline networking has also been mentioned.
One more alternative to consider is MoCA [2], which is an option if your house is wired for cable TV. The easiest way to think of MoCA is that it is like powerline networking except instead of running Ethernet over your power lines, it runs Ethernet over your cable TV lines. A typical MoCA adapter has two coax ports and one Ethernet port.
You hook up one MoCA adapter at your cable modem between the modem and the wall using the two coax ports on the adapter, and hook up the adapter's Ethernet port to an Ethernet port on the modem. (If you don't have cable internet, same instructions except you only need one of the coax ports on the adapter, and the Ethernet port gets hooked up to your DSL or fiber or Starlink or whatever modem).
Then anywhere else in the house that you want Ethernet and have a coax outlet you hook up another MoCA adapter to the outlet via one of the coax ports on the adapter. The MoCA adapters do their magic, and that Ethernet port is logically on the same Ethernet as your cable modem. You can use the other coax port on that adapter to hook up another coax device, like a cable TV box.
I want to love Ubiquiti and put their gear all over my house, I already have some long-range WiFi antennas from Ubiquiti, but my APs are all Linksys right now. However, the recent fiasco and managements apparent fail to even try to fix the issues at Ubiquiti makes me start looking for other gear instead... Worth keeping in mind.
Yeah the gear and software are very good but the management response to basically try to sweep the whole thing under the rug has really soured me on the company as a whole. You might as well just hand foreign intelligence the keys to your network.
I chuckle when Americans say that their "5 over 1" buildings "all look the same". This is nowhere near the level of sameness that I'm accustomed to see here in Eastern Europe. To my eye, these buildings look very distinct from each other.
And make sure a low-voltage electrician installs it. Half my ethernet are daisy chained. Some of it is stapled to the studs of the wall. Of the 5 drops in my condo only 3 lead back to the IDF. Fixing it is going to be damn near impossible.
They didn't even do that. They double punched the RJ45 jacks. They used the proper Cat5e everywhere (cables, jacks, and punch blocks) but the wiring was done terrible enough that I can't really use it.
I recommend just running conduit to everything during your next renovation, that way you could easily upgrade down the road. I wish conduit was used for all of these coax lines in my house.
20 years ago, I heard people online saying they were running their cat5e cable in conduits for ease of upgrading. Reasonable enough, after the 10-megabit to 100-megabit to 1-gigabit upgrades of the preceding decade.
Not once, since then, have I heard of any of them pulling new cables.
there is a pretty good chance that 20-30 meters of cat5e utp in a house will test successfully for 2.5GBaseT today, not that even one percent of consumers will have such a switch.
Yeah, I wish. I wanted to do this in a townhouse where we have RJ-45 ports in every room, but for some reason, they aren't actually internally connected to each other. Only I found the walls are all completely filled with insulation and running any new wiring at all requires pulling some of that out first and then putting it back in after, otherwise the nice noise barrier I currently enjoy from my walls shared with neighbors suddenly goes away. This means instead of making one hole upstairs, dropping a weighted cable, and making a second hole downstairs, I now need to tear out entire sections of wall.
They make small fiberglass sections that are threaded. You can assemble them together (even in tiny spaces) and push them through wall cavities until you get to your destination. Then you use the rod to pull the new wire (or a piece of string) backwards. I've done this to pull through walls with insulation quite often. I've even used them to fish through all sort of crazy places. It's fast and easy once you get the hang of it.
I thought it was. My house was built in 2003 and has CAT-5 to every room with RJ45 jacks on the walls. When I sold my previous house to a builder, he planned to open the walls and run CAT5 also, because "everyone expects that these days."
I guess newer houses might be expecting everything to be WiFi now, though.
In 2021, if you’re running network in the walls, run fiber to switches with SFP adapters. We maxed out twisted pair copper years ago (you can squeeze more over shorter runs, but that hits diminishing returns quickly) and home internet speeds are going above 1gbps. If you install in-wall cat6 it’ll probably be obsolete by the end of the decade.
If you install in-wall cat6 it’ll probably be obsolete by the end of the decade.
No. Most devices will still work just fine on Cat 5e for another 10-20 years, running at 1gbps. Cat 6a, running at 10gbps, will be fine for residential for another 25-30 years.
Putting fiber in the walls is an expensive overkill. If you are worried about future proofing, just install conduit with cat 6a.
imagine calling a country with majority of homes nowhere near 1 Gbps a first world country in 2021...
P.S. us citizens need to acknowledge that the reason they don't have such a luxury is a lack of competition in ISP space, make that issue nonpartisan, discuss it with everyone who cares, find out why other countries are in a better state? what are the structural differences? are there laws preventing competition? etc
> If you install in-wall cat6 it’ll probably be obsolete by the end of the decade.
Would it? "Regular" Cat5e has been around for 20 years or so now, and even at "just" 100mbps to 1 gigabit, would be a ridiculously huge upgrade for any home I've ever been inside of, and is still reliably faster than any WiFi device invented thus far.
Unless you have some really esoteric requirements, most homes could just run CAT6a to every room without thinking about it, and sleep soundly knowing they'll reliably get 2 to 10 gigabit (depending on distance) into each room for the next 30 years or so.
Cat6a supports 10gbps. That should be good enough for the foreseeable future. Fiber is significantly more difficult to pull through the wall since it is more fragile.
Fiber is not significantly harder than Cat6 and the turns can be quite a bit tighter. But terminating can be very expensive; the best price I have for a commercial fiber tech is $100/end.
Yes, if you ignore the difficulty of terminating fiber, the more expensive switches, the fact no motherboard, laptop or TV accepts fiber, and the lack of PoE there's really no reason not to use fiber.
If you're doing a basic project where it makes economic sense and are willing to teach yourself, it takes a $950 fusion splicer and about $300-400 of hand tools/supplies to terminate single mode these days. It's not super hard to learn how to do as an amateur if you can watch some youtube videos and look at reference documents. This particular model which can be found for 900-950 from China is popular for FTTH last mile work:
It's not something I'd use to splice a very important cable carrying long haul DWDM circuits, but more than good enough for its purpose.
As to whether a house needs fiber to each room? I'm not really sure, at the loop lengths involved, recent cat6 cable has a high chance of working successfully at 2.5GBaseT and 5GBaseT speeds, even if it doesn't qualify and test successfully for 10GBaseT. If you have a really high end 802.11ax 4x4 dual band AP with 2.5 or 5GBaseT interface on it and the switch to support it, in real world use it's unlikely you'll ever get much beyond 1000BaseT speeds to it with real wifi traffic.
The turns in fiber can be tighter than in Cat 6, really? Granted my experience pulling any kind of cable is most of two decades stale by this point, and my one experience with installing fiber considerably older still - I'm still surprised to hear fiber could be easier to pull.
Thanks for linking the video. That is wild to me - if I'd tried that staple gun trick with Cat 6 on a worksite, my boss would've kicked my ass all the way back to the office, and rightly so. I'd have never dreamed of seeing fiber that could hold up to it!
At a certain points (generally 6 gbs), you're getting network that is faster than internal SATA, so unless you're using pure NVME storage, your network is faster than your filesystem, which means your network isn't limiting you. That happens well before we max out speeds that can be accomplished with twisted pair copper. Even with pure NVME, PCIe as far as I know still maxes out under 100 gbps and no actual storage devices support read or write at full PCIe speed anyway. I mean, not consumer grade at least.
Fiber can solidly hit 10x the price of installation over CAT6a/7, between the more expensive cabling, ethernet conversion on the room terminals (ok, maybe you have one computer with a PCI-E fiber adapter? nothing else does), and the networking switch in a closet/basement (price a switch with more than 4 SFP+/fiber channels. they approach five figures. so, you'll probably have to convert back to ethernet at the source as well).
And the benefit is tenuous. CAT6a/7 can hit 10Gbps, as long as the run length isn't insane. Even the 11th gen Intel NUCs ship with 2.5Gbps ethernet LAN ports, on-board; outfitting your endpoint devices to breach 1Gbps is far cheaper, especially considering most won't ever breach 1Gbps due to hardware limitations (PS5/Xbox? Ikea Tradfri Gateway?).
Even in the "local network upload/download" case; you've got a server, and you want 40Gbps to that server. Building a file server capable of sustained 40Gbps transfer rates is... insane. Its not easy, nor cheap. It requires multiple PCI-E attached NVME drives in RAID-0, on the latest-gen TR/EPYC platform (for their PCI-E lane count, maybe Xeon is good enough nowadays as well). In 2021, this is still in the realm of "something Linus Tech Tips does as a showcase, with all the parts donated by advertisers, and it still sucks to get going because Linux itself becomes the bottleneck". Remember: A Samsung 980 Pro NVME Gen4 ($200/TB) can sustain somewhere around 6Gbps read; you'd need 6-8 of them, in a single RAID-0. And, realistically, you'd want 12-16 of them in RAID 0+1. A server capable of this is easily in the mid-five-figures.
(and, fun fact, even after you build a server capable of this; Windows Explorer literally cannot transfer files that fast. you have to use a third party program.)
If you're a millionaire outfitting your mansion, then sure, maybe fiber makes sense (due to both upfront cost and length of the cable runs, where sustaining 10Gbps on CAT6a/7 is more tenuous). But I think the assertion that Cat6a/7 will be "obsolete" by 2030 is pretty crazy. Yes, technology will continue to get cheaper and more accessible, and I do think we'll see more fiber providers in tier 1 and 2 metro US areas offer wider 2Gbs and 5Gbs connections, but CAT6a/7 is perfectly capable of saturating this. Just ask yourself: Do you really predict that the PlayStation 6, maybe 2028, will have a duplex fiber port on the back, instead of ethernet? Its 2021, and Microsoft's Xbox download servers can't even download game data at gigabit speeds; they rarely breach 250-500Mbps.
Given the niche that fiber lives in, even taking the position that "its just dual-channel light, one up, one down, nothing can travel faster than light, its the perfect future-proof tech" is tenuous. Whos to say that, in the next twenty years, a consumer standard for fiber is developed which runs quadplex (2 up 2 down)? Or simplex (because its "good enough")? Or the connectors are totally different (which would be the easiest to switch because it may not need new cable runs. maybe).
Oh, also: PoE! PoE is freakin fantastic for prosumer setups. and only available on copper. You can run copper to areas around your house where you want security cameras or other smart devices, and not have to worry about also running power.
I agree with pretty much 100% of what you've said there - but fiber doesn't need to have the mystery of being really expensive... Not for houses, but for commercial use, if you spend the money one time to buy a fusion splicer and good tool kit, some basic consumables, two strand singlemode is actually 1/3rd the price per meter of cat6. Due to it being so cheap to manufacture and the cost of copper being high right now. Done correctly you have a guaranteed hassle free upgrade path as far as 100GbE and 400GbE on the same fiber, patch cables, patch panel, etc.
But for residential use, one of the primary needs to run an ethernet connection to different places in the house is for an 802.11ac/ax (or whatever next generation AP), so fiber doesn't really solve the problem because you still need electrical power for the AP. Obviously one cable and 802.3af/at/bt PoE is a better idea than running fiber powering each AP off AC power wherever it's mounted. Aside from the fact that APs except for very, very expensive enterprise ones don't come with SFP/SFP+ ports, and are generally designed around the concept of being powered from the switch they're connected to anyways.
One of the reasons why i really strongly agree with your points is that in a residential environment it's going to be very, very difficult to really move throughput through an 802.11ac/ax AP that gets anywhere near stressing the speeds of a 2.5 or 5GBaseT connection in the future. I'd be fairly confident in saying that a house wired today with cat6a at sub 50 meter lengths, that tests OK for 5GBaseT, will probably be good for the next 25-30 years.
The big thing for me is, its easy to say "oh, fiber is future proofing". Alright, can't argue with that; just as its impossible to predict the future to say fiber is the correct choice for the future, its also impossible for me to say that it isn't. But, I strongly suspect it won't be necessary in our lifetimes.
The primary reason I suspect this is on both ends of the internet delivery spectrum:
First; I think the broad resource allocation focus over the next 10-15 years in the US will be getting "the bottom 80%" up to 100Mbps+ speeds; not getting the "top 20%" beyond 1Gbps. Many of the traditional ground-line companies who would be doing this work (Comcast, Spectrum, etc) are going to be experiencing pressure from emerging wireless technology that can meet these speeds, with beyond-adequate latency, at a fair price, and require far less infrastructure work (Verizon/AT&T/T-Mobile 5G, Starlink) (Starlink is a wild one; you're competing against the gravity well of the planet at that point; what can any of these companies who are "good at digging holes in the ground" do?).
Sitting in my new apartment here, I have AT&T home internet. Averages ~50/10 @ 25ms. I was told on the phone it would be 200 down. "Well, the lines in this building are so long, very old, we ran some tests and we can sell you the 100 plan, but you probably wont get those speeds reliably, you'd be better off on the cheaper 50 plan". Ok, fine. Let me run a speedtest on my phone here, Verizon 5G, 125/50 @ 10ms. The cell companies can just put up a tower, cover hundreds of people with really freakin' good internet, sell it as home internet, what are the cable companies supposed to do against that? Spend thousands of dollars re-tooling the wires in this old building to get "just as good" internet to six people, half of whom won't pay for it?
And the key thing there, these emerging wireless internet technologies won't breach gigabit for decades. Its difficult enough getting them to gigabit.
Part of the reason they won't is on the other end; we're hitting the point, very quickly, where Bill Gates' old misattributed "64k should be enough" quote is becoming true; just not 64k, more like "4K video". Would having 5Gbps internet, instead of 1Gbps (which I had just a few days ago) actually fundamentally change how I interface with content online? Not even close. Even 100Mbps doesn't; there's a point where internet just hits "yup, that's good enough". Cool, I can download Warzone in an hour instead of four hours; its the same thing at the end of the day.
An argument could be made that continuing to push internet forward will open up more innovation in content delivery; whether that's game streaming, 8K video, actually decent quality 4K video, whatever. I think this is tenuous as well, because a big bottleneck for many content providers is networking costs on their end. So much money has been (rightly!) dumped into making our (mostly privatized) nationwide internet backbone "resilient", that suddenly its gotten very expensive to egress data from most hosting providers (big cloud certainly, but even small cloud and colo providers). A high quality 4K video stream can saturate a 100Mbps line; as an end user, that sounds great, I've got a 100Mbps line! But as a service provider, you multiply that 100Mbps by XXX,XXX users, and the numbers start looking really scary. That situation will not improve in the next 1-3 decades; the focus right now is in algorithms to get the same quality in lower bandwidth, not just pushing more bandwidth.
Plus, applications like Game Streaming are both bandwidth intensive and latency intensive. So, double-edged sword, and one that the emerging wireless home internet technologies won't solve well. Having whole-home 40Gbps fiber or a 5Gbps uplink won't help you with Stadia.
Point being, I think arguably for the rest of our lifetimes, the internet as a whole is going to enter a holding pattern while we catch everyone else up with acceptable speeds, improve the width of the backbone (not just the "depth" e.g. fallovers and resiliency), which includes 10-100xing edge distribution, and improve underlying algorithms to reduce the size of content while maintaining quality. All of this will be prioritized above widespread 10Gbps to the home.
I did my undergrad in Seattle and had this issue. I eventually made 10 or so unprotected wifi printers spit out the "how to turn off wifi printing" page of their manuals. Definitely cleared up a bit of 2.4ghz room.
in practical use it's really not necessary, since a lot of those apartments are small enough in square footage to be well covered by one $80 5.x GHz capable 2x2 or 3x3 802.11ac AP. You can still get 400-700Mbps speeds through a 5 GHz channel, such as around 5150MHz, on any dual band equipment. Although 5 GHz goes through wood framed walls/floors a tiny bit, it doesn't go very far, to the point where you might see your neighbor's 5 GHz AP at a signal of -87.
Funnily enough, I seem to get 5 GHz interference from nearby buildings through glass. I’m very much looking forward to 6Ghz. I ran some 60Ghz radios for a bit, and that was great, as it was interference free, and a drywall layer would stop it.
Unfortunately, the hardware failed, and I didn’t see mass adoption of 60Ghz elsewhere. I hope that tech like local optical or 60Ghz becomes viable.
Yep that is definitely a thing, and you will see more of it coming in through glass in the 5725 to 5850 band, since APs can run at higher power in those frequencies. One possible mitigation is to choose the cleanest possible channel for your in home use in the bands where FCC-allowed equipment can only transmit at moderate powers (specifically disallowed for outdoor WISP 5 GHz equipment too), in the area of 5100 to 5300.
> I hope that tech like local optical or 60Ghz becomes viable
Yeah, I was really fond of the prototypes a while back that used visible light; those would completely end contention in an apartment complex or office:)
One of my neighbors has a 5 GHz access point that comes into my house stronger than the one in my own home. My only theory is they have a high powered AP with a directional antenna pointed in my direction. They also run 3 other APs on other channels as far as I can tell with identical. Basically they use almost all the available spectrum in the 2.4 GHz & 5 GHz range.
Interesting reading. The results, and the issues with housing, remind one of the post war building boom in Central Europe, Communist enforced 'blocks'. We would visit from Canada and marvel how everything was the same. Obviously a very efficient use of planning and construction. Almost assembly line. Actually impressive how many people they had to house, quickly.
The amount of design work that went into the community was more impressive. Years later, I could appreciate how everything was coordinated to have nearby playgrounds - so that you could watch your kids from your apartment. Schools, grocery stores, medical clinics, were all nearby in a logical efficient pattern. ( Communist system though, meant there was little to eat, or buy, or drugs, in those facilities.)
The wiring, (and plumbing) was all prebuilt into the concrete walls and floors, ready to be connected, upon assembly. The ideas and designs were great. Of course, the actually performance, and construction, were prone to the usual Communist era woes of graft, theft, and shortages.
To get back to the topic, phone wires were installed, but phones were not available for another 20 years. Imagine if Ethernet wiring was installed that way?
Fun fact: Wooden doors propagate fire less then metal doors. That's because metal doors just heat up and set what's behind the door on fire, while wooden doors consume the heat. The opposite side of the doors stays cooler than a metal door would.
Not sure why downvoted. For nearly a century large multi family wood structures were banned in most countries due to fire risk. These huge apartment buildings are all 2X4's, not timber framing. They're incredibly flammable.
The only thing keeping them safe is fire sprinklers. There's been many infernos that burned the things to the ground when sprinklers failed.
OK - I didn't want to be 'that guy', but here I go: DevRant is garbage! I like the concept, but the execution is a marketing black hole.
First of all: the linked page doesn't scroll on an iPad - they are trying to be clever with scrolling and it actually breaks scrolling.
Secondly: they take pushing their app to an insane level: none of the links works, they all point to 'Get the app'. This fact alone has just gained the site a place in my DNS blackhole.
> Secondly: they take pushing their app to an insane level: none of the links works, they all point to 'Get the app'.
I agree. Its “use our app”-push is similar to Twitter or Instagram, but these platforms offer a lot of functionality in their apps.
Devrant on the other hand, seems to be quite simple in functionality. Why would I need to download an app for that?
Just use the Web Notifications API! Maybe also act like some obnoxious "captchas" and actually check whether you got the permissions to nag you later or give you some flair reward.
Hii all, the original poster of the devrant thread here (made an account sjust for you guys).
I've read some of your comments and wanted to address some things.
I've seen people tell me to get a 5GHz capable AP and a dongle.
But this shouldn't be needed.
I already own a hAP AC3 (MikroTik) but to the way our house is built (it's pretty much all solid brick between each room), 5GHz doesn't go that far, so I'd need to buy an AP for pretty much every room.
When sitting behind my desk (about 2M from my AP), everything is fine and dandy but when I leave the room, 5GHz basically dies.
Next, I saw people recommend "working it out with my neighbours" by having them turn down the power or use cables to hook up all the APs (I even offered to pay for all the cabling myself), but they are not willing to do so.
And just saying "Would you stop using all this smarthome junk so we can have a decently working wifi" also isn't really gonna cut it for them.
And finally, I saw a comment or two to just jam the living shit out of it, but well... I'd rather not get my own ass into legal trouble for this.
> I've seen people tell me to get a 5GHz capable AP and a dongle. But this shouldn't be needed.
You can't have the cake and eat it too: 2.4 GHz propagates through walls, so you share the spectrum with your neighbors; 5 GHz is much more easily blocked by walls, but that also means you have the (much larger) spectrum to yourself.
> 5GHz doesn't go that far, so I'd need to buy an AP for pretty much every room.
Well, for every room that you need a stable connection in. If you can see your neighbor's signals that well through (presumably) multiple walls, I suspect that you'll also be able to cover at least one extra room per 5 GHz AP so that calls don't drop while you're moving between rooms.
> I've seen people tell me to get a 5GHz capable AP and a dongle. But this shouldn't be needed.
And yet, it sure sounds like it is! If your neighbor won't change the situation, and your outcomes are entirely dependent on them or you doing so, then you're left with taking an action that you normally wouldn't have to.
Or, depending on your needs, perhaps you can use Powerline adapters to some rooms.
I was thinking: take a larger canvas picture and put aluminum foil or window screen on the back. If you hang it in the right spot, it might block out a large portion of your neighbor's wifi signal.
Well actually... If they know that someone is shitting around with de-auth frames, they probably will find out in no-time (I'm the only one around this block that is known to have knowledge of anything tech related).
Kind curious; you have obviously talked to the neighbor. What's their attitude? Do they just not care about the problems that are causing you and others?
They generally do not care no and often try to brush it off as "as long as it's legal, I'm fine with it"... and I'd have no ground to stand on with that since it's technically legal here...
Consider moving to 5GHz and installing a few metal reflectors/surfaces to direct the signal that comes though doorways and other openings about the house. If your ceilings are plaster you might be able to get up in the roof space and set up a bunch of reflectors to direct the energy though the ceilings and over the tops of the walls. The total cost could be a roll of aluminum foil, as used in cooking.
Mikrotik, has great support for mesh networks and managing multiple AP, if you want to go that route. (and combine 2g with 5g)
Another route might be using ethernet over power lines, and then have 5g points in rooms that need it.
But essentially , everywhere I need to do things that require bandwidth, I just pull 1gb ethernet cable. Sure it's not as convenient, and pulling cables is a pain, but once you do those things it just works.
You could have shared mesh network, each (you, neighbor) with it's own vlan. You both win, great coverage and bandwith for both, and your AP's wouldn't fight (and cheaper too, since you need less overall AP's).
If talking is out, then you are only left with suboptimal options. Depending where you live mobile 5/4g might also be an option.
That's not how 802.11 medium access works: If two access points are on the same network, fairness should be identical in the "one AP, two VLANs" and the "two APs/SSIDs" cases.
On the other hand, if the issue is due to super slow legacy devices that just hog airtime, a single modern AP also won't help.
APs don't "fight" (or more accurately: they always fight in a fair and standardized way, even in a single AP network). Medium access works the same between stations and APs on a different SSID (on the same frequency) as it does on a single AP/SSID.
It's a bit different on distinct but overlapping frequencies, since detecting a busy medium has to happen on the physical rather than the MAC layer, but if both APs are on the same frequency (and supporting the same maximum speed), the number of APs and SSIDs is mostly irrelevant.
This might change a bit when thinking about MU-MIMO, but I doubt that 2.4 GHz only IoT devices support that.
Personally, I use a combination of wireless and physical cables. Not necessarily the most elegant solution, but it works.
Wireless has been great for a while, but I found it does not scale the way I would like it to, especially given the explosion of wireless devices out there.
Many years ago I lived in an apartment complex mostly populated by Comcast employees, with those horrid Comcast modems that broadcast two 2.4Ghz AP's a piece (Yours and the "xfinitywifi" hotspot)
My laptop picked up about 140 AP's from the couch. I could copy files from my NAS over Wi-Fi at a blistering 6KB/s! Sometimes bursting to 25KB/s
That's kind of surprising, actually, I'd expect a bunch of techy types to be running their own routers or at least to have manually configured their router. At least, assuming everyone in the building has Comcast as an ISP, I'd expect the number of xfinitywifi APs to be inversely related to the number of Comcast employees who live there.
I'm a techie type and have one of those comcast modems. I also have a descent Ubiquiti setup that rides behind it, but I'm kind of shoehorned into using their provided modem or alternatively paying for a modem myself and then paying another $35/month for "unlimited data" (Really crappy comcast policy but that's beside the point).
Long story short, this modem will reset itself and require me to not only become aware that it's once again broadcasting the "xfinitywifi" hotspot, but to also redownload the xfinity app, configure the app with my information and try to remember where they buried the setting to disable this "feature". After disabling it somehow takes up to 24 hours to actually happen....
I have over 80 IOT devices and three aps. I also live in the middle of nowhere. That being said, I like how all the commenters just go straight to de-auth attacks. The proper solution is to get up knock on the door and talk to them about their signals, ask if they can clear one channel for you. Work with them. OR just you know, buy a router made in the last five years that supports decent 5ghz range and mount it up high.
Not everyone can use 5Ghz - if you have one of those 50s/60s/70s houses that were build with plaster/chicken wire (rather than lath&plaster or drywall) then at 5GHz you're basically living in a bunch of faraday cages
(at a company I used to work at we once had to abandon a 5GHz product when we discovered that an appreciable numbers of customers would take it home and not be able to make it work reliably - it would have been a support nightmare)
Not all IoT products support 5ghz - some of mine say they don't (Nest Protect) and some say they do but don't work (Nest thermostat, Netatmo). The only way I could get a stable IoT network in my apartment was to set up a 2.5ghz guest network and put all of them on it.
I live in a relatively modern house (<15 years old) and the 5Ghz signal doesn't make it up the stairs. 5Ghz is 'same room' wifi, and I have Problems with it. It's the same with mobile 5G data, it needs so much more signal towers / transmitters to be effective.
I've hit 'forget' on my home 5Ghz network because it's just unstable upstairs, and my phone isn't smart enough to pick the network with the stronger signal for some reason.
depends on whether it's used for internal walls, chicken wire as part of plaster on external walls alone would help keep the neighbor's interference down
Older chicken wire for internal walls (ie pre-drywall) is the problem at 5GHz (and I'm sure to a lesser extent at 2.4GHz)
I would love it if those down-voting this comment would explain why. Is this comment untrue? How? Does not metal siding end up grounded, however weakly, by rain downspouts anchored to the side of the house then contacting the ground. Does not a relatively contiguous grounded metal sheet around the exterior of a building attenuate 2.4 and 5Ghz?
I'm probably romanticizing and projecting, but I imagine that scenario going something like this:
* lrvick knocks on guilty neighbor's door.
* guilty party's non-technical brother answers
Brother: Yeah?
lrvick: Hey, I notice you guys have an antenna on the east side of your roof pointed at my house, is that yours?
Brother: Huh? No, that's my sister's.
lrvick: Is she around? It's messing with my Netflix and I'd like to talk to her about it.
Brother: Oh, um, yeah. hold on...
* Brother leans away from door into house and shouts inside, "Hey sis! Neighbor at the door for you!"
* lrvick waits.
* Guilty sister walks up to door
Sister: Can I help you?
lrvick: Is that your antenna on the east side your roof?
Sister: Yes.
lrvick: So, the past month, someone's been messing with my wifi with deauth attacks.
Sister: *mildly indignant* I'm not doing that.
lrvick: Well, The past week, I spread a bunch of sensors around my property and a couple of the neighbors and logged the 2.4Ghz spectrum for the area. I correlated everything and triangulated the attacks to that antenna of yours, which just so happens to be aimed at my house. Here's the logs and spectrum heat maps if you'd like to see.
* lrvick offers Sister the printouts he brought with him.
* Sister takes, looks them over and starts to get that embarrassed flush in the cheeks and ears characteristic of someone who knows they're busted.
Sister: I mean --
lrvick: I'm actually not even mad. It's kind of illegal, but it's also ballsy and I respect that. But I'm not the kind of person to go snitching to the FCC first thing. You do know if we're competing for spectrum, you can come talk to me, and we can coordinate a plan over a couple beers, right?
* Three months later lrvick and Sister are co-presenting a talk at the local hackerspace on counterattacks to deauth attacks.
I can't imagine it's the high number of (IoT) clients that is the problem - it is the high power that the AP's use that causes the issue.
WiFi goes two ways - the AP talks to the client, and the client has to talk back. Clients don't have the high transmitting power that an AP has, and actually the AP won't need this high power. After all, if you use a bullhorn to reach your backyard, but the guy over there can't talk back, there is still no communication.
The solution would be to set the AP's to a lower transmitting power, enough to cover the house but not the street.
That would be the ideal solution, but they'd have to adjust tx power on all of their devices. Or at least a significant number of them. Which might prove problematic, since most of them don't allow manual adjustment.
Every light bulb in my home is on Wi-Fi and has its own IP address. And then several outlets have their own IP address. Door locks. Robots. My Christmas tree had an IP address when it was up. My air purifier. My air quality monitor. Security cameras. Lots of things. Seems pretty normal.
I think it's safe to assume that 80+ wireless devices puts you in the top few percentiles. So while not unheard of, not "normal" exactly. And even less common in dense apartment environments where having more than 3-4 rooms is unusual.
10 window/door sensors, 5 cameras, 10 radiator thermostats, 20 lightbulbs and some other now that you've started and you're already at 50 while "only" doing the "basic" smart home.
I've heard of at least a handful of people that have done this and more this year alone.
Yeah, it obviously does happen from time to time, but is it normal? My anecdotal data point: 3 devices all isolated on their own AP with no way to talk to eachother or the rest of the world. 50 is a lot. 120 is just insane. :)
I struggle to figure out how to reach 120 devices, but I can easily see 60-70 if you want a smart home. 3-4 lights in each room, a thermostat, air quality sensor, smoke alert, window sensor, door sensor and a camera. 10 sensors in one room. It's only seems like a lot because it's not mainstream yet.
Yeah, that might just be me. What you are describing sounds completely alien to me. I still struggle with comprehending why people insist on bringing the basic stuff online -- my security cameras should not require Internet access -- but I guess there's a reason I live in an old house on the countryside, and not in the middle of a city. :D
Cameras in each room? It's a privacy nightmare and I frankly don't trust my equipment, my network, nor myself to set that up safely. Especially if the cameras I own are any indication of how the rest work (hard-wired default passwords, security flaws, upload of FTP credentials to some Chinese business' servers, constant attempts at leaking data through DNS lookups when regular traffic is blocked, etc).
Thermostats and humidity sensors make sense if each room has it's own heaters/air conditioners, though.
Shitty design. Doorbells should never need batteries. Even a tiny, tiny solar power panel and a method of power storage would be able to keep it going indefinitely.
Our door bell harvests the kinetic energy from being pressed and uses that to generate the trigger signal for the base-station. Sadly it fails if the push isn't firm enough, it appears not to have any sort of battery.
I could imagine securely setting up cameras through a house.
Of course, the first thing I would make sure is that they don't connect to the internet or any wireless network. So I can't imagine the ones the GP is talking about being anything other than a privacy nightmare.
By the way, what happened to data over electricity wires? It's the obvious way to set the home automation stuff.
There are quite a few cameras that record to a device on the internal network - put them on a VLAN that has no outside connection and you have it all. That is quite an advanced setup though and it's good practice to save the recording at another location as well.
LAN over electricity wires is a proper pain in the ass. It's not like you can plug in a "router" to one outlet and have network in all others. It only goes to the group that particular outlet is in, so unless your wiring is made in a particular way, which existing houses aren't since it's not an established standard.
Well, what is stopping people for creating switches that go into the distribution panel? They can even fit as circuit breakers, so no change at all is necessary.
One would need a large diversity of products to make it work, what puts it out of reach of any small manufacturer. But why the large home automation sellers insist on their "you must add this huge panel, pass all those wires around, and can't retrofit any new device once the layout is settled" technologies is a mystery to me.
Hmmm. My two bed 120 year old house has 9 rads and 28 bulbs. It's not what I, as someone in the UK so used to smaller houses than the US, would call, _large_.
I've got to ask, although I'm afraid I already know the answer: how do you do patch management with that many devices? I've come to the point where I'm running about 12 VM's and patch management has become a bother, so I'm curious how people deal with this stuff when their light bulbs are tiny little servers.
Having said that, if I had the money, I'd totally automate everything in my house as well. It's just too much fun not to.
80 IoT device seem like a little more than normal, but I wonder how many devices the average person have without really thinking of them as IoT devices.
Some of IoT device you list I’d actively avoid, but thinks like robot vacuums is really useful.
The kitchen appliances are a little weird. People keep claiming that it impossible to avoid, but I have yet to see any of them in stores. Not even a gimmicky smart fridge is available.
Hey, it is available. At 20.000DKK ($3235) and in that double width I don't see many buyers, regardless if it's IoT or not.
Interestingly, at one of the largest Danish chains, it's listed under "American fridges", most of those DO have wifi. If you just pick: "Fridges", then none of them have wifi.
A lot of modern designs have flat controls with no tactile cues that renders them unusable for the blind. Being able to connect with an app using VoiceOver makes the device accessible.
Even non iot kitchen tech is ridiculous. I tried to buy an oven with as few features as possible. I foolishly assumed that two knobs meant “mode” and “temp”. Nope. It has loads of features which you access by clicking and holding and twiddling one while opposite twiddling the other. Smeg.
They have backup batteries and form their own mesh network which works independently of the main WiFi.
If the central unit disappears, the detectors still all go off if one is triggered. (Central unit offers app interface and optional 4G fallback network connection.)
I graduated from Software Engineering in 2003, and recently worked for Canada's biggest Telco for 4 years.
I didn't even know half of the IoT's on your list even existed !
I love your love of automation. One question: is it harder to say setup one controller with an IP address and WiFi to control a set of dumb light bulbs rather multiple IP enabled bulbs?
IP-enabled bulbs (or rather, 2.4GHz controlled Bulbs, by whatever protocol) have a bit more nuance to them. A controller could probably at best hope to control a dimmer hooked up to the light bulbs. I've setup my own apartment with the IKEA LED bulbs, which is great for giving more individualised lighting (I use the colored LEDs to provide a dimm red light in the evening to simulate sunset more closely aligned with my actual sleep schedule).
There are non-IP full-color LED bulbs like this Zigbee version[1] which have the same flexibility as the WiFi models. Color and brightness can both be configured dynamically through the controller in response to arbitrary user-defined rules.
My IKEA models are not Wifi either and have a lot of flexibility (and the gateway is offline-first too). I can understand the demand for IP, since it probably simplifies a lot of things about the software stack (and licensing).
There is a wireless LAN in my space, all devices that need connectivity should just use that LAN.
More hubs are more headaches. You need to not lose their power adapters when moving, you need to find a place to put them that is within range of all your devices (may not be as easy as Wi-Fi since they may be using other bands that don't transmit as well or have FCC restrictions, and I already have Wi-Fi mesh network everywhere, you can't easily mesh arbitrary protocols on arbitrary frequencies), you need to set them up using typically a shitty spyware phone app, you need to keep their firmware updated in addition to device firmware. And you also need a hub for every brand of appliance you buy. A Philips Hue hub will not work with some other brand of lights. Whereas you can mix and match hub-less lights and you don't need to have a pile of hubs on a shelf somewhere.
I imagine that powerline networking for lightbulbs would be ideal. Note the serial numbers, screw the bulbs in, plug controller into an outlet somewhere, assign bulbs to groups/rooms by serial#.
But knowing the state of IoT they'd probably take 3-5 seconds after turning on to negotiate connection since the power had been cut, and they'll default to blinding white at full brightness during that time.
That's the thing, Zigbee isn't 802.11/wifi based, it IS a radio, but has really great range for low bandwidth purposes like turning a light on or off, and runs from 784-915Mhz (depending on country).
I don't feel the need to have a single "smart" light bulb or outlet. If I'm turning a light on or off, it's because I'm leaving or entering the room, in which case I'm walking by the light switch anyways.
What I got from this article was "Person using WiFi complains about another person using WiFi".
I agree with your sentiment. It requires communication. De-authing is not going to solve the problem just make enemies if they work out what the OP is doing.
The poster on devrant has joined the discussion and it seems like this is exactly what has happened: they spoke to their neighbours, and their neighbours have refused to make any changes.
It does seem like getting some 5GHz hardware would be a relatively straightforward solution but the poster is unwilling to do that. My take is if you want good quality home networking then you need to be willing to pay the price to get that within the constraints of where you live: sometimes that means spending more money than you might like. Such is life.
To me, the problems seem to be those devices use too many channels, continously?
I recently tried checking some TPMS sensors that I bought with rtl_433 and was amazed by how many sensors can get along just fine in my 10 story appartment buiding, but those messages are really short and only transmitted once every couple of minutes.
What does that gain exactly? The 5ghz band is so crowded around my apartment that most devices can't connect at all. Is there some fundamental reason it should work better?
There are many more WiFi channels in the 5GHz range and the signal drops off faster, so you should have both fewer APs "visible" and more room for them to spread.
The mad scientist in me says to use a microwave magnetron pointing at the neighbor's house to fry all their 2.4 GHz stuff.. Could that work in principle?
Not quite the same, but Serbian forces used microwave ovens with the door interlocks defeated as cheap decoys for anti-radiation missiles in the Kosovo war.
On the other hand, somebody so uninformed on the fundamentals of half-duplex wifi that they would choose to use mesh network extenders everywhere all on the same channel, is a person that I would rate as highly unlikely to listen to a logically reasoned technical explanation.
I'm not saying "they literally need to know the FCC regs" - but totally uninformed on how to not step on yourself in a half duplex wifi medium in the same channel with all your equipment, yeah.
Please explain to me the time a hypothetical person that decides to install a bunch of IoT things is referring to a document that explains how to do this? To me this sounds like a classic case of "because of a twist of fate and my hobbies/job/past I happened to do a deep dive on wifi fundamentals at so e point and since I now find it easy and obvious I can't fathom why people with different hobbies/jobs/pasts don't know the same things I find obvious!"
Somebody who has literally 140 discrete wifi based IOT devices yet has put them all on the same channel and is further using half duplex mesh repeaters also on the same single channel, is probably the sort of technology early adopter/enthusiast who thinks they know what they are doing, with a high degree of confidence, but actually doesn't.
I have to admit I am biased by doing some microwave band rf engineering stuff professionally, so my perspective on it is perhaps a bit more critical than others. You could say I've seen how the sausage is made...
Maybe. Or maybe it's someone that paid for a complete remodel and some contractor sold them on making every light and outlet and appliance "smart" along with some more we can't even think of, because that's all extra money to them, and then they throw whatever else they can find at it to make it work because really, they haven't ever put so many things on one network before.
I've heard crazy stories about home installers and contractors that didn't really know what they were doing. Everyone starts somewhere, and it's not always with our own home nor with a mentor that actually knows what they are doing.
Or, it's just some person that did it for their own home but went way too far and when they finally got it working (sort of) decided they really didn't want to touch it again for a while, because it was a pain.
Truthfully, either of those sound kinda likely to me.
I don't think the IoT devices will care that their download speed is somewhat reduced and that they'd have to wait a few ms to get access. These devices will hardly transmit or received any data. Likely a lot of switches, wall outlets and some sensors that you'd want a reading from every minute.
The number of clients should not be an issue, the half-duplex shouldn't be as well...
I'm definitely at least that uninformed. I'd like to think I'd listen to a logically reasoned technical explanation. I really don't understand how wifi works and I know this. Sometimes I wish I did, but not quite enough to do anything about it.
Wi-Fi 6E would solve this for you. 1,200 MHz of bandwidth is now available at 6 GHz. Wi-Fi 6E routers exist now. Here[1] are some devices that are Wi-Fi 6E capable.
It's an expensive, possibly inconvenient solution, but from the perspective of RF spectrum it has a high probability of success; your 2.4 GHz neighbor is unlikely to appear at 6 GHz for some time, and even then the reduced 'range' of 6 GHz will work in your favor.
We're currently running into the same problem now with 6E as we had with 5Ghz.
All of those dinky cheap shit little IoT devices DON'T USE IT. They're all on 2.4Ghz.
5Ghz promised more spectrum and more usability and mostly gave that to us. Not quite as good range but it's great... IF we replaced everything we had that used 2.4Ghz.
We're still waiting for legacy devices to die... and new devices to stop using 2.4Ghz before that can happen.
Generally, the 2.4Ghz problem affects everyone and everything. 11 channels, only 3 usable without overlap, that's before you consider other peoples' routers AND other unlicensed equipment on the same spectrum that isn't even wifi... I have had issues with my smart switches not responding at certain times of the day because of the interference... and what i described before still remains: 5Ghz doesn't have as nice of a range and I'm not exactly up for the idea of chucking out what I have just to buy it again for 5Ghz or 6E.
There is literally no benefit if you are this person to acknowledging its you. From their perspective you might be a well informed tech person or just crazy, but either way you don't want to be dealing with them everytime anything goes wrong with their wifi.
And it's pretty obvious the person ranting doesn't feel this problem rises to the issue of "spend $50 to get a 5ghz router and dongle".
I think he mentions trying 5ghz somewhere in the thread, but his house's internal construction interferes with the signal. It is a little odd how long it takes for someone to suggest that, though.
Something like a TP-Link AC-1200 plus a $10 AC dongle should give you pretty good speeds for around $50 total.
I'm not skeptical of people being able to get a 5 GHz router setup for that price, but I am skeptical that everyone can make 5 GHz work. There are still devices that are 2.4 GHz only and some apartments where 5 GHz doesn't cut through the walls correctly.
The neighbour definitely has problems with his wifi as well, so fixing the issue would be beneficial for everyone.
Step 1 would be to use APs that are connected with ethernet instead of Mesh-type APs. That would greatly reduce the amount of data that needs to be transmitted (with mesh, your signal needs to be repeated multiple times. With ethernet it's just transmitted once)
Step 2 would be adjusting power output of each AP. If the neighbour has multiple APs, they probably don't each need to run on full power.
Step 3 would be looking for devices that transmit a lot of data. I assume most of the 120 devices don't cause a lot of traffic. Maybe there's just a few that cause 90% of the traffic. Maybe the most traffic intensive devices (eg. camera) could be replaced with a wired connection.
I'm assuming the neighbour doesn't know anything about how to optimize Wifi networks, but OP sounds knowledgable, so maybe they could work together to fix the issue.
It can be a Win-Win situation: OP gets a usable 2.4GHz channel, Neighbour also gets better connectivity.
I've had similar problems in my neighbourhood, I can't honestly find out who it is.. i kinda know the direction, and have tried talking to each of the units and everyone denies it is them.
First - that's definitely super frustrating and I'm sorry you're dealing with it.
It's possible that the person who's causing the problem doesn't know they're the problem. Like, if the underlying cause of this problem is "not super techno-savvy" then they might not realize what they're doing :)
Yeah, I don't think I'd know if I were doing this. I really don't think I am, but default settings and shitty proprietary IoT devices make me unsure...
If he does this, and the neighbor decides not to do anything, OP can’t run de-auth attacks later on to free up the network, or he will be immediately suspect.
That might work, but also, it's possible 120 devices is simply too many for one person to be using in an apartment complex. Wireless spectrum is a shared resource, it's not okay to hog it.
If not 120, imagine someone with 200 or 1000 devices. At some point you've eaten more than your quota of spectrum and you owe it to your neighbors to stop hoarding IOT devices.
Edit: heh, not believing in scarce resources is peak Orange Site.
In an ideal world that might be true, however, in our world currently there's no such thing as "your quota of spectrum"; wireless spectrum is a shared resource but people are allowed to hog it. FCC does not prescribe that the spectrum usage should be shared equally or equitably; within the allowed limits and without intentional harm (e.g. deauthers) you're each allowed to try and use as much as you can. The person running 120 or 200 or 1000 devices is entitled to assert that they don't, in fact, owe anything to their neighbours - on the other hand, they would be entitled to protection and FCC intervention if the neighbours attempt to deauth their devices as suggested in other comments. You're welcome to try and hog the shared resource yourself, but preventing others from hogging a lot of the shared resource is illegal.
Sure, there are scarce resources; and what we're seeing is the expected results of the tragedy of commons for a shared resource.
Depending on where you live, this could be something you could report to your local radio spectrum regulator.
Usually there are regulations that state your radio devices can't interfere with someone else's. It doesn't matter if they are off-the-shelf type approved devices, you still have to operate them responsibly.
I live in Ireland and there are have a few cases where the local regulator ComReg has intervened when someones wireless networks and devices affected others.
I doubt this would get anywhere. Usage of the spectrum as designed is not "interfering", and fairness at the physical layer/medium access level works at the per-device, not the per-household or per-person level.
There isn't even a requirement for "802.11-like fairness" on 2.4 GHz, which makes legacy analog wireless headphones that blast the entire spectrum without any listen-before-send policy as legitimate to use as a state of the art 802.11ax device.
I wonder if a dedicated "Wi-Fi only" chunk of spectrum would make sense, but it's not like such rules couldn't be bent by having multiple access points and spreading traffic over these as well.
I live in a densely populated area. I see 250 distinct SSIDs belonging to about as many APs. 2.4GHz is basically unusable. People try to increase power of their emitters (I see some weird devices that emit way more power than they should legally be able to), but this is only making the problem worse.
Our flat is top of the building, we can assume middle is even worse.
We use 5GHz for our WiFi needs which does not penetrate through walls as readily which means we only see 3 other APs.
Why doesn't anyone make a variant of drywall that has copper mesh embedded in it? You could theoretically make it thin enough to cut with almost the same level of difficulty.
You would only want the shielded variant on your exterior walls & upper floor ceilings. All you would need to do is make sure you clearly label/color the shielded vs non-shielded drywall appropriately so the construction crew doesn't get em mixed up.
I never really understood why IoT devices don't communicate via the power cable; at a guess it's because the hardware needed is too bulky or more expensive than wifi.
In Europe we have internet providers who 'helpfully' send you boxes you plug into your wall sockets and allow you to use your electrical infra as network cables. Lovely concept, you don't have to run your own network cables.
Ofcourse proper network cables are twisted in pairs and shielded for a reason: so no signal gets in, but also next to none comes out. All the copper in your house essentially acts as one big antenna if you use these powerline communication boxes. You would not believe what the spectrum looks like next to one of these houses. Ask any HAM operator, they have a deeply rooted hate against these things.
Thank goodness they seem to die out slowly since powerline communications doesn't seem to be able to keep up with increasing broadband speeds. As it turns out you need proper network cables to get halfway decent speeds.
Just doing quick mental accounting of my devices: leak sensor, a few door and window sensors, ceiling mounted monoxide and fire detectors, a motion detector… none of that I'd really want to run the mains to. And seriously, if I had to run cable to those I'd rather much have 2-wire Ethernet with PoE.
The fridge, washer and dryer have integrated IoT but there'd be little point running just these over power lines.
Duh! To run a peer to peer decentralised, compression algorithm Which beats the Weissman score records and which can double up as P2P file hosting solution using PiperNet.
Surely the solution with someone with that much kit is to turn the TX power wayyy down. I guess it depends how close your neighbours are too, and what kind of building they have.
I actually tried coordinating an effort like this with my neighbors when I lived in an apartment complex. That effort very quickly died, for most neighbors not knowing much about their wifi other than that the box with blinking lights made the Internet go, and not a single stock firmware I encountered exposing a TX power setting anywhere in the configuration interface.
Hopefully someone with that much kit would be the kind enthusiast who'd be running firmware that allows you to turn TX power below 100%, but it could just as likely be someone who really likes being able to set their light bulb color from their iPhone.
I put up an AP at the other end of my apartment from where my router sits, chose non-overlapping 5GHz channels with no interference from neighbors, tuned the transmit power so the signal level from each sits at around -60dBm where they overlap, and got everything 2.4GHz-only either wired with ethernet or replaced.
Now I just wish my neighbors would do the same, but I am not going to be the one to suggest it. If I fix up their networking, I will forever be on the hook if anything goes wrong or they face intermittent outages. Been there, done that, didn't even get a t-shirt. Unpaid network janitor is not something I need to add to my resume.
Perhaps one could make work as a network tech for home users, but I feel like people either don't want to pay for it, or don't care because it's "good enough".
Exactly! That's what it was supposed to be. A wifi-neighborly approach - have your local coverage at minimum needed power and with lower pollution. Except that all wifi-devices would be factory configured at increasingly higher 'medium' power setting.
I had fun with just the worst case scenario - 20m away is a glass condo tower right in front of my windows (the list of loud networks is endless), 100m away is a cell-tower. The RF interference from the cell tower so bad across all bands that I could hear it even in radio speakers. Yet, I still get steady browsing speads over my wifi (loud it is), just not practical NAS speeds, so wire does it.
On the contrary, turn 2.4 way up, and only use one channel on one AP, not mesh. Then 3 homes can co-exist.
By contrast, for 5 GHz, turn it to minimum and use one AP per room with wired (Ethernet or MOCA not Powerline) backhaul, and ideally APs with smart DFS.
Yeah this is what I do. I have 3 WiFi aps and obviously they are on different channels (though one of these is 5ghz only). But I have the power set to minimum. This actually helps proper roaming around the house.
My apartment had so many issues/complaints they just had the local ISP install AP access points in each apartment. It's one network but has VLAN segmentation and per apartment PSK. This way routers are way smarter at time sharing and power levels for 2.5Ghz are very low plus you can roam anywhere in the complex.
Of course I run my own network and do those steps my self. (they were nice enough to give me a port with a public IP). Ethernet is still king for many critical but WiFi seems great despite there still being 35+ SSIDs around.
TBH most SSIDs you see have so little traffic 90% of the time so I can't imagine they would decrease the SNR that much (e.g. printer setup networks...) My guess is their neighbor in this case does have a lot of traffic besides the IOT. # of devices and heavy usage just tend to corelate for obvious reasons.
Zero traffic networks do put a bunch of congestion load on the network, because the various Auth/advert frames are transmitted at 1Mbps with very legacy modulation and framing. That ends up using a lot of channel time.
One of the first things everyone should do on a new network setup is disable 802.11b, this raises the minimum modulation in beacons from 1mbps to 6mbps which helps a little.
I wish Zigbee didn’t pollute the WiFi frequency space either. Looks like Z-Wave is my only option, despite it being an inferior, closed protocol. It looks like the home automation community prefers Z-Wave for critical, high availability devices for this reason.
I heart z-wave. After having worked for a company that was heavily gaming the IEEE standards process I have lost faith in many 'design by committee' protocols. I use Z-Wave at home and it works great. Low power and reliable.
Oh and HDMI as well. I spent at least a year of my life working out interoperability issues for a major DVR mfg. If only HDMI certification had been controlled by a single company...
Not all standards are made like utf8, ie commissioned last minute by a committee, written by two guy over the weekend and adopted by committee the week after. Then used by everyone for decades with little to no changes.
Z-wave does not require aes128, it can't even handle multiple messages that well and gets congested very, very easily. The more z-wave secure devices you have the more easily your network gets congested, and god help you if every device also reports their power usage. If you want to remove a device from the network to move it to another network, you often have to go through an explicit dejoin process vs just start a new join process with something else.
It seems like you know this but just to be explicit; not a damn thing. MQTT rides over TCP, this is like suggesting TLS to fix ethernet broadcast storms.
I do think the suggestion of switching to MQTT nicely illustrates the problem though; even relatively technically inclined people frequently have absolutely no idea how their iot devices work.
ZigBee isn’t proprietary, It’s an open standard on top of 802.14.5. MQTT is for command and response dispatch; communication with the devices still takes place over a network like e.g., ZWave, ZigBee, WiFi.
The problem is that 2.4GHz is noisy as fuck, regardless of who’s making the noise or how. ZWave is only better because they use 900MHz.
For the past more than 10 years I could see more than 50 APs from my room in my 12 floor building, and this even before IoT was so popular. Wifi has never been reliable for more than browsing or chatting.
I've ruminated over 2.4Ghz spectrum contention for a long time now, and, while I claim nothing more than ignorance on the physical side of things, it's not obvious to me why we have moved to bonding ever larger blocks of that frequency allocation together when from my experience, contention has been an issue for 15 years. As far as I can tell, most applications of ambient connectivity are fairly low throughput: push notifications, MQTT, and whatnot. The 'original' 5Mhz channels should be more than adequate.
From a theoretical perspective, using double the bandwidth and half the transmittion time causes the same congestion, as long as everyone is using the same strategy.
That's not quite true with CSMA-CA. The minimum medium aquisition time is effectively a constant, no matter the bandwidth. Wider channels mean you've got the same duration of deadtime across more channel bandwidth for each transmission. In fact it's worse than that - wider channels mean more senders compete during each medium aquisition, so collisions are more likely, further reducing capacity. The use of AMPDUs in 802.11n (and later) goes some way to mitigating this, but our measurements show it doesn't do a great job in practice with many workloads. In the end, if things are really congested, you'll get more total capacity with narrow channels, at the expense of not being able to burst at high bitrates when things are quiet.
Well, if you want to argue from a theoretical point of view, the fundamental constant determining the CSMA-CA slot length is the speed of light within reception range, which is effectively constant, and the requirement to demodulate the carrier signal to perform carrier sense, which also has a fundamental time constant related to carrier frequency. Of course DIFS in 802.11 is much larger than that fundamental limit, because carrier sense needs to implemented in real low-power hardware.
I live in an old, three story brick house from the 1930. It's small, 100 m² (1075 sq.ft.). The 5 GHz from my Airport Express can barely reach the kitchen. My solution is to put additional APs in the house, but with low transmit power. I use 25 mW personally. But then again, the chances that the neighbor will agree to this, are slim to none.
I am willing to cede the entire 2.4GHz spectrum to IoT devices. We can use 5GHz and 6GHz for full-fat internet devices (TV, laptop, phone). This is what I've done at my home.
A single AP is useless with 5GHz. You should look into mesh systems, and try to get one with a wired backhaul for even better coverage. I splurged a bit on my mesh, but I'm glad I did. I have never experienced Wifi coverage so good in my life.
It will be interesting see if the next waves of wifi products solve this. For Wifi 6 and Halow there's a new feature called 'target wake time' that will let these devices sleep (and not pollute spectrum) for longer.
Wifi 6 also brings OFDMA which will let stations use much less of the channel at a time (instead of a 20MHZ+ chunk they can just use 2MHZ while other stations use the rest). 2.4GHz being stuck on old Wifi 4 (or worse) devices hasn't helped the situation.
Turning down the transmit power of the AP might help in this situation.
I use the 2.4GHz band only for legacy/unimportant devices so I've turned its transmit power to the lowest setting available. All of the legacy devices are in the same room with the AP so this is not a problem in my case.
Since 2.4GHz penetrates through walls quite well this has the added benefit of limiting the radius of the coverage and to some extent improving security. But this is not a perfect solution (it's more like an unintended side-effect).
The 5GHz band has lower penetration through walls so leaving it on highest transmit power should be fine.
It definatley helps if you have space/property. If you live in a relatively close density (apartment, zero lot line homes) it very well may not.
I have some rtsp cameras. To keep them from being dumb and randomly roaming to like aps (and overloading the antennas and causing breakups in the stream)I setup different SSIDs on each ap for them so I can control what ap the cameras use and prevent bad roaming decisions.
This also means I have an SSID on each non overlapping 2.4 GHz channel.
However I live on a couple acres. If I’m interfering you are trespassing.
I probably would take a different approach on a higher density area, such as getting an AP with 2 dual-port, dual-polarity antennas dedicated to 2.4GHz and sticking them all on the same AP dedicated to the task, also assuming i would need to cover fewer doors and RSSI wouldnt be an issue.
As someone with an irresponsible amount of IOT devices, I try to hardwire everything to avoid being 'that asshole' on the block. I have my wifi decluttered to being just my phone, work laptop (when I bring it home), my boyfriend's phone, and two bluetooth wireless headsets for our two computers.
I have 3 music receivers (for android casting), 7 TVs, weather station, coffee maker, fridge, google home controlled lights (40ish), and other stuff all hardwired. A lot of which I had to modify or build the controllers myself to get them off wifi. The lights all run off POE.
Reading the comments under the article made me think when I'll become the bad neighbour and suffer from deauth attacks without knowing it. I have two AP's and 3 SSID's, one 2.4ghz for my 15 IoT devices, a 5Ghz using the same network, and a separate 5Ghz for "serious" devices that I use only for work. However I live in an apartment building and I have both most devices and SSID's. I've pondered before whether I am causing inconvenience to someone and how to solve it if complaints ever arise.
If you want to be a good neighbor, disable support for "legacy" rates in your access points. The periodic beacon frames are transmitted at the slowest supported rate so that the least capable client devices can receive them. For 802.11b, that means something like 1Mbps. At such a low rate, the beacon frames consume a non-trivial amount of bandwidth. Each access point you see could be consuming over 3% of the channel. Turning off 802.11b/g support (if you can) will significantly reduce the cost of each access point broadcasting. If you're really nice you could decrease the frequency of beacons, but that's not really worth it due to buggy clients sometimes not coping well.
The channels around my house are so busy that in order to make wifi reliable, I have three access points, each one on its own non-overlapping 20Mhz 2.4Ghz channel. Two of them listen in on 80Mhz 5.8Ghz as well. I have them all set to the same SSID, so that my devices just choose whatever they think has the best signal strength. I also ran cat6 to my TV and desktop so that they don't need to go over wifi. I disabled 802.11b/g on all but one of them to keep the amount of overhead down from all the beacon frames.
This is why the Comcast SSIDs everywhere are so annoying. They make the spectrum tangibly worse just by existing. Same thing for wifi printers that broadcast their own AP. Something like 30 APs sending beacon frames is enough to completely trash 20Mhz of 2.4Ghz.
Perhaps in scenarios like this, people will need to be responsible for putting up a basic faraday fence around their property in order to prevent polluting the neighbourhood with their radiation.
I wonder how effective ICF construction is in blocking signals (both incoming and outgoing)? With the high price of lumber right now, maybe more people will opt for ICF which might help with this problem if enough people adopt it.
I love ICF, but concrete alone doesn't block radio very well at all. Best to go with steel studs -- it's found in all modern commercial buildings where I live. Use steel joists between floors and a metal roof and you've got yourself a building full of tenants who have great wifi, but terrible 4g cellular.
All of us fighting over tiny slivers of bandwidth while telecom monopolies gobble up giant chunks of spectrum. The amount of value generated by the tiny unlicensed spectrum dwarfs the value that "5G" could possibly ever generate for society. Our public resources have been stolen, again.
It's not exactly on-topic, but why does all IoT stuff use 2.4Ghz? Basically every router has 5Ghz nowadays, even the cheap ISP ones, and spectrum pollution is an obvious problem. Is there any reason to prefer 2.4Ghz or to avoid supporting both?
ESP8266. Good news: everyone can have a cheap Wi-Fi chip to put in their project. Bad news: everyone can have a cheap Wi-Fi chip to put in their project.
2) 2.4GHz has channels 1 - 11 globally, with 12 and 13 in some countries.
5GHz only has channels 32 - 48 globally. Depending on country of distribution it then can support other channels too, but which channels are legal varies significantly. To support those, you need to know which regulatory area you are distributing to. So - additionally to HW being more expensive, you have to have different SKU for different regulatory areas, meaning production runs of different hw - more costs added. Limiting yourself to channels 32 - 48 to claim 5GHz compatibility will just cause you support pain & public image when people will complain your device does not really work with 5GHz.
IoT devices (except something like NVR cameras) uses very little bandwidth and little time, so why you need 5GHz?
2.4GHz is better for range, all AP/Client supports same frequency on worldwide, cheap, and so on. I suspect that there are few reason to implement both 5GHz and 2.4GHz for IoT devices. Supporting all 5GHz band would increase cost.
Deliberately jamming the channel is an FCC violation. The FCC is very effective at coming down like a ton of bricks on people who mess with the spectrum.
The FCC isn't coming out to investigate your neighbor's suspected deauthentication flood report.
And in some imaginary reality where they are: there's almost no scenario where they're going to successfully fox hunt the source of random 2.4Ghz deauth frames.
Flood the entire 2.4Ghz spectrum above EIRP limits, constantly? Maybe visitors will come.
The intent of my original question, hamfisted as it was, was if the neighbour is informed of the negative effects their IoT swarm causes other users, do they then incur a legal duty to change their set-up / behaviour such that they would no longer do so?
Intent matters. Devices designed to have negative effects are prohibited; deploying an IoT device working as intended within FCC limits is allowed, despite having negative effects on others; deploying the exact same IoT device with the exact same negative effects with the intent to disrupt someone else's service is prohibited.
This particular IoT setup doesn't seem to be causing any prohibited intereference, all the devices seem to be working as designed within the allowed limits and they can work equally (i.e. your device and your neighbour's 120 devices each can get 1/121th of the usage), so the neighbour being informed that you'd like to use more of the spectrum does not change anything.
And just run ethernet to USB adapter cables to every place you use your phone? The author says they wire what they can, but they want WiFi for the laptop and phone.
The phone can use the cellular network when not docked and usually both laptop and phone are only used/docked in 1-3 different places (desk, bed, some other place).
All Tuya based devices are 2.4ghz only. Considering the scale of that ecosystem it's unsurprising this has occurred. I have around 40 devices in my home already...
This guy seems overly angry and judgemental about something he could just talk to his neighbor about.
Also seems like some minor changes could avoid the issue entirely. I live in a large building amongst other large buildings and depending on which way the wind is blowing I can see over 100 wifi access points, not to mention devices.
https://archive.curbed.com/2018/12/4/18125536/real-estate-mo...
https://crosscut.com/2015/04/the-new-seattle-where-everythin...
a "one plus five" looks like this, structurally: https://1.bp.blogspot.com/-DZ2BLyticQE/VRMjELb1DYI/AAAAAAAAX...
Since 2.4 GHz goes through wood fairly effectively to moderate distances, it's a total half duplex CSMA hell...
One possible mitigation if you have older 2.4 GHz only devices is to run your own wifi on a 20 MHz channel, sacrificing throughput for better SNR. As channel sizes get narrower it cuts through the noise floor a little bit better. And of course to use your own choice of the cleanest 5.x GHz channel for everything you care about.