I'd also set the APs on the tables, rather than trying to elevate them, so that the bodies would limit propagation.
Whenever someone asked, which happened at least a couple times each conference, I explained it like this: Imagine you have 10 groups in the ballroom all trying to have discussions. Turning the power up is like giving each group a bullhorn: the whole room becomes a noisy mess. Lower power is like each group gathering close and talking at normal volume.
It worked well, mostly because the venue-provided wireless at the time were all trying the "few, big AP" solution.
More details are in a series of blog posts I did, starting with: https://www.tummy.com/articles/pycon2007-network/
In fact small cell networks might be how the networks are deployed in future. Website below has some interesting research on this topic
One unintended side effect of this is that tracking becomes incredibly precise.
Well, to be fair wifi tracking was already fairly precise but soon cellular tracking will have similar granularity:
5G's biggest advance is short range high speed data transmission at high frequencies.
Network operators are going to end up deploying tons of tiny cell transmitters throughout cities.
The biggest catch is getting the backbone in place to service all those tiny towers.
I’ve since isolated my equipment from my ISP’s with a pair of fiber adapters. I’m also on AT&T fiber now instead of coax (but the ONT is outside and Ethernet runs from it to the AT&T RG inside).
Ethernet ports seem to be really sensitive (HDMI too) and it’s possible a current is being induced in them from some other path. I’ll find out this summer. (I live in NC which has the second highest number of lightning strikes in the US after FL.)
Edit: just realized you wrote lighting rod. You actually would need a lightning protection system which is multiple rods properly installed and grounded. It isn’t a DIY thing to install and I’ve been told is not cheap. Also probably doesn’t help with nearby strikes, only keeps your house from burning down from a direct strike.
I’ve also installed a couple type 2 surge protectors.
This is since 2004. And we get a lot of lightning storms here in the summer. I grew up in S Florida and I don't recall getting lightning storms as frequently or as violently as here.
(edit: just noticed "ethernet over power" mentioned in another comment here. Same thing)
I have the cat5e throughout the house but in a ring topology as it was wired for phones only. RIght now I just have two points connected end to end. Next step is to think of a clever way to use the 2 unused pairs to support another 100BaseT and create a fake hub and spoke topology. This G.hn standard might be a good way to get even more performance out of this setup.
The cable line in my livingroom connects to a modem. The modem feeds into a wifi router so that I get a good signal throughout most of the house. One of the ports in the router's on-board switch feeds into a wall outlet, which connects to a master switch in the basement. I could never go back to pure wifi.
What I find strange is that, like the OP said, the original network was setup for phone outlets... but the house was only a couple years old. Are RJ11 phone networks really still a bigger selling point than RJ45 switched networks? With the high adoption of cellphones and the increasing demand for internet-connected devices, it seems like RJ45 should be standard.
Where a cable needs to make a turn to follow a corner, I screw in a cup hook. For support along a straight section of wall I nail a wire nail partly in and hang the cable on the protruding part.
Staples leave small holes when removed which are easy to patch. You can also use many of them to get nice straight and tight runs. You can put a bit of white (or whatever color your walls are) paint on them to help blend in.
It's not ideal aesthetically, but nice straight lines and right angles go a long way to making it look better. The next step up would be to run some channel to hide the wires, but I don't find that necessary.
I'm out the cost of materials (maybe $200?) and the price of having an electrician add an outlet to my "MDF"/hall closet ($175).
If you put your modem at the same jack where the first tee in the line is, you should be able to get at least two connections, but for more than that you may need a switch at each tee.
Modern WiFi is getting better, but for some things you just can’t beat good, old-fashioned wires.
I'd prefer to run Ethernet, but one portion of my house, the part that has most of the devices, is really hard to wire to my standards. I should probably just find a pro to do that run, or bite the bullet and either do some drywall work or put up some crown molding to get that run.
I'm afraid I won't know where to stop though. :-) If I'm gonna do drywall work in the spare bedroom, I might as well pull it ALL down. If I do that, I might as well run some more power circuits, seal the HVAC ducting. I might as well remodel that closet. Which might mean taking space from the closet for the basement bathroom and laundry. etc...
I have an illness. :-)
I also made sure to use a proper MoCA cable splitter, as well as installed terminators on the unconnected end points.
Anyway: This is a fascinating idea. I had looked a year or two ago and the options didn't seem as good. What adapter are you using? I'm seeing mostly an Actiontec for around $90 each. I need to check my head end, but this might just be plug-and-play to reach 2 of my hardest to cable locations. I started thinking about just running some fiber along the CATV runs (along the outside of the house), but that'd be more of a pain. Especially getting the terminated ends through the wall. I'd want to run fiber since it'd be exposed to possible lightning, but I guess the CATV cable is similarly exposed. Hell, maybe I should just run some shielded cat6...
Anyway, I might just try this MoCA 2.0, thanks for that pointer.
I don't recommend Actiontec, I had a fairly expensive adapter go bad on me after about a year. I replaced the other ones with Motorola MM1000, which are much more reliable in my experience, and I found for $60.
Interior finish work is expensive, doing it over to put in insulation is even moreso. And to add insult to injury, the low R value of walls is a data point used by siding installers, to try to talk you out of ripping the old stuff off before installing new. Each layer has an R value, dontchaknow...
If you have any ham radio neighbors ethernet over power is basically a jammer.
That's like saying light is harmful because looking directly at the sun or a high-powered laser will blind you.
Focused beams are outside the scope of this question because we're dealing with cell transmitters.
I do not recommend Ethernet over Power any more.
For what it's worth, you could turn on the built-in encryption within most Powerline adapters. (Usually a button labeled "Security" or similar). Then you can leak anything you want, and still be reasonably safe.
Wi-fi has this exact same problem, with basically the same solution.
MoCA (Multimedia over Coax Alliance)
MoCA 2.0 allows theoretical speeds of up to 500 Mbps (1 Gbps if bonded). This allows real world speeds of 1/2 to 2/3 of that.
Many houses have cable outlets.
But no, I hadn't heard of it. (I realize the impedance and signaling is obviously different.)
You have little to lose by trying, but there’s no guarantee it will work at all, even less work well.
I think you do best to have just two stations in a powerline network situation because the contention and overhead of contention control is much less. (Compare that to gigabit ethernet which is full duplex and switched so there is little or no contention)
You can connect a "distant" powerline NIC to an ethernet switch and plug multiple devices into that if they are close together, that works better than plugging in multiple powerline NICs.
Unfortunately new houses have circuit breakers that detect broadband noise caused by arcing. These are good for catching fires early, but will blow if you use powerline NICs.
Google wifi pucks work much better, in spite of the cinder block walls.
Anyways, I use a set of powerline adapters with a second wifi router at the other end because the walls in my house are very thick (the studs are covered in shiplap on both sides). The throughput is fast enough to stream video on the other end, but I haven't put it to any measured test.
In the summer, I often take the whole setup to one of the outdoor plugins, so that we have solid wifi on the deck. So it can work quite well. Whether it will in any given circumstance is uncertain I suppose.
(These weren't label-advertised speeds, these were the negotiated links)
I used ethernet over power myself for several years when I rented a house, and it worked well for me. Recently, a friend was having wifi trouble (old house, thick plaster faraday cage walls). I suggested ethernet over power, and it worked great with one exception:
They have Sonos speakers, and Sonos just would not work with it. They apparently are very sensitive to latency. The ping times were roughly 50ms (from memory) between the 2 ethernet over power adapters.. I guess Sonos takes synchronization quite seriously.
Powerline networks (Powerline AV and g.hn) send ethernet through the electrical wiring.
Inside my house, on 1/3 of an acre (so not up against neighbors), the strongest 2.4GHz signals I see are from 2-3 HP printers in houses around me. This includes my own 2 Ubiquity APs that are inside my house.
I even noticed that I have a device screaming on 5GHz .. an Nvidia Shield. AFAICT, I cannot turn that off, even if I disable wifi.
I hate wifi direct.
Also, none of the other devices that share the same room as the dimmer can use anything on the 2.4GHz spectrum. It's almost impressive how effective a jammer it is.
If it's really as bad as you say it is, you need to report it to the FCC (or other local governing body)
Anyways, having hundreds of these in an office lead to some interesting WiFi scans.
* Dynamically adjust the power level to match the furthest client
* Boost the power level on an interval to check if there are clients further away waiting to connect
Additionally, what do mesh Wi-Fi networks do when clients are holding onto a connection? Are they smart enough to know that another node has a stronger signal to the client and trigger a disconnect from the clients' current node so that the it can associate with the stronger node?
As whatshisface alludes to, a wireless router acts as both an access point and a router.
I'd guess this is the most common setup for residences, it requires next to no technical skill other then plugging in the cables. Should definitely count as a router here no?
Not that I lack the technical skills to do otherwise, but my ISP's router works remarkably well (biband 802.11ac, IPv6, no packet loss, 4 ethernet ports, ...), so it's all I need.
And if I have a weird issue, that's a single device that I need to reboot (and I can even do it remotely).
To be honest it's all most residences need, even if you're a technical person. Half of common home devices (xbox, printers, laptops, etc) have wifi, so for most people 4 ethernet ports is all you need(if any). It's almost 0 setup other then changing the passwords, and it works.
The only reason why it wouldn't work for you is if you have many wired devices, or want to get into the networking stuff.
Even I don't bother with ethernet. There's no point; with <10 Mbps Internet speeds, too slow is too slow. The only thing plugged into the router is a second router for the detached garage/shop. Occasionally, there's need to transfer large files between two computers (games which can take multiple days to download), but temporarily stringing an ethernet cable directly between them does the trick.
There are many standards to assist in roaming such as .11k, v, and r. Particularly for mesh nodes s. In the end the strategy used in Wi-Fi is that the end station is the one that decides when to roam and why. Again it comes down to the client knows more information about what it could connect to and when it makes sense for it to roam than the AP could. The AP can just send hints that it'd like it off of it for other clients benefits.
Many Enterprise systems come with the ability to tune power the same way they can automatically tune channels. Usually they tune against other APs rather than clients since AP positions and emissions are much more static and controlled by the same system (normally).
I assume these specs include mandated behavior in both the APs and the clients.
All of the access points broadcast a certain SSID. When a client tries to connect, they coordinate with each other to choose which one will reply to that particular client. That is, the client things it is connected to one AP and doesn't know anything special is happening.
If the system wants to move your client to another AP it just disconnects you from the first AP and when you try to reconnect the second AP will reply to you.
In a case like that, assignment is driven as much as "having a clear channel" as "having a better connection on the channel". If you had a choice between two channels, once of which was shared and slightly "better" and another you can have to yourself, you are better off having one to
yourself. (That way you aren't having to wait for other clients to stop sending or receiving, dealing with interference, etc.)
A corollary to that is that if you have both 5GHz and 2.4GHz support on an access point you do best distributing clients between both sides, even if people think 5GHz is better or that 2.4GHz performs better in real life.
I am amazed that instead of all the silly gimmicks that APs have been marketed with, nobody has come out with one that has a lot of radios working on different channels and just behaves like a large number of APs. Practically I think this would work way better than channel aggregation.
Edit: meant “zero handoff” per replies below.
Getting good roaming is theoretically not that difficult. Lower transmit power so that the device's RSSI in the locations where it's expected to transition to another AP (or off WiFi) are lower than the device's roam scanning threshold (-70 for iOS). Set the minRSSI on APs to something sensible and enable strict mode so that more troublesome devices aren't able to cling to an AP @ -85.
In practice, tweaking those knobs and figuring out placement and channel planning start to get tough beyond 3 APs.
Most other devices will jump to another AP broadcasting on the same SSID if the signal is a lot stronger. It’s not nearly as much of an issue as it used to be, but people expect WiFi to Just Work (tm) so it’s better to let the OS’ network stack manage it.
I'm only mildly bitter at the useless roaming behaviour of most clients. Turning transmit power down doesn't really help either. Some clients will roam better but other clients at the edge of the coverage now have no coverage.
So it may not be perfect, but it can’t be that bad.
Knowing that, you need something that shows you details about wifi connections, so you can view the BSSID (MAC address of the AP) you're connected to:
Android, iOS: Install the "Network Info II" app
Linux: use the iwconfig command
Windows: Use the command "netsh wlan show interfaces"
It does matter if antenna is located higher up. This will allow it to pickup weaker signal from clients. Better antenna is also probably more sensitive.
I think what the author meant to express is that the situation is symmetrical, even if one of the antennas is "better" (eg larger, or located higher up). So, author probably meant: "The bidirectional connection is symmetrical. It doesn’t matter (in terms of symmetry) if the AP has a better antenna or is located higher up".
So, even if the antenna on one side is "better" in some way, it affects both directions the same, and as such it still makes sense to use the same transmit power from both ends.
You can also have better analog RF components with more space (filters, LNAs, PAs, etc).
It's not necessarily true that the "advantage" of the AP is symmetric as well. The tx power is limited by FCC part 15 so the difference in tx power between AP and client is probably less than the difference in Rx sensitivity between the AP and client.
Even if the uplink and dowlink are symmetric, the traffic profiles are probably not - there's probably a lot more data going to the client.
1) Hear them better
2) Yell at them louder
(While holding all else equal like transmit power, line loss in the coax...)
With wifi, there is usually little you can do since you probably want a spherical radiation pattern, which prevents you from getting any significant gain over a perfect isotropic radiator, but that's a different story.
When I tried to coordinate better channel assignment and transmit power with my neighbors in my apartment complex, the effort very quickly died for the reasons above. Most neighbors didn't know or care about the settings on the magic box that made their Netflix or Xbox work, and even if they did, the config options simply weren't there.
I wish there were more I could do to change this situation. As it is, all I can do is vote with my wallet and buy less horrible brands like Ubquiti, where advanced options like channel assignment and Tx power are available to me if I want them.
* Check that all your APs are broadcasting the same name. No devices I know of will naturally roam to an AP with a different name.
* Check that all your APs are broadcasting identical security features (e.g. WPA1, WPA2, WPA1 or WPA2 .. as well as the encryption methods like TKIP, AES/CCMP, etc). Many devices will check these too, and roam only to matching APs.
If you want low coverage area it would also not be such a bad idea to use the 5ghz frequency if your router supports it. Much less crowded shorter range by default and higher speeds
As the author said in the comment section: "I usually suggest turning 2.4 GHz off altogether so none of your users will ever connect to it by accident."
As for AP's sharing channels, this is great in concept but our (3 unit) building became so overcrowded in the 2.4 GHz spectrum that devices couldn't connect which caused people to add more routers and repeaters to "boost" the signal making it even worse. If you can't connect, you can't share.
But the automatic channel selection is not very accurate in my personal observation as well as that of a few others on the internet . Wish it were better though. I feel that there are numerous problems in automating the channel selection
1. We have to make sure that this happens without any downtime otherwise there is poor QOS. If the router has to reboot to choose a channel you're out of internet for about 5-10 minutes. It can be very frustrating if it happens often.
2. As if that was not enough you have to draw extra power to scan nearby access points and constantly react to the changing signals. (This would also defeat the point of using low power signals)
In my opinion it is better to tune the routers manually than to set it on automatic channel selection.
> the odds that your [...] is _very_ low
Yes unfortunately that is true. There is little we can do about it. But there is one redeeming factor. As per my understanding interference depends not only on the base access point but also on the client. In other words the channels will deteriorate if the communication link that is the pairing of client with the access point interferes with the signals of another access point.
On low ranges you can create mini line of sight networks that will happily co exist with any number of signals. Example wifi on laptop working with bluetooth mouse.
[Router] --------- [laptop]
It would be super useful for someone like me where there are dozens of different APs in an apartment building all competing with each other. Then I could kindly go to my neighbor and ask/assist them in turning theirs down.
It's great for finding unused channels for your AP.
I've heard if you install a giant ham radio tower they're not legally allowed to do anything about it. Any little bit of sticking it to those organizations is a good thing.
But this is a very good reminder for people who set up wireless network at events, schools, ...
My school had huge thick concrete wall that basically killed any Wi-Fi signal. But for some reason, instead of having more cheap AP distributed more evenly, the guys handling it insisted on putting 1-2 big expensive AP in each building, which meant that the Wi-Fi was unusable in most of the classroom. Not really cleaver when most of your classes require you to be connected to the internet.
In a sense, it's lose-lose, but you can choose the higher ground...
All that said - if you have multiple APs, then you absolutely want to tune the power levels, or you're only making yourself a looser!
It sticks out like a sore thumb to me because my brain reads it as "it is", which does not fit.
One I've witnessed in my lifetime is "different than" vs. "different to". I was taught the first one. The latter sounds wrong to me, which is why I notice I'm hearing it more and more. (Note I have phrased this subjectively; I'm not saying it is wrong. I'm generally a descriptive grammarian. But it does sound wrong to me.) It may be a dialect thing, in which case my dialect seems to be shifting. Languages change.
I remember thinking at the time that I live to be an average 70, then at the rate of change I could see in books from various centuries, I should be able to witness some changes myself in my lifetime. And I mean changes in "real", core language, not merely the rapid churn of local dialects and slang. (Which the Internet has greatly accelerated. I suspect in general it pulls us all together towards a more "core" English, but it also means an incredible proliferation of local slang communities.)
Now that I'm 40, I can say that I'm definitely starting to notice it. It's not a fast process on a day-by-day basis, and it's hard to notice the small differences at first, or assume they're dialect differences (and in some cases they are, after all). But the change is definitely happening. I would be unsurprised within my lifetime that there's only "it's" and it has two distinct meanings, and that it will be officially recognized by dictionaries as such.
The "hacker" style of quoting  also seems to be generally accepted. I've on several occasions done something like 'Have you ever said "It's not a tumor!"?' and I'm yet to get jumped by a grammar nazi for the two punctuation marks like that, too. I suspect that'll never quite become the official style (a bit on the complicated side), but it doesn't seem to bother people much.
I've always found it odd that in a place like HN, most are pedantic in their use of every language but English.
Edit: I mistyped my comment as "after". The replies are justified.
Try asking people what "half three" means.
Also "Less" vs. "Fewer".
Then one day, a couple years ago, it started grating on my nerves. I have no idea what caused me to suddenly start caring! I just do now.
I guess as I get older, less things get past me?
Turning power down, on the other hand, probably will loose nothing, just like he said.
2. The correct answer in the Prisoner's Dilemma is to cooperate, in my opinion.
Garrett Hardin was a (neo)Malthusian:
>Hardin blamed the welfare state for allowing the tragedy of the commons; where the state provides for children and supports over-breeding as a fundamental human right, Malthusian catastrophe is inevitable. Hardin stated in his analysis of the tragedy of the commons that "Freedom in a commons brings ruin to all.":1244 Environmental historians Joachim Radkau, Alfred Thomas Grove and Oliver Rackham criticized Hardin "as an American with no notion at all how Commons actually work".
>In addition, Hardin's pessimistic outlook was subsequently contradicted by Elinor Ostrom's later work on success of co-operative structures like the management of common land, for which she shared the 2009 Nobel Memorial Prize in Economic Sciences with Oliver E. Williamson. In contrast to Hardin, they stated neither commons or "Allmende" in the generic nor classical meaning are bound to fail; to the contrary "the wealth of the commons" has gained renewed interest in the scientific community. Hardin's work was also criticized as historically inaccurate in failing to account for the demographic transition, and for failing to distinguish between common property and open access resources.
>Despite the criticisms, the theory has nonetheless been influential.
In this modern economic context, commons is taken to mean any shared and unregulated resource such as atmosphere, oceans, rivers, fish stocks, roads and highways, or even an office refrigerator.
...or WiFi spectrum.
We went from having spotty Internet to it being really solid. I think we have some neighbors that have devices that interfere with 2.4G (cordless phone? Microwave? Baby monitor?)
2.4 is unusable in an apartment building because the range is so great. I live in a 3 unit so I am happy with my 5 Ghz but in a very dense building I could see problems.
but first I'd like to identify the mode of harm. How would this low-energy, non-ionising radiation present a biological impact?
The theories I've heard (induced currents in the dna base stack? magnetic stimulation of cryptochrome that modulates intracellular reactive oxygen species?) describe very rare, unlikely and low (negligible?) impact.
I am open to the theory of hypersensitivity, however I've never found anyone who suffers it.
These studies indicate a mechanism of EMF stimulation on voltage-gated ca++ channels.
Personally, I've never noticed issues with RF-EMF until I tested my sleep using the oura ring. WiFi didn't seem to bother my sleep but shutting off the power to my bedroom at the breaker improved my sleep tremendously. Even if their effects on sleep are small, the negative, non-linear health impact of sleep loss is significant...to me.
Maybe in 20 years we'll view EMFs akin to cigarettes, maybe not. I'm not wearing a tinfoil hat just yet but it has changed the way I intreact with these non-native frequencies. It seems prudent to take simple precautions like avoiding microwave use, utilizing airplane mode and turning off/unplugging appliances, lights and WiFi when not in use.