Hacker News new | past | comments | ask | show | jobs | submit login
5G Is Likely to Put Weather Forecasting at Risk (hackaday.com)
668 points by szczys on April 16, 2019 | hide | past | favorite | 471 comments



While 5G Will be a great boon, especially the beam-forming satellite version, another unintended consequence besides weather remote sensing is nuking the extremely important 24 GHz range (K Band) for radio astronomy. There are a few narrow protected windows for absolutely critical spectral lines, but the truth is that nature doesn't play by the spectrum allocations rules, and there are hundreds if not thousands of lines that are observed routinely outside of the protected bands. It is also remarkably free and clear of radio frequency interference (RFI), in part because industry has chosen other frequencies not attenuated by atmospheric water vapor. This isn't to say we should halt global human progress to save a local river bait fish, but that threat to forecasting is only one of the serious consequences major spectrum reallocation can have. This is especially true for passive use in the sciences, which has a weaker lobby than the private sector.


> While 5G Will be a great boon

This is something I've been having trouble with.

Lately I've become more aware of the secondary effects of 5G- on weather forecasting, on the radio spectrum, possibly on bees- and it's got me wondering why we need it for telecom. I just don't see the value added. I can already communicate with anyone in the world, access any information, and find my way anywhere with 4G. A significantly higher rate of data transfer just doesn't seem to add any new functionality to my phone. Can anyone give me a good rationale for 5G? Entertainment doesn't count.

I'll grant right-off-the-bat that it'll have some fantastic industrial applications; my issue is with personal telecom. It just feels like a new planned obsolescence vector.


I think it's driven by network utilization on the carrier side and battery life on the device manufacturer side. Your monthly 1GB (let's say) usage cap at EDGE speeds translates to around 5 hours of constant connection. At 4G speeds (40Mbps typical for me) that's just 3.41 minutes. If we can get that to 10X for 5G, that's just 20 seconds.

Over the last decade we've gone from using 5 hours of data per subscriber to less than 4 minutes, and potentially down to 20 seconds. Similarly, latency went from 400ms ping to 5ms.

This improves network performance, since the tower is more likely to be available when you need it, and faster too, even if the average throughput and backhaul link remains unchanged. It also improves battery life because the battery is primarily draining during active comm sessions. Get them in faster, serve them faster and back to sleep faster.

Think of it in terms of Apple's MacBook. The finite resource here is more complex as it's both thermal and power. The CPU tops out at an average of 1.3GHz but it has an instantaneous turbo-boost to 3.2GHz. You could say that 1.3GHz is pretty fast and gets the job done, if it represents the same TDP, why bother going faster? The answer is it improves your experience (lower latency on bursty workloads). It also improves your battery life because it's more efficient to spend 1/3 the time at 3.2GHz and 2/3 the time sleeping than the whole time at 1.3GHz. I think Apple described it as "racing to get back to sleep."

Obviously this breaks down with sustained workloads, though I'd argue the same is true of 5G.


Okay sure, but what can you _do_ with that extra throughput is what OP is asking.

You're making the classic IT-guy failure of talking about the technical feature rather than the user outcome (I have had to train myself out of this in recent years so I'm not having a go at you).


I'm confused, he's talking about the user outcome - improved battery life while consuming content. The technical reason is that the modem spends less time active for the same amount of bandwidth.


Cell on my phone contributes like less than 10% of total battery consumption. The most goes to screen and cpu. So there's limited improvement for 5G in this. But for IoT devices in which comms use most of the battery, this could be significant. However, it's debatable whether spending on billions of dollars on a regional 5g network is an optimal solution for a more durable IoT infrastructure. In my understanding, there should be a more economical way to build an efficient IoT infrastructure since most of them don't need that much data speed.


I agree with your point on power consumption and agree that probably doesn't warrant the investment.

I don't work on cellular infrastructure, and don't even work on IoT anymore, but I could imagine that as far as IoT infrastructure is concerned, it might not be that any specific devices need a large amount of throughput to provide their services, but that the aggregate of a potentially very larger number of devices needing access to data services could test the throughput of the collective infrastructure, and redesigning that infrastructure to be able to have more throughput could be key, not to a future with some magical usecase of some specific types of devices, but to a future with a much larger number of simple networked devices working in concert to make the world more efficient in the aggregate.

That said, is the timing right? I don't know. It does kind of seem like it is driven more by politics than technical necessity.


If given that most people charge their phone once a day and doesn't actually need more battery efficiency, and battery efficiency is the only benefit, are there then no actual benefit from 5G?


Unless one day you could charge your phone once every two days? Or once a week! Like back in the day of the 3310, which had 22 hours of talk time and 31 days of standby. Or you could make the phone smaller/lighter/faster with the extra efficiency. I'm having trouble believing someone here on HN is arguing against improving technology.

I sometimes forget to plug my phone in, maybe after traveling, maybe after drinking haha. I'm already super glad my phone will make it through a second day. Back in the iPhone 3G days? Not so much.


Smaller/lighter/faster phones and battery banks are already available, without 5g. Why do you need 5g specifically? It's value has to exceed the value it destroys. Not all tech developments are net improvements.


Because they'll be even smaller/lighter/faster. That's what efficiency gets you. 5G is more efficient. There's really nothing else to say.


So is it worth rolling out globally a very expensive new network which will cause negative externalities for that?


That depends on whether the expected benefits greatly exceeds the costs of the new network and can offset the supposed negative externalities.

   expected benefits >> costs + supposed negative externalities


You could just as easily have switched those words and that would make just as much sense.

supposed benefits << cost and expected negitive externalities


And that is why everyone is asking the question, rather than blindly accepting 5G as a desirable thing.


This all seems kind of funny to me because these are almost the exact same arguments on why power lines are bad. They kill trees, they kill bugs, they cause kids to be hyper, they cause kids to be depressed, they cause cancer, they misalign the resonant frequency of your brain. A hundred years later I still hear the same arguments but never once has anything borne out.

Can I get an argument against the 5G standard that doesn't amount to, "This may cause harm through an unknowable secondary mechanism despite roughly 100 years of evidence that it is safe."


Infrastructure is intrusive and imposes problems on the poorest people and wildlife who are forced to live near it, I'm not sure why you find that funny. The difference is people need power, not 5g.

If you don't need it then why would you do it? There is no need for an argument to not waste money on useless infrastructure. E.g. Why not build only 4 lane highways instead of a side streets? Because it's unnecessary.

The burden is on you to justify the expense and I don't see any clear justifications as to how this is in anyone's best interest outside of the telcos and <5% of users.


I don't think I need to justify a private company spending private funds but since you insist,

According to Pew Research, 95% of Americans own cellphones. 75% of the world population owns cellphones. When you include other mobile networked devices that number goes up if only a little. Any infrastructure improvement to the wireless telecom infrastructure will benefit the vast majority of the world's population. Even people who insist on using 4G will see major benefits due to less congestion as people migrate over to 5G systems and guess what, the 5G standard includes keeping the 4G system intact and running so this isn't a planned obsolescence but purely an expansion of capability!

Historically, nearly all infrastructure development has yielded significant positive returns not just for the capital holders (Telco's) but for the users as well. While we can certainly extend additional 4G coverage, 5G has a significant number of technical improvements that allow for better backhaul network and network management. Once the initial development costs are paid, there is no reason not to use 5G access standards over 4G.

Most of the 5G mmWave standards are backhaul for the foreseeable future. These enable low cost, low latency deployments to areas which are currently undeserved because of the costs of deployment. This is important because it enables technologies like tele-robotics. It would allow rural areas to invest in a surgical robot at their local hospital and have an expert surgeon working from a more prominent location potentially saving lives.

There are standards for out of channel spectral masks to prevent problems like (including specifically) the one postulated in the article. There exposure standards for safety that the military has been researching for decades as part of their radar work in the mmWave band. I've yet to see any concerns that were not specious and certainly not from anyone familiar with the technology. Most of the concerns are just repeated talking points from dozens of other older technologies that never showed any substance with a helping of, "It's different THIS time." So, you're right, it's not funny, it's sad to see people on Hacker News arguing that 2010 was the ideal level of technology and we shouldn't bother trying to develop more. Hell, it's literally the same argument Comcast makes on why broadband standards shouldn't be raised. "What people have is good enough and they don't want more so you can't justify the costs to improve!"


You're right about the money argument. However, everyone is effected by this infrastructure so the roll out should require some justification regardless of who's money it is.

The 95% of people who own cell phones don't necessarily need internet faster than 4g on their phones and would be happy with that working reliability, which it doesn't.

>It enables technologies like tele-robotics. It would allow rural areas to invest in a surgical robot.

That sounds good and all, but shouldn't we just build out the fiber network so that we have actually reliable infrastructure instead of janky bullshit running on a proprietary cellular modem? What level of reliability should a surgical robot be? I'd say 9 9's for myself to use it. Can a cell signal transmit perfectly 99.9999999% of the time with those low latency numbers ? I don't think so since it's relying on atmospheric conditions and no interference to function.

Ironically, 5G will further reduce incentive for the clearly useful technology of fiber to be deployed at scale.


Everyone buying new phones and changing the infrastructure for marginal gains sounds super efficient.


That will happen anyways.


That's a technical feature. The user outcome is fuzzy because you don't know really how weather prediction interference and increased RFI will play out for users.


Those are externalities unrelated to the phone or experience with it. I'm confident if 5G is widely deployed and it causes the stated problems we'll find a new way to predict the weather.


Do you study atmospheric analysis? Weather prediction is not great currently and the weather is getting more volatile.

Why should someone be allowed to implement a design at scale without testing and solving the downstream problems first? That's not how engineering works.


Okay I have 10% extra battery life - what can I do with that I’m not already doing?

Extra battery life is a feature, not an outcome.


The outcome is your phone lasts longer. And your experience with it improves as the network is more available when you need it.


My phone lasts longer and that lets me _____

Nobody is wandering around wanting their phone to last longer purely for that sake.


Lets you use your phone for longer between charges. You're acting like battery life isn't a user feature, but it is: companies advertise heavily on it, and it can often be a differentiator between midrange and premium phones.

You're looking for a user story where only looking at the corollary makes it obvious that battery life enables everything else.

Suppose you shipped a phone with 99% less battery life than its competitors. You could, at trivial expense, increase your 1% to 10%, your engineers propose how to do so.

You reply, "My phone lasts longer and that lets me _____".

To which your engineers reply, "Use your phone?"


The fact that the almost exact same conversations happened with the advent of broadband, mobile data, 3G, 3G+, 4G should give you a hint.


Perhaps you could just come out and say it?


We did. Many times. Each time there's a generational shift in wireless technology some people come out and say it's good enough, we're done here, nothing new to make, why do you bother with all this effort. Then we do. And life improves, our experience with technology improves, and we get used to it. Then the cycle repeats itself.

Grab an iPhone 3G and use it for a few days. That's what going back from 5G to 4G will be 5 years from now. Just as it is going back from 4G to 3G today. You've just gotten used to how good things are.


I see you're getting frustrated with me because I'm not willing to accept that because it happened in the past, it will always happen in the future.

I just set my phone to 3G, where I'm getting a speed test of 15mbps about 1/10th what I get on 4G typically.

Everything seems to work the same way it did on 4G. Actually I'm surprised by that because I thought 4G had been a bigger improvement.


Demand for wireless broadband services isn’t static, it’s rapidly growing. It’s not just that 5G will let us do things 4G can’t. It’s that it will let us do the same things but a lot more of it, before the networks become too congested and demand outstrips supply.


You are the first person to actually nail it - well done!

That's a sale-able outcome: more and more people are using the spectrum so if we don't upgrade to a more efficient use then the outcomes you currently enjoy (video chat, gaming, virtual desktops whatever it is you do and value) will stop working because of the traffic.


> a lot more of it

avidity I say


Like what?


The same thing you do now but more efficient.


Define efficient? Nothing I currently do now reaches the limits of even half of my current connection.


From a power and network utilization perspective. It feels like you're not really trying.


So I'm using the network more efficiently... how does that improve my life?


I kinda agree. 4G feels "fast enough" to me, just like PCs have become pretty much "fast enough" about a decade ago.

I'd rather see more work put into usability than in even higher speeds.


The funny thing with the internet is that the content really doesn't ntake up that much space/speed. It's the ads and trackers. So that's why the industry is lobbying for that.

Also, We still haven't even hit theoretical 3g speeds. But reallocation of spectrum is a quick easy way to get there without doing the work.


> the content really doesn't ntake up that much space/speed

video streaming


> Entertainment doesn't count.


Not all video streaming is for entertainment. Case in point: security cameras.


On the other hand, discouraging wireless security cameras is a benefit because they have poor security properties. Wireless exposes the cameras to attack by anyone within wireless range rather than requiring the attacker to have physical access, and also allows an attacker to disable the cameras with a wireless jammer.


They do indeed have poor security properties, but sometimes it's the only viable option. For example, you might need to setup a camera in some outdoor spot that's too far for a wired connection to be practical (Ethernet only runs so far, and I ain't aware of very many cameras that use fiber), in which case a wireless connection and a solar panel might very well be the way you'd have to go.

Also, wired connections (especially outdoor ones) have a tendency to be vulnerable to things like wire cutters or fiber-seeking backhoes. I'd imagine a competent security system implementer would find some way to physically secure the cable as best as possible, but given that a wireless jammer is a much more sophisticated attack strategy than, say, some snips or an "accidental" strike by some piece of equipment, going wireless might be a viable tradeoff.


> For example, you might need to setup a camera in some outdoor spot that's too far for a wired connection to be practical (Ethernet only runs so far, and I ain't aware of very many cameras that use fiber), in which case a wireless connection and a solar panel might very well be the way you'd have to go.

Cameras that use fiber exist, but you could also just use an ethernet camera and a fiber to ethernet converter.

> Also, wired connections (especially outdoor ones) have a tendency to be vulnerable to things like wire cutters or fiber-seeking backhoes.

I once encountered a survivalist who would always carry a length of fiber optic cable with him, that way if he was ever hopelessly lost in the wilderness he needed only to bury the fiber and a backhoe would be along promptly to dig it up.

One solution in those cases is to use directional wireless, which is harder to jam, but then you're back to not needing 5G.

If it's important enough you can also attach a storage device directly to the camera so that if there is a temporary network interruption the data isn't lost.

Of course, you also have the trouble that the cameras themselves tend to be vulnerable to things like rocks. Securing something which is out in the open is hard.

> given that a wireless jammer is a much more sophisticated attack strategy than, say, some snips or an "accidental" strike by some piece of equipment

Jamming wireless is not really that sophisticated. It's both easy an inexpensive to do it. The main impediment is that the legal penalties can be rather severe, but criminals are not well known for their fastidious adherence to the law.


As anyone who's ever played one of the recent editions of Shadowrun well knows.

For those who haven't, Shadowrun is a fantasy/cyberpunk RPG, and since the most recent two editions, everything is wireless. Everything. Cameras, locks, guns, you name it. Which makes it a lot of fun for a hacker to brick an opponent's gun in the middle of combat. Or, of course, use a security camera's wireless connection to get into the larger system behind it.

I hope real world security will be more sensible than that, but signs are not encouraging.


Are cybernetics wireless too in Shadowrun? If so, it would mean a hacker could also directly attack cybered enemies, like shutting down cybernetic eyes to blind someone, force a leg to trip its owner or an arm to punch its owner.


A few are explicitly wireless because they need to connect wirelessly to other things, like an implanted comlink, cyberdeck or rigger system.

But I believe the rule or gentleman's agreement in Shadowrun is that something you paid Essence for (you pay Essence for cyberware and bioware) is part of you and cannot be hacked. Though I believe there have been adventures where for plot reasons it was possible. Shadowrun is not entirely consistent in that regard, I'm afraid.


It’s a pretty flexible game system. A good GM would allow it I think, but with a high difficulty rating for hacking it.


I feel like they answered their own question, entertainment does count.


It's also every useless TCP connection because stupid site providers chose to load stuff from Google instead of the page they currently have the connection open. Easily adds 3 RTT, maybe even an additional 4 if you use TLS.


It's pretty expensive though. I pay $10/GB (marginal) for Google Fi. That doesn't really scale to streaming HD video. I imagine a roomier higher-frequency spectrum would help get contention down and lower costs.


Google Fi is the exception, not the rule. You can get "unlimited data" LTE for $60+ with other carriers. (e.g. T-Mobile One is $70 for a single line)

Google Fi is for people who mostly use wi-fi. (I use Google Fi)


"Unlimited" plans just hope that not too many people use a lot of data. But if you're really one of the people who doesn't use much data, you might as well use a plan which charges per gigabyte, and then you're exposed to the real per-gigabyte cost. Market forces therefore keep the cost of unlimited plans roughly in line, and the per-gigabyte cost is still relevant even for unlimited plans.


https://www.t-mobile.com/offers/mydatausage

T-Mobile is “unlimited data” until 50GB, then they limit your speed.


50GB is pretty generous. Not truly unlimited, but I could definitely live with that.


It's one blu-ray worth of data. Mobile data has really lowered people's expectations about internet connectivity a lot, relative to the improvement curve that we got used to last decade with wired connectivity.


The improvement curve for my mobile data is much steeper than my home broadband. Only recently and by installing an ugly looking microwave satellite on my home has my wired broadband exceeded my t-mobile LTE speeds. For a long time on DSL my phone battery would die trying to upload stuff to the cloud because DSL upload speeds are so bad. I’m a heavy mobile user and I’ve never hit my 50GB limit, it’s high enough that as long as I use WiFi when it’s available I don’t have to think about data caps which only exist to reduce congestion. And the data caps keep going up every year. My home broadband has a soft cap of 1.5TB which has grown much slower than my mobile soft cap. I expect 5G to finally bust the “unlimited data for everything but tethering” problem and at that point I can ditch the microwave satellite altogether.


Yes, but this is only because home broadband has essentially stalled. If you look at what's happening in areas where there is a market for wired net bandwidth, computers are connected at around 100G speeds. Yeah, server environments traditionally always had better networking, but you can chop off an order of magnitude or two to account for this, and still the comparison shows how consumer wired is stalled.


Project Fi is a little more expensive but can be used as an unlimited plan at $80/mo for a single line, as they stop charging for data after 6GB.


It is political problem. Currently using my unlimited 4g plan which cost about 15 euros and fast.com gives 49mb/s and I live literally middle of nowhere.


We need to advance all parameters of our technology stack.. changes are incremental and take time, it'll be many years before 5G is widely available.

5G as I've understood it also more about latency.


Parkinson's Law. Increase in 5G bandwidth will be filled with next-gen media.

Cable boxes will be replaced by subsidized free smartphones in exchange for "always on" subscriptions. No more home WiFi for low-end market.

Personally, the industrial 5G IoT applications are far more interesting.


>Parkinson's Law. Increase in 5G bandwidth will be filled with next-gen media.

Perhaps, Jevons paradox might be more apt in this case, as huge investments in 5G Infrastructure indicate that 4G/LTE efficiency/usage ratio has peaked.

Nevertheless, any nascent demand for next-gen media in the next few years can be served via 5G NR, Wi-Fi 6 and inter-related standards like Wi-Fi HaLow,Vantage etc.[1] and Blutetooth 5 with synergistic and overlapping features, until 'real' 5G establishes itself, sometime in the next decade.[2][3]

https://en.wikipedia.org/wiki/Jevons_paradox

https://www.computerweekly.com/microscope/opinion/A-modern-t...

[1] https://www.wi-fi.org/discover-wi-fi/wi-fi-certified-6

[2] https://www.networkworld.com/article/3342158/cisco-exec-deta...

[3]https://www.wsj.com/articles/from-wi-fi-to-bluetooth-to-5g-a...


I don't think consumers are expecting more efficiency from 5G; therefore no paradox when actual consumption exceeds expectations.

Insert any VHS vs BetaMax or AC vs DC historical lesson here. The best tech does not always win.

If carriers, smartphone makers, and chipset manufacturers all agree on 5G, then 5G will "win".


My comment was from the perspective of the stakeholders i.e. governments/academia/spectrum holders, various steering groups/SIG/committees, policy makers, environmentalists, OEM's, chip designers/fabs, banks, telcos et al., who are not only aware of the challenges ahead but have to be fully invested and efficiency matters. [1]

They could adopt the default position suggested by you and do nothing; it will certainly be favourable and indeed profitable for all the incumbents in the short term, but they will only be postponing the inevitable. A new generation of mobile standard is a well-trodden path, from the beginning of 1980's with the advent of 1st Generation of wireless telecommunications and every decade since then. It is without a doubt paved with riches and there will be winners and losers ─ however, the biggest driver has been innovation and not just about the 'win' and to suggest that 5G might fail is pure fantasy.

It is a non-sequitur to compare stand-alone video formats with a constellation of technological advancements, encompassing a multitude of disciplines, which have had a profound impact on us.

[1] https://www.youtube.com/watch?v=nljwtkdHAYw


We're already in a state of doing nothing. The progressive view is pushing for 5G upgrades and breaking the cable box model.


While cable boxes could be replaced by subsidized smartphones, my question is "Why?". A wired connection will always be less capricious than a wireless one [citation needed], so if your TV is in a fixed position there's no benefit. At best, you could get a little bit of flexibility w/in your domicile, but wouldn't this problem be better solved with better wiring? And who moves their frequently TV anyways?

Also, before anybody comments something about 5G helping out in broadband deserts, I'd argue that point is invalidated by the reduced penetration depth due to the much higher frequency of 5G. Let's just invest in rural broadband instead


A smartphone can probably deliver better-personalized ads since it can be tracked to an individual person. No benefit to the consumer, really.


We as a society, might benefit more, but, like socialized healthcare, that's not how the market works. The bandwidth to support renting out smartphones-as-hotspots for home wifi connections will make Verizon wireless and AT&T quite happy (at Comcast's expense), so the question is less "why", and more "who, and for how much"?


> Personally, the industrial 5G IoT applications are far more interesting.

IoT applications are typically very low-bandwidth. The main issue is cost and power usage.


That's one of the big use cases of 5G though. Having a huge number of low-bandwidth subscribers


We've got existing low power, low bandwidth, long range systems. What do you think 5G can improve here?


How many devices can these systems handle?


Depends what you want to do, how you want to communicate, do you broadcast / mesh, uni/bi directional, etc. It's a "how long is a piece of string" situation.

From cheap, slow, long range deployments like LoRa (100 devices sending tens of bytes) to satellite broadcast (any number of synchronised devices), and a few things in between.


> next-gen media

What does this mean? Something beyond video? VR?


If the bandwidth is available then people will find a way to fill it. Streaming games, higher res video, remote terminals replacing low end PC hardware, who knows?


Good point, though I'll note that streaming games, high-res video, and remote terminals are already available (at least via high-speed ethernet connections), so the "next generation" would be higher quality or wider availability rather than genuinely new forms of media.

My instinct is that there are diminishing returns past a certain point. We're certainly not there yet, but once cellular networks allow you to stream high-definition VR content and upload data at the same rate, it seems like there's nothing more that additional bandwidth could add.

I see it as a philosophical issue... bandwidth is for the transmission of information, and there's only so much information that a human being can receive and provide at a given moment. At some point you're running up against the maximum bandwidth of the human user.


> once cellular networks allow you to stream high-definition VR content and upload data at the same rate, it seems like there's nothing more that additional bandwidth could add.

Well there is this famous quote "I think there is a world market for maybe five computers." from 1943 and it turned out to be very wrong.

"High-definition VR content" isn't the same as "indistinguishable from reality". When movies were first introduced to theaters, with small frames per second rates, without color and without any sound on the medium itself, people were quite stunned and e.g. took cover when a the movie showed a train approaching at high speeds. Nowadays it seems primitive to us from a technological standpoint.

There is a trend that some people don't accept lossy audio encoding. Maybe one day, videos will get a same trend and people want lossless videos, in full 360 degree VR, intensity resolution beyond perceptual limits and constantly high enough angular resolution for your eyes to foveate any area of the screen and see no pixels. That's quite a huge amount of data to transmit. Add in buffering so that you can seek, etc.

As for genuinely new forms of media, I could think of some: full-body experiences with feeling of touch, smell, etc either live or recorded possibly professional in a studio or just you sharing your last vacation to venice.

Taking it further: uploading your consciousness to a body which is a large distance away, making physical travel of humans mostly obsolete: Maybe one day we can represent the brains of human individuals as data and send it with light speed around the earth and throughout space.


So here's an example of "next gen" media streaming in action. I was recently at a MotoGP race. MotoGP has a streaming media app that allows for you to stream cams from your favorite racers on your mobile device in very high quality. This would be really cool to use for the many hours of racing throughout the day at the race track, but while its technically possible with current technology for me to do it the cost is prohibitive. With significantly higher throughput per tower, the cost of data should significantly be reduced, making it really cheap to have a crowd of people at the race track have a few 1080p each.


On-demand streaming of identical content to hundreds of users at the same time is not an efficient use of bandwidth.

It's like Netflix versus cable television - you can push the equivalent of hundreds of 1080p streams through a broadcast cable television, but attempting to push on-demand IP packets to an equivalent number of subscribers would bog down horrifically if they even attempted to stream a single show (let alone how you have cable tuners that can tune multiple shows at once).

What you need there is something much more akin to broadcast television - either a digital OTA video broadcast (good ol' digital television), or a microcell using multicast to broadcast a stream to any interested party.

(of course your phone probably doesn't have a DTV tuner, but when a RTL-SDR dongle is like $20, you should probably be asking why your phone isn't integrating that functionality. These days they don't even have FM tuners on phones anymore... despite the fact that in virtually all cases those are already built into the cellular chipset. IP-based singlecast is not a good paradigm for a lot of the use-cases that people come up with, it's just that it's the most profitable one for carriers, so it's the only one they'll support.)


In this instance its not all identical content though. In this example, each bike has three cameras, along with a dozen or so camera angles around the track. Users can pick and choose a lot of those different views and combine a hybrid view of their own personal choosing.

Of course, this could also be accomplished with DTV tuners, but there's a much higher probability of users having a 5G chipset on their phone than having a DTV tuner capable of tuning to multiple channels and an antenna.


So essentially you want you want an app that lets you pick a couple multicast groups to add yourself to, and then displays them in this "hybrid view of their choosing".

It's the same as what the app is currently doing, just with multicast groups instead of singlecast. And by doing so, you reduce the network load by N/M, where N is the number of users and M is the number of streams each user runs on average.

Don't get me wrong, I understand that this probably isn't currently implemented, but that's the kind of thing we should be looking at, before we decide to screw up weather forecasting and radioastronomy so that you can see your NAAAYYSSSCARR.

Even 5G is going to get eaten up under certain types of load, so it makes much more sense to look at ways to reduce traffic, the easiest of which is broadcasting rather than singlecast.


"...before we decide to screw up weather forecasting and radioastronomy so that you can see your NAAAYYSSSCARR."

This isn't the place for cheap comments like that.

Even if you have a point ideologically speaking we live in the real world where consumer money talks louder than forum comments. 5G is coming whether we like it or not because of people wanting to stream data for entertainment.


I guess that we can revisit that if weather forecasting does become substantially less reliable. And with global climate change, maybe we'll need reliable weather forecasting. But me, I'll be dead before it gets too bad, dog willing.


When hot dogs at sports stadiums cost $10 a pop, why do you think 1080p streams would be cheap?


It is dependent on supply/demand and business model. You could have asked the same thing when Google/Yahoo as a search engine came into play "why do you think the knowledge of millions would be widely available and relatively cheap/"free"?


The stadium has a monopoly on hot dogs and event streaming, so they can gouge you til their heart's content.


So we up the bandwidth, something like Elon’s NueraLink.


For how long? Is there a saturation and pull-back at some point?

Personally, I used my home gigabit extensively the first few months... but a year later I frequently find myself still tethered to my cellphone's 4G plan. It doesn't make a difference except once in a blue moon when I want to download something big.


More like AR, if you can run a computationally complex AR environment in the cloud and stream it in realtime to thin client devices it reduces the requirements for portable glasses etc.


Pretty bold predictions, I'd say. People still buy cable even with the rise of a half dozen streaming platforms and as long as you have to buy internet from a cable company, people will still be cowed into buying cable. Comcast (or anyone else, they all play the same exact game) can just make internet only packages egregiously expensive unless you also buy a cable subscription because many customers don't have another telecom option, and lawmakers are on the telecoms side.

5G is also going to need to have better coverage than anything ever made before if its to replace in home wifi. As it stands I drop calls when I walk into different rooms of my house on the data connection. 4G drops to 3G or even edge all the time going in and out or between buildings. That being said, I'm waiting to see what route telecoms pounce on to force us into 5G use. Will they degrade other data connections or go the planned obsolescence route? Either way, they are going to get their return, this isn't done out of technical altruism.


I would adopt a 5G to Ethernet/WiFi router just for the ability to drop Comcast. I would pay a premium to drop Comcast, but I don't have any other high-speed options. They have been so incredibly horrible. What I really want is some means, any means, to have a reliable high-speed connection that doesn't require 20+ hours on the phone and multiple service calls to admit that the connection from their box to my house is faulty. Being able to take my "home" connection with me on the road is just a side-benefit.

Edit: that said, I would probably also be pretty happy with a 4G plan that I could use for this purpose.


4G LTE is already able to deliver well over 100 MBit/s of internet connectivity, if deployed sufficiently.


Less prediction and more observation of what is already taking shape in Asia.

Anti zero-rating laws prevent broader adoption in U.S. (and possibly for good reason. Can't have 2-3 pay-for-play gatekeeping apps to the Internet)


>Personally, the industrial 5G IoT applications are far more interesting.

Which are?


That's the crazy thing. Almost nobody asked for 5G. And certainly not our health. Their basically just stuffing this down our throats in the name of money.

There's an international petition [0] and one in Switzerland [1] that I know of - please sign if you agree.

[0] https://www.5gspaceappeal.org/

[1] https://www.change.org/p/p%C3%A9tition-contre-la-5g-et-ses-d...


I agree. I have yet to see any solid argument for 5G being a some sort of game-changer for the end user.


I thought it was just me with the "who cares about 5G" attitude. Just give me better 4G coverage. Give me another tower in my area so my cell service isn't out while they work on the only one that serves me. Give me coverage that works at my neighbor's house too. I have poor coverage, but 700 feet away he doesn't have service.


5G won't even get you better coverage, that's more a question of which frequencies are used. Some European countries are freeing up the 700 to 800MHz spectrum which can help, but that's not really a 5G issue I believe.

The high frequency ranges suggested for 5G seems useless, they'll barely be able to penetrate windows.

Increased coverage would be far more helpful than faster speed in most countries.


I'm partially with you there. I think it's partly that IoT will require higher bandwidth as there will be so many more devices. However, when I visit my mum in Wales it's really difficult to get a signal at all - even in the nearest town it can be annoyingly patchy. And there are many rural places with the same problems. Instead of spending millions nothing the speeds of the digital haves, why not spend to get a reliable service to the digital have-nots first?


I don't understand the IoT craze. My light switches work just fine, and if I really wanted to operate them remotely, I'd just get a clapper. A clapper is a lot more convenient than pulling out my smartphone and poking at the screen. I don't want my Roomba sending iRobot information about my home to sell to data brokers. I don't want my refrigerator telling my insurance company how much cheese I eat.

But suppose I wasn't a neoluddite and I was easily excited by digital salt shakers. As it stands, all of these products could be using wifi, which would still give me, the consumer, the ability to firewall them. That would let me access them on my LAN without letting them talk on the internet. The idea of putting 5G radios in them instead of wifi radios seems to be to deprive me, the consumer, of the opportunity to firewall them.


You are looking at IoT the same way some folks looked at the internet and smartphone at its inception. Demographics will change and so will the perception of people. It is possible that IoT will help refrigerators be less energy intensive depending on the produce inside, that heaters in cars and homes will turn on only when a family member is arriving at home, that food will be heated before you arrive home. The list of possibilities go on and on. The individual/companies will have to decide the amount they will trade for the IoT benefits they get.


When I come home, I can have food heated up for me already. It's called a slow cooker. I don't need a 5G radio in my slow cooker for that to work. And what a 5G radio can't do is get the meat balls out of my freezer and into that slow cooker. Similarly thermostats on timers are older than cell phones.

If you want an automatically adjusting fridge, that can and should be implemented without the cell radio too. Put a RFID chips in food packaging that requests a particular temperature; the fridge then sets itself to the lowest temperature requested. A fridge that instead broadcasts the contents of your fridge to some corporation that does not have your best interests at heart is an abomination, but I do not doubt that whichever corporation starts selling them will try to mask this by framing their spy devices as ecologically friendly to make it seem morally unassailable.


Agreed. Honestly, I don't understand the 5G hype. Moreover, if a device has an embedded transceiver I don't control the network it's connected too. I literally lose control over the device. Short of shutting it off or damaging the transceiver.


I already have my laptop ensuring all my lights and a space heater are off when I leave (detected by my phone's bluetooth), just in case I forgot. And this is with extremely simple plug-in outlets.


You're thinking of IoT as only personal home electronics. IoT might encompass a lot more, such as more connected fleet vehicles, infrastructure pipelines (water, gas, sewage, etc), traffic systems, and potentially a lot more.


Infrastructure can be wired, and there are only ~260 million registered road vehicles in America (most of which are not fleet vehicles.)


Infrastructure can be wired, but it is also extremely costly. Just ask Google the costs of trenching up city streets to lay new cable. True, new infrastructure could be built to handle this new cabling requirement, but then we're at the same place we're at now with such tasks being prohibitively expensive for anything currently deployed.


More available bandwidth ultimately means higher data caps at lower (or the same) cost.


That result would go completely against the well-established behavior of the cell companies, at least in the US.


Not really... while prices do not generally go down[1], $$/GB does tend to trend down over time. The important thing to the business ultimately is revenue/investment, not revenue/GB.

[1] I have to also say generally, as some of the MVNOs do tend to lower data prices over time. There are some bargains to be had there, just with lower network prioritization.


Ah, I see. You and I are just valuing things differently. Personally, I don't care about cost/MB nearly as much as I care about the "Amount Due" line of my bill.


Yeah, I definitely see that perspective. But I also see the trendline of our usage steadily increasing, and the older pricing models would get amazingly expensive for heavier users.

As it is, there are some cheaper options as well (eg, consumer cellular or other MVNOs).


re: " I can already communicate with anyone in the world, "

Summing up some of the other comments: 5G isn't about (personal) communication per se. No more "accessing the cloud". With 5G we'll be wrapped in the cloud. Etc.

I'm not saying it's imperative we go there. Only that that's where we're headed. For better of worse. Like it or not.


Damn. Maybe the past few years have made me a pessimist, but these days the absolute last thing I want is to be ‘wrapped in the cloud’ in any way, shape, or form.


  Talking to myself
  Crying out loud
  Only I can hear me, I'm
  Stuck inside a cloud


seriously, more and more I find myself wanting to minimize how much stuff I have “in the cloud”, especially parts of the cloud controlled by for-profit corporations.


I'm not saying we won't ever need faster speed, but right now, 4G LTE when working correctly is plenty fast. And we need higher or no data caps far more urgently than faster.

Last thing I need is a data connection so fast it can burn through my data cap in 5 seconds.


And for that, more bandwidth is needed...


>I'm not saying it's imperative we go there. Only that that's where we're headed. For better of worse. Like it or not.

You're doing that thing where people say it isn't something only to reveal in a few extra sentences that you mean the opposite. "It's not imperative but its going to certainly happen."


No. Really. I wasn't :) I simply didn't want my analysis to be misinterpreted as an endorsement. Without that that disclaimer / clarification those reading will default to confirmation bias and have me casting a vote, so to speak, in favor of 5G.

Just the facts Jack :)


I love this accidental "does free will exist?" discussion.

pretendscholar is asserting that because it will happen, it must happen. You're arguing that it need not despite the fact that it will.


Well I'm just quoting op and the incoherence of the first part as it relates to the second.


They don't read as incoherent to me. Imperative doesn't mean "will happen" it means "must happen" (to achieve some end).

So the quoted comment to me says "it doesn't need to happen (not imperative) but it's what likely will happen (where it's going)". No internal inconsistency.


What does that even mean?


Nothing. It's politician's bullshit bingo regurgitated so many times through the media that now people on HN start to echo it.

We need 5G for self driving cars

We need 5G for industry automation

We need 5G for IoT (because that makes so much sense, lol)

We need 5G for ...


I just saw a recent article touting the benefits of 5G for smart traffic signals.... but it's not clear why those same traffic signals couldn't use 4G, or even 2G, it's not like traffic counts need a lot of data bandwidth.

Or, you know, it's not like traffic signals move around on their own, so they could just be hardwired.


There are already traffic signals out there that use radio signals to communicate with each other. They are used to manage the traffic around road construction sites. E.g.: http://www.fabema.de/en/products/traffic-light-systems/wirel...

So 5G is not needed for that purpose at all.


It is my understanding that while traffic signals are almost always hardwired to mains supply (somewhat surprising observation is that traffic cameras often are not but use Pb battery packs that are recharged during the night from street lighting power) they very often use radio interfaces for communication, be it something PACTOR-like, point-to-point 802.11 on unusual frequency bands (ie. sub-GHz) or straight 802.11.

The sub-GHz 802.11 seems to be uniquely american thing. And maybe just pushing anything over RF is somewhat uniquely Czech thing (caused by the fact, that Czech spectrum allocation has somewhat unique additional 10GHz ISM band), but in Prague there is an giant nest of smallish microwave antennas on every tenth street light pole (the extreme is probably Malostranske namesti, where there is 10GHz link between two traffic signals that spans about 70m).


IRT to applying ultra-high bandwidth networks to devices which don't need a lot of bandwidth, the point of moving to a higher bandwidth network is to allow these devices to quickly report back what they want to say and then get off the air faster. The faster they can report their small heartbeat of data, the more devices you can support. Lets assume this traffic light sends a snapshot of the intersection at a frame a second, each frame being 100KB. So, 100KB/s would be is normal load, so there's no real need for it to have a dedicated 100Mbit link. However, if you've got a few thousand devices wanting to share that link space, you're going to need more bandwidth available to the whole. Its also good for the general responsiveness of the network for that 100KB transfer to happen in less time, as only a single device can talk on the network at once. Its the same thing with home WiFi connectivity. 54Mbit would theoretically be plenty of bandwidth for a movie stream and a few phones/tablets browsing social media, but if you're using 802.11g you'll probably have issues with your video stream as those phones and tablets hog the airwaves while they load a batch of images. It takes time servicing multiple devices in the same channel, time that takes away from max effective bandwidth available.

Also, while its true one could just wire up all of these fixed devices, these wiring costs would massively drive up the cost of implementation. Trenching city streets to lay new cables costs a lot of money, just ask Google.


You're talking about 802.11g and 100Mbit links, while I was talking about cellular data -- two completely different things.

A cell tower has more than a single channel and a city with a thousand traffic lights is going to have many cells, so using existing cellular generations is not going to run into problems with collisions from 1000 traffic signals trying to send data at once.


At the basic level these two things are pretty similar. Its the same concept, just a difference in scale. The tower can only slice time so much before clients have to wait longer and longer to transmit. The faster a client can send its burst of data and get off the air, the more clients you can have connected per tower. I've personally experienced plenty of circumstances where even though signal strength is fine, there just isn't enough bandwidth for reliable cellular service.


I understand that if a single cell tower is overloaded, then its clients will have issues talking to that single cell tower.

My point is that a city with a thousand traffic lights will be covered by hundreds if not thousands of cells, so the issue of traffic lights competing for bandwidth for a single source is non existent.

And this problem doesn't change for 2G, 3G, 4G, or even 5G. Perhaps 5G can scale better but if you had a single 5G tower covering an entire city, it too would run out of capacity.


Or even use LoRa for meshing which is 50kbps over 10km. As you said, there's only so much data a traffic signal needs.


The 4g network tower can only handle so many devices connected to it. 5g also solves that problem. Speed is great but the big change is that the towers can handle many more devices (which is why it’s touted as the reason why iot/selfdrivingcars/etc can now work. If you’ve ever been to a large conference or festival, you can see what happens when too many devices try to use the network. It just loses all reliability and speed.

So the shorter distance the 5g signals go also mean more towers with more connections. My point is, don’t focus on the speed as being the huge innovation, it’s the number of devices possible. Although once it’s available, you will see apps come out that you may not have expected that take advantage of the higher speeds to do cool things.

But yes, we need to move in the direction of 5g or something similar to allow the tech you listed to take off.


5g will be so pervasive that it will be abnormal for a device or appliance to not be online. That 5G will be the aether and devices will expect to be able to access it, whether it's tied to your personal connection/identity or not.


Except entertainment does count according to customer use metrics.


Does anyone feel 4G is lacking in any way? I don't think I'd notice if my phone's network was any faster.


The biggest thing preventing me from using 4G more has nothing to do with speed or coverage. It's the caps and pricing on bandwidth consumption. I doubt 5G is going to change that.


5G will increase capacity, especially in urban areas, enabling cheaper unlimited plans.


Looking only at capacity, it will definitely be cheaper to provide unlimited plans but even if that was the only factor, there isn't much of an incentive for carriers to change how they charge for bandwidth.

In all likelihood prices will go up as 5G requires a massive infrastructure deployment of very expensive hardware, spectrum licensing, and they'll have to pay to transmit and receive that data to the internet. All of those additional costs will be passed on to customers.


Even if it wasn't expensive to deploy, carrier customers pay for value provided. 5G will be in some way better, so more expensive. In the same way we use pay insane roaming rates until EU ruled they can't stay - suddenly carriers advertise how cheap the roaming is with them and how they're the first ones to offer it in the country...


How? The investment required is bigger than 4G


The real issue is coverage, not highest-available speed. It doesn't matter if Verizon puts up a tower that lets you download 10GB per second if you live in an area that Verizon doesn't care about servicing.


It doesn't matter if Verizon lets you download 10G a second if the average 10G/month data plan costs hundreds


but how/why would 5G increase coverage?


Faster autoplay ads to "serve you better" so to speak, no?


Of course... and they'll be better targeted to your precise location for tracking purposes too!


Let's not get too hyperbolic - 4G isn't perfect. I do regularly have issues getting any data connection in my office building in Chicago.

...that said, from the other comments, it sounds like 5G would make this issue worse...


I think it's more about power usage. Shorter bursts of more data vs having the radio on for longer.


Circuses, but maybe not so much bread?


People can already steam music and Netflix (or whatever) and be happy with it. You don't need 4k revolution (or lossless music) on your tablet. If you cared that much about quality, you'd watch $whatever on an actual monitor (which, given our technology, is EITHER easily mobile OR better in 4k). As such, "people entertain themselves" doesn't justify 5G, just better 4G coverage.


But entertainment does count for me and billions of others like me.


At least unexpected weather gives you something entertaining to talk about and post pictures of on social media.


The confusing thing for me (not exactly well versed in networks) is that currently all 5G base stations seem to have ranges more akin to modern WiFi endpoints than LTE with regard to how densely deployments will need to be placed - am I missing something?


This is what gets me, too. LTE is great because it's mobile data - the range to a phone tower seems to be about the same as 2g/3g and the data rate is awesome.

5G is more like roaming wifi, and the data rate drops rapidly to the point where (iirc) if you're more than a couple of hundred meters from a tower it's no better than LTE. Why would I want this?


I wonder if like the 3G and the 4G rollout in most countries, the first batch of deployments will use the higher frequencies for the new technology keeping the lower frequencies for the legacy technology to ensure everyone retains access to the exisiting coverage while giving some people access to some coverage using the new tech. Then as user adoption of the new tech approaches a certain point the lower frequencies are reallocated to the new technologies when the risk of reducing coverage for users of the old tech is not significant issue anymore. I’ve seen this occur with 2G to 3G (3G started with deployments only at 2100 MHz but is now mainly deployed at 900 MHz replacing the previous 2G services in that band) and 3G to 4G (4G initially mostly in 1800 MHz but many deployments now are in 700 MHz reframed from TV and 900 MHZ reframed from 2G/3G).


Why does entertainment not count? Entertainment is the basis of our entire civilisation. If everyone were content then no-one would be striving to create better movies, cook better food, play sports, create higher definition TVs, increase processor power, communicate and travel across the world.

If we were all happy to sit on our own in silence, eating the same local crop day in, day out, then very little would be developed.


To me the greatest boon is to rural areas. I am incredibly excited about the opportunity 5G opens up to those who currently live with marginal access to the Internet because the monopolistic world of ISPs has prevented fast, consistent internet outside of major cities and population centers. The opportunity for education and services to everyone is the major promise of 5G. Now I just sit here and hope that actually comes to pass.


i was under the impression that the high power/short wavelength for 5g would practically limit it mostly to cities


Exactly. I predict many areas will never see 5G. You need more transmitters to cover the same distance as 4G.


Many rural areas don’t even have 4G yet. These areas even less likely to get 5G anytime soon IMHO.


Think for a moment about how big of an impact 4g had, and 3g before it. As others mentioned above about the law that states that any new computing resource will eventually get filled, there are plenty of areas where this is worthwhile.

The biggest first off would likely be the end of wifi as we know it, so in turn, an increasing number of IoT devices for the home/workplace/elsewhere would be built without the need for complicated wifi setup, and all syncing between devices would likely be done using a cellular setup. "Cable" would likely completely go away, where companies like Verizon realize they can deliver all TV over their 5g network. The constraints relating to numbers of devices on a wifi router will be backgrounded, ect.

Another key piece is that there will be far more nodes in a 5g deployment, so you could do better triangulation for GPS. It will have lower latency so even tighter real time applications can be done over the internet. This list can go on and on.


> the end of wifi as we know it

Can you explain what specifically about 5G makes this possible? I don't know anything about it, but if the logistics and pricing for it are anything like 4G, I have no interest in paying my telecom an extra $5/mo per device when I can throw them on my own network for free.


Because potentially "your network" goes away. You have no home-based "line".

I'm skeptical, very, but the appeal is obvious if it works and the pricing isn't obscene.


The pricing will be higher than 4G. The data caps will stay similar. Carriers are not charities and don't operate at cost - if they can charge us for extra value, they will.


s/for extra value//


Why don't I have a home-based line? Are routers illegal now or something? Are the receivers locked down and proprietary? Is someone taking my computer away?


What are wireless providers gonna allow these new IoT devices on their network for free? Are they suddenly gonna stop making us pay through the nose for bandwidth on our phones? Is tethering not gonna exist, or not be an extra $20 per month charge?

I don't see any of these things happening with 5G.


How big of an impact did 4G really have? I had Google Maps and text messaging on 2G. Now I can mindlessly scroll through high res feeds full of memes too?


I remember mapping being really slow on 2G. 3G improved things but peak time would see issues. 4G finally saw mapping work well 99% of the time for me since the speeds were always good enough.


Modern economics is a religion of growth for its own sake?


That might be a simple no, but let's look at what growth is. Increase in consumption, and there's still a lot to go when it comes for the economy to provide the basic stuff for a decent life.

Sure, if all that extra growth does not translate into increased well being for the median, or if it can't lift those out of poverty who need those efficiency gains that led to "growth", then, no, of course that growth is useless.

However. Growth is amoral, and most economists are trying to understand the economy, not normatively influence it. (Though there are a lot of oped and blogs by economists.) And how the economy works, how does that growth happens is not up to the "modern economy", it's up to people. Politics.

Modern economics is the religion of Cassandraism, anyone who understands what's going on is unable to persuade others. Look at how the Net Neutrality debate went down the drain, because somehow libertarian/dumb free market advocates can't comprehend that zero-touch no-regs complex systems have a tendency to end up in a pathological state. Or look at how people still can't believe that a bail-out was better for the economy short term, and how trying to do central planning without honest signals likely won't work.

And this doesn't mean that it's easy to figure out these things, or that there's a 100% idiot proof way to get magical silver-bullet solutions, far from it. But it seems pretty straightforward that blindly ignoring those who study these things all their lives and just repeating a mantra - be it socialism or libertarianism - is not exactly helpful.


I am not saying the benefits outweigh the costs, but I think there's huge inroads to be made for high bandwidth communications tools like many-to-many video conferencing, which is tantalizingly close to being fast enough but still a pain in most situations I've encountered. Disclaimer: I am thoroughly ignorant about 4G and 5G so I don't know if 5g would help the typical issues that come up with video conferencing.


It's painful even on a 1 Gbit/s dedicated fiber connection. Most video conferencing tools just suck.


The main interesting part isn’t about making phones faster (although that’s a fantastic side effect since now it’ll be feasible to work off a cellular connection instead of having to find a good cable link for fast internet), it’s about everything else. Houses don’t necessarily have to have wired to the home fiber, instead, a 5G antenna on the side could easily get you gigabit.


> Entertainment doesn't count.

But it's what like 90% of users use their devices for! :-/

Sorry about that, but you (and me) are a minority.


Just because most users use their devices for entertainment doesn't mean that entertainment is a valid reason to need more bandwidth. Most users couldn't give a good gosh dang whether the resolution is higher on their 6" phone. It makes literally (literally) no difference.

Nice emoticon, btw. It definitely didn't make me completely disregard your opinion.


I think this is a great point, and that the main boon to 5G is the sprouting of lots of localized towers. It seems like a big giveaway to big telcos to the detriment of cable providers with bigger physical plants and capital needs.

End of the day, as shitty as Spectrum and Comcast are, they are a better devil then the AT&T corporate offspring.


Maintaining the existing level of service as more people use their phones more heavily at greater population densities.

As I understand it, 5G is more about network scalability than about improving end-user experience (though of course end-user experience falls off a cliff if the network reaches saturation).


It's so that we can make cable companies obsolete without running fiber everywhere. 4G networks just don't have the capacity to support use for TV streaming - the hope is that 5G will.


We will have to run fiber to all those 5G hotspots. 5G is very short range, it does not extend access to anywhere that doesn't already have good access, it just makes it possible to run devices at higher speeds without being tethered to a wall.


Yup! It means fiber to the neighborhood rather than fiber to the home. The proposal is that fiber to the neighborhood will be considerably cheaper than fiber to the home.


I don't believe cable companies will become obsolete anytime soon, lol. There's a tendency to rely on cloud-stuff more and more, we stream video instead of downloading, 4k, 8k, we store entire collections on google drive and stuff. And even if you think 5G will solve all problems with the bandwidth — well, I won't argue, but there are plenty applications (like gaming) that require low-latency, which 5G will solve not.


Note that I'm not necessarily arguing that this line of thinking (5g will obsolete cable companies) is correct, only that that's why mobile operators want it so badly - they want you to be able to drop Comcast and give them your money instead.

That being said there's absolutely no reason why 5G (or even 4G or low Earth orbit Sat) couldn't compete with cable companies on latency. In many places 4G is already lower latency than existing cable networks.

Edit to add: 4G also already offers fast enough connections for 4K and in many places 8K streaming. It's the total capacity that isn't there - they have to put data caps on so people don't do all their streaming on their mobile connection. 5G is more about removing data caps than providing faster speeds.


Why do we need it? To send you ads quicker, of course


be sure to research and laugh at e-rating windows and the 5g range.


I agree.


[flagged]


Is it? I wasn't aware of this so I did some googling and found this study:

https://link.springer.com/article/10.1007/s13592-011-0016-x

This doesn't seem like pseudoscience to me, but I don't really have context here. Is this considered to be a poor study or a disreputable journal?

Edit: Apparently it's been cited 61 times since publication: https://scholar.google.com/scholar?cites=270710047801738518

Edit 2: Okay, there's definitely skepticism: https://www.huffpost.com/entry/cell-phones-kill-bees-study_n...

I'm not sure if I'd call this "pseudoscience" as much as "speculative and controversial."


Here's a study that looked at microwave absorption in insects at different frequencies (picture):

https://media.springernature.com/lw900/springer-static/image...

"All insects showed a general increase in absorbed RF power at and above 6 GHz, in comparison to the absorbed RF power below 6 GHz. Our simulations showed that a shift of 10% of the incident power density to frequencies above 6 GHz would lead to an increase in absorbed power between 3–370%."

https://www.nature.com/articles/s41598-018-22271-3

It doesn't seem like pseudoscience to me either. In fact it looks to me like more real science needs be done on this subject, to figure out if there is a real danger or not.


prev. https://news.ycombinator.com/item?id=2537839

(I recall much longer discussion, but could not find)


That's a completely fair point. I'd read a little on it, nothing conclusive, and as md224 pointed out, there's been some back-and-forth, and the science isn't settled. I figured that including "possibly" in my OP covered my bases. I recognize that this might be seen as manipulative though. Could someone suggest an alternate phrasing?

Regardless of the veracity of the bee thing though, I think the salient point is that having lots of EM radiation in the environment will have non-obvious effect, so we should seek a rationale FOR the technology, rather than need a reason to reject it.

(thanks to adsfqwop and avip for providing context below.)


> possibly on bees

Citation needed, this seems like FUD.


Have a look at section 2.4 of this document:

https://www.pathophysiologyjournal.com/article/S0928-4680(09...

Some excerpts:

"Panagopoulos et al. [99] exposed fruit flies (D. melanogaster) to radiation from a mobile phone (900 MHz) during the 2–5 first days of adulthood. The reproductive capacity of the species reduced by 50–60% in modulated radiation conditions."

"The authors concluded that radio frequencies, specifically GSM, are highly bioactive and provoke significant changes in physiological functions of living organisms."

The paper references over 100 studies in total on the subject of biological RF effects.


Hey jobson, fair point. Please read my reply to exabrial below. There's a couple other commenters with salient links there, too.


As an amateur astronomer, I would greatly prefer preserving radio astronomy to allowing folks to have faster facebook crap streams on mobile devices.


Area Man Prefers His Own Interests Over Interests of Others


that doesn't make his interest invalid. In fact, it's the basis of democracy - we add up the number of people with interest A vs. interest B and do the one with the most votes.

That's important. Our own interests are important. It's equally important that we discuss why our interests are important enough to disallow others from pursuing theirs. And then persuade others to our side. So don't dismiss someone for having an opinion in contrast to anyone else, argue the opinion itself.


Are you referring to the astronomer or the 5G user? ;)


Obvious solution: we designate one half of the planet as reserved for astronomy. The earth rotates, providing an opportunity to see everything. Then the people who want to look up can move to that half of the world, and the people who want to look down can move to their half.


Sounds like a great SF plot. Please do send me a copy of the book!


All things considered, I'm glad we don't live under an amateur astronomer dictatorship.


Radio astronomy has never had a better moment to push for its interest than now - just fire up the PR machine, and say: "Seen that black hole pic? Want more things like that? Better pics? Well, then let us keep our spectrum!"


There have got to be other solutions for radio astronomy. Let's look at options like putting telescope relays at lagrange points, or in orbit of other planets where there is no spectrum interference.

The black hole pic was cool, but you know what would make it even cooler and higher-res? Using the same technique with an even wider array (interplanetary) telescopes.


> Let's look at options like putting telescope relays at lagrange points

That's precisely what Millimetron/Spektr-M aims to do, but it still needs the ground part. You can't put the 100m Green Bank Telescope into orbit.


I'm all 100% into that. Wish there was a way to help make this happen.


Sure, but what is important for me or you individually is not how we make decisions on policy affecting everyone collectively


No, we look at what makes the rich people richer, whilst giving occasional appeasements and not going too far so as to ensure there are enough "slaves" to do the work.

If it were about collectivism copyright would be about 5 years and noone would get paid more than a few times the median.


I'll rephrase my comment.

> Sure, but what is important for me or you individually is not how we should make decisions on policy affecting everyone collectively


>> save a local river bait fish

A few or more of those and we have real loss in biodiversity. Maybe your "local river" can sustain that for a bit - but overall they are all important.


Would it be possible to send up a space based telescope in those frequencies?

Maybe the economic benefit of 5G would be enough to justify a one time cost of a telescope launch, especially if US, EU, and Russia all pitch in.


What does 5g have to do with satellites? It’s a marketing push around some scheduled cellular equipment upgrades.


They're probably not related, but I was thinking of the thing that SpaceX and another company are planning for satellite internet. I've heard that it would use the same centimeter wavelengths, probably for bandwidth reasons among others. I assumed that they were likely the same thing if they were both operating in the same frequency range.


If 5G is going to impact radio astronomy then the governments that license the spectrum should fund alternatives. Some simple space-based telescopes orbiting out beyond the 5G bubble would be expensive but not terribly difficult (radio telescopes, not the JWST). Put a couple out beyond the moon and the next image of a black hole won't be so blurry.


Radio telescopes are way bigger and heavier (large dishes required). Even more importantly, usually many radio telescopes are used together for interferometric observations. They require exquisite calibration of the distance between the telescopes, which can be pretty difficult in space.


Actually not necessarily. Many light, cheap and small antennas can also work.

See p.8 and 10 for those used in the current state of the art telescope at 160MHz https://www.aanda.org/articles/aa/pdf/2013/08/aa20873-12.pdf And https://cdr.skatelescope.org/#photos?lfaa for those used in the SKA, one of the largest future radio telescopes that is still under construction.

Getting the distances between the telescopes is actually easier in space. Both distance measuring and datatransmission could be done with lasers when there is a clear line of sight. Moreover, once an orbit is established, the laws of Kepler are 'followed' and predicting their mutual distances is something we can do extremely well. On Earth with very long baselines it is much trickier and things that need to be taken into consideration are cablelength differences due to temperature changes, tides and continental drift. (Continental drift is actually measured with radio telescopes: in the reverse problem when the location of a set of sources on the sky is known to high precision one can establish at what speed the distances between the telescopes is changing.)


You might want to look into what the NRO is already putting in space (supposedly) [1]. If the reports are to be believed, 100m dishes are already there, just pointing at the earth instead of into deep space. To your second point, it seems that relative positioning is mostly a solved problem at least for satellites in GSO (that's a core part of how GPS works). My personal opinion is that space-based radio astronomy is mostly a problem of cost, not of available tech.

[1] https://en.wikipedia.org/wiki/Orion_(satellite)


Not a solved problem, GPS is too inaccurate for interferometry. The problem is you want to do correlations between radio signals; consider a 1 GHz signal, which has a 30 centimeter wavelength. Error compounds as you integrate longer and longer signals. To do anything decent, you need to know absolute position within millimeters, and it can't be drifting too much.

Also, the positioning error isn't the whole story; there's also timing drift. Rubidium GPS-disciplined oscillators actually drift too much instantaneously to be useful...


Laser interferometry will pinpoint the distance to far less than a millimeter.


Military spacecraft are almost always ahead of scientific ones, and a 100m foldable dish might be an engineering marvel, but it's a bit different from a 100m telescope. An actual telescope needs to guarantee that your data are correct within certain tolerance, and not just good enough to get a good SNR from another spacecraft.

Besides, as another poster noted, VLBAs need extremely precise positioning which is just another rabbit hole. Spektr-R orbit determination was really tricky for its baseline of 300000+ km.


I've always thought on the moon would be more interesting than "beyond" the moon. A ground-based dish on the dark side of the moon, isolated from a large majority of the noise we're pumping into space by the body of the moon itself.


But it wouldn't collect the type of data that will be lost by the 5G spectrum. We can't change the microwave emission frequency of water but we can change our technology to not use a frequency close enough to cause interference with that frequency.


Large dishes and large networks of large dishes are economical on the ground in ways that are utterly impractical in space.


Yes and no. Those large dishes on earth require massive support structures. In the zero gravity of space large dishes can float virtually unsupported. There are spy sats with extremely large reflectors. The biggest geostationary ones can been seen from earth, meaning they have reflectors on the order of 100m+ in diameter. Turn one of these around and we would have a very capable radio telescope, much as how a redesign of another spysat program gave us hubble.

https://en.wikipedia.org/wiki/USA-202

https://en.wikipedia.org/wiki/Orion_(satellite)


For the low, low price of 2 gigabucks each.

For an idea of scale, the entire VLA was built for ~80 megabucks. The entire Atacama Large Millimeter Array was built for 1.4 gigabucks.


Gigabucks and megabucks is how I’ll be talking of money from now on.


Totally worth it, even if to just add momentum to building space infrastructure and increase world's competence in space technologies.

Things have changed quite a bit over past 20 years. Launches are getting much cheaper now, and satellites are getting smaller. I imagine costs could be brought down, especially if the scope of the mission was limited to e.g. simple constellation for radiointerferometry.


More like shouldn't the government have prevented 5G from licensing the spectrum in the first place?


We have already done a few low-Earth-orbit radio dishes for VLBI [0]. IIRC correlating the data from a single on-orbit dish was difficult because it was moving way faster, over larger distance, than the other elements of the array. Still learning how.

[0]: https://tinyurl.com/google-search-vsop


More info on impact of satellite constellations on radio astronomy: https://news.ycombinator.com/item?id=19712625


In addition to the good points raised nearby: The data rate from radio telescopes is enormous.


Just put a 5G tower on the radio tele... nevermind.


Data rates are high for the particular types of astronomy being done today because they are the cheaper capability. Adding new electronics to existing telescopes is less expensive than building a larger dish. But any astronomer would trade those date rates for the greater sensitivity/resolution and decreased noise of a massive space-based dish.


The stream is also in one direction.

I suspect an optical (laser) emitter could transmit either directly to Earth or to a retransmission satellite at GSO that already has a beefy radio downlink, and also serves as a regular comm sat.


autocorr, do you know if the Jansky VLA WIDAR correlator will be able to deal with 5G?

A primary design goal was terrestrial signal rejection; we would get nuked by ABQ Traffic Control radar etc.

I haven't kept up. I suppose I should find out.


Has anyone done a deep dive on 5g health concerns? E.g., 240-some scientists and 40 doctors signed a letter of discouragement (or something), claims research indicates 5g interacts with human biology in poorly understood ways: https://ehtrust.org/key-issues/cell-phoneswireless/5g-networ...


Unfortunately, as a society I think we're going to have to "pee on the electric fence" for ourselves to find out.

Despite the fact that this spectrum has never been used for any widespread purpose, we're rolling it out and the burden is not on the implementers to prove that it is safe. It's basically on researchers to both prove, publicize, and convince society as a whole that 5G has health impacts.

I am not going to go all conspiracy-theory and say that the research is being suppressed but certainly funding for this research is not going to be a priority for the US government, as they've been thoroughly bought and paid for. Most research into health effects of non-ionizing radiation is not funded from the US government, so draw your own conclusions from that.


It's all nonsense. The higher frequencies are non-ionizing and in common use today. See all of those microwave antennas on buildings and towers? A lot of them transmit around there.

Your neighbor can blast 24 ghz right at your house with a free licence:

https://en.wikipedia.org/wiki/1.2-centimeter_band

The reason microwaves (like from a microwave oven) are dangerous are because of their power levels. Like, they'd cook you if they leaked out. Not because of ionizing radiation.


Why is it that you can just dismiss the hesitancy of more than a hundred researchers with PhD's in a wide range of relevant disciplines who have made careers out of studying this kind of thing by simply posting a comment saying "it's all nonsense"?

https://emfscientist.org/index.php/emf-scientist-appeal

ctrl-f for "PhD" shows 126 results on the page.

Electromagnetic radiation is used as a signaling medium for countless lifeforms on earth, including the cells and microbes in our own bodies.

Imagine if some alien creature decided they wanted to flash relatively bright blue light across the entire surface planet, day and night at frequencies that known to induce epileptic seizures. It'd affect your quality of life and mine. For some people, it'd be devastating.

Just because we can't see it, doesn't mean it's not there.

I fail to see how a blanket dismissal of peoples concerns, however uninformed many of them may be, counts as "rational"!


I can find 100 crackpot scientists as easily as I can find 100 crackpot doctors or mathematicians.

Instead of appeals to authority, why can't we work with the facts and evidence that we have already (those overwhelmingly disproving the health hazards of 5G)?

"Peoples concerns" equate to the desire to halt all human progress out of some bizarre mix of paranoia and Neo-Luddism.


"why can't we work with the facts and evidence that we have already (those overwhelmingly disproving the health hazards of 5G)?"

Yeah well, I'd like to see that overwhelming evidence please.

But since, I am not a biologist nor physicist, I could not really examine it in detail. So I would have to believe the authorative power of the experts who day it is all good.

Then I compare the arguments to the experts who say yes.


That's an ad hominem attack.

It's an inconvenient possibility (not going to call it truth) that our technology might be hurting us. We're addicted to technology the same way smokers are addicted to nicotine.

Smoking was at one time considered healthy and promoted by doctors, and now where are we?


Technology is a religion to this forum, so good luck convincing others that electromagnetic radiation might have an impact on human biology that is not well understood.

If it doesn't cook you like microwaves or causes radiation sickness 3 hours after exposure it is immediately declared universally safe.

Of course that's negligent at best. Looks like we will live through in vivo testing for those questions, I suppose.


Yep, totally get it. That's why I'm using a throwaway, I'm a heretic to the highest, on the level of an anti-vaxxer.


>ctrl-f for "PhD" shows 126 results on the page.

I took a random one for the country I live in: According to wikipedia, this person who got his PhD in 1973 and has since retired has done research in 'quantum philosophy and spirituality'.

The second one, retired also, is a biologist who researched subterranean mammals. He and his team got an Ig-Nobel price in 2014 for their finding that dogs align themselves to earth's magnetic field when pooping.

I'm not sure I want to continue.


> that dogs align themselves to earth's magnetic field when pooping.

Well actually that seems like relevant information for this topic, because if that is true it means that dogs can sense magnetic fields. And as we know RF waves are composed of both an electric and magnetic component.

So this guy might actually know something that the rest of us do not know. If dogs can sense magnetic fields, it means electromagnetic radiation could have an unknown impact on their sense of direction. Sounds important to me.

Here's another guy that seems to know what he is talking about.

Prof. Olle Johansson, Ph.D., Dept. of Neuroscience, Karolinska Institute, Sweden

Seems like someone you might want to listen to, especially since he has participated in several studies on this subject:

https://www.cellphonetaskforce.org/the-work-of-olle-johansso...

You can also find his talks on the subject on Youtube, which I suggest people watch. They are very easy to listen to, and provides a good introduction to these topics.

Then we have this guy:

Dr. Paul Héroux, Ph.D., Director, Occupational Health Program, McGill University; InvitroPlus Labs, Royal Victoria Hospital, McGill University, Canada

He was commissioned by the electrical power companies to actually invent the RF dosimeter in 1991, that many use now to gauge exposure levels.

https://www.ncbi.nlm.nih.gov/pubmed/1930308

I'd certainly want to know what he has to say. Let's look at some of the rest:

https://www.ncbi.nlm.nih.gov/pubmed/?term=H%C3%A9roux%20P%5B...

Several publications relating EMF exposure. I think we want to know what he has to say also.

So perhaps you should not have give up so easily, and continued to look a little further instead. We want to look for the people who have the expertise in this field, and listen to what they have to say.

To finish off, I'd like to know where are the scientists that have studied this subject as extensively as someone like Heroux or Johansson, and are willing to sign their names on a pledge of microwave safety for all of humanity. I haven't seen such a list.

"The 5G recommendation for global irradiation. Go ahead, we take responsibility for full irradiation." In fact there is no such recommendation, as both insurance companies and even the cellular manufacturers are increasingly distancing themselves from liability.

That's what I call a clue.


>ctrl-f for "PhD" shows 126 results on the page.

https://en.wikipedia.org/wiki/Argument_from_authority

Show me one species that uses GHz signalling.


> Like, they'd cook you if they leaked out.

Nope, they wouldn't. The standing wave would break down as soon as it left the enclosure. It would still be a potentially harmful amount of radiation, but not like that.

And no, 24 GHz is not in common use. It's used for special applications, mostly highly directional tower-to-tower links (I have one sitting on my roof for a high-bandwidth HAM radio link). With 5G, it will be ubiquitous in an unprecedented way. We've already seen plenty of interference with 5 GHz.

Of course, there's a lot of fearmongering whenever RF radiation is involved, but there are genuine safety concerns.

Source: HAM radio operator


In school, we used to learn about the RF service technicians returning with stories of dead birds and other such phantasmagoria in and around the sweet spots of the feedhorn, antenna, transmission line, transmitter or other such sensitive areas of high power microwave operations.

Professor also admonished us that such technicians must always be infinitely certain that the transmitter is not operational at the time of service.


> Professor also admonished us that such technicians must always be infinitely certain that the transmitter is not operational at the time of service.

It's a good thing your professor is not a tower technician. No self respecting technician climbs up a tower without knowing his potential exposure.

There is a reason both the US FCC and EU ICNIRP have guidelines for human exposure, and if you are a tower technician you wear a personal RF densitometer to ensure you are not exposed above these levels.

https://en.wikipedia.org/wiki/Personal_RF_safety_monitor

"Electromagnetic field densitometers, as used in the cellular phone industry, are referred as "personal RF safety monitors", personal protection monitors (PPM) or RF exposimeters.[1] They form part of the personal protective equipment worn by a person working in areas exposed to radio spectrum radiation."

And even with a densitometer, when working on high-powered live equipment, you also wear a protective suit:

https://upload.wikimedia.org/wikipedia/commons/c/c7/Nardaler...

I hope you forward this information to your professor. Industrial RF radiation is not something you want to play around with.


I saw a video recently where a guy took a densitometer to a street with 5g in my country, and lets just say that there was plenty of sweet spots where it was well over 500uw/m2. Im no expert or anything, but seems like allowing 4 different vendors to put up 5g infrastructure all over the place might not be such a great idea.


I lived in a place where they had many co-located LTE base stations and power density about 1/4 mi away from these exceeded 75 mW/m^2. Now, I get it that a typical wifi base station transmits up to 500mW, but that's a point source and the inverse square law works to your advantage - the total dose in any one direction is minimal.

Not so when it's a huge multi-tier tower - that's your whole body facing the tower getting that dose.

I'm a software engineer, a technologist, by trade and I was in complete denial about this until I started getting serious eczema that only went away when I stopped using wireless gadgets and avoided staying too long in areas with cellular base stations.

I denied and denied and denied but repeated experiments on myself only revealed how my body reacts to this stuff. My ideal power density is less than 1mW/m^2 to not break out. And there seems to be a relationship to what LTE bands the tower is transmitting on - outside the US I seem to do better. It could be the frequency, or could be the modulation.


hmm your comment makes it sound as if I had said that my professor recommended reckless abandon such that we should climb the tower in swimwear during a heavy rain after cranking the tranmitter to maximum?


Yes it's true. Unfortunately I cannot edit my comment any more. I do apologize, as it seems I actually misinterpreted your comment.

You can erase all references to your professor and swimwear tower climbing from the text. Hats off to him! And again, sorry for the mistake.


No worries, I reckoned that a misunderstanding must have occurred!

Still, I couldn't resist a vision not unlike the penultimate and electrifying golf scene with Bill Murray & the bishop in Caddyshack.


I believe these "sweet spots" are technically referred to as "modes". Think of them as "balls" of RF located in a specific area.


>Nope, they wouldn't. The standing wave would break down as soon as it left the enclosure. It would still be a potentially harmful amount of radiation, but not like that.

Yeah that's a good point

>And no, 24 GHz is not in common use. It's used for special applications, mostly highly directional tower-to-tower links (I have one sitting on my roof for a high-bandwidth HAM radio link).

That's what I mean by common use. Omnidirectional antennas aren't in common use.


Non-ionizing radiation does have potential biological effects on Voltage-Gated Calcium Channels (VGCCs) which our cells have plenty of:

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3780531/


Not when immersed in a giant bag of salt water.


> See all of those microwave antennas on buildings and towers? A lot of them transmit around there.

But microwave links are P2P and directional, using narrow beams at a certain height so that humans (or other objects) don't get in the way. That will not be the case with 5G.


microwave is a broad term. WiFi in your house is in the microwave band. microwave CAN be directional and focused, but doesn't have to be.


Some of those items on the list are sending up red flags to me. The weapon use is purely because it heats skin; if you apply a million times less energy for a million times as long, the danger level from heat is zero. When it can't penetrate skin, it does make sense to classify head skin exposure the same as foot skin exposure.

The thing about sweat glands is interesting. And sadly I have no idea how to evaluate the quality of the studies linked.


The "good news" is that if it was serious, we would very likely already know. Therefore via some Bayesian inference we can claim that it's relatively harmless, but obviously worth keeping an eye on.

The bad news is that there are and will be entrenched interests that will likely try to "work around" any health concern, and maybe, potentially we will hear that the existence of 5G is worth it. (For example the tech advantage helps with healthcare more than the radiation harms us.)



I was wondering, what exactly 5G will bring us. For the most part, all the tasks I need to do on a phone (pocket computer / communicator) can be done even with 3G (video, streaming music, and any website's loading time is more than acceptable at 3G speeds).

The only thing I can think of is 5G will allow for more overall network bandwidth, so the data caps on "unlimited" plans wouldn't be needed. But compared to how we use our phones today, what new items will be be able to do with 5G that we can't do with current 4G/LTE?


With data caps I don't see much use for it. I still think it's hilarious that carriers advertise download speeds that would blow through most people's monthly data allowance in at most a couple of minutes.


It would blow through your allowance if you downloaded something at that speed, constantly. Maybe other people understand it differently but I’ve always understood those ads as like ads for a top speed on a car. You’ll never go that fast normally, but you can if you need to.

People will continue downloading < 1 MB per request, it will just be much faster


Except if you hit the top speed of your car you can continue driving around at the normal speed the whole rest of the time. You don't have to drive around at half the speed you want for a month.


That's just it, though. If 99% of your mobile internet usage consists of <1MB requests it doesn't matter if you're on 3G,4G,or 5G.


4G at least in general has lower latency than 3G, but I don't know if that's the protocol or just deployment.


You'd still get kicked for high ping even if you sat right under the 4G tower.


In our rural area with poor 4g (Verizon) signal, the latency is still usually better than our CenturyLink adsl. Something like 40ms vs 60ms.


Speak for yourself. With my grandfathered unlimited data play from Verizon, I basically stream YouTube all day long.


It won't be long before companies are making extra-high bandwidth auto-playing ads to eat your allowance, not to mention the improved location tracking due to smaller cells.


I think this is a great point. With fast internet, my fear is that websites will load more and more crapola, eating our bandwidth, so basically we will indeed be buying 3 minutes of internet per month.


You can be sure that content creators will expand their media density to fill the available capacity.


>But compared to how we use our phones today, what new items will be be able to do with 5G that we can't do with current 4G/LTE?

5G has the potential to provide throughput and latency that is comparable to a fixed broadband connection. In reasonably competitive markets (i.e. not the US), that's A Big Deal. Latency is a particularly acute issue in many applications; 4G generally adds about 50ms in the best case scenario, but 5G can easily provide sub-millisecond latency. Imagine a near-future where it simply doesn't matter whether you're on WiFi or cellular, because they both provide the same experience.


You say that it provides sub-millisecond latency.

-- Keep in mind that I know nothing of nothing.--

--- Is the sub-millisecond from repeater to repeater or from origin to final destination that's [n] repeaters away? Won't each repeater introduce its own latency?

What causes the 4G's >50ms response time?


One thing is that 4G is generally centrally broken out to the internet.

So let's say you're in Scotland, the packet would enter the mobile operator's network, then go to London, where they've bridged the network to the internet.

So there's an extra there and back again.


But why should that change with 5G? Couldn't operators already now route traffic to the internet in more locations?


5G isn't just the radio network. It's a rearchitecture of the core network as well.

For example you can have a 5G core over LTE.


But the article is only about the radio network. 5G over LTE wouldn't bring the problems discussed in the article.


The same applies with roaming too, so in that case there is even more latency.


5G non stand alone can provide this due to the new core network. But we are still at the mercy of the speed of light. A round trip of 1 millisecond translates to about 150 km (95 miles) of distance between you and your resource. And that is without any signal processing at the receiving end and in the air to fiber interface.


But the sub millisecond latency is only for local communication, isn't it? Which is most likely not available for most applications that phones use.


From my current location, it's a 13ms round-trip to the nearest Google server via a good fixed broadband connection; over 4G, that would be ~63ms because of the latency overhead of the cellular connection. That's a substantial difference for a lot of real-time applications.


I do not deny that 4G has worse latency, but 5G will not provide sub millisecond round trip latency for regular internet based applications. Sub millisecond latencies are due to local communication which can replace DSRC/802.11p in car2car communication and similar applications.

Your 4G results also seem quite bad, my personal test right now was more around 18ms broadband vs 31ms LTE. Google scholar results (https://scholar.google.de/scholar?hl=de&as_sdt=0%2C5&q=LTE+l...) I looked into seemed to suggest LTE is capable of ~20ms latency results, while (http://wirelessone.news/10-r/1007-lte-latency-today-9-ms-dow...) suggests that LTE-A (3gpp release 13) is capable of 14ms latency.


4G offers enough speed to replace my home internet connection, however aggregate bandwidth at the cell towers prevented this.

I assume with 5G we're just about at the point where you could ditch the home internet connection and go 100% cellular.


The average home connection in Canada is something around 15-20Mbps, IIRC. Your average 4G is already faster than that. I get similar speeds on average on my phone compared to my home internet.

The two reasons I still have dedicated home internet at this point are data caps and latency (gaming). Otherwise, my phone usually gets comparable - or better! - speeds than my home connection.


My home connection has a data cap now (fuck you comcast) so the first 5g provider to sell a real unlimited plan will get my money.


T-Mobile is piloting just that: https://www.t-mobile.com/news/home-internet-pilot

And I hate Comcast. The $50/mo to unlock true unlimited internet was absurd.

I actually cancelled my Comcast last month and went to Century Link. The overall speed is slower, the lack of data cap and being 1/3 of the price has been worth it.

It feels good to no longer be giving them any money, especially after their billing would routinely "forget" discounts that were supposed to be on my account. I got offered free Starz for 3 months and $30 off my bill for the coming year, which just happened to only be there for a month... So after calling them to have the issue fixed once, and then seeing it happen again the very next month, I finally cut ties.


T-mobile de-prioritizes this product though: "During congestion, Home Internet customers may notice speeds lower than other customers due to data prioritization"

At least where I live, T-mobile towers are at least occasionally congested. Its not too uncommon for data to be slow or even non-functional with good signal levels (on my smartphone, don't know about this home product).


You are so fortunate to have options for your ISP! Where I live (in a good sized city), my options consists solely of Comcast.


Yeah, it really is a blessing. I'm sorry to hear they've got a stranglehold on you. Their customer service is really terrible, and they can get away with it because they know you're stuck.

I am in Minneapolis and Century Link has managed to become the primary secondary option to Comcast. Some neighborhoods are lucky and have US Internet's Fiber as a third option.

On top of that, Century Link has been pushing hard to expand their fiber. I am relegated to only 40MB connection on the copper connection in my neighborhood right now, but the technician that came out to do the install said that (unofficially) they're hoping to have Century Link's fiber be an option in the next year.

$45/mo for 40MB. Will be $65/mo for fiber. Still a fraction of Comcast's insane rates.


Pretty much all home connections have too up here, unfortunately. I have to pay a CAD$15/month extra to have unlimited transfer, otherwise it's capped at 400GB IIRC, and overage charges are outrageously expensive.


Mine is 1tb/month, but $50/month to get unlimited. The sooner internet gets treated like a utility the better.


And I believe it's $10/50GB you go over the 1tb/month.

Microsoft's One Drive was doing a constant upload/download one week, refreshing itself constantly, costing me $200. Ugh.


You realize that existing utilities don't provide unlimited usage for a flat fee, right?


I honestly don't understand the fear of usage based billing. It's the only sane way to charge for this kind of thing. The marginal price for a "unit" of data is just too high. The real "cost" of delivering a terabyte of Netflix or Youtube content is so small compared to the price ISPs are able to charge.


Well, if the cost is that low, I would think that would be an argument for just making it a flat rate.

Historically, I think the fear among people like those here was more or less two-fold.

1.) The original top 1% downloaders were mostly those downloading lots of "Linux distributions" (and of course many other types of media) over torrents. I'm sure quite a few of those 1% ers were on sites like this and preferred to effectively be subsidized by the 99%.

2.) Whether or not it was just a rationalization for #1, many made the argument that bandwidth caps would constrain innovative services because if you turn a meter on, people are going to use bandwidth-intensive services a lot less and the Internet will be less generally useful as a result.

The dynamics today are probably a lot different. I imagine the profile of a big home bandwidth consumer now is a family that watches a lot of Netflix and YouTube.


And then, a couple of years down the road, your ISP will silently begin to throttle your connection once you reach 1TB per month. :(


But the 4G towers have less bandwidth than the coax network. So if everyone used 4G for home internet speeds would be worse.

With 5G and recent increase in cell tower coverage. We are almost there.


The 5.8 GHz band has 500 MHz total of bandwidth. Many cell phone companies already have more than that much bandwidth already, they just have to deploy more small cells.

The only result of 5G would be the end of public wifi which typically has 5 - 10 mbps speeds.


> The only result of 5G would be the end of public wifi which typically has 5 - 10 mbps speeds.

Not if they charge anything close to what they charge now.


It's not cell phone service, but I get my home internet via 5G from Starry (https://starry.com). Speeds are between 200-300MbS (symmetric upload/download) for $50/mo. It's really been fantastic in the six months or so that I've had it. They've got their own towers and repeaters on buildings here in Boston. Couldn't be happier with them and with being able to ditch Comcast.


What is the latency like? Could I play CS:GO or another game where latency and jitter matters?


They claim, and I've heard from a friend who games, that the latency is lower than it was from Comcast. I don't know what it is and I'm not home at the moment to test it. I haven't played in a long time so can't speak from experience.


Out of curiosity, do they give you a publically accessible IP? Or is the connection NAT-ed?


Yes they do give you a public IP but don't have static IPs.


Wow, I’m paying $130 for 100/5. Damn cable companies.


However, due to limited range, they'll have to put up 5G stations every couple houses, and they'll have to get wired internet to the station... I wonder if you'll be able to buy your own station, or if you'll have to use the crappy utility-provided power-wasting box? To get adequate service to the station, they'll need some kind of high bandwidth link pretty much right to your street, perhaps coax or fiber...


> so the data caps on "unlimited" plans wouldn't be needed

They're not even needed now. It's all about money.


Right, I doubt it really costs Verizon $15 to provide me an extra gigabyte..


It’s somewhat needed. 15 dollars a gb is just to raise revenue.

But a truly unlimited 4G hotspot service would crawl to stop during busy periods.

Some sort of rationing is needed.


Seems like this is a pain-point that was self-inflicted by the companies themselves; selling service to many people without the infrastructure to back it up.

That said, I'm not suggesting companies deploy 1:1 infrastructure in lock-step with sales. Power/Water companies plan for peak demand periods. What prevents ISPs from being able to do the same?


It seems to work just fine here in Finland, it is a very common alternative to wired home connections (seems to be about 50-50 split: https://www.ficom.fi/ict-ala/tilastot/kotitalouksien-laajaka...).

No data caps, so you can use terabytes of data without throttling if you want to.


Most people don't come close to hitting their cap unless they accidentally update their 200mb apps on the data connection.


> can be done even with 3G (video, streaming music, and any website's loading time is more than acceptable at 3G speeds).

I think you forgot how slow 3G was/is. It's around 512KBps. You can stream music on that, but streaming video is a straight nope (480p on youtube wouldn't even be happy with 3G) and even just browsing the web will be a frustratingly slow experience.

4G LTE definitely has the bandwidth to do all those things are reasonable quality/throughput particularly for the resolution & form factor of the device in question, though.


"3G" also encompasses HSPA which supports real-world speeds in the 10s of megabits.

The original 3G UMTS (384kbps) was indeed dreadfully slow by today's standards.


HSPA+ will hit real-world speeds in the 10s of megabits, but HSPA won't. Both are technically in the "3G" spec, though, you're right about that. But they were marketed/branded as 4G almost immediately.


> But they were marketed/branded as 4G almost immediately

That was just a US thing though, right? Where I lived it was branded "Turbo 3G".

Isn't LTE already being branded as 5G in the US as well?


Have a decent connection where there’s a large gathering of people.


Very much so. Sure, the headline speeds of 5G are nice (and indeed currently with the right 4G deployments and equipment, you can certainly get some phenomenal speeds today), but like all major network upgrades the primary benefit for typical users is aggregate capacity. That seemingly pointless headline speed doesn't quite happen when you've got 100s or 1000s of people in an area, what it does result in though is a usable speed for all of those people instead of an unusable one.


You don't need 5G to get that, you can get usable speeds to over 100k people in a roughly 200m x 100m area with wifi and 4G, as OSU has done (1,2).

1 https://www.bizjournals.com/columbus/news/2013/08/27/ohio-st...

2 https://www.dispatch.com/news/20180405/ohio-state-to-spend-n...


With the same backbone infrastructure? Wouldn't you need more cell towers for this? From my experience, areas that constantly have large crowds often work well with 4G, just the occasional bursts don't work.



As near as I can tell, the only benefit that consumers will see from 5G is that it will resolve a real problem with the cell system right now: capacity.

In very congested areas, the total supportable capacity is already being completely utilized. This is why people in very congested areas experience call drops or problems with network availability. 5G would go far to resolve this.

However, that's the only consumer-level benefit that seems realistic to me, and it would only be noticed by the people in dense areas. All the other stuff, such as increased speeds, etc., appears to be nonsense.


Nothing, really. The bottleneck for cell data throughout (and latency) is backhaul. Improving backhaul isn’t sexy like changing the “4” to a “5”. Improving backhaul is a very expensive (and slow). And counter to one’a business goals - telecoms in the US want just the right amount of congestion to justify network upgrade subsidies, even if that means they do not adhere to ITU-T requirements.

The best “feature” of 5G is that it’s a great opportunity institute rate-hikes.


>> The best “feature” of 5G is that it’s a great opportunity institute >> rate-hikes.

I can see it now. "$100 additional per month" But you can cancel your cable/DSL now!


The most common suggestions I see are mobile applications that benefit from the lower latency afforded by 5G - stuff like VR/AR, gaming, self-driving cars.

Aside from that the real reason money is being poured into 5G appears to be to replace the last mile / home internet.


> the real reason money is being poured into 5G appears to be to replace the last mile / home internet.

In order to be a realistic alternative for internet service, the cost would have to come WAY down. Using the cell system for your primary internet feed right now is prohibitively expensive.

I have a hard time believing that the cell companies will actually lower their pricing for 5G. It seems more likely that they'll do the exact opposite.


I'm in Japan which has a similarly uncompetitive mobile space to the US, and they've been offering unlimited 4G home internet here for a couple years now.

Their solution is to IMEI lock the plans to these mains-power wifi routers the size of an old AirPort Extreme, and QoS them to have lower priority than their pay-per-GB mobile customers. That way it can't be abused for people to use it to replace the (still-expensive) mobile plans, and the lower network priority means that they're just taking up unused network capacity anyway.


You should listen to Shahriar Shahramian on The Amp Hour Podcast: https://theamphour.com/430-shahriar-discusses-5g/

There's a ton of interesting things, like 5G is for a lot more than cellphones. I've always heard that 5G will have phased array and I wondered how that would work -- he talks about possibility that the array is on the roof of a car and your devices will connect back to that. Lots of issues, like size of antenna and power consumption, to get those blazing fast speeds.


I'm thinking it's probably an ingenious marketing move intended to force the people to install equipment from a certain country despite concerns, in order to keep up appearances that investments are being made in the technology.

From a consumer perspective, LTE's great in its current form, I've never seen any actual person complain about network speeds in a region where LTE is properly deployed, I don't know of any application that struggles on this infrastructure.


You're making the assumption that 5G and future advanced wireless systems are exclusive to phones. In the future this will not be the case at all.


Human(individual) customers won't reap the huge benefit IMO. This is a big step up for commercial sites though, where cellular modems serve as fail-over for fiber/commodity service. These customers typically have unlimited data plans, and are more than happy to pay any outrageous data fees that keeps their network operating.


Demand for bandwidth has historically expanded to match the bandwidth available. Folks will find some use for it.


Multiple 8K 90Hz VR streams for the whole family to play Roblox 2025, all running off a personal hotspot.


I want to see DOOM running on just the network stack, but with raytracing because.


Ajit Pai seems to think it's about: Rural Connectivity. Fiber Infrastructure. National Competitiveness: https://youtu.be/jKbAdEVOaDY?t=406 (Ajit starts around the 8 minute mark)


Do his words actually carry any information? My understanding is that he is one of the least credible, least respected humans on the face of the planet.


Ajit Pai has exactly zero credibility.


He's one of those people the internet is right to hate


I think I've read something about significant improvements to latency. Don't know if true or not.

Latency-sensitive applications over 4G/LTE, like SSH or games feel sort of sluggish.


One use case could be high resolution, low latency video streaming for remote control of autonomous vehicles, e.g. trucks.


Driving a semi-truck remotely? What could possibly go wrong!


You seem to be ignoring speed of light and time to react.


5G attempts to market a compelling reason to upgrade a phone and stave off the slowing phone replacement cycle?


> I was wondering, what exactly 5G will bring us.

WISP internet service to the home.


WISP is very useful in rural areas. It's less compelling as an alternative to wired services in the dense urban environments where we'll see 5G deployments. You're going to have to run all that fiber, anyway.


That all depends on how rural it is. Rural enough that a tower can cover enough houses to make it worth it or so rural enough you have almost as many towers as houses?

I spent a decade on various WISPs, even had one of their towers on my house (free connection + cash = very yes). It was a city with a mix of single family homes on small lots and townhomes/condos. Cable doesn't exist there, so WISP was the way to go. It was pretty good until Netfix got popular. It was bad in the evenings when everyone would sit down and turn on a show.

Now I'm in a very rural place, lots of hills, thick trees. Luckily there is a telephone/internet co-op who put in fiber. I think they realized with the distances they have to cover that WISP and copper were out. I've never seen fiber to such remote locations.

We only have one cell tower that covers us now, so I won't hold my breath for 5G.


My only choice for internet in a reasonably urbanized area is Comcast. No one has run fiber to my apartment building. If someone wants to run fiber and set up nearby 5G cells, and can provide internet service that is competitive with Comcast, I don't see why I wouldn't switch.


Again, looking for the advantage -- in this case, I can see WISP being useful for areas that are underserved by the local cable monopoly, and in other areas it provides a third competitor (cable/dsl/wisp).

I wonder how 5G home internet will compare to the up and coming LEO satellite constellations (StarLink for example).


You still need to lay fiber for 5G base stations


And if you're going to bring the fiber most of the way, you might as well terminate it at the premises.


That depends heavily on the location.

In many suburban developments, the power/communication lines are all buried underground. It would probably be much easier to install a couple of microcells around the neighborhood than to dig up everyone's lawn.

I was lucky enough to live in a neighborhood where Verizon initially installed FiOS. It was a massive undertaking. They had to trench every street in my development. I was young at the time, but I remember thinking how massive the effort was considering not every house would even sign up for it.


I’ll confirm, as a crew is dragging/trenching fiber as we speak (I’m looking out the back window watching after ordering them pizza) at a family member’s home in a suburban community, to be terminated on the side of each home (fiber to the premises).

They ran aerial fiber to the neighborhood, and then ran the fiber under each street, with the final run being placed in the backyard easement between house rows (with a Ditch Witch doing the heavy lifting).

Family member is going from AT&T DSL for $40/month with 3Mb down/1Mb up to $50/month for 200Mb down/75Mb up (with a two year price guarantee and no data caps). It’s quite the deal.


That's a little weird. I lived in an area that had fiber installed, it's pretty nonintrusive. The fiber is laid under the side walk and the last bit from the side walk to the house is blasted in. You have a small hole dug by the house and another one by the side walk, no trenches needed.

At no point does a fiber installation require digging up an entire road or even your lawn.


In my old neighborhood, I saw Comcast fishing cables through the street drainage sewers. Not sure if there was a separate conduit that was accessed via those grates, or if it actually just laid along the the bottom of the sewer pipe lines.


This isn't necessarily the case. Bringing that cable an extra ~300m and providing the ONT hardware for each house is quite expensive for the small number of customers that might benefit in a rural area. Terminating at fast wireless and providing radios is much more cost effective from the telco's perspective.


From the telco's perspective, yes. From an end user who is getting fiber from a coop or muni fiber provider, it's about $300-$500, which might be considered reasonable if your ongoing operating costs (and therefore, costs to the end user) are lower.

Disclaimer: I have contributed to a muni fiber project, and am familiar with trenching/arial mounting and premises termination costs.


The problem is that the new spectrum is not appropriate for that purpose for the exact same reason it interferes with weather forecasting equipment.


> Does this also mean that 5G will suck, when it’s raining? [from a comment below the article]

If 5G uses almost the same frequency where microwaves detect water vapour (around 24 GHz), won't the weather have a great impact on it?

Also, I always thought that such small waves would have problems with obstacles, with good signal just when your phone is in line-of-sight with antennas.


That's all correct, the higher frequency suffers from worse object penetration. Solutions I've heard was that 5G would likely involve neighborhood or even building repeaters.

IMO 5G is massively overhyped. My iPhone 7+ isn't limited by 4G LTE, it's limited by Verizon deciding to only allow it 10mbps down (with great signal). 5G won't matter one bit if the current bottleneck isn't 4G LTE in the first place.


> My iPhone 7+ isn't limited by 4G LTE, it's limited by Verizon deciding to only allow it 10mbps down (with great signal)

This is in fact a limitation of 4G.

A single device is fine, but there is only so much bandwidth available. When you get lots of devices in the same area, something has to be limited.


That seems to be american thing. Most countries I’ve lived easily get 100mbps.


Yeah, in especially good conditions you can get over 800Mbps on a Galaxy S10 over here (e.g. https://bbs.io-tech.fi/threads/nettiliittymaesi-taemaenhetki...), and over 100Mbps is normal.


Smaller cells allow for fewer devices per tower, does it not?


Unfortunately, the density of people who don't want towers "too close to me" means we need bigger towers farther apart.


Smaller cells require lower signal power to prevent nearby cells from interfering.


Cell capacity is a shared resource. The reason (or part of it) that Verizon limits your speed is to let others use it too. 5G brings smaller cells and more capacity in each cell, which means less need to throttle.


Not just cell capacity, but phone capacity in general.

One of my biggest memories of 9/11 as a school-boy was when all phone-lines were so congested that no one could call their parents. We all kept trying, but it was "busy" tones all the way through.

That's the only time I remember land-lines all being too congested to make a call, but it was part of the event that a lot of us talked about for days later. Some of the kids were trying to call their parents to see if they were still alive, because some of them worked at the Pentagon. (Their parents were fine, but they too couldn't contact their kids because all the phone-networks were too congested. I presume everyone was trying to call each other at that time).

Its the only time in my memory where I got a busy-signal from picking up a phone. You didn't even dial yet, it was too busy to even give a dial tone.

For the kids out there: texting wasn't common yet, and pagers weren't really worth the trouble. Pagers worked like text messaging today, except you didn't have a screen... so it was hard to use and weren't really used very much. You called a pager using a land-line, and then dialed "44 33 555 555 666" for "HELLO". People generally still just did phone calls for simple messages like "I'm okay".


They'd have to upgrade the backhaul too though to take advantage of 5G capacity.


Backhaul generally isn't the bottleneck - you have fiber to the towers, and you can bump the capacity of that with comparably cheap hardware upgrades - however, the available "air bandwidth" per cell is pretty much a hard limit, and to go beyond that you need to build more cells which either requires more frequencies or more towers, which both are very expensive and time consuming.

Perhaps it's different in USA, though, the geography and population density differences mean that the backhaul problems are different than in Europe; but here the bottleneck is on the radio side.


>> They'd have to upgrade the backhaul too though to take advantage of 5G capacity.

THIS!!! Even if more sites are added, each with their own 10/100gb fiber run - it's still running to the main circuit for a region. That circuit isn't changing(in the near term), this will just consume more of it. Regions without colossal backbone access that are already bandwidth constrained will remain bandwidth constrained.


What, pray tell, is the “main circuit for a region?”

Network bandwidth is still going up and up in speed. 100G was very expensive and hard to do only a few years back, now it’s becoming standard. DWDM systems continue to evolve.


I’m sure they would. For all the faults of telecoms companies, they generally don’t like spending a massive amount of money on upgrades that do nothing.


Being able to charge customers more for their service would absolutely count as "doing something".


It does something, it makes your mobile say 5G on the top. That's worth paying more money for some people.


But that can be (and has been!) done with a phone software update, which is much cheaper.


Depends on how expensive the settlement will be. :)


They love to when customers pay for it! ;)

They spend massive amounts of money on marketing, at that doesn't benefit the product at all.


> building repeaters

so... sorta like a WiFi access point?


Yeah, my thought exactly. People like to talk about how 5G will eliminate the need for wired internet because there will be 5G stations on every block, even on every building! I don't know if they simply assume it will be wirelessly repeated or they just haven't really thought about it too hard.


The main value of 5G as opposed to WiFi is that it comes with an entire system for roaming over different networks.

WiFi has no built-in system for authentication across different operators, no system for accounting usage, no system for ensuring trust.

This is what 3gpp brings. Their systems are hopelessly convoluted, and have some issues. However, it works.


> "building repeaters"

In other words glorified wifi, except the AP is no longer owned or controlled by me.


Is Verizon really that bad? I usually get around 100mbps on TMobile LTE.


That's with their rate limiting to fast.com (Netflix).

Speedtest.net (with Verizon hosted servers) was at a very consistent 50mbps/4.5mbps which would make sense for my unlimited plan.

Please note that speedtest.net is REALLY not a good representation of real world network usage, and I don't just mean like how they throttle netflix. They get to host the server in-network as well as full traffic shaping to get those juicy benchmark numbers.


Well, seems like exactly the test you would want if you are testing the 4G network or your phone.

Lack of competition and sensible operators isn't 4Gs fault...


> If 5G uses almost the same frequency where microwaves detect water vapour (around 24 GHz), won't the weather have a great impact on it?

There's a general misunderstanding about the technology that leads people down this road of thought.

5G is broken up into two frequency ranges, FR1 and FR2. FR1 is everything below 6Ghz and encompasses the same spectrum as traditional cellular technologies. FR2 is everything over 24Ghz and that's the bit everyone is confused about.

FR1 is like traditional cellular and will be slapped on cell towers to provide broad coverage over a wide area with performance characteristics similar to what we have today with LTE. It's not very exciting but it's 5G and this is what everyone is currently rolling out.

FR2 is meant to be absorbed, otherwise you'd have a big problem. Unlike FR1 which limits you to 100mhz bandwidth per channel, FR2 mandates that channel bandwidth be between 50-400mhz. So at a minimum, an FR2 channel will have half the maximum allowable bandwidth of FR1. If FR2 propagated more than a very short distance the airwaves would be quickly saturated by a small number of users.

FR2 is intended to be deployed in very dense areas like indoors. You'd be able to deploy many cell sites without worrying about overlap or signal propagation because everything from walls to moisture in the air will absorb the signals.

It might also be possible to slap an FR2 cell site on top of every lamp post going down a street.


Wouldn't there be a great deal of contention with an FR2 on each lamp post; how would they avoid that? Are these cell sites meshing internally to create a backhaul, or is the idea that each lamp post is wired?


Why? The sites backhauls could be wires or wireless, it doesn't matter much as the backhaul is independent of the interface tranceiver. The sites interface would be attenuated for the short ranges. We can do this today with Wifi, every always thinks more power is better but sometimes less is more.

Samsung's 28Ghz solution was rated at something like 1500ft at maximum power. So one per city block might be more realistic than one per lamp post but it will ultimately come down to capacity. The mmWave portion of 5G is designed to high density deployments.


Ideally a bit of both.

The problem is that omnidirectional (so non-directional) radiation causes the problem, not fixed point-to-point links. So, with the omnis the best idea is to limit their TX power and spread the noise via ultra-wide band multiplexing (using random orthogonal coding [which is CDMA or lately frequencies (subcarriers/subbands), which is OFDMA] minimizes the interference), but with the p2p links, you can go wild, use a relatively high power high frequency narrow band high symbol/baud modulation (like QAM256 - used in 802.11ac wifi).

But of course cables rule, because they are really great at containing the EM radiation. And lamps already have cables to power the lamp, and the micro-nano-pico-cell thingie will also need power too, and then you can just do Power over Ethernet and be done with it, and then concentrate the copper wires and switch to fiber.

Though probably the p2p mesh backhaul module would be pricey, but sometimes getting your cable to a switch is equally problematic/costly, so that's why probably a bit of both from a cost perspective too.



I think the main point of 5g keeps getting missed when people are asking about cell phones and their broadband speed vs capacity etc etc. The only reason telcos are going to put in 5g is for IOT coverage. Low powered trickle data from billions of devices.

Stuff for your personal cellular use would never come close to covering the costs involved. And 4g will still be used for many years to come for that.


The focus today for 5G is fixed wireless access and smartphones, not IoT.

5G covers 3 variants: 1) massive broadband (with mmWave in particular); 2) URLLC - Ultra Reliable and Low Latency Communications; 3) massive IoT. The current focus is on (1). Lot's of talk on (2), but nothing much concrete yet. (3) is moving along, but it's actually based on... LTE.

Strictly speaking, 5G is a set of requirements defined by the ITU-T, not a technology. The actual technologies are developed by another organization, 3GPP. And there are 2 technologies to cover 5G: 1) NR, or "New Radio". This is what most people mean by 5G. It's for massive broadband and URLLC; 2) LTE (release 15 and later) for massive IoT!

Yes, the IoT variant of LTE (LTE-M and NB-IoT) will be the 5G implementations for a while. Eventually there will be new NR versions for IoT, but nobody is in a hurry there. LTE-M and NB-IoT evolutions will be just fine for a long time, as far as massive IoT is concerned.

When you hear about 5G deployment today, it really mean "NR" and not LTE based IoT. The concern for smartphone is really capacity during peak time in the busiest cells.

I work in the field BTW, as you may have guessed ;)


I work in the mesh network IoT space, primarily Wi-SUN. Since we view NB-IoT and LTE-M as competition, I encourage you cellular guys to really take your time :) we're happy for the head start while we keep deploying our networks in the millions.


Isn't this fantastic? It is the measure of any great buzzword that it means something entirely different to every single person.

I personally think the entire mmWave part of 5G is at least another 10 years away. Like, that's just going nowhere. Massive bandwidth? Great, what for? WiFi will be mmWave before we have an application for that in cellular.

Presumably what is actually being deployed is the incremental LTE revision on the same old bands, some extra thrown in that were yoinked from other applications.


To be honest it's been well over a year since I've been in the telco space, and I was coming at it from a different angle (software).

My main exposure of the 5G area was the parts around the move to Network Function Virtualization. I wasn't deep in enough to know the ins and outs as you clearly do, I was extrapolating from what I know.

Here's my reasoning, feel free to poke holes.

Background: The hardware boxes used for network functions supplied by Nokia, siemens and the like were very expensive.

Telco profit margins have dropped massively as people make less regular calls and SMS, and use whatsapp and viber (over the top) instead.

The market is saturated.

A) Making data "Better" makes these over the top services better. If my data is more stable I will more likely make a call using whatsapp, and the telco looses the money from voice. - why would they do this?

B) 5G requires completely new infrastructure, from aerials to core packet switched. This costs a lot of money, again, why would they do this?

Reasoning: The ITU defined the whole NFV thing, and how to do it. You need a certified platform, cloud orchestrator, a whole load of systems to do NFV. The move to NFV is very similar to how x86 went from bare metal in the data center to hypervisors. It is an absolutely massive change. The telcos have never virtualised their machines, and can only start doing it now (in the last 2 years)

This is why the telcos took on/over openstack as a concern. I think they thought the could run openstack clouds and run their NFVs (this is what telcos call their VMs) for way less money than the hardware boxes.

Of course, reality came home, and rolling your own is hard. And the NFV providers still charged a fortune, and so did the system integrators.

So again, we are at a point that this costs a load of money for the telco. Why do it?

Benefit one is that they can now roll out new functions in a relatively short amount of time (magic of virtualisation and all that). And by short I mean I have seen a national VoLTE roll out in a few weeks.

And I though, that's great, but me, as a consumer, am I going to pay extra for VoLTE? Nah. 4G is good enough for me. And considering the cost to the telcos... something doesn't add up. As far as I can see, they simply can't add on enough consumer fluff to pay for what is a massive investment.

The only answer I can see, the only untapped market, is IOT. And my only example is some car company made a deal with some big US telco to do all of their car originating data, and I think that is where the telcos will make their money.

But again, I am coming from the software side. 5G to me isn't about radio or and of that, it is about telcos moving their compute to the edge, and virtualising their entire infrastructure to allow them to roll out new services in weeks instead of years. And I am at least a year out of date ;)


No, most IoT devices in most applications are dramatically underpowered for 5G.


5G doesn't work well inside buildings without additional hardware. I doubt people will buy that for their house just so their toaster can display adverts.


I’m somewhat confused. I’ll admit that I’m not very familiar with super high frequency radio, but isn’t the difference at least 200 MHz, approximately 10 times larger than the entire FM radio spectrum? Doesn’t out-of-band emission stop being a problem at that much separation? Or should we look at it relative to the base frequency?

edit: For what it's worth, I found this paragraph from the FCC last year: https://www.federalregister.gov/d/2018-14806/p-20 It sounds like they're saying "we don't know if this will be a problem yet, but be prepared to limit emissions in 23.6-24ghz range because we might require it at some point".

Also, paragraph 9 of the same document has the actual band limits (with a special requirement) if anybody is interested:

> The 24 GHz band consists of two band segments: The lower segment, from 24.25-24.45 GHz, and the upper segment, from 24.75-25.25 GHz

> any mobile or transportable equipment capable of operating in any portion of the 24 GHz band must be capable of operating at all frequencies within the 24 GHz band, in both band segments


"At the high end of the electromagnetic spectrum, signals travel over a band of 10 million trillion Hz (that is, 10^22Hz). This end of the spectrum has phenomenal bandwidth, but it has its own set of problems. The wave forms are so miniscule that they're highly distorted by any type of interference, particularly environmental interference such as precipitation. Furthermore, higher-frequency wave forms such as x-rays, gamma rays, and cosmic rays are not very good to human physiology and therefore aren't available for us to use for communication at this point."

Not the best analysis but I'm working. Basically like any other metric, as the frequency increases the space between peaks and valleys decreases and it becomes harder to determine/separate from others 30hz to 230hz is much easier to tell the difference than 15khz to 15.2khz if you want to listen to audio tones. Once you get to microwaves this becomes of course much more difficult.

http://www.informit.com/articles/article.aspx?p=24687&seqNum...


To me, this analysis sounds more about the interference with physical objects (like how 2.4ghz can pass through walls easier than 5ghz). It doesn't seem to be about interference with other radio emissions?

> 30hz to 230hz is much easier to tell the difference than 15khz to 15.2khz if you want to listen to audio tones

That's definitely true for humans, but I don't think it necessarily applies to radio signals in general (though it does make sense at first glance).


While bandwidth is typically measured in absolute Hz difference due to the spread directly relating to the amount of information-carrying capacity, when you are actually generating signals many things scale with carrier frequency.

The way most of these signals are being generated is through an "information signal" (usually referred to as a modulator) being created within your device, then moved up in frequency by combining it with a "carrier signal". This separation allows the information signal's properties (such as bandwidth, modulation scheme, bitrate/Hz, etc...) to be more-or-less independent from the physical characteristics of propagation, which will be more strongly related to the carrier signal's properties (e.g. water absorption, reflection/transmission off of/through materials, etc...).

However, no process is perfect, and although we would like to generate perfect signals, we can't. Distortion appears in multiple parts of this pipeline, both in the generation of the low-frequency modulator and the high-frequency carrier. Distortion typically comes from "linear" components not being perfectly linear and thereby generating harmonics of the signal passing through them. In the case of the modulator, this splatters signal energy up and down the spectrum on the order of the bandwidth of the modulator, but in the case of the carrier it does so on the order of the carrier frequency. This is all a matter of degree and depending on the application may not be that big a deal, but it definitely must be addressed for high-density communications equipment like cell networks.

All transmitters have filters at multiple stages of their signal processing chains, both lowpass filters that filter the modulator before it's mixed with the carrier signal to boost it up in frequency, as well as bandpass filters that ensure the output stays within the bounds it's meant to be; but these filters only do so much and they can be expensive to create (as they can have rather fine physical tolerances) so everybody is always just playing the "do the best job we can for the least money" game.

Luckily, most transmitters are also connected to antennae that provide a convenient filtering on the output (they only resonate at the frequencies they transmit at) which helps for specialized systems that only operate at a single transmit frequency, but for something like 5G is less helpful, due to the many channels it supports, causing the antenna to necessarily support a wide range of frequencies.


It should definitely be looked at relative to the base frequency.


Rejecting on the grounds of no technical basis? I'd like to see more on that. I would hope when NASA raises a flag with the FCC it's taken with sincerity.


There are a lot of things one would hope the FCC would do that have not been happening under Ajit Pai. He's an industry shill, plain and simple.


Human fallacy, we all want it faster and more.

Bur for 99% of the cases, why would we need it?

IoT doesn't need 5G, it needs LiRa.

Streaming applications, I can stream with 4G.

50 ms latency with 4G, so what. Except for competitive multiplayer gaming perhaps, I don't see the issue. But I think they want everything wired ;)

Industrial applications, outside of IoT? Give me a valid example that needs countrywide coverage.

I hardly notice difference with 4G and my WiFi. Increase coverage for 4G, before implementing 5G.

Fyi: 4G offers maximum real-world download speeds up to 60Mbps. Currently, that is more than enough.


you still need more capacity, as the population grows more number of devices will compete for same 4g bandwidth and unlike optical fibre you cannot increase the bandwidth by adding new wire.


Well, the FCC could coordinate a move to a new standard that uses smaller cells for the same 4G frequencies, after all big parts of 5G is about handling increased density.


5G needs way more capacity/stations than 4G


> Industrial applications, outside of IoT? Give me a valid example that needs countrywide coverage.

Police CAD systems, streaming video from body cameras, oil well monitoring.

Just because you can't imagine an application of 5G for your consumer needs doesn't mean it isn't needed.


- Police CAD systems

Why would a CAD system need > 60 Mbps? Why would it even need more than 30 kbs? I doubt it would need more than a MQTT channel and an app.

- streaming video from body cameras

Low quality over 4G and store the video in HD. Syncing on secure access-points ( the devices need to charge, sync there) Nothing more is required, since body cams are used as evidence. Also, live stream body cams already exist over 4G ( 1920 x 1080, 30FPS)

- oil well monitoring

IoT... Jeez...

> Just because you can't imagine an application of 5G for your consumer needs doesn't mean it isn't needed.

Like i said. You want more and faster. I already mentioned some applications, that also don't require it.

Just like your examples, also don't require it.

Again, human fallacy and you are providing excellent examples. Thanks.


Oil well monitoring can work with the crappiest of GSM (a simple SMS every few minutes), no?


What is LiRa?


LoRa, sorry


Pilot here - they used the example of a hurricane, however, I think it would have a daily impact on thousands of flights (general aviation and commercial) which all rely on on accurate weather forecasting. Weather is no joke in aviation; even if you're flying a 747.


> ... a letter from NASA Administrator Jim Bridenstine and Secretary of Commerce Wilbur Ross requesting that it be delayed. FCC Chairman Ajit Pai rejected the request ...

Ajit Pai strikes again!


I also do not really see the benefits of 5G.

I would be much more happy to have reliable 4G or 3G at least, first.

I suppose quite some support comes from people who have connection issues with weak 4G and assumes 5G will solve them.

But since 5G will consume apparently 3G towers and has much less range, quite the opposite could happen. Even less connection for people not in the city.


Slightly OT or meta. I keep bumping up against these nutty conspiracy theories about 5G being dangerous in various forums. Has anyone done a study of the effects of certain frequencies and energy levels on the human body that I can use to refute these fools? Also, what is the canonical source on 5g spectrum and power levels?


https://arxiv.org/ftp/arxiv/papers/1503/1503.05944.pdf this basically found that 90% of the energy is absorbed on/at/around the skin.

if neurons were directly exposed they would heat up and their firing rate would be altered significantly [look at fig2 / D ] ( https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4233276/ )

this study did a direct exposure testing on a human's arm (and on rats and monkeys) at 94GHz for 3s at 1W/cm^2: https://apps.dtic.mil/dtic/tr/fulltext/u2/a628296.pdf

this did 24h low power 1mW/cm^2 exposure of human eye like cells and detected no significant difference in micronucleus expression, DNA strand breaks, and heat stress protein expression: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4997488/

this used 42GHz on rats for 30min/day for 3 days, and tried to look for tumors/cancer and found no significant difference: https://www.rrjournal.org/doi/abs/10.1667/RR3121

so. it seems there's no immediate and known adverse effect (other than the heating and maybe some interference with the electro-biochemistry of cells, particularly with neurons).

and luckily people usually don't use phones near their heads anymore (when doing large data transfers), so we don't really have to worry about concentrated absorption on the head due to being at best a centimeter from the emitter.

> Also, what is the canonical source on 5g spectrum and power levels?

maximum permissible exposure is defined as 10W/m^2 power density (for 6GHz between 100GHz) by the FCC.


While most of the promise of 5g feels like overhype, this article has a bit of flat earth feel to it. For example, we’ve been trying to convert obsolete UHF frequencies to usable bandwidth for years only to find all sorts of reasons why it can’t be killed off. If 5g really threatened weather forecasting and radio astronomy I would think at minimum there would some sort of initiative to address it prior to a cutover - and yet, this article doesn’t seem to acknowledge such a thing exists so I’m left to conclude maybe the article is nothing more than a hypothetical what if that likely won’t have much impact in the real world. This is just my gut feel...


Can someone help me understand this better? What is "very close" to 23.8-GHz frequencies? I don't know which bands 5G operates on, but it seems [1] that the closest they get, at least in the US, is ~27 GHz. If the FCC is auctioning 3000 licenses for the 24 GHz space, is that the space that can potentially interfere more? Can 5G operate on just any frequencies, then?

[1]: https://www.cablefree.net/wirelesstechnology/4glte/5g-freque...


The carrier signal of a new technology could potentially use any frequency that isn't already in use. A lot of the frequency spectrum is already accounted for and generally parts of the spectrum that are important to scientific research are internationally protected (Such as the resonant frequency of hydrogen which is used extensively in radio astronomy).

That being said it is more technically difficult to go "up" in frequency. Lower frequencies have less bandwidth, but are easier to generate, can go through physical objects better and for longer distances.

This fight to me looks like the industry wants to use a cheaper frequency that meets their minimum technical goals (in terms of development costs not necessarily licensing costs) science and other applications be damned.


Europe, at least the EU has settled on 24GHz, that's pretty close to the 23.8GHz.


So how hard is it to limit the 5G signal to bandwidths that don't interfere with weather forecasting, and how hard is it to detect and enforce laws against such bandwidth spillover?


> detect and enforce laws against such bandwidth spillover?

This phenomenon is called adjacent-channel interference and violations can and would be enforced by the FCC. The challenge isn't detection or enforcement, it's the challenge of the different government agencies to balance the people's needs properly.

FCC wants to auction off the spectrum to benefit telcos and their customers. NOAA wants to protect people with accurate predictions of hazardous weather. Your question presumes that the weather prediction function is more valuable, but the government may not reach that same conclusion.


Easy to detect, hard to overcome cellular network lobbyists.


This seems like a failure on the FCC's part. If it was going to be a problem these frequencies should never have been licensed out to cell companies.


Stupid question. I was under the impression that one of the limits of 5g was that it was a short-distance signal, easily blocked by a wall or any obstacle. Is it really going to create interferences all the way to space? I thought satellites measured the temperature of the top of the atmosphere, not of stuff on the ground.


They actually measure a lot of parameters, not only temperatures, and in all the layers of the atmosphere. And they're _very_ sensitive. For some forecast needs (short term forecast, storms, etc.), the conditions under the atmospheric boundary layer (https://en.wikipedia.org/wiki/Planetary_boundary_layer) is what matters most, so the microwave noise near the ground is definitely an issue.


Typical signal transmission uses a a signal to noise floor of something like 20-40 dB ( https://documentation.meraki.com/MR/WiFi_Basics_and_Best_Pra... ) for high speed data transmission, but if you want to just get a big fat one or zero across, then you don't need that much. And antennas are really good at picking up resonant EM radiation, even if it's not the "full signal".

But these very sensitive weather sensors. And they already work by detecting a trend and then detecting a big blip over that. (So rain currently looks like some sort of interesting blip in the noise.) So, it might be possible to have cities mapped with a differing trend, but it would further complicate models. And currently over land there's not much noise, because humans don't use this part of the EM spectrum. Mostly because it's not great at long range, because attenuates very fast exactly due to water vapor in the air. So it was "easy" to exploit this for getting weather data, because it was reasonable to assume close to constant natural emissions. (Probably only a simple daily and seasonal trend. Though it might be already necessary to handle differences between woodlands and urban areas.)

(See also, why it's hard to do the same over water: https://www.researchgate.net/publication/252663726_The_Effec... )


Why do we actually need 5g? Wired with last meter wifi is a lot better option imo for high bandwidth transfer


Any assurances that this won't seriously disturb the earth's ecology and human health, or do we no longer bother with that when manipulating the whole planet?


Given the whole thing with Global Warming, I think this question already has an answer.


They are putting up "DAS" nodes all over my city. Nearly every block. Wonder if these will be hooked up with UHF 5G and what the ramifications may be.


Equally important but often overlooked concern is the risks it will pose to our health given 5G relies on much higher energy radio signals than 4G.


Wouldn't it be possible to reuse the 3G frequencies with an updated technology in order to obtain higher bandwidths?


It is. Denmark is allowing the phone companies to use their existing 2G/3G and 4G frequencies for 5G. The 700MHz spectrum is also being opened up for use with 5G.

I'm not sure if it will result in dramatically more bandwidth though.


If it is possible, and those frequencies are worth using for this purpose, maybe they should give free 4G/5G-phones to anyone still on 3G to be able to do it as soon as possible.


Seems striking especially topic not touched like radio astronomy.


A network of well-calibrated surface and marine weather stations and atmospheric probes is probably enough to produce reliable and precise weather forecasts in the today age of ML.


You can't calibrate away the RF noise introduced by transmitters that change frequently in time and space.

That's not to suggest that you cannot find a way to discriminate between the water vapor and the 5G transmissions, but you can't just take a sample on a low-humidity day and subtract that from new samples. If the metrics are below the new noise floor, merely throwing machine learning at the problem will not solve it.


I don't mean taking RF data in account at all. I mean just collecting temperature, air pressure, humidity and wind speed data from many points in time-space.


Panasonic was developing a weather model incorporating data from airplanes that was supposed to be really good. But I think most models are based on physics rather than ML.


tl;dr

We get Water vapour readings (which feed into our weather forecasting models) from the ~24ghz range which 5G is going to blast away.


That's interesting - I'm working on using cell signal strength as an indicator for live weather features! There's a simple relationship between signal strength and clearness of weather - light rain has a distortion signal, heavier rain a heavier distortion, etc.

I'm betting that when all is said and done, the cell phones will help the weather forecast more than hurt it - but this may take some years if the cell companies are too greedy about it.

I'm working on detecting weather using all kinds of phone sensors like barometers and cameras in All Clear if you're interested: https://play.google.com/store/apps/details?id=com.allclearwe...

and the open source sensor package: https://github.com/JacobSheehy/AllClearSensorLibrary


But that approach, at best, only tells you about the weather near the ground. That can be useful, but is woefully incomplete. You also need to be able to see weather high in the atmosphere.


And radar data and satellite fills some of that need. We generally have a lot better idea what's happening in the middle atmosphere than we do close to the ground.


Well, signal strength yes - it does have those limits. But pressure data from the barometers can describe some higher altitude conditions (if you can map pressure at the surface over a large geography with decent density, you can infer metrics about the higher atmosphere).

And I do agree that other instruments are required for a complete picture of the weather, not just phones! But a remarkable amount can be done from phones. GPS attenuation for example, can also be read from some Android phones and can inform about high-altitude conditions.


But surely you miss the anomalies that might exist high in the atmosphere and which are actually responsible for the violent meteorological events (such as temperature inversions).


Definitely phones can't replace all weather satellites and things - not yet anyway! But for many severe weather events that actively harm property and life, such as severe thunderstorms, can be aided with smartphone sensors. For example, barometers in phones can detect initiation of rapid convection that may lead to severe thunderstorms. In many places this rapid convection may start where there are no weather stations, and cannot (yet?) be detected by satellite.


I see. It's definitely complementary, indeed.


>! But for many severe weather events that actively harm property and life,

Fuck planes amirite?


No? I am not saying those weather events don't exist - I am saying that there is a very large class of weather events that happen with high frequency, and affect life on the ground significantly, and those include severe thunderstorms that could have early detection through smartphones.

Planes and aviation could have major assistance in planning and adjusting to weather events if a phone network were actively in use as well.

Last second info re: waiting for planes to take off due to weather delays is one of my primary frustrations. It hurts a lot of people and companies to have last-second changes to forecasts and I would like to reduce that pain point by providing further advance notice of likelihood of severe weather in an area. This can be done (partially) with phone sensors.


And these speculations lead you to be confident that these techniques will end _better_ than current satellite-based techniques?

Truly we live in the best of all possible worlds.


I think you misunderstand me - I do not intend to express any confidence that phones will end up better than satellites if used exclusively! I am saying that working in tandem, all technology to measure and study the atmosphere should be used to better understand the weather.

Phones together with satellites and radars and weather stations will produce a better weather forecast experience than we have today, without using phones.

Thanks for the Candide reference, that is one of my favourite books, but I do not think it really applies to my optimism about weather forecasting technology into the future. We currently live in the worst of worlds when it comes to stability of plans for improvements into our weather infrastructure (at least in the US).


> Phones together with satellites and radars and weather stations will produce a better weather forecast experience than we have today, without using phones.

That will surely depend on how much the satellites sensing ability is degraded, no? What makes you so sure, other than a commitment to optimism?


Is there a map of Cell Strength Detectors all over? (actual devices, not a predictive map)?

I think that its really interesting that water is literally the reason why life exists on this planet. Basically, water has shielded life from radiation from space, (in addition to the magnetosphere, etc) -- but life was able to evolve in the protective layer of water.

So, I just find the fact that propagating radiation measurement through water, such as you describe, and as it relates to the potential health concerns of Humans creating a new, fairly powerful and ubiquitous source of radiation (5G) to be something we really need to pay attention to.


How does it convert signal strength to clearness of weather, I assume you have a map of base signal strength in clear weather? How accurate it is compared to a weather radar?


I should defer to the experts here: https://ieeexplore.ieee.org/document/8070024

The part I'm working on currently is a privacy-aware implementation of recording useful signal strength from user's phones, that could be used for this.


Ajit Pai makes me so angry.


[flagged]


Hurricane path prediction seems truly a "killer app", or do you think it's not?


Predicting weather it will rain or not today would be a killer app, and I'm still waiting.


Agreed. It turns out storm system movements are easier to predict than local precipitation. Aggregates of a system are easier to predict than localized/point values.

https://www.precisionag.com/systems-management/data/betting-... - this is about how effective and useful forecasts are (the are useful, because we have calibrated forecasts, so we can use them for objective decision support for agriculture)

https://www.telegraph.co.uk/gardening/problem-solving/ken-th... - this is about how relatively dry places (like the US or UK) usually get a lot of misprediction, even if the accuracy is 90%. (But it's not, it's more like 80-85%.)


As if mainstreaming weaponized radiation poisoning weren't enough, let's unemploy weather-girls in the same swing. It's like Rod Serling took the brown acid.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: