The "if only we had more detailed maps, self-driving would work" is a fake argument. Once you have enough info to know where the fixed trouble spots are, more data will not help you. Trouble comes from unexpected moving objects.
(2004 (not 2005) DARPA Grand Challenge, where everyone did badly. CMU team tried to do it by manual pre-planning. DARPA gave out the course as a set of waypoints on a CD 2 hours before the start. CMU had a big trailer full of people at workstations to plan out the exact path in those two hours, using high resolution aerial photos. But the USMC Colonel in charge of the event foiled their scheme. Just before the event, a few of his Marines went out in the dark and put up some obstacles the vehicle would have to go around. And, sure enough, the CMU vehicle plowed right into a sheet metal fence and got stuck.)
What we're likely to see is bandwidth that changes drastically as you move. You'll get great bandwidth with line of sight to a nearby base station, and then it will drop off drastically as you get further away. That's inherent in using 26GHz for high bandwidth.
The likely benefit is that it becomes possible to provide enough short-range bandwidth for many people to get video-rate bandwidth in crowded areas. So people can watch the game on their phone while in the stadium.
I'm not a GPS expert by any means, but dang.
It's also to enable autonomous vehicles to constantly stream data and video feeds back to humans who can take over in real time if the car doesn't know what to do.
Furthermore you can't expect a remote operator to suddenly take over with no context of preceding events and immediately control the vehicle in a safe manner. It takes a little time for anyone to understand what's actually going on.
If level 4+ autonomous vehicles are ever going to work then at a minimum they need to be able to operate safely with zero network connectivity.
a) situation returns to within operating spec
b) remote control comes through
The remote control aspect would be severely limited and intended only to get the vehicle back to a situation within its operating spec. This trades the very difficult problems 'need to immediately control the vehicle in a safe manner' and 'what happens if a remote control is lost' into the difficult problems 'how do I detect I'm out of operating spec' and 'how do I park safely when I'm out of spec, especially if I've just lost remote control'
I frequently drive on rural roads where there are blind curves and no real shoulder (and not much cellular coverage either). A human driver can usually steer a disabled car mostly off the road by going down into a ditch or up an embankment. But it's still not a safe place to be. And autonomous vehicle software is nowhere near being able to handle those maneuvers.
It helps with municipal adoption.
If this is required for autonomous vehicles, then I'll pass.
Given enough time, they will probably stop requiring external help, but it's very unlikely that the first autonomous cars to run without it.
Could anyone enlighten me to what the use cases are? Because it's still dubious.
Other fuzzier ones are, uh, crunch live video or audio captured by the device via beefy compute to... display overlays? Search similar products? Live translation? Calculate evasive maneuvers? A lot of these are limited by interface concerns that introduce human or other 'latency' anyway, making connectivity latency between the base station where the 'edge' is, and the device itself less of an issue.
I have a Samsung TV which, if it is not in "game mode" introduces much more latency than my DSL internet.
I couldn't win at Titanfall at all with the TV in normal mode, in game mode I was able to progress halfway up the leaderboard.
So far as I can tell, "Edge" is for people who think AWS isn't expensive enough and they'd rather pay AT&T or American Tower prices.
When it's a pure video feed you get the impact of all the latency all the time, and that means the problem has to be solved with brute force reduction of bottlenecks everywhere.
So how much latency is enough for responsiveness? This is pretty easy to derive from the common target framerates: for 30hz display, you need 33.3ms. For 60hz, 16.6ms, for 120hz, 8.3ms. Since perceptual latency is known to keep improving up through 144hz it's reasonable to say that we should be looking for only 5-6ms at most, while most broadband connections are still achieving pings in the 20-100ms range, depending on the game and the specific connection. On the very best connections, that is, we can assume a "30hz transparent" video stream, which is fine for casual experiences but severely impacts competitive ability when tested. In many current popular titles this manifests in mechanics that are both latency sensitive and require a server roundtrip, e.g. building in Fortnite.
I have between 40ms and 200ms on 4g. Average around 80, with often peaks above 300ms.
That seems implausible to me. We already have the equivalent of supercomputers in our pockets, and yet the trend is to move computing away from the end user, onto centralized servers.
really though, the waves aren't faster than lightspeed. Although lightspeed in different media might depend on frequency, indeed, I do not think that's the problem, rather than congestion, because the refraction index of Air is close to 1.
Congestion is a typical problem, though. ""Mobile Edge Compute"", what about "capsule networks"?
Perhaps those people should check how the first navigation device worked ( FYI: without GPS back in 1981) as inspiration to see if there are other ways.
The self-driving car problem can be divided into several chunks, and the 2005 Grand Challenge intentionally tested some of those chunks and ignored others - a divide-and-conquer approach to problem-solving any developer will be familiar with.
The subsequent 2007 urban challenge added extra chunks, like other moving vehicles, high-level route planning and replanning around blocked roads, manoeuvres like parking and three-point turns, and suchlike.
I already see speeds like 200 Mbit/s on 4G in good conditions, sometimes more. Normal conditions 50-100 Mbit/s. Yeah, your mileage may vary. Those 1.2 Gbit/s 4G cellular modems do deliver. I'd be disappointed if 5G didn't significantly improve on that in real world.
> 5G is promised to have much better latency than 4G - perhaps 20-30ms in the real world, down from 50-60ms for LTE (4G). It’s not clear how visible this will be to users.
What? HSDPA "3.5G" was about 50-60 ms. 4G is mostly something like 12-20 ms, when I've measured the latency. 5G hopefully at least halves that.
Anyways, I do acknowledge 4G performance is very regional. Above just reflects my experience, what I've been measuring.
"low latency" for 5G is related to a variant called URLLC, for Ultra-reliable and low-latency. 5G is an umbrella for 3 different variants:
1) eMBB: the massive broadband with higher speed, mostly with mmWave. This is associated with the 3GPP "NR" (New Radio) cellular standard;
2) massive IoT: this will actually be done based on LTE CatM and NB-IoT for a long while, with small improvements in release 15. It can be confusing, but the "Gs" are performance requirements defined by the ITU-T, and not a specific technology. Then a tech that meets the 5G requirements can claim (honestly, without marketing hype) to be 5G. It turns out that for massive IoT LTE meets the 5G ITU-T specs. Later on there will be a new tech based on NR, but nobody is in a hurry there, as LTE IoT is ok;
3) URLLC, the optimized for low-latency variant based on NR.
URLLC is the less clear of the 3 really. The vision is to use NR for industrial control applications, and maybe VR/AR for consumer too. There are low-level optimizations down to the radio framing layer to reduce latency. Lots of fuzzy visions, but I'm not sure anyone has a clear business case. The work on this at 3GPP (the organization defining the LTE and NR specs) is progressing slowly. In particular there's always a cost to reduced latency, so it won't be for free. Private usage for factory could make sense, but consumer use is less clear.
There's a tendency to confuse the "regular 4G/5G" with URLLC, but the later is a different beast and not solid yet.
For what most people use their phones for extra bandwidth isn't going to make much difference, which is good because the individual is unlikely to see it: more devices are coming online as the bandwidth improves, and the demands you have increase too (usually outside your control: sites getting heaver, using higher quality video, ...).
The latency improvements will be more noticeable I expect, especially in busy cells because latency-centric problems tend to balloon much more than bandwidth-centric ones as the air gets more crowded. As the available throughput nears saturation both bandwidth and latency suffer but for most uses the latency is going to be what causes most problems because for many applications you can wait for the bulk of a block of data to come in as long as the first bits of it arrive in a short amount of time (in a web browsing example you can start reading before the rest arrives, unless the page is badly designed and doesn't render until everything is transferred).
Away from "normal" Joe Public mobile network use (web pages, games, some video, maybe maps) the next big thing is workers using VPNs, and again here latency improvements are going to be more beneficial to most people than a bandwidth boost.
Despite this the bandwidth improvement is the fact most shouted about. The reason for this is that it is a number people understand (or at least think they understand) because it is how these things are usually touted. It is not dissimilar to the back end of the GHz wars with CPU releases: at a certain point the extra clock ticks were meaningless because of many other factors, but chips kept getting sold that way because that is what the buying public "understood" and marketing didn't want to bamboozle them with other details (cache design factors, multi-core performance, ...).
If I’m downtown speeds are usually better.
One of the main issues here is that they want to make a lot of money from 5G, that's the main reason why you're always required to have a base station involved.
thing is: you are a subscriber, not a 5G procurer.
Telcos are concerned with how many people like you they can support at once with how much power and opex and capex.
If 5G tech provides n*cell throughput, for 1/n capex and/or opex and you (as a subscriber) see no benefit vs LTE then it will please Telco's much. However, 5G may make a big impact on your experience as well.
I don't know how you measured the latency; but remember that these numbers can be very deceptive due to network technology pretending to do things that it isn't really doing. That's fine when you run a speed test, but not if you are being treated at the side of the road.
I've had one (Cat 18, 1200/200 Mbps) in my daily driver for almost a year. The best download speed under ideal conditions I've seen have been a bit under 400 Mbps.
With wireless data connections, you're sharing the medium (and therefore the bandwidth) with all other users of the same frequencies. This is the reason that you can have absolutely terrible performance in a densely populated city, despite having a maximum strength 4G signal. Having the 20+Ghz channels enables operators to install a large number of micro-cells in these areas. Even if they don't penetrate walls, moving everyone outdoors on to these cells free's up the lower-frequency cells for indoor users, substantially reducing contention.
Another improvement over 4G that hasn't been mentioned by anyone is, supposedly, that it will significantly reduce latency and interruptions from moving between cells. This is especially apparent when travelling at speed (e.g. on a train).
The effect to consumers of both of these is that we'd get much more consistent performance.
I live in Bucharest, Romania, a city of 2 million people, but it's very densely populated and the 4G data plans are cheap, on prepay or contract, so everybody with a phone has 4G bandwidth to use (not necessarily for watching Netflix, but enough for music or facebook).
And we get good 4G coverage both indoors and outdoors. Right now I'm at my office, indoors. As I'm writing this I just did a speed test on my connection and I'm getting 105 Mbps download, 10 Mbps upload, 17 ms ping. And this is the norm, I know because I like going to coffee shops and use my mobile connection for real work.
Here's a screenshot: https://www.dropbox.com/s/pw86k6tgd9rc5fd/Screenshot%202019-...
I haven't seen such good coverage and performance in the several European or US cities I've traveled too. E.g. in my experience the Internet in Germany or France is really poor.
But I don't believe that it is because of the 4G technology.
Every other application, especially the cool ones like AR/VR which do seem plausible for people dropping a lot of cash, will be constrained by the enormous markup on data well before it hits the limits of LTE. Fewer milliseconds on handoff could be nice but it’s hard to think of an application which can’t buffer but is going to fit within a few GB per month.
I don't give a crap about faster mobile data, I've got a 5GB plan and faster data just means I could chew through it faster. All of the data heavy applications I can think of I use almost exclusively on wifi where it's available.
I think the real sell for 5G is it's capacity to carry thousands of concurrent connections, allowing for everything to be connected. A true IOT solution where anything you can think of can connect without congesting towers.
The potential benefit though I see is that you potentially no longer have to pay an ISP like Comcast. You would only pay one bill for 5g for both your computer/phone. Aside from that, its a difficult marketing sell.
I think telcos will encounter increased resistance to putting up millions of microcells, and the cost will end up being way higher than they anticipate. So coverage in the mm bands will end up being extremely spotty (way worse than 4G now), and may end up being a near-bust.
That's my .02 anyway. But I live in a suburban-ish place where 4G coverage is mostly a cruel joke, despite being just across the Golden Gate Bridge from 4G-ground-zero SF.
It certainly seems like much less of an improvement than 2G->3G or 3G->4G.
However, latency is a different story. Many (most?) can "feel" improved latency as better responsiveness, like web page load speed. There's still a lot of "ping-pong" traffic going on, where latency improvements do make a difference.
I wish there were more efforts behind IOT support, by way of lower-frequency bandwidth and low-energy radios, maybe even restricted to a simple messaging protocol. That would open up so many new applications!
Compare it to cable modems with still lowish 5-10 ms latency. Pages load visibly slower. But you don't mind it or even notice, if you haven't previously gotten used with the faster option.
So I think I'll always take all of the latency improvements available. The difference can be substantial even when the starting point is already "pretty good".
The UX around free public WiFi is so terrible that I really never want to use WiFi again outside of my own home.
Fatter Pipes, as stated in the article. There are still lots of people on 3G Network, and we expect once the all of them moves to LTE , the total amount of Data will again double. 5G is designed around capacity, the higher the total capacity, the more Data they can sell you. In reality you will likely be paying the same prices but getting much more Data from your plan. Surely this is a good enough reason for them to spend money right?
Something tells me the bloated ticks on this dog’s underbelly will be grow proportionally to the fatness of this new pipe.
Thanks to this LTE IMS voice is separated from data, like 3G even if it's done differently.
Of course, this doesn't apply to OTT voice: that is always in the Internet PDN and in your data cap.
To get back to 3G, it'll slowly die. Anything you can do on 3G you can do better on 4G. So operators tend to use their 3G infra to the max, but they won't upgrade it. It'll be replaced by 4G over time, until they eventually pull the 3G plug.
3G can have a temporary advantage for (1). But as the 3G infra ages, it will always tend to be replaced by cheaper / more efficient LTE infra as the existing 3G bands can always be refarmed for LTE. And LTE will make a better use of the available bandwidth. There are large differences between operators on the speed of this process, but the trend is universal.
(2) is why 3G was impaired vs 2G: 2G was deployed in low bands (900 MHz for 2G is very common), while 3G was very often only available in mid-bands (1700+ MHz). The higher the frequency, the more challenging the coverage. So 3G coverage lagged 2G for a long time, and is still lagging in places (Europe for example).
The situation is different with LTE, due to the digital TV dividend. This opened low bands for LTE everywhere (700 to 800 MHz, depending on region). Those bands are even better then the 2G bands. They're not yet fully deployed everywhere (Europe is lagging), but they eventually will. And when it's done you can kill 2G and 3G.
For (3) or course, new tech has improvements so 4G is better than 3G and 2G.
When you combine all of this LTE will replace all previous technologies everywhere, in time. How long it will take will however vary a lot depending on regions. Europe invested a lot in 3G, and telco there will milk this infra as long as possible. And because 3G coverage in Europe is not as good as 2G (see above, bands) 2G will also tend to stay with 3G. Cheap second hand 2G/3G infra will remain for a long time in developing countries. But in the end 2G and 3G will go away, it's just a question of time.
On the other hand, 5G will take a lot longer to replace 4G. The big change is mmWave, but it's for small hotspots. You can have 5G in low bands too, but then the gain vs 4G is small so I expect operators to take their time to replace their 4G infra. We'll have 4G for coverage + 5G for dense areas only for a long time IMHO.
Around here, LTE coverage absolutely kills 3G coverage since LTE is deployed on the old analog TV bands (700/800 MHz) which has great penetration into buildings etc. My operator doesn't even activate 3G on new iPhones - they're effectively LTE-only now.
Further still, the new "5g Cellular" standard can include use of the 5GHz frequency band.
Actually, in writing this, I'm realizing I myself may have some understanding incorrect. So please correct me if you know better.
In fact: I don't want faster speeds. Give me much, much slower speeds so that instagram doesn't autoplay video and vampire all of my mobile data plan.
Can anyone comment on whether this might have any effect on us? It makes me a bit anxious to see the huge list of wifi networks available where I am -- are we absolutely certain that all those signals don't have some sort of health impact?
Excuse me if I still feel skeptical about mmWaves..
Both from a physics standpoint, as well as from a biological standpoint, they are not the same thing.
Both are biologically active, but for one of them we have adapted (Vitamin D production and photosynthesis, for example), for the other we have not.
From another perspective; the FCC limit for for microwave radiation is 10^18 times higher than natural background. That's a difference of a quintillion times.
Backround levels (green), compared to current exposure:
Therefore you can't really compare visible light exposure with microwave exposure. One has existed on the planet for billions of years, the other is completely artificial and man-made, and unlike visible light it penetrates tissues through the whole body, and there is no adaptation.
The only time I notice a difference is when I'm downloading a big file, which in these days of streaming everything, I rarely do now.
But for normal web browsing, my browser rendering speed seems to be more of a bottleneck than my internet connection. And on my phone pretty much all I do is web browsing (either through a browser or an app)
I find that sort of appealing as a unification.
Wireless is already a 'good enough' wired replacement for areas where running fibre is uneconomical, though.
Cars being able to get new information within milliseconds about things that happened distant from it, but on its course, will be useful.
AR outside the home could be incredibly useful, if provided with low latency and high bandwidth over 5G.
Verizon is being a bit deceptive in using the "5G" designation for this. It reminds me of when US telecoms started calling their non-4G systems "4G" for marketing purposes.
And further more - with people generally not needing more data, and not willing to pay more, why would wireless carriers bother ?
Just recently some journalist cover the biological aspects of 5+ GHz radio on humans which assume that there are true risks compared to the sub 2 Gigahertz bands.
Can you elaborate on this?
Tagesspiegel is usually a good newspaper but this reasoning is are very light in facts. They should stick with uncovering lobbyism.
I don't see that anywhere. All the data has shown average Data usage per user is still increasing YoY. And as new younger population comes to age, replacing older generation which are less tech savy, Data usage will only continue to grow for the foreseeable future.
We certainly do not want to pay more, but I have never heard anyone who does't want more data, assuming they are not on unlimited plan.
You have been paying for the Services agreement to Infrastructure, and as long as you continue to paid, under the accepted condition Huawei, Ericsson and Nokia will continue the hardware and software investment as well as Network tuning.
So if you are paying $100 per month, and $30 had been going to Nokia / Huawei / Ericsson, as long as you paid the same, there will be money for future upgrade.
And one of the reason why you see many countries has Carriers consolidated to three, becasue the nature of carriers business requires certain minimum amount of revenue to be sustainable.
Note: Of coz this is an over simplified description. But it basically shows you don't need to paid a lot more to get 5G.
Why do you need a specific killer use case for more bandwidth and lower latency? That helps almost all applications.
(Yes, and presumably they'll increase the cap, but even a 100x increase would give you 1 full day of usage per month.)
'LTE generation 4.3' just doesn't have the same ring to it.
But that's where the economics get fuzzy too. Building out all these base stations will be an enormous cost. Mainstream consumers may tolerate modest price increases for connectivity, but much fewer will bear significantly higher prices, or spring for significantly better plans. Such a market segmentation would also dampen the consumer excitement for use-cases as well.
To get around this, corporations who want to ensure connectivity for their application will push to become MVNOs and offer captive access to the corresponding product, so that the end-user doesn't have to pay the cost directly. This works best when they control the hardware too. Vertical ecosystems will proliferate, where the experience can only be consumed using the corresponding hardware.
It's not hard to imagine the likes -- and competitors -- of an always-connected successor to the Nintendo Switch, streaming games from a nearby server farm using a captive MVNO, or one of the many Amazon or Google's decidedly non-gaming, 'smart hub' devices that ensure their own connectivity without the need to put them on your Wifi. This has serious implications for privacy and business models too: it will be commonplace for devices to be connected to the home base by default in a way that's difficult to thwart, but correspondingly license and authorization servers will always be reachable, so DRM-enforced subscription business models can continue to thrive.
But the issue is, you can already do the entire latter part -- the always-connected home hub, or the always-on DRM captive media player with LTE or lower, but it's not yet done. Why? Because people willingly join them to their Wifi for free. 5G will have to fit into the holes left by existing alternatives, and do it at a price point or cost structure that makes sense.
I expressed my view before  that most of the hype surrounding 5G is the industry's own buzz -- likely to get investors excited -- and then amplified by tech journalism, whether intentionally or unwittingly. It greatly remains to be seen how much its deployment lives up to its big expectations.