This is all because there are edge cases for which your brain can make use of significantly more information than it can directly identify. Just because a pixel has a smaller arc than your retina can distinguish doesn't mean that its contribution to the image isn't recognized. "Retina" screens are still far below the level at which there would stop being advantages to increasing resolution. (Yes, there would be huge disadvantages too, and I'm not saying we need higher resolutions.)
What are the odds that there are edge cases where the extra bandwidth of 5G would actually matter? Seems pretty close to 100% to me. Some of those edge cases might be as easy to find as moving your mouse pointer rapidly. Maybe going to a sporting event?
Ever notice how almost all home security networks support cellular connections these days, even the lowest priced options that cost $10/month? That's 2G and 3G at work. The marginal cost to the security company is probably $0. Likewise with the Kindle--so cheap that you only pay a single upfront cost premium, largely for the hardware.
Cringley mentions the desire to supplant POTS networks, but that doesn't do justice to the market potential. Home security networks switching from POTS to cellular was an incremental improvement. The growth will come from all the various devices and services that previously weren't economically viable given installation and recurring connectivity costs. Basically, IoT, except without context IoT is a meaningless phrase.
For example, I noticed yesterday that the arcade at the movie theater switched to electronic tokens. The card dispenser had a cellular antenna, thus I assume the whole thing was completely remotely managed and probably just took a few hours to install without having to provision any local services (like a POTS line) except electricity. Vending machines, etc, are obvious [existing] growth markets, but who knows where it will go.
Anybody know who's the leading provider in this space? I've always assumed Sprint with T-Mobile in second place.
I used to work at a firm that made/distributed/maintained gaming machines, and that's exactly correct. It came about in a roudabout way, though.
Some jurisdictions mandate active management and reporting from 'fixed-odds betting terminals' and historically that was done with POTS and overnight modem dialing. That moved across to 3G because the on-site complexity was drastically lower, but it was a slow move due to the carrier and new hardware costs.
Now those costs are barely anything, and the back-end hardware is already in-place for FOBT. So it makes a lot of sense to move all the other terminals over to this system even if it's only for maintenance reporting/alerting, and not odds/configuration management.
The electricity metering industry tells a nearly identical story. Now every meter might as well be a smart meter, because everything needed to achieve that has to be present anyway, it's just a case of rolling out the end customer devices.
We already have this. It's called NB-IoT (Narroowband IoT) and the carriers have been dragging their feet horribly.
It's chicken and egg. Without a cheap, high volume, low power chipsets, user devices don't exist. Without lots of user devices, the carriers feel no pull to roll it out. Without rollout, nobody feels the need to jump in and create a cheap, high volume, low power chipset.
EDIT: Press release from the original announcement, https://www.t-mobile.com/news/narrowband-iot
Currently, a 60hz screen isn't perfect and pixel overdrive will be used to reduce motion blur. In effect, there are more 'unique intermediate frames' being displayed as the system tries to approximate the discrete input frames physically. The actual blur on fast-moving content on a modern screen is huge...
Ignoring latency for a moment, if a controller has the previous and next frame plus motion vectors and knowledge of the precise pixel response and decay curves for the monitor... it should be possible to continuously adjust the current/voltages(?) to every pixel to better approximate what a much higher refresh rate low motion blur panel would look like on high fps content. Can't remove blur but can make motion continuous rather than a stair step. A kind of smart motion interpolation + overdrive implementation.
The assumption is the higher fps the more linear the motion is frame to frame. 30 to 60fps interpolation should have more noticeable abnormalities than 120 to 1000hz.
I doubt one could make it work for computer graphics (since motion vectors aren't easily accessible), but when displaying video the results could be good, although users seem to dislike content with interpolated generated frames, and perhaps your proposed tech could get a similar response.
My hunch is motion interpolation is preferred at high fps. E.g. a high action scene at 30 fps will need a lot more information to be filled in and has likely already been mastered with the target frame rate in mind (camera shutter speed etc).
But 120hz or more content will have very little motion between frames, and this motion will be far more linear (e.g. a finger might move 1cm between frames, but doesn't bend so less motion artefacts).
So 120hz content upscaled to 960hz+ would be much closer to native 960hz content and preferable to displaying at the original 120hz. A logical implementation would be VR headsets with a built-in DSP.
Here's my thought: connect a laser to a high frequency switch, similar to what you'd find in a PWM controller. Then rig the laser to move such that the beam traces a small circle relatively rapidly. Set up in a dark room, and play with the switching speed until you can no longer distinguish individual spots/blurs as it traces the pattern on a wall. Based in my experience with a 144Hz monitor, I'm betting that if it was moving fast enough, I'd still be able to tell the difference at 500Hz, anyway. Possibly a lot higher. I'd be fascinated to get empirical results back. Especially if they proved me wrong.
Galvanometers are the normal way to do that, but they are quite expensive and not a normal part of the hobby electronics repertoire.
Cringely is the stereotype of the pundit who makes a lot of really confident predictions that are just absurdly, comically wrong, over and over and over. But he's fun to read, so he gets read. Same as the analysts that you see on the news.
However, when I make my own investments, I do more thorough that watching a guy make quick-fire guesses.
The real issue is that 5G in 24–86 GH region has a range of about 1500 m, unlike LTE (and 5G in the LTE bands up to 6 GH) with a much longer range. So think of 5G more as a small-cell mesh technology -- I doubt you'll see it outside urban centers except for marketing -- your "5G" phone will continue to work but only at LTE speeds & bandwidth.
So yes, you can already watch netflix on your phone, but not every technological step forward is about your individual entertainment :)
For reference, just look at what Ericsson is doing: https://www.ericsson.com/en/5g/what-is-5g
Or they’re just talking about what their audience wants to hear.
Without actually making a story up. Usually.
The carriers are just going to push a software update to change the indicator for LTE-A to 5G, I think AT&T already did that...
So for this part, you do not have to worry about 5G being short range or easily blocked. Compared to LTE you will still get benefits like flexible numerology (ie. lower latency) and full duplex (ie. increased bandwidth).
The name 5G is more of a marketing term rather than an accurate technical description, just like the term "retina display". Different vendors can market the same technology very differently. For example, HSPA was marketed as 3.5G, 3G+, Turbo 3G, 3.7G, 3.9G and even 4G. It did not help that HSPA evolved a bit during its lifetime. I would not be surprised if operators arbitrary decide to call whatever they are using 5G.
If 90 percent of the phones in a cell can be hoisted up to the 80Ghz spectrum, then the remaining devices can each have a much larger share of the 800Mhz/1.8Ghz spectrum.
As the leaves blow about in the wind, devices will dynamically shift bands every few milliseconds.
People have been saying this for years, but I'm not yet convinced; I don't hear too many people giving up wired for 4G. The overall download limits must surely be tighter?
What 5G might solve is cell congestion. There are already places in central Edinburgh where I can have full "bars" and no bandwidth, because it's too heavily contended with the other thousand people in the same cell now that everyone has a phone that's polling all the time.
Just for reference, streaming point cloud is about 100mb/sec. You can compress "frames" with the totally unproven Google Draco product, but there's nothing quite like mpeg for the format. I guess it's really hard to call it a format since there's no real standard at current.
2G -> 3G -> 4G -> 5G
VHS -> DVD -> Bluray -> UHD
While the jump is significant between each level, the apparent improvements are diminishing. On LTE I can watch Netflix, live tv, Facetime, use the internet as I would at home on Wifi.
Using the internet on the go (mobile) won't see that much of an apparent improvement (unless you are needing high bandwidth connectivity), but I could definitely see a case if this was used to push gigabit level wireless speeds in hard to install/dense residential locations.
Kinda like how 3G meant iMessage, Skype and Hangouts rendered SMS obsolete so carriers stopped charging per-message - I think 5G will mean more people will stream other video streaming services (this skirting the controversy over zero-rating Netflix and YouTube to the detriment of their competitors). Hopefully this will also mean that people in underserved rural areas stuck on atrociously slow DSL can get decent service without needing to compromise with a high-latency satellite connection or a low-transfer-limit LTE hotspot.
That in turn means apps can be faster because they're running on a cluster of powerful machines with a big GPU.
Developers will look back at the old times when apps had to be unloaded to save memory, and then try to recover state when reopened.
As for use-cases that 5G will enable, industry groups are hopeful for nearly every scenario that involves sending and receiving data. But technical fitness and business models are different questions entirely. And if we're talking about the kinds of hardware Apple currently makes, 5G is to be understood solely in terms of wireless carriers a customer can subscribe to a plan for, not in some intricate use-case for machine-to-machine communication.
Unlike a more invested party and component-maker like Qualcomm, Huawei Nokia, or Samsung, their customer base isn't industry at large, but people who want a tablet-like device to carry with them. They can punt this to the next generation of phones or the one after that, they can wait out the network to be built, for the insane number of access points to be deployed, and perhaps even until their feud with Qualcomm winds down, and the state of play (solely in terms of mobile data and its utility for a smartphone) isn't going to be running circles around them for a while.
But the important question to ask here is why that speed difference matters for mobile phone users? It doesn’t. 5G is no killer app.
Stopped reading at this point.
1) based on quotes from various providers, one big benefit of 5G is in bandwidth. I don’t personally know how (beyond less time to download something opens up bandwidth for other devices quicker), but that’d seem to be a big deal. The whole network saturation issues I hear about (T-Mobile is great in my area and I’ve never experienced throttling issues or the like, but I’ve heard stories).
2) What other killer feature can a phone have at this point? I’ll give credit to phones that flip out into tablet like displays, but I think just about any truly compelling innovation is pretty done in these devices.
3) Anecdote time. I live in a neighborhood where we spent 5 years fighting a provider that maintained an exclusive easement over TV/Internet. This provider’s speed was 10Mbps on a great day and their peak hours were, per their support staff, “between 4pm and 1am” where you were lucky to get Facebook to load (and people sharing pictures/video which seems to be the big use case was a painful experience to be on the receiving end of). Finally, we got out of it and while waiting for Spectrum to lay their infrastructure, I was using T-Mobile’s International Plus add-on as a hotspot for desktop, laptop and TVs and it was fantastic (relative to what we had). Mobile LTE/5G could very easily serve those in similar situations and if infrastructure is in place, rural areas.
I’m of the mind that I won’t buy a 5G device until the infrastructure is there so I’m okay with what I read to a point, but it seemed like a case of making a myriad excuses after a point and tired of it quickly.
Now this is interesting. I wonder if this means the rural and the poor will have to give up on free television.
 Ad supported.
Edit: Not trying to imply only the rural and poor enjoy free tv. Cord cutters do too. My concern is that they won't be able to afford the new alternative or it just won't be available.
There's a lot to work out before Cringely's "utopian" 5G network to rule them all can actually exist.
But the previous post is 100% correct in saying that “video over unicast on the internet” is massively inefficient compared to current broadcast techniques.
People want Netflix-style video on demand though. Outside live news & sports that will kill broadcast TV.
Sending cartoons as raster rather than vector data is also horribly inefficient, yet every TV network does it.
Sending music as MP3 is horribly inefficient compared to the original notes in midi form, yet everyone does it.
Taking a photograph of your electric meter rather than writing down the number is millions of times more wasteful of space, yet people do it.
If you've ever played an instrument that isn't a bad synthesizer, you'll know why everyone does this.
The whole idea behind 5G is that it will allow the wireless carriers to totally eat the lunches of wireline telephone, cable and Internet service providers while also supplanting broadcast TV.
well, most of the wireline operators have a _huge_ presence in the wireless world as well, think at&t, vzw, sprint, vodafone etc. of the world.
afaik, 5g has primarily been about better specturm utilization. core-network has not really changed significantly since 3g -> 4g transition (where it i.e. core, became an all ip network). sure, you keep hearing about things like control-plane user-plane separation (cups) etc. but that is just a marginal improvement over what has already been deployed/running...
Also, having a 5G network with lots of restriction won't bring much usage. I prefer to have a good 3G unlimited connection than 5GB at 4G speeds. Same thing for 5G. If we cannot have unlimited (or close to unlimited) data then new usage cannot really be developed.
Edit : typos.
It's that his point? Others complain that Apple doesn't have a 5G iPhone ready. Part of Cringelys point is that it doesn't make sense to penalize Apple for the lack of 5G support, when there's no infrastructure available.
And we are "soon" moving to 6G anyhow, maybe I'll skip 5G :) https://www.oulu.fi/6gflagship/
Latency? That sounds nifty for something like video confrencing. Alas, the bandwidth caps are there anyways.
Faster speed just means a random website can burn through your Verizon cap that much faster. "Unlimited" has and is a willful lie and deception by wireless companies. The fine print always says the opposite.
First, 5G is the future, it may not be important at first but it will eventually change. Is a bit of future proofing too much to ask of a high end smartphone?
Second, 5G is not just about maximum bandwidth, it also gives you access to less saturated frequencies.
The lack of 5G is a very big deal, especially when smartphones vendors struggle to find even minor differentiating factors.
On the technical side, I think 5G has a serious problem of being blocked by just about anything at those frequencies. I'm sure the RF wizards will get it sorted out, but I imagine early adopters are going to suffer with garbage signal indoors.
So really, 5G is a somewhat abstract standard by ITU-T that specifies requirements, and the industry then delivers a suite of technologies that are deployed together and marketed as 5G. In the case of 4G, the same was true at first, but then non-confirming LTE was cobbled together as a series of progressive enhancements of existing tech by the industry, and marketed as "4G", co-opting the term in consumers' minds and making the prior formal definition irrelevant.
With the hype and anticipation over 5G rapidly intensifying, there's a chance the same might happen again, as networks are eager to brand any progressive enhancement as a true differentiating factor. It's enabled by the term "5G" having cachet with consumers, yet its technical specs are relatively obscure so average customers lack meaningful information and recourse to challenge the marketing.
Besides, most of the people I know who talk about future proof phones, tend to be the same people who upgrade every year or two.
Innovation and bringing that to market has been sorely lacking from apple of late.
- made fingerprint scanning fast and easy to use (and then removed it)
- pioneered facial recognition that is robust against being fooled by images and masks, and also robust to the user's face slowly changing (e.g. growing a beard)
- created their own SoC that allows for solid security, excellent power management, and fast graphics
- made security a feature
- pioneered augmented reality with ARKit
- pioneered watch-based EKG
- pioneered watch-based insulin measurements (apparently the research didn't work out, though)
- researching self-driving cars (rumor has it)
- pioneered fake bokeh with a phone camera
- maintains net profit margins of 40% (!! even Coca-Cola is only around 30%; margins of this sort are not easy to maintain for multiple years)
You could argue that Apple is not the first to do many of those, but it wasn't the first to make a phone-computer, nor was it the first to make watch-computer, yet that counted as innovating. So I don't think it would be fair to say that security, SoC, AR, self-driving cars (allegedly) is not innovating. It's just that the innovations aren't category-defining like they used to be. But category-defining innovations aren't a dime a dozen. And we don't know if Steve Jobs could have continued creating category-defining products, either.
Apple quietly but incessantly evolves their products so that today's major new features are added to tomorrow's minor backlog of expected features. We quickly forget how amazing something is because we rapidly move from the "ooh ahh" phase to the "meh" phase thanks to the relentless, predictable pace of evolution. Every year, something replaces the new and shiny thing from last year, and the world somehow manages to forget from what we've come to what we've come in short, measurable periods of time — a year, two years, five years, a decade…
Meanwhile, Apple brings out the iPhone in 2007 and the iPad in 2010, and it's somehow expected that they'll do similar revolutionary shake-ups every few years because that, apparently, is their modus operandi. Plenty of digital ink has been spilled already to say that no, this is not the way Apple does things. The iPhone, the iPad, and to a lesser extent (but with no less validity) the Apple Watch have vastly changed the technological landscape to be sure, but it's the slow, quiet evolutions on top of the noisy revolutions that truly show what innovation can do.
Innovation isn't just inventing new product categories. Innovation can be taking something that already exists and making it better. It can be making something more accessible, more friendly, from beyond the grasp of the technological elite and into the hands of mere mortals. It's making things smaller, faster, and more convenient. It's evolving what we already know into what we'll soon come to expect.
Since Steve died, we've had massive improvements to the way macOS and iOS communicate to get jobs done quicker and more efficiently, with less hassle when switching devices in a modern mobile lifestyle; seen tremendous changes in the way people think about health and fitness with Apple Watch, as well as the fundamental means of communication; seen how Apple conglomerates content from multiple providers to deliver a more seamless experience for browsing TV shows and movies on Apple TV and iOS to delivering the news in a fun, friendly way, and most recently seen how profesionals can use the iPad, leveraging the simplicity of iOS with powerful custom processors whilst not having to deal with the complexity of a desktop OS nor the battery drainage of Intel's silicon; made foolproof facial recognition not only widely available but, once the iPhone 7 and 8 stop selling, the standard (that plenty of competitors try, and fail, to emulate). There's a lot more I could write but so little time.
If I sound like an advertisement, it's because I'm trying to impress upon you that I haven't even begun to list the fields in which Apple has innovated since Steve died, but the accomplishments so far have been enormous enough that if tomorrow the world suddenly reverted to what was available at the end of 2011, much of the world would sorely feel it.
And to add to this, it can be said that what Apple does today, the rest of the industry does tomorrow. All of those tiny evolutions seem to be mirrored in Android, Windows, Linux, and other systems taking their queues from iOS, macOS, watchOS, and tvOS. Apple does the same thing, as any company does, taking what's successful but adding their own spin on it to to adopt, to evolve, to innovate.
As Phil Schiller put it, "Can't innovate anymore, my ass!".
I think this was it: https://www.saferemr.com/2017/09/5G-moratorium12.html
When people hear about 5G where plans call for 300 meter cell spacing, they panic thinking that the power levels will increase, but in reality it does not. Because the transmitter power levels are reduced commensurately.
A lot of people (even scientists) do not understand this basic concept. Current spacing and power levels are 2Km at 40Watts (typical but varies) in an urban area for low and mid band (1ghz - 3ghz). If you reduce the spacing to 300M and 5W, then you really have not changed the energy density. But what you accomplish is getting more capacity by enabling more re-use of frequency spectrum.
The numbers are different for different bands due to different propagation properties, but that is the basic concept.
The concerns seemed legit, I wonder why the industry is either not paying attention to them, or not at least responding to the concerns
Of course, the SaferEMR folks dismiss it and all of the safety testing and health risk studies that have been done. Their official stance is that we haven't demonstrated that there couldn't be some unknown mechanism by which EMR could negatively impact human health so we should curtail all wireless technology until we have proven that there is no possible down side. This is of course impossible.