Which is a vast majority of the US population. I agree that it is important to get broadband access for everyone, but asking companies to hold back services and features until that happens makes zero sense.
But even if you're working from somewhere else, remote desktop doesn't need "high speed" anyways. It sends intercepted OS-level draw commands rather than bitmaps, so as long as you're not doing Photoshop it's closer to terminal bandwidth than YouTube bandwidth.
For the vast majority of business apps over remote desktop, whether your internet is high speed or not is a complete non-issue.
And the basic latency (e.g. for typing text or clicking a menu) is the same as terminal latency too.
You're still right -- the performance is good, and using normal UI controls you will typically not notice even a very poor connection. But this is because UI changes are an "eventually consistent" deal, and although the Windows terminal services implementation does send bitmaps it optimizes drawing for sending the resulting bitmap. Contrast to scrolling text on a terminal that may not update in a manner that allows you to react to it.
: Somewhere on the FreeRDP mailing list. Sorry :(
If "terminal latency" is the high watermark, it doesn't bring pleasant thoughts to mind. I find typing over ssh to be irritating. Not awful, but a clear downgrade from a local terminal.
In Akamai’s 2017 report, Maryland’s %-age of connections above 10 mbps (easily enough to do Remote Desktop) was 79%, just below South Korea and above Switzerland. Virginia and the district itself were around 75%, the same as Switzerland.
There's plenty of rural on septic systems and well water just an hour's drive out of DC.
this is for companies. with enough employees who are building on the microsoft stack and the office suite. god knows we would rather they deal with their updates themselves. and besides, way things are going virtualization is being cheaper than buying hardware...
Aside from that, I don't see so many benefits. The company still needs to provide to each employee a device, and training. I don't see many savings on this side. Companies going for cheap devices will still go for cheap devices, while those looking for higher end devices won't settle for cheap alternatives. Besides, SSD and CPU are now very cheap, so you can get very good configurations for cheap. On the other hand, batteries are still expensive, and virtualization might drain them more than using the local CPU depending on the use case.
If the added value is the easier configuration phase and maintenance for IT (software side, as devices still need maintenance), then ChromeOS seems like a much better approach. Instead of delegating to a third party the whole OS (virtualized), the customer only transfers the management of the services. And to be fair, if security here is the main concern, then ChromeOS seems still better positioned, both because data is on the cloud, and because of how the OS is "restricted".
At the end of the day, WVD seems like a way to sell (also) azure services on top of windows. So I wouldn't be surprised if companies evaluating this option will see an increment in costs.
At this point I wonder if it's really worth, especially considering that if for any reason the company cannot keep paying Microsoft for the service, or it's cut off from the service (e.g. non us company -> sanctions), basically the whole company is back to paper in a matter of minutes, while with non-virtualized solutions there's a much bigger safety net.
You mention more CPU, more SSD, more memory, just keeps getting cheaper and you’re right. You can get a pretty amazing laptop for not much more than an iPad Pro and it will handle nearly any workload. The problem is now you have to carry that and all of the batteries and accessories around. It’s so much easier for me to bring my iPad Pro 11” with a keyboard case and remote into a desktop environment for those workflows that either require a high end machine or Windows. I imagine a Surface would be even better.
Personally, I just don’t want to carry a massive laptop to get compute if I can access it remotely just fine. So that’s what I do. I have multiple VMs running Windows or Ubuntu on a high end Ryzen machine with multiple Nvidia GPUs. That’s significant more power than I could ever hope to bring with me on the road. For me the only major setback is screen real estate. Hoping one day some of these HMDs can fix that.
The whole concept would, of course, enrage certain sections of the populace including most of HN's readership, but I can see it happening.
My experience with a Steam Link, hardwired on a local ethernet makes me skeptical that any of these cloud gaming platforms will really be practical.
I have to imagine at that point that at least some of the latency is the cost of encoding the video.
The video is easy to fix with a fast network or Apex PCOIP accelerator cards, although those are being replaced by expensive server-side GPU’s these days. We have some google earth users and a few others with gpu tasks. A couple Apex cards in a couple of the servers took care of that.
The only thing that completely sucks is USB storage devices. They are painfully slow. Most people accept it because they are rarely used in our environment and everything else is faster than any other system they’ve had.
Why not? Everything old becomes new again at some point.