Windows 10 was the giant step in that direction - an always updating, maintenance mode, single OS that MS will need to support. Arguably until Win 7 dies out they won't be entirely there but once they are, maintaining Windows is a much simpler affair than yesteryears.
None of this means Windows itself is dying - it only means the OS has matured enough for the needs of present and any future needs will be addressed as and when they arise - instead of inventing the future they will abide with it on their own schedule.
This allows Microsoft to focus on things that truly matter for the future without killing Windows or spending a lot of resources on advancing/supporting it. It's not hard to imagine Windows stays a very dominant client OS for a long time to come - even if the overall PC market share continues to decline, because none of the alternatives are there and nobody is going to invest the resources to get them there.
I will also add that Satya's success lies in speeding up this strategy that started late under Ballmer and also in executing so well on it. Considering the scale of what MS is doing - Office ports to iOS, Android, Windows 10 releases, Azure, ton of other Enterprise stuff (Exchange, O365, Development tools, cross platform stuff like .NET Core, VS Code etc) - the change of course and the success they are having with it all - you can't argue it's not a phenomenal achievement.
* In 2018, the most popular resolution for users of Firefox is still 1366x768. And only 1920x1080 is making any headway against the dominance of 1366x768. As much as I am surrounded by the culture of multiple 4K+ displays, apparently this is not at all commonplace. 4K doesn't even get listed, presumably lumped into the "Other" category.
* In 2018, the most popular memory amount for users of Firefox is still 4GB, trailed by the also disappointingly small 8GB. In my world, I consider 16GB a bare minimum, but at least with memory I haven't been deluding myself—I know many people economize on memory. Still, I would have thought 8GB was the most common memory size by now.
* A surprising number of people still have Flash installed.
Most of the tasks in the real world doesn't need multiple 4K displays, including low level and systems development. Most people read stuff, and as a developer and system administrator, when the text is looking good, this means the resolution is enough.
> In 2018, the most popular memory amount for users of Firefox is still 4GB, trailed by the also disappointingly small 8GB. In my world, I consider 16GB a bare minimum, but at least with memory I haven't been deluding myself—I know many people economize on memory. Still, I would have thought 8GB was the most common memory size by now.
8 GB is more than enough for most people. My family's Win10 desktop is happy with 4GB, my office desktop is cozy with 8GB. My own desktop has 16GB of RAM, but it runs many, albeit small, virtual machines. "The hardware is cheap, let's waste it" mentality doesn't help anyone and it's wrong. I've written some state of the art algorithms which use 1.5MB of RAM and make the CPU scream for cooling (I develop high performance computing software as a side-academic gig), so like every resource, RAM should be used sparingly.
Edit: I've no comment for flash. I think it's a forgotten remnant of old systems.
The other footnote should be that display prices have been crashing recently. You can get an IPS 24" FHD monitor for like $90, and a QHD version at 27" for about $150. Those would have been twice as expensive a few years ago.
That being said, all those 768p screens are crappy plastic laptops with really slow hard drives. That I guess is what we end up with the Intel took what should have just been the natural evolution of notebooks - small SoCs running a high PPI display in a metal frame - and made them into some premium brand name product with a huge margin on the prices of chips that cost peanuts to manufacture because they didn't have any real competition in the space for a very long time (and even then, their monopolistic behavior lets them keep AMD out of all the major brands premium notebooks anyway).
You're right, however not everyone has the same desktop space to accomodate a 27" panel. I can barely fit a 24" on my desk. 1440p monitors start at 25".
> The other footnote should be that display prices have been crashing recently.
When I was in the US at the end of 2014, one of my friends said the same thing about flash drives when I pointed out a $44 PNY 128GB flash drive. Unfortunately, other parts of the world doesn't work same way. Because EUR or other currencies are not fixed against US$, and prices fluctuate at most parts of the world if not increase. So, no, technology doesn't become cheaper as it matures at some parts of the world unfortunately.
Addendum: BTW, you are right about the 768p screens are generally found in entry level laptops or netbooks. These devices are most feasible ones when they first came out. Now they are bottom end cash cows which are virtually free to build.
You can get 24" 4k (2160p) 180ppi 60Hz for $300 e.g. LG 24UD58.
Anything higher resolution-wise requires a much larger display to be readable at 100% scaling. I'm adamantly against using scaling.
Perhaps my experience would have been better on a desktop, but this was for my work where I have a Surface Pro which when going from a docked to undocked state (or vice versa) with monitors that didn't match the same scaling as the surface resulted in graphical issues that can only be resolved by logging out then in again.
Also still come across apps that don't know how to scale so that can be really frustrating.
I have to say you have it exactly backwards!
Gaming on 4K is extremely expensive and still basically impossible at refresh rates higher than 60 Hz. In fact, you’ll be lucky to get even that much. 1440p/144Hz is a much better and more realistic target for even the most enthusiastic gamers.
Also a most welcome recent trend has been to ship with novel temporal antialiasing techniques, completely redefining how games can look at lower resolutions.
Temporal artifacts have always been the bane of 1080p, forcing higher resolutions either directly, or indirectly as subsampling. Once you take that out of equation, the benefit of native 4K is much more modest.
4K movies are nice, but as with games, it’s more of a linear progression. I doubt most people could even tell the difference in a blind test.
Full-range HDR is, in my opinion, a much better investment if you want to improve your TV viewing experience (and lately gaming as well) in a noticeable way.
I don’t know much about graphic design, but I doubt 4K is all that essential. Everyone has been using 96 dpi displays to create content for very high density mediums for a long time. Even the most craptastic ink printer is 300 dpi+. All you need is a zoom function. Color reproduction is, I think, much more important than resolution.
Where HiDPI displays really shine is actually in the most mundane: font rendering.
For anyone that works in the medium of text, programmers, writers, publishers, etc., a 4K display will be a considerable and noticeable quality-of-life improvement.
Even the most Unix-neckbeardy terminal dwellers will appreciate the simply amazing improvement in visual fidelity and clarity of text on screen.
> I tested out a 4k 28" and unless I had it at 200% scaling couldn't use it for longer periods of time.
That’s what you are supposed to do! :)
It’s only HiDPI at 200% scaling. Otherwise it’s just 2160p, or whatever the implicit resolution is for some partial scaling value.
For 4K at 100% scaling you’d need something like 45" screen at minimum, but that’s not actually practical once you consider the optimal viewing distance for such a screen, especially with a 16:9 ratio.
> I get more screen estate and still readable text with the 25" 1440p.
A 4K display should only provide extra space indirectly. With text on the screen looking so much sharper and more readable, it might be possible to comfortably read smaller font sizes, compared to equivalent 96 dpi display.
If you also need extra space as well, then that’s what 5K is for.
Though for things like technical drawings or detailed maps you can actually use all the extra 6 million pixels to show more information on the screen.
A single-pixel–width hairline is still thick enough to be clearly visible on a HiDPI display.
> but now you have to render 2.4x (Vs a 2560x1440 monitor) the amount of information for what I think is fairly little gain.
Yes, that’s an issue with things like games. However you can still display 1080p content on a 4K screen, and it looks just as good, and often even better.
Most graphics software will also work with 1080p bitmaps just fine. Vector graphics necessitates doing a little bit extra work, but for a very good payoff.
Overall though, for things like programming or web browsing, it shouldn’t matter. I have a netbook with a cheap Atom SoC (Apollo Lake) and it can handle 3K without breaking a sweat. That much more capable GPU on your Surface Pro should easily handle even multiple 4K displays.
Pushing some extra pixels is not a big deal, if all you’re doing is running a desktop compositor with simple effects.
> going from a docked to undocked state (or vice versa) with monitors that didn't match the same scaling as the surface resulted in graphical issues that can only be resolved by logging out then in again.
Yeah that must suck. Still, it’s only a software bug, and you mustn’t let it keep you from evaluating HiDPI on its merits.
> Also still come across apps that don't know how to scale so that can be really frustrating.
That’s life on bleeding edge ;)
Sure, it’s annoying, but the situation is a lot better than it used to be. Even Linux is doing fine, at least if you stick to recent releases. Some distros like to ship outdated software for some reason :/
Still, in my opinion, the quality-of-life improvements of a HiDPI display very much outweigh the occasional inconvenience. Though obviously, YMMV.
 Assuming you’re viewing at optimal distance.
 With the notable exception of 96dpi native pixel art.
4K has exactly 4 times as many pixels as 1080p, so it shouldn’t be an issue in theory. Nearest-neighbor will give you exactly what you want.
However in practice you need to force scaling in software, otherwise graphic drivers, and most monitor’s postprocessing, tends to default to bicubic scaling. That said, pixel art is not computationally expensive, so it’s mostly just an inconvenience.
 You can use advanced scaling algorithms to upscale 1080p to 4K and it usually looks great. E.g. MPV with opengl-hq profile or MadVR on Windows. For that you’ll need something a notch over integrated graphics though, e.g. RX 560, GTX 1050 and on mobile Ryzen 2500U or equivalent.
I'm with you on accurately calibrated monitors though! God most of them suck out of the box.
I’m not being sarcastic. The last time there was a thread like that on HN a bunch of people figured out they need glasses.
I have a 4K @ 24in monitor (180ppi) and a 267 ppi netbook and when I switch between them the 4K starts looking like a blurry mess!
Its fair advice, but some eyesight issues cannot be solved with glasses.. if they can be solved at all.
Also worth noting that with TVs the distance matters a lot. With monitors, laptops, and gadgets it is relatively stable.
Don't get me wrong, I can see a difference, but not nearly as night and day, especially when it comes at the cost of other features (eg refresh rate... which isn't the end of the world for coding so if it's the only thing you do on the monitor it could be worse... otherwise ouch my eyes.)
What scaling is that? 200% scaling is what you should have, 4K is exactly 4x as many pixels as FullHD. If someone is using lower scaling then they are trading sharpness for virtual space.
Both displays are around 200 pixels per inch, plus or minus. It's great not having to see the pixels, so much more pleasant and easy on the eyes.
Also the combination of a portrait display with the landscape display is really nice. I can read an entire PDF page without scrolling.
The original Windows "standard display" was assumed to be around 96 DPI. That's the monitor that 100% scaling (i.e. no scaling) is intended for. Round the 96 up to 100 and we can say that in rough terms, the percentage scaling should be in the neighborhood of the monitor's DPI.
So monitors in the 200 DPI range are best at around 200% scaling.
A 28" 4K UHD has 157 DPI, so I wouldn't want to try it at 100% scaling - ouch. It ought to be running in the 150-175% scaling range.
The idea with a high-DPI monitor isn't to make everything smaller on the screen, it's to make everything sharper and more detailed. When you double the DPI and scale appropriately, you get four times the number of pixels for everything you put on the screen.
That’s not how it works. Lower dpi does not somehow give you more real estate!
You should still be running with ~200% scaling because you are viewing it at a greater distance.
Optimal viewing distance, assuming 16:9 ratio, is 120 cm vs 140 cm for 24" vs 28", respectively. Accounting for the difference gets you ~155 ppd with both monitors, maintaining 25.0° horizontal viewing angle.
The closer your viewing distance the more ppi you need for the same density. That 28" is not inferior to the 24", when you account for distance, despite the lower ppi, because the greater viewing distance makes the pixels look smaller, thus creating more dense image.
Also, apps that don't scale properly are a pain haha.
You need a higher PPI to make anti-aliasing work on screen (finally looking nearly as nice as print).
Maybe as the price comes down and more transistors end up in basic chipsets we'll see 2 and 3 heads of 4K displays become common.
I wouldn't trade my three 1080P 24" panels for 4K panels, unless the goal was to sell the 4K panels and rebuy the 1080P 24" panels. I don't do a lot of gaming anymore, but they're hardly a horrible experience in that regard either.
I think at this point in the market, I'm going to stick with 16:9 native resolutions. If I do any swaps, I'll probably try out dual 27" 4K panels (also 16:9, great for 1080P content), with one mounted below the other. That'll be pretty nice in time as 4K becomes better supported.
Personally, I've just upgraded from 1680x1050 to 1920x1080. My old monitor was 10 years old, and was performing relatively well. I bought a new one, because it started to show its age (mostly backlight age).
I don't remember anyone having any problems running the latest chrome or Firefox. I think people even managed to run things like SolidWorks or Matlab perfectly fine on those machines.
It's been about 8 years since I last had a laptop without at least 1080p I think.
Because the latter is what most people buy for, as far as I can tell.
That said, I suspect most people over the age of 35 or so can't read text at "normal" size on a 1080p 12inch screen, so if they want a 12 inch screen they have to either scale up their text or drop the resolution. And I believe the reported resolution in the Firefox telemetry is the one the OS thinks it's using, not the one the hardware has.
I develop for an organization that serves people of mid-to-lower economic strata. 10% of our users are still on Windows XP.
We get caught up in the tech sector echo chamber and don't realize that upgrade cycles for millions of people is a lot slower than we are used to.
I think that it's not surprising to see what Firefox is reporting. A lot of people go to Firefox seeking something that runs faster or better than what came with their Windows box years ago. Most of the people we serve on low-end computers are Chrome, not IE/Edge.
A friend of mine is thinking about buying a new laptop. We were looking together at what's available in the sub 400 Euro range, basically everything is 4 GB and 1366x768. Add some 200/300 Euro and 8 GB and 1080p start to appear in the lists. Guess what people buy most? The cheapest stuff.
By the way, the new laptop must be Windows because anything different will be too difficult to cope with after 30 years of training. He's replacing the old one (only 2 years old) is getting too slow. Probably a malware party but disinfecting that machine it's a hopeless effort. That one and the previous ones were periodically cleaned and/or reinstalled but there is no way to change those ingrained clicking habits. No adblocker because my friend wants to see ads and they could actually be useful in his business. I told him that he'll be back with a slow PC soon and that he's wasting his money, and I've been wasting my time.
My daily machine at home is still a 2011 Macbook Air with 4GB of RAM, which is admittedly not enough. My work machine was a 2012 Retina MBP with 8GB which was PLENTY for all of my everyday needs except when running VMs. To this day, 8GB is enough for 'most' uses, and my work machines only have 16 to get me over that "8gb-is-not-quite enough" hump. But I've got retina displays everywhere, and a 5k iMac at home. No clue how much memory it has, to be honest, but 32 is insane for not just the average user but even most power users unless they have really specific needs.
why would you need that much ram? are you running 3+ VMs all the time?
This is the weirdest part for me, as I've myself never met a person who'd like to see ads. Installing an adblocker seems to usually be a relevation for people.
Possibly a good candidate for just doing a reformat?
That being said, my parents are perfectly happy using Google chrome on 2gb ram with a core 2 duo. And that's not with only a few tabs.
It may come as a shock to you but there exists a world outside of California/London/random-tech-hub
£670 is much more than people I know want to spend on a new machine. None of them have ever mentioned screen resolution when i've asked what they want in a laptop.
To me, the hardware report shows technology is quite stagnant on display resolution and memory. 1366x768 is a resolution from 2002 or thereabouts. 4GB was common in desktop computers in 2005. ~15 years have passed and consumers in the $750 price range are still hamstrung with 1366x768 and 4GB of memory? I would expect better from a $750 phone, let alone a computer.
I think we're actually talking about computers that are very old and have never been upgraded or new computers in the $200 to $400 range.
I think there's a huge disconnect between what's you think is "common" and what people actually have. Steam hardware survey from 2009 says only 10% of steam (gaming) users have ram greater or equal to 4GB. I'm going to estimate that in 2005, that figure is less than 5%. Saying 4GB is common in 2005 would be like saying 64GB today is "common".
In 2002, the most common resolutions were 4:3, not 16:9, not only because notebooks were not mainstream yet.
> 4GB [RAM] was common in desktop computers in 2005.
That's the time when notebooks became mainstream iirc. I purchased a gaming notebook in 2006, it had 1 GB RAM. I also bought a midrange notebook in 2009 which also came with 1 GB. (I later upgraded to 2 GB aftermarket.)
Sure. I didn't really intend to get into fine-grained technical details, but yes, back then 4:3 was the predominant aspect ratio. In fact, some of the 17" LCD displays from the time were that awkward 5:4 resolution of 1280x1024.
But putting specifics aside, the broader point is that computing technology—at least with respect to the most popular display resolution and amount of memory—is stagnant. In this thread there are people arguing for technology asceticism and making the point that 4GB and 1366x768 are good enough. And maybe they are for some people. I am surprised there are so many who are satisfied with this particular flavor of "good enough." (Though I would like to see how well that correlates with mobile device technology asceticism.)
I want the hardware industry to do better, and I thought things had started improving a few years ago when 4K displays emerged and the shackles of "High Definition" (1920x1080) were cast aside. But this data shows adoption of 4K displays for computing is so low it's not even measured as such. That disappointed me, but it's unsurprising. What surprised me was realizing 1920x1080 isn't the top dog I thought it was—in fact, the situation is worse than I thought.
And to my mind, all of this feeds into the subject at hand: an operating system vendor getting stuck. The embrace of "good enough" is suffocating.
If anything it's worse now because the RAM is soldered in and they're using BGA ICs which makes it impossible for most people to upgrade.
There's quite a lot of these, e.g. HP Stream; the Windows license is much cheaper for these systems because they have such low specs.
So yes, the innards of your 2017 MBA are very old. It has Broadwell generation (circa 2014) CPU/GPU, and one optimized for power draw at that (i.e. a slower one).
I've used for years a CoreDuo system+4GB with Firefox then PaleMoon playing videos in FullHD on 23 and 27 inches monitors, and never got even close to hit the swap partition unless I was running virtual machines or heavy games. Now I'm on a i5 Debian+XFCE with 8gigs, one Waterfox (privacy oriented Firefox derivative, thanks to the HN user who pointed me at it) instance with 4 tabs opened and a shell with 2 tabs; here's free output:
total used free shared buff/cache available
Mem: 8037636 1194044 4182928 187332 2660664 6555172
Swap: 8249340 0 8249340
total used free shared buff/cache available
Mem: 8037636 1453204 3761776 280824 2822656 6202520
Swap: 8249340 0 8249340
 ~$ uname -svo
Linux #1 SMP PREEMPT RT Debian 4.9.30-2+deb9u5 (2017-09-19) GNU/Linux
(a bit older RT kernel needed for low latency audio)
Of note, 8GB is most popular with 1920x1080 resolution.
My main computer has 8GB RAM. I don't know why I would need more than that. Right now I have firefox with 15 tabs, qt creator and 3 terminals open and I'm only using 2.5 GB of RAM. Unless I suddenly wanted to get into heavy video editing I don't think I'll feel the need to upgrade soon.
I think a good SSD and GPU are much more important than having a lot of RAM if you want to improve general user experience.
It seems that most people who make websites nowadays think similarly. I don't get why they are making websites for themselves, rather than for their users.
Because most people making websites don't care about basic usability, even though they have stats that shows their users use small laptops they'll still stick a fixed header that takes 1/5 of screen real estate because it's the latest "bootstrap/material design fad". Google does this a lot, which is ironic with them pushing "AMP" on mobile, they can't even give regular people a good on experience on their laptop.
The laptop I'm typing this on is exactly that: A Lenovo Ideapad from 2010 with 4GB and 1366x768 resolution, running Bunsenlabs Linux and the latest Waterfox (Firefox fork with telemetry removed). It's blazing fast even without a SSD; the browser starts in the blink of an eye and it takes nearly 70 tabs open to start slowing down.
Yes, the resolution can be limiting for some websites, but most are designed well enough to work fine on it. In fact, on text-heavy sites like HN I find myself browsing at 120% magnification for less eye strain. I couldn't imagine what 1080p or higher on a 15" screen would feel like, but it would likely be painful. It's actually more comfortable to read on than my iPad mini for that reason as well, and of course typing is a dream compared to any other laptop I've used, to say nothing of klunky touchscreen keyboards.
Granted, it would be slower with a more bloated Linux distro and certainly was with Windows 7, but when it comes to the OS I prefer minimalist and stable to kitchen sink and bleeding edge.
Moreover, even apps that do support HiDPI often look like crap at 125% scaling -- the default setting on Windows 10 for most 1920x1080 laptops these days. And if you go up to 150%, you only get 1280x720 effective pixels. That's even less information density than 1366x768. Most legacy Windows apps expect to be able to use at least 768 vertical pixels.
The search turned out to be more difficult than I expected, because most high-end PC laptops with good build quality had already transitioned to 1920x1080 by early 2016. I finally found the perfect model for my father and told him that he should expect to use it for a long, long time since it's the last laptop that will ever support his apps. Fortunately, it's a well-built Broadwell i5 with a number of upgradable components, so it should last well into the early 2020s.
Most just don't have a choice. Plenty of laptop models top out at 8GB,with 16GB either being confined to larger form-factor machines, or being so outrageously expensive that no-one in their right mind would bother.
I mean, look at the current line of macbook pros, you simply can't get 16GB without going up to the 15-inch model.
Websites become a problem on slow internet, which is surprisingly common in companies. This is because they are often located in cheap rent areas with little infrastructure.
The screen issue does bug me though. Not just a haxxor feature anymore, all basic Windows users can leverage the split-screen capability to put two applications side by side - on a 720p screen, there really isn't enough screen space to practically do this. I think 1080x1920 or at least 900x1600 should be the standard for 2018.
It's included in the university fees and there's no way of opting out of it, so I had to take it. Not only do they overcharge us for it, finding a seller is hard for it too, so I decided to wipe it clean and use it as my daily driver.
Works great on my iPhone X and iMac Pro? Perfect!
Could the chart be reporting CSS pixels rather than hardware pixels? Macs are only 7% of total Firefox users but the majority of Mac hardware has a "Retina" display with a higher resolution than these.
BTW, 2560x1440 is listed with about ~1% but it's not visible because the line and text is black on a black background.
EDIT: https://www.asus.com/us/ROG-Republic-Of-Gamers/ROG-SWIFT-PG3... 100Hz is awesome, even for just browsing the web.
Great monitor though. Both for gaming and for general usage.
In 2018, my computer only has 2GB RAM and runs more than fast enough for what I need on a 32-bit processor. I don't understand why some people always want bigger hardware numbers, rather than better software.
> * A surprising number of people still have Flash installed.
I do too, I don't see how it's abnormal.
I can barely conceive of how narrow-minded and myopic your technological experience must be that you feel only the latest and "greatest" is viable. As someone who browses the Web regularly, I spend a lot of time on places like HN, YouTube, etc. My screen resolution is 1440x900 - one step above the 1366x768. As I said, 2GB RAM, etc. This is why I've argued, constantly, against the nonsense of Chrome and "modern" Firefox. That's why users of old computers are going to Pale Moon if they know about it, or back to Internet Explorer, or just not upgrading to the newest Firefox.
Your technological experience is very much a high-class experience, but you have to understand that there are users all over the world and in all economic positions. All this push toward expecting the best and most modern only works to force out the common person - in America it's the poor person in the ghetto who still is well above the position of people in other, even less prosperous nations. Most people, if they are lucky, have a used <$200 laptop which is often a decade or so old. Library computers are often almost as old - I know two libraries which are still running their .5GB 32-bit Windows NT2000 computers as patron-access machines because they are in poor areas and simply cannot afford to upgrade.
People who have the view that everyone should have base access to the latest, greatest, and most wonderful hardware upset me greatly. I have one comment about it that seems constructive. If you think it's so important that they should have this, you should provide it. If you can't give every person who has an old reliable computer (that they can repair) the newest bleeding-edge tech (and give them service when it inevitably breaks), then you need to understand that they have valid concerns too - and help provide better software and help others like you to understand what reality looks like for most of the world.
Malware authors rub their hands gleefully when they see sentiments like this being spread online by people like you.
Don't do any of those suggestions. Disable multiprocess mode in the Firefox config if you find it eats too much memory, stick with the ESR if you're scared of upgrading and use Edge over IE.
In any case you must be at least airgapping it (right?), so I guess it's academic.
Edit: you might be onto something here. Running an old, outdated browser on an old, unpatched 32 bit windows xp box might trigger some anti analysis defenses on whatever malware is sure to come your way. Security through being too obviously insecure!
But I know three families in low-income situations for whom that sort of PC is the only one they have, because they got it from either an office or a school when the place was discarding computers. So no, it's not being air gapped. It's the computer the kids look up homework info on, it's the one the parents use to search for jobs, and more. They can't afford to upgrade, and they're afraid that even upgrading the software would make the whole thing come apart at the seams.
Unless you just need your monitor for gaming. New gen temporal aliasing mostly makes up for the lack of pixels.
SSDs have an order of magnitude faster random access performance (both read & write) and you can measure that. NVMe drives additionally increase parallelism by a significant margin. Typical workloads are very sensitive to this metric. At least at cold boot you should see significant improvement in performance.
Font rendering, likewise, you can just see. See the full page for more examples.
If you have good eyesight, you will see a difference. I could see how 4K to “retina” (>260 PPI) might not be worth paying extra to some (though I personally want the highest PPI I can afford), but 96 PPI to ~185 PPI is a world of difference. It goes from blurry to sharp.
That’s not all—because LoDPI text rendering often involves subpixel rendering techniques, HiDPI also brings significant improvement in contrast.
For crying out loud, some people even get headaches from the blurry subpixel rendering! 
Maybe that’s arguing semantics, but to me a waste of money is something like a $150 HDMI cable… Or 64GB of RAM you’ll never use. Things that you can brag about but that don’t actually improve your life in any way at all.
However, the $100 extra I’ve spent on a 4K vs FullHD monitor was, I’m certain, one of the best purchasing decisions in my life.
My 4K display makes spending time productively so much more pleasant, and so much less tiring, both on eyes and mind. And it just looks gorgeous.
The first time experience of watching BBC’s Planet Earth in 4K on this thing was mind-blowing. It must look absolutely incredible on an actual big screen with full range of HDR.
I also have a very cheap Chinese netbook with a 3K display (260 ppi), and I have, on occasion, used it to read books etc. I would have never even imagined doing any extended reading, of any kind, on my old laptop. The font would need to be huge to make that experience at all bearable. At least that’s an option for something like ePub or HTML, but a lot of things are available only as a PDF. On that little netbook it’s no bother to have multiple columns of tiny text on the screen, all at once, just like in a real book.
I wear glasses and with the old LowDPI devices I was always unsure if I need a new prescription or if it’s just the crappy display.
As for SSD I can strongly recommend getting one in addition to your HDD. For my workstation, I’m running a bcache setup (in writeback) with a 60GB old SATA3 SSD + 1TB HDD RAID1 and I’ve been extremely pleased with that so far.
I can only tell I’m not running an actual 1TB SSD right after I do a major upgrade. It takes at least one cold boot to promote files in the boot path to cache. Every time that happens it reminds me just how much faster SSDs are for certain things.
Nevertheless, you almost get to have your cake and eat it too. Bcache (or dm-cache, a fine choice as well) will intelligently sync all the randomly accessed and most used data with the SSD and stream anything bigger (like movies etc.) directly to/from the HDD.
In writeback mode it will also use the SSD to buffer random writes, until they can be written sequentially, all-at-once, to the backing device.
It makes both your SSD and HDD so much more valuable. Strongly recommended :)
 It’s something that could be solved, though. An explicit interface to promote certain extents would do the trick. So far I think only bcacheFS offers something like that.
That’s why I have an adamant rule about never trying hardcore drugs or displays capable of refresh rates higher than 60 Hz.
Subjective. Depends entirely on the settings you are using and the resulting frame rate at that resolution.
However, it looks like Microsoft is isolating the online services from the client platform in order to de-prioritise Windows, which inherently makes Windows itself more replaceable. That creates a huge potential gap in the overall tech market, particularly once Windows 7 reaches end-of-support and a lot of people are going to be wondering about what they can do if they don't want Windows 10. (After all, if they did want Windows 10, why are so many people still not using it?)
There obviously aren't that many businesses that would be in a position to exploit such a vulnerability and start building not just a viable alternative client OS but a viable alternative ecosystem around it, but there are some. Apple could be a threat from one direction. Traditional big tech companies like IBM could be a threat from another. With the foundations available with modern OSS stacks, an unexpected alternative from (or sponsored by) the likes of Red Hat might be less likely but not totally out of the question either, or even a curveball from the likes of Intel.
So, what if one or more of them really did produce a compelling client platform and moved the whole industry back in the direction of local software and in-house operations? If the current focus on Cloud everything wanes, but Microsoft has voluntarily given up its long-standing dominance of the more traditional parts of the industry by neglecting Windows as a platform in its own right, where does that leave their online services five or ten years from now?
Maybe the likes of Satya Nadella are just much better connected in the big boardrooms than a mere geek like me, and they're confident that the emphasis on Cloud everything isn't going anywhere any time soon. But now we've been doing that for a while and the gloss has worn off a bit with all the real world problems of reliability, security, leaks, costs, and so on, I can't help wondering whether Nadella is mostly playing to Cloud-obsessed big investors and at some point those investors will prove to have been throwing good money after bad because they didn't know any better.
I've been thinking about this too but I don't see who would stand to gain from it.
If Wine got some serious commercial backing it could be leveraged to run all "legacy" Windows apps on another platform (Linux). Desktop Linux is relatively far along but with solid commercial backing it could become a serious contender. Or something more exotic...
The only question is: cui bono? Who would have a reason to go that route?
Someone like Valve, who has a massive investment in Windows software in areas that Microsoft has specifically targeted for "embracing."
What Microsoft doesn't seem to understand is that when they screw with Windows and deprioritize the needs of its users, they're also screwing with the rest of us who live, work, and play in their ecosystem. Gabe Newell, being an old-school 'softie himself, understands that very well... and he's one of the few interested parties with the funding to do something about it.
That is the part that baffles me. Visual Studio Code is a fantastic advert for the Microsoft brand: every day it shows every developer who uses it that MS can produce genuinely great software. It makes Azure more credible, as well as providing on-ramps to services. If people have a bad experience with Windows, why would they be inclined to get other services from MS?
Ironically, we decided not to try out VS Code specifically because of privacy/security concerns about Microsoft that have been heightened since Windows 10. We checked Microsoft's written policies to see what sort of telemetry data was being uploaded. We were unable to find any detailed information about it, and the general wording appeared to allow for uploading more-or-less anything, up to and including our own code.
into vscode settings and you are fine wrt telemetry.
What concerns me more is that the .rpm build is consistently later than other builds. Usually it means, that the user is staring 2 out of 4 weeks every month on update notification, that cannot be used, because the build is not ready yet.
It brings back the memories of MSIE and WMP that used to be available for Solaris and HP-UX, always slightly later... until they weren't and the users were left in the dark.
... for the moment, as far as you know.
The problem is the trust thing, not the telemetry thing.
Yes, exactly, and trust is hard to earn but easily lost. Given the corporate philosophy of Nadella's Microsoft, and the fact that Nadella was made CEO in the first place when it was pretty clear what style of leader he was going to be, I don't see Microsoft ever regaining the kind of trust we used to place in it without fundamental changes that start at the highest levels. And sadly, as long as investors continue to buy into that vision, those changes are highly unlikely.
It's not often I feel so strongly about a business, but Microsoft's actions in recent years almost feel like a personal betrayal after being being their customer for so long and trusting them as a keystone of the tech industry. In this case, I really do regard them as "the enemy" now, and I really do hope that someone else will drive a wedge through a gap created by their customer-hostile strategy with painful and expensive consequences for their management and investors.
The claimed benefits of the Cloud have always been as much hype as substance, and a lot of the "sales pitch" in the early days has now been debunked. We're not going to run short of stories about major Cloud services having significant downtime, poor support, data leaks and so on any time soon.
Likewise SaaS is a great pitch if you're a software developer, and converting capex to opex makes it attractive in some contexts as a customer as well. However, I get the feeling that patience is starting to wear a bit thin in some quarters now, with customers looking at the overall costs they're paying, how many improvements they're really seeing in return, and also how many unwanted changes are being pushed out.
Depending on where you stand on those issues, combined with the general dumbing down associated with many online apps compared to traditional desktop software, I can imagine a variety of situations where businesses might be looking at keeping more IT in-house as we go round the cycle again. For personal use, I suspect there's less of an incentive because many of these users are quite happy with just using their tech for basic communications and information retrieval, but for those who do want to do more creative things or who enjoy more demanding applications like games, again there's surely a big enough market for a serious desktop OS with an emphasis on good management of local native applications and not just being an expensive thin client for online services.
So who potentially benefits from an industry shift? I guess the simple answer is anyone who is interested in developing software for the markets I just described.
I wish I could agree with you. However, the QA layoffs at Microsoft were in 2014; if something like was going to happen, it would have happened by now.
I have championed it when it came out, and despite being in a hard position to pick one up I managed to do so.
I deeply deeply regret it.
If anything I have gone from being a massive supporter to actively doing what I can to spite the brand.
The product is an expensive laptop, which has such critical issues with it that it renders it inoperable.
Then when you need to try and talk to MSFT help, they are about as capable as toddlers. Politely unaware and unhelpful.
The surface line went from having a great rating, to having that rating revoked since it is unrepairable.
I want to love my device, I am using it right now, but there are serious issues with it, and this is a device which has been placed in the same context as apple products and their legendary support reputation./
The fact that they dropped the mobile market (where windows could not fit as very different hardware was involved in the beginning) doesn't mean they will miss the next shot too.
They will have an entire ecosystem up and running, and windows 10- level flexible to be tailored to whatever comes. Of course it would be crazy for them to drop it now
Every single incarnation of windows mobile / phone was well tailored for the mobile hardware of the day.
What sunk windows phone was not that it was slow, it was that it was late to go touch-first and failed to differentiate enough to make up for it. That and tightly controlling the vendors while up against android made the quality of windows phone itself irrelevant.
This is such an utterly clueless explanation of why Windows Phone failed that it’s kind of stunning. Until, of course, you remember the culture-induced myopia I described yesterday: Myerson still has the Ballmer-esque presumption that Microsoft controlled its own destiny and could have leveraged its assets (like Office) to win the smartphone market, ignoring that by virtue of being late Windows Phone was a product competing against ecosystems, which meant no consumer demand, which meant no developers, topped off by the arrogance to dictate to OEMs and carriers what they could and could not do to the phone, destroying any chance at leveraging distribution to get critical mass…
Interestingly, though, Myerson’s ridiculous assertion in a roundabout way shows how you change culture…In this case, Nadella effectively shunted Windows to its own division with all of the company’s other non-strategic assets, leaving Myerson and team to come to yesterday’s decision on their own. Remember, Nadella opposed the Nokia acquisition, but instead of simply dropping the axe on day one, thus wasting precious political capital, he let Windows give it their best shot and come to that conclusion on their own.
One wonders what would have happened if they just sort of let Windows be Windows. I.e. if they had continued iterating on core stuff but left UI and general philosophy resembling Win7's trajectory and not tried to force people into WinRT/UWP/store.
Even though Win10 attempted corrective action it always struck me that it was still accepting the fundamental thesis of Win8. They still kept with the Orwellian redefinition of phrases like "Windows app" to mean very recent, immature and unproven technology. They were still talking about ARM devices that don't let you do a straightforward recompile of old code. They were still pushing the Store as the only means to distribute software, where even Apple has not succeeded in changing people's habits on the desktop.
I hope that with a weaker organization, people keeping the lights on for Windows will know to not shake things up too severely and ease up on pushing some of these silly ideas. I somehow doubt it. I've been hoping for that for... 7 years?
With Windows 8 they tried to do too much too quickly and not well, however their motivations and intention were spot on in my opinion. Security has always been a pain point for Windows and it's only been getting worse, the Store introduced a sandbox like that found on iOS and Android. ARM, iOS, and Android showed that modern hardware and software can produce power efficiencies that put x86 and Windows to shame. UWP introduced hardware independence and a modern development framework that would put Windows Apps on a level playing field with iOS and Android.
The problem was that Microsoft thought that merely by virtue of their existence, developers would adopt UWP/Store despite the path forward being a huge pain in the ass. We see how that played out. With Windows 10 we've seen a course correction and a change in strategy. You can have traditional Applications sandboxed in the Store, and you can have UWP Apps outside the Store. Microsoft bought Xamarin and is folding it into UWP, the intent is to not just have hardware independence with UWP but also OS independence.
That’s a bold claim presented without evidence.
IFF your user has enabled side-loading.
will keep my Win7 for as long as hardware and software keeps running. don-t expect much more from OS honestly
Android and iOS have demonstrably shown that most users don't value freedom as long as the OS works for their needs. Most users are security nightmares as IT Admins and relatives supporting family members can attest to. The Store is Microsoft's attempt to introduce that stability and safety to the wider Windows audience.
That's not to say that there isn't a significant subset of power users who want the ability to do what they will with their machine however they are the exception.
Microsoft is struggling with how to cater to both markets simultaneously.
The first is designed to be casual and apps grew out of a situation where there previously was nothing, while desktop OS's have had decades of external software being installed regularly. Introducing something new, especially when opening up so much potential, is far easier than trying to change existing habits of both consumers and vendors on an existing landscape.
The iPad is a good case showing how the same model doesn’t succeed without top class apps.
Here's hoping they'll rethink that at some point and follow Apple's approach of having just one desktop OS instead of nickel-and-diming the masses for features such as BitLocker.
But they just can’t bear to actually get it all right, so still try and stiff people to switch off ‘S’ mode.
- There will be less SKUs
- Updates won't require to restart the OS
Presumably, given that the opposite has happened, in that case, "targets" would not have been met by "Q4" and important team members would have to be "let go".
As you point out, this is not new. I would pin the start of this trajectory on the introduction of Windows Activation with XP. It's all variations on the theme of restricting the user's freedom. It hasn't been your software since 2001. It's theirs.
With the possible exception of my non-Retina iPad mini I’d rather watch Netflix on any of my mobile devices. It never even occurred to me to look for it on my laptop, because the form factor is so ridiculous for movie watching.
Now, I have no idea where to even start! C#? C++? .NET? Win32? MFC? WPF? XAML? WinForms? UWP? What a rat’s next! How did MS let the development ecosystem get so muddied? I know I can go online and research all of these, learn the trade offs, and hope I bet on the right horse, but then again I could also choose to simply leave this crazy platform alone and work on more features on the platforms that are more comfortable.
Out (but not gone/forgotten): Win32 (hand-coded C, C++ MFC, C++ ATL and related alphabet soup [aka back in the day the choice wasn't easy, either], VB6, .NET WinForms, etc), WPF/Silverlight (a .NET only XAML platform)
In: UWP (C++ with XAML, C#/.NET with XAML, JS with HTML/CSS)
Just three UI platforms to consider (Win32, WPF, UWP), two of which are mostly deprecated and not recommended if you are building a fresh application or porting one to a new UI. The remaining platform (UWP) gives you language choice (C++, C# or other .NET languages, JS), but not UI design framework/markup language choice (XAML, unless JS and then HTML).
If you are doing a fresh port of a C++ application from another platform, the choice seems pretty simple to target the UWP and use XAML to define your UI.
In the mean time, if you, in a corporate environment, must Windows 7 until the bitter end in 2020, you can use WPF as a bootstrap step towards UWP. The XAML is similar between them and a lot of the coding paradigms are same (MVVM patterns), and if you architect well (.NET Standard 2.0 libraries) you can share 100% of your business logic between a WPF application and a UWP application.
Xamarin.Forms also has an open source WPF renderer if you'd prefer to further go the cross-platform route. (Xamarin.Forms supports UWP, iOS, Android, macOS.)
I agree that I think Microsoft should have had the .NET Standard, XAML Standard, WPF Xamarin.Forms development stories in place sooner to better leverage developers that need to support older Windows, but many of those pieces are in place now if that is what corporate developers were waiting for.
A fun result of the whole mess is that after a new Windows 10 install, the first thing I do now is open Store to install Ubuntu WSL.
The old Windows desktop might have had a half dozen things prompting you to elevate and update. Flash player, Adobe reader, Java runtime, all these things wanted your undivided attention to update, not to mention were polling the network and using resources to do that. So maybe you could see a single piece of infrastructure allowing those third parties to hook in less intrusively.
But that's not what they asked for. They would prefer you trash your app and write for the thing they came up with this year, to be thrown away when they do the rewrite the next year.
That hasn't been true for over a year now. Classic Win32 desktop apps have been in the Store for over a year now (nearly a year and a half). It's my preferred way now to install things like Office and Paint.NET. Even Photoshop is in the Store now.
(Even iTunes was promised at one point, and I'm still very hopeful for that, because Apple's Windows Updater is still one of the worst updaters on Windows, where bad updaters was once an art form.)
Selling only WinRT software made all their efforts worthless.
Forcing developers to provide Win32 and WinRT versions of software alienated software developers.
Some of the calculation was correct. WinRT was not.
They don't have to sell all existing legacy Win32 apps in the store, but they should at least allow some Win32 apps.
About malicious software, I think it would be very dangerous legally for the malicious developer to expose his personal and financial information to MS and expect nothing happens when caught.
And all it would take is the developer to claim that their machine had a virus unbeknownst to them that somehow made its way into their binary.
Dunno, Steam is in a position where they don't really have to care about the Windows brand but Microsoft does, so they have a bit more freedom than Microsoft in that.
Wasn't old windows desktop technology the one proven to be bad? I mean I can say this because I like windows, but people have made fun of windows bugs and viruses for three decades now..
Granted, Windows Store still sucks ass years later its introduction, but building a new ecosystem with unified and modern tooling was probably a necessary step for them to make
I sometimes wonder if maybe they should have done a hard fork between 7 and 8 and continue to develop 7 for commercial users and 8 would be the consumer version.
In the commercial branch, they could undo some of the moves they made years ago (like putting GDI in the kernel rather than user space) to eventually get a more stable, secure OS.
The thing with "people making fun" through is that a lot of it was by-product of culture war between linux and windows world. between radical ideological open source and closed source - where Microsoft engaged in a lot of ugly tactic. Between techies who wanted tech focused on their needs and commercial company that cared significantly less about needs of admins, tinkerers and such.
Meaning, the jokes were much more political and much less technical. They were often result of people hating Microsoft (recall their ugly tactic).
A lot of that stuff could have been (and is now being) fixed incrementally. Lots of that tech is available to Windows desktop applications without needing to switch to UWP. Some of it is UWP only (except for a backstage pass for Edge, which tells you everything you need to know) only to push developers to the Store.
I feel like MS had mostly quashed these problems with XP - it was certainly not bug-free, but after a few years was a solid enough platform many used it all the way to (and probably past) its EoL in 2014.
Windows for Pen Computing was released in 1992.
For the record, I had a MP120 and absolutely loved that machine.
One of the points the article makes is that Ballmer's MS would do things like not release Office for iOS because that would potentially eat into Windows marketshare. That no longer will happen. MS sees its more important to get o365 everywhere they can.
From my perspective, switching small businesses over to Exchange Online and other o365 services - The future is bright for MS in that space. These SMBs we've switched over have gone from using a version of Office they purchased forever ago and an old Exchange license to paying 50-100 a month for Office 365.
Mac OSX, Linux+desktop (any of several), IOS, and Android are all operating systems that can do everything you need of an OS, except MAYBE run windows programs.
Which is to say Windows as a separate company probably cannot make it long term.
The only computer I own at home (other than ipad/iphone) is a Chromebook. But I lease a cheap VPS that lets me do "computer stuff" otherwise.
(Then there's scientific software -- Stata and Matlab and that stuff. But ten years ago Matlab, like gagh, was best served in higher-end minicomputers and now everything is Python and can be ran on a cheap VPS or on AWS.)
Besides the sheer gobsmacking convenience, there's also the fact that it's way more fun for bean-counters to have higher OPEX in exchange of lower CAPEX. It really kicks the llama's ass.
Can you list some? That sort of software is my favorite genre and I'm always interested to hear about different ones.
That is surely overstating it, at least for definitions of "long" meaning <= 20 years. If you assume that Windows is more or less "done" their engineering expenses could be low, and revenue significant, merely serving the corporate and various niche (gaming, etc) markets which are still heavily bound to Windows. Their market share is declining in some niches (such as CAD) and maybe even overall (to OS X), but negative revenue and market share growth does not necessarily mean quick extinction. Some dinosaurs take decades to bleed out.
I'd even buy the Windows 12 Pro 2021 version when the time came for fundamental internals shifting like wider WSL access.
I would. An update to Windows 7 that worked well with modern hardware and had modern security updates but otherwise ignored almost everything Microsoft have done in more recent versions would be a significant improvement on anything currently available for those of us who value stability and security over bells and whistles.
Windows is a perfect example of no focus at all. Maybe this makes sense for a true mainstream OS.
To be honest, I wonder if Azure, Exchange online or Office 365 can really be their future. Why not use AWS, Scaleway or a local service. It's cheaper and better.
amazing how many people here are missing the point. this is not about Windows vs MacOS. Client OS has become a commodity with prices approaching 0 (MacOS no longer charges for upgrades, neither does iOS or Android) and users care less about how they get to Facebook.
Think about server/mobile space. if MS tried to hang on to their dominance simply via desktop lock-in (because it worked in the past) that would be their Kodak moment. All Nadela is doing here is emphasizing efforts on staying competitive in cloud/o365 which is where he thinks they can continue to grow and stay competitive and bring in revenues from enterprise clients. Less and less companies want to lease/run their own windows/database servers, and with technologies like containers a lot of companies are now going with open source solutions.
No, established enterprises (both private and government) are, in the real world, important customers for cloud offerings like Azure, GCP, and AWS.
My Boss is smart and he'd wonder what the business plan would be should say 5 years down the line you want to move away from the cloud for some reason.
You threw away decades of accumulated knowledge about the company and it's systems to save some money upfront.
He thinks on a longer scope than most bosses I've known though (largely because his family own the group so he doesn't have to answer to shareholders).
I'd be wary about how the real world actually deals with that, but there is no lack of large companies jumping on short term costs cuts without looking at the risks.
Really? People incentivised into looking for your best interests vs. a managed organization incentivised into dissolving your contract and gaining on average. I can't imagine anything that could create more different risks.
What people seem to do all the time is looking at an SLA and believing the real world must behave the way things are on paper; because there's a signature there. People do this even after being recently burned by the same partner. Some people have the same trust for in house staff, but the usual on high management is to have the exact opposite bias.
When we're talking about an entire ecosystem going away, the new one has to have almost 100% adoption. Windows desktop is going to exist for a really long time, especially when everyone else is giving up on desktops.
And yes, companies are entirely moving their systems and data. Salesforce runs their sales team. Oracle and SAP runs their ERP and finance. And now AWS/Azure runs their IT infrastructure. They use email from Google or Microsoft, along with hundreds of other SaaS products used day to day by employees. New startups are even more invested and start using other vendors for every non-core activity from day one. This is what a strong global business ecosystem enables and is a massive advantage for everyone.
When it comes to "someone else's datacenter", I think you'd be surprised at just how few companies even own a datacenter, let alone their own racks. Even the major clouds lease their own space from actual datacenter builders and operators.
You are incredibly unbelievably wrong here. I work for a vendor known for incredibly high prices and have knowledge of basically the entire infrastructure of every one of my clients, and nearly all of them have a massive cloud strategy that's in-place right now. I even have clients shipping their security and compliance logs to the cloud, or sending them from cloud to cloud.
Actually the more medium-sized businesses are the ones most hesitant to move to the cloud, because of the lack of cloud expertise on their teams. But even then... it's still a huge strategy. They're all consolidating physical datacenters and using Amazon or Azure or Google.
Also, I don't want to get off topic but how does SIEM space fit into the infrastructure stack you see in the future? In security, what about the firewall vendors trying to become the central hub for managing the new security needs brought about by the cloud (CASB, endpoint, virtual firewalls, etc?). What about SaaS only Iidentity & access management solutions like OKta?--I hear they are a game changer, but is a cloud-only solution really the best positioned seller for something like this, especially considering the hybrid structure of most large orgs today (on-prem & off-prem/public cloud)?
When it comes to next-gen security appliances, I see them enhancing the SIEM, but not replacing it. Too many regulations require centralized log management, and organizations depend on a central alerting and monitoring platform. I do see these security platforms making the SIEM cheaper and dumber, though. Whether it’s a storage-based pricing like Splunk or an event Rate pricing like QRadar, licensing costs a lot. Which means it costs a lot to feed a bunch of dumb logs into a system that makes information out of dumb logs. As long as you already have a Palo Alto, you might as well feed the IDPS logs into your SIEM and forget the rest: it’s fewer logs, cheaper licensing, and less hardware needed on the SIEM. You already have the Palo doing the intelligence.
With regards to hybrid clouds that are popular today, you see a lot of SIEMs go to the cloud. You can ship logs cloud-to-cloud, which saves bandwidth backhauling it to the enterprise. It is a challenge for security logging to get data from there though. Oftentimes it costs extra to send logs, or can’t be done at all. Amazon wants you to use Cloudwatch. Microsoft wants you to use Security Center. It’s a pain to centralize it all. That’s going to have to change.
To your last point: sounds like it should change, but not that it necessarily will. I can’t fathom the cloud providers would allow an independent third party come in and take this business from them, despite how much easier it would be for the customer.
I thought splunk was usage based pricing, not storage based? But yes, I have heard similar things on how quickly it can get expensive, sometimes without any warning when they get a bill 10x what they expected...
Thanks for your time.
And guess what, the CTO and CFO and COO will very much care about licensing cost of each CPU running windows or sql server, vs FREE on Linux.
This is a small and shrinking corner of the market. Valuable. But still a corner.
They haven't really up until now have they? MS licenses are not a big part of the IT budget compared to the people needed to run an in-house data center (typically staffed 24x7)
Disclaimer: I work at SAP, but not in the same division. And not in sales either, so take my word with a grain of salt. ;)
I guess the 5 years I spent at a massive tech company shutting down our datacenters didn't happen and all the recruiters offering me huge amounts of money to move their stuff into aws aren't real.
This is very myopic. Establishes businesses are huge customers (both current customers and potential customers), for cloud providers.
Established businesses do this all the time.
If you mean as in a desktop PC then no, but smartphones is becoming more and more important and Apple is not strong outside the US. Around here almost everyone has an Android phone as a work phone and I see more Chromebooks by the day.
I also want to point out that i believe that saying they abandon the ecosystem is missing the point. They don't. They just don't let Windows dictate what the rest of the business does.
There are still a lot of tech heads and gamers that depend on Windows and will continue to buy into the platform. I even know hard core Linux devs who've switched back to Win10 and just work off of the new Linux Subsystem and an X11 server.
With Android being a clusterfuck of vendor specific bloatware and not having a single concept of a clean install, Microsoft should focus on keeping a clean Windows install equivalent to a fresh Mac install.
Everyone hates when Microsoft changes something core and foundational about Windows and thus there has been little change in certain areas of the OS for the last two decades. That build up of cruft has become untenable and Microsoft attempted to address major pain points like the Control Panel with Windows 8, then 8.1, and still in 10. The problem is that rewriting something like the Control Panel would consume 100% of developer resources for an entire OS Dev cycle. Can you imagine if Microsoft announced Windows 11 and the only difference between it and 10 was a new control panel?
Windows 10 is the perfect opportunity to start fixing cruft. It's already controversial but not universally hated, they got the majority of people to adopt it through free upgrades, and they've introduced the semi annual upgrade cycle that allows them to gradually introduce foundational changes gradually. With each release of Windows 10 they introduce a bunch of new crappy apps that are completely forgettable AND they improve a core feature silently.
And with every shift towards ads or trying to force a mobile-first UI on a desktop/laptop, they alienate users off of it. I switched to OS X as soon as I saw the writing on the wall with Win8 (and Wine on OS X got stable enough to play GTA San Andreas, tbh).
Also, I wonder if it would be possible to get Windows 7 userland to run on a Win10 (and with it, WSL) kernel. Now that's something I'd switch back to Windows for.
You mean like Apple asking me to use iCloud every single time I unlock my Mac? Or every time I power on my test iOS devices? Or do you mean up I have to open the App Store to apply MacOS updates and click past the ads?
Windows 10 adds some shortcuts to the start menu during the bi-annual updates, they're annoying but once they're deleted they don't come back. They also have an add in Explorer for OneDrive that also goes away permanently when dismissed.
If I had to choose between MacOS/iOS Ads, and Windows 10 Ads, I'd choose Windows but ideally they'd both stop it.
> trying to force a mobile-first UI on a desktop/laptop
Windows 8.1 and 10 don't require UWP apps to be full screen. And actually the Settings App in Windows 10 is fantastic because it responsively adapts and is usable whatever size you make it.
MacOS on the other hand won't allow you to resize most settings windows and they're fixed to a size that's comparable with a 90s era Macintosh despite MacOS only working on machines with high resolution displays.
> I switched to OS X ... with Win8
So you've never used Windows 10 first hand then and you're just spouting things you've read to validate your decision?
wtf? I don't use iCloud and don't get this nag. I only get a single question when I reimage my test computers or wipe a test device, that's it.
> Windows 8.1 and 10 don't require UWP apps to be full screen.
The start menu is enough for me to permanently turn me away. Or this stupid "tile" interface. What's that crap even for?
> So you've never used Windows 10 first hand then and you're just spouting things you've read to validate your decision?
No, it's these things that I always encounter when having to fix other people's computers. Plus I do not really like that Windows 10 has un-opt-out-able telemetry (yes, you can turn it off by hacks, but every random "update" will turn it back on again).
Are you signed into it though? I'm not and it nags me to sign in.
> The start menu is enough for me to permanently turn me away.
Which one? 8, 8.1 and 10 all have very different start menus.
> Or this stupid "tile" interface. What's that crap even for?
In Windows 10 it's not unlike Dashboard in MacOS.
That's not an 'ad' - there is a ton of OS-level functionality that's tied to your iCloud ID and account, you can ignore it if you want, but it's not an 'ad' if they want you to use it...
No, I don't use it in any form.
> Which one? 8, 8.1 and 10 all have very different start menus.
And they're all not the Win95-to-Win-7-era one. I need a working menu with nothing else, especially not tiles or ads.
> In Windows 10 it's not unlike Dashboard in MacOS.
Which Apple doesn't force one to use, thank God. Programs folder in the Dock, that's it (although, I admit, I'd prefer a list/menu...).
I don't get asked to use iCloud every time I unlock a Mac. Additionally, I don't recall a Mac ever displaying ads in their file explorer asking me to upgrade my SkyDrive storage. Nor do I ever recall Mac OS ever installing third party bloatware apps and games and making them front and center in the Start menu. I also don't recall ever having to proactively prevent the mass collection of "telemetry" data when installing Mac OS.
To be fair, it will ask you once at profile creation if you agree to sending telemetry. To the best of my knowledge though, Apple respects the choice and does not reset it quietly like Windows does.
Let's have a look at the "Get Going Fast" screen
You'll see at the very bottom a Customize Settings button that is nearly invisible. It's located on the left side, shaded in a color that is different from the other buttons and the text size is substantially smaller than the standard button text size.
The majority of users will have missed this and just clicked on the Use Express Settings button.
Let's have a look at what is turned on by default:
1. Personalize your speech, inkling input and typing by sending contacts and calendar details, along with other associated input data to Microsoft.
2. Send typing and input data to Microsoft to improve the recognition and suggestion platform.
3. Let apps use your advertising ID for experience across apps.
4. Let Skype (if installed) help you connect with friends in your address book and verify your mobile number. SMS and data charges may apply.
1. Turn on Find My Device and let Windows and apps request your location, including location history and send Microsoft and trusted partners location data to improve location services.
Connectivity and error reporting
1. Automatically connect to suggested open hotspots. Not all networks are secure.
2. Automatically connect to networks shared by your contacts.
3. Automatically connect to hotspots temporarily to see if paid Wi-Fi services are available.
4. Send full error and diagnostic information to Microsoft.
Browser, protection, and update
1. User SmartScreen online services to help protect against malicious content and downloads in sites loaded by Windows browsers and Store apps.
2. User page prediction to improve reading, speed up browsing, and make your overall experience better in Windows browsers. Your browsing data will be sent to Microsoft.
3. Get updates from and send update to others PCs on the Internet to speed up app and Windows update downloads.
Also ads on the lock screen, ads in the start menu ("app suggestions") and (Edge) ads from the toolbar. Just off top of my head, maybe I missed some.
And some folks get the really big money for "driving business initiative", "spearheading new projects" and stuff like that.
What's the use case for a "hard core Linux dev" and WSL? I've tried it. It failed miserably for Rails development when it was released. I logged bugs. I went back to what I was doing. Maybe it's gotten good now?
Speaking of Windows OEM bloatware clusterfucks here's your typical Windows HP OEM Windows clusterfuck:
That Windows we all know, both desktop and server side, is not dead. The economics on the server side are diminishing (20~30 year outlook) and Microsoft won't be around if they don't do something about it now.
IMHO it's a very smart strategy, people just didn't understand what Ben was saying because he didn't articulate it correctly (which I do find with his content, but it's less often the case)
Most definitely not. There are a ton of Microsoft Certified Somethings that will refuse to administrate Linux servers unless their life depends on it.
Source: I work at SAP's internal cloud unit. We have to support Windows VMs (begrudgingly, if I may say), even for payloads that could run on Linux.
Can you elaborate this assertion? As far as stability goes, macs are top notch. I'd say in the business context Windows machines are prevalent because of successful MS strategies (in terms of marketing, not ethics) in 1995-2005 era, and it kind of stuck around.
Windows has provisions for performing all sorts of tasks across a fleet of desktops. I can automate deployment of a Windows desktop to the point where all you have to do is boot it up and join it to the domain (which makes it easier to hire someone to do it as well).
What's more is that these settings will work across updates. I've gotten really sick of the screwy changes Apple makes to macOS with little warning. Windows goes to great lengths to ensure backwards compatibility between major version upgrades. (For example, did you know VB6 apps still run just fine?) Also, I can get fine grained control of updates to Windows.
But maybe most importantly, I can virtualize Windows and create a virtual desktop environment. Now I can hand out toasters to users and they can just log in to their box from anywhere.
Windows is simply much better at making desktop units fungible.
That said, everywhere I've worked the past 8 years, across 3 jobs, was virtually Mac-only. But yes, the valley is an outlier.
From the medium to enterprise range, maintenance and stability issues are often one in the same. Breaking automated maintenance and/or deployment leads to instability.
Keep in mind I don't mean "kernel panic" levels of instability or even "programs crashing" levels of instability. The smallest of changes can cause a mountain of work in large environments. Simply changing the icon of an often used program can cause a deluge of tickets.
This level of stability and control is something that macOS doesn't even come close to recognizing.
My MBP and its periodic gray screen of death begs to differ. Apple really needs to shore up its QA.
Sticking with something just because it's a large majority is a great way to fail when that something fades into the background.
I suspect it's a mere marketing strategy to jack up their stock price and gain a "cool factor". MS will go wherever the market/profits leads them.
Their cloud is commercially successful, a rarity for new MS products actually, and that's where they currently see future growth. But if Windows stays a nice cash cow, they'll quietly milk it.
"Productivity" applications still need a real GUI, something the Web has yet to do right cross-browser: CSS/JS/DOM is a mess. I believe we need a WYSIWYG standard to get GUI's right. The server side can still scale the window & widgets per device size under WYSIWYG, so it's a myth that WYSIWYG is anti-mobile. The client-side auto-flow approach sucks for efficient UI's.
They aren't abandoning the desktop Windows ecosystem, they’ve just demoted Windows from a single desktop and server org that the rest of the firm orbits to something that serves the areas where there actually is expected to be growth.