Hacker News new | comments | ask | show | jobs | submit login
The End of Windows (stratechery.com)
583 points by samsolomon 10 months ago | hide | past | web | favorite | 597 comments



It isn't really the end of Windows - it's the end of Windows' dominance of Microsoft mindshare.

Windows 10 was the giant step in that direction - an always updating, maintenance mode, single OS that MS will need to support. Arguably until Win 7 dies out they won't be entirely there but once they are, maintaining Windows is a much simpler affair than yesteryears.

None of this means Windows itself is dying - it only means the OS has matured enough for the needs of present and any future needs will be addressed as and when they arise - instead of inventing the future they will abide with it on their own schedule.

This allows Microsoft to focus on things that truly matter for the future without killing Windows or spending a lot of resources on advancing/supporting it. It's not hard to imagine Windows stays a very dominant client OS for a long time to come - even if the overall PC market share continues to decline, because none of the alternatives are there and nobody is going to invest the resources to get them there.

I will also add that Satya's success lies in speeding up this strategy that started late under Ballmer and also in executing so well on it. Considering the scale of what MS is doing - Office ports to iOS, Android, Windows 10 releases, Azure, ton of other Enterprise stuff (Exchange, O365, Development tools, cross platform stuff like .NET Core, VS Code etc) - the change of course and the success they are having with it all - you can't argue it's not a phenomenal achievement.


You can see a nice graph showing (Firefox) users steadily upgrading from Windows 7 to 10 from Mozilla's Firefox telemetry:

https://hardware.metrics.mozilla.com/#goto-os-and-architectu...


That hardware report is surprising to me in many ways.

* In 2018, the most popular resolution for users of Firefox is still 1366x768. And only 1920x1080 is making any headway against the dominance of 1366x768. As much as I am surrounded by the culture of multiple 4K+ displays, apparently this is not at all commonplace. 4K doesn't even get listed, presumably lumped into the "Other" category.

* In 2018, the most popular memory amount for users of Firefox is still 4GB, trailed by the also disappointingly small 8GB. In my world, I consider 16GB a bare minimum, but at least with memory I haven't been deluding myself—I know many people economize on memory. Still, I would have thought 8GB was the most common memory size by now.

* A surprising number of people still have Flash installed.

I can barely conceive of the poor experience of running a modern web browser on 4 GB of memory at 1366x768. Can you imagine the user experience of today's modern over-JavaScripted web pages on such underpowered hardware: not simply slow because of their grotesquely large transmission payload but also because they caused your computer to start swapping memory pages to disk?


> In 2018, the most popular resolution for users of Firefox is still 1366x768. And only 1920x1080 is making any headway against the dominance of 1366x768. As much as I am surrounded by the culture of multiple 4K+ displays, apparently this is not at all commonplace. 4K doesn't even get listed, presumably lumped into the "Other" category.

Most of the tasks in the real world doesn't need multiple 4K displays, including low level and systems development. Most people read stuff, and as a developer and system administrator, when the text is looking good, this means the resolution is enough.

> In 2018, the most popular memory amount for users of Firefox is still 4GB, trailed by the also disappointingly small 8GB. In my world, I consider 16GB a bare minimum, but at least with memory I haven't been deluding myself—I know many people economize on memory. Still, I would have thought 8GB was the most common memory size by now.

8 GB is more than enough for most people. My family's Win10 desktop is happy with 4GB, my office desktop is cozy with 8GB. My own desktop has 16GB of RAM, but it runs many, albeit small, virtual machines. "The hardware is cheap, let's waste it" mentality doesn't help anyone and it's wrong. I've written some state of the art algorithms which use 1.5MB of RAM and make the CPU scream for cooling (I develop high performance computing software as a side-academic gig), so like every resource, RAM should be used sparingly.

Edit: I've no comment for flash. I think it's a forgotten remnant of old systems.


As someone who reads text all day, going up to a 27" 150 PPI monitor was huge. There is a tangible improvement in how that text looks, especially on an accurately colored monitor.

The other footnote should be that display prices have been crashing recently. You can get an IPS 24" FHD monitor for like $90, and a QHD version at 27" for about $150. Those would have been twice as expensive a few years ago.

That being said, all those 768p screens are crappy plastic laptops with really slow hard drives. That I guess is what we end up with the Intel took what should have just been the natural evolution of notebooks - small SoCs running a high PPI display in a metal frame - and made them into some premium brand name product with a huge margin on the prices of chips that cost peanuts to manufacture because they didn't have any real competition in the space for a very long time (and even then, their monopolistic behavior lets them keep AMD out of all the major brands premium notebooks anyway).


> As someone who reads text all day, going up to a 27" 150 PPI monitor was huge.

You're right, however not everyone has the same desktop space to accomodate a 27" panel. I can barely fit a 24" on my desk. 1440p monitors start at 25".

> The other footnote should be that display prices have been crashing recently.

When I was in the US at the end of 2014, one of my friends said the same thing about flash drives when I pointed out a $44 PNY 128GB flash drive. Unfortunately, other parts of the world doesn't work same way. Because EUR or other currencies are not fixed against US$, and prices fluctuate at most parts of the world if not increase. So, no, technology doesn't become cheaper as it matures at some parts of the world unfortunately.

Addendum: BTW, you are right about the 768p screens are generally found in entry level laptops or netbooks. These devices are most feasible ones when they first came out. Now they are bottom end cash cows which are virtually free to build.


Lenovo still sells 768p screens in their high end X-series laptops including the very recent X280.


Which imo is criminal, if the laptop is starting at >$700 they should "splurge" on the nicer display.


I could swear 1080p was the default on my X270 that I bought a few months ago, but it looks like its a $150 upgrade now -- on an already expensive nearly $1200 base model. I paid just over $800 for my X270, including the 1080P screen and upgraded memory.


1440p is a resolution for gaming, not work.

You can get 24" 4k (2160p) 180ppi 60Hz for $300 e.g. LG 24UD58.


Just purchased two 1440p monitors and I love doing work on them. 25" and the PPI is just perfect for me.

Anything higher resolution-wise requires a much larger display to be readable at 100% scaling. I'm adamantly against using scaling.


A 4K screen with 150% scaling is liquid smooth and other-worldly


Don’t you find it a little small at only 150%? What size/model?


> I'm adamantly against using scaling.

Why?


I guess I just don't see the point unless you're gaming, watching g 4k content, or doing graphic design. I tested out a 4k 28" and unless I had it at 200% scaling couldn't use it for longer periods of time. Sure it looks a little more smooth, but now you have to render 2.4x (Vs a 2560x1440 monitor) the amount of information for what I think is fairly little gain. I get more screen estate and still readable text with the 25" 1440p.

Perhaps my experience would have been better on a desktop, but this was for my work where I have a Surface Pro which when going from a docked to undocked state (or vice versa) with monitors that didn't match the same scaling as the surface resulted in graphical issues that can only be resolved by logging out then in again.

Also still come across apps that don't know how to scale so that can be really frustrating.


> I guess I just don't see the point unless you're gaming, watching g 4k content, or doing graphic design.

I have to say you have it exactly backwards!

Gaming on 4K is extremely expensive and still basically impossible at refresh rates higher than 60 Hz. In fact, you’ll be lucky to get even that much. 1440p/144Hz is a much better and more realistic target for even the most enthusiastic gamers.

Also a most welcome recent trend has been to ship with novel temporal antialiasing techniques, completely redefining how games can look at lower resolutions.

Temporal artifacts have always been the bane of 1080p, forcing higher resolutions either directly, or indirectly as subsampling. Once you take that out of equation, the benefit of native 4K is much more modest.

4K movies are nice, but as with games, it’s more of a linear progression. I doubt most people could even tell the difference in a blind test.

Full-range HDR is, in my opinion, a much better investment if you want to improve your TV viewing experience (and lately gaming as well) in a noticeable way.

I don’t know much about graphic design, but I doubt 4K is all that essential. Everyone has been using 96 dpi displays to create content for very high density mediums for a long time. Even the most craptastic ink printer is 300 dpi+. All you need is a zoom function. Color reproduction is, I think, much more important than resolution.

Where HiDPI displays really shine is actually in the most mundane: font rendering.

For anyone that works in the medium of text, programmers, writers, publishers, etc., a 4K display will be a considerable and noticeable quality-of-life improvement.

Even the most Unix-neckbeardy terminal dwellers will appreciate the simply amazing improvement in visual fidelity and clarity of text on screen[1].

> I tested out a 4k 28" and unless I had it at 200% scaling couldn't use it for longer periods of time.

That’s what you are supposed to do! :)

It’s only HiDPI at 200% scaling. Otherwise it’s just 2160p, or whatever the implicit resolution is for some partial scaling value.

For 4K at 100% scaling you’d need something like 45" screen at minimum, but that’s not actually practical once you consider the optimal viewing distance for such a screen, especially with a 16:9 ratio.

> I get more screen estate and still readable text with the 25" 1440p.

A 4K display should only provide extra space indirectly. With text on the screen looking so much sharper and more readable, it might be possible to comfortably read smaller font sizes, compared to equivalent 96 dpi display.

If you also need extra space as well, then that’s what 5K is for.

Though for things like technical drawings or detailed maps you can actually use all the extra 6 million pixels to show more information on the screen.

A single-pixel–width hairline is still thick enough to be clearly visible on a HiDPI display[2].

> but now you have to render 2.4x (Vs a 2560x1440 monitor) the amount of information for what I think is fairly little gain.

Yes, that’s an issue with things like games. However you can still display 1080p content on a 4K screen, and it looks just as good[3], and often even better[4].

Most graphics software will also work with 1080p bitmaps just fine. Vector graphics necessitates doing a little bit extra work, but for a very good payoff.

Overall though, for things like programming or web browsing, it shouldn’t matter. I have a netbook with a cheap Atom SoC (Apollo Lake) and it can handle 3K without breaking a sweat. That much more capable[5] GPU on your Surface Pro should easily handle even multiple 4K displays.

Pushing some extra pixels is not a big deal, if all you’re doing is running a desktop compositor with simple effects.

> going from a docked to undocked state (or vice versa) with monitors that didn't match the same scaling as the surface resulted in graphical issues that can only be resolved by logging out then in again.

Yeah that must suck. Still, it’s only a software bug, and you mustn’t let it keep you from evaluating HiDPI on its merits.

> Also still come across apps that don't know how to scale so that can be really frustrating.

That’s life on bleeding edge ;)

Sure, it’s annoying, but the situation is a lot better than it used to be. Even Linux is doing fine, at least if you stick to recent releases. Some distros like to ship outdated software for some reason :/

Still, in my opinion, the quality-of-life improvements of a HiDPI display very much outweigh the occasional inconvenience. Though obviously, YMMV.

[1] https://www.eizo.be/eizo-pixeldichte-im-zeitalter-von-4k-e5....

[2] Assuming you’re viewing at optimal distance.

[3] With the notable exception of 96dpi native pixel art.

4K has exactly 4 times as many pixels as 1080p, so it shouldn’t be an issue in theory. Nearest-neighbor will give you exactly what you want.

However in practice you need to force scaling in software, otherwise graphic drivers, and most monitor’s postprocessing, tends to default to bicubic scaling. That said, pixel art is not computationally expensive, so it’s mostly just an inconvenience.

[4] You can use advanced scaling algorithms to upscale 1080p to 4K and it usually looks great. E.g. MPV with opengl-hq profile or MadVR on Windows. For that you’ll need something a notch over integrated graphics though, e.g. RX 560, GTX 1050 and on mobile Ryzen 2500U or equivalent.

[5] http://gpu.userbenchmark.com/Compare/Intel-HD-4000-Mobile-12...


2560x1440 on a 27 inch at a reasonable distance is pretty darn close to optimal IMO, so 4k, to me, is for 34" monitors (but 27 inch I feel is optimal on 60-75 inch desks, which is what I usually work with, so 4k rarely matters).

I'm with you on accurately calibrated monitors though! God most of them suck out of the box.


If you don’t see any (spectacular!) difference between 4K & 1440p you need to have your eyesight checked.

I’m not being sarcastic. The last time there was a thread like that on HN a bunch of people figured out they need glasses.

I have a 4K @ 24in monitor (180ppi) and a 267 ppi netbook and when I switch between them the 4K starts looking like a blurry mess!


> The last time there was a thread like that on HN a bunch of people figured out they need glasses.

Its fair advice, but some eyesight issues cannot be solved with glasses.. if they can be solved at all.

Also worth noting that with TVs the distance matters a lot. With monitors, laptops, and gadgets it is relatively stable.


For TV/Multimedia HDR makes much more difference than 4K in my experience.


No, my eyesight both far and close was quite a bit better than average as of last week (as I was just there getting checked). We have a lot of 4k and 5k displays at work and most people who say they can tell a (significant) difference when we compare (the topic comes up a lot) seem to usually either be on > 27inch, have scaling higher than what's expected, or just fail to see it when we really test it out. Your millage may vary :)

Don't get me wrong, I can see a difference, but not nearly as night and day, especially when it comes at the cost of other features (eg refresh rate... which isn't the end of the world for coding so if it's the only thing you do on the monitor it could be worse... otherwise ouch my eyes.)


> have scaling higher than what's expected

What scaling is that? 200% scaling is what you should have, 4K is exactly 4x as many pixels as FullHD. If someone is using lower scaling then they are trading sharpness for virtual space.


I never get comments like these, as if the text just gets smaller as the resolution increases, rather than what actually happens (the text gets crisper). 5120x2880 is so nice because you can’t see the pixels and words almost look like they are on paper.


Seconded. 2560x1440 on a 27" panel is only 109 pixels per inch. I use a ThinkPad with that same resolution on a 14" display, with a 24" 4K UHD next to it in portrait mode.

Both displays are around 200 pixels per inch, plus or minus. It's great not having to see the pixels, so much more pleasant and easy on the eyes.

Also the combination of a portrait display with the landscape display is really nice. I can read an entire PDF page without scrolling.


I agree that having higher PPI is great, but are you using scaling to make text larger? I was barely able to use a 28" 4k a 100%, can't imagine doing that at 24"


Yes, I should have mentioned that I'm using Windows 10 with 225% scaling on both the 4K UHD 24" display (187 DPI) and the WQHD 14" (210 DPI). Some people like a bit less scaling, some more, but in general you want a scaling factor that roughly matches your display's pixels per inch.

The original Windows "standard display" was assumed to be around 96 DPI. That's the monitor that 100% scaling (i.e. no scaling) is intended for. Round the 96 up to 100 and we can say that in rough terms, the percentage scaling should be in the neighborhood of the monitor's DPI.

So monitors in the 200 DPI range are best at around 200% scaling.

A 28" 4K UHD has 157 DPI, so I wouldn't want to try it at 100% scaling - ouch. It ought to be running in the 150-175% scaling range.

The idea with a high-DPI monitor isn't to make everything smaller on the screen, it's to make everything sharper and more detailed. When you double the DPI and scale appropriately, you get four times the number of pixels for everything you put on the screen.


> A 28" 4K UHD has 157 dpi, so I wouldn't want to try it at 100% scaling - ouch. It ought to be running in the 150-175% scaling range.

That’s not how it works. Lower dpi does not somehow give you more real estate!

You should still be running with ~200% scaling because you are viewing it at a greater distance.

Optimal viewing distance, assuming 16:9 ratio, is 120 cm vs 140 cm for 24" vs 28", respectively[1]. Accounting for the difference gets you ~155 ppd with both monitors[2][3], maintaining 25.0° horizontal viewing angle.

The closer your viewing distance the more ppi you need for the same density. That 28" is not inferior to the 24", when you account for distance, despite the lower ppi, because the greater viewing distance makes the pixels look smaller, thus creating more dense image.

[1] https://en.wikipedia.org/wiki/Display_size

[2] http://phrogz.net/tmp/ScreenDens2In.html#find:density,pxW:38...

[3] http://phrogz.net/tmp/ScreenDens2In.html#find:density,pxW:38...


I guess the problem is I value amount of information I can fit on the screen vs. quality of the information.

Also, apps that don't scale properly are a pain haha.


Scaling is usually on by default in most modern operating systems.


Bitmap text can look clear as crystal on a very low pixel density display.

You need a higher PPI to make anti-aliasing work on screen (finally looking nearly as nice as print).


Last year I had a 45” 4K screen with 150% scaled UI and was able to develop in VS code with two code windows open side by side, all with the crispiest text I’d ever seen. It’s the dream.


I would __much__ rather have TWO 1080p or 1920x1200 displays than a single 4K display. For me, the quantity of visible 'glass' matters most. It's surprisingly difficult to drive double 4K monitors.

Maybe as the price comes down and more transistors end up in basic chipsets we'll see 2 and 3 heads of 4K displays become common.


Agreed. I've been using a Commodore or PC desktop since ~1986. Been through lots of hardware iterations, seen and pondered many different configurations. I found the 24" 1080P display is the best and the more, the better[0]. And no larger than 24" either, that's crucial. I've downsized from larger panels to 24".

I wouldn't trade my three 1080P 24" panels for 4K panels, unless the goal was to sell the 4K panels and rebuy the 1080P 24" panels. I don't do a lot of gaming anymore, but they're hardly a horrible experience in that regard either.

[0]https://pcpartpicker.com/b/dFmqqs


I almost agree, except that I definitely prefer 24" 1920x1200 over 1920x1080.


So I wasn't able to get this out of my mind and went to order 3 new 1200P panels today and actually backed out at the last moment after further consideration. I think it would drive me nuts to effectively have a 21.6" panel in 16:9 optimized content, which is everything. Which would bother me more than having the additional 10% viewspace for work. Especially considering I have 3 panels, plenty of workspace.

I think at this point in the market, I'm going to stick with 16:9 native resolutions. If I do any swaps, I'll probably try out dual 27" 4K panels (also 16:9, great for 1080P content), with one mounted below the other. That'll be pretty nice in time as 4K becomes better supported.


Agreed. I couldn't find (modern, thin bezel + Displayport) 1920x1200 panels when I was shopping last time. I do see a few on the market right now, when one of my 3 goes out, I'll sell off the other two and order three 1920x1200 panels for sure.


Why do you prefer a 24 inch display to a larger one such as a 27 inch?


The dot pitch is too big, at least at 1080P and it's just not as sharp as I prefer. I had a couple 27" 1080P panels at one point and got rid of them. 1440P@27" is good but I've just been happiest overall with my three 1080P panels, the main factor is fitting three 27" panels on most desks (& mine is rather large) is harder than 24s. Also, less neck movement to focus on a panel on the periphery. Some trial and error to reach this point but as long as I run three panels, I'll never go beyond 24".


It's not about needing 4k, it's about the prices. While it's normal in the usa to get a couple of these monitors for me the expense is impossible. And I really would use a 4k monitor. It's a big world, and most of it is not the usa


Actually, it's not always the price. Serious developers or seasoned computer enthusiasts doesn't change rigs every couple of years. If one's system is performing well enough, and the user gets used to it, system upgrade can be deferred until some technology or computing resource becomes necessary. When something breaks, the broken part is replaced and upgraded in most cases.

Personally, I've just upgraded from 1680x1050 to 1920x1080. My old monitor was 10 years old, and was performing relatively well. I bought a new one, because it started to show its age (mostly backlight age).


I think your experience may be an outlier. Being in an University for the past 4 years, I observed that 90% of the laptops used by the students were 1366*768, and majority were 4GB, followed by 8GB.

I don't remember anyone having any problems running the latest chrome or Firefox. I think people even managed to run things like SolidWorks or Matlab perfectly fine on those machines.


None of the other things surprise me, but that screen resolution does a bit.

It's been about 8 years since I last had a laptop without at least 1080p I think.


Do you buy your laptop for screen real estate or for "small and not too heavy"?

Because the latter is what most people buy for, as far as I can tell.


Those things aren't necessarily mutually exclusive anymore. Something like a Dell 7270 will cost about £275 on eBay. 12 inch screen and 1080p touchscreen. Some question the wisdom of a small screen like that with scaling and whatnot but I've always found it fine.


That's fair. Though I wonder how many people buy computers on ebay as opposed to dell.com or amazon.com or an actual brick-and-mortar store.

That said, I suspect most people over the age of 35 or so can't read text at "normal" size on a 1080p 12inch screen, so if they want a 12 inch screen they have to either scale up their text or drop the resolution. And I believe the reported resolution in the Firefox telemetry is the one the OS thinks it's using, not the one the hardware has.


1366x768 just will not die. They're usually cancerous TN panels with 20 degree viewing angles too.


For work, the laptop stays at home or you put it in a rolling case and take it into the office once a day. College, it's in a backpack and carried to class 5 times daily.


It doesn't surprise me too much.

I develop for an organization that serves people of mid-to-lower economic strata. 10% of our users are still on Windows XP.

We get caught up in the tech sector echo chamber and don't realize that upgrade cycles for millions of people is a lot slower than we are used to.

I think that it's not surprising to see what Firefox is reporting. A lot of people go to Firefox seeking something that runs faster or better than what came with their Windows box years ago. Most of the people we serve on low-end computers are Chrome, not IE/Edge.


Personally I won't touch anything with less than 32 GB and 1080p, but people that use a PC like a tablet with a keyboard and a larger screen have different needs. They are the vast majority of buyers.

A friend of mine is thinking about buying a new laptop. We were looking together at what's available in the sub 400 Euro range, basically everything is 4 GB and 1366x768. Add some 200/300 Euro and 8 GB and 1080p start to appear in the lists. Guess what people buy most? The cheapest stuff.

By the way, the new laptop must be Windows because anything different will be too difficult to cope with after 30 years of training. He's replacing the old one (only 2 years old) is getting too slow. Probably a malware party but disinfecting that machine it's a hopeless effort. That one and the previous ones were periodically cleaned and/or reinstalled but there is no way to change those ingrained clicking habits. No adblocker because my friend wants to see ads and they could actually be useful in his business. I told him that he'll be back with a slow PC soon and that he's wasting his money, and I've been wasting my time.


Those numbers seem out of whack to me. 1080p is pretty bad in this day and age, and 32gb is massive overkill for virtually everyone.

My daily machine at home is still a 2011 Macbook Air with 4GB of RAM, which is admittedly not enough. My work machine was a 2012 Retina MBP with 8GB which was PLENTY for all of my everyday needs except when running VMs. To this day, 8GB is enough for 'most' uses, and my work machines only have 16 to get me over that "8gb-is-not-quite enough" hump. But I've got retina displays everywhere, and a 5k iMac at home. No clue how much memory it has, to be honest, but 32 is insane for not just the average user but even most power users unless they have really specific needs.


In Mac world 1080p is bad. In low cost Windows laptops 768p is normal. It's one of the reasons why they are cheap. One random laptop under 300 USD:

https://www.amazon.com/Performance-HP-Quad-Core-Processor-Gr...


>I won't touch anything with less than 32 GB

why would you need that much ram? are you running 3+ VMs all the time?


I run an IMac with 32gb because I’m a SharePoint developer. A 2012 VM running AD, DNS, SQL Server, Visual Studio, and a SharePoint farm all running takes up 16gb easily. The other 16gb goes to VS Code, XCode, Slack, multiple browsers, Outlook, Docker at times...etc.. Then I gotta shut it down and boot up the SharePoint 2010 VM sometimes which gets 12gb a little less I suppose.


I had 16 GB, got close to the limit working on a project with many containers, upgraded to 32. I leave open projects for 4/5 customers now which is very handy, no need to close and open. That alone is worth the cost of the extra GBs.


I run 64GB in a mac pro, with about 1/2 of it used as a cache for a zfs volume. With it taking half of my ram along with my other daily apps (safari, sublime text, etc) I’m only left with ~10GB free.


I had to move from 16 to 32 gb when I did hadoop development and needed to run a vm hadoop system locally. It all depends on your workload.


> No adblocker because my friend wants to see ads and they could actually be useful in his business.

This is the weirdest part for me, as I've myself never met a person who'd like to see ads. Installing an adblocker seems to usually be a relevation for people.


Preaching to the converted I know, but: A laptop that's in the shops today is going to be hardly faster than one from two years ago. In the 90s it would have been but the pace has slowed down a lot since then.

Possibly a good candidate for just doing a reformat?


You seem to assume a large portion of users are software engineers or tech people. You may consider a 16GB ‘bare minimum’, but not everyone needs that - nor can even afford that (or a laptop more than, say, new or used for $750).


My brand new thinkpad with 16gb ram and a 1080p ips display cost only £670. These specs aren't expensive any more.

That being said, my parents are perfectly happy using Google chrome on 2gb ram with a core 2 duo. And that's not with only a few tabs.


Not a single person in my office has a laptop that costs over ~400 GBP.

It may come as a shock to you but there exists a world outside of California/London/random-tech-hub


You can't walk into/click onto a high-street store and buy that though, and my assumption is that's where most people buy laptops.

£670 is much more than people I know want to spend on a new machine. None of them have ever mentioned screen resolution when i've asked what they want in a laptop.


£670 is not "only".


The average person won't spend that much on a mobile device.


> (or a laptop more than, say, new or used for $750).

To me, the hardware report shows technology is quite stagnant on display resolution and memory. 1366x768 is a resolution from 2002 or thereabouts. 4GB was common in desktop computers in 2005. ~15 years have passed and consumers in the $750 price range are still hamstrung with 1366x768 and 4GB of memory? I would expect better from a $750 phone, let alone a computer.

I think we're actually talking about computers that are very old and have never been upgraded or new computers in the $200 to $400 range.


>To me, the hardware report shows technology is quite stagnant on display resolution and memory. 1366x768 is a resolution from 2002 or thereabouts. 4GB was common in desktop computers in 2005

I think there's a huge disconnect between what's you think is "common" and what people actually have. Steam hardware survey from 2009[1] says only 10% of steam (gaming) users have ram greater or equal to 4GB. I'm going to estimate that in 2005, that figure is less than 5%. Saying 4GB is common in 2005 would be like saying 64GB today is "common".

[1]https://web.archive.org/web/20090415225520/http://store.stea...


> 1366x768 is a resolution from 2002 or thereabouts.

In 2002, the most common resolutions were 4:3, not 16:9, not only because notebooks were not mainstream yet.

> 4GB [RAM] was common in desktop computers in 2005.

That's the time when notebooks became mainstream iirc. I purchased a gaming notebook in 2006, it had 1 GB RAM. I also bought a midrange notebook in 2009 which also came with 1 GB. (I later upgraded to 2 GB aftermarket.)


> In 2002, the most common resolutions were 4:3, not 16:9, not only because notebooks were not mainstream yet.

Sure. I didn't really intend to get into fine-grained technical details, but yes, back then 4:3 was the predominant aspect ratio. In fact, some of the 17" LCD displays from the time were that awkward 5:4 resolution of 1280x1024.

But putting specifics aside, the broader point is that computing technology—at least with respect to the most popular display resolution and amount of memory—is stagnant. In this thread there are people arguing for technology asceticism and making the point that 4GB and 1366x768 are good enough. And maybe they are for some people. I am surprised there are so many who are satisfied with this particular flavor of "good enough." (Though I would like to see how well that correlates with mobile device technology asceticism.)

I want the hardware industry to do better, and I thought things had started improving a few years ago when 4K displays emerged and the shackles of "High Definition" (1920x1080) were cast aside. But this data shows adoption of 4K displays for computing is so low it's not even measured as such. That disappointed me, but it's unsurprising. What surprised me was realizing 1920x1080 isn't the top dog I thought it was—in fact, the situation is worse than I thought.

And to my mind, all of this feeds into the subject at hand: an operating system vendor getting stuck. The embrace of "good enough" is suffocating.


Notebooks were plenty mainstream in 2002 (they'd been accessible to the average middle-class household or small/medium business since the mid/late 90's). It's just 16:9 that wasn't mainstream at the time, at least for PCs (laptops included).


Many of these statistics may be from overseas? I know in Brazil for instance, computers are levied with crushing taxes that mean even new computers in stores are already obsolete. Add in high rent seeking from middle men and low salaries and most people just make do with something ancient.


> 4GB was common in desktop computers in 2005. ~15 years have passed and consumers in the $750 price range are still hamstrung with 1366x768 and 4GB of memory?

If anything it's worse now because the RAM is soldered in and they're using BGA ICs which makes it impossible for most people to upgrade.


> new computers in the $200 to $400 range

There's quite a lot of these, e.g. HP Stream; the Windows license is much cheaper for these systems because they have such low specs.


My 1-year-old MacBook Air runs 1440x900 with 8 GB and it cost >$1.000 IIRC. So neither "very old" nor $200-$400 range.


MacBooks Air, Macs Mini and Mac Pro during last few years were ancient as soon as manufactured. Apple is being criticized for not updating these model lines.

So yes, the innards of your 2017 MBA are very old. It has Broadwell generation (circa 2014) CPU/GPU, and one optimized for power draw at that (i.e. a slower one).


"I can barely conceive of the poor experience of running a modern web browser on 4 GB of memory at 1366x768."

I've used for years a CoreDuo system+4GB with Firefox then PaleMoon playing videos in FullHD on 23 and 27 inches monitors, and never got even close to hit the swap partition unless I was running virtual machines or heavy games. Now I'm on a i5 Debian+XFCE with 8gigs, one Waterfox (privacy oriented Firefox derivative, thanks to the HN user who pointed me at it) instance with 4 tabs opened and a shell with 2 tabs; here's free output:

  ~$ free
                total        used        free      shared  buff/cache   available
  Mem:        8037636     1194044     4182928      187332     2660664     6555172
  Swap:       8249340           0     8249340
And this one while watching a 1080p Youtube video at fullscreen.

  ~$ free
                total        used        free      shared  buff/cache   available
  Mem:        8037636     1453204     3761776      280824     2822656     6202520
  Swap:       8249340           0     8249340
The operating system probably makes a difference [1], but anyway Firefox runs perfectly on 4 gigs or less. That RAM amount is plenty if well used.

  [1] ~$ uname -svo
  Linux #1 SMP PREEMPT RT Debian 4.9.30-2+deb9u5 (2017-09-19) GNU/Linux
  (a bit older RT kernel needed for low latency audio)


My main home PC is a laptop matching those specs. I frequently use it for development and web browsing. You can comfortably fit two terminals in columns with that resolution. The RAM is mainly an issue when browsing, but killing all the ads and defaulting to no Javascript helps. A surprising amount of websites still only use Javascript for "progressive improvements" i.e. they'll use it just for pop-ups, hijacking scrolling, randomly moving menu bars in and out of view, implement some nonsensical partial loading of the main content etc. but will happily show you the full text without a hassle with a consistent user interface once you turn it off. For a while I used Dillo but that became less viable as an option over time. That said, I do waste a lot less time on shitty websites on this setup.


It is worth contrasting with the Steam survey which covers several million gamers (of varying degrees of dedication) -- http://store.steampowered.com/hwsurvey

Of note, 8GB is most popular with 1920x1080 resolution.


Simplified Chinese being the most common language (by far at 52%) threw me for a loop, though it probably shouldn't in this day and age. (For context: English - 23%, Russian - 7.7%, Spanish - 2.8%).


That's probably due to the influx of chinese players coming in to play pubg or whatever in the last few months. A more "representative" sample could be found in the months leading up to the spike: https://web.archive.org/web/20170606030609/http://store.stea....


Would probably be higher if memory prices weren’t so damn high. Same goes with video resolution. I bet we’ll see a quick bump once video card prices start to drop.


Every time we've said "when video card prices start to drop", we've seen about 3 months of drop, and then the companies come out with some new thing that increases the prices yet again.


>disappointingly small 8GB. In my world, I consider 16GB a bare minimum

My main computer has 8GB RAM. I don't know why I would need more than that. Right now I have firefox with 15 tabs, qt creator and 3 terminals open and I'm only using 2.5 GB of RAM. Unless I suddenly wanted to get into heavy video editing I don't think I'll feel the need to upgrade soon.


The more RAM you have, the more your OS will use. My 16GB laptop is currently using 8-9GB and I don't have much open. What's the point of RAM if you don't use it? If I get low on free RAM, but OS will discard some unneeded stuff but otherwise it properly uses it as a cache.


Almost all operations I command are executed instantly by my computer, so I see no need to use more RAM.

I think a good SSD and GPU are much more important than having a lot of RAM if you want to improve general user experience.


I think if more developers had to live on 768 line displays, slow CPUs and 4GB RAM, we would have much better performing software.


> I can barely conceive of the poor experience of running a modern web browser on 4 GB of memory at 1366x768

It seems that most people who make websites nowadays think similarly. I don't get why they are making websites for themselves, rather than for their users.


> It seems that most people who make websites nowadays think similarly. I don't get why they are making websites for themselves, rather than for their users.

Because most people making websites don't care about basic usability, even though they have stats that shows their users use small laptops they'll still stick a fixed header that takes 1/5 of screen real estate because it's the latest "bootstrap/material design fad". Google does this a lot, which is ironic with them pushing "AMP" on mobile, they can't even give regular people a good on experience on their laptop.


> I can barely conceive of the poor experience of running a modern web browser on 4 GB of memory at 1366x768.

The laptop I'm typing this on is exactly that: A Lenovo Ideapad from 2010 with 4GB and 1366x768 resolution, running Bunsenlabs Linux and the latest Waterfox (Firefox fork with telemetry removed). It's blazing fast even without a SSD; the browser starts in the blink of an eye and it takes nearly 70 tabs open to start slowing down.

Yes, the resolution can be limiting for some websites, but most are designed well enough to work fine on it. In fact, on text-heavy sites like HN I find myself browsing at 120% magnification for less eye strain. I couldn't imagine what 1080p or higher on a 15" screen would feel like, but it would likely be painful. It's actually more comfortable to read on than my iPad mini for that reason as well, and of course typing is a dream compared to any other laptop I've used, to say nothing of klunky touchscreen keyboards.

Granted, it would be slower with a more bloated Linux distro and certainly was with Windows 7, but when it comes to the OS I prefer minimalist and stable to kitchen sink and bleeding edge.


When I bought my father a new laptop a couple of years ago, I specifically looked for a model with 1366x768 resolution because a lot of legacy apps he depends on for his work have terrible HiDPI support. "Upgrading" to 1920x1080 with HiDPI turned off (100% scaling) would have made everything too small for his eyes.

Moreover, even apps that do support HiDPI often look like crap at 125% scaling -- the default setting on Windows 10 for most 1920x1080 laptops these days. And if you go up to 150%, you only get 1280x720 effective pixels. That's even less information density than 1366x768. Most legacy Windows apps expect to be able to use at least 768 vertical pixels.

The search turned out to be more difficult than I expected, because most high-end PC laptops with good build quality had already transitioned to 1920x1080 by early 2016. I finally found the perfect model for my father and told him that he should expect to use it for a long, long time since it's the last laptop that will ever support his apps. Fortunately, it's a well-built Broadwell i5 with a number of upgradable components, so it should last well into the early 2020s.


> I know many people economize on memory

Most just don't have a choice. Plenty of laptop models top out at 8GB,with 16GB either being confined to larger form-factor machines, or being so outrageously expensive that no-one in their right mind would bother.

I mean, look at the current line of macbook pros, you simply can't get 16GB without going up to the 15-inch model.


There's always been a strong impedence between users and developers. Calling a website "Apple first" isn't a compliment.

Websites become a problem on slow internet, which is surprisingly common in companies. This is because they are often located in cheap rent areas with little infrastructure.


This surprises me too, but I also must recognize my own inherent bias: The places I have worked in for the past five years all have "enterprise" laptops which are loaded down with all kinds of agents, clients, and background services which just put a silly load on just an idle system. A measly 4GB of RAM would not be enough to run MS Office alongside IE.

The screen issue does bug me though. Not just a haxxor feature anymore, all basic Windows users can leverage the split-screen capability to put two applications side by side - on a 720p screen, there really isn't enough screen space to practically do this. I think 1080x1920 or at least 900x1600 should be the standard for 2018.


I have a 1366x768 laptop with 4 GB of RAM..

It's included in the university fees and there's no way of opting out of it, so I had to take it. Not only do they overcharge us for it, finding a seller is hard for it too, so I decided to wipe it clean and use it as my daily driver.


Ouch, which university is that? I've heard of universities negotiating deals to get student discounts with certain vendors, and others requiring students (normally in engineering programs) have laptops meeting certain requirements, but this is the first time I hear of students having to purchase underpowered laptops.


> Can you imagine the user experience of today's modern over-JavaScripted web pages on such underpowered hardware: not simply slow because of their grotesquely large transmission payload but also because they caused your computer to start swapping memory pages to disk?

Yes. The laptop I keep by the TV has 4GB of RAM, a 1366x768 screen, and a quad-core Celeron, that I think is some descendant of one of the Atom families. I tend to keep 10 or fewer tabs open, avoid sites that use really heavy Javascript (well, I do that anyhow...), and usually don't have much else besides a Bash window running. It's got a spinning-rust drive, and swapping would be very obvious. With my usage patterns, it doesn't swap. Performance-wise, it's a fair bit nicer than my phone, at least.


If you look at Chromebooks, most are 4GB, often 2GB. These are the mainstream new laptops that people will expect to run for 5 years+.


Think is 'expected to run for <5 years' since the device has 5 years of software support. I find that to be a bit of a bummer, but it's still better than android's 3


Yeah, but that link is a Firefox hardware report. How many Firefox users are on Chromebooks?


I find it extremely sad that us developers really have no idea what our sites are used on.

Works great on my iPhone X and iMac Pro? Perfect!


> 4K doesn't even get listed, presumably lumped into the "Other" category.

Could the chart be reporting CSS pixels rather than hardware pixels? Macs are only 7% of total Firefox users but the majority of Mac hardware has a "Retina" display with a higher resolution than these.

BTW, 2560x1440 is listed with about ~1% but it's not visible because the line and text is black on a black background.


I've been using 2560x1080 for almost a year now and will never go back. 21:9 is the way to go.


What size is that? I run a 34" 3440x1440 21:9 and it's nearly a bit too wide.

EDIT: https://www.asus.com/us/ROG-Republic-Of-Gamers/ROG-SWIFT-PG3... 100Hz is awesome, even for just browsing the web.


I have the same monitor and it is the first time I've started to run Windows apps more in actual windows than full screen/full screen-like. Too many sites waste a bunch of horizontal space on it, so makes sense to shrink the window.

Great monitor though. Both for gaming and for general usage.


Its 29". I've had many monitors, both bigger and smaller, and this is by far the best I've had for gaming and everything else.

Exact model:

http://www.lg.com/us/monitors/lg-29UM68-P-ultrawide-monitor


Sounds like April Fools joke/Onion article BREAKING: Bay Area developer discover that California is surrounded by rest of the World.


> * In 2018, the most popular memory amount for users of Firefox is still 4GB, trailed by the also disappointingly small 8GB. In my world, I consider 16GB a bare minimum, but at least with memory I haven't been deluding myself—I know many people economize on memory. Still, I would have thought 8GB was the most common memory size by now.

In 2018, my computer only has 2GB RAM and runs more than fast enough for what I need on a 32-bit processor. I don't understand why some people always want bigger hardware numbers, rather than better software.

> * A surprising number of people still have Flash installed. I do too, I don't see how it's abnormal.

> I can barely conceive of the poor experience of running a modern web browser on 4 GB of memory at 1366x768. Can you imagine the user experience of today's modern over-JavaScripted web pages on such underpowered hardware: not simply slow because of their grotesquely large transmission payload but also because they caused your computer to start swapping memory pages to disk?

I can barely conceive of how narrow-minded and myopic your technological experience must be that you feel only the latest and "greatest" is viable. As someone who browses the Web regularly, I spend a lot of time on places like HN, YouTube, etc. My screen resolution is 1440x900 - one step above the 1366x768. As I said, 2GB RAM, etc. This is why I've argued, constantly, against the nonsense of Chrome and "modern" Firefox. That's why users of old computers are going to Pale Moon if they know about it, or back to Internet Explorer, or just not upgrading to the newest Firefox.

Your technological experience is very much a high-class experience, but you have to understand that there are users all over the world and in all economic positions. All this push toward expecting the best and most modern only works to force out the common person - in America it's the poor person in the ghetto who still is well above the position of people in other, even less prosperous nations. Most people, if they are lucky, have a used <$200 laptop which is often a decade or so old. Library computers are often almost as old - I know two libraries which are still running their .5GB 32-bit Windows NT2000 computers as patron-access machines because they are in poor areas and simply cannot afford to upgrade.

People who have the view that everyone should have base access to the latest, greatest, and most wonderful hardware upset me greatly. I have one comment about it that seems constructive. If you think it's so important that they should have this, you should provide it. If you can't give every person who has an old reliable computer (that they can repair) the newest bleeding-edge tech (and give them service when it inevitably breaks), then you need to understand that they have valid concerns too - and help provide better software and help others like you to understand what reality looks like for most of the world.


> That's why users of old computers are going to Pale Moon if they know about it, or back to Internet Explorer, or just not upgrading to the newest Firefox.

Malware authors rub their hands gleefully when they see sentiments like this being spread online by people like you.

Don't do any of those suggestions. Disable multiprocess mode in the Firefox config if you find it eats too much memory, stick with the ESR if you're scared of upgrading and use Edge over IE.


Please show me a download for Edge on a 32-bit WinXP PC.


If you are running XP (32 bit at that!) then your choice of browser is the least of your security issues.

In any case you must be at least airgapping it (right?), so I guess it's academic.

Edit: you might be onto something here. Running an old, outdated browser on an old, unpatched 32 bit windows xp box might trigger some anti analysis defenses on whatever malware is sure to come your way. Security through being too obviously insecure!


This isn't specifically my situation.

But I know three families in low-income situations for whom that sort of PC is the only one they have, because they got it from either an office or a school when the place was discarding computers. So no, it's not being air gapped. It's the computer the kids look up homework info on, it's the one the parents use to search for jobs, and more. They can't afford to upgrade, and they're afraid that even upgrading the software would make the whole thing come apart at the seams.


There is no real need in 4K resolution, so that's why is's not so popular.


People that think they don’t need 4K either haven’t seen a 4K display in person or have poor eyesight. It’s a leap like upgrading from HDD to SDD, in the display world.

Unless you just need your monitor for gaming. New gen temporal aliasing mostly makes up for the lack of pixels.


I have a system with 7200 rpm HDD and system with SSD. Not so much difference. It was a waste of money. Just like 4K display.


I don’t believe you. There must be something pathological with your SSD or something else in your system.

SSDs have an order of magnitude faster random access performance (both read & write) and you can measure that. NVMe drives additionally increase parallelism by a significant margin. Typical workloads are very sensitive to this metric. At least at cold boot you should see significant improvement in performance.

Font rendering, likewise, you can just see[1]. See the full page[2] for more examples.

If you have good eyesight, you will see a difference. I could see how 4K to “retina” (>260 PPI) might not be worth paying extra to some (though I personally want the highest PPI I can afford), but 96 PPI to ~185 PPI is a world of difference. It goes from blurry to sharp.

That’s not all—because LoDPI text rendering often involves subpixel rendering techniques, HiDPI also brings significant improvement in contrast.

For crying out loud, some people even get headaches from the blurry subpixel rendering! [3]

[1] https://www.eizo.be/eizo-pixeldichte-im-zeitalter-von-4k-e5....

[2] https://www.eizo.be/knowledge/monitor-expertise/understandin...

[3] https://www.w3.org/WAI/RD/2012/text-customization/p7.html


HDD fast enough for my needs. Just like full HD resolution. Of course there is may be difference, but in very specific cases. So, just waste of money, as I said.


Alright, you might not appreciate it, but it’s not a waste of money.

Maybe that’s arguing semantics, but to me a waste of money is something like a $150 HDMI cable… Or 64GB of RAM you’ll never use. Things that you can brag about but that don’t actually improve your life in any way at all.

However, the $100 extra I’ve spent on a 4K vs FullHD monitor was, I’m certain, one of the best purchasing decisions in my life.

My 4K display[1] makes spending time productively so much more pleasant, and so much less tiring, both on eyes and mind. And it just looks gorgeous.

The first time experience of watching BBC’s Planet Earth in 4K on this thing was mind-blowing. It must look absolutely incredible on an actual big screen with full range of HDR.

I also have a very cheap Chinese netbook with a 3K display (260 ppi), and I have, on occasion, used it to read books etc. I would have never even imagined doing any extended reading, of any kind, on my old laptop. The font would need to be huge to make that experience at all bearable. At least that’s an option for something like ePub or HTML, but a lot of things are available only as a PDF. On that little netbook it’s no bother to have multiple columns of tiny text on the screen, all at once, just like in a real book.

I wear glasses and with the old LowDPI devices I was always unsure if I need a new prescription or if it’s just the crappy display.

As for SSD I can strongly recommend getting one in addition to your HDD. For my workstation, I’m running a bcache setup (in writeback) with a 60GB old SATA3 SSD + 1TB HDD RAID1 and I’ve been extremely pleased with that so far.

I can only tell I’m not running an actual 1TB SSD right after I do a major upgrade. It takes at least one cold boot to promote files in the boot path to cache[2]. Every time that happens it reminds me just how much faster SSDs are for certain things.

Nevertheless, you almost get to have your cake and eat it too. Bcache (or dm-cache, a fine choice as well) will intelligently sync all the randomly accessed and most used data with the SSD and stream anything bigger (like movies etc.) directly to/from the HDD.

In writeback mode it will also use the SSD to buffer random writes, until they can be written sequentially, all-at-once, to the backing device.

It makes both your SSD and HDD so much more valuable. Strongly recommended :)

[1] http://www.lg.com/us/monitors/lg-24UD58-B-4k-uhd-led-monitor

[2] It’s something that could be solved, though. An explicit interface to promote certain extents would do the trick. So far I think only bcacheFS offers something like that.


Moving from a 7200 to a SSD would have made a big difference. Your boot time and program load time were cut in half or better. A lot of inconvenient actions like file search became easier to do frequently. I can only guess that you do almost everything via web on a newer machine to not notice much difference.


That's not a need though. My computer needs a CPU and by extension so do I, but at the end of the day, a 4k display is a nice to have.


Sure, but once you use one for even 15 minutes, it pretty much becomes a need ;)

That’s why I have an adamant rule about never trying hardcore drugs or displays capable of refresh rates higher than 60 Hz.


I went back from a macbook pro 16gb to a macbook with 8gb. I do iOS, web, hardware and TV development. I have not had a problem. I can have xcode, eclipse, firefox, safari, mail bbedit and a few other programs running fine.


How is 16GB bare minimum when it's the max you can get on most laptops?


4k is overrated. 27" 1440p is good enough, plus current GPUs are able to push 100+ FPS with such resolution (they fail miserably with 4k resolution).


>(they fail miserably with 4k resolution) I use an external GTX 1070 over Thunderbolt 3 on a 2017 Dell XPS 15, and have 2 2160p monitors. 4K gaming is no problem at the refresh rates that 4K offers (usually 60Hz)


> 4K gaming is no problem at the refresh rates that 4K offers (usually 60Hz)

Subjective. Depends entirely on the settings you are using and the resulting frame rate at that resolution.


Most corporations can’t afford to upgrade hardware and most likely run on very old hardware running windows xp or if they’re lucky, windows 7.


640k ought to be enough for everyone


I will hope this post is ironic and move on.


Satya is having enormous financial success in chasing much needed markets, but the jury's still out on ditching MS's strengths in large software development in favor of a no-support permanent beta culture at a moment when the latter is showing its ugly side.


I kind of expect that the "let consumers be the beta testers and fire all our QA" approach is going to majorly bite them in the ass at some point and lose a lot of goodwill.


The thing I don't fully understand about Nadella's strategy is what Windows is supposed to be any more. Clearly if all these whizzy subscription-based online services are going to be useful, people need a device to access them. For simple communications and information consumption, the modern generation of devices like smartphones and tablets works just fine, but for anything more involved or with a significant creative element you still need something more like a PC.

However, it looks like Microsoft is isolating the online services from the client platform in order to de-prioritise Windows, which inherently makes Windows itself more replaceable. That creates a huge potential gap in the overall tech market, particularly once Windows 7 reaches end-of-support and a lot of people are going to be wondering about what they can do if they don't want Windows 10. (After all, if they did want Windows 10, why are so many people still not using it?)

There obviously aren't that many businesses that would be in a position to exploit such a vulnerability and start building not just a viable alternative client OS but a viable alternative ecosystem around it, but there are some. Apple could be a threat from one direction. Traditional big tech companies like IBM could be a threat from another. With the foundations available with modern OSS stacks, an unexpected alternative from (or sponsored by) the likes of Red Hat might be less likely but not totally out of the question either, or even a curveball from the likes of Intel.

So, what if one or more of them really did produce a compelling client platform and moved the whole industry back in the direction of local software and in-house operations? If the current focus on Cloud everything wanes, but Microsoft has voluntarily given up its long-standing dominance of the more traditional parts of the industry by neglecting Windows as a platform in its own right, where does that leave their online services five or ten years from now?

Maybe the likes of Satya Nadella are just much better connected in the big boardrooms than a mere geek like me, and they're confident that the emphasis on Cloud everything isn't going anywhere any time soon. But now we've been doing that for a while and the gloss has worn off a bit with all the real world problems of reliability, security, leaks, costs, and so on, I can't help wondering whether Nadella is mostly playing to Cloud-obsessed big investors and at some point those investors will prove to have been throwing good money after bad because they didn't know any better.


> So, what if one or more of them really did produce a compelling client platform and moved the whole industry back in the direction of local software and in-house operations?

I've been thinking about this too but I don't see who would stand to gain from it.

If Wine got some serious commercial backing it could be leveraged to run all "legacy" Windows apps on another platform (Linux). Desktop Linux is relatively far along but with solid commercial backing it could become a serious contender. Or something more exotic...

The only question is: cui bono? Who would have a reason to go that route?


The only question is: cui bono? Who would have a reason to go that route?

Someone like Valve, who has a massive investment in Windows software in areas that Microsoft has specifically targeted for "embracing."

What Microsoft doesn't seem to understand is that when they screw with Windows and deprioritize the needs of its users, they're also screwing with the rest of us who live, work, and play in their ecosystem. Gabe Newell, being an old-school 'softie himself, understands that very well... and he's one of the few interested parties with the funding to do something about it.


"What Microsoft doesn't seem to understand is that when they screw with Windows and deprioritize the needs of its users, they're also screwing with the rest of us who live, work, and play in their ecosystem."

That is the part that baffles me. Visual Studio Code is a fantastic advert for the Microsoft brand: every day it shows every developer who uses it that MS can produce genuinely great software. It makes Azure more credible, as well as providing on-ramps to services. If people have a bad experience with Windows, why would they be inclined to get other services from MS?


Visual Studio Code is a fantastic advert for the Microsoft brand ... If people have a bad experience with Windows, why would they be inclined to get other services from MS?

Ironically, we decided not to try out VS Code specifically because of privacy/security concerns about Microsoft that have been heightened since Windows 10. We checked Microsoft's written policies to see what sort of telemetry data was being uploaded. We were unable to find any detailed information about it, and the general wording appeared to allow for uploading more-or-less anything, up to and including our own code.


Put:

"telemetry.enableCrashReporter": false, "telemetry.enableTelemetry": false,

into vscode settings and you are fine wrt telemetry.

What concerns me more is that the .rpm build is consistently later than other builds. Usually it means, that the user is staring 2 out of 4 weeks every month on update notification, that cannot be used, because the build is not ready yet.

It brings back the memories of MSIE and WMP that used to be available for Solaris and HP-UX, always slightly later... until they weren't and the users were left in the dark.


Put: "telemetry.enableCrashReporter": false,"telemetry.enableTelemetry": false, into vscode settings and you are fine wrt telemetry.

... for the moment, as far as you know.

The problem is the trust thing, not the telemetry thing.


The problem is the trust thing, not the telemetry thing.

Yes, exactly, and trust is hard to earn but easily lost. Given the corporate philosophy of Nadella's Microsoft, and the fact that Nadella was made CEO in the first place when it was pretty clear what style of leader he was going to be, I don't see Microsoft ever regaining the kind of trust we used to place in it without fundamental changes that start at the highest levels. And sadly, as long as investors continue to buy into that vision, those changes are highly unlikely.

It's not often I feel so strongly about a business, but Microsoft's actions in recent years almost feel like a personal betrayal after being being their customer for so long and trusting them as a keystone of the tech industry. In this case, I really do regard them as "the enemy" now, and I really do hope that someone else will drive a wedge through a gap created by their customer-hostile strategy with painful and expensive consequences for their management and investors.


It's difficult to predict who might see what potential advantages in shifting the market back in that direction.

The claimed benefits of the Cloud have always been as much hype as substance, and a lot of the "sales pitch" in the early days has now been debunked. We're not going to run short of stories about major Cloud services having significant downtime, poor support, data leaks and so on any time soon.

Likewise SaaS is a great pitch if you're a software developer, and converting capex to opex makes it attractive in some contexts as a customer as well. However, I get the feeling that patience is starting to wear a bit thin in some quarters now, with customers looking at the overall costs they're paying, how many improvements they're really seeing in return, and also how many unwanted changes are being pushed out.

Depending on where you stand on those issues, combined with the general dumbing down associated with many online apps compared to traditional desktop software, I can imagine a variety of situations where businesses might be looking at keeping more IT in-house as we go round the cycle again. For personal use, I suspect there's less of an incentive because many of these users are quite happy with just using their tech for basic communications and information retrieval, but for those who do want to do more creative things or who enjoy more demanding applications like games, again there's surely a big enough market for a serious desktop OS with an emphasis on good management of local native applications and not just being an expensive thin client for online services.

So who potentially benefits from an industry shift? I guess the simple answer is anyone who is interested in developing software for the markets I just described.


> I kind of expect that the "let consumers be the beta testers and fire all our QA" approach is going to majorly bite them in the ass at some point and lose a lot of goodwill.

I wish I could agree with you. However, the QA layoffs at Microsoft were in 2014; if something like was going to happen, it would have happened by now.


I am beyond sour on the Surface line -

I have championed it when it came out, and despite being in a hard position to pick one up I managed to do so.

I deeply deeply regret it.

If anything I have gone from being a massive supporter to actively doing what I can to spite the brand.

The product is an expensive laptop, which has such critical issues with it that it renders it inoperable.

Then when you need to try and talk to MSFT help, they are about as capable as toddlers. Politely unaware and unhelpful.

The surface line went from having a great rating, to having that rating revoked since it is unrepairable.

I want to love my device, I am using it right now, but there are serious issues with it, and this is a device which has been placed in the same context as apple products and their legendary support reputation./


They also will never drop windows because it's probably gonna be an extremely useful platform when the next generations of devices will come up, think of AR.

The fact that they dropped the mobile market (where windows could not fit as very different hardware was involved in the beginning) doesn't mean they will miss the next shot too.

They will have an entire ecosystem up and running, and windows 10- level flexible to be tailored to whatever comes. Of course it would be crazy for them to drop it now


The fact that they dropped the mobile market (where windows could not fit as very different hardware was involved in the beginning) doesn't mean they will miss the next shot too.

Every single incarnation of windows mobile / phone was well tailored for the mobile hardware of the day.

What sunk windows phone was not that it was slow, it was that it was late to go touch-first and failed to differentiate enough to make up for it. That and tightly controlling the vendors while up against android made the quality of windows phone itself irrelevant.


You seem to be suffering from the same delusional thoughts that affected Myerson.

This is such an utterly clueless explanation of why Windows Phone failed that it’s kind of stunning. Until, of course, you remember the culture-induced myopia I described yesterday: Myerson still has the Ballmer-esque presumption that Microsoft controlled its own destiny and could have leveraged its assets (like Office) to win the smartphone market, ignoring that by virtue of being late Windows Phone was a product competing against ecosystems, which meant no consumer demand, which meant no developers, topped off by the arrogance to dictate to OEMs and carriers what they could and could not do to the phone, destroying any chance at leveraging distribution to get critical mass…

Interestingly, though, Myerson’s ridiculous assertion in a roundabout way shows how you change culture…In this case, Nadella effectively shunted Windows to its own division with all of the company’s other non-strategic assets, leaving Myerson and team to come to yesterday’s decision on their own. Remember, Nadella opposed the Nokia acquisition, but instead of simply dropping the axe on day one, thus wasting precious political capital, he let Windows give it their best shot and come to that conclusion on their own.


Well, that's a very myopic view of what happened to windows mobile. Windows mobile and blackberry owned the mobile enterprise market for years. Then the iPhone came around and despite not having any ecosystem at all (or even native app capability) immediately started displacing both of them rapidly. Microsoft and RIM both gambled for a long time that ecosystem would keep windows mobile and BB OS in place, and only realized that it wouldn't once iPhone and Android had taken most of their market away. Microsoft then concluded from this that ecosystem didn't matter as much as people thought it did, and that if their offering was good enough the ecosystem would materialize by itself. Windows phone was never good enough compared to iOS and Android for that to happen, so it failed. But, the underlying theory was not crazy given what had just happened in the enterprise mobile market.


I still can't wrap my head around everything that was wrongheaded about Windows 8, and I was working there for most of its dev cycle.

One wonders what would have happened if they just sort of let Windows be Windows. I.e. if they had continued iterating on core stuff but left UI and general philosophy resembling Win7's trajectory and not tried to force people into WinRT/UWP/store.

Even though Win10 attempted corrective action it always struck me that it was still accepting the fundamental thesis of Win8. They still kept with the Orwellian redefinition of phrases like "Windows app" to mean very recent, immature and unproven technology. They were still talking about ARM devices that don't let you do a straightforward recompile of old code. They were still pushing the Store as the only means to distribute software, where even Apple has not succeeded in changing people's habits on the desktop.

I hope that with a weaker organization, people keeping the lights on for Windows will know to not shake things up too severely and ease up on pushing some of these silly ideas. I somehow doubt it. I've been hoping for that for... 7 years?


> One wonders what would have happened if they just sort of let Windows be Windows. I.e. if they had continued iterating on core stuff but left UI and general philosophy resembling Win7's trajectory and not tried to force people into WinRT/UWP/store.

With Windows 8 they tried to do too much too quickly and not well, however their motivations and intention were spot on in my opinion. Security has always been a pain point for Windows and it's only been getting worse, the Store introduced a sandbox like that found on iOS and Android. ARM, iOS, and Android showed that modern hardware and software can produce power efficiencies that put x86 and Windows to shame. UWP introduced hardware independence and a modern development framework that would put Windows Apps on a level playing field with iOS and Android.

The problem was that Microsoft thought that merely by virtue of their existence, developers would adopt UWP/Store despite the path forward being a huge pain in the ass. We see how that played out. With Windows 10 we've seen a course correction and a change in strategy. You can have traditional Applications sandboxed in the Store, and you can have UWP Apps outside the Store. Microsoft bought Xamarin and is folding it into UWP, the intent is to not just have hardware independence with UWP but also OS independence.


Is there any metric that doesn't show the Windows Store to be an abysmal failure? Have any Windows Store apps gone viral? Despite the enormous installed user base, there seems to be a definite lack of excitement around Windows apps (other than games) compared to the Android and iOS app stores. And for games, Steam feels like it's more dominant.


No but recently I started using the App Store version of the Arduino IDE. It's been a lot less trouble to setup and deal with cross device. If more Apps moved to the Store I would consider using them over managing updates across all of my devices.


Has security been getting worse on windows?

That’s a bold claim presented without evidence.


> and you can have UWP Apps outside the Store

IFF your user has enabled side-loading.


that's... not what most users want. even as a dev myself, I understand most people want just freedom and no hassle, OS to be there in the background, not creating narrow allowed alley for its own users.

will keep my Win7 for as long as hardware and software keeps running. don-t expect much more from OS honestly


> that's... not what most users want. even as a dev myself, I understand most people want just freedom and no hassle, OS to be there in the background, not creating narrow allowed alley for its own users.

Android and iOS have demonstrably shown that most users don't value freedom as long as the OS works for their needs. Most users are security nightmares as IT Admins and relatives supporting family members can attest to. The Store is Microsoft's attempt to introduce that stability and safety to the wider Windows audience.

That's not to say that there isn't a significant subset of power users who want the ability to do what they will with their machine however they are the exception.

Microsoft is struggling with how to cater to both markets simultaneously.


There's a big difference in phones and tablets vs productivity machines like desktops and notebooks.

The first is designed to be casual and apps grew out of a situation where there previously was nothing, while desktop OS's have had decades of external software being installed regularly. Introducing something new, especially when opening up so much potential, is far easier than trying to change existing habits of both consumers and vendors on an existing landscape.


And still, the Mac OS app store failed as well.


IMO iOS’s success comes more from the quality of the applications than the security model per se.

The iPad is a good case showing how the same model doesn’t succeed without top class apps.


I second this.


I feel like they started gathering metrics and then had no idea how to interpret them, and the result was Windows 8. "The Start menu only gets clicked on a few times per day. That must mean no one uses it, so let's remove it". Are you kidding me? It's like they really have no clue there. Some of these things are just so blindingly obvious you are left with no words...


Yep. If this was a car, they’d have said: “clearly the brake pedal is needed but let’s get rid of the door handle, people only touch that occasionally”.


It would have helped if they hadn’t done everything they could possibly imagine to fragment windows as much as possible. Six SKUs of Windows desktop, then on top of those two entire desktop UI environments in desktop and Metro, then Windows on ARM, then shifting from one new development stack to another, then another. Then mandating store apps. This was foot shooting on full automatic, clip after clip. I switched to a Mac desktops in 2007 before Vista so I watched all of this with stunned bemusement and to be honest have little idea how it affected users and Devs reliant on the platform, except from reading the occasional article by Paul Thurrott.


> Six SKUs of Windows desktop

Here's hoping they'll rethink that at some point and follow Apple's approach of having just one desktop OS instead of nickel-and-diming the masses for features such as BitLocker.


On the other hand, compare Windows desktop marketshare with that of macOS. When you're tiny, limited choices makes sense. When you're huge, product differentiation makes sense. When you're a premium product, you can charge whatever you want. Windows appeals to people who spend $0 on an OS and also people who can spend $300 on an OS. If they cut that down to one product, they'd be losing money.


This is true, but fragmenting your platform is an existential threat. Nickel and diming customers just isn’t worth it. The problem is Microsoft thought they were so powerful and had such strong lock in that it didn’t matter. They were wrong, hence the radically simplified SKUs for 10.

But they just can’t bear to actually get it all right, so still try and stiff people to switch off ‘S’ mode.


There is two constant declarations for each new Windows :

- There will be less SKUs

- Updates won't require to restart the OS

Well ...


>One wonders what would have happened if they just sort of let Windows be Windows.

Presumably, given that the opposite has happened, in that case, "targets" would not have been met by "Q4" and important team members would have to be "let go".

As you point out, this is not new. I would pin the start of this trajectory on the introduction of Windows Activation with XP. It's all variations on the theme of restricting the user's freedom. It hasn't been your software since 2001. It's theirs.


It felt like somebody on very high level decided to have unified experience for both tablets and desktops. Then those people who told this is impossible to do (well) in given timeframe were ignored. Windows 8 was what they managed to scrape together before running out of time. Delay was not an option as the tablet hardware launch was depending on the OS.


One thing I love about Windows 10 which may sound silly. The Netflix app. I can download anything downloadable on Netflix. I can't do this basic thing on my Macbook Air sadly cause they have no desktop app. Small thing that makes me enjoy Windows 10.


I honestly don't know the answer to this, but is that a Microsoft/Apple problem, or is it a Netflix problem? Apple has an app store for macOS, and there are tons of unofficial Netflix apps in there. Netflix has an app for iOS. So why hasn't Netflix made an app for macOS?


I don’t know either, but I wouldn’t be surprised to discover Netflix thinks people with Apple laptops are likely to have an iPad or iPhone as well.

With the possible exception of my non-Retina iPad mini I’d rather watch Netflix on any of my mobile devices. It never even occurred to me to look for it on my laptop, because the form factor is so ridiculous for movie watching.


Very possible. I've watched movies off my desktop for years, and my network isn't always reliable so being able to predownload shows on my Windows desktop is perfect. It returns me to my experience pre-Netflix / Hulu. I wish the streaming services were not getting so fragmented they're not helping with their greed, they're only drawing people toward piracy further all over again. I'm speaking of course of CW. I'm not paying to stream only one network just to watch Arrow and Co. dang it!


As a developer, the Windows target is baffling. I briefly considered porting a few apps I wrote over to Windows just to re-learn the platform again. Might be fun! Back in the day, the choice was easy: MFC/Win32

Now, I have no idea where to even start! C#? C++? .NET? Win32? MFC? WPF? XAML? WinForms? UWP? What a rat’s next! How did MS let the development ecosystem get so muddied? I know I can go online and research all of these, learn the trade offs, and hope I bet on the right horse, but then again I could also choose to simply leave this crazy platform alone and work on more features on the platforms that are more comfortable.


Here's how simple I think it is:

Out (but not gone/forgotten): Win32 (hand-coded C, C++ MFC, C++ ATL and related alphabet soup [aka back in the day the choice wasn't easy, either], VB6, .NET WinForms, etc), WPF/Silverlight (a .NET only XAML platform)

In: UWP (C++ with XAML, C#/.NET with XAML, JS with HTML/CSS)

Just three UI platforms to consider (Win32, WPF, UWP), two of which are mostly deprecated and not recommended if you are building a fresh application or porting one to a new UI. The remaining platform (UWP) gives you language choice (C++, C# or other .NET languages, JS), but not UI design framework/markup language choice (XAML, unless JS and then HTML).

If you are doing a fresh port of a C++ application from another platform, the choice seems pretty simple to target the UWP and use XAML to define your UI.


Except that with Win7’s dominance your UWP isn’t going to run on most of your clients for many years. In a corporate environment it sorts of kills it from the outset. I think Microsoft made a big mistake by not making UWP win7 compatible.


Windows 7 doesn't have "many years" left. Windows 7 has less than two years left in its support lifetime. (2020 is when extended support ends.)

In the mean time, if you, in a corporate environment, must Windows 7 until the bitter end in 2020, you can use WPF as a bootstrap step towards UWP. The XAML is similar between them and a lot of the coding paradigms are same (MVVM patterns), and if you architect well (.NET Standard 2.0 libraries) you can share 100% of your business logic between a WPF application and a UWP application.

Xamarin.Forms also has an open source WPF renderer if you'd prefer to further go the cross-platform route. (Xamarin.Forms supports UWP, iOS, Android, macOS.)

I agree that I think Microsoft should have had the .NET Standard, XAML Standard, WPF Xamarin.Forms development stories in place sooner to better leverage developers that need to support older Windows, but many of those pieces are in place now if that is what corporate developers were waiting for.


I'm not developing anything desktop right now but when I think about it the conclusion I come to is to give up on MS technologies and go with QT. Much more stable and you get Linux for (almost) free.


Yea, this is where I ended up. If I’m going to port something to oddball non-Unix-based or otherwise less-important platforms, like Windows, Qt makes the most sense.


This reminds me of "How Microsoft Lost the API War": https://www.joelonsoftware.com/2004/06/13/how-microsoft-lost...


You forgot Universal Web Apps in your list, which like UWP is the least universal of the Windows ecosystem.


You start at Electron. Having used most of the GUI frameworks listed above, making a GUI app in HTML/JavaScript is faaaar easier.


They bet the farm on “store”. Why? Because they must have done the math and realized that the enterprise and “creative pro” desktop with classic expensive desktop apps is a dead end. Was that calculation correct? Who knows. But I have to think they did that analysis very carefully.


It was wrong. None of those apps have been replaced. I use them every day and switched to Windows 3 years ago. I was stubborn and should have done it sooner. Maybe MS saw Apple abandoning the pro hardware market and was trying to play along “because Apple”.

A fun result of the whole mess is that after a new Windows 10 install, the first thing I do now is open Store to install Ubuntu WSL.


I think they saw Apple’s success with the iOS App Store, saw Apple trying to do the same on desktop and tried to leap ahead of them by forcing the pace. Of course when app stores on desktop, for both Apple and Microsoft, proved much more problematic to implement, ramming it through like that proved highly damaging.


It's funny because even if you put yourself in the mind of 2009, you can imagine a much more modest store feature doing good things for end users.

The old Windows desktop might have had a half dozen things prompting you to elevate and update. Flash player, Adobe reader, Java runtime, all these things wanted your undivided attention to update, not to mention were polling the network and using resources to do that. So maybe you could see a single piece of infrastructure allowing those third parties to hook in less intrusively.

But that's not what they asked for. They would prefer you trash your app and write for the thing they came up with this year, to be thrown away when they do the rewrite the next year.


> They would prefer you trash your app and write for the thing they came up with this year, to be thrown away when they do the rewrite the next year.

That hasn't been true for over a year now. Classic Win32 desktop apps have been in the Store for over a year now (nearly a year and a half). It's my preferred way now to install things like Office and Paint.NET. Even Photoshop is in the Store now.

(Even iTunes was promised at one point, and I'm still very hopeful for that, because Apple's Windows Updater is still one of the worst updaters on Windows, where bad updaters was once an art form.)


I think it's more likely that there was a team or more likely several teams looking to make a name for itself, and they concluded that forcing everyone in the Windows desktop monopoly to use its stuff was the way to get that work noticed. I am sure that a bunch of the people responsible were praised as top performers and rewarded for their victory. When the original people are cycled out there is already inertia and it doesn't really occur to people that the previous blunder needs to be rolled back.


Yes, but the solution would be to do something like Steam, and sell normal Windows software in the store.

Selling only WinRT software made all their efforts worthless.

Forcing developers to provide Win32 and WinRT versions of software alienated software developers.

Some of the calculation was correct. WinRT was not.


While I agree to a point, I do think the "wild west" nature of a lot of Win32 apps kind of prevented it as they couldn't manage the security. If there were some way to wrap the Win32 apps in their own individual sandboxes without hampering function, then I could see it working. But with everything else they have to get going, I guess it is too tough a nut to crack right now (if could be cracked).


The obvious counterpoint to that is Steam again.

They don't have to sell all existing legacy Win32 apps in the store, but they should at least allow some Win32 apps.

About malicious software, I think it would be very dangerous legally for the malicious developer to expose his personal and financial information to MS and expect nothing happens when caught.


true, but I think (not certain) Microsoft's aim is security via sandboxing, so without thorough testing of the various Win32 apps, I don't see how it gets done. And I could just imagine the blog post from a Win32 developer when their Windows Store release breaks because it doesn't have the same access privileges as the proper version. Then users request added security but with no net impact on user experience/expectations and all that.

And all it would take is the developer to claim that their machine had a virus unbeknownst to them that somehow made its way into their binary.

Dunno, Steam is in a position where they don't really have to care about the Windows brand but Microsoft does, so they have a bit more freedom than Microsoft in that.


>immature and unproven technology

Wasn't old windows desktop technology the one proven to be bad? I mean I can say this because I like windows, but people have made fun of windows bugs and viruses for three decades now..

Granted, Windows Store still sucks ass years later its introduction, but building a new ecosystem with unified and modern tooling was probably a necessary step for them to make


There as nothing wrong with their desktop core technology. Visita at least allowed them to solve the security issues, and with seven they had a really great, solid desktop OS. The problem was actively undermining all that with Metro and RT. What’s popular and effective about 10 is all the crud they stripped off it, much more than anything they added, just as with 7.


> with seven they had a really great, solid desktop OS. The problem was actively undermining all that with Metro and RT.

I sometimes wonder if maybe they should have done a hard fork between 7 and 8 and continue to develop 7 for commercial users and 8 would be the consumer version.

In the commercial branch, they could undo some of the moves they made years ago (like putting GDI in the kernel rather than user space) to eventually get a more stable, secure OS.


I think they recently put backs tons of kernel space code back in userspace (in recent versions of 10), probably because the quantity of discovered security vulns started to become really ridiculous.


The security issues aren't "solved" as long as the expectation is that apps run with full user privileges; that's why the AppContainer/UWP model was and is needed.


Define proven to be bad. What is the metric there? And note also who considers it "proven bad". It was winning and also the better ui at the time. It did indeed had bugs, but not really that many of them affected most users - who were fully comfortable restarting and go. The viruses were issue through.

The thing with "people making fun" through is that a lot of it was by-product of culture war between linux and windows world. between radical ideological open source and closed source - where Microsoft engaged in a lot of ugly tactic. Between techies who wanted tech focused on their needs and commercial company that cared significantly less about needs of admins, tinkerers and such.

Meaning, the jokes were much more political and much less technical. They were often result of people hating Microsoft (recall their ugly tactic).


>Wasn't old windows desktop technology the one proven to be bad? I mean I can say this because I like windows, but people have made fun of windows bugs and viruses for three decades now.

A lot of that stuff could have been (and is now being) fixed incrementally. Lots of that tech is available to Windows desktop applications without needing to switch to UWP. Some of it is UWP only (except for a backstage pass for Edge, which tells you everything you need to know) only to push developers to the Store.


> Wasn't old windows desktop technology the one proven to be bad? I mean I can say this because I like windows, but people have made fun of windows bugs and viruses for three decades now..

I feel like MS had mostly quashed these problems with XP - it was certainly not bug-free, but after a few years was a solid enough platform many used it all the way to (and probably past) its EoL in 2014.


or if Windows 8 had set one UI default for tablets and the other default for desktops. The OS was faster and had better load times. It was a far cry from the failure of Windows Vista. It was just that start menu that really killed it for most people.


Remember at that time Windows 8 was the MS solution for tablets and MS was getting killed in the tablet space as iOS took off. Windows 8 wasn't officially out until a year after the first iPad. They had to get it out asap.


Windows 8 may have been a response to iOS popularity, but don't forget that Microsoft had been pushing tablets for a long time by then. They had a huge head start over Apple. If you want to include Windows for Pen Computing, then it was an 18 year lead that they lost overnight.


Apple released their own hardware pen-based handheld computer in 1993.

https://en.wikipedia.org/wiki/Apple_Newton

Windows for Pen Computing was released in 1992.


Yes, but Newton's OS had nothing to do with MacOS, being a completely independent system with its own API and even programming language (Newtonscript). Windows for Pen Computing was Windows.


Good point. I guess I would respond with the difference being that Microsoft developed the technology continually during that time whereas something like the Newton was a relatively short-lived product.

For the record, I had a MP120 and absolutely loved that machine.


To be fair though, it’s not that a desktop app store couldn’t work, it’s that Apple’s implementation and set of restrictions are infuriating.


They took risks with Windows 8 and I give them credit for that, but yeah, I'm glad they changed course because it wasn't working.


What strikes me is that their latest bet seems to be universal web apps. I have nothing against UWA in themselves but I very much doubt that the mass of Java/VB/C#/C++ developers that write the bulk of Windows apps will suddenly move to JavaScript. And if you ported your app to the web why bother with UWA.


Agreed. If I wanted to be a web/javascript developer, I would already be doing JavaScript. But I don't want to.


This seems like a ridiculous strategy. Who would abandon an ecosystem that they own over 80% of? If they're spending too much money on it, stop adding features and harden what they already have. It's increasingly clear that macOS isn't business-level stable, and Linux isn't polished... No one has an Android device on their desk at work. Maybe if they stopped trying to take over the world they would realize that they already own a huge part of it. If they don't care, they should sell it to someone who does.


They're not abandoning Windows or the ecosystem. This is just the end of it as the tip of the spear/driving force for MS strategy. Nadella is (rightly, imo) focusing more on o365 and Azure.

One of the points the article makes is that Ballmer's MS would do things like not release Office for iOS because that would potentially eat into Windows marketshare. That no longer will happen. MS sees its more important to get o365 everywhere they can.

From my perspective, switching small businesses over to Exchange Online and other o365 services - The future is bright for MS in that space. These SMBs we've switched over have gone from using a version of Office they purchased forever ago and an old Exchange license to paying 50-100 a month for Office 365.


I agree that getting Office to run on Android and iOS is a bright future, but that doesn't mean you put Windows in the corner with "some other stuff". Maybe there are too many people on it and they've gotten too comfortable? Spin it out into a company with half the staff and let it soak up its own sun.


The problem with Windows is operating systems are a commodity. You shouldn't care about it. You are about opening your documents on your desktop and phone - you don't even care what program opens the documents.

Mac OSX, Linux+desktop (any of several), IOS, and Android are all operating systems that can do everything you need of an OS, except MAYBE run windows programs.

Which is to say Windows as a separate company probably cannot make it long term.


If that were true, Linux on the desktop would have already won and this decision wouldn't be controversial. I think what's preventing this is that making a productive desktop operating system is more difficult than we think.


The desktop matters less and less.

The only computer I own at home (other than ipad/iphone) is a Chromebook. But I lease a cheap VPS that lets me do "computer stuff" otherwise.


Not at all; if you want to consume a lot and create a little then your thesis is right, but the desktop is king for creators. There are a slew of very mature Windows only software out there serving many fields of endeavour for which there are no competitors in either FOSS or other OSs.


Eh. Creator software comes in two tiers. Dilettante tier (e.g. lower-end Photoshop) is actively ditching desktop altogether, while higher-end (think of Color, the motion picture colorizing system) either run on Macs or on such dedicated machines that they might run Wintel but are not "desktops" anymore.

(Then there's scientific software -- Stata and Matlab and that stuff. But ten years ago Matlab, like gagh, was best served in higher-end minicomputers and now everything is Python and can be ran on a cheap VPS or on AWS.)

Besides the sheer gobsmacking convenience, there's also the fact that it's way more fun for bean-counters to have higher OPEX in exchange of lower CAPEX. It really kicks the llama's ass.


> There are a slew of very mature Windows only software out there serving many fields of endeavour for which there are no competitors in either FOSS or other OSs.

Can you list some? That sort of software is my favorite genre and I'm always interested to hear about different ones.


Apologies for the delay in replying. I use MS Project and an add-on called SSI Tools daily, both essential for CPM based project management. Also MS Outlook plus Clear Context for PIM and communications functionality. There's also MindManager which is a highly functional mind mapper plus a couple of plug ins for it.


No. Linux on the desktop has some specific list of pros/cons. Windows can be better by enough, even if inertia is the only advantage.


Isn't that my argument?


You seem to be arguing that is enough to make windows not a commodity.


> Windows as a separate company probably cannot make it long term.

That is surely overstating it, at least for definitions of "long" meaning <= 20 years. If you assume that Windows is more or less "done" their engineering expenses could be low, and revenue significant, merely serving the corporate and various niche (gaming, etc) markets which are still heavily bound to Windows. Their market share is declining in some niches (such as CAD) and maybe even overall (to OS X), but negative revenue and market share growth does not necessarily mean quick extinction. Some dinosaurs take decades to bleed out.


Are you kidding? I'd be ecstatic to go purchase a new boxed copy of Windows 11 Pro 2018, so long as it was basically Windows 7 shell with Windows 10 kernel and security updates, no UWP/Cortana/store/telemetry addons, and settings that stay set.

I'd even buy the Windows 12 Pro 2021 version when the time came for fundamental internals shifting like wider WSL access.


You might but no one else would. Most people use the OS that came with their computer until it dies.


You might but no one else would.

I would. An update to Windows 7 that worked well with modern hardware and had modern security updates but otherwise ignored almost everything Microsoft have done in more recent versions would be a significant improvement on anything currently available for those of us who value stability and security over bells and whistles.


I would and have. My last PC was built with Vista. I bought 7 when it came out and then 8. I'm far from the usual user though. My point was that regular people will not.


Not saying that you are wrong, but iOS office was in development long before Ballmer left, so never isn’t as true as “not soon enough.”


> Nadella is (rightly, imo) focusing more on o365 and Azure.

Windows is a perfect example of no focus at all. Maybe this makes sense for a true mainstream OS.

To be honest, I wonder if Azure, Exchange online or Office 365 can really be their future. Why not use AWS, Scaleway or a local service. It's cheaper and better.


> macOS isn't business-level stable

amazing how many people here are missing the point. this is not about Windows vs MacOS. Client OS has become a commodity with prices approaching 0 (MacOS no longer charges for upgrades, neither does iOS or Android) and users care less about how they get to Facebook.

Think about server/mobile space. if MS tried to hang on to their dominance simply via desktop lock-in (because it worked in the past) that would be their Kodak moment. All Nadela is doing here is emphasizing efforts on staying competitive in cloud/o365 which is where he thinks they can continue to grow and stay competitive and bring in revenues from enterprise clients. Less and less companies want to lease/run their own windows/database servers, and with technologies like containers a lot of companies are now going with open source solutions.


Established businesses aren't going to put their data in someone else's datacenter. Small to medium businesses perhaps, but all it takes is one downturn that tightens the purse strings, or one critical event that reveals their absolute dependence on a third party. SAAS solutions are fair weather, and while we've had a lot of good weather I don't expect them to be used for critical business functions in the long term.


> Established businesses aren't going to put their data in someone else's datacenter.

No, established enterprises (both private and government) are, in the real world, important customers for cloud offerings like Azure, GCP, and AWS.


During a AWS demonstration at work. The Amazon guy specifically mentioned, AWS is a more lucrative option for large companies than small ones, simply because it was an easy one time way of culling large teams in big companies who were adding very little to nothing in return in value.


Hmm, not convinced about that.

My Boss is smart and he'd wonder what the business plan would be should say 5 years down the line you want to move away from the cloud for some reason.

You threw away decades of accumulated knowledge about the company and it's systems to save some money upfront.

He thinks on a longer scope than most bosses I've known though (largely because his family own the group so he doesn't have to answer to shareholders).


But if your business is to produce, say, cars, why would you want to maintain this knowledge in house. You certainly want to understand your systems, but do you really want to deal with physical datacentres, updating the underlying OS, replacing drives, etc. You kind of want it outsourced to specialists.


Yeah, that ought to be a great sales line.

I'd be wary about how the real world actually deals with that, but there is no lack of large companies jumping on short term costs cuts without looking at the risks.


From the point of view of upper management in a large enterprise, paying a third party with whom they have an SLA and someone to sue for infrastructure which may go down vs. paying their own staff to build and maintain the infrastructure that may go down often either isn't a difference in risk or, actually, the former seems to be better-managed risk. It's the same reason they often prefer to buy support contracts even for OSS rather than supporting it internally. Part of the analysis that probably gets overlooked from the outside is that employees aren't just a predictable cost they also are a risk.


> isn't a difference in risk

Really? People incentivised into looking for your best interests vs. a managed organization incentivised into dissolving your contract and gaining on average. I can't imagine anything that could create more different risks.

What people seem to do all the time is looking at an SLA and believing the real world must behave the way things are on paper; because there's a signature there. People do this even after being recently burned by the same partner. Some people have the same trust for in house staff, but the usual on high management is to have the exact opposite bias.


With the developed world being increasingly dependent on AWS, Azure and Google Cloud, the next big fuck up is going to be interesting. We have seen multiple datacentres from the same cloud provider going offline simultaneously in the past. And it is safe to assume that hostile foreign state actors took notice of this critical dependency. Whether it is because of malice or mistakes, I can’t help thinking that this massive concentration is like a snowball increasing in size while it’s going down the slope.


Indeed. If/When another major conflict comes to the developed world, you better believe major cloud DCs will be pretty high up on the cyber and (if it comes to it) kinetic target lists. Never have so few facilities been relied upon for so much by so many.


It seems like my comment is being evaluated without proper context. Sure, some companies are moving some of their data into someone else's datacenter. Is that 50% of companies? Are they moving their core business data? Like when there's a network outage, they can't do business? When there's a partial data loss, they close their doors?

When we're talking about an entire ecosystem going away, the new one has to have almost 100% adoption. Windows desktop is going to exist for a really long time, especially when everyone else is giving up on desktops.


Those 2 statements are completely different issues.

And yes, companies are entirely moving their systems and data. Salesforce runs their sales team. Oracle and SAP runs their ERP and finance. And now AWS/Azure runs their IT infrastructure. They use email from Google or Microsoft, along with hundreds of other SaaS products used day to day by employees. New startups are even more invested and start using other vendors for every non-core activity from day one. This is what a strong global business ecosystem enables and is a massive advantage for everyone.


You have entirely missed the last 2 decades of this happening and at a rapidly accelerating pace. I'm not sure what you think "business critical functions" are but every company outsources at least some important components that could easily put them out of business if something goes wrong, whether that's cloud servers or payroll or manufacturing supply chains, etc. It's called risk analysis and yield management and is well understood by every successful team.

When it comes to "someone else's datacenter", I think you'd be surprised at just how few companies even own a datacenter, let alone their own racks. Even the major clouds lease their own space from actual datacenter builders and operators.


>Established businesses aren't going to put their data in someone else's datacenter

You are incredibly unbelievably wrong here. I work for a vendor known for incredibly high prices and have knowledge of basically the entire infrastructure of every one of my clients, and nearly all of them have a massive cloud strategy that's in-place right now. I even have clients shipping their security and compliance logs to the cloud, or sending them from cloud to cloud.

Actually the more medium-sized businesses are the ones most hesitant to move to the cloud, because of the lack of cloud expertise on their teams. But even then... it's still a huge strategy. They're all consolidating physical datacenters and using Amazon or Azure or Google.


On the topic of cloud strategies and infrastructure, what are your thoughts on the Mulesoft acquisition? Does this foray into infrastructure software really make it that much easier for CRM to sell the rest of their SaaS stuff? Obviously a lot of CRM and potential CRM customers have hybrid models today, does this really help them dive two feet in with Salesforce for their pending 'digital transformation'?

Also, I don't want to get off topic but how does SIEM space fit into the infrastructure stack you see in the future? In security, what about the firewall vendors trying to become the central hub for managing the new security needs brought about by the cloud (CASB, endpoint, virtual firewalls, etc?). What about SaaS only Iidentity & access management solutions like OKta?--I hear they are a game changer, but is a cloud-only solution really the best positioned seller for something like this, especially considering the hybrid structure of most large orgs today (on-prem & off-prem/public cloud)?


I can’t talk too much about Salesforce and Mulesoft since I’ve never used them and don’t have many clients asking me about them.

When it comes to next-gen security appliances, I see them enhancing the SIEM, but not replacing it. Too many regulations require centralized log management, and organizations depend on a central alerting and monitoring platform. I do see these security platforms making the SIEM cheaper and dumber, though. Whether it’s a storage-based pricing like Splunk or an event Rate pricing like QRadar, licensing costs a lot. Which means it costs a lot to feed a bunch of dumb logs into a system that makes information out of dumb logs. As long as you already have a Palo Alto, you might as well feed the IDPS logs into your SIEM and forget the rest: it’s fewer logs, cheaper licensing, and less hardware needed on the SIEM. You already have the Palo doing the intelligence.

With regards to hybrid clouds that are popular today, you see a lot of SIEMs go to the cloud. You can ship logs cloud-to-cloud, which saves bandwidth backhauling it to the enterprise. It is a challenge for security logging to get data from there though. Oftentimes it costs extra to send logs, or can’t be done at all. Amazon wants you to use Cloudwatch. Microsoft wants you to use Security Center. It’s a pain to centralize it all. That’s going to have to change.


Thanks. Interesting....

To your last point: sounds like it should change, but not that it necessarily will. I can’t fathom the cloud providers would allow an independent third party come in and take this business from them, despite how much easier it would be for the customer.

I thought splunk was usage based pricing, not storage based? But yes, I have heard similar things on how quickly it can get expensive, sometimes without any warning when they get a bill 10x what they expected...

Thanks for your time.


I suppose time will tell... Many companies will never do this.


they can still take advantage of existing cloud technologies running on their own servers.

And guess what, the CTO and CFO and COO will very much care about licensing cost of each CPU running windows or sql server, vs FREE on Linux.


You're thinking about your own app servers. No one cares about the cost of Windows when you're running someone else's proprietary desktop software on it.


> No one cares about the cost of Windows when you're running someone else's proprietary desktop software on it

This is a small and shrinking corner of the market. Valuable. But still a corner.


It may be shrinking, but it isn't small.


> the CTO and CFO and COO will very much care about licensing cost of each CPU running windows or sql server, vs FREE on Linux

They haven't really up until now have they? MS licenses are not a big part of the IT budget compared to the people needed to run an in-house data center (typically staffed 24x7)


This is where Microsoft is going with Azure. Compared to other platforms, setting up a local instance of Azure is pretty painless and allows you to spin up resources locally that are identical to the Cloud.


Which is why Microsoft has on-prem and hybrid offerings for most of their products. Actually this is one place where they’re way ahead of AWS and GCE.


Same with SAP. They have a cloud offering (Hana Enterprise Cloud) which can connect to existing on-premise SAP deployments. That way, they can up-sell existing customers into the cloud and then slowly migrate them to even more subscriptions.

Disclaimer: I work at SAP, but not in the same division. And not in sales either, so take my word with a grain of salt. ;)


> Established businesses aren't going to put their data in someone else's datacenter.

I guess the 5 years I spent at a massive tech company shutting down our datacenters didn't happen and all the recruiters offering me huge amounts of money to move their stuff into aws aren't real.


> Established businesses aren't going to put their data in someone else's datacenter

[CITATION NEEDED]

This is very myopic. Establishes businesses are huge customers (both current customers and potential customers), for cloud providers.


"Established businesses aren't going to put their data in someone else's datacenter."

Established businesses do this all the time.


>No one has an Android device on their desk at work

If you mean as in a desktop PC then no, but smartphones is becoming more and more important and Apple is not strong outside the US. Around here almost everyone has an Android phone as a work phone and I see more Chromebooks by the day.

I also want to point out that i believe that saying they abandon the ecosystem is missing the point. They don't. They just don't let Windows dictate what the rest of the business does.


I agree. I wish Microsoft would focus on Windows 11 and not think of Win10 as the last Windows, thereby flooding it with adware non-sense.

There are still a lot of tech heads and gamers that depend on Windows and will continue to buy into the platform. I even know hard core Linux devs who've switched back to Win10 and just work off of the new Linux Subsystem and an X11 server.

With Android being a clusterfuck of vendor specific bloatware and not having a single concept of a clean install, Microsoft should focus on keeping a clean Windows install equivalent to a fresh Mac install.


I agree with what you're saying and I think they need to keep Windows 10 around at least for a little while longer.

Everyone hates when Microsoft changes something core and foundational about Windows and thus there has been little change in certain areas of the OS for the last two decades. That build up of cruft has become untenable and Microsoft attempted to address major pain points like the Control Panel with Windows 8, then 8.1, and still in 10. The problem is that rewriting something like the Control Panel would consume 100% of developer resources for an entire OS Dev cycle. Can you imagine if Microsoft announced Windows 11 and the only difference between it and 10 was a new control panel?

Windows 10 is the perfect opportunity to start fixing cruft. It's already controversial but not universally hated, they got the majority of people to adopt it through free upgrades, and they've introduced the semi annual upgrade cycle that allows them to gradually introduce foundational changes gradually. With each release of Windows 10 they introduce a bunch of new crappy apps that are completely forgettable AND they improve a core feature silently.


> With each release of Windows 10 they introduce a bunch of new crappy apps that are completely forgettable AND they improve a core feature silently.

And with every shift towards ads or trying to force a mobile-first UI on a desktop/laptop, they alienate users off of it. I switched to OS X as soon as I saw the writing on the wall with Win8 (and Wine on OS X got stable enough to play GTA San Andreas, tbh).

Also, I wonder if it would be possible to get Windows 7 userland to run on a Win10 (and with it, WSL) kernel. Now that's something I'd switch back to Windows for.


> And with every shift towards ads

You mean like Apple asking me to use iCloud every single time I unlock my Mac? Or every time I power on my test iOS devices? Or do you mean up I have to open the App Store to apply MacOS updates and click past the ads?

Windows 10 adds some shortcuts to the start menu during the bi-annual updates, they're annoying but once they're deleted they don't come back. They also have an add in Explorer for OneDrive that also goes away permanently when dismissed.

If I had to choose between MacOS/iOS Ads, and Windows 10 Ads, I'd choose Windows but ideally they'd both stop it.

> trying to force a mobile-first UI on a desktop/laptop

Windows 8.1 and 10 don't require UWP apps to be full screen. And actually the Settings App in Windows 10 is fantastic because it responsively adapts and is usable whatever size you make it.

MacOS on the other hand won't allow you to resize most settings windows and they're fixed to a size that's comparable with a 90s era Macintosh despite MacOS only working on machines with high resolution displays.

> I switched to OS X ... with Win8

So you've never used Windows 10 first hand then and you're just spouting things you've read to validate your decision?


> You mean like Apple asking me to use iCloud every single time I unlock my Mac? Or every time I power on my test iOS devices?

wtf? I don't use iCloud and don't get this nag. I only get a single question when I reimage my test computers or wipe a test device, that's it.

> Windows 8.1 and 10 don't require UWP apps to be full screen.

The start menu is enough for me to permanently turn me away. Or this stupid "tile" interface. What's that crap even for?

> So you've never used Windows 10 first hand then and you're just spouting things you've read to validate your decision?

No, it's these things that I always encounter when having to fix other people's computers. Plus I do not really like that Windows 10 has un-opt-out-able telemetry (yes, you can turn it off by hacks, but every random "update" will turn it back on again).


> I don't use iCloud and don't get this nag.

Are you signed into it though? I'm not and it nags me to sign in.

> The start menu is enough for me to permanently turn me away.

Which one? 8, 8.1 and 10 all have very different start menus.

> Or this stupid "tile" interface. What's that crap even for?

In Windows 10 it's not unlike Dashboard in MacOS.


> Are you signed into it though? I'm not and it nags me to sign in.

That's not an 'ad' - there is a ton of OS-level functionality that's tied to your iCloud ID and account, you can ignore it if you want, but it's not an 'ad' if they want you to use it...


> Are you signed into it though? I'm not and it nags me to sign in.

No, I don't use it in any form.

> Which one? 8, 8.1 and 10 all have very different start menus.

And they're all not the Win95-to-Win-7-era one. I need a working menu with nothing else, especially not tiles or ads.

> In Windows 10 it's not unlike Dashboard in MacOS.

Which Apple doesn't force one to use, thank God. Programs folder in the Dock, that's it (although, I admit, I'd prefer a list/menu...).


>You mean like Apple asking me to use iCloud every single time I unlock my Mac? Or every time I power on my test iOS devices? Or do you mean up I have to open the App Store to apply MacOS updates and click past the ads?

I don't get asked to use iCloud every time I unlock a Mac. Additionally, I don't recall a Mac ever displaying ads in their file explorer asking me to upgrade my SkyDrive storage. Nor do I ever recall Mac OS ever installing third party bloatware apps and games and making them front and center in the Start menu. I also don't recall ever having to proactively prevent the mass collection of "telemetry" data when installing Mac OS.


> I also don't recall ever having to proactively prevent the mass collection of "telemetry" data when installing Mac OS.

To be fair, it will ask you once at profile creation if you agree to sending telemetry. To the best of my knowledge though, Apple respects the choice and does not reset it quietly like Windows does.


Windows 10 privacy controls are intentionally designed to be skipped by the average consumer.

Let's have a look at the "Get Going Fast" screen

https://media.askvg.com/articles/images5/Customize_Privacy_S...

You'll see at the very bottom a Customize Settings button that is nearly invisible. It's located on the left side, shaded in a color that is different from the other buttons and the text size is substantially smaller than the standard button text size.

The majority of users will have missed this and just clicked on the Use Express Settings button.

Let's have a look at what is turned on by default:

Personalization

1. Personalize your speech, inkling input and typing by sending contacts and calendar details, along with other associated input data to Microsoft.

2. Send typing and input data to Microsoft to improve the recognition and suggestion platform.

3. Let apps use your advertising ID for experience across apps.

4. Let Skype (if installed) help you connect with friends in your address book and verify your mobile number. SMS and data charges may apply.

Location

1. Turn on Find My Device and let Windows and apps request your location, including location history and send Microsoft and trusted partners location data to improve location services.

Connectivity and error reporting

1. Automatically connect to suggested open hotspots. Not all networks are secure.

2. Automatically connect to networks shared by your contacts.

3. Automatically connect to hotspots temporarily to see if paid Wi-Fi services are available.

4. Send full error and diagnostic information to Microsoft.

Browser, protection, and update

1. User SmartScreen online services to help protect against malicious content and downloads in sites loaded by Windows browsers and Store apps.

2. User page prediction to improve reading, speed up browsing, and make your overall experience better in Windows browsers. Your browsing data will be sent to Microsoft.

3. Get updates from and send update to others PCs on the Internet to speed up app and Windows update downloads.


>Windows 10 adds some shortcuts to the start menu during the bi-annual updates, they're annoying but once they're deleted they don't come back. They also have an add in Explorer for OneDrive that also goes away permanently when dismissed.

Also ads on the lock screen, ads in the start menu ("app suggestions") and (Edge) ads from the toolbar. Just off top of my head, maybe I missed some.


Win 8.1 does require UWP apps to be fullscreen. Win 10 does not.


> With each release of Windows 10 they introduce a bunch of new crappy apps that are completely forgettable

And some folks get the really big money for "driving business initiative", "spearheading new projects" and stuff like that.


> hard core Linux devs who've switched back to Win10 and just work off of the new Linux Subsystem and an X11 server.

What's the use case for a "hard core Linux dev" and WSL? I've tried it. It failed miserably for Rails development when it was released. I logged bugs. I went back to what I was doing. Maybe it's gotten good now?


How is Android relevant? It's not a PC OS, and doesn't seem like a real Windows alternative in the same breath as MacOS or Linux.


Project Treble should help for your last point.


>With Android being a clusterfuck of vendor specific bloatware and not having a single concept of a clean install, Microsoft should focus on keeping a clean Windows install equivalent to a fresh Mac install.

Speaking of Windows OEM bloatware clusterfucks here's your typical Windows HP OEM Windows clusterfuck:

https://www.youtube.com/watch?v=B1GyAI5d-Yw&feature=youtu.be...


The author didn't characterize his position accurately. He meant "Windows Server", not "Windows Desktop". Similar to how IBM sold off it's laptop business to Lenovo, the strategy is basically saying that "We're a business services company, there's no growth left in the end-user ecosystem anymore, because all of that growth is in mobile, which we suck at, so let's just double down on what we are growing at: cloud services and office productivity services".

That Windows we all know, both desktop and server side, is not dead. The economics on the server side are diminishing (20~30 year outlook) and Microsoft won't be around if they don't do something about it now.

IMHO it's a very smart strategy, people just didn't understand what Ben was saying because he didn't articulate it correctly (which I do find with his content, but it's less often the case)


If that's true, yes, Windows Server has been irrelevant for some time. It wouldn't explain the remapping of Office, though.


> Windows Server has been irrelevant for some time.

Most definitely not. There are a ton of Microsoft Certified Somethings that will refuse to administrate Linux servers unless their life depends on it.

Source: I work at SAP's internal cloud unit. We have to support Windows VMs (begrudgingly, if I may say), even for payloads that could run on Linux.


Inflexible workers will be separated from their jobs eventually, its just a matter of time.


What makes you think that? Those inflexible workers are also those who make the purchasing decisions. A shop full of Windows admins will not switch to Linux unless forced to (and vice versa).


And many will eventually be outsourced. In fact, systems administration as a profession is on the downswing due to the cloud. This site itself is largely focused on "disrupting" inefficiency with automation.


Not only due to cloud; even if you have local bunch of immutable OS instances like CoreOS or RHEL atomic running containers, you don't need a big staff babysitting server OSes.


Yes, conflated cloud and containers in my head when writing that comment.


Office is Office365 rebranded - which just means the lock-in is a monthly subscription instead of a license. The lock-in is further exploited because email is the cornerstone of any business productivity suite and it'll be hosted on Azure now (which is a huge lock-in to their ecosystem).


> macOS isn't business-level stable.

Can you elaborate this assertion? As far as stability goes, macs are top notch. I'd say in the business context Windows machines are prevalent because of successful MS strategies (in terms of marketing, not ethics) in 1995-2005 era, and it kind of stuck around.


The enterprise tooling for macOS is woefully insufficient.

Windows has provisions for performing all sorts of tasks across a fleet of desktops. I can automate deployment of a Windows desktop to the point where all you have to do is boot it up and join it to the domain (which makes it easier to hire someone to do it as well).

What's more is that these settings will work across updates. I've gotten really sick of the screwy changes Apple makes to macOS with little warning. Windows goes to great lengths to ensure backwards compatibility between major version upgrades. (For example, did you know VB6 apps still run just fine?) Also, I can get fine grained control of updates to Windows.

But maybe most importantly, I can virtualize Windows and create a virtual desktop environment. Now I can hand out toasters to users and they can just log in to their box from anywhere.

Windows is simply much better at making desktop units fungible.


I was going to reply to the original thread with [citation needed], but thankfully someone already did it. The fleetwide maintenance is the issue, not stability.

That said, everywhere I've worked the past 8 years, across 3 jobs, was virtually Mac-only. But yes, the valley is an outlier.


> The fleetwide maintenance is the issue, not stability.

From the medium to enterprise range, maintenance and stability issues are often one in the same. Breaking automated maintenance and/or deployment leads to instability.

Keep in mind I don't mean "kernel panic" levels of instability or even "programs crashing" levels of instability. The smallest of changes can cause a mountain of work in large environments. Simply changing the icon of an often used program can cause a deluge of tickets.

This level of stability and control is something that macOS doesn't even come close to recognizing.


Agreed. To get anything close to this with Apple you need to go to third party tools such as Jamf.


I can agree with the stability of the tooling since Office products have become the de-facto standard of "enterprising" and MS doesn't really sweat herself to keep OSX tooling pristine. The original statement sounds like it is a statement about the stability of the operating system itself, which I'm sure it is good enough to support personal and business computing needs.


Business is full of "just good enough" solutions, and loves control. Apple has shown that breaking backwards compatibility doesn't bother them, they're not going to build IT control features to the same level, and ... maybe macOS is going to be sacrificed at the alter of iOS? We're not sure these days. Meanwhile, custom Windows software from forever ago works just fine today, and that's good for business.


It's possible they mean stability in terms of length of support. macOS as an operating system is more "user-stable" if you will, than Windows- i.e. less crashing. But Windows is that way because of the massive legacy interoperability in that codebase. Windows didn't just work because of marketing, it worked because when Microsoft supports something they do it for Enterprise-lengths of time- and with the proper Enterprise tooling ecosystem that Windows has. Ironic that what, probably (some of this is assumption), makes it attractive to Enterprises also probably makes it difficult for them to deliver a OS as solid, to the user, and excluding High Sierra, as macOS.


They literaly broke enterprise provisioning between the last beta and final release of High Sierra pretty much preventing update of huge fleets of enterprise MacBooks. That's not something business customers appreciate. And that's only the tip of the iceberg ignoring things like my MacBook deciding to fail to boot after a High Sierra security update.


I always thought one of Apples strengths was that they ignored the enterprise market, Windows has always been compromised by the need to support both home users and corporations.


> As far as stability goes, macs are top notch.

My MBP and its periodic gray screen of death begs to differ. Apple really needs to shore up its QA.


This article is way too long to make a simple point - which is that Windows is not being abandoned but is no longer the flagship. Cloud infrastructure and services are leading the way, so Azure and AI is now the forefront.

Sticking with something just because it's a large majority is a great way to fail when that something fades into the background.


Re: seems like a ridiculous strategy. Who would abandon an ecosystem that they own over 80% of

I suspect it's a mere marketing strategy to jack up their stock price and gain a "cool factor". MS will go wherever the market/profits leads them.

Their cloud is commercially successful, a rarity for new MS products actually, and that's where they currently see future growth. But if Windows stays a nice cash cow, they'll quietly milk it.

"Productivity" applications still need a real GUI, something the Web has yet to do right cross-browser: CSS/JS/DOM is a mess. I believe we need a WYSIWYG standard to get GUI's right. The server side can still scale the window & widgets per device size under WYSIWYG, so it's a myth that WYSIWYG is anti-mobile. The client-side auto-flow approach sucks for efficient UI's.


> Who would abandon an ecosystem that they own over 80% of?

They aren't abandoning the desktop Windows ecosystem, they’ve just demoted Windows from a single desktop and server org that the rest of the firm orbits to something that serves the areas where there actually is expected to be growth.

More

Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: