Hacker News new | comments | show | ask | jobs | submit login
What Really Happened with Vista (hackernoon.com)
304 points by mrpippy 169 days ago | hide | past | web | 126 comments | favorite



From where I sat in Windows Security (I wrote much of the BitLocker UI) these are a couple of anecdotes:

I remember one of the base team architects laughing and bragging about how and why they killed managed code and banned it from the OS — the technical reasons where valid, but the attitude was toxic. I think most or all of the generation of architects that had come from DEC have now moved on, but they also took tribal knowledge with them. I wonder how much they were encouraged to find other things to do because of this type of attitude (which not all of them had btw)

Moving on

The Avalon team appeared to be drinking their own cool aid. They had charts on the wall showing their performance goals and progress. All the performance measurements on those charts were annotated as "estimated". I remember an internal alpha that had strong polynomial execution time based on the number of elements in the logical tree. It felt like the design was good, but that the correspondence of the implementation to the design was missing.

When Avalon was booted, we were told to write UI in a broken framework called DirectUI which had a convoluted history, an arrogant maintainer, and a weirdly mismatched documentation. Much was documented that wasn't implemented, the standard answer to missing features was "add it yourself", but if I pushed a code change to shell code (where it lived in the codebase) it would typically cause a forward integration from winsec to base to be rejected.

Good times


> had a convoluted history, an arrogant maintainer, and a weirdly mismatched documentation. Much was documented that wasn't implemented, the standard answer to missing features was "add it yourself"

Sadly, this is how some open-source projects are run as well.


Worse, I've seen open source project maintainers tell to mere users (as in, non-programmers, just using the software) to 'add it yourself'. Ok the maintainers cannot really know the user's background, but still, such attitude raises some questions.


I also have this problem with the "where's your pull request" trend lately. Yes, I use open source project X. When I detect a bug, I'll definitely try to isolate the cause and reduce it to a minimum, and then log a Github issue with clear instructions on how to reproduce it. As a developer, I feel that this is massively helpful; if only the bug reports I get in my professional life were like this, instead of "yeah, there was an error - I have no idea how".

But no, I'm not going to put in 30+ hours to download, build and try to figure out where the bug is in the actual code. There are people who are in a better position to do that. If everyone who has a car problem had to submit a diagram to the factory on how they would change the car's design, we'd all be spending a lot of time doing stuff we're not good at.


If someone sends a "Please add X, KTXHBYE" , then I can see why a simple "Add it yourself" might be the response.

I offer here a response that you can use to add as a mental sed stage when reading mail from people that work for free and have little time:

Add it yourself -> Thanks for your suggestion. I will make a note of it but it is unlikely that I have the time and resources to implement this. You are more than welcome to submit a well tested patch.


To be fair, most OSS maintainers are doing the work for free. If they don't see the priority in the feature you want, it's not a bad thing to ask you to do it yourself.


By following multiple blogs and articles, that was the opinion I had built regarding Longhorn's failure. A sabotage caused by the internal politics between Microsoft teams, specially WinDev and DevTools.

Now with .NET Native, C++/CX and UWP, one can say the design of Longhorn is revenged.

The COM based infrastructure could already have been a thing, if there was a desire to collaborate during Longhorn's development.


The summary of the managed code in OS debacle is probably:

Rarely do you get something for free, and assuming that this time is different is more likely hubris.

And when you do get something for "free", it's more about making trade-offs that no one notices (JIT) or doing incredibly clever things in a layer abstracted enough that people won't have to concern themselves with what you did (language and compiler design).


You are forgetting a very important factor, politics trump technical achievements.

As for note, all modern mobile OSes have languages with runtimes, regardless of AOT or JIT compiled, but they all required architects with guts to push them through.


OSes with managed runtimes and OSes that are built on managed runtimes are two different things (vis the spelunking Android had to do in order to get rendering performance to acceptable latencies).


Objective-C also has a runtime.


We're talking about two different things. I'm talking about OS components being written in managed vs unmanaged code. You're talking (I believe) about whether or not an OS has a first-class managed runtime for apps.

The article's author's assertion is that blindly using managed code in performance-critical components is a stupid idea, and I agree.

From the article, WinFS (the filesystem), Avalon (the UI compositor and renderer), and Windows Communication Foundations (networking) were all rewritten in the then-new/unstable C# for Vista.

Given that fate of the projects, I'd say history bears it out as being a stupid idea.


NeXT device drivers were written in Objective-C, check Driver Kit API.

The projects were killed by political reasons, not technical. Specially after what was being done with Midori, also killed by management, in spite of the technical achievements.


Ojective-C is a superset of C though. If you avoid the (dynamic dispatch) message passing, and if you construct/destruct objects manually, you're getting the same performance as C. It's trivial to move the performance-critical bits into the pure-C subset of objective-C.

You can't really do this with a managed (VM) language.


> check Driver Kit API.


I remember when managed code was killed. We could never get hangs and quality issues to go away. There was an amazing long write-up on the fundamental issues. They were unfixable, so it had to go.

Everyone trying to drive to a stable release was happy to see it go, because otherwise our job was impossible.


Can you talk more about that era? I'm really interested.


People rarely talk about what went well in Vista.

There was an effort to componentize Windows. This was extremely hard to do, while the kernel itself had a very clean architecture, the layers on top - less so.

"Longhorn Basics" this was a list of fundamental stuff that every team had to either cover or explain why they didn't (Accesibility, I18N, security, etc.) this was a great way of making sure that these cross cutting concerns where handled consistently across an enormous team

Testing - it may be hard to see this from the outside, but the testing was off the hook. Hardcore static and dynamic analysis ran against everything

Edit: also BitLocker!


>testing

If you look at Vista as a hardware compatibility beta for 7, I think it was a resounding success. They pushed out an operating system with a new incompatible driver model, had to wait for driver manufacturers to catch up (giving the impression that Vista was the problem) and then by the time every driver was rewritten, 7 came off as a godsend. 7 wouldnt be 7 without the "disaster" before it. It also let them use home users as a giant testing base for enterprise customers.

and dont forget uac, which broke a ton of code that expected admin rights


UAC was a huge deal for me. I had to write UI that needed privilege in an unprivileged process


> and dont forget uac, which broke a ton of code that expected admin rights

What do you mean?


Prior to Vista a lot of code was written on the assumption that the process would be run with administrator privileges. While this wasn't the original security model for windows NT, 16-bit windows, and windows 95 et al didn't have a security model as such, so code that had been written for those operating systems was particularly problematic.

It was definitely the case that a large number of windows users, possibly a majority, would have their user account be a member of the Administrators group.

UAC changed the way processes were started to use a lower privilege level by default, or popup a permissions dialog if higher permissions were needed (based on a manifest associated with the process)

In parallel the code samples on MSDN that showed how to detect if the current process had admin permissions was subtly incorrect, so programs that had been written to do the right thing depending on privilege level would fail unexpectedly on Vista


Very interesting! I feel like all the OS components are now written in DirectUI too. Did it turn out to be the way to go, or did it happen because it was shoved down everyone's throat?


AIUI, DUI was the standard toolkit for first-party Windows UI from XP through 8.0 (this includes the "Metro" shell components in that release like the Start screen). Starting with 8.1, new components have been written using the same XAML toolkit that's promoted for third-parties, but there's still lots of older DUI stuff in the OS, like File Explorer and the taskbar.


I left the Windows division shortly before Vista shipped, so I couldn't say. But from native code, the choices were to do things directly on the win32 API or to use DirectUI which, for all its faults, was much easier to use, and produced much more modern UI/UX.

There were also the obvious advantages of everyone using the same technology. At the time, the look and feel of a windows release was a closely guarded secret until fairly near the release, so the team that owned that experience needed to be able to apply that theme to everything fairly quickly.


Ahh, DUI. I had forgotten about that.


This is an article as to why Longhorn failed. If you want to know why Vista failed, here's two big reasons:

1. In Vista, there was a completely new Antivirus framework that AV vendors had to use in order to support Vista. Good idea in theory, but because it was brand new (and because vendors had to write brand new code to support it), it had a fuckton of problems, mostly contributing to an enormous performance degradation compared to XP. Like, horrendous. And since every laptop you could buy in the store had 3rd party AV because that's how OEMs make $$, whoops every laptop is absolute garbage out of the box.

2. Windows Vista completely rewrote the USB stack, an extremely complicated, timing-sensitive component (like, add a printf, break a device kind of sensitive), and about 80% of the way through, the lead architect who planned the rewrite left the company. Meanwhile, there were a _ton_ of devices which weren't actually correct wrt the USB spec, but just happened to work with the timings that WinXP solely by accident would exhibit. A friend of mine made a lot of money in bonuses by slogging through bug after bug in the USB stack trying to make it right for Vista SP1.

It's funny, because Vista shipped a ton of great features, that then got attributed to Win7 - it really set the foundation for the modern Windows OS until Win10 came out.

So, if you learn anything from the story of Windows Vista, it's this: Performance is not a Feature, but it sure can make all of your other features not matter at all


> It's funny, because Vista shipped a ton of great features, that then got attributed to Win7

I have heard that some people at Microsoft consider Vista a kind of public Beta for Windows 7. Maybe not intentionally, but that is the way things worked out on the road from XP to 7.

And Windows 7 is, as far as Windows systems go, very nice. Very stable, decent performance on not-too-old hardware, I like the UI very much.


I have Windows 7 on a 4GB maxed out IBM laptop for personal use. Don't see a reason to go anywhere, as long as I can. Although, XP was also fine.


That must be due to Vista being so late, all the oems had barely any lead time to get on board with it. Whereas by 7 rolled around it was old news.


I like it a lot. I declined the free upgrade offer. Why take the risk?


And a whole new graphics driver architecture - with Windows 7 getting the praise there too. But it was Vista with the teething trouble, and the third parties that didn't have their support together.

The headline being "NVIDIA Responsible for Nearly 30% of Vista Crashes in 2007", but the perception of Vista being terrible.


Indeed. Unfortunately, no one cares how immaculate your suit is if you forgot to wear pants.


Or in other words, don't buy a bespoke suit jacket and vest, then pick up pants from Walmart right before your wedding.


> the lead architect who planned the rewrite left the company.

This alone explains a lot.


That explains what exactly? You should never allow a culture where one employee departure can endanger the whole project.


I think that in reality most companies (up to a certain size) or projects have people that are pivotal, enablers or highly influential. Not only charismatic leaders but also technical wizards, etc.

They might not be irreplaceable but they might be hard to replace, and if they leave in the wrong moment, it could jeopardise everything.

I guess having a management team made up of several Steve Jobs might arguably feel safer for the share holders, but it's probably hard to find them and make it work.

There are risks to manage; For instance, treat them nice so whole teams won't quit in rage. Treat the unicorns even nicer.


Tell that to Apple. :) At certain level, very good people make a big difference day to day with their leadership, vision, political prowess, technical ability, etc. Losing them DOES damage that no plan B can cover.


> You should never allow a culture where one employee departure can endanger the whole project.

And yet, in practice, that's how the majority of shops are run.


It suggests that Microsoft had such a culture, at least at the time.


any reference i could read for this matter?


Who was this?


There was also the sudo-UAC thing. I totally think it was the right thing to implement and improved the security for the power users a lot, but pisses everyone off for being annoying as hell (and it taught people to "approve" everything, which is bad).

It became much less of a pain in the ass when it comes to Windows 7's implementation, but it was a step in the right direction.


Also was hated by admins since the transition to the new interfaces for automation was half assed and broke a ton of client scripts


MS always seems to have had a schizophrenic relationship with API-exposing automation.

It's like they have "Make it easy to manage" as a priority, then don't talk to admins with non-MS experience, and then build features in a bizarre way.


They also introduced a GPU accelerated desktop, iirc.


Microsoft's incredible journey with dotNet.


From the outside world Vista just didn't look compelling.

Yet another Microsoft OS with a fresh set of incompatible hardware drivers.

There was no clear vision about what a modern OS should be. It felt bloated and clunky. Vista felt like it was trying to be everything to everyone and not really being anything much special to anyone.

By contrast, OSX seemed to know what it was, and provided a cohesive integrated whole in which things just worked. Most importantly, Apple somehow managed to generally provide just one way of doing things. For example configuring networking on OSX happens in network preferences. On Windows, to this day, configuration seems to be all over the place, a bit here, a bit there, and many different visual and UI styles for the configuration. Made even worse by add in software for configuration from hardware vendors. Even now with Windows 10 I dread having to resolve ANY sort of network issue - where to start - which rabbit hole will the required setting be down?

Windows had - and still has - problems that have never bee fully solved and they make the platform so unappealing. Shutdown of the Windows OS could never be relied on to actually happen without hanging for some reason. Hibernate mode when closing the laptop lid was equally unreliable. Constant application crashes. Multitasking that just wasn't - all the time the OS would become unresponsive due to the behaviour of some application, which leads to the next major issue was a seeming inability for Windows to actually force kill processes when instructed - that rarely worked. Even something as trivial as deleting a job from a printer queue - basically forget it.

After Vista, Windows 7 seemed to clean things up a bit before they got even worse, with Windows 8 just being an absolutely definitive statement that Microsoft did not know what Windows was or where to take it - I mean for goodness sake - removing access to control panel? Crazy.

The latest versions of Windows are better but no-one has had the courage to do what Windows really needs which is to get rid of the registry - that chewing-gum-in-the-hair-mixed-with-spaghetti central configuration settings black hole.


None of the usability problems you mention have to do with the registry, so I fail to see how getting rid of it would materially improve the windows experience.

The core problem with windows 10 for me is and remains the crappy driver situation. Drivers on windows are far less robust than on OS X, and OS and/or driver updates create many more problems. This isn't tied to any particular windows version, but W10 suffers from it more because updates happen with much less control over the when and where, which means drivers break in more awkward moments.

An example from my own workplace is that a lot of my coworkers have had their laptop keyboard break after a W7 update. Windows reboots to update, and ctrl-alt-del no longer works, not on the built-in keyboard and not on an external one, so you're locked out. The solution is connecting via remote desktop and logging in once, after which windows update picks back up and fixes the problem, but that's not always a convenient thing to do. I bet that microsoft doesn't even see that in their metrics, because to them it's just a remote desktop login, so they probably think those updates don't have any problems (after all, the update gets installed and people seem to go back about their business).


To be fair, drivers are a much more difficult problem for Microsoft because they don't control the hardware like Apple does. There are way more devices out there for Windows machines, and an insane number of combinations that have to work together without ever having been tested together.


That still doesn't explain why an external keyboard breaks. Keyboards are not exotic hardware, and they all work just fine when connected up to a mac.


If Windows Update was able to fix the problem after being reinitialized, then your original description sounds more like a Windows Update hiccup rather than an inherent driver issue.


Could be a permissions problem.

These days you can authorize Windows to log you in upon reboot so that Update the finish up.


Random thought: I wonder if Microsoft has ever considered adding a built-in test suite to consumer Windows to flog drivers.

I could see the benefits of being able to throw up a "Warning: {Device name} is not behaving properly and may cause system instability" prompt. As opposed to just screwing up behavior at user interaction time without a clear "That piece of hardware you bought / company you bought it from is a POS" message.


Isn't this basically what WHQL does without the blaming? And that's enforced on 64-bit systems unless you manually turn it off. Afaik part of the WHQL process is Microsoft installing the driver on thousands of different hardware configurations and seeing if it misbehaves.


Why the hell changing how drivers work with every release in the first place?


You are mixing Windows with GNU/Linux.

The drivers ABI has only changed a few times, instead of every point release.


I'm not contrasting windows to anything. The fact is that with every Windows Whatever I must go to sites and download new drivers, experiencing a lot of frustration on the way, or buy a new device only to make new generation of windows kernel developers happy. "Few times" is too much.


I didn't say that the problems are because of the registry. I just hate the registry because it's a really bad, unusable way to configure the systems and turns into a mess and tightly binds everything together when it should all be loosely coupled. The INI file was a much better solution for configuring things.


Ini files are awful, no fine grained control over permissions, spread all over the place, every one looks different. I'll take the registry any day.


Actually INI files are usually in a predictable place. It is registry entries that are spread all over the registry and are not cleaned-up when an application is uninstalled.


> On Windows, to this day, configuration seems to be all over the place, a bit here, a bit there, and many different visual and UI styles for the configuration.

With mmc and the various snap-ins, I get the feeling the code has not been touched in at least a decade.

Regarding the Registry, I have a funny anecdote: During development of Windows 7, Microsoft tried to be very open and communicative to rebuild some of the trust that Vista had shattered. So, there was this video of an interview of a few Windows developers about all the cool and interesting things they worked on. Halfway through the interview, the guy behind the camera asks, "So what do you guys think about the registry?". There was a long awkward silence, the developers started to grin and giggle. Eventually, one of them replied, "Let's just say it has turned out to be more widely used than its creators had anticipated".

(In other words, yes, the registry sucks like an industrial vacuum cleaner!)


It goes earlier than that. Here's a bit from a presentation that Tony Williams (considered the "co-creator of COM") gave once about the Registry: http://i.imgur.com/52YiK2Y.gifv

Essentially, the Registry was a stop-gap to store configuration information about COM in Windows 3.1 until Cairo came along (which never did). Then the Windows 95 guys came around and decided to use it as the replacement for .ini files. And the rest is history.


You know how the saying goes: Nothing's more permanent than a temporary solution.


The oddest thing i have found about the Registry is that it has its own ACL, independent of the file system ACL.

Thinking about it, it strikes me that the registry is similar to the Office file formats. Effectively file systems rolled up into files.


> ... no-one has had the courage to do what Windows really needs which is to get rid of the registry - that chewing-gum-in-the-hair-mixed-with-spaghetti central configuration settings black hole.

This! I couldn't say it better myself.


It's like SystemD for Windows.


Or more accurately, gconf. Thankfully deprecated now.


> After Vista, Windows 7 seemed to clean things up a bit before they got even worse, with Windows 8 just being an absolutely definitive statement that Microsoft did not know what Windows was or where to take it - I mean for goodness sake - removing access to control panel? Crazy.

> The latest versions of Windows are better but no-one has had the courage to do what Windows really needs which is to get rid of the registry - that chewing-gum-in-the-hair-mixed-with-spaghetti central configuration settings black hole.

The registry is exhibit A in the case for Windows being essentially a legacy OS at this point. Microsoft no longer seems to have the incentives, or, after the Vista disaster, the political will to make big architectural changes to fix basic problems of Windows: the configuration mess, the gaggle of different poor package management systems that PS PackageManagement is papering over, the muddle of logging, etc. Windows 10 also looks like Microsoft accepting a strategy of slow incremental improvements, or no improvements at all.


"Windows 10 also looks like Microsoft accepting a strategy of slow incremental improvements"

Good.


I don't know - perhaps it's selective memory but OS X was a trainwreck too in early releases. If you consider vista to be the .0 release of windows 7 then it makes more sense. Most of what was good in win7 was in vista, but was overshadowed by bugs and performance issues.


Jim Allchin, who led the Windows client team, famously wrote an email to Gates and Ballmer during the development of Longhorn saying that Microsoft had lost it's way and that if he didn't work for Microsoft he would run OS X himself.

http://www.macworld.com/article/1054364/allchin.html

Trainwreck seems a bit of an overstatement when discussing that time period.


I believe Windows 8 hit the same problem as Vista. Too ambitious plans for the available time frame so engineers did not have time to get things right. For example the Modern UI was actually pretty neat idea, but of course the version that shipped with Windows 8 was completely useless. They should have waited until the screen splitting was working and figure out a way to easily split the screen between traditional and modern apps.

What's the main issue with registry or what would be better alternative? I think this kind of database oriented way of storing the settings with transaction support is actually pretty good approach.


In both cases the story seems like it can be summed up as "spec'd rewrites of major subsystems, and worry about implementations of how people actually use them later".

E.g. all the missing Win 8 basic Windows functionality, where the underlying system existed but no GUI had been written to expose it

And how Windows Explorer search is still a cluster$&#@ that's regressed in functionality since XP (available operands to specify search params, scope, speed, etc).


Another highly annoying example of good ideas just not finished properly is trying to use multiple virtual desktops on multiple monitors. For some reason switching the virtual destkots switches to the new desktop for ALL physical monitors. Just why...

The devs have also STILL not figured out how to give a simple and easy way to move windows from one monitor to another from the alt-tab menu.


> removing access to control panel? Crazy.

Actually the control panel is present even in Windows 10.

You just have to dig it out of the Start view/menu.


Or just type "control" into the start bar. No digging required.


The first laptop I could afford was a used Toshiba Satellite that had Vista on it. A well off college student was having problems with it BSODing constantly, so she was selling it as "broken, maybe you can use it for spare parts?" For about $200. When I looked at the BSOD error and saw that it looked like a driver issue, I told her it was probably fixable she said she didn't care, she had already bought a MacBook to replace it. So I got it for $200. I immediately rolled it back to Windows XP and it worked perfectly fine ever since.


I always had the impression that the biggest gripe with Vista were the incessant UAC dialogs asking you to approve every little thing (I think Apple even poked fun at it in one of their 'Hi, I'm a Mac' ads). And that this problem was actually a consequence of the (sensible) change from having all programs running with full privileges to limiting them and asking users for escalations only when needed - but since most existing programs were not prepared for this, they caused UAC dialogs left and right, where they could and should have been avoided. And understandably it took a while for all the 3rd party devs to catch up to this and change their code to fit well within this more restrictive security model. But the way I see it, this is much like pulling off a band-aid that's been on too long - it'll hurt no matter what, so best just get it over with. I imagine that if Windows 7 had been released instead of Vista and at the same time (and AFAIK it has pretty much exactly the same security model) then it would have born the brunt of all that negativity stemming from all those UAC dialogs. In other words, Vista had to take that bullet, in order for Windows 7 to go on the scene later as the hero.


It reeked of CYA to me- if something bad happened, well, you clicked OK in the dialog. Shouldn'ta done that. Except you couldn't do anything at the time without those dialogs constantly coming up.


A clue that could tell the non-tech-savy audience that something was off with Vista was its laughable "wow" advertisement campaign: invariably depicting a starry-eyed person looking at an unseen and undescribed thing off-screen, and expressing "wow". No product features. No aesthetics. Basically just an industrial ad with a soundtrack. The only thing that was crystal-clear was the subtext: "we have no idea why we're selling this thing so we'll just say wow".


Never understood the criticism towards Vista... during that time I made a switch from Linux (to Vista). It was quite slick & fast. Afterwards, when I moved to Win7, it was not that much different at all.


Later versions of Vista weren't dramatically different to 7. But the initial versions were hideous, and once you've had a failed launch people's perceptions tend to get stuck.


Isn't this the same for all Windows-es? I've never switched to next version until at least a first SP was released.


The way I understood it (not having touched Vista very much), the problem was that performance was kind of unpredictable. It could run very well on modest hardware, but it could also crawl along like a very lazy snail on top-shelf machines.

I think SP1 fixed that significantly. In my current job, I had to install Vista twice, and I clearly remember that after installing SP1, performance became much better.


Yeah, Vista was mostly hit or miss. For some people it wasn't a step down in usability and stability from XP, but for a good chunk it absolutely was. Drivers not being published, drivers crashing, slow, bloated. In no way better than XP, just "newer".

My own experience was basically this:

  * 95 a - nearly unusable with the wrong kind of hardware (e.g. SCSI)
  * 95 b - pretty solid
  * 95 c - too few differences to b
  * 98 - kinda meh, but usable
  * 98 SE - finally fixed most of the 98 quirks
  * 2000 - first time I felt Windows could be called "very stable"
  * XP - pretty decent as well
  * Vista - horrible mess on ~10 machines, worked ok on ~2
  * 7 - second time I am actually ok with using it (just still greatly prefer Linux for work as a developer)


I noticed you skipped ME... :-)


Ha, you got me there. I was actually thinking "am I missing something? Why does a good one (2000) follow after another good one (98SE)...


Even Microsoft skips ME nowadays.


When I first started using GNU/Linux as my desktop system, I kept a Windows installation around for watching DVDs - back then, CPUs were not powerful enough for that, so I had bought a PCI card that did the decoding in hardware, but drivers were available only for Windows (IIRC), so I to boot into Windows to watch DVDs.

In retrospect it is rather funny, but I did use Windows ME for that, and it never gave me the slightest trouble. (Of course, one might say, I hardly used it. True enough.)


For me the standout versions were 95 osr 2, which had tiny system reqs but was comparable to 98 in ability, and NT4, which was the smoothest modern multi-tasking computing experience I've ever used. I was sad when hardware and software upgrades forced me to w2k, because it seemed like a big step back.


Third-party applications/drivers not being ready and higher resource usage than XP.


What I thought was weird about Winodws Vista was the contacts folder. It had its own contacts application which didn't store the data hidden but directly in as vcards in this folder. The application got removed in Windows 7, like many other tools. But the folder is still there. I'm sure it is not the only one which isn't really being used anymore since then.


The power hungry Vista forced millions to upgrade to newer devices with at least 1gb ram and faster processors.


...and for a small handful to move to Ubuntu. Vista was the last Windows I used. Plenty others went OSX. The public sector stayed on XP.


And a beefy GPU.

Best i recall, Vista was the first Windows that used the GPU to accelerate the UI. And the prime example they had was the frosted glass effect of the titlebars.


If I look at Vista and compare it to Apple's iterations through OSX, I see that Apple spread out the macro changes while MSFT tried to cram them all into one huge release.

Given the results, it seems that Apple had much better leadership/management and maturity in their team.

Apple's approach, while not perfect, clearly worked much better. I often wonder what it would be like if MSFT had taken that approach. I think the world would have been much better off for the last 15 years.


I was one of the lucky ones with Vista in that by the time I'd upgraded to it, it was pretty stable.

Forgettable and boring yes, but plenty fast and stable on an old HP box that served as my workstation


If one talks about failed Windows versions my favorite is still Windows ME ...


My favorite is Bob...everyone wants to forget about Bob.


> In fact, the OS innovations in iOS were what made it so clear in retrospect how wrong-headed the overall world view driving this work was.

What were those OS innovations in iOS?


Initially, the strict limits it placed on its own software and third-party apps. The first iPhone had pretty meagre hardware but still managed smooth 60fps scrolling, one-to-one interaction with your touches, and hitting the home button immediately stopped whatever you were doing and took you home.

Those things didn't come around by accident. It's because the OS was ruthless in cutting off applications from hardware resources, because its compositor was designed for basically moving a textured quad around the screen with very little latency on a mobile GPU, and because its main animation system ran in a high priority thread way above your applications.

All these little things are what made it feel "magical" to use the first iPhone, to have your finger perfectly tracked when pinching and panning on a mobile computer in the palm of your hand. It can't be overstated how important it is for a human to get an instant visual response to their touch, and how highly iOS prioritised this exact thing.


Symbian had them first.


Not sure I understand your comment.

Symbian had 60fps one-to-one correspondence with your touch on a multi-touch surface in a handheld computer? It had a high priority animation thread that continued to run even if your application crashed? I was never a Symbian developer, but I don't recall any pre-iPhone Symbian devices performing in this way.

The examples I gave were all about driving the physics-based movement of an OpenGL based UI compositor on a very rudimentary mobile GPU. It's putting your finger on the screen and getting an instant response from the underlying visuals, and having that response tracked perfectly across the surface of the screen.

The innovations in iOS (e.g., resuming an app by showing a screenshot of it's last use while it came to the foreground) were all in favour of increased perception of responsiveness and actual increased responsiveness. Not just about maintaining battery efficiency.


N95 was the first mobile with an hardware GPU, until then OpenGL ES was software emulated.

The Series 90 was the very first touch driven device with hardware GPU, one year earlier than iOS got released to the world.


I still fail to understand your point. I'm trying to educate myself on the N95 but videos of its interaction bear no resemblance to the first iPhone OS.

Putting a touch interface OS on a hardware GPU is not the same as the specific developments I am talking about. The graphics compositor (CoreAnimation / CALayer architecture on iOS), physics-driven animation and one-to-one correspondence with your touch at 60fps was what made the platform unique on its release.

Edit: this is the best video I can find of a 2007 Nokia device with touch interaction: https://youtu.be/YrS1PGj27VI?t=1m4s — the interaction is incredibly high latency and not at all one-to-one like an iPhone. Absolutely none of the innovations I mentioned are present on the Nokia (or if they are, they are not being used to facilitate human interaction with the device in this video).


I also mentioned Series 90, aka Nokia 7710.

Although I do concede that if it wasn't for iPhone's success, OpenGL ES would probably never had taken off, as Symbian was one of the very few mobile OSes that had some flagship devices with hardware support for it.


Thank you for the specific device (Nokia 7710). I've looked at some videos and I can explain more clearly what I mean.

In the following video https://youtu.be/CUUnnf_gX1s?t=46s you'll see that when the button is tapped, the screen "tears" when the device re-renders its display. I doubt this OS was using the GPU to render and composite its components to the display. It seems more like it is software rendering the UI then passing the entire buffer to the GPU for display.

Notice also that the scrolling on the device is line-by-line rather than per-pixel. It's done to minimise screen redraws but it is exactly the opposite of what made the first iPhone so pleasant to use. The only one-to-one interaction in the Nokia 7710 video is the little sliders being moved by the stylus, but they too flicker when redrawing.

My point wasn't really about whether a phone supported OpenGL. I was pointing out the software design of the graphical compositor used to render and animate the UI, and how every small decision was made in favour of prioritising an immediate reaction to human input.

The only reason I mentioned the iPhone's meagre mobile GPU was to emphasise how iOS was able to produce such low-latency and smooth performance from very low powered hardware.


Notice also that the scrolling on the device is line-by-line rather than per-pixel. It's done to minimise screen redraws but it is exactly the opposite of what made the first iPhone so pleasant to use.

Yep—smooth scrolling was a famous obsession of Steve Jobs that was emphasized in just about every Apple product where it was feasible, the idea borrowed from one of the fated PARC demos. Little things like this go a long way in building a sense of symbiosis between user and device.


It was the first OS (that I ever used) where things just worked out of the box, and tasks that should be straightforwards (like installing and uninstalling software) were straightforwards and dead simple.


I mean for me that was BSD. On the other hand my concept of straightforward install is messing around with config files and running a command.


Sandboxing of all user apps being enforced.


J2ME did it a lot earlier.


When iPhone first shipped, it looked like a featurephone with a large screen. And what sold it was the Apple name, and the iTunes integration. Effectively for most that bought it, it was and iPod with phone capability first and foremost.


I think the safari web browser was a very big deal at the time, it wasn't just a music player.


I'd argue that Safari was one of the biggest things that sold me on the platform, before the app store.

At the time I had a Windows Mobile 6 phone and while there was a lot of great things about it, Safari's browser experience - even on iOS 2 - was revolutionary.

It absolutely blew the alternatives out of the park.


The browser on flagship Symbian devices was pretty good.


Never mind that Opera had existed in mobile form (nope, not Mini, full Opera Presto) for quite some time.


Well rendering I guess it was ok but with t9 and arrow keys actual use was horrendous.


Smooth multitouch apps on a small, limited device?


TL;DR- Microsoft old guard saw that C++, html, hierarchical file systems were stupid. They were right. They thought of some excellent solutions but were 1) ignoring what MS knows today about open source etc 2) wrong on the cost and difficulty by a factor 10 3) surprised by swift technology changes

If the "new" Microsoft had materialized sooner and (e.g) successfully shipped .NET core in 2005 before betting on WinFS/Avalon then perhaps in 2017 we wouldn't be taping our backends together with javascript.


WinFS/Avalon

For those who may no know, Avalon was the code name (or perhaps just previous name) of Windows Presentation Foundation aka WPF - https://en.wikipedia.org/wiki/Windows_Presentation_Foundatio... - which was eventually formally released.

WinFS released a beta and a "beta refresh" but the project was cancelled before it was formally released.


this is a really misleading summary of the post.


Ok I started writing TL;DR and then wrote an opinion instead of an abbreviation - granted (too late to remove TL;DR)


Better Javascript than .Net


Didnt they demo it with flash...




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: