
Windows 7 to break previous API compatibility - r7000
http://thebetaguy.com/exclusives/?postid=1029344029&title=microsoft-windows-7-exclusive
======
jcromartie
What is this nonsense about the number of files slowing the system down?
Surely seeking to a bunch of little libraries is slower than loading one file
initially, but does it matter in the long run? Once you're up and running
there should be practically no noticeable difference, right?

I can't imagine that the number of libraries is what's slowing Vista down...
rather, it's just what those libraries are actually doing.

~~~
acrylicist
I've seen people who have never heard of "folders" save 50-60,000 items in one
folder. NFTS (if they keep it), slows to a crawl every time that folder's
accessed because I don't believe its storing the entries in a tree.

The other possibility is all of the memory mapping of hundreds of little
libraries into the system plays havoc with code locality and L1, L2 caches in
microprocessors. Since Vista is trying to be "more secure", there are probably
more monitors and validation between these address spaces than there were
before.

Now with Vista on just "what the libraries are doing", maybe they just turned
a bunch of interns loose on a bunch of API's embedded in those libraries and
they did the "almost simplest thing" that could possibly work without thought
to performance. "Get it working and ship! Optimize it later."

I find Vista a nuisance to users who complain that nothing looks the same and
the "trivial" modifications of user-interface derail the "learned-by-rote"
user.

Microsoft would do the world a gigantic favor if they just conceded and made
the world "Microsoft Linux." Apple did the right thing already with adopting a
"UNIX" core to base their systems and and put all the money into UI and
feature improvements. Microsoft is spending too much time overworking their
engineers who work on continual bomb squad responses to exploding deck chairs
to focus on usability for users.

------
hunterjrj
This all sounds great, but the author didn't cite a source for any piece of
information contained in the article.

------
dsplaisted
I call BS. The trouble Microsoft has had with Vista would be nothing compared
to the trouble it would have if ALL applications had to be recompiled to work
with Windows 7. The article cites no sources and doesn't offer any
substantiation at all for its claims.

~~~
johns
The article didn't say that everything _had_ to be recompiled, but that they
could. Without recompiling apps can run in a virtualized, compatible
environment (like Classic on OSX). It didn't cite sources, but MS has alluded
several times to this being the plan.

~~~
wanorris
As common as virtualization is becoming , that could be a good solution as
long as they can keep a ceiling on the memory cost -- true virtualization can
be really expensive, memory-wise.

On the other hand, if they're simply talking about a conventional
compatibility box, this is less of a worry -- Windows has been doing
variations of this for years for backwards compatibility -- since long before
OS X.

This article skips past all the technical details, but the smartest solution
might be to make the new API purely managed (i.e. .Net), and run all unmanaged
apps in a compatibility box. This is great for stability, and it ends the days
where you have to dig down into C or C++ and mess around with window handles
to use certain OS features that aren't available otherwise.

------
chaostheory
I have high hopes, but then again I had high hopes for Longhorn and Xbox 360
too...

Given MS's history of not giving high priority to reliability and quality,
it's typically a good predictor of the future

