
Windows 7, Windows 8, Windows 9 - nreece
http://arstechnica.com/microsoft/news/2009/10/week-in-microsoft-windows-7-windows-8-windows-9.ars
======
daeken
128-bit addressing is great and all, but it's completely useless (for the near
future) unless we have finer-grained pagetables. Give me the ability to do
very, very fine-grained pages (let me bring it down to 16 bytes if i want to)
and suddenly a lot of new things open up. For a long time, I've been thinking
about representing 3d data spatially in memory using virtual memory mapping,
but the granularity of existing systems makes it extremely inefficient.

------
vinutheraj
I personally liked this one - _Symantec sponsored a study by Dennis Technology
Lab which concluded that Norton Antivirus 2009 is superior to Microsoft
Security Essentials in on-demand scans as well as real-time protection._

------
InclinedPlane
This whole "128-bit windows 8" story has to be the dumbest piece of
"reporting" I've seen in a while.

Firstly, who is even working on 128-bit processors? Secondly, what is the
source for this news? At best the hint of a rumor. Thirdly, what industry need
does 128-bit processors satisfy?

All around it points to a lack of critical thinking and skepticism, if not
straight up naivete, gullibility, and ignorance bordering on incompetence on
the part of the "reporters".

This line strikes me as amusing: "Consequently, the company [MS] is also
forming relationships with major partners, including Intel, AMD, HP, and IBM."
Newsflash: MS already has close relationships with each of those companies.
There are many "business partners" from each of those companies who
effectively work on the Microsoft campus.

At best this "revelation" is about the Itanium processor (which uses 128-bit
instructions, though it retains a 64-bit word length and address space), in
which case it is not only boring but horribly misleadingly recharacterized.

~~~
timdorr
Reminds me when consoles went from 8-bit to 16-bit to 32-bit to 64-bit and
then sort of stopped (Dreamcast mentioned that it was 128-bit for a bit).
Sounds like the author of this article is a bit nostalgic for his former
console days.

------
zyb09
can anyone explain to me, why we need an 128-Bit OS anytime soon? I thought
the problem with 32bit was mostly about memory address space and 64bit solved
that issue for good.

~~~
wyday
Because processor technologies aren't designed the weekend before they're
shipped. AMD's x86-64 technology (aka 64-bit) was publicly announced in 1999
(see: <http://en.wikipedia.org/wiki/64-bit>). It likely was in planning stages
years earlier.

When was the first time you _needed_ x64? (i.e. when did you run into the 4GB
ram wall?) This year, maybe. Last year at earliest. That's more than a 10 year
gap from when a technology was conceived and when it was needed.

~~~
oomkiller
Well, according to Wikipedia, 32 bit architectures started to come around in
the 60s. We reached the 32-bit memory wall in some applications at earliest
1990. Thats 30 years to go between the first 32-bit archs and the "topping out
of them." Then, you need to remember that this is an exponential change in
address space, not a linear one. 2^64 is MUCH MUCH bigger than 2^32, you can't
just think of it as "twice as much." In the 60s, they thought 4 billion bytes
would be much more than they would ever need, but 32 bit's headroom was just
1000 times more than the biggest they used in that day (assuming 4MB, even
though IBM's System/360 could access 8MB, which in that case the headroom
would only be 500 times as much memory). 64 bit tops out at about 17.2 billion
GB, and if we estimate that today's common RAM size is 2GB, the headroom we
have for 64 bit is about 8.6 billion times what most computers have commonly
today. That means that the headroom for 64 bit is 8.6 MILLION times the
headroom that was in 32-bit. For some reason I think that it will take us much
longer to fill up 64-bit than it did for us to use up 32-bit.

~~~
mechanical_fish
_this is an exponential change in address space, not a linear one_

To be fair, Moore's Law describes an exponential change in _usable_ address
space that we're still living through.

The numbers are kind of spooky. The stupidest statement of Moore's Law is
"everything doubles in size, at constant cost, every 18 months." If we take
that statement literally, and also assume that the reaching of the "32-bit
memory wall" became a widespread phenomenon around, say, 2009 [1], a backwards
extrapolation suggests that the RAM industry should have begun in 1961 with a
single byte. This is _frighteningly close_ to agreeing with actual history.
[2]

The implication is that another 32 bits only buys us another 48 years. After
that we'll need a 128-bit address space in order to properly index our
personal catalogues of _everything ever_. [3]

Or, perhaps, at some point we will just get tired of making RAM.

\---

[1] This is when _I_ first noticed it. Yeah, people with millions of dollars
hit this wall in 1990. Must have been nice to be them.

You will note a certain degree of slight-of-hand in my numerology here. ;)

[2] Intel's first memory product was a 64-bit unit in 1969. In the fantasy
modpunk universe where Moore's Law was operating during the Sixties, that
would imply a starting point of a single bit in 1960.

If we reject my narcissistic assertion that _nobody important_ cared about the
32-bit memory wall until this year, and instead accept oomkiller's assertion
that the 32-bit memory wall first appeared on the radar around 1990, that
would push back the birthday of the first bit to around 1941. I give you
Wikipedia:

 _The [Zuse] Z3 (1941) was the first working machine featuring binary
arithmetic, including floating point arithmetic and a measure of
programmability. In 1998 the Z3 was proved to be Turing complete, therefore
being the world's first operational computer._

Cue the _Twilight Zone_ theme.

[3] We probably won't need that 256-bit address space in 100 years, since
there are estimated to be only 2^266 atoms in the observable universe.

~~~
likpok
Exponential growth is unsustainable. In a universe where the infinite does not
exist (either big or small), we are strictly bounded by things. At some point
Moore's law will show it's true colors (a logistic curve) and taper off.

