
Intel's Haswell Architecture Analyzed: Building a New PC and a New Intel - reitzensteinm
http://www.anandtech.com/show/6355/intels-haswell-architecture
======
bornhuetter
I really hope that the threat of Apple switching to ARM causes Intel to push
Linux as an alternative to OSX.

I'm not holding my breath, but I can only hope.

~~~
mdasen
Linux is unlikely to be an alternative to OS X (for most users). The key is in
looking at why most users decide to buy an OS X machine. Many people on here
probably bought one because it provides a decent _nix style OS with normal
command line interface and the ability to run the things they use for
development in much the same way as their Linux deployments. However, that
isn't the majority of OS X users.

The majority of OS X users probably bought it because it felt more like a
consumer electronic device than their Windows computer did. Apple's
integration between the hardware and software means that things like trackpad
drivers, sleep mode, ambient light sensors, etc. all just work nicely and
smoothly. Many PC manufacturers offer features thinking that a feature is just
a boolean: it has it or it doesn't have it. For example, a friend of mine has
a Sony laptop with a backlit keyboard - just like my MacBook. Except that, in
practice, it isn't like my MacBook. Either the light sensor is faulty or it
doesn't use one, but it doesn't change the backlight brightness automatically.
When you manually change the brightness, sometimes a box on the screen comes
up showing you raising and lowering it and sometimes it doesn't. The same can
be said for the WiFi helpers that companies seem to install to manage the
WiFi: they feel cheap and poorly executed. So, for many users, a MacBook is
more like buying a consumer electronic where you feel like it's been
manufactured as a cohesive product a little more than the Windows environment
where it feels like an OS on top of hardware with the pieces supporting
hardware specific functionality created hastily and poorly.

If Intel pushes Linux as an alternative, it could help Linux along for those
wishing they had a Windows with a _nix development environment (which is what
I think you mean when you say "alternative to OS X"). It's unlikely to create
something that most OS X users would want since mainstream Linux would mean
the same hastily made pieces to do hardware specific things. Even look at
Android. It's great, but what does everyone complain about: the ways in which
manufacturers customize it to make themselves seem different from everyone
else.

\--

In terms of the likelihood that Intel will do this, I think it's highly
unlikely. You talk about Apple (potentially) switching to ARM and Intel losing
that. However, if Intel pours money into Linux to make it a better OS than it
is today, that isn't helping Intel's case. Linux can transition to ARM a lot
easier than OS X can. Most Linux software is open-source meaning that even if
the original maintainer is gone, someone else can recompile the software. For
OS X and Windows, it would be a lot harder for them to transition processor
architectures since they would need developers to do that work or provide a
translation layer like Apple did moving to Intel.

Making Linux the best OS out there would only make it easier for people to
leave Intel processors. With OS X and Windows, Intel has a certain amount of
lock-in. It's growing smaller, but there's a lot more lock-in than with Linux.

Ultimately, Intel has to compete to make a processor that people want.

~~~
bryanlarsen
I think to most people, Android feels more like a consumer electronics device
than even OS X. Yes, Samsung Android is different than HTC Android, but only
geeks care about that.

Intel is putting a lot of effort into making sure that Android runs well on
Intel. As a side effect, stock Linux will run well, too, since the vanilla and
Android Linux kernels are converging.

------
mtgx
Anand is too optimistic about Apple adopting Intel chips in iPads or iPhones
in the future. Personally, I don't think that's ever going to happen, and I
don't think it makes any sense either. Apple is trying to become an ever more
integrated company, and apparently making their own chip is becoming very
important to them. And recent rumors start pointing to the same conclusion,
although we probably won't see this happening until 2014:

[http://www.techradar.com/news/computing/apple/apple-
reported...](http://www.techradar.com/news/computing/apple/apple-reportedly-
wants-to-ditch-intel-chips-in-macs-hire-its-own-soc-designer-1101772)

From what I hear Haswell will be significantly more expensive than Ivy Bridge,
too, so again, to me it looks Intel is becoming less competitive with ARM, not
more. But I think having full control over their chips matters more than
anything else to Apple, and I think they will move there at least gradually.
Apple's A6 is a significant step in that direction, and I don't see them
moving back from it.

~~~
bryanlarsen
Anand is probably talking more about Windows 8 tablets. We're already getting
Ivy Bridge based tablets, so it's obvious that we're going to get Haswell
tablets. The article just shows us how compelling those tablets are going to
be, as long as you don't paying a few extra dollars.

I suspect that this also means that we're going to get Haswell based Android
tablets, which along with initiatives like Ubuntu for Android might be really
compelling to a certain subset of HN readers.

------
Symmetry
Good article. I've always wondered how the bypass network on Intel chips
works, though. Is it between the ports or directly between the execution
units? I also can't imagine it being fully bypassed in either case, so I
wonder what it looks like.

------
theevocater
Intel is making the necessary steps to stay relevant as we see smaller lower
power devices become the norm. Ultrabooks (and their silly netbook cousins of
the past) are merely stepping stones on the path to more embedded, power
sipping devices everywhere in our lives.

I'm curious to know what this means for Nvidia. Nvidia is now essentially
competing on both big fronts. On one end, nvidia has to stay relevant with its
video cards. Things like Tesla are going to be big business, but how do you
compete with intel's on chip offering and keep power demands low enough? On
the other side of the wall you have Tegra. Intel is a much easier target here
as the Tegra is arm based and can already be found in android phones and
tablets. And as the article indicates Haswell is still not equipped to truly
compete with Tegra at the low end of the power spectrum.

Also, and it hurts to mention it, what about AMD? Bulldozer was at best a
lukewarm release and piledriver still isn't out. On the other hand, AMD's
graphics offerings still compete well with Nvidia. Big box retailers don't
offer much in the way of AMD but ATI still sees significant usage. Dell, for
example, only offers opterons in its servers, but offers plenty of AMD
graphics cards in consumer offerings. But as AMD still has the same problem as
Nvidia: this side of the business is being disrupted by things like Ultrabooks
and tablets though so its hard to see how this will last. For AMD to compete
with intel, it would seem they need to focus on SoC designs.

Lastly, one thing I found very interesting was the various winks and nods to
the fact that intel creates a lot of interesting features that are more
expensive, but only 'one' manufacturer ever uses them with the need for the
stick. That manufacturer is clearly Apple.

People often like to parrot the meme that Macs are simply more expensive PCs
but it seem clear here that other manufacturers require the stick to get them
to make improvements to their boards.

> Intel gave one example where an embedded controller on a motherboard was
> using 30 - 50mW of power. Through some simple firmware changes Intel was
> able to drop this particular controller's power consumption down to 5mW.
> It's not rocket science, but this is Intel's way of doing some of the work
> that its OEM partners should have been doing for the past decade. Apple has
> done some of this on its own (which is why OS X based notebooks still enjoy
> tangibly longer idle battery life than their Windows counterparts), but
> Intel will be offering this to many of its key OEM partners and in a
> significant way.

You don't drop power consumption by an order of magnitude that easily unless
the original chip was seriously poorly designed.

------
batgaijin
I still can't help but think that the name is a little jab at the added
hardware level STM support...

