

The end of Wintel - ab9
http://economist.com/node/16693547

======
lolipop1
The author seems to generalize the "market" here. The effects described on the
markets are/willbe much more subtle than the article claims. IT has always
been multi-faceted. It's not like MS, Intel, Oracle, IBM or Cisco could have
stopped doing marketing and still be where they are.

~~~
jacquesm
There's a good joke about that, it goes like this:

A man is seated next to Bill Gates in the business class of an airplane. Mr.
Gates, he asks, how come that with your name and brand recognition and your
tremendous turnover as well as the locked-in nature of your customers that you
still spend money on advertising.

Gates replies: Do you like the speed at which this aircraft is travelling ?
How about we turn the engines off ?

~~~
InclinedPlane
I'm not overly convinced. A big reason why most major corporations advertise
so heavily is just because it's something that big companies do, it can be
tough to buck a trend especially when billions are on the line.

But aside from that, advertising is a business expense that is fully tax
deductible. For many big businesses the net cost of advertising is zero.

~~~
jacquesm
If something is tax-deductible that does not mean that the net cost of doing
it is zero.

For example:

We have a turnover of $100, a tax deductible cost of $40, our gross profits
before taxes are $60, say we pay 50% tax the net is $30. If we had not made
the $40 tax deductible expense the net would have been $50.

So the net cost of stuff that is tax deductible is not zero.

~~~
Retric
Let's say Cisco spends $1B on advertizing and ends up with an extra 800M in
sales and $900M in profit after taxes. Was that a good idea?

PS: The actual numbers may look like this. By selling more units your price
per unit usually drops. But, sometimes it's better to sell fewer things at a
higher margin. For a company like Cisco maintains a specific brand image is
worth a lot. They may sell cheap home networking equipment but they don't use
the Cisco brand.

~~~
jacquesm
Obviously not, but the argument was made that because advertising is tax
deductible that it was essentially free. Nobody talked about the effective
result of the advertising on the bottom line because of the extra revenue that
it might bring, or how bad it would be to advertise ineffectively.

~~~
Retric
_Obviously not_ err my point was 1B in pretax money is generally worth
significantly less than 900M in after tax money.

Clearly it's not free, but due to the tax break advertizing can have a lower
ROI and still be a better investment.

------
Raphael
AMD designed the 64-bit architecture.

~~~
Cabal
AMD designed _a_ 64-bit architecture. There were many 64-bit architectures
around long before x86-64 in 2000.

~~~
melling
It is probably worth noting that AMD brought 64-bit to the PC world. Intel was
telling customers they didn't need it.

AMD actually took a small bite out of Intel and Intel quickly changed its
mind.

Breaking the Wintel monopoly will spur a lot of innovation. A third choice
like Chrome or Android on tablets or notebooks might do the trick.

~~~
InclinedPlane
Intel was telling everyone they needed 64-bit (more accurately, post-32-bit)
technology long before AMD created x86-64. It's called IA64, EPIC, or Itanium.
Intel made a huge gamble, betting that the next-generation of CPUs would be
founded on a radically different VLIW type architecture.

But Intel goofed. Because they didn't factor in how long it would take to
fully develop the new architecture (something that takes multiple iterations).
The first generation of Itanium was hugely expensive, didn't run x86 code
well, and couldn't easily masquerade as a plain Jane x86 machine. (Not to
mention the problem of having to modify code so that it could be re-compiled
for IA64.)

Meanwhile, AMD looked at the same problem and came to a different conclusion.
Rather than a revolutionary new 64-bit architecture they thought that an
evolutionary architecture was preferable. Simply fix up a few of the most
glaring problems of x86 (such as a dearth of registers), widen the bus to
64-bits and call it good. They probably figured that since everything under
the hood is all RISC/ILP-ish hidden behind micro-code translation anyway the
instruction set isn't super important, as long as its semi-reasonable. They
concentrated on maximizing performance for unmodified code and providing good
benefits for using actual 64-bit specific code (such as more registers, well-
engineered 64-bit memory addressing, etc.)

Itanium floundered while AMD64 became hugely popular very quickly. AMD64
processors ran x86 code faster than x86-only processors, so there was no
reason not to buy them, even if you never ran any 64-bit code. But once you
owned the hardware it was now just a matter of an OS update sometime in the
future to upgrade to 64-bit. Allowing the x64 transition to be far more
incremental and far less of a leap of faith than it would otherwise have been.

Intel quickly realized their mistake and tried to implement a version of
x86-64 using the net burst (Pentium 4) architecture. This led to some really
sad machines and a very awkward time for Intel as they were hit with multiple
setbacks at the same time. Eventually they succeeded with a very substantial
redesign of their hardware architecture (the core series) which paid huge
dividends in the 2nd iteration as it proved to be far superior even to AMD's
offerings.

It's not at all that Intel thought that the 32-bit world would last forever,
it's just they that screwed up royally in figuring out where to go from there.

Note that finally, many, many years too late, Itanium is actually a decent
platform now.

~~~
e40
From a [native compiler] developers point of view, Intel's fatal flaw: the
architecture is nearly impossible to write a compiler/assembler for. That's
what Intel told my company (that makes native compilers). They said we
wouldn't be able to make a high performing compiler because the instruction
scheduling, etc were too complex and not documented well enough for us to do
it.

A port of our compiler, which normally takes a couple of months, was estimated
on IA64 to be 1-2 years, for a high-performing result.

This is where they failed.

Oh, the AMD64 (what it was called when we started the port) took 2 months, and
that was fully optimized.

~~~
InclinedPlane
Yeah, I only mentioned that in passing but it is a serious issue. All the more
pressing because a lot of the promise of IA64 hinged on smart compilers. It
was a dramatic change of course for CPU architectures since the previous trend
had been to make the hardware smart enough to figure out how to take advantage
of ILP on the fly, even with unmodified code (thus the superscalar
architecture of the P5, the out of order & branch prediction strategies of P6,
etc.)

------
known
Does Intel make chips for Cell phones?

~~~
azim
There are rumors circulating that Intel is currently in talks to acquire
Infineon's wireless business. [http://www.eweek.com/c/a/Mobile-and-
Wireless/Intel-Infineon-...](http://www.eweek.com/c/a/Mobile-and-
Wireless/Intel-Infineon-Negotiate-Over-Wireless-Business-Reports-456279/)

~~~
nimrody
Infineon _does not_ produce application processors for the mobile market --
except perhaps based on ARM technology. They do make baseband and RF chips
that are used in cell phones.

Intel definitely wants to get into the mobile / cellular business and have
made a similar attempt before (search for DSP/C -- now owned by Marvell).

In fact, Intel was even making ARM based application processors in the past
(called Xscale and derived from DEC's StrongArm design). Eventually they sold
the entire division.

They would love to have a version of Atom powering mobile phones. Right now
ARM is the de-facto standard.

