I'd love to see AMD at a 22nm process with Intel and then compare them. They've managed to stay close even two process generations behind(The chip that just came out today is at 32nm).
If AMD wasn't there in 2004, forcing Intel to (effectively) drop the Itanic and do what customers actually wanted - the AMD64 platform - we'd either be much farther back now in terms of x86; Or would be much farther in terms of ARM; (or both).
AMD's gut to do AMD64 helped us all.
They showed the path to better integration and performances (on-chip memory controller, significantly better interconnect, better multicore integration) as well.
It's just sad how much they lost their way since the Athlon 64, and how Intel's Core just curb-stomped them.
While it is true that the P4 was going nowhere, it was the 64-bit market that forced Intel to reconsider their road map; If it wasn't for AMD, 32-bit might have sped up to Core, or stayed at P4, but we'd be nowhere close to where we are today.
The small form factor gaming/htpc market could be a good one for AMD, but they haven't been able to get any other companies to build them in volume. Usually you get the eeeBox or something underpowered using an E-350. Heck, HP is sticking those in full desktop cases which is absurd.
All the innovation left in the PC market seems to be on tablets and ultrabooks, not on the desktop at all. AMD hasn't pushed as hard on either form factor and it's hurt.
Also, why hasn't AMD done what nvidia did and become an ARM chipmaker? Tegra has sold well enough for nVidia and an ARM desktop box could be quite competitive in the next few years for the average user.
Their best move would probably be to license the ARM platform and use their chip designers to make custom chips (like Apple and Qualcomm have), but I don't think this is realistic. Also, I'm not sure whether any of their GPU technology is low-power enough to be useful in ARM designs.
When they were supposed to do something about the mobile market, they said they will "wait and see". That was even after Nvidia did the right move and created Tegra. Nvidia was clearly a more visionary company than AMD, and Nvidia will survive because of this. It might even out-survive Intel because of their move to ARM. AMD won't. They'll be crushed by both Intel and ARM chip makers.
tl;dr Nobody with any sense would take the CEO job at AMD.
Having a second source for strategic parts was said to be a requirement for US government/military procurement. My guess is that if AMD is really on the ropes, they'll get a DoE contract for a new supercomputer or three to hold them over for a while longer.
I guess a question would be, would Intel be found guilty of illegally causing the demise of AMD?
I don't think so. However it's worth mentioning that Intel certainly acted in anti-competitive ways to severely damage AMD's ability to gain marketshare. The suit was eventually settled though, with Intel paying AMD a huge sum.
And it isn't even really their fault that they are building new fab plants a year ahead of anyone else. It would seem wrong to me to force Intel to sell off yields on the market from their own plants if they don't want to, but that is the major reason they always dominate the pc space (besides the fact most software is for x86 and they license the architecture, but I don't really buy that anymore - hardware virtualization has come a long way, and I can pretty effectively emulate x86 on ARM under qemu with binary address translation and hardware instructions supporting it, which every major architecture now has).
The only issue is that software needs to be optimized for ARM processors but if the savings are there, this will happen pretty quickly.
So I'd actually be worried for Intel as well. Not just AMD.
Does anyone have ANY numbers that back this up? I've heard this refrain many many times but I've never seen any hard numbers to prove this. On a processor-per-processor basis sure an ARM SoC beats a Intel Xeon. But flop or dhrystone, x86s destroy ARM processors. In a virtualized world, where the number of real systems doesn't have to match the number of servers, x86 still appears to hold the lead.
It has to do with the evolutionary heritage of both systems. Most x86 systems are still sold to individuals or small businesses that will plug them into the wall and forget about power dissipation. A typical x86 desktop machine will draw between 300-500 watts. ARM evolved more for the cell phone and tablet market, and typical power consumption for one of those systems would be under 5 watts.
With a beefy GPU, perhaps. The new i3 3220T CPU is only 30W MAX TDP, flat out - a typical (ie. non-gamer) rig is looking more like 150W max, and a lot less idle (20W should be possible). Not 5W, but nowhere near 500W.
You also need to consider that Intel is at least a fab tech ahead of ARM chips. I have a Tegra 3 Transformer tablet and that chip was yielded at 40nm, 2 generations behind Intels 22nm, and it has a TDP of 15 watts. Of course under load it would run higher than that, but so would an Intel chip.
And if you think your typical beige-box PC can handle a power supply that is specc'ed for 150W-- go ahead and put one of those in there. I DARE you.
The ubiquitous small form factor PCs like the Optiplex 780 (http://www.dell.com/downloads/global/corporate/environ/compl...) use a 235W (max) power supply, which will be overspecced to trade off failure rates for manufacturing cost. Those machines actually draw less than 150W flat out. And they're everywhere. A certain large e-tailer with an emphasis on frugality used to use them as developer desktops(!).
Who knows what's in a typical consumer beige box, but it isn't pulling 500W continuously, unless they're playing, say, Skyrim 24/7 with a big graphics card - in which case of course one would specify the correct (safe) component for the design. I'd argue that they're not typical by that point; most people won't spend £300 on a graphics card (I do).
* You point a to 235W power supply as an example of the bare minimum PC power supply-- not too far from my 300W round number.
* You point out that a power supply rated for X isn't drawing X continuously-- a true statement, but it's responding to an argument nobody made. You have to pick a power supply which is rated for your max load-- everyone knows that, or should. It still doesn't change the fact that both max load and average load for X86 are orders of magnitude greater than for most ARM devices.
my ageing desktop with an e8400 and an radeon 4850 draws 270 watts at full load, including display - number straight from the ups.
It has nothing to do with evolutionary heritage and more to do with you repeating ancient myths that haven't been remotely close to the truth of a decade.
Another example of "evoluationary heritage" is the fact that x86 chips require a northbridge and/or southbridge, whereas with ARM chips, everything is integrated on the chip. This was one reason why Atom-based designs often weren't that low power-- the CPU itself might be low-power, but the glue logic was thirsty. There is evidence that Intel is trying to change this, and put everything on one chip.
I'm not trying to say that x86 will never succeed in mobile. I don't have a crystal ball. I'm just saying that the burden of proof is on Intel to prove that it can be cost and performance-competitive in that space. And I am not the only skeptic-- Apple and Microsoft use ARM for most of their mobile offerings.
Which is ironic since Trinity could drive a retina-like display without a discrete GPU on the side, yet I could only find a handful of Trinity laptops with optional 1080p displays and two weren't available stateside. The only performance unit I could find was made by MSI.
There's nothing like the Zenbook or the ENVY15 available, so at the end the problem isn't a compromise on CPU power alone but on nearly everything else too. So you have to choose: either you get a good laptop or an AMD laptop, and that's not fair.
I guess AMD should start working more closely with OEMs to make sure its APUs are available on products that are not all bargain-bin units but at least some mid-to-high end units with good features and build quality.
That or do like MS with the Surface and make their own highend laptops and tablets.
That isn't even about monopolistic business practices, decisions, or market forces. You are comparing two companies operating on effectively different planes of existence. Intel owns the instruction set, has the most advanced silicon fabs in the world (and still makes their chips in house) and spends more on R&D than AMD even makes. And all Intel does is make CPUs.
Meanwhile, AMD bought ATI and took a tremendous gamble on APUs. They are just starting to mature their APU line with Trinity in the last few weeks, and are still reeling from integrating two large companies together like that. They had to sell their own fabs off and couldn't even make their most recent generation of GPUs at Global Foundries because they aren't keeping up anymore. On that front, the 7000 series graphics cards (from my objective viewpoint) basically crushed Nvidia for the first time in a while. They were first to market, as a result didn't have major shortages, and price cut at the appropriate times to keep their products competitive. It took Nvidia almost half a year to have their GPU line out after AMD, and their chips, at competitive prices, are almost exclusively openGL / graphics devices, being beaten in GPGPU operations by the old 500 series and easily by the 7000 series because they tried going many core limited pipeline over more generic cores in the 500 or 7000 series that were better at arbitrary GPU compute tasks.
So they are doing really well in graphics. And their APUs are really good graphics chips too. The only flaw in AMD right now is that they are floundering on the cpu fabrication front as badly as Nvidia did with their graphics line (only with their cpus). They eat power, they are effectively 1.5 generations of fab tech behind, and the bulldozer architecture is weak in floating point and serial operations.
That doesn't ruin a company. Hopefully next year is the year they really start moving forward, because I really think AMD is the company to finally really merge gpu and cpu components into some kind of register / pipeline / alu soup that can really revolutionize the industry (imagine SIMD instruction extensions to x64 that behave like opencl parallel operations and have the normal processor cores work on register ranges and vectors like a gpu, rather than just having a discrete gpu and cpu on one die).
Even barring that kind of pipe dream, Steamroller is shaping up to be sound. It finally gets a die shrink AMD desperately needs to stay competitive, if only to 28nm, and finally puts GCN into their APU graphics instead of the 6000 series era VLIW architecture.
They can't really stand up and fight Intel head on anymore, because Intel got on the ball again, and their cpus are crushing AMD in a lot of use cases, especially power usage. But AMD still has significantly better graphics, and are leveraging it, and they are finally getting over the ATI growing pains, so I'd wager they are still in the game, if only barely. They have a lot of potential still.
Footnote: I really think market is a big reason AMD is falling behind. The Ultrabook campaign is stealing wealthy pc buyers from them, and that is where chip makers get a majority of their profits (look at the high end mobile i7 chips selling for a thousand bucks). Desktop sales are abyssmal besides OEM systems or businesses. Intel wins at getting business contracts just by size alone, they just have more reach. Desktop enthusiasts can bank on AMD being a cost effective platform, but the wow factor lies in Intel chips, even at the premium, and they steal that market too. AMD doesn't even do well in the cheap HTPC market because their chips burn so much power. They are in a crossroads where all their target markets are either becoming obsolete or they are losing ground, and not because they have bad products, but because their perceptions and influence are growing worse.
Right now, AMD is really strong in mid ranges. Mid range laptops with a trinity APU are really good and extremely cost effective (I had a friend buy an A8 based Toshiba because it was $500 cheaper than a comparable Intel machine that could run League of Legends). Piledriver is good enough in the desktop space to recommend one of the 4 or 6 core variants to friends looking for a budget PC gaming experience, because they are pretty much more than enough with a proper overclock for anything major. But AMD has (from my experience) a bad image right now as a dying company and as a maker of budget goods, even when their GPUs kick butt and their desktop CPUs can (at least according to the recent Phoronix Piledriver FX benchmark) hold ground against even Intels best Ivy Bridge offerings in some cases at almost half the price.
TLDR: I guess after graduating college I had withdrawal on writing essays. This is a really long wall of text, holy bacon.
Also doesn't Intel also make GPUs?
They are saying Haswell will be an improvement, but AMD has the architecture cohesion to be able to pair discrete and integrated cards in their hybrid crossfire x, and they just have a decades worth of gpu experience from the ATI acquisition, that they are better positioned to exploit heterogeneous cores. Same way Nvidia Tegra in the mobile world is a gaming / video powerhouse because the gpu is so strong.
Also, x86-64 was / is a specification extension by AMD, not an instruction set, so I don't think Intel licenses it. They even spent a few years calling them AMD 64 and Intel 64 even though they were written to the same spec. AMD still licenses x86 from Intel though. It is like how SSE and other instruction set extensions are not cross licensed between the two. https://en.wikipedia.org/wiki/X64#History_of_Intel_64
Intel makes SSDs too. Intel made chipsets and wifi chips for centrino in the past. I think Intel has made every major internal PC component, just not all at the same time. Now, they're getting into ARM chips and fighting to put x86 into mobile.
Likewise, nVidia has made nForce chipsets and other major PC components. And doesn't the Tegra also do general-purpose processing? Either way, they're becoming entrenched in mobile devices more firmly than Intel.
AMD is better positioned to come up with a unified product, but after 10 years of fanboying for both brands, I'm still not convinced that their merger made any sense. A unified product could have been made from a technical partnership, without having to merge the companies. I haven't seen a compelling product since the merger. I haven't even seen a roadmap for a mobile device component.
The AMD/ATI merger seemed only to happen because both companies were based in Canada. At this point, it only makes sense for AMD to swallow RIM, another Canadian company -- at least that way ATI would be a mobile device manufacturer.
But I'm talking more about the 77xx vs the 66x lines, which are the most mainstream discrete cards in the series, where the 77xx came out in February and the 66x cards came out in August. So about 6 months.
For all the hype around the 7970 vs 680, in the end the vast majority of their OEM card sales will be with less expensive hardware, which is why I think AMD "won" even if the 600 cards give better FPS for lower power usage in video games. They just basically controlled the market for mid-range cards for almost half the year, and by the price cuts they have been making they have been making a really sizable profit off sales until Nvidia brings out a competitor.
I just want to mention I'm not an AMD fanboy - I have an i7 920 and a gtx 285 right now. My "last" build was around 2006 and was an Athlon x2 with a 1950.
Also I've spent most of my evening reading reviews of the new AMD CPU they released and it is looking good for budget enthusiasts.
I want AMD to continue to compete and push but one bad architecture takes a while to overcome. Bulldozer++ needs to deliver.
What nobody ever seems to consider is that AMD has revenue that less than what Intel spends in R&D. The fact they can even compete on the scale they do with Intel is a testiment to their success. Intel was always way too big for AMD to compete directly against, and after Athlon's sucecss they tried to move into the big leagues and fight Intel one on one, and lost just due to raw funding (I'd argue, at least). It is why they had to sell off Global Foundries to buy ATI, and such.
There was an announcement that AMD will make ARM-based chips in the near future. That niche might have some more air supply than the one they're in currently. However, they'll have a lot of competitors in the ARM space, so who knows.
Like Bulldozer, it was taking a risk that didn't pan out as hoped. Thank goodness CPU companies are still taking interesting risks.
I also wonder what role AMD proper (as opposed to GF) will have in some kind of theoretical future world where ARM provides all the chip designs. Wasn't architecture sort of their main thing previously? I know a lot of companies add their own little things on to the base ARM designs, but it still seems like AMD will have to scale back their design team considerably in such a scenario.