Hacker News new | past | comments | ask | show | jobs | submit login
Intel slashes prices in face of AMD competition (anandtech.com)
202 points by taspeotis 14 days ago | hide | past | web | favorite | 159 comments



Finally, they were basically asking double price for some CPUs until there is a strong competition. This is a really good example why competition matters for the consumers. I hope similar situation happens for GPUs too. Right now NVIDIA asks for way more than the cards' worth.

Even if it may still take some years, I have really high hopes for chiplet GPUs.

Would the GPU have its own RAM or simply share address space with a CPU?

It's own RAM. The memory bandwidth of my GPU (616 GB/s) is about 23 times higher than that of my system memory (26.5 GB/s). And I have applications that utilize 90% of that GPU memory bandwidth.

Thanks! I've written code for discrete GPUs but not integrated, and so I'm unsure what the advantages would be. Would there be lower latency for transfers, then, if not higher throughput?

For real 3D performance, it would need dedicated gddr. Regular ram won't cut it for a Gpu at the high end. If all they want is mid-tier, DDR4 would work.

DDR4 isn't even adequate for mid-tier. NVIDIA's lowest end desktop GPU (GT1030) comes in DDR4 and GDDR5 variants and the DDR4 version has roughly half the performance.

I believe chiplet gpu is meant to be two or three chiplets in one graphics card, not an apu design.

I guess so but for a lot of buyers they're not even considering AMD. They're just using it as a stick to beat Intel with for a better deal.

Which buyers exactly? Enthusiasts seem to be switching overwhelmingly to AMD: e.g. Mindfactory having AMD at 81% now. [0] Anecdotally, there seems to be a lot of resentment towards Intel for their shenanigans, I don't personally know anyone who's even considering Intel for their own builds.

It's a different matter for laptops where consumers have much less say on the topic, of course.

[0] https://m.imgur.com/gallery/9edvLSn


>It's a different matter for laptops where consumers have much less say on the topic, of course.

The rumor is Microsoft's Surface Laptop announced today will be AMD which will be the beginning of the end for Intel as OEMs will fall in line.


I wonder if Intel will resort to dirty tricks like last time, when the Athlon64 came out and offered better performance than the equivalent Intel cpus

Now that a large segment has swapped to AMD for performance they're also going to see the light about not getting screwed by the cpu socket changing in intels platforms.

This is a big deal IMO.


I think he means relevant buyers, like Google and Apple etc. People who order their CPUs by the cubic meter.

Google's planning on using semi-custom designs for their new gaming platform. AWS is already ordering and using tons of server chips and Google doesn't seem too far behind.

Google is also looking at AMD for it servers. As AMD is beating Intel with the current generation so badly in all cost metrics that using Intel is wasting money.

Google has been "looking at" AMD for decades, including rolling out an enormous fleet of AMD machines spanning several generations. They have a big platforms group and plenty of engineers to dedicate to trying to make the TCO pencil for any platform, even POWER and ARM. The existence of a new AMD part that could actually work is not a new development in the relationship between Google and Intel.

Even if AMD is used just as a leveraging stick against Intel, that's a win for customers.

There's the old story of talking to an Oracle salesman while drinking from an IBM coffee mug - to signal that there's other viable options and get a better deal.


The real problem is that AMD doesn't invest enough in the software.

Nvidia's Linux support absolutely puts AMD to shame.

The biggest lesson in hardware is that "it's the software support, stupid". Too many hardware manufacturers think they are selling hardware alone. They are selling hardware AND the software that makes it work.

A computer is useless without software support. Witness the vast array of essentially useless ARM devices out there because none of them have usable software drivers/Linux support.


Nvidia's Linux support? AMD actually has open source drivers these days, just like Intel, Nvidia just gives you a binary module, which means in order to make use of that hardware you have to taint your kernel and aren't free to update it if the in-kernel ABI changes. It's basically on par with the vendors of all those "useless ARM devices out there", none of them with "usable Linux support".

AMD is finally approaching parity in the past year or two, but Nvidia is still ahead in terms of day-to-day usability. It sucks that they're closed, but if you just want to install a driver and have your system work, Nvidia is ahead.

Look at this mess: https://wiki.archlinux.org/index.php/Xorg#AMD Do you want AMDGPU, AMDGPU PRO, ATI, or Catalyst? How do you choose? Let's assume you want open source and recent, so probably AMDGPU. OK that's relatively straightforward, but now let's say you want Vulkan support. That's over here: https://wiki.archlinux.org/index.php/Vulkan#Installation and golly wouldn't you know it there are three choices, vulkan-radeon, amdvlk, and vulkan-amdgpu-pro. How do those relate to the radeon and amdgpu drivers mentioned earlier? Arrrrrrrrrrgh.

Meanwhile on nvidia, you install "nvidia" and you're done.

I've been working in gaming on Linux profesionally for ten years. If I were buying a new GPU today, I would pick nvidia.


I think you might be blinded by prior experiences (yes, that sounds illogical). The choice you have with AMD now is not bad for users, they get a working good and free driver with their distro out of the box. And people like you, who work in gaming under Linux and might use a more custom distro, can figure out which one is the best driver in your specific situation (it's AMDGPU always anyway).

I'd never buy a Nvidia card for Linux now. I wouldn't have done so three years ago, but now? That'd be a huge step backwards.


It's possible. The last time I bought an AMD GPU for my personal use was in 2009 or 2010. But I really do work with this stuff every day. In our QA lab, the AMD machines have far more problems than the Nvidia machines do. Valve's VR software works much less consistently; installing modern versions of the drivers on Ubuntu is a huge pain, you have to use 3rd party repos maintained by some random guy; missing features and driver bugs are much more common on AMD (though not unheard of on Nvidia either).

AMD is definitely getting better, and being open source is an enormous point in its favor, but it's still just less usable day-to-day for an end user. I hope this changes.


Unfortunately very few Linux people even see a point in actually using their GPU. As long as it can composite terminal emulators, they think everything is great. I've seen such reasoning in discussions about sway, where people are encouraged to throw out their $500+ GPU in favor of the on-die Intel GPU...

To be clear, I'm a gamer. The games I play would not run with Intel onboard gpus. And I'm very happy with the performance the AMD driver delivers.

> Look at this mess: https://wiki.archlinux.org/index.php/Xorg#AMD Do you want AMDGPU, AMDGPU PRO, ATI, or Catalyst? How do you choose?

You don't. Just use what your distribution uses for you. That's one less step, than

> you install "nvidia"

> If I were buying a new GPU today, I would pick nvidia.

I guess you don't mind the problem Nvidia has with Wayland.


> I guess you don't mind the problem Nvidia has with Wayland.

From my perspective of someone who dislikes Wayland, that is a feature of Nvidia's drivers :-P.


> but if you just want to install a driver and have your system work, Nvidia is ahead.

But if you want your driver to not break horribly whenever there's a xorg or kernel update, nvidia is behind

Or if you want your driver to just work 'out of the box' without needing to go out and install a binary driver nvidia is behind

Or if you want your driver to support newer technologies like DRI3 and wayland nvidia is behind


Also their OpenGL performance is abysmal. On windows isn’t not the biggest deal since everything is DX but on Linux?

...Vulkan? Proton? In terms of 3D acceleration libraries and support, its never been better on linux. I actually just switched to manjaro full-time because everything I wanted to play now works on linux, too.

Mesa drivers are fine. The one with horrid performance is their proprietary AMDGPU-PRO, but there's no reason to use it. (especially since Mesa now support GL compatibility profile)

Mesa drivers are far from fine, they only got rid of the "core only" braindead schism just last year (IIRC) and that was thanks to pressure from AMD to get games working. Even then a lot of Mesa's codepaths are downright awful.

Nvidia had OpenGL as a major focus for decades (it was even the first API to get new features introduced in their hardware via extensions - though nowadays they also seem to do the same with Vulkan) and their OpenGL implementation's quality shows.


Yes, Mesa is not perfect. (particularly annoying thing is that it is possible to lock up entire GPU by invalid memory accesses in shader, infinite loops not always recover cleanly and lock up GPU too, and these bugs are accessible through WebGL, etc.) But I prefer occasional crash in badly behaved software than abysmal performance of their proprietary implementation. (and it crashes too, usually on software using legacy pre-GL3 context).

And as to compatibility context, IMO OpenGL shouldn't have defined it at all. It is rather weird to have rendering code using both legacy fixed pipeline and modern shaders. From my experience it just causes problems everywhere, and I had encountered unexpectedly bad performance and glitches on Nvidia drivers too. (though maybe less often than on Intel/AMD)


This was an issue with the previous proprietary drivers, but the modern open source ones just use Mesa like everything else sane.

My experience with NV on Linux is out of date, from 2003-ish (?) when the binary driver was new, to 2010, but I remember a few drawbacks I couldn't overcome and happily made the switch to ATI/AMD:

1) Closed driver

2) Weird incompatible multimonitor support called twinview? Where windows would maximize to stretch across all your screens

3) Abusive on the PCI subsystem, so that my sound hardware would experience underruns, both spurious and reproducible events such as during workspace switches

4) Lockups

5) Corruption/lockups switching between FB terminals and the X terminal

I switched to ATI and the open source driver and all these problems went away.


> which means in order to make use of that hardware you have to taint your kernel and aren't free to update it if the in-kernel ABI changes.

I have difficulty placing all the blame on Nvidia here since it is the kernel developers long-held disdain for stable driver ABIs that Nvidia feels forces their hand. There are those who argue otherwise, but that argument is basically "all software should be open source". Even if we go with the more reasonable "all drivers should be open source" that still introduces problems because of bonkers patent law nonsense.


Nvidia can release an open-source kernel driver. They have just figured out that inconveniencing their customers is cheaper than paying developers.

AMD doesn’t support 8K monitors under Linux. I agree, NVIDIA has far better usability for most users. The fact that it’s a binary blob means nothing to anyone who isn’t an OSS activist.

Once again we see the unfortunate trend of downvotes used to censor unpopular opinions.

> aren't free to update it if the in-kernel ABI changes.

How often is that actually an issue? Last time I ran into that was when I was still running Debian Sid and NVIDIA was far from the only thing that ended up breaking.


I'm sorry but NVidia Linux support is really bad. I administer a few dozen machines and am really looking forward to tossing the NVidia cards into the dumpster where they belong.

NVidia Linux support has always been bad, we just didn't realize it because AMD's used to be worse. Luckily this has changed over the past few years and now we can spec in AMD or perhaps Intel's rumored Xe graphics card.


> I'm sorry but NVidia Linux support is really bad. I administer a few dozen machines and am really looking forward to tossing the NVidia cards into the dumpster where they belong.

I assume you're joking, but if not please don't junk them! I'm sure lots of people on HN would gladly pay for (at least) shipping.


Yes, it was hyperbole. The "dumpster" is an area in the office where employees can grab surplus hardware and supplies -- it'll be picked clean pretty quick if video cards show up there...

>Nvidia's Linux support absolutely puts AMD to shame.

Huh? AMD "just works". NVIDIA requires a dance of (Fedora) DKMS kernel modules and binary blobs. What am I missing? I buy AMD just because it's been frictionless to get working, I don't mind if theres a small performance penalty I pay for it.


This is repeated frequently, but I was never able to get my AMD 560 to work on Linux at 4k with MST (or turn down the fan speed). With Nvidia I installed the binary blob and 4k over MST “just worked”. I tried the amdgpu and Pro versions of the drivers.

Neither company is perfect with Linux support. As with most things, there are pros and cons to both sides. But for anyone else reading here, just because you pick AMD does not mean everything is magically trouble free on Linux.


Does MST even work 4k@60 over DP? 4k@60 requires so much bandwidth, that is uses both MST channels (it is internally handled as two monitors); if you want to use both channels separately, you are stuck with 4k@30 at most.

Yes, MST absolutely can do 4k at 60hz. This is what the first Dell 32" 4k used.

My rMBP has been able to do this since February, 2014. I was trying to get my RX 560 in mid 2018 to do this and it just was not possible via any means with Ubuntu 18.04. I ended up using an Nvidia 1040 with the binary drivers without issue. The nouveau driver for the 1040 was able to do it with a newer kernel than Ubuntu 18.04 had (I was using Arch for that test).


My rMBP is not capable of doing that; not with Dell P2715Q. It works without MST, or with lower resolutions.

Which rMBP do you have? I have the late 2013 with the dedicated Nvidia card.

13" early 2015, Intel Iris 6100.

To get two 4k@60 outputs, I have to run two cables (it has two TB2 ports). With daisy-chaining, I got mirroring at best (I didn't check refresh rate at the time, though).


I've never tried MST on AMD so I cannot comment. But of course it goes without saying that by picking NVIDIA it won't be magical and trouble free for everyone.

AMD's Linux support is terrific now. Nvidia is much more of a hassle and also impeding progress by insisting on their own window system - driver interface for Wayland.

> AMD's Linux support ist terrific now.

I think you have a typo, which unfortunately is half way between "is" and "isn't". Or your sentence is partially German.


Fixed. I'm German and this typo happens occasionally.

> Nvidia's Linux support absolutely puts AMD to shame

Is this meant seriously? Is it not the exact opposite?


I'm pretty sure neither manufacturer is really that worried about their Linux support. And, as far as my experience has gone with them both on Linux, they have both had their good years and their bad.

They are absolutely worried about Linux support. All high end computer graphics in large companies are done using Linux.

This is exactly true. Especially on CUDA and the deep learning front. Basically, only option is NVIDIA as the competition has almost nonexistent software framework (maybe OpenCL).

CUDA is a vendor lock-in scheme. Use OpenCL or Vulkan instead (yes, Vulkan includes support for compute, not just graphics!). AMD supports both, in addition to tools like HIP to help you port legacy CUDA code.

CUDA is result of being the first mover in an emerging field over a decade ago. It’s a lock in scheme the way Excel is a lock in scheme for Microsoft: a product that provides value which is used to generate revenue.

Using Vulkan(?) or OpenCL for deep learning is fine if your goal is helping get out the word that it’s at least possible to do so.

It’s not a great idea if you simply want to get your work done with a minimal amount of issues and with up to date support of all the prominent packages.

Both are valid objectives.

HIP won’t help you one bit if you use one of the many CUDA libraries.


"The prominent packages" for deep learning are all open source. If they don't support OpenCL or Vulkan yet, you can work on it and contribute the feature upstream. It's a matter of getting the ball rolling to start with, nothing more than that.

cuDNN isn’t open source.

> If they don't support OpenCL or Vulkan yet, you can work on it and contribute the feature upstream. It's a matter of getting the ball rolling to start with, nothing more than that.

And that’s exactly what I mean with the choice that you have to make.

You could do that, and win brownie points with some crowd, but let’s not pretend that doing so wouldn’t have an impact on your work as an AI researcher.

“Nothing more than that” is more than a little bit disingenuous when you take into account the actual work that involved, don’t you think?

It’s something a company like AMD could (and does?) sponsor by throwing paid employees at it. Or big companies like Alibaba who buy enough HW to trade off HW cost vs R&D spending.

But if your goal is to be a deep learning specialist (it’s where the money is today), spending time on these kind of infrastructure issues is a complete waste.


>This is a really good example why competition matters for the consumers.

Imagine how much quicker the fair pricing would be if there were standards and regulations, like what goes into every other aspect of its design.


I think this is a poor example for the pro-regulation argument; it's a more or less classical market where competition drives innovation, but the cost of entry is very high.

The only applicable regulation I can think of might be "dual sourcing".


I can't quite interpret what you are saying; so I'll say what I hope you are saying:

If the x86 and x86_64 instruction sets became an open standard so that anyone could implement them; that would be great for consumers. If Intel is in a position to halve prices in the face of competition there are obviously not enough players in the market.


At that price point I'd rather spend a little more on AMD and get 24 cores. Because let's be honest, if you got the money to build a REALLY high end PC why are you penny pinching on the CPU? 200-300 isn't a big leap when the motherboard costs more than that difference.

Gotta hand it to this place, it's one of the few places I visit where 200-300 dollars is considered penny pinching. The other is the country club.

Given the amount of time people on hn spend using their computers, it isnt that surprising. Most people probably spend much less time commuting, but that amount of money is considered reasonable for vehicle maintenence expenses that probably recur more often than people buy a new computer

My last computer I put together about 5+ years ago (i7-4790k, 32gb ram) and did a mid-cycle upgrade a couple years ago (gtx-1080, 1tb nvme, large 4k display).

New computer is X570 mb, 64gb, rx5700xt, using an r5-3600 as a placeholder, I am still waiting on a 3950X, and only pulled the trigger early since my last system was acting up. I don't have the specific costs, but have spent around $1500 so far and will probably lose ~$50 on the 3600 when I upgrade and sell it off. Will probably be 4-5 years until I upgrade again, except I may drop a second and/or third nvme drive in or bump the ram to 128gb.

The difficulties I've had this time around have been around aesthetics, since I didn't have any experience with aRGB it tool a while to figure out what to get, and even then none of the aRGB supports Linux. The 5700 just got swapped in, and I've had trouble with that and will have to correct over the weekend, may wind up on Windows if it's too much of a pain.

I've mostly been working on my work-issued laptop, where I've had separate issues with the battery/charging etc the past month. In the end though, spending a couple grand on a computer every 3-5 years isn't so bad if you're actually doing work on it. I will say, I am surprised how much faster this system actually feels over my old one. Hopefully I can get it stabilized with the new video card soon.


When you have $1000 to drop on a CPU I'm quite sure you can also afford a country club membership ;-).

Can you though? I played piano gigs at a country club where the starting family membership cost was $100k per year, which is the lowest tier of membership, the plebeian level.

Computers at this level are durable goods and ostensibly help you take an income, so it's not too strange.

Only if you need this much computing power. The typical webdev can probably get away with a quad core AMD/Intel cpu for <$200. Very few people actually need 12+ cores.

Haven't you visited a website outside HN the last decade? I doubt most webdevs use anything less than an 8core and 64GB of RAM these days :-P

Considering I'm using docker to run databases and various services in order to do "webdev" I wouldn't underestimate how useful the extra resources are. I recently upgraded and still feel the strain in some instances.

(r5-3600, waiting for 3950x to drop, upgraded sooner because old system was acting up).


As a web dev running multiple applications at the same time with memory hogs like Chrome and Slack on a dual core work machine, having more cores do help.

It come to the point that I will have to ask my boss to replace my machine since it was so sluggish.


I know companies who are handing out new MacBook Pros and Airs with 8GB of memory. Now that's the falsest economy I've ever seen.

very few really need more then 1 core...

JS devs usually need quite a few threads/processes. The editor/IDE is 1 or more processes. Compiling is a separate, multi-threaded process. The webserver is another process. Tests are several more processes (which matters because 30+ minutes on a fast desktop isn't unheard of when running a complete test suite). Parsing the newest compile is multi-threaded in the browser. A separate core for the main website thread is especially important because opening the dev tools de-optimizes all the things resulting in significantly slower performance (a big thing with web apps). If service workers or webworkers are used, then there are even more processes involved.

Going from my dual-core Pixelbook to my Ryzen desktop is a huge change in the development experience.


It could be argued that some "developers" should be restricted to pencil and paper, but it's a different issue. Even hipsters deserve to multitask at their computer.

you can multitask on a 1-core CPU... because the CPUs are much faster then you at switching tasks... that's how it used to be.

[flagged]


Computers are quite durable these days. You can easily run a useful, general-purpose OS on a computer from 10 years ago and perhaps more, even one with baseline specs. If that's not enough to be considered a 'durable good', I don't know what is.

Yep.

They were quite durable 20 years ago too. It's just back then clock speed and performance was increasing very rapidly and it had a huge effect on the usability of your machine (especially if you were a gamer). You would likely upgrade and retire a working machine for the sake of a drastic performance boost every 3ish years.

Nowadays you could happily be using a 5+ year old i5 for full time development and moderate gaming with no issues at all. I'm personally at 5+ years now and I don't see myself upgrading for quite some time because everything I do still feels very snappy.

But compare that to something like a Pentium I at 75mhz (1994) vs a Pentium III at 1ghz (1999) and the performance difference was crazy. Over 10x the clock rate, but it's not just a "paper" boost. They are so different in performance that you can't even compare them. They are in leagues of their own.


Tell that to my 9 year old iMac with the dead GPU that I can't replace, won't start up, and is going straight into the trash. Never again, Apple.

Well, he did say "computers," not "appliances." ;-)

Hey! Don't impugn the serviceability of appliances. When my toaster stopped staying down and heating on one side, I simply opened it up and crimped the broken connection.

Yes, my iMac has been flaking out for a while and only does Netflix now. Terrible design, especially for such an expensive machine.

If you can, take it back to Apple - they will recycle it.

I built a PC seven years ago (i7-3770K). It's lasted through multiple apartments and multiple jobs. I do all my dev work and gaming on it, and it's even able to run Slack. How much longer does something need to last before it's considered durable?

I have a food processor that my parents bought nearly fifty years ago. They had to replace the motor once, and then I replaced the wiring on the motor coil four years ago, but it's still ticking. Stainless steel, no plastic components.

(The buttons used to be plastic. Now they're wood.)

That's durable. It also does a much better job than any modern food processor, possibly because the motor is four times the size of what you'd get now.


The rate at which technology advances doesn't change whether a good is considered a durable good or not. That term has a specific meaning in economics.

There are 50-year-old cars still in service today. I wouldn't drive one, because I want to take advantage of the tech advancements of the past 50 years. But that doesn't mean cars aren't durable goods.

And even though computing is barely 50 years old, there are 50-year-old computers in service today. I wouldn't use one, because I want to take advantage of the tech advancements of the past 50 years. But that doesn't mean computers aren't durable goods.

This applies even if a 50-year-old car, which predates bluetooth, doesn't have built-in bluetooth. And it applies even if a 50-year-old computer, which predates HTTP, can't load websites.


I wouldn't consider that durable, just repairable. After all, that's like saying an old classic car is durable because it exists and only had its motor replaced once.


I'm still running an AMD cpu I bought around 2019 ( I think?) in a tower that is 10 years older than that. The PSU is a year old because I had to replace it. The video card is 2 years old. All that to say, I run my hardware for a long time.

If I was building new, I'd go AMD, too.

As things stand, I already have a 8-core X299 system, and the fact that I can drop an 18-core CPU into it after a BIOS update is compelling. But it seems unlikely that there is a large enough audience in this situation to juice Intel's profits or prevent them from losing more market share to AMD.


Thanks, but I'd still rather buy AMD since they actually offer competition. I fear that if I buy Intel now, I'm basically saying to them that they can pull whatever shenanigans they like and they're still untouchable.

This incompatible socket thing is driving me crazy. I "just" bought a new motherboard and now I can't use any current products.

I've never even considered that a problem - my time between upgrades is so long that the last motherboard had DDR3, and the one before that DDR2. If you're a once a year upgrader I could see it, but given the slow progression of CPU performance as of the last half decade (AMD's rapid jump to parity not withstanding), I can't see a real reason why to upgrade that often.

With a motherboard bought with a Ryzen 1200 quad-core you could upgrade to a Ryzen 3900 12-core. In the same time frame you would have had two or three Intel sockets and chipsets.

And that's why I described my use case - as each chipset pretty much maps to an Intel generation, two or three different Intel generations doesn't provide a compelling reason to upgrade. Again, Ryzen is different, but that's largely because they've been fighting to gain parity with Intel.

It feels arbitrary.

You're missing out on a host of vulnerabilities.

At risk of missing sarcasm, a large reason all these vulnerabilities in Intel are found is their market share. Same as when OSX was considered "unhackable" - it just wasn't worth the effort. AMD is likely just as bad, but it's not uncovered (yet).

Supporting the underdog is a good idea nonetheless, but not for this reason per se.


Or it is the other way around. Intel had cut corners to be able to make the fastest processors, and that is how they got their marketshare. They have been called out on it in the last two years and updates have made their processors slower. Also AMD improved massively with Zen.

More like this, intel has cut a lot of corners for a long time in their x86 processors, mostly to have better performance, which ended up with all these vulnerabilites we keep hearing about every couple of months. They're just now suffering the consequences of all the shady things they've done to try and establish a monopoly in the CPU market

A more generous interpretation: Intel made decisions that favored actual measurable performance today at the expensive of theoretically known vulnerabilities that might be exploited in some hypothetical future. They gave the market what the market demanded at a cost they tolerated.

And even now, after said theoretical vulnerabilities have been reified, there is very little cause to be concerned about the vulnerabilities under discussion unless you host code for other people as a business model (or use such a service). Otherwise your biggest concern is a web browser that already has a whole host of actual and theoretical vulnerabilities of its own.


> They gave the market what the market demanded at a cost they tolerated.

This suggests a level of informed consent that I don't think existed. It implies that "the market" (who?) knew of, understood, and agreed to the risks.

And anyway, "the market" does a poor job of representing some of its stakeholders, notably the disorganized group known as users, and immediate competitive advantage may be the only metric driving decision-making.


Intel made decisions that favored the actual indicators people were looking at with a hidden cost on things people weren't aware of. They got the market exactly what the market demanded while deceiving that market by applying known bad practices. There is a huge difference in impact from saving money by using lead-based paint, but it's the same kind of decision.

> unless you host code for other people as a business model

Or is a victim of one of those javascript based exploits when visiting a random site.


Any security researcher or engineer worth their salary is already planning for those theoretical vulnerabilities. Ideas like these are hashed out at the development meeting. Not fixed after the cart has left the barn.

And yet people use Linux instead of OpenBSD, because it turns out security isn't always the most important consideration.

Maybe, but maybe not. The fact that AMD has implemented a similar instruction set but is not vulnerable to many of the existing speculative attacks that plague Intel suggest that they could just have made more secure design decisions, whether it was intentional or not.

You're comparing apples to oranges. Operating system architectures can vary a lot. CPU architecture between Intel and AMD can't vary much since they both target x86. Every vulnerability found with Intel CPUs can and has been tested for against AMD CPUs which is something you simply cannot do with operating systems, and AMD CPUs have been found not to have the same vulnerabilities.

> CPU architecture between Intel and AMD can't vary much since they both target x86.

Completely and utterly false. The exploits being found exist at a point much lower than the x86 instruction set. ARM processors were also hit with speculative execution exploits.


You're focusing on something that doesn't really matter in the context of what he was saying, and ignoring the bigger truth which is that security researchers DID look at AMD processors and they DID find that they do not suffer from the vulnerabilities that plague Intel.

All the vulnerabilities. Speculative execution vulnerabilities were found in AMD, just not the meltdown vulnerability.

And, his point still stands. While intel has/had meltdown, which was pretty bad. That doesn't mean that a Meltdown like bug doesn't exist in AMD's hardware.

It's like finding a bug in the OpenJDK that isn't in Zulu and then declaring that "Zulu is more secure than the OpenJDK!"

Or an even closer example "Intel doesn't have the TLB bug[0]! Intel makes better CPUs!"

AMD isn't guaranteed free of all exploits. It is only guaranteed to not suffer specifically from meltdown.

[0] https://www.anandtech.com/show/2477/2


> All the vulnerabilities. Speculative execution vulnerabilities were found in AMD, just not the meltdown vulnerability.

Just Not Meltdown. Or SPOILER. Or Fallout. Or RIDL. Or ZombieLoad.

You're really underselling the difference between the two. AMD's processors have only shown vulnerability to the issues that are inherent to the nature of speculative execution (Spectre).

Intel, on the other hand, has suffered from no less than 5 or 6 separate disclosures of vulnerabilities from various places in their microarchitecture where they cut corners on process separation in order to gain speed. None of these exploits have been pulled off against AMD, despite many of the papers authors explicitly trying to,


You're doing exactly what you wrongly accuse me of...

The original rebuttal was AMD gets less researcher eyeballs because there are more Intel devices in the wild so it's expected that Intel has more vulnerabilities found.

This was fully correct, yet the GP and you apparently are fixated on the idea that AMD didn't have certain vulnerabilities that were originally found _by studying Intel CPUs_. Yet no big surprise most don't apply to AMD in that case... because they were found by studying Intel CPUs first.


>CPU architecture between Intel and AMD can't vary much since they both target x86

Yes they can. Instruction sets are just that, a bunch of op codes and "things" for a processor to do. How a processor actually performs them is open to implementation.


Right, it is like declaring that "Java bytecode is fixed, all JVMs are the same!"

Implementation matters a lot. Heck, it is WHY there are performance differences between AMD and Intel CPUs.


Intel's offering still looks bad compared to AMD.

Before the price cuts it looked pathetic, so I guess it's an improvement.


Why is it with every AMD sales-related submission I see, the level of discourse in the comments is barely one step above name calling?

My pet theory is that there's just a ton of casual investors who have also become emotionally invested in AMD...


They called Intel's old price "pathetic." That's hardly a low level of discourse, and not really tantamount to name calling.

It isn't a particularly insightful comment, I'll acknowledge that, but this response seems like it lowers the quality of the thread more than the comment it is replying to (and arguably breaks the sites rules by suggesting the other poster have ulterior motives for posting what they posted).

> Please respond to the strongest plausible interpretation of what someone says, not a weaker one that's easier to criticize. Assume good faith.

And:

> Please don't make insinuations about astroturfing. It degrades discussion and is usually mistaken. If you're worried, email us and we'll look at the data.


> Please respond to the strongest plausible interpretation of what someone says, not a weaker one that's easier to criticize. Assume good faith.

Yet you assume my comment is referring to one comment and unfairly attacking it?

This entire page is uncharacteristically rife with misinformed comments and people oddly invested in shutting down anyone with anything positive to say about Intel

And:

Please don't put words in my mouth.

I didn't accuse AMD of astroturfing. If anything I literally implied the opposite, an organic reason why there's a movement to emphasize "AMD bad Intel Good".

My point is it's to the detriment of having real discussions about either when they come up on HN.


Or it could be, as people have said many times before and the actual subject of this article confirms, that people were sick of Intel using its near Monopoly in the past to price gouge its customers and slow walk tech improvements.

Hmm they definitely worried about threadripper 3.

2019 - year consumers win.


Still bleeding profusely on the server side of things. AMD still has a 4X on performance and a considerable energy savings to boot.

Much more significant IMO is the rumours of heavy discounting for Intel server chips and the wide variety of laptops of AMD laptops exhibited at Computex 2019. AMD doesn't have significant market share in servers or laptops yet, but those are signs that it may be coming...

How much of Intel’s demise is due to their layoffs from 4 years ago? Did they cut too deep and mess up the secret sauce?

I'm no Intel fanboy, but I think it is too early to speak of Intel's demise. They are still making a lot of money. I mean, AMD only has single digit penetration into the datacentre market, for example, and Intel is still compelling on mobile.

And I suspect the real answer is they set their targets too ambitiously with their new node. Their 10nm was simply too big a jump with too many unproven technologies, with too many unexpected obstacles.

But I wouldn't count them out of the game just yet. They are an aggressively competitive company. They'll be back.


Considering purchasing cycles are over the course of years that's not surprising. My work issued laptop is just over a year old, and I wouldn't expect a new one for another 3-4 years. Not to mention purchasing contracts that are negotiated once or twice a decade that may specify Intel.

AMD is expected to double their penetration by the end of the coming year in the server space, which is huge. I think they'll also make inroads into the laptop space as well. Given that they're the preferred APU for next gen consoles won't hurt their sales to developer channels either.

I would be surprised if they haven't captured about 1/3 of the overall x86 market by the end of 2021, which is where Intel will probably be where they want to be technology wise, until then it's AMDs game.


Just look at what happened to amd around 2010. Intel can punch back. Hard.

Intel could punch hard around 2010. We'll see what they'll be able to do now.

Improving the price (not the quality) of their high-end product range from vastly inferior to slightly inferior is, at best, an attempt to remain relevant; new and better products are needed to recover actual market share, and their timeline remains doubtful.


They weren't bleeding on the portable side of things in 2010. Apple, Microsoft and Google (TPU) all have some sort of custom chips now which aren't x86

The CPU bugs go back much further, probably the late 00's in the beginning of the end. Short sighted speed cheating got them into this mess.

Now it's an interesting battle. I want to build new workstation, probably in 2020. I was going to go to Threadripper route, but now I have a choice and that's awesome.

It's only a choice if you're willing to settle for:

- a long list of security vulnerabilities which, when fully patched and recommended features (including HyperThreading) are disabled, results in a huge loss in performance(~25% in some reports)

- lower power efficiency due to still using 14nm process

- platform uncertainty, i.e. will upgrading mean buying a new motherboard?


AFAIK Intel patches many of those security issues in new CPUs, so performance loss won't be as severe. I don't care about power efficiency, as electricity is cheap enough for me and I'm buying computer for many years, so I'm unlikely to benefit from CPU upgrading in 7 years anyway.

Basically for me it's about:

* Security issues. They are real for Intel and performance loss is real.

* Inter-core communication latency. I don't particularly like this chiplet design for AMD, when some cores have a long time to talk to other cores. Intel seems like a safer play here. But I did not research about their HEDT CPUs, may be it's the same for Intel.

* Balance of frequency/cores. While more cores is better, I think that after 12 cores I don't really need more. And single-thread performance is always important.

* Other issues. I don't like that AMD had multiple issues with its Zen CPUs, especially with Linux. Crashes, RNG bug, overall Linux compatibility. I'm feeling like Intel takes Linux compatibility much more seriously and Intel overall seems like a safer bet for stable system.


I think a lot of the issues you've brought up regarding AMD CPUs are no longer relevant with Zen 3.

Inter-core communication latency has been much improved with the larger caches and architecture improvements in Zen 3. Single threaded performance has always been within a stone's throw of Intel CPUs with Zen and Zen 2 and now the gap isn't even worth talking about with Zen 3 since you get more cores for less money.

I'm not sure about the Linux story, but this is the first I've heard of issues with Zen on Linux.


AMD's entire chiplet architecture is aimed squarely at cheaper server chips. All of their biggest wins have been with cloud providers like Google and Amazon. If their target was the consumer market, they would have launched 7nm mobile chips first. Given that the overwhelming majority of servers run Linux, I doubt they neglected to focus on that OS.

Zen 1 didn't have an especially large list of issues for being a ground-up new architecture. Launching to enthusiast desktops so you can revise things for the server market seems like a great strategy for them overall.


TR4 will likely support PCIe 4.0. It seems to make sense to me to pick Threadripper because of this. Ignoring costs the future-proof of having tomorrow's IO interconnect is probably worth a few bucks for a work machine.

I can't imagine scenario where PCIe 3.0 is not enough for me. Even for storage it only matters for a linear I/O and I don't care about it that much. Random I/O won't saturate PCIe interface, it's not fast enough on any disk.

It might not be useful now but in terms of future proofing a system it might become more of a deciding factor. The ability to make use of drives, graphics cards, network interfaces, or anything that can connect via PCIe for a few years to come might be beneficial.

PCIe 4 is only really worth it for enhanced storage drives... and they do come with some heat and related performance penalties if the case airflow is insufficient.

Depending on your workloads, it may well be worth it for you though.


The question is how much of this is pure marketing, given Intel's supply problems, and how much is this a real product. Will Intel actually be able to match demand for these products in the short term? It's an easy marketing win if you promise half price parts but customers enmasse can't actually buy them.

Given the ability to actually buy an r9-3900x anywhere close to list price, I'm not sure it really matters either way at the moment.

Ryzen9 [3900X 12C/24T and 3950X 16C/32T] both @ 105W will be Intel HEDT's coffins

I think the next ThreadRipper release will be what really kicks Intel HEDT line. The new pricing structure seems competitive against the R9 series. I've already bought AMD and don't plan on switching for 4-5 years though (3600 waiting for 3950x to drop).

Is there a website that maps popular intel chips / chipsets / builds to AMD competitors?

Also, what's the deal with sockets? What sets of intel chips are mutually compatible and incompatible with others, and how does AMD compare?


The deal with the sockets is that the best CPUs stop being compatible with your motherboard only once every 4 or 5 years, and you'll only be denied an upgrade in a decade. In a quick search I got a compatibility enumeration:

http://www.buildcomputers.net/amd-cpu-socket.html


Yay, competition. Gotta love it.

Yes it in the midst of all the complexities of the various monopolies, oligopolies and inefficiencies in the modern marketplace, its refreshing to see the system work as promised. Just like in the Athlon days, even people who would never in a million years use AMD's product, or even know of its existence benefit from its products.

I am getting served ads that are redirecting my browser 10 times and showing something like those scammy Amazon gift card ads. On mobile safari.

Yeah it has gotten worse. You should run an adblocker on your iOS device.

To add, I recommend Adguard, 1Blocker X, Wipr, or Firefox Focus. Out of those, I believe only 1Blocker X has a cost, although AdGuard has a subscription for DNS privacy and security/custom filters.

I personally use 1Blocker X.


Thanks. Looks like I lost my ad blocker when I got a new phone. Ad Block Plus was what I used.

Is there a Brave release for iOS, been pretty happy with it for Android.

Brave browser is on iOS, but I don’t believe its adblocker extends to Safari.

Not being able to set a default browser on iOS tends to make you use Safari more than another browser you have installed since tapping a URL in another app will open Safari, unless that app supports setting a default browser, which not many do and even fewer support opening Brave. It has to be implemented on a per app basis since they’re really just using the x-callback-url scheme. Most tend to only support setting Safari, Chrome, or Firefox.


Was just curious... I don't use/own any iDevices, so no personal experience in that space. For Android, it's worked out pretty well for me.

Intel CPU TDP is 165W we have global warming. AMD has a TDP of 65W and 105W

We have gone from 200W power supplies to 1kW power supplies in PCs.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: