
Intel slashes prices in face of AMD competition - taspeotis
https://www.anandtech.com/show/14925/intel-cascade-lakex-for-hedt-18-cores-for-under-1000
======
s5ma6n
Finally, they were basically asking double price for some CPUs until there is
a strong competition. This is a really good example why competition matters
for the consumers. I hope similar situation happens for GPUs too. Right now
NVIDIA asks for way more than the cards' worth.

~~~
Jonnax
I guess so but for a lot of buyers they're not even considering AMD. They're
just using it as a stick to beat Intel with for a better deal.

~~~
andrewstuart
The real problem is that AMD doesn't invest enough in the software.

Nvidia's Linux support absolutely puts AMD to shame.

The biggest lesson in hardware is that "it's the software support, stupid".
Too many hardware manufacturers think they are selling hardware alone. They
are selling hardware AND the software that makes it work.

A computer is useless without software support. Witness the vast array of
essentially useless ARM devices out there because none of them have usable
software drivers/Linux support.

~~~
zozbot234
Nvidia's Linux support? AMD actually has open source drivers these days, just
like Intel, Nvidia just gives you a binary module, which means in order to
make use of that hardware you have to taint your kernel and aren't free to
update it if the in-kernel ABI changes. It's basically on par with the vendors
of all those "useless ARM devices out there", none of them with "usable Linux
support".

~~~
coldpie
AMD is finally approaching parity in the past year or two, but Nvidia is still
ahead in terms of day-to-day usability. It sucks that they're closed, but if
you just want to install a driver and have your system work, Nvidia is ahead.

Look at this mess:
[https://wiki.archlinux.org/index.php/Xorg#AMD](https://wiki.archlinux.org/index.php/Xorg#AMD)
Do you want AMDGPU, AMDGPU PRO, ATI, or Catalyst? How do you choose? Let's
assume you want open source and recent, so probably AMDGPU. OK that's
relatively straightforward, but now let's say you want Vulkan support. That's
over here:
[https://wiki.archlinux.org/index.php/Vulkan#Installation](https://wiki.archlinux.org/index.php/Vulkan#Installation)
and golly wouldn't you know it there are three choices, vulkan-radeon, amdvlk,
and vulkan-amdgpu-pro. How do those relate to the radeon and amdgpu drivers
mentioned earlier? Arrrrrrrrrrgh.

Meanwhile on nvidia, you install "nvidia" and you're done.

I've been working in gaming on Linux profesionally for ten years. If I were
buying a new GPU today, I would pick nvidia.

~~~
selectodude
Also their OpenGL performance is abysmal. On windows isn’t not the biggest
deal since everything is DX but on Linux?

~~~
garaetjjte
Mesa drivers are fine. The one with horrid performance is their proprietary
AMDGPU-PRO, but there's no reason to use it. (especially since Mesa now
support GL compatibility profile)

~~~
Crinus
Mesa drivers are far from fine, they only got rid of the "core only" braindead
schism just last year (IIRC) and that was thanks to pressure from AMD to get
games working. Even then a lot of Mesa's codepaths are downright awful.

Nvidia had OpenGL as a major focus for decades (it was even the first API to
get new features introduced in their hardware via extensions - though nowadays
they also seem to do the same with Vulkan) and their OpenGL implementation's
quality shows.

~~~
garaetjjte
Yes, Mesa is not perfect. (particularly annoying thing is that it is possible
to lock up entire GPU by invalid memory accesses in shader, infinite loops not
always recover cleanly and lock up GPU too, and these bugs are accessible
through WebGL, etc.) But I prefer occasional crash in badly behaved software
than abysmal performance of their proprietary implementation. (and it crashes
too, usually on software using legacy pre-GL3 context).

And as to compatibility context, IMO OpenGL shouldn't have defined it at all.
It is rather weird to have rendering code using both legacy fixed pipeline and
modern shaders. From my experience it just causes problems everywhere, and I
had encountered unexpectedly bad performance and glitches on Nvidia drivers
too. (though maybe less often than on Intel/AMD)

------
MisterTea
At that price point I'd rather spend a little more on AMD and get 24 cores.
Because let's be honest, if you got the money to build a _REALLY_ high end PC
why are you penny pinching on the CPU? 200-300 isn't a big leap when the
motherboard costs more than that difference.

~~~
tyri_kai_psomi
Gotta hand it to this place, it's one of the few places I visit where 200-300
dollars is considered penny pinching. The other is the country club.

~~~
snagglegaggle
Computers at this level are durable goods and ostensibly help you take an
income, so it's not too strange.

~~~
gruez
Only if you need this much computing power. The typical webdev can probably
get away with a quad core AMD/Intel cpu for <$200. Very few people actually
need 12+ cores.

~~~
OrgNet
very few really need more then 1 core...

~~~
HelloNurse
It could be argued that some "developers" should be restricted to pencil and
paper, but it's a different issue. Even hipsters deserve to multitask at their
computer.

~~~
OrgNet
you can multitask on a 1-core CPU... because the CPUs are much faster then you
at switching tasks... that's how it used to be.

------
llampx
Thanks, but I'd still rather buy AMD since they actually offer competition. I
fear that if I buy Intel now, I'm basically saying to them that they can pull
whatever shenanigans they like and they're still untouchable.

~~~
sixothree
This incompatible socket thing is driving me crazy. I "just" bought a new
motherboard and now I can't use any current products.

~~~
SketchySeaBeast
I've never even considered that a problem - my time between upgrades is so
long that the last motherboard had DDR3, and the one before that DDR2. If
you're a once a year upgrader I could see it, but given the slow progression
of CPU performance as of the last half decade (AMD's rapid jump to parity not
withstanding), I can't see a real reason why to upgrade that often.

~~~
llampx
With a motherboard bought with a Ryzen 1200 quad-core you could upgrade to a
Ryzen 3900 12-core. In the same time frame you would have had two or three
Intel sockets and chipsets.

~~~
SketchySeaBeast
And that's why I described my use case - as each chipset pretty much maps to
an Intel generation, two or three different Intel generations doesn't provide
a compelling reason to upgrade. Again, Ryzen is different, but that's largely
because they've been fighting to gain parity with Intel.

------
PedroBatista
Intel's offering still looks bad compared to AMD.

Before the price cuts it looked pathetic, so I guess it's an improvement.

~~~
BoorishBears
Why is it with every AMD sales-related submission I see, the level of
discourse in the comments is barely one step above name calling?

My pet theory is that there's just a ton of casual investors who have also
become emotionally invested in AMD...

~~~
Someone1234
They called Intel's old price "pathetic." That's hardly a low level of
discourse, and not really tantamount to name calling.

It isn't a particularly insightful comment, I'll acknowledge that, but this
response seems like it lowers the quality of the thread more than the comment
it is replying to (and arguably breaks the sites rules by suggesting the other
poster have ulterior motives for posting what they posted).

> Please respond to the strongest plausible interpretation of what someone
> says, not a weaker one that's easier to criticize. Assume good faith.

And:

> Please don't make insinuations about astroturfing. It degrades discussion
> and is usually mistaken. If you're worried, email us and we'll look at the
> data.

~~~
BoorishBears
> Please respond to the strongest plausible interpretation of what someone
> says, not a weaker one that's easier to criticize. Assume good faith.

Yet you assume my comment is referring to one comment and unfairly attacking
it?

This entire page is uncharacteristically rife with misinformed comments and
people oddly invested in shutting down anyone with anything positive to say
about Intel

And:

Please don't put words in my mouth.

I didn't accuse AMD of astroturfing. If anything I literally implied the
opposite, an organic reason why there's a movement to emphasize "AMD bad Intel
Good".

My point is it's to the detriment of having real discussions about either when
they come up on HN.

------
philliphaydon
Hmm they definitely worried about threadripper 3.

2019 - year consumers win.

------
Beltiras
Still bleeding profusely on the server side of things. AMD still has a 4X on
performance and a considerable energy savings to boot.

------
bryanlarsen
Much more significant IMO is the rumours of heavy discounting for Intel server
chips and the wide variety of laptops of AMD laptops exhibited at Computex
2019. AMD doesn't have significant market share in servers or laptops yet, but
those are signs that it may be coming...

------
jmpman
How much of Intel’s demise is due to their layoffs from 4 years ago? Did they
cut too deep and mess up the secret sauce?

~~~
wbhart
I'm no Intel fanboy, but I think it is too early to speak of Intel's demise.
They are still making a lot of money. I mean, AMD only has single digit
penetration into the datacentre market, for example, and Intel is still
compelling on mobile.

And I suspect the real answer is they set their targets too ambitiously with
their new node. Their 10nm was simply too big a jump with too many unproven
technologies, with too many unexpected obstacles.

But I wouldn't count them out of the game just yet. They are an aggressively
competitive company. They'll be back.

~~~
chapium
Just look at what happened to amd around 2010. Intel can punch back. Hard.

~~~
HelloNurse
Intel could punch hard around 2010. We'll see what they'll be able to do now.

Improving the price (not the quality) of their high-end product range from
vastly inferior to slightly inferior is, at best, an attempt to remain
relevant; new and better products are needed to recover actual market share,
and their timeline remains doubtful.

------
vbezhenar
Now it's an interesting battle. I want to build new workstation, probably in
2020. I was going to go to Threadripper route, but now I have a choice and
that's awesome.

~~~
gravypod
TR4 will likely support PCIe 4.0. It seems to make sense to me to pick
Threadripper because of this. Ignoring costs the future-proof of having
tomorrow's IO interconnect is probably worth a few bucks for a work machine.

~~~
vbezhenar
I can't imagine scenario where PCIe 3.0 is not enough for me. Even for storage
it only matters for a linear I/O and I don't care about it that much. Random
I/O won't saturate PCIe interface, it's not fast enough on any disk.

~~~
gravypod
It might not be useful now but in terms of future proofing a system it might
become more of a deciding factor. The ability to make use of drives, graphics
cards, network interfaces, or anything that can connect via PCIe for a few
years to come might be beneficial.

------
mbfg
The question is how much of this is pure marketing, given Intel's supply
problems, and how much is this a real product. Will Intel actually be able to
match demand for these products in the short term? It's an easy marketing win
if you promise half price parts but customers enmasse can't actually buy them.

~~~
tracker1
Given the ability to actually buy an r9-3900x anywhere close to list price,
I'm not sure it really matters either way at the moment.

------
noipv4
Ryzen9 [3900X 12C/24T and 3950X 16C/32T] both @ 105W will be Intel HEDT's
coffins

~~~
tracker1
I think the next ThreadRipper release will be what really kicks Intel HEDT
line. The new pricing structure seems competitive against the R9 series. I've
already bought AMD and don't plan on switching for 4-5 years though (3600
waiting for 3950x to drop).

------
sukilot
Is there a website that maps popular intel chips / chipsets / builds to AMD
competitors?

Also, what's the deal with sockets? What sets of intel chips are mutually
compatible and incompatible with others, and how does AMD compare?

~~~
marcosdumay
The deal with the sockets is that the best CPUs stop being compatible with
your motherboard only once every 4 or 5 years, and you'll only be denied an
upgrade in a decade. In a quick search I got a compatibility enumeration:

[http://www.buildcomputers.net/amd-cpu-
socket.html](http://www.buildcomputers.net/amd-cpu-socket.html)

------
self_awareness
Yay, competition. Gotta love it.

~~~
zantana
Yes it in the midst of all the complexities of the various monopolies,
oligopolies and inefficiencies in the modern marketplace, its refreshing to
see the system work as promised. Just like in the Athlon days, even people who
would never in a million years use AMD's product, or even know of its
existence benefit from its products.

------
wil421
I am getting served ads that are redirecting my browser 10 times and showing
something like those scammy Amazon gift card ads. On mobile safari.

~~~
Synaesthesia
Yeah it has gotten worse. You should run an adblocker on your iOS device.

~~~
nvrspyx
To add, I recommend Adguard, 1Blocker X, Wipr, or Firefox Focus. Out of those,
I believe only 1Blocker X has a cost, although AdGuard has a subscription for
DNS privacy and security/custom filters.

I personally use 1Blocker X.

~~~
tracker1
Is there a Brave release for iOS, been pretty happy with it for Android.

~~~
nvrspyx
Brave browser is on iOS, but I don’t believe its adblocker extends to Safari.

Not being able to set a default browser on iOS tends to make you use Safari
more than another browser you have installed since tapping a URL in another
app will open Safari, unless that app supports setting a default browser,
which not many do and even fewer support opening Brave. It has to be
implemented on a per app basis since they’re really just using the x-callback-
url scheme. Most tend to only support setting Safari, Chrome, or Firefox.

~~~
tracker1
Was just curious... I don't use/own any iDevices, so no personal experience in
that space. For Android, it's worked out pretty well for me.

------
acd
Intel CPU TDP is 165W we have global warming. AMD has a TDP of 65W and 105W

We have gone from 200W power supplies to 1kW power supplies in PCs.

