It's a different matter for laptops where consumers have much less say on the topic, of course.
The rumor is Microsoft's Surface Laptop announced today will be AMD which will be the beginning of the end for Intel as OEMs will fall in line.
This is a big deal IMO.
There's the old story of talking to an Oracle salesman while drinking from an IBM coffee mug - to signal that there's other viable options and get a better deal.
Nvidia's Linux support absolutely puts AMD to shame.
The biggest lesson in hardware is that "it's the software support, stupid". Too many hardware manufacturers think they are selling hardware alone. They are selling hardware AND the software that makes it work.
A computer is useless without software support. Witness the vast array of essentially useless ARM devices out there because none of them have usable software drivers/Linux support.
Look at this mess: https://wiki.archlinux.org/index.php/Xorg#AMD Do you want AMDGPU, AMDGPU PRO, ATI, or Catalyst? How do you choose? Let's assume you want open source and recent, so probably AMDGPU. OK that's relatively straightforward, but now let's say you want Vulkan support. That's over here: https://wiki.archlinux.org/index.php/Vulkan#Installation and golly wouldn't you know it there are three choices, vulkan-radeon, amdvlk, and vulkan-amdgpu-pro. How do those relate to the radeon and amdgpu drivers mentioned earlier? Arrrrrrrrrrgh.
Meanwhile on nvidia, you install "nvidia" and you're done.
I've been working in gaming on Linux profesionally for ten years. If I were buying a new GPU today, I would pick nvidia.
I'd never buy a Nvidia card for Linux now. I wouldn't have done so three years ago, but now? That'd be a huge step backwards.
AMD is definitely getting better, and being open source is an enormous point in its favor, but it's still just less usable day-to-day for an end user. I hope this changes.
You don't. Just use what your distribution uses for you. That's one less step, than
> you install "nvidia"
> If I were buying a new GPU today, I would pick nvidia.
I guess you don't mind the problem Nvidia has with Wayland.
From my perspective of someone who dislikes Wayland, that is a feature of Nvidia's drivers :-P.
But if you want your driver to not break horribly whenever there's a xorg or kernel update, nvidia is behind
Or if you want your driver to just work 'out of the box' without needing to go out and install a binary driver nvidia is behind
Or if you want your driver to support newer technologies like DRI3 and wayland nvidia is behind
Nvidia had OpenGL as a major focus for decades (it was even the first API to get new features introduced in their hardware via extensions - though nowadays they also seem to do the same with Vulkan) and their OpenGL implementation's quality shows.
And as to compatibility context, IMO OpenGL shouldn't have defined it at all. It is rather weird to have rendering code using both legacy fixed pipeline and modern shaders. From my experience it just causes problems everywhere, and I had encountered unexpectedly bad performance and glitches on Nvidia drivers too. (though maybe less often than on Intel/AMD)
1) Closed driver
2) Weird incompatible multimonitor support called twinview? Where windows would maximize to stretch across all your screens
3) Abusive on the PCI subsystem, so that my sound hardware would experience underruns, both spurious and reproducible events such as during workspace switches
5) Corruption/lockups switching between FB terminals and the X terminal
I switched to ATI and the open source driver and all these problems went away.
I have difficulty placing all the blame on Nvidia here since it is the kernel developers long-held disdain for stable driver ABIs that Nvidia feels forces their hand. There are those who argue otherwise, but that argument is basically "all software should be open source". Even if we go with the more reasonable "all drivers should be open source" that still introduces problems because of bonkers patent law nonsense.
How often is that actually an issue? Last time I ran into that was when I was still running Debian Sid and NVIDIA was far from the only thing that ended up breaking.
NVidia Linux support has always been bad, we just didn't realize it because AMD's used to be worse. Luckily this has changed over the past few years and now we can spec in AMD or perhaps Intel's rumored Xe graphics card.
I assume you're joking, but if not please don't junk them! I'm sure lots of people on HN would gladly pay for (at least) shipping.
Huh? AMD "just works". NVIDIA requires a dance of (Fedora) DKMS kernel modules and binary blobs. What am I missing? I buy AMD just because it's been frictionless to get working, I don't mind if theres a small performance penalty I pay for it.
Neither company is perfect with Linux support. As with most things, there are pros and cons to both sides. But for anyone else reading here, just because you pick AMD does not mean everything is magically trouble free on Linux.
My rMBP has been able to do this since February, 2014. I was trying to get my RX 560 in mid 2018 to do this and it just was not possible via any means with Ubuntu 18.04. I ended up using an Nvidia 1040 with the binary drivers without issue. The nouveau driver for the 1040 was able to do it with a newer kernel than Ubuntu 18.04 had (I was using Arch for that test).
To get two 4k@60 outputs, I have to run two cables (it has two TB2 ports). With daisy-chaining, I got mirroring at best (I didn't check refresh rate at the time, though).
I think you have a typo, which unfortunately is half way between "is" and "isn't". Or your sentence is partially German.
Is this meant seriously? Is it not the exact opposite?
Using Vulkan(?) or OpenCL for deep learning is fine if your goal is helping get out the word that it’s at least possible to do so.
It’s not a great idea if you simply want to get your work done with a minimal amount of issues and with up to date support of all the prominent packages.
Both are valid objectives.
HIP won’t help you one bit if you use one of the many CUDA libraries.
> If they don't support OpenCL or Vulkan yet, you can work on it and contribute the feature upstream. It's a matter of getting the ball rolling to start with, nothing more than that.
And that’s exactly what I mean with the choice that you have to make.
You could do that, and win brownie points with some crowd, but let’s not pretend that doing so wouldn’t have an impact on your work as an AI researcher.
“Nothing more than that” is more than a little bit disingenuous when you take into account the actual work that involved, don’t you think?
It’s something a company like AMD could (and does?) sponsor by throwing paid employees at it. Or big companies like Alibaba who buy enough HW to trade off HW cost vs R&D spending.
But if your goal is to be a deep learning specialist (it’s where the money is today), spending time on these kind of infrastructure issues is a complete waste.
Imagine how much quicker the fair pricing would be if there were standards and regulations, like what goes into every other aspect of its design.
The only applicable regulation I can think of might be "dual sourcing".
If the x86 and x86_64 instruction sets became an open standard so that anyone could implement them; that would be great for consumers. If Intel is in a position to halve prices in the face of competition there are obviously not enough players in the market.
New computer is X570 mb, 64gb, rx5700xt, using an r5-3600 as a placeholder, I am still waiting on a 3950X, and only pulled the trigger early since my last system was acting up. I don't have the specific costs, but have spent around $1500 so far and will probably lose ~$50 on the 3600 when I upgrade and sell it off. Will probably be 4-5 years until I upgrade again, except I may drop a second and/or third nvme drive in or bump the ram to 128gb.
The difficulties I've had this time around have been around aesthetics, since I didn't have any experience with aRGB it tool a while to figure out what to get, and even then none of the aRGB supports Linux. The 5700 just got swapped in, and I've had trouble with that and will have to correct over the weekend, may wind up on Windows if it's too much of a pain.
I've mostly been working on my work-issued laptop, where I've had separate issues with the battery/charging etc the past month. In the end though, spending a couple grand on a computer every 3-5 years isn't so bad if you're actually doing work on it. I will say, I am surprised how much faster this system actually feels over my old one. Hopefully I can get it stabilized with the new video card soon.
(r5-3600, waiting for 3950x to drop, upgraded sooner because old system was acting up).
It come to the point that I will have to ask my boss to replace my machine since it was so sluggish.
Going from my dual-core Pixelbook to my Ryzen desktop is a huge change in the development experience.
They were quite durable 20 years ago too. It's just back then clock speed and performance was increasing very rapidly and it had a huge effect on the usability of your machine (especially if you were a gamer). You would likely upgrade and retire a working machine for the sake of a drastic performance boost every 3ish years.
Nowadays you could happily be using a 5+ year old i5 for full time development and moderate gaming with no issues at all. I'm personally at 5+ years now and I don't see myself upgrading for quite some time because everything I do still feels very snappy.
But compare that to something like a Pentium I at 75mhz (1994) vs a Pentium III at 1ghz (1999) and the performance difference was crazy. Over 10x the clock rate, but it's not just a "paper" boost. They are so different in performance that you can't even compare them. They are in leagues of their own.
(The buttons used to be plastic. Now they're wood.)
That's durable. It also does a much better job than any modern food processor, possibly because the motor is four times the size of what you'd get now.
There are 50-year-old cars still in service today. I wouldn't drive one, because I want to take advantage of the tech advancements of the past 50 years. But that doesn't mean cars aren't durable goods.
And even though computing is barely 50 years old, there are 50-year-old computers in service today. I wouldn't use one, because I want to take advantage of the tech advancements of the past 50 years. But that doesn't mean computers aren't durable goods.
This applies even if a 50-year-old car, which predates bluetooth, doesn't have built-in bluetooth. And it applies even if a 50-year-old computer, which predates HTTP, can't load websites.
As things stand, I already have a 8-core X299 system, and the fact that I can drop an 18-core CPU into it after a BIOS update is compelling. But it seems unlikely that there is a large enough audience in this situation to juice Intel's profits or prevent them from losing more market share to AMD.
Supporting the underdog is a good idea nonetheless, but not for this reason per se.
And even now, after said theoretical vulnerabilities have been reified, there is very little cause to be concerned about the vulnerabilities under discussion unless you host code for other people as a business model (or use such a service). Otherwise your biggest concern is a web browser that already has a whole host of actual and theoretical vulnerabilities of its own.
This suggests a level of informed consent that I don't think existed. It implies that "the market" (who?) knew of, understood, and agreed to the risks.
And anyway, "the market" does a poor job of representing some of its stakeholders, notably the disorganized group known as users, and immediate competitive advantage may be the only metric driving decision-making.
> unless you host code for other people as a business model
Completely and utterly false. The exploits being found exist at a point much lower than the x86 instruction set. ARM processors were also hit with speculative execution exploits.
And, his point still stands. While intel has/had meltdown, which was pretty bad. That doesn't mean that a Meltdown like bug doesn't exist in AMD's hardware.
It's like finding a bug in the OpenJDK that isn't in Zulu and then declaring that "Zulu is more secure than the OpenJDK!"
Or an even closer example "Intel doesn't have the TLB bug! Intel makes better CPUs!"
AMD isn't guaranteed free of all exploits. It is only guaranteed to not suffer specifically from meltdown.
Just Not Meltdown. Or SPOILER. Or Fallout. Or RIDL. Or ZombieLoad.
You're really underselling the difference between the two. AMD's processors have only shown vulnerability to the issues that are inherent to the nature of speculative execution (Spectre).
Intel, on the other hand, has suffered from no less than 5 or 6 separate disclosures of vulnerabilities from various places in their microarchitecture where they cut corners on process separation in order to gain speed. None of these exploits have been pulled off against AMD, despite many of the papers authors explicitly trying to,
The original rebuttal was AMD gets less researcher eyeballs because there are more Intel devices in the wild so it's expected that Intel has more vulnerabilities found.
This was fully correct, yet the GP and you apparently are fixated on the idea that AMD didn't have certain vulnerabilities that were originally found _by studying Intel CPUs_. Yet no big surprise most don't apply to AMD in that case... because they were found by studying Intel CPUs first.
Yes they can. Instruction sets are just that, a bunch of op codes and "things" for a processor to do. How a processor actually performs them is open to implementation.
Implementation matters a lot. Heck, it is WHY there are performance differences between AMD and Intel CPUs.
Before the price cuts it looked pathetic, so I guess it's an improvement.
My pet theory is that there's just a ton of casual investors who have also become emotionally invested in AMD...
It isn't a particularly insightful comment, I'll acknowledge that, but this response seems like it lowers the quality of the thread more than the comment it is replying to (and arguably breaks the sites rules by suggesting the other poster have ulterior motives for posting what they posted).
> Please respond to the strongest plausible interpretation of what someone says, not a weaker one that's easier to criticize. Assume good faith.
> Please don't make insinuations about astroturfing. It degrades discussion and is usually mistaken. If you're worried, email us and we'll look at the data.
Yet you assume my comment is referring to one comment and unfairly attacking it?
This entire page is uncharacteristically rife with misinformed comments and people oddly invested in shutting down anyone with anything positive to say about Intel
Please don't put words in my mouth.
I didn't accuse AMD of astroturfing. If anything I literally implied the opposite, an organic reason why there's a movement to emphasize "AMD bad Intel Good".
My point is it's to the detriment of having real discussions about either when they come up on HN.
2019 - year consumers win.
And I suspect the real answer is they set their targets too ambitiously with their new node. Their 10nm was simply too big a jump with too many unproven technologies, with too many unexpected obstacles.
But I wouldn't count them out of the game just yet. They are an aggressively competitive company. They'll be back.
AMD is expected to double their penetration by the end of the coming year in the server space, which is huge. I think they'll also make inroads into the laptop space as well. Given that they're the preferred APU for next gen consoles won't hurt their sales to developer channels either.
I would be surprised if they haven't captured about 1/3 of the overall x86 market by the end of 2021, which is where Intel will probably be where they want to be technology wise, until then it's AMDs game.
Improving the price (not the quality) of their high-end product range from vastly inferior to slightly inferior is, at best, an attempt to remain relevant; new and better products are needed to recover actual market share, and their timeline remains doubtful.
- a long list of security vulnerabilities which, when fully patched and recommended features (including HyperThreading) are disabled, results in a huge loss in performance(~25% in some reports)
- lower power efficiency due to still using 14nm process
- platform uncertainty, i.e. will upgrading mean buying a new motherboard?
Basically for me it's about:
* Security issues. They are real for Intel and performance loss is real.
* Inter-core communication latency. I don't particularly like this chiplet design for AMD, when some cores have a long time to talk to other cores. Intel seems like a safer play here. But I did not research about their HEDT CPUs, may be it's the same for Intel.
* Balance of frequency/cores. While more cores is better, I think that after 12 cores I don't really need more. And single-thread performance is always important.
* Other issues. I don't like that AMD had multiple issues with its Zen CPUs, especially with Linux. Crashes, RNG bug, overall Linux compatibility. I'm feeling like Intel takes Linux compatibility much more seriously and Intel overall seems like a safer bet for stable system.
Inter-core communication latency has been much improved with the larger caches and architecture improvements in Zen 3. Single threaded performance has always been within a stone's throw of Intel CPUs with Zen and Zen 2 and now the gap isn't even worth talking about with Zen 3 since you get more cores for less money.
I'm not sure about the Linux story, but this is the first I've heard of issues with Zen on Linux.
Zen 1 didn't have an especially large list of issues for being a ground-up new architecture. Launching to enthusiast desktops so you can revise things for the server market seems like a great strategy for them overall.
Depending on your workloads, it may well be worth it for you though.
Also, what's the deal with sockets? What sets of intel chips are mutually compatible and incompatible with others, and how does AMD compare?
I personally use 1Blocker X.
Not being able to set a default browser on iOS tends to make you use Safari more than another browser you have installed since tapping a URL in another app will open Safari, unless that app supports setting a default browser, which not many do and even fewer support opening Brave. It has to be implemented on a per app basis since they’re really just using the x-callback-url scheme. Most tend to only support setting Safari, Chrome, or Firefox.
We have gone from 200W power supplies to 1kW power supplies in PCs.