Hacker News new | past | comments | ask | show | jobs | submit login
Intel Starts Publishing Open-Source Linux Driver Code for Discrete GPUs (phoronix.com)
209 points by symisc_devel 29 days ago | hide | past | web | favorite | 43 comments



Finally. Options!! Back in the day we had 3Dfx (went under and sold to nVidia), PowerVR (They're still around. They're probably the GPU chip in your cellphone), Intel i740, and the AMD Radeon. nVidia broke in to the market with the Riva/Vanta/TNT cards. Hell even Trident had some Direct3D/opengl chips. For the past several years we've been down to just the big two (although Intel integrated has held up, even for some medium gaming and indie titles).

5+ was probably too many to support by most game devs, but 3 would be nice and solid. It's also nice that we'll have another player besides AMD that actually gives a shit about Linux/OSS drivers. nVidia might finally be forced to open up and contribute to Nouveau and bring it up to par with amdgpu. The future could be a team green, team red and team blue. Intel has killed the discrete graphics projects before though. So we'll have to wait and see.


>AMD Radeon

Or you mean ATI Rage3D. There was also as mentioned S3, Matrox, and 3D Lab ( Sold to Creative, that maker of Sound Blaster Sound Card ) as well. I miss 3DFx ( I still have a few Voodoo graphics Card somewhere ), Glide was way ahead of its time. John Carmack was fighting for OpenGL.

The biggest barrier of Entry to GPU isn't the Hardware, but the insane amount of Software ( Drivers ) optimisation required for the GPU to be even remotely competitive with its rivals.


Don't forget Rendition and their Verite chips.

They had an absolutely insane card that had both pci and agp connectors.


Hasn't Vulkan made that barrier much lower? Is it not yet feasible to just ship a Vulkan driver and a set of standard OpenGL/DX generic implementations on top of it just for backwards compatibility?


So far Vulkan has been mostly taking off on GNU/Linux.

On Windows, store apps only use DirectX, including Win32 packaged ones.

On Apple you have MoltenVK, as Apple only cares about Metal.

Sony has their own APIs.

Nintendo does support Vulkan, but are more keen in their NVN, alongside Unreal and Unity support.

On Android it is an optional API, and the few vendors that support it, not everyone cares to update their drivers.

So Vulkan is not yet taking over the 3D world as Khronos might advertise it.


Matrox and S3 were also competitive in the consumer space before convergence on the big two.


I forgot about those. I had an AGP S3 Virge as some point. Matrox is still around, but they mostly make high end cards for 6~12 displays (for thinks like airports, trade shows, massive digital billboards, etc.)


I've never forgiven them for failing to deliver real OpenGL support for the G400. And, boy, can I hold a grudge. Just ask Microsoft.


I had a G200 and recall having random lines scattered over the screen in games courtesy of the OpenGL-via-Direct3D wrapper.


Hmm, we had more than 2, if you consider how big Android and iOS are (in amounts of clients). Not sure what kind of GPU something like the Librem 5 is going to use.

For desktop Linux FOSS drivers, sure, you are right. What choices do we have here besides low end Intel and AMD (who can't seem to compete with Nvidia for marketshare)? I mean, I put my money to AMD for GPU and CPU whenever I can. Ryzen and Vega give good price/performance... but the option isn't always available (Vega weren't in stock or expensive during the cryptocurrency hype, and for say laptop its difficult to go AMD).

Now, I get that Intel is a small player in the GPU market, and that competition is good here, but Intel is huge in the CPU market. I feel like AMD is the underdog; they face a gigantic bully on #1 in both CPU and GPU field: Intel and Nvidia. I don't like either of these companies...


iPhone used to have PowerVR before Apple decided to make their own GPU for iPhone 8. PowerVR also can be found on some Intel Atom devices. Most Android phones nowadays use either Adreno from Qualcomm or Mali from ARM.


making them practically unusable for open source OSes. even under windows they performed horribly. Good riddance.


From an open source perspective, there is hope with a preliminary reverse-engineering Mali driver recently merged into Mesa. Kernel bits aren't wired in yet, but progress has been pretty good recently. I wouldn't be so quick to say "good riddance" as a lot of very useful ARM boards like those from Odroid have a Mali GPU.

https://www.phoronix.com/scan.php?page=news_item&px=Panfrost...


I think this is a misunderstanding. I don't think Mali is available on a lot of windows devices. The parent was presumably talking about PowerVR.


There isn't a single igpu that's used on android phones that has mainline drivers. Pretty much all of them are practically unusable for open source OSes. The most notable exception is adreno which has a more mature reverse engineered driver. Now that their patents expired Qualcomm has contributed a tiny bit too. But other than that the rest still suck.


The Atoms with PowerVR were also put into netbooks, i.e. into a class of hardware, where you would want to install some Linux distribution.


Correct but that's still a rather small market compared to e.g. chromebooks which can use anything from arm or qualcomm.


Some Mediatek SoC still uses PowerVR.


Dont forget 3dlabs.


Any number should be fine if we actually had proper standards. Why are graphics so special that we can't have x86 of graphics? I think all of this is a problem of this klepto capitalism. Also, you need 4 to have competition


If you haven't noticed, there are plenty of standards kicking around in graphics and there isn't a need for any more. AMD, Nvidia, and Intel do a pretty good job supporting all of them - you can write a DX12 (or opengl, or Vulkan) program and run it any any vendor's hardware.

What problem do you think having a standard ISA would solve? Nobody distributes native binaries for GPUs, only shaders (either compiled to intermediate representation or no). From my perspective, all that a standard ISA would cause is less implementation flexibility in the hardware since now the hardware has to deal with compatibility instead of letting the vendor-specific compiler included with the driver deal with it.


Can't wait to pair my Threadripper with an Intel GPU :D Now NVidia, please release an x64 processor to have a complete mix on the market.


The advantage of Intel drivers is their support of virtual GPUs which neither AMD or NVIDIA support for consumer graphics cards.


I rather they fix their GPU drivers by open sourcing them, or AMD.


AMD's Linux stack is not only completely open source, most of it is in-kernel and maintained by AMD devs.


Try to use OpenCL and you will need blobs. It's major pain to use blender(requires OpenCL if you want to finish render today) not long time ago their blob crashed on new versions libdrm.


The blobs are probably the most stable option right now, but Clover exists (and kinda sucks) and the new cool option is the fully open ROCm stack which does include OpenCL.

> requires OpenCL if you want to finish render today

honestly, modern CPUs aren't that far behind in Cycles rendering


I'm using RocM OpenCL right now, which I believe is the approved option, or will be. Personally I've not had any issues with it - I can certainly use Blender with it, although the performance right now is so-so (I only have a cheap APU though, so hardly surprising).

I don't believe I have any blobs, other than the firmware one, which is true regardless - can you tell me what blobs you mean?



I can confirm I don't have that installed. I'm using this: https://github.com/RadeonOpenCompute/ROCm-OpenCL-Runtime

Edit: actually Blender performance is decent, but the kernels take ~10 minutes to compile.


That's only for new stuff, right? Older GPUs are still stuck with radeon or wonky proprietary drivers?


AMD's GCN architecture is about 7 years old and fully supported. Open source support for the previous TeraScale architecture is reportedly fairly good too, though no Vulkan.


Pity that to use those I would need to replace my laptop, meanwhile the DirectX 11 driver still works.


I thought amdgpu only officially supported GCN 1.2 and later, which is more like 4.5 years old?


My understanding is that the best AMD Vulkan implementation is still the indipendently developed one as AMD took forever to open source their own.


No, it requires nonfree firmware to be loaded by the driver.


There were rumours of Nvidia having an x86 variant of their Denver core but killed it because of licensing. I wonder how true that is.

https://semiaccurate.com/2011/08/05/what-is-project-denver-b...


I wonder if nvidia could even compete with a high-end ARM cpu. Windows runs on ARM these days. Server software runs on ARM. So it would "just" be a matter of getting the performance right.


Intel's Gen10 integrated graphics presumably had a bunch of effort put into it, and has been mothballed for years because Intel never could get the CannonLake CPUs out the door. That's got to sting a bit for its designers.


I can’t wait for some eDRAM to act as my L4 cache and finally start addressing the von Neumann bottleneck.


Get yourself a 5 year old laptop ;) https://en.wikichip.org/wiki/intel/crystal_well


Are there security benefits of "fake local memory"? I imagine a sandboxing arrangement where a rogue WebGL app can't hack your entire OS.

Several applications (a) opencl servers - just fill an extra slot (b) eGPU over thunderbolt (c) a drop in GPU for Risc-V boards (sans ARM's Mali)


As far as I can tell, you are describing problems already solved by virtual memory.




Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: