Hacker News new | past | comments | ask | show | jobs | submit login
Tried Nvidia’s GTX 1080 – still no external GPU on a Pi (jeffgeerling.com)
127 points by geerlingguy 34 days ago | hide | past | favorite | 60 comments

The Raspberry Pi does not support PCIe I/O BAR space. That's not a hard requirement to run a modern GPU as far as I know, but drivers do kind of assume it's available. In particular, VBIOS stuff often needs it. The message from the Radeon driver is not a hard error, but presumably other things break as a result.

The Broadcom SoCs used on Pis still suck, and although this can probably be made to work, someone familiar with this codebase needs to care enough to fix the drivers to make it happen. It's probably quite doable for the open source drivers, but I wouldn't expect Nvidia's blob to work any time soon unless they take the time to fix it.

Not all PCIe root ports are equal. Some are, shall we say, less than amenable to devices more complicated than a USB controller.

Looking through the discussion here, there is some rather concerning stuff about DMA mappings (e.g. the thing with 64bit accesses not being supported). It's possible the Pi4 PCIe implementation is actually too broken to support these GPUs, though I haven't read through the whole thread in detail. It certainly looks pretty messy.


Thanks for taking a glance. There's a groundswell of people commenting "get marcan to take a look". And I would ask, but you're already doing some really awesome work over on the M1 side and I've learned enough to fill up two of my brains already from that.

At this point I kinda have to wave the white flag on the driver issues because I'm not deep enough on the kernel / PCI spec side to be productive figuring out the holes on the Pi's 'messy' bus :)

I spent a night looking through the radeon driver and register documentation, and it seems to fall back on sane defaults without being able to access IO space, it just prints a nasty gram. All of the expected IO space is mirrored in the MMIO space and the VBIOS code will just use that as a fallback pathway just fine.

One sort of cheap way to get a sense of what's actually going on (at least way cheaper than a PCIe protocol analyzer) might be to put your hands on one of the USB3382 cards (the poor man's PCIe protocol analyzer for those who aren't that comfortable with FPGAs). Those are really neat soft PCIe devices that bridge the PCIe transactions over USB3 to another host computer and can do things like emulate other devices or send arbitrary TLPs. There's some hacker con talks about using them to attack systems by using their DMA TLPs as a read/write primitive. Using that to either bridge to a card on the host computer and tap the PCIe bus, or send your own TLPs to verify that stuff like 64 bit transactions are actually a problem and not a red herring. You can use a FPGA for the same thing pretty easily, but a USB3382 might be a lot closer to the skill set that you feel comfortable with.

> But probably not AAA gaming, since gaming on Linux is a tough proposition even in the best of circumstances.

Just in case that's unknown to you or to other readers: It's really not true. Have a look at https://www.protondb.com/ - Linux is excellent for gaming, even AAA gaming. The PI processor just would be too weak either way, and not being x86 would prevent the games from running directly. Maybe that incompatibility and not having proprietary games compiled for ARM available is what you meant? That would need something like https://github.com/ptitSeb/box86, which actually seems to work way better than expected.

Eh... the problem is even with Proton it's extra work and sometimes not pleasant try to keep games running natively in Linux. I had a nice Kubuntu Focus 2 laptop with a fast Nvidia RTX mobile chip, and all the bells and whistles, and I could only get 2 of my 5 favorite Steam games to run through Proton, and one of those required some janky tweaks to make it run correctly.

When they did run they ran great (testament to the hardware and the decent driver support), but that's what I mean when it's 'tough'. Not impossible but it's not like Windows, where you just click buy, wait for the game to download, and start playing.

With 1 out of 5 you were extremely unlucky. For the most part proton works exactly like described: You click on play and the game does run. In the other cases where there are fixes to be applied that's most of the time just setting a simple start parameter directly in Steam.

You also should consider that on Windows in practice there are many issues, ranging from drivers and OS updates critically breaking performance (happened just now again), middleware stopping to work, and many older games that run perfectly fine on Linux not running at all on Windows anymore. It always depends on your game selection of course, but I for example also removed my dual boot Windows installation to exclusively game on Linux a few years ago now because it got way too annoying dealing with all the breakage and incompatibility on that platform.

It's actually interesting in that tech written to help games run on Linux can also help older games run on Windows! As an example, the best way to run The Sims 2 on Windows 10 with a modern GPU is to force the game to render via DXVK :)

Right, that is interesting. Goes in the same direction as that the compatibility I praise above is from a certain perspective somewhat of an accident - Wine was supposed to run cutting edge software. It does, but it also still runs the old software, giving it a capability modern Windows is missing.

I tried for ProtonDB for the 5 most games I play since a few months now: Cyberpunk 2077, PUBG, Doom: Eternal, Disco Elysium, Apex Legends with friends.

On those 5 games, 3 do not work at all, and Doom: Eternal keeps crashing every 15 minutes with the two latest builds.

I am sorry but the "You click on play and the game does run." is definitely not true, except if you are playing famous old games (mainly supported by Valve) like Portal, CS, ...

I'm curious if you use Nvidia or AMD because for me (a rx 390 owner), Doom Eternal not only runs flawlessly on Linux, it's performance is out of the box better than on Windows. As for stuff like PUBG and Apex Legends, that's a crapshoot thanks to anti-cheat software frequently being hostile to Linux.

Thankfully, there is work being done and at least the situation with Battle Eye has improved in recent years.

Not at all, I had issues running CSGO when you can't set stretched resolution or play 4:3. The performance was way worse comparing to Windows.

I think the game itself was broken for couple of days for GNOME because of some library and if you want to play you have to switch to KDE.

Even if you want to game on linux, you have to install a ton of libraries, hope in wouldn't brick your system and then jump through the hoops all day and in case if something breaks you have to spent time debugging / reinstalling system instead of playing.

> Eh... the problem is even with Proton it's extra work and sometimes not pleasant try to keep games running natively in Linux.

I flipped one switch in settings and restarted Steam. I install Windows-only games in Linux just like I do in Windows within Steam.

How is that extra work?

The vast majority of the games now work in Proton without any significant performance differences. I actually have an inverse anecdote to yours: Two games I recently tried to run in Windows 10 wouldn't run, but ran flawlessly in Proton/Steam/Linux despite being newer Windows only games.

So your daily driver is Windows pc already?

For development + video editing + gaming, are you gonna make a Windows vs Mac comparison video soon?

Linking some comments from your pi vs mac thread over here

1. https://news.ycombinator.com/item?id=23503762

2. https://news.ycombinator.com/item?id=23503481

That's the nature of all desktop computing on Linux. Either way, Linux is very much capable of AAA gaming, though the setup can be a little rougher on the Linux side of things. That being said, Steam's Proton has made it so that 90% of my games just run out of the box.

The problem with Proton is that some things cannot be bypassed. I recently downloaded a multiplayer game that uses Xigncode (anti cheat software) and it simply doesn't work with proton. I know the game works flawlessly because I played a version of it from a different publisher that didn't use anti cheat software (it's not a competitive game).

Anticheat will never work on anything other than the official OS for which it is designed. A cornerstone of anticheat is to prevent the user from running non-approved software in the same space as the game. Any attempt to emulate an OS or fiddle wit memory will be attacked as possible cheating.

On a higher level, anticheat is about locking down user behavior, confining it to acceptable corners of the machine. It is there to enforce the content creator's wish. It is just the latest incarnation of DRM. I expect that many in control of linux are quietly happy to hear that it doesn't work properly on linux systems.

I can give you a counterexample, Battlefield 3, which uses Punkbuster Anti-Cheat, works fully and I played it a lot with friends in multiplayer.

Right. And random drug-testing is infringing on the athlete's right to bodily autonomy. Why would anyone care about the competitive integrity of a video game?

Can you name the games?

I don't think you can call gaming on Linux excellent if it still involves WINE / DXVK. Until Linux gets first party support, it will never be excellent. More like 'good enough'

That's exactly why it's excellent.

If the goals of future game development are pro-consumer and developer-centric like platform agnosticism (or, preventing artificial platform monopolies like, eg- GFWL) and simplifying development, then we're doing way better than good enough.

We've learned over the past decade that publishing Linux native games isn't as feasible or valuable because supporting them smothers smaller and independent developers in distro-specific bugs, robbing them of time and resources they could be putting towards developing content, spending time with family, or planning their next projects. There's no reason to publish natively as long as you can develop with Proton/SteamPlay in mind, and instead of every indie dev needing to fumble through learning Linux support, it shifts the burden to specialized developers (like Valve) who can more effectively contribute to Proton instead.

Linux native support is almost an anti-criteria for me today, because I'd much rather troubleshoot a known-working build with all the features enabled (eg- multiplayer works with other "Windows" versions) instead of a build that lags behind in updates and may never be more than a perpetual WIP. Proton and tools like DXVK and D3DVK that can be ported anywhere exist to circumvent that, and that's pretty amazing when you think about it.

DXVK is just Direct X for Linux. It's just a library the same way libc is a library. Why would I care about what libraries my games use?

The problem isn't the compatibility, it's the reliability. You can't expect every game to work under Proton. It's always a gamble.

If there was a guarantee that every game works then it would be indistinguishable from excellent first party support. The reason why it's only good enough is that you have to sacrifice the titles that don't work.

A major problem with that is that you practically need an emulation layer to run old Linux games. Try running the sid meier alpha centauri linux demo... Chances are it won't work because distributions have moved on from where they were.

On the other hand, the windows demo probably works, because wine still understands old windows apis and current linux apis.

30% of my Steam Library runs on Ubuntu 20.10 out of the box. I was shocked at the advancements made on Linux in the last 2 years. It really was desperate before that but there are decent options now at least.

Wine has gotten incredibly good over the past few years.

I just checked the protondb website for a few games I enjoy these days and it's far from excellent. I will keep the other way around : windows with WSL2.

I love that this always comes up, no gaming on Linux will never be as easy as Windows.

It works about 90% of the time, but in that 10% there are probably a couple of games you're just not going to be able to get working. If you're serious about gaining, there's no reason to just not buy a Windows license. You can duelboot

It's not like 100% of games someone wants to play will work on Windows. Back when GTA 4 was relatively new I got it as a present. Took months until the Games for Windows Live DRM permitted me to run it, and even then only offline. Another example - that's a while ago, but that's when I realized the compatibility you get from Linux: I tried to play the original Anno 1602 and of course booted up Windows, and it just was impossible at that time to get it to run. Annoyed me enough to finally try it on Linux, with the regular wine, where it just worked.

Sure, you can always dualboot, but it's honestly a hassle after a while. Windows updates every time you boot it up (when it's seldom enough), Grub breaking occasionally, and when you are used to Linux modern Windows is just excruciating user hostile. When 90% of your Steam library works anyway it's just not worth it.

A common refrain on the internet is "but can it play crysis?"

Windows 10 can't (at least not without a crack) because the DRM doesn't work anymore.

I dualboot without issue.

You can also run Linux inside of a VM on Windows.

Or you can even have a dedicated gaming PC and a cheap Linux PC. Windows Subsystem for Linux can give you the best of both worlds. I don't love Windows, honestly I think OSX is the best OS, but I can't run it on anything but a Mac.

>When 90% of your Steam library works anyway

Epic, EA, and Ubisoft don't even have clients for Linux.

There is a client for Epic, Heroic (https://github.com/Heroic-Games-Launcher/HeroicGamesLauncher). EA and Ubisoft gave up on their walled gardens and sell their games via Steam now. There is still the EA Origin DRM client involved there somewhere and that can cause problems, but it can also work, so worth a try if the game is worth it.

I stand corrected. I still think Windows is worth keeping around , but I love to see alternatives

Just buy a Jetson Nano 2GB, they are cheaper than a Raspberry 2GB now anyway (atleast in Sweden where the Raspberry is sold out and Arrow has free shipping of the Jetson). And that little 10W machine is 1/2 a Nintendo Switch, it can play my MMO engine with 300 characters on screen without world or physics (probably 100 players when the game is complete):


If you follow his channel he just wants to push the Pi to the limits, and part of that is documenting all pci/e devices that work on the pi. The nano doesn't have Pci/e

Jetson Nano has PCIe, exposed through the M.2 slot. Far more bandwidth than an RPi too.

The suggested 2GB Nano doesn't have the M.2 slot, instead it has 802.11ac wireless.

If I remember well, it’s just a regular M2 Wi-Fi card included inside.

Looks like the redesigned carrier board for 2GB is missing M.2 Key E connector

I'm more interested in trying to get my hands on a SolidRun Honeycomb board, or heck, if the stars aligned, an Ampere server... I figure it might give a good reference for a better/full-featured ARM PCI Express implementation (especially the Ampere as it implements the latest ARM arch).

That goes against some of my "get things done the cheapest way possible" philosophy, but it would be nice to be able to test things on hardware where it is known to _likely_ work.

I'm actually stacking Raspberry 2/4 as servers with 1TB SanDisk (slow but with enough distribution fast enough)...

But right now I'm investing in Mini-ITX Atom 8/16-core servers (25/32W) with passively cooled cases (Streacom) and 8x64GB SATA SSDs from 2011 (50nm that last 100.000 writes per bit for 2W per drive = 18W total)!

So the Raspberries are an option if power becomes unstable/expensive, gonna have lead-acid backup and Gb/s fiber in appartement/summer house for 100% read uptime redundancy!

Just wanted to say thank you for the content you put out. I was watching your stream yesterday and found it very interesting despite my lack of knowledge when it comes to Linux kernel stuff. Hoping to learn more as time moves forward so I can start to grasp more of the content in the videos of this topic. Looking forward to the next one, Jeff!

"10% chance of releasing the magic smoke" made me laugh out loud.

Off topic-ish, but maybe someone here would know... I have a 2013 macbook pro. It's still great for coding but recently I got into Blender, and now I understand why you might want an external GPU. Is that a possibility for such an old laptop? I couldn't find any definitive information on how to go about buying one, and setting it up.

I looked into getting an eGPU for a Macbook in 2013. Thunderbolt 2 has the bandwidth to make it possible, but it's not ideal and will cost performance.

More critically, at least in 2013, the docks you needed cost thousands of dollars—and made more sense to just get a separate desktop PC. I don't know if the situation has changed since then—there are more eGPUs out there but it's all for Thunderbolt 3. Thunderbolt 2 is more niche and thus may be more expensive.

It's possible to use at least some TB3 eGPUs with TB2 and adapters, it's not officially supported by Apple on macOS though. They are still insanely expensive in my opinion for what's basically a case, small PSU and PCI-E to TB3. The cheapest ones that can take most cards are around $300 and usually have some downside. I considered it for my Mac Mini but in the end it just made more sense to sell the Mini and buy a desktop PC with space for a real GPU, more storage, faster processor, etc. I mean my case, power supply and motherboard was less than the price of a Razer Core X...

Thunderbolt 2, if supported natively by the hardware and using an external monitor, might actually be enough in practice. I used to game on rMBP 2015 with a GTX 1080 in a TB2-connected eGPU and it didn't even saturate the link, so when I upgraded to an iMac with TB3 the performance was pretty much identical.

The rule of thumb is: use an external monitor and the highest resolution possible. Sending control commands one way consumes much less bandwidth if you don't need to send whole images back down the link, and the high resolution and quality is to make the GPU the bottleneck, not the TB2 link.

The other issue is drivers. Nvidia is definitely out since the drivers have not been updated in many years.

I found egpu.io to be a fairly reliable source. Check out their build section to see if anyone has successfully paired an eGPU with your Mac model.

I was able to get a ThunderBolt 2 Mac to work with a Razor Core + Vega 56 combo, but YMMV. Away from computer so don’t remember exact year for my MBP, circa 2015 maybe?

Do you intend to use the GPU for rendering with Cycles? This isn't possible with current versions of Blender on macOS as Nvidia has stopped development of the CUDA toolkit for macOS and Apple has deprecated the OpenCL compiler. Since Cycles doesn't yet support Metal you cannot use GPU rendering on macOS. It is planned though and there are improvements being made with the Cycles X project. It is possible to use other render engines that support Metal though.

I used to do a lot of Blender work on a 2013 macbook pro (up until last year or so). It's fine for modeling, but rendering is rather slow as the macbook thermal throttles when all its cores are being used.

I opened this article right after working on a rust program on my Jetson Nanon.

Jetson Nanos are like $50 on Newegg. They are very cool. HN community should be all over this. I have used raspberry pis for robotics, but now I am seeing if I can switch and it seems to work. A Jetson Nano has (I believe) 128 cuda cores. If you want more, get a TX1 or a bigger board.

Seriously, trying to add an Nvidia care to a Pi is a dumb choice IMO. Show the Nano some love!

Nano has 128 Maxwell cores, which is the GeForce 900 generation. The Xavier MX, for 4x the cost, has 384 Volta cores. Dhewm3 on the Nano pulls 34 FPS at 1280x720. On the Xavier NX it runs at 88 FPS. On the pi4 it runs at 7.5 FPS. On a pi3 at 640x480 it takes several seconds per frame. Mostly a shortage of memory though.

CPU speed is ordered pi3, nano, pi4, Xavier no. Memory bandwidth on the jetsons is much higher than on the pis.

Right, but the nano is cheap and you can get the TX1, TX is you need more power. There is no world where that is not a cheaper and more versatile solution than hooking up Pi to like a 1080 graphics card.

And plus.. really I want to get people using these things so we can share code :)

I been playing around with an RPI4 and have been getting interested in graphics programming on Wayland. Unfortunately efforts to get Wayland running under the RPI have stalled due to the difficulty in working with the Broadcom chip. So I’m pretty much left looking for another SBC if I want to experiment.

Huh? I've used KWin-Wayland on Manjaro ARM on an RPi4, it Just Worked. And why wouldn't it — V3D is a normal DRM+Mesa thing.

How many more orders of magnitude processing power than the Pi does the 1080 have?

TLDR: it didn’t work, no real clue why, more updates to come.

anyone have experiences w ~modern Nvidia eGPUs for os x CUDA dev? (arm / pre-arm?) Currently traveling w 2 systems just because of this , which is nuts..

Seems odd that even RISC-V has external GPU support but ARM boards don't.

The ISA of the CPU cores doesn't matter. The PCIe controller does. And Broadcom being Broadcom, they have a weird one with interesting bugs like 64-bit I/O not even working.

On something like a MACCHIATObin or Honeycomb LX2K, AMD GPUs just work, even with pre-boot display in UEFI (thanks to EDK2 optionally including QEMU for this, it can just emulate x86 for the display driver it reads from the card). Well, the MCbin has a fun quirk with device enumeration, but actual operation of the device is perfect.

^ This. I'm working around the edges of the SoC in the Pi, the BCM2711, which has some implementation quirks.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact