MXM cards unfortunately seem to be incredibly niche and vendor locked with BIOs whitelists, but allowing for swapable GPUs on laptops would be amazing for e-waste reduction and upgradability. It's a shame the Framework laptops don't support them, it seems right up their alley.
The problem is it adds quite a bit of thickness to the device. The framework 16 solution where it slides into the back is better, but we will see how well upgrades are supported.
Surprisingly the older iMacs are very upgradeable because of their socketed CPU and MXM GPU. I recently received a free 2010 iMac and was able to upgrade the dual core i3 CPU to a quad core i7, which then boosted the RAM limit to 32GB. I also grabbed an $8 FirePro M4000 MXM GPU from a dell laptop and flashed it for MacOS compatibility .
This old 2010 iMac now has a quad core cpu, 32GB RAM, and a metal capable GPU running the current version of macOS Sonoma. It's running great, and I can essentially run every app on the machine without ever hitting swap, unlike my 8GB MacBook Air.
I have thought about blogging a bit but there are already many people blogging who are much more knowledgable than me. :)
This iMac is the 21" model, they have a smaller PSU and cooling, so the most powerful chip you can install is the i7-860s (82W CPU). I paid $25 from aliexpress.
If you have the 27" iMac you can install the i7-860 (without s) that is a 95W CPU and can boost to higher clock speeds. This is also a more common chip so likely cheaper than the i7-860s.
Both chips should boost the default max of 16GB RAM to 32GB RAM.
I did the same with a dumpster find in 2020. I saw the guy hauling and depositing it, so I asked and got it (2011 iMac 27" with maxed out specs at ordering time - radeon 9800, i7, 16gb ram, ssd+2tb hdd. Almost mint condition).
It had of course 2 issues:
- backlight was dead on one half of the screen
- infamous radeon BGA soldering issue, intermittent glitches & os freezing when it had warmed up
I resoldered the backlight connector (apparently broke off the pcb under mechanical stress) and swapped in a used $20 Nvidia Quadro that I had flashed with nick's hacked rom from the above thread.
I did not want to do some hackintosh mods to the OS so i went with (iirc) mavericks and sold it on classifieds for $430 after noone in my circle wanted it. I was also able to sell off the mxm radeon 9800/2GB for $30, placed the ad just for curiosity about it's resale value in such a broken state. :)
The money was just by-catch, the real joy always comes from resurrecting broken things (for me).
Thanks haha and yes I also installed an SSD. Unfortunately this iMac is only SATA II, so that limits the speeds a bit, but it is still much faster than an HDD.
Apple's desktops of that era were quite upgradeable, but those old MacBook Air's were the beginning of what Apple devices became today.
Maybe but it doesn't matter because they're not using it directly. The card was AGP/PCI while MXM on the laptop is PCIe, so they had to create an adapter including a AGP/PCI-to-PCIe bridge. In this case, they shoved this graphics card with its adapter into a Dell Precision M4800 released in 2013 because they needed to run Windows XP. They also created another PCB as a PCIe 1x adapter to run it on desktops.
The card was probably intended to work with a specific model of gaming or workstation laptop from 2000 but it's doubtful any made it past the prototype stage. Maybe Dell or Acer sold one or two models of it.
I think you're confused - this is a from-scratch new design, not an old prototype. There's no adapter other than the desktop mount, the VSA-100 is directly mounted to the board that plugs into the laptop.
We had to develop a new interface for swappable GPUs to make the graphics card and mainboard co-planar. MXM requires the graphics card to be stacked above the mainboard, which results in really thick laptops. This is part of why MXM is largely dead for consumer usage and lives on mostly for industrial and embedded applications.
Unrelated to MXM but I've struggled with the same problem you described (swappable GPUs on laptops). The current solution I've settled with is to get a laptop with a pcie gen4x4 nvme slot, install an oculink adapter on it and connect my full size RTX 3090 to it.
No performance diff far as I can see (oculink on pcie gen4x4 should have ~64GBps bw) and still super portable. and I can finally upgrade the CPU/display and GPU independently
IDK, we have all of the unused space behind the screen of a laptop. Why are there no attachable modules that take advantage of that?
It would be ideal to have a PCIe Thunderbolt connector that allows you to attach a GPU that maglocks to the back of the screen while in use, and any manufacturer could make their own variations. Thunderbolt can easily send 100w of power. Wouldn't be on desktop scales but it would be enough to make me happy.
In my mind this is ~4 oz. of extra weight, I feel like sufficiently strong hinges and just enough weight in the base would be fine for that little extra though.
If anyone knows of a guide or layman-accessible reference talking more about them (compatibility, bios hacks, part sourcing best practices) I'd love to hear about it.
This is the kind of high-quality niche posts that makes HN for me. This is really an incredible hobby project that someone has also taken time to share details about.
It seems enthusiasts are quite interested in 3dfx GPUs for some reason --- lots of NOS parts with detailed documentation available? There was this not too long ago: https://news.ycombinator.com/item?id=32960140
Maybe you're too young, but in the late 90's 3Dfx was so hot. They weren't the first GPU, but they were the first to break out and make it big in the PC space. The difference between software rendering and hardware accelerated Voodoo graphics was nuts. It was mind blowing. Once one of your friend's had a Voodoo card, software rendering was never enough. You had to get one too.
Between the release of the Voodoo 3 and Voodoo 4/5, NVIDIA would eat 3Dfx's lunch, but not (at first) by having better tech. They just built momentum by consistently releasing new cards every six months like clockwork, whereas the Voodoo 4 and Voodoo 5 cards were feature-oriented releases stuck in development hell. 3Dfx was still the sexy hotrod brand, although NVIDIA's market share kept growing. Then the GeForce 2 came along and it was clearly better than any 3Dfx offering, while people were tired of waiting for a new Voodoo card. The rest is history.
I remember cutting grass for neighbors during the summer when I was 11 to save up money for a 3dfx voodoo2 8mb card to put into my parents computer (packard bell 200 megahertz pentium). The driving force behind this obsession was a foldout ad in pc gaming magazine with a screenshot of the game Unreal. I remember my Dad (who was very non-technical at the time) trying to talk me into one of the other cards at CompUSA (a PowerVR if I recall) and I was explaining why I wanted the 3dfx (I'm sure I was regurgitating whatever I read on HardOCP).
We got home, figured out how to install it and the drivers ,and booted up Unreal. Amazing moment in my life, and probably one the main reasons I ended up being a software developer. Tweaking settings, learning commands in the 'in-game terminal', understanding basic networking to help pick servers, tinkering with the level editor. PC Gaming was an amazing introduction to 'How Computers work' with a really motivating example. I feel like my kids completely miss that because they just play games on an ipad which completely insulates them from any of it.
I agree except for Nvidia not having better tech. Riva TNT was not on 3dfx level, but GeForce was groundbreaking. 3dfx managed to match it with Voodoo 4/5 although it didn't get great press after all the delays, but then GeForce 2 came out and 3dfx was toast. At the time, a big angle was also that 3dfx made its own cards and Nvidia just did the chips for dozens of manufacturers, so it had the advantage there. But that's a bit of survivorship bias IMHO because e.g. Apple also does everything today and vertical integration is touted as their advantage. In the end either your tech works and has a market fit or it doesn't.
Pretty crazy how Nvidia grew from that to a top 5 company in the world in 25 years.
Got to agree here, in fact what I remember is that the Riva TNT 2 already was the smart pick vs. what 3Dfx had out at the same time, even though Voodoo was the cooler brand and had all that previous goodwill.
I had a Voodoo 3 3000, friend had a TNT2. If the game supported Glide and/or had a minigl driver available it blew the TNT out of the water. Direct3D-games however the reverse was true.
I was so pissed off, after managing to buy a Voodoo, having to come back to the shop and replace it with a Riva TNT, because for some odd reason it wouldn't work on my PCI bus.
And since back then any hardware accelerated solution that was supported by games was millions of miles ahead of software rendering... that was enough.
I was a gamer in the 90s, chatted with id via IRC when they were developing Doom, loved Quake, but didn't get a 3d card until the Voodoo 3 came out and was lucky enough to be able to afford a Voodoo 3 3000 when it launched. I installed it, fired up Quake 2 and, exactly as you say, my mind was blown. Colored lighting, moving lights, smooth shading, etc. It was incredible. I've never experienced such a visceral reaction to a big leap in computing power before or since. I'll always remember that afternoon seeing what it could do. I was mesmerized.
I'll never forget the first time I booted up GLQuake. The mid/late 90s was a really special time in gaming getting to watch 3D games really get off the ground.
For people of a certain age, 3dfx is iconic. As a teenager, you read a bunch of reviews about how great it was. This was before widespread broadband and there wasn't much PC gaming content on TV, so you didn't really know what it looked like. After months/years of saving you finally bought one, put it in your PC, started up Quake 2, and it was f**ing amazing.
GeForce came along soon enough and made it obsolete, but never had the same mystique.
> Not to be confused with today's "Serial Link Interface"
Scalable Link Interface.
And it's not available today either. The last cards that used it was the 1000 series. The 2000 and 3000 cards used NVLink, but that's disappeared from the 4000 series, too.
The 3DFX jump was almost the same as playing games badly rendered from the GBA (or maybe the first Nintendo DS games) to the quality of the PSP.
Or, as a better example, compare the V-Rally game for the GBA with Ridge Racer for the PSP.
Deus Ex and Unreal could run under a 3DFX card. The jump from a software rendered Quake (or just any software rendered Unreal game) to a hardware rendered one was pretty noticeable.
I have a Dell Precision M4800 and would like to upgrade the GPU to an Intel Ark A370M, which is available in MXM format. Does anybody know how to add this card to the BIOS whitelist?
I’m interested in that PCB. How many layers, min trace, and if it was blind/buried. Can’t seem to find any information on it. Looks like a nice layout.
Amazing. Were the 3dfx chips just openly documented enough for this kind of thing to be possible with original information sources, or have these chips been reverse engineered to get to the point where a project like this is possible? I ask because in today's hardware, it's almost laughable to even expect a pinout of a GPU to be publicly disclosed.
>Were the 3dfx chips just openly documented enough
All chips have always been documented well enough, since that's the only way for the board partners and system integrators to know how to build cards with them and write drivers for them as was common back then, so everything had to be well documented, same how we have datasheets today with all the low level registers, and application notes with PCB and code samples.
The question is always how sticky the NDAs are for you to also get your hands on them, but since 3dfx went bust, the leaks are no longer a problem.
Yes of course, hence my word 'openly'- I mean, at some point in the 90's designs started to get locked behind NDA's even just to get the docs. At the time of original manufacture (99/00?), if you were a company making cards, I assume the documentation of the Voodoo 4 was locked behind an OEM agreement?
And then at some point that information has leaked out onto the web? Curious to when and how that went down.
I fear that so many things now will never get to that point of seeing public light.
Cool to see a project messing with them though!