Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Interesting, but is not a fully “system” I think, it lacks a GPU to be complete.


You do not require a GPU for a system, a serial port is sufficient for a terminal and putting code on it.


A server may not need a gpu, most servers may not need a gpu.


Yes true, sorry I’ve explained wrong: what I wanted to say is that is a very interesting system, but in order to be a full compact SoC like what we mean now, it should had only a GPU.


I understand by GPU you meant VGA, but either way a lot of SBCs come without a VGA/GPU. It’s x86/PC quirk that everything comes with a VGA compatible framebuffer.


I suspect it’s your use of ‘GPU’ that’s causing the confusion. I would expect a GPU to have some form of 3D acceleration or compute capability and I don’t think that’s really applicable to the EGA / VGA video chips of the 386 era. I could be incorrect though, I’m not sure there is a settled definition of GPU.


The chip that would read RAM and generate a video signal was called a CRTC or RAMDAC back in the day - though RAMDAC, standing for RAM Digital to Analog Converter, only really applies to VGA which is analog. Some game consoles had an equivalent chip (Brooktree/Conexant) that was called an encoder.

This is separate from anything that blits textures, rasterizes triangles to video RAM, performs parallelized math operations on big vector lists, runs shader code, or does MPEG/etc. decoding. All of that stuff is what constitutes a GPU.

I'm not sure what the modern term for DAC is given everything is all digital.


I still see the CRTC or RAMDAC terms used for modern scan out engines, even if the terms are a bit anachronistic.


Yes my mistake because also now we call “GPU” some cores related to graphic processing inside a complete SoC, but is not really a GPU, only a “graphic core/s”.

What I wanted to say is that in order to “output something directly to a display like a contemporary SoC, this 386 system lacks only a graphic unit”. Maybe in this way is more correct.


The 3dfx Voodoo1 card _might_ be classifiable as the first GPU, I'd say. While SGI had display hardware with more processing before that, it wasn't (afaik) in a form that could be moved to a different system.


I think the term you are looking for is "video controller"?


That's my wording, not theirs. It seemed close enough to be acceptable at the time. (Technically it is not even an SO-DIMM but a module in the form factor of an SO-DIMM.)


Depends on the application. Plenty of SoCs in IoT/WiFi access point applications with no video out.


Few 386 machines had GPUs




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: