As a nerd of a certain age I’ve been deeply attracted to retro computing. Even going so far as to place bids on 30 year old Atari STs on eBay since that was a computer I had in the late ‘80s/early ‘90s. But in recent days, through the use of modern emulators I’ve dabbled in these retro systems and I’ve come to the conclusion that they’re just bad. They are fine for the enjoyment of certain games and reliving or just getting a reminder of my younger years. Yet these home computers of that era lack so much. No MMU, no FPU, no IP networking. Very limited CPU cycles and memory.
Even switching from home computers of the era to workstation class system, it’s pure frustration at the limits on what can be done. Don’t get me wrong, for a lot of my computing needs I could probably get along just fine with a next cube or a late model Mac II. But on the other hand you’ll pay hundreds or thousands of dollars for these systems, while I can go on Amazon and spend ~$200 for a mouse, keyboard, cheap monitor, and a raspberry pi zero w and I’ll have a more capable system in every way. For just a little more money I can buy a tiny PC that is able to emulate near perfectly all the computers of my youth.
It always stuck with me when announcing the Virtual Console on Wii U that Satoru Iwata said it perfectly "It can be fun to relive your childhood but just for a little while".
It is fun to look back but we tend to forgive a lot of things that kind of sucked. It is like how some people who live through Hurricanes and then a decade later speak about how fun it was coming together even though it was just horrible. Memories can be deceiving.
"When you wear rose tinted glasses, red flags just look like flags" - Bojack Horseman
Definitely, I always felt just a tinge of bad for how awful I was at the original Super Mario brothers. The NES and SNES games. After playing them on the classic consoles as an adult I realized they were just unforgiving and let my youthful self forgive that. I’ve done much better on the newer Mario games. SMB3 though really does shine.
The difficulty helped hide the fact that you were paying $50 in 1980s money for a game that took approximately an hour to beat. If you could beat it. But I suspect the more common gameplay loop was the one I and my friends encountered: you put the game in, played for 20-35 minutes, got a Game Over. If you were really motivated maybe you tried again and got to Stage 4 instead of Stage 3, but at some point said, screw it, turned the game off, and played something else.
These days every game is made with the intention that it can and should be beaten, which is part of why there's so much needless padding in games to stretch them out beyond the 40-hour mark and make the player feel like they're getting "value" for their money.
Much of this design also feels like a hangover from the arcade days where you wanted players to die cheap deaths so they plugged more quarters into the machine.
I think all of those games were offered as arcade boxes. Plus you have to figure the audience then was in that sort of mindset. Very good point. My 4-5 year old self can relax at last.
That Iwata quote kind of rubs me the wrong way, especially with Nintendo and Sony releasing extremely flawed emulators in the past. On top of games designed for CRT TVs often not looking as good on modern screens, we have emulators introducing low FPS and jerkiness (Sony PSone Classic) or making games look worse by rendering certain scenes without fog (Switch Online N64). With commercial emulation of early 3D consoles especially, you often just aren't getting an accurate representation of what the games were actually like.
(Although credit where credit is due, Nintendo's recent Game Boy Advance releases ship in one of most accurate GBA emulators around.)
What's funny is they both had much better emulators in the past. The PS3/PSP/Vita PS1 emulator is pretty good, ditto the Wii's N64 emulator. The Wii U's was a step back, so was the Switch's although they've worked on it.
For what it's worth, as a born and raised Floridian, hurricanes are always fun before (laughing as people panic buy, watching the track, wondering if it will turn at the last moment or unexpectedly linger on top of you), during (admiring the wind/rain, wondering if we'll have to run the generator this year), and after (marvelling at the damage, having a good excuse to run the chainsaw, wondering when we can stop running the generator); and that's not even considering how much fun road trips can be when you have to evacuate.
Death is sad, but it comes to us all one way or the other. Death by hurricane can be mostly avoided by leaving for significant storms and direct hits, but I know several folks who have had very close calls (e.g., tree fell through the roof and clobbered the bed while they were eating dinner) even in light, non-hurricane storms.
Losing the house sucks, but that's why we carry insurance. Any Florida house worth its salt (a unfortunately dwindling percentage due to significant development in recent years) is CBC and isn't likely to take significant damage unless you lose the roof or take a direct hit from a tree or tornado.
Not to refute your (accurate) statement, just to offer my perspective.
It could be just a stage of denial as in the 5 stages.
Or it could go deeper and as the world around us always change, by event we cannot determine and affect us totally … like say death and we have several possible path
Subject
- actively deal with it
- actively not deal with it
- inactively deal with it
- inactively not deal with it
Ignore us as subject
- let others deal with it
- …
Many religions and life philosophies are around this for, well, death.
To run software, emulation is great. To develop software, emulation is great. Anything that deals directly with hardware, well, emulation isn't going to cut it.
Then there are the other reasons, ones that have nothing to do with the hardware itself. Personally, I use vintage computers as an escape from modern computers and as a reminder of what the technology could have been. I'm not talking about it being slow, unreliable, or downright difficult to manage. I'm thinking more along the lines of being simple enough to understand and a lot more personal. A lot of what we call progress is a mixed bag. Yes, the performance of modern computers is amazing and it comes at the cost of complexity. On the other hand, we have a lot of complexity that is not necessary for day to day use. Yes, having access to a literal world of information at our fingertips is amazing. On the other hand, our computers are also at the whim of the outside world. In other words, there are a lot of tradeoffs.
For some things, emulation is the perfect solution. For other things, it simply won't work.
nearly all of the old stuff just sucked. there's a reason we threw it all away (for the most part) and have turned to emulation.
the new low power SoC systems with actual connectivity are much more interesting... probably why they can barely keep the stuff in stock. as a bonus, they can run emulation just fine. old, and new, in one package.
The 16:9 craze drives me mad: it's jack of all trades master of none, compromise every other way. Maybe nice to watch TV shows, not really good to watch movies, not tall enough for a single screen, not wide enough to 50:50 split left/right (each side is too narrow). Putting two side by side makes you face the bezels, and constant-tilt your head either left or right when you focus. Three side by side is way too wide. Probably why some folks put three but have the two outmost ones be vertical. Which brings me to 9:16 which is way too thin AND tall.
A triple head 4:3 setup was terrible for movies, but reaaaally nice for a battlestation. Maybe 5:4 would have been quite something but I never experienced that.
I do like the Ultrawide things though, 2.4:1 feels like two 4:3, only continuous, no bezel in the middle. 3.5:1 is technically about two 16:9 but feels like triple head 4:3. They're especially comfortable when curved, but damn, none are HiDPI, which kind of ruins it. And of course that's kind of a no-go for laptops (or maybe not, but I bet the manufacturer attempting that would somehow botch the execution)
I actually do use three 24" 16:9 screens, side by side. Though small by anyone else's standards, 24" is sort of the perfect size for the three monitor setup. It doesn't make me strain my neck _too_ much, but also isn't so gargantuan that I can't fit three on my 5-foot desk.
I split each monitor down the middle, treating it essentially as a viewport for two windows. Optionally I sometimes fill one screen with a single window, but I never overlap screens. On my home computer running Linux (Cinnamon) this functionality is built in; on my work laptop running macOS, I use Rectangle to accomplish the same thing.
I did have three 19" 4:3 CRTs, way back when - beautiful Sun Trinitrons. They took up the entire desk, but I could conveniently stack equipment on top of them (speakers, a clock, an FM radio, etc), because they were so large. But my workflow for those was different - each monitor _was_ a single window, as splitting them resulted in windows that were too narrow to be useful.
There an easy fix for the sharp edges on laptops: don't buy macbooks.
Nearly any desktop has user-replaceable parts, and so do many laptops. Laptops get to be compact and lightweight, so they are very tight internally. An Amiga wasn't too easily portable, to say nothing of an Amstrad.
Keyboard quality differed drastically: an Amiga or a Yamaha MSX had nice keys, but good luck typing on a Spectrum ZX.
The repairability depends on the laptop. Framework is the obvious choice if you want a lot of control over the device hardware, but I was pleasantly surprised at how many user-serviceable parts were in my HP Aero 13. Battery, m.2 storage, RAM, wifi card, input devices, screen, speakers, webcam, etc.
In my experience, a big enough 16:9 display also works as a 4:3 in practice – a 55” 4K OLED TV is great for this. It’s like a 2880 × 2160 4:3 display… with three little 960x720 4:3 displays next to it, heh. Window managers can enforce the split. It’s nice!
I used a single large TV for my monitor for a couple of years. At first I used a 50” 4K LCD TV. Some aspects of this were very nice. As you describe, being able to split the screen up into multiple areas or use a portion of the screen as a monitor with different aspect ratios was very nice. The downside with the LCD was the backlight on the areas of the display that weren’t in active use was very distracting. It was also nice having the flexibility for games, watching TV and movies.
I briefly tried my 65” LG OLED TV as my monitor. It didn’t have the backlight problem at all but the auto dimming of white screen areas was too distracting to be usable.
I’ve now gone to 2x 32” 4K monitors. It’s okay but either too wide if both monitors are horizontal or one is too tall if I turn one vertical.
I’d like to revisit using a large OLED again when their use as monitors has matured a little. With a little software tweaking for window management, it was almost perfect.
i use a sony 48" oled and yeah, the only annoying thing is the auto dimming. otherwise it's fine. i may buy an oled actual-monitor (they're starting to be released now...) once this one dies, but my previous sony (non-oled) led tv monitor lasted 6 years so it could be a while.
I feel the same way about computers of the 70s, 80s, and early 90s. I owned several of them, used them a lot (mostly programming for fun), and much of the experience just sucked. I do not understand the appeal of retrocomputing. But I'm happy that there are a lot of options for people who like that sort of thing.
"I do not understand the appeal of retrocomputing."
But at least one knew what was going on. If one didn't then one could probe the hardware with a logic analyzer and an oscilloscope. Try that today and see how far you get.
Mind you, the 8088 was a dog of a processor, too little too late and far tooooo slow (nobbling the 8086 was a dumb retrograde idea).
Yeah, using an 8088 in this thing seems dumb. My first thought when seeing this was: put a 386 or 486 with VGA in this and I would buy one. This would be instantly useful for retrogaming. 8088 with CGA, not so much (for me at least).
My thoughts exactly, especially the 486. It was a good all-rounder, worked well—thoroughly debuged unlike the Pentium—and fast enough to be useful even today for some work.
Also, I'd heard accounts that some influential third parties (companies developing stuff for the military etc.) were still using it long past its used-by date because it was the last of the 86 line whose internals they could fully understand. IBM was second-sourcing it too.
Incidentally, I wonder where those 8088s come from, same for the 8087. Seems someone still makes them.
__
Edit: Just occurred to me the main/intended use for this computer is as a training aid. The ready access to the chips and that they've sockets for removal would allow access for an ICE (In Circuit Emulator) bond-out chip/board to be inserted. Thus, there's no need for it to run any faster.
Thanks. Presumably for use embedded in their hardware products. A genuine 8087 must mean Intel still makes them or there's a lot of old new stock around.
Depends what you're looking for. I have fond memories of programming an 8088 in high school with BASICA and Turbo Pascal. I could definitely see getting this to play around with. A 486 could do more but it wouldn't take me back to my roots in quite the same way.
Some were replaced by cheaper, easy to mass produce but somewhat inferior products.
Others by some insane reasons by manufacturers to increase their profits or differentiate amongst themselves, pushing for thinner, lighter devices. i would gladly use bulkier devices for better specs and reparability.
But the point of such devices is emphatically not daily computing. It's like saying you can go and buy a more evenly colored, stronger, cheaper, and less fiddly brick instead of a Rubik cube.
Equally, to me the point of devices like this, or fantasy consoles, or old limited machines is in being a puzzle, providing an intellectually stimulating pastime. Spending time on them is more fun and likely more useful in various senses than activities like solving Sudoku or watching TV.
I dunno about the ST but my Amiga 1200 very much has an FPU and and IP network. I would be very surprised if that wasn’t something that you could add to an ST as well.
Amiga 1200 is in the ballpark of Atari TT/Falcon and yes STing for IP, browsers etc. were available. But ST was same year as Amiga 1000 which also had just MC68000 without MMU, both Commodore engineers (who designed... Atari ST) and Atari engineers (who designed in HiToro Lorreline - than then became Amiga) had own external MMU.
Early software was designed to feel as fast as possible, even resorting to hacks like temporarily turning off DRAM refresh to write text on the screen as quickly as possible. Though hardware was very limited, when you pressed a key you generally got a response right away.
Fast forward to 2023, and everything runs on top of a non-realtime multitasking OS, causing periodic little glitches as background stuff hogs up resources. Graphics is too complicated to guarantee consistent 60 frames per second. And web is incrementally loaded from shared servers, often with delays of many seconds. Animations introduce intentional lag.
Modern hardware could be of course amazing at being fast, but nobody put in the effort into software designed for that goal. Plus obsession with device thinness and fanless designs introduces thermal throttling that no software can overcome.
> Modern hardware could be of course amazing at being fast, but nobody put in the effort into software designed for that goal.
A lot of effort has being thrown that way, it's just that their definition of "go fast" is "having lots of throughput", not the things you are looking for: goodput, low latency, low jitter.
We got faster mainframes instead of faster minicomputers - computers and network systems that are optimized at doing batch jobs.
We can submit a whole bunch of blocks and the graphics processing unit can display accelerated smooth video for us. Or we can push a whole neural network to a tensor processing unit and have it do inference in very few operations, after the model is loaded. But both of those operations while having smooth output have horrible startup latency.
I think is very naive to call what the devices have today as a "single computer" when in fact, for a long while they're several interconnected computer components joined with lots of buffer.
They are fine for the enjoyment of certain games and reliving or just getting a reminder of my younger years. Yet these home computers of that era lack so much. No MMU, no FPU, no IP networking. Very limited CPU cycles and memory.
People just had more patience back then.
45 minutes to LHARC my text file? Sure, I'll talk to my wife.
Four hours to download an ILBM? OK. I'll read a book.
Today we're so used to living at the business end of a digital firehose that if something takes too many clock cycles we walk away. Some people can't even sit in a chair in their own homes without having music on.
That 95% of the retro computing scene seems devoted to games tells me that people just crave a quick hit of digital content and then move on.
But what is it that you want to do with these retro computers beside playing Wolf3D?
As you mentioned even the most basic of RPi can run the pants of the Atari ST machines, so it is hardly a fair usage of the computer if you want to do modern tasks.
This is why I refuse to so my retro computing on anything without a floating point co processor. As such the 486 DX/2 66 mhz is ideal. Not too fast. But crazy powerful compared to rubbish 8086 machines.
A "real hardware" approach to retrocomputing like displayed here appeals to me because it much better captures parts of the experience lost to emulation. Things like insanely low latency thanks to there not being a fractal spiral of hardware and software abstractions sitting between everything. It's much more simple, and you can feel that.
That said I'd also like to see some modern advancements brought in… for example a reproduction of a mid-90s 68k or PPC Mac using components manufactured on a much smaller node would be incredible. It wouldn't even have to be anything cutting edge like 3/5nm — even the now-ancient 14nm or 30nm would be amazing compared to say the 350nm node that the PowerPC 603ev was manufactured on.
The latency issue is very real. I work in Linux, Mac and Windows on fairly fast computers and programming in VSCode is so annoying because of the latency .
On my 486 in Borland Turbo C++ ide, I click on a key and I see it instantaneous.
Of course if you never tried it, you wouldn't notice the difference and think I have some sort of OCD. But it is really noticable. I don't know if it's 100ms or 200 ms but there is a difference.
Get a high refresh rate (<1ms latency) LCD/LED monitor. This is by far the most important. It took a while to catch up to CRT latency, and it's still sold as a niche product, but we are there.
If you are in Linux, you may want to turn off your display compositor (KDE/XFCE), or use a lightweight xorg window manager like fluxbox or i3. This isn't necessary when using a fast GPU, but even then you may dislike animations.
Use a text editor with a real GUI toolkit (not electron or similar) like emacs or gvim/nvim-qt. Unfortunately, emacs can struggle rendering syntax highlighting, but outside that it is lean and snappy. Hopefully the recent progress with tree-sitter will resolve that.
I have met very few people who care about latency the way I do. I think most of it comes from my nostalgia of DOS-era computing. The most satisfying hardware purchase I have ever made is a good looking 2K IPS/VA 120hz/144hz+ freesync display. Millions of hours of my life experience were noticeably improved.
I'm looking forward to the day that display tech, GPUs, and connection standards have gotten to the point that high PPI (@2x UI scaling) high refresh rate microLED displays are practical. It's going to be so nice to have text with as much contrast, cripsness, and accurate letterforms as print with as little latency as is possible on a modern system.
What they're getting at is that the latency is caused by overhead from the OS and USB, increasing it across the board regardless of editor — each keystroke has more layers to pass through compared to a machine from the 70s, 80s, or 90s, plus USB by the nature of how it works unavoidably adds latency that's variable depending on CPU load, since USB is CPU-dependent. There's also more layers in the display stack, increasing the amount of time it takes for what's actually displayed on screen to match machine state.
A Core 2 Duo era laptop specifically might not suffer as much as its desktop counterpart though, because laptops from that era often used PS/2 for their internal keyboard+trackpad connection and thus don't suffer USB latency increases unless the laptop's motherboard was doing something weird like implementing PS/2 via an onboard adapter attached to the USB bus.
One of the nicest screens I have ever used was on a monochrome ega laptop.
Sure, the pixels were huge and slow, but the contrast ratio was really good without it being too bright. It’s hard to explain why it was so good. I suspect it was a bit transflective.
The problem with those old monochrome/greyscale LCDs is that they were horrendous for ghosting.
I imagine if you just wanted an old laptop for wordstar or something, they’d be amazing due to their clarity; but I remember trying to play commander keen on them and getting a headache.
Thanks for the flashback! Got my first own computer from my grandfather. It was a 486 laptop with a bulitin trackball and floppy drive. The display was as you described. Movement on the screen was just a blurry mess.
I'm also bummed out that these sorts of things are all including OPL2/OPL3 Adlib style clones when I'd much rather have Sound Blaster support with the digital audio channel.
One thing I'd love to see is a Pentium III class machine with a lot of cache and ram with a pcie SSD in pocket size. I feel like those machines were held back by hard drives of their day.
I really hate Liliputing. After browsing, I somehow have several tiny laptops that I don't need but truly love like the Magic Ben MAG1 and the Pomera DM30.
And Jetpens. I hate Jetpens too. I can't believe how many...
fyi these kinds of pen cases have been popular in asia and asian communities worldwide for decades, usually for carrying mechanical pencils and their accessories (extra lead, erasers, protractors, rulers, mini notebooks, sticky pads, whatever)
western equivalent would be trapper keepers with their pencil pouches.
You can also probably get little pencil boxes at an art store or some place like that.
Would recommend. A little plastic box that prevents me from having to fish around in my backpack possible get poked when I want my pencil is great, and they are pretty cheap.
Speaking of fishing, this is slightly adjacent, but fishing tackle boxes come in a wide variety of sizes, are built to be modified, and are easily open to additional modification since the plastic is soft. And they are cheap.
UK: Try an image search for 'Helix Oxford Maths Set'
Compasses, dividers, protractor, setsquares (that we don't use these days) and a 15cm ruler. Pencils, sharpener, rubber. Usually a stencil for lettering and a leaflet with geometry definitions.
Available from supermarkets, stationers and cash and carrys right now because it is GCSE Maths exam time.
Altoids tins are pretty strong, small enough to be handy and cheap. This makes them the ideal container for a lot of small electronics projects, but a pi doesn't quite fit. I see it as a perfect small electronics container which you can pull out and use on the go.
back in high schools I had a zen micro mp3 player that fit snugly inside an altoids tin, drilled a hole for the headphone jack and felt pretty stylish.
As the owner of an obscure floppy disk subsystem that requires an 8-bit ISA card and was originally spec'ed to run on XT-class hardware, this is tempting.
In particular, 640 kB really is enough, as the bundled software runs in real mode, doesn't support extended or expanded memory, and predates tricks like loading DOS in high memory by a number of years (not that they'd work on an XT in any case).
The disk read/write code and some of the simpler filesystem modules run on the ISA card itself, essentially an 8085-based SBC with a rather flexible (no pun intended) floppy controller. Here, 64 kB is necessarily enough for everything.
The fun part is that, assuming an Intel MDS 80/ISIS-compatible toolchain, the card can easily be coerced to run arbitrary code. And, while working Intel "blue" hardware is thin on the ground, I have personal experience with at least one working emulator (MAME) able to run ISIS.
I found them a couple of days ago, and the 386 was already sold out. I was hoping to eventually pick up one of the 8088s; but it looks like they'll disappear now too.
I ordered a HAND 386 before it was removed from sale. The order hasn’t shipped yet and I’m trying to not get my hopes up too much that it will actually ship. I was going to use it was a portable testbench for my RP2040-based ISA card emulation projects.
It's not straight from Intel, it's from a company called Rochester Electronics that buys EOL chip IP from a bunch of different companies and then makes their products available for legacy devices. They also make Motorola 6809s and 68000s.
DOS emulation is not a very battery-efficient way to play '88-98 games on the go, so projects like these seem to cater to that specific need, letting people run DOS natively. The Toshiba Libretto was a nice, small machine, but finding one in good condition is super hard. Toshiba Portégé was another, but after Pentium-II models, they took away the OPL3 card in favor of something more Windows-friendly (AC97), which doesn't have good DOS drivers. Now there's SBEMU (https://github.com/crazii/SBEMU), which can emulate SB/SBPRO/SB16 on top of newer PCI-based sound cards, including AC97 ones, solving that problem. Now it's possible to have sound on a Pentium-II, III, M and Atom machines running DOS, like a Sony Vaio P.
> DOS emulation is not a very battery-efficient way to play '88-98 games on the go
They don't mention the battery capacity or specific 8086 variant used but modern ones use to average 2W of consumption; there are zero power saving modes; and the guy even says that when using 8087 the mini-laptop must "be plugged in all the time".
So I rather doubt real hardware is the most efficient way, specially when e.g. with virtualization you can easily have 20+ hours of DOS on a subnotebook.
20+ hours playing games with audio support on a VM? I’d love to see that kind of performance on, say, the GPD Win 4, but it sounds exaggerated. Where could I find this?
Have you tested SBEMU? I wonder how well or hassle-free it works in reality. I'm interested in creating a SvarDOS [1] machine for some tasks, inclding some low-end audio processing, probably with some custom "program".
If the code is GPL and you are distributing the code in object form, then it’s okay to remove copyright notices as long as the work carried “prominent notices stating that it is released under [the GPL]”. GPL v3 section 5. [0]. Basically, that section says that if you’re distributing modified code, you’re not required to “keep intact all notices”. Id.
I bought one on the gray market in 1996, and I think I managed to double my money in about 6 months by reselling it within the US. It was supremely handy to have around, but with only one PCMCIA slot you had to choose between wired networking and a CD-ROM drive.
I’m convinced that just like how humans are attracted to little kittens and puppies, some technophiles easily feel the same attraction to small, harmless looking little computers running tiny software on tiny processors. I feel like buying this even though I have zero need for it.
I had a similar device in my hands 15 years ago in Thailand, with the company making it trying to find a market when OLPC was getting all the press. It was a netbook with a low power 386 compatible CPU and just enough circuitry and support to drive the keyboard, screen, some RAM and the SD card serving as the hard drive, and USB. It booted DOS, and in theory could have run Linux. It was powered by 6 AA batteries. They might have been able to produce them in bulk for $100 USD at the time, but I suspect it would have been more like $200. Didn't go anywhere as far as I know.
So close. But as others mentioned why not something like vga or at least ega and 286? You could still run 80s still but you could then also do a space quest v and some of later Lucas arts games... Which embarrassingly would make it a must have for me even though I own 27 copies of them on GoG and steam and bundle and cd and wherever :)
Curious. Why do we need an actual 8088? Aren't all x86 processors backwards compatible with 8088/8086? Also last time I checked VGA cards supported legacy modes such as CGA, and EGA (this may not be the case anymore).
Any modern Intel laptop should be able to run the same software, without emulation.
In theory, no, but if you're using an AMD Ryzen CPU, there's some nasty bugs related to VME instructions (which AMD appears to have no intention of fixing) that break some DOS-based applications, including Windows 3, 9x, and probably the other DOS-based Windows versions as well but I never tried them. NT-based Windows works, though early versions do not perform as well as they do on Intel (my R7 1700 and 5700X run Windows 2000 VMs worse than my old i5-6600...I wish I was joking).
There are subtle differences with newer CPUs such as: the 8086 does not AND shift operands with 31 while the 80186 and newer do. Not sure if it matters much for compatibility, just some trivia.
Source: 80186 hardware refence manual, appendix A "Differences between the 80186 family and 8086/8088"
You should be able to switch encodings/codepages in DOS; 8-bit encodings cover many languages, and includes "fancy punctuation". Even the default cp437 covers quite a few of the languages in Latin script.
I read your comment and thought that can't be a unit of 1, it must be a typo and a batch of 1000.
Then I clicked the link.
But that can't be. Don't they appear in all kinds of things now, from washing machines to fire alarms? So I checked Alibaba, which has 8088s from $0.3 - $2. [1]
I'm confused why the military version is three orders of magnitude more expensive.
9 out of 10 "new old stock" chips on sites like Alibaba are harvested from ewaste. The "recycling" process generally involves workers holding old circuit boards over open fires to melt the solder, and then banging the boards against the ground to knock the chips out. After that, the recyclers sand off the old chip markings, cover the tops of the chips with a tar-like material, and laser on new chip markings (this makes sure that when you buy 100 chips from a seller, they all have the same markings and date code). Since they're applying new markings, they can also take the liberty of making the chips into more expensive versions, such as by increasing the rated clock speed or by labeling consumer versions of chips as milspec or radiation hardened versions. This may be fine if you just want an old chip for a hobby project or something, but for repairing military equipment you want chips with a verifiable chain of custody.
I doubt there's actual 8088s being used in consumer products; it's mostly military (and someone else mentioned avionics in another comment here) applications that require exact qualified parts, and can't be replaced with something else. Alibaba's are probably recycled or NOS.
The fact that it appears to use an actual 8088 (and 8087) is somewhat surprising, considering that the whole PC/XT that this model claims to be compatible with could probably be implemented on a single COB like what happened to the NES.
Also amusing is that the keyboard has a Windows key.
Not true. Millions of dollars was spent on 8086 compatible software. Lotus 1-2-3 for example, and many programs bought new were 8086 compatible until windows finally took over.
This is really cool but I would have targeted a 286 or 386 with EGA or VGA, not 8088 with CGA. The display used can do a lot more than that anyway and the 386 actually came in a smaller package.
I had an Ambra 486sx (IBM line) laptop that was so small and great. Seemed so ahead of its time, a netbook before it became a category. Google "Ambra laptop" shows a picture of an SN8660C which looks like what I had. Played Doom just fine on it. Even had the external dock that took PC/XT/AT expansion cards.
Just now I thought 8MB is still enough to be useful, then I checked myself MB not GB.
Man, I would kill for an out-of-the-box machine that's a tiny bit faster, with a bit more RAM, and has full Voodoo 2 support.
I know pcem can do a pretty decent job with voodoo these days, bit would be nice to have something that Just Works so I could play Messiah and Sinistar Unleashed in their full glory again.
I think it's not really the same sort of thing. It's got a 4MHz CPU, for starters. I'm sure it's good for the exact cases you're mentioning, but the Raspberry Pi, even the W, is much more powerful and versatile.
Why so small? 24cm width wtf? Why not a full sized laptop? Screen will be basically invisible, size of a typical smartphone screen.
Also 16:9 screen ratio will definitely work poorly with almost all nontrivial DOS apps.
Very little is going to prevent this thing from being very slow.
It’s an 8088 running 4.77Mhz, on a not really great system architecture. Head to head the original PC was slower than competing Z80s at 4Mhz.
The truly singular advancement the 8088 brought was more precious, precious memory. The PC itself was a different phenomenon. What it lacked in performance it made up for in many other ways.
For most applications, particularly early on, the 8087 did not bring a lot to the table to improve performance. GW-BASIC did not recognize it, for example. Wasn’t going to make Wordstar any faster. And it was an expensive add-on.
Numeric coprocessors we’re always pretty niche until they came bundled with the 486 and 68040, even then the 486SX and 68LC040 continued to sell, and did not have bundled FPUs.
It look like, from the picture, but I don't know why it has because it would not be meaningful on that kind of computer even if you do have Windows 3.0, I think.
There were 8088 variants rated for 8+ MHz, motherboards supporting them had independent oscillators for the CPU and the ISA bus and you could switch speeds with software or a keyboard shortcut. Maybe also a button?
That would be interesting but MS-DOS? Absolutely not.
I'm having much more fun tinkering in Linux on a 150 EUR Chinese laptop with a fairly anemic CPU.
If there's an itch for retro computing I'll just use an emulator.
The only thing I'd probably enjoy using old tech for would be stochastic super-optimization of programs i.e. to find the very best combination of machine code that executes a certain algorithm maximally fast or with minimum memory.
Even switching from home computers of the era to workstation class system, it’s pure frustration at the limits on what can be done. Don’t get me wrong, for a lot of my computing needs I could probably get along just fine with a next cube or a late model Mac II. But on the other hand you’ll pay hundreds or thousands of dollars for these systems, while I can go on Amazon and spend ~$200 for a mouse, keyboard, cheap monitor, and a raspberry pi zero w and I’ll have a more capable system in every way. For just a little more money I can buy a tiny PC that is able to emulate near perfectly all the computers of my youth.
Thank you for attending my Ted talk.