Yeah, but back in the day you didn't run the raw square wave signal into your headphones. Even if you used headphones the beep would go to the speaker either in the chassis or in the monitor
An entire OS, which can still perform most of the tasks we ask of computers today, runs fast in a virtual machine on a device designed with a power budget in mind. Yet many apps that do basically just one of those tasks, and run natively, run slower and take up more disk space and memory. That's progress.
I've always felt like this effect is just perception. LibreOffice Writer starts in a couple of seconds on my current laptop, for example. I remember when Word took a few multiples of that in the past.
And browsers now feel slow if they don't respond perfectly to touch input on my tablet. We used to tolerate much slower response times.
A lot of this is down to UX, or rather expectation management. If I click on something and nothing at all happens for 0.8 seconds until the UI updates with the task done, the app feels incredibly slow and laggy. If I click on something, a progress indicator opens immedietly and updates a real progress bar every 0.2 seconds until the task is done 10 seconds later the app feels fast and snappy.
The second app took more than ten times longer, but it never left me doubting if it even noticed my click and how long I have to wait. That makes a world of difference.
90s era software was written in the expectation that the computer is slow, and having progress indicators on screen or in the status bar for a lot of tasks was common (and if that was too much work at least the cursor was updated instantly to show that work is done). Today most software is written in the expectation that everything works instantly, and if it doesn't the user experience falls apart quickly.
One addition to this (excellent) post: it was much more common for applications to do all the work in a single thread, and so it often became important to be aware of stuff like chunking up a big workload so that progress bar could actually render. In doing so, this also improved application responsiveness through allocating time to drain the message pump.
While now the UI is more regularly decoupled from the application, the awareness that updating the user seems to have somewhat fallen by the wayside, and so a big ol' spinner, or a largely fictitious progress bar that isn't actually tethered to any measure of work, shows up and you get situations like "well, the progress bar is at 130%, but I sure don't feel like I'm at 130%...".
This isn't the tools, it's a poor use of them, and it lends a lot of ammunition to folks who want to back-in-my-day while ignoring that a lot of other parts of the stuff we had back in our day kinda...sucked. Some of it (not everything!) just happened to be good at this one thing!
Is there a way to make this go full screen? It'd be so much fun to turn my entire macbook screen into an IBM green character screen, and immerse myself in BASICA programming like the old days.
I realize aspect ratio could be a problem and I'm willing to put up with some black on either side.
I keep seeing "PCjr Machines". It's actually close to what this is, using the console is bringing back memories of my first programs in basica! I poked around a bit, this doesn't support an actual PCjr does it (with colors & polyphonic sound)?
I can't get most of the disk images to load. It claims the images are too big. Or do I need to use a different DOS with HD disk support? Bit of chicken and egg since the more recent DOS images are on big disks, and usually more than one.
I'm guessing (didn't try) it could be a slightly patched version that does actually halt the CPU when it has nothing to do, since DOS is well known for using up all the host CPU when run in a VM even when idle and leading to people writing utilities like this:
The "Running the CPU at 100 percent for too long can damage your computer." message is probably legal ass-covering, since any system that can't 100% CPU usage 24/7 is defective. Indeed, back when DOS was common, that's what it did to the CPUs of the time, as did other applications which used a polling loop instead of waiting for interrupts.
That's normal for a full-featured OS, not for a single-tasking OS like MS-DOS. Additionally, x86 HLT instruction was quite obscure before the 80486 (IIRC).
Ah.. it sounds logical yes. But not for DOS where the CPU was not waiting, but instead just burning CPU cycles.
Try running DOS in any VM and you'll be shocked by the CPU usage. A constant 100% until you install a little program called DOSIDLE that properly runs a HLT instruction when the CPU is waiting.
Wow. I can't believe the amount of hours I spent on computers back in the day, even though they were so incapable and boring compared to today. It's as if I've never liked people.
I think the reason they did not do that is because it's a JavaScript project and they didn't want to rewrite it in C or another language since there are already lots of emulators in C. But it could be faster if it was compiled to wasm.