This is such a great demonstration of both how fast JS engines have become in the browser, and how much less efficient native software has become overall. Running on top of an x86 emulator in JavaScript, it's still faster to pop open a file browser and click around than it is on some modern smartphones.
Most modern smartphones run Android, which means they have a bastardized Java VM between your "native code" and actual metal... not unlike having a JS VM.
It's not that "native software" has become less efficient, but rather that it is slowly disappearing.
As of Android 5.0, the Android Runtime (ART) compiles Dex bytecode ahead of time to native code. That probably still isn't as efficient as C++ or the like, because there's still a garbage collector, and Java methods are virtual by default.
That's because the Win98 code is that much simpler. It has less to do with speed and more to do with complexity.
Your phone is doing all sorts of weird shit behind your back as you poke about. Stuff that it's not supposed to be doing if app-makers were actually respectful of their users...
On that note, why does that crap take so long on smartphones anyway?
One tweak that made my G2 feel much more responsive is simply turning down the UI animation length in Dev Settings.
That's not to say that app developers don't love their non-native webview apps crammed to the brim with ads, though...
UI animations have a lot to do with it, but I'm more referring to UI jank over simple stuff like 'enumerating a list of sharing targets' or 'swiping to the next page of the launchboard' when the phone is otherwise idle. Or, my favorites: the multi-second pause from unlocking my phone during a phone/skype call, or the taking of an actual eternity just to answer said fucking Skype call, of which half the time the call has already hung up once my phone is finally responding...
That's why I switched to windows phone. It's a whole new experience. I can actually answer my phone without the obligatory wait for catch up before I can swipe to answer.
Had a client who later wanted me to do add all sorts of nasty things to an Android app template that I had made for him: Locking out primary navigation buttons, partially replacing the home app, spying on the user, and the capability to send out very expensive SMS texts and auto dial long distance phone calls. Unfortunately I was unavailable to do the addons he wanted, despite him asking repeatedly over the next 18 months. Never trust an Android app...
I think a big part of it is simply loading in resources and the like. Smartphones try to keep a _lot_ of apps in memory, so there's a lot of switching around.
Another thing is that Windows would put certain native widgets (like file selection) on a higher priority than other program code. Android, at least, tries to put as much stuff in userspace as possible, so you might be experiencing the reality if everything run at the same priority.
I miss the shitty machinery of win9x days[1]. The weird part is that I was deep into CGI, and compositors made so much sense to allow for all kind of graphical operations; yet.. I miss the refreshed icons, blinking cursor .. as if I was one to one with that simple machine.
That's very subjective, I'm in a minimalist passeist phase.
> I regularly scan (ansi|gnu|color|)Forth webpages.
I occasionally do too :P
> Thinking Forth is on my mental shelf for so long. ML and Lisps keep delaying reading it.
I know the feeling... but I'm still at the "Lisp looks like parenthetical line noise, and what even is* ML?" stage, so I haven't yet tackled those.
Forth, to me, seems to be bring out the "mechanicalness" of the computer, in a weird sort of way. Of course it's just another programming language, but the philosophy and mentality behind it seems to lean in that direction. I like it for that, and its minimalism. :D
I love lisp and ml to bits, it's more that there's a whole cosmos to learn from there (type systems, macros, logic resolution, you name it).
I kinda understand the mechanicalness of Forth, if you mean that there's only a few principles and that even 'syntax' is built on that. The kind of less is more that frees your mind
I suspect smartphones to have crippled IO and memory subsystem. Marketing emphasize on "Hexa Core 128 bit Samsung silicon from the future" while the rest is subpar, leading to weird performance. JS engines are beautiful these days, especially given the adhoc nature of js, but a JVM should beat it hands down (based on blogs claiming reaching 1-3x C perf).
You might think it's a great demonstration if you try it in a desktop browser.
Meanwhile, on actual mobile hardware, the performance is so atrocious it's completely unusable.
No knock on the implementors of course. But the idea that this demo has some deep insight about the performance of native smartphone software is just inaccurate.
I didn't mean to compare native mobile apps to web apps (though I see now how it could be read that way). I'm talking about native software in general. I chose smartphones for the comparison because they're the most notorious about being slow, even as their hardware performance approaches what PCs were like just a few years ago. Although even some desktop apps manage to be as janky and dog slow as almost anything from the Win98 era, e.g. iTunes until a couple of years ago. By contrast, the cheapest Third World market smartphone would run Win98 apps blinding fast.
In general there's an arms race between hardware getting faster and native developers getting more and more lazy about efficiency. On the desktop, the hardware is finally winning--it's just too damn fast. Hopefully that will happen with smartphones too.
> Although even some desktop apps manage to be as janky and dog slow as almost anything from the Win98 era, e.g. iTunes until a couple of years ago. By contrast, the cheapest Third World market smartphone would run Win98 apps blinding fast.
It is easy to take potshots at e.g. iTunes, but the real reason software "got worse" is not developers getting lazy. What happened is that expectations for CPU-intensive features rose (memory protection, ASLR, NX, encryption, low-latency audio, high-efficiency codecs, ClearType) while willingness to pay vanished. In Win98 times, you bought your music player from the developer (remember Winamp?) whereas now it comes free with your OS, which is itself probably free. So of course it is all half-assed now, but it's not "lazy" to spend resources on software someone will pay for, and not on software nobody will pay for.
You could absolutely reverse this, but it has nothing to do with native or web technology, and everything to do with changing consumer attitudes about choosing software.
> On the desktop, the hardware is finally winning--it's just too damn fast. Hopefully that will happen with smartphones too.
From 1995 until today, power consumption in desktop processors grew about 5-15x, depending on how exactly you measure. 5-15x more power on mobile devices is simply not an option, unless we have a "new physics" kind of breakthrough in both battery and thermal technology.
Willingness to pay didn't vanish, it's just that how we pay has changed. No, we don't pay for iTunes, but iTunes is the entry point to the iTunes store, so there's plenty of revenue coming in through that. And while iTunes and OSX are free, the hardware that OSX runs on makes up for that by being more expensive.
It's getting better (particularly on high-end phones) but single-threaded performance is still nowhere near circa-2013 desktop x86_64 (3.5ghz+ haswell). And it'll probably never catch up due to the thermal and battery life constraints.
"Never" is a very dangerous word to use considering that just 70 years ago, the ENIAC was created and that about 20 years ago, we were using Pentiums with 150-200MHz. The fact that we've come such a long way in such a short time means we probably have quite a ways to go, too.
This is actually really useful to me. I've been working on-and-off re-implementing Comic Chat using HTML5. It's nice having a working copy again to reference. The paper is great but it omits some details.
Where does one start with such thing? I mean, where do I read how to emulate CPU (or at least how CPU works, so I would be able to come up with how to emulate it by myself), how does PS/2 work, how to boot an image from BIOS?
If the reference manuals are overwhelming, I'd recommend to start by reading Code: The Hidden Language of Computer Hardware and Software by Charles Petzold. It covers how computers work for a general audience, from logic gates and boolean algebra, up to assembly, opcodes on the Intel 8080 processor, and how operating systems work. It was one of the most consequential books I've read, and I'd recommend it to both professional programmers and anyone else who is intellectually curious.
Learning about CPU internals was a huge part of your first year in CS? Where did you go to school?
At my school and every other school I'm remotely familiar with, the knowledge needed to emulate a CPU would be covered in upper division computer engineering courses, and not covered by CS undergrads at all, certainly not in their first year.
Bologna, Italy (from the school of Science, not Engineering). At the time (16+ years ago, under prof. Renzo Davoli), the introductory "Computer Architecture" course [1] was almost entirely dedicated to CPU architecture, including building basic ones (IIRC).
This kind of course (computer architecture) is typical of the cross CS/EE requirements and can usually be taken during any year of an undergraduate degree in CS. Prerequisites are minimal if any.
I taught that class once. It does have some prerequisites, but not a ton. I think it could be taken as early as second year, first semester at that particular school.
Back in 2007, my first semester in a technical CS school had plenty of assembly, binary, MMU emulation, OSI / TCP/IP, boolean math and so on.
I know this is mostly gone at this point, but it should not. The fundamentals are essentials. I often meet people who are totally clueless how a computer work and have CS degrees.
"How can you do C, it is so old? By now, we must have invented faster languages. Computers changed so much recently". Sure... Binary is now expressed with emoji.
Those aren't the fundamentals of CS. The fundamentals of CS are data structures and algorithms, as the Good Book says. Hardware is important, practically, but don't let practicality blind you to the actual core of the field: Practicality has a way of becoming obsolete, and an actual education is for life.
Meh. It's like saying that the fundamentals of film-making are writing and acting -- except writers and actors are nothing without an actual camera, and if you don't know how to place the camera and how to cut film produced by such camera, you will never get a film done. Cameras change, but you will never have a camera that miraculously materializes in all the right places to take exactly the shots you imagined. In the same way, some things will never change on the hardware side: you will always have inputs, outputs, memory, storage, human interfaces, power management, booting and so on.
The difference is that when people talk about hardware, they're talking about the equivalent of film, not abstract "this is central to being a camera which works with visible-spectrum light" concepts.
> you will always have [...]
> memory, storage
First, making a distinction between memory and storage is not an "at all times, in all places" kind of thing. It's more "this is what we do now, in the past, on some systems, it was different, and it may be different in the future" kind of thing. Single-level storage is not currently popular, but it was in the past and may yet come back. It already has, in some limited contexts.
Second, we've already seen home system storage go from paper tape to magnetic tape to magnetic disk to metallic magnetic disk to solid-state NAND Flash or close equivalent. Each has vastly different performance characteristics and details in every detail.
> human interfaces
I'm sure there are some iron-clad universals in HID. I don't know which of those translate from CLIs to touch-screens to gestural interfaces to speech recognition to pupil tracking to...
> power management, booting
Two things which have changed quite a bit even in the lifetime of "vaguely IBM PC-derived" desktop computers, and even moreso if you widen your scope up and down the power curve to include handheld systems and, you know, Real Computers What Do Real Work.
> we've already seen home system storage go from [...]
Yes, but the point is that there will always be a requirement to manage and persist the data you are working on somehow, and how you go about this somehow dramatically impacts (or should impact) the choices you make at the more abstract level of data structures. It is a fundamental concept that you will be forced to consider in one way or the other. You can have the fastest algo in the world crunching huge amounts of data, but if you then take an inordinate amount of time to store and retrieve results, it's as bad as having blazing-fast storage and crappy algos.
> I'm sure there are some iron-clad universals in HID
I agree that is traditionally considered a subclass of I/O, but I think in recent years we have seen that it's much more important than previously understood. Good software with mediocre UI is ignored while mediocre software with good UI can change the world. This is one of the few real discoveries in our field since the '80s.
>> power management, booting
> Two things which have changed quite a bit
... but are still there in some shape or form, and will forever be there. They are changing the world because people put effort and thought into them as fundamental parts of computing experiences, not one-offs that can be simply ignored as "constant time".
Those would also be the fundamentals of theatre, though, and film as a medium is essentially defined by its divergence from theatre. The fundamentals of film are cinematography and editing, but even this is theoretical compared to the craft of film-making. Just as CS includes the behaviors of Von Neumann machines with tapes, which is abstract compared to the science of e.g. processor die doping.
It was a technical school with very practical teachers. The lack of "theorists" probably helped a lot. Most of them were either from the video game industry or 80's embedded developers. In Quebec (Canada), a technical school teacher unlikely is to have a PhD. Technical schools and universities are 2 different "level". You can take technical school (CEGEP) as a pre-university degree (or you can end there and get the equivalent of an associate degree). When taking a CS one instead of science, you "lose" 1 year, but it is much more fun. Too bad they replaced the system programming courses with web ones a couple of years ago.
All that to say that sometime, having real industry veterans as teachers really influence the teaching point of view.
Don't get me wrong, I certainly think that functional programming has its place in the world, but as a first year uni student all fired up about finally learning "real" programming after years of teaching myself (back before the internet laid everything out on a platter), I was not impressed.
I don't think functional paradigms can really be appreciated by 1st/2nd year undergrads. At that age you are fundamentally impatient to make your mark in a practical sense, your approach will be instinctively imperative. You have to hit the wall (scaling / parallelism / thread management / complexity etc) before you start to really appreciate the upsides of functional paradigms.
Unfortunately, a lot of professors are actually terrible educators (after all, they did not get there by teaching but by researching) and think the learning process is as linear as house-building: "place bricks here and there so that your next row will be this way and that way". They also think people should enjoy programming for programming's sake, whereas a lot of people are motivated by a creative process driven by outcomes.
Not necessarily (based on being an assistant in lab sessions for first year students learning Haskell).
But what it did do was put everyone on the same level, including the arrogant students who "already knew how to code" and hadn't listened (or attended) the lectures.
I think they chose a functional language to start with good habits for thinking about what to implement, not how to implement it. If you don't know what the problem is, you should work on that, rather than bashing out some Java...
I got a solid understanding of CPU operations in my undergraduate computer architectures class, in my CS degree. Of course, it was probably a third year class. First year sounds a little unusual.
We covered it in the first quarter of the CS program at The Evergreen State College. After a few weeks of being introduced to digital logic, each student had to draw (with an application called Logisim) a simulated simple-as-possible Von Neumann machine up from logic gates and wires. Then we had to write short math programs for them directly into the RAM. Fortunately, Logisim lets you save components as something like functions or macros so it wasn't too repetitive. This project demo video by someone who took the same course shows what the result looks like:
It was challenging, but it was awesome (and finding that video to illustrate my comment is a blast of nostalgia). It wasn't any harder than most other CS or other sciences courses. And after digital logic, the other CS topics aren't really prerequisite or especially helpful in learning how simple processors work. I really appreciated getting straight to the foundations of how computers work and building up from there.
It wasn't like emulating x86 in javascript, but it was CPU internals. Up until I read your comment, I just assumed this was standard CS stuff.
I was equally shocked at the state of CS education at University of New South Wales here in Australia. They don't seem to cover many fundamentals (like CPU, algorithms and data structures, operating systems) compared to what I am used to in Europe. Either not at all in the undergrad curriculum or only very late.
I studied math at university, and did CS as a minor. They made me take data structure and algorithm classes for both.
The mathematician's version was half as long, but covered the material in more depth: ie they proved every result. The CS version was full of dumbed down and full of fluff. (And even those CS people did operating systems and compilers as undergrads.)
First year, second semester we learned binary logic, system bus, how a CPU works, etc. I think it was seen as the foundation so you actually understand how a computer works.
Me too, but there's a huge gap between the often simplified CPUs they teach you on at Uni' and programming for the querks in x86, or any of the buses/peripherals/interfaces.
I didn't learn much more than assembly 101. I doubt I could emulate much real hardware. Certainly not a PC of all things... A gameboy looks more accessible...
I've not really enjoyed my time in higher-ed (and I have little to show for it), but sometimes it's really the only place where unfashionable but fundamental topics are covered in depth. My comment was meant as a pointer (i.e. "check out lecture recordings etc") rather than a quip.
0xffff2 - I can't reply directly to you for some reason but I went to UC Berkeley many years ago and the third undergrad course, CS 61C, laid the basic groundwork for beginning to understand CPUs. Here's the syllabus from the most recent semester. http://www-inst.eecs.berkeley.edu/~cs61c/sp16/
My university didn't touch on hardware until the second half of second year, and that was only one paper (strictly speaking, a computer engineering paper).
Personally, I had some courses on computer architectures, and those covered the theory of how CPUs are constructed, how they run programs, etc. I decided to write an NES emulator. So:
- Find Wikipedia articles, and learn that it used a variant of the MOS Technology 6502, which was used in a lot of computers in the 80s.
- Find some digitized assembly programming manuals from the time (I think the one I used was distributed with the Commodore 64, and ended up having several typos introduced by OCR).
- Write a tool to recognize, decode, and print out an operation when you feed it a little data
- You basically need to set up a loop of fetching instructions, interpreting them, then doing what they say. An actual CPU runs in a similar loop, and it generally doesn't stop until power is removed,
I think that after the classes I took, I read a lot of what other emulator writers said. This article is a basic look at the structure of an emulator, the theory behind them, and some different designs: http://fms.komkon.org/EMUL8/HOWTO.html
One of my friends at uni wrote a Z80 emulator/assembler in PDP-10 assembler as a hobby project - and he was studying chemistry, not CS...
I don't understand why a basic understanding of CPU architectures isn't a CS fundamental everywhere.
Even if you have no interest in emulating a CPU or an OS, you really do need to know what registers are, how caches work, what interrupts do, and how basic IO happens.
At the very least it's a practical demonstration of one particular kind of VM, and - if you want to - you can generalise from that to VMs of your own design.
For web apps, not understanding these things can get expensive. Cycles, even cloud cycles, aren't free, and if you take zero interest in optimisation and efficiency you're literally throwing money away.
Agreed. It's important to understand how the hardware works, at least at the theoretical level. If you don't understand what the machine is doing, it's hard to say that you really understand how your program works.
Do you worry that Microsoft might threaten you for the Windows 98 image? I don't think they make a cent from it anymore, but big companies and common sense aren't common bedmates.
This is the most impressive part! Well done! So in theory, we can run anything that a real PC can run, albeit a Win98 speced PC. Any takers to get the next generation in the Windows series to run on it. WinXP? I think.
PS: Just came across the HN discussion around the said x86 emulator.
Great work on your emulator. I am also looking forward to use webassembly for my emulator when available. However, I don't expect much speed improvement. I guess you can speed up your emulator significantly by putting in some '|0' noops to prevent deoptimizations into double precision.
It's stuff like this that makes me wish I had unlimited side project time. I think this could be fun to work on but my assembly/C level programming skills would need some serious ramp-up. Great work though!
A note: the mouse position and acceleration seems to not be mapped correctly. It is often separated away from my local desktop mouse position. Maybe because the screen resolution is different?
I'm having the same issue. It makes it difficult to do anything, because I would need to move my actual mouse pointer past the edge of my monitor to get to any of the icons or start menu with the virtual mouse.
My first laptop was a 10 year old hand me down Windows 95 laptop (literally, as old as me) that I used just to play Space Cadet Pinball. I can't believe they took it out of Windows 98; what a great game haha
Are there license issues around this project? I doubt Microsoft would care to put the resources into enforcement of windows 98 licenses but I am just curious about the legality.
Essentially you are being given a copy of a VM that has a copy of Windows pre-installed. It's unlikely (but possible) that Microsoft has granted a license for this. I imagine this is technically not legal, but as you say, unlikely to matter much.
Microsoft does not seem to bother with older software. It is because they don't support and upgeade it anymore.
Microsoft only goes against piracy of software they still support and upgrade.
They don't bother with the old software because it isn't worth it to sue or send a c&d letter because they don't earn an income with it anymore. So they don't lose income if someone pirates software they no longer sell.
In fact they gave away the source code to an early MS-DOS and MS-Word as part of their own open source license.
It is good PR for them if someone emulates their old software in the web browser and gives them free publicity.
That is brilliant. Little things like being able to change the screen resolution and waste time on Solitaire - what more could one want!!! Works amazingly.
Also it is quite a historical artefact, things like Active Desktop were truly cool back in the day. Plus the simplicity of the Win98 UI is a joy to return to.
This is awe-inspiring. I don't know what to say -- except that, if it was faster, I'd try to run Windows 98 in the browser of Windows 98 in the browser.
The network interface doesn't seem to be configured – when I try to open the browser it wants me to install a modem. Is the VM just not configured for that, or do I need to do something? Would love to see what today's web looks like on '98 IE :)
It's possible, Windows 98 has drivers for the NE2000 network card we're emulating (https://github.com/copy/v86/blob/master/src/ne2k.js). However, currently the emulation isn't accurate enough, Windows 98 refuses to use it.
This allows you to connect to CORS enabled sites without using the WebSocket proxy. It talks HTTP on the serial port.
I want to add SNI support to tlstunnel so that I can tunnel to google.com by navigating to https://google.com.mydomain/ and having the snitunnel tunnel to http://google.com by reading the bottom-level domain names using SNI.
Using this with browser-http-proxy, it would be possible to tunnel to HTTP sites on a request-based level (making it easier to scale) and without relying on tun/tap on the server. Also it could serve as a fallback for non-CORS enabled hosts.
What a coincidence, I was just thinking about how nice it would be if I could start a minimal javascript linux with toybox and start spacemacs with it (< 64M) and a small mounted filesystem somewhere.
Win 3.11 in the browser would be nice too, there is a whole ecosystem of apps. Windows95b would be slower, but still faster and smaller then Windows98.
That uses Em-DOSBox, which is more limited than v86 (which the OP uses) in some respects. I think v86 is loading hard disk sectors on-demand, which is pretty amazing compared to Em-DOSBox which has to download a massive disk image. On the other hand, Em-DOSBox has sound support!
Windows before 30 years of updates that were created with a need to maintain backwards compatibility feels so cohesive. I mean, that control panel is just so clean.
Also, would be interested in seeing the orginal Win 95 (before OSR2). It was way way faster than win 98.
The control panel in newer versions could still be that clean if Microsoft hadn't tried to reinvent it three times (XP, Vista, 8).
Of course, the 9x control panel is, in a way, a reinvention of the 3.x control panel. It's just a very clean one, because the 3.x control panel was a window full of icons, and the 9x control panel is an Explorer window full of icons.
"My Briefcase" could have been Dropbox, 10 years earlier, if only 1) it could work decently, which it never did, and 2) had an option to sync to MS servers. Bandwidth was an issue at home, back then, but for offices it would have been fine.
Microsoft spun its wheels a lot with file-syncing in the mid-2000s. Aside from the stuff coming from Office (Groove, etc) and Windows (folder-shadowing, BITS, etc), there was a slew of inter-related products in the consumer space starting with the acquisition of FolderShare. By the time DropBox launched, it had become Mesh.com, with PC-to-PC syncing and online storage, but thanks to internal competition and constant rebranding/repositioning, it slowly became deprecated and subsumed by SkyDrive/OneDrive.
I remember how amazing it was when the email accounts went from 20mb max size to something like 500mb, it blew my mind and gave rise to many services that capitalized on this huge amount of space (mostly peer-to-mail and an application that turned gmail accounts into a dropbox kind of thing).
This is really impressive, and also brings back waves of nostalgia as I used to use Windows 98 (and 95 and 3.11 before that) as my daily driver.
I like how the cloud icons at the side of Explorer when not in classic mode still are drawn with a white background even if you change the colour scheme.
On a real machine, when I look at this old system I realise how little we have moved on in UI terms or the basic needs and requirements of a computer. You could do 99% of what you need to do on an old computer, other than duff website rendering and horrible security and power usage.
uhh it doesn't? Maybe it runs way better than the android or iOS emulators but you have to have a pretty messed up install of recent Android to make it run as slow as this.
The Lock mouse button helped with my first problem which was that the emulated cursor seemed to be scaling its position based on the side of my browser window instead of the size of the emulated window. Locking the mouse fixed the position issue but I'm still unable to click on anything though.
Now only if we had a version that could connect to the internet. Right now it tells me that it can't find a modem and that I should call MSN Technical Support! Cute.
Respect! This and Fabrice Bellard's emulator have me in awe.
http://www.bellard.org/jslinux/
It's even more painful when I know that technically I should be able to do the same, but have not yet done it ...
But is the A20 gate being properly emulated? :) I know a verification engineer who left AMD because he was tired of dealing with the ancient testbench which checked the ancient hardware.
The browser should run in the hypervisor and provide a real hardware virtual machine. Provide both ARM and x86, one is emulated and one is real.
Somewhat infamously, GRUB failed on Intel Macs (in 2006?) because it tried to use the A20 gate, and Apple refused to emulate it. It's fixed now, though.
Very old (as in pre-386, or even pre-286) DOS software, written for the PC and only the PC, might depend on it. If I remember correctly, archive.org has a whole section devoted to ancient DOS software, so you might find something in there.
Windows ME was my most disliked OS (too buggy, even after service packs). I think the level of complexity finally hit the wall given the 9x series lack of good memory/file protections.
My favorite 9x OS is 98 SE (which was a stand-alone release, not a service pack). But 98 "first edition" did add substantial features to 95: IE 4, multi-monitor support, sfc was added (which was useful because 9x got corrupted a LOT), ACPI support, better plug and play, and better hardware support overall.
98 SE just added USB support out of the box (which was a big deal for those of us trying to use USB mice), IE 5, WebDAV, Windows Explorer improvements, ICS, improved WMP, and all of 98 "first editions" hotfixes and updates. You could make 98 "first edition" into 98 SE more or less, but 98 SE was a nice thing to "just install" and have everything work.
My 1990s/early 2000s OSs looked like this: 3.1, 95, 98, 98 SE, ME, back to 98 SE, and then 2000, XP SP1, and beyond. Skipped XP pre-SP1 as it was a pretty shoddy release compared to 2000 at the time, and ran away from ME screaming.
I ran all of those windows versions as well, but my favorite remains NT4. I stuck with that for a long time, until the lack of drivers forced an upgrade. It was incredibly fast (smooth on a 66 mhz 486), rock-solid (uptimes of weeks with daily use), light-weight (32 MB RAM was doable, 128 MB was multi-tasking heaven). It's nice that MS is trimming down windows lately, but I keep thinking that it's a bit like seeing someone who has let themselves go get back on a diet. If they hadn't put on the weight in the first place, they would have never needed to lose it.
Dos6+Win 3.11, OS/2 2 and 3, Win98se were the best, XPsp3 as well. Win ME had one good thing going for it: window transparancy and alpha blending! For that reason I used it with a HMD, backstrapped laptop and webcam.
So pulled this up from my Chromebook - rather interesting to have Chrome running Windows rather than the other way around. Of course the big leap will come when it gets Win XP up and running and then folks will have an answer for their legacy systems...
Install Hacker Keyboard from fdroid (or the Play Store...). There's an option for a persistent notification that you can click, which will enable the keyboard wherever.
Open the Run panel and try to go to "C:\con\con". Instant BSOD. (This was an old trick I used to do as a bratty kid when people left their computers unlocked. Nice to see that it still works!)
The mouse cursor doesnt work.... my system's real cursor is still onscreen and the emulator is doing a very poor job of tracking my movements, and its not intercepting clicks or anything like that either.
None of the toy 'show HN' emulators that have popped up have a properly working mouse cursor....
Did you try the lock cursor button? A lot of times in Windows VMs I have to go disable mouse acceleration, but it doesn't seem this version of Windows 98 has that option.