That full screen start view was one hell of an abomination....
It's interesting to see how strongly change-averse most people are when it comes to those things.
Ok, a search-focused menu is fine, but what I can't understand is how is it possible that the in 2020, on a new computer the search in the start menu can still lag??? What is this? There are menus in some linux desktop managers that haven't been lagging for decades, and the newest windows reliably takes 0.5s-3s for showing any results, upwards of 5-10 seconds to showing every result... What are people paying for? Apparently it must also be that it doesn't even properly separate the drawing code from the search/IO/sync code because while it lags away I can clearly feel the lag even in how the text cursor moves and responds... Reliably! Wtf this is the world's most used OS? (Sorry, rant over.)
There are definitely major problems in Windows, be it incentive structure, or some wrong people at key positions in the UX department, or legacy, or whatever it is, but after experiencing a good desktop manager it is obvious that Windows is not getting better in this regard... It's not about tablet or no tablet, it is just worse, in many ways.
Perhaps in a vacuum it might work. But this is the real world were hypothesis are just that.
> A similar thing happened with Vista, people were overwhelmed with the wildly different UI
No. People were apparently overwhelmed by system instability and resource hogging (mainly hard drive grinding). I don't remember people complaining about the UI's usability. Though there were UI complaints which were mostly echos of the same complaints leveled at the glossy "Teletubby" XP theme.
> It's interesting to see how strongly change-averse most people are when it comes to those things.
Interesting? It's human nature. We develop habits and routines which take time to memorize and get right. It's work which we personally invested. I'm sure you have routines that if changed by an external force without choice would be upsetting to you.
Memory usage used to be a major complaint (possibly the biggest) even if it was made clear repeatedly that the OS was simply keeping more stuff in RAM instead of dumping it to disk specifically to improve performance. A mechanism that stuck to these days. That memory isn't marked that obviously in the Task Manager now, and memory is is no longer such a luxury so people don't complain anymore. But on my 16GB machine I have 4.1GB in use clearly marked on the graph, and another 7.5GB cached that is not at all made to jump at people. People want the added goodies and expect absolutely no impact on anything else.
When XP was launched we heard the same grumbles. XP was bloated, had higher resource usage than 98/2000, less stable than 2000, not compatible with a lot of hardware, weird GUI. By SP3 people were loving it and by the time Win 7 arrived nobody wanted to let go of XP. Win 7 was bloated, had higher resource usage than XP, less stable than XP, not compatible with a lot of hardware, weird GUI. By SP2 people were loving it and by the time Win 10 came along nobody wanted to let go of Win 7. And no, it's not an issue of OS quality going down. Like you said, people just get used to stuff and can't take change and when you combine it with the lack of understanding you get all kinds complaints.
Reminds me of an anecdote about a certain car made for the low end market, targeting a segment of owners of 20+ year old clunkers. Everyone would buy it and complain that the fuel consumption was huge. Strangely enough this was a modern engine, certainly more efficient than the old ones it was replacing. The problem? The fancy computer was showing instantaneous fuel consumption. When accelerating? 25 liters/100Km. Outrageous! The company just hid the instantaneous counter and left only the very reasonable average. Problem solved. Then there were the "I can't feel the road with this power steering" complaints which worked themselves out, although to this day there are people who swear the old cars were better (they were most definitely not).
Between lack of knowledge, nostalgia goggles, unreasonable expectations ("all of it, for free"), and a few more things these popular opinions of tech of the past aren't all that useful. It says a lot about the commercial success of a product, not its actual qualities.
In 2009 when Windows 7 was launched, Vista's market share (all desktop OSes) reached the all time peak of 18%. At the same time the (then) 8 year old Windows XP had 72%.
The NT series has always distinguished between cached/locked memory, so I don't see that as an explanation.
Also, consider than, on equivalent hardware, XP actually booted _faster_ than 2k, which led many people (incl those who hated the interface) to actually use it.
So this is not simply explainable by the "people just can't take change" argument.
As for XP most people upgraded from 98. The resource usage difference was undeniable and so was the driver incompatibility which made most devices not work properly or at all initially. I did not have the experience of XP booting faster than 2000 even on the same hardware but SP1 fixed a lot of issues, maybe also this. For the first couple of years every forum, IRC channel, BBS, or DC hub I read was full of complaints about either performance and compatibility, or stability depending on what people were upgrading from.
> So this is not simply explainable by the "people just can't take change" argument.
Not only. But it's one big part of the explanation. People are usually skeptical about change and compare around transition time so are inherently biased. They're comparing a stable, fine tuned product with the fresh, rough edged one. The Vista name was dropped because it was already toxic. But Windows 7 is more or less Vista SP3. If Vista didn't flop so hard from the start it would have had the same evolution as XP: launch grumbles grumbles grumbles > SP1 grumbles grumbles > SP2 gru... hey, this is pretty ok > SP3 noice!.
Even Win 10, with all its issues, is a far better product today than it was 5 years ago.
Well, that's because they fixed most of the issues in SP1 and SP2.
Also is there a specific reason you skipped certain versions such as Windows ME, Windows Vista etc? Maybe it's not only "people just can't take change" and some products are legitimately bad?
The biggest fault of Windows Vista in my opinion was Microsoft made one too many compromise. There were machines that came with Windows Vista pre-installed that should have never gotten Windows Vista. My roommate in college had a Compaq machine on which the screen completely blanked for over three minutes at a time as it tried to display the user access control overlay. Of course, over three minutes later the screen would turn on as if nothing had gone wrong at all. iirc it was something about the processor/integrated graphics being too weak for Windows Display Driver Model. 
I think Microsoft is making a similar mistake today by allowing OEMs to ship Windows 10 on new machines with anything less than a SATA SSD (mechanical hard disks or eMMC).
 https://en.wikipedia.org/wiki/Windows_Display_Driver_Model#:.... Off-topic but if the url looks weird it is because it is a Chrome only thing: more at https://wicg.github.io/scroll-to-text-fragment/
People complained about memory usage even tho it was just windows precaching. Something they don’t complain about now.
The issue with vista was drivers. And then when 7 came out everyone is like oh yay everything works. Even tho it was using the Drivers from vista.
Majority of the vista complaints are silly. It was never as bad as everyone made out to be. Just the same old cool fad to hate on MS.
Change is only good if it improves things. Win8 and later were a massive usability regression for keyboard+mouse/trackpad users. Windows 10 rolled some changes back, but those that weren't rolled back are still an inconsistent mess (and with no signs of improvement, that's the actually worrying part). There's no rational way to call the Windows 10 start menu or system settings an improvement over Windows7 all the way back to Windows95. And both the start menu and the control panel area weren't all that great to begin with in Win95+, but somehow Windows 8 and 10 still managed to make it worse.
PS: The main reason why the Win8 start screen and Win10 start menu are bad seems to be that they assume that a Windows application is just the executable, like on macOS. But Windows applications usually have more files associated with them, mostly other tools, readme files, uninstallers. Visual Studio is the best example, it's nearly impossible to find the tools associated with Visual Studio in Windows 10 without installing a proper start menu replacement. And Visual Studio is a Microsoft product. Go figure.
Windows 8.1 made some improvements, so that users will probably be able to find their way from Win32 to Metro. Getting back is still a pain. It retained the fundamentally bad paradigm.
Windows 10 was the proper fix: abolish the ecosystem split, as a disastrously failed experiment, and return to a unified experience. You still have the UWP apps (though their design is finally somewhat more in line with the rest of the OS), but they are properly windowed like everything else now.
In 8 the tiling window manager and non-tiling window manager were separate worlds you switched between virtual desktop style. In 8.1 they made the entire non-tiling window manager desktop a "proper" tile. (In 10 they killed the tiling window manager.)
As a tiling window manager fan, it's easy to sometimes wish they'd gone down the other path in some AU 10 and "promoted" more Win32 applications to proper tiles in the tiling window manager, rather than "demote" all the tile capable apps to the chaos of traditional non-tiling window manager.
I switched to Windows 10 purely for the Surface Book hardware a few years ago. Before that, I was a contented Arch Linux + i3 user. (And my next laptop will probably be back to Arch Linux, with Sway instead of i3 because Wayland > X.)
The split made, and still makes, no sense at all to me as a tiling window manager fan. It maintained two separate worlds that you could not easily switch between, which is fairly antithetical to a tiling window manager. And functionality like, y’know, tiling, was non-existent. (And tiling does exist just a little bit on the other side, with window snapping.)
It was a window-management disaster, wholly unmitigated.
Early on most apps, because of Windows 8's attempted focus on Tablets/Phones typically only supported full screen, half screen, and phone widths. Windows 8.1 added a lot more responsive screen sizes to more of the apps, adding a lot more variety/capability to the ways you could tile apps. The classic Win32 desktop even lived "inside" one of these tiles in Windows 8.1, so you could tile apps on both sides around it.
At various points in Windows 8.1 I'd have a half dozen apps (including the Win32 desktop as a single "app") tiled across two 16:9 monitors. It was actually a very nice dual monitor desktop experience and a good use of all that horizontal space, which I will continue to point out to people that didn't believe the tiling window manager made as much sense for desktop users as it did for tablet users. (I joke that I wish I could turn on Windows 10's Tablet Mode on dual monitor systems still, even as anemic and nearly dead as the surviving tile manager is.)
(Keyboard included and especially. Do you know the keyboard shortcuts to move/resize/arrange windows and when was the last time you tried to use them? Part of my love for tiling window managers came from daily use of a laptop with a bad trackpad and wanting to automate more things with the keyboard.)
This was broken for 5 frigging years. The general consensus was that tablet mode was more problems than it was worth. They fixed in some recent release but is still buggy as hell (windows will not return to full height and instead be partially moved off screen for some reason).
Today, GNOME Shell is a default DE for major distributions, and while people have their preferences, it's more popular than the 2.32 fork, Cinnamon, which I think is good evidence of change-averseness.
I had my differences with the GNOME Foundation's handling of things, but I have few complaints about the convenience of their DE today.
Lots of people internally knew it was a mistake in real time before realease but it was strongly backed by exec Steven Sinofsky, iirc.
So the homepage doesn't even load.
So the hapless admin who needs to download a file off the web (with no other browsers installed, and no firewall) would open IE and boom, exploit!
Is this what you're referring to?
More importantly people who suffer from to those interfaces are not in position of power to make switch operating systems. Forcing them to eat what MS feeds them seems like good way to experiment. If experiment fails, you don't have to hurry and repair it.
The default homepage for it should be msdn or bing or whatever, but msn makes zero sense.
Linux got this standard arbitrary menu categories thing but in the end people use menus to launch a program so I believe a simple A-Z list with a search box should be enough.
Throwing app store stuff in there is not okay though (as in the current windows situation and the previous ubuntu affiliate links in the dash bar).
(You can imagine what my reaction was when edgy auto-started after an update to tell me the over-engineered Firefox downloader was now "better" than ever. I had to kill the process in order to not see the mandatory new features tour. It did it again after the recent update to make sure I hate it as much as possible.)
Take for example the fluent design, which by itself is a good design, but after three years is still not consistently applied in windows, let alone microsoft’s first party apps. Meanwhile apple has redesigned all of the ui of the os and all first party apps, in one year. That is what following through looks like. It is embarrassing for microsoft how much better apple is at rolling out design changes.
I actually really liked that.
Edit: took me way too many brain-cycles to work out the emphasis in this comment.
This (broken search) is the single biggest complaint I have with Windows 10. Everything else I can overlook.
Inside the Sound Settings I see almost all of the controls I expect to see from the Sound Control Panel, though they are broken into multiple pages in the way that the old Control Panel was broken into tabs. A lot of the Sound Control Panel is the Devices list, and there are multiple links to the "Manage Sound Devices" page that I see in Sound Settings. The link to the app-specific volume mixer is under Advanced Settings and labeled about you would expect "App volume and device preferences".
If that weren't enough I also see on Sound Settings as the second item down under "Related Settings" is "Sound Control Panel" to get back to the one you hoped for if you wind up searching and pulled up the wrong one from the one you preferred. (I don't think that is a Dev Mode thing and always is there.)
(Bonus: the best app-specific volume mixer in Windows 10 is the Audio widget in the modern multi-widget Xbox Game Bar which you pull up with Win+G or the Xbox "Home" button on Xbox controllers. If you've never used the Xbox Game Bar or remember only a flat single bar without a variety of widget windows, trying the new Xbox Game Bar is a good idea, even for people that don't necessarily use Windows for gaming some of the Widgets like Audio and Performance are generally useful and faster or as fast to access than their Task Manager or Settings equivalents. Of course, likely to be disabled by group policy on Enterprise systems; it's unfortunate when Power User tools are Gaming focused/branded.)
Search in Windows 10 is garbage. I install Everything on ever y PC I use for home or work and it does wonders:
For me, without the OmniBox Windows would be unusable by now. I consider Windows 2000 to be peak Windows UX. It was all downhill from there.
If you're on 2004, it's HKEY_CURRENT_USER\SOFTWARE\Policies\Microsoft\Windows, DisableSearchBoxSuggestions = DWORD(1)
If you're on 1909, it's HKEY_CURRENT_USER\SOFTWARE\Microsoft\Windows\CurrentVersion\Search
CortanaConsent = DWORD(0), and
BingSearchEnabled = DWORD(0)
The guy of Bellard's caliber does not need flying unicorns on a screen to attract people.
It's unfortunate that Microsoft didn't work toward making it an open source standard for documentation. We'd have avoided every other documentation having it's own format, style, etc... Plus you get the whole documentation in a single file, not worrying about broken stuff, missing images, broken links, etc...
MS tried completely shut down and end open internet based on open standards. Their Blackbird project attempted to make create completely closed alternative MSN (The Microsoft Network) with closed publishing, search and software. Instead of Wikipedia, we would check out things from Microsoft Encarta.
It's funny you bring that up. I just noticed yesterday that MS Word now has a Wikipedia button: https://imgur.com/a/zbcLZGj Things change!
I'm a huge chm fan but I for one don't want out of update electron apps to open everytime I look up something. And then they would still go online to find up to date stuff anyway.
Is it possible to make a program like the sheep.exe nowadays? It is awesome.
A similar effect seems to happen even with e.g. Canonical and Ubuntu. Gnome2 had an awesome optimal UI and 90% of Linux users I know prefer that kind of no-BS UI but "management" wanted that Unity and Gnome3 nonsense just to make it look like they were making progress when they were really going leaps and bounds backwards in usability.
Microsoft's backward compatibility is still pretty impressive.
However, there is now Open Shell Menu which was created to replace it: https://open-shell.github.io/Open-Shell-Menu/
When I want to do something, I hit the launcher key-combination of whatever OS I'm in (the Win/Super key in Win8/10 or GNOME; ⌘Space in macOS) and then type the first few characters of the name of what I want. If I know the search database is up-to-date (i.e. I haven't just installed the program, I then just immediately hit Enter before the dialog even shows results.) The program opens.
I feel like a Start Menu makes a lot of sense... for people for whom using a mouse is somehow a lot easier than using a keyboard. People who can move their fingers easily but whose wrists+forearms are immobile, for example. Or people operating the GUI through a thin-client app on a candybar-style cellphone. Or people using Windows on their TV with one of those Wiimote-like "air mouse" pointing devices, where switching to the keyboard has the friction of flipping the thing over, taking away your ability to move the mouse while you type. Or, for that matter, people using a game controller, ala Steam's Big Picture mode.
But in the regular case, where you have a keyboard and mouse in front of you, and are in a relaxed default position with one hand on one and one on the other? I can't see the benefit—for just launching programs.
Now, for discovering programs... I'd never know about the "accessory" programs that came with an installation if I couldn't browse. I still recall the first time I installed Sim City 2000 and realized it came with a separate program called SCURK (a level+resource editor for the game.)
But it's been a long time since I've installed a program that came with accessory programs that way. These days—probably because of the influence of app-stores—every feature you get from an installation of "an app" is always accessed through the main binary of the app. And if it can't be, then they just don't bother to distribute it. (See also: macOS and the paucity of uninstallers for programs that install LaunchDaemons. You usually have to google for a script!)
The main reason I like the start menu is that I sometimes forget the name of applications I use seldomly and I forget what apps I have installed.
One thing I forgot to whine about is the configuration panel. In the old one you had everything in front of you instanly. The new one is a joke and I never seem to remember which feutures you need can change in the "mobile friendly" blue one, or the kinda a folder one.
I am on windows all day long for analysis/development, anything I actually use is just pinned to the taskbar (like osx).
And when I do have to use the start menu to find something like newly installed.. not sure how you really think it's that different than win98? The left part is the same thing, it's just sorted alphabetically by program names etc.. and in theory now instead you just start typing for search instead of looking.
Overall calling win98 more user friendly than win10 sounds ludicrous, unless you are someone who never actually uses windows..
it had the user experience of windows 98 but was also a lot more stable.
I tried Windows 2000 and really wanted to make it my daily driver but I kept getting blue screens when I played games. I spent hours looking for a solution never figured it out. Went back to 98se up till Windows XP release.
I recently set my desktop to the Windows 2000 default blue background color just for a bit of nostalgia. As far as I can tell, it's the only bit of Win 2000 UI that it's still possible to achieve.
It was a neat concept. I used to use that to sync stuff stored on my school's student network drive with a floppy (and later flash drive) with my home PC. They definitely didn't describe it well unless you went digging for why it exists.
Stuff like Hover! - getting that Weezer music video with the Happy Days set on the Win 95 media edition or whatever...such cool little things that displayed a sense of ‘fun’ about very business-centric software.
But Windows had skeumorphisms, even nowadays the screen with everything minimised is called "the desk[ ]top".
But I guess briefcases and recycle bins made things relatable...
I still have a bunch of floppy disks with briefcases in them. Alas, the feature has been removed from recent versions of Windows.
Funny that because I grew up in the 80s and for me the golden age was 10 to 15 years earlier and I saw Windows 98 as the decline.
There was so much variety and experimentation in the 80s, and so much excitement too when GUIs first started appearing on home computers. There was also much more diversity in the computing landscape with different hardware architectures and operating systems. In fact back then DOS machines were some of the least interesting hardware and early Windows (pre 3.x) was just terrible compared to what Acorn, Atari, Amiga and Apple were doing. Then came the mid-90s and everything had converged into x86 running Windows. I remember at the time feeling rather let down by just how boring and crummy desktop computing had become considering all the interesting things that preceded it. Things picked up again once I discovered BeOS and Linux -- I guess even in the 90s I didn't like Microsoft Windows and to be honest little has changed over the years.
That's just my opinion though. The "golden age" is a very subjective thing that I suspect is largely driven by the age of the observer.
Back when I was learning computers in elementary school, we were taugh to use Yahoo! as our search engine and most of our 'computer' assignments for the day involved drawing butterflies in Microsoft Paint or seeing how many words we could type per minute.
I think I phrased my initial comment incorrectly - I definitely didn't mean it was the golden age of personal computing, but rather the golden age of the internet.
Certainly, some things have improved. There's been more standardization and a proliferation of knowledge about the 'right' way to design a website (from both a tech and marketing perspective). Back then, there was an excitement and a feeling of freedom I got when browsing the web. Part of it I suppose could just be that I was young and not yet so cynical, but it felt like the wild west to me back then.
Every site looked different, many were ugly and gaudy as hell with bright colors, flashing animations like the Vegas strip, and regularly broken links, but despite this, there was something charming and endearing about it that just isn't present today.
Browsing the internet today is a very contained and sanitized experience. Everything is wrapped in plastic and padded with nerf, and though you may go to different sites, there's a feeling of sameness to it all. Every time I click on something, I'm making someone money. The data I generate while browsing is sold off to the higest bidder. You can assume, like a digital panopticon, the man in the watchtower (NSA, other government alphabet agencies) could be watching you at any given time.
Maybe not all of this is true or framed accurately - browsing the web has definitely become easier and more streamlined and for the average user, that's a good thing. But something's been lost for me - the wild west of the internet has become a guided tour.
For me the biggest change is the shift from content to consumption. 90s web was generally content first. Today's web is all about lower quality content and force-feeding consumers until they're addicted to it like crack (some massive generalisations there)
For many years I looked forward to GNU/Linux settling into one stack and offering a multimedia experience like the Amiga, Atari and Acorn, but that was in vain in the kingdom of CLI and forking projects.
Funny you should comment about screensavers, the first thing I did when I loaded the Win98 emulation these comments refer to was the display settings then clicked 'Preview' on the pipes screensaver. I believe there was also one that looked a little like Wolf32?
Back then, I wasn't so much interested in the computer itself as I was in what I could do with it - as a four year old, that meant hunting and killing pixelated T-Rexes.
I wish a kernel engineer could give a good answer to that question.
One side of the answer could be that the software that is bigger, but honestly that doesn't explain everything.
I wish somebody could confirm Wirth's law is real and that there are valid example of it.
Curiously, major Linux distributions have also gotten significantly slower compared to early 2000s versions.
I wonder, for a thought experiment, what if companies stopped development on software when it reaches certain stage of maturity, say Windows 2000, providing only necessary security updates or optional visual changes?
New software is really not adding much to the table after some point of completion since the software companies seems to mostly add pet feutures and user hostile fads be it star menu, complete gui changes, 'enterprise' admin lookout or ads and tracking.
E.g. Facebooks 1000s of developers seem to add a net of antifeutures to their site. On Netflix you can't even disable autoplay.
It is the same really for software moving to remote mainframes. The companies rather hide and burry the old desktop versions deep.
EDIT: and in the case of Windows 10, adverts/trialware and their associated animations.
....how? That's handled on the GPU, and by GPU standards it's a nothing task.
There was an interesting conference video exploring this, about eighteen months ago. It was part of a project to make git faster on windows. I can't find it, maybe someone else will remember.
I've seen it working on a couple of older, non-SSD machines. But "working" is perhaps a strong term.
The experience of using the actual operating system isn't that far away from Windows 98, which ran on a few megabytes of RAM.
Definitely can feel the nostalgia.
I remember the first time I decided to leave Windows 95 run overnight, the next morning, moving the mouse would send the harddrive playhead flying like crazy... You know the old "SHRrrrrt Shrrrt..."
It had such a memory leak overnight that moving the mouse was causing the swap to kick in non-stop!
DOS TSR programs started before windows were still running. I had one that popped up a calculator in dos text mode, and if you pushed its hotkey in win95, it switched win95 back to text mode, paused win95, did its thing, then popped back in win95.
Only the very basics of protected mode and virtual memory where there, and a well-behaving program had a reasonable chance of staying in its own sandbox. But only because it wanted to. Seen from the CPU, you could argue EMM386 was more the actual OS than win95/win98.
None of this is meant to be negative. It was a solid step up from windows 3.x, and yet quite usable with 4MB RAM of which the first 1MB wanted a very different treatment.
The quality of drivers could be a serious problem in the Windows 95 era. If it was a driver shipped with Windows, everything seemed to work fine. If the driver was provided by the vendor, the reliability was so inconsistent that I would try to find a compatible driver that shipped with Windows even if it meant missing out on some features.
Was this the pinnacle of UI design, or is nostalgia clouding our judgment?
Sort of like how desktop icons have text with a blur-extruded drop-shadow. It fades out at the point where text is no longer shown.
I think it's the latter issue that drove it to it's doom.
Example tasks would be, "run a program," "find a file on disk," "save your work," "copy and paste between applications," "close a window."
Examples that would often fail include updating configuration settings or manipulating disks. These are important tasks, but not core tasks.
Restricting Linux to the mass market distros, my limited experience is they've been reasonably consistent. They can probably use Win-R to run a program. These distros mostly use the ZXCV clipboard keys, and adopt other Windows-like conventions like Ctrl-S to save and clicking an X to close a window.
Some Linux distros could fail some reasonable tasks because of poor design; e.g. I recall Ubuntu's tiny hitbox for resizing windows being very annoying.
On OS X / macOS, Apple has chaned things up with features like autosave, which fundamentally changed how some applications behave. But according to my test, I'd just tell the person, "hit command S" and while it will create a new revision instead of saving the file, this doesn't do a thing we don't want.
I think the biggest lack of core consistency is in Electron apps and other non-native UI creeping in. I don't think I'd pin that on OS designers, though.
Between architecture changes (PowerPC to x86 to x86-64 and next up ARM), and compiler changes (I doubt Objective-C code written for OS X 1.0 will compile on the latest XCode), breaking API changes, security changes, and even deprecating standards (can't use latest OpenGL, you gotta use Metal)... not really.
Also, Matrix never had a sequel.
for the ones who haven't lived the windows 95-98 era: https://coderanch.com/t/131585/engineering/Folder-con-Window...
Sadly this install doesn't include QBasic which was also still available on these DOS based Windows versions.
My 14 years old me would have never believed me.
That's why I made a screenshot.
Malware is quite a good study subject about this question. There's a lot of malware that won't run if it's in a virtual machine to avoid researchers from testing it inside one.
Or, are there modes for emulators like qemu/bochs, where they'll run entirely with "real" (LLE-emulated) hardware?
(dunno if these would build/run on win98 though)
Works in Chromium though.
I've got to say, the icon for .txt files brought on some very specific hit of nostalgia.
But a lawsuit? Why would Microsoft care enough to launch a lawsuit?
1. Open Start Menu
2. Click "Run"
3. Type: con\con and press enter