Even though I've used a Mac daily for the past decade or so, I still miss the task bar, and window-oriented GUI of Windows. I still get frustrated on OSX when I minimize a window and have to hunt around for it. I wouldn't switch back because of the underlying crap that is the Windows OS and file system, but I still miss the interface.
Edit: Found this fantastic PDF "Chicago Reviewers Guide" which goes over all the new stuff in Win95. So much stuff I had forgotten - TrueType fonts, Plug and Play, registry settings, right-click properties, long file names... Basically everything that makes Windows what it is today.
(On the flip side, Windows' select-a-directory dialogue of the same vintage is such an utter piece of garbage that I can't imagine there being any overlap of designers between the two dialogues.)
I hadn't realized that KDE was copied from Windows 95. I'm surprised no one here has mentioned NeXTSTEP. Here's a demo by Steve Jobs from 1992: https://www.youtube.com/watch?v=gveTy4EmNyk
- Click the FILE tab
- Wait two full seconds while it replaces my screen with something else
- Click Save As
- Click Computer (because Microsoft wants you to do everything in the cloud)
- Click Browse
- Finally proceed to saving your document like you would be able to do immediately on any other system
A habit that I've developed from earliest computer classes in elementary school is to save the file in the location you want as soon as it's named, so ever after Ctrl+S saves it with no hassle.
I agree it's very ridiculous.
 'Backstage UI' is the bit that looks like this: https://kiatplayground.files.wordpress.com/2015/06/save-to-o...
I would extend your argument to Windows Explorer in general.
Sometimes I feel like the developers don't want stuff to work similar to Windows. Because that would be admitting that Microsoft actually did something right.
It's not the same vintage. It's a carry over from Win 3.0.
Windows 3.0 didn't have any standard system dialogs, these were introduced in 3.1, but that is not the main point. In Windows 3.1 the directory selection dialog was exactly the same as file open/save dialog (Figure 8 in the article) but didn't let you to choose files (and there are still places in windows where standard open dialog is used for choosing directory).
What is typically used as "standard directory selection dialog" actually isn't even documented standard WinAPI dialog, but originally internal dialog of the explorer shell (and it would not surprise me it it was not present in original Windows 95 RTM but introduced in some slightly later version), also it does not select directories (ie. pathname as string), but shell folders (ie. ITEMIDLIST).
For example: I literally can't know how I can save a file using Gnome's dialogue in the general case, e.g. if the dialogue opens at /foo, and I navigate to /foo/bar using the dialogue, but then go back to /foo ("bar wasn't the right place after all"), I can't save the file there any more. "bar" will be selected. Clicking "Save" while a directory is selected will not save, but navigate instead. Now I'm in "bar" again. I go back to /foo and try to click something else, say a file. This changes the to-be-saved-file's name to the selection.
After I asked someone who uses Gnome as their main desktop they told me "that's easy: you just have to ctrl+click on the selected directory to de-select it, then you can save in the current directory".
That, my dear readers, is indeed unusable garbage.
But, I only realized it now because almost none of the apps I regularly use make use of Gnome's default file open/save dialog. Linux's extreme inconsistency has some benefits ;)
Oh, and talking about Gnome, the only reason anyone uses it is that after configuring a few basic stuff, you can completely ignore it, just run your apps, forget that there's an actual OS with GUI somewhere beneath them. Heck, if even the applications running on it ignore Gnome and its "standard widgets", its biggest strength is that it can be easily ignored!
That's one of the most absurd and glaring usability bugs I've ever heard.
That is the most annoying part to me. Windows does this as well. It's such a rare thing to want to overwrite a file for me, I find it so irritating that if I accidentally click on a file name instead of a folder, suddenly I lose the file name that I wanted to save as. Usually I just cancel and start again. So stupid.
Apple have a tradition of being really bad at this. Many of the (slightly more) advanced features are completely hidden behind undiscoverable key combinations or very hidden features. The slide-to-reveal pattern on iOS (now mostly fixed with augmentations) is a good example. Middle clicking the titlebar in Finder to reveal the directory parents is another.
It's not like the Mac way saves screen real estate or anything. It's like the way Firefox lets me close a tab by middle-clicking on it. Most users don't know it's possible, but it doesn't actively harm them to have the feature there.
I really like the fact that the Windows save dialog is basically Windows Explorer, and I really miss those features when an application uses the older file save API (which gives a more Windows 95 / 3.11 interface).
RiscOS had it right - the Open dialog didn't exist - you would open the folder and double-click the file. And the Save dialog didn't exist - the document would reduce to an icon which you would then place into the correct folder. (You can sort of do this on the various versions of macOS by dragging the icon from the title bar but it's inconsistent)
What I was interested in back then was the idea that there was a direct correspondence between the folder window and the data structure on the hard disk, and that to the user these concepts should be indistinguishable. One part of the illusion is that a folder always appears in the same place with the same window size, to give the sense that the folder is a tangible thing with permanence.
It's got a loooonnggg way to go (I never realized how many small but vital details are in a file explorer application), but it's strangely exciting to see it draw itself on modern macOS, especially on a retina screen!
And yes, the spatial UX! I'm still working through getting all of that implemented (I just completed persisting window locations/positioning days ago). I have recently been reading theough some of John Siracusa's turn of the Mac OS X era writings and that's been hugely insightful and helpful. The level of detail (both of the original Finder and his writings) is impressive and sirprising.
Very cool UX concepts!
IIRC, you even used to be able to just swap the System folder around and have like a completely new/different install of Mac OS.
This article from 2003 lists the problems with the OSX Finder. Since it was written, it's only got worse.
I love being able to drag a file up one level in the tree without cutting and navigating to the destination to paste, or having 2 windows open to almost the exact same location.
While Miller columns certainly support that, so does the tree view in the left panel in the default Windows explorer layout.
I've been using bitCommander, which is a Windows file manager that supports (among other things) Miller columns.
I think RISC OS (?) did this. Open documents in applications had icons representing them which you could drag to the file manager to save. (And perhaps to other applications to open?) Mac OS also has (or had?) this to some extent – many document-based applications show an icon in the title bar, which is for dragging and dropping the document in question.
Even Windows has an example of catering to this way of working, in Explorer, where the folder icon in the location bar represents the current folder and can be dragged and dropped. They even have some custom behavior to prevent the window from being raised when you drag from it, so that it works more like classic Mac OS and lets you drag to an overlapping window.
Be OS did this, too; I got the impression that Windows was inspired by that, both for BeFS (NTFS) and the desktop filesystem UX.
More likely that they were both inspired by a common source.
Never mind that MS have been working on a fully database driven FS (WinFS?) for ages.
How many exploits were just a matter of tricking an extra-fancy OS dialog into popping open something it’s not supposed to, escalating permissions alongside it?
The door swings in both directions, in other words :)
This can't be blamed on UX people. It's an ancient difference between Unix and VMS and not easy to fix.
What really is a joke is a left pane that combines favorites, libraries (die, die, DIE) and disk trees. It was never usable, except for favorites.
++edited to clarify language.++
No web browser, though. Internet Explorer 1.0 shipped with the optional Plus! pack.
I could have sworn that was all in the Plus! package too, but 20+ years of time has eroded my memory on that...
Well, I agree that there was no comparison with System 8, but not in the sense you mean. I think that the Mac back then was head-and-shoulders a better system than Windows. It might still be, but they're both so painful to use now that it's very difficult to pick a winner.
The Macintosh system was very understandable, very clean. Extensions were an easy-to-understand way to extend one's system, and easy-to-disable too. The window system itself was better-thought-out and less-confusing than Windows's was. The Finder was much more straightforward than the Windows equivalent (was it called the File Explorer back then?). The way that the Mac associated programmes to files (with an application code & a file code) was much better than the extension-based naming of Windows. The way that the Mac used its files' resource fork was great.
Programming a Mac back then was very clean & straightforward. I don't think there's anything today as nice, except maybe Cocoa, maybe. Certainly not the Windows 95 API!
Extensions could easily bring down the entire system because there was no memory protection. Full OS crashes (what modern macOS calls kernel panics) were a daily occurrence for the typical Mac-using professional who ran complex software.
The window system was often difficult to understand because apps tended to use a plethora of little panel windows that could overlap even from different apps. Windows preferred large windows that contained the entire app UI, and users typically maximized them. The Windows 95 Task Bar was much better for actually keeping track of your tasks than whatever the MacOS 8 thing was.
File extensions were always a hack, but one that Apple adopted too for Mac OS X. The days of Mac's file-specific associations were numbered when the Internet happened, because Unix servers wouldn't keep track of that metadata, so you needed file extensions anyway.
Besides, the file-specific associations were often super annoying because they were created by the editor app even for exported files. You saved a JPEG file from Photoshop, and it forever insisted on launching the full Photoshop when you double-clicked on it, instead of your preferred lightweight image viewer. This would happen even when you copied the file to someone else because the association was in the file metadata.
Windows NT 4 and its next version Windows 2000 were just heads and shoulders above MacOS 8 and 9 in terms of performance, stability and usability.
(And programming in Mac OS 8... Ugh. No memory protection, no multitasking, APIs originally designed in Pascal.)
Yes, they were quite unstable. I didn't say that they were stable; I said that they were easy-to-understand and easy-to-disable, which they were: each extension had a distinct icon displayed at system boot; disabling one was as easy as dragging it to another folder; disabling all was a matter of, IIRC, holding down Command as you booted.
> Windows preferred large windows that contained the entire app UI, and users typically maximized them.
As a Mac user at the time, I much preferred the multi-window mode: it meant that I could customise my desktop as I liked. The Windows single-window mode was terrible, as it meant that I couldn't layer windows properly.
> You saved a JPEG file from Photoshop, and it forever insisted on launching the full Photoshop when you double-clicked on it, instead of your preferred lightweight image viewer.
We considered that a plus at the time: it meant that different files of the same type could be opened up by different apps by default. One could always, IIRC, Save As if one wanted to change the file type — or use ResEdit.
> Windows NT 4 and its next version Windows 2000 were just heads and shoulders above MacOS 8 and 9 in terms of performance, stability and usability.
Stability, probably. Performance, maybe. But usability? Never! That was back when Apple cared about UX.
Windows 9x and maybe OS/2 were far and away the dominant OSes on actual workstations. Macs and even aging Amigas at some creative shops, some more Unix workstations at places where most people could recognize and identify the purpose of (if not actually use) a slide rule. But what I essentially never saw on any desktops was Windows NT. Lack of driver support and inability to run many business applications kept it out of that space.
But NT was the first winOS which supported multi-proc, correct?
NT4 was the first OS I could have my dual 266 intel procs run softimage IIRC
NT4, Adaptec 2940u2w, and mirrored segate cheetah drives.
You had to space them out because a stack of two cheetahs would cook the upper one to death ...
I actually recall that Cheetah problem. :-)
Mac was so set on being different, that they aschewed UX tropes that were natural.. and had to spend ridiculous amounts of resources trying to convince people that their way was the right way, but clearly it was not.
This, IMO, is where the “fanboi” concept evolved.
Who cares - in the long term - where the UX and UI elements came from, the point is to make machines immediately accessible to humans creative desires, not to mold workflows to a corporate ego...
So, Apple figured out how to develop products that were managed through the extension of the desire of the user, but they are still struggling with the requirement from Jobs to be “different” - and Ives’ perception of a common user is skewed toward “Ives’ has stated that this is how it should be done” type design, which I find completely ironic given the whole “think different, so long as it’s exactly how I am designing you to think” campaign is a hypocracy that goes up to an 11
The original Mac released in 1984 and was produced through 2000, sixteen years. OSX released in 2000 and is now in its eighteenth year. Whilst each system has seen some evolution over time, the general metaphores and interfaces have remained consistent.
Apple have realised and internalised a core concept of GUIs: change is bad. There is a far higher cost to changing interfaces than can be gained through efficiency, and the retraining and unlearning costs are exceedingly high relative to benefits.
This is a message apparently lost on Microsoft and most of the leading Linux desktops.
Mind: I write this as someone who whilst using a Mac presently doesn't much care for the interface. My preferred desktop remains WindowMaker (itself based on Aqua's predecessor, NextStep), which has a key advantage of having changed almost not at all in the 20+ years that I've been using it. It's also configurable in ways I find useful, and I schlep around a configuration directory to new systems as needed.
That and a terminal window.
* MS-DOS/Windows 3.1-like
* Windows 95-like
* Windows 8-like
Windows 8 basically was born and died in a couple of years, to be replaced with Windows 10, which is very much the same basic set of metaphors as Windows 95. There's a start menu, a little dock of pinned icons next to it, a taskbar, a clock and some little icons for what are basically background processes. There's a maximize and minimize and close button on windows. There's a File/Edit/Whatever set of menus. You can right click for context items, many of which are consistent with Windows 95 20+ years ago. That basic system is now something like 21 years old now.
You can maybe argue that Microsoft forgot your message five or so years ago, but they obviously rediscovered it.
Windows 3, 95, NT, and 2K each saw significant changes in where and how major system functionality was presented.
During the same period I was using numerous Unix and Linux platforms (and still do). Those have largely seen far less substantive change at the shell and system level, with a few notable exceptions.
I'm not discussing Linux GUIs, which have been all over the goddamned map. I've used twm, fvwm, fvwm2, VUE, CDE, WindowMaker (my preferred option), GNOME and KDE through multiple generations, Enlightenment, various of the 'boxes (black, open, flux, ...), ion, xfce4, ... And those are the ones I've trialed to some significant extent. I've at least fired up and looked at virtually all the options mentioned on the XWinMan page: http://www.xwinman.org/
There are several fairly central components which have changed fairly markedly. The shift from telnet to ssh, multiple iterations of firewalling, various scripting languages of preference (bash, perl, python, an oddment of others), mailers (sendmail, qmail, anything reasonably sane, mostly exim and/or postfix now), and of course, the whole init replacement clusterfuck.
But the notional concepts of files, filesystem, shell, utilities, pipes, etc., has remained consistent, and even across several utility / server replacements (particularly ssh and mailers), command-level compatibility has been preserved to a remarkable extent with previous options (e.g., rsh and sendmail syntax).
If Microsoft can only be relied on for five-year stints of "having learnt this lesson" then they have not learnt this lesson.
This includes a hell of a lot of systems configuration tools. The corrective force on failure to adhere to this norm is strong.
Hell, you can today configure Windows 10 to behave much like 8.1. The one thing i see some people miss with the 8.1 to 10 transition is the charm bar. In particular that it gave easy access to printing and such.
While Apple may retain the UI across time, they are more than willing to change APIs etc on a whim.
MS on the other hand may change the UI (though outside of 8.x, the core layout and behavior has remained much the same, and even 8.x could to a large degree behave like the older UI) but they bend over backwards to maintain APIs across time.
Microsoft was always more vendor / ISV / VAR oriented, and stable APIs matter there.
Being able to get a new computer but install from the same software library (i can hear the _sec people getting hissy already) as was used on the old one makes people more likely to pick the same "platform" over time.
The counter is that Apple caters to a smaller software development community, though several of the tools also see extensive use and support (particularly photoshop). But there's a heck of a lot fundamental functionality on Apple's platforms that you can get without relying on third-party software, or at least, third-party proprietary software. Given the dynamics of proprietary software markets, particularly toward adware, nagware, and malware, this seems a possibly positive development.
(I've made much the same observation in recent years about the Android marketplace, which I see as a growing cesspit, and of the Windows application space, particularly at the peak of its crapware / spyware / adware period in the decade of the 2000s.)
Linux solves the software compatibility problem by allowing for recompiling of software for which the source is freely available, for the most part. This isn't a perfect solution, and there are complex systems which tend to not be particularly forward-compatible. One possible argument is that such complex systems are themselves inherently problematic and ought perhaps be avoided. You may not agree with the argument, but I'd expect you'd admit to its existence.
Microsoft was addressing a different space, and one in which there was a massive focus on desktop-distributed client software, much of it aimed at very specific business applications. This is a major application area for computers, though it's also one that's shifted significantly toward client-server Web-based solutions (or app-based, now). Which presents its own set of features and limitations.
And again, all this is what I was hinting at earlier with noting that you'd presented a very interesting point. I'll be thinking about this for a while.
That's Macintosh all over. "You're holding it wrong" is not a new failing of Apple, nor is it exclusively post-Jobs.
Jobs was just better at convincing people that, yes, they were holding it wrong.
I agree with you that keeping the file metadata in a separate fork is far superior to keeping the file metadata in a three character extension, but sadly the world zagged on this one when Apple zigged. I especially like that you could have two files of the same type associated with different applications, so if you created a text file in your IDE it would open back up in that IDE instead of launching the word processor. And you could change it (somewhat clunkily) at will.
Win95 did have some advantages though. The Start Menu was a better organization system than Apple's Application folder for example. Macs of that era were also slow and badly overpriced.
These were not actually that different. The original Start Menu was just a menuized view of a folder hierarchy (which mostly contained shortcuts but could contain any document).
Or if you're going to Very Deliberately Ignore the Other OSes and go do things your own way. VDIing like that seems to be a very Apple trait.
My point is, I'm not sure how well the resource fork model could have ever survived prolonged and sustained contact with the Internet and modern pervasive networking.
Since we're talking about Win95, then the 3 characters limit doesn't apply (long filenames was a major new feature of this OS after all). .jpeg and .html were relatively common for example at the time, and worked fine.
I find the extension system still kludgy, but arguing than it was worse in part because of the limited pool, is incorrect starting from Win95.
The metadata that Macs kept in the resource fork go way beyond file type and creator too. It included things like the file's Icon, creation/modification information (so it would survive a trip over the Internet!), loads of stuff for applications (menus, graphics, sounds, etc...), formatting for plain text documents (so they fall back to plain text on unsupported systems), and so much more.
Fun fact: NTFS supports the concept of a resource fork on files, but almost nothing in Windows uses it. I think I've seen more malware hiding stuff in there than legitimate uses in the wild. Worse, even the obvious case of loading a Mac file on a Windows machine it usually fails and falls back to creating the clunky separate directory instead.
One thing I loved about old MacOS apps is opening them up in ResEdit and so much of how the thing was built.
Sort of. It would be more accurate to say that each filename was a 4-letter type code and a 16-bit ID. Each resource could also have a name, but that was less frequently used (and didn't have to be present, let alone unique).
More importantly, resource forks didn't exist in isolation. They were loaded into a chain of active resource files -- for instance, while working with a Hypercard stack, the resource chain would include the active stack, the Home stack, the Hypercard application, and the System suitcase. A stack could use resources (like icons or sounds) from any of those sources.
Also, that Windows feature you mention is called ADS - alternate data stream.
The filename extension is the bare minimum of metadata for a file and not easy to extend.
Even then Unix systems will skip even that minimal metadata and force you to messily search for magic numbers at the start of the file and make a guess.
I completely disagree.
First of all, Windows 95 had preemptive multitasking (the Amiga was the only computer that had this at the time), Mac OS was single task and used a terrible scheduling and it would be years before Mac OS gained preemptive multitasking because of terrible architecture choices that made this extremely challenging.
From a GUI standpoint, Windows 95 was a total revolution and made Mac OS look completely antiquated: rendering, scrolling speed, font types, menu items and dialogs, etc...
The Amiga was exotic hardware, like the BeBox. If you want to include those, you might as well include SGI and Sun hardware. Preemptive multitasking was common on non-PC hardware.
We're talking about consumer facing computers, and at the time, the market was pretty much only Windows, Mac, OS/2, Amiga and Atari.
I miss window shades so much.
You are in a tiny minority if you think Mac window and file management are any good. File extensions are the most pragmatic way of dealing with associations and that just about sums up the difference between Mac and Windows: Microsoft made Windows to be practical whereas Apple has always focused on style above all else.
Windows user here. Honest curiosity: does anyone know why the minimize / maximize works on the Mac the way it does? I mean, what's the rationale to design it like this?
I rarely, if ever, use Minimize on the Mac. Minimize comes from Windows (and other windowing systems) where the window minimizes to an icon or button on the task bar.
"Maximize" also comes from Windows (and other windowing systems.) As others have noted, in newer macos (which I don't use,) I think it oddly makes the window go full screen. Full screen is a recent macos feature -- I once asked about making my Application go full screen and Apple developer support said that going full screen did not follow their Human Interface guidelines. Something seems to have changed at Apple since I asked about that decades ago.
There was no Maximize on the Mac, it is called "Zoom." The idea is that the window has two sizes and you zoom between the two sizes: One size is the size the user has resized the window to (often with much difficulty) and the other size is an ideal compact size ("optimally fit content") without hiding anything and hopefully where scroll bars do not appear -- it is a UI feature that is/was rarely, if ever, done very well by applications other than the Finder.
By the way, Command-Tab to switch tasks was once an add-on from Microsoft for Mac OS. Go Microsoft! (Mac fanboy here :)
The the MacOS scheme of doing this leads a sort of organic, emergent window layout - I always end up with windows staggered to display relevant bits. With Windows (and with most window managers, X), things always end up either strict tiled or stacked.
I'm very used to the Mac way and prefer it, but that could just be the result of long use. It doesn't waste space (Windows apps always seem to have lots of dead space to me and makes me work to be able to see parts of other windows) and forces me to switch windows far less. But it is a fiarly subtle thing.
My WM of choice is now Xfwm, which allows to rearrange windows easily without gaps, make windows fill available space (vertical and horizontal separately), or tile them by dragging to corners.
On OSX, things like Spectacle and Magnet sort of help.
For me, the goal is not to maximize available space to the frontmost windows; it is to maximize the use of the monitor to display what's currently relevant. It allows me to do things like keep a Finder window open with just a bit peeking through to drag things to, keep an eye on a few lines of a terminal tail -f, see the mailbox pane and room pane in Mail/Slack so I can see if anything new happened I should respond to, etc. all while working on whatever I'm working on.
With the Windows-ish "maximize the window", that is replaced, almost invariably, by useless window background.
Again, I expect this is mostly what one is accustomed to, and the Macish approach is more idiosyncratic to the user. Works for me.
Also, you can maximize a window for a moment, look at it, and then unmaximize it back to its previous size.
I use xfwm4, and turn all of the window snapping off -- unlike Mac and Windows we can customize. ;)
You will pry the Command-key from my cold dead hands:
/usr/bin/setxkbmap -option ctrl:swap_lalt_lctl
Also, if you hit the up arrow in the command-tab switcher, you can use the arrow keys to select a minimized window and hit enter to restore it.
Tiling window manager: https://github.com/ianyh/Amethyst
Programmable window manager: https://github.com/kasper/phoenix
Either that or I've gotten very much better at doing this.
I absolutely agree with your guess as to why this is.
in Maximized state the window is set to maximal available size (so you are not wasting any part of the screen) while you are still provided with fast and easy access to relevant OS UI elements
in full screen you explicitly tell the system that you don't want to be distracted by OS UI elemtents (typically in situation when you know that you won't need them for extended period of time, or if you REALLY need every single pixel of the screen)
I'm came from ~18 of Windows & Linux usage and everyone always told me macOS is THE OS with the best usability.
But I can't confirm this.
Minimization of windows is shitty and maximization even more.
When I maximize, often just the height is changed, when I go back to "normal" the height and the width is changed, so I always have to adjust the width manually.
When I minimize a few windows, it's impossible to get back the right one without luck.
and if you have many windows open (in win 10 at least) you don't have to press alt+tab 10 times in a row to choose your desired program, you can hold alt+tab and use arrows.
Some days I still really miss dwm, but having Photoshop, Ableton, and several other things Just Work™ makes it worth it.
System Preferences > Mission Control > Automatically rearrange Spaces based on most recent use
Perhaps it's a setting or was the behavior in an older version of macOS?
Motherfucking maddening as hell.
And there is no proper UI element for checking if you have anything open in a different workspace without opening the switcher.
I mean it's perfectly usable once I got the basic gestures down and installed the app that let me independently set the track pad and mouse wheel scroll directions, but will go so far as to say that both xfce and MATE are objectively better at window management.
Partly I think their hands were tied by the too late to change decision for the always there contextual top menu bar. Or this is just 25 years of using win95 clones talking and I'm set in my ways.
You wouldnt judge i3 or other completely different approaches after only a few minutes.
What i like the most is you are just a super key tab away from basically everything so focus on a single thing feels natural and right. There is no way to lose anything either, its all there. Always.
The last time I tried it, I got frustrated when working with a lot of pdf sources - hitting the super key just presented me with a myriad of white rectangles where open windows would frequently rearrange requiring a slow manual search to find the file I was looking for. This can be less of a problem with a taskbar, as the filename is the main identifier, and being 1 dimensional it is easier to scan and preserves its position better.
Another frustration is trying to view two apps together on the screen at the same time, if one of the apps itself contains multiple windows. I'm not at my desk to try this but let's say you have multiple chrome windows open, all with their own tabs, and you want to view your current chrome window overlayed on a window from another app. To do this you have to manually minimise all of your chrome windows one by one so they will all move of the way, to allow you to switch between apps and view them both at the same time.
Four fingers up, expose.
Drag the windows you need up top into a new desktop.
Command (or control? Or option? Or a combination?) Plus the right arrow to switch desktops.
Try all the combinations until I get to the right desktop or throw the damn thing out a window.
Somehow it is able to open up the device manager and THEN give the window in the background focus when I switch to my editor and back to Xcode with cmd+tab.
So I have to move the Xcode window down, to grab that backgrund thing what seems to be part of Xcode.
There was no task bar or dock or anything else really to minimise to. IIRC there were addons for 7.1 that added "window shading": a button on the window title bar that reduces the window to just the title bar.
The closest thing to a maximise button in classic Mac OS was more like a size-to-fit button: the application gave the window manager a hint which was the appropriate size for the document displayed, be it a file folder, a word processor document or whatever. Having a single window fill the entire screen wasn't as common as it was on Windows.
None of this was particularly strange to me back then.
 https://en.wikipedia.org/wiki/WindowShade apparently a standard feature later on
System 7 was WAY earlier than Windows 95, it was 1991, a little more than 4 years earler (like an eternity):
"It was introduced on May 13, 1991 ..."
And even 7.1 was almost exactly 3 years earlier:
"In August 1992, the 7.1 update was released."
"It was released on August 24, 1995"
Not even Windows 3.1 had been released at the time System 7 came out, it's competitor on MS side was DOS (version 5.00):
or some of the various semi-graphical third party shells for DOS.
Sure it did, namely Windows 1.0, 2.10/2.11 (actually Windows/286 and Windows/386), but very few people used them.
Windows 3.0 was the first one to have some diffusion, but it had very limited capabilities, and it's adoption was slow because of the increased specifications for the PC, and in any case not comparable with the later wide adoption of 3.1.
Yup! For the longest time I liked to work with a half-width browser window to match my half-width editor & word processor windows. It drove me crazy the number of websites which set their body text to some fraction of the window width, which looked good with a fullscreen window but terrible with a halfscreen one.
Eventually I just gave up. The whole point of the web was device-independent information transfer, but somehow we allowed device-dependence to sneak it.
You can play around with it on archive.org. https://archive.org/details/mac_MacOS_7.0.1_compilation
IMO there's a whole generation of people who did their early computing on MS Windows (including myself) and so internalised that that is how GUIs are "supposed to work". When moving to something else later in life there's a feeling that it is "wrong", but it (e.g macOS)'s way of doing things is also correct and is just a divergent evolution to MS Windows. Research, open-mindedness and experimentation are necessary when using something different.
It wasn't designed to 'make this window fit the screen'. It was designed to 'make this window fit this A4 document.
Maximize on Mac was NOT designed for the window to fill the whole screen, but rather to resize the window to optimally display its contents. E.g. maximizing a Preview window with a PDF document changes the width of the window to the PDF page width. (Good tip for getting along with your Mac: stop trying to maximize everything.)
In a recent macOS version, Apple changed the green "maximize" button to full-screen, which is very different from maximize. Now double-clicking on most window chromes will execute the old maximize behavior.
The problem with this approach is I definitely do not need someone else making the decision of what is "optimal" for me.
I've been using macOS for about as long as I used to use Windows now, and at this point, macOS seems to have largely abandoned the concept, which is great. Applications get either a full screen in their own isolated context, or option-click for a full-screen in a regular windowed context. The options now are a very windows-95/98 like: "either go completely full screen, or resize to whatever you like," which gives me full control of what I find optimal for any given application.
Macs used drag and drop for file management between windows representing separate locations on disk. Windows users tended to select files and choose cut or copy then navigate to the second location and paste.
The same held true for moving content between documents in an application or moving content between applications. Mac users preferred to use drag and drop, while Windows users relied on copy and paste.
The problem with keeping every window maximized is that you're giving up system wide drag and drop as the primary user interaction method.
Personally, I find drag-and-drop handy sometimes but it's very constraining. You have to go through non-standard motions to complete any more that is more than trivial, always holding down the primary mouse button and thereby losing your primary way of interacting with the interface. In other words, sure, if you have a clear view of your destination, then yeah, drag and drop is fine, but in all other instances, it becomes clunky.
Cut/paste is incredibly quick and doesn't sacrifice usability of your interface or input methods between the two ends of the transaction. Windows seemed to balance this out well, where you could drag and drop most of the time, but you could also ALWAYS cut/paste. I despise that I can't cut/paste in finder. Which is why I use PathFinder instead.
The danger cut/paste DOES pose is it fundamentally unlinks the start of the transaction and the end. In between, you can do literally anything, which may mean losing track of what's in your paste. Still, I'd call this a fair trade-off, specifically because it is non-destructive for files. You won't lose a file to to paste. It just stays put.
On Windows Ditto, and on Linux CopyQ (among others, and there has to be something like that for Mac) solve this problem, by giving you a preview of what's in the clipboard as well as the history copies you made.
Then they are shocked later when they paste those files into some random location and can't figure out where they went.
Dragging and dropping does not have that issue. Users find it much easier to learn.
Open the source window. Open the destination window. Drag.
And that's the crux. The whole feature (in its original incarnation) rested on the false assumption that there's a singular "optimal" state at any given time.
The Apple Human Interface Guidelines have been a state-of-the-art reference for good UI for a long time, but the part about the zoom button always baffled me, as it directly contradicted several Core Principles laid out in Part I of the book.
The reason this is done is of course that most applications don't have content that fill the entire screen, so maximizing, in most cases, is meaningless - and hinders the usability of the system.
It makes more sense to leave some space over for other apps than have a big empty area on both sides of the screen.
Interestingly, I noticed that it's only Windows-switchers who complain about this. People who've used Mac for a long time don't give this any thought.
It's a matter of where you place responsibility. It's like saying, "most websites aren't responsive, so naturally it makes sense to restrict the size of your browser window and leave space for other apps." But most would laugh at this and say it's the responsibility of the website/webapp to build a responsive layout. Why should we hold desktop applications to a different standard?
I completely that it is probably almost entirely windows-switchers who complain about it. I'd, obviously, self-aggrandizingly suggest it's because we've tasted something better. People don't complain about the taste of food they've never tasted ;)
I think that's a misrepresentation. It's not a static size. It's not an artificial limit. If the document that's open has content to fill the entire screen, the window will fill the entire screen.
The macOS interface is based around floating and overlapping windows. If you put a window over the whole screen then it could be as well maximized. This gets a bit hairy on smaller screens but really shines on huge monitors. In general macOS is more optimized around having one big screen rather than a multi-monitor setup.
But seriously, minimising windows is a reflex learnt from Windows. In Windows you often need to minimise one thing to find something else. Especially the desktop. On a Mac you can usually find something more quickly in the dock. In Windows you’re far more likely to have an app maximised by default, and minimising is a natural way to switch tasks. On a Mac, minimising is not a natural way of task switching.
I use ShiftIt, a neat little Open Source tool that help me manage windows sizes and positions (including minimizing and maximizing): https://github.com/fikovnik/ShiftIt
I've recommend it to pretty much any Mac user I've met.
I'm actually willing to bet good money that it's the other way around and it's you who is in the minority.
I think that Apple thought that many regular users would switch to full screen apps on the Mac, combined with Launchpad (it's just like an iPad/iPhone). But virtually all non-tech-savvy Mac users that I know do not use Launchpad, nor fullscreen apps.
I think the problem with Launchpad as with Spotlight search  is that they are not very discoverable on the Mac. Having search in an application menu (like recent Windows versions and some Linux desktops) is far more discoverable.
I guess people don't use fullscreen apps because they equate desktops/laptops to the 'WIMP' interface paradigm.
 If I received a penny every time I see even experienced Mac users launch applications by clicking on a Dock icon or by navigating to the Applications folder in Finder, rather than using Spotlight, I would be rich.
(Yes, yes. It is a lame joke. But who uses only one Window at a time? What is the point of that? Though I remember when I used OS X you could swipe left (or right) to switch back to all other apps. So I think that was good enough.)
In the olden days, Mac users would tend to have lots of overlapping windows. Dragging anywhere on the window edges would move the window, so it was easy to arrange them as you wanted - almost like shuffling bits of paper on your desktop (strangely enough) and if you wanted to move something out of the way, you could fold it up (window shade). There was no need for maximising or minimising and "zooming" just meant "make this as large as makes sense for this particular document", not "expand to take up all the space on my desktop"
As OSX/macOS has developed, all that document/desktop-style behaviour has been lost.
You're talking about the "fullscreen" feature which I always found very weird. For example, ever forget your video was "fullscreen"ed as you try to alt-tab to it only to realize it's a 4-finger swipe to pull it back up. Making the user have to differentiate will always be bizarre to me.
Eh? They still do, as of MacOS 10.13.
I recently sold several windows 95 shrink wrapped original copies on eBay which was the OS on 32 3.5” floppy disks. As an original piece of computer history.
Win 95 was monumental and great. Aside from outlook, and excel, the greatest product MS ever made.
What are the use cases that bother you?
Really, the only place case-insensitive filenames makes sense is when you are searching. It makes no sense for any other reason.
I also like when an IDE detects and provides a warning when I have two variables with the same letters better different capitalization.
Different case is often a bug.
I can think of a lot of reasons to abandon Mac OS X, but this seems like a really odd one to me.
You lost me. HFS+ is arguably one of the worst file systems around (yes I know that Apple finally switched to APFS but that was fairly recent).
I'm really curious what you actually liked about Mac's file system vs. Windows?
NTFS/Windows actually has all of that stuff you want, too. NTFS's permission system, for example, is extremely feature-full and integrates nicely with the user system (ACL support by default rather than an add-on, for example). The octal user-group-all permission you're probably used to is pretty crude by comparison.
It's more likely just you're unfamiliar with it rather than it's actually missing anything.
But if you want you can just pretend C:\ is equivalent to / and mount all your other drives at C:\mnt\whatever, that's completely doable (with a GUI to configure it if you want, even)
I’m still annoyed that APFS doesn’t have file level checksums.
OT but have you tried Witch as a task switcher? It switches between windows, which made my life SO much easier
OSX took the NextStep/OpenStep interface and dumbed it down to great detriment, then added new things back (spaces, zooming) which were inferior logically but required less 'thinking' of how one works and more 'shiny looking' to potential customers..
IMHO hands down the best mouse-oriented window management paradigm to exist to date is the NextStep/OpenStep style over and above windows and osx, though I will admit windows has improved things with their sort of hybrid 'classic' windows+'macish' updates, and some of the newer ui things (e.g. window thumbnails) haven't made it into the current flagship of that lineage which is the open source WindowMaker..
Since the 'official' lineages are dead, am hoping the WindowMaker people continue to innovate/move this paradigm forward as they have been doing subsequently for the last N years...
I remember these times well. It was considered a huge break. People were whining about how stupid the Start menu was compared to just seeing your apps in front of you all the time :-D I love that we eventually came full circle to a Windows 3.1 Program Manager-esque approach with iOS nowadays!
I don't remember anyone using this setup ever though. Maybe this was too obscure and technical for the users complaining.
That would not have been my first pick for an example of Win95's failures. Perhaps the daily system crashes.
I remember when unified USB support came out in Win95 2.5, it was big damn news at the time.
To me Windows is bloatware. But I also make the OSX dock as small as possible and autohidden. I launch everything through spotlight though as I abhor unecessary point and click (synonymous with hitting the windows key and typing a couple letters of the application to launch.).
I remember Windows 95 as a complete disaster of crashes, data loss, failing installations, incompatible applications, missing drivers and countless other problems which were only fixed with the release of Windows 98 (maybe even SE), which was much much better. I have memories of people sticking with DOS and 3.x, only having 95 as a nondefault boot option in case they wanted to watch the Buddy Holly video or launch the new Encarta cd-rom.
Learn to use the dock? The taskbar on Win 95 was evolutionary rather than revolutionary, and the Dock from NeXT was one of the influences. Which is the same dock was have today in macOS.
You can still have personal preference, of course. But if you have trouble using macOS to find minimized windows, that's because you haven't learned to use it not because it's not possible.
After I'm calm, I ask her why, trying to understand it from her perspective. Every time I do this, I'm always surprised, because she gives valid points, and I end up cursing the developer :D
So, whenever I design UI/UX for an app, I ask my mom to test.
Rant: In my opinion, there should be an option in Mac/Windows to disable file drag and drop. Every time I check her computer, I always find dislocated files simply because she accidentally drag them.
I have an informal rule that I will try to get someone at my job that has never seen or used the application to be the one to test out new features or UI changes. Generally just asking when they have some time, handing them a phone or laptop, and asking them to do a task in the app (with a small amount of background about the task if needed).
There has never been a case where this hasn't monumentally improved the application. Questions like "what do I do here?", "how do I get it to start?", and "did it work?" were extremely common for quite a while before we managed to get the UI in a good state. You just don't see the implicit assumptions you make at so many places.
Sadly it's hard to "formalize" something like this (at least in my experience), because the benefits seem to be greatly reduced if the person testing has seen or used the application before, and I found it works best the "further away" someone is from software development.
We have plenty of analytics and user testing, but they tend to miss the case of users that are unable or unwilling to learn the application, and end up perpetually confused.
It's that there isn't really a way to formally gather and have people that haven't really ever used the application before, but know enough about the problem domain to be useful in testing. Not to mention that it's a non-renewable resource, once I've done it with someone for a specific part of the app, they are "burned" (at least for a few months). (that last sentence came across really... shitty sounding, but I can't figure out how to reword it to get the point across without sounding like I'm treating people like old computer parts, so I'll just add this disclaimer...)
Any kind of "formal" process ends up just looking like QA, and they end up making the same kinds of assumptions that the developers and designers do since they work and know the application just as good if not better than them.
Once you've learned something, you can no longer remember what it's like to not know it. Not fully, anyway, and certainly not without deliberate effort. It takes quite a lot of mental effort to stop knowing something, and approach a task with the mindset of someone who's never known it.
Good QA people should be able to do that, but it must be hard even for them.
She also reminds me that she used to work at Digital (DEC), on the cutting edge of tech, in the 80's, but left her job to raise me- so I'm the reason she's so far behind, technically!
That would save SO MUCH HASSLE. Great idea.
You don't like the metaphor of a folder, but you'll accept the metaphor of what goes into a folder?
I too prefer the term "directory". Apart from all UNIX tools using this terminology, it also more accurately reflects what it actually is - not a physical container for information, but a list of references to it - an association of names to addresses. A paper file cannot be in more than one folder, but it can easily appear in more than one directory. On a filesystem that allows hard links, this is a more useful conceptualization.
Yes, but "directory" doesn't describe the way the UI appears to behave in a way that's more intuitive to the average non-technical end user than "folder." A "directory" could also describe a map or a phone book for most people.
That rather begs the question though, doesn't it? To a nontechnical user, the ship has sailed - everyone calls them "folders", the icons are ubiquitously pictures of file folders, and it would be terribly confusing to try and change it now. That doesn't mean it was the right choice though. I could imagine a world where the icons were little phone books, and the concept of a hard link didn't require a half-hour explanation.
But in a filing cabinet system you have different cabinets, different drawers, different hanging files and various lower level arrangements - non-hanging folders (or drawer dividers), paper clips, staples.
Perhaps they never saw a filing cabinet? Or never put something in a box, then in another box?
When they put food in a cupboard do they empty it out of the current container first?
<<Your house is a drive, every room is a "folder", every container in every room is a "folder", every container inside another is a "folder", every object is a "file".>>
I've come across it, just assumed that no-one explained the metaphor to them.
If it was this simple, there would be no people who struggle to understand recursion. And I know of quite a few.
Then there's your Personal folders.. which are in fact collections of other folders themselves, but appear transparently to be folders- but somewhere on the file system they are actually different folders. But without the benefit of breadcrumbs to help you keep your place in the structure, where did that file actually go?
Don't get me started on shoehorning OneDrive (or was it OneDrive for Business?) into the mix.. where'd I save that file again?
That's why my desktop is so cluttered!
That's a specific failing of Windows, not something inherent to the folder metaphor, though.
It's one of those absurd things that happen when nobody pays any attention to coherence. That particular thing has driven me BATS for years.
I'm encouraged, though, that my Win10 machine here doesn't seem to continue this foolishness. WinExp used to always show the Desktop as the "root" of everything, but it doesn't now. The quick access pane shows This PC as the root, with quick links below it for Desktop, Documents, etc. It's almost like MSFT is learning.
"Directory" I don't understand at all. The only thing I can think of is the felt letter boards at the entrance to office buildings that say "DR SMITH - OFFICE 555". Not only are those a more obscure metaphor, but I've never seen them nested even one level deep. I don't have any idea how I'm supposed to use that metaphor. "Create directory": am I creating a new building? Is my building going to have two directories in the lobby? How is one 'inside' the other?
I was completely lost on Unix until it was pointed out to me that "directory" was just its weird name for folder.
Directory is a UNIX thing.
On AS/400 (now IBM i) there are no directories or folders, rather catalogs.
Similarly other systems also had other denominations for group of files.
However, even UNIX later adopted the folder designation for its GUI variations.
There are plenty of other examples of folder use, so it isn't something that Microsoft just decided to invent for Windows 95.
It's highly relevant to the discussion because it illustrates just how deeply ingrained these basic UI expectations are. UI (re-)designers take note, in some cases even decades of forced retraining won't make everybody accept gratuitous change.
However, in the Windows shell a "folder" is not necessarily a "directory" - it's just something that may contain something. Control Panel, for example.
A directory is a type of folder.
However directory was also a borrowed term (and to be honest folder kinda makes more sense)
And of course, bakckups weren’t properly tested and didn’t work.
Further what is the chance that anyone who can't drag and drop would be able to find the setting to turn off drag and drop?
Especially older people have problems with their motor skills so instead clicking on a file they often perform drag & drop instead (they can't hold the mouse steady enough, so the system recognizes movement and at the same time their "click" is too slow, so OS recognizes it as holding the button down)
Right click is evil, keyboard shortcuts are evil, drag and drop never occurs to them. Mom usually just reads her emails (and attachements open in the browser nowdays), so it's not a problem, but father saves files.
If he finds something important then he saves it multiple times, so in the file list view there will be multiple similar items, that's how he knows it's important (visually it's very distinctive). When he has to copy a file somewhere else he usually fires up his CAD software, opens the file and then saves it elsewhere.
I just gave up "educating" them (after countless attempts). It's pointless. I remove file duplicates half a year, copy stuff to an usb key when he asks me (I visit them once a month) and avoid the machines like the plague, unless they explicitly ask me to do something (I installed ubuntu for mom, which works wonderfully for her, one less problem to worry about).
If you're using RDP and copying files from the host computer to the remote computer, it's natural to cut/copy the file on the host and paste it on the remote. Drag/Drop isn't as natural because you have to mess around with the RDP window size and scroll position to be able to see Explorer on both machines.
The file copy occurs very slowly, and requires a lot of memory. Copy/Paste uses the clipboard, so the entire file gets read into memory before it can be written to the remote filesystem. It goes slowly because there is apparently an un-optimized loop copying bytes.
Instead, on the remote system you should open two Explorer windows, one for the destination and one for your local filesystem, which RDP adds for you. Then you can drag & drop the file you want to copy. This skips the clipboard and also seems to use a much better optimized byte-copying loop. The speed difference is very noticeable.
Pro-tip: if you're copying lots of files, zip them first and copy the zipped file. That can reduce a multi-hour copy operation into a couple of minutes, even if the compression ratio is awful. Windows is really, really bad at multi-file copy operations.
1. Cut a file.
2. Paste it somewhere.
3. Hit control Z.
4. Cry because your file is gone.
But it seems I cannot reproduce that in the current build of Windows anymore. So I think they finally fixed it.
And indeed, most of the "gosh I broke the computer, all my files are deleted" calls from my grandma are caused by accidental drag-and-drop.
Fortunately you can do 99% of file management in Windows with the keyboard, either through cut and paste or the Explorer menus.
Are you a Mac user? Apple disabled Cut in Finder ages ago, basically for aesthetic reasons. It's a constant irritation with my Mac that I have to manually change the folder view just to move a file up one level; on Windows you can do it in less than a second with Ctrl-X, Backspace, Ctrl-V.
1. Copy the source file with Cmd-C
2. Browse to the destination
3. Move with Cmd-Opt-V
It's a UI design decision to avoid Cut in the filesystem. Cut would behave differently from Cut in every other app -- namely, the file is not deleted immediately after Cut. It seems better to avoid calling this "Cut" than to have it behave inconsistently from other apps.
Most people I've seen use Ctrl-X/Ctrl-V, or right-click-Cut/right-click-Paste. It has the advantage that you don't need both the source and destination visible at the same time.
For example, you can begin dragging a file, then alt-tab to another window or tab or even open a new program or whatever using the keyboard, and then finally release the file into the destination window.
It's not obvious that this would work, but it is convenient if you find yourself wanting to drag something with the mouse for some reason.
It also gets in the way when trying to use Windows via a touch screen.
I still have a soft spot for vanilla HTML. When I can hit view source and read the whole article (minus images) without much difficulty, I'm glad.
For its time it was a great design that was intuitive to understand, relatively lightweight and did not get in my way. About the only changes I think improved things notably were the search field in the Start menu and Aero Snap.
So, it's also an ad delivery machine.
Cortana won hands down by actually telling a joke (it was about dirty laundry, but it was ambiguous enough that I think Cortana actually "got it".
OTOH, I feel really weird literally talking to a computer, and I don't think I will ever overcome that.
I only use Windows at work, and my work computer is sufficiently slow that I have not had a problem with that. Maybe it is the version, too, my work laptop runs Windows 7 - and will hopefully continue to do so until Microsoft stops supporting it; as far as Windows goes, Microsoft really nailed it with Windows 7.
(I have to admit, though, I have only very little experience with Windows 10; I really disliked Windows 8/8.1 and the corresponding server versions because Microsoft butchered the UI.)
After 12 years of macOS I recently got a Windows 10 machine. There's plenty of Windows 10 bling on top of the OS, but you don't have to dig deep before you encounter the embarrassing remnants from very early versions of Windows. Running the latest version of 10 it still feels very unfinished which I hope Microsoft intend to do something about.
I don't have too high hopes though, considering it was released two and a half years ago.
But I think the biggest thing I miss is the start menu being just a view of a folder hierarchy. The Windows 10 Start Menu is tied into the appstore and uses some kind of database that can easily get corrupted and cause it to stop working seemingly randomly. Sometimes performing the right voodoo magic can fix it, but usually it means an in-place upgrade (aka reinstall). Go on, look for "Windows 10 start button doesn't work" on google. Fun reads.
It's an example of how complexity meant to make things easier often just makes them rigid and unfixable when shit inevitably goes wrong. The original start menu was so stupidly simple that it was almost impossible for anything to go wrong in the first place, and if it did, it was easy to reason about because it was simple. I miss that kind of design in my software.
* It is flush bottom right with the screen, making it a true "mile button". In 95, there was a tiny non-clickable border around it, so if you just slammed your mouse into the corner, it would not work; someone who worked on Mac OS described MSFT as "narrowly snatching defeat from the jaws of victory" compared to the mac's menu bar, which was perfectly flush to the top of the screen and thus a "mile high bar". I think this was actually done with XP?
* After clicking it, you can immediately type the name of the program you want to run, instead of having to click the "Run" entry. I switched off of Gnome onto Windows after learning that this aspect of my workflow wouldn't need to change. I think this was actually done with Vista?
* Subfolders are opened by clicking instead of hovering. Nothing makes me more frustrated than accidentally mousing over the wrong part of the menu and closing a sub-sub-folder. [hyperbole]Whoever invented mouse-over menus should be shot.[/hyperbole]
XP has a hack where if you click the bottom row of pixels on the screen, your mouse cursor is magically moved up enough pixels so you hit the buttons instead of the dead area.
Why click anything at all if you're going to be typing anyway? Pressing Win + R has worked since the beginning, taking you directly to the "Run" dialog.
> mousing over the wrong part of the menu and closing a sub-sub-folder
The classic start menu (since Windows 98, or Windows 95 with IE 4) allows you to easily rearrange the entries by drag and drop. If you organize it such that it doesn't have any sub-sub-folders, it works much better. :)
On the other hand, if you just let things stay where the installers put them, it's pretty terrible. Each application takes something like 5 clicks: Start -> Applications -> SomeVendor -> SomeApplication -> SomeApplication. I guess this is the way most people had it, so that's why Microsoft gave up on structure and focused on search.
Even fewer keystrokes, and less typing with Win 10: hit Win key, start typing Word (or settings, or mouse, or whatever), hit enter. The Start menu pops up, and is searched/filtered as you type. Compare this with the previous option (which still continues to work as well): Win key + R, type msword, hit enter. Not much in it with this example, but it's a neat way to access Start menu items rapidly.
For reasons I haven't yet explored, it occasionally fails to find an item, which is more puzzling than annoying. An item can be right there, and the keystroke search fails to find it. I miss the simplicity others have mentioned of the 95 Start menu simply being nested folders.
Might depend on what particular parts of the OS you come across. I use Win10 on my work notebook  and besides the long startup times , I haven't noticed anything "unfinished". In fact, Windows very subjectively feels more polished than macOS which I used before (until December).
Then again, I don't use much of it: I only run Firefox, Slack, Outlook, and VirtualBox where the actual work is going on in a Linux VM.
 Probably not the "latest" version though. I get to use whatever our corporate update server hands out.
 Compared to my private Linux machines. macOS also took several minutes from turning on the machine to everything having fully settled in.
I am a linux users myself, configuration inconsistency gives me nightmares.
Yes. And another really mind-boggling example is the Control Panel. There's actually two of them (or more, depending on how you see things). We have both the old Control Panel and the new Settings user interfaces for configuring various settings in the operating system. Some configuration options are in Control Panel and Settings, and some options are only available in one of the interfaces. That's a UX fk up if you ask me.
Imagine that situation in macOS, there being two System Preferences apps with completely different looks. They would both have some commonalities, but many options would only show up in one of the apps. Would. Not. Happen.
However it is still a lot cleaner and more consistent than win 10. Agreed.
* Applications minimized to a special folder, which was located on the desktop.
* No start button or task bar (they were added in OS/2 4.0).
* Shredder on the desktop (did not offer restore files like Mac Trash or Windows Recycle Bin).
* Hierarchical folders on the desktop that could contain either shortcuts or files.
* Shortcuts couldn't get "broken" as long as you did all of your file management through the Workplace Shell.
* Folders and file types could be subclassed in various ways to change their behavior and appearance. Simple changes didn't require programming.
* You could mark a folder as a project, and all the programs and files associated with the folder would open/close/hide along with the folder.
At the time, I felt that the Workplace Shell was immensely superior to the Windows 95 desktop. But it probably was quite a bit less friendly to new users.
Even though I jumped on the Mac OS X bandwagon from the very first moment in 2001 and was happy to leave the Windows world behind, the fact remains that for a few years time in the mid 90s, Microsoft showed a strong ability to design GUIs that were easy to use, relatively consistent, and flexible enough to suit a large array of first and third-party application designs. It's a shame that, IMHO, Windows XP took things in a highly negative direction after that, and Microsoft never fully recovered. With the possible exception of Windows 7, every OS release since XP has been a mishmash of competing ideas and confusing discrepancies, and macOS has continually outpaced Windows in usability.
I still hold out hope that there's a solid future for Windows when it comes to UX/UI design, if only because I want macOS to have real competition on that front.
Apple later pioneered the notion of making products so radically simplistic, they sacrificed functionality on the pretense that power users wouldn't be bothered by the absence of features. There was almost no customisation. You got what you bought and liked it. Finder, the worst file manager I've ever used as a developer, endlessly gets in your way in the name of not letting grandma mess up her system.
Microsoft later badly copied Apple, and the result are horrid and unusable "Metro" style applications.
After seeing some old demos of Xerox Star, I'd question that.
Looking at Windows 95 user interface today, it becomes evident how iconic that UI was. No fluff, a pure joy to use.
As far as UI/UX is concerned:
Windows peaked with Windows 2000.
MacOS with OS9 (Why didn't they just throw the classic GUI on top of Darwin)?
At least with *nix you have choices and can go with one of the several variants of Gnome 2 (Xfce, et al).
Win7 was still internally consistent (as opposed to the newer tablet / mobile / pc / washing machine UIs of later versions)
and provided few additional UI enhancements over 2k.
The most internally consistent design was that of NT 3.1 - it was a true classic in many respects. As far as general usefulness, performance, and versatility nothing can compare with Windows 10 (except Linux, of course).
Starting with Win8 you have basically a random choice if some setting is set in "classic style" or new fancy "settings" dialogs. You can have all the performance in the world if your users spend most of the time just looking for the right place to do something.
For example, the useless user settings in the control panel and the more useful old version in "control userpasswords2".
In Windows 10 you have two different kinds of UI and settings that can be changed in one but can't be changed in the other and vice versa.
I spent a lot of time with Windows 2000 back in the day.
Windows 2000 was the first OS I bought retail -- my new PC came with Windows ME and it was so terrible I ran out and actually paid for an OS ;)
(I've been so GNOME-averse I appear to have entirely missed that.)
Honestly, this is probably the best I can provide:
As a user I would say that Recycle Bin is a misleading name because it has nothing to do with recycling a file / folder. However it has a more positive sounding than the weird Wastebasket.
Meanwhile classic Mac OS already had a Trash. Simple, clear and short.
I wonder why Windows could not simply name it Trash? Could it be that they did tried to stay away from copying as much as possible?
On the contrary, I'd say recycling is a much more fitting name for what happens when a file is deleted: The existing disk space is reclaimed to be reused for something else.
Apple probably also has a pretty solid design process.
Suddenly, faced with hyper-spy mega-corps, the dumb simplicity of the evil-yet-cute Windows 95 is desirable. Like the lesser of two evils, or the evil you know.
Any day now, a post will come up extolling the illumined joys of mainframe COBOL programming.
There were many, many things wrong with Windows 95 (and its successors). But the design of the Shell was solid.
(Also, considering how hardware requirements have skyrocketed, it might seem remarkable to some that it could run on a computer with 8 MiB of RAM and a CPU that makes today's low-end mobile phones look like supercomputers and still feel snappy.)
I know people used to joke about using the "Start" button to shutdown the computer, but I never understood what is so funny about that.
If you think of shutdown as a "stop" command to the computer, you see the contradiction: to stop your computer, press start.
1) Windows 95 is a good UI design, better than even CDE. Heck, Win3.1 and CDE shared a lot of stuff.
2) ICEWM was golden back in 2001. Guess what mimicked.
In any case, to bring it back. It was good enough. Nowadays, I often hope for a minimal Windows 10 that will be out of the way enough to approach Windows 95.
Your 8 MB was the "recommended" requirement for 95, with 4 MB being the minimum. That wasn't pretty. If you have used 95 with only four megabytes of RAM and a small, slow hard drive, you'll learn that the minimum requirements are quite lower than comfortable, probably in order to more closely match what was actually a typical home computer at the time, some 386/486 with 4 MB RAM and a small hard drive. Where I'm from, Windows 95 pretty much meant getting a new PC for the average consumer.
So you go home with your newly bought copy of Windows 95 to your 386 with 4 MB that you bought 1-2 years ago, perform the minimum installation to your 100 MB hard drive and find out that it's super slow and constantly using virtual memory making the loud disk sound like a Geiger counter throughout the session. You compare it to Windows 3.11, DOS, whatever you had before and have a pretty solid basis for complaining about its resource usage.
Or worse, you read about Windows 95 and decide to finally sell your increasingly irrelevant Amiga 1200/3000/whatever now that you can also have preemptive multitasking on a PC, buy a cheap PC matching the Win 95 requirements with the money and install. Only to learn that it's 100x slower than Workbench, BSODs nearly as often as the Amiga gurus out, uses megabytes of RAM instead of kilobytes.
Or you have a Macintosh, couldn't care less about how exactly multitasking is achieved, and wow, PC seems like a nice option now that that too has a nice, user friendly GUI. And it works on cheap, affordable hardware! So you buy the cheapest, most affordable hardware that'll support Windows 95...
Hmm, I wonder if that should be reworded to “and then we saw RISC OS and it had a task bar design that we really liked”. I can’t believe that they wouldn’t have known about it.
Compared to Firefox: https://www.openhub.net/p/firefox/analyses/latest/languages_...
12'000 people create Uber's app.
I always thought of iterative design and development as becoming popular staring around 2001, and usability studies only becoming popular around that time too.
> the design documented in the spec was suddenly out of date. The team faced a major decision: spend weeks changing the spec to reflect the new ideas and lose valuable time for iterating or stop updating the spec and let the prototypes and code serve as a “living” spec.
> After some debate, the team decided to take the latter approach. While this change made it somewhat more difficult for outside groups to keep track of what we were doing, it allowed us to iterate at top speed. The change also had an unexpected effect: it brought the whole team closer together because much of the spec existed in conversations and on white boards in people’s offices. Many “hallway” conversations ensued and continued for the duration of the project.
Extremely easy to use, extend and navigatee, at the time it’s only missing piece was a built in “file explorer” but there were also so many 3rd party options by then, too (Directory Opus being my personal favorite).
Looking at this retrospective, I can see how it could be possible, especially if they started design in 1992. NeXT had been winning praise for their UI for years by that point, and Microsoft were consulting with Susan Kare (an Apple alum and NeXT employee).
If you compare Nextstep 3.3 and Windows 95 or NT, you can see startling similarities in title bar size and format (to the pixel), window borders, 'rectangularity', tabbed elements and more. "Great artists steal" and all that..
This is a lesson the Android team re-discovered decades later, resulting in Android dropping the "Menu" button. Apple still hasn't gotten the memo yet (3D Touch). The biggest usability negatives with context-menus are poor discoverability and inconsistency in different contexts.
An easy win would be to use h-entry categories for the tags in the document.
If you're writing stuff on the web, please consider adding some microformats to make your stuff more accessible and archivable.
I would love to see a similar Windows 8 document, because I have no idea how they convinced themselves that non-tablet users would like it. Things like charms and even the freaking start button were hard to discover.
document.body.style.fontWeight = "normal";
The original Start menu was in a corner of the screen but without the crucial zero pixels of separation from the physical corner, turning what could have been a massive target into a tiny one. (Fixed in XP though.)
Menus were sluggish as hell to open, and they lacked hysteresis to make diagonal traversal a lot easier. Also, there was no consideration for how to handle a task that might take awhile, such as locating the names and icons of dozens of items; the menu would not show anything, you’d just wait. Sadly they were experts at efficiently making menus go away so one accidental mouse movement and you start all over.
The ordering of frame buttons, “minimize, maximize, close”, on Windows does not clearly separate the most-dangerous action from the least-dangerous action, nor are the actions ordered by similarity. Instead, a very common action on Windows (“Maximize”) is right next to the most dangerous and polar opposite action (“Blow This Away Forever”), with zero pixels of separation. On the Mac, the order is “close, minimize, maximize”: if you mis-hit Minimize on a Mac while moving toward Close, the window will still go away (more or less what you wanted) instead of becoming gigantic and still visible (polar opposite). Also, on a Mac there is significant pixel space between the distinct options so it is harder to mis-click.
The ordering of dialog buttons, such as “OK, Cancel”, meant that it was not possible for memory to take over. In some dialogs “OK” was in the position that Cancel would be in, in others it wasn’t. Also, Windows tended to have very generic names (Yes, No, Cancel), probably because entire APIs for opening messages had only those options; this required reading every word of a long-winded message to understand the options, rather than just clicking something obvious like Save.
Windows 95+ tend to add hierarchy in lots of places that don’t benefit at all from hierarchy. I hate having to remember some obscure vendor’s name so I can find “Unnecessary Company Name, Inc. >> Unnecessary Product Suite Name >> App Name” in a menu for example, when “App Name” in a flat list is the only sensible option. (Fortunately, Search was a reasonable way to avoid this. Until it became slow and couldn’t actually find things that clearly exist.)
For users that can't grasp that minimized windows become icons on their desktop or that directories can themselves contain directories, having everything switch to a new workspace will look to them identically to either one of an "I just broke it" or "It just deleted/erased/destroyed everything I was working on" result (and which result occurs depends upon whether they blame themselves or the machine as the cause).
Sadly, the result is that us advanced users, who do understand workspaces and can make productive use of them, were left without them being available until win10.
And don't forget MDI, best shown by the Windows 3.1 Program Manager: a MDI program would have several inner windows within it, and each of these inner windows could be resized, moved, minimized or maximized, all within the boundaries of the main program window.
that provided, even on Windows 3.1/3.11 roughly the same UI of the Mac OS of the time, very similar, for some aspects, to the later Windows 9x one.
PC Tools for Windows was another lternative shell. Finicky and prone to crashes, but better UI design than the Norton Desktop.
Windows 95 obsoleted them both.
I think it like a poor's man alternative to workspaces.