Hacker News new | comments | show | ask | jobs | submit login
Designing Windows 95’s User Interface (socket3.wordpress.com)
568 points by LaSombra 10 months ago | hide | past | web | favorite | 394 comments



It's hard to remember, but even though Windows 3.11 was extremely dominant at the time, it was by no means assured that Windows 95 would be the success that it was. The very first version missed wildly in some big ways (MSN was a folder integrated into the desktop, for example, and no TCP/IP support [*Edit: yes there was - I misremembered.]), but the core, underlying redesign of the GUI was so profoundly good it propelled Microsoft into a new level of ubiquity. Compare it to other GUIs at the time, like CDE, IBM's Presentation Manager, or even Mac OS 8 and there's no comparison. Windows 95 solidified Microsoft's dominance, but could just as easily eroded it had they dropped the ball.

Even though I've used a Mac daily for the past decade or so, I still miss the task bar, and window-oriented GUI of Windows. I still get frustrated on OSX when I minimize a window and have to hunt around for it. I wouldn't switch back because of the underlying crap that is the Windows OS and file system, but I still miss the interface.

Edit: Found this fantastic PDF "Chicago Reviewers Guide" which goes over all the new stuff in Win95. So much stuff I had forgotten - TrueType fonts, Plug and Play, registry settings, right-click properties, long file names... Basically everything that makes Windows what it is today.

http://tech-insider.org/windows/research/acrobat/940601.pdf


The one thing I don't understand that Mac has never adopted is being able to use open and save dialogues as mini file explorers (move stuff around and rename, specifically). Having to switch to Finder to move or rename a file that has the same name as the file I'm trying to save is ridiculous. Of course I never need to do this anymore since I only work on text files under revision control, but it still seems odd to me it was never introduced. I really just miss Window Explorer A LOT since moving to Mac. I don't hate macOS, but Finder is a bit of a joke.


To be fair, Windows' file open/save dialogues are so far ahead of everything else that the competition seems like unusable garbage to me. I'm glad KDE/Qt chose to emulate these very closely on Linux. Wouldn't want a desktop where my only choice is Gnome's take at this.

(On the flip side, Windows' select-a-directory dialogue of the same vintage is such an utter piece of garbage that I can't imagine there being any overlap of designers between the two dialogues.)


@blattimwind: "Windows' file open/save dialogues are so far ahead of everything else that the competition seems like unusable garbage to me. I'm glad KDE/Qt chose to emulate these very closely on Linux"

I hadn't realized that KDE was copied from Windows 95. I'm surprised no one here has mentioned NeXTSTEP. Here's a demo by Steve Jobs from 1992: https://www.youtube.com/watch?v=gveTy4EmNyk


Looking at this NeXTSTEP seems ridiculously ahead of its time.


It’s awesome that his goal has always been to allow “mere mortals” to be able to use computers. I can’t believe it’s from 1992. Why is it still so hard to build a database powered app in 2018?


I forgot how good Steve Jobs was at demoing.


Had he grown up anywhere else, with any other friends, he would have been one hell of a car salesman...


Windows (Word) workflow:

- Click the FILE tab

- Wait two full seconds while it replaces my screen with something else

- Click Save As

- Click Computer (because Microsoft wants you to do everything in the cloud)

- Click Browse

- Finally proceed to saving your document like you would be able to do immediately on any other system


To remedy this: File > Options > Save >Save to Computer by Default (yes, agreed that this is ridiculous).

A habit that I've developed from earliest computer classes in elementary school is to save the file in the location you want as soon as it's named, so ever after Ctrl+S saves it with no hassle.


Word 2016 also has "Don't show backstage while opening or saving files" in the same options dialog, which basically hides the Backstage UI[1] unless you specifically invoke it from the File menu, and shows a plain old Open/Save dialog instead.

I agree it's very ridiculous.

[1] 'Backstage UI' is the bit that looks like this: https://kiatplayground.files.wordpress.com/2015/06/save-to-o...


Press F12 for Save As in Word and Excel.


Reasons like this I still use office 2010 and will probably cling to it as long as I possibly can.


Absolutely true. I use a MATE desktop too, I can't even rename files in the save dialog. And I don't want to look it up either because it is so inconsistent all the time...

I would extend your argument to Windows Explorer in general.


Gnome's save dialog allows you to rename or delete files. Maybe MATE should take some inspiration from the newer version of their desktop environment...


I don't really care whos fault it is, or which DE allows this. Such basic stuff MUST work.

Sometimes I feel like the developers don't want stuff to work similar to Windows. Because that would be admitting that Microsoft actually did something right.


> Windows' select-a-directory dialogue of the same vintage

It's not the same vintage. It's a carry over from Win 3.0.


It is not.

Windows 3.0 didn't have any standard system dialogs, these were introduced in 3.1, but that is not the main point. In Windows 3.1 the directory selection dialog was exactly the same as file open/save dialog (Figure 8 in the article) but didn't let you to choose files (and there are still places in windows where standard open dialog is used for choosing directory).

What is typically used as "standard directory selection dialog" actually isn't even documented standard WinAPI dialog, but originally internal dialog of the explorer shell (and it would not surprise me it it was not present in original Windows 95 RTM but introduced in some slightly later version), also it does not select directories (ie. pathname as string), but shell folders (ie. ITEMIDLIST).


Because that is extremely non-intuitive. It's an "Open" and "Save" dialog. That is what it should do. Joe Public is not going to know it does anything else, yet it does.


Actually, you completely missed my point. The Windows dialogue is not only superior, because it allows power users to do what they need, but also because it's actually usable for regular users.

For example: I literally can't know how I can save a file using Gnome's dialogue in the general case, e.g. if the dialogue opens at /foo, and I navigate to /foo/bar using the dialogue, but then go back to /foo ("bar wasn't the right place after all"), I can't save the file there any more. "bar" will be selected. Clicking "Save" while a directory is selected will not save, but navigate instead. Now I'm in "bar" again. I go back to /foo and try to click something else, say a file. This changes the to-be-saved-file's name to the selection.

After I asked someone who uses Gnome as their main desktop they told me "that's easy: you just have to ctrl+click on the selected directory to de-select it, then you can save in the current directory".

That, my dear readers, is indeed unusable garbage.


As a long-time Gnome user, I've only realized how bad it is after reading this comment ;) (No irony here, I agree it's truly horrible now.)

But, I only realized it now because almost none of the apps I regularly use make use of Gnome's default file open/save dialog. Linux's extreme inconsistency has some benefits ;)

Oh, and talking about Gnome, the only reason anyone uses it is that after configuring a few basic stuff, you can completely ignore it, just run your apps, forget that there's an actual OS with GUI somewhere beneath them. Heck, if even the applications running on it ignore Gnome and its "standard widgets", its biggest strength is that it can be easily ignored!


And watch them take that as cudos...


your irony stack is too deep for hn, it might overflow soon


> Clicking "Save" while a directory is selected will not save, but navigate instead.

That's one of the most absurd and glaring usability bugs I've ever heard.


> I go back to /foo and try to click something else, say a file. This changes the to-be-saved-file's name to the selection.

That is the most annoying part to me. Windows does this as well. It's such a rare thing to want to overwrite a file for me, I find it so irritating that if I accidentally click on a file name instead of a folder, suddenly I lose the file name that I wanted to save as. Usually I just cancel and start again. So stupid.


Whereas I would find it irritating if it didn't have that behaviour. Sure it's not the most common operation, but it's still quite common to want to overwrite an existing file.


This is the excuse that's always used, but I rarely see it play out in practice. Many people don't notice advanced features that are present, that's true. But they don't notice them so it does matter. For the people who do, they can easily discover advanced features through exploring the interface.

Apple have a tradition of being really bad at this. Many of the (slightly more) advanced features are completely hidden behind undiscoverable key combinations or very hidden features. The slide-to-reveal pattern on iOS (now mostly fixed with augmentations) is a good example. Middle clicking the titlebar in Finder to reveal the directory parents is another.


In Windows's save dialog, right-clicking on a file gives me the same list it does in Explorer. In Mac's save dialog, right-clicking on a file does nothing.

It's not like the Mac way saves screen real estate or anything. It's like the way Firefox lets me close a tab by middle-clicking on it. Most users don't know it's possible, but it doesn't actively harm them to have the feature there.


I have similar problems with the Finder. Right-click a file on the Desktop and you have the option to compress a file. Open the Finder to the desktop folder, that option isn't present -- it's grayed out on the Finder file menu as well.

I really like the fact that the Windows save dialog is basically Windows Explorer, and I really miss those features when an application uses the older file save API (which gives a more Windows 95 / 3.11 interface).


"Joe Public" is a strawman people who make crap software prop up as the reason their software has to be crap.


It beats using "your mother" or "your grandmother" as the term for a stupid person.


I am going to print this and put it on my wall.


OSX Finder is horrible. Classic MacOS Finder was far better - but still had open/save dialogs.

RiscOS had it right - the Open dialog didn't exist - you would open the folder and double-click the file. And the Save dialog didn't exist - the document would reduce to an icon which you would then place into the correct folder. (You can sort of do this on the various versions of macOS by dragging the icon from the title bar but it's inconsistent)


You could always replace your usage of the modern macOS Finder with my poor and very incomplete clone of the classic Mac Finder. :)

https://classicmacfinder.com


A few years ago, I actually searched around for an implementation of the classic finder, so this is very amusing for me to see actually implemented!

What I was interested in back then was the idea that there was a direct correspondence between the folder window and the data structure on the hard disk, and that to the user these concepts should be indistinguishable. One part of the illusion is that a folder always appears in the same place with the same window size, to give the sense that the folder is a tangible thing with permanence.


Glad you're enjoying it!

It's got a loooonnggg way to go (I never realized how many small but vital details are in a file explorer application), but it's strangely exciting to see it draw itself on modern macOS, especially on a retina screen!

And yes, the spatial UX! I'm still working through getting all of that implemented (I just completed persisting window locations/positioning days ago). I have recently been reading theough some of John Siracusa's turn of the Mac OS X era writings and that's been hugely insightful and helpful. The level of detail (both of the original Finder and his writings) is impressive and sirprising.

Very cool UX concepts!


I also used to love the fact that the equivalent of kernel extensions (totally forgotten the name) could be disabled by moving them to a different folder. The file system IS the computer.


Agreed! I think that this behavior may have been one of the truly novel principles of classic Mac OS.

IIRC, you even used to be able to just swap the System folder around and have like a completely new/different install of Mac OS.


Amazing


What do you find horrible? I prefer it by a wide margin over Windows Explorer.


I prefer it over Windows Explorer. I prefer the Classic Mac Finder to both.

This article from 2003 lists the problems with the OSX Finder. Since it was written, it's only got worse.

https://arstechnica.com/gadgets/2003/04/finder/


Coming from the other direction, it kills me when I'm using windows and have a save dialog open, then drag the file I want to open into it from the explorer. On windows it moves the file to a possibly new location, while on OSX the save dialog selects the file you dropped on it (without moving it).


I have the opposite experience WRT Finder vs Explorer. I use a Mac as my primary machine, but have a PC for games. Trying to use a file browser without Miller columns[0] drives me crazy.

I love being able to drag a file up one level in the tree without cutting and navigating to the destination to paste, or having 2 windows open to almost the exact same location.

[0]https://en.wikipedia.org/wiki/Miller_columns


> I love being able to drag a file up one level in the tree without cutting and navigating to the destination to paste, or having 2 windows open to almost the exact same location.

While Miller columns certainly support that, so does the tree view in the left panel in the default Windows explorer layout.


Honestly I didn't think of tree view before seeing this (and neither did the friends I asked who are more regular Windows users than I am). I'm gonna give it a shot.

I've been using bitCommander, which is a Windows file manager that supports (among other things) Miller columns.


Compared to the tree view, Miller columns use more horizontal space, but have less vertical overflow. Which is superior probably depends a lot on your usage patterns, but both will support “drag to (grand, etc.) parent of current folder” fairly simply.


Another good approach might be to do it the other way around: Get rid of the pseudo-file-manager dialogs entirely and let applications integrate with the regular file manager.

I think RISC OS (?) did this. Open documents in applications had icons representing them which you could drag to the file manager to save. (And perhaps to other applications to open?) Mac OS also has (or had?) this to some extent – many document-based applications show an icon in the title bar, which is for dragging and dropping the document in question.

Even Windows has an example of catering to this way of working, in Explorer, where the folder icon in the location bar represents the current folder and can be dragged and dropped. They even have some custom behavior to prevent the window from being raised when you drag from it, so that it works more like classic Mac OS and lets you drag to an overlapping window.


Windows Explorer can be extended; the example I remember is browsing a folder of email message files, both the metadata attributes displayed in list view, and a functional content pane for viewing the message itself.

Be OS did this, too; I got the impression that Windows was inspired by that, both for BeFS (NTFS) and the desktop filesystem UX.


NTFS was introduced a few years before BEOS.

More likely that they were both inspired by a common source.

Never mind that MS have been working on a fully database driven FS (WinFS?) for ages.


You can use most open/save dialogues to do this - a right-click is all it takes in most places. But I definitely agree with your sentiment. Finder feels like a badly-designed toy. The latest things they've added - tabs and labels iirc - require more steps to do the same tasks in the new way versus the old way.


While a fair point, it’s important that feature-rich OS components be designed correctly (including security) and this wasn’t the case with Windows.

How many exploits were just a matter of tricking an extra-fancy OS dialog into popping open something it’s not supposed to, escalating permissions alongside it?


On OS X, open/save panels run in a different process via ViewBridge, at least they did last time I checked.


I would normally agree with you here except that I've accidentally renamed files on Windows so many times only to lose them because something was in focus that I didn't intend to be. I use a combination of the mouse and keyboard when browsing and the fact that I can accidentally click on a file (which renames my current file to that file) and save or click enter thinking that I'm saving which then just starts to rename the file is utter garbage in an Open or Save dialog box. I feel like the only things I should be able to do are Open a file or Save a file but not rename other files. Neither is a good solution so it comes down to preference but I definitely see the reason for it because it happens to me on Windows all the time.


From someone traveling in the opposite direction at the moment: the Windows folder selection dialog is the crustiest open/save relic of them all and "open files are locked even to administrators" is a UX catastrophe. Lesser pains: it hurts to have missing consistent "jump to enclosing folder" semantics everywhere (command-click on mac -- only get an equivalent context menu 30% of the time) and to not have the ability to snap open/save dialogs to a particular file or folder with drag and drop.

The door swings in both directions, in other words :)


> "open files are locked even to administrators" is a UX catastrophe.

This can't be blamed on UX people. It's an ancient difference between Unix and VMS and not easy to fix.


Windows Explorer, on the other hand, still has no “expand folder” functionality on the right pane. I have to go back and forth to sort garbage trees, opening multiple windows or temporary pinning all folders into favorites. Given that their names may intersect, problem gets worse. In Finder you just expand interesting folders and DnD until it’s done.

What really is a joke is a left pane that combines favorites, libraries (die, die, DIE) and disk trees. It was never usable, except for favorites.


I believe this has been possible for some time on macOS.


Haha, damn ok, it's been so long since I've had to do this, I didn't realize it is now possible (I used to want this all the time and was the bane of my existence in the late 90s early 2000s when using Mac). You still can't seem to move files into sibling directories, though, only to directory links in the sidebar. Anyway, I LEARNED SOMETHING TODAY (about macOS dialogues and about testing things before I open my mouth).

++edited to clarify language.++


There absolutely was TCP/IP support in Windows 95 from the very beginning. It was not installed by default but it was trivial to add via the Network applet in Control Panel. SLIP/PPP was also supported and you had basic utilities like Telnet and FTP included so you could connect to the Internet right out of the box.

No web browser, though. Internet Explorer 1.0 shipped with the optional Plus! pack.


Since a large part of my work involved building networks of pre-DOCSIS cable modems at the time, I can tell you that the first Windows to support TCP/IP was Windows 3.11 (Windows for WorkGroups). This was carried into Windows 95 and by Windows 98 there was even Internet Explorer.


But in windows 3.11 the TCP stack was not part of the installation or even on the installation media, it had to be installed separately together with win32s. In Windows 95 IIRC everything you needed for TCP/IP was on the installation media.


...and it had preemptively multitasking 32 bit drivers, we built a Mac (AppleTalk) server product on top of that with pretty good performance. 3.11 was a huge step forward.


For some reason we used Trumpet stack. Could it be that OS stack didn't work very well at first?


Probably inertia. Windows 3.0 and 3.1 didn't ship with a TCP/IP stack. You had to either have something like Trumpet or install Internet Explorer 2.1 (which, AFAIK, was a separate purchase) to get Winsock. Or you could try to use DOS drivers. Even Windows for Workgroups 3.1 only shipped with NetBEUI and IPX/SPX. It wasn't until Windows for Workgroups 3.11 that the OS shipped with TCP/IP.


> There absolutely was TCP/IP support in Windows 95 from the very beginning.

I could have sworn that was all in the Plus! package too, but 20+ years of time has eroded my memory on that...


The browser was in the Plus! package.


Yes, you're right. I misremembered it.


> Compare it to other GUIs at the time, like CDE, IBM's Presentation Manager, or even Mac OS 8 and there's no comparison.

Well, I agree that there was no comparison with System 8, but not in the sense you mean. I think that the Mac back then was head-and-shoulders a better system than Windows. It might still be, but they're both so painful to use now that it's very difficult to pick a winner.

The Macintosh system was very understandable, very clean. Extensions were an easy-to-understand way to extend one's system, and easy-to-disable too. The window system itself was better-thought-out and less-confusing than Windows's was. The Finder was much more straightforward than the Windows equivalent (was it called the File Explorer back then?). The way that the Mac associated programmes to files (with an application code & a file code) was much better than the extension-based naming of Windows. The way that the Mac used its files' resource fork was great.

Programming a Mac back then was very clean & straightforward. I don't think there's anything today as nice, except maybe Cocoa, maybe. Certainly not the Windows 95 API!


I was using Macs back in the late '90s, and none of the things you say ring true.

Extensions could easily bring down the entire system because there was no memory protection. Full OS crashes (what modern macOS calls kernel panics) were a daily occurrence for the typical Mac-using professional who ran complex software.

The window system was often difficult to understand because apps tended to use a plethora of little panel windows that could overlap even from different apps. Windows preferred large windows that contained the entire app UI, and users typically maximized them. The Windows 95 Task Bar was much better for actually keeping track of your tasks than whatever the MacOS 8 thing was.

File extensions were always a hack, but one that Apple adopted too for Mac OS X. The days of Mac's file-specific associations were numbered when the Internet happened, because Unix servers wouldn't keep track of that metadata, so you needed file extensions anyway.

Besides, the file-specific associations were often super annoying because they were created by the editor app even for exported files. You saved a JPEG file from Photoshop, and it forever insisted on launching the full Photoshop when you double-clicked on it, instead of your preferred lightweight image viewer. This would happen even when you copied the file to someone else because the association was in the file metadata.

Windows NT 4 and its next version Windows 2000 were just heads and shoulders above MacOS 8 and 9 in terms of performance, stability and usability.

(And programming in Mac OS 8... Ugh. No memory protection, no multitasking, APIs originally designed in Pascal.)


> Extensions could easily bring down the entire system because there was no memory protection.

Yes, they were quite unstable. I didn't say that they were stable; I said that they were easy-to-understand and easy-to-disable, which they were: each extension had a distinct icon displayed at system boot; disabling one was as easy as dragging it to another folder; disabling all was a matter of, IIRC, holding down Command as you booted.

> Windows preferred large windows that contained the entire app UI, and users typically maximized them.

As a Mac user at the time, I much preferred the multi-window mode: it meant that I could customise my desktop as I liked. The Windows single-window mode was terrible, as it meant that I couldn't layer windows properly.

> You saved a JPEG file from Photoshop, and it forever insisted on launching the full Photoshop when you double-clicked on it, instead of your preferred lightweight image viewer.

We considered that a plus at the time: it meant that different files of the same type could be opened up by different apps by default. One could always, IIRC, Save As if one wanted to change the file type — or use ResEdit.

> Windows NT 4 and its next version Windows 2000 were just heads and shoulders above MacOS 8 and 9 in terms of performance, stability and usability.

Stability, probably. Performance, maybe. But usability? Never! That was back when Apple cared about UX.


We were talking about 95, which is a completely different universe from NT (though current versions of NT have a GUI that's based on 95).


Windows NT 4.0 had the same GUI as Windows 95 and was released in 1996. That's the version of Windows that sealed the deal for professional applications.


The late '90s I remember had a mishmash of NT, NetWare, and some (or maybe several) variant(s) of Unix on the bits that lived in the room with all the air conditioners.

Windows 9x and maybe OS/2 were far and away the dominant OSes on actual workstations. Macs and even aging Amigas at some creative shops, some more Unix workstations at places where most people could recognize and identify the purpose of (if not actually use) a slide rule. But what I essentially never saw on any desktops was Windows NT. Lack of driver support and inability to run many business applications kept it out of that space.


I suppose it depends a lot on geography and business. I remember Windows NT/2000 rapidly displacing Macs in the creative fields, and being widely used as developer workstations.


Especially developers moved from Win95 to NT in hordes during late 90s.


I’m trying to suss up some old memories...

But NT was the first winOS which supported multi-proc, correct?

NT4 was the first OS I could have my dual 266 intel procs run softimage IIRC


"I’m trying to suss up some old memories..."

NT4, Adaptec 2940u2w, and mirrored segate cheetah drives.

You had to space them out because a stack of two cheetahs would cook the upper one to death ...


hahaha

I actually recall that Cheetah problem. :-)


The funny thing I have always thought about mac vs win is that mac was so set on doing it “their way(think different)” that they ignored UX that really was intuitive and “just worked”

Mac was so set on being different, that they aschewed UX tropes that were natural.. and had to spend ridiculous amounts of resources trying to convince people that their way was the right way, but clearly it was not.

This, IMO, is where the “fanboi” concept evolved.

Brainwashing.

Who cares - in the long term - where the UX and UI elements came from, the point is to make machines immediately accessible to humans creative desires, not to mold workflows to a corporate ego...

So, Apple figured out how to develop products that were managed through the extension of the desire of the user, but they are still struggling with the requirement from Jobs to be “different” - and Ives’ perception of a common user is skewed toward “Ives’ has stated that this is how it should be done” type design, which I find completely ironic given the whole “think different, so long as it’s exactly how I am designing you to think” campaign is a hypocracy that goes up to an 11


What Apple have done with both first and second generations of the Macintosh -- System 1 and OSX -- is pick a basic GUI metaphor and stick with it for at least fifteen years.

The original Mac released in 1984 and was produced through 2000, sixteen years. OSX released in 2000 and is now in its eighteenth year. Whilst each system has seen some evolution over time, the general metaphores and interfaces have remained consistent.

Apple have realised and internalised a core concept of GUIs: change is bad. There is a far higher cost to changing interfaces than can be gained through efficiency, and the retraining and unlearning costs are exceedingly high relative to benefits.

This is a message apparently lost on Microsoft and most of the leading Linux desktops.

Mind: I write this as someone who whilst using a Mac presently doesn't much care for the interface. My preferred desktop remains WindowMaker (itself based on Aqua's predecessor, NextStep), which has a key advantage of having changed almost not at all in the 20+ years that I've been using it. It's also configurable in ways I find useful, and I schlep around a configuration directory to new systems as needed.

That and a terminal window.


Microsoft hasn't been too different in that regard. In terms of interfaces with any real userbase to speak of, the only real Microsoft UI systems I can come up with could be characterized as

* MS-DOS/Windows 3.1-like * Windows 95-like * Windows 8-like

Windows 8 basically was born and died in a couple of years, to be replaced with Windows 10, which is very much the same basic set of metaphors as Windows 95. There's a start menu, a little dock of pinned icons next to it, a taskbar, a clock and some little icons for what are basically background processes. There's a maximize and minimize and close button on windows. There's a File/Edit/Whatever set of menus. You can right click for context items, many of which are consistent with Windows 95 20+ years ago. That basic system is now something like 21 years old now.

You can maybe argue that Microsoft forgot your message five or so years ago, but they obviously rediscovered it.


I'm largely familiar with the DOS -> Win2K period, and have made little use of Microsoft operating systems since.

Windows 3, 95, NT, and 2K each saw significant changes in where and how major system functionality was presented.

During the same period I was using numerous Unix and Linux platforms (and still do). Those have largely seen far less substantive change at the shell and system level, with a few notable exceptions.

I'm not discussing Linux GUIs, which have been all over the goddamned map. I've used twm, fvwm, fvwm2, VUE, CDE, WindowMaker (my preferred option), GNOME and KDE through multiple generations, Enlightenment, various of the 'boxes (black, open, flux, ...), ion, xfce4, ... And those are the ones I've trialed to some significant extent. I've at least fired up and looked at virtually all the options mentioned on the XWinMan page: http://www.xwinman.org/

There are several fairly central components which have changed fairly markedly. The shift from telnet to ssh, multiple iterations of firewalling, various scripting languages of preference (bash, perl, python, an oddment of others), mailers (sendmail, qmail, anything reasonably sane, mostly exim and/or postfix now), and of course, the whole init replacement clusterfuck.

But the notional concepts of files, filesystem, shell, utilities, pipes, etc., has remained consistent, and even across several utility / server replacements (particularly ssh and mailers), command-level compatibility has been preserved to a remarkable extent with previous options (e.g., rsh and sendmail syntax).

If Microsoft can only be relied on for five-year stints of "having learnt this lesson" then they have not learnt this lesson.


I fear though that command level compatibility is under attack these days, as fewer and fewer see shell scripting as something positive (never mind trying to do more and more via dbus rather than pipes and such).


Tools that fail to provide a stable scripting interface tend to be ditched early and hard by sysadmins.

This includes a hell of a lot of systems configuration tools. The corrective force on failure to adhere to this norm is strong.


And already with 8.1 the "start screen" could behave like a very large start menu.

Hell, you can today configure Windows 10 to behave much like 8.1. The one thing i see some people miss with the 8.1 to 10 transition is the charm bar. In particular that it gave easy access to printing and such.


Seem to me that Apple and MS focus on different kinds of change.

While Apple may retain the UI across time, they are more than willing to change APIs etc on a whim.

MS on the other hand may change the UI (though outside of 8.x, the core layout and behavior has remained much the same, and even 8.x could to a large degree behave like the older UI) but they bend over backwards to maintain APIs across time.


Very interesting point, and one that may play in subtle ways to each platform's audience and scale.

Microsoft was always more vendor / ISV / VAR oriented, and stable APIs matter there.


I get the feel that stable APIs are undervalued as a user retention element.

Being able to get a new computer but install from the same software library (i can hear the _sec people getting hissy already) as was used on the old one makes people more likely to pick the same "platform" over time.


Right, I see that point and would have acknowledged it more explicitly had I time earlier. That's the interesting part of this.

The counter is that Apple caters to a smaller software development community, though several of the tools also see extensive use and support (particularly photoshop). But there's a heck of a lot fundamental functionality on Apple's platforms that you can get without relying on third-party software, or at least, third-party proprietary software. Given the dynamics of proprietary software markets, particularly toward adware, nagware, and malware, this seems a possibly positive development.

(I've made much the same observation in recent years about the Android marketplace, which I see as a growing cesspit, and of the Windows application space, particularly at the peak of its crapware / spyware / adware period in the decade of the 2000s.)

Linux solves the software compatibility problem by allowing for recompiling of software for which the source is freely available, for the most part. This isn't a perfect solution, and there are complex systems which tend to not be particularly forward-compatible. One possible argument is that such complex systems are themselves inherently problematic and ought perhaps be avoided. You may not agree with the argument, but I'd expect you'd admit to its existence.

Microsoft was addressing a different space, and one in which there was a massive focus on desktop-distributed client software, much of it aimed at very specific business applications. This is a major application area for computers, though it's also one that's shifted significantly toward client-server Web-based solutions (or app-based, now). Which presents its own set of features and limitations.

And again, all this is what I was hinting at earlier with noting that you'd presented a very interesting point. I'll be thinking about this for a while.


> “think different, so long as it’s exactly how I am designing you to think”

That's Macintosh all over. "You're holding it wrong" is not a new failing of Apple, nor is it exclusively post-Jobs.

Jobs was just better at convincing people that, yes, they were holding it wrong.


Windows 95 was no paragon of stability when you started loading it up with tray apps and running big nasty applications or trying to use the damn printer with its piece of shit driver. We are spoiled these days by how stable our computers are.

I agree with you that keeping the file metadata in a separate fork is far superior to keeping the file metadata in a three character extension, but sadly the world zagged on this one when Apple zigged. I especially like that you could have two files of the same type associated with different applications, so if you created a text file in your IDE it would open back up in that IDE instead of launching the word processor. And you could change it (somewhat clunkily) at will.

Win95 did have some advantages though. The Start Menu was a better organization system than Apple's Application folder for example. Macs of that era were also slow and badly overpriced.


> The Start Menu was a better organization system than Apple's Application folder for example.

These were not actually that different. The original Start Menu was just a menuized view of a folder hierarchy (which mostly contained shortcuts but could contain any document).


The resource fork idea is great if everyone agrees on it and everyone preserves them, even across OSes.

Or if you're going to Very Deliberately Ignore the Other OSes and go do things your own way. VDIing like that seems to be a very Apple trait.

My point is, I'm not sure how well the resource fork model could have ever survived prolonged and sustained contact with the Internet and modern pervasive networking.


> I agree with you that keeping the file metadata in a separate fork is far superior to keeping the file metadata in a three character extension

Since we're talking about Win95, then the 3 characters limit doesn't apply (long filenames was a major new feature of this OS after all). .jpeg and .html were relatively common for example at the time, and worked fine.

I find the extension system still kludgy, but arguing than it was worse in part because of the limited pool, is incorrect starting from Win95.


Yet for the same compatibility reasons most people stuck with .jpg and .htm, at least in the Windows world. Even now it's very unusual to see a .jpeg extension on a filename.

The metadata that Macs kept in the resource fork go way beyond file type and creator too. It included things like the file's Icon, creation/modification information (so it would survive a trip over the Internet!), loads of stuff for applications (menus, graphics, sounds, etc...), formatting for plain text documents (so they fall back to plain text on unsupported systems), and so much more.

Fun fact: NTFS supports the concept of a resource fork on files, but almost nothing in Windows uses it. I think I've seen more malware hiding stuff in there than legitimate uses in the wild. Worse, even the obvious case of loading a Mac file on a Windows machine it usually fails and falls back to creating the clunky separate directory instead.


NTFS does not have resource fork in the MacOS sense nor extended attributes in the unix sense. Instead it allows for file to have multiple named contents that are accessible by same file IO API (in essence the file can behave like a simplified directory). There is no distinction between data and metadata stored this way. In the late 90s MS even intended to not use OLE compound storage fileformat (ie. what office 97/2000 formats are built on) on NTFS drives and instead write the objects into separate streams (reportedly it was not implemented because then windows would have to somehow transparently reconstruct the compound storage when you copy such file to non-NTFS drive or upload it to the internet). Today apart from malvare hidding only major usage multiple streams have are the "this file was downloaded from internet, are you sure you want to open it?" prompts which store the internet-ness of file in secondary stream.


Conceptually the MacOS Resource Fork is basically a directory where all of your filenames have to be exactly 4 characters long. The only difference is that each "file" might be a stack of "files". So you might have a CODE resource that has multiple CODE segments in it.

One thing I loved about old MacOS apps is opening them up in ResEdit and so much of how the thing was built.


> Conceptually the MacOS Resource Fork is basically a directory where all of your filenames have to be exactly 4 characters long. The only difference is that each "file" might be a stack of "files". So you might have a CODE resource that has multiple CODE segments in it.

Sort of. It would be more accurate to say that each filename was a 4-letter type code and a 16-bit ID. Each resource could also have a name, but that was less frequently used (and didn't have to be present, let alone unique).

More importantly, resource forks didn't exist in isolation. They were loaded into a chain of active resource files -- for instance, while working with a Hypercard stack, the resource chain would include the active stack, the Home stack, the Hypercard application, and the System suitcase. A stack could use resources (like icons or sounds) from any of those sources.


You don't see .jpeg because people are used to .jpg. There's nothing incompatible with long names that we care about.

Also, that Windows feature you mention is called ADS - alternate data stream.


Yes, but you also never see extensions like:

.jpeg.photoshop.ro.creat127339292....

The filename extension is the bare minimum of metadata for a file and not easy to extend.

Even then Unix systems will skip even that minimal metadata and force you to messily search for magic numbers at the start of the file and make a guess.


> I think that the Mac back then was head-and-shoulders a better system than Windows.

I completely disagree.

First of all, Windows 95 had preemptive multitasking (the Amiga was the only computer that had this at the time), Mac OS was single task and used a terrible scheduling and it would be years before Mac OS gained preemptive multitasking because of terrible architecture choices that made this extremely challenging.

From a GUI standpoint, Windows 95 was a total revolution and made Mac OS look completely antiquated: rendering, scrolling speed, font types, menu items and dialogs, etc...


From 1991, Linux had preemptive multitasking.

The Amiga was exotic hardware, like the BeBox. If you want to include those, you might as well include SGI and Sun hardware. Preemptive multitasking was common on non-PC hardware.


Well yes and Minix had preemptive multitasking a decade before.

We're talking about consumer facing computers, and at the time, the market was pretty much only Windows, Mac, OS/2, Amiga and Atari.


> The window system itself was better-thought-out and less-confusing than Windows's was

I miss window shades so much.


There used to be some good haxies that would let you do this on OSX, not sure what their current state of support/brokenness is.


Yup, I remember using them. WindowBlindsX if I recall. It worked alright, but it was never as snappy as shades were.


Cocoa is pretty painful compared to web UI alternatives like Electron. I feel stupid for mastering it considering there’s no jobs there and the Mac App Store is probably impossible to make any money off of anymore.


Windows has always been head-and-shoulders above the Mac and that's why it won the 90s desktop wars and that's why just about every business on earth uses Windows and not Macs.

You are in a tiny minority if you think Mac window and file management are any good. File extensions are the most pragmatic way of dealing with associations and that just about sums up the difference between Mac and Windows: Microsoft made Windows to be practical whereas Apple has always focused on style above all else.


> I still get frustrated on OSX when I minimize a window and have to hunt around for it

Windows user here. Honest curiosity: does anyone know why the minimize / maximize works on the Mac the way it does? I mean, what's the rationale to design it like this?


Different ways of working, mostly stuck in our own ways of doing things.

I rarely, if ever, use Minimize on the Mac. Minimize comes from Windows (and other windowing systems) where the window minimizes to an icon or button on the task bar.

"Maximize" also comes from Windows (and other windowing systems.) As others have noted, in newer macos (which I don't use,) I think it oddly makes the window go full screen. Full screen is a recent macos feature -- I once asked about making my Application go full screen and Apple developer support said that going full screen did not follow their Human Interface guidelines. Something seems to have changed at Apple since I asked about that decades ago.

There was no Maximize on the Mac, it is called "Zoom." The idea is that the window has two sizes and you zoom between the two sizes: One size is the size the user has resized the window to (often with much difficulty) and the other size is an ideal compact size ("optimally fit content") without hiding anything and hopefully where scroll bars do not appear -- it is a UI feature that is/was rarely, if ever, done very well by applications other than the Finder.

By the way, Command-Tab to switch tasks was once an add-on from Microsoft for Mac OS. Go Microsoft! (Mac fanboy here :)


All correct. In daily use, I almost never use those buttons - I end up manually sizing/placing windows.

The the MacOS scheme of doing this leads a sort of organic, emergent window layout - I always end up with windows staggered to display relevant bits. With Windows (and with most window managers, X), things always end up either strict tiled or stacked.

I'm very used to the Mac way and prefer it, but that could just be the result of long use. It doesn't waste space (Windows apps always seem to have lots of dead space to me and makes me work to be able to see parts of other windows) and forces me to switch windows far less. But it is a fiarly subtle thing.


The amount of tedious manual work required to resize and arrange several windows on MacOS (6+ to latest OSX) bothered me all the time I had a misfortune to have to use it. It does not even have something like edge snapping.

My WM of choice is now Xfwm, which allows to rearrange windows easily without gaps, make windows fill available space (vertical and horizontal separately), or tile them by dragging to corners.

On OSX, things like Spectacle and Magnet sort of help.


The lack of window and edge snapping are exactly the reason I prefer MacOS behavior. If it did that, partly revealing lower windows would be far more tedious/impossible.

For me, the goal is not to maximize available space to the frontmost windows; it is to maximize the use of the monitor to display what's currently relevant. It allows me to do things like keep a Finder window open with just a bit peeking through to drag things to, keep an eye on a few lines of a terminal tail -f, see the mailbox pane and room pane in Mail/Slack so I can see if anything new happened I should respond to, etc. all while working on whatever I'm working on.

With the Windows-ish "maximize the window", that is replaced, almost invariably, by useless window background.

Again, I expect this is mostly what one is accustomed to, and the Macish approach is more idiosyncratic to the user. Works for me.


Edge snapping does not prevent the behavior you describe; sometimes it even makes it easier to do. You can still rearrange and resize the windows manually as you wish. But when you want two edges to fit snugly, it's easy to do.

Also, you can maximize a window for a moment, look at it, and then unmaximize it back to its previous size.


> Different ways of working, mostly stuck in our own ways of doing things.

I use xfwm4, and turn all of the window snapping off -- unlike Mac and Windows we can customize. ;)

You will pry the Command-key from my cold dead hands:

  /usr/bin/setxkbmap -option ctrl:swap_lalt_lctl
For now, the one pixel resize box is remedied with Ctrl+right mouse button to resize.


A trick is that if you hold option while resizing, it resizes the opposite edge at the same time.

Also, if you hit the up arrow in the command-tab switcher, you can use the arrow keys to select a minimized window and hit enter to restore it.


Here are two window managers for macOS I use:

Tiling window manager: https://github.com/ianyh/Amethyst

Programmable window manager: https://github.com/kasper/phoenix


As of at least 10.13 (High Sierra), there does appear to be a very modest edge-resistance when placing windows next to each other.

Either that or I've gotten very much better at doing this.


Same in 10.12.


Interesting. Now that you've made me think about it, I've realized that I use Windows more like a single-threaded operating system. I only ever flip between applications (ignoring my second monitor). I never, ever, combine multiple applications on the same monitor - this is crazy because this was the Windows 95 promise. Not that it's a bad thing, I'm used to the workflow and love it.

I absolutely agree with your guess as to why this is.


maximized window ≠ full screen

in Maximized state the window is set to maximal available size (so you are not wasting any part of the screen) while you are still provided with fast and easy access to relevant OS UI elements

in full screen you explicitly tell the system that you don't want to be distracted by OS UI elemtents (typically in situation when you know that you won't need them for extended period of time, or if you REALLY need every single pixel of the screen)


I feel your pain. I was using BetterTouchTool to remap the default behavior of that green +, but eventually decided it was silly to use an add-on for something I should be able to change via `defaults`, so I just trained myself to hit Option when I wanted to maximize.


It's also possible to double click anywhere on the top bar of a window to "maximize."


I have no idea...

I'm came from ~18 of Windows & Linux usage and everyone always told me macOS is THE OS with the best usability.

But I can't confirm this.

Minimization of windows is shitty and maximization even more.

When I maximize, often just the height is changed, when I go back to "normal" the height and the width is changed, so I always have to adjust the width manually.

When I minimize a few windows, it's impossible to get back the right one without luck.


Know what you're talking about. Since I've switched back to Linux I can't imagine working without workspaces - each one dedicated for specific task/app - and every time staying with specific order so e.g. 1 workspace: Browser, 2nd: Code editor, 3rd: Terminal, 4th: File explorers etc. I used to it so much that I automatically use shortcuts to access it immediately - switching between minimized windows using alt + tab keys is a nightmare.


Windows 10 actually supports this feature, still can't get used to it though... win+tab -> switch between virtual desktops, or ctrl+win+left/right arrows.

and if you have many windows open (in win 10 at least) you don't have to press alt+tab 10 times in a row to choose your desired program, you can hold alt+tab and use arrows.


Yep. Curiously, though, you cannot share running programs or windows across "desktops" in Windows, although you can share them across Spaces in macOS.


Yes you can, it was added in the Creators update. Hit win+tab and right-click the window, you can choose to show that single window across all desktops or all windows from the app across all desktops.


You can accomplish (mostly) the same thing with Spaces on Mac. Granted, I don't believe you can really script any of it. All manual, but it still works pretty well for me.

Some days I still really miss dwm, but having Photoshop, Ableton, and several other things Just Work™ makes it worth it.


The problem with spaces is that any time you cmd-tab between spaces, those spaces will be repositioned relative to each other to be adjacent. This also happens whenever a window opens a dialogue that forces you to switch to it. This means that you can't reliably keep spaces in a strict order.


You can disable this behavior in

System Preferences > Mission Control > Automatically rearrange Spaces based on most recent use


I have never seen either of those things happen. I even just tested the first one.

Perhaps it's a setting or was the behavior in an older version of macOS?


If I alt-tab between apps, or open a new window in an app from a particular space, I'm warped to wherever OSX very wrongly thinks I ought to be.

Motherfucking maddening as hell.


Yes, this is something I do to on linux, and it's great particularly when you're doing something that gets quite messy with lots of windows open - when you're working with lots of files, or you the program (like GIMP) opens up several windows. If you need to do something else it's so nice to just leave it all and move onto a nice clean workspace without having to minimize everything.


You do have multiple workspaces on Windows 10... Though if you mean "automatically assign application X to workspace Y", which many Linux WMs are able to do, you're out of luck.


I used to use VirtuaWin[1] on Windows, which adds virtual desktops to it. It's possible it doesn't work in the most recent Windows (although, given Win compatibility, it could just as well work), but until Windows 7 (when I stopped using Windows) it was a life-saver. I used Enlightenment (E16) on some of my computers back then and after working with multiple desktops I just couldn't live without them. I mostly use 3x3 layout, with the main application I work with at the center, and other applications to the sides. Works great for me!

[1] https://en.wikipedia.org/wiki/VirtuaWin


Sadly the workspaces are very much mouse oriented (want to switch between workspaces? Thats win+ctrl+number!).

And there is no proper UI element for checking if you have anything open in a different workspace without opening the switcher.


You can switch with Ctrl+Win+Left/Right, too.


While true, it does not alleviate the finger yoga...


Not to different from most Linux DE's Ctrl+Alt+Left/Right.


At least any sane DE allows me to define them to be less gymnastic.


After using a mac for ~6 months now, I finally understand what happened to gnome3/unity, clearly designed by mac users.

I mean it's perfectly usable once I got the basic gestures down and installed the app that let me independently set the track pad and mouse wheel scroll directions, but will go so far as to say that both xfce and MATE are objectively better at window management.

Partly I think their hands were tied by the too late to change decision for the always there contextual top menu bar. Or this is just 25 years of using win95 clones talking and I'm set in my ways.


I wouldnt say Gnome is by or for mac users. In fact in many ways it is way more similiar to Metro than to OSX. Gnome 3 just broke the common Windows 95 workflow, however many others did before.


ya tbh i never tried using it for more than 10 minutes


Personally i love it but it took me way more than 10 minutes to realize why :) IMO it is highly underestimated, mostly because you need to change your workflow which takes time, but when you did it feels super productive.

You wouldnt judge i3 or other completely different approaches after only a few minutes.


I feel absolutely the same about Gnome 3. One of the biggest things, I think, is that it puts workspaces absolutely in your face, so using them is a much more natural part of the workflow than in Gnome 2. It's also more keyboard-friendly than Gnome 2 (though it still could use some work in this area). Despite being rather large (gnome-shell on Wayland is typically the second-biggest RAM user on my laptop), it feels minimalistic, and is almost always fast, and stays out of the way of whatever I'm working on.


> it feels minimalistic, and is almost always fast, and stays out of the way of whatever I'm working on.

absolutely :)

What i like the most is you are just a super key tab away from basically everything so focus on a single thing feels natural and right. There is no way to lose anything either, its all there. Always.


I'm going to have to give it another go then, though I do love my MATE desktop.

The last time I tried it, I got frustrated when working with a lot of pdf sources - hitting the super key just presented me with a myriad of white rectangles where open windows would frequently rearrange requiring a slow manual search to find the file I was looking for. This can be less of a problem with a taskbar, as the filename is the main identifier, and being 1 dimensional it is easier to scan and preserves its position better.


What's strange on a mac is somehow having the ability to completely lose windows.

Another frustration is trying to view two apps together on the screen at the same time, if one of the apps itself contains multiple windows. I'm not at my desk to try this but let's say you have multiple chrome windows open, all with their own tabs, and you want to view your current chrome window overlayed on a window from another app. To do this you have to manually minimise all of your chrome windows one by one so they will all move of the way, to allow you to switch between apps and view them both at the same time.


Here's my workflow for this situation.

Four fingers up, expose. Drag the windows you need up top into a new desktop. Command (or control? Or option? Or a combination?) Plus the right arrow to switch desktops. Try all the combinations until I get to the right desktop or throw the damn thing out a window.


Always have this issue with Xcode.

Somehow it is able to open up the device manager and THEN give the window in the background focus when I switch to my editor and back to Xcode with cmd+tab.

So I have to move the Xcode window down, to grab that backgrund thing what seems to be part of Xcode.


I use ShiftIt (https://github.com/fikovnik/ShiftIt), and ctrl+alt+cmd+M maximizes the window just like you expect.


Classic Mac OS (System 7 was contemporary to Windows 95) didn't really have any concept of minimisation or even maximization as such.

There was no task bar or dock or anything else really to minimise to. IIRC there were addons for 7.1 that added "window shading": a button on the window title bar that reduces the window to just the title bar.[1]

The closest thing to a maximise button in classic Mac OS was more like a size-to-fit button: the application gave the window manager a hint which was the appropriate size for the document displayed, be it a file folder, a word processor document or whatever. Having a single window fill the entire screen wasn't as common as it was on Windows.

None of this was particularly strange to me back then.

[1] https://en.wikipedia.org/wiki/WindowShade apparently a standard feature later on


>Classic Mac OS (System 7 was contemporary to Windows 95)...

System 7 was WAY earlier than Windows 95, it was 1991, a little more than 4 years earler (like an eternity):

https://en.wikipedia.org/wiki/System_7

"It was introduced on May 13, 1991 ..."

And even 7.1 was almost exactly 3 years earlier: System 7.1

"In August 1992, the 7.1 update was released."

https://en.wikipedia.org/wiki/Windows_95

"It was released on August 24, 1995"

Not even Windows 3.1 had been released at the time System 7 came out, it's competitor on MS side was DOS (version 5.00):

https://en.wikipedia.org/wiki/Timeline_of_DOS_operating_syst...

https://en.wikipedia.org/wiki/Windows_3.1x

or some of the various semi-graphical third party shells for DOS.


Windows did exist before 3.1


>Windows did exist before 3.1

Sure it did, namely Windows 1.0, 2.10/2.11 (actually Windows/286 and Windows/386), but very few people used them.

Windows 3.0 was the first one to have some diffusion, but it had very limited capabilities, and it's adoption was slow because of the increased specifications for the PC, and in any case not comparable with the later wide adoption of 3.1.


> Having a single window fill the entire screen wasn't as common as it was on Windows.

Yup! For the longest time I liked to work with a half-width browser window to match my half-width editor & word processor windows. It drove me crazy the number of websites which set their body text to some fraction of the window width, which looked good with a fullscreen window but terrible with a halfscreen one.

Eventually I just gave up. The whole point of the web was device-independent information transfer, but somehow we allowed device-dependence to sneak it.


MacOS System 7 did have a function like minimize for Apps (not individual windows). In the finder menu on the upper right, you could "hide" or "show" a program and use the same menu to switch apps, similar to the task bar in Win95.

You can play around with it on archive.org. https://archive.org/details/mac_MacOS_7.0.1_compilation


macOS has Hide & HideOthers in addition to Minimize. I go weeks at a time without using minimize because of those.

IMO there's a whole generation of people who did their early computing on MS Windows (including myself) and so internalised that that is how GUIs are "supposed to work". When moving to something else later in life there's a feeling that it is "wrong", but it (e.g macOS)'s way of doing things is also correct and is just a divergent evolution to MS Windows. Research, open-mindedness and experimentation are necessary when using something different.


And this hidden setting makes working with Hide/Hide Others much more intuitive:

http://osxdaily.com/2010/06/22/make-hidden-application-icons...


Doesn't seem to work on High Sierra.


I'm on High Sierra and it works, maybe just needs a `killall Dock` to have an effect.


Thanks for that one.


I'm new to macOS. Thank you. I need these tips. It's really annoying in the differences, but I'm sure thre are more hidden things that are useful I suspect I am not alone


The old maximise system was simply designed to make the window as large as it needed to be to optimally fit content.

It wasn't designed to 'make this window fit the screen'. It was designed to 'make this window fit this A4 document.


Thanks, this is the first answer to this question that sort of makes sense from a Windows user point of view. I mean I still prefer the "Windows way", but this is at least viable reasoning for the "Mac way".


Depending on how you've configured your Dock, Minimize is pretty simple. Either the window just moves to the Dock, or it minimizes into the application icon (then you can right-click on the Dock icon to view a list of the windows that are minimized, or click to open the last minimized window)

Maximize on Mac was NOT designed for the window to fill the whole screen, but rather to resize the window to optimally display its contents. E.g. maximizing a Preview window with a PDF document changes the width of the window to the PDF page width. (Good tip for getting along with your Mac: stop trying to maximize everything.)

In a recent macOS version, Apple changed the green "maximize" button to full-screen, which is very different from maximize. Now double-clicking on most window chromes will execute the old maximize behavior.


> Maximize on Mac was NOT designed for the window to fill the whole screen, but rather to resize the window to optimally display its contents.

The problem with this approach is I definitely do not need someone else making the decision of what is "optimal" for me. I've been using macOS for about as long as I used to use Windows now, and at this point, macOS seems to have largely abandoned the concept, which is great. Applications get either a full screen in their own isolated context, or option-click for a full-screen in a regular windowed context. The options now are a very windows-95/98 like: "either go completely full screen, or resize to whatever you like," which gives me full control of what I find optimal for any given application.


The difference is that Windows never really embraced universal drag and drop and the Mac did.

Macs used drag and drop for file management between windows representing separate locations on disk. Windows users tended to select files and choose cut or copy then navigate to the second location and paste.

The same held true for moving content between documents in an application or moving content between applications. Mac users preferred to use drag and drop, while Windows users relied on copy and paste.

The problem with keeping every window maximized is that you're giving up system wide drag and drop as the primary user interaction method.


That's a great distinction I hadn't thought of before, and it definitely makes sense - if you're focusing on drag-and-drop, you want as many windows visible somewhere on the screen as possible to maximize possible destinations.

Personally, I find drag-and-drop handy sometimes but it's very constraining. You have to go through non-standard motions to complete any more that is more than trivial, always holding down the primary mouse button and thereby losing your primary way of interacting with the interface. In other words, sure, if you have a clear view of your destination, then yeah, drag and drop is fine, but in all other instances, it becomes clunky.

Cut/paste is incredibly quick and doesn't sacrifice usability of your interface or input methods between the two ends of the transaction. Windows seemed to balance this out well, where you could drag and drop most of the time, but you could also ALWAYS cut/paste. I despise that I can't cut/paste in finder. Which is why I use PathFinder instead.

The danger cut/paste DOES pose is it fundamentally unlinks the start of the transaction and the end. In between, you can do literally anything, which may mean losing track of what's in your paste. Still, I'd call this a fair trade-off, specifically because it is non-destructive for files. You won't lose a file to to paste. It just stays put.


> In between, you can do literally anything, which may mean losing track of what's in your paste.

On Windows Ditto, and on Linux CopyQ (among others, and there has to be something like that for Mac) solve this problem, by giving you a preview of what's in the clipboard as well as the history copies you made.


I've seen users cut files from one location and forget what they are doing before they manage to find the destination they intended to move those files to.

Then they are shocked later when they paste those files into some random location and can't figure out where they went.

Dragging and dropping does not have that issue. Users find it much easier to learn.

Open the source window. Open the destination window. Drag.


>The problem with this approach is I definitely do not need someone else making the decision of what is "optimal" for me.

And that's the crux. The whole feature (in its original incarnation) rested on the false assumption that there's a singular "optimal" state at any given time.

The Apple Human Interface Guidelines have been a state-of-the-art reference for good UI for a long time, but the part about the zoom button always baffled me, as it directly contradicted several Core Principles laid out in Part I of the book.


I get it, it's your computer, etc., etc.

The reason this is done is of course that most applications don't have content that fill the entire screen, so maximizing, in most cases, is meaningless - and hinders the usability of the system.

It makes more sense to leave some space over for other apps than have a big empty area on both sides of the screen.

Interestingly, I noticed that it's only Windows-switchers who complain about this. People who've used Mac for a long time don't give this any thought.


> most applications don't have content that fill the entire screen, so maximizing, in most cases, is meaningless

It's a matter of where you place responsibility. It's like saying, "most websites aren't responsive, so naturally it makes sense to restrict the size of your browser window and leave space for other apps." But most would laugh at this and say it's the responsibility of the website/webapp to build a responsive layout. Why should we hold desktop applications to a different standard?

I completely that it is probably almost entirely windows-switchers who complain about it. I'd, obviously, self-aggrandizingly suggest it's because we've tasted something better. People don't complain about the taste of food they've never tasted ;)


>It's like saying, "most websites aren't responsive, so naturally it makes sense to restrict the size of your browser window and leave space for other apps."

I think that's a misrepresentation. It's not a static size. It's not an artificial limit. If the document that's open has content to fill the entire screen, the window will fill the entire screen.


For minimize I can't say. But the maximize works the way it works because (and this is according to the platform ideology, not some general truth) you are not supposed to maximize windows in the Windows sense of the word.

The macOS interface is based around floating and overlapping windows. If you put a window over the whole screen then it could be as well maximized. This gets a bit hairy on smaller screens but really shines on huge monitors. In general macOS is more optimized around having one big screen rather than a multi-monitor setup.


What was the big screen they optimized for in 1984?


I do not know how the interface looked back then (I was not even born), but I can imagine that at that time a lot in interface design had yet to be discovered.


You’re holding it wrong.

But seriously, minimising windows is a reflex learnt from Windows. In Windows you often need to minimise one thing to find something else. Especially the desktop. On a Mac you can usually find something more quickly in the dock. In Windows you’re far more likely to have an app maximised by default, and minimising is a natural way to switch tasks. On a Mac, minimising is not a natural way of task switching.


It drove me nuts when they updated the maximize button behavior to full screen.

I use ShiftIt, a neat little Open Source tool that help me manage windows sizes and positions (including minimizing and maximizing): https://github.com/fikovnik/ShiftIt

I've recommend it to pretty much any Mac user I've met.


Haven't used ShiftIt, but I've been using Spectacle to accomplish the same thing. They seem pretty similar, overall. Works pretty well and I've had no issues.


I used Spectacle for years than was forced to switch to ShiftIt at a new employer. ShiftIt default hotkey combos didn’t conflict with other apps like Spectacle.


Alternatively, use opt-click to just invoke the old-style maximise. No add-on needed.


Double-clicking on empty window chrome also works.


No idea... And a few versions ago Apple changed the maximize button to go full screen, which made it useless for the 95% of us who don't use one app at a time. Minimized windows also used to show up at the bottom of the screen when using "Expose", but then Apple changed the name to "Mission Control" and removed them. Just one more example of them slowly but surely driving its Mac customers away.


> which made it useless for the 95% of us who don't use one app at a time

I'm actually willing to bet good money that it's the other way around and it's you who is in the minority.


It depends on the definition of 'at the same time', but I think that many people in a work environment at the very least use an e-mail client and a productivity application at the same time. Also throw in a calendar for good measure.

I think that Apple thought that many regular users would switch to full screen apps on the Mac, combined with Launchpad (it's just like an iPad/iPhone). But virtually all non-tech-savvy Mac users that I know do not use Launchpad, nor fullscreen apps.

I think the problem with Launchpad as with Spotlight search [1] is that they are not very discoverable on the Mac. Having search in an application menu (like recent Windows versions and some Linux desktops) is far more discoverable.

I guess people don't use fullscreen apps because they equate desktops/laptops to the 'WIMP' interface paradigm.

[1] If I received a penny every time I see even experienced Mac users launch applications by clicking on a Dock icon or by navigating to the Applications folder in Finder, rather than using Spotlight, I would be rich.


Only after reading your comment did it occur to me that I should set up applications in the Launchpad the same way I have them organized on my iPhone (or like Win 3.1's "Program Groups" or Start Menu folders) and stop using the Dock to hold the applications I use most often. Maybe that would help keep track of which applications I have open, and where my minimized apps keep going off to!


You sure you are not confusing it with iOS? That's where you always have strictly only one app in fullscreen!

(Yes, yes. It is a lame joke. But who uses only one Window at a time? What is the point of that? Though I remember when I used OS X you could swipe left (or right) to switch back to all other apps. So I think that was good enough.)


I use several windows at a time (around 5 in average), but they are all maximized. I just alt-tab and switch workspaces?


That's a very un-Classic-Mac-like way of using windows.

In the olden days, Mac users would tend to have lots of overlapping windows. Dragging anywhere on the window edges would move the window, so it was easy to arrange them as you wanted - almost like shuffling bits of paper on your desktop (strangely enough) and if you wanted to move something out of the way, you could fold it up (window shade). There was no need for maximising or minimising and "zooming" just meant "make this as large as makes sense for this particular document", not "expand to take up all the space on my desktop"

As OSX/macOS has developed, all that document/desktop-style behaviour has been lost.


But when it is full screen then you can't alt tab, right? You would have to use the shortcut to switch workspaces. Maybe I just remember wrong.


They're talking about a window sized to the maximum space as if you click and dragged its edges out as far as they can go or just used a tool like Divvy.

You're talking about the "fullscreen" feature which I always found very weird. For example, ever forget your video was "fullscreen"ed as you try to alt-tab to it only to realize it's a 4-finger swipe to pull it back up. Making the user have to differentiate will always be bizarre to me.


It may be a new behavior (I'm not sure), but you can Alt+Tab to and from full screen apps in High Sierra.


Ah thanks. Then my iOS analogy didn't work.


Speak for yourself. Spaces in macOS is one of the best window management features I've encountered. Of course, it's been common in Linux window managers, but since I use a Mac for work, its a godsend that spaces and multiple desktops is implemented. I'm a developer, and being able to organize my windows and split them (fullscreen isn't limited to one app per screen anymore) is so freaking useful that I feel lost whenever I try to use Windows again.


Hold option/alt as you click the green maximize button to get the old fill-available-space behaviour.


> Minimized windows also used to show up at the bottom of the screen when using "Expose", but then Apple changed the name to "Mission Control" and removed them.

Eh? They still do, as of MacOS 10.13.


use alt-click on the green button.


Isn't the maximize button a full-screen mode now?


You may not recall, but when win95 came out - there were lines blocks long awaiting to buy it, much like when the iPhone came out.

I recently sold several windows 95 shrink wrapped original copies on eBay which was the OS on 32 3.5” floppy disks. As an original piece of computer history.

Win 95 was monumental and great. Aside from outlook, and excel, the greatest product MS ever made.


I'm predominately a Linux user and I switched away from Mac OSX specifically because of the underlying file system. I'll admit that it was annoying finding windows sometimes but the case-preserving, case-insensitive file system made no sense.


Case-preserving,case-insensitive is good for less sophisticated users: prevents accidentally misplacing or duplicating files with capitalization; and simplifies through this constraint media listing and sorting — no worrying about jpg vs JPG.

What are the use cases that bother you?


When a file refuses to rename because the OS thinks it is the same name. When you have multiple files that differ in case, but they overwrite each other because the OS thinks they are the same.

Really, the only place case-insensitive filenames makes sense is when you are searching. It makes no sense for any other reason.


Files don't overwrite each other the operation is forced. All of the OSs I've used will prompt you before overwriting.

I also like when an IDE detects and provides a warning when I have two variables with the same letters better different capitalization.

Different case is often a bug.


Files definitely do overwrite each other without prompting in my terminal. And it is straight bullshit that MacOS doesn't care.


Why not just use it with a case-sensitive file system? Did it break some application you were relying on?


I wondered the same thing, but then acknowledged that adds a first step of formatting the disk / re-installing.


Which switching to Linux entails as well.

I can think of a lot of reasons to abandon Mac OS X, but this seems like a really odd one to me.


> I wouldn't switch back because of the underlying crap that is the Windows [...] file system

You lost me. HFS+ is arguably one of the worst file systems around (yes I know that Apple finally switched to APFS but that was fairly recent).

I'm really curious what you actually liked about Mac's file system vs. Windows?


Hmmm... You're right of course. I didn't describe that well. I meant Unix file names and way of working with files vs. Windows. In other words, forward slashes, symlinks, mounts, sane permissions, etc. I hate dealing with drive letters, etc. What's a good name for this?


Oh, the UX of the file space layout.

NTFS/Windows actually has all of that stuff you want, too. NTFS's permission system, for example, is extremely feature-full and integrates nicely with the user system (ACL support by default rather than an add-on, for example). The octal user-group-all permission you're probably used to is pretty crude by comparison.

It's more likely just you're unfamiliar with it rather than it's actually missing anything.

But if you want you can just pretend C:\ is equivalent to / and mount all your other drives at C:\mnt\whatever, that's completely doable (with a GUI to configure it if you want, even)


Agreed. NTFS is one of the few things I missed.

I’m still annoyed that APFS doesn’t have file level checksums.


>Even though I've used a Mac daily for the past decade or so, I still miss the task bar, and window-oriented GUI of Windows. I still get frustrated on OSX when I minimize a window and have to hunt around for it. I wouldn't switch back because of the underlying crap that is the Windows OS and file system, but I still miss the interface.

OT but have you tried Witch[1] as a task switcher? It switches between windows, which made my life SO much easier

[1] https://manytricks.com/witch


I find Contexts (https://contexts.co/) a better application to do exactly that


Wow, looks like a very polished Witch and then some. The addition to the search looks interesting. Any area or use case thst witch is better? Is it as snappy? I like that Witch is instant.


I remember using Witch, but a very long time ago. I remember stopping using it, though the reason escapes me. I've been using Contexts for what feels a year, and I keep using it. It's that good.


Looks neat, I'll take a look - thanks


Not sure why you’d have to hunt for minimized windows because there is a separate area on the dock reserved for them, and the animation clearly shows the window moving towards it.


> I still get frustrated on OSX when I minimize a window and have to hunt around for it.

OSX took the NextStep/OpenStep interface and dumbed it down to great detriment, then added new things back (spaces, zooming) which were inferior logically but required less 'thinking' of how one works and more 'shiny looking' to potential customers..

IMHO hands down the best mouse-oriented window management paradigm to exist to date is the NextStep/OpenStep style over and above windows and osx, though I will admit windows has improved things with their sort of hybrid 'classic' windows+'macish' updates, and some of the newer ui things (e.g. window thumbnails) haven't made it into the current flagship of that lineage which is the open source WindowMaker..

Since the 'official' lineages are dead, am hoping the WindowMaker people continue to innovate/move this paradigm forward as they have been doing subsequently for the last N years...


I am glad you mentioned right click. I remember being really pleased with the orthogonality aspect of being able to right click any object.


it was by no means assured that Windows 95 would be the success that it was

I remember these times well. It was considered a huge break. People were whining about how stupid the Start menu was compared to just seeing your apps in front of you all the time :-D I love that we eventually came full circle to a Windows 3.1 Program Manager-esque approach with iOS nowadays!


Note that progman.exe was still shipped with the OS (it disappeared for good with Vista), and you could even set it as the default shell at boot.

I don't remember anyone using this setup ever though. Maybe this was too obscure and technical for the users complaining.


I still remember the Windows 95 hype. It was on the level of the excitement for the first iPhone releases.


> The very first version missed wildly in some big ways (MSN was a folder integrated into the desktop, for example

That would not have been my first pick for an example of Win95's failures. Perhaps the daily system crashes.


And Win95b with opengl, proper win32 support, and a tcp-ip stack was much better compared to the addon for win31.

I remember when unified USB support came out in Win95 2.5, it was big damn news at the time.


To each their own.

To me Windows is bloatware. But I also make the OSX dock as small as possible and autohidden. I launch everything through spotlight though as I abhor unecessary point and click (synonymous with hitting the windows key and typing a couple letters of the application to launch.).


You’re using macOS wrong ;)

I remember Windows 95 as a complete disaster of crashes, data loss, failing installations, incompatible applications, missing drivers and countless other problems which were only fixed with the release of Windows 98 (maybe even SE), which was much much better. I have memories of people sticking with DOS and 3.x, only having 95 as a nondefault boot option in case they wanted to watch the Buddy Holly video or launch the new Encarta cd-rom.


Most of the above are talking about the UI which is arguably separate from the underlying OS. You can like a UI even if it's a UI to a system that crashes a lot :)


> still get frustrated on OSX when I minimize a window and have to hunt around for it.

Learn to use the dock? The taskbar on Win 95 was evolutionary rather than revolutionary, and the Dock from NeXT was one of the influences. Which is the same dock was have today in macOS.

You can still have personal preference, of course. But if you have trouble using macOS to find minimized windows, that's because you haven't learned to use it not because it's not possible.


"not understanding how folders could exist inside of other folders" -- My mom is 70 years old now, and I easily get frustrated whenever she's stuck with seemingly simple tasks with her computer. I usually scold her and yell at her, "This is so obvious, how come you don't know?" -- I always regret doing that afterwards.

After I'm calm, I ask her why, trying to understand it from her perspective. Every time I do this, I'm always surprised, because she gives valid points, and I end up cursing the developer :D

So, whenever I design UI/UX for an app, I ask my mom to test.

Rant: In my opinion, there should be an option in Mac/Windows to disable file drag and drop. Every time I check her computer, I always find dislocated files simply because she accidentally drag them.


>So, whenever I design UI/UX for an app, I ask my mom to test.

I have an informal rule that I will try to get someone at my job that has never seen or used the application to be the one to test out new features or UI changes. Generally just asking when they have some time, handing them a phone or laptop, and asking them to do a task in the app (with a small amount of background about the task if needed).

There has never been a case where this hasn't monumentally improved the application. Questions like "what do I do here?", "how do I get it to start?", and "did it work?" were extremely common for quite a while before we managed to get the UI in a good state. You just don't see the implicit assumptions you make at so many places.

Sadly it's hard to "formalize" something like this (at least in my experience), because the benefits seem to be greatly reduced if the person testing has seen or used the application before, and I found it works best the "further away" someone is from software development.


Don't forget about repeated use. Do your users come back, or do they use the app only once !?


This isn't the only thing we do for UI/UX stuff, it's just one that I've found can really help, and most developers should be able to do it in some fashion.

We have plenty of analytics and user testing, but they tend to miss the case of users that are unable or unwilling to learn the application, and end up perpetually confused.


Because you can't "formalize" humbleness. In fact, a lot of the tech culture, today, is anti-humble.


I don't think it has anything to do with "humbleness".

It's that there isn't really a way to formally gather and have people that haven't really ever used the application before, but know enough about the problem domain to be useful in testing. Not to mention that it's a non-renewable resource, once I've done it with someone for a specific part of the app, they are "burned" (at least for a few months). (that last sentence came across really... shitty sounding, but I can't figure out how to reword it to get the point across without sounding like I'm treating people like old computer parts, so I'll just add this disclaimer...)

Any kind of "formal" process ends up just looking like QA, and they end up making the same kinds of assumptions that the developers and designers do since they work and know the application just as good if not better than them.


It isn't humility, it's learning.

Once you've learned something, you can no longer remember what it's like to not know it. Not fully, anyway, and certainly not without deliberate effort. It takes quite a lot of mental effort to stop knowing something, and approach a task with the mindset of someone who's never known it.

Good QA people should be able to do that, but it must be hard even for them.


I used to get frustrated explaining what I thought were simple computer tasks to my mother, but as I've gotten slightly older I am getting just as stuck as she was with some basic computer tasks! I totally get it now.

She also reminds me that she used to work at Digital (DEC), on the cutting edge of tech, in the 80's, but left her job to raise me- so I'm the reason she's so far behind, technically!


There's a world of a difference between VAX and today's modern operating systems. Twenty years in the future, I predict user interaction will largely be predicated on artificial intelligence. I can certainly see an older version of myself, frustrated at trying to figure out what sentences will trick the machine into doing what I want it to do rather than simply speaking to it.


We are pretty close to that already, in some areas. Google Search is a good example; autocorrect on the Android keyboard is another.


I like that "she's so far behind technically" can be interpreted two ways here.


>In my opinion, there should be an option in Mac/Windows to disable file drag and drop.

That would save SO MUCH HASSLE. Great idea.


OT but related to Drag&Drop, my IntelliJ at work is set up with the "drag files with ALT pressed" setting, mostly due to my "lazy" clicking that ended up trying to move around classes across packages disrupting my flow. Would love to see that option in any file manager as well. Also I hate with a passion the "drag selected text" feature


The term "folder" already annoys me. It's a (filesystem) directory...


> The term "folder" already annoys me. It's a (filesystem) directory...

You don't like the metaphor of a folder, but you'll accept the metaphor of what goes into a folder?


The two metaphors are not as tightly coupled as you suggest. Even pre-computing, a "file" meant a collection of information and a "filing system" was a way of organizing it - that didn't neccesarily involve paper folders. In fact, card files, from which we ultimately derive our computer term, were more usually stored in boxes.

I too prefer the term "directory". Apart from all UNIX tools using this terminology, it also more accurately reflects what it actually is - not a physical container for information, but a list of references to it - an association of names to addresses. A paper file cannot be in more than one folder, but it can easily appear in more than one directory. On a filesystem that allows hard links, this is a more useful conceptualization.


>Apart from all UNIX tools using this terminology, it also more accurately reflects what it actually is - not a physical container for information, but a list of references to it - an association of names to addresses.

Yes, but "directory" doesn't describe the way the UI appears to behave in a way that's more intuitive to the average non-technical end user than "folder." A "directory" could also describe a map or a phone book for most people.


>"directory" doesn't describe the way the UI appears to behave

That rather begs the question though, doesn't it? To a nontechnical user, the ship has sailed - everyone calls them "folders", the icons are ubiquitously pictures of file folders, and it would be terribly confusing to try and change it now. That doesn't mean it was the right choice though. I could imagine a world where the icons were little phone books, and the concept of a hard link didn't require a half-hour explanation.


Check the comment four levels above. Apparently it is not at all intuitive that folders can be inside other folders.


I don't understand that, the folder was picked for the filing cabinet metaphor.

But in a filing cabinet system you have different cabinets, different drawers, different hanging files and various lower level arrangements - non-hanging folders (or drawer dividers), paper clips, staples.

Perhaps they never saw a filing cabinet? Or never put something in a box, then in another box?

When they put food in a cupboard do they empty it out of the current container first?

<<Your house is a drive, every room is a "folder", every container in every room is a "folder", every container inside another is a "folder", every object is a "file".>>

I've come across it, just assumed that no-one explained the metaphor to them.


> I've come across it, just assumed that no-one explained the metaphor to them.

If it was this simple, there would be no people who struggle to understand recursion. And I know of quite a few.


The places where the metaphor gets stretched can be frustrating, especially in Windows. You can put Folders on the Desktop, but there's also a "Computer" icon on the desktop which allows you to browse into other folders.. including the desktop?

Then there's your Personal folders.. which are in fact collections of other folders themselves, but appear transparently to be folders- but somewhere on the file system they are actually different folders. But without the benefit of breadcrumbs to help you keep your place in the structure, where did that file actually go?

Don't get me started on shoehorning OneDrive (or was it OneDrive for Business?) into the mix.. where'd I save that file again?

That's why my desktop is so cluttered!


>You can put Folders on the Desktop, but there's also a "Computer" icon on the desktop which allows you to browse into other folders.. including the desktop?

That's a specific failing of Windows, not something inherent to the folder metaphor, though.

It's one of those absurd things that happen when nobody pays any attention to coherence. That particular thing has driven me BATS for years.

I'm encouraged, though, that my Win10 machine here doesn't seem to continue this foolishness. WinExp used to always show the Desktop as the "root" of everything, but it doesn't now. The quick access pane shows This PC as the root, with quick links below it for Desktop, Documents, etc. It's almost like MSFT is learning.


Well, yes. Windows' implementation of the metaphor is my complaint. In its current form, they may have cleaned up the "desktop" but also made it more confusing to tell what is a special folder/collection/Library. The way it appears now, there's a "Library" called Documents, which includes a folder called Documents but may include others as well. Where am I at a given moment, not sure.


"Folder" makes perfect sense to me. At my first job, we used a paper-based filing system, and we'd sometimes put folders inside of other folders. It's perfectly legit. You wouldn't nest them 5 levels deep, but that's simply a physical limitation: you wouldn't put 100 files in one folder, either.

"Directory" I don't understand at all. The only thing I can think of is the felt letter boards at the entrance to office buildings that say "DR SMITH - OFFICE 555". Not only are those a more obscure metaphor, but I've never seen them nested even one level deep. I don't have any idea how I'm supposed to use that metaphor. "Create directory": am I creating a new building? Is my building going to have two directories in the lobby? How is one 'inside' the other?

I was completely lost on Unix until it was pointed out to me that "directory" was just its weird name for folder.


The term folder was already common in other systems, before being adopted on Windows 95.

Directory is a UNIX thing.

On AS/400 (now IBM i) there are no directories or folders, rather catalogs.

Similarly other systems also had other denominations for group of files.


Directory was a DOS thing too, if you used cmd in Win95 you'd use dir.


Yep, a concept that MS-DOS inherited from UNIX, when they added directory support on MS-DOS 2.0.

However, even UNIX later adopted the folder designation for its GUI variations.

http://toastytech.com/guis/unixpcorganize.jpg

http://toastytech.com/guis/unixpcerrors.jpg

http://toastytech.com/guis/sv411fileview2.png

https://docs.oracle.com/cd/E19504-01/802-5817/6i9i42q3l/inde...

There are plenty of other examples of folder use, so it isn't something that Microsoft just decided to invent for Windows 95.


I agree with the downvotes in so far as it is silly to still get upset about this (instead of silently admitting defeat), but I am 100% guilty of that folly myself.

It's highly relevant to the discussion because it illustrates just how deeply ingrained these basic UI expectations are. UI (re-)designers take note, in some cases even decades of forced retraining won't make everybody accept gratuitous change.


Me too!

However, in the Windows shell a "folder" is not necessarily a "directory" - it's just something that may contain something. Control Panel, for example.

A directory is a type of folder.


And indeed I remember that as one of the more confusing things when I've got my first win95 PC as a kid. I used amiga and some DOS before, and I knew what a directory was, but now all of the sudden there's something called a folder that seem sort of similar?! To add to the confusion the computer manuals that were written in my mother tongue used in parallel both word 'folder' and the literal translation of it. Took me a while to understand that it all means the same thing.


I suppose it's because folder sounds "less tech" maybe

However directory was also a borrowed term (and to be honest folder kinda makes more sense)


I like that folders icons on most GUIs are still that light brown Manila colour, but plain brown folders went out of style with leg warmers and mullets.


I’ve seen workplaces in a major meltdown when someone accidentally dragged a folder somewhere and no one could find it.

And of course, bakckups weren’t properly tested and didn’t work.


How do you manage files in windows at all without drag and drop? Do you just never reorganize anything?

Further what is the chance that anyone who can't drag and drop would be able to find the setting to turn off drag and drop?


OP didn't want to remove the functionality, just provide a way to disable it for users that often perform drag & drop operation randomly.

Especially older people have problems with their motor skills so instead clicking on a file they often perform drag & drop instead (they can't hold the mouse steady enough, so the system recognizes movement and at the same time their "click" is too slow, so OS recognizes it as holding the button down)


Cut & Paste for when you really need to reorganize? But usually it would probably be enough to just list the most recent files somewhere and provide full text search for everything else.


Seeing how my elderly parents use the computer: you don't manage files. At all.

Right click is evil, keyboard shortcuts are evil, drag and drop never occurs to them. Mom usually just reads her emails (and attachements open in the browser nowdays), so it's not a problem, but father saves files.

If he finds something important then he saves it multiple times, so in the file list view there will be multiple similar items, that's how he knows it's important (visually it's very distinctive). When he has to copy a file somewhere else he usually fires up his CAD software, opens the file and then saves it elsewhere.

I just gave up "educating" them (after countless attempts). It's pointless. I remove file duplicates half a year, copy stuff to an usb key when he asks me (I visit them once a month) and avoid the machines like the plague, unless they explicitly ask me to do something (I installed ubuntu for mom, which works wonderfully for her, one less problem to worry about).


There's something different about cut & paste vs drag & drop, though it won't impact most people.

If you're using RDP and copying files from the host computer to the remote computer, it's natural to cut/copy the file on the host and paste it on the remote. Drag/Drop isn't as natural because you have to mess around with the RDP window size and scroll position to be able to see Explorer on both machines.

The file copy occurs very slowly, and requires a lot of memory. Copy/Paste uses the clipboard, so the entire file gets read into memory before it can be written to the remote filesystem. It goes slowly because there is apparently an un-optimized loop copying bytes.

Instead, on the remote system you should open two Explorer windows, one for the destination and one for your local filesystem, which RDP adds for you. Then you can drag & drop the file you want to copy. This skips the clipboard and also seems to use a much better optimized byte-copying loop. The speed difference is very noticeable.

Pro-tip: if you're copying lots of files, zip them first and copy the zipped file. That can reduce a multi-hour copy operation into a couple of minutes, even if the compression ratio is awful. Windows is really, really bad at multi-file copy operations.


Cut & Paste is (was?) dangerous btw. Windows could lose your files:

1. Cut a file.

2. Paste it somewhere.

3. Hit control Z.

4. Cry because your file is gone.

https://answers.microsoft.com/en-us/windows/forum/windows8_1...

But it seems I cannot reproduce that in the current build of Windows anymore. So I think they finally fixed it.


Yup, I ran into this issue enough as a kid that I stopped using cut/paste/undo in explorer. If I needed to copy or move files around, I’d use the command line—which also had the effect of making files copy much faster (seconds instead of minutes) for some reason.


Ribbons, context menus, keyboard shortcuts. For elderly users, I usually recommend the context menu because it is most universally available.

And indeed, most of the "gosh I broke the computer, all my files are deleted" calls from my grandma are caused by accidental drag-and-drop.


I'd like an interface without DnD too, not because I trigger it accidentally but because it's very bad for my RSI. And it's a nightmare on a trackpad.

Fortunately you can do 99% of file management in Windows with the keyboard, either through cut and paste or the Explorer menus.


The Mouse Keys accessibility feature lets you activate drag and drop without holding anything down.


> How do you manage files in windows at all without drag and drop? Do you just never reorganize anything?

Ctrl-X, Ctrl-V.

Are you a Mac user? Apple disabled Cut in Finder ages ago, basically for aesthetic reasons. It's a constant irritation with my Mac that I have to manually change the folder view just to move a file up one level; on Windows you can do it in less than a second with Ctrl-X, Backspace, Ctrl-V.


Apple did not "disable" Cut.

1. Copy the source file with Cmd-C

2. Browse to the destination

3. Move with Cmd-Opt-V

It's a UI design decision to avoid Cut in the filesystem. Cut would behave differently from Cut in every other app -- namely, the file is not deleted immediately after Cut. It seems better to avoid calling this "Cut" than to have it behave inconsistently from other apps.


There are Finder extensions like XtraFinder that add cut&paste to it (and a lot of other goodies)


> How do you manage files in windows at all without drag and drop?

Most people I've seen use Ctrl-X/Ctrl-V, or right-click-Cut/right-click-Paste. It has the advantage that you don't need both the source and destination visible at the same time.


You can actually drag a file from one window to the other without having both showing (such as between two maximized windows). While dragging a file, you can still manipulate windows around using the keyboard, like with alt-tab.

For example, you can begin dragging a file, then alt-tab to another window or tab or even open a new program or whatever using the keyboard, and then finally release the file into the destination window.

It's not obvious that this would work, but it is convenient if you find yourself wanting to drag something with the mouse for some reason.


You can also drag the file over the taskbar and whatever program you hover over will come to the foreground


I use the mighty Total Commander all the time :)


right click cut and paste, keyboard shortcuts?


Drag and drop is one of those things that looks oh so fancy on demos, but is horribly imprecise to use as a daily action.

It also gets in the way when trying to use Windows via a touch screen.

More

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: