Compare this to modern video players, especially in web, it's not that bad. No disappearing buttons. No panels appearing from all 4 sides of screen on mouse hover. This QuickTime thing does not appear from nowhere and start autoplaying videos when you start some program or open document. It does not move to screen corner when you press "close" button.
I've been using computers since the mid 1990s, and after getting better from about 1995 to 2005, software has taken a turn for the worst. I'm sorry, most of it's just garbage these days. The new Gmail site lags on my Thinkpad with an i7-8650U (turbo boost to 4.2 GHz). When docked to the left-hand side of my screen (using WIN-LEFT), the layers of various toolbars makes it so the actual message text doesn't start until halfway down the screen. The content area extends past the right edge of the window, but not only that, it won't let you scroll that pane sideways to read the rest of the content. The reply button gets shoved off screen too, and the forward menu item is buried under some cryptic icon with three dots. And this is with the "compact" view and the left sidebar hidden!
In short, Gmail 2018 does a fraction of what Outlook '98 did two decades ago, and does it while pissing away processing power, memory, and screen space. This sucks. I want off this ride.
(While I'm ranting: what's up with Dropbox these days? I've been looking for an app to manage a library of PDFs and notes alongside. I decided to try Dropbox & Dropbox paper. The web app burns up 20% CPU just sitting there displaying a PDF. It can't render fast enough to keep with scrolling. I don't remember PDFs being onerous back in the late 1990s, but web apps seem to universally struggle with them even on modern hardware. After trying a bunch of other crap-tacular web apps, I stumbled upon Citavi, a traditional Windows .NET app. I was so happy I almost cried. It displays PDFs! It sits at 0% CPU usage when idle! It has a menu and toolbars, and you can dock and undock the content area to fit everything on screen! It's a wholly unremarkable app that's remarkable only because software has gotten so incredibly awful in the last decade.)
This was also an age of people getting on computers for the first time. The brushed aluminum feel, the drawers, static buttons, etc. were all to make it feel like a physical media device since the target users were used to using those to consume media in the 90's.
The nostalgic web designer / graphic designer from those years in me is really fond of and misses the old conventions of web interface and application design.
I don't know if anyone / everyone remembers phong photoshop tutorials (oh my god they are still there! http://archive.phong.com/tutorials/ ) but that type of interface design was my heart back then.
It was all about creating almost sci-fi like installation design. Wires, vents, metal surfaces, rust, plastic, gel, glass, reflections, scanlines, wireframing, grids and 3d grids, and other things like that. Think Starcraft's HUD design and the Terran installation type maps that floated in space. ( https://media-curse.cursecdn.com/attachments/21/437/b34694f4... <~ this is actually the perfect representation in my mind of what I wanted to make in terms of the graphical interface in websites in the mid to late 90s and probably early 2000s before everything became about "clean, flatter, simple, "elegant") and I'm not so sure I like things that way.
In fact I think I'm going to make it an express feature that all of my future side project's design portion will be dedicated to creating mid-to-late-90's-like interface / graphical designs based on phong's tutorials and other things I loved back then.
Yeah, to my surprise there's a lot that I actually miss about that era.
I don't really miss the graphical excess of trying to make one's site look like a Starcraft map, but even when that graphical excess existed it was sort of a thin veneer atop good old raw plain fast HTML. As opposed to today's sites, which download and JIT megabytes of Javascript before they're ready to respond to a single click, and have often invented their own bizarre UI patterns.
It's odd how right after that era the design stuff we're getting nostalgic about was derided more often than not. Is that something you remember as well? It's not that it was pointed out as being terrible and nasty like, let's say Geocities or MySpace type pages, but I know there was a drastic shift and unlike most trends it hasn't seemed to loop back on the internet of all places where things seem to cycle back into popularity faster than something like fashion? Maybe that's my personal perception but it feels at least somewhat correct.
> it was sort of a thin veneer atop good old raw plain fast HTML
Yes, I loved that too! It felt like you actually had a handle on everything your website was doing and was comprised of. There wasn't anything involve with it that you just installed and assumed would work. Each decision and technology was something you knew inside and out and employed specifically.
Things these days are crazy, eh? Maybe it's a bigger combination of that part of it plus the fact that it looked the way that I described that creates the fuzzy feelings? Well, if I ever get around to doing what I said above I'll have some sort of answer, haha.
I do think that mid-2000s "web 2.0" era that came after the era of shiny pipes and stuff.
The "web 2.0 era" was pretty good. There was a doubling down on semantic HTML, sharing data via RSS, etc. The first cross browser JS frameworks began to pop up which allowed everybody write JS once that would work mostly everywhere.
During that era there was definitely a lot of derision of that late "web 1.0" era with the superduper photoshop-image-slice-heavy rendered pipes and stuff UIs. At its worst a lot of it fundamentally broke basic web functionality like bookmarking and such. The derision was probably justified, since it actually sort of sucked when the whole web looked and worked like that.
HOWEVER, like you I am nostalgic for some of that excess! I like it better than a lot of today's web!
Indeed, Skeuomorphism was solving an actual real world problem of giving people analogies to physical objects. It's been vogue to hate it so long that we forget the challenges of software design that predecessors faced. There are a few strikingly bad choices here (mostly the drawer) but the buttons are all omnipresent and look familiar even if digital.
If you stuck the modern quick time interface in 90s, I imagine the disappearing buttons and positional play interface daunting. We've evolved as our literacy has evolved.
What, you mean to say it's not a brilliant design idea to make everything flat and make clickable and non-clickable areas indistinguishable? Shocking! /s
or Mobile Android Youtube. "Video starts dim with an overlay. Pressing the X in the corner to hide overlay may or many not skip your video to the end." Sometimes the X just disappears and you cant close the overlay.
This is one of my favorite user interface design articles that I recommend every chance I get, and it should be required reading in every design class.
Apple's long romance with skeuomorphism peaked with Bob Bishop's APPLE-VISION, and went downhill from there.
>[... detailed step by step instructions to demonstrate a terrible usability failure that I wrote up in a bug report that was brushed off and ignored ...]
>This single facet of VLC's terrible UI deserves to be front and center in the User Interface Hall of Shame [2] -- it's even worse than Apple's infamous schizophrenically skeuomorphic QuickTime 4.0 player [3], from 1999! The latest version of VLC in 2017 is still much worse than the shameful QuickTime player was 18 years ago!
At least QuickTime 4.0 serves as a useful ruler by which we can measure the terribleness of other video players.
I wouldn't say VLC has the best user interface in the world, but in no way is it as bad as the QuickTime/WinAmp [1] era UI we went through where UI looked cool but it wasn't obvious which part if it was actually functional or decoration. Not only that, but it was completely abstract in comparison to the OS's UI. VLC's interface is relatively honest in it's appearance in my opinion.
I think modern UI is becoming over simplistic - it's no longer obvious what is meant to be interacted with [2]. Humans usually have a good grasp of depth and things pretending to have depth.
I imagine OSX will eventually go the way of iOS and Android with a flat UI, where it's not obvious at all how you're supposed to interact with it. Apple products have been getting harder to use for a long time, with features such as the insane four/five finger scroll on an iPad to change apps, close windows, etc [3].
And don't even get me started with the Apple Newton [4].
To put QuickTime and WinAmp in the same sentence? QuickTime (on windows) is one of the worst applications I've ever had the misfortune to use (it was pretty much a requirement to view many .mov files). You couldn't even play fullscreen without paying.
Winamp on the other hand, I can't name a single application that does what it does better today. Also, barely any media player today looks native either.
>Winamp on the other hand, I can't name a single application that does what it does better today.
I'm talking UI, not functionality. Functionality wise - I am just as in love with the old WinAmp. And regarding UI, it was awesome that people could re-skin it!
>Also, barely any media player today looks native either.
Of course, but some do a better job than others I think. I think VLC appears more native anyway (it uses native menus and dialogue for menus).
Do you really call VLC's "interlayed" "VLC ZOOM HIDE" user interface I described in detail in my bug report (that was brushed off and still a bug years later) a "native" user interface? It's absolutely not. Go back and read what I wrote, because you're totally missing the point. Both the UI and the functionality are terribly broken.
Do you actually believe it's a good idea, or that there is any possible justification, for VLC to NOT use a native user interface, and instead draw the user interface INTO the video at video resolution, and rotating it with the video, but not rotating the mouse events (so it is absolutely UNUSABLE under rotation or reflection, and extremely hard to use even unrotated because the re-invented ugly non-standard sliders in a stupid way that flaunts Fitts' Law for no good reason), and using giant jaggy pixels and a terrible blocky font and short upper case labels so it's not too enormous for low resolution vidoes, is better than drawing a native user interface at full screen resolution with a decent full resolution anti-aliased font in the overlay that is actually usable at any video resolution, scale, rotation or reflection?
Please stare into these screen snapshots for a few minutes and reflect on what I said, and then go run the latest version of VLC, and prove to yourself that it's still just as bad as I described before replying, because it is.
While I'd agree that VLC is fairly bad usability-wise, concentrating on this single feature is wrong, I think. It's an obscure feature buried two or three layers deep in an effects panel that most users never see or use.
VLC has a lot of features that often are rather tangential to a media player and it can sometimes act as a frontend for ffmpeg (especially the stream/convert tools). The fact that basic features, power user features and plain gimmicky crap (like the puzzle mode) are lumped in all together in one application without organization, rhyme, reason, or separation almost automatically makes for atrocious UI. With some features being added by someone who felt the need for it or needed an exercise in programming doesn't help. VLC is a kitchen sink of vaguely media-related stuff and the UI does little to hide this.
One point where I'm split on is single-key hotkeys. For years it was trivial to crash VLC simply by starting to type (something that happened to me regularly when I forgot that my IM application didn't have the focus, but rather VLC). If it didn't crash, you now had the video horizontally mirrored, in greyscale, with subtitles in Ancient Sumerian and video, audio, and subtitles were all a second out of sync to each other. Good luck figuring out how to revert that. Having hotkeys only be F keys and Ctrl+something at least has the benefit that you can't mess up things accidentally, at the expense of needing chords to interact with the application via keyboard (I'd say Windows Media Player's Ctrl+P for Play/Pause with Space only working intermittently, depending on which button had the focus was just as bad, but at least you couldn't crash it by typing into the wrong window by accident. As is YouTube for requiring K for Play/Pause with Space only working intermittently, depending on where the focus is ...).
To their credit, WinAmp and QuickTime (or any other known video player) never rotated the zoom widget with the video rotation, the lower end of which was so small it was unusable at any scale, and the entirety of which was unusable at any non-normal rotation or reflection.
That's right: the mouse target is the outlined region whose width (and thus usability) diminishes to zero towards one end. It's the most Fitts' Law Hostile user interface I've ever seen, AND it rotates sideways and upside down with the video, AND the mouse tracking doesn't work at all in 3 out of 4 rotations (or reflection), only when it's non-rotated and non-reflected: they didn't bother to transform the mouse event coordinates, just the video-resolution user interface drawing!
Don't even get me started about "interlaying" the user interface INTO the video, so it looks jaggy with unreadable text, instead of "overlaying" the user interface at full screen resolution. Just look at those blocky letters "VLC ZOOM HIDE", and the ferocious jaggies on the diminishing zoom slider curve!
It's not even worth fixing any of those particular bugs in the zoom interface, because the entire concept of "interlaying" the user interface into the video is fatally flawed.
All that code needs to be printed out and burned, and VLC's entire user interface needs to be completely rearchitected from scratch by designers who know and respect Fitts' Law, so it works consistently across all platforms at all resolutions and rotations (and doesn't suck, while they're at it).
But that's not what the VLC developers want to acknowledge or work on. They're more interested in supporting the latest fashionable Anime format.
>New 6.1 downmixer to 5.1 and Stereo from MKV/Flac 6.1. Correct YUV->RGB color matrix in the OpenGL shaders. Improved MKV support for seeking, and resiliancy. Editions support in MKV. Better subtitles and metadata support from MKV. Various ASS subtitles improvements.
I'm not sure why you take such issue with the fact that they added new features "FOR ANIME FANS". Not every foss project is going to focus its resources where you think they should be focused. You might as well say "FOR MOBILE" and print out that entire section because it's equally not focused towards the desktop ui improvements you think are necessary.
EDIT: Typo in necessary
Not just added, but headlined in their release announcement. Yet there was no "FOR USABILITY" section, and more than three years later, VLC 3.0.4 STILL has the same constellations of bugs and egregious usability issues.
I would also hazard a guess the mobile user is larger than the anime user base, and they're just as concerned with usability as the desktop users.
Are you seriously suggesting that there are people who believe that VLC needs to focus on anime more than usability?
Is it just you who feels this way, or can you point me towards some evidence of that, please? Because if you read the VLC bug reports and discussion group, and even this thread, you can see that there are a lot of people who agree with me that VLC needs to focus more on usability. Where are all these anime fans demanding VLC focus less on usability and more on supporting their anime fetish?
>Are you seriously suggesting that there are people who believe that VLC needs to focus on anime more than usability?
Either the Anime audience is better at filing bugs and chasing the up, or the developers favoured those tasks for some reason. Perhaps the UI guy is really slow or just missing entirely.
>Where are all these anime fans demanding VLC focus less on usability and more on supporting their anime fetish?
There aren't any. It's clearly not a trade-off on developer time. For an open source project, people work on what interests them the most.
Specifically in regards to "anime fetish" - who actually cares? What difference does it make if it is an Anime crowd, people who like watching 8k raw footage, people who want to watch 3D films or whatever. What does the specific reason for a video format being worked on bring to the table, other than the fact you clearly don't like it?
I don't dislike a video format. I dislike terrible usability, and arrogant cliquish developers who refuse to even acknowledge there is a problem: not just fail to focus on it, but actually make ridiculous flippant insincere arguments that it's not a problem, in their consistent efforts to brush off repeated valid bugs reports that they get frequently from many different users, and steadfastly refuse to do anything about every single time.
And it's not a matter of anime fans being better at filing bugs. As I said, I filed some very detailed bugs, with step-by-step instructions to prove there was a problem, and they were flippantly dismissed and ignored.
At least Apple had the decency to acknowledge the universally held opinion that the QuickTime 4.0 player SUCKED, and they fixed.
Developers are not interchangeable, especially on open-source projects. The group of people who contribute their time to work on core multimedia features (like audio mixing or OpenGL shaders) has almost no overlap with the group of people who will work on UI/UX.
Are you sure about that? On my Mac, the full-screen overlay shows both skip and scan arrows. The windowed version normally shows just the scan arrows. You can select an option to reveal the skip arrows when windowed. In none of the scenarios do the skip/scan arrows ever change function.
I never understood why that became a dirty word in user interface design.
There's nothing wrong with wanting a calendar to look like a calendar. If someone comes up with a different paradigm that is better, then by all means do it, and let people decide if it's good.
But ever since skeuomorphism became an anti-buzz word, interfaces have been designed to be different just to be different, not to be better.
I don't mind the skeuomorphic look. However, it almost always went hand in hand with skeuomorphic behavior, which I detest. From the article:
> Unfortunately, the apologists fail to recognize that there are two likely consequences of this approach: (1) the user is unable to transfer his or her existing knowledge of computer interaction, and (2) the software becomes needlessly subject to the limitations of the physical device.
I remember the bad old days where the Mac calendar would only let you flip between pages, because that's how real calendars worked. All your contacts were grouped by first letter, because Rolodex. It was a death by a million cuts where nothing worked as well as it could have because it had to exactly emulate the crummy real world thing it looked like. No thanks - that's the aspect of skeuomorphism that gave it such a horrible reputation.
If you want to make my calendar look like it has leather trim, fine. So be it. Just don't make it act like an old-fashioned desk calendar instead of the special purpose database it really is.
while I fully agree with the critique of VLC usability, we have to admit that we get what we paid for with [probably] majority of open source software.
what's worse, i don't think there is a reasonable solution for this problem. popularity of free software is just too alluring to the users, and asking for monetary compensation instantly reduces the user base.
example: linux and openssl. large corporations benefit from running open source software. do they compensate the authors?
...or I could just use mpv, which does what I want better than VLC ever has.
Funny thing is, in the anime/tokusatsu fansubbing community, VLC has a really, really bad reputation because it does everything just a little bit wrong. Many of the more respectable fansubbing groups have big warnings in their FAQs saying "don't use VLC", typically in bold and all caps (the usual recommendation is to use CCCP on Windows and mpv on Linux and Mac). Their support for chapters in MKV containers has been hideously broken for a long time, and they have an NIH syndrome where they implement their own versions of things that are provided by open-source libraries that everyone else uses, which causes all kinds of slight incompatibilities.
I invested the time in submitting a detailed bug report as a litmus test to determine if it was worth spending more of my time installing the source code, learning how it works, and trying to fix the problem myself.
Because I'm a user interface programmer who could actually do that work, I also have a lot of other pressing things to do with my time, and wouldn't want to waste it if there was no chance of the developers accepting my changes or even acknowledging that there was a problem.
If they brush off my bug report, I thought, then they probably won't accept my changes.
But that's exactly what they did, which proved to me it wasn't worth my time.
The developers also ignored and argued against many other valid bug reports and complaints on the discussion board. Their attitude came through quite clearly and consistently, and my experience wasn't unusual.
If you disagree that it's not worth the time, then by all means you should get started downloading the source code, reading it to understand how it works, submitting bug reports on any problems you find, getting up to speed so you're qualified to make changes in the proper way that the developers will accept, fix the bug I described, test it on all platforms, and submit a pull request.
The user is expected to make linear movements to operate a rotary control. This is the reason that most properly-designed applications utilize linear slide controls for similar functions.
I hate this UI paradigm. There are a bunch of graphic design apps on iOS that do this (e.g. Affinity Designer) where you can change something like line width. The icon shows a circle filling up as you change the property (it’s rotary so it’s the circle “ring” filling up, not the disk). Except it expects you to move your finger linearly to change the property, not in a rotary fashion so it’s very confusing. Also you can move your finger linearly up and down OR left and right. These movements are not exclusive of one another so you can move your finger in a rotary pattern and then wonder why the line width changes seemingly at random.
This is fairly common for music creation apps which have a lot of knobs, and it makes sense because a knob doesn’t work so well with a mouse as dragging up and down. It’s not immediately obvious though
Not if you work with both physical and virtual hardware.
Looking at a physical control board, you can see at a glance where a group of knobs are positioned. Being able to then glance at a screen and get similar information instantly is important for workflow.
For most DAWs the primary control surface is usually a physical MIDI device that has real physical rotary encoders, which you would map to the virtual rotaries on the pretty interface. So it's more intuitive to have both be a rotary.
To make it worse, some rotaries actually ARE implemented as rotaries, so even if you’ve been trained on a linear rotary you still may be confused by the interface when it’s implemented “correctly.”
You also have actual rotaries that are just buggy where the rotary spins 0-100% and instead of stopping at 100% it will instantly jump back down to zero when you complete the full 360°.
I feel like a rotary should “adjust” a value up or down rather than representing absolute values.
Nearly two decades later, have we made any progress? iTunes honestly confounds me. And it seems like the Spotify UI is deliberately trying to prevent me from finding the music I want to listen to.
iTunes reached a peak around 8 versions ago. Every change since seems to have degraded something, hidden something else and made something take 4 more clicks than it used to. Each release gets rid of something decent. The desktop app seems like a case study in poor UX design. Current iOS thinks album art on the lock screen should need a magnifying glass and such a small space for title that every track I have scrolls. It's shit. It didn't used to be. It never reached great.
Over on Android Google play can't stream reliably unless 5 bars of signal, but can't even give any buffer settings. In the real world it's simply shit and unusable unless you like a random length silence at least every 5 mins. Other apps figured out buffering a decade ago.
iTunes’ days are limited I think. I’m honestly surprised it’s still alive; the introduction of iOS/UIKit-based apps to the Mac in Mojave seemed like the perfect time to EOL it in favour of separate Music, Videos, Podcasts etc apps as it is on iOS.
They’ve been slowly stripping things out of iTunes into their own applications for years, and no recent Apple services use the iTunes branding (Apple Music, for example). I give it a couple of years before it’s replaced and left in the “Other” folder in /Applications
I hope not. The is music app is woefully unfeatureful compared to iTunes. They may be stripping cruft from it but it can still do a million things that iOS cannot.
The iOS music app has gone downhill just as much as desktop iTunes. Open it and the screen is filled with options I have never and will never use, except "Playlists". Then you open that and it only fits 4 playlists on the screen at a time, so you have to scroll very far to find the one you want.
Google's music app has issues too. It works ok in the browser. But on the phone and especially with android auto its so minimalist that its painful to use. In the car there's basically no way to play an album, which is like the one thing I actually want to do. Instead I'm greeted with 4 menu items I never use.
It's sad because I know there are far better interfaces, but the reality of streaming means your media is locked behind an inferior product.
I found a similar issue watching NHL games on nhl.tv and eventually found a script online to download the raw files and build an mkv out of them - no more buffering, can work around region restrictions with a vpn, and play it anywhere I want (like offline on a plane)... but it just seems so unnecessary to have to go through all that.
I find the Google Music App for Android to have a reasonable design... not great, but usable. What I can't deal with is the fact that just like Amazon, it never seems to have occurred to them that you might have a large music collection, and it's insanely slow at times, sometimes taking 10s of seconds to respond to a tap.
You're right about the downside of streaming. I used to use Amazon until they killed their Cloud Music capability earlier this year, and I switched over to Google. Both services provide a lot of value, but in both cases my number 1 request would be to somehow allow third-party apps to access the cloud because the built-in apps are pretty awful.
The “For you,” “Browse,” and “Radio” tabs in music.app are unnecessary. They are just ads for apple music. I also don’t like how much the music app advertises music I don’t own. I just want to listen to my music, not accidentally tap an album I don’t own and be taken to a store page to purchase it.
In iTunes (which is what this thread is actually about), you can always uncheck the box "Show Apple Music Features." There is a similar slider control, "Show Apple Music," that can be turned off in the iOS Music Settings.
(Also, given that many millions of us actually use Apple Music, describing these tabs as "just ads" is incorrect. They're tabs that do things. I'd agree that those tabs should be hidden by default if you're not an Apple Music subscriber, though.)
Thank you! I agree completely. It's like there's some flow that it's optimized for, but I have no idea what that flow is. I want to find music, then play music. Half the time when I search, it doesn't find anything, even though I know it's there. The other half the time I search, it shows me an overwhelming number of possibilities, each slightly different, with no way to tell which was the one I intended. Then there's the "card" that pops up with what's playing, but which won't come up when I need it to, and always comes up when I'm trying to do something else. The buttons to do something with the music, like add it, remove it, etc. are never where I think they're going to be, so I end up playing point and click games like some bad adventure from the 90s. "Does this control do it? Nope. How 'bout this one? Nope. This one? That's it! I've unlocked a basic feature that should have been obvious!"
The only thing that I miss from the Apple ecosystem is Apple Music. I like to listen to music by albums, and Apple Music does that wonderfully. Spotify seems to be tailored to consume music on a track by track basis.
Not too barebones though. It has multiple playlists in tabs, fast type in search, good support for tags, decent seek bar and volume control. Good menu.
Sensible file dialog instead of custom silliness.
Easy handling of whole directories.
It can handle thousands songs in a playlist and supports a lot of audio formats, perhaps the most of audio players out there. You even have queue if you wish.
"We" are no more than a rhetorical device! I think low quality UX is only really important when the content is tied to the presentation platform, blocking competition. Like iTunes when DRM was compulsory and Spotify. By design of content providers, I'm out of touch with competing player platforms.
I do remember despising "skinz" in the late 90s, for taking away my control of the color scheme and all the other reasons this article describes.
While that went out of style, little did I know it was going to get worse in other ways. Now the OS itself and recent browsers go out of their way to prevent the use of dark themes by hard-coding colors everywhere.
We work in a darkened studio (color accurate) environment and are negatively affected by your fucking enforced white backgrounds! It makes me even angrier when I remember having better control in Windows 3.1 in the early 90s.
I use MATE now but the gnome3 and firefox-aping-chrome disease still shows through the cracks. Very recent products have halfheartedly added a bit of dark theme here and there but still don't seem to get that what we had twenty years ago was just right.
I wish firefox would just make native clients instead of all this other rubbish their trying to invent, that alone would make it a better browser. These days it at least tries to use the native theme but they royally screwed that app, you have either turn this off or turn off your dark desktop theme to use 90% of web forms out there.
Bonus points if they could actually improve reader mode and enforce contrasting colors.
Is there some general overview of these "Blunder Years" of UX design? When Steve Jobs came back to Apple there were numerous examples of this, e.g. the volume control in QuickTime player being shown here, which was horribly annoying to use since it used an on-screen design that only made sense for a physical device.
They also had a note taking / desktop book program at some point that emulated the look of a physical book to the detriment of its UI.
Steve Jobs, for whatever reason, really loved skeuomorphism, and in many cases personally ordered those designs, disregarding the objections of designers like Jony Ive.
Because he was trying to make it easy for non-technical people to use a new piece of technology.
It's the same reason that Microsoft included Minesweeper and Solitaire on Windows 3.1 — to teach people how to double-click, and to click-and-drag.
Today, Silicon Valley just assumes that every single person in the world already knows how computers work. Tell that to the people I work with who spend an hour a week showing poor recent immigrants in their classes how to install AA batteries in devices.
Unfortunately, more often than not physical metaphors in UI design only serve to confuse. There's a reason nobody wanted to use the home metaphor of Microsoft Bob.
As brutal as that UI was by today's standards, I would still consider it far from the worst of UIs from the 90's. The behaviour of the controls were objectively bad, but for the short time era were still rather impressive. Media players seemed to always take the cake with terrible controls and skins for presentation. I love 'em.
For the record, the horribleness of QT 4.0's UI was a topic of conversation the moment it came out. The attention paid to UI quality by the industry has ebbed and flowed at times, but there was never a standard under which QT 4.0 was ok.
It would have been bad for any company to come up with that design at the time, but it was especially tragic coming from Apple, because it scuttled their sterling and meticulously earned reputation of purposefully designing user interfaces to be easily usable and learnable, and of going the distance to adapt to quirky human behavior, physical and physiological limitations, etc, and then documenting those designs and the rationales behind them, so other people can understand and apply them, and then actually applying those rules themselves. (But also unfortunately patenting some of those designs.)
QuickTime 4.0 marked the end of an era of great user interface design at Apple.
>The original 1987 version of the Apple Human Interface Guidelines can be checked out from the Internet Archive, and should be required reading for serious user interface designers, the same way that serious art students should contemplate the Mona Lisa, and serious music students should listen to Mozart. Even though it's quite dated, it's a piece of classic historic literature that explicitly explains the important details of the design and the rationale behind the it, in a way that modern UI guidelines just gloss over because so much is taken for granted and not under the control of the intended audience (macOS app designers using off-the-shelf menus -vs- people rolling their own menus in HTML, who do need to know about those issues):
Unfortunately it has been replaced by “flat” minimalism and other horrible stuff straight out of art school. Application user interfaces are now designed by graphic artists and not HCI engineers.
The saddest thing is that the "Classic" skin, which emulated the Windows 2000 look, and is light-years better than anything Microsoft has done since, disappeared after Windows 7. Windows is now so "advanced" that it can't even emulate how it looked 20 years ago.
I use WindowBlinds to sort of emulate the Windows 2000 look, but it still has so many of those user-hostile flat UI features which inevitably reduce your display to a mess of indistinguishable pale rectangles.
This is a long-standing problem for the whole software industry that is largely unsolved: real separation between the user interface and other logic. In order to stay on yesterday’s (better) UI, you need to stay on yesterday’s software version, foregoing critical security and performance improvements, or rely on nasty third party hacks to bring the old UI back to the new software.
I know a lot of people who deliberately do not update their software unless forced to, because designers/developers cannot resist the urge to redo the UI every year, pointlessly moving things around and changing everything.
This, imho, is the biggest reason people don't update their software. They've caught onto the fact that the term "updates" is misleading, and it would be better call it an "anti-feature delivery mechanism".
This is why we need two separate channels for updates: One for security, and the other for features.
But, unfortunately, quite unlikely: it assumes extra thought must go into architecture and implementation. If you look around and see the current state of software industry, the rapid increase of entropy in software systems, the push of the market for cheaper, or completely free software, you'll find no pressure ever to deliver an "anti-feature delivery mechanism".
Plus, making a perfect product effectively puts you out of business. But ship a product that needs support, customization, training, consulting, periodic updates, maintenance - and you a rich man!
To me, QT 4.0 was the beginning of the end of good UI design at Apple - when the design stopped being done by HCI engineers and started being done by graphic designers instead.
Unfortunately there are no other major players doing any better.
Can we talk about the Chrome video and music player regression? It's a horrible decrease in user experience while gaining only very, very, very marginal "visual" wins.
Reminds me of the massive block that appears when you change the volume on OSX.
Perfect position to cover subtitles of the full-screen movie you're watching. On more than one occasion, I've had to rewind the video to catch a critical subtitled word because I made the mistake of making a minor volume adjustment at the wrong time.
You can turn off the block entirely from the terminal, but then you lose the indication of where you are in the volume continuum.
YouTube goes out of its way to replace iOS's fullscreen media viewer with a terrible one. Maybe they felt like they had to make it look like the desktop UI or they needed the quality / settings buttons, but the scrubber is just bad.
And they took away the +/- 15 seconds buttons, which are much much better for skipping around long videos than the scrubber is.
Mystery meet navigation! On Android, I accidentally found tapping multiple times quickly on the sides of a videos does this and the more you times you tap the further back it goes.
Mystery meat navigation is probably the worst paradigm in all of us. Even icons with stupid, but distinguishable icons is better.
Ha, good to know. I've discovered that the pinch-outward-to-fullscreen gesture is disabled on the YouTube video player (you get a "Full screen is unavailable. Learn More." notice).
But if you:
1) go to shitty-fullscreen
2) hit picture-in-picture
3) go back to fullscreen
4) close fullscreen back to normal view
5) pinch-outward to zoom
It will put you into the native iOS fullscreen player. So I think I'll keep doing that rather than learn the mystery meat navigation.
Dear Googlers reading this: you may be tempted to go out of your way to fuck it up further and remove this workaround. Pretty please don't? Thanks.
Discovering this is a big gamble because you don't accidentally double click these regions because a single click could mean going forward of back a full video in some playlist.
Which is a shame, because desktop YouTube is one 8f the best video player UIs that I have ever used, and I'm still searching for a desktop video player that is as usable.
Although the mobile version is unconscionably awful, I almost always have to guess at what the controls do and avoid opening the app.
Are you joking? Desktop YouTube is one of the worst UIs I've had to deal with. Controls constantly popping up over the video, annotations on by default, autoplay to the next video on by default. If you're in full screen mode and switch to another app, then come back, the video is just ... gone. It's not on the page, but it's not full-screen anymore, either. Buttons and options that don't show up consistently on every video. No way to fix the aspect ratio of videos created by morons. Crap everywhere on the page. I seriously find it hard to believe that anyone would consider YouTube to have a good video player interface, let alone the best.
> YouTube goes out of its way to replace iOS's fullscreen media viewer with a terrible one. Maybe they felt like they had to make it look like the desktop UI or they needed the quality / settings buttons
I think the reason they do this is because the standard media player allows for picture in picture and background playback.
You can actually use those in Safari on an iPad now, so I don’t think that’s it. Unless a content blocker I’ve installed has fixed something they tried to break.
The YouTube app’s custom video player also doesn’t have audio delay compensation for AirPods, unlike the standard iOS AVKit player. I’ve been avoiding the official app because of it.
A key feature of the original Macintosh operating system was an orthogonal interface for copying, pasting, dragging and dropping arbitrary data, whether plain or styled text or graphics, into (more or less) arbitrary locations, which was far from obvious in consumer systems of the time.
And when QuickTime came about, video was no exception. You'd select a length of video from the timeline, copy it and either paste it into a different video, or into a presentation or other kind of document. It's ridiculous to call the ability to use the OS the way it was intended a "pro feature" that you should have to pay extra for.
Ah, yes. The "worst of both worlds" design where you had to spend several minutes poking at every little piece of chrome on the UI to figure out whether or not it did something. And then about 5 years later came the equally idiotic trend of making all desktop UIs look like web pages. Dark ages of UX, indeed.
The interface was definitely confusing and it was way too easy to click on the wrong thing. And when you had a mouse that still had a mouseball that was easily clogged, clicking on the wrong thing was extra annoying.
About the article itself: I miss the days when web design was that clean and simple. Just the information, no fluff or ads.
To me, it's not newer vs. older. Things like flat design and skeuomorphism can work just fine if they follow the principles of human interface design. The problem I see is that we seem to be drifting away from those principles in a lot of ways.
Major OSes seem to be worse at having consistent interfaces. Trying to make desktop interfaces work like the constrained screens of phones and tablet by hiding scrollbars, etc. is a step back. Useful animations are a step forward, but useless and slow animations are a step back.
However, web interfaces have gotten better since the death of Flash. They're not perfect, or maybe even good, but they're much better than previously.
So it's a mixed bag. I prefer modern web design to that of the past, but prefer older OS design to the present mostly.