So I'm not sure what there is to get upset about. I don't see any big functional changes in the OS aside from reworking the notification center UI a little- and I like this new version more. It's already been confirmed that they aren't locking down apps to the app store or anything like that.
It seems like people think that because they're taking design cues from iOS they intend to lock it down like iOS and they're transferring that fear into dislike of the new UI itself.
Apple knows that a large percentage of the people who buy its computers are software engineers and if they ever do fully lock their computers down in that way they'd be ceding that marketshare entirely, along with catching quite a bit of heat for it in the press. Maybe eventually that calculus is worth it to them but I don't think that's what this is a step towards. I think they just want users who are fully plugged into their ecosystem to feel like it's a cohesive experience across platforms and this achieves that.
Apple was the company that dropped the headphone jack from their phone, shipped a laptop with only USB-C ports etc... they don't shy away from making big controversial decisions if they feel like it's good for them.
If Apple was going to do this to MacOS, they'd have already done it. They wouldn't be spending years tip toeing towards it like people seem to think.
Instead, they've rolled back some really unpopular changes like the crappy keyboard in their laptop, they just shipped a new Mac Pro that is clearly designed for professionals. This chip transition will result in Apple's laptops having better battery life, specs and heating than any other laptops available on the market. It won't even be close.
Additionally, they're actively trying to bring a huge swath of developers who currently only develop for their mobile devices into developing for the Mac. If they were trying to turn Macs into something for college students to watch Netflix on they wouldn't be doing any of this.
It's insane to me that people can see a UI refresh as a sign of doom while ignoring that Apple has actually been investing in the platform quite a bit more than they were a few years ago.
"We love our developers and pride ourselves on individual expression and transparent feedback."
-- Corporation 101
Imagine: you're in charge of Apple's new keyboard. You spend a bunch of time and money making the thing as thin as possible, because that's the overall design trajectory.
Of course, you test this thing internally, and it's okay. It takes a little while to get used to, but isn't awful to type on, and a few months of stress testing seem to indicate that this thing will work in the real world with an acceptable defect rate.
Besides, your management chain loves this thing, because It's Thinner(tm).
And then, you ship it, and discover that this thing breaks way more often in the real world.
But, because Apple, you're a company that never publicly admits to doing anything wrong (because lawsuits), which takes customer feedback with an overly-critical eye (philistines must be using it wrong), and which is more likely to spend it's entire capitalization on cat treats than it is to interrupt an ongoing hardware design cycle.
It's not just management -- it's the company culture.
On the one hand, this enables them to do bold things and endure incredible criticism before being proven right. The iPod, no optical disks, bundled GPUs, AirPods, etc.
On the other hand, this enables them to do bold things and endure incredible criticism before being shown horribly, horribly wrong. The snowflake keyboard, the touchbar, the iPhone antenna, etc.
I also suspect this is why Apple's hardware design is just so overall excellent. They identify perfection, and relentlessly pursue it. Sometimes Apple gets it wrong, and it shows, but when they get it right, they get it really right.
When the iPad first came out: "OMG, you can't print! How insane." Probably forgetting other things.
Apple does a ton of things wrong and arguably forces some changes before a lot of the world is ready for them. But they've never prioritized supporting legacy and, in fact, arguably have had a pretty consistent strategy of eliminating legacy at least one or two beats early relative to other companies. Certainly I've found some changes too aggressive. OTOH, there are usually ways to adapt that I've found no more than mildly annoying at first.
I very much don't recall anyone complaining about printer support. The primary jab at launch was "it's just a big iPhone". Which was in fact true—it just turned out that "a big iPhone" was pretty damn good, especially when actual iPhones had 3.5" screens.
I'd say most of these sorts of complaints are basically legitimate. When the iPhone came out, it didn't have copy and paste, and that was in fact extremely annoying!
The thing for me with the original iPad was coming to the realization that it was a (very nice) luxury device that was mostly useful for me (at the time) when traveling. And, when I gave one to my dad, it was a revelation relative to computers that he never could really use.
I had mine replaced twice, but the third revision (which according to Apple is an improved version) has been reliable (except for the key labels wearing off) though I still greatly prefer the 2012 design.
My biggest issue now is the trackpad and its incredibly broken palm rejection that causes the cursor to jump all over the screen. This isn't a settings issue (settings do not help at all) and it was never a problem on older Apple laptops or on the iPad, which has a huge touchscreen with flawless palm rejection. Unfortunately (for me and for Apple), wearing an Apple watch seems to make the broken palm rejection and random cursor jumping even worse.
The keyboards had a higher fault rate and many other issues and Apple has corrected its mistake, albeit 1-2 years too late from a consumer perspective. That's all there is to the story and the only semi-relevant competition to MBPs was STILL just ThinkPads and XPS, even with the main input device being significantly worse than previous iterations.
If you used the 2016 keyboard, you wont say that.
Regarding the 2017 keyboards, I'm not even talking about the fault rate. They had no escape key and the key travel was too small so it was like typing on a touch screen.
No, no it's not really.
It's all about preference and tolerance. I bought my Mum an 'old' Apple USB keyboard (i.e. similar to the pre-2017 laptop keys) when her PC's existing KB broke, as I liked mine very much. Being a life-long touch-typer, she just couldn't get on with it, because the travel was too little for her. She's now very happy with a KB with mechanical (Cherry) key switches.
In contrast, I prefer the older Apple KBs, but can get on perfectly well on my 2017 MacBook Pro's KB; and it's not like a touch-screen at all.
Then maybe I only used broken ones with no perceptible key travel whatsoever.
I agree that when it does work, there is _some_ feedback from keypresses, but seriously, if they can fake the clickability of the touchpad via haptics, why couldn't they do the same for the keyboard? Out of all devices that I develop on, the macbook is the most frustrating, mostly because of the keyboard.
Still, I don't think these issues made the KBs true garbage. Such examples would be nearly any non-MacBook touch pad or often enough display hinges. Far worse than a slightly less convenient keyboard which can always be supplemented with an external bluetooth keyboard as well. Can't really do that with hinges.
And sad about the change in sheets and alerts... I’ll probably get over that.
I figure it must be something with my machine but haven’t really delved too deeply and couldn’t find much when I looked.
Was hoping for a bit more out of Music. It’s decent, and certainly better than what iTunes had become. But sometimes I’m left wanting.
Although having my Pops be able to share a playlist over iMessage with me was a stellar experience in terms of how seamless it was and having it sync from my phone to my computer.
I think the problem is that they're indicating state by toggling between your chosen accent color and black. If your accent color is, say, pink, the difference is pretty obvious, but if your accent color is relatively dark, like blue or purple, it's harder to see. It seems like they could solve this by toggling between the accent color and gray, or "lighting up" a solid roundrect of the accent color around the control when it's on.
I'm excited to try it all out, but also unsure how comfortable I'm gonna find it. The large white expanses seem glaringly over-bright to me, at first glance.
If it is bright in light mode, it might be nice and dark on dark mode. We’ll find out once people start running the preview
But that's off-topic to some extent; I scanned through this page looking for some good news for developers. Topping my list is support for SVG assets. NOPE. WTF? Android has had this how long? Apple's "@2x", "@3X" bitmaps were amateur-hour to begin with, but a decade later? No vector support?
And still no user-definable color schemes. Windows (and Unix GUIs, for that matter) had them for decades, while Apple forced its late-'80s inverse scheme on every user that whole time. The hard-coded "dark" theme is a half-baked, belated admission that staring into a glaring light bulb with black text on its surface all day may not be the best way to work.
This would also have been a great time to abandon the single menu bar, but nope; that poor decision continues to plague Mac users. Really, this should have been canned in the transition to OS X, but now there's even less of an excuse.
I'm sure there will be some good changes, but from this list I'm not encouraged. Getting rid of the distinction between toolbar and app content... WHY? The "flat UI" experiment was a failure. Let's continue to take tasteful steps back toward GUIs that communicate in universal visual terms whenever possible.
From the Xcode 12 Beta Release Notes:
> Added support for Scalable Vector Graphic (SVG) image assets. These preserve their vector representation with deployment targets of macOS 10.15 or later, iOS 13 or later, and iPadOS 13 or later. (18389814)
> Android has had this how long? Apple's "@2x", "@3X" bitmaps were amateur-hour to begin with, but a decade later? No vector support?
PDF assets have been supported on all Apple platforms for many years.
The announcements today represent years of work by thousands of engineers and designers. It’s impossible to cover everything in a 2 hour keynote, let alone one geared towards the media/general public. Even the 2 hour “State of the Union” that’s geared towards developers barely scratches the surface, let alone a single webpage.
Not that I don't think they should support it (they should), but this is almost certainly due to perfectionism and bias, not amateurishness. Apple has an institutional focus on good iconography, and good iconography almost always has different representations for different sizes. The icons scale within a certain size range, then switch to a representation more appropriate for the size. I would be shocked if this is ever completely eliminated from Apple's DNA.
If you are using a precise mouse or trackpad, a menu bar at the top (or edge) of the screen is much easier to than a menu bar anywhere else on the screen, because it has "infinite" extent. This has been known for decades, and is why macOS places the menu bar at the top of the screen and the dock at the bottom or edge.
However, Apple has made remarkable progress rethinking pointer design on the iPad, as seen in this excellent video, "Build for the iPadOS Pointer":
that statement seems perhaps too harsh to both HN crowd and Apple's core fan base, I spend more time on HN than I care to admit and I can assure you that I do not like XFCE desktop experience at all, my desktop is running Debian 10 with Cinnamon on it, but UX is not something that I'd ever list as one of top 5 features of this setup. Similarly I've been using Apple laptops for about 14 years now and do consider myself a part of "core fan base", for the record I hate the butterfly keyboard, despise touchbar with a passion, and am utterly disgusted by the decision to remove MagSafe chargers.
To quote great Dr Tobias Funke - there are dozens of us.
StumpWM, i3 and dwm are also commonly brought up contenders as peak desktop Linux.
If it doesn't apply to you, it doesn't apply to you.
In my opinion height of desktop experience was Mac OS X 10.6 Snow Leopard - I still miss workspaces, and TotalSpaces2 is a valiant effort but less likely to work with each new update.
Simple minimalism is certainly preferable to whatever macOS can be described as by many, sure. I don’t see a problem with that.
MATE and LXQt are two other comparably simple desktop environments that I like.
I get comments from my bewildered MacOS-using friends when they see that my 400$ Linux machine is faster than anything they've ever tried.
You new-fangled San Berdoo OSXI users get off my lawn.
.... Yeah, that's pretty much true, and for very good reason. Every time an interface changes, it forces people to stop using the system and figure out the new interface. As such, while it is possible to make changes that are beneficial enough to justify the cost, UI changes are very expensive and you'd better have a very good reason to justify breaking everyone's workflow. And by that standard, yes; XFCE is nearly perfect. It's been on the 4.x series since 2003, and going by screenshots I strongly suspect that even a user from XFCE 3.x would find the very latest version to be a drop-in replacement. And it does have decent quality, which combined with its general "get out of the way and let me work" vibe and 20 years of continuity makes it easily one of the best options available.
(You can hold down Option and use the green button as its old "Zoom" function, but I think there should be a system preference to reverse that so it's zoom by default and you hold down Option to go into full screen.)
Optimizing things where the primary focus of attention is what I am looking at mostly is more relevant to me than a shifting design languages. But, I know others will have different views on this and this is based on what people are actually happy with.
I think it's more that (visual) UI Design has the same cyclical property of most other "visual design" like clothing fashion, interior design,... Things look fresh when they're new, after a few years people get bored of the look, and what was once considered modern now looks dated.
Especially for a company like Apple that has built part of its reputation on having the best UI, they can't afford looking behind on the competition. Calling it "change for the sake of change" is partly correct if we were purely talking about products, but it ignores the fact that Apple is as much a marketing company as it is a product company.
Aside from that I think there's also a strong functional argument to be made. iOS and iPad share the same OS, so it makes sense for those to be consistent. As the iPad has matured it's approaching desktop levels of productivity. So now we're in a situation where the iPad is becoming like a Macbook without the keyboard. With so much overlap in functionality, it also makes more sense for those 2 experience to converge rather than diverge. So we're hearing complaints about the "iOSification of the macbook" but they are ignoring the missing link; iOS = iPadOS = macOS
Both have a classic theme.
There are those of us who will have to choose a new DE because Big Sur apparently won't run on our perfectly capable pre-2013 MBPs, and have to choose the least clunky. In that group, Xfce is indeed the bees knees.
I'm more likely to prefer XFCE than Gnome, for example
That being said, I also use and like desktop Apple products so there's that
People are taking the UI / ARM change as apple taking away more control, but macOS is THE development platform of the entire apple ecosystem.
So yeah, the ecosystem may be tighter knitted, but they sure as hell will not lock down macOS like they do iOS, they have so many mac users who are developers and apple are absolutely aware of that.
People complain about features in catalina and other recent versions of macOS 'locking things down' - it's incredibly easy to bypass the stuff that makes a difference using the terminal.
In a world where every company in the world wants to track and harvest your data, we've somehow got to the point where people bash apple for actually taking privacy and security seriously.
Maybe I’m just paranoid, but I personally decided to stop buying their computers because of it.
At what point do we stop saying "he was so prescient, look we're almost there," and just admit things haven't turned out as badly as people feared? DRM has turned out be more of a nuisance than an apocalypse, and general purpose computing is at least as available today as it was in 2011.
In some ways it is even easier; for example anyone with an email address can get access to a cloud-hosted computer for free.
Finding parties that would benefit is a trivial exercise one has only to look towards app store revenue, Intels idea of selling "upgrades" to your cpu via software unlocks, or Verizons ability to rent you access to your gps for $10 a month.
Others have been pouring water on a fire that would have already filled your kitchen with smoke by now otherwise and you have merely failed to attend to this fact.
Could you explain some of these incremental lock downs? I honestly can't think of any that are genuine restrictions, but that might just be because I don't encounter them with what I do.
That alone has stopped me moving my Macs to the latest OS after testing.
Why not? It's his system, he should be able to do whatever he likes. On most Linux I can rm -rf / all day long and nobody cares. What's the problem with creating root level directories?
> On most Linux I can rm -rf / all day long and nobody cares.
So let me turn this around: Why do you think a system that doesn't make it difficult to completely destroy it with one command is a bad idea? "It's my system, I should be able to do whatever I like" is an answer, to be sure.
(N.B.: in fact, "rm -rf /" won't work without "--no-preserve-root" in most Linux distributions, AFAIK, and I presume sudo.)
The decision on which bits of the system I can destroy should be mine, not theirs. That's the issue. At the moment, I am getting pushed out of my machine, bit by bit. Every binary I run is getting report to Apple etc.
It is also possible to destroy the system by dropping your MacBook from a very great height. Should it be padded to cope with this eventuality?
This does not seem very charitable.
It is fine if the system warns you about doing potentially dangerous operations (which --no-preserve-root amounts to) but it must not prevent you from ignoring that warning.
It was a habit on macOS since /Developer used to be where xcode lived until they moved it to /Applications.
It also makes it easy to look in the /DeveloperLibs directory and see if I've got all libraries I need on this machine (I have different development machines under different OSes).
Removing ability to create directories under the root seems draconian for no good reason, other than the iOS-ification of my machine. Not Apple's machine - my machine.
I don't understand why people would ask "why would you want to do that?" because everybody used to defend the ability to only resize windows from the bottom right corner under macOS until a few versions ago, when it became obvious that resizing a window from any edge was a good idea and always had been. But people oddly defended it for decades before that.
What if they now remove the ability to change your desktop wallpaper? Would we all say "but why would you want to change your wallpaper?"? We seem to accept the gradual lock-down of our own machines. Apple users (and I am one) seem to be the most defensive of the company's behaviours and it doesn't make sense - we just have to accept it.
If Microsoft removed the ability to create a directory under C:\, what would the result be? Would we be asking "but why would you want to create a directory under C:\?"? I think not. It'd be the apocalypse.
There are certainly many decisions Apple makes that I don't like and I am always keeping my eye on Linux in case I really feel the need to pull the rip cord. But for me, this is just not a hill of sufficient size to die on. "Apple now mounts the system partition as read-only" just doesn't feel like a slippery slope that leads inexorably to... well, I'm not really sure where people imagine it's going. "This is something I used to be able to do and now I can't and soon that means I won't be able to open a shell!" I mean, maybe, but... probably not. If I had to choose whether "Macs in five years will only be able to install programs from the Mac App Store" or "iPads in five years will be able to install programs from places other than the iOS App Store" is more likely, I'd honestly go with the latter.
> I don't understand why people would ask "why would you want to do that?"
I guess I don't understand why asking the question is controversial; I'm not trying to lead anyone into a rhetorical trap that forces them to admit only axe murderers would ever use this functionality. :) It's simply that in what I'm pretty sure is over a quarter-century of Unix use on my part I don't recall ever putting anything in the root directory of any of my systems except by mistake.
As for "defending the ability to only resize windows from the bottom right corner until a few versions ago," your memory of this is different from mine, I suppose. I recall a lot of Mac users thinking that Apple's one-corner-only approach was a bad leftover from the pre-OS X days that needed to change!
It would get rid of shitty Mac application documentation that rather than installing itself in correct locations, has troubleshooting steps like:
> Just chown your entire `/` to your current user and `chmod +w *`...
Until Atomic installations become the norm like SilverBlue its probably a good step.
The second, is that I am able to create root folders, I just have to remount as rw before creating it and after that I can mount any data folder to `/foo`. I'm not entirely clear why not having `/` as rw by default is such an issue, but you are cetainly not prevented from doing it if you wanted to.
Why would Apple want to restrict access to certain memory locations?
Finder hides all of /etc and /bin and /usr for the "protection" of normal users, just like the default view for Windows Explorer under C:\Windows\System. But you can still change the contents on Windows, or see the directories in Terminal on macOS.
Should they hide those directories when you use Terminal too? After all, /etc is important. Your reasoning would also stand up there - perhaps whatever files you were attempting to modify under /etc wasn't actually that important.
When does the "you cannot easily modify files on your own machine" stop?
Would you also accept the inability to change your wallpaper with such glee? Perhaps the wallpaper you wanted wasn't actually that important.
...it's...the root of your entire system? Are you really asking me why `rm -rf /` is a meme, or are you just being contrarian?
> Finder hides all of /etc and /bin and /usr for the "protection" of normal users
And you can change Finder's preferences to show those folders, like any actual macOS user that's qualified to be making changes to those folders should know. If you lack both the curiosity and the ability to look up how to enable that option, then you frankly have no business modifying them yourself. If whatever you wanted to copy or edit there wasn't important enough for you to take ten seconds to look up how to do it and then follow through with the instructions, then why on earth would you be complaining about it to anybody?
I can also do rm -rf ~/Documents or rm -rf /Applications or rm -rf /Library which has a similar devastating effect to my life/machine - should we block that too?
I can also fling a hammer into the screen and break the screen - should we put extra thick glass on the screen? I guess I have no business using a MacBook if it is possible to break the screen.
I can also pour liquid into the keyboard - should the keyboards be waterproof? I guess I am not qualified to use a Macbook and have no business using one if I spill a drink.
Using the keyboard is dangerous too because someone can type dangerous commands. Let's stop people using it. Only those with sufficient "curiosity" and those "qualified" should use it, right? How do I get "qualified"?
I am saying that as it is my machine, I get to decide which parts are dangerous. But I no longer can because someone the other side of the world has decided that I am cannot create a directory under /.
I do not ever modify /etc using Finder. Should I only be allowed to edit files under there using Finder or is Terminal permitted?
I guess if we follow this "lock down" and "only those qualified" route that you encourage, we'll soon see permission prompts to do anything anyway.
There have been examples of (un)installer scripts removing substantial parts of your system, steam for example. Linux in this case https://github.com/ValveSoftware/steam-for-linux/issues/3671, but not generally unthinkable on MacOS. It's generally good if the system puts up barriers to prevent such occurrences.
You're entirely welcome to disable these protections, but MacOS is designed to protect Joe User who may have substantially less knowledge and who has no general need to mess around in /etc
It's like banning knives because there's been a few stabbings. It removes the general utility due to misuse by a few.
> synthetic.conf describes virtual symbolic links and empty directories to be created at the root mount point.
It's almost as if they've been actively devesting in OS X.
At least we now understand why.
Mac's are no longer machines fit for professional development. It's a damn shame, but it's true in my mind.
For a long time, I and other people I know chose MBP for professional web-development / video editing / music production because they were provided with a slick GUI on top of a solid operating system.
It's fairly apparent to most people who read tech news, that the quality of the operating system isn't what it was. For instance, think back to the 'no password necessary for root user' debacle a few years ago.
I don't think it's beyond the realms of credulity to consider why this type of problem was allowed to occur in Apple's 'professional' line.
If you're still with me, perhaps you'd also be able to follow my line of reasoning when I surmise that the reason for this degradation of support, may have been because they were planning on dropping their support for OS X .. with this convergence event being its replacement.
In reality, it's your own sassy comment that's devoid of content. Hope you really did wait! :)
Would you buy a house without any locks since it's convenient? As long as you have the key it's paranoid not to buy Apple because of some restrictions since any PC also has a lot of these (secure boot, DRM etc).
Not my experience. It took a couple weeks of noodling to get parts of my environment moved over, and some of my automation I just moved to a linux box because I got sick of it.
> we've somehow got to the point where people bash apple for actually taking privacy and security seriously
On the planet I use a Mac on, the complaints are about a set of non-manageable, opaque and somewhat buggy security changes that arbitrarily changed semantics on big chunks of the filesystem and broke a bunch of things is hard to figure out ways.
Add in that there is no sensible mechanism for cleaning up any of these magic permissions paths, and I don't even know what permissions are active on the machine I'm typing on.
I've loved OS X for being a decent gui on a real Unix. It had a decent run, but my next laptop will be a Linux box.
Lately, I've been doing most of my dev work on a Windows 10 machine using WSL. It's come a long way and is very usable now.
Having said that: I agree, WSL is pretty amazing.
Try running dtrace without having to configure a bunch of settings and restarting your Mac. Then you have to change the settings back and restart again to go back to the way it was.
It's a pain in the hole, and for basically no reason.
I still don't believe Apple on the privacy front. Yes, they are better than competitors. Anyway, I would like freedom and privacy and don't see them as contradictory. And you won't get more privacy if a manufacturer locks down environments.
We had OS without any privacy intrusion without any form of lock down. MS drops the ball constantly, agreed, but that isn't a metric anyone should look for.
Freedom and privacy are at odds. If I give any app the freedom to access my messages or call list, does it increase or decrease privacy?
I have an app(lication) that has access to my messages. It doesn't do anything with them aside from showing them to me on request. It doesn't affect my privacy at all because doing so is far beyond its scope. If I had shitty apps, I would agree, but until now those seem mostly restricted to shitty app stores. But if they cannot be trusted, they cannot be trusted to see my messages at all.
> People are drawing, creating music, making movies, editing photos on their mobile OSes
True, they are not completely useless, just mostly. I can do that on other OS with far more freedom. Artists generally don't like restraints too much and I do actually paint digitally. Wouldn't know how to connect it to a mobile device, nevermind the atrocious software selection.
The cameras on mobile devices are nice though. Don't think the common mobile OS makes handling image date easy at all. Even most normal users want images on their desktop as soon as possible.
It is sort of like scrolling through Dribbble app mockups. The layouts look amazing as a mockup, but when you actually go to make or use the app it isn't as practical as you want.
On a desktop, with a mouse, I have near-perfect precision. I don't need overly-padded icons and windows because I have the precision of a mouse. The extra padding just wastes screen real-estate and makes me move the mouse more to get between tools or options in the menu.
I would much rather have less padding, which might not look as beautiful, and use the extra space for the content of my application as opposed to the toolbars.
You do, but not everyone does. Consider the accessibility needs of people with motor-degenerative diseases like Parkinson's or Huntington's. Won't they be better-served by larger click-targets?
In such cases, it makes a lot of sense to just give the change to everybody, so that you can design the UX once, with the change in place.
See also: modern TV shows and films that make sure to not put anything important where closed-captions would appear—they're designing once, assuming captioning.
Complete BS --- it absolutely slows down those who don't need elephantine UI elements to use a computer.
It's a bit like making the UI into a https://en.wikipedia.org/wiki/Voronoi_diagram — any area on the UI chrome becomes "associated" with the nearest interactive element.
Like I said, I don't see how that hinders anyone.
Edit: It's also quite clear from the comparisons that the number of elements visible on-screen is decreasing because each element is getting larger and the padding between elements is increasing.
Edit2: Also, the idea that this UI is more accessible is contradicted by the removal of various visual elements that help separate and differentiate UI elements and the loss of contrast in the UI in general. Visual impairments are much more common than Parkinson's and none of these decisions help with that.
At the cost of shrinking everyone's effective screen size.
Although 930k individuals in the US suffer from Parkinsons given the economic cost including medication, lose of income etc which averages about 24k per individual it seems likely that few of them are enjoying mac laptops even compared to the general population that is 10 times more likely to run windows.
Indeed this recent article is replete with references to cheap and budget not retina and style
I would note that many people with disabilities use iOS devices. The layer-stack of iOS APIs has been top-to-bottom engineered with accessibility considerations in mind, rather than it being worked in after-the-fact; so accessibility features on iOS are a lot more “universal” than ones on other OSes. iPhones are basically “the” phone for the blind, for one thing.
Not saying that rebuts your other points. Just pointing out that when you have a disability, “cost” becomes just a matter of how long you need to save up for it; while “has the particular benefit” is the difference between the device being at all useful for you, vs. not. There’s no point in buying something that’s half the price if you can’t use it.
I suspect a large percentage of software engineers buy their computers, but I'm skeptical at the assertion that a large percentage of people who buy their computers are software engineers.
Or have things changed since I left the Apple ecosystem? I'm assuming you don't need to be a registered developer to use things like "make" or download IntelliJ IDEA?
If you're doing web dev you don't need to be a registered Apple developer.
The US and big tech companies are outliers regarding Mac usage.
One of these days I'll write a todo list manager to replace OmniFocus and then be able to switch platforms.
Either way, software engineers build the applications that drive your platforms popularity, meaning developer adoption of a platform is incredibly important and thus they probably have a larger influence on Apples decision making than other demographics would with similar percentages.
They need to continue sell more to keep share holders happy.
It's a calculated shift.
If I can fix one thing from the new UI, I think it should have borders or at least a different background color, like pre-iOS 13 with button shapes. The current iOS buttons (and the new macOS design) doesn't give any visual cue for the user to know whether this is an icon label or a button. That's a problem, and one that macOS has solved elegantly by having different textures (like the toolbar).
It's now... more like Windows 10 where everything is on a flat surface without any differences.
For one thing, it’s not like cohesion was achieved by bringing in the best parts of OSX and the best parts of iOS. Cohesion has been achieved by transforming the OSX UI into iOS’s to the extent possible. Apple has been clever to do it bit by bit but using OSX from 10 years ago to today, you can feel the alienation.
The other part is that until a few years ago Apple was the biggest proponent of the idea that a Mac, with its pixel perfect input mechanism, and a keyboard, and iOS whose input was fingers, should not have a similar UI. Their execs scoffed at Microsoft’s attempts to make a device which when in laptop mode ran traditional windows but in tablet mode ran a version of windows optimized for touch. Somehow those UI concerns have evaporated for no good reason.
That said, the user experience of Mac OS still has plenty of low hanging fruit:
- Window management and app switching could be much better. The "app switching system" I liked the most was in Gnome 2 / MATE - I could just quickly move the cursor to the bottom task bar and scroll. It was so effortless, fast and easy.
- When I download an app (.dmg) and click it, it shows up as a new disk. WTF?
- The file manager could be made much better. Again, MATE had (probably still has) the best file manager I've ever used in terms of UX.
So download (or provide for download) a .pkg instead of a, you know, Apple Disk Image file
Normal people don't know and don't want to know what are apple disk images and what's the difference between .pkg and .dmg.
My understanding is that it's done this way so that there's a consistent experience installing software from a CD/DVD or from a download online.
But I think you're right, Apple should've rethought the software installation process the moment they dropped the optical drive from their computers.
It's only a "UX fail" if you're of the opinion that all platforms should work exactly the same way, and/or that UX paradigms that aren't what you already know are inherently wrong.
macOS users are aware (even though they might not know the technical terms in question) that a .dmg is not "an app". It's a virtual disk format - something reinforced by the literal disk that's its file icon - and using one is reminiscent of inserting a CD or USB stick to get files from. _That's the point_. You insert a disk and copy its contents to your own hard drive (applications aren't the only things stored or distributed via virtual disks).
It's like complaining that clicking on a ZIP file has the "side effect" of creating a new file/folder. People who actually use archives understand that they're container formats.
I've been using Mac OS for years, so I'm used to it. And good UX works well even for people who're not used to it. See iOS, you just tap "install" in the App Store and the app installs, that's how it should be.
Seriously, ask a good UX designer what they think about this - or just ask normal people why they think there's a virtual disk drive on their desktop and whether they know what a virtual disk drive is. I don't have the time or energy to convince you.
Now if Apple where to follow you suggestion and lockdown the app installation for better UX, there would be pitchforks everywhere. In fact, the pitchforks where out just last week for this issue.
How do you think it should show up?
I don't know the exact purpose behind the "mount a virtual drive to install an app" model, but it seems far too deliberate to not have a purpose.
For me, AppCleaner is one of the first things I download on a new machine.
% brew uninstall <appname>
% brew zap <appname>
1) Delete spotlight, Siri and 'look up,' make alt-tab handle multiple windows of the same app, fix searching within folders and also separately fix whole-filesystem search.
Levelling your UI to the lowest common denominator is a sure fire way of making it worse and reduces the efficiency and ability of everyone in the long run. macOS and iOS should be entirely distinct UI wise.
The new one looks like a broken stop/play button (the play is backwards).
I know the old one was "skeuomorphic", but it was useful. This one burns brain calories just trying to figure out if it's a stop/play button or not.
Why not? I think its a distinct possibility.
Put together a touch-friendly new OS UI with the ability to run a zillion apps designed-for-touch in a context where all your competition has touch-enabled devices and it's hard to believe they'd hold out any longer.
I could be wrong, it's just my instinct.
This, and increased font sizes. Do any of you know how to change the font size of the window title bar in macOS 11?
(To be clear, I'm not insinuating anything, just interested in the dynamic.)
This has been the fear for the last 10 or so years, maybe longer. It hasn't strictly happened yet. Maybe someday, but this definitely isn't it.
as a UI geek these kinds of updates in Mac OS updates are some of my favorite - they impact everything you do in a small but substantial way
More cohesive brand experience is not to be undervalued either both in terms of cost to produce assets and perception by users.
It's been fairly obvious that this was their plan. Now it's been categorically confirmed.
So much has to be inferred from the way things used to be, makes it harder to argue that it's the "simpler" operating system designed for Humans with a keen eye on UX design..
FWIW I also think the flat UI on iPhone (which has been prevelant for >6y now) is a horror show.
Steve Jobs famously once said (while working at NeXt): "There are two kinds of people at Apple: Those that want to push computing forward, and those that want to be the Sony of computers.. and the Sony guys are winning"
macOS moving to ARM is the biggest leap for developers in recent years - Amazon are also developing their own ARM chips and cloud services running linux ARM is the future.
Having a mac with a unix-like terminal environment & support to run linux arm natively when devloping for those services is a huge deal.
on the same note; the"decreasing information density" I don't think would be a problem if you're switching between the "terminal/IDE/browser/excel". the "table views" still seem fairly "compact" so..? /shrug
I'm glad is isn't a polished XFCE, there are a lot of options for people who like this sort of thing already, just run Linux. In its current state, Linux on the desktop is definitely not for me, but then, I like the touchbar (as a software engineer, even), I like their new keyboard better than the pre-USB-C one (the butterfly one was terrible though), I'm fine with four TB3/USB-C ports, I don't miss MagSafe much. I like not having to bother with a myriad settings, because the defaults are usually just fine and most things can't be fine-tuned anyway.
Big Sur seems to be a really interesting spin on a desktop iOS that embraces and extends the security and UX concepts from iOS, but for keyboard/mouse and without being so locked-down and constrained. From what they've shown so far, I like it.
Now please have iPadOS become more like MacOS – that would be terriffic.
What's much more likely is that they'll find some new form factor of sensors and projectors (keys, screens, pointers, loudspeakers, data i/o) that does away with "computers" and "tablets" entirely. And that'll have its own widgetOS.
But instead they explicitly mentioned virtualization support for Linux. There seem to be no additional measures like Gatekeeper and Notarization. The security measures introduced in Catalina have carefully-placed gaps to not break terminal workflows, and unless I'm mistaken they can be disabled. Notarization is a bit of a mixed bag; I get the security benefits, but it does give them quite a lot of control.
I may be totally naive and utterly wrong, but the vibes I get from all of this isn't that Apple is currently pursuing a strategy of forcing MacOS into the iOS mold and turning it into a two-apps-side-by-side, fully locked down, anti-productivity, content consumption and light Excel, iPad-with-permanently-attached-keyboard device.
Rather, they seem so be trying to build a nice and coherent user experience across phone, tablet, laptop/desktop on the UX front that works for a wide audience with little customization, and to find a security compromise between hundreds of millions of (potential) users and organizations who need an ecosystem that keeps them as safe as possible vs. developers, researchers etc. who want their systems to be very open and unrestricted and safeguards to be very lax. I imagine it is kinda hard to build a product that serves both of these well while remaining coherent and fun to use.
With that in mind, I think they did a decent job at this with Catalina and the lastest iOS. Big Sur seems to be a continuation of that; it remains to be seen what it'll actually be like in the beta, but right now, I'm optimistic.
And making toolbar buttons all monochrome outlines does make toolbars have a more consistent appearance, but that actually makes finding the button you need more difficult. The differences are what we notice, and later what our brains automatically key on when we need to actually use the interface.
Less than half of these changes are arguably worth having. The rest seem like marketing saying, "We need to be fresher, guys!"
Using a new interface can be fun because of the change, but that fun quickly wears off. Then we're left with the realities of the interface. That Finder window is going to be much less useful in the real world with real file/folder names that are longer than 12 characters wide.
This is a UI for casual users. It might be good for grandma, but it's less good for people who need to get work done.
* Putting the title (folder name) before the controls introduces a fundamental instability, much like OS X did with the menu bar by placing the named app menu before any other menus. Also, the amount of available space for displaying a longer title/file name seems limited.
* It seems, the proxy icon is gone. I can't see which element could bear the functionality in place. The title is already editable, making it dragable as well might be too much. Is this the beginning of the end of the drag-and-drop paradigm? (Notably, it doesn't suit touch interfaces well. Many aspects of these changes seem to prepare "touchability".)
Wish every other OS would replicate it already.
That's a good point I hadn't thought of -- I love the proxy icon. I'm glad it's still there, but this kind of feels like Apple is implicitly saying, "Look, for the ten of you who knew what this was, you still do it, but we're just hiding it from everyone else, sorry."
I guess I shouldn’t be fighting the wars of 20 years ago though. It is what it is, and since the goal seems to be to rebuild everything in SwiftUI eventually, maybe the Dock will finally be reconsidered in the next few versions when they get around to it.
John Siracusa has a different theory though, and having had time to go through and examine to changes in the HIG in more detail, I'm inclined to agree. The spacing and layout and sizing appears to make Mac OS XI more "touchable", i.e. we should expect Macs with touch screens at some point.
However, at the same time, this also illustrates, why it may be not that great a choice for a WIMP interface: the kind of "intimacy" related to the specific mode of interaction is essentially different. Where contrast and too much of variety may be irritating for a touch interface, it may be already too low in contrast and information density for a WIMP GUI. On the other hand, visual grouping (and semantic collapsing of groups) is much more important for the more remote WIMP interface, whereas touch already is much more locally focused. Generally, treating them both the same isn't a good idea.
Re: Zoom. Maybe you already know this, but just in case you don't, double click the titlebar. Even works in Safari, although there's less "titlebar" to work with. If you already know all that, well, maybe it will help someone else reading through the thread.
That said I did use BetterTouchTool to modify its behavior because I too was frustrated with changing functional behavior and I recently bought a three button mouse to use in addition to my other input methods, but I decided to make it default Zoom, but middle click Full Screen.
You can also set a keyboard shortcut if you like in the Keyboard settings (no third party software required) within System Preferences. Just go to App Shortcuts, All Applications, add a shortcut, for the Menu Title type Zoom and then just record your shortcut.
To be fair, the click target of the traffic light circles isn't limited to the actual drawn pixels. You can click a fair few pixels outside of it and still get the same result (basically wherever the hover state activates).
The stoplight was a pure aesthetic choice, I would have preferred more functional choices, but it was probably the right call at the time to go for aesthetics. A new aesthetic on a new operating system, that’s exciting, that gets people talking. That said, it’s one of the last elements of Aqua that has remained almost entirely untouched, through pinstripes, brushed metal, stitched leather and now apparently, iOS 7 Strikes Back or whatever we’re going to call it in Mac OS XI.
I love the proxy icon, too, by the way. But it's always been a badly documented, surprising, quirky feature. I found it by accident, by slipping and dragging a folder onto the menubar. That's not even useful; I didn't understand what I'd done. Eventually, after reading blogs and manuals, I use the proxy icon (and the brilliant Finder<->save dialog tricks) all the time.
I don't think we gain much by framing the argument around "discoverability" any more. I think it's much more useful to talk in terms of the overall interface paradigm. The proxy icon kind fitted into the "desktop" paradigm - i.e. a scatter of media and tools strewn across a desk. But Apple are moving away from that, and into a new paradigm of "surfaces for interaction". The i-devices have been in that space for years. The "mac" devices are now going there. Both are gonna change again, bit by bit, in the future as humans develop new ways to interact with this weird networked functional data thing they've created, and Apple try to provide novel tooling for novel technology. They do seem to be going faster, and breaking more things, than they did in the 80s and early 90s when they lead the desktop paradigm. They're possibly experimenting with ideas on the wider user base, rather than doing closed door research. A big part of that is due to the wider user base being MUCH more comfortable with computing than before. People with very little "computing" training are contributing daily to the development of digital culture.