
After Mac? - emilong
https://www.tbray.org/ongoing/When/201x/2016/10/29/Post-Mac
======
rjzzleep
Every time I mention the biggest flaw of the Linux desktop people freak out
and point me to Qt. I know QT but the effort of making a half decent GUI app
in QT is way above the effort it takes with Cocoa.

I very much remember the development and push of cocoa and the slow transition
away from carbon and quickdraw.

I believe NSDrawer was something that first showed up in third party code and
then was adopted in Cocoa.

Apple has constantly refined their toolkit by looking at well designed apps.
Everything on Linux seems to be doing the exact opposite.

Although GTK was developed out of Gimp it's so much easier to make a good
looking photo editor on Mac than it is on any other platform.

Until we sit down and design a good looking and easy to use GUI toolkit we can
forget about decent Gui applications on Linux. (with a focus on easy to use)

~~~
pabloski
Huh? Never heard of QML?

~~~
usernam
Can you point to some well designed QML interfaces?

I can spot QML from the ill-fitting looks and mostly poorly though-of
interaction.

People think QML has magic powers, but in reality any UI designed in 5 minutes
has only the value of a 5 minute interface.

To me QML is the biggest mistake in Qt5.

~~~
pjmlp
> To me QML is the biggest mistake in Qt5.

I think they are trying to appeal to the JavaScript hipsters as a way to stay
relevant, and in the process also annoyed many C++ developers that don't see a
value in it.

All the other major declarative toolkits, XAML, AXML, JavaFX, Cocoa don't use
a JavaScript like language for the UI descriptions.

------
tlow
Apple laptop hardware seems to be best in class in terms of performance,
support and longevity.

I have some complaints about the way that macOS (or OSX) has progressed. My
computer feels more like a Windows 8 desktop running a tablet operating system
than ever before. Core usability features that I love have morphed and changed
(exposé, spaces). I have more but that's not the point. The point is, despite
my personal feelings that macOS continues to decline in quality, I haven't
found any hardware that is in any way comparable.

I use a 13" Macbook Air base model and I've used the Dell XPS 13 clone. The
Dell has a basically unusable trackpad while the air has the best trackpad on
the market. Little things like this make a big difference. Longevity is
another big one.

~~~
Osmium
> despite my personal feelings that macOS continues to decline in quality

Not disagreeing with about the changes to exposé etc., but I find it
interesting how often this 'decline' is talked about. I read somebody saying
that Snow Leopard was the last solid release the other day...and I'm
wondering, do some people have selective memory here? Are they remembering the
same OS I remember?

I've been using OS X since 10.3 Panther, and it's _always_ been a 'work in
progress' and rough around the edges. It's currently the best it's ever been
in terms of security, for sure. There are some superficial areas that I don't
like much (Mission Control, Launchpad), but they can be ignored. The OS itself
is as solid as it's ever been...which is to say it does still have issues (the
discoveryd debacle being a notable example), but it's also had a lot of
improvements to its fundamentals too (Core Storage, APFS to pick two big
examples).

I also think it hasn't helped that the change to high-DPI screens certainly
made the whole OS feel slower, and I think we've haven't quite recovered from
that yet.

~~~
drawnwren
My personal complaints with OSX are all UI related. I give up too much of my
screen and time to animations and buttons that I don't want. The underlying OS
is quite good. Much more stable than me endlessly fiddling with Linux. That
said, seeing as I can't customize the UI or workflow to what I consider to be
a reasonable level, I'll stick with a custom WM on Linux.

~~~
tlow
I agree with the sentiment of this comment. I think you're articulating what I
perceive as feature decline particularly in the GUI interface. Advanced
features seem to be disappearing and unless I'm using terminal, I feel like my
machine is increasingly operating with settings and parameters I can't access,
customize or eliminate (iOS analogy: non-deletable apps, obscure OSX example
cmd-ctr-option-8 used to invert screen colors, this is now default off and
only if you know its a feature you can enable can you use it). Huge UI bugs in
OSX not previously present are update alerts that cannot be eliminated and
will overlay full screen presentations and videos until closed, there's also
no way to turn this off, same is true for time machine backup notices).

------
Alex3917
The reason you can't get 32 gigs of memory is because Skylake (and Kaby Lake)
don't support mobile DDR4 memory, which is necessary to make power consumption
low enough for a laptop. This isn't Apple's fault. Short of making a 7 pound
windows-style gaming laptop, there's not really much that they can do until
2018 when Cannonlake comes out.

~~~
sirmike_
Bologna. Dell and system 76 offer current and last gen Intel processors in
excess of 16 GB of RAM going up to 64 in the case of system 76.

~~~
Veen
Yeah, but what sort of battery life are they getting? It's not that you can't
have more RAM with those processors, it's that you can't have more energy
efficient mobile RAM.

~~~
gcr
Is it possible for a machine to dynamically turn the power to RAM blocks on
and off?

In periods of low use, the OS may choose to swap most of the RAM to disk and
switch it off to save energy.

~~~
Mandatum
You can't turn off blocks of memory in a single RAM chip AFAIK, it's all or
nothing. And I think enabling and disabling different RAM chips wouldn't be
very viable since you'll need to move memory from one chip to another when
turning one off, there would be a lot of potential issues when that happens.

However enabling and disabling RAM chips and rebooting the laptop, like a low-
power toggle, would be possible.

------
bluejekyll
I totally agree with this article. I have a MacBook Pro circa 2015, and it's
awesome.

I haven't touched the new machines, but they aren't inspiring, though the
black option is really nice to see :)

I haven't touched the keyboard, which it sounds like they changed. I really
like the one on the generation I have, and I also like the wireless model that
Apple came out with. Anyone know if it matches the wireless keyboard? Cause I
like that one, short throw and large quiet key movement.

In terms of the touch bar, eh, I really like the idea of having thumbprint
access to the computer. I hope this is accessible to other applications. As a
developer I'm minorly annoyed at losing function keys for certain tasks, but
maybe I can utilize it to create custom keys and icons in different contexts,
which could be really cool.

Otherwise, while it doesn't offend me, it's definitely not worth upgrading to.
But IMO, like this article, it's still the best Unix OS based laptop on the
market (yes I know it's XNU, but it's 100% posix and full BSD subsystem, which
is better than windows' posix mode and Linux veneer).

Anyway, while it isn't better than my current machine, it is thinner and
lighter which are nice haves, and _I_ might actually like the new keyboard.

~~~
extra88
I briefly tried the keyboard on a 2016 12-inch MacBook in an Apple Store and
didn't like how "shallow" it felt relative to other Apple keyboards I've used.
I hope the 2016 MBP's keyboard isn't like the 12-inch MacBook but it's not a
deal-breaker.

I'm not excited by the Touch Bar but I'm not concerned about it either. I
think it will take some time to see how useful it is and how committed Apple
is to it (e.g. will it appear on a keyboard peripheral soon). You don't lose
function keys entirely, you can always call them up by pressing in the fn key.

The Touch ID is available to developers, 1Password will support using it to
unlock a vault instead of entering your Master password, same as they do in
their iOS app.

I don't care about them being thinner but appreciate them being half a pound
lighter. The was achieved, in part, by having a smaller battery, 24% smaller
in the 15-inch model, but they claim the 2016 model can get 10 hours of use,
vs. 9 hours with the Mid 2015 model.

~~~
Russell91
The 2016 Macbook Pro uses the "shallow" keyboard.

------
seiferteric
I use linux exclusively so I am a bit biased. I have used Mac in the past, but
I just don't see the value when all I ever need is a web browser and a
terminal and I'm set. There's no question that their hardware is the best
though. My question is, what's stopping a company like Canonical from
polishing Ubuntu to be on par with MacOS in looks and ease of use then
designing the "perfect" developer laptop? Seems like they should have done
this years ago.

~~~
Russell91
Their hardware is not the best anymore.

Just visited the Palo Alto Apple store and I'm pretty sure a Microsoft Surface
Tablet has a better keyboard than the new MacBook Pro. They really smushed
down the keyboard to make the whole configuration thinner, but whereas the old
Pro's were halfway between the quality of a dome keyboard and a mechanical
keyboard IMO, the new ones are even worse than domes.

~~~
erokar
I certainly didn't like the shallow keyboard on the Macbook. I don't even
particularly like the keyboard on the Macbook Air I'm typing this on because
of the short key travel. But I'm curious as to whether I could get used to the
new butterly keyboards. Also, would be interesting what an ergonimcs expert
would have to say about them.

------
pault
I've been using macs exclusively for 10+ years, and my next work machine will
be a Razer Blade Stealth with the Core external GPU Dock.

[http://www.razerzone.com/gaming-systems/razer-blade-
stealth](http://www.razerzone.com/gaming-systems/razer-blade-stealth)

~~~
skeptic2718
MacBook Pro - 6th-gen i5 2.9/3.3GHz, 8GB 2133MHz memory, 512GB PCIe SSD, Intel
HD 550 iGPU, 2560x1600 screen, aluminum body - US$ 1999

Razer Blade Stealth - 7th-gen i7 2.7GHz/3.5GHz, 16GB 1866MHz memory, 1TB PCIe
SSD, Intel HD 620, 3840x2160 screen, aluminum body - US$ 1999

~~~
Bud
And the Razer will be heavier, have half the battery life, inferior fit and
finish, plus you're cherry-picking MB Pro configs to intentionally leave out
the i7 and higher-end graphics options. Also, 3840x2160 screen on a laptop is
stupid; you're paying extra and paying a large performance penalty to get 4k
on a screen on which you can't even really appreciate and use the higher res.

~~~
pault
To be fair, I plan on getting the base $999 model and putting a geforce 1080
in the breakout box supporting a 5k monitor, or 5 1400p monitors in portrait
mode. I don't really leave the desktop often unless I'm using the laptop in
bed.

~~~
swozey
I'm pretty sure that I read a 1060 or 1070 saturates the Razer Cores
bandwidth, I may be wrong but bear that in mind. I think, specifically, I read
a review where only 80-85% of the 1070 was able to be used fully, so I doubt
throwing a 1080 into the mix will be a net gain. Just mentioning it so you
make sure to research it before you spend the cash.

~~~
pault
You are correct. That is incredibly disappointing.

------
pavs
Linux can look nice. Unfortunately not by default.
[https://www.reddit.com/r/unixporn/top/](https://www.reddit.com/r/unixporn/top/)

~~~
izacus
elementaryOS IMO looks more than nice by default.

~~~
dkuntz2
If only they worked to make Pantheon work well with other operating systems,
instead of forking off as a separate one. And picking Ubuntu as the base to
fork off is not the best choice.

------
trapperkeeper79
I bought a Windows laptop this year after being a long time Mac user. I still
have Macs in the house, and use a beefy 15 inch Macbook Pro for work. I'm
pleasantly surprised by my Win 10 experience. It isn't as polished as a Mac
but hey .. I have touchscreen, new processor, lots of ports (which I care
about) and almost half the price of an entry level Macbook pro. The cost isn't
a big factor .. I would have loved to buy an expensive Macbook with beefy hw
specs. What is being sold right now isn't for me.

~~~
mercer
Which laptop did you get?

~~~
trapperkeeper79
HP Spectre (the one before the latest once since the latest doesn't have a
regular HDMI connector). It is pretty decent .. the only con is the trackpad.
I fooled around with the settings to make it livable and also use it with an
external keyboard and mouse when I'm coding at home. The windows subsystem for
linux is pretty rad.

------
thaw13579
After spending the last week using the Linux subsystem for Windows, I can see
myself potentially using it exclusively for development (assuming they iron of
bugs and make GUi programs perform well). I wonder, have others used it and
gotten a sense of how much they can depend on it?

~~~
DoofusOfDeath
There are two reasons I'm avoiding Windows 10 for anything important: (1) I
can't significantly control the timing of when OS patches get installed, and
(2) the plausible risk of unacceptable spying by MS.

So even if W10's Linux subsystem ran flawlessly, I don't trust host OS in
general to provide a reliable, acceptable environment.

I'm curious if you have the same concerns.

------
jsz0
> I want a meat-grinder CPU to make photo-editing

Isn't Apple using the latest & greatest Intel mobile CPUs?

> I want a mod­ern video card so game-playing is cool when I feel like it.

I don't know much about gaming but isn't the Radeon HD in the 15" model based
on the latest AMD architecture? I can't even find any benchmarks for it so are
you sure it's not good enough?

> I want lu­di­crous­ly ex­ces­sive amounts of mem­o­ry.

Do your memory requirements take disk IO speed into consideration? When you
can read from disk at 3GB/sec I'm not sure the old assumptions about memory
capacity hold true anymore.

> I want lots of con­nec­tors so I can plug in my cur­rent USB drives and
> mouse and key­board.

USB-C is great for this. You can connect all your accessories to a hub and
only need to plugin one cable for power, external display, storage, input
devices, etc.

> I want a PS Card read­er be­cause I shoot RAW so my pho­to files are huge
> and they in­gest faster from a card

USB-C card readers cost about $10-$15 so you could probably just buy a few of
them if you're concerned about losing them. Most of them are multi-card
readers too so you're actually gaining functionality in the process. Also when
newer/faster SD standards come out in the future you can simply buy a new
USB-C card reader instead of being stuck with the one built into your
computer.

~~~
vladimir-y
> Isn't Apple using the latest & greatest Intel mobile CPUs?

Surely Apple is not using latest Intel mobile CPUs, not the Kaby Lake.

> Do your memory requirements take disk IO speed into consideration? When you
> can read from disk at 3GB/sec I'm not sure the old assumptions about memory
> capacity hold true anymore.

Why should I use SSD instead of RAM? I use SDD for the persistent storage. Why
in general you should look for a trade-off when other laptops manufacturers do
ship a decent laptops.

> USB-C is great for this. You can connect all your accessories to a hub and
> only need to plugin one cable for power, external display, storage, input
> devices, etc.

Yeh, a hub ... which you will have to take with you every time.

> USB-C card readers cost about ...

Yeh, a USB-C card reader ... which you will have to take with you every time.

~~~
jsz0
> Surely Apple is not using latest Intel mobile CPUs, not the Kaby Lake.

Ah I must have missed that in all the hysteria over the TouchBar and USB-C.
That is valid criticism especially for the 13" model that would benefit from a
faster integrated GPU.

> Why should I use SSD instead of RAM?

The benefit of faster disk IO you can shuffle things in and out of memory
faster. In the case of a photographer dealing with lots of relatively small
files faster disk IO can be more important than raw memory capacity. In other
scenarios raw memory capacity is more important. In every scenario raw memory
capacity and faster disk IO is great but there are tradeoffs to that also
mainly weight, battery life and cost.

> Why in general you should look for a trade-off when other laptops
> manufacturers do ship a decent laptops.

There are many factors to consider and different people have different
priorities.

> Yeh, a hub ... which you will have to take with you every time.

Along with all the accessories you're already lugging around to plug into it?
Never really understood this argument.

------
tgarma1234
I have a MacBook air 13", the basic one with 4GB of ram and a small SSD. It
works fine. I can't really see why it would need to get a whole lot better
honestly. If I need a "big" computer then I use my desktop, and if I just want
to read then I use my tablet or my phone. I would buy another apple laptop
sure but not because of the specs really. They are just good computers in a
world where there are lots and lots of good computers. I think apple is right
to just keep on keeping on with these machines anyway because there isn't much
innovation in terms of parts happening now. Touchscreens are a gimmick to me
and newer cpus are only a little better and not worth the price of the
upgrade. New computer specs are basically about as exciting as reading about
innovations in Refrigerators or other durable goods.

~~~
swozey
The goal for a lot of us is to not need a "desktop" to do our "big" work.
That's the thing. This was the Macbook Pro line, meaning you'd carry it home
and to work and plug it into a dock/monitors and never need a "big" computer.

~~~
tgarma1234
In that case it seems like using "big" desktop computers on Azure or AWS is
right around the corner, especially if you factor in better internet speeds to
make the remote desktop experience more seemless, so that the client computer
doesn't need all that much power. It just doesn't seem like there is a future
in developing computers with massive specs as personal machines anymore. If
there was a market in it that was profitable, then the analysts at all of the
major companies would know it. They seem to see that there isn't a market
worth investing in.

~~~
swozey
There is a HUGE monetary and time benefit to having developers spin up
docker/vagrant environments to run integration/etc tests on their laptops
before it hits CI/CD systems so I completely disagree. People thought we'd
offload developer computing to the "cloud" 5 years ago. It's not coming soon,
I doubt it's coming in the next 5-10 years if you're making applications that
now rely on microservices, kubernetes, etc.

AWS charges by the hour, GCP charges in 1 minute increments after the first 10
minutes. That all is extremely costly if you have a number of developers
sending all of their tests through those systems.

~~~
tgarma1234
And that seems to me like the main incentive they would have to force
developers into that user model. On the one hand it makes sense to charge
$2500 for a laptop but then if the quality is so great now that the machine
can be expected to work well for almost every user for nearly 10 years then
that price point is actually almost a Pyrrhic victory. It would be better to
sell smaller cheaper machines that justify upgrades more frequently and then
charge power users for time in the cloud. I think that's pretty much the
inevitable endgame here in an industry that is all but fully commoditized.

------
droopyEyelids
>So, there are new MacBooks and many peo­ple are un­hap­py

I'm seeing this sentiment repeated quite a bit, and I'm surprised because it's
almost a pre-literate view of things.

Whenever something changes with regards to human life, there is a brief window
for content generators to capture increased attention by writing a timely
article.

Content creators get more attention (which directly translates into revenue)
if the content is contentious, or can be forced into 'explaining' a larger
trend.

Understanding this is basic Internet literacy. "Many people" are not unhappy-
that is just the content machine striving to grab your attention.

------
peatmoss
I feel like Ubuntu has been the F/OSS replacement for OSX for some time.
Unfortunately, I feel like the strengths of a Free *nix (Don't like the UX?
Change it.) are exactly the things that prevent a critical mass.

Sadly, a critical mass of support seems to be crucial for things like hardware
drivers and commercial software with roots in the shrinkwrap era (Photoshop,
Office, etc?).

~~~
lucideer
elementaryOS seemed to be angling more directly for that market (Ubuntu was
always quite Mac-ish, at least up until Unity, but still very much looked to
attract a more general userbase - e.g. targeting installers at Windows users,
etc.), but I think elementaryOS is probably a bit young to take full advantage
right now. They may have gone a little too far on the "simplicity" side for
power users coming from MacOS.

~~~
dkuntz2
What do you mean by "up until Unity"? Unity has always looked like a macOS
desktop clone, from having the menu bar always on top, and integrating a dock
by default. Unity seems significantly closer to the macOS desktop compared to
the old Gnome 2 setup.

~~~
lucideer
Unity still has its macish elements, true, but I think the post-Unity GTK+ and
general in-default-app experiences tended to move toward "doing their own
thing", rather than aping Windows or macOS.

Also, while the Unity side dock is iconified like macOS', I think it's
functionality more different than the old Gnome2 bottom panel was from the
macOS dock.

All subjective I guess.

------
macawfish
Gnome 3 with extensions is really lovely now, heads up.

------
frik
Not just Apple with MacBook/macOS. What does a post Windows 10 future look
like? So many things are not okay, I simply refuse it and stay with Win7 for
some time. Microsoft like Apple doesn't care anymore about power user. Apple
still makes at least good iPhone/iPads, MS is worse on everything nowadays
(XBox, Surface, WinStore, WinPhone, destroying BIOS with DRM EFI) but paying
trolls.

A paradigm shift - desktop gets less and less used, I am hoping for a
Android/Fuchsia OS on notebook. Today many things are done on mobile iOS and
Android - I just want to connect it to a huge TV/projector/monitor and
keyboard and mouse and use it for work too.

~~~
aweb
I'm sorry but I find this to be a weird assertion when you see how much more
detailed Windows 10's Task Manager is, or the fact that you can have a Linux
subsystem running on Windows since the summer. What makes you think that MS
doesn't care about the power user when all signs point to the contrary?

~~~
XaYdEk
I'd like an answer too.

I'm a Linux guy and Win7 is the last I will ever use personally for my own
machines (unless I really need it for work, which case I'll VM it anyway). But
my reasons are different, I really don't think they ruined it that much for
power users, but it's other details that annoy the hell out of me. Like
default online accounts, way too much telemetry data being sent and it still
feels like it was designed for a tablet (even Enterprise versions). It is
pushing too much for the always-online, always-connected computing experience,
which feels like turning PCs back into terminals.

@aweb I dunno, it seems they just moved services and startup from msconfig to
the task manager. The Linux subsystem is nice, very nice, actually, the one
thing I really liked.

