
Mac Pro Puts the Pedal to Metal in Apple's Race with Nvidia - webwielder2
https://appleinsider.com/articles/19/10/18/editorial-mac-pro-puts-the-pedal-to-metal-in-apples-race-with-nvidia
======
chipotle_coyote
It's refreshing to see that Daniel Eran Dilger's talent for writing thousands
of words of speculation based on, say, two sentences in release notes is still
going strong.

And, yes, that's a little mean of me, but I'm not kidding. In July 2010 he
wrote three pages _with illustrations_ about how Xcode 4 "could portend new
HTML5 development tools" that delivered "solutions for parallel development
tasks"; in April of that year, he wrote what might as well have been a short
novel about how the iPad would "kill" DVDs, Microsoft Office, DVRs, and "idle
moments." He at least used to fill his long, long articles with links that
_look_ like citations, but are actually just links to... earlier speculative
articles he's written. Actual references tend to be very, very thin. He's also
still remarkably committed to the notion that everything Apple does is mad
genius four-dimensional chess, to the point where other bloggers in the Apple
space including John Gruber (and, back when I could be described as "a blogger
in the Apple space," me, I suppose) called him out for it.

The notion that Metal is a SECRET GRAND STRATEGY to replace CUDA is
fascinating, it's just... not supported by any material evidence, despite the
length of the article suggesting otherwise. In other words, it's pretty
classic Dan Dilger.

~~~
scarface74
He was writing his entries at “Roughly Drafted” for years and he was a well
known but not well respected blogger I think back as early as the 200x’s.

------
kevin_b_er
Metal is racing against Vulkin as well, because Apple needed to have their own
standard they owned. Except Metal has zero support outside of Apple products.

Besides, what Metal represents is Apple's attempt to have yet another vector
for their infamous walled garden and vendor lock-in strategies.

Further, the article and the website are intentionally and blatently biased.
Quotes like this showcase their extreme bias:

> One can be righteously indignant that Apple isn't subsidizing everyone else
> with support for their platforms, whether CUDA, Vulkan, or even Android. But
> such emotions won't have any bearing on the final outcome of who wins and
> who loses in the market for developing and commercializing the graphics
> technology of the future.

The absolute derisiveness of open standards coupled with their insults against
those that would showcases quite well the position of the article, its
authors, and the website itself.

~~~
reaperducer
_Further, the article and the website are intentionally and blatently biased_

Seems like an unnecessary criticism. It's an Apple fan blog, not journalism.
Nobody is going to Apple Insider for balanced information about tech trends.

~~~
jonas21
_> Nobody is going to Apple Insider for balanced information about tech
trends._

Many people are going to click on an HN link and assume it's a reasonably
objective technical source. I didn't realize it was an Apple fan blog until I
was well into the article.

~~~
bitwize
I like to think the average Hackernews is a bit more savvy than to assume all
or even most links here are objective. Still, there's no harm in saying "yo
dawg, this article has an agenda, so just be aware of that when you read."

------
exabrial
Without the ability to spin Osx instances up on the cloud, I don't see Metal
making a dent in CUDA.

~~~
CoolGuySteve
The cloud is staggeringly more expensive than building your own workstation.
Like a couple of weeks of on-demand GPU instances on AWS Virginia costs about
as much as a machine with 2 RTX 2080 cards in it that will stay performant for
at least a couple years.

But coming back to the Mac Pro, unless you need Apple-specific software, the
same is true compared to Ryzen/ThreadRipper + NVidia cards. The Mac costs 2-3x
as much while delivering less performance.

At least the cloud can scale horizontally, the Mac Pro is just slow and
inefficient.

~~~
GenerocUsername
Why is that? Shouldn't competition drive cloud prices closer to ownership
prices?

~~~
sseveran
Its because NVIDIA requires Teslas in the cloud, which while more powerful
than 2080 ti's, cost roughly 6-8x as much.

~~~
Analemma_
How does that work? If I buy a 2080 Ti and then decide to use it in an
internet-hosted box that I rent out to other people, the first-sale doctrine
should tell Nvidia to piss off if they have a problem with it.

~~~
angry_octet
When you are operating at any kind of scale you need the Telsa packaging
(airflow, connectors, power) and testing (running many in a system). The
consumer desktop cards cause lots of trouble.

Also the consumer cards have artificially small memory sizes (e.g. 11GB max)
which painfully constrains DL jobs.

No manufacturer is allowed to build Tesla-like cards. Theoretically AMD could
crush the nVidia profit margins by releasing cheap data center boards, but
their developer support is rubbish and they want that high margin cash too.

~~~
fit2rule
.. what this means, is that Apple is competing against the Titan/NVIDIA
_hardware designers_ to make better compute power available at scale, and in a
way which makes sense across consumer/pro boundaries.

In that context, the Mac Pro doesn't sound too bad a proposition. I say that
as someone who recently built a dual-Titan/AMD Ryzen system, and while the
pain of the build is almost gone away .. I do lust after that sexy Apple box,
being plug 'n play and all ..

~~~
angry_octet
Intel competes with nVidia on high end GPU hardware and so far (KL and KF) are
doing an abysmal job of it, despite using their CPU control to lock nVidia out
of the CPU bus.

Apple can't really compete except in mobile GPUs, which is all about power
constraints set to single digits Watts, vs >100W for server. That isn't all
bad for Apple, there is lots of inference to be done, but by refusing nVidia
hardware they impose a developer hurdle.

I'm looking forward to people putting 2080s into the new Mac Pro and seeing
how reliable it is.

------
jchw
In all honesty, with Metal limited to just Apple and Vulkan representing
roughly “everyone else,” will this even be a fair fight? This could be a
horrendous prediction, but I see Metal as the modern-day Betamax. There’s
nothing wrong with it, but is this how you win a competition of APIs? I have
enormous doubts.

~~~
lostmsu
Does Vulkan work on Xbox and PS?

~~~
jchw
Excellent question, but let’s be clear: it’s more of a question of whether or
not Xbox or PlayStation ship Vulkan implementations. As far as I know,
PlayStation is traditionally proprietary APIs, and you would need a third
party implementation, if one exists. As for Xbox, I assume Xbox is still a
modified Direct3D API for graphics.

I assume you didn’t mention Switch because it, somewhat famously, _does_
actually ship with Vulkan support natively. Which is cool.

Of course, even though video game consoles continue to converge into looking
like computers, developers still don’t and can’t treat them as such. Sure,
Switch runs Vulkan but that doesn’t mean you can compile a Vulkan game and
ship it to the eShop. Same for a DirectX game on Xbox. Consoles are, by
nature, very proprietary, and much more frugal than general purpose computing
platforms.

Does adoption of Vulkan for game consoles matter? Maybe, but I’d guess
probably not. It might help the consoles attract developers, but at the end of
the day every console is going to get its own port with a decent amount of
retooling to better suit the platform, especially in the case of Switch which
is not a terribly high end piece of hardware in this day and age.

~~~
pjmlp
Switch also has their own proprietary API, NVN.

Vulkan is a kind of addendum.

~~~
jchw
Sure: it’s clear Vulkan is made to attract developers. I don’t think there’s
any delusion that a unified API would really mean you didn’t have to port to
each console individually; you need more fine grained optimization for most
games still, especially in Switch’s case. Eventually this may stop being true,
but as long as a $300 laptop has trouble gaming, it would be unfair to expect
a $300 portable game console to run unmodified games significantly better. In
this case, the API only matters to a smaller degree.

------
gdubs
I’ve been exploring Metal a lot in the past year, and I really enjoy working
with it.

This article has a lot of interesting historical bits, but conflates a lot.
Maybe Jobs hoped to take on the high-end graphics market, but my understanding
is that the NeXT cube was fairly underpowered. It was its application
development environment which made it a cult classic. (And arguably what led
the the success of MacOS and iOS.)

The Mac Pro is definitely reminiscent of a time where SGI had some of the
sexiest machines available. There’s always been the “Apple Tax”, but there’s
also always been people willing to pay it because they love the Apple
ecosystem, the aesthetics of the platform.

------
wazoox
Given the hefty price of the new Mac Pro, the lack of dense, rackable hardware
setups, I don't see how it could make any significant dent in the 3D market.

~~~
pram
They said the new Mac Pro will come in a rack mounted variant at the WWDC
presentation. Rebirth of the Xserve ;P

------
siod
It's fairly standard for rendering/games engines to support native graphics
apis, they're built to be cross platform.

Not to mention machine learning is a totally different market, using OS X on
workstations would be counterproductive as there isn't any feasible server
solutions to run production data on.

Also the Tegra chips aren't irrelevant, they're powering the switch and
extremely popular for drone/robotics application processors.

This entire article is biased beyond belief.

~~~
pram
The Mac Pro will come in a rackable server chassis version. That's a detail
from the announcement a lot of people missed. Presumably Apple intends for the
new hardware to be deployable Xserve style, not just workstations.

[https://i.imgur.com/D1HaH9l.png](https://i.imgur.com/D1HaH9l.png)

------
gavanwoolery
Competition is generally good, but competing standards are bad, especially
when they have no real advantages over each other. Imagine if every platform
had its own incompatible version of C (ok, yes there are incompatibilities but
they are not there by intention).

If you want to win me over as a developer, do it the right way. Provider
superior tooling and win a customer base with superior implementation.

~~~
burlesona
I'm not sure it's that simple when you're talking about needing to evolve
hardware design and software toolkit together. NVidia has been ahead of
everyone else largely because they skip all the coordination overhead and
design CUDA to just work on their hardware, and put their energy into making
that hardware + software combination better and better without regard for what
everyone else is doing. If Apple wants to compete it's hard to see how they
could without adopting a similar approach.

What I think is more realistic is that things like Vulcan will have Metal and
CUDA adapters, etc., and that developers who don't need bleeding edge
performance as much as they need broad compatibility will be able to use that.

In other words it's the same tradeoff as we've seen in software for decades --
you can write a fully-optimised native application per operating system, or
you can use a cross-platform toolkit (increasingly the web) which can do a
nice job but lacks some of the power and polish that a native application can
deliver.

------
angry_octet
What an incredibly Cool-Aid guzzling piece. Manages to get historical time
line right but has totally out of kilter understanding of the reasons why. And
thinking Apple can supplant CUDA with their own proprietary tech is completely
delusional. The lack of an nVidia GPU is a significant drawback to using it as
a workstation, and Apple is foolish to have gone the AMD only route.

------
elipsey
As an aside, what's this about "Microsoft and Intel worked to steal Apple's
code representing proprietary video acceleration techniques" back in the day?

It's before my time, and the citation is a press release. Any one remember
that?

~~~
GeekyBear
Code written under contract for a port of QuickTime to Windows later showed up
in Microsoft's Video for Windows product.

>The lawsuit, filed on December 6, 1994, alleged that the San Francisco Canyon
used some of the code developed under contract to Apple, in their additions to
Video for Windows. Apple expanded the lawsuit to include Intel and Microsoft
on February 10, 1995, alleging that Microsoft and Intel knowingly used the
software company to aid them in stealing several thousand lines of Apple's
QuickTime code in their effort to improve the performance of Video for
Windows.

On March 3, 1995, a federal judge issued a temporary restraining order that
prohibited Microsoft from distributing its current version of Video for
Windows.[1] Microsoft subsequently released version 1.1e of Video for Windows,
that removed all of the code contributed by San Francisco Canyon, stating in
the release notes "does not include the low-level driver code that was
licensed from Intel Corporation." Later testimony in the Microsoft anti-trust
trial revealed that, at the time, Apple was threatening Microsoft with a
multi-billion dollar lawsuit over the allegedly stolen code, and in return
Bill Gates was threatening with the cancellation of Office for the Mac.[2]

[https://itlaw.wikia.org/wiki/Apple_v._San_Francisco_Canyon](https://itlaw.wikia.org/wiki/Apple_v._San_Francisco_Canyon)

Microsoft and Apple eventually settled and cross licensed their patent
portfolios.

~~~
elipsey
Oops, I posted at the same time. Thanks for the summary.

------
dev_dull
For those that think Apple can't pull this off, remember that they dominate
the developer laptop market. They just need to incrementally slip it in via
software and hardware.

For example I recently found myself doing some video encoding via ffmpeg. The
default install from brew supports videotoolbox (T2 chip accelerated) out of
the box, and boy howdy is it fast. It smokes my multicore intel server
optimized with vaapi.

Soon users will do a `pip install pytorch` on their dev laptop and get the
metal-optimized version of it without even thinking twice. Apple already has
their foot in the door.

~~~
lliamander
I don't know about dominate. From the most recent Stack Overflow survey, 25%
of developers used Mac, 25% used Linux/BSD, and 50% used windows.

~~~
dev_dull
Maybe I should say it dominates for silicon valley tech companies and
startups, which is manifestly obvious to anyone who works here.

~~~
lliamander
Fair enough. I'm "valley-adjacent" and so I see plenty of that.

------
shmerl
_> One can be righteously indignant that Apple isn't subsidizing everyone else
with support for their platforms, whether CUDA, Vulkan, or even Android. But
such emotions won't have any bearing on the final outcome of who wins and who
loses in the market for developing and commercializing the graphics technology
of the future_

Time for Apple to unstick from their nasty lock-in attitude, and to support
Vulkan. Such sickening lock-in approach in development tools is so much '90s.
Apple got frozen in time with having this attitude.

Metal isn't even a competitor to Vulkan, due to Metal being Apple only. So
it's not a question of what will win. It's a question, whether Apple will
remain in the dark age of lock-in or will unstick from it and will become a
good citizen that collaborates instead of taxing developers.

------
skvj
Vulkan (MoltenVK) does seem to be a valid option for those heavily invested or
interested in porting existing OpenGL projects (I plan on evaluating it):
[https://github.com/KhronosGroup/MoltenVK](https://github.com/KhronosGroup/MoltenVK)

------
temac
Either I completely missed some important things about PC history, or the
author does not know at all what he is talking about:

e.g.:

> By 1991 ATI was selling dedicated GPU cards that worked independently from
> the CPU. By enhancing the gameplay of titles like 1993's "Doom"

Hm I'm sorry, what???

~~~
SigmundA
The ATI Mach8 was a 8514/A graphics accelerator with no VGA core so it needed
a separate base video card.

AFAIK Doom never had a 8514/A mode, but there where a very few games that did:
[https://www.classicdosgames.com/game/Mah_Jongg_-8514-.html](https://www.classicdosgames.com/game/Mah_Jongg_-8514-.html)

Back then accelerators where about GUI/windows acceleration and then the 3dfx
came out...

~~~
temac
Acceleration in that era was about 2d drawing, most of the time for desktop
env. Completely uninteresting for 3d games like Doom. Probably uninteresting
for even 2d games unless the acceleration in question support direct sprite
rendering (like video game consoles)

------
xvilka
Anything that impedes NVIDIA is good news for open source.

~~~
bloody-crow
I mean the whole article talks about how Apple wants to grab CUDA's market
share with their own proprietary tech that is owned by Apple and only exists
in Apple's walled garden. Not sure how much of a win for open source that is.

------
soup10
I'm happy for Apple to embrace gaming, but yet another proprietary non-cross
platform API is not what we need. I will not be supporting metal.

------
PaulHoule
The page doesn't seem to work in Firefox.

~~~
ska
It does here.

------
spamizbad
Seems strange they don't mention Vulkan. While OpenGL will continue to be
developed, Vulkan is lower-level and plays into the architecture of modern
GPUs. Edit: I am wrong here and apparently can't find properly.

I'm not entirely confident that Metal will see widespread adoption become
outside of iOS devices. And while that's a big market, it's on the opposite
spectrum of the more powerful, and niche, Mac Pro.

~~~
macintux
> Seems strange they don't mention Vulkan.

Vulkan showed up in 3 paragraphs.

------
microcolonel
People who trust Apple are made to look like fools, every time. Consider using
a compatibility layer rather than using Metal directly: chances are, your
shaders will perform just as well, and you won't be tied to their cycles of
betrayal and rug-pulling.

