They have always assumed that people would be lured in by the interesting design and they make the real details hard to find. That is clueless.
The top of the line product page should have stats front and center.
The two mistakes Apple made (and is referencing there) were that 1) the GPU market would follow down the same path as CPU's were and going with dual (slightly slower) GPUs would win out over single high-temperature GPUs so they would have good replacements to choose from down the line, and 2) other software would adapt to the 2 GPU strategy and FinalCut Pro would be the leader on a new wave of software using GPUs through things like OpenCL.
Neither of these turned out to be true. GPU makers pushed hard into making single-card monsters (and it has worked well), and very few software titles have figured out (or even publicly attempted to figure out) how to make things use the second GPU on the MacPros.
That didn't work out. There are external GPUs today and they perform surprisingly well, but it's just not the same as having a card you can slot into your machine.
The one thing that the Mac Pro was supposed to do well, it killed at though. Final Cut on that machine can chew through 4K video without issues. The problem is the people that want that machine want more than Final Cut.
We go back and forth between dual and single graphics cards every time we get the TPD down far enough that a single card can do something like 1.5x as much as the old cards without melting, because one card without all the data juggling can accomplish a lot more per clock.
Laptop: ultralights (MacBook. This can be all soldered), value (MBA, or even the old 13" MB unibody, somewhat upgradable), powerful (13"/15" MBP, ram and ssd upgradable with standard components)
Desktop: all-in-one (slick design), value (smallish, somewhat upgradable), pro (flexible, upgradable)
And the pro models really need some semblance of upgradability, especially desktop pro.
Where is the 4k 17in MBP? The old one was 1920x1200, it would be a perfectly obvious upgrade
Sure they are absolutely losing pro customers, but does that really matter when they have the iPhones and high-end laptops? It's only the very high-end professionals that aren't able to make due with an Apple product anymore.
I actually was all set to buy a new MBP for around 3k, but the last round's market segmentation pissed me off. They always have one cheap reasonable option and then charge an arm and a leg after that. IMO, they would be better served aiming for a 35% profit margin on everything vs the ridiculous upgrade costs for high end they currently use.
EX: 1TB PCIe-based SSD upgrade from 256GB + $564.00. When 1TB SSD's are running under 300$ and they are saving on the cost of a 256GB SSD. Think about it, if this was a removable drive you would not only save money getting a 3rd party SSD, but also end up with and extra 256GB drive. Which means they would still have a ~40% profit margin if it cost 300$ to upgrade.
You could be right of cause, but it seems a little unlikely doesn't it?
I am not a power user by any means, so I don't need lots of raw computing power. To me, the values of owning a MBP is design, build quality, good compromise between weight/power, OS X, and longevity. However, longevity is reduced by removing upgradability for RAM and SSD. I don't see MBP as attractive as before, especially with the price premium. I admit that I am cheap, and I try to spend as little as possible. I typically get the lowest spec'd MBP that fits my current need. After a few years, I upgrade RAM and storage (used to be HD, now SSD), and that will last me another few years.
Apple's current offerings doesn't fit my need, and I am exploring other alternatives (currently test driving Asus Zenbook 305 running Xubuntu alongside my MBP). I am not bitter, just disappointed since I much prefer Apple laptops.
Also, the lack of reasonably priced SSD upgrade for Retina MBP due to proprietary interface is driving me nuts. I am running out of storage space and can't do anything about it.
It's possible to make a computer that's upgradable with non-standard components. People will hate you for it if they realize you're gouging them, but it's a time-honored strategy.
2. Apple could sell those parts themselves at a much more reasonable profit margin than they do today.
3. Better to sell the cheapest option (at a good profit margin) than nothing at all.
This really needs to be qualified, as there are plenty of high end pprofessionals still served well by Apple products.
There's not a lot of choice in the high end for eg. Notebooks. The Surface Pro Performance Base comes close to the MacBook Pro, for example, but not a lot else on a balance of features. If I really need 32GB RAM (For example) I need to lug a back breaking brick from Razer , or deal with the Dell XPS 15 which can't be charged on an airplane.
Non-mac PC -> probably Samsung phone & definitely not iPad
Home, end, page up, page down, and the power button moved back to not being a keyboard key.
Most people would rather have a 14 hour battery that's not replaceable than a 6 hour one that is.
I've had a lot of laptops in the last fifteen years and the only time I'd ever removed my battery was to add memory or a hard drive simply because it was in the way. Maybe one in a hundred people actually care about having a replaceable battery.
I find the built-in style is fine for casual work, even typing documents, but for programming I love my delete key too much.
On the mini keyboard I find myself using ⌘X in place of Fn-Delete, it's a one-handed thing that's more convenient. If you have nothing in your paste buffer that's important it works out well.
Those that care about a forward delete key are in the minority. They're also the type most likely to have some kind of highly personalized keyboard with very particular Cherry switches and keycaps.
So why all the excitement for apple to make a "narrow" machine to "cater" to the pro market? That would be against what Apple has historically done. Apple can win the "pro's" over by making iPads or MacBooks able to do pro video editing that previously required a $10k windows/linux tower with dual video cards, not by making their own $10k tower with dual video cards. The whole point may not be that there is a pocket industry of "pros" upset at Apple, but that their function is being disrupted by young grads just out of film school doing their jobs on an iMac.
WAT!?! Imagine editing an 8k, 3D feature film on an iPad Air. The "handheld" device would end up being a 12 lb behemoth with Gilliam-esque cable madness running to a single lightning port and optional (Red) branded oven mitts.
Pro video editing (and FX, hello) requires massive power, storage, and peripherals - the more the better.
Is there a substantive difference between editing in 8K and in say 720p?
Ie, could I edit my whole movie in downsized resolution, save all the transformations I've applied, then redo them all in 8K on a server someplace for the final product?
Back when REDs first came out, there was a lot of buzz from amateur and student editors who could get 1080/720 proxies of their freshly shot 4K footage almost instantly and hand it off to editors before production even wrapped. This had been the standard for many professionals doing off/online editing for a while, but getting access to that workflow for relatively cheap was the new, cool thing.
There's always an impetus to get higher resolution and quality at the offline stage though, as people get used to improving technology. It wouldn't be outrageous to have a 4k offline now, because people with 4k TVs at home will say "Why can't we? I want to see everything!", even if it doesn't particularly enhance the editor's work.
It would still be done with lower bitrate files, because original 4k material from digital cinema cameras is beyond what most machines are happy with.
No one is editing 3D right now, except a very tiny portion of the industry who are producing specialist films or blockbuster AAA movies with unlimited budgets.
By the time that those become standard and mass market, hardware will be at a place where laptops and iPads can handle the tasks.
4k is very much the standard right now, and almost any Mac right now can handle 4k editing (some better than others) and perhaps even the iPad Pro can handle it in stride.
So the realities of the market don't hold up to your fantasies of millions of people needing to edit 8k and 3D at the same time.
Even in the world of photography, there is no Mac that edits/filters/transforms/renders 30-40-megapixel Raw images without very noticeable latency. For fluidity and efficiency in what I do, I'd gladly take 10x performance over what exists today on the Mac.
Depending on the studio, content is centralized, but for smaller studios, cloud-based storage is prohibitively expensive since they work with raw 8k/4k video files.
So Apple is still the best choice for people like me who don't totally want to break their current workflow, simply because there's no convincing great alternative out there. But the credits, or forgiveness of all this is dwindling fastly. If I would've bought this new MBP for 3500 EUR from my own money, instead of it being a company laptop - rather than my personal one --, I would feel seriously cheated -- and probably would've gone for a Lenovo instead.
Apple, under Jobs, made plenty of mistakes, but in the last years, he was pretty much spot on.
In my eyes, Tim Cook, makes one mistake after the other. And it's not just the Mac Pro, but also the Mac Mini (which is a great computer-format), the audio jacks on the iPhone, the removing of the magsafe, dongle-life, etc. So there's plenty of reasons to say basically: let's bail from this mess. The problem is that the alternatives are not so much obviously better.
GrumpyGamer's posts when compiling Thimbleweed Park for Windows were a pretty good reminder of staying away from Windows. ;)
I'm a critic of the poor pace of mac updates, but Apple is making far more profits in iPhones and Macs (and overall) now than when Jobs was in charge. It's market position is far stronger as well.
So instead, you prefer Apple monitoring you like a little baby duck?
That said, Google seem to have some smarts about defending against high-level attacks, though also some very noteable failures. Remember, John Podesta's email phishing occurred on a Gmail account.
And it's Gmail which secures the Traitor Puppet Fascist Donald John Trump's Twitter account password recovery.
Also, it is possible - and not all that hard - to run Android without any proprietary (closed) Google software. The same can not be said about iOS.
>We don’t build a profile based on your email content or web browsing habits to sell to advertisers. We don’t “monetize” the information you store on your iPhone or in iCloud. And we don’t read your email or your messages to get information to market to you. Our software and services are designed to make our devices better. Plain and simple.
Do I have proof for this, other than the fact that they've shown to be less than fully truthful about their data retention policies? No, I do not, but I do know that it is more rule than exception for commercial entities to eventually renege on their promises of personal privacy, probably because that huge 'big data' carrot on that stick is just too juicy to be ignored.
Given Apple's enormous cash reserve they pose the additional threat of being able to outright buy large industrial players in lucrative fields which would allow them to use the collected data to target their customers without needing to 'sell to advertisers'.
Hell what? Not even that works. Apart from the curios* I've seen no live use of anyone using an iPad instead of a MacBook. From custom window management to the simple fact that a hinge between the screen and the keyboard is just the perfect HID with a TUI.
* yes it does work and some people seem to like it (to each his own) but every attempt I've seen just ends up turning an iPad into a frankenlaptop just to end up claiming "hey I work on an iPad, iPads are fine!", which they are, but I'd rather jack a car up with a jack rather than with a beam and a log.
Executive and field workforce people love the ipad. When I stood up our mobile team, a group of field auditors literally sent us cookies and brownies when we trashed their corporate crapbooks with iPads and VDI -- despite significant UI issues.
As SSO has become better with iOS, people have gotten even happier. The instant ok and lower friction (yet secure) connectivity experience is huge.
The biggest issue is legacy windows apps. But people come up with solutions for that as well. In one case, a guy wrote an access front end for a Windows workflow with giant buttons that were easy to click. Ugly as sin? Yes. But it worked.
That right there tells you the iPad will never be a pro machine or replacement for a laptop or anything but a toy. I bought my sister one a couple years back. I still have no idea how to play movies on it in non Apple-approved formats. Yes, I could research it. But this "it just works" machine is one now that requires research to play a movie. From what I could tell, each app will do it differently. Yeah, that's progress all right. Progress on a toy, a toy that by design (of its OS) will never be anything more.
You download VLC from the App Store and use that. Not much different from Mac or Windows, both of which require you to download VLC or another third party application to get a wider range of video formats.
(And iOS does have what amounts to an accessible filesystem these days; it's called iCloud Drive.)
The trashcan Mac Pro came out after Jobs died, but in spirit it's a classically Jobsian product: fascinated with aesthetics and style, disinterested in performance and expandability. It's striking to look at, but required a whole bunch of increasingly painful compromises on the inside in order to make that visual statement possible.
Which makes it a lot like the original Mac, which was also a Jobsian product. Its revolutionary industrial design makes the 1984 Mac a wonderful object to contemplate even today; it's like the Platonic ideal of the term "personal computer." But the original Mac was also hobbled in important ways to fit Jobs' vision; it came with just 128K of RAM (low even by the standards of the time), and no mechanism was provided for adding expansion cards or other upgrades inside the enclosure. (It took two years for Apple to come out with a version that included a SCSI port!)
What the pros want, by contrast, is a Mac that puts functionality over aesthetics; it's OK by them if it's a big ugly box, if that big ugly box comes with cutting edge parts and can be stuffed full of RAM and drives and expansion cards. Which sounds a lot like Gassée's baby when he was in charge of product design at Apple, the Mac II: https://en.wikipedia.org/wiki/Macintosh_II
The Mac II discarded the beautiful lunchbox form-factor of the original Mac completely, replacing it with... a big ugly box. But it was a big ugly box that was twice as fast as the lunchbox Mac, supported true-color graphics and had plenty of room for expansion, so the pros of the time loved it. The Mac II was the start of the long line of high-performance Macs that would make the Mac the go-to computer for people in creative industries.
So when pro users complain about there not being a Mac for them anymore, this is what they're talking about. They want a modern Mac II. They want a Mac that's less Steve Jobs and more Jean-Louis Gassée.
Ironically they didn't learn their lesson and thermal failure was still the Achilles heel.
In their defense, Apple hits home runs with those kind of products. I worked in a computer store when the original iMac came out... the serious computer people thought it was a joke, but it sold like hot cakes.
What "kind of product" do you mean, that includes the G4 Cube and Trashcan Pro (interesting but expensive failures), and the original iMac (decent mass-market computer that was incredibly successful)?
The bondi blue iMac was an incredible gamble. It was bondi blue, looked like a hard boiled egg sliced in half, lacked a floppy drive, and was incompatible with every Apple accessory/peripheral on the market.
But it hit the mark and was a hit.
Apple makes plenty of turkeys; it's just when they score a hit, they hit it out of the park.
Jobs didn't need a Mac Pro and a huge external monitor, and could have clearly used a laptop or iMac. The fact that he used the company's pro computer in his home office tells you a lot about him and the kinds of computers he liked.
The Mac Pro is very much a Johnny Ive design and idea (and not updating it feels like something that a numbers obsessed CEO like Cook would do). I'm not convinced at all Jobs would have let the Mac Pro flounder like this.
I truly believe that if Steve Jobs were alive today, the Mac Pro would be routinely updated.
That's the description for a Jobsian consumer product.
All the previous Mac Pro designs (and Xserve designs!) were greenlighted under Jobs, and it's obvious that tons of thought was put into their expandability and maintainability.
The obvious example is the PowerMac G5 with its screwless design (before that was at-all common); but looking further back, you've got things like the PowerMac G3, where most of the computer lived on a slide-out tray (https://www.ifixit.com/Teardown/PowerMac+G3+All-In-One+Teard...).
The G3 AIO was obviously an "aesthetic" product (however people might feel about the aesthetics of said "Molar Mac") but it was also built to serve the education market, not the consumer market, so it had maintainability as a goal.
> But the original Mac was also hobbled in important ways to fit Jobs' vision; it came with just 128K of RAM (low even by the standards of the time)
This was about cost savings! Same strategy as Nintendo: cheap parts + lots of value-add in integration and software = huge profit margin. (But, unlike modern Nintendo, early Apple did this out of necessity—a risky investment into a new product line when you don't have much capital is even riskier if you have to make minimum purchase orders for large quantities of expensive parts. May as well design the first one to be low-end, and then, if you can't sell it, repurpose the low-end parts for something else, like, say, disk drive controllers.)
> no mechanism was provided for adding expansion cards or other upgrades inside the enclosure
I'm not sure why they did this for the Mac 128K, but I'd guess it was simply that there wasn't room on the motherboard, with the motherboard constrained by the tiny case (where the form-factor of the case was part of the value-prop of the Macintosh product: it fit on your desk without stealing your desk.)
Look at the 128K's motherboard (https://www.ifixit.com/Teardown/Macintosh+128K+Teardown/2142... )—there's no room there to add anything.
> It took two years for Apple to come out with a version that included a SCSI port!
The Macintosh 128k was essentially meant to be an all-in-one, classy "8-bit micro" (at only a slightly higher price than one, once you factor in the price of the included color computer monitor in the 1980s.) It's competitors were Amiga, Amstrad, Commodore, etc. Nobody else in that world used SCSI ports either.
Apple basically noticed the reaction creative professionals had to their not-originally-intended-for-creative-professionals product, and then started adding "pro" features to it (while still never thinking of the Macintosh line as for professionals, until the popularity of Macs pushed them to eliminate their other lines.)
> The Mac II was the start of the long line of high-performance Macs that would make the Mac the go-to computer for people in creative industries.
Note that the Macintosh SE continued this trend, and yet returned to the lunch-box form-factor. The difference was that, by then, the chips on the motherboard had miniaturized enough to let them stick some expansion sockets on it as well.
That said, he definitely came around after his stint at NeXT. Just look at the PowerMac G3/G4, with the single hinge (zero screws) fold out motherboard. Still the best computer case design ever IMHO. https://d3nevzfk7ii3be.cloudfront.net/igi/Nm4mRELLtcYI4YYI.m...
I think this is wrong. This is what the benchmarks say:
- Apple A10 Fusion @ 2.35 GHz  / Geekbench 4 single-core: 3500 (typically) - that's 1489 points per GHz
- Intel Core i7 7700 @ 4.20 GHz turbo / Geekbench 4 single-core: 5400 (typically)  - that's 1286 points per GHz
- Intel Core i3 7100U @ 2.40 GHz (no turbo) / Geekbench 4 single-core: 3150 (typically) - that's 1312 points per GHz
In clock-for-clock performance, Apple already is competitive with the highest end Intel performance. It's just a question on whether they can scale this up to a desktop chip with an appropriate amount of cores, clockspeed and I/O.
Before you protest, Geekbench 4 is one of the few benchmarks that tests individual core speed while being fairly independent of I/O performance. Of course any desktop processor will easily outclass a mobile chip when it comes to I/O. It's not a perfect comparison for sure, but I think it shows that it's quite conceivable Apple will use their own chips in their next Mac Pro.
: This fails to take into account any potential 'turbo' functionality the A10 Fusion has - there is so little known about it that we don't if that exists or how much it boosts.
: I chose the non-K variety because the 7700K results on Geekbench typically seem to vary wildly due to its overclockability.
We might get there one day, but NEVER while Apple maintains closed-garden approach to iOS.
Apple allowed its 'Pro' lines (less so MacPro and more so MacBook Pro) to start catering to its biggest buyers - consumers who had lots of money and simply wanted 'the best'. And back in 2012-2013, the 'Pro' lines were really well regarded amongst professional users which re-inforced in the consumers mind that they were the best.
Apple focussed on thinner and slicker looking devices and added gimmicks such as touchbar and cylindrical cases.
Suddenly we find that the MacPro hasn't been updated in years (yes Mr Schiller CAN'T INNOVATE YOUR ASS indeed) and the MacBook Pro can handle the same amount of memory as it did over 3 years ago (battery you say Mr Schiller? Well most 'pro' users will sacrifice some battery life for performance and those that don't want to do that could have gone for regular MacBooks).
The Pro lines aren't so well regarded anymore and Apple will watch slowly as those rich consumers who propped up sales dwindle. Maybe Mr Cook and co will wake up and actually realise they should rather just drop the pro line than consumerize it.
iPads could have done so well in the pro space if they'd supported proper integration with Macs and could be used as extension devices rather than exclusively as replacement devices - but no - Apple had to try will a market that simply isn't there into existence.
For most non-pro users, a walled garden is awesome. You can actually use a computer without it becoming progressively more and more infested with malware and without it "rotting" from badly written installers, etc. For most users closed is a feature, not a bug. Closed means it works. Closed means you don't have to do your own IT.
A real solution to these problems that isn't based around walled gardens and app whitelisting would require very innovative solutions to major problems in application and operating system security. These aren't the kinds of problems that get solved overnight, and unfortunately I don't see very many people working on it.
Ultimately I think what's going to happen is more pragmatic: the computer market will fully bifurcate into one kind of device and OS for power users and developers and another for everyone else.
My understanding is that Apple expect everyone who uses an MBP "for work" to actually be using it as the portable sidekick to a desk-bound workstation, rather than as their primary computer. They're supposed to be the cutter you launch off the side of your ship, not the ship itself. The Mac Pro (or often the iMac) is supposed to be the ship.
You might think that the lack of updates to the Mac Pro line belies that argument—but I'd say it's completely possible to have a strategy and then fail in tactical execution in a way that makes it look like you don't have that strategy.
If that's the case, how are you supposed to do it? Specifically, how do you keep your two accounts, files, bookmarks etc in sync between the two, especially 'live' so you can just pick up your laptop and continue?
I'd actually very much like to do this, so Apple's expectation or not it would be great to achieve.
Window state doesn't sync, but then, it wouldn't make much sense for it to, workflow+performance-wise. The cutter is not the ship; it doesn't have a 50-gun battery, nor room to take a 1000-person crew aboard it. You're expected to multitask on the desktop, and then single-task on the laptop (picture an executive VP with a fancy watch whose time is Very Valuable: if they're working at all while not at their desk, it's because there's one very important/urgent thing they need to do.)
As such, it makes much more sense to save your GarageBand project, or your PSD file, to iCloud on your desktop, and then re-open just that one thing on your laptop, rather than attempting to squash your whole workflow around as if you'll keep interacting with it the same way at the destination. (Though, if you're desperate to have that particular experience, Back To My Mac exists.)
(I do imagine an OS could pull off something like the Firefox UI experiment of "tab groups", with native app windows, by combining a ubiquitous cloud store with all apps being wired to save+restore window-state, and requiring that any apps installed in a "session group" get installed on all the computers in the group. You could then grab out individual pieces of your workflow and un-hibernate them onto any computer you liked, work for a while, then put them back away. But that'd require a radically different style of window management than we have today.)
* Chrome, with window state / open tabs
* A VM
* Various documents, downloaded files etc
ie, really, syncing much of the profile's files and app state.
iCloud seems suitable if you want everything going up to the cloud and back, which for two computers sitting next to each other most of the time seems overkill. I'd love an automatic local file sync and something similar to Continuity that worked Mac <-> Mac, not iDevice <-> Mac.
And indeed, the places I see iPads used most heavily are exactly those. You see it heavily in the audio industry: Logic Remote is really really good and all the other similar tools (Lemur, TouchOSC, etc.) have huge uptake. I'm in the process of building a video/CG overlay system for TV/streaming and the best way to use the backend management system I've found while I'm trying to concentrate on something else is my old iPad 2.
This is a market in which a $5k 10 core PC is considered an entry level machine. 3D houses and some music studios have racks of server-grade hardware that is far outside the reach of most mortals.
There is a bigger market of users who like to label themselves pro, but are actually prosumer. It includes smaller music studios (home or not), design studios, serious hobbyists and dabblers, smaller video houses, academic music composers, and so on.
Also many software developers.
These users are not in the Hollywood league, and they typically have a limited budget.
The old cheesegrater MacPro was a good fit for this market. It was also good enough, and expandable enough, to get some interest from the no-compromise pros.
The new MacPro is too expensive for the prosumers, and poor value - and underpowered - for the no-compromise pros.
I'm not yet convinced Apple understands this. Jobs understood it because his links with Pixar meant he could see first hand what Hollywood pros were using.
I don't think the current management is as open to creative input from creative professionals at all levels.
If I'm right, the 2018 Mac Pros will be open like PCs, but they still won't truly satisfy either type of user.
Generally a TV Show workflow (since FCP X killed its user base) an Avid centric base. Editors grab all their footage off a central server to edit the down converts (1080p). SSD is great for a boot drive but you'll need hefty internal storage. Now you need to buy an expensive external solution. Also for same goes for Digital Musicians or recording, although the storage demands are quite a bit less.
The dual GPU setup was under-utilized and the consumer grade cards are far better for render assistance bang-for-buck. So much so that it was better just to upgrade GPUs with gaming cards. Notably, the Fire GLs are underperfomers for After Effects. Also Musicians get almost zero benefit from dual GPUs. It's unnecessary cost.
Since in Hollywood TV, you are contract you, you bring your own hardware. When USB 3.0 took off, most of my editor friends just grabbed USB 3.0 cards and popped them into the computers. Mixing guys tended to have a lot invested in PCIe audio interfaces for ProTools. A few friends had MPEG4 encoder accelerators which were great for quick encodes for dailies. I'm not sure what fantasy world Apple was in that small size of the Mac Pro 2013 mattered but post-guys set up shop for weeks at a time, and its easier to cart one heavy tower followed by monitors than a small computer with a box of dongles and misc items.
The friends I still keep in touch with in LA mostly have switched to Windows instead of going for the Mac Pro 2013. More bang for the book (time is money) and more performant. Anyhow, other than that I agree. just rerelease the 2012 with new innards and call it a day.
I just think that the pro market for Apple became so small because normal computers are getting fast enough for most pro people.
For example photo editing works fine on a Apple laptop even with very high resolutions and a lot of layers.
So I can imagine that Apple just doesn't need a pro desktop anymore because the market became too small.
By ignoring the pro market demand for faster turnaround and bleeding edge performance they pushed everyone jumping ship faster; now that the market for the mac pro shrunk to a point the line is unprofitable the circle is complete.
meanwhile the trillion dollar render/graphics/cad industry is gonna need something to keep up with the always increasing demand, and will simply look elsewhere.
During that entire time Apple was always positioned in a niche. For a time it was desktop publishing, video editing, photography, audio. A lot of that had to do with loyalty of the developer community, who produced killer apps that only really ran on Apple hardware, as well as appealing to a customer base that deeply preferred Mac OS.
But it's worth keeping in mind that the Mac Pro wasn't really ever dominate in certain high-end spaces like 3D. I mean, I'm pretty sure I remember hearing that the ID team at Apple was for a long time using non-apple hardware to run their CAD stuff until one day Ive and Jobs were like, 'this is unacceptable'.
As far as development work, I run it all on servers. Whether it's something cloud based, or my own Linux box at home, A low powered laptop SSHFS'd into a server is my preferred way of working.
Which wasn't even a Mac Pro. Interesting.
That's the market Apple have historically singled out, but it isn't the only "pro" segment. Another big one is software developers, who care a lot about compile and/or test time (and maybe having enough RAM to run a bunch of VMs...) -- and this apparent was recognised in the April meeting.
Not that these aren't great journalists, but Buzzfeed, Tech Crunch, and Mashable are no strangers to clickbait.
The mac pro isn't important for Apple measured by the money it makes by its sales. It is important for keeping the mac line alive. It is the machine for all the people writing the software for iOS devices, and of course for all the mac hardware which does create the revenue for Apple. Also, it keeps the professionals on the mac platform. Even if Apple would sell the mac pro at a loss, it would be worth it for its role in the environment. It plays the same role for Apple, as all those photographers at the sports event for Canon and Nikon. There isn't much money to be made from them, but by their reputation, all the equipment is sold to the big public.
In the mid-2000 years, the first sign of macs becoming popular was that at conferences, 80% would have an Apple laptop. They were leading the wave. But today, their market share is still high, yet, one can read more and more blog posts on how to switch to Linux or build a Hackintosh. I guess this is, what got the message at least partially to the Apple management.
I am writing this on a 5k iMac, which I quite like (except: no video input, what were they thinking??). In a year or two I would be interested in upgrading to a mac pro, if its a reasonably versatile device. I don't need too much upgradability, but I would expect to be able to order it with a current high-end graphics card and proper internal storage. Otherwise, I might consider going back to Linux eventually.
The second references Matt Mariska's misleading tweet: "Geekbench has the iPhone 7 beating the $6,500 12-core Mac Pro in single-thread." - Never mind that it's a $6,500 12-core Mac from over four years ago. This one gets regurgitated over and over.
The third still illustrates performance edging close to the Core M series of processors.
(I can't back that up, except to note that Apple fans are usually fairly scrupulous about referring to their MacBook Pro, iPad Air, iPhone 6S etc specifically rather than lumping them together).
In any case the current MacBook is quite similar to the MacBook Air and the rest of your comment probably stands.
Perhaps without understanding the full context.
I could be wrong, but I imagine he still knows a thing or two about CPU architecture and performance.
This could, of course, be solved with software.
Linux is great, but it just doesn't do it superficially for me. I can make it look great for screenshots, but as soon as I click a button or engage with the DE, the facade falls apart.
macOS/OSX seems to be like a Linux distro with a very well developed desktop manager.
The only problem is, I need to buy a Mac. Who the hell can afford that, holy sh*it
This did not happen when I installed Creators Update a few weeks ago. Firefox was still my default browser after updating.
I find all the bad press and hate confusing. It's just a computer.
1.) Why is Apple not keeping this system in line with the state of technology? If it's targeted at professionals, why have they left their only option in that department 5 years behind in technology?
2.) Are they planning on making the next iteration upgradeable? For some time now, Apple has used third party silicon, interchangeable with that of any other system. Intel, AMD, nVidia. Professionals on any other platform can spend a couple hundred bucks and modernize their computer for 2-3 years. Isn't it therefore possible that the lack of sales in Mac Pro division aren't because of lack of interest or a small professional market, but because of a bad idea to make a proprietary desktop tower which locks down upgradeability?
They do what they do and that's it. Only Apple understands what Apple does. No one else. How is that not clear?
Sorry for the rant.
Really? Do you think it's reasonable the amount of people that seems to devote themselves to this?
If industries like video and graphic design (and possibly even app development?) are forced to switch to Windows to maintain pro-level capabilities, what's going to motivate them to stick with iPhone?
That was more than 25 years ago; any company would change a fair amount in that time.
So, no, not of Apple, but of two other technology companies.
What I can't understand is why people feel the need to make this kind of comment. Do you want me to explain why I enjoyed it? Or do you get some personal satisfaction from publishing your bewilderment?