It will almost certainly be a low end macbook. In fact, there's a non-pro 'macbook'[0] that already exists which would be perfect, it was discontinued a year ago, but used ultra-low power CPUs. In fact when they were talking about them (prior to release) they thought they would have ARM CPU's back then.
They are fanless and have a 5W thermal envelope, which fits in with the current A12X too.
The major criticisms of the design at the time was: Butterfly keyboard, the Single USB-C port and the speed of the device (Intel CORE-M is truly, truly painful). But for a $600-$700 machine with an ARM CPU? that's insanely competitive, and is in-line with the "basic" macbook branding.
Optics dictate that Apple show the superiority of the new silicon, and it wouldn't want it immediately pigeonholed as something like Surface X, a compromised curiosity.
Apple's been consistent about ~10 hours of battery life for a while. They've hit their spec and think that most people don't need more in typical usage. So they'll be able to shave battery volume down to continue their trend of thinner without sacrificing other specs.
The recently discontinued MacBook was a vision statement that the hardware could not quite reach, much like the first MacBook Air. Here, I think that statement realized would be a MacBook Pro that is as thin as that MacBook, and is still notably faster than the current MacBook Pro.
I see no need for a standalone Macbook to negatively portray Apple Silcon.
"Macbook" is the heart of the entire PC portable line for Apple. "Macbook"is the grounding of the brand itself.
I think Apple will release the Macbook as the first Apple Silicon machine because of the optics of confidently putting the hardware at the symbolic heart of the entire portable line.
I do not think the 12" that was discontinued should be thought of also as where the machine will be picked up in terms of form factor. This is not how Apple behaves, when tech shifts or they don't like where they left off, they change direction. (i.e. the Mac Pro trashcan -> modern Mac Pro)
Apple Silicon will allow Apple to reintroduce the Macbook as a compelling machine addressing many customer types. This base machine would pave to eliminate the Macbook Air and redesign the Macbook Pro.
With a few models of Macbook and Macbook Pro the portable product line will be simplified and clean up the confusing price / feature / performance comparison problems that exist today.
To the skeptics, Apple Silicon is about thin laptops and fat margins. It makes no sense for Apple to reinforce this. Apple's silicon team is exceptional, and they should prove it.
And they will, delivering thin, light, long-lasting laptops, with high performance and the fattest of margins. The only people they need to prove themselves to are shareholders, and this is gonna do the trick.
> But for a $600-$700 machine with an ARM CPU? that's insanely competitive, and is in-line with the "basic" macbook branding.
I'm skeptical Apple's introduction of their high performance CPUs exclusively on a low end device. A super light/ high performance 12ish inch machine seems right up their ally, but they don't want the new CPU associated with being a low-end "Intel Pentium Gold" type device.
There might be a lower end ultra-thin laptop released, but they are going to go big and push out a high performance CPU right out the gate. They didn't launch their 64 bit A series CPU on a secondary device, they won't launch this on a secondary device either.
I've heard rumours of a low end 12" MacBook, along with a 13" MacBook pro targeting developers as launch devices. No idea if it's true, but it would certainly make a lot of sense.
I don’t know if they would actually do it, but that would be an awesome way to launch: one category-defining device in terms of portibility and performance, and one super high end device to create a halo effect around the new processors
Yes, this would be a good strategy. The "low end" one would likely ship with the same CPU as the next-model iPad. The higher end one would be their new shiny high performance designed-for-MacBook CPU.
With the Intel release they also released two devices at the same time (MBP and iMac, and other models quickly followed).
I’d assume them to do the same: a high end (MBP?) and low end device (MB? MBA?); that’s where ARM will shine first (mobile: battery life; maybe even integrating a modem with esims).
>I'm skeptical Apple's introduction of their high performance CPUs exclusively on a low end device
You can group the Notebook Shipment into a few Categories, $1000+, those are nearly all MacBook. With the remaining going into Gaming Machine ( Which is now being rebranded into Content Creation Machine )
You have the absolute low end, Netbook style that are $300, those are not what Apple targets and they are actually competing with an iPad.
You have vast majority of the Notebook in $300 to $600. Considering the lowest MacBook offering at $699, is just the same play as iPhone SE at $399.
It is just $100 more than the median of average Smartphone selling price, and it would attract enough number jumping to Mac platform all while sucking out the oxygen for the other PC vendors.
If you look at the history of the MacBook and MacBook Air together, it's pretty clear that Apple expected to eliminate the MBA as a product category years ago; to just have "MacBook" and "MacBook Pro" categories. This was probably due to them believing Intel's roadmap for chip releases, such that they thought they'd be able to hit MBA-like performance targets in the MacBook's form-factor. But Intel's roadmap didn't pan out, and so Apple had to turn around and build more MBA-form-factor devices to hit those performance targets. Eventually, the MacBook went so long between releases that they just eliminated it, and redesigned the MBA a bit to cater more to the people who liked the MacBook form-factor (without really making the MBA any lighter, just smaller-seeming.)
All the MBA computers since 2015 "should have been" devices with a MacBook form-factor and MBA-level performance. Right now, that'd mean an ultralight with a CPU matching or surpassing the performance of the https://www.cpubenchmark.net/cpu.php?cpu=Intel+Core+i7-1060N... (which is the highest-specced option for current MBAs.)
Such a computer, if released today, would be impressive, no? It would redefine the MacBook (Air) category to be Apple's primary Mac product line, satisfactory for almost all workloads (in the same way that the iMac has been redefined to be the primary desktop Mac, satisfactory for almost all workloads.) In that world, the MBP would be relegated to halo-product status, akin to the current positioning of the Mac Pro. In Apple Stores, you'd have three differently-sized MacBooks, and then an MBP off in a corner where looking at it for too long summons an Apple Business representative.
I don't see how that's not already "going big" or "making a statement." It'd demonstrate that their Apple Silicon chips can eat Apple's own lunch, with their new mid-range Apple Silicon devices beating their old high-end Intel offerings in price:performance, even before Apple Silicon high-end chips get designed to truly replace them. The Intel MBPs, even brand new, would be immediately branded as a "legacy" category, outmoded even before replacement, only to be purchased if necessary for a specific business use-case.
(Before you suggest: no, I don't think Apple would be Osborning their Intel products if they did this. Businesses and freelance professionals would, for the time being, still have plenty of such "specific use-cases" for an Intel-based MBP. Apple also has lots of enterprise customers locked into long-term upgrade contracts, so part of the demand for the tail end of their Intel-MBP pipeline would be fixed/inelastic demand.)
Maybe not exclusively, but it would make more sense as users of their more 'budget' machines likely won't be put off by its inability to run Windows and running existing Mac software more slowly due to emulation (unless it's been recompiled in the latter half of 2020).
Edit: Thanks to the replies letting me know that Rosetta is still a thing. I somehow forgot. Updated.
> inability to run existing Mac software (unless it's been recompiled in the latter half of 2020).
You will be able to run existing MacOS software on the new hardware under emulation. The Dev kits run emulation fairly well on a 2 year old iPad CPU so I would expect whatever new hardware they release will as well.
Most Mac software should be ready with an Apple Silicon recompiled version when or soon after they ship. Apple has been showing how easy it is for most software to retarget the new chips will very little work.
the 12" macbook is my favorite laptop of all time. Mine is the 2016 version and it's starting to feel dated and I'm hoping that they rerelease this on the new silicon first. I'll be smashing the buy button the day it's out.
I think my favorite computer was my late-2010 MacBook Air. It was such a departure from the big & clunky Windows laptops I'd owned up until that time. And of course I have fond memories of my first computer, an IBM PC Jr (although I really wanted the 2nd disk drive, and copying floppy disks with only one drive & 128kb of RAM was suboptimal).
The Air series we’re solid machines for a good while until they stopped updating them. It was such a good price/performance/portability balance that I didn’t miss having a Pro machine for a few years there, even doing relatively heavy tasks like photo editing and development.
I have a 2019 Macbook Pro now. The only reason I bought it was a friend really needed a computer any computer, and I gave her my 2014 Macbook Air. I bought the pro because it's a bit nicer for mobile development & running VMs.
I recently found a pic of my friend holding up my Air that I had just got delivered at the office. She has this look of awe on her at how thin the thing was.
I think Airs are the ideal machine for the majority of the population that just needs to reliably accomplish basic tasks. It's probably the cheapest if amortized over its lifetime.
At least until iPads can do everything laptops can.
Mine is a Macbook pro 2012 and it is still phenomenal.
2yrs ago, my screen used to flicker. I contacted everyone and they said 'change motherboard' and Mac support people were like buy a new laptop. so I kept the laptop in cupboard for a few months and I open it to check if it works, had decided to sell it and buy a new one and viola, it was working perfectly fine.
2012 mbp15r, maxxed out specs at the time (16GB DDR3, 500MB SSD), daily driver for eight(!) years, and has maybe 5 tiny dead pixels as the only sign it's the worse for wear. Current gig requires use of client-provided 2019 mbp16r with approximately the same size, weight and perf. I don't hate it, but it's hard to believe my ancient personal box is about as capable. They really don't make em like they used to.
>I don't hate it, but it's hard to believe my ancient personal box is about as capable
Exactly my point. The funny thing is, I bought the 2012 model at half the price of the new models and I have 0 complaints about it. No fancy keyboard issue, no fancy thermal issue. Nothing.
and to think that since then, they've made more problems than laptops!
They really don't make them like they used to. I just want a better battery backup that's it. I get 6-8hrs because I did a lot of research about how Li batteries drain (it is best if you do not let it drain below 50% because Li batteries' lifecycle depends on the high-low charge discharge cycles so if you keep discharging battery to 0% then you're scrwed in a few years)
I'll buy the 2012 model again, if this one dies out, but won't ever buy the latest models.
I have the non-retina version of the 2012 and it is still going strong. Battery is still fine, screen has nothing wrong with it, and I upgraded the RAM to 16GB as it only came with 8, then shoved in a 1TB SSD to replace the incredibly slow 750GB hard disk. Dual boots to Windows 10 and all is great! I honestly don't know what I'm going to do when it breaks because I don't like the work MBP I have (2016 too) and I don't like the direction macos is going.
I'll really miss this 2012.
I replaced my 2012 15" MBPr after 5 years with a 2017 only because it kept overheating and throttling, but in day-to-day use, the performance difference was really hard to notice. Biggest difference was probably in graphics performance when using an external display.
I recently sold my 2016 Macbook (loved the device, the form factor, it was great — but the thing was becoming horribly slow and was starting to drive me nuts).
Got an iPad Pro (2020 + the keyboard) instead. Don’t know what I was thinking... :) Guess I wanted to try it out and see if it could work (and torture myself a bit).
The iPad pro is fun, great device for browsing on the sofa, and the keyboard is good for emailing and such. But multi tasking still sucks, apps in the background get killed or take forever to refresh content, etc. I’d go back to a Macbook but one that doesn’t take a second to swap between tabs in Firefox.
It just seems that they could juice up the power in the MacBook Airs a bit and then there wouldn't be enough of a gap between an Air and Pro to justify having the normal "MacBook" in between.
That said, the current Airs are still pretty weak on processing power and if they're not going to juice them up then a middle option would be welcome. Plus it would be a good proving ground to launch ARM with a model that's not currently available new.
I have a 2020 Air and it's pretty great but I recently saw it struggling when there's graphics-heavy stuff, namely driving a 5k external monitor while on video calls.
I also have a 2016 12" Macbook and love it. It's my full time personal machine, having replaced an older iPad and MBP. I love the form factor and weight, but I like it more than an iPad due to having an actual keyboard and ability to run real software.
As you say it's getting a bit long in the tooth and I've been lamenting that there's really no replacement for it. I've been holding out hope that apple silicon will change that.
I agree, still running a 12" 2015, it's an incredible machine that just doesn't feel as though it has been replaced. Makes the Pro and Air feel like heavy weights.
The 12-inch Macbook is my favorite Mac, despite I have multiple MacBook Pros and a MacPro! Although people might find it strange. It's simple, beautiful, extreme on the design, and also extremely slim, light, and quiet. I'm pretty sad that they've discontinued this product, I hope they'll bring it back with an Arm processor.
It's almost as heavy and as thin as the iPad, but it is a full-sized computer with a full-sized keyboard. I can play with Emacs for 20 hours writing and scripting without charging.
Also another unpopular opinion - I also love the butterfly keyboard of the 12-inch Macbook, it's quite uncomfortable but once I get used to it, I feel I can type pretty fast and it feels good. It feels firmer than the old Macbook Pro because the latter feels a little bit shaky. It also feels more clicky than the later Macbook Pro with the butterfly keyboard. It seems to me they have received too much criticism on the butterfly keyboard so they were trying to make it less edgy, but that made it lost the characteristics as well, thus they withdraw butterfly keyboard eventually.
I have a later 2017 revision of that one, never really had an issue with CPU speed. Mine has RAM maxed to 16Gb though, perhaps that helps. It's a great machine: a real computer but with iPad pro dimensions.
I’ve been force-feeding myself the new iPad Pro with magic keyboard as my primary laptop for the past few months (why? So I can experience what kids are largely experiencing these days, and empathize). It’s definitely a “real computer”, just optimized in a different way than I’m used to. Doesn’t feel like a budget computer either, it’s incredibly powerful for something that uses less than 18 watts and has no fan. IPadOS has some rough edges but they’ve been getting quickly smoothed out.
I am so tempted to go buy an iPad Pro, demoting my Mac to a development-only machine.
Everything on iPadOS just seems more refined, more fluid and more fun than on MacOS. Everything from 120 Hz display, to instant waking from sleep, to battery life that is actually consistent. The overall user experience just seems lightyears ahead of the Mac, assuming your workflows are compatible with the software limitations.
If you have the cash, I'd suggest you do it. I bought one kinda on a whim and I don't regret it at all. It's pretty much replaced my ThinkPad.
iPadOS is indeed more refined, but not all of it - there's a fair number of very specific things that are a total PITA on the iPad but a breeze on my desktop Mac. That makes doing serious work on the iPad a little challenging, though it's still possible, but for most serious work I still use the desktop. But for almost anything else, I prefer the iPad's mobility and UX. It's the best drawing tablet, ebook reader, web browsing and netflix machine I've ever had, all in one.
Well I very much need Emacs and Unix-like environment for work, so iPad doesn't quite cut it. That said I've got a 11" Pro: mostly as a portable second display via Sidecar for MB12 but also some pen-based apps like Sketchr3d. It's nice and if you have a workflow that suits it, no doubt it's usable.
Pretty sad to see people touting this as a workaround solution.
It's still a workaround - possibly the most basic one for any kind of thin client there is.
I want a consistent IDE as well and right now, you can't run vscode properly on an iPad.
I'm still hopeful that future apple silicon macs will be able to do local development with more grunt than current macs. I guess we'll see.
And yes, for doing development on the iPad basically everything is a workaround, but that doesn’t automatically make it worthless. Neither does the fact that it’s basic. I don’t get your criticism.
Then, contrary to what you said previously, VS Code never was an IDE in the first place because it's an Electron app, it's built on web tech.
That also means there's literally zero difference running VS Code in the browser vs. the electron app save for the browser toolbar (which Safari hides if you add a bookmark to the homescreen, which also nets you an app icon. At that point this "workaround" gives you exactly the same experience as a native app. Not sure what issues you're still seeing there.)
I'm talking about local vs remote development. Can you even spin up a webserver on an ipad to run codeserver? Or does it rely on an external webserver to run and build your application.
If you know what a thin client is, then you're effectively describing that for an iPad. Anything is a thin client - even your phone.
> Except you've made it a DIY thing and claiming that it's a the solution for doing development on an iPad which is clearly false.
Why not? You're basically claiming that solely because it's remote it's completely unfit for development purposes. That doesn't really make sense to me. Especially with VS Code being an Electron app the browser version is equivalent to the desktop app.
Just to be clear, I totally concede that there are certain tasks where you do run into limitations, e.g. handling files is rather a PITA, but other tasks like plain coding are perfectly fine. And if you can save the former type of work for your PC then an iPad plus code-server is actually not going to cause you any trouble with workarounds.
It's no question that a "real" Unix system is more powerful, but a VPS has been surprisingly usable to me. Many commandline tasks are more enjoyable on the iPad than on my desktop. The rest, I just fall back to my desktop. Try it - you might be surprised how powerful it is.
Thanks for the tip, I might try it at some point, would need the keyboard folio first tho.
Am no stranger to bizarre setups. Have used remote shell on Palm PDA with foldout keyboard over IRDA via cellphone GSM modem; now this was cumbersome setup. Had Agenda VR3 Linux PDA and coded on it. It's really comfy with mb12 now tho and I like network independence.
I code on the planes all the time, somehow it's really easy to get in the flow. So I like how this setup works on the foldout tray and without network to count on. Doing tons of field work too. In late January had to code for 6 hours with my ass on a tunnel tarmac north of Trondheim. Though of course most of the time I work within network range.
The "best" CPU that comes in that lineup is the Intel Core i7-7Y75.
I have a GPD P2 Max[0] which also has 16GiB of ram and a Intel Core m3-8100Y...
Not to brag but my CPU does seem to bench higher[1][2], and for me it can be painfully slow at times.. Though perhaps the OS is doing me no favours (Sway+Arch/Chromium)
I think the 12" 2017 model with the i5 was the best one. The i7 got a little hot, and the m3 was pretty slow. The i5 (I'm still using it today and it works great) doesn't suffer from heat issues nearly as bad as the i7.
I've used some fanless Ideapad before this with Core M as well, let's just say there is no comparison. With that one a tiling WM was very much a necessity.
My office PC is a very generously specced desktop, but for user interaction tasks like coding, browsing etc there is no appreciable difference with MB12.
Lightroom or Ableton are almost impossible to run on a CORE-M. It's painful specially after a 10/15 minute editing session when the fans kick in and not only you have to deal with a noisy laptop but also your CPU gets underclocked and switching between your full screen Lightroom or Ableton back to Chrome sometimes takes more than 30 seconds.
WoW and LoL are very lightweight compared to pro image or sound editing software, specially because they are doing a lot of the heavy stuff in your GPU.
Even the Mac Pro is also kinda bad with that software to be honest, specially if you spend more than half an hour using it, because of the thermal issues.
You can both be right. Machine defects, or just aging, can cause the battery to expand and internals to coat with dust - causing thermal throtting, excessive fan usage, and significantly worst performance, after even just a year or so.
I loved my macbook (failed to survive a rainstorm, alas) and wrote a ton of code on that supposedly non-pro machine while traveling all over the world. I hope it does reappear as an apple silicon machine.
However I suspect that if they can they’ll start with a “pro” machine that beats Intel specs as their first out of the gate, to demonstrate that it’s not a compromise option.
I primary drive the latest version and its my favorite mac to date (and I have the pro + imac + ipads + iphones). I have found that if you have a little bit of patience for loading, everything works just fine and the form factor beats everything for real work on the go.
If they are really creating an ARM macbook, this will be a great product and this chassis is definitely the right one to start with.
Are their hubs yet that can turn one USB C port into several? Or do they still only turn one USB C port into several USB A ports?
A couple years ago what I read was that the hold up was that this required more complicated chipsets that would not be available for a few months and would be expensive. More recently, I've read that this won't happen until USB4.
It's a strange thing, you can get these super complicated docking stations with all sorts of different ports, but can't get a 4-port hub that's only usb-c connectors.
Oh wow, thanks for that link, I have never come across that!
Also, when I google with your phrase, there are no hits like that on the first page. I wonder if google is customizing my searches away from what I want...
This is the only way you can end up with >1 USB-C port, afaik. It's not currently possible to split USB-C like USB-A, so the only way you can get a second USB-C port is if you have a Mac and use a Thunderbolt dock.
I just wanted to point out, Butterfly Keyboard on MacBook wasn't much of a criticism at all. It was trade off that is possibly worth it in the name of the thinnest Notebook.
Butterfly Keyboard get most of its criticism when it moved to MacBook Pro, Because now you are putting up with a keyboard that ~50% of the people find it to have worse typing experience at the expense of a possible ( or not as we have seen other vendor capable of doing without it ) 1mm decrease in thickness.
Personally I still want the old 1.5mm Scissor Keyboard.
This is the laptop I have been using for 5 years now. As a web developer it is amazing that this 5year old mini laptop can drive a 4K screen through usb-c but driving too many stuff at once does slow it down a lot.
Thinking about switching to the new iMac right now.
Agreed, and I want to add one prediction (which is maybe just a hope):
It will have a SIM card slot, and get at least 12 hours of battery life while using LTE.
This would be a truly compelling product, filling a niche Apple has tried to inhabit a couple of times, with the distinct possibility of getting it right this time.
With tethering, I'm not sure I want a SIM card in my laptop! If nothing else, AT&T will charge me extra for that (I'm already paying extra per month for my Apple Watch to have LTE connectivity).
Whilst I don't take my phone running because I can use my Watch for music and can make calls on it while out, I don't think I'd ever be somewhere with my laptop and no phone.
The problem with that is that MacOS has no concept of “low data mode”. If it has a connection, apps are going to use it. You can deny apps from being able to use data over cellular on iOS. Also, well behaved apps will usually give you an option to either not use data or in the case of streaming apps, use less data.
It is also true of tethering, which is quite popular.
Now, I can tether my phone to my laptop. Indeed I did so fairly often, before global house arrest.
But this is true of my iPad as well, and I opted for the cellular model instead. I'm glad I did, it's a better experience, hands down, especially when I'm traveling and want to conserve my phone's charge rather than burn it at maximum.
But if you felt differently, Apple sells a WiFi only iPad, you could simply not pay the extra for the cellular model when checking out. This would almost certainly be true for this imaginary Macbook as well.
I wouldn't buy it at all, I'm a professional developer with mild presbyopia, and purchased the standard-fancy model of the 16". But if I were an incoming freshman again? Bet I'd be pleading with my parents to get the cellular model.
Unlike a touchscreen, an LTE modem doesn't demand any changes to a desktop operating system to provide a good user experience. Again, tethering. Cell bandwidth gets cheaper every year.
The Mac mini is a vital part of the lineup, but it is so much of a utility computer. That's why they utilized it for the Developer Transition Kit. It's hard to market it since it's nearly invisible on a desk and there's no Apple monitor that is reasonable to pair with it. For 'Apple Silicon' for xmas, one would assume several notebooks and maybe something like an iMac with a built-in display.
Interesting how much emphasis they put on the camera, speakers, and mic
People realized how crappy the camera (an well mic) was on Macs compared to other computers now that we are all video conferencing. Its sad that every PC user with a Logitech looked much better.
Few if any laptops have decent cameras. This is partially because until recently most users didn't actually care that much, but also for practical reasons; modern laptops have very slim lids, so you can't really fit decent optics (a phone is a good bit deeper).
For comparison, the MacBook Air has about 2-3mm of usable space in the lid for optics + sensor. The iPhone has something like 7 or 8mm. For lens design and the ability to have a very slightly larger sensor, that's a huge difference.
Right, but having an integrated camera that is absolutely awful completely defeats the point of having that there in the first place. It really noticeable when you Facetime with someone on an iPhone and you compare the video quality of the two.
> having an integrated camera that is absolutely awful completely defeats the point of having that there in the first place
This is a ridiculous statement considering the camera we're talking about is 720p. The current camera is indeed better than "no integrated camera at all" and is perfectly fine for video calls with your family, or even for work, since you're probably sharing your screen and your coworkers don't need to see every pore on your face.
Which is riddiculous, because if they can fit good front camera in iPhone they should be able to do that in the much bigger device, like laptop, right?
I'm sure they can, but for some reason they didn't. It's fun to speculate, but apply Hanlon's Razor and Occam's Razor; it's unlikely there is some evil corporate scheme at play, and it's highly likely that the reason is very simple.
Perhaps they never got the levels of feedback that drove them to upgrade it. Perhaps it has to do with internal designs of using an USB 2.0 bus or something that now has to be changed to a CSI or MIPI style interface that first has to be processed by something like the T2 before it is a readable data stream for the standard Intel architecture.
They probably won't be able to fit the exact same iPhone module, the optics (Be it plastic or glass lenses) are too deep, but you could probably do with a better sensor for sure. Some manufacturers tried to 'fix' this by putting the camera below the screen which gives it a bit more space because you can then use the hinge area for the components; but now you end up looking at someone's chin all the time.
An iPhone is much deeper than a laptop lid. Perhaps they'll have to redesign the lid not to taper at the edge to provide more space for a better camera.
The camera on my iMac Pro is better then the Logitech C920 I was using with my older Trash can Mac Pro. I have a much better mic that I use so not sure about the built in Mic on the iMac Prov
The original Apple FireWire webcam was very good, with a large, motorized lens. Despite boasting higher resolution, I doubt many modern webcams are as good, especially integrated ones.
You can still judge the macbook cameras by looking at the images they deliver. They're bad, really bad. I suspect that apple's laptop guru / team moved to iPhone and never got backfilled, leaving the macbooks with budget flat cameras from 2007 or something.
How awesome would an updated iSight be today? You could fit an APC-C sized sensor (1.6x crop DSLR) with a 22mm f/2 pancake lens for some insane bokeh. Probably at around the $300 price point. Sure would be easier than hooking a "clean HDMI" DSLR or mirrorless into a HDMI input/USB adaptor...
Especially if the camera could compress the stream in-body so you don't have to blow up your CPU/GPU to get a good 60fps HD video signal like you do using a mirrorless/DSLR.
IMO YMMV etc but wide lens distortion can drag down what people would call “image quality”. 50 or 85 would work better for portraits if you can afford the distance.
I'm guessing they will go right to the meat of things and upgrade one of their MacBook Pro models. That is right up the "Performance per watt" avenue which Apple claims their new silicon is best for. They are going to want to squash any rumors that these don't perform well right off the bat so I don't think they are going to be conservative here.
For similar reasons, I don't think the iMac was ever slated to be the first Apple Silicon machine. The desktop form-factor just doesn't highlight the benefits of the new architecture the way a laptop does. Since they sell a lot more laptops than desktops, it's likely they don't even have a desktop specific CPU ready at launch time. That'll come next year or maybe even 2022 towards the tail end of their 2 year launch window.
Yes, they need at least one Apple Silicon machine, which demonstrates that it is competitive for absolute CPU power and of course as a machine for all the developers. They do need not only a testing machine, but should be doing the development itself on an Apple Silicon machine. That is, why I would consider the rumor about a 13 or 14 inch MB Pro with Apple Silicon for highly plausible. The 16 inch would then follow later, with an even beefier CPU.
I wouldn't be surprised to see the 13/14 and the 16 inch both released this year.
That would mean 2 significantly different 16" MBPs in the same year. My thinking here is that it would be quite odd if they released a 14" MacBook Pro which is faster than the more expensive current 16" model.
The 16" was just refreshed and requires both a very beefy cpu and gpu. Not sure Apple will bring that with the first iteration. The 13" MB Pro has an integrated GPU, so that sounds more like a first step for Apple. Also, I think the Intel version of the 16" will stick around for quite a long time, might even be the last Intel machine sold in parallel to Apple Silicon, as developers and many others might require an Intel-based machine.
How can Apple say their new CPUs are the best/ fastest... but aren't good enough to ship in their flagship laptop?
This first launch is going to be the most scrutinized & criticized Mac Apple has launched in years. Apple knows this and they are going to put a beefy CPU in their new machine to silence the critics. If they can't beat Intel performance at launch, what are the chances they are going to be able to beat them a year down the line?
And since it would be quite weird to have the smaller MacBook be the better/ faster iOS development machine, it seems like the 16" is pretty likely. Maybe not launch day, but within the first few months at the latest.
They might keep the Intel based MacBook around and sell them at the same time, but I doubt it will be the only 16" MacBook they sell for long. It just doesn't make any sense.
That is the reason why I think at least one of the MB Pros will be available on Apple Silicon from the start, I am just not sure they will start with the 16", as they would have to replace a dedicated GPU for that machine.
Because one important feature of the large MB Pro is the availability of a dedicated GPU making it a decent graphics machine when connected to an external screen. I am certainly stressing the GPU of mine :)
My thinking is this is exactly the use-case Apple needs to prove they can compete with and arguably the case where they should shine. With ARM's much better thermal characteristics, Apple should be able to get much better graphics performance.
Worth noting that the very first Intel Macs released in January 2006 were the MacBook Pro and the iMac. At the time the MacBook Pro came in 15”/17” sizes, now after multiple generations they’re 13”/16” machines. The iMacs were 17”/20” machines as opposed to the 21.5”/27” machines they are now.
The rest of the PowerPC line was brought to Intel within the calendar year, and iMacs and MacBook Pros received an additional refresh replacing the Core Duos with Core 2 Duos later in the year.
At the time I remember thinking the release cadence for which models they transitioned over made perfect sense. You could make a strong argument for them to follow a similar roadmap this time around because while Macs have changed substantially since then, each Mac’s place within the lineup has not changed very much, although a MacBook Pro and a new ultra-thin MacBook around the same to show off the advantages of Apple’s chips on both sides ends of the performance per watt spectrum wouldn’t surprise me.
Man, this will put a lot of us in a quandary. Buy this knowing the Apple Silicon is coming down the pipe or wait it out. I am disappointed no obvious changes to cooling have been made. Current i9 models can really spin up the fan as a number of owners have stated it one of the loudest Apple computers ever.
Nano Glass is around $500 up charge which does not seem bad but I seem to recall it has special cleaning requirements so be careful if you have family or friends who are touch prone.
SSD upgrade from 512 to 1TB is reasonable as well, around $200
Crucial P1 NVME 1TB is around 100 Euros in my neck of the woods so charging twice that for an 512GB upgrade is bonkers in my book but clearly I'm not the intended customer base.
I guess for companies flush with cash where every employees salary is in the six figure ballpark, the price of Mac configs don't raise any eyebrows in the list of expenses but at least in all companies I've worked so far if I'd have suggested we buy Macs, the bean counters would have had a fit.
> I am disappointed no obvious changes to cooling have been made
Same.
I own a 2017 5K model with an i5 and the terrible cooling is my only gripe with it.
It's totally fine for bursty workloads, but once you get into a light sustained workload (eg: music production) temps go to +70ºC and the fans become quite annoying.
As for the nano glass, I ordered an anti reflective screen protector from a local Dutch supplier. It works fantastic on my 16". I do programming though, no color-sensitive photo or video work.
Consensus is that the first ARM devices will be 13.3-inch MacBook Pro and a new redesigned 24" iMac, based mostly on Ming Chi Kuo's report from just before WWDC2020.
I think this means iMac will be one of the first computers getting Apple Silicon, otherwise it's weird that they're updating the 27" without updating the smaller iMac.
There have been rumors about an updated smaller iMac which will shrink the bezels and basically look like an ipad pro on a stand, with a screen size increase to 24" and Apple Silicon, coming either later this year or early 2021.
I do wonder why separate the smaller and larger iMacs lifecycles, maybe because the desktop-level Apple Silicon chips aren't ready yet? That would make sense especially if they're aiming to replace the AMD GPUs even on the top of the line larger iMac and use integrated graphics there too.
I can imagine two priority devices for them to feature from the start:
- A super-lightweight 12" Macbook-like device that lets them demonstrate how Apple Silicon opens up new categories and form factors: small, powerful and amazing battery life
- A developer machine (MBP or similar) will be necessary. There are precious few DTKs out there, and they need more developers to be running Apple Silicon. I can't imagine them not shipping this in the first wave.
IIRC the iMac and the PowerBook were among the last computers to get an update after Apple announced that they were switching to Intel, and they were also the first computers to get Intel processors.
So this probably doesn't mean anything in that regard.
There is a lot of speculation that they might have a replacement MacBook 12” at launch. It would be a good fit. That one is not listed in any of the stronger rumors though.
The strongest rumors are for a MacBook Pro 13/14 and a 24” iMac. The 24” iMac suggests that they will switch to new designs with the new chips.
I don’t think that Apple will restrict their new chips to just a little MacBook. It would make it look like that is all they can do. They are going to want to come in strong and have a range of chips on different machines. Some with high efficiency and some with high performance. They seem confident and I suspect they will pull this off.
There should be at least one mid-range machine coming out early, so that the developers have really a machine they can work on. Also, Apple needs to demonstrate that they can make some powerful CPUs beyond what they do already for the iPhone and iPad. So the rumor of the smaller MB Pro sounds plausible, the 16" would be migrated in a second generation of Apple Silicon, likewise the large iMacs and the Mac Pro.
The performance of their chips is good enough that it wouldn't make much sense not to update everything once silicon availability is assured. It would be ridiculous to keep their iMac slower than their Macbook Pro.
I’m sure there are practical product development and supply chain reasons they would not be able to release an entire line of updated macs at the same time
MacBooks and iMacs are not direct competitors - one is a laptop, the other is a desktop computer. You can have up to 128 GB of RAM, 8 TB of SSD, and a beefy GPU in an iMac, something I bet won't be available on ARM for some time.
> Massive upgrade for people working at companies whose IT refuses to buy anything but the cheapest model.
In my experience, the companies who always buy the cheapest model tend to avoid Macs entirely. The big place I see the base iMac popping up is at schools.
The 27” models have user-accessible DIMM slots in a compartment on the back, so you’re ok to get that RAM from Newegg or Crucial. That’s what I will be doing.
It’s crazy. Even though my work laptop (https://www.cnet.com/reviews/dell-latitude-e6500-review/) was considered high end in 2010. My company ordered it for me because of some Windows CE development I was doing. It had 8 GB RAM.
They gave me the laptop when the company went out of business. It was my Plex server until last year.
Depends which model. Some require taking the screen off, which unlike earlier models is taped on and some have soldered in RAM. Until someone has seen a hole in the back or has torn one down you just don’t know.
A friend of mine upgraded his 21 inch one out of the box because Apple RAM pricing and turn around is terrible. However he damaged the screen ribbon cable at the same time. Happens a lot.
The iMac Pro is starting to seem pretty redundant even with this slight spec boost.
If you upgrade the new 27" iMac to the same specs, you can get a faster machine than the base model $4999 iMac pro for $900 less - it's a faster clock speed 10-core processor, better performing GPU with Radeon 5700 XT 16GB (as opposed to Vega 64 8GB), and same 32GB RAM and 10Gb ethernet and 1TB SSD for that price.
The iMac pro has even further upgrade options that the iMac doesn't, but still, I wouldn't expect a $5k non-upgradeable pro machine like that to be underspecced. It'll also be interesting to see some teardowns of the new iMac and compare actual benchmarks between these machines, the iMac Pro might still have better thermals that lets the cpu/gpu perform better in the real world.
Particularly when you consider they didn't even add the anti-glare display upgrade to the iMac Pro. It seems likely the iMac Pro will go its entire life with only this one small processor bump before it's retired.
5K iMac was introduced in 2014, several years before iMac Pro. I think the point of iMac Pro was to fill the gap while the new Mac Pro was in development.
The rumor that they would redesign for Intel didn't make a ton of sense.
The MacBook Pro design had a major pain point so they needed to push an update for their "final" Intel MBP. The iMac design is a touch dated, but has no significant flaws (at least none which Apple intends to fix).
> iMac Pro now comes standard with a 10-core Intel Xeon processor. Designed for pro users who require workstation-class performance, iMac Pro features Xeon processors up to 18 cores, graphics performance up to 22 teraflops, up to 256GB quad-channel ECC memory, and a brilliant 27-inch Retina 5K display.
I think just a bump in the base CPU and maybe the graphics? There is a reason it's buried deep in the PR.
I'm always interested in the differences between Apple's Intel OEM CPUs and the ones Intel lists in their catalog. There is no 10th-gen Core i9 CPU that exactly matches what Apple is selling here. In the iMac Pro they are installing a Xeon-W part that has almost twice as much cache as the standard part (43MB vs 25MB). They're not saying how much cache is in the new 27" iMac.
I'm picturing some binning shenanigans on Intel's part and Apple doing enough benchmarks to pick a minimum amount of cache. Or maybe Intel offered them something with odd specs that they have too many of?
Usually it comes down to either not being socketed or as with the cheese grater mac Pro's the CPUs lacked the integrated heat spreader. Sometimes it's cache or clock variations, but usually it's more along the lines of the chip packaging that differs.
The one thing I keep missing on my 5k iMac is some form of video-in. Would be great if I could connect my MB Pro to it and use the iMac as a display. Perhaps even in a window and all mouse and keyboard events to the window get passed to the connected MacBook.
I researched why the 5k iMac doesn't have video-in, and it seems that the reason is that the display has so many pixels that it needs to pieces of hardware to drive it. Getting the timing and color calibration and what-not clearly wasn't going to happen externally, so rather than have a suboptimal experience they just took video-in out. There's also the issue that few video cards can drive that resolution.
Yeah my 2008 iMac had this functionality, and I was supposed/bummed to hear they got rid of it a few years later. I’d think Apple would keep the feature around, if only to make it easier for people to buy both a Mac laptop and Mac desktop. My wife would totally use one at work if it were easy, and I’d love to have one for us at home. But instead we just get laptops with fans that come on all the time, and SSDs that are far too small.
This one kills me. With COVID, I'm now stuck working from home. I literally spend all day staring at my work Macbook Pro's small screen with a glorious but unused iMac screen sitting five inches behind it.
Same here - using my MacBook Pro when working from home. I was very fortunate though to get me a Dell 2415Q in January. While not being quite as good as the iMac, it is a good "retina" screen. Can only recommend it.
Previously, I would sometimes use screen sharing to use the desktop of my MacBook onto my iMac, but due to its slow speed, that is only an emergency measure.
The large the screen, the less the resolution in ppi gets. I have seen a 28" 4k screen at a mac - much more fuzzy as the pixels get larger. Actually, to get the same ppi as in the iMac, you would need a 22" 4k screen. So the 24" works in scaled mode, which gives about 2300 horizontal points, this is still reasonably sharp.
Many people are only able to work on machines provisioned by their companies. Even if you could get your home iMac provisioned during Covid, you may be wiping out all your personal data and now you are dedicating a machine you paid for to your company.
I use Luna Display[1] for this. It is a bit laggy and some keyboard shortcuts are broken. But it let me use my 5K iMac (and it's keyboard) as a second display for the MBP.
It is a shame that iMac does not provide normal video input.
This is the reason I went Mac Mini + LG 5k display.
It's much more expensive but I needed to be able to hook in my work laptop to a 5k display while working and switch easily to my home setup with a single cable when I want to.
Ended up being perfect! It's paying off big time with COVID and WFH.
I had a Mini before the iMac, but the Mini had the problem of the lacking GPU. Also, at the time I got the iMac (2015), the Mini was really falling behind. The iMac gave me a quad-core i7 and a GPU. I wished the Mini was available with hardware comparable to the 16" MB Pro. (8 cores, GPU)
If Apple only offered a desktop class Mac, I would even start thinking about the pro display.
In some ways the outlook for the Mac is much better today than it was a couple of years ago.
But I agree it does suck that you can't get a decent desktop that's not an all-in-one iMac. Users have been asking for it for years, and Apple's response was "Fine, you want upgradeability? Here's the entire kitchen sink with terrible base specs for $6000. Oh that's too expensive for you? Guess you're not a real pro after all so shut up and buy the consumer level products we sell you."
It's maddening. If they put an i9 and non-ECC RAM in an upgradable desktop, I wouldn't have built my PC. And as the other commenter said, I'd maybe even consider the new display.
Too bad for them. They're missing a huge chunk of the market.
Only looked briefly at it. Combining the tiny Mini with a huge external GPU seems a bit off, and unfortunately, for all the size, they don't integrate any storage space. Also, currently the Mini doesn't offer good CPUs. Anyway, I will wait now with my next purchase till there are good Apple Silicon offerings.
For me, the eGPU setup is not worth the benefits. I am also waiting for Apple Silicon to streamline my setup.
I have a MacMini + eGPU. Every time I (re)boot the machine I have to remember to remove the remove the eGPU: Otherwise I cannot enter the filevault password. This is super annoying since the machine seems to have crashed every time I want to use it (even without an eGPU).
Previously I used the eGPU with an 13” MPB and the experience was annoying in different ways: Simply unplugging the eGPU kills the programs using it. Every time I wanted to “undock” the MBP I had to remember to click to remove the eGPU from the MBP: otherwise half the programs running are killed/not responsive.
> 1. If you have a Mac mini (2018) with FileVault turned on, make sure to connect your primary display directly to Mac mini during startup. After you log in and see the macOS Desktop, you can unplug the display from Mac mini and connect it to your eGPU.
It might not be more expensive in the long run because now if your Mac Mini needs an upgrade you can only switch the Mac Mini and keep the display which is cheaper than a new iMac.
You know, there are a few key features about macs that have made them much better computers.
Target disk mode. Target display mode. Boot from an external drive using option. Boot from the network.
And you know the "decision makers" don't understand how much nicer it makes the computers to administer. (understand in the sense they would defend them for real in a feature-cutting meeting) Instead we get dongles.
Can't you boot from an external drive (time machine backup) using option, on Catalina? I was under the impression booting from external USB was how you would format the mac.
I don't know about time machine. I suspect you might.
To be clear, I wasn't saying they were taking out the option boot-picker.
I was saying these lesser known features - although they will never be used by a general audience - make the mac significantly easier to administer for people who have been using macs for years and know what they're doing.
I will say I'm glad you can upgrade an older system to a new usb-c style thunderbolt machine using target disk mode - but the dongles/cables to achieve this feat are ridiculous.
oof, that is brutal. I just ordered one and assumed it would support it. My imac from way back in 2012 supported it and with connectivity only improving since then it never crossed my mind that it wouldn't support it.
Actually, my research shows that the 2009 iMac was the last one that offered this functionality, which is described in 2009 as "27-inch models also support input from external DisplayPort sources (adapters sold separately)."
This matches my recollection that my 2008 iMac had this feature, and the one my wife got in 2012 did not.
This is most certainly why video input does not exist. What's ridiculous is they clearly figured it out to support the LG 5K display which is the exact same panel and over TB3 which the iMac fully supports.
More likely Apple dont supply it since it ruins the aesthetic of the back having so many ugly ports. I mean, they've also done questionable things for aesthetic reasons like putting the USB port at the back, the power button at the back, the headphone jack at the back. All these ports need to be facing the user, not behind the screen (I have an iMac 2014, so this is speaking from experience). I honestly believe that an artist determined that iMac cannot be used as external monitor.
Just say "No" to the product. I'm not making the same mistake again.
Right. But now that all Macs support even 6k output, there is no longer a justification. Also, they could have allowed target display mode in a lower resolution or in a window on your iMacs desktop, if they had tried hard enough.
1/4 TB of storage in the $1800 model, 1/2 TB in the $2300 one. Ugh. (So before the lower-end one had 64GB? [Edit: Nope, 1 TB Fusion Drive, thanks smnrchrds!] That would be FULL just from installing the one game the press release mentions.)
What kind of magic fairy dust SSDs are they using that you can't have a sensible amount of storage space at those prices? Looking at the upgrade options (which aren't even available for the lower-end model), they charge $300 per TB.
Don’t buy a desktop computer with non-user replaceable/upgradeable disks. In an air form factor I can buy the argument that it saves 50 grams or one mm of thickness. In a desktop it’s just ridiculous. Having any other standard than 2.5” or nvme is silly, and ideally a desktop even in the iMac form factor should have at least one empty 2.5” slot.
Yup. I've had my 2015 5k iMac apart a couple of times, and it's not for the faint of heart. Screen is glued on, takes a few hours, and for certain tasks the entire computer has to be taken apart. Nothing like the drive swap on my mom's ancient 2008ish (?) iMac where you just suction cup the screen off and the drive is right there.
I accidentally ordered the 'wrong' iMac (256gb SSD) which turned out to be the right option because the fusion drives seem to have issues. Had a 2TB SSD hanging off the back for the past 6 months until i finally went to install it internally. Sadly, if you didn't order your machine with a SATA drive, it doesn't have a SATA Cable -- hence having to disassemble the entire thing to get to the plug on the logic board that lets you add a drive.
There's a great reason for it: profit! Apple knows their customers will pay because they don't have a choice - their computers are bought for them or their software only runs on macOS.
Meh - my Mom's iMac has the 2TB fusion drive and it works surprisingly well. When I dual boot it into bootcamp with Windows you can tell the SSD caching is not there; it's pretty obvious. Sticking a 32GB SSD (formatted NTFS) and dedicating it to readyboost helps a bit, but it's still no where near as effective as fusion under Mac OS. Obviously being pure SSD would be better, but when she got it almost five years ago that much SSD - even third party - would have been cost prohibitive and it's still pretty expensive to get 2TB as all SSD.
I’ve been very happy with the 3TB Fusion drive in my 2014 27”, it’s hugely more performant than a bare hard drive. I’ll probably be getting a new one with a 4TB SSD, but that will make the new machine over a grand more expensive than my old machine was. These new SSDs had better be zippy.
They dropped down to less than that in the middle, almost completely negating their benefit since everything something you needed would always end up on the slow spinny drive :/
> What kind of magic fairy dust SSDs are they using that you can't have a sensible amount of storage space at those prices?
The kind where Apple can make a 300% or more margin, knowing that people will still pay.
Don't think Apple have ever sold SSD upgrades at anything remotely resembling competitive $/TB prices. Clearly they don't need too because people still buy their machines.
Apple wants a certain profit margin for the model as a whole. Yet for customers who will never pay prices that will give Apple that full margin, they offer a model at a low price point with a compressed margin, but with a painful aspect that will encourage customers who can pay to go with a higher priced unit — here, the painful thing is the small SSD size.
So the idea that the minimal-SSD model is somehow the "real" price and Apple is selling SSD space at an exorbitant markup, isn't quite right.
You could almost think of it as Apple offering a discount if you'll go with the smallest SSD, so they can capture more of the market while still keeping their average profit margins high.
If Apple charged "competitive $/TB prices", the small-SSD option would likely be more expensive, rather than the larger-SSD options being cheaper.
Perhaps that's true for people who buy personal Macs, but in several companies I've worked for, the companies that buy Macs for work almost always use the smallest SSD option: you don't need a lot of local storage, since most files will be in company-managed storage systems.
Sure, there will always be customers for whom the low-price option is perfect.
Outlet malls are another form of price discrimination, as they offer lower prices for those willing to travel further to get the lower prices. But there will always be some people happen to live next to the outlet mall.
If people want to "cheap out" they can always get it with the base storage and hang a big thunderbolt drive off the back. It basically performs the same as if it was built-in, at a much lower price. One can get the entry level 27 inch imac and add storage and RAM to get a high end config on the cheap (graphics excepted).
The iMac competes via its 5K display and having macOS in general. Remember that Apple’s comparable offerings for this tier display were priced at $1299, in the same price range as many 5K displays.
The $1800 price point isn’t hard to grasp once you consider the display.
EDIT: I previously compared to / referenced possibly sub-$1000 5K monitors, but it seems the ones I was looking at in that range were actually 4K.
It’s like buying a 5k display and getting a subsidized computer. There’s ~1 5k monitor on the market today (LG)?
I wonder how expandable this is? Buy the bottom end machine with the CPU you want and upgrade SSD and RAM? I would imagine SSD is upgradeable but RAM soldered onto the mobo?
Thanks. Hopefully the SSD is accessible too. I have a ~2010 iMac 27" and I had to pull of the front glass of the monitor and disassemble half the computer to replace the HDD with an SSD. That was a gigantic PITA.
I think the SSDs on all the T2 computers are built-in to the mobo — the T2 is the disk controller — so not even a removable card on the back side like the iMacs models up to now.
On Apple laptops "SSDs" are also soldered PCIe flash memory chips or something like that, so maybe on this very thin desktop computer they will also be doing that.
and it's interesting, hard to find a 5k "retina"; was looking at this, but other than the hard to find (these days) LG model, everything from the usual suspects (dell, etc) gives you some gigantic screen at a lesser DPI...
Apple NVMe disks are fast as hell - faster than any other single disk else I’ve used, outside of the dell NVMe disks we have in some of our servers that cost like $3000 each. I can also trash whatever disks Apple is using for years without ever running into failure, whereas I’ve blown out the wear leveling on plenty of “affordable” SSDs.
Apple intentionally prices a lot of customers out of their market. Their monetary strategy relies on it, oddly enough. If they sold to more people, they’d run into many more issues being seen as a monopoly.
Tim Cook spelled it out pretty clearly to Congress when he testified last week; Apple does not have majority market share in any market they’re in. Not in smartphones, tablets, laptops, desktops, or wearables. And they don’t want to have more market share. The inflated prices and being seen as a “luxury brand” is a great way for them to continue making a huge profit while restricting market share to the point that they can skirt around most monopoly law.
The downside there is to maintain growth, they need to continually push into new markets, which they’ve been doing about once every 5 years (desktops -> laptops -> phones -> tablets -> wearables -> automotive/AR?)
>Apple does not have majority market share in any market they’re in.
That is his spin. iOS has majority market in US ( over 50% ), and iMessages etc. So it is not a niche.
Of course he would like to use Shipment, Brand or whatever other matrix to prove. Although congress seems to be dump enough ( or not ) to not act those questions.
I imagine they also reduce their support resources, and improve their customer retention, by pricing where they do. Low-cost shoppers are going to be hard to retain.
Well, sure, that's just the law of demand. If Apple's goal was simply to ship the most products as possible in the short term, they would sell them for a penny until they ran out of money.
The only ones that had anything close to reasonable prices were the 2012-2013 MacBook Airs. The base one came with 128GB(which was decent at the time) and upgrading to 256 or 512 wasn't a lot more than those drives cost at retail.
Throughout its history, Apple went back and forth between reasonable memory upgrades and ridiculous memory upgrades. They seem to have settled on a general trend recently - if the user can do it, like in a Mac Pro or iMac, charge a ridiculous amount. If it's not user accessible memory, like in a portable, charge a more reasonable (but admittedly still on the expensive side) for memory upgrades.
I bought a tricked out iMac last year and the only thing I didn't upgrade was the memory. Got 32 GB from Other World Computing at the same time for a much more sane price.
They learned their lesson and made storage, RAM, etc. non-user replaceable in newer versions. You used to be able to buy a Mac from Apple and upgrade its RAM at a reasonable price on your own. Not anymore.
You are right. But you used to be able to do that to any Mac. And I fully expect the ARM Macs to get rid of this feature. Though I would be happily surprised if they don't.
I don't expect them to, but I don't not expect them to, if that makes sense -- it feels like an "all bets are off" kind of thing to me. The industrial design of the iMac is likely to finally change when it goes ARM, and it doesn't strike me as impossible that they'll make it slightly more open, or at least not make it worse.
On the one hand, Apple loves to eliminate options and lock things down, both for quasi-defensible reasons like simplifying product lines and for less defensible ones like increasing profit margins (and making everything obsessively thinner, like they're in the grip of some industrial design anorexia). On the other, their most recent hardware design changes have often shown response to customer complaints (e.g., replacing the butterfly keyboard with an improved iteration of the "Magic" keyboard) -- and, if they really intended to lock down the Mac like iOS, the move to ARM and the sweeping UX changes in macOS Big Sur would almost certainly have been when that happened. The fact that it hadn't happened makes me considerably more skeptical it's going to. (I'm also more skeptical now that iOS will ever be allowed to blossom into a full general-purpose OS, but that's a different topic.)
As an aside, I'm not sure whether adding the T2 chip would make it more complicated to use a third-party SSD. It's my understanding they function as the SSD controller and do some kind of wonky things, but I am not taking the time to look that up and could be completely wrong. :)
I feel like they are taking a page out of consoles, at least if the iPad's influence is to be felt on future Macs. Relatively rigidly defined hardware specs and the apps are designed to take that into account.
On the 27" iMac, the memory upgrades are able to be done easily by the user. The 21" Models require taking it to an Apple Store to do so. I believe the iMac Pro also needs to be taken apart in order to upgrade the RAM.
Franky I don’t think people upgrading their own machines ever played into this.
I suspect the vast majority of people who pay for the upgrades are businesses or consultants where the Apple upgrade price is fairly negligible compared to cost of their professional time.
If you’re someone who didn’t want to pay the Apple premium before they took user upgrades away, it seems a little unlikely you’ll be happy to stomach them after. Rather you would just do without (of course there will be some who do upgrade, but I suspect they’re in the minority).
> I suspect the vast majority of people who pay for the upgrades are businesses or consultants where the Apple upgrade price is fairly negligible compared to cost of their professional time.
Yes exactly. The premium on my time has been gradually edging out in my priorities. I'd rather pay a little extra to have something already there than spend the same amount in my own time instead. It's a fair tradeoff. "But there's not much time involved, it's a ripoff," say many. But there is time involved for people who don't regularly upgrade computers. Figuring out what to buy, the best place to buy it, then the process of doing the upgrade yourself (if it's possible) is not a trivial amount of time unless you do this often enough.
Let's say you can upgrade a hard drive in 2 hours total, which is conservative -- the total time of researching what to buy, reading how to install it, ordering it, opening the package, putting in the drive, configuring the stuff you need to do, if necessary. Even at two hours, for my wage, that's about $200 of my time. I'd rather just spend the $200 or even a little more to not deal with it. And in reality, for me at least, it would take more than 2 hours of my time all-in anyway.
Upgrading the RAM on a Macbook used to be a five minute process: Order "macbook ram" from the online retailer of your choice, pop out the battery using a coin, slide the RAM into the slot. Done. Anybody could do it.
Apple has put a huge amount of real effort into direct outreach and support of their customers, really.
Microsoft chased business.
Apple chased people.
Google is chasing its own tail.
For the markup over the retail price of the SSD, someone else does the installation, validation, built dumb stupid restore and backup services to boot.
Dell? Lenovo? Microsoft? Still ramming bloatware down your throat and persona non grata in malls or shopping centers, or wherever a digital nomad might be roaming.
Apple built a hardware and software ecosystem for normies. Free from the Machiavellian incantations of pretentious experts with their opinions on memory layouts, how big their data is, when people just want to edit and backup files.
People think the markup is worth it to avoid IT people. Can’t say I blame them. Have you worked with the “professional” level IT crowd? Alpha bro sausage fest and foot fungus eaters.
>Have you worked with the “professional” level IT crowd? Alpha bro sausage fest and foot fungus eaters.
I work at a network security company and a very small percentage fit your ignorant stereotype of them. Overall its the best group of people I have met in my entire life.
You run into way more "Alpha bro sausage fests" when you hang out with "normies".
I agree with your overall idea, but not your sentiment. I think that plenty of computer nerds use Macs. It's still a popular platform for software developers and computer enthusiasts. There's no need for gatekeeping between "normies" and "computer nerds."
You seem to be really angry at people who make choices different from yours. So much sarcasm and contempt for others here.
Anyway, if you want to get into it... Microsoft dominated the consumer market in the 1990s. The same people you say who were not nerds and didn't care ... Didn't care to get an Apple machine either, they got whatever everybody else got, which were Wintel PCs. Most didn't care if something else had a nicer design or UI or was friendlier or more efficient. It was Win9x almost everywhere.
I think this negates your "Microsoft went after businesses" hypothesis somewhat. They had total domination everywhere, and maybe got a little lazy or complacent and the lead eroded, at the same time Apple got Jobs back and grew as a consumer brand due to iPod etc. But Microsoft is still a major force outside of techie circles.
> Microsoft dominated the consumer market in the 1990s.
Microsoft dominated everything in the 1990s. Apple more or less lost it all.
> Most didn't care if something else had a nicer design or UI or was friendlier or more efficient.
Macs weren't very good in the 90s. They weren't fast, they weren't pretty, and they were quite expensive. The original iMac changed that in a fairly big way. It was more affordable and a lot more approachable than pretty much any other PC on the market and non-business consumers loved it.
Every successive generation of the Mac has shaved of just a little bit more of the consumer end of the PC market. Then the iPad came along and completely crushed the low end Netbooks. So now Apple has much of the mid-low end consumer space with the iPad and the upper end of consumer space (and a big chunk of professionals who have the choice) with the Mac.
Microsoft still truly owns corporate PCs and gamer space though.
I used to build gaming PCs for myself and friends, salvage laptops from parts picked up at auction, etc. But I'm no longer interested in doing that. I'd much rather just buy the computer I need and not fuss with it. I think both of those "modes" are completely valid, and it's great that there are many options on the market for people in both groups.
Is it perhaps "annoying in theory" that I couldn't upgrade my Mac's hardware if I wanted to? I suppose so. But I'm never going to want to do that, so I don't experience any practical annoyance. I suspect my experience matches the overwhelming majority of Apple's customers and potential customers.
Before switching to Macs I used to build my own machines. You want to know how many times I upgraded any of them? Never. Seriously, I specced them out quite generously and never needed to, they lasted as long as I intended them to. Now I’m a Mac user it’s the same thing. I’ll spec out a 27” with plenty of headroom and Ill be done.
> You want to know how many times I upgraded any of them? Never.
I did it several times. Seriously, upgrade is advantage. Some of Apple fanboys can deny it, but better to have that possibility.
And Apple SSD specs in 2020 makes me laughing. But I'm not whining, you want mac - be ready to pay premium price or spend time with hackintosh. It's just business.
Oh come on, Toms Hardware did a roundup of desktop SSDs a few months ago. The iMac drives beat all but one of them, the Sabrent Rocket and that comes in at £750 for a 4TB card.
So sure Apple storage is pricey, and if you’re willing to compromise on performance you can get something that looks equivalent for a lot less, or something actually equivalent for a bit less. But there are a lot of other things about Apple gear you can’t get anywhere else full stop, at any price.
According to who? Everybody cares about value for money. Specs aren't the only reason a product is valuable though, of course. I pay a mark up on Apple gear because of the value I get from the software and customer service, but I still need effective tools and that means up to scratch spec-wise.
Apple supports their hardware (both repairs and software updates) for much longer than anyone else ever. While I have no qualms about opening up my Dell desktop, I wouldn't dare touch a laptop. Laptops, even before Apple soldered the components in, were extremely fragile. My Sony Vaio for example had a broken key and the only option was to replace the entire keyboard.
In essence, I'd rather go with a company that will fix my hardware even if they charge me for it. Rather than a company (like HTC/Sony) where I have to wait many many weeks if at all and software upgrades cut off in under two years.
I agree it is. I still build all my desktop PC's for myself. I've been doing it for over 20 years, and will do so in the future. Back in the day you could really upgrade Mac's. I had a PowerMac 7500 in which I had upgraded the CPU, storage, memory and Video Cards, even the older MacPro's you could do this. I think you can upgrade the CPU's on the newer Mac Pro's although, it's a decent amount of work to do so.
I have tried linux on a laptop and the battery life was abysmal (it was an older Thinkpad X1). With macos (which I don't particularly like) I get a solid dev experience and reliable battery life. Plus, their hardware is untouchable IMO.
I have a linux desktop that works well, but I can't always work on it unfortunately.
I can't remember the last time I upgraded the memory/disk in one of my computers. It really is a non-issue for the majority. There are other issues that are much more important to me.
Apple SSDs have been historically leaps and bounds ahead of the competition. Apple started using SSDs driven by PCIe while consumers were all using SATA. Jacking up the price was justified.
Not anymore. Fast NVME drives are commonplace. Unless these drives are again significantly faster than the competition, which I doubt, that's just Apple doing their thing and not adjusting fast enough.
HN clearly isn't the target market. When HN frets about how it isn't suitable for them, it's just a lot of meaningless bluster and noise.
Though let's be real here. I have a 2018 MBP with 256GB of storage. I have XCode, the XCode beta, Logic Pro, virtually the entire Adobe gamut of software, brew and a massive selection of brew packages, every browser, IntelliJ, GoLand, and just a tonne of crap.
I've used about 150GB. As fair disclosure I have a USB 3 1TB 970 Pro in an enclosure that I use for the occasional massive file download, purely because I'm paranoid about flash exhaustion, though by the system metrics I'm still at less than 1% wear.
Yeah, someone buying this for their kid to do their homework is going to be completely fine with 256GB. Though it's worth noting that the next option is just $200 more and gives you a faster process and 512GB. The lowest end one is just the one to frame the value, and presumably isn't their recommendation.
I agree that Apple has focused its attention exclusively on the consumer market... and that's a problem.
Developers write code for platforms that they're using. This is what drove adoption of Apple hardware in the early 2ks (a POSIX that runs MS Office!), which in turn set set the stage for the iPhone and iOS.
Unlike then, Apple now has a good grip on the consumer market: Not targetting iOS with a mobile/tablet release is a bad idea, regardless of whether or not devs are familiar with it. But OSX? There are a few areas where it's still has strong devotees: Color management in OSX is still fantastic, creating a lot of loyalty among artists, photographers, etc... It's still technically a POSIX, so it's still attractive to developers.
In short, OSX is not targetted at consumers. When someone needs a "computer" for their kid to do their homework, that's increasingly going to be an Android/iOS device.
For a platform to survive, it needs a healthy developer community. A developer community needs incentives (e.g., market-share or devs already using the platform). Right now, what are the incentives for OSX devs?
Not entirely sure if you intended to reply to my comment, however just to clarify I'm saying that this particular machine isn't for developers or hackers. Apple has different machines for different markets.
Developers buy MBPs. Industry users buy iMac Pros (which starts at 32GB RAM and 1TB SSD, going up from there) or even the goofy Mac Pro.
"When someone needs a "computer" for their kid to do their homework, that's increasingly going to be an Android/iOS device."
Lots of people do the vast majority of their "computing" on pads and smartphones, but they still like a computer on the desk for..."productivity". For many of those people, this device is more than adequate.
It seems crazy. I bought a 2TB M.2 PCIE gen 3 drive not too long ago for 200 EUR.
These drives are so tiny now they could easy offer an expansion slot which is accessible to the user, like the PS5 will. I miss the early-2000s era Apple who would actually do something like that.
Apart from the margin argument advanced in the comments (which I agree with) this TB3 machine can be upgraded with an external drive directly attached to the PCIe bus (that TB3 interface is faster than your disk Last time I checked so supposedly no performance issue).
You may not find this adequate (actually, I don’t like it either) but it’s not an unreasonable position to take.
Having worked in the hw biz not only do memory sockets add to BOM, they reduce reliability (statistically — not on any one machine, but over your installed base)
How much is it to get a really good quality 5k display with a Dell?
> What kind of magic fairy dust SSDs are they using that you can't have a sensible amount of storage space at those prices?
I'm using 165GB of 500GB on my current system. Considering all the systems issued by work are configured the same and nobody complains about space issues, I don't think it's a big problem.
I'd take the 5k display over 1TB storage any time. Sadly my work issues laptops and not iMacs.
It's odd that you can't increase the storage for the £1799 / $2300 model. I don't think 256GB SSD is enough these days.
Personally, I am always inclined to buy my Apple products when travelling as most times it's a lot cheaper in the US or Singapore. I mean $2300 (incl. VAT) vs $1799+1.08=$1945 $350 cheaper.
> What kind of magic fairy dust SSDs are they using
I don't know about magic fairy dust, but Apple's SSDs are known to be phenomenally fast in benchmarks. $300/TB is compared to $200/TB aftermarket for a high end NVMe SSD. Is that markup? Yes, but it's not 300%.
Apple uses a file system especially designed for fast file transfer speeds, and guess what benchmarks they're using to declare their SSDs superior? With newer M.2 drives like the 970 EVO or Sabrent Rocket Apple has no valid excuse for charging so damn much for storage...
> With newer M.2 drives like the 970 EVO or Sabrent Rocket Apple has no valid excuse for charging so damn much for storage...
Can you show any other major manufacturer that includes a competitive internal drive for significantly less? Because otherwise you're comparing very different things. Dell also upcharges $300 for a 1TB NVMe SSD.
"If I buy it separately and install it myself" isn't a valid market comparison. If you want to do that, then do it.
Not presumably, definately, at least in benchmarks. Max transfer rates on a current model gen4 ssd can reach upwards of 5GB/second, but top-end gen3 model is faster at other things. The Samsung 970 pro is still the fastest nvme consumer disk in almost all categories, besides max transfer rate, due to its usage of 100% mlc, instead of qlc & mlc cache.
For all of the handwringing over SSD endurance figures over the years, the fact of the matter is that normal use won't ever get close to using it up. The only people who ever really needed to worry about those figures were doing heavy data processing every day.
I actually do heavy data processing but not every day.
It can be hundreds of RAW images (50MB/image) or the simulation software I'm working on which can generate GBs of output in 3 seconds if I leave the wrong flags on.
In my other workstation I can actually see the big jumps of write accumulation when I do these tasks via SMART (I log the data periodically).
So mine was a honest question rather than covert fanboyism.
The Sabrent Rocket 1TB, a reasonable, fast $130 SSD, has an endurance rating of 1665 TBW. To have hit that rating, you'd have to have written 760 GB to it every day over those six years. At 5%, that's still 38 GB a day, every day.
If you were spending more to get better endurance, you could have just bought the 2TB version, which would have guaranteed 1422 GB/day.
(NB: SSDs suffer write amplification, so if you actually need 1.4TB of writes per day you should choose an enterprise drive instead.)
Endurance isn't a problem with SSDs for consumers, at all. Even QLC, which has significantly worse endurance than TLC, has plenty.
I'm not aware about the TBW rating of the Mac's SSD. I didn't drill into its SMART stats. Maybe I should do that. Home workstation's home is on a Samsung 860 Pro, 256GB version, which has 300TBW.
I have a slower storage tier on that computer for big files and archives so, it's only hammered when I really need that speed.
When I bought the Mac, 1TB SSD was top of the line. 2TB was not available for Mid 2014 Macs. Actually, I upgraded everything as apple could while buying it
I'm aware of the dynamics of SSD writes and TBW values. I personally don't write that brutally. Things get hot if I'm processing images or working on my software and need to see some detailed logs or interim results along the way.
If I was using my computer as an ordinary user, I'd not worry about it at all. My family's computer runs on much simpler drives and their write volume is nowhere near me.
Actually, my method is very simple.
Every 10 minutes I log the Total LBAs Written attribute from the SSD to a log file alongside a long time stamp.
Periodically I graph the data with GNUPlot. There's a slight slope most of the time. In some regions there are jumps. These are generally when I seriously work with my software, generating logs and other output.
System is on another SSD and swap is on a high performance HDD so, they do not affect the graphs I'm getting.
That jumps helped me to catch two KDE bugs. One was an isolated case with akonadi. Other one was reported and possibly KRunner's bookmarks extension will see some more revisions to eliminate disk trashing.
Addendum: When I have spare time, I'd write a small HTML/JS file to get the data from log file and graph it interactively. A both fun and useful project.
The only constraint of price is that must be greater than cost. After that price is determined by what the customer is willing to pay based on the value they receive.
I realize that games these days are full of high detail world-building, but I shudder to think that 250-500 gb for an install is considered commonplace these days...
Yeah, entry level should be a 512GB SSD and 16GB of RAM for any "premium" desktop in 2020. I'm sure some bean counter at Apple has calculated that arranging the options in this way makes more money for Apple per unit, but I can't tell you how many times I have looked at an iMac or Mac Mini and thought "that's a pretty good price", then add the correct amount of memory and disk capacity and noped right out of their website.
Schools like buying Macs, and they don't particularly care about performance. The low-end iMac has always been a top seller to lower education because a kid isn't going to complain that their computer takes four minutes to boot.
Schools have historically liked buying Macs; I'm not sure how true that is anymore. Chromebooks have swept the schools, both because they're very cheap and because Google has, from what I've read, just terrific classroom management software compared to most competitors. More recently Apple has been pushing the iPad into education pretty hard -- competition from the Chromebook is arguably why the $329 iPad ($299 for education) exists. Macs are increasingly positioned as being for teachers and administrators, not students.
Maybe I misread the comment but it was a response to someone noting how expensive they were... so I read it as saying that they're supposed to be that expensive so schools would buy that. That seemed odd.
Until you have to manage a lab full of Windows, then MAC becomes a much better value. With a Mac you don’t have to buy MS office, you get iWork. You also get iMovie, GarageBand and other Apple software that covers almost everything a typical school would need. And you never have to buy updates for either the OS or the other Apple software. You also have less malware to deal with.
Having worked at a school with labs full of Windows machines, the real cost of IT support is not to be discounted.
Maybe a Mac isn’t the best solution for everyone, but in the school environment, they’re great. Most schools have kids doing video projects, presentations, desktop publishing, etc. —- tasks for which an out of the box iMac can handle with aplomb and with minimal hassle or additional expense.
Sure there are probably FOSS alternatives, but if you are the computer lab teacher, the last thing you have time for is managing a lab full of temperamental Linux boxes and the multiple flavors of FOSS software alternatives that frankly, aren’t that good compared to what Mac gives you included. For example, what’s a FOSS version of iMovie that works so powerfully and intuitively?
Spend later like looking for the one guy who has a Mac in the region and knows how to fix problem x? Or spending later like sending kids to extra courses because they have no idea how to operate the most common office suite on this planet? Spend later like taking that one broken device for a xxx km tour to the next Apple store? And so on...
I wonder how people here are unable to understand basic reality. Especially in third world countries.
I wish I could use it as an external display. I would buy one as my personal machine and then use the display during the day with my MBP for work. Would be nice in this new world of remote work.
Always found it irritating that they never brought back target display mode. It was forgivable on the first retina iMac, as there was then no widespread connectivity for 5K screens (iirc that Mac internally used overclocked DP1.2 or something), but once Thunderbolt 3 was available there was really no excuse.
Though maybe I should be thanking them; I'd almost certainly have bought one if target display mode was available, so they've saved me money!
You can always boot from your mbp? Treat it like an external bootable storage system. Yes you won’t be able to use the display on the laptop, but you will be able to run your applications on the iMac. (Assuming that matters more? For me it does. I can’t stand how slow docker is on my mbp, pretty much puts my cpu to 100% and has the fans running at full speed)
I wanted this so much. Turns out that doesn't work with encrypted drives which is mandatory. Target disk mode only allows booting in non encrypted drives or perhaps non t2 MacBooks.
Never used it, so I don’t know how well it works, but https://support.apple.com/en-us/HT204592: “With target display mode, you can use your iMac as an external display for another Mac.”
From that same article: "This article has been archived and is no longer updated by Apple."
Because Target Display Mode is no longer supported:
"Make sure that your iMac is using macOS High Sierra 10.13.6 or earlier. You can't use target display mode with later versions of macOS, or with Boot Camp and Windows."
Having mentioned that, it'd be awesome if we could plug it into a MacBook Pro via USB-C and use it as a power supply, an eGPU, and an external monitor. And maybe also mount the disks.
I can't believe it took so long for Apple to add SSDs to the basic configuration. I had the misfortune of coming into a company who purchased 2017 21" iMacs and they were almost worthless with how slow the fusion drive is.
A lot of Mac users get the computer and never plug anything to it besides a phone or some USB sticks. If all you want is a computer you can read your e-mail on, 8 GB is fine. Memory, AFAIK, is upgradable post-purchase in all current models.
Also, the three standard configurations (i.e., the ones you'll be able to walk into an Apple store and pick up, rather than ordering a different configuration online) max out at a 512 GB SSD. It's just bizarrely small.
It just feels wrong to buy anything new from Apple with "Intel Inside" - when they're clearly going to dump it soon, in favor of ARM (excuse me for not using the cringey term Apple Silicon).
This (and the 16" MacBook) are well timed upgrades. Apple is giving their more conservative/ cautious users a good upgrade choice right before they launch their new line. So if you want to stick with Intel, you can buy a system now and put off making the riskier move to ARM for a few years.
I'm sitting around waiting on the 2020 CPU revision of the 16" MBP. Will be getting that top GPU too. Just don't want to buy right now if we might get that CPU upgrade in the next 1-3 months.
I think this is every Apple Intel platform. My i7 6-core Mini can manage about 30 seconds at full clock if they're all lit. One problem seems to be that the fan hysteresis is too long, it spins up the fan too slowly and the CPU is already throttling down before it reaches full airflow.
Why does it matter? There will be a period where the Intel Macs are still supported and probably more stable, and it could be ~2y before your upgrade is available if it's ARM based.
If you have a retina macbook pro from 2014~ it definitely makes sense to upgrade to a macbook pro 16" today. The thing will see you through another 5-6y easy.
My fear isn’t around Apple and OS-level support. My fear is around smaller or indie developers who are already being asked to support macOS, Windows, Linux, iOS, and Android and now have to support another target. It’s not uncommon to see these devs not supporting Android or Windows because they can’t keep up. So how long before those devs drop support for Intel Macs (or don’t support ARM Macs for several years)?
For most apps it’ll be fine, emulation and cross compiling will be enough. There will be some apps where this isn’t enough though. When the App Store switched to 64-bit, how many apps just disappeared, never to be seen again?
I fully support the transition and I think it’s a logical step for Apple, but it’s going to be one more thing that indie or smaller devs have to worry about. Which means it’s one more thing everyone on this forum has to worry about.
It's trivially easy to build software for both ISAs from within Xcode, unless, of course, you are playing with Intel or ARM intrinsics or other pathologically non-portable thing, in which case it's anything but trivial.
It's not much harder than supporting Linux on x86 and ARM.
> It’s trivially easy to compile your iOS app for 64-bit as well, but like I mentioned... that transition left a lot of dead apps in its wake.
Most of those "dead" apps were not under active development and had no income flow. Making a "trivial" change to an inactive project means rebuilding software which you have shelved for some time.
If you are actively developing software, adding Intel support to a build is pretty much just clicking the option to build a Universal binary. Adding a flag to an existing build a lot easier than dusting off an old app and making a bunch of tweaks and a new build with near zero chance of return on that time invested.
Some packages don't support ARM because there is little demand for that. People won't use RPis for heavy number crunching or any demanding application. Until recently, you could barely go beyond 2 GB of RAM on any ARM platform and still very few ARM boards will support more RAM than that. Another issue is that most ARM hardware is underpowered and it takes a long time to build and test software on it. Not everyone is willing to do that.
Apple is going to change the landscape with a mainstream ARM platform that can compete in specs with Intel desktops and laptops. Let's see what happens then.
The dynamics of proprietary software are somewhat different from open source - unless you make a steady revenue from an app you launched some time ago, there is little incentive to dedicate resources to port it to a new platform (even if it's from 32 to 64 bits) unless you expect it to drive new sales. There is no penalty apart from lost potential revenue and a huge incentive to focus on the next product.
Apple only supported PPC for about 3 years after the Intel transition... what are you basing 5-6 years of support on? Apple hasn't made that promise.
And what user that is ok with a 6 year old computer is interested in spending several thousand on a new macbook that they're going to have to throw out in 3 years? The only user I can imagine that would be ok with the 16" macbook are the users that upgrade every 2-3 years... they can risk it and likely won't lose out on anything except maybe the resale value.
I still run my 2014 MBP model because it still seems better to me than the newer models.
No touchbar instead classic function keys, HDMI output, 2 displayports + 2 usb3 ports, magsafe!, and most importantly a MicroSD drive i have put a Nifty 512GB card into, so i have 768GB of space of which 256 is from the SSD, - works perfectly.
Having a small SD drive is so incredibly useful when you can buy half a terabyte of extra space for $100, granted not that fast but still useful as storage.
The newer models are so incredibly expensive they make no sense to me when they remove features.
Fair point. It does make sense if you really need to upgrade a very dated machine.
However, I am assuming since the Macbooks are rumored to get ARM as early as late this year, it is likely iMac would follow next year too. The Developer Transition Kit has a Mac Mini with ARM already.
Yeah I agree. Since this is an iMac thread.. I think notebooks will definitely be first to go ARM, and I wouldn't be surprised if the iMac user base would prefer that native x86 based processor for at least the next 5 years or so.
I’m guessing intel will be supported much longer than we think. They just (relatively speaking) launched the new Mac pros and I don’t see them switching that over to ARM quite as quickly.
Apple stopped shipping updates to PPC systems after the 2 year transition was over last time. So if you buy today, you’ll need a new system in less than 3 years
Yes – anyone saying otherwise is an early adopter (so most of this site), but for the average user, an ARM-based device won't be perfect at launch. Plenty of apps (and games in particular) will outright not work; buying an Intel device now will mean support for 3+ years and avoiding transition pain, even if you miss out on 'new shiny' for a while.
>It just feels wrong to buy anything new from Apple with "Intel Inside" - when they're clearly going to dump it soon,
I'd be very surprised if Apple Silicon is going to be competitive against the Xeons in their highest end machines any time soon, so they'll be bound to supporting Intel at the OS level for the foreseeable future. They've basically said as much. I suspect you should be able to get about 5 years, at least, before they start phasing things out.
I don't think it'll be close to that long. Rumours are that they'll release an 8+4 core this year, which is enough for everything bar the top-end iMac Pro and Mac Pros, and I'm sure they can get out a 16+N core by 2021 or early 2022. That would be as fast as any Intel CPU on those models. And heck, the margins on those models are so large (they sell a $3000 Xeon W for $7000) that they could comfortably push past yield issues to throw out a larger die than that.
Good point. I still think 5 years is a realistic time-horizon for how long before they start phasing Intel stuff out, but that could just be like that Bill Gates quote about how people tend to way overestimate how much progress will be made a year from now and way underestimate how much things will change 10 years from now.
Nah I think it's good news that apple is continuing to release Intel-based devices. It means all of us with existing Intel macs will get support for years and years to come.
When buying a Mac one should factor in that they'll get a new one in 5-6 years (or earlier). In which case, buying an Intel now is fine -- there will be too many Intel Mac in 2-3-4-5-6 years for it not to be supported, and when it's near its last legs, it would be time to buy a new laptop anyway...
The chip also contains a bunch of coprocessors which have little or nothing to do with ARM, e. g. the GPU.
That’s one more reason to call the chips Apple chips, not ARM chips.
Anyone in music should stick to Intel, maybe for as much as three years.
The whole industry of music software is kind of weird… their products tend to be somewhat “closer to the metal” than typical software, even a bit more than video, graphics, CAD and 3D… improvements and bug fixes are few and far between.
Also, like most artists (in my experience), musicians tend to learn one way of doing things very very well and getting them to try anything new is next to impossible.
But your use of ARM is not encompassing of the entire change here. Apple Silicon is the whole SoC environment. All the custom cores, power management, specialized hardware decoding parts.. etc...
This is far more than an instruction set change.
I guess people who buy Intel Macs now, they know what they're getting into. Plus, Intel Macs will still hold strong for at least another 3-4 years minimum, and it's definitely gonna live a lot longer
This essentially confirms Apple's strategy to just refresh existing models/chassis without redesigns going forward; the new Apple chips will be used exclusively for redesigned/relaunches like the Macbook.
Smart move since it increases the appeal of the Apple-powered Mac devices. The pricing difference will also help, as based on the expensive base configurations (RAM, SSD as others have pointed out here), it will make the new devices look that much more attractive.
Seems like this would be part of the reason the Mac Pro launch was such a teeth-pulling exercise. It crunched them on both sides. It was just too long of a wait from the old design (which couldn't be refreshed for thermal reasons).
I have a first generation iMac 27" 5K (late 2014.) They were shipping with 8 gigs back then and I immediately upgraded to 24G. It's ridiculous that 8 gigs is still the default config! I recently ordered one of the new Raspberry Pi 4's with that much.
My wife has an 8 GB Macbook Pro and she's quite happy with that. For most uses (and she usually has a dozen Word documents open at any time) it's perfectly fine and the machine is quite snappy.
OTOH, I'd expect the average HN user to need a lot more. I'm currently finding 16 GB a bit constrained and would be delighted to have 32 in the same footprint. The server under my desk has 64 and the swap indicators tell me it should have more (or less stuff to do).
It's so the "starting at" price is low but they can mark up RAM and Storage upgrades when you actually go to order as if they were the expensive piece. They are making something like 80% margin when you max out the RAM, I assume similar for the "SSD storage" of ambiguous type.
Dell did the same thing with the RAM in a 7740 and 7750 I recently ordered (work and personal), ironically about the same markup.
I swear that the MBAs are ruining the world. This is the crap that mature industries pull to grow profit since they've saturated other growth channels. Instead of product growth, MBAs come up with product bundlings to maximize profit by obscuring the true cost to customers.
When we read "10th generation Intel Core" then we have to go and research whether than means real 10th generation (like i5/i7 MBA, some MBPs) or just another rebadged Skylake (6th gen) derivative.
Why do you care? They're the fastest cores you can buy and 10 of them in a box is pretty respectable on throughput terms, too. They're all just the umpteenth revision of the pentium pro, if you're trying to be reductive.
All 10xxx parts are either Ice Lake (10nm) or Comet Lake (14nm), none are Skylake. I mean, if you want to get technical both are "Skylake derivatives", I guess.
The uarch and core differences between Skylake, Kaby Lake, Coffee Lake and Comet Lake are neglible. The most significant changes were claims of hardware fixes for some Spectre and Meltdown variants in Coffee Lake, which is still a really minor change. The minor IPC differences arise from different cache sizes and interconnect.
I still find it fascinating how the choice of Intel chip generation is almost an afterthought (well, maybe only for a minority, but do most buyers look carefully? are they aware?) in the difference between models or computers when you click to buy something.
Is this a strategy of Apple (or Intel) to be able to get people to ignore (and therefore they can optimize the cost of) the generation of chip used? It's almost as if they realized people fixate on the GHz figure, and overlook the chip generation, which (I guess?) must contribute much more to effective clock speed than a small difference in Hz rate? I even fall into this trap sometimes.
It is a bit complicated for the casual buyer to have the clock speed which was the previous metric of "goodness" of your machine now have to be weighed with the generation of processor. When the name of the processor itself doesn't provide clear differentiation between generations.
I think part of this is driven by the fact the Intel chip generations are a bit of a mess at the moment. It's takes them over year at this point to release a full line of new generation chips, and even within that generation you get a mix of tech thanks to Intel's fabrication struggles.
Obfuscating the chip generation is almost a necessity at this point because the variations within and between generations is too complicated to keep track off.
I still don’t fully understand this world, I ‘downgraded’ from an i7 to a slower i5, and my computer got faster. I wish they’d use sensible clear names for the generations so you can clearly understand what’s better.
The first number after the i# is the generation. i5-5350 is fifth generation, i3-9350k is ninth generation.
That said, the difference between generations is pretty minimal thanks to Intel's huge process blunder last decade.
If you want to know which chip is actually faster for your workload that turns out to be a fairly difficult question to answer. A lot depends on what you're trying to accomplish and how your code is built.
And I would point out, the level of chip model detail you gave above isn't even provided in the buying screens! For some users this is ok -- does the fact of not providing it give some hint about what level of user is being aimed at?
I think even for the Mac Pro, they don't go into this detail.
Your comment seems a bit disingenuous - how much time passed between when you got the i7 and when you got the i5? Today's i5's can be faster than long-ago's i7's but todays i7's are still far faster than today's i5's.
Their comment isn't disingenuous - that's just how a regular, normal, non hardware nerd would most likely interpret Intel's branding. Bigger number = faster, as far as consumers are concerned. If you understand the i3/i5/i7/i9 tiers, you're in a small minority.
I think so, the only time Apple seems to have highlighted the use of Intel chips was during the transition from PowerPC architecture. They focus on the value of the computer as the entire package.
Interesting deviation is how they've been highlighting the performance of the A-series chips since the A7.
I bought a 2012 iMac with 8GB of memory as the baseline. We’re more than halfway through 2020 and Apple is telling us 8GB is still just fine for a >$2000 computer?
It seems very important to Apple to get the T2 (similar to an iPhone 7 CPU) controller chip into all the Macs. But it seems unable to accomodate internal HDD or Fusion drives. And Most of the 21.5" models are bargain basement (as far as that goes for Apple) models. Fusion drives were an interesting digression, but obviously not the wave of da future.
Well they can quite frankly fuck off with that one.
The 32gb additional memory option costs 4x the price of the entire 32gb of ram in my desktop. It costs independently 60% of the cost of my desktop pc base unit just for the upgrade.
The bottom end unit features 256gb SSD (probably soldered) and the next tier up wants +£200 to go from 512 to 1TiB. My entire 1TB Samsung evo plus nvme cost £165.
The 5k display isn’t worth it for my use case so I’ve got a 27” iiyama 4K display I paid £379 for. I suspect for 90% of people the story is the same.
Add the shit show that has been Catalina so far and this is a comedy gouging.
Edit: equivalent spec of my desktop build is £3399 here. I paid £1520 for my desktop approx: ryzen 3700x, 32gb ram, 1TB ssd, gtx1660, 27” 4K display, decent mechanical keyboard, decent mouse, Windows 10 pro.
Maybe just put Apple out of your mind and don't get so angry about it. They're clearly not trying to compete for your business / segment of the market.
I'm also part of the Apple cult for now. Every new computer hardware release just keeps getting frustrating. The only thing I like about Apple computers now is just Mac OS. Their hardware has been waning for the past few years. What people want is an affordable headless iMac in desktop form. They were just tone deaf when they released a $5k desktop. I don't want a headless Macbook ala Mac Mini
I still feel good about iPhone, iPad, and Apple Watch at this moment. However, everything else hardware wise feels like a disaster to me. I mean where is the home pod mini? Homekit will never be taken seriously or as useful as Google Home and Alexa, until Apple finally releases smaller, less expensive smart speakers. "Hey Siri" just doesn't work well on current Apple devices.
This. Also they ship the Mac mini and no reasonable display options themselves leaving you at the mercy of LG (unreliable and almost completely unavailable) or 1.5x display scaling on other 4K monitors or a monitor which costs three kidneys attached to a stand that costs another one.
And let’s not get into the overpriced input devices. Wobbly keys and a mouse designed much like the furniture at the start of Men in Black.
Gah I’ve just convinced myself to sell the Mac mini and the iMac I already have now. And probably the iPad I’m whining about this on. Yes I’m a member of the cult. I have a whole cupboard of white boxes. But the pc pays the bills.
Obviously Apple is not courting the pcpartspicker market.
This is competing against products like HP Envy 32, Microsoft Surface Studio 2, etc which are priced similarly. You pay a premium for the design and form factor.
Ah yes the design with the ports on the back that get scratched up, the unopenable case without destroying the screen cables, the dangerously fragile corners, the thermal issues, the proprietary storage thermal monitoring and the fact it’s all in one lump so if something goes wrong you’re in trouble.
I have repaired a few and own one (2015) I think the design is absolutely dire.
As someone who never purchased an iMac, but enjoys Apple's MacBook line of products - it blows my mind that people are praising them for this. Considering the price point, this should have been the case like 4 years ago.
I had the same reaction. It pains me to use a machine that has a HDD... I had no idea that Apple were selling new, £1000+ machines that relied in spinning rust.
Nice spec-bump for those, who want to stay with Intel-based Macs a bit longer (e.g. depend on x86-virtualisation, running Windows). While I was about ready to upgrade my current iMac in 2020, I am going to try to hold out till the step to Apple Silicon. The one thing I would hope for would be a bigger screen. The 27" is a tiny bit too small, especially vertically. Considering that Apple once sold the 30" Cinema display, the step to 27" always felt like a step back. When going to a design with less bezels, they really should use some of the saved space for a bigger screen.
It's weird how restricted the choices are on SSDs too. The $1800 model is stuck at 256GB, the $2000 model lets you do 512/1TB/2TB, and the $2300 model adds options for 4TB/8TB.
Like, this has to be a totally artificial restriction. I'd much rather buy the cheap one with a bigger SSD, but instead I'm stuck paying hundreds of dollars extra for features I don't care about.
90% of people don't need more than what's in the iMac. For pros who would use it the iMac Pro or Mac Pro are the obvious choices. For gamers Apple has been ignoring the market and everyone knows you are better off building your own rig.
5700 XT is a very high end GPU, especially the 16GB option (can you even buy a 5700 XT with16GB for PC? I can't find any in the local price aggregator, all of them are 8GB). It isn't the fastest GPU you can buy, but it is still a high end one.
If anything, this is the first time i see an iMac which has a good GPU. Though considering iMac's form factor and my experience of having a PC with a 5700XT i wonder about thermals.
Does the imac pro get the better camera and other features listed?
I used laptops for years until and rsi issue made me have to avoid trackpads. I can say I’ve been extremely pleased with having a desktop imac. I use an ipad pro when I need something portable. I like just having a station I can get in front of that signifies work, and the imac pro fans basically never make noise.
iMacs always had relatively better cameras than the laptops, and are not really limiting for most video chat usages (unless you're seeking 4k video or something). Their form factor allows real-sized cameras to be used, unlike a laptop lid that constantly compromises.
I'm not so sure - ARM and Intel are bound to meet at some point in the power/performance space because both share the same limitations in terms of physics.
I have an iMac Pro and can confirm the microphone is amazing. Miles better than the internal microphones on my colleagues PC laptops. (I don’t have a strong opinion on the camera, and don’t have an iMac to compare to.)
As I get older and wrinkled, I appreciate the lack of pixels and soft focus of a crappy camera. I'd be much happier with low noise in dark settings than more pixels. Kind of the same way I appreciated the Dell XPS "nostrilcam" that fools people about the hair on the top of my head in ways my Macbook doesn't.
I definitely agree with the general consensus here that this is a solid spec bump, and most welcome at that!
And like others, it made me think of their transition to ARM. I know the first ARM Macs this fall will be their lowend MacBook and 13" MBP line, but at WWDC they said the Intel to ARM transition would be complete in under 2 years (like they said with the PPC to Intel transition). And since the PPC to Intel transition only took a little over a year, it could mean that ARM-based iMacs may be here next year to replace the current iMac lineup, and I'm dreadfully curious as to whether they'll be lower performing than this iMac update (which is quite impressive, performance-wise). Or perhaps they'll use multiple ARM CPUs to boost performance. Regardless, every time I see a spec bump now from Apple, I can't help but think of these things.
Just my speculation, but rather than “lowering the specs“, I think they’re positioning themselves to release a “budget” option in the $800 range with a 4k screen, which will probably match performance with the low end intel iMac.
Matching encoding speeds with Apple Silicon is probably an easier task, since they can create on-chip, hardware accelerated encoders, which exist on iPhones already.
The current bottom end is $1099 and that's the last model with a low-dpi screen. I'd be surprised to see the price drop much below that when it gets renewed.
Just did the fun "what kind of car could I buy instead" of maxing out the specs on an Apple computer. 10 cores, 128GB RAM, 16GB GPU, 8TB SSD, 10GB ethernet comes out to $8799. ~ A used Civic/Corolla I guess. Not as sticker shocking as the maxed MacPro. Would love to see how the machine performs though.
I think I might the only one who likes and wants bezels on everything that has a display. Extra large bezels. Perhaps the only exception would be phones as they increase the size of the device.
Bezels are important to frame the screen content from functional perspective. Think of it like a frame for the art work. Black bezels occulude the distracting edges and background noise from the screen and you have always have a constant black border. They can be made to look nice as well like some of those Bang & Olufsen TVs[1] and Sony Trinitron Professional monitors [2].
Contrararily, there is no reason to the opposite besides "aesthetics". Can you think of any?
Alas - the momentum behind bezels is so massive, it is impossible to reverse this trend. Same with "borderless" trends in UI/UX.
I agree that bezels can look good, and small bezels are nice to frame the image as you mentioned, but the iMac bezels are neither small nor good looking.
I am arguing that they're invaluable even if they're not needed now that LCD technology has advanced in terms of being able to assemble it.
It actually doesn't matter if its a screen, artwork or a photo. IMO bordered (black or white) photos is vastly superior to borderless photos. There is a reason why every museum that displays fine art photography have borders, almost always.
I don't see any distinction with screens. Perhaps, even more so important to have distraction free edges on monitors than photos.
What on Earth is everybody's sudden aversion to bezels?
To be fair I'm biased because the shift to bezel-less (on phones) was accompanied by a shift to humongous screens that I can't use. Now I'm stuck with the new iPhone SE as my only viable upgrade.
Great machine and no Apple Silicon as yet (which may be a huge advantage if you still need to run Boot Camp or x86 VMs.)
I'm also excited about the nano-texture display. I've learned to live with reflective displays because I'm addicted to their high contrast and deep blacks, but nano-texture sounds like the best of both worlds.
Nice that they upgraded the anemic FaceTime camera and pulled the plug on the low-speed hard disk (even if it was configured as a fusion drive.)
Apple has generally had decent mics and speakers as well, so it's exciting that they are trying to make them even better. Anyone who has suffered through horrible zoom audio will appreciate the benefits of good mics (and not using zoom, which seems to make bad audio even worse.)
I'm sorry, I know it's a minor cosmetic thing, but those bezels make it look so dated that I can't really see it as new. Not that it matters since I will probably never buy an iMac again after experiencing the issues with dust getting beneath the glass of my last iMac 5k.
Presumably you cannot install Mojave on these, only Catalina. I sacrificed a Mac to Catalina, then wiped it back to Mojave. Hopefully Big Sur will fix the flaming dumpster fire that is Catalina, but I am not hopeful and working on my Linux migration plan.
Curious about the bugs in Catalina that made it useless for you. Have been on it since the public beta, and haven't really had anything that has made it impossible to do things I used to do. Care to elaborate? (Edit: I did on 10.15.13 have a bug where I couldn't access files and folders on the desktop, and had to repeatedly reboot. Downgraded to 10.15.12 and went to 10.15.14 when that came out and fixed it. Will admit that this was incredibly frustrating!)
I'm sitting here on a 2012 imac with a fusion drive. This drive never seemed to give much performance but it absolutely tanked once it converted to APFS. The computer became unusable to the point that I had to buy an external SSD and use that as my main disk. Now i'm very disappointed that it doesn't support Big Sur despite being only 8 years old. I will wait for the apple silicon because i don't anticipate that intel will be supported for too long. Apple will make "core features" of their OS depend on something that isn't shipping in intel silicon sooner than later.
How much RAM do you have? I recently upgraded my 2012 iMac from 8gb to 32gb and the performance is like night and day, I suspect because it has more memory to use for disk cache. It's a fully usable machine again.
I'm also on a 27" 2012 iMac. Have 32gb inside. I've had the internal drive fail a few times, once under warranty - another time due to age. I'm about to transition to an external SSD as my boot drive since the SSD part of the fusion drive is on its last legs, and keep the internal 6tb HDD for storage. When Apple silicon rolls around, I'm planning to update. Apart from the drives, the machine has been a total rock. No performance issues, apart from when the drives had their issues, but hard to complain this deep into its lifespan.
APFS is as far as I know designed for flash and performs really badly on spinning disk. They still provide HFS+ and ExFAT as the options if you format an external spinning disk.
It depends on the macOS. Only Catalina and higher (macOS 10.15) require APFS. macOS 10.14 and Mac OS X 10.x may use HFS+. That is, though High Sierra and Mojave both "want" APFS, they may be coaxed (cloned) into booting on HFS+. [0]
Just when you think they've forgotten the iMac. Can't wait for Barefeats to post the benchmarks, and compare the top level 27" to the Pro. The 27" is interesting now because you can get 10Gbe.
One gripe is the 27" iMac comes standard with only 8GB, while the 15" MacBook Pro has came with 16GB standard since 2014... I'm guessing most people are going to the aftermarket to throw 32-64GB in there anyway. Apple offers the 16GB upgrade (from 8GB) for $200, but you can buy 32GB of RAM for $150 and install it yourself.
RAM on iMac's has been user upgrade-able since at least 2014. I have a 2014 27" iMac and a 2017 27" iMac, and I upgraded the RAM on both of them. It's not a crazy "disassemble half the machine" type job either, it is really easily done via a panel in the back.
Only the 27" model has the access panel, the 21" model doesn't. I'm not actually sure if the RAM in the smaller one is soldered on or if it even has additional slots. But even if it does replacing it isn't something you can do easily.
iMacs have always had upgradeable RAM, just about the only user-serviceable part in them. There's a little door you can open on the back by the power connector. https://support.apple.com/en-us/HT201191
Haven't the iMac and Mac Mini always offered (relatively) easy memory upgrades? I've upgraded probably 10-15 of them at work. They just use the samller laptop modules instead of a desktop module, and (usually?) only support 2 modules. Mac Pro offers it too, though that's a different style of computer entirely.
The current 2020/2018 (they are the same) Mac mini does use DIMMs, but it is not easy to entirely disassemble the computer to replace the RAM. Some (2014?) Mac minis didn't have replaceable memory at all. The 27" "regular" iMac is still using a design from late 2012 that has a simple door behind the stand hinge to very easily access the RAM. Even the iMac Pro has to be totally disassembled to change RAM!
The post-lamp iMac has always offered user-replaceable SODIMM slots in a number sufficient to occupy all of the CPU's memory channels. The late 2009 model had 4 on the bottom, the 2015 has 2 one the back, these new ones have 2 and the Pro model has 4 (because the Pro CPU has 4 channels).
I'll never understand the pricing. $1000 extra to get 64gb of memory? Incredible. What circumstance leads a purchaser to look at that and say, "ok sounds great?"
My boss. I mean, we don't have unlimited cash to throw away, but let's say I'm going to keep a new Mac for 4 years (which is a lot less time than I'll probably actually have it, but we'll round down). I work about 2,000 hours a year, or about 8,000 hours over the time I'll own it. That $1,000 amortizes out to be about $0.13 per hour, which is a rounding error compared to salary, benefits, air conditioning at the office (back in the old days when that use to be a thing), etc. If that makes me 0.2% more efficient -- say by making compiles or "docker build" commands run faster -- then the company made a profit off of it.
After 3 weeks of waiting, I just got my new iMac. Paid $4399 before tax. Old specs, same price as new one with improved camera, 2 additional cores etc. :-(
I can't use Zoom with the camera of the old model, without using a 3rd party tool, as the color balance is way off making everybody purple/red. Sad.
erm, no. During the introduction of Apple SOC they stated that they will produce and support Intel based macs for some time together with the apple SOC versions.
As expected. They're not going to do a radical form factor redesign and update for Intel based chips, they're saving the all new iMac design for the ARM transition for the extra wow factor.
Not sure if the value proposition is there though for a specced out iMac.
It doesn't matter what they release, they've already Osborne effected themselves as soon as they announced they're transitioning to ARM chips in a couple years
It looks like the last iMacs with (Thunderbolt) Target Display Mode came out int 2014, but the Retina (and later) models don't support it unfortunately:
It does not make any sense for me to buy new macs anymore - especially when I have to run the new macos, which will take SECONDS to open apps that open in MILLISECONDS when you switch off wifi or just use a good OS.
Also I do not like their beta product experiments they tend to run on customers since years - I will wait for years before I trust their hard- and software to be useful for work or even personal things.
If they want me to beta test they should give me money for it.
I'm not really sure about that. It would probably be useful for artists and draftsmen, but I keep my iMac pretty far from my face while I would want to keep a drawing table pretty close. The Studio was really cool as a design concept, but I'm not sure how well it translated from showroom to daily use for anyone who isn't a professional artist.
And even for professional artists, I have to imaging the portability of an iPad Pro probably blows it out of the water for those use cases. Some people think it's more natural to move their canvas around the pen than vice-versa, so just being able to move the device/screen around is huge.
I wonder what the ARM based iMacs will bring. There is a lot of talk about a fundamental redesign and Big Sur really looks like being designed for touch. If an iMac should support touch and pencil, it would need a foot similar to the one of the Surface Studio too.
This is likely the final iteration of this design and for Intel processors. They’ll save the redesign for Apple Silicon. They are definitely road testing a few new technologies for that future iteration, like the new screen technology.
Wonderful computer. How much you wanna bet that in < 5 years they drop support forcing almost everyone that purchased one to throw it out and get a new one?
Firstly, you have to consider that Apple has said the transition to ARM will take about two years. So it’s going to have to release a few more Intel Macs next year too.
For the less than five years bet, if you’re referring to macOS updates, I can personally bet my entire wealth (which is a decent figure) that you’ll lose. There’s no way the Intel Macs released this year are going to lose macOS support for another six years (into 2026), at the very least. The latest macOS (Big Sur) that’s now in beta supports models that are seven years old (or even older than that).
If you’re referring to hardware support and repairs, there are standard ages for each model that Apple uses for that purpose to define what’s obsolete, vintage, etc. I’m sure these will follow the same cycle.
Yea, I got the timeline wrong. More like 10 years.
I've worked e-waste recycling and the number of beautiful but unsupported macs that were dropped off was very sad and to this day I definitely consider Apple a significant force for evil on this planet, just for the planned obsolecense ecological damage alone, let alone the right to repair.
I've gotten linux running on a few of them and they are wonderful machines. Sad that the owners had to throw them in the trash.
For 5 years, you're probably right. For 10 years, I think not. For example, the 2011 iMacs are stuck on High Sierra, which hits end of life later this year.
IMHO they should build some hysteresis into their cost structure, so when something much better has just crashed in price, they can install that much better thing even though they may have contracted their parts inventory at a higher price. They can eat the cost difference in order to deliver a better machine to the customer. It will be made up in time. For example, over a year ago (maybe two?) I was able to buy a 1 TB SSD, top speed available at the time, and a top brand, for just over $100, retail. They probably plan out their costs ahead of time, so they may have considered the equivalent to be a $200 part. Ignoring the fact that they don't pay retail, they could eat the $100 difference on paper, because future inventory will cost them even less. Sure prices go up sometimes too, but the general trend is down for tech as we know. Not sure why they are so stingy with putting the best into their products when I have the dollar cost amounts in front of me and they are not that high. /rant
Yeah, I know. They even provide you with a standard approved cloth, which is hilarious.
Still, the results are amazing and anyone who dares to even get their filthy greasy fingers near my screen, let alone touch it, are in for some serious trouble anyway.
In the case of a nano-texture screen, I should probably require people to sign an insurance agreement to be in the same room as the device.
There are people, who require a new machine now, so it is good that they get an up to date offering. Also, there are people, who need x86 compatibility when running virtual machines. Also, as long as Apple expects developers to support x86 Macs, they need to offer x86-based machines for those developers.
Honestly a fairly mediocre upgrade considering the pricing.
It's good that they've finally improved the webcam, but Apple clearly went out of their way to ensure people looking to get an Intel Mac because they need to dual boot weren't going to get anything too compelling.
Edit: Downvotes be damned, it is mediocre. Not bad, because it is a spec bump, but there are lots of things about the iMac's architecture that are beginning to show their age and are unchanged here, despite becoming bottlenecks. The cooling system is a big one, and the storage controller support another.
It includes a 5K display which may become useless in five years? (when Apple plans to stop updates for Intel based Macs). It seems that iMacs do not support using them as expernal displays right now. I would prefer 5K monitor to last a bit longer than five years.
It included a 5K display at a much lower price point that hasn't been adequately justified by currency changes before too, so I'm not sure that means anything.
The iMac had NVMe SSDs at least as early as 2015[1]. Apple has since moved away from NVMe modules and gone to raw flash chips hanging directly off the proprietary "T2" system controller chipset. I think it's fair to criticize the proprietary/non-upgradeable nature of this storage solution but benchmarks have shown it to be comparable in performance to standard PCIe 3.0 NVMe drives so calling it outdated is not really a valid criticism.
I’d also speculate this means iMac won’t be the first computer getting Apple Silicon. I wonder if it will be the last?
What’s the consensus guess now? Perhaps a new MacBook Air with good performance but the real “breakthrough” is > 12 hours battery life?