Look how big the escape key is on that keyboard, finally, revenge for the touch bar!
I like the direction Apple is going here, I assume they will have higher end iMacs later this year or early next year, and that's what I'll be waiting for.
I'd like 27 inches of screen, at least 16GB of ram, but preferably 32, and that's about all I'd ask for. My current workstation is a 2013 27" iMac fully upgraded (fusion drive, 16gb, good dGPU for its time - still plays Kerbal Space Program fine), so looking forward to more upgrades on the internals.
I've already run into the old WiFi and Bluetooth 4.0 as being bottlenecks and those are difficult to replace in an iMac by a normal person.
>Look how big the escape key is on that keyboard, finally, revenge for the touch bar!
Jony Ive's departure is the best thing to happen for Apple products in a decade. They're actually *gasp* listening to their users again. There's news that the new 16 inch M1 MBP's will even have physical function keys and more ports. I cannot wait.
I also heard that rumour about the ports, but I'm not sure I'm entirely convinced with them releasing this new iMac with only USB-C ports. If their desktop Mac doesn't have ports, why would their laptop?
As I said in another comment, M1 seems to be severely I/O-limited. The M1 Mac Mini even has an internal USB hub iirc to solve this. It's safe to say that the next generation of SoCs will be better in this regard.
It’s basically a laptop without the transportability. Only 255GB disk. No numeric keypad. Have to use a bulky charger. Have to carry dongles for the Blue Yeti microphone in USB2 (and the printer). Need a SD adapter to download the films from my reflex camera. Can’t plug headphones.
It's funny/weird that they only put the ethernet port on the power supply. They could've put some more USB ports on it for connecting accessories in a clean manner. If you're using a docking station already, why the hell not do it properly? For this iMac the connection could've been USB-C, btw (but it probably can't be for the larger ones).
Furthering the idea of putting things into an external unit to save space on the desk, what if they also put the computing hardware into something like a power brick? Sounds crazy, I know, but the display could be very thin that way and there would be no need for thick chins or tons of dongles on the table.
While Ive was there, Macs had an extreme focus on being thin. Ive started at Apple on the candy iMac. Seems like the combination of both of these qualities
iPhone 11 feels way better in the hands than iPhone 12 with its new rectangular design. It's night and day. Presumably, Jony didn't take the lead with iPhone redesign.
This one's pretty contentious. I strongly prefer the flat edges. It's one of the main reasons I've held on to my SE for so long. The 12 is the first phone I've considered upgrading to in years.
I can't see what the arrow key layout is on the narrower (non-numeric-keypad) version because a hand is over it. It looks like there's a big key where there would be space above the left / right keys on the most recent one. I've got one of the interim ones where the left / right keys are full size so it's difficult to know where they are by touch, and I'm still irritated by that. I hope they haven't gone back again.
Unfortunately the shot I saw showed the same big left right arrow keys. Small improvements to the over all design but that is still a huge tactile wart for me as well.
If I end up using one of those keyboards I'll probably replace the larger arrow keys with something that has the top half recessed, or just add a dimple/bar over them for a similar sensation.
Ugh, yeah - first image down on https://www.apple.com/uk/imac-24/ That's a real shame after they temporarily rowed back on both the touch bar and the full-height left / right arrow keys.
I think it's because of the more rounded corners - hard to see how they could have a shallow right arrow key with that corner radius. Sigh. Two steps forward, one step back.
If you search for the heading "All-New Design Enabled by M1" (near the beginning), then there is an image that shows the layout of the small keyboard. Further down, under "Touch ID Comes to iMac, Along with Color-Matched Accessories", there is a gallery, and the third image is of a full-size keyboard.
> I assume they will have higher end iMacs later this year or early next year
I wouldn’t be so sure. Unless their M1 strategy changes, I think they’ve temporarily gotten rid of iMac Pro entirely and are probably going to rely on the Mac Pro to fill that void instead.
The M1 was a stunning upgrade for the MB Air, but it is already bottom line for the 24" iMac, especially in the GPU department. Apple is lacking any desktop-grade hardware based on ARM, so a larger iMac with more cores and much more GPU power is desperately needed. Unless they plan to reintroduce the Mac Pro as a desktop Mac :).
I am not getting one of these because of the ram limitation but only because I currently have 16GB and it "feels off"
At this point I'm convinced that 16GB M1 isn't comparable to 16GB DDR3/4 simply because of the architecture and that the RAM is so close to the CPU. We've already seen that they are doing some crazy efficient stuff with memory, but all that said, it still feels anemic for a desktop, and I suspect the response will be the same as it was for MacBooks: normal people don't care, tech people who love Apples will wait for the inevitable upgrade.
From what I’ve read, with the M1’s unified memory and the ability to swap extremely quickly, it’s like the SSD is more of a cache, making the limited RAM much less of an issue— https://singhkays.com/blog/apple-silicon-m1-black-magic/
Curious to see how this plays out. I would have expected apple to just extend the M1 a little longer rather than jumping to M2 since the M1 is still pretty limited in its robustness.
I don't care what they call it, but I'll buy one :)
I don't think we will see an M2 chip before next year. It's rumoured to be on 4nm and yeilds will be highly problematic to push a 4nm product out that soon.
I think Apple will try to stick to a proper one year cadence for the M1, M2, M3 line, just like they have successfully done for years with their A series.
But we might still see an "M1X" this year. An M1 with 6 or 8 big firestorm cores.
Hard to tell. Apple might be planning to skip over the larger M1 in favor of putting the remaining products directly onto the M2 generation.
Interesting choice of colours. I suspect they look better together than individually.
They're certainly not for pro video or photography, where a coloured frame is the last thing you want. Which suggests they're aimed deliberately at the school, home, and hipster office markets.
Add me to this list of those waiting for an M1X iMac Pro with at least 64GB and no-compromise screen. And a black bezel.
So does this mean the M1X is delayed? Or this is Apple's tick/tock?
They're certainly not for pro video or photography, where a coloured frame is the last thing you want. Which suggests they're aimed deliberately at the school, home, and hipster office markets.
1. A silver colored M1 iMac is part of the lineup, which is a neutral color.
2. A Mac with a 4.5K Retina display, Wide color P3, True Tone and support for their 6K pro display clearly isn't exclusively targeted at school, home and hipster office markets.
3. Apple will announce build-to-order configurations where you can get 16 GB of RAM as we get closer to April 30.
So does this mean the M1X is delayed?
How can something that's unannounced be delayed? Apple hasn't announced a timeline for future M-series processors. WWDC in June is the next logical time such an announcement could be made.
Having said that, an M1X processor would make sense for the Apple Silicon versions of the iMac Pro and 16-inch MacBook Pro.
> How can something that's unannounced be delayed?
This is the benefit of Apple's policy of only announcing products when they're ready. It prevents bad publicity if there's a delay and also prevents people waiting to buy the upcoming product and waiting and not buying available ones.
There was a great interview with the iMac product manager Colleen Novielli on the Upgrade Podcast in 2019, where she talked about the tremendous number of iMacs used in a business setting because of their look compared to a black plastic Dell monitors. Think reception desks, salons, hotels, etc.
These colors will stand out in a way that really nothing has since the original iMac G3.
I'm super excited to see them figure out/be comfortable with a detached Touch ID.
I've wanted my watch to be set as a Touch ID, or my wireless keyboard for my mac mini, and historically Apple seemed like they'd only allow physically attached Touch ID. With this product, it looks like they're providing a Touch ID keyboard that's wireless. I'm very curious how that works logistically, and if they'll make the keyboards compatible with other Apple products.
I was super happy to see that option for sudo, and it worked great. Until MacOS reverted it. Not sure if I did an update or what, but the change was undone.
Now only if I could somehow use the Secure Enclave on my Apple Watch to store webauthn and other u2f authentications, so the device I have on me most of the time could act like yubikey
Is them releasing a keyboard (detached and presumably replaceable) with a Touch ID sensor basically an admission that the iPhone Touch ID did NOT have to be permanently paired to the phone to keep it secure?
Because in this case you can replace the keyboard and keep it secure, which contains the Touch ID sensor.
I would not be surprised if the sensor is permanently paired to the keyboard's controller chip, and then there's a load of security involved there to keep you from swapping the sensor on the keyboard itself.
Still, better to replace a $100 wireless keyboard than a $1200 computer.
I was really hoping they would add touchID to the magic trackpad. I love my trackpad but there's no way I'm trading in my nice mechanical keyboard for a magic keyboard, and I doubt they'll add it to both peripherals.
They have facial recognition, but Face ID involves a complicated array of sensors and an IR projector to create a 3D mapping of your face. Iirc, the module is fairly expensive to produce which is why we aren’t seeing it on more products.
I'm self employed (engineer, not comp-sci or coder bro) and as my company's revenue grows I am more and more tempted to "treat myself" and integrate more Apple products (I have an original SE phone) for the ease of use and having "corporate" Unix products in the arsenal.
I see new products like these and instantly get excited. It looks so cool! It's thin, the screen is nice, looks like a great product! Maybe I'll quit RHEL on desktop and move on with my life!
Then I see the price, the specs, and the 2 USB type A products clicked into my Thinkpad while I'm out of town helping an older relative. I had a Macintosh a long time ago when in college. Now, I think I'm too far down the RHEL / Thinkpad corporate hole to ever go back, and System76 has ruined my expectations for power on a desktop level (I have a Thelio). Hell, my Thinkpad is 7 years old and the only hiccup is its low screen res.
As pretty as the UI is, I'm not sure I could get used to it from years on stock GNOME and a stint on XFCE.
I might as well start buying New Balance or Nike Monarch's while I'm at it.
I think it's worth it. I'm at the point where I think any non apple laptop is a waste of money. I own a software consultancy and charge about $700 a day for my time. My MacBook regularly has uptimes of over a month. Sometimes I work in an office with people using Windows machines. Sometimes their bluetooth earphones won't connect. The mic doesn't work on Zoom sometimes. Doesn't wake from sleep. Things like that. That's what I'm happy to pay to for. My last 2 Macs have both lasted 5 years no problems and if you ever do have one you can pretty much go into any apple store and have it back in a week. That's the benefit to me. If the machines were 2x the price I'd still pay it.
In the process of searching to fix my issues I found many other people complaining about related problems. I don't have actual data on the issue, Apple would never release such a thing, but if it was very rare there wouldn't be so many others complaining.
I totally feel you on the Apple store part. That's another feature that is very tempting to me. Anything break? Fine, drive straight to the Apple store and get it taken care of.
Yep anywhere in the world! And the nice thing is that a lot of people buy the products so manufacturing issues are usually brought to light and you get extended warranty for that issue.
The New Balance refer to the fact that I'm old and crusty. I work in engineering but am not a software developer (majority of folks here are). Running Red Hat makes you old, too,
I'm not sure what Pop!_OS makes you in all honesty. I do know Arch makes you leet / insufferable, and RHEL does in fact make you old, boring, and corporate.
I'm really surprised that Apple decided not to have the Apple Logo emblazoned on the front of the iMac. It's a good looking computer but if you had shown me one before this event I would have assumed it was some other all-in-one PC.
I also can't help but think a black bezel would look better.
Apple's so popular their branding is the device itself without logo. Only the biggest brands have this luxury. I agree though - the logo would have made more sense.
Jony Ive is no longer working at Apple and he was the God-King of industrial design. I suspect the MagSafe charger disappeared because of his efforts to make USB-C the one cable to rule them all. But after all the trouble with dongles and the butterfly keyboards I suspect more sensible heads have taken over and we can have these nice things again.
I don't understand why, with Apple's design chops you'd say they would be able to make a USB-C Magsafe hybrid.
Hell, I can fantasize about one right now: make a specialized Apple USB-C charger cable with a plug a teensy bit smaller than standard USB-C, then magnetize the encasing of the plug. In addition, the furthest left and right side USB-C ports on the MacBook should be magnetic where this encasing would touch the port.
This would make it so that the plug itself is loose in the socket and is purely held in place by the magnetism.
Of course this means that particular cable is only suitable for use in MacBooks as it would slip out of regular USB-C sockets. It should have a clear Magsafe symbol because of that.
In a further iteration, that kind of magsafe-C port could even be added to iPhones and iPads.
there are magsafe Usb-C cables. [1] I owned one similar, worked ok but itself was a bad quality, but I'm pretty sure that good quality cables like this are very possible.
USB-C everything is a great idea, USB-C ports on macbooks wasn't an issue. It's that they were stubborn and kept lightning on the iphone and provided a lightning to usb-a cable, which was ridiculous
I guess that they would have used USB-C here too, but the power supply is 143 watts and USB-C only supports 100. As long as they need to use a non-USB-C connector, they might as well make it a nice one.
I think the worse thing is that the iMac needs to ship with a USB-C to Lighting cable in order to charge the terrible Magic Mouse via its ridiculous bottom port that prevents use of the mouse while charging. And even the brand new magic keyboard still charges over Lighting. Lightning needs to die.
I had this question too, but couple of days ago, I tripped over USB-C power cable for my MBP, and it just unplugged without taking my laptop from the table or any damage whatsoever. So... may be it fits this use-case anyway?
But on the other hand, I'm not sure who has a desktop computer with cords draped across areas people are walking.
Photo studios for one. Or anyplace outside of a studio where photography is being done long term. I saw this at a factory where a photographer was hired to take pictures of all 14,000 (!) SKUs for a new web site.
iMac 2013: Simply adjusting the screen yanks the cable off. iMac 2021: Can’t wait to lose my work. It makes so much sense on a Macbook where there is a battery to take over.
It wasn't marketed as MagSafe, just as a magnetic connector. And why not? Magnetic connectors are awesome.
Also, putting the power brick on the floor and extending it so you can route ethernet up to the desk via a single cable is a nice idea. I hope that is a hit with customers and they expand it to other systems (namely MBP) in the future, I wouldn't mind having a couple USB-C ports down there too.
The lightness, size, wireless connectivity, and boot speed of this device makes it more likely to move around the house than a traditional desktop or even the recent iMacs. I could certainly imagine scenarios in which someone purchases multiple power cables, plugs them in throughout the house, and then brings the computer from their desk into the kitchen while making dinner or something. That is already a common practice with iPads and physically this is basically just an iPad attached to a stand.
Except this doesn't have a battery, touchscreen or keyboard attached. It seems extremely unlikely for someone to do this instead of just getting a laptop or tablet for this price.
I think you'd be surprised. I've seen a few people with iMacs who treat them as portables and move them around a lot, and I kind of get it. It's a size that's big enough to be a nice screen for just about any room, and small and light enough that moving it doesn't feel like a big hassle. Really, having to plug it in is the only drawback, but for a lot of people the laptop always stays plugged in as well, so this isn't a total dealbreaker. The fact that it isn't a touchscreen and you need to move the keyboard and mouse is more of a hassle to that kind of use IMO.
I have an iMac pro as my main workstation in an open plan apartment/home office, and will often move it from my desk to the large kitchen table (to collaborate with someone when they're over). The extension cord I use is long enough to keep it powered on.
I quite like it, and with the recent shift to working from home and decrease in travel over the next couple of years I think I'll stick with this set-up for the foreseeable future: powerful desktop driving 3 5k retina monitors + cheapest laptop or ipad I can get away with on the go.
Hence the updates to the wireless keyboard, touchpad, and mouse.
>Seems extremely unlikely to do this instead of just getting a laptop or tablet given the price.
This is the wrong way to think about these features. Yes, if this is your only use case for the device you are better off with a laptop or tablet. If this is something you do occasionally, but you primary use case is for a desktop, the iMac can now serve both uses and save you from purchasing a second device.
No, I don't. My current setup requires several power bricks ("weird boxes") for the displays and speakers anyway. The good thing about those is that you can just hide them away, unlike your monitor. And what exactly is the problem with extension cords?
The nice thing about the All-In-One Macs was always how portable they were. The Mac Plus, the Colour Classic, the all-in-one Quadras, and every previous version of the iMac -- just a single box, a standard power cable, a keyboard and a mouse.
It was always a lot easier to move them around than the PC towers everyone else had. Apple even ran an ad once that showed how few cables iMacs had in contrast to a typical PC.
I mean, it's not a huge difference in portability, but it's an Apple product -- people obsess over details like this, and they pay a premium for a product that gets it right.
Also, it's nice that you can hide the external PSU in your setup, but often that's not easily possible. Eg. if you have a minimal setup with a sleek desk, there's just no place to hide the power brick. Or if you bring the computer to the living room to watch a movie... I mean, there's lots of ways to use this computer, and the external PSU is just optimized for one use case.
And it's made even worse by the fact that many people are going to end up having to attach an external hub with an extra power brick since this iMac severly lacks ports.
EDIT: And you can see that Apple doesn't like the PSU since it is conspicuously absent from their marketing page. Even the "What's in the box" picture hides the PSU: https://www.apple.com/shop/buy-mac/imac
They know that (some) people don't like power bricks.
There are desks that solve the cable management issue by mounting a tray just underneath the tabletop. If it's such a problem for you, a desk like this is probably worth it.
The portability argument I understand though. I regularly move my desktop between two places and having duplicate cables and peripherals for each one has been very convenient. So seeing that, you're actually right, I would need to either buy a probably fairly expensive proprietary cable or always carry the brick; and an integrated PSU would've been more convenient.
But in the end whether this tradeoff is worth it comes down to your particular situation, and apparently Apple have decided that in general it is worth it. Personally I also don't think too many people move their desktops around that much.
The point about the power brick not showing up on the page I'd chalk up to OCD design, not malice, to be honest. It is at least hinted at because the two pictured cables are different.
I can only speak for myself, and I hate power bricks with a passion. You need to find a place to hide them, they are annoying when cleaning, and they are annoying when you move devices (need to untangle not only cable but also power brick).
Why get an all-in-one when it requires attaching a box to it? I might as well get a Mac mini at that point (which still has a built in PSU)
Every picture features an iMac on a beautiful desk with the power adaptor conveniently not in the picture. There's not really a place to hide it if you have desks like these, it's just gonna lie somewhere on the floor.
...yet it somehow still has that huge pointless "chin" at the bottom. I don't think that's beautiful. I guess it's "iconic", like the ugly notch on the newer iPhones.
I'd rather have the thing a bit thicker instead, because 99% of the time I'm not looking at it from the side.
Once needed to bring my old 27-inch Intel iMac to collaborate in-person on a project. It weighed 30.5 pounds, which isn't super heavy but it was a hassle to transport.
The 24-inch M1 iMac weighs a little under 10 pounds. You could bring one of these into a conference room at work with no problem.
Since it is a desktop computer it makes total sense to move Ethernet to the power brick. That way, you can have a single cable up from under your desk to the computer. I'm collecting all my cables in a rack under the desk anyway, and the less cables I have to route over the desk the better.
It doesn't need to be but it is. What are you gonna do, bloat the case just to fit an Ethernet adapter? Also, what's wrong with it being in the charger? This is actually genius, this way only a single cable leads to the machine itself.
I suspect Apple designers think the RJ45 is an ugly cable that should not sully the minimalism of their smoothed rectangles.
Jokes aside, putting the connection on the power brick actually seems like a really smart idea for cable management under the desk. And, you will have one fewer cable plugging into the back.
It's an aesthetic win since you don't need the cable coming out the back of the machine and it's a pragmatic win since cable management is much easier if it can plug into the power brick that's already on the ground.
perhaps weirdly, but it's basically a dock.. you could reasonably hotswap the mac with another one for purposes as yet unknown to me, and be fully setup.
It's an almost ridiculously portable stationary device
I initially thought that RJ54 on brick is weird, but it seems to make sense. Just curious about intermediate wire interface, 1000BASE-T or USB3? I also expect that it's hard to receive 10GbE update.
But you have a “charger” on a device without a battery.
Power bricks are the most inelegant solution to a problem ever. Can’t think of a clever design to accommodate the features we want? shove it on a box and put it to the side. Lazy.
Yes. Or I wish the Charger is small enough they are like the MacBook Pro where it plugs directly to the wall. And not something that is now hanging / sitting in between of my Cable to the wall.
I mean it does help with heat and electrical noise isolation, plus if a manufacturer cared to they could probably add more heat fins on an external brick not constrained by the case.
But why just Ethernet? Why not go further, make the power cord also USB-C (or even Thunderbolt) and have the power brick be a dock with USB and whatnot plugs on it?
I think it makes sense. The power cable is usually fixed and not moved for a desktop computer. The same holds true for an ethernet cable. It's fixed and plugged into the wall.
So both ports that are used for fixed cords plug and that plug into a wall are located at the same spot.
The ports on the computer itself are for devices that you might plug and unplug on a daily basis. Peripherals, charging cables, USBs etc.
This makes a lot of sense the way its setup. Fixed cables are grouped and separated from "free floating" cables.
From my own experience, most people end up using wifi anyway at home and in office regardless of the device type (desktop/laptop). This way the least likely to be used port is also separated from the device and also means that if you need to for whatever reason swap one iMac with another, a single plug into the new iMac will connect power and ethernet with no other setup necessary.
I don’t mind the Ethernet port, but I was wondering the same thing. Is there no longer any value in making the computer larger? I’d expect that there is something that was cut for the thinness, but maybe not?
The computer is so thin, the headphone jack had to be moved to the side. Now you have desktop computer with speaker cable hanging on the side going to your external powered speakers. How's that for elegant.
From an audiophile's point of view: The thing on the side is a headphone jack (digital out through the jack has been long gone AFAIK), not a line level output, you're not really supposed to connect your speakers (directly or through an amplifier) to that. It usually works, sort of, but it's far from ideal.
I'm not sure whether that could be Apple's line of reasoning, but I wouldn't be surprised.
Why does a stationary computer need to be so thin that they can't fit an ethernet port and have to place it in the charger?
Not everyone lives in a college dorm. Some people have beautiful homes with very carefully selected furnishings, and spouses that appreciate design. This fits in with that.
Those new iMacs look really lovely. They should be a perfect desktop for many users. At 24inch, the screen is large enough, the M1 powerful, silent and also doesn't use a lot of electricity. With all environment concerns, the electricity usage of computers should not be overlooked.
I am very happy to see the return of colors, love that! But white bezels around the screen? Those are a bit of a no go for me.
And of course: why still no video in, especially with such a great screen. In times of home-office, it would be so great if you could connect your work laptop to your iMac for video output. I might have gotten one just to replace my dell 24" 4k screen (they are getting rare, seems to be basically the only screen with almost 200dpi on the market). (A Vesa-mount option would be great too, especially considering thow small and light the iMac is)
exact same boat here about video input. I’d have gotten one in a heartbeat. My wife who uses macs professionally and I who is bound to a Windows work PC share a home office desk. That would have given exactly the flexibility they touted in the keynote. Especially considering that they don’t have any external display options other than the totally overkill and outpriced XDR. Why not have the option of using an iMac as a second screen to another Mac or PC?
Traditional opinion is that AIO PC is bad design (except looks) because it merges display and computer. They work for different lifetime (display lasts longer), unable to self repair, need to send whole display and computer to repair station, non-upgradable, unable to use it as a monitor.
But now maybe it can be thought that both lifetime is almost same, rarely fail thanks to solid state (except fans). So AIO would be totally fine if they implement HDMI-in if you don't need upgradability.
Can someone explain the 8gb obsession (edit: 16gb is possible, but why not more)?
is there a real limitation with this architecture or are they just making a business decision so they can sell multiple models over 2-3 years instead of creating what the market wants immediately
Planned obsolescence has never really been Apple's style (their devices generally get longer software support and have better resale value than anyone else).
It's just about scale for the time being, probably.
The M1 chip is now in 5 different devices. That is 100% worth not fulfilling everyone's RAM needs for the fantastic economies of scale and engineering simplicity that brings. They'll come out with an M2 soon enough, which will undoubtedly have more RAM and they'll stick in the iMac Pro, Mac Pro, and MBP 16" (and possibly "backport" to other devices).
That being said I just "downgraded" from an Intel 16" MBP with 32GB RAM to an M1 13" with 16GB RAM and haven't noticed much of a difference, even though I was regularly getting to 80-90% RAM utilization on my 16".
> Planned obsolescence has never really been Apple's style (their devices generally get longer software support and have better resale value than anyone else).
I think that only applies to their mobile devices. I have had Intel Macs that can't run the latest MacOS but can still Boot Camp into Windows 10. Of course, this is all moot in a post-Intel Apple world.
Most non-technical people don't share the same definition of obsolete. I know a few people who have gone more than 10 years between hardware purchases using Apple computers.
Windows 10 was released in 2015. Apple releases a new major macOS update every year. Not really surprised that you can run 6 year old software + minor updates on an 6-10 year old machine but can't run a major software revision from the past couple years.
Yeah but the point still stands that you can still run new applications on Windows 10 in Boot Camp an old Mac. There's a good chance that if you're using a "left behind" Mac, that you can't even update your apps because they stopped supporting your version of MacOS, even though it's working perfectly fine.
Also, Windows 10 in 2021 is far from being exactly like Windows 10 in 2015. There have been a lot of changes, Microsoft just doesn't increment the major number any more.
> Windows 10 was released in 2015. Apple releases a new major macOS update every year. Not really surprised that you can run 6 year old software + minor updates on an 6-10 year old machine but can't run a major software revision from the past couple years.
If you're saying that Windows 10 from 2021 is Windows 10 from 2015 with "minor updates", you haven't been paying attention to Windows updates :-)
Windows 10 updates are probably bigger than MacOS updates.
> Planned obsolescence has never really been Apple's style (their devices generally get longer software support and have better resale value than anyone else).
I agree with the latter, but tell that to their previous default of 4gb ram and 120gb ssd (for so long) and 16gb of storage on iPads.
Not really sure how you can agree that the resale value is excellent and in the same breath complain about features of the device being somehow egregious.
Every single device ever made has to make tradeoffs. No device can be everything to everyone. Apple has (correctly) identified that RAM is really not the end-all-be-all that people obsessed with specs seem to think it is. If you spend more time working on other aspects of the device you can get away with less RAM and still have a device that people resell years later for good money.
Because I think we're using two different definitions of obselete. Just because people will buy a thing years down the road, doesn't mean it wasn't already obselete then or when it was brand new or that those second purchasers can't make some use of it, however strained. I disagree about ram, at least at the minimum threshold, evidenced by numerous naive customers I've witnessed being incredulous as to why a few chrome tabs and some temp files are bringing their $1k+ computer to a halt.
Would you have bought a MacBook air or pro in 2017 with 4gb of non-replaceable ram for $1000? No, because it's obselete and you know that. What if they told you it was a great computer, and you didn't know anything about computers? Probably.
The M1 is using LPDDR4X-4266. It's possible that packages bigger than 8GB just don't exist (8GB was all that was announced just a few years ago, and the RAM announcements usually come far ahead of shipping usages https://www.anandtech.com/show/11021/sk-hynix-announces-8-gb... ).
That'd then be 1 module = 8GB, and 2 modules in a daisy chain or T-topology gets you 16GB, and there just isn't anything more to be done without a bigger memory controller (more channels) or bigger DRAM modules.
Samsung claims to have very large chips (96Gb) in mass production. It really seems like a reason to upgrade next year when the CPU isn't getting much faster.
I think you've got a bits vs. bytes confusion there. Samsung is only claiming up to 12GB modules. 96Gb(its) = 12GB(ytes). So that'd get you +50% max RAM on the M1's, which would be a nice offering, but still won't be enough to reach the 32GB+ of the outgoing Intel lineup.
That's likely the largest that LDDR4x is ever going to go because the tech is outdated. If they had a 24GB option, you'd hear a lot fewer complaints (and the cost difference isn't massive).
They announced 16GB (128Gb) LPDDR5 chips were in mass production in Q3 of last year. These both use less power and have better performance than LPDDR4x.
Both Intel and Qualcomm have support for LPDDR5 in their chips that launched last year. No excuse for Apple here.
As a developer who moved from a 16gb Intel MBP to an 8gb Air, I see no difference at all. I think a combination of the nature of the SOC and the massively faster retain/release cycle just make it much less of an issue.
That said, I did straight up run out of RAM the other day, but my editor was chewing up something like 32gb of swap. Still no slowdown, though. (I suspect a memory leak, it's very new software)
I'm having no trouble running xCode + iOS simulators at all. In fact I think there's some possibility that the iOS simulator could be lighter weight due to ARM -> ARM rather than ARM -> Intel, but that's pure speculation.
Not currently doing any Android work so I can't comment.
One theory: we'll see the iPhone strategy applied to Macs with this year's mainline models becoming next year's budget option at lower prices. All as part of a move to grab more of the desktop / laptop mid-range market.
They have cut CPU costs with M1 so keeping RAM costs down ready for material price cut on these models next year.
As far as I know there are only three (four?) M1 chips in existence: M1 8GB, M1 16GB, M1 7-core GPU 8GB. There may be an M1 7-core GPU 16GB but I suspect that there is actually only ONE M1 chip (M1 16GB 8-core) and the rest are binned versions.
The way they did memory on the M1, they can only use two LPDDR4X chips. DRAM is on package, so they can only physically fit two on their package and presumably they do not have pins that would go off package for more memory channels at all.
Micron and Samsung make 12 Gb LPDDR4X chips, but Apple probably decided that having a 24 Gb option wouldn't be worth it, or would be an ugly number, or couldn't get enough supply, or all of that.
The M1 is for "consumer" tier devices anyway. Wait for M1X or Mwhatever – when they port the larger MBPs to their own silicon, that will have larger memory.
Good answer! Apple doesn’t seem to shy away from “odd” numbers, like 4.5K resolution model
I would go with supply combined with having enough supply of the 8gb modules but not enough of the 12gb ones that it would create strange tiers, while they would lean towards a 24gb machine
They make the odd resolution because they've committed to a certain amount of dpi ('super retina' resolution) and are marketing it as such.
While it would be fun to be able to run a complete k8s cluster locally that's quite a niche usecase. There's not much software on the consumer side where having 32gb+ really makes a huge amount of difference. At least not the software Apple is showing in their presentations. Lots of people think they want a huge amount of memory, but macOS seems to be able to handle memory pressure (with transparant compression and swapping) quite well, even on machines with little memory.
Why the chin bar? The could have easily removed it now that there is no logo on it, and the logic board footprint is now next to nothing. Their design decisions are just plain weird. Also, why is there no black version? No network ports? No USB type A ports? Really?
Well, the thinness at least explains why. Both RJ45 and type A USB ports are huge. By that extent, they could have introduced fiber channel adapters beside the 1GBit Ethernet port.
IMO it looks pretty iconic. An all-screen look would've been very very cool, but I don't think this looks bad.
Also, I'd imagine with this design the motherboard, cooling and speakers don't overlap the screen. I wouldn't be surprised if it's just much cheaper to manufacture them this way, or if since this is supposed to have a beefier sound system and an M1 configured to run hotter, there were other considerations with having screen and logic overlap that would've added thickness beyond just that of the stacked components.
At the end of the day Apple is one to choose thinness over all other things, and to leave room for future upgrades haha.
but seriously, the cooling system and speakers are built around it. Speakers in particular obey the laws of physics, so they either need to go behind it or under, and someone made that call.
I’m going to go ahead and say if you haven’t gotten onboard with USB C by now you’re behind. It’s been half a decade since the USB A port went away on Mac. I know everyone has their legacy peripherals they love but... it’s time to move on.
They're not "legacy peripherals". Mice and keyboards are far from "legacy", and almost all of them come with USB A cables still.
Having to add a flaky USB hub to your desktop just to plug in your keyboard is actually ridiculous. It makes sense on a laptop, but these desktops have _a lot_ of surface area on which they could put a few USB-A ports.
EDIT: Though, Apple removing the USB-A ports from everything might be what the industry needs to push peripherals to use USB-C for everything? If that's the case, then that might be a win for everyone, except for people who buy these new iMacs.
So, I was curious if you are correct and the market has changed a lot, or if I'm correct and most peripherals still use USB-A. So I checked out the website of a tech reseller, sorted their keyboards by most sold, and counted the technology used.
Before I stopped counting, I found 8 wireless keyboards using bluetooth, 9 wireless keyboards using a USB-A receiver, and 10 wired keyboards using a USB-A cable. It seems like USB-A is still alive and well in the peripheral ecosystem.
It's a website which is commonly used by gamers, so wired keyboards may be over-represented compared to the general population. But it clearly shows that actually, USB-A is still really commonly used in new peripherals sold today.
In the context of discussing the port selection of an iMac let's not forget that the device already comes with wireless keyboard and mouse/trackpad. While using your own input devices is an option very few customers will want to make use of it.
Yeah, it should go without saying that gamers and people who code are in a different class: hobbyists. I'm one of them, and most of the people on this site are too. We love our periphs. I have a wireless (mech) keyboard and a wireless mouse that both charge via usb-C cables, though it is something I filtered by when I was shopping for them a year or so ago in anticipation of this happening.
I think Apple (correctly) determined that these folks will tolerate whatever they put out, buy adapters to use their beloved special keyboard, etc. And for the ones who won't tolerate it: they're probably not buying Apple hardware anyway.
I mean, now your argument has changed from "most peripherals use bluetooth so it doesn't matter" to "most peripherals use USB-A, but many Apple users will nonetheless tolerate the lack of USB-A on their desktops". I can't really argue against that, and maybe it's correct.
EDIT: Just to respond to the "gamers and people who code are in a different class" thing: I went to a different store which is more general, went through the most sold keyboards, and skipped all "gaming" keyboards (so everything from Razer, everything branded Logitech G, everything with RGB), and counted to 14 bluetooth keyboards and 19 keyboards which require USB-A (7 wired, 12 with the USB receiver dongle). People just use USB-A peripherals a whole lot. (Store URL: https://www.elkjop.no/INTERSHOP/web/WFS/store-elkjop-Site/no...)
I think we mostly agree, we're just splitting hairs over the word "use". I'd say my mouse "uses" bluetooth instead of USB, even though it does still technically "use" USB as well to charge, albeit very infrequently.
I am surprised that most peripherals do still use USB-A to charge, and my argument did change to reflect that new information.
I'm not talking about devices which use USB-A to charge, I counted all peripherals which have the capability to connect via Bluetooth in any form in the "Bluetooth" category. If I counted every peripheral which comes with a USB-A charging cable as a peripheral which requires USB-A, the numbers would look even more bleak.
The only keyboards I counted as requiring USB-A are ones which are wired with a USB-A cable, and ones which are wireless but only have the ability to use an RF receiver dongle such as the logitech unifying receiver. Your mouse would be counted in the "Bluetooth" category.
I have a recent vintage Logitech mouse (came out this year). The charging port on the mouse, is USB-C, but the charging cable it came with was still A to C. Same goes for the 2.4Ghz unified dongle it came with.
All mac computers except the Mac Mini come with a keyboard and mouse. The Mac Mini 2 USB-A ports, which would be suitable for the keyboard and mouse with USB-A.
Pretty sure that the USB dongle was a solution to have the mice work on all computers because at the time, pretty much all desktops had no Bluetooth connectivity. The next step for them is going all in on Bluetooth (at least for the casual user products).
While I'm technical enough to know what to buy (most of the time) as it relates to USB-C, I think it's still a confusing mess for non-techy consumers, who aren't reading the fine print on USB-C related product descriptions when buying stuff.
A lot of USB-A to USB-C data cables sold on Amazon are still only USB 2.0 standard (480 mbps). Not all C to C cables can be used for high watt power delivery, etc. Some cables only charge. And then there's those USB-C and/or Thunderbolt docks whose video ports may or may not work depending on what your PC's capabilities are.
I think the creators of USB-C should have come up with some type of lettering or numbering system to reduce the confusion. At least there would be a chart that "normals" could refer to in terms of whether they're buying the right cable.
Until there are readily available, not too expensive, hubs that (1) hook up to the computer via a USB-C to USB-C cable, and (2) have multiple USB-C ports that you can use to hook up USB-C devices via USB-C to USB-C cables, it is not time to move on.
Exactly! It’s crazy to me that more than 6 years into this USB-C journey, you still can’t buy a USB-C hub for under $100. Back in the day USB 2.0 hubs were under $50 within a couple years, and then practically free (literal giveaway swag) within 5 years.
Yep, we moved on. It's crazy just how much more pleasant it is to use my 13" Razer laptop, which is for all intents and purposes equivalent in built quality to my previous 13" MacBook Pro... except that the designers had an amazing idea to put 2 Thunderbolt USB-C ports and 2 USB-A ports on it.
No. More. Frigging. Dongles. For anything - you can plug both modern USB-C docks AND keyboards and USB sticks easily. Crazy.
I'm perfectly fine with getting on-board with USB-C but I don't design the peripherals. The only USB-C product I own is a charging cable to a wireless mouse and the host end is USB-A.
Well, this is supposed to be a professional desktop machine. I don't want to run my work machine off of wifi that could randomly disconnect in the middle of meetings though. I know I'm not the only one. They could have spent the 3 dollars per machine it would cost them to add an ethernet jack, but instead they're going to have you spend $20 on some adapter dongle.
As for USB-A, there are tons of peripherals still in existence out there. Maybe you're fucking loaded and you don't care about just buying everything USB-C, but that's definitely not the case for everyone. Suppose I have a nice USB-A optical mouse and mechanical keyboard that I like for example, or a really nice $200 audio interface. I guess I need to buy yet more dongles, or throw them away and replace everything with USB-C peripherals because some 20yo on HN says that USB-A is legacy. Thanks Apple. You definitely value user experience above all else.
>I guess I need to buy yet more dongles, or throw them away and replace everything with USB-C peripherals because some 20yo on HN says that USB-A is legacy.
The concern about dongles is so overblown. Let's take your examples each in turn.
>Suppose I have a nice USB-A optical mouse
You buy a USB-C to USB-A adapter and it lives on the mouse's USB-A connector, never being taken off, forever. And then... it's one piece. Are you taking the mouse with you? The dongle comes with it. You don't gotta think about it.
>mechanical keyboard
Same deal. Lots of keyboards even have USB-A ports on them -- which means you only need 1 dongle. One for the keyboard. That lives on the end of the USB-A connector. You have now converted your legacy USB-A device to USB-C and can forget about it forever, for the cost of $5 from Monoprice.
>a really nice $200 audio interface
If you really, for some reason just cannot stand to add a $5 add-on piece to get the latest gear from Apple, many of these devices have detactable cables. So, you can just.. buy a different cable. Let's take the Focusrite Scarlett 2i2, which matches your use case at $200. A replacement cable from MonoPrice is... $5.29. And you can put away the old cable and just... have this one.
At the very, very worst case, you're talking about a $1200 machine and an additional cash outlay of approximately $15. For adapters which attach to the thing you want to use, forever, and don't have to be thought of again.
And somehow this is a travesty against the user? Worth of a scrap-and-redesign of the $1200+ machine?
Or get one of the tiny USB C hubs with four USB A ports on them. A hub like I have now anyways, for an iMac that still has USB A ports, because I needed to plug in more shit.
Consumers don't need Wide color P3 displays or the ability to attach a $5000 6K display. Clearly this machine can do professional level tasks, as demonstrated during the keynote.
I know I'm not the only one. They could have spent the 3 dollars per machine it would cost them to add an ethernet jack
It's too thin to support a standard RJ45 port in the back like previous Macs.
macOS has supported Thunderbolt Ethernet/USB-A dongles for years if you want to go that route instead of using the Ethernet built-in to the power adapter.
Logic board and speakers have to go somewhere, they can either go behind the display and make things thicker or they can go in the chin bar area.
I was surprised they did that too - I expected a look that was more like the pro display. Other than that though I'm pretty impressed with the design, and it is quite thin. It looks like how I'd imagine a 'future' computer to look.
Even without the logo, it makes the design more recognizable as a iMac from the front. Apple doesn't want people to mistake it for a regular old monitor.
Especially when iMacs end up in movies and tv shows.
The chin bar lines up with the ports (USB and power) on the back, and is probably where the speakers are.
So I think they prioritized thinness, which meant there was not enough room for screen+board+ports stacked, and they had to offset them. It's probable that the whole SoC and board are in the chin, given how small they are in the MacBook Air/Pro:
Agreed. Reducing bigger ports from laptop is great tradeoff, but having them on desktop computer is convenient, thickness/lightness for desktop isn't good tradeoff.
Why would anyone want their screen to be anything other than a featureless rectangle?
There's a huge market for making TVs look like anything other than a slab of glass. Aesthetics count. Is why the 17 inch iMac with the crane neck is still covered.
For what it's worth I don't think featureless rectangles are a bad thing. I'm just saying that it gets hard to distinguish products from each other when the designs converge so much.
Pretty much exactly what was speculated, although it's cuter than anticipated! Although I'm surprised they kept the chin, but that's where the logic board is now.
The fact that's effectively the same price as the M1 Air is interesting. Like the Air it's a very good deal, and it'll put other all-in-ones on notice.
The 4 identical ports, 2 with Thunderbolt icons feels like a real swing and a miss at cleaner UX.
Is it really that infeasible to just make all identical ports behave identically?
My parents have a story about how at age 6 I moved our Apple IIGS into my bedroom. They were more bewildered than angry. The one thing I remember is my older brother saying how because every port was uniquely shaped even a child can set up an Apple.
I remember when USB 3 was finalized and they introduced a MacBook Air, Schiller or someone was on stage and said “You’ll notice, unlike other pc manufacturers, we didn’t make our ports blue to signify USB 3 vs USB 2. That’s because every port is USB 3.”
(It’s a limitation of the I/O bandwidth on the M1, which makes me wonder why they didn’t wait for a new generation before building desktops with it. Maybe they were afraid of having no iMacs on the shelf since the old ones were discontinued?)
Which is funny, because on some of the early MacBook Pros with USB-C/thunderbolt ports, the ones on the left vs right side of the laptop didn't have the same capabilities
> (It’s a limitation of the I/O bandwidth on the M1, which makes me wonder why they didn’t wait for a new generation before building desktops with it. Maybe they were afraid of having no iMacs on the shelf since the old ones were discontinued?)
These are entry-level iMacs and the majority of potential buyers will be fine with the port selection. It's probably not worth delaying an important device like the iMac for such a relatively minor downside. That said, I'm personally waiting for the larger iMac (Pro?) which won't feature such limitations.
I don’t get why are they using different arrow keys on MacBook and external keyboards? I love the short left and right keys on my new MacBook Pro, I wish they had it consistent, muscle memory is a bitch
I avoided replacing my 2015 MBP until 2020 because of the arrow keys (when they finally fixed). Not great to seem them still clinging to that asymmetric design.
Probably only introduced the 24inch because of maximum pixel output limitation of M1. The 24inch is 4.5k and it supports one external display up to 6k.
A 27”+ would require to push at least 5k (possibly 6k) for the internal monitor + 6k for the external
For comparison the M1 Mac mini supports one 4K display and one 6k display. So it seems that this is close to the maximum for the M1 chip
It also means that for us with 5k2k monitors we will not get back our missing scaling options (at least not with M1)
I would pay the same price for a monitor in the same case with the M1 replaced for some thunderbolt controller-thing.
Really hope Apple will return the more normal end of display offerings, haven't found any alternatives with ""retina"" DPI from other manufacturers (except the LG Fine models...).
Indeed. Plus, I'd like to use an iMac in a dual display setup without one display looking out of place. And needless to say I have neither the budget nor the need for the Pro XDR display.
Can't wait to see the teardown, I wonder how much of it is glued together.
Who wants to bet that despite there probably being no pressing need to do so, the SSD will be permanently soldered onto the motherboard rather than as a M.2 NVME card?
So The design isn't super clear on this but unlike prior models, the entire machine face is one piece of glass. So I imagine just the typical display tape of the past and then everything else is screwed in.
And yes, it will lack the NVME SSD, same as the Mac mini. One board for SOC, RAM, and SSD.
former x86-64 hardware manufacturer view: the max TDP in watts of a CPU that can reasonably fit in something as thin as an air is a lot lower than a macbook pro or similar cooling solution (combined heatsink/heatpipe/blower fan).
in the intel world there's a whole class of CPUs that absolutely max out at 8W, 12W or 15W, you can go somewhat higher in a fatter laptop.
If I had to guess I'd say that the air probably thermal throttles or reaches a steady-state of lower performance at an earlier time than the Pro, if the pro has a larger cooling package on the same CPU...
If you were to sit the two side by side on a desk and run a complicated pure-cpu x265 1.5 hour ffmpeg encode of a 4K video from raw y4m, the pro will probably race ahead after about 20 minutes, after the heat has soaked into the air.
Yes, that's exactly how it works. I'm just not sure why you're saying this. It's obvious, given the fact that both machines use the same chip and different cooling.
It has nothing to do with the shape and size of the board, which differs between the two machines significantly.
If it's exactly the same board and layout, an ultra slim laptop motherboard will be designed for different thermal limits than a desktop board. Also a laptop board will have its shape and orientation optimized for a very thin heatpipe + flat blower cooling system, something in a desktop might be different. Take for instance the same CPU used in a gaming laptop or in an Intel NUC/tiny box shaped PC, the heatsink/fan arrangement will be unique between the two. But then again this new imac is so thin it might as well be a laptop.
I take a lot of photos and videos and I actually opted for a camera that’s mostly 1080p this year. Why? Because at this point barely anyone is actually using 4K to view content. And the bandwidth and processor power add a lot to the cost.
It’s nice to shoot in 4K for flexibility of cropping etc later. But for me, I decided to save money this year. And I’m happy with my decision.
So for a video call camera, personally I think 1080p is totally sufficient.
Even most external webcams are limited to 1080p. It's not practical to stream 1440p or higher right now, on demand streaming is super computationally heavy.
it's too bad Apple has an irrational vendetta against the company that makes the best hardware video encoders on the market. NVENC is really, really good, at 1080p you need to be running 6-12 dedicated cores (depending on settings) in order to match the quality.
AMD chips would have failed if they had used them too - that was the era of "baking your GPU" due to failed solder bumps and that definitely applied to AMD too, Apple is just such a clientzilla that they demanded payment for an industry-wide problem caused by the switch to RoHS-compliant solders. I think it's much more about control - CUDA is a third-party ecosystem that Apple doesn't want its clients getting dependent on, because then that would constrain apple's choices in hardware. Nobody locks in Apple's clients except Apple.
I've read that quality is largely limited by optics, and optics are limited by space in the MacBook line (just a few mm). Not sure why Apple wouldn't improve it on the iMac (a few cm depth to work with!).
I don't consider it a constraint they're imposing on themselves but rather a goal they're actively trying to achieve (decreasing the device size and profile). Some people have small desks. A small footprint desktop computer that looks incredible because of its thinness is a wonderful achievement. Even on larger desks, now the end consumer has more room.
Reducing the size of electronics has always been a goal for consumer hardware companies. Look no further than the memory stick which now supports tens and hundreds of gigabytes on a piece of metal smaller than your finger nail.
It's quite nice having devices that have such small profiles. It's the result of incredible engineering to fit more power in a smaller space for really very few tradeoffs.
I'm not sure that, one, I want people to see my mug in even higher def or, two, that I trust my ISP's upstream bandwidth to carry my 2k+ video feed very well.
I'd just like to have a webcam with a large aperture and nice bokeh, but it seems the only way is a complex setup with a DSLR, capture card, and associated mess of cables and mounting brackets.
The 2020 iMac has a similar 1080p camera + ISP, and image quality is already very good even in low light. And webcams higher than 1080p aren't as necessary since no consumer video conferencing supports higher than 1080p anyways.
> A lot of Youtubers like using 4K (or higher) cameras because the resulting video looks better at 1080p.
A lot of Youtubers are not streaming, which is what most people use their laptop webcams for.
> I would think the M1s are more than fast enough to do a nice job downsampling in real time from 4K to 1080p.
Just video encoding a 1080p stream in itself is already extremely intensive even for high-end CPUs. Especially considering that I (like most users) am simply trying to take a call, and have actual work already going on that I want my processor to spend its time on.
Why? I think 720p is the maximum need for a front facing camera on any device. What would you want a 4K front facing camera for? Most video-conferencing wont even allow more than 480p to preserve bandwidth.
Because the less pixelated it looks the more natural a conversation flows?
Are you asking why is it important to have a good camera for online video conferencing? Look at all the guides that teach people how to use DSLRs as their webcams.
Plenty of very good DSLRs come at 1080p, exactly like this webcam. And they take better photos and videos than many sensors out there stuffed with unnecessary pixels.
Using resolution as a shorthand for actual image quality doesn't reflect the reality of how cameras work, never mind that full HD already pushes the limits of most people's streaming capabilities (both video encoding and bandwidth).
If you are a professional video maker, you'll anyway have something better than the internal mic and camera. Not just for quality but also for position and practicality of moving it.
We are talking here about the average person's use-case
It'd be cool if they made a single-key Touch ID accessory for people who use other keyboards. Since they probably won't, I wonder if it's feasible to take the keyboard and chop it down to a smaller size.
On the product page https://www.apple.com/imac-24/ the section talking about the "Brilliant 4.5K Retina display" is showing a clip from "The Croods: A New Age".
I thought Apple was fairly close with Disney, so I'd have expected to see a Disney or Pixar movie, not a DreamWorks movie. Or was that only when Jobs was running things?
"On January 24, 2006, Jobs and Iger announced that Disney had agreed to purchase Pixar... When the deal closed, Jobs became The Walt Disney Company's largest single shareholder with approximately seven percent of the company's stock."
I assume that is the primary reason Apple and Disney were close in the Jobs days.
I noticed that too - a footnote says "available on Apple TV" or something - so I assume that Apple made a deal with Dreamworks (and can't with Disney because of Disney+).
I cannot tell you how delighted I am that Apple is (slowly) moving away from its entirely clinical brushed-aluminum aesthetic. Can't wait for them to ship a pro machine that doesn't look like a bulky surgical instrument that needs to be dipped in a bath of hydrogen peroxide.
The ethernet on the power brick is a great addition, but I wonder why they didn't turn that into a full hub with multiple different ports. There would be such potential for the Magsafe connector be the only cable connected to the device for a lot of use cases and could really give the iMac true in the home mobility.
This has been the case with iMac models for a couple of years now. It's ridiculous. If you want to upgrade to a more ergonomic setup you need to buy a new computer...
(Also, 2m power cable is a bit short for an articulating monitor arm...)
Didn't know that. Ridiculous indeed. I mean, it's one thing with the curved back ones (I kind of get that it would be a bit awkward), but these new ones are similar in shape to their $5000 standalone monitor, which supports both their stand and VESA mounts, I believe.
24" devices are tiny - I'd consider 27" to be a reasonable size and 32" to be optimal for detailed work or for aged eyes.
I suspect this device is targeted towards kids and students (as evidenced by the colours and the choice of voice actor), and Apple wants adults and workers to buy a (more expensive) future iMac Pro with a bigger screen.
Probably not. Windows-on-Bootcamp used drivers written by Apple for their peripherals, but with M1 devices they aren't writing or offering Windows drivers anymore. So Microsoft or someone would have to write them, which seems unlikely. You can run ARM64 Windows 10 in Parallels on M1 Macs though, and that does have DirectX 11 support and x86_64 translation, but it's still pretty slow I think.
Honestly, if you want to play some Steam games on ARM macOS (and to be clear, I think PC gaming on a mac will always be rough), your best bet is to fork over some cash and buy CrossOver for Mac. CrossOver combined with Rosetta 2 can run x86 Windows binaries on an M1 ARM Mac, and Rosetta 2 is really impressive and well tuned.
Thanks. I didn’t think this through. Indeed all binaries not just Windows but also games running in Windows simply are not compiled to execute on this new chip.
It’s not just drivers then afaik? It’s the entire win10, not to mention all third party drivers i guess like Razer and whatnot.
For those wondering about the new location of the ethernet socket, remember in many environments (offices, schools, etc.), the network port is often beside the power on the wall/in the floor. Makes sense to have them beside each other.
I am not excited at all for the Ethernet port to be part of the power supply. It just makes my job harder. I will probably be buying external 10Gbe adapters anyway.
The memory is also a concern. They didn't show the memory in the summary slide like they did with the MacBooks and Mac mini. This is just disappointing. I hope 16gb an option.
For myself, yes. I am waiting for the first 32GB to jump. We have bought 16GB models of the iMac and MacBook Air with the people receiving them being quite happy. I don't see any of our iMacs getting replaced just yet. Most of those people are running Parallels and that opens a whole class of worms.
When Mac Mini M1 was released I ordered the 8GB model. It was always swapping and memory pressure high. After replacing with a 16GB model swapping decreased but 16GB of RAM is still not enough. Hope Apple will release future models with lot more RAM otherwise the M1 based devices are useless.
I have tons of room behind my desk, I have much less room on my desk. I don't want the power brick anywhere near my computer. As for 8gb, do people not see what Apple is doing here? All of their M1 entries are entry level. The true power user work will come, but this gets a lot of people converted over to M1, testing out the software/hardware, giving time for power software suppliers to focus on M1.
The additional ram will come (though with M1 it doesn't seem nearly as big of a priority to have more).
> I have tons of room behind my desk, I have much less room on my desk. I don't want the power brick anywhere near my computer.
There wouldn't really be any increase in desk usage if the power brick was integrated into the display. The stand's size will still be more or less the same even if the display was twice as thick, as it's going to be more about supporting the leverage from the upper part of the display at whatever angles it supports being at rather than a bit of extra thickness & weight in the bottom part of the display.
The problem with the brick is it's something you have to cable manage. You need to mount that brick somewhere or have it sitting visibly ugly on the floor. And since it has a port on it, you need to be able to access it more often than you otherwise would. It's classic form over function here. They wanted it to look like a big ipad, and they made compromises to achieve that. It does look nice, but you are paying for that in more than just the MSRP price.
If you really care about the minimal amount of desk footprint then get the one with a VESA mount instead & put it on a monitor arm.
We don't, but their marketing photos clearly show a fan, which would lead to a reasonable conclusion that the thermal design is more along the lines of the Mini, and not similar to the Air.
Well, given the power supply is mostly out of the box, I would expect it not to be along the lines of the Mac mini, which has a fully internal power supply, thus my original comment.
How many external screens can be connected to these machines, and how, and what resolution? If it's USB then all ports would be taken? I'm clueless about USB-C.
Is it still possible to connect a hub for additional peripherals? I have 6x connected to my machine currently (external drives, printer, audio stuff, backup mouse, card reader), not counting extra screens.
Really hope this works cause these machines look awesome.
I'm sort of disappointed by the pricing. The base model is ok, but the higher price tiers add way too little to justify their price bump. You'll be hanging a dock or dongle off the back anyway with so few ports, so I only see value in the ethernet jack for the middle tier, and in the 512 GB bump for the top tier.
The M1 air and mini were great value, because they replaced machines that were ridiculously slower at the same or lower price. The new iMac does a secret price hike (by starting at $1299 instead of $1099) and the performance jump for those middle tiers is not as big as for the air and mini. I don't think this offers the same amazing value as the previous M1 macs, especially since a desktop needs more RAM and more storage.
Also, it is very amusing they're still selling the dual core i5 21.5 inch non-retina iMac to hit that all-important $1099 price for schools. That's a terrible product, but they just keep selling it indefinitely. I feel sorry for all the people that will buy that model.
I love the new look, but having the audio jack on the side is kind of insane. I realize that it is not deep enough for the full length of the connector, but the idea of having a cord dangling off the side 24/7 for those or us who use external speakers is unwelcome. But I’m waiting for a 27 or 30” model anyway...
The iMac is an all-in-one, so I don't think many people will be adding their own speakers (might as well get a Mac Mini at that point). I think most users will use the jack for wired headphones, in which case having it on the side makes it easier to access.
Looks like they have moved the headphone jack from the back to the side.
On the one hand, that would be better for me, because the way my desk is laid out to plug in my headphones on my iMac I have to reach under the front and jumble around blindly.
Having the headphone jack on the left side would be easier.
On the other hand, someone with a second monitor that prefers it on the left might find that it is in the way of their headphones.
What I really wish they would do is make it so that you can keep your headphones plugged in, and choose via the Sound settings whether you want sound to go to them or the internal speakers.
EDIT: My iMac is a 2017 model. From what I've read elsewhere, they may have changed the behavior described in the "What I really wish" paragraph in later models.
I do this on my MacBookPro. I keep my headphones plugged in. I have the system sound go via the Mac speakers. And I have zoom set to use the headphones.
It’s easy to switch between built in speakers, wired headphones and AirPods. All with unplugging anything.
Yeah, it was easy on my 2008 and 2009 Mac Pros too. And from what I've reading it might be easy on 2020 iMac and maybe 2019 iMac. But from 2017 iMac (which I have) back at least to 2011 MacOS considers the headphone jack and internal speaker to be the same output, with whether or not you have headphones plugged in determining where it sends the sound.
You can indeed switch between sound output devices, but on iMac "headphones" and "internal speakers" are the same device, so switching is only available if you have something else, like USB speakers or an AirPlay device.
If you click the sound button on the menu bar to open the menu, it will have a headphones entry if headphones are plugged in, and an internal speakers entry if headphones are not plugged in.
As you plug in and unplug headphones, with the menu showing, that entry dynamically changes. (Same on the Sound panel in System Preferences).
This appears to be entirely a software decision on Apple's part, rather than some sort of electrical/mechanical design that forces the speakers to be disconnected when the headphones are inserted. In this discussion [1] on Ask Different people report that when they run Windows via Bootcamp, they can choose via software between headphones and internal speakers.
BTW, my iMac is a 2017 model. They might have changed this with later models.
Now that all the devices across the line have the M1 chips, it would be interesting to see on which other features will Apple differentiate the products across the line after the full switch happens.
Artificially limiting the RAM options in air/pro13 vs pro14-16 laptops? What else could be different except maybe 1-2 ports?
Same for iMac 24 vs 27 coming in the fall?
IPhone pro v's iPad pro difference is the big screen and a port?
It will also allow for faster iterations now that they are not dependent on Intel so I think the future will be even more interesting than this first release of M1.
Most likely 14-16 laptops (and iMac Pro if they decide to do that again) will have a larger chip (say "M1X") that supports more memory. The M1 only supports two LPDDR4X chips (physically on the package there's only space for two).
No one even makes 16Gb on one chip. Samsung and Micron currently max out at 12Gb and I would imagine there's much less supply of these than of 4 and 8.
I think iPads and iPhone designs are getting better whereas the iMac (and even other macs) are getting worse. The bright colored and thick bezels on this including a chin makes this look super ugly from the front. And since iPads and iPhones have thin bezels and iPad even has the m1 chip, there’s no reason for the chin on the iMac. They could have at least made the bezels black for less distraction. I genuinely prefer the old design. This looks like an iPhone 5 stretched out.
Why would you want a smaller screen and less memory? I have a 2014 iMac 5K and I wouldn't consider upgrading to this. 24" is way too small. I'm waiting for the 27 to 30" model.
You have 64 GB swappable memory, can cheaply upgrade it up to 128 GB without voiding the warranty. You have 10Gbe Ethernet. You have bigger screen. Why sell now?
Can this thing really only power a single external display or is a single display just suggested? I have one on each side of my ancient late 2013 iMac and I'd hate to have to lose one.
Odd so someone tugs accidently on the power cable and the machine just shuts off? Magsafe seems like a great idea on a laptop. Not sure I see how it makes sense here.
It's ridiculous that Apple ships 8GB configurations in 2021. My company bought me a MacBook pro M1 with only 8GB of ram by mistake and it's extremely frustrating to have so little ram. I look forward to replace it with a laptop with 32GB of ram. Meanwhile I run devops stuff in the cloud because the laptop has not enough ram for that. Once safari advised me to close the jira tab to save ram. Such a shame.
I have the 16GB but I push my machine hard and I usually have gigs swapped and I've been absolutely blown away by the performance of my M1 MBP. Was able to step down from a 32GB version without any issue. I'm kinda surprised you're having such a hard time with 8.
Yes I wanted the 16GB one since 32GB is not available yet on Apple M1. But Apple sells a premium laptop that is not good enough for many of their users. That's a fact. 8GB is ridiculous and Apple is rich enough to afford 16GB by default on their premium "pro" laptops.
- A web browser with many open tabs.
- An audio player
- A few code editors / IDE
- A chat / video conference software since COVID-19
- An email client once in a while that I tend to leave open
- Various development tools
- The software stack of my company for development in a Linux VM
Fair, my setup is almost identical, except I also do some video editing and I have been very happy with the 16GB MBP 13" (coming from a 32GB MBP 16"). So maybe consider trying to swap your 8GB for a 16GB before thinking you absolutely need 32GB.
Given they come in multiple colours (and this was a selling point), it looks more aimed at home users/classroom - web browsing, watching movies, light photography/video/audio editing, etc.
Those seeking more power would likely opt for a MBP and external monitor.
I think it is a pretty neat idea. It is not a cable I want to add/remove often, so not having it coming up to the table, at least sounds nice on paper.
No Ethernet port on a desktop is just insane. I can see the sense in saying "your dock is Thunderbolt and Ethernet goes on the dock" for laptops, but on a dekstop?
I don't think it's a bad choice. That way you only need one cord dangling from the device; the ethernet cord can be attached under the desk. It's actually a thoughtful feature; most people are annoyed by the usual mess of wires. This helps with that.
that OS and the apps over there are supper efficient
it is the same story as iPhone vs Android phones, android always needed 2x more ram/battery than iPhones because the android ecosystem is bloated and slow, just like windows
There is nothing that prevents a developer from writing bad applications for any platform. What you said is also not empirically true. There are many sluggish and resource intensive apps on OSX, Adobe's apps, Chrome, Apples own XCode comes to mind.
I haven't used Android phones but my experience with Apple tells me that the phones get slower with every update.
I have an 8GB M1 air. It's fine on day one when you power it on and have a single safari tab, but leave a few things open and its swap hell. The SSD may be fast but it really takes away from the magic when your waiting multiple seconds to switch to another app's window. I regret not waiting for a custom configured 16GB one.
That is why they offer options, my grandmother doesn't need 32gb of ram to browse a recipe website or to watch netflix
8GB is to remind software developers that it is plenty enough
You don't want to be in the Android situation where everyone have 4gb-16gb ram in their phones, and as a result developers are bloating their apps with poor programming models, wich results in needs of bigger battery and bigger disk space, this is bad
8gb of ram and 256gb SSD make this a fashion accessory, a fancy email/news/facebook machine for your desk/office. I think these are going to be wildly popular actually. I hope it leads to a better mid-range iMac somewhere between this and a $5000 iMac pro.
Think this is a hard limitation of some sort on the M1 itself. This iMac seems like it's the Air version of the iMac, definitely not the "Pro" version. All the pro stuff will likely go there.
No dual out, though this is much better than a current M1 Pro as you can at least have a total of dual high-quality desktop displays. If you want more you can get a DisplayLink as a stopgap for now.
No 4 firestorm, 4 icestorm. Same as all other M1.
Still a killer CPU however, I think this CPU fills all gaps for 99% of all users. Anyone who needs more can afford to spend more as their workflow is probably pretty expensive.
M1 has on-package RAM and limited to two chips thereof. I’ve read that in theory they could use the 12GB version thereof and get to 24GB but choose not to.
From all available Macs the new iMac is the only one not to feature a black device bezel. This is so sad! Dark mode is gonna look so stupid that way, an watching a movie is gonna be annoying too. I guess they needed to avoid having an iMac that looks from the front no different than the previous model. Byt whyyyy???
I wonder how quick the 3rd party accessory market will deliver a solution. I'm not a fan of the white bezel either. Kind of a reverse Ford model T, you can have the imac in any color as long as it's not black...
everyone else is building 2+ inch thick crappy monitors & here's apple building a whole computer into a nice slim portable-ish display. sickeningly limited conventional set of offerings from everyone else.
In 12 years, 3 months, Windows lost 20% to 75% (1.63% per year), and OS X gained 12% to 16% (1.02% per year). At the current rate, they'll converge in just 22 years 3 months, with Windows and OSX having 39% each.
If you don’t see it side-by-side, you don’t really notice it, but Windows 10 is slower at everyday tasks compared to even older Intel Macbooks. Possibly related to the antivirus implementation?
Its kind of like Light Mayo vs regular Mayo, you don’t notice the difference in isolation, but things get awkward side-by-side.
My M1 Mini is much faster than my 2017 i7 PC with 32gb ram and fast m.2 SSD
Edit: to be clear, my main way of testing was trying to launch the same apps side-by-side, and load the same websites side-by-side. 2014 Macbook Pro beat the much newer Windows 10 device every time. The Windows device doesn’t melt down with Minecraft, but that doesn’t change the fact that latency for basic tasks (e.g. opening the browser) is worse for Windows, for inexplicable reasons.
Welcome to 2009 8GB ram on the desktop club Apple! Build in planned obsolescence, you will be replacing this computer like a phone or a tabled in 2-3 years.
Paying more to get more cores enabled on the same chip is standard practice, everyone does it, and it makes complete sense. The alternative is to just throw away all chips with a fault which affects one core, even though there are customers who would gladly have taken a unit with one broken GPU core for a lower price.
If you wanna complain about the practice, you can complain that they have no way to attempt to re-enable the locked core. AMD would let you do that back in the day, and it was a great way for customers to buy a 3-core and potentially be able to unlock a fourth core (even if it might have to be clocked lower than the others). Though I believe neither Intel nor AMD lets you unlock cores these days.
I so want to go back to an iMac... I loved mine back in ~2010.
After about one year I had a simple fan issue, the fan was making high pitch noise (revving full speed) and would start up randomly.
It was likely a very minor issue... but then i felt so powerless : to fix it I’d have had to send it back and wait, my entire development environment, private files and notes, browser with possibly logged in sessions,... I mean you have to return the entire thing... it’s not like you could replace the power adapter, or screen.
I’ve been wondering about workarounds as I don’t care for "triple a" games, plenty of awesome smaller games that dont need a beefy and hot gpu.
But the only idea i have is to install the os and everything on an external SSD so at least if i have to send it for repair, I have all my dev stuff with me.
I could also have a linux install so i can even use it in theory without the mac, but then what is the point of getting the imac and its os experience right?
So I don’t know what I am missing... how are other devs comfortable with this?
It's easy enough to "clone" a Mac boot disk to an external device if you need to ship it back, but at some point the "computer isn't working" problem comes to all systems.
I like the direction Apple is going here, I assume they will have higher end iMacs later this year or early next year, and that's what I'll be waiting for.
I'd like 27 inches of screen, at least 16GB of ram, but preferably 32, and that's about all I'd ask for. My current workstation is a 2013 27" iMac fully upgraded (fusion drive, 16gb, good dGPU for its time - still plays Kerbal Space Program fine), so looking forward to more upgrades on the internals.
I've already run into the old WiFi and Bluetooth 4.0 as being bottlenecks and those are difficult to replace in an iMac by a normal person.