How are companies still releasing hardware with 8GB RAM? My MacBook Air from ~2016 has 8GB of RAM.
My first PC which had 8GB of RAM was built in ~2010 if I remember correctly. My current machine has 32GB. In all honesty, 32GB is not even much, considering my motherboard can take 128GB (in the old days, if your motherboard could take max 8GB RAM, people would fill all the slots to capacity). Also, I have a Lenovo machine that I paid $200 for that has 8GB RAM.
WHY any power user would buy a MacBook is beyond me. They've become devices that only my mother would use, not to get work done.
Apple should've made the baseline 32GB with 128GB as max spec. It would've forced the whole industry to give us more RAM. So for the next two years RAM will still be stagnant, all other manufacturers that copies Apple will indeed keep shipping 8GB without blinking.
This was an expected step of the evolution of the Memory Hierarchy. RAM is just bigger-but-slower CPU cache. Swapfile is just bigger-but-slower RAM. There's always a trade-off between size and speed.
Incorporating RAM into the CPU package gives a big performance boost at the expense of maximum capacity. To compensate for the reduced performance in tasks that require more memory, you can either add another layer into the Memory Hierarchy (i.e. CPU Cache -> Fast RAM -> Slow RAM -> Swapfile), or you can improve Swapfile speed. It appears Apple have done the latter - they claim their SSD is up to 2x faster.
Video game consoles have done something similar with the latest generation. Only a 2x increase in RAM from the last generation (compared to an 8x/16x increase from the generation before), but a focus on storage performance.
Jisss to be honest the new PlayStation and Xbox hardware is amazing, specificially how the memory/cpu/mobo and streaming system will work. Wish we had something like that in PC land. Then again, their workflow is basically streaming data from disk, thus this round they optimized that hell out of that chunk of the pipeline.
Hopefully AMD learn lessons from that sphere and can bring some of that to our sphere.
The problem I see with the PlayStation 5 at least is repairs, if the solded SSD kick the bucket you will probably need to: buy a new chip from Sony and resolder the fault one (or even worse, resolder the entire SSD chips because you can't rely in which cell is physically the data and swapping the cell will probably corrupt the entire disk), or buy a new PS5/Motherboard
You can't just swap the hard disk like before, I'm sure this you be a major headache with repairs
Battery life is easier to market than an acronym that sounds like it's either related to trucks or sheep.
RAM eats battery. Apple has made a strategic decision to pretend RAM doesn't exist, ship underpowered hardware, then brag about the battery life.
This is also the reason RAM is not listed on iPhone specs.
If I remember correctly the iPhone Pro 12 Max has twice as much RAM as the other 12 models and it's nowhere in the marketing materials. This is not accidental, it's a fundamental part of Apple's strategy.
EDIT: The max has 6GB and the rest of the line has 4GB. That is not twice as much. I did not remember correctly.
The performance of the phones seem to be smooth compared to Android phones with similar ( or more) RAM. The difference comes from vertical optimization, so the raw numbers on RAM would not give us a full picture.
The relative openness of Android means your are not always getting the most optimized ROM , a lot of vendors add custom crapware into their phone OOTB.
While Java GC on Android may not always be optimal in regards to memory utilization (given that there are other concerns) and AFAIK doesn't allow tuning like you can do for enterprise Java apps it seems like quite a few improvements have been made over time and many GC implementations have resulted from this: https://proandroiddev.com/collecting-the-garbage-a-brief-his...
Personally, i'd look at how common cross-platform apps are and how many of the frameworks (like Ionic and React Native) don't always use the native system UI in an efficient manner.
It seems a bit like running Electron apps on GNU/Linux and being puzzled about where all of the RAM goes.
The Native UI in React Native is where Native comes from in the name. Nothing to do with an electron app which is a full fledged browser with a node process attached to it. Not a fair comparison by large
And yet, that is not the case! While ReactNative might not result in quite as big of a performance penalty as embedding a browser engine, its performance can still be worse than actual native apps, given that it adds additional abstractions to make developer's lives easier and allow for cross-platform development: https://medium.com/swlh/flutter-vs-react-native-vs-native-de...
While one could argue that there aren't that many visually driven experiences like that out in the wild, the benchmarks still prove that most abstractions have drawbacks.
Of course, there are good reasons for using them, which is also why many choose even the previously mentioned Electron platform for development, instead of something like Qt/GTK on desktop.
But for the discussion at hand, it simply serves as a suggestion that there's more at play here than just "Java uses a lot of memory and can be slow".
Lots and lots of people use their laptops in a way where there's little to no noticeable difference between 8GB and 16GB RAM. Why should those people pay extra for something they don't need? I think the HN crowd - with 4 docker VMs, 3 electron apps, 50 Chrome tabs, ML training, long compile times, etc - need to realize that their workloads and needs are not representative of the majority of users. It's ok for Apple to also make computers for people that are not you.
> It's ok for Apple to also make computers for people that are not you.
Yes, it's perfectly fine to sell machines like these with specs from 2012 but their efficiency should be reflected in the price. Honestly, what Apple is doing now seems like pushing power users buing new hardware towards non-macOS systems.
Uhm. Adding 16GB RAM instead of 8GB costs almost nothing and which by far and large, is useful to all users. The only reason why they ship 8 GBs is because they want to sell you upgrades, similar to why Apple products come with miniscule SSD hard drives. Certainly, only professionals would need something like 512GBs right?
Yes but those certain professionals can now only upgrade to 16GB on a MacBook Pro in 2020 which is shameful. Oh and it's soldered so you have to buy the beefed up version today, cannot upgrade the RAM 6 months from now.
Go look at a non-techie webbrowsing habits alone: they have 50 tabs open in chrome, some of which are months old. And they install all sorts of other crappy software, anti-virus, browser extensions. Some of them use Chrome, Firefox and Internet Explorer at the same time. And/Or if they use Slack/Steam/Spotify/VS Code/etc they are running another 300MB RAM per app.
If you are a programmer, try all that, with Visual Studio or Eclipse running, sometimes at the same time as Android Studio with an emulator running, maybe with a local postgres or sql server etc... it easily eats up that 16Gb. Oh lets sprinkle some docker/vagrant ontop..
Good luck if you are into video/image editing, 3d modeling, illustrating, CAD etc... then MacBook Pro's are basically the wrong tool for the job today, where it became very famous for being the right tool for the job in ~2010.
Same problem with Intel CPU's...they've been riding the wave of success of their initial chip redesign for 10 years and now AMD seems to be wrecking them every 6 months.
My problem with the RAM situation is that we've been paying ~$100 for 16GB for 5 years now.. instead of getting scale up to ~$100 to 64GB and down to ~$25 for a 16GB dim. The RAM companies riding the wave too at the moment. Luckily SSD's forced HDD manufacturers to up their game else we would've still paid ~$200 for 500GB hdd with 8mb cache, so now we at least have 10TB+ drives avail for consumers.
That's (slightly) unfair because obviously the RAM produced in 2012 isn't the same as the RAM produced in 2020. Better to compare Apple's price against the cost of top-of-the-line RAM available for other laptops. I checked around and it looks like you can get 8GB 3200MHz DDR4 from a highly respected brand for 56 USD, while Apple will sell you some mystery RAM for 200 USD. It's not an (ahem) apples-to-apples comparison because the Apple RAM is closely integrated with the SOC, but I think it's fair to say they're making a comfortable margin.
Do we truly believe that on die memory has similar costs? I don't see how this is remotely possible but its going to take tear downs of both configurations to check what is going on; as in are the 8gb units using chips which never had the circuitry or perhaps disabled by hardware and its still there?
yes, right now memory is built-in into the motherboard... So you need to buy overpriced memory from Apple (and it makes me think about leaving Apple platform)
For a work laptop that only serves as an interface to a personal workstation set up in a server room, Macs are so much more enjoyable to use. No worrying about wifi drivers, being able to use the Mac touchpad etc.
Absolutely agree. - First they took away the ability to upgrade RAM, then to change HD/SSD. What‘s next: Can‘t boot any other OS. - Taking away freedom step by step.
I am a die hard Apple products aficionado, I don‘t use any other computing device, but why are many HN commentors stubbornly defending every Apple decision that limits its users is way beyond me.
Edit: Re taking away freedom: Let‘s not forget the many app store dramas on iOS. This is Apple‘s vision for macOS as well: Install only approved apps, show a warning for all other apps (and finally disallow them?)
There’s a fairly significant performance difference between Apple’s new cores and the Pi 4’s A72s.
If you ignore that and the high-end display, touchpad, and keyboard, then I suppose it’s not too far off from a Pi in a fancy case, but then you could say the same for any laptop.
In 2012, the same Mac mini case held a cooling system for a 45W chip, 2x2.5" storage bays (or one storage bay and a DVD-Drive), 2 RAM slots, and had three more ports and a card reader. They really used every last corner.
None of these new M1 devices will be densely packed. They’re putting smaller boards into the same enclosures. I assume next gen is when they will take advantage of space savings.
My 2012 Macbook Pro was also slowly dying. As a Hail Mary I opened it up, cleaned it and renewed the thermal paste on CPU and GPU. Now it's back to old speeds!
And this is not only about 2020, most likely people will use the machine till 2025. ( Also consider the fact that Apple solders the RAM to motherboard. )
Not only the base spec is 8GB but the largest possible upgrade is 16GB. My previous-generation Macbook Pro had the possibility of 32GB, which is what I bought because I think that's a decent minimum for a computer I want to last.
We will probably see the next round of new Macs in either March or June. I suspect that is when we will see the next version of the chipset that would support higher RAM limits and more I/O ports. We might start seeing the new hardware design language,too.
Because not everyone is on containers fashion running full blown clusters on their laptops and 8 GB are more than good enough for enterprise developers.
One doesn't even need to be a developer to fill up 8GB with just a browser, Teams and Slack in the background (or any other Electron apps of their employer's choice).
Another wise choice is not to use anything Electron based unless forced by the employer, or customer, and in that case, it is expected that they pay for the hardware.
This is a confusing comment. If your mother doesn't need more than 8GB RAM, why should she have to pay for more? It's not as though you can't order more if you want it.
It wouldn't force the industry to offer more. It would just mean purchasers such as your mother wouldn't buy a Mac.
I only upgraded from my 2013 MBP this time last year. Through most of that time the macbook was my workhorse as a contractor. It's still more than sufficient for many of the web and data jobs I did.
For this 8gb is fine though right? I get the idea - we want Apple to push the bar. But my 2011 4gb MacBook Air with Catalina does those things fine... swapping to an SSD is reasonably fast.
Not true at all.
Maybe in the US but not elsewhere. I know more than a few developers that work on sub $999 laptops/machines. Specifically Dell/Lenovo laptops seems to be the go to, since they can get them with 16GB RAM + 1TB hdd + non-discrete gfx, then they just swap out the hdd with 256/512GB ssd.
About half of all my dev friends have given up on laptops (specifically apple gear, incl iPhone) and ALL of them built ryzen-based machines and work from home. My previous workplace refused to buy us laptops and built typical workstation PC's for everyone (quadcore i5 + 16GB RAM + 256GB ssd + 1TB hdd, this was around 2017). The guys that do have iPad's tend to hang on to them but swapped their iPhones with Android phones when they ditched their MacBooks. Main complaint? Ports & (Performance vs Price vs (ryzen pc with same price)).
> How are companies still releasing hardware with 8GB RAM?
8GB is a huge amount of RAM. I cannot imagine writing a serious program that needs more than 2% of that. The real problem is programmers using "frameworks" and shit that requires inordinate amounts of unused memory.
Mostly Electron. I don't like it either, but Electron is a fact of life now.
While you don't have to join in on that as a developer, you still have to be aware that 95% of users will have no idea that their instant messaging application uses 800mb of RAM to idle, or whether that's a bad thing.
I run regressions on half a million data points. My 2017 Macbook Pro with (checks) 8 Gb of RAM works fine. I'm not sure what all you guys' mothers are doing.
Half a million data points sounds like a lot but it's smaller than a 1k texture. RAM for integrated graphics is VRAM too, just to render to the screen you need a ton of it, for every open window.
Not op, but my main laptop is a 2013 MacBook Air with 8GB of RAM. It’s old enough to have developed dead pixels, the backlight is now noticeably uneven, the USB ports have become loose, several keys now only work unreliability, and it now does an emergency low power shutdown when the battery reports 55%.
(Naturally this means it is now permanently plugged into mains power and an external monitor, keyboard, and mouse).
Yet the RAM is still sufficient for my use of macOS, dozens of Chrome and Safari tabs, Xcode, TextWrangler, iTunes, etc.
A few dozen tabs, as in often in the range of 24-60.
That one aspect never even seems problematic or limiting, and it’s not like I’m oblivious to all the other things wrong with such an old piece of hardware.
Perhaps this pricing was brought about by the same head injury that caused someone in management to believe 8GB is an adequate base spec for RAM in 2020.
They aren't doing it to save 20$ per machine, they are doing it to make the higher price of a 16GB machine seem more reasonable than if it was the base model.
Yeah this screams classic Apple. Advertise the cost of the Mac as lower, but with a shitty option that most people will have to upgrade. Majority are not going to want to buy a 8gb laptop for that kind of money and not just bump it up to 16gb.
This reminds me of how movie theatres offer a small, medium and large but the large popcorn only costs slightly more than the medium, thus enticing you to go for the large.
I’ll add classic Lenovo to the list, whose base level thinkpad are underspeced and have terrible screens. The fact that apple doesnt perpetually claim their machines are on sale is their last bastion of superiority
You used to be able to replace/extend ThinkPad parts quite easily. We'd buy base models exclusively and just add extra stick of ram and flip the CD rom to another hard drive or battery - those were super cheap high quality laptops.
Only recently Lenovo started to solder their parts so base models lost their power :(
They're also charging $200 for a 256GB storage upgrade - spot price of 3D TLC is $3/256Gb ($24 for 256GB) - you can currently buy a 256GB NVMe SSD (full stick with controller) at retail for <$30 as well.
That might be a somewhat relevant analogy if you had a choice to go somewhere else to buy bread elsewhere, but since both memory and storage are now soldered, you are forced to upgrade for the lifetime of the product at purchase. The relevance on the parts pricing here is to point out that Apple is being extremely abusive in their pricing to their customers.
Many laptops now have soldered RAM, but other vendors have not chosen to do what Apple has done. In HP's premium laptop (Spectre x360 13t), adding 8GB of RAM is a +$70 option. For reference, an 8GB DDR4 SODIMM at retail pricing is about $30.
Also, AFAIK, no other major laptop or mini-desktop manufacturer uses soldered storage like Apple does, so in this case, the retail cost for storage is even more relevant.
These are commodity parts, so there's not any R&D to recoup - this is just profiteering on Apple's part. Especially for the Mac Mini, where there's not the same space constraints, the lack of storage upgradability is also a rather egregious form of forced obsolescence. Fine from a business perspective, but hypocritical for a company that claims to care about the environment.
Well, no. Apple buys RAM chips at that price, and they just have to swap one chip for another. The differential in price from one to the other including all costs is that much. Labour costs and OpEx are not affected by choosing Chip A or Chip B that are essentially identical.
This is pretty much the reason why I'll never will buy apple devices again. Recently I bought iphone 11 for my girlfriend as a present and the price difference between 64GB model (which honestly might as well be a dead brick in this data age) and 256GB model was around 30% in my country which is beyond absurd.
You can call me bitter but this sort of manipulation is making me extremely salty to the point where I'll be having seething hate for the company for the rest of my life.
But I think a more important factor is future sales. They will likely sell more laptops as these 8GB owners upgrade earlier (in 2-3 years) as the OS and apps continue to bloat.
So on one side they save $20 per laptop and they likely sell more laptops.
So charge $50 more per laptop and make $30 more profit while not producing 8 GB junk that'll be e-waste in a few years because that's barely enough RAM to run an average browser session anymore.
>Even if you somehow could argue that 8 is all you need now, what about in few years?
What about it? It's not like someone who mostly browses the web, checks email, works with office documents, and so on, will change what he does in a few years...
The fatal flaw in this argument is the failure to realize that web pages are bloating, video is becoming more prevalent, higher resolution images, etc.
So “browses the web” has ever increasing hardware requirements if you want to maintain the experience.
The large consumer market that uses mac for common computing activities will not need more soon, perhaps, but many of Apple's bulwark clients - graphic designers, musicians, and even developers - probably will.
Despite the software they demoed, the machines they show (with exception of the mini) were all for the non-demanding users.
I would like to see Photos on those benchmarks, that is one program that would benefit from lots of ram, and would be used by many of the Air’s customers.
Don't forget - with unified memory that means that the GPU has to pull from the same pool, so your usable RAM after display buffers is likely much less.
I have Firefox, Slack, MS Team, MS Word, MS Excel, and Onenote opened. The world's most boring office worker use case.
I know it's a deceptive metric but Task Manager shows 12.5GB "In Use". 16.2 Committed.
Each Firefox tab has between 250mb and 1g "Memory (Active Working Area)".
HUUUUUGeeeee long tail of 100MB trailing down to 20MB services and things. Adobe has half a dozen at all times, etc etc etc.
I'm game to use any other "more realistic" specification or monitor as I know that with memory paging and virtualization things are wonky.
I don't think this indicates I MUST have 12 or 16 or 32GB - I have a 4GB media PC that works just fine. But I do believe it's an indicator average user can use 16GB, if it's available. Or in other words, if I buy a $1000USD laptop, I don't think 16GB is going to waste away unused with zero benefit ever :-\
Oh, I don't disagree even remotely! We can have a nice thread about optimized apps and 4k demos :)
But in context of expectations in the $1k laptop market, my point was that perfectly normal usage can grow to benefit from over 8GB of RAM these days, whatever the background reasons may be.
On my MBP with Catalina just browsing will get me over 8 gigs. Reaching 16 is another thing, that i did not manage even with Docker and mutiple IDEs in a typical day.
I personally wouldn't buy something without 32GB these days. But I would tell family/friends to get at least 16GB. With 32GB, I rarely have to think about RAM. But I'm probably 85%+ utilization all the time. I'll admit that I'm a "power user", but my wife has a fairly new laptop with 8GB and it fills up FAST, even when doing nothing serious. It's at 81% right now with just Firefox, Word, and Evernote.
If I were going for a desktop it'd be at least 64GB.
I'm not saying your wrong, but this is such a catch-22.
When people complain about Electron apps, the response is always "modern computers have so much memory anyway, it's fine!"
But then when a manufacturer introduces a computer with slightly less memory (but still a perfectly reasonable amount) the big question is whether it can run all those Electron apps!
Is there a way to break the cycle? This isn't good!
> Never measured whatsapp but it's really pretty performant
are we really talking about the same whatsapp ? it takes a good 4 seconds to load here. I don't remember ever waiting for MSN Messenger to load, on potato-powered computers.
Electron is cancer .. but Skype is native (for now, at least on Linux. Although I'm sure they'll switch to the Teams codebase and Skype native will disappear forever too)
I don’t do any tab-hoarding in Chrome (that’s the job of Safari on my machine), but I’m pretty sure that it will suspend tabs long before you run out of RAM. 8 GB is still plenty for web browsing.
The median Macbook user is not a developer. A web browser and videocalling are the only two from that list you'd expect the median user to be doing with a Macbook.
2. Their entire presentation focuses on editing large image files, editing 4k video, machine learning, and graphics-intensive games (games were mentioned more than anything else)
So, given how these are marketed, they are extremely low on RAM and storage.
Does macOS handle memory in a way where it just lets RAM fill up to some point before actively doing something about it? Because on my 16gig Macbook 8 gigs are gone just with browsing & spotify.
Yes. If you run vm_stat in the Terminal, you can see how few pages of RAM are actually left free at any given time. Modern operating systems aggressively cache files in RAM to improve performance.
The green/yellow/red graph in the Memory tab of Activity Manager is the best way of seeing if you need more RAM at a glance. In my experience, macOS won't do any swapping in the green. In the yellow, the system is still pretty usable but there's some swapping going on. When it turns red, your SSD will be getting thrashed pretty hard.
They're not really "gone", they've just been put to use because there's no point in leaving RAM empty. You can do much more than run a browser and Spotify with 8GB of RAM.
It's a good question -- do most people still use desktop office suites these days? I don't know the answer. I use the Google Docs suite for both work and for personal stuff, and I personally haven't seen the need to pay for Office in awhile.
I have a Mac mini that still had upgradable ram, and was able to be kidded to add a second hard drive. I put 32 gigs of ran and an 1 tb ssd into it. I’ve had it for roughly 7 years. Any current Mac mini is a downgrade except in processor speed.
We're talking about RAM. RAM usage is a direct function of how many apps you have open at the same time. For an iPad that number is a lot smaller than for a laptop.
I don't think that's the reason the iPad Pro performs well at video encoding (or else you'd be able to say the same thing for any Android tablet). It's a combination of dedicated hardware and a fast CPU.
What I'm saying is that just because the iPad Pro is good at video encoding it doesn't mean that it is capable of running a windowed OS with real multitasking. For that you need more RAM. The iPad is built for a specific workflow where you use one or two apps at a time, laptops are designed for a workflow that involves more multitasking.
What Apple appears to have done is transpose the hardware from iPad to Mac without thinking about the different requirements of each. There is a reason that RAM is not part of the package on laptop CPUs. This might work for casual users but people spending $$$ on a "pro" computer will wise up pretty fast.
I used to run a Windowed OS with real multitasking on a 33MHz 486 DX with 4MB of RAM! Sure, apps are a little more resource hungry these days, but there's nothing about multitasking or windowing that inherently requires tons of RAM. An iPad could for sure enable full multitasking without any problems. Apple has made millions of laptops that ran OS X perfectly fine with less than 6GB of RAM. It's just that iPad users aren't accustomed to having to manually manage which apps are open at any given time.
I have a 2019 Macbook Air with 8GB RAM and it runs OS X fine for dev work. RAM compression and fast SSD storage make a big difference. The newer Macbooks will have even faster SSDs.
I think the main reason Apple don't want to add lots of RAM to the base Macbook models is that it's the wrong compromise between battery life and performance for most users. They are probably betting on continuing gains in SSD performance obviating the need for additional RAM.
Video encoding is usually special dedicated hardware, so it's more measuring the hardware encoder than the general speed of the device. Go head to head with AV1 encoding for example and you'll see it matches it's general benchmark.
Yes, but Apple are moving in the direction of adding dedicated hardware to support a lot of common use cases. The point remains that video encoding is something that people often use laptops for. Users just want it to go fast; they don't care how exactly the hardware is making that happen.
The iPad Pro has strong raw CPU performance in any case. It certainly outperforms plenty of entry-level Windows laptops.
Seriously though what’s the connection between it being 2020 and there being more default RAM? Are you suggesting that in the year 2030 we should expect 100gb of RAM?
The goal shouldn’t be to increase ageing technology but to replace it with something better.
If you do anything like virtualization 8GB of RAM is absolutely paltry, borderline unusable - due to the need, e.g. to allocate a specific amount of RAM to a running VM.
To be honest, I find even 16GB to be limiting. The lack of offering of a 32GB configuration killed the immediate sale for me, absolutely no exaggeration.
Because you can't upgrade it post-market, I'd consider these to be some of the least future-proofed releases from Apple in a while.
> Are you suggesting that in the year 2030 we should expect 100gb of RAM?
100GB sounds like paltry for 10 years of technological advancement. In 2010 DDR4 didn't exist yet and DDR3 only supported up to 16GB per ram stick (8 on Intel at the time). Now we have ram sticks with capacities up to 256GB. The 2010 MacBook Pro only came with 4GB! But really "640K ought to be enough for anybody", am I right?
Well, I get where you're coming from, but would compare it maybe to electric or hydrogen vs gas cars. If I was in the market for a new car and drove certain distances regularly, I probably wouldn't be the first to buy something that could only go half the distance less conveniently, but does it faster.
I've had 16 gigs for the last 8 years. I thought about getting 32, but I haven't noticed needing more memory and it does use (battery) energy so I decided against it.
More ram is patch for components that aren’t hyper-optimized to work together. The whole point of what apple is doing with their own silicon is to create that optimization, thus reducing the need for excessive ram.
It’s worked for them for iPad and iPhone. Samsung et all would boast more ram but oddly enough iPhones were and remain faster
An iPad and iPhone mostly display one thing. One app or maybe a second one in split screen.
A laptop can run many things simultaneously.
Web browser, Office suite, Musik, EMail client, Cloud sync...
Even my mother struggled with 8GB RAM on here work machine and she is by far no PC expert.
She writes emails and letters, opens spreadsheets and websites. Sometimes she gets a call and has to open another document or website.
End of RAM.
I would NOT recommend the 8GB Macbook Pro for professional work.
You're confusing CPU and memory pressure, and attributing blame to the wrong component.
Android's lack of smoothness is due to thread contention, whereas the iOS kernel uses a dedicated UI thread running at high priority. RAM size has no impact on UI performance, beyond a sufficient amount for the kernel.
To add on to this, the person you are replying to is also quite wrong. Android devices definitely can be smooth and many go past 60hz now and have higher refresh rates than Apple products.
Agreed. I have a pixel work phone and iPhone (XS/12 pro) for personal usage. Even though the Android is also current gen it can be unresponsive at times and also the face unlock is terrible on the android. I can’t remember the last time faceid failed but on Android it’s atleast 2-3 times a day.
MacBooks and PowerBooks before them where about that 5% of power users who fell in love with their tools and preached to all their friends and family to get one too.
Now that apple has the remaining 95%, it is letting their power users down. But that is only natural, establishing a beachhead with the powerusers was "just" a genious marketing move that worked pretty well.
>MacBooks and PowerBooks before them where about that 5% of power users who fell in love with their tools and preached to all their friends and family to get one too.
And they still are. Just not in their base configuration, like they have never been. I've been using their stuff since the Motorola era.
Was there any time Apple gave "ample" RAM/DISK in their base configuration? No.
They might both be an M1, but only the Pro has a cooling fan.
The Air is going a more iPad route; totally silent and more than enough performance for typical tasks, but under sustained load it's going to throttle hard.
Or they may be throttled lower all the time, so that using the four performance cores still stays within its cooling budget. Either way, it's not going to run as fast as the MBP without having a fan.
I'm very curious to see benchmarks on these things. Also wondering if processors are performance binned from one SKU to another, or if it's really just the amount of RAM and storage.
Apple's been known to have secret hardware differences like that, the 2018 iPad Pro had 4GB of RAM unless you specced it up to 1TB storage, which included a secret upgrade to 6GB RAM. You wouldn't know from the specs, since Apple doesn't talk about how much RAM their iPads have.
Now here we are with the M1 and Apple doesn't talk about the clock speed. All MacBook Pros have an M1, but do they all have the same clock speed? Maybe.
Here[1] is an article from The Verge that goes into the full differences with confirmation from Apple. For those too lazy to click through, yes the fan is the biggest difference. Beyond that the screen in the Air is slightly darker, the base model has worse chips due to binning, the battery is smaller, and there is obviously the Touchbar.
Thermals and sustained performance. I expect 10 W passive CPU in MBA body to throttle quickly - I now have 2017 12" MB, and for about 60(winter)/15(summer) seconds of sustained load it's the most awesome little machine in the world, and then becomes a slog.
It has 7W CPU in slightly smaller body than new Air, so I expect Air's performance review to be 'the story about throttling'.
The pro comes with fan cooling.
With that, the pro can run for much longer without hitting max temperatures .
The air is fanless so in an intesive task, you will hit thermal limits very soon and the cpu will be slowed down .
I have an 2018 MBP with an i9/Vega20 - I can testify how terrible their thermals are. But given that this thing can be passively cooled - even apple crappy venting will probably do a decent job with this chip. Need to see what happens when the reviews land.
Oh I am sure judging by how excellent MacBook Pros cool things, having passive cooling will make no difference at all. I mean, during summer I put my MacBook on a large ice block that I freeze over night, this way I can maintain acceptable build & development speeds during the day. Should work the same for the MacBook Air. No?
The emphasis here being on the word "should". I wouldn't buy just yet until some third party has published a sustained load/heat test. Apple have sucked at this for quite some years now, it's folly to touch a MacBook for anything remotely compute-intensive
Only when you compare apples to oranges, but this is an apples to apples comparison. Same underlying silicon, almost identical configuration, different chassis. It's impossible for the Pro not to outperform the Air when the only substantial difference for performance is TDP and we have zero reason to believe that heat-pipe + fan would be outperform by passive cooling of all things.
Agreed, the Pro's cooling should be better. However, I think it's worth waiting and seeing. There might not be that big of a difference in the end (e.g. both are uncharacteristically well designed and run fine; or conversely both are so badly thermally throttled that no serious intensive tasks can be performed) in which case other factors could end up being more significant.
Apple has always sucked when it came to cooling. The Apple /// and Lisa were plagued by “IC creep” where, due to heat expansion from inadequate cooling, the ICs would wiggle out of their sockets ever so slowly.
You can solve that problem by going into keyboard settings and changing the "Touch Bar shows" setting to "Expanded Control Strip". Or "F1, F2, etc. keys" if you prefer. Whichever one you set, holding down the "fn" key will temporarily toggle to the other one.
You still don't get the same feel as physical keys, but no Siri, no sliders to adjust volume or brightness, no stuff shifting around as you switch apps.
Yes, it's just annoying that the best workaround involves replacing the touch bar functionality with an inferior replica of the physical keys it replaced.
The M1 chip will for sure sustain higher performance in the Pro because it has active cooling.
Apple ramps their chip performance dramatically and frequently in iOS devices; that’s how they hit such high benchmarks but also deliver long battery life and stay cool. Those chips spend most of their time not performing.
In such a system, having “the same chip” in two different devices will not tell you much about their relative performance under real world loads.
I literally had to cancel orders after noticing this as well, was very confusing even for a developer.
Also, what's the difference between the left and the right versions? just the storage? can't you configure the storage? why do they make it look like there are 2 different options.
I'd also be very interested to see if that $250 difference between the 7-core & 8-core GPU is really just a software toggle. It sounds like they've built 1 SOC, not two, so I don't know how they came up with that.
tech specs say 500nits on the pro and 400nits on the air, so its possible they are different screens but also possible they just have different backlights
The larger chassis allows for better cooling and more battery. Their comments would make it seem that the same processor in the MacBook Pro performs better because of more thermal headroom due to active cooling. The Air is fanless.
I wonder how useful they are. I used to have a gaming laptop in the early 2010s and the nicest cooling fan I could find, but it only dropped the temperature down by 2-3 degrees
My suspicion is that they're not useful at all. Once I bought such a cooler pad, and placed a 2011 13" MacBook on it. Then I benchmarked it (i.e. max out the CPU), and measured the sensor "CPU Near". I could not measure a difference between the cooler pad on or off.
I can't remember the brand of the cooler pad, but it had two 10 cm fans.
Apple mentioned in the event that these Macs will have hardware verified secure boot. Since I’m not very knowledgeable in this area, can someone explain (or even try to guess) what this would/could mean for running Linux on these? I use Macs way beyond Apple’s support timeframe with OS X/macOS, and Linux is the one that runs on some of the older Macs and provides adequate security and security related software updates.
I don't know the answer^, but how old is your current old Mac hardware? I don't know about the desktops, but Macbooks from 2016 are not well supported hardware-wise in Linux - things like no WiFi even. There was a good GitHub repo tracking it for up to I think the first touchbar Pro, and basically it was dismal then and only got worse (according to repo owner who consequently stopped bothering iirc).
So.. depending what you want to do on these older machines, my point is that this may be the least of your worries.
^(though I think it's fine, because it's the reverse that would be a problem? Bad news for 'hackintosh' if all supported versions of macOS can expect secure boot hardware, I think)
Until 2019 Apple sold 13 inch MBPs without the touchbar, and these models did not have a T2 chip. They are still miserable computers to run Linux on, although I think the original 2016 touchbar-less MBP performs better than all the rest, albeit (IIRC) no working audio, very poor suspend/resume functionality, and until pretty recently no keyboard/trackpad functionality.
Oh, and Apple's NVMe interface is non-compliant. This is widely reported as Apple locking Linux out with the T2 chip, but that's not really true. The T2 chip will by default prevent unsigned kernels from reading/writing to the SSD, but this can be disabled.
Even if it's disabled, the controller is not standards compliant, and Linux won't see the underlying block device. I saw some diffs floating around on github a few years ago that fixed it, but I don't think it was ever mainlined.
Basically, post-2016, Apple seems to have incorporated even more custom (and undocumented) hardware that running alternative OSes on them is basically impossible. Windows works because of the Apple-provided HAL + drivers for WinNT.
Even in Bootcamp Apple did not bother to expose all hardware to Windows . The touchpad is reported as a mouse with a scroll wheel, no option to enable hardware encryption or to use Touch ID to unlock.
Windows supports various bio-metric logins and provides rather generic API. Some manufacturers use that to provide login based on veins in a finger, not fingerprints. Apple could have implemented those API.
I have a MacBookPro15,2 (2019, with T2), on which I duel boot Arch Linux. It is perfectly usable. The hardware support is not great. In particular, resuming from suspend is very slow, and I haven't gotten the built-in mic to work. And getting the system to work did require using a patched Linux kernel installed from Github. So not easy, but possible.
Your claims about "dismal then and only got worse" are unfounded. The repository you refer to is still active. https://github.com/Dunedan/mbp-2016-linux If anything, activity has slowed down in these threads because it was figured out how to make it work.
Even among people who run Linux on these MacBooks, the general recommendation is to keep a macOS partition around for stability. Some of the value you get from any Apple computer is in the software. If you intend on instantly installing Linux or Windows as your only OS, this probably isn't the computer for you. But if you want to or have to use Linux sometimes, these T2-chip Macs can do it.
> There was a good GitHub repo tracking it for up to I think the first touchbar Pro, and basically it was dismal then and only got worse (according to repo owner who consequently stopped bothering iirc).
As I'm said repo owner, let me chime in here quickly to shed some light on that.
I used a 13" MacBook Pro 2016 for 3 years with Debian as my sole machine for work. When ordering it back in 2016 I wasn't sure how difficult it'd be to get Linux properly working on it, as at that point it was only known that it's possible to boot Linux, but nobody had figured out even such basics like support for the integrated input devices or the NVMe SSD yet. However as I was using Linux on Macs since 2006 I figured it'd be somehow possible to get it to work for me.
Fortunately I wasn't the only one serious about running Linux these 2016+ MacBooks, as I have very limited knowledge of the required lowlevel programming skills. What I did was to provide and moderate a Github repository (https://github.com/Dunedan/mbp-2016-linux) as a central place to document and discuss of the status of hardware support for these MacBook Pros, some little patches and lots of feedback and bug reports. A big shoutout to all contributors who did an incredible job at reverse engineering, implementing and upstreaming drivers for various components! That's quite an achievement for such a complex device with no public hardware documentation at all!
After a while it turned out that support for certain components would be rather difficult to get working flawlessly. As an example, even at the end of the 3 years I used the MacBook Pro, I had to use an external adapter to be able to use WiFi. With that in mind I started to reconsider why I bought Apple products: I bought them because of their superior hardware quality. But if I'm not able to use the hardware as intended, what's the point of paying a premium for Apple products? And let's just not talk about the butterfly keyboard or the horrible thermal management. So when it came to replacing my MacBook Pro, I decided to go with a Lenovo Thinkpad X1 Carbon instead. It's not perfect, but I'm way happier now than I ever was with the MacBook Pro 2016, as the hardware just works.
As I don't own any 2016+ Apple device anymore, the help for further Linux support I can provide is limited, but I didn't stop bothering at all! I'm still actively managing said Github repository, but activity in general has significantly dropped there over time. Either the devices work well enough for other people now or they also replaced them with non-Apple hardware.
I bought a 2017 MBP hoping the situation would eventually improve but it never did, so I never got around to installing Linux. I'm expecting it'll be even worse for these M1 systems.
It is a shame, it's not something I ever really did (or not for long, for a period I do recall having Arch on my 2013 Air) but I like the idea - I like Apple's hardware, just not the software.
Oh Apple, why are you doing this, taking freedom from your customers. I don‘t want to use Windows, neither do I want to tinker with Ubuntu. But if you keep going that path, you are forcing your power users to think about migrating to platforms that respect users freedom to do whatever they want to do with their machines.
After two years of using an otherwise beautiful iPad Pro (along with my MBP) I came to realize that a crippled machine that is very limited in how I use a computer is not the future of computing I like. The device collects dust for quite some time as I prefer a computing environment where I use the terminal a lot, where I use my bash and Python scripts a lot to automate, where I use Emacs a lot to write tech docs, do my project planning, writing, automating workflows, and many more things that are not doable on a crippled (iPad)OS.
You keep going toward your vision of a computing platform where your customers are just consumers, not hackers and doers, and us hackers need to look for alternative platforms, most propably Linux.
You can get XPS Developer Edition, System76, Purism, or many other laptop brands (eg. any of these https://elementary.io/store/ with elementaryOS, whose DE should feel fairly familiar to a macOS user) with GNU preinstalled these days.
Yes the iPad is "crippled" in that sense, but I find it's an excellent accessory to a computer. Not everything I do needs a terminal, my Python scripts, favorite text editor, and rapid multitasking. The iPad is a wonderful (albeit expensive) side device for lighter activities on the couch, in the kitchen, or on the go.
It doesn't need to be our only computing device to be appreciated, and not every computing device needs to be powerful.
What I am lamenting is the observation that the iPad OS seems to be Apples vision for how computers should work: Crippled, not much user control, just content-consuming devices with Apple controlling every aspect of it. That’s not a personal computer anymore, not a device where we have much control over it.
I'm perfectly happy with the division between "consumer machines" and "creative machines".
I obviously count myself among the people who needs and wants creator capabilities, but for my technologically challenged family and friends there's no reason to learn and manage all the complexities of a classic computer environment if they just want a point of access to youtube, netflix, spotify and social networks.
I still shiver remembering the times of browsers riddled with search bars, trojans and antivirus software slowing computers to a crawl and people who's "good with tech" being dragged to friends' houses to see what's wrong with the computer.
Same here. I can't stand macOS, the interface is terrible and it's an awful development environment.
But the iPad is an excellent companion, since I use that to scrible around, consume media, photo editing, keep my music sheets, and all that stuff that would suck on Linux.
What? The Apple II was a fantastically open machine! It even came with the circuit schematic and ROM source code right in the manual! It had lots of slots and there was a massive third-party ecosystem. It was when Jobs got to design machines with the Apple III/Lisa/Mac that things closed up.
Linux does support many ARM architectures already, and even supports the x86/x64 version of Secure Boot in some configurations. If Apple wants to either allow their Secure Boot to be disabled or to allow end users or Linux distros to somehow get their own keys trusted, I'm sure the port can happen in the coming $smallnum years with enough interest + resources + time. (But not $smallnum months, sadly.)
I use a mac for work (and paid for by work) but refuse to spend my money on something I can't use the way I would like to use it. I think that these companies shouldn't be able to lock you out of you using your tractor/car/computer like everyone seems to be moving towards. It's a real shame. I understand if they want to void the warranty because a user blew away some critical firmware, but that's another ball of wax and it's on the user to suffer the consequences.
Void my warranty, boot with a scary splash screen, whatever, but don’t lock me out of the thing I ostensibly own. Or, maybe change the “buy” button to a “license to use” button in your store.
Maybe, just maybe, it shouldn't be a $1T company? Maybe it should once again become a company that puts its customers and their experience before profits?
People are frustrated with technology lately. Even non-tech people. Apple has everything it would need to change that, but it decides to contribute to technology becoming ever more frustrating time and time again instead.
Agreed. But I can't think of an example of a company that ever voluntarily downsized. Downsizing usually happens because a new competitor arises that makes a product customers prefer. That's a very difficult proposition in the personal computer space.
If history is a lesson, any company that arises to compete with established players, gets acquired by them. And it's a shame no companies actually decline these acquisitions.
>no companies actually decline these acquisitions.
Not true. Yahoo made 2 separate offers to acquire Google, and an established social-media company offered to acquire Facebook (for a billion dollars IIRC).
Craig Federighi Said himself that they don’t boot other operating systems.
Could you link the talk where they said it can run binaries not signed by Apple? The only thing I could find is where they still allow you to boot older versions which they don’t let you download anymore. To keep the actual mac experience.
Can’t find anything in both documents which allows booting of non Apple signed Software. The only thing I See there is something like SecureBoot on PCs, where Apple would need to sign your boot loader in order to be able to boot it.
It changed. You use kmutil create to create the artifacts and add the hash to the Secure Boot policy. (--help at https://pastebin.ubuntu.com/p/mN3Z2kfJWy/, no manpage)
It is no longer a Personal Computer. And it is a security disaster if you cannot control own hardware of your computer. It should be made illegal for Apple to operate like this. User MUST have full control of the computer. It is user right and should be human right. Then only reason I used Macs is their respecting ability to use any OS I want if I Want.
Apple's not going to ship drivers for Linux, and it's a SoC. So someone needs to somehow write an open source driver for Apple's proprietary black box of a GPU.
I suspect very little will work at all if someone can get Linux to boot on one, and it will be a very lengthy endeavor to get things up to being usable.
The reality is that we can't answer this until we have some hands on.
It really depends on whether their secure boot architecture can be disabled (unlikely knowing apple), or allow adding ones own keys (unlikely). Bootcamp probably won't happen since windows does not support the architecture: they'll be pushing people to use VMs.
They might also provide some untrusted path to boot without it being able to access certain secure features. I wish they did this, but also won't keep my breath!
That said, the kernel itself needs to have support for the hardware architecture, and then drivers for all the new hardware they're pushing out. I don't expect this to be soon, though I'd definitely be willing to sponsor anyone willing to work on this.
Who downvoted your comment and why ?
This is a good comment and good strategy to teach them a lesson. Without some efforts those companies would not recall moral values. Richard Stallman was warning us about this development long ago and he was right. Cripled hardware is useless for hacking mind.
Dude they'll just sell them to someone else. It doesn't change anything, the material and resource cost has already been paid. Stop making this about something it absolutely is not.
Companies DO care when people return something, because that is pure signal. "I got it because I thought I would like it and I don't" is a much different signal than "I have no idea what you think because I never interacted with you". That is likely one of the most effective ways to make a company sit up and take notice, the return rate of a product is a key indicator of its success.
I really don't see that there's anything to disagree with there. Loving Apple, as you may, doesn't make the above point wrong.
Do you think Apple is just going to take the computer you touched, turn around, and sell it to another person? No, they’re going to take the whole thing apart, replace all the consumables and user-facing parts, the sell it as refurbished. And that’s the best case: they might have to strip it for parts or trash it depending on what it was that you bought.
This has nothing to do with a love of Apple or anything, and everything to do with “you’re abusing a program that they are going to either ban you from, or remove because you abused it too much”.
and degrating Personal Computer into machine under control of someone else, attacking rights of the person and stripping people from privacy completely is not a wasteful thing to do? I mean, it's a garbage by defintion and sure, it takes time for people to understand this, but this machine is useless by design for freedom respecting society, it's not a waste? it's a huge waste of resources I would say. Returning product doesn't add to this too much of waste, it simply tells what it is.
Dude, you’re arguing about control to the wrong person. Apple knows about this already and you returning a bunch of devices isn’t going to get them to change their policy.
I am not buying apple stuff for 5 years already, becasue I can't stand stupidity and their macbooks pro woould just cripple my abilities and mobility with those stupid dongles, unupgradable memory and idiotic touchbars.
The only reason I could bare some of their hw is becase I knew I can put Linux when I get enough of it, and now what?
I am not buying, sophisticated people are not buying and it doesn't help so in my perspective IF something is ever going to change their policy is returning products to SEND A MESSAGE. Other option is to wait untill some dumbo get it when it'll be too late, like it was when S.Jobs has to return to save them..
Or you sugest even more strong action then returning?
This is intended. They should stop selling things that attack privacy and freedom of a person, or this concept is not your priority and you are ok to have computer controlled COMPLETELY by someone else, which means ZERO privacy?
PS: Well, this one is downvoted too. Looks like some lost even a sense of what PERSONAL computer means.
OK. Keep downvoting! it's a good strategy to silence someone when there are no valid arguments.
They’ll stop selling the device to you or accepting your returns. So you haven’t really done much.
Also, people who complain about downvotes usually attract more. I’d suggest not doing that. Claims that there are “no valid arguments” against your position rather than nobody wanting to deal with you are, well, absurd.
If many people will do that it's a different story. May be it's ok with you and you see no danger in their strategy but I see this issue as huge attack on freedom and rights, including right for privacy. History have many examples about how people protected their rights and freedom. Returning product is very light way to send a proper message.
Recently I see more and more perfectly valid comments to be downvoted and I do not like it. If this forum will become mob controlled with bullying then I see no reason why bright people would stay here. If one doesn't want ot deal with comment, usually one moves on, like I do. But if argument is perfectly valid and instead of answer I see one simply downvotes it is bullying as it appears.
I also prefer secure hardware, but I find macOS completely useless for work.
While I can appreciate that some see other-OSs as something of a curiosity, for many of us it's a big deal-breaker, and it's a shame Apple is not willing to provide their hardware to so many potential clients who simply don't want their software.
What advantage do you see to "secure hardware", I'm unaware of any recent Mac security issue that would have been prevented by it. It gives Apple a lot more control over the device but I don't see any advantage to the user.
It means that you need to find a vulnerability in bootloader and exploit it to break free from Apple secure garden. Linux works on ARM for years, so I'm sure that it won't be impossible to port it over, but whether enthusiasts will do it or not is another question, as you would need to write drivers for proprietary GPU and storage to make it useful.
That’s not really what I’d call a “WWDC talk”, but sure, it mentions that Apple won’t provide Boot Camp, and that they are running their OS demos using virtualization. I didn’t see a claim that they won’t let you reduce the boot security.
"We're not direct booting an alternate operating system, it's purely virtualization. Hypervisors can be very efficient, so the need to direct boot shouldn't really be a concern."
For me that quote means that they're not allowing booting any alternate operating system and they expect developers to use virtualization if other operating system is needed. I would be happy to be wrong about that.
I understood that as "we weren't booting something else in our demos, we were using virtualization" but not "we can't boot anything else". I am sure, however, that they would like you to use virtualization instead of direct booting.
I'll be sticking with my 2015 Macbook Pro Retina 13". Great machine, not too thin, heavy enough, no stupid touch screen, usb ports, great keyboard. Everything apple has done since hasn't compared.
I agree here. I really don't understand some of the bashing that happens every single time the Macbook Pros come up in HN comments.
I was a longtime happy owner of a 2015 15" MBP until earlier this year when I decided it was time to upgrade to a refurbished 2019 16" MBP. I was a bit nervous at first but I have to say that I have no major complaints, other than the fact that I wish I had F keys instead of the touchbar. Contrary to everyone else, I really like the keyboard.
As for solving the touchpad issue, I use Pock [1] which I read about on HN. It removes the need for using a slider every time I want to adjust the volume or brightness and lets me control Spotify and see what song is currently playing.
I also have a 2018 15" MBP for work. If you believed the comments here you'd think that I am unable to type or use the damn thing. Honestly, after 1 week of use I'm already used to the keyboard. Not having an escape key kind of sucked, but I have rebound caps lock to escape on all of my Macbooks and enjoy that even more than having an escape key.
My other complaint is that the trackpad is a bit too large - I find myself accidentally hitting it sometimes and it just seems excessive. Finally, it's a shame that you can't mess with the battery/RAM/SSD yourself but unfortunately that's more of a trend for the industry than just Apple.
Overall I'm a totally happy user on both the 2018 and 2019 Macbook Pros. I was quite nervous about the possibility of having to use a Windows PC for work when I started my new job. And don't even get me started on having to use a Pixelbook at Google.
> Contrary to everyone else, I really like the keyboard.
Lot of people like it, they just don't post about it.
Mine has a bunch of keys that print 2 or 3 times when I press the key once (the issue that everybody eventually get with the model I have) but I still prefer that flat keyboard to the old one. Takes a few days to get used to but then it's great.
I wonder how many people are informing their opinions solely from the gross incompetence exhibited by Apple with your mentioned double tapping keys issue. I seem to recall that they have fixed it in newer models, but I wouldn't be surprised if it still persists.
I hate to admit it since I've drunk the mechanical keyboard koolaid, but I like the mbp flat keyboard. I easily prefer it to the 2015 keyboard, which imo is really nothing special (at least it doesn't gunk up). I hear people say great things about the thinkpad keyboard, but from my little experience with it, I'd still rank the mbp flat keyboard higher.
With BTT and GoldenChaos you can swipe anywhere on the Touchbar with two fingers to change the volume, and with three fingers to change the brightness of the display.
I use the 2020 macbook air personally and 16in professionally and the keyboard drives me insane, to the point I bought a magic keyboard to be able to function without screaming at my computer for accidentally turning my video camera on yet again during a meeting. I really wish they would have a no touch bar option, because I could use more power but still be able to touch type. Touch bar is the worst technological "improvement" I've ever come across.
Did you know that you can change what buttons are on the Touch Bar? You can make it almost behave the same as other keyboard by having it always show F-keys.
the difficulty isn't the content but the fact you can't blindly hit the key reliably. reminds me of back in the day where you could send a composed complete text message from your phone without taking it out of your hoodie pocket in class. not the case anymore in our touch screen world without tactile feedback.
Must be finger positioning. I hit the TouchBar accidentally several times a day, each of which was infuriating, until I managed to semi-disable it by turning it into a row of inferior function keys.
My company gave me the current 16" late last year, and it's been a piece of junk: suddenly starts to roar without a reason (while plugged in to power supply) so people on a phone conference complain I should switch off the hoover.
Often freezes with not much open other than a few Chrome tabs, O365 and Sublime (not actually DOING much with any of them at the time).
The keyboard is poor, the touchbar is useless but doesn't get in the way that much, the touch pad is too large and disturbs me sometimes while typing source code. The lack of ports means without a hub the whole machine is pretty useless (and that means all users will have to buy an USB-C hub for another $100 - annoying but the real annoyance is to have that hanging of your laptop all the time, and you mustn't forget to pack it, or you're screwed).
I also dislike the 16" form factor compared to 13", as it breaks all my leather bags when I try to carry it (not so relevant anymore in the days of the home office..).
I was having issues with my 16” until I disabled Turbo Boost. I dunno if it’s specific to the i9 model, but it was running hot and processes were spinning out of control. Since disabling, it runs a lot cooler and I have pretty much zero issues.
Ok, so like a reverse Turbo button. You’re running an i4 most of the time (and that’s fine) and sometimes you crank it up to i9 (when the solo begins, to stretch this metaphor to the limit).
I don't think that's a problem with Apple. I have a Dell XPS with an i9 in it, work gave me, i don't use it because idling in Windows the fans ramp up... If I use a laptop I end up using my personal Lenovo ThinkPad X1 Extreme (gen 1) which doesn't have fans trying to take off into space.
The original retina macbooks are still the best. :(
So what to do if a powerful working machine is needed now, buy the current MBP 16" or go to thinkpad? Disappointed they didn't even upgrade the intel CPU on MBP 16" to the newer version.
The newest one fixed it enough to satisfy me, i.e. it has a physical escape key again. I never used the rest of the function keys anyway, and I like the customizable nature of the touch bar. The lack of a physical escape key when they first introduced it was my only real gripe.
I was personally really hoping for an Intel 16" refresh today. I'm up for a new laptop at work and it's a little bit of a bummer to drop that much on a year-old machine. Not quite sure what to do at this point.
It's very similar to the 2014 models, it has scissor switches and not the controversial butterfly keys. If you ever tried a magic keyboard then it's basically that.
My work gave me MacBook 16" 2020 and probably the worst business laptop I have ever had. The screen is smudgy and glary, the keyboard is awful, touch bar is a meme, no ports and finally it literally cuts my wrists because it was designed to be looked at not used.
You are missing out if you don't have 4 USB-C ports in both sides of your laptop. You can charge your laptop and connect a display with a single cable and without caring about putting the computer in the right direction.
My 2017 13” Pro (no touch at) has two USBC ports, both on the same side. It is annoying. Especially after it turned out there is a design flaw in the 2017 Pros where charging on one side tends to trash your battery - guess which side both ports are on?
It’s nice to only have to plug in one cable when I dock it at my desk but I would have greatly preferred not having to replace the battery twice in the three years I owned it due to this. Especially since only one replacement was covered under the warranty thanks to the ‘rona shutting down all the authorized service centers in my city.
Overall I have positive feelings about my 2019 16" Macbook pro, and USB-C on both sides is definitely part of that... However I have yet to find a decent adapter that will both charge my laptop and output 4k @ 60FPS. I've gone through several iterations now and the closest I've come is a monitor that can connect via USB-C directly, but even then the power output only outputs enough juice when the machine is near-idle... I'm looking forward to the day when I can have a 1 (or even 2) cable dock setup that won't run me $250+. Until that day, I have to use all 4 ports for:
Looks like the just announced 13" MBP only has 2 usb-c/thunderbolt ports and they're both on the same side.
Never occurred to me they would go backwards on this, I bet this is one of those overlooked details that will get a lot of notice once people actually start receiving these new laptops.
There are two models of 13" MBP - based on the other specs of this new machine, it occupies the lower tier - which in the Intel models also only has two ports.
Presumably, when they fill the higher tier with the new chip it will also have more RAM and more USB ports.
don't MacBook mysteriously run slower if you plug in power on the left side instead of the right? I don't remember the cause but I've personally experienced this handedness issue on my mac
I heard yesterday that Apple does not offer a battery replacement for 2014s and you have to go third party. This is an unfortunate development and I'm wondering if I should inquire about getting mine replaced.
I have this machine, it’s a super portable and reliable powerhouse. I recently upgraded to a 2019 15”, mostly to go up to 32G ram, and it feels like a downgrade in terms of design and usability.
Also have this one. It was my first macbook, and I think I got really lucky with getting in when I did. Been going strong for 5 years, and has no difficulty doing anything I need it to do. Only thing I wish I did was get more storage (only got the 128GB model).
FYI the storage is upgradable on these machines (and it's very straightforward to do). It's a nonstandard SSD interface, but you can official modules 2nd hand on ebay quite cheaply these days.
Do your research on the adapters though and drives. Some have issues with suspend. Make sure the hardware you add has been tested by some brave youtuber or blogger before buying anything. It looks like the actual act of repacing it isn't that bad though.
Used Apple products maintain their value very well. The 2015 MBP Retina can be found on Ebay between $700-$1000. I've thought about getting one myself.
The Macbooks that were for many years the smallest and lightest Apple laptops, would be imbalanced and knocked off your lap by placing your hands on it slightly asymmetrically.
I think these super thin laptops are ugly, too hot and not ergonomic - also this sacrifices the keyboard. I actually think this model represents the perfect mix of all of these things - thickness (being the thinnest thing I can buy is not a feature for me) - weight (I really don't mind 1.3kg) and keyboard.
> Modern web is too heavy for the old dualcore i5, unfortunately.
It's a shame that the web has become this borderline unusable mess. I shouldn't need a quad-core machine with multiple gigabytes of RAM to just read the news online.
I have the same machine too, and I'm starting to notice this as well. It's absolutely ridiculous that LinkedIn for example makes my machine slow down to the extent it does.
I've been waiting to upgrade my 2015 for 5 years now! It's funny, when I bought it I only went with 8GB ram because I figured I would replace it in 2 years-ish. It has served me well, but, it's struggling with only 8GB these days. Now I'm ready for 32GB, because I'm planning on keeping the next one for another 5 years. Let's hope the 14" MackBook Pro in 2021 (with mini-LED secreen) rumors have legs to it. Hoping this is only the interim state of the 13" Pro and is due to really needing a different ARM cpu that is not available yet.
I am in the same boat. I love this Laptop, but Lightroom (CC) is really slow compared to how it runs on iOS. I would assume the new M1-based machines could run the iOS version of Lightroom almost out of the box.
Same boat as you, but for some reason with the new MacOS update my computer kept getting the most random freezes to the point where I gave up and gave it a fresh Linux install. Haven’t looked back and I’m extremely happy with the work I’ve been putting into it.
I have this generation air and a recent pro... call me crazy, but somehow the hard edges of the pro are super uncomfortable when the laptop is on my lap. Maybe an air vs pro thing rather than generation thing
Did I miss something, or are they, once again, not upgrading the camera? The low light performance of the current version is real bad. Since we're all using these cameras way more, I really thought there would a hardware bump. You can only squeeze so much detail out of an under-exposed, noisy image with software.
If you look at the cooling solutions on the current macs I’d say the inverse: that these chassis were designed with the M1 in mind and they shoehorned Intel CPUs into them.
This is exactly right. They could have done everything in one go, but they had to change the keyboard and call that the 5th generation macbook pro. Like they didn't even include wifi 6 in the previous model nor did they hold an event for it. Kuo says a redesign is due in 2021, but apple can't keep getting away with doing the same thing since 2016. There are laptops with dual screens, 4k touch screens, vapeor chamber cooling, and apple just be changing out the keyboard for the next gen. This new chip is good for the chip industry, but it's not really a different laptop at all.
I'm curious, are higher-quality cameras even available that fit into the thin lid? (At a reasonable price?)
I was always under the impression that the built-in camera is so much worse than your phone's front-facing camera simply because the phone is thick and has room, while the MacBook lid is much thinner.
The reality is, for most people 720p is more than enough, since most videoconferencing is at a heavily compressed 480p anyways, and it's not like most people really want it super obvious that they missed a spot shaving or have a small zit anyways. And sure the quality arguably falls to below 480p in very low light conditions, but that's not an issue in any normal well-lit office, coffee shop, or kitchen or living room.
I would assume Apple is making the right call here that a higher-resolution camera isn't worth additional thickness.
Yea, I hate the way the background blur feature in google meets suddenly make my face way brighter and "pop" in the video. With winter coming, I'm getting a lot less natural light in my work area and my videos just look dreary.
Considering how good they are at iphone cameras there is absolutely no reason to have a poor camera nowaday in the laptop. They know how to make cameras that fit in tight places. They're just being lazy/cheap
Instead of having a camera as thin as the lid, they could have the camera stick out a little bit and fit into a recess on the other side of the clamshell when closed.
Surprising as it seems, 8GB seems to be enough for most "normal" users. My wife has an 8GB MacBook Pro and I'm always shocked by how many open tabs she keeps in Chrome and how many Office documents are open at any given time.
My 16GB MBP is constraining sometimes and the next one will, doubtlessly, come with at least 32GB. Until Apple can do that, I won't use an ARM-based Mac as my daily driver.
As for the 8-core, it's 4 beefy ones and 4 low-power ones. Load will shift from one kind to the other according to usage patterns. This is how they achieve that 20-hour battery life.
I brought a decent external camera, and while being able to position it is a must have, the quality means that if I don't shave or have a nick or something in the background, it is obvious.
Whereas the people I am on an online call with have about has many pixels dedicated to their hair as Laura in Tomb Raider 3.
I still care enough that I use it, and I am disappointed in Apple not including their iPhone camera, but it might not be something the average person wants.
I agree that the cameras are subpar. I wonder if Apple figures that most Mac users already have an iPhone, and that if they want to use a better camera for videoconferencing, they can always just plug in their iPhone and use it as the camera.
Has anyone done this? I've considered using my iPad Pro's camera this way but have never cared enough to actually do it.
I'm pretty sure there was a part of the presentation talking about a camera upgrade. But it's not something on my wish list, so I didn't pay attention.
Double-check the recording on Apple's web site. The answer may be there.
Super interesting how they kept the touchbar on the Macbook Pro keyboard but not on the Air. As a software developer, I'm going to be more likely to buy an Air just due to the keyboard.
Having bought a Macbook Pro with an Esc key earlier this year, lack of an Esc key was 90% of my issue with the touch bar. My biggest issue now is when I accidentally activate it. I didn't have an issue accidentally hitting the function keys on my 2013, so why not bring those back? Or invent something new. Forward or back, I don't care, just admit the touch bar was a dud and get it off my keyboard.
EDIT: My mother also says she gets distracting autocomplete suggestions on the touch bar while she's typing. I vaguely remember doing something to turn that off on mine, but she is terrified of Covid so I haven't had a chance to get at her laptop and fix it for her. I don't know what human factors genius at Apple decided it would be helpful for people to see words hopping around at the edge of their vision while they're trying to type.
> I vaguely remember doing something to turn that off on mine, but she is terrified of Covid so I haven't had a chance to get at her laptop and fix it for her.
You can use Messages to easily initiate screen-sharing (with you controlling her Mac) to do that. No 3rd party software, no action on her side needed.
It's absurd that you had to resort to this.
I wonder if someone also released a standalone bluetooth escape key that attaches to the side of a keyboard (equally absurd).
This is such a bizarre unforced error. Why would they just arbitrarily remove a row of the keyboard? Taking it away and asking why I need it is like asking why I need right-click, pinch-to-zoom, or my left pinky finger. It makes no sense that we have to choose between fully featured input and active cooling.
Either way, based on this announcement I'll probably hold off on upgrading for another generation. That'll leave some time to see how the transition goes, and with any luck the next ones will include 32 GB RAM, Mini-LED, and 5G (along with a full keyboard).
Don't know why they didn't just add a screen to the blank function key width gap of bare aluminum between the keyboard and the hinge if they were looking to add some decoration to the keyboard. That would have been praised, instead it's been a pariah.
Step Over/Into/Out shortcuts in Xcode (F6/F7/F8) are the really obvious ones which I hit many times per day. Play/pause/etc. with the Fn key are easier to hit without looking than the touchbar.
I prefer that they change labels in a context-sensitive way. For keys that I don't use that often the icon is a better trade-off than a tactile response. Sliders are useful as well.
I'm not the person you replied to, but I frequently use the play/pause key and also the volume keys. I occasionally use the function keys in my text editor too.
I'm with you. My work computer has the TouchBar, and my personal computer has the physical keys.
While I prefer the physical keys, once you get the TouchBar configured properly (I use MTMR), it's really quite nice. Combined with a physical escape key, it would be ideal.
Considering how the people on HN boast so much about being L337 Haxxorz, I'm surprised they don't see the value in a secondary interactive screen that they can make do anything they want.
I don't look when I type, either, except for the function keys, because I don't use them that often. So since I'm already looking down at the function keys, looking at the TouchBar makes zero difference.
One minor annoyance, however, is that the TouchBar isn't visible in direct sunlight, which is to be expected of any screen. Function keys don't have that problem.
It is Cmd+Backspace for me because I remapped my MacBook's keyboard so it matches my ThinkPad's[1] - I really don't want to have to play Twister with my fingers.
[1] Yes, I've remapped/swapped the Ctrl+Fn keys on my ThinkPad too.
I can't fathom why Apple couldn't just include both. I'm using a company-provided touchbar'd Macbook Pro for work, and it's handy (I've got it setup with screenshot and screen lock buttons so I don't have to memorize the absolutely bonkers keyboard shortcuts for them); it just seems boneheaded to treat it as a replacement of the function keys instead of an addition.
I've been using Airs as my primary machine since 2010. I don't use bulky software (XCode, Adobe suite, Office, etc), so if you use any of that I wouldn't recommend it, but for software dev it's been plenty fast and a real joy to use.
The only time I'm speed constrained is deep learning, but generally I just run tiny test sets locally and then run full jobs on a cluster or the cloud.
In the mid 2010's I had a desktop with like 64GB of Ram and four million processors, and I found programming on the Air I was still more productive. Productivity wise I think it's a very high dimensional space to consider.
There's a noticeable latency using gmail.com on my iMac Pro, too.
I've always used powerful desktops (Mac Pro, iMac Pro) and MacBook Airs while traveling, and while the Airs aren't very good for video games like StarCraft II, I've never had a problem with performance while doing anything else (I work in VS Code on TypeScript). My unit tests run in ~20 seconds on the MacBook Air as well as the iMac Pro.
Since the new MacBook Air doesn’t have a fan while the MacBook Pro does, I’m sure there are some differences that Apple isn’t admitting right now. The battery life claimed by Apple is also higher on the Air compared to the Pro. Maybe the Air is throttled or runs at lower clock speeds.
If you configured them the same (chose the more expensive Air), the Air and the 13" Pro have the exact same specs from what I can tell other than a slightly brighter screen (500 nits vs. 400 nits) and slightly bigger battery (58.2 whr vs. 49.9 whr) on the 13" Pro.
I am also not a fan of the keyboard or touch bar, but especially the touch bar. It's a cool idea and I can see useful functions, but not pushed up against a place where you're typing. I am constantly bringing up Siri or changing volume when hitting numbers, -= or delete from a finger swiping across the touch bar accidentally. There used to be a tactile force needed to activate F key functions, but now there's just a capacitive touch bar.
What's wrong with the new keyboard (touch bar aside)? I haven't tried one yet, but I thought it was supposed to be more or less the same as the 2013 - 2015 MBP's.
Personally I tried using it for F keys which is quite frankly the only requirement I’d have and it was awful. At a fully extended hand, which you need to do if you’ve hit a meta key, it’s impossible to hit the right F keys every time. I don’t know how anyone writes code on those machines.
It went back to the Apple store within a week and I got an air. Which went to the Apple store in a week because the keybbboooaaarrdd wwaaass aawwwful.
Then I bought a thinkpad. Which is less of a shiny toy.
T495s running Ubuntu 20.04. Absolutely perfect machine as far as I am concerned. Everything works flawlessly that I've tried and the quality is excellent. Really like it.
Like most oldsters, I have presbyopia, which is exacerbated in low light. When I need to turn up the brightness of my monitor, I can actually see the "button" on a touchbar. On all my non-touchbar laptops, I randomly press F keys and hope for the best or pull out my phone for a flashlight.
Yes, I miss the physical esc, but F-keys are usually SW configurable and I can configure splat-1-10. That said, I am not a frequent programmer...
I love the Touch Bar. All those cmd-opt-shift bizarro keyboard shortcuts, I have mapped to custom buttons across lots of apps. A couple popup apps on globally-available buttons. Even a swiping gesture while on iTerm to fly through command history.
It's great if you take the time to actually customize it.
incidentally, I place the “blame” for the poor uptake of the Touch Bar at Apple’s feet. They relied on uninterested developers to make it useful, and didn't but the necessary tools in the hands of users to make it useful themselves. In five years, they’ve done nothing to expand its capabilities since the debut.
Everything cool I can do with the Touch Bar is thanks to a third-party tool, BetterTouchTool. Also enables some cool stuff like three-finger-trackpad-swipes to cycle through window tabs! Worth the price of admission for that feature alone, give it a look. https://www.folivora.ai
I really have no clue what apple is thinking with regards to the touchbar.
Why should any developer spend time developing sorely lacking features for the touch bar, when they just released the most popular mac without one?
The only way I can make sense of the decision to release the air without a touch bar, is that it must be really expensive to manufacture and they were struggling to hit the $1000 price point.
As a result the touchbar lives in this weird limbo state. Apple themselves clearly are uncertain what they want from it, and it shows. Since its release the touch bar has been left mostly abandoned and it’s been up to third party developers to make it useful.
It can be useful, at least after heavy customization – something a screen lends itself well to. Why isn’t apple doing more to help users customize it?
macOS comes with an application called Automator. Automator is confusing for power and regular users alike and as a result, nobody uses it. Why not rethink Automator completely with the touchbar in mind? Bring the power of programming to regular users with an easily accessible ‘create your own button’ feature that lets users add custom commands accessible from their touchbar?
My fondest wish is a high-end MBP 16" with no touch bar. The touch bar's sole role is to generate errors and missteps resulting from merely grazing over it.
Even setting the touch bar to vanilla F-keys, grazing triggers F-key actions which is so frustrating.
FWIW there are third-party software options for the Touch Bar such as MTMR (my touch bar, my rules), which at least allows you to activate the haptic feedback in the trackpad when touch bar buttons are pressed. I found that it helped dramatically with accidental touch bar presses.
MTMR also solves the other main problem with the Touch Bar which is that it hides brightness and volume controls behind a tap (so you can't, for example, instantly mash "volume down" when you find it is unexpectedly loud.) With MTMR (and others I believe) you can make multi-touch gestures on the bar to adjust volume and brightness swiftly.
All that said, I'm not convinced that the touch bar adds enough value to justify its cost. If your day-to-day computer use includes tasks it is good it, maybe. As a developer, probably not.
Just my $0.02 as a touchbar-skeptic-cum-macbook-owner.
I use it for exactly what I've always used function keys for- brightness and volume.
I also occasionally use the emoji picker or app specific stuff. For example- my markdown editor has Touchbar selections for code blocks and all that sort of stuff. I don't write enough markdown to remember everything- so the touchbar makes it nicely discoverable without having to click through menus.
I only use it for brightness and volume controls (and TIL about the Emoji picker thanks to a sibling comment). I LOVE the analog-feeling controls on brightness and volume. Even though I know it's the same under the hood as an incrementation button, the UX of it makes me feel substantially happier. I absolutely miss the volume and brightness sliders when I go to use my Chromebook or my wife's laptop.
I normally keep the touchbar configured to only show the ~4 most used buttons (night mode, brightness, volume, and mute), and find that I use the other buttons rarely enough that I forgot what else was on there -- mainly because I use touchpad gestures instead.
I rarely (never?) used F-keys in my IDE (Jetbrains) or other apps, so I don't miss them. I have a physical Esc key (which is nice), though I used it rarely enough on my previous generation touchbar mac that it wasn't very infuriating. Having the physical escape + power keys removed any complaints I previously had about it.
Interesting. Volume and brightness adjustment is my least favorite part of the touchbar. With physical keys I can adjust the volume/brightness without looking down.
If it were a physical slider, I'd totally agree with you.
(Not that there's any right or wrong or answer here.)
I dislike it immensely, mainly because it changes all the time and this makes ot hard to use worhout looking at it every single time. The best way is to use an external keyboard.
Depends on how you use it, if you use a lot of actions (basically macros) via keyboard shortcuts then you probably hate the touchbar, because they can only be bound to f-keys.
VirtualDJ supports using it for a crossfader, but then, it would be a way better move to buy an Air, and a real audio interface and mixing board, or at least an external controller.
If Touch Bar were in addition to the function keys + Escape key, everyone would praise it as yet another brilliant Apple innovation, and rivals would be copying it the way every notebook nowadays looks like the late-2008 unibody Macbook Pro (and really, all the way back to the 2001 titanium PowerBook). But it's not, so they don't and they aren't.
On a tangential note, there were reports about the Intel based MacBook Pro from earlier this year having CPU usage spikes and heating issues when charging from the left side port. For users with this problem, it was effectively “charge using the right side port”.
I definitely get more heat and throttling powering on the left side than the right, on my 2019 16”. There are quite a lot of people who have reported the same, for all models which have USB-c ports on both sides. If you’re not seeing a difference, it’s pretty likely your workload or some background process is still pushing the thermal limits of your machine.
Hmm, I wouldn't say this is nonsense. I can consistently reproduce the issue by charging on the left side (making the computer completely unusable), and then switching to the right side and having it run perfectly fine. Since I started charging on the right side I've had zero issues with it.
You would also think they would keep the MagSafe connector, which was truly revolutionary (and has definitely saved me at least a dozen times), but yet here we are.
I've already tripped or stumbled on my usb-c cable dozens of times working from home. The damn thing is already pulling out at the ends. Why can't apple make good cables? They make their environmental statement making you buy the charger for your phone rather than shipping one in the box, just to sell you junk cables that don't last. I had old multipin cables and earbuds from the original ipod that still hold up. All of their textured grippy feeling cables have been crap.
You can’t charge from either side if there are only ports on one side, which, if I understand the design and Gp’s comment correctly, is the case for the ARM macs. Which would be those GP meant as “this generation”.
Five years later and I still can't buy a new 13" laptop from Apple with more RAM than my 2015 MBP.
Edit: Apparently you can configure a (Early 2020) Intel-based 13" MBP with 32GB of RAM - I was not aware of that. Hope they bring that option to the ARM versions ASAP, especially if the performance gains are as good as Apple claims.
I'm guessing that since the RAM is now on the SOC instead of on separate chips creating models with more RAM becomes more difficult not only from a space constraint on the die but a cost to manufacture more variants?
The iPad Pro has a mere 4GB of RAM and is faster at almost every task AND multitasking than any of my other beefy machines. I wouldn’t discount this yet.
I think it depends what they've done with Big Sur.
Take a look at your activity monitor and see how many background tasks you laptop is running right now (I currently have 481 and I only have 4 apps actually running).
Compare that to iPad which can run 2 apps max and apple controls all the background activity.
Hitting a wall how? The machine stops working properly? Or you see all the RAM allocated in Activity Monitor? Because MacOS aggressively allocates RAM even under a normal workload.
Well when you start getting memory pressure you start losing cycles to compressing and decompressing pages which makes everything run like complete shit.
That’s not how MacOS works. It “keeps the pressure on” so to speak to try and have as much in-memory before you need it. Have you experienced actual issues with specific software?
Blows my mind, I do most of the same on a 2013 MacBook Air with only 8GB RAM. I’ve never had a perf hiccup unless I try video editing or gaming. But I just don’t really have a want to do those things, hence why I’m still getting by on this thing. I was going to upgrade once for the hell of it but all the keyboard issues kept me seated.
I expected you to say you did something with graphics/media. I guess the VMs could be the difference. I don’t use. But I have 100 chrome tabs open at most times lol.
All these things work with 8GB RAM. macOS uses all available RAM aggressively, so if you have 32GB RAM it will appear as what you are doing requires 32GB RAM, which is not the case.
On my machine, if it goes more than 5G into swap it starts to become less responsive.
It really depends on your usage patterns. VM abusers and people who need to keep an eye on more than one project at the same time (as in multiple IDEs) can't do with 16 Gb.
And please don't tell me to start closing software, I'm willing to pay for more ram to have everything handy. Except... I can't.
Right now I have 6GB for VMs (for ~12 docker containers) and 8GB for various IDEs and editors. Probably typical for those on my development team. Surprisingly Chrome isn't even the top ten for memory footprint on my laptop.
So far 16GB has been fine on this 2017 MacBook. Wouldn't turn down 32 though :-)
Could be true. But I get instant feedback from all apps, so what's the measure of "reduced performance" that would matter? I'm actually curious, if nothing lags, would I notice a difference by upgrading?
My view into your comment is that you think I'm used to the lag. That's not the case. It's not my only device just my only Apple laptop. My newest device, a iPhone 12 Pro, is an upgrade from my 6s that I surely noticed and knew I would. I had been frustrated by the 6s for a while but held out for 5G.
Yeah, I found running all my dev environments in Docker/VMs made everything too slow. So I've just installed everything natively and have my own little docker-compose style runner to simplifying running our in-house microservices. I would definitely appreciate decent Docker performance, but it was actually surprisingly painless to set this up.
I did some benchmarking on that public taxi rides dump (~600m records) and could run through it all in under 15 seconds directly from SSD. No way in hell your CPU will keep up with anything close to memory bandwidth or beyond 3GB/s.
The only thing could be some really aggressive swap disk situation offloading anything not needed that second. But that won't work for things where you need all the ram right now, video editing for example.
But you don’t have background services on the iPad and apps are hibernated to disk if they aren’t active and iOS needs more ram. It’s possible that Apple is doing the same thing on macOS, but we don’t know (it would break a ton of apps).
This is a good point and begs an interesting question: will they continue using the same ARM chip across the whole line when the 16" MBP and the other iMacs make the switch, and if so, will all Macs of the same generation always have the same amount of RAM? Or will they branch the chips (M2 and M2S, or something)? Is RAM becoming less relevant when you have smart integration of software and hardware components, to a point where stratification is no longer necessary in most cases?
> Is RAM becoming less relevant when you have smart integration of software and hardware components, to a point where stratification is no longer necessary in most cases?
If they intend these things to run software development, audio/video editing, CAD, or any other resource-intensive workloads, there will absolutely be demand for more.
I could see the M2 coming with 32GB across the board, at which point the only current outliers (setting aside the Mac Pro as a special case) would be the 64GB configuration for the 16" Macbook Pro and the 64GB and 128GB configurations for the iMac. I could imagine Apple's optimizations closing the performance gap with the current 64GB offerings, and then perhaps they just leave 128GB customers to go all the way for the Mac Pro. I would be surprised to hear that they sell very many 128GB iMacs right now anyway.
I used a MBP with 4 GB of RAM up until 2018 for rather heavy duty data science workloads. Wasn't ideal, but one thing I learned was that you really can't infer the amount of swapping and performance degradation that occurs just by looking at how much RAM is in use vs. how much you have, because the OS will eat up whatever it gets. The little memory pressure graph you can see in Activity Monitor is quite good, and on my current machine with an "unfortunate" 16 GB of RAM, memory's always full but I have no complaints about speed whatsoever.
Crazy to imagine this, but restricting developers to 16GB machines might be the most effective way to fight against (cough... electron driven) software bloat :-)
fwiw, you can configure a 13" Macbook Pro with an Intel chip up to 32GB. But I agree, I wish they launched the new M1 based 13" Pros with up to 32GB RAM
The low end Air only has 7 GPU cores compared with 8 on the one with more storage. So they must be disabling a bad core and selling it the cheap model. Other than that all these machines use the exact same CPUs. Which means that an iMac or 16" MBP are probably going to use a M1X or something with more cores.
Yes. Chip fabrication is super sensitive to the condition of the silicon wafer used. Chip companies talk about yields, because some percentage of chips can't, for example, be run at the highest clock rate. Indeed, some can't run reliably at all. If there is a microscopic flaw on the wafer that ends up being where one of the cores is located, disabling that core altogether is an option to keep that silicon marketable.
The Intel 13" MBP with 2 TB3 ports also topped out at 16GB. They're actually still selling the Intel 13" MBP with 4 TB3 ports, which is the model that can be configured to 32GB.
This is 16GB which is shared with their GPU so it will be interesting to see what limit Apple puts on the GPU for grabbing memory.
There are stories that Apple is working on a gGPU so that would free space up on a future Mx chip for additional processors or memory. However looking at the space occupied by DRAM and GPU it looks like a larger die is required for any on board memory expansion.
Do most people with MacBook Airs have workloads that require more than 16GB or ram or do they generally value a lower price point?
Cost, performance, light weight - pick two right? Apple seems to have picked a combination of lower cost and light weight. Customers who need more ram probably move up to a MBP.
For the lowest end models right? It's the same tradeoff to bring down prices for people buying the lowest end machines. Presumably they understand that people who need the higher end spec'd machines will wait to buy a new laptop until more software has been migrated to run natively.
If you need a 13" MBP that supports more than 16gb of ram, they sell it. It seems unlikely they'll stick with a 16gb limit once they start replacing the higher end devices.
I honestly can't think of a situation where you need more than 16GB of ram on an ultrabook. If your job is heavy video editing - I don't think getting a laptop is a good idea.
I know I use a lot of RAM for compiling and tests, but we have cloud instances with up to 500GB of RAM for that.
Also with the new ARM instructions, I suspect more heavy task like these will be forced to move elsewhere. Businesses might not want to switch for a few years to wait for dev tools/ARM servers to be available.
Please do explain what your workflow is that requires to you have so much RAM in such a small form factor with a severely limiting CPU and almost non existent GPU?
GPU: don't need it except to drive 1 or 2 external monitors if I'm at my desk (thus small screen not an issue either)
CPU: 8 cores is fine for me
RAM: mainly for a local dev environment running in Docker
For reference I currently have a 2019 16" MBP, with 64GB of RAM and an 8 core i9. Right now I'm using ~45GB of it, usually its up around 50 or 55. The CPU is at ~10%. While I'm developing locally, the workloads are highly bursty but sometimes they will chew up almost all of the CPU for about 10 minutes.
I used to have a 2016 15" MBP with 16GB (4 core i7) and while I could still run everything, the RAM was always pegged. Before that I had a 2016 13" MBP with 16GB (2 core i7) (I actually swapped with a coworker because I thought his 15" had 32GB). With both of these the RAM was by far the limiting factor, and while the increased CPU cores have been nice, most of the time I haven't noticed. The RAM increase on the other hand is always noticeable. Thus, 16GB is a nonstarter for me.
Edit: To add, I prefer the 13" form factor when I'm not at my desk.
More correct would be to term this limit "absurd" or "clownish".
Even if Apple views the use-cases of these machines as primarily consumption devices and the users of these machines not as creators but as consumers, the bloat of the modern web is expensive and getting more expensive every day.
The end user needs this memory even if they really are just using these as facebook machines.
EDIT: That's not just the notebooks - the mini appears to have this limitation as well (down from 64GB previously).
Not all RAM is created equally. Better memory management means less RAM can be better than more RAM. Pair that with superfast flash storage for SWAP and you might not even be able to tell the difference between 16 and 32.
Besides, this release cycle is 100% optimized for an impressive speed boost, tempered by a need for a more impressive battery life.
> Better memory management means less RAM can be better than more RAM
No, it can't. If more RAM means it's slower RAM then maybe it can be better to have less of it in some workloads. But otherwise it's never better to have less RAM than more RAM. Better memory management can make the impact of less RAM be less severe, but it's still unambiguously worse.
Especially if you have workloads that actually need the RAM like large ML models or editing 8K videos.
> better memory management absolutely can be better than just throwing more RAM at the problem.
Those are not competing in any way. Better memory management does not require nor benefit from less RAM.
Apple doesn't give you a different kernel when you choose the 8GB SKU instead of the 16GB one. It's the same software, just with less RAM. And having that less RAM is best case break-even in "day to day" experience, but never better.
It's true that if the OS could predict exactly which memory pages to keep and which to swap out, we could save memory wastage, but so far I haven't seen any memory management scheme that can reduce memory consumption by half.
For me personally, I won't even consider a machine with less than 32GB ram in 2020/2021. With 32GB, I never close out of applications that I use regularly, and so it allows me to switch state instantaneously for not that much more money. My workloads are typically read-heavy & multiple GBs - editing/screening/cataloging hundreds of RAW photos, loading of multi-GB games, having about 50 tabs open in FF, etc. After having switched careers I rarely code anymore, and I don't think these are uncommon requirements.
Yes that's true, for the "average consumer" that really only needs that RAM to power the 100+ browser tabs they have open. But if you're doing lots of virtualization or containerized work, super fast SWAP isn't going to cut it.
I no longer code so I'm probably close to the 'average consumer' now. I personally consider 32GB to be the minimum amout of RAM that anyone should consider in 2020,2021 (with obvious caveats on money). My multi-GB workloads are read-heavy and include loading multi-GB games, editing hundreds of RAW images, opening 50+ tabs, etc, all without leaving the cosy confines of my RAM - which I still sometimes do. I have a 100GB system commit limit on my W10 box, and with my current usage pattern, I hit about 50GB @ peak.
I was literally about to order a new mac mini, until I noticed it only has 16GB max memory. How can they do this?! The previous model could be upgraded to 64GB, had 4 USB-C connectors (instead of now 2) and the option for faster ethernet. It really sounds like this new mac mini is a downgrade from the previous model. I wonder if the new M1 chip/architecture really makes that much difference to make up for the downgrade of the rest.
The new Air looks good. The new mbp 13” not so much. If they’re going to lower the max ram they should at least provide an explanation why 16g in the new architecture is comparable to 32 in the previous, if that’s indeed the case.
Dell XPS 1000+ 8GB, Lenovo X1: 1000+ 8GB, Thinkpad T14s 1000+ 8GB, this list could be long. Even System76 linux computers are 1000+ for 8GB config. I do not know where you got this idea that 1000+ laptop have 16GB default.
I am sure there will be consumer (crap) computers like HP pavilion, Dell inspiron etc where one can get 1000+ computer with 16GB or so RAM.
My Huawei MateBook 13 2020 Ultra has 16GB and cost ~€1200 i.e. same ball park as the ones you listed (i7, 512GB SSD, 16GB RAM, GeForce MX250, 2736x1824 display).
Apart from the RAM the main reason I went for it was the resolution (I liked the extra height in a 13" form factor). Also, it works with linux - though haven't tried it yet.
Recently bought a Thinkpad E15 Gen2 with 24 GB of RAM for less than $1000. (Different country though (Japan), and ~6 week wait, but extremely reputable retailer)
* As to how that deal worked, I'm guessing that large retailers that agree to buy a lot of laptops get a large discount.
I finally switched to 32GiB this year, and man, what a difference. I hadn't thought I was all that constrained by 16, but now that I'm not, all the edge cases where that was the pain point are thrown in to relief.
Not strictly comparable, but my company provided Lenovo P1 that's a few years old got 64 GB. Can't imagine having to deal with 16 GB a few years later..
I think it's actually a good move for the first gen. If they put out the CPU frequency, people would be comparing it to the intel parts. ARM and x86 being different architectures, a 2GHz M1 is going to perform differently than a 2GHz Core i5
There's been dozens of macs that have come out with a lower baseline frequency than the last gen. My ancient 2004 ibook G4 has a higher frequency than a 2020 macbook air, doesn't mean customers conflate it to be faster nor should they if your marketing was worth a damn.
It is sorta meaningless these days. The 3.6GHz chip in my P consistently averages at 4.3GHz, and climbs up to high 4's when it needs to. It's also much faster than a 3.6GHz processor from 5 years ago.
It's a great baseline for comparing same-generation computers, though. Without benchmarks, none of us has any idea which is faster, the Air or the Pro, or the 15".
Note they say "latest", not "best". That means that a Celeron 5205U, dual core processor with 1.9GHz, no turbo, no hyper threading, 2MB L1, would be a valid baseline.
Apple's direction of marketing is better suited for mainstream consumers. Many of them don't know what CPU frequency means. For them, its additional noise with no added value. Consumers have too many choices and throwing a spec sheet often make them compare for hours and still can't make a decision whereas you know that this year's Mac is likely faster and better than last year's.
What I can't find is how many external monitors can be supported on the M1 chip? I don't see any detailed specs. Al they say about the M1 specs are:
"The Apple M1 chip is the first system on a chip (SoC) for Mac. Packed with an astonishing 16 billion transistors, it integrates the CPU, GPU, I/O, and every other significant component and controller onto a single tiny chip. Designed by Apple, M1 brings incredible performance, custom technologies, and unparalleled power efficiency to the Mac.
With an 8‑core CPU and 8‑core GPU, M1 on MacBook Pro delivers up to 2.8x faster CPU performance¹ and up to 5x faster graphics² than the previous generation."
At practically every desk in Silicon Valley for 4+ years, yes. Two 27" 4K Dell UltraSharps and a choice of 13" or 15" Macbook Pro are standard issue for an engineer.
My first PC which had 8GB of RAM was built in ~2010 if I remember correctly. My current machine has 32GB. In all honesty, 32GB is not even much, considering my motherboard can take 128GB (in the old days, if your motherboard could take max 8GB RAM, people would fill all the slots to capacity). Also, I have a Lenovo machine that I paid $200 for that has 8GB RAM.
WHY any power user would buy a MacBook is beyond me. They've become devices that only my mother would use, not to get work done.
Apple should've made the baseline 32GB with 128GB as max spec. It would've forced the whole industry to give us more RAM. So for the next two years RAM will still be stagnant, all other manufacturers that copies Apple will indeed keep shipping 8GB without blinking.