Instruments.app is the official way to access performance counters. I believe it can use a few more (non-whitelisted) events, which are described in /usr/share/kpep/a14.plist - I couldn't figure out how to hijack the single-consumer API that I think it's using. (Edit: it seems it just shows non-whitelisted events in the GUI, but doesn't let me use them.)
I don't really understand the kernel module process/policy - it was a lot of trying random things and seeing what worked. But I think you shouldn't need permission, as long as you're building it and running it on your own machine. But I do have an Apple developer account that costs money to let you develop on iOS, so that might have been a factor. You do need to disable some security features to run kernel modules.
I'd rather be running this kind of instrumentation on Linux (or without a kernel, e.g. using m1n1), but I haven't set that up yet. (It'd be nice to lock the CPU to a frequency, and pin threads to cores, etc.)
Could anyone explain the instructions/float metric ? Just tried running the same code on an M1 Air from blog, and i get quite different number of instructions/float, whereas the instructions/cycle are pretty close:
# parsing random numbers
model: generate random numbers uniformly in the interval [0.000000,1.000000]
thanks a lot, yea it makes sense.
Edit: it seems weird when i think that on the blog it takes 490 instructions whereas in my case it takes 360. That is roughly 26% less instructions required than on blog.
Instructions per cycle are more or less same on both results, telling me that my M1 Air is somehow using less instructions to run the same code (or is faster)
I was skeptical about the m1 processors, specifically around performance on something like a desktop where you expect snappy performance.
I have definitely changed my thoughts and if apple can solve getting more ram in their systems (it’s pretty shameful that in 2021 you max out at 16gb on a MacBook Pro), I feel like they are going to knock it out of the park with their new architecture.
It wasn't until 2017ish before getting 32GB in a MacBook Pro. So the fact that they have 16GB in the first gen seems about right. According to Apple's propaganda and blogospere, the M1 is much more efficient not "needing" the extra RAM. That's just too much for my old dog to wrap my mind around.
I’m using one as a daily driver (red teaming) and it easily handles a couple busy vms, term, few services, editor, obsidian, slack/teams/whatever and browser.. I stopped even bothering carrying the charger with it..
Sometimes I look at activity monitor and it is swapping a bit but I don’t notice any perf degradation so I stopped thinking about it.. shrug
I replaced my i9 16” MBP with a 13” Air. Performance with Xcode is either the same or better on the Air.
Battery life is the true game changer though. When I went out with the 16” I had to make sure I had 100% charge, if I wanted to use it for ~3-4 hours while out. With the Air I can grab it with 25% battery left and be fine for the same amount of time.
The “actual” pro machines with M1 derivatives are going to be incredible…so long as apple doesn’t make another Touchbar/butterfly keyboard-esq mistake.
I also replaced a 16" (although i7) with a 13" Air. Best computing purchase I've ever made.
I was lucky if I could even make it to 4 hours battery on the 16", but the real killer for me was heat and noise. It was like sitting in front of a little aeroplane all day long, and because the video out is wired up through the dedicated GPU, just plugging it into an external monitor added an extra 20W to dissipate. It also had weird sleep issues the entire time, if I left it connected to a monitor overnight I'd often sit down at my desk the next morning to find it burning hot, fans blasting, with the lid closed.
So happy to see the back of that piece of crap. It got reasonable reviews at the time, I can only imagine that either no-one tested it with external monitors, or because it was the first model to drop the butterfly keyboard people just gave it a pass on everything else.
I'm sure that swapping isn't a huge deal, but the big issue is around longevity of the SSD. If you have workloads that use more than 16gb of ram, swapping a lot will prematurely wear the SSD, which isn't replaceable.
I think this has been largely debunked by now, early SSDs had these problems but newer ones? not so much..; I mean, here's what smart says about the drive in my Mac atm:
Percentage Used: 0%
Data Units Read: 12,515,207 [6.40 TB]
Data Units Written: 13,202,995 [6.75 TB]
Host Read Commands: 264,092,805
Host Write Commands: 158,635,815
Controller Busy Time: 0
Power Cycles: 188
Power On Hours: 223
The % used is (I think) the lost capacity due to wear, I've fairly hammered it so far and it's not even at 1%.. I'm pretty sure I'll get at least a few years out of it based on that..
Uhh, that may be what apple is reporting but most SSDs only last for about 100-200TB of write, so I would take that percentage used with a grain of salt. They are probably counting wear leveling percentage used.
On the low end assuming 200TBW that means you have used about 3.3% of the life of your SSD and on the high side you have used 6.75%. Lets say you got your M1 on release in November, that would mean you are using about 1.68%/month on the high end and .84%/month on the low end.
Assume it lasts 200TBW, which I don't think they will, you would be on track for a 8 year lifespan of the SSD. Otherwise 4 years on the low end, assuming the write amount stays the same (which won't likely happen), which would imply you are using the same or less swap.
Every year, I find that I use about 1-2gb more on average, so for me/my usage patterns I almost find the 16gb of memory in the MBPs to be a slap in the face. I have a 13'' and a 16'' MBP with 32/64gb respectively and I don't think I would find anything less acceptable. Right now, my typical memory usage is around 30-40gb when I run docker, vscode, chrome, slack, zoom, etc. I am not even running any VMs(other than the docker vm), I typically will spin those up elsewhere.
For reference, this is my 16'' daily MBP after nearly a year (4.84TB write) and I typically pull down lots of logs/data.
Available Spare: 100%
Available Spare Threshold: 99%
Percentage Used: 0%
Data Units Read: 4,406,559 [2.25 TB]
Data Units Written: 9,454,881 [4.84 TB]
Host Read Commands: 183,307,692
Host Write Commands: 406,821,556
> "most SSDs only last for about 100-200TB of write"
My 3.5-year old MacBook Pro has 119 TB of writes, and still claims 79% "available spare":
Available Spare: 79%
Available Spare Threshold: 2%
Percentage Used: 15%
Data Units Read: 256,111,725 [131 TB]
Data Units Written: 232,470,299 [119 TB]
Host Read Commands: 2,872,362,421
Host Write Commands: 1,547,764,735
That said, my 1-month old M1 MacBook Air is already up to 7 TB writes, so seems to be writing at a significantly higher rate. Both are 8GB machines.
From what I have seen the available spare value will decrease rapidly near end of life.
As Flash cells start to fail they are replaced by spare. Because of write leveling this is delayed for a while, but then as each cell reaches its 300 or 3,000 writes, they all start to fail.
If you care about the keyboard longevity, I'd get the macbook air or wait for the new macbook refresh when they bring the air keyboard over to the pro.
The MacBook Air keyboard is exactly the same as the Pro keyboard, except for the Touch Bar. Any problems with the Touch Bar are UX issues: I don't think anyone has questioned its longevity/reliability.
I got the Air and when I was compiling Nodejs to install it the chassis got really hot. I read a lot about it barely ever getting warm but I guess those reviews were a little exaggerated.
Wasn't the 32gb Macbook Pro infeasible (for battery life) initially because Intel kept on recycling their memory controller and it wasn't until the 8th gen processors where LPDDR4 was officially supported [1]. It appears that Apple gave up trying to do LPDDR4 on the 15/16inch [2] and released it with DDR4, while on the 13 inch they released a 32gb version using 3733MHz LPDDR4X [3].
It was Intel's fault that there was no low power memory controller that would work with 32gb for a laptop.
There were definitely PC laptops available with 32gb of ram, but the wired powered consumption doubles from ~6 to ~11/12 watts [1] [2]
Reviews of pc laptops at the time (eg: Skylake) with 32/64gb of ram showed some detrimental battery life and it appears in Apple's case it wasn't a tradeoff they were willing to make.
To me it seems as though Apple gave up and just put DDR4 into their laptop anyway and also increased the battery W-h from 76Wh in the 2016 model [3] to 83.6Wh in the 2018 [4] thus offsetting some of the battery life issues but there may have been other hardware improvements that reclaimed some battery life.
As for the 13 inch MBP, the 2020 model was the first to include LPDDR4X and could allow the bump to go to 32gb but is much more battery constrained at 58Wh. [5]
These are not new and have been part of Apple’s ISA extensions for several years. For some reason marcan decided to blackbox them, but they’re fairly easy to find if you open the kernel in a disassembler so I’m not sure why he did.
Honestly the SSD is so fast, that for everyday workload you will not notice the lack of RAM even in an 8GB system (I am coming from a system with 32GB or RAM). But RAM is RAM, and using swap memory that aggressively can reduce the life span of the SSD very fast.
Not true. I desperately wanted to keep my M1 MBA, but with Cities:Skylines I have enough mods and assets where I really do need 32GB of RAM. When you really need it, there is no substitute for RAM.
When I loaded up a new game with minimal assets (it easily fit with the 16GB of RAM I had) the MBA played C:S smoother than my i7/1080ti gaming machine. I think if I can get a laptop with at least 32GB of RAM I won't have to maintain a Windows gaming machine and will be able to consolidate everything onto an M powered MacBook Pro.
Now if Apple would just produce a 5K external monitor life would be perfect :)
>That's just too much for my old dog to wrap my mind around
My theory is that Apple has specific hardware for much faster and better memory compression and decompression. macOS has always had memory compression but I think those are done with Software on CPU.
You may also get some additional memory from Unified Memory. ( Not sure how it worked previously on macOS, so this is a reference from PC perspective. )
But Yes generally I agree, That's just too much for my old dog to wrap my mind around.
They say that because the SSD is fast and the performance hit for hitting swap isn't that big of a deal. Without mentioning that the SSD is soldered to the motherboard and is a wear item. The fact you can't swap hard drives is silly, almost like having a car with tires that can't be changed.
SSD is NOT as fast as RAM. If you really need more RAM, do yourself a favor and wait until they come out with a machine that has more RAM. I'm waiting, rather impatiently. It was hard to return my M1 MBA but I think it will be well worth it in the long run.
> ”According to Apple's propaganda and blogospere, the M1 is much more efficient not "needing" the extra RAM.”
The M1 Macs come with very fast SSDs. I think what’s really going on here is that, with write speeds exceeding 2GB/s, macOS can use NVM almost as if it is RAM with little performance penalty.
It works. With 8GB, my MacBook Air feels as if it has a lot more RAM than it does.
The downside is increased wear on the SSD. My 512GB SSD has accumulated 7 TB of writes in the few weeks I’ve owned it.
I'm not so sure it's so hardcore. The standard laptop that new engineers at LinkedIn get is the 16in Macbook Pro with Core i9 and 32 GB of RAM. I'm sure we're not the only ones issuing this spec, and it's not even the maxed out 64GB spec.
For work the $400 difference between 16GB and 32GB is pretty negligible over the course of 3 yrs. Consider the cost of each engineer and $400 / 3yrs for even a tiny improvement in productivity is worth it.
And as soon as 32GB became available, those same hardcores where asking where the 64GB options were. I had workflows on desktops with only 32GB of RAM that would choke, but it was a Hackintosh, so I was able to bump up to 64GB and the issues went away. Which is why 16GB max is so hard for me to wrap my head around
Can you give some examples of workflows that need that much ram? I stay away from docker and VMs, and I really struggle to think of everyday situations where more than 16gb of ram is going to be worth it for me.
I shoot lots of timelapse on a full frame dslr RAW. I import my image sequences into after effects at full image size at 16bit color (the RAW is 14bit native). Exporting those 1:1 image size would choke on 32GB. Upping to 64GB saw the errors go away.
IME if you are not doing video editing, it will be docker and VMs. So it makes sense that devs would want the 32g (standard at my work, too). When I got my own personal laptop. 16G was fine, and I haven't had issues.
I'm pretty sure the MBP line won't actually "max out at 16GB"; they just haven't released the higher-specced MBP configurations yet. Think of it as a "some configurations are experiencing shipping delays" but where those "shipping delays" are ~1yr.
Intel and AMD both document their performance counters extensively (unlike Apple) so if you have Linux installed you should be good to go to do it yourself.
But I said I would get the first production-ready ARM Laptop chip... It had to be an Apple, here I am writing this on an M1 Laptop now (to my credit I am trying to minimize as much of the home phoning via various means, albeit I understand I'm using complete spyware).
> On modern versions of macOS, you simply can’t power on your computer, launch a text editor or eBook reader, and write or read, without a log of your activity being transmitted and stored.
> It turns out that in the current version of the macOS, the OS sends to Apple a hash (unique identifier) of each and every program you run, when you run it. Lots of people didn’t realize this, because it’s silent and invisible and it fails instantly and gracefully when you’re offline, but today the server got really slow and it didn’t hit the fail-fast code path, and everyone’s apps failed to open if they were connected to the internet.
> Because it does this using the internet, the server sees your IP, of course, and knows what time the request came in. An IP address allows for coarse, city-level and ISP-level geolocation...
> Apple (or anyone else) can, of course, calculate these hashes for common programs: everything in the App Store, the Creative Cloud, Tor Browser, cracking or reverse engineering tools, whatever.
> This means that Apple knows when you’re at home. When you’re at work. What apps you open there, and how often. They know when you open Premiere over at a friend’s house on their Wi-Fi, and they know when you open Tor Browser in a hotel on a trip to another city.
"We do not use data from these checks to learn what individual users are launching or running on their devices.
Notarization checks if the app contains known malware using an encrypted connection that is resilient to server failures.
These security checks have never included the user’s Apple ID or the identity of their device. To further protect privacy, we have stopped logging IP addresses associated with Developer ID certificate checks, and we will ensure that any collected IP addresses are removed from logs.
In addition, over the the next year we will introduce several changes to our security checks:
A new encrypted protocol for Developer ID certificate revocation checks
Strong protections against server failure
A new preference for users to opt out of these security protections"
I'm with Stallman on this one. Without access to the source code of their entire server stack, there really is no way to be sure of what they are or are not doing.
Even with it there is no way to be sure of what they are or are not doing. Trust in an external endpoint is not improved by some theoretical source code dump on GitHub.
This post is factually incorrect, it does not send a hash of the application (instead an identifier that can be shared across apps) and the server doesn't save any logs.
But a content-hash would have given them a release ID (i.e. which specific version of the app bundle you're running.)
A series of {release hash, launch timestamp} events could be used to build a much more precise profile of your computing habits than just {app ID, launch timestamp} events would.
Also, you're ignoring the power-law: while yes, the majority of software exists in a long-tail of ISVs, most of the apps that people use are made by big corps that make a lot of apps each. 80% of the apps on any computer (Windows or macOS) are Microsoft or Apple or Adobe apps. When you're using any of those, all Apple gets is {Apple, timestamp} or {Google, timestamp} or {Adobe, timestamp}. That's... not very useful for profiling. Especially the first two. Safari and iTunes are both just "Apple" through this system. Are you working? Relaxing? Who knows?
Those are fair points. The original report was much more serious before Apple changed policies to make the reports encrypted in transit and stopped logging IPs.
Apple telling you they don't log IPs means nothing. Facebook did this and nobody batted an eye. [1]
It's also incredibly unlikely. I'm just trying to picture what this server that does no kind of user identification at all looks like. What value would it possibly offer? Just to count how many times an application is opened? How can any kind of analytic application function without some kind of user profiling mechanism and a place to store that data for analysis?
It’s not an analytic application, its purpose is to benefit users not Apple, and Apple doesn’t actually have any profit incentive to collect user data from this service.
Facebook operates a free service that they use to collect data about you to sell for advertising purposes. Apple sells expensive personal electronics directly to consumers, and has made it a part of their core brand to be privacy-conscious. They’re certainly not perfect, and they’ve clearly made missteps along the way, but they’ve done more than virtually any other public company to further their customers’ privacy and demonstrably collect as little data as possible. When they have made mistakes, they’ve carefully explained what circumstances led to it and have generally gone above and beyond in ensuring that kind of mistake can’t happen again.
Your perspective is little different than the indefensible “both sides” mentality many people have toward politics. Apple is not Facebook, and there is a massive difference to anyone actually paying attention.
I too, used to think incentives were good enough to generally guard against bad behavior like this. The problem is that incentives can change quickly and unless data is explicitly (and with some guarantee) removed, there's always the chance for it to be accidentally exposed, nefariously exported, or repurposed as incentives change. The only safe amount of data to send out by default is what's essential to accomplish what you are trying to do.
Relying on Apple to do the right thing when they're sent a bunch of data which has some use to them, and to their users, if they keep it and run statistical analysis against it, is like relying on that handshake agreement to store some of your belonging in your kindly old neighbors shed. Sure, you trust him, but he's not going to be around forever, and who's to say what will happen to it if someone takes over his property after he's gone. And if that kind neighbor had a habit of cleaning up the stuff your kids left in your yard for you by putting the items in that shed of his... well it's nice that he allows your kids to get their stuff from there whenever they want, but still, that's just asking for problems down the line.
> Relying on Apple to do the right thing when they're sent a bunch of data which has some use to them, and to their users, if they keep it and run statistical analysis against it, is like relying on that handshake agreement to store some of your belonging in your kindly old neighbors shed.
No, it's relying on this being disaligned with their profit incentives. They've made a selling point of their products being privacy-focused, and actions that go against that directly impact the profitability of these products.
There have been several cases where data was mistakenly collected. Nobody's perfect! And in every one of those cases, they've gone above and beyond in explaining what went wrong and how they'll prevent those situations from occurring in the future. In several cases, they've even published white papers pushing forward the current state of the art on preserving privacy while collecting the minimal data necessary for services to function.
Apple is not Google and Facebook. The latter two have direct profit incentive to maximize data collection and analysis of you, personally. Apple wants to sell you consumer devices, and—outside of specific counterexamples like Siri—collecting your data rarely aligns with those profit incentives.
There is some phoning home from applications, though. On top of the occasional Gatekeeper SNAFU. I like to rely on Little Snitch, and feel nervous when using a computer without it.
Google, Adobe, and Microsoft tend to be quite bad. The other ones tend to be reasonable and just check for updates every now and then. It has flagged a couple of daemons, mostly related to iCloud, but nothing very suspicious after a quick investigation.
Hash checks for malware is not the same as spyware. If anything, it's the opposite.
One could argue that these hashes should be downloaded and then checked against a local database and I can't really argue against that, sounds like a standard anti-virus MO and I would prefer that.
You can block those connections if they bother you.
Is this ripe for being manipulated by an infected system though? By keeping it central to the home office, they have control over the single location rather than trying to ensure every single system stays current.
They can be blocked but not in a simple way for most users. If it was a preference synced with iCloud or at the very least in system preferences I’d be okay with it but I shouldn’t have to fight with my OS. Plus I had to unblock ocsp.apple.com at one point because it was causing an issue with something else.
If it's of interest, these performance events (and the whitelist for this API), are described by Apple at https://github.com/apple/darwin-xnu/blob/main/osfmk/arm64/kp...
Instruments.app is the official way to access performance counters. I believe it can use a few more (non-whitelisted) events, which are described in /usr/share/kpep/a14.plist - I couldn't figure out how to hijack the single-consumer API that I think it's using. (Edit: it seems it just shows non-whitelisted events in the GUI, but doesn't let me use them.)
(And, for my own measurements, I use a kernel module to bypass the whitelist, which is even more likely to blow up the computer, and definitely not recommended: https://github.com/dougallj/applecpu/tree/main/timer-hacks )