I'm not sure I understand your comment here? Are you referring to the end of your MacBooks' life, or the beginning? It's somewhat ambiguous.
Assuming you are referring to end of life process:
Apple's kit sold as refurb is done from their incoming returns — faulty, or otherwise unwanted, systems.
Everything sent to them as a trade-in isn't handled by Apple at all: it's contracted out to third-party companies. (This likely applies to their recycling programme also, but I've not looked into it)
At least: these things certainly used to be the case — and there have been a whole bunch of articles online that support this, over the years. I would love to see evidence to the contrary if things have changed?
— But like I said: I'm not sure I understand your comment, so maybe my points here are irrelevant.
I'm pretty sure your (repeated) comment here goes against HN guidelines — arguably: it's a shallow dismissal, it's snarky, it doesn't provoke thoughtful conversation, it supports an idealogical battle, etc., etc.
Except, with Apple's new kit here, the RAM isn't simply some external chip soldered to the main board, it's actually on-die with the CPU silicon (and everything else in that silicon: GPU, memory controllers, etc).
So yes, arguably there are fewer parts (just one), but in the event of e.g. some bad RAM during manufacture, it's far more costly to throw out the chip containing that bad RAM.
No. It is not possible to make DRAM on the same silicon process as high-performance CPU logic. It is a myth that Apple Silicon includes the RAM on its die. Apple uses external LPDDR packages, just like everyone else, which you can clearly see in this photograph of the mac mini's CPU module: https://valkyrie.cdn.ifixit.com/media/2021/01/28102657/m1_ch...
Those chips on the right side are LPDDR4x chips (which you can verify by googling the part numbers visible on them). They are "off-the-shelf" so to speak, not custom on-die memory.
To some degree, the OP here is more or less just a rewording of, and invocation of, the 'think of the children' argument. And because of this, personally, I don't really buy it — at least: not the additional emotional weight/baggage that such an argument tries to buy.
A lot of children's books can be viewed as, or compared with, toys. They're fun, whilst also being educational in some way. With the main difference being the focus on words/language (plus images). Plus story, I guess.
But that specific focus aside: a lot of folk still think it's perfectly fine to give their kids toys that are basically plastic tat that is mass produced in China (almost an equivalent to a content farm, perhaps?), as opposed to beautifully hand-crafted toys. Because, well: that's fine, really.
Similarly, if a parent or relative makes some toy for a child themselves, it's ok for some of those to still be minimal- or low- effort, particularly if the end result is still good-enough (or better), and, more importantly, still treasured by the child — even if only for a short period (e.g. a paper plane / boat / crane).
I can recommend the Storz & Bickel 'Mighty' (by same makers as the Volcano) — perhaps plus a pack of spare parts / accessories (so one can postpone cleaning of the cooling / top part of the unit).
Although, sure, it is expensive, as you say, but it's very good. And no matter how gunked up the top cooling part of the unit might get, it will never not turn on/off. I don't find the maintenance of the unit to be a big deal (particularly having an extra top /etc), but as always with such things, YMMV.
Just as a point of reference (in the past I was a games dev), a game / sim running at 60 frames per second means that one has a budget of ~16ms/frame — a figure that you may want to consider when benchmarking.
Obviously, games rarely have 500k objects that need collisions — and logging can also be costly (I didn't specifically look at where/when you were logging, but even one log statement in a regularly called function can significantly impact a benchmark)
Text message via one's phone number isn't the only way of doing 2FA...
Twitter also support the use of an authentication app to generate codes [0] (or a physical security key, which likely is far from practical for most folk).
I use an authentication app for 2FA on a few different sites, I think the process is alright, and don't mind the small amount of extra friction for the gain in security (whilst additionally not disclosing my phone number). I'm currently using Authy [1], recommended by a friend, and haven't really looked at others, but there are a few different alternatives available (some of which are mentioned on the Twitter 2FA page here).
> the vast majority of software developers do not consider a strict conformance to the 10 OSI criteria as being necessary to apply the term "open source"
[citation needed]
My counter claim, without citation, is that I actually believe (from experience) that the vast majority of 'open source' projects are in fact released under licenses that already comply with the 10 OSI criteria, and are therefore 'approved' OSI licenses. This is easily witnessed by looking at the licenses of the majority of open source projects — or perhaps even just the most popular ones.
That would seem to go against your claim regarding 'most developers'.
But it's not actually a debate about 'most developers', it's about the OSS projects out there, not individual devs, no?
From the headline alone, I guessed this was to do with pointers/references to values vs values themselves.
Yep, with values that take a lot of memory, it's faster to pass pointers/references around than it is to pass the values around, because it is less bytes to copy.
Of course there is more to such a decision than just performance, because if the code makes changes to the value which are not meant to be persisted, then one wants to be working with a copy of the value, not a pointer to the value. So one should take care if simply switching some code from values to pointers-to-values.
All of these things are things that coders with more experience of languages that use such semantics kinda know already, almost as second nature, since the first day they got caught out by them. But everyone is learning, to various degrees, and we all have to start somewhere (i.e. knowing little to nothing).
Assuming you are referring to end of life process:
Apple's kit sold as refurb is done from their incoming returns — faulty, or otherwise unwanted, systems.
Everything sent to them as a trade-in isn't handled by Apple at all: it's contracted out to third-party companies. (This likely applies to their recycling programme also, but I've not looked into it)
At least: these things certainly used to be the case — and there have been a whole bunch of articles online that support this, over the years. I would love to see evidence to the contrary if things have changed?
— But like I said: I'm not sure I understand your comment, so maybe my points here are irrelevant.