The author writes
"I don’t consider Windows good enough. Historically there have been to many ways to compromise a Windows-based computer, and new techniques keep showing up with alarming regularity."
And then later
"I am writing this on my newest computer, a Late 2008 Aluminum MacBook running OS X 10.11 El Capitan"
If you're going to knock Windows on a lack of security, at the very least do as much as you can do to protect yourself on a Mac. Note the long list of security fixes in the latest version of MacOS https://support.apple.com/en-us/HT201222
I mean, this is front page for several hours now and it's embarrassingly low quality. The only reason it triggered a conversation is because it's highly inconsistent.
My point is that both are very vulnerable, precautions are what keeps you safe.
Did they really have viruses (of which for OS X there are very few and far between, to the point of lore) or trojans?
Anyone remember when you could click Cancel in the login screen and get access to the desktop? Good times. :)
If you're getting infected all the time using Windows, you're doing something stupid. The operating system provides the vulnerabilities, but you might want to reassess how you're using your software.
Or, you know, just switch to an OS where you don't.
However (1) this isn't realistic since so many issues are found with bloatware installed by vendors which you don't get on a Mac and (2) the severity of the recent mass release of zero-days by Wikileaks has really skewed the landscape.
Microsoft needs to do more to reign in vendors because if a brand new Samsung laptop takes 7 restarts just to update then something is off with the ecosystem.
If you're behind a firewall and you practice safe computing (adblockers, NoScript, click-to-play Flash and Java or just don't use them at all, don't visit sketchy sites, don't open email from unknown sources, etc.) you can remain just as safe on an older Mac as you can on Windows. Even a 2010 era Mac can still run Sierra and therefore get all the current security updates, just as Windows 10 does.
> Note the long list of security fixes in the latest version of MacOS https://support.apple.com/en-us/HT201222
The list you shared contains updates for much more than just macOS Sierra, there's Apple TV, iTunes for Windows 7, Apple Watch OS, and so on. Microsoft's list contains 876 entries, of which approximately 250 apply to Windows 10 x64 alone, so I'm not sure what point you're trying to make here.
If you are targeted by an individual/funded organization, it doesn't matter which OS you use, because there just needs to be one vuln and you are own'ed.
Therefore, it makes sense to use an OS that has a smaller marketshare. That keeps you safer than anything else.
But in real world, Windows are good enough for a lot of stuff and specifically gaming and office. If kept up2date will be secure enough.
I think talking about security superficially is not really valuable.
> if not more
You could say they're similarly vulnerable, but such emphatic phrasing just makes it sound you (and the GP) are on an agenda.
But the author also says he runs 10.9 in another partition, and claims to want to run 10.2, although looking closer that appears to be a typo.
Malware? No, not really. Windows spyware only really started ramping up in the early 2000s, by which point classic Mac OS was already on the way out. I'm not aware of any significant malware for the platform.
Geez, that's counterintuitive.
This lifecycle depended on having some way for those infected executables to get run by uninfected users. That was where personal software trading came into the picture.
Anyways, as more users started downloading software instead of trading it, that cycle broke. The big shareware sites like Info-Mac would only accepted software from authors, not from users, and would run virus scans on anything they published anyways.
Of course hilariously enough he retired Disinfectant after the onslaught of Microsoft macro viruses/malware.
Well, those security fixes don't matter much. There are as many, if not more, for windows for one.
But even more so, what matters is actual number of exploits in the wild. Sure, any OS can be exploited. But in reality, not all are, and Macs have traditionally had far less malware/virus issues (close to zero). Time and again, "OS X viruses" are shown to be just trojan (which you can never get by just practicing software download hygiene). And even at worse, they affect something in the low single digits (or less).
It's not a case of market share either. When Windows had 98% of the market, and Mac OS (pre-OS X) was dwindling to 1-2%, it still had more viruses that it has today with 5-10% of the US market.
My 2011 Macbook Pro is still perfectly usable for everything a normal person would want to do with a computer. So is my parents' 2012(?) Mac Mini. I just bumped the RAM up to 8GB on both of those computers and they are just fine for web browsing, word processing, and HD video playback.
I suspect that Apple's problem selling iPads is related to this -- my iPad Air 2 is never noticeably slow doing any task, and I know people with pre-Air iPads who are perfectly happy with them. These are people who have plenty of money to buy a new iPad and would do so if theirs felt slow; they just don't notice any drawbacks to using an older one.
In fact, it's not the computer at all that's making me upgrade - it's the fact that the battery is toast, but I can't get an OEM replacement. I've tried to buy a couple of knockoffs, but they've all been terrible.
If it gets the job done buying a newer laptop is just vanity.
Just buy a dual-band 802.11n or 802.11ac USB WiFi adapter, you can buy them for less than $20. 
You won't get more than 20MByte/s over USB 2.0, but it should still be a massive improvement over 802.11g.
My notebook is also from 2012 and still going strong. I just deeply regret getting one with soldered-on RAM and no expansion slot. 4GB is still enough for productivity work, but games could use more.
This is actually a bad idea - the 2011 15/17" MacBook Pros have dying graphics chips, that Apple stoppped repairing at the end of last year.
my employer still have "late 2012" MBP with the nvidia-of-death cards as I call them, and apple haven't replaced any of them. Even in 2013.
This whole article is a disservice. You do not need a second machine to write. And even if I needed, i'd use a machine that runs the latest linux with all the security considerations of a modern system, and then I would just install my text editor and nothing else.
This is nothing but a gateway drug for people to join the mac cult. Meh. Not even an expensive mac is any good. let alone a free one. As mac fans always reminds us: you get what you pay for. And free+mac tax = lousy machine.
And I hate Apple.
I built that machine from three broken PowerBooks in 2006 and used it daily until my ex-wife took it in the divorce back in 2010 (she hated Macs but she knew it would hurt me to take it). Otherwise I'd probably still be using it for basic tasks like distraction-free writing or classic Mac games.
Or maybe to avoid getting distracted by all the stuff available on newer computers (going to guess much of the internet is not easily accessible from Mac OS 9.2).
Or maybe just cuz it's cool.
(Significant minus for those stupid non-standard screws, though.)
I could easily see using one as my daily use machine if I was in a more mac-centric environment without feeling like I was missing anything at all.
Also his mention of Windows security is a little tired.
I think there'll be a minor trend of people keeping the same laptops they have, or buying older laptops to save money, now that improvements have stagnated. Processors are literally getting slower - I challenge you to find a laptop today that is faster than my X230T. Sure, manufacturers might claim a higher clock speed, but with modern horrible thermal management they can only sit there for brief amounts of time. The X230T can literally sit all day at 3.8GHz, and fairly quietly too.
Sure you can go 0.6GHz faster, with the Alienware 18 with a 4.4GHz processor, but that costs six thousand dollars, is as thick as six laptops stacked on top of each other, and is twenty two inches measured diagonally. Oh yeah, and people found that because of heat problems the processor wasn't able to reach the clock speeds claimed. This review  shows that it only got to 3.2GHz, which is substantially slower than my $250 laptop. Oh yeah, and that laptop easily goes over 100F on the outside of the case in normal use, so not only will your legs be crushed by the 12 pound 22" monstrosity they'll also be burned if you dare put it on your lap.
It's very very rare for non-ultrabooks to throttle and usually when it happens is because some manufacturers intend it for whatever marketing reasons and not due to heat really building up.
You likely wouldn't benefit from a 7700 in a laptop in real life tasks but to suggest that all new laptops with high end processors throttle enough that your old one ends up performing better is an outlandish exaggeration.
Not to mention that the GHz race has been over for some time now and clockspeed isn't all that matters.
For even more proof, to dissipate power you need a lot of copper and some giant fans. The X230T has a substantial amount of its very thick (1.2") body dedicated to heat pipes, fans, and copper fins. A handful of things I work with are very closely related to heat (Peltier elements, batteries, and motors) so I often look at things with a thermal camera to see how heat is dissipated throughout the laptop. In the X230T I was very impressed: Lenovo's engineers did a fantastic job. The laptop is capable of a substantial amount of heat dissipation. Meanwhile in my previous, fancy, new Yoga Pro laptop with the fanciest processor you can put in it, the thermal management is crap. There's a tiny little fan maybe 5mm thick that sits in a strange place and doesn't seem too effective at anything except making noise. The same goes for every other recent laptop I've seen: most of them seem deadly afraid of making the laptop too heavy or too thick, so they sacrifice heat management.
For the equivalent Haswell processor, maximum TDP is 55 watts vs approximately 40 watts for mine. So your laptop, which is almost undoubtedly thinner and with worse thermal management, has to dissipate almost 50% more heat. And don't go thinking there might be some revolution in heat management that lets yours dissipate more heat in a similar space, because "copper plus fans" has been the formula for the past 30 years. (With the exception of water cooling and peltier elements, which are most certainly not in your laptop.)
Furthermore, if your processor is putting out 55 watts of heat, _more than_ 55 watts have to go in somewhere. So that's also draining your battery. Standard batteries are around 45ish watt hours in recent laptops, so assuming the rest of your laptop (screen, RAM, motherboard, Wifi, hard drive, etc etc) literally drains zero power you'd have less than 45 minutes of battery power at maximum load. Does that sound like something that would freak out a consumer? Yep. So even if the laptop was thermally capable, this maximum power mode is always limited so the consumer doesn't freak out about getting a battery life measured in minutes.
Meanwhile, my Thinkpad, which has a 55 watt hour battery, lasts about an hour and change at maximum power. Did this freak out some people? Maybe. But 2012 Lenovo said "yep, we want to give our consumers maximum power, and if they drain their battery that's their problem." Personally I love it. If I want to limit performance I can just put it in energy saving mode. But most of the time I keep it at maximum, because I like it and in a world with good RAM and a nice SSD the CPU starts to become a bottleneck reasonably often.
In my opinion - computers should still have a boot to basic with a simple interface to vga graphics and passable sound. Something like pico-8 but a better editor with VI bindings. :-)
It's a good thing we have the raspberry pi.
Nothing stops you from creating a 'boot to basic' image from power-on. Whether it would get much traction is another thing. Expectations are rather higher now than they were in the 70's and early 80's.
Even a mouse was a pretty rare peripheral in those days (and would set you back some serious money).
And is the Rasberry Pi comment a reference to RISC OS, with its ability to access BBC BASIC by pressing F12 and typing a quick command?
I remember the IIgs being a pretty sweet machine.
The 8-bit guy did a video about this a while ago: https://www.youtube.com/watch?v=7h4tepFbMso
Are there any hacks that can convince macOS (or the older versions of OS X described in this article) not to treat the display as HiDPI? Yeah, I realize the machine will abruptly feel like it needs a magnifying glass to use, but in a pinch (laptop on lap <2ft from eyes) it might work for some (insert standard disclaimers here about eyes being non-replaceable and needing to last the distance).
The late-2015 21″ iMac is ~$1.5k+, and "has a multi-core Geekbench score of 5623."
Then the late-2011 17″ MacBook Pro which is ~$1.3k checks in with a "9240 Geekbench score".
Is there some datapoint I'm missing here?
macOs already supports several resolutions higher than the standard 1/2native which is what Retina uses (half the native pixels at each dimension for twice the resolution).
IIRC, already the "default" resolution on newer MBPr with the touch strip is higher than the 1/2native (that used to be the default on retina laptops).
There are also apps like: https://www.thnkdev.com/QuickRes/ and http://www.madrau.com/ for more flexibility and full-native resolution even.
That said, the full native retina resolution on something like a 15" screen doesn't make any sense to me except for some special circumstances (maybe 4k movie viewing, but doesn't that already use the full resolution?).
>The late-2015 21″ iMac is ~$1.5k+, and "has a multi-core Geekbench score of 5623." Then the late-2011 17″ MacBook Pro which is ~$1.3k checks in with a "9240 Geekbench score". Is there some datapoint I'm missing here?
Yes, one is a GeekBench 3 score, the other is a GeekBench 4 score. Scores of 3 and 4 editions of the GeekBench suite are not comparable.
Ah, that's what I was missing. Thanks.
You can change the display scaling directly in system preferences, I can crank mine all the way up to 1920 x 1200 scale (Late 2016 15"). Beyond that, there's QuickRes: https://www.thnkdev.com/QuickRes/
(RDM has never had an official home, I don't think, and I don't remember where I got my copy from. Think it was a link from reddit. So I can't comment about this particular version, which I found via Google just now.)
SwitchResX and others allow you to use your Retina Mac at its native resolution
I think I'm trying to run too much locally, so these days I am still getting a lot done using cloud VM's.
Huh? Out of the box resolution is 1440x900 and maxes out at 1680x1050 (without using a 3rd party application like SwitchResX). Apple used to set the default to 1280x800 but this is no longer the case.
If it can do vim and owncloud I'd be golden for text.
I do however work with media, so no, low end is not good at all.
We detached this comment from https://news.ycombinator.com/item?id=14535697 and marked it off-topic.