Hacker News new | comments | show | ask | jobs | submit login
Back on Linux (after one year of Apple and OS X) (dywypi.org)
593 points by akheron on Feb 13, 2012 | hide | past | web | favorite | 321 comments



What an uninformed rant.

When someone rants about memory usage it is usually a sign he knows nothing what he is talking about. On virtual memory systems with on-demand paging that use shared libraries and where all file system I/O is mmap(2) based, memory is managed in a very different way than what most people expect. It's understandable, most people don't know and don't have to know what virtual memory is, even if they have a superficial understanding of swapping. Most people, even most technical people, don't know about the implications of shared libraries in memory measurement.

The users are presented with data they don't understand. Everybody talks about things like "this app is using 300MB of RAM", when such statements don't make any sense in the modern world. The way file systems, file system caches, virtual memory, and shared libraries in the context of virtual memory interact is architecturally identical on all major operating systems today, including Windows, Mac OS X, Linux, Solaris, and the BSDs. There are various differences in implementation making each system optimized for particular workloads, but understanding the differences between the system is out of reach of most people who complain on their blogs, and it only affect out-of-reach workloads anyway. It's funny how much can one advocate for something when all alternatives are the same.

But all memory management rants are nothing compared to mentioning Mac OS X' repair disk permissions feature. Of course, this feature doesn't magically repair anything, but it's sold as a panacea. I read the first paragraph about memory management and decided to give it one more chance, but then repair permissions was mentioned as a solution. Sorry, this is no HN worthy.


Okay, I really think you know much more about memory management than the writer of the original post, the question for me is using chrome with a few tabs (8 for example) and Dropbox besides the default applications I face huge slowdowns (the system becomes unresponsible for a few minutes), the same happens with Safari, for a system that was described for me as "it just works" this really sucks. This with 4GB of RAM in my MacBook Pro.

I face no such problem with Linux or FreeBSD in the same hardware.


Open activity monitor, and check which applications use a lot of memory. (Look at the "Real Mem" column).

If your computer frequently freezes, the problem is mostly a specific application (Virtualbox comes to mind), not just Mac OS X itself. 4GB are more than enough for casual browsing.


Activity Monitor is a tool too blunt for these kind of scenarios. It pains me, Mac OS X has DTrace which makes it a breeze to find out what really happens, however, the only GUI tools built on top of DTrace are profiler for programmers, nothing for the casual user.

Activity Monitor presents data that's not really useful and does so in an intrusive manner and in a confusing display.

Mac OS X already comes with parts of the DTrace toolkit, you can install whatever you're missing: http://www.brendangregg.com/dtrace.html#DTraceToolkit. These tools allow a very deep understanding of performance problems.


If your problem is that you are running out of RAM, Activity monitor will tell you precisely what app is the culprit. Nothing "blunt" there.

If you want to know why a specific app is frozen (beachballing), you can click the "Sample Process" button in Activity monitor to perform a time profile.

Yes, if you want more details, you have to use Instruments or write your own DTrace scripts. But I doubt that even a time profile is useful to the "casual user".


Sorry, but no, you're just illustrating the problem I've mentioned. In a virtual memory system with on-demand paging, memory mapped I/O, shared libraries and copy on write pages even measuring and interpreting memory stats is very difficult and subtle.

The "real memory" column is resident set size, a completely useless metric for the problem at hand because of many reasons. One reason is that much of the physical pages can be shared, indeed most of them usually are, on my system a chrome process has a 120MB RSS, but after closer inspection 110MB is shared, and after even closer inspection 90MB is shared with non-chrome processes. Closing the process with top RSS usage might do very little for decreasing memory pressure. Another reason is that physical memory usage is very misleading, if the system is swapping, a process has less resident physical pages than the virtual pages it uses, that's the reason the system is swapping in the first place! A process can thrash memory and have a relatively small RSS.

You also simply assume what the problem is without actually testing for it. You need to look why the system spends time in kernel mode, maybe it's not swapping, most likely is not swapping in this particular case, it's more likely to be the random I/O caused on-demand paging of memory mapped things or something more subtle, like copy-on-write pages being written to. It also could be a million other things.

Even if the problem is caused by memory pressure, memory pressure is a remarkably generic term, the VM system has many components and different components are affected by different workloads. A simple metric like RSS can't tell much.

Yes, Activity Monitor is very blunt tool.


I do understand how memory works on modern systems, and I agree that Activity Monitor can still be used as a blunt tool.

My system (MBP, 4 GB of RAM) works beautifully most of the time, but I noticed that sometimes when I went to a new tab in Chrome, there was a multi-second delay. I opened up Activity Monitor, went to a new tab, and noticed that my disk activity had spiked. I figured that the memory system was swapping in/out a lot of pages, so I looked at rough memory usage, and saw that the Shockwave Flash plugin had close to 1 GB in RSS. I then realized that using YouTube as a music player is probably not a good idea - Flash was designed to run as the main thing you're doing, not in the background while you're doing other things. I killed the tab, and I had no more problems.

Blunt tools can still be useful. The problem isn't the tool, it's the knowledge that people have when they use it. For example, I know that RSS overcounts, but I also suspect that the Flash plugin isn't sharing enough to make a big dent in 1 GB. And, it turns out I was right.

You wrote an excellent explanation above about modern memory systems, I find it strange that you're harping on this particular point so much.


Yep, I checked out my Activity Monitor and the Flash Plugin was also the culprit, using over 890mb of RAM. I'm running Firefox with multiple tabs (5), Chrome with multiple tabs (7), pages, X-Lite, Adium, mail and thunderbird and 4GB of RAM with no problems.


You are missing the point. The idea is not to determine exactly what's going on, because that doesn't matter. You mentioned the casual user before, but all these details are only important to specialists trying to debug an issue with a specific application.

When you run out of RAM, it is completely irrelevant which application is thrashing, because the problem is that you run out of RAM. It doesn't matter if the reason is that an app is accessing a memory mapped file or modifying a copy-on-write page, when underlying reason is that you ran out of RAM.

Activity Monitor is perfectly suitable to find out why you ran out of RAM. Oh, Mathematica is using 2GB of RAM? Maybe I try closing that. It doesn't matter if "Real Mem" actually counts some memory twice, it is still a useful measure.

EDIT: Yes, I assumed that the problem is memory pressure. Since this is a common reason for "The whole system becomes sluggish", it seemed reasonable to start testing for this.


Fantastic argument, user is having a problem, but it doesn't matter what's going on, let the user try arbitrary things which I already explained that are folklore and why they make no sense, maybe it works.

If that's you advocate, let's stop here with this discussion, as there's no common ground.

When you're having a problem, first you try to understand it in order to try to solve the root cause. Applying rules of thumb like "kill top RSS process" are as sensible as rules of thumb regarding running repair permissions, sizing paging files or hoping arbitrary herbs cure arbitrary diseases.

Activity Monitor is useless because it's impossible to assess how a specific action will affect the system. Users should understand what's going on when they kill a process and the tools should help them to do so. When people do something, they should understand it, even casual users. Activity Monitor exposes data that's not understood by most users, although it leaves the impression that it does.

Just for trivia, memory pressure, hasn't been the primary reason for "the whole system become sluggish" for a few years already.


1) Nowhere have I suggested to "kill top RSS process". 2) If you start Activity Monitor, you immediately see if memory pressure could be the problem (well, after 30 seconds, because it takes time to start Activity monitor if you ran out of memory) 3) If you ran out of memory, looking at which apps use much RAM is useful. And it's not as unpredictable as you make it seem. Quitting an app, you'll free at least the private memory, and you might free some of the shared memory, and other processes (e.g. Window server) might also free some more memory as a consequence. This might not be a precise prediction, but it's not quite "impossible to assess".

I do not know how prevalent memory pressure is for other users. It's been the primary reason of "sluggish computer" for me. If you know more about this topic, I'd be thrilled to hear other possible explanations beyond "it's more complicated than that".


Your argument basically calls for casual users to stop being casual and to begin to become experts. Activity monitor certainly does help casual users understand what's going on. For the casual user using a system normally (meaning having a browser open, checking mail, writing in a word processor) seeing that application X is using the most memory means that's the problem to them and they're correct to assume they should kill it. It may be hit or miss but that's all they know and in most cases things get fixed that way.

As professionals we often forget what it is to be a casual user. Asking a casual user to learn what you explained as folklore is simply too much to ask. Everyone who uses a computer should at least be technically literate to a degree but that means understanding the basics. To a casual user the basics are: my computer has a processor that executes tasks, it has RAM that stores data for quick retrieval, and a hard disk for long term storage. Each application uses a percentage of my finite RAM and when it runs out my system slows down. Therefore logic dictates that if I kill the app taking the up the most RAM my computer will go faster.

That's all they usually know. We understand that Activity Monitor lies to us and killing random processes is voodoo but we also have to take into account how we use our machines. The casual user will be able to solve their problem by killing processes more often than people like us will because of the way they use their systems plus there is a placebo effect for them. When they kill a process they often feel like the system just got faster regardless of whether it really did.

I liken it to driving a car. Ask some random person about fuel economy. Their thinking is "high octane fuel has more energy per gallon therefore if I use it I'll get better fuel economy". They might even know the relationship between tire inflation and fuel economy too if you're lucky. Ask a professional driver about those things and they'll look down at the average person like they're crazy. They know all how octane, oil, air filtration, shocks, struts, aerodynamics, etc, etc. all contribute to better fuel economy. "If only the average driver knew what I knew, then they'd save a ton on gas" theyd think. But alas, that's too much to expect so we just have to make sure they get the basics and it's up to the professionals to provide the average person with something that just works and do our best to be one step ahead of users by anticipating their usage patterns. This applies to hardware/software engineering, car manufacturing, and anything else. You just can't expect the user to learn or even take an interest in even a quarter of what we know.


Well, I know the basic about SystemTap, but I'm not so much familiarized with low-level stuff, I'll look for DTrace to see what's happening. Thanks for the information.


SystemTap is a Linux thing, which you can use on Linux to see how this works, but it won't be a pleasant experience.

DTrace is also available on FreeBSD, so if you look at what happens on Mac OS X, it might be helpful or insightful to also look at what happens on FreeBSD.


Activity Monitor is way older than DTrace and I don't think Apple has made any code changes to it in a long time.


The casual user doesn't know what any of the words in your post means.


The private memory column is likely a better measure when looking for memory hogs.


I've never had this happen, and I run Mac OS X on a MacBook Pro 5,5 with 4 GB's of RAM (core2duo at 2.33 Ghz). This laptop is from the mid-2009's.

My default set of apps that I run:

1. Safari 2. Mail.app 3. Terminal.app 4. Adium 5. Twitter 6. iCal 7. X11 8. MacVim 9. GitX 10. Activity Monitor

I have no slow-downs and I regularly compile the entire companies code-base on this little machine. Sure it isn't fast compared to some of the newer monsters out there, but I have had no issues with slow downs what so ever.

That being said, I don't use Chrome, and I don't have DropBox on my work laptop.

I keep hearing about people having really bad slow downs and I just don't understand it. Mac OS X has never been anything but "Just Works" for me.

11:29 up 69 days, 17:34, 11 users, load averages: 1.38 1.24 1.18


Besides of that OS X works really fine for me, although I do not use it to serious work, which is programming in C++, Python and R since I work as a statistician, I will upgrade the memory to 8GB and see if things become better for me.


I can't directly answer the question but I get the feeling most memory issues have to do with an individual's use of their system, and not the system itself. I have a 2009 iMac and a little netbook running Xubuntu. The iMac has 4gb RAM and the netbook just a single gig RAM and not even 2ghz processor. The netbook doesn't slow down on me while the Mac is prone to freeze ups. I Could say "Linux is better than my Mac because it doesn't slow down" but that's not fair. They each have their own pros and cons and I don't see one as being better than the other, it's all a matter of preference.

When the slowdowns bother me I look at usage. On the Mac I always have 4 spaces open (the dashboard is not set as a space on my Mac), and at all times I have the following apps open: Mail.app, Terminal.app, Chrome w/>=5 tabs at a time, MAMP, CodeKit, iTunes, and Sublime Text. Then Photoshop is open a lot in a addition to a lot of others that get opened at times. On the netbook I've got a terminal session, Chrome, and Sublime Text. That's it. It's no wonder the Mac slows down. So I'd say look how you use the thing. No machine has unlimited performance and when it comes to memory usage it's a lot like money in that the more you have the more you tend to spend and you never seem to be able to have enough.


I thought about that some time ago but I use the laptop mainly to browse the web, never more than 10 tabs, and login via ssh in a Linux desktop. Maybe it is slow because of that, but the main problem I have with this is that I get no slowdowns running Linux and FreeBSD in the same computer and with the same use case.

This computer stays up for several days[1], and I suspect the main cause is because of Chrome leaking memory.

[1] Now: 20:16 up 43 days, 23:41, 2 users, load averages: 0.38 0.31 0.26


The parent told you he was using Linux and the Mac the same way.


But it's a real problem the "inactive" memory on OS X.

If I do a malloc of 7GB with 6 GB free and 800 MB of inactive memory on my MBP, the system starts paging before it deallocates the inactive memory.

If I ctrl-c the application, suddenly the active and inactive memory go down drastically, with inactive going to like 100 MB, so it obviously could have been given to the process wanting it before paging started.


+1. It is interesting that there are no complaints that "system A is slower than B". I don't understand why people bother with VM stats if there are no performance issues.


I agree with this, what is left is one guys opinion, which makes it to hacker news front page. Next up: vim vs emacs.


another uninformed rant (this is also an uninformed rant)


Last July I bought one of the new Macbook Air's. The price for the hardware you got was unbeatable, and that was ignoring build quality! While previous Air's were anemic tarted up netbooks, the 2011 Air's were (and are) powerful enough for everyday use.

There was just one problem with the Air however. Horrible Linux support. I'd been using Kubuntu happily for years. After looking at all the ugly dirty hacks people were using to get their Air's running, I decided to give OSX a trial for 6 months or so. Long enough for Linux support to mature. I hadn't used OSX since the early 2000's and it looked like the OS had come a long way. To be precise, what really changed my mind was how far the OSX modding community had come. Despite being hated and loathed by Apple, they had managed to fill in some of the gaping holes in core functionality that Apple philosophy forbade, such as a way to remap keys (in all applications, not just some). I could finally remap the Apple key to something that didn't break my touch-typing habits on all the other OS's I use daily!

6 months later, I'm ready to jump ship. I like OSX Lion's touchpad gestures, but beyond that, I'll miss little else about the OS. OSX isn't bad mind you, but it's infuriatingly difficult to modify when it does something you don't like. It's buggy. It's actually pretty dated and ugly looking now too. OSX's virtual desktop management has absolutely nothing on KDE's.

Unfortunately, just as the next version of Kubuntu was starting to look like a good one for the Air, Canonical announced that they are ceasing paid development of Kubuntu. My favorite KDE distro is now officially on deathwatch. Maybe it will live on with community support, like any other distro has to, or maybe it'll fall by the wayside. I appreciate what Canonical is trying to do with Unity, but it's not for me. I'd long felt like Kubuntu, despite it's many virtues, was being treated like a red-headed step-child. This tears it. I haven't decided what distro I'm going to yet, but it will be one that puts KDE first, and that rules out anything Ubuntu.


Reports of Kubuntu's death may well be greatly exaggerated. Ceasing paid development means pulling the funding of one developer, a developer who was previously absent for a release cycle when he was on he Bazaar rotation instead - and Kubuntu still managed to make a release. At the same time, other Kubuntu developers have pointed out how this change will actually make it easier for the community to put Kubuntu together in some ways, due to the move of their packages from main to universe removing the need to get Canonical's sign-off for various things.


Kubuntu isn't on death watch, there was one person being paid to work on it, the rest was community support. The last version he didn't even work on (he was off working on bazaar or something) If Kubuntu does die it will be at least like 3 releases out.


I dont find that encouraging - 4.8 was the first update that badly broke my system in a long time ... since the original 4 upgrade IIRC.


Canonical recently stopped funding Kubuntu: http://www.geek.com/articles/chips/canonical-to-stop-funding...


If you read the article you posted, you'll know "stopped funding Kubuntu" means Canonical "has paid for developer Jonathan Riddell to spend the last seven years working on Kubuntu" and will ask him to do something else after 12.04. He already wasn't fully dedicated to Kubuntu for a year or so.


I wouldn't mind the official end of life of linux software.

I used to be a window maker user (always hated KDE and did not like Gnomes sluggish performance). Window maker is nowadays not even on the DVD distributions. But it still works fine although I think nobody has really touched the code since long.

That is one of the nice things about linux compared to OS X. On linux dead software will still be alive for long and you get it via your favorite package manager. On OS X dead software is really and finally absolutely dead.


I wouldn't say the Linux support was horrible. It's a bit of a pain to get it installed to begin with, but that seemed an issue with how locked down macs are...

Once I had Ubuntu running, everything worked out of the box. The main issue was terrible trackpad support -- not that it didn't function, but that it didn't feel even close to right. And a few more complex gestures, like tapping with one finger and then dragging with a second, didn't work.


I've read that OpenSUSE is one of the best KDE distro's around.


Personally, I'm fond of Fedora's KDE offering as I explained the other day: http://news.ycombinator.com/item?id=3561927

Again, not that there is any reason to leave Kubuntu if you're presently happy with what it offers you.


I've been using Fedora with KDE as my primary system for a year now. And after switching from f14 to f16 2 months ago I decided to leave KDE and try the gnome3 again (perhaps Unity on ubuntu). Why? Because KDE sucks. Ok, I should explain. KDE is great. So much better than gnome2 that people love for some very strange reason (but I guess some /same?:)/ people loved win95). I love its customizability. I like its unified feel across the apps. But out of the box feel is really bad. It looks bad, it works bad. I would say it's Fedora's problem as the KDE really is capable.. but still. Also, the apps are painful sometimes. But it's not different from GTK "suite" from what I've tried.

PS: It's my personal opinion of course. Maybe I'll switch back to KDE right after I try the Unity.


KDE on Fedora is my primary environment and has been for years, even through the rough times when 4 came out. Ugly is subjective but I'm not sure what you mean by 'works bad'. Unless you don't like design choices. My F16 machines have been rock solid and I've had no major problems with KDE.


openSUSE is the best KDE distro around, not the least because the vast majority of SUSE developers use KDE. Dogfooding really works in this case.


My first distro was openSuSE 10.1, before KDE4, in 2006. The system has been upgraded to a new machine, now headless, and now lives on openSuSE factory. Running through VNC KDE is still a decent interaction. It spoiled me to all other implementations of KDE. OpenSuSE has a lot of other tidbits going for it too. YAST2 being huge!


All Linux distros I have seen look tacky and gaudy compared to OSX. I guess that's just your style, though.


I realize you're doomed to be downvoted into oblivion, but I have to agree - I had a tough time following someone whose logic is "OS X is pretty dated and ugly looking... it's back to KDE for me".

There are certainly fair points to be made about the inability to customize certain aspects of the window manager -- coming from a long-time linux user's perspective, it was one thing I immediately missed -- but I'm really not sure "ugly design" is one of the criticisms I could make.


Why can't you just take a standard Ubuntu and install the KDE packages? As far as I understand Kubuntu it's exactly the same as vanilla Ubuntu, but just uses different default packages.


This is true, but realize that the Kubuntu team is maintaining those packages, so "taking a standard Ubuntu and installing the KDE packages" does nothing to remove one's dependence on them. For why this still is nothing to immediately panic about see my earlier reply to parent.


As long as the upstream Debian guys keep maintaining them, you should be able to do KDE pretty well still, right?


This is what I do with Fedora. I started using RedHat in 2000 or so and it's been my OS of choice since. People often assume I'm a gnome user - but I jumped to KDE in the first few months, and never switched back. I do install both though, as disk space is cheap and I like having the ability to run some Gnome applications.


I don't even know where to start with the author's complaints about the memory model. Inactive memory is not "memory from a recently used app". Inactive is just like active memory. It is not unused memory. Inactive memory is memory that has not been accessed recently and will therefore be swapped to disk first. Just like active memory, it can be be freed immediately if it has not been modified (eg. a memory mapped file, or a local copy of some shared memory), but if it has been modified, it must be paged to disk because otherwise data would be lost.

All the purge command and "Repair permissions" do is swap out lot's of memory. But this is not unused memory. This memory will likely be paged in again sooner or later. It does not help much if you simply have too little memory. That's why the author claims he has to run the commands again and again. But the problem is not the broken memory management system. The author simply has too little RAM.

Which leaves the question: Why does Mac OS need more RAM than Linux?

Well, there's the simple fact that there is much more stuff running in the background on Mac OS. You have automatic indexing of every file on your hard drive and file system monitoring. Try downloading an app that opens some file type. The moment it is unzipped, the Finder automatically uses it to open supported files.

Then you have an automatic version control and backup system running in the background, for every single file on your hard disk (Time Machine). Additionally, many Mac apps aggressively cache data in memory. Take iTunes. You can scroll lag-free through music libraries with tens of thousands of songs and their album artwork, and filter them instantly.

These things use up RAM. Lot's of it. And that's why you can't run 3 VMs on your Mac if you only have 4GB.

If you don't want the extra features / bloat of Mac OS please go ahead and use Linux. But don't write a completely ignorant piece about how you think the memory management model in Mac OS is broken without reading the real docs: https://developer.apple.com/library/mac/documentation/Perfor...


After two decades of Unix use, in 2000 I switched to Apple and OSX - the only reason I considered it was because of the design of the Titanium Powerbook, which I saw in those days as an amazing piece of hardware design, which - amazingly - gave me a Unix workstation in a fantastic portable package. I simply couldn't believe that Apple, of all people, were delivering what I'd wanted for years - a smart, functional, fully working Unix workstation in a portable format.

So I was very happy for a year or two, and am now on my 4th Macbook Pro. The Macbook Pro (17") has come to represent everything that I desired in the 80's and 90's for a Unix workstation.

But: only because I'm running Linux on it. Mac OSX, sure, has its time and place - but when it comes to putting the power of this amazing bit of hardware to good use, nothing beats having a proper Linux distribution onboard. Proper memory management, proper user security model, proper levels of abstraction between a user program and a system service, and so on. Its simply an amazing bit of gear, now that I've set it up right.

Oh, though .. how I wished SGI had gone a different path, and released their Indy laptop to great fanfare. How I wish they hadn't been usurped by Microsoft, it would be so, so nice to have an SGI laptop in the 21st Century ..


This is pretty similar to my story except I was using Macs before OS X came out. I grew up using Macs (my first computer was an SE/30) but I started playing around with Linux around the end of the 90s. When Apple released OS X I was really excited and thought I'd be able to have the best of both systems but I'm really uninterested in OS X at this point. I'm currently using a Santa Rosa MacBook Pro from 2007 and despite having absolutely wretched touchpad support in Linux (the cursor doesn't like to move in diagonals...) I'm much happier running it than I was with Mac OS.


How is the power management? I run a linux (ubuntu) thinkpad for work, and while I find the OS pleasant to use, I have found the battery life dire compared to windows-using colleagues.


I run ubuntu 11.10 on a thinkpad and I find that battery life is on a par with Windows.

You might want to investigate the ASPM power regression issue; pcie_aspm=force might work for you (or wait for ubuntu 12.04 which cw the 3.2 kernel containing the fix)


Thanks, I'll give it a try.


My understanding is that much of this is due to regressions in the kernel. Some of the more recent 3.2 kernels are supposed to fix it.


lesswatts.org has a bunch of things you can do to improve Linux power usage.


What distro do you run?


Ubuntu Studio. Its taken quite a beating, but works absolutely wonderfully with my studio equipment.

The audio experience with this setup is better than that of Mac OSX - but of course I had to choose my hardware well, and administer a good chunk of it myself before it got that way (Presonus Firewire-based audio I/O, complete removal of Pulseaudio, Jack+FFADO configuration) Nothing beats being able to easily install, modify, and compile the sources of every bit of useful software you're using - especially things like audio effects/synthesis plugins, and so on. Need to tweak a filter? Easy: install sources, modify, re-package, install new version. Can't do any of that on Mac OSX nearly as smoothly on Linux.


How is the power management of Ubuntu on the Macbook and does close-lid-to-sleep work flawlessly?

Those are the two non-starters for me with linux on a laptop. OSX just does an awesome job with both of these.


Works for me on Debian. Battery life is somewhat shorter, but not to the point that it bothers me.

That said, it took a few kernel iterations before every last bit of hardware was fully supported.


Works for me without any hassle whatsoever.


Did you find a way to get a second mouse button? I saw several howtos that said that a keyboard key could be mapped to it but none worked for me last time I tried (four years ago?)


I used to use Ubuntu on a Macbook, and it was pretty easy to map two-finger-tap to right-click and three-finger-tap to middle-click. I guess if you're the kind of person that hates tap-to-click and prefers hardware buttons, that's not so useful, but I had no problems using it.


I've had a few Apple machines over the years (mainly as test machines and toys). I've had two MacBooks, one MacBook Pro, an iMac and a couple of Mac Minis. I have nothing now other than a cheap second hand desktop PC (and some cash in pocket!):

I agree with the build quality - they are good quality as in good materials and good fit. However, with my EE cap on, the designs themselves are bad and are quite dangerous. When there is literally that amount of LiPoly cells sitting inside a chassis, you want to be able to isolate the power. One bit of water in it and it's effectively an incendiary device. I've seen one recent MacBook Pro (pre-thunderbolt) go up with my own eyes quite spectacularly and wouldn't want something you can't drop the cells out of rapidly if you inevitably pour your coffee in it.

With regards to the software, I found the OSX environment inconsistent and XCode absolutely terrible. The OSX environment is inconsistent from the "task focused" application designs that you see. Every shipped application has its own set of behaviours and pretty much ignores a common standard resulting in head scratching. The keyboard shortcuts system is horrid and doing anything without the trackpad is hard work. XCode was just a mismash of concepts thrown together badly. As a comparison point, Visual Studio is a lot more mature and consistent and that is saying something.

The whole Apple/OSX ecosystem is a good attempt but it's not good enough for the money on the basis that some of the fundamentals are flawed. I'd actually throw more money behind Microsoft at the moment as they are heading in what I percieve to be the right direction. Apple started at a good point and have got worse. Microsoft started at a bad point and are getting better.

TBH however, the best OS/hardware ecosystem I've come across so far was SunOS4 and Sun4 architecture in the early 90s.


Sorry, I find it a difficult to believe that Apple would release a laptop that would catch on fire if you spilt a cup of coffee on it - I suspect your story has been sprinkled with a touch hyperbole. Evidence, please :)


LiPo batteries explode if you short them. Doesn't even need to be a very long short, or a very big one. They are incredibly sensitive to fast discharges. Water easily conducts well enough to cause problems, especially if it messes up another power conduit. It isn't guaranteed to catch fire, but the risk is certainly there, especially in such a thin form factor - you don't have much space in there for containment or fancy routing to reduce the chances of a short.

Again, it's not guaranteed, but Airs are rather more at risk than other laptops because of their construction.


The battery is ok for direct exposure to water as it's well sealed but as you say the power output from them in short circuit is immense which results in fires if you can't isolate the supply. Water + high current = interesting :)


Id believe it just based on the ridiculous Amount of heat the mac laptops seem to generate. Im constantly turning mine off to cool down


> I'm constantly turning mine off to cool down

use http://www.macupdate.com/app/mac/23049/smcfancontrol - when the thing gets hot, just increase the fan speed (to 5700 rpm or whatever you machine's max is) for a minute or two, and then everything gets back to normal.


They generate just as much heat as other systems with the same chip and chipset, but the metal case is much better at conducting heat.


No hyperbole. The design is simply dangerous.

One glass of water on the desk (not in the machine!). Capillary action sucked water around the seal on the base. It was turned over to remove the battery and the sucked up water rained on the logic board resulting in all sorts of crackling, smoke etc and one dead MacBook.

Being a qualified and experienced EE, I'm qualified to say that it's electronics 101 to be able to isolate power (like every other vendor allows).


Your knowledge may be obscuring the bigger picture here. I suspect the general public wouldn't behave the way you would - the only spill I've ever seen resulted in the user turning the machine over and furiously shaking it. That this is the only spill I've seen in 16 odd years of being around laptops and using laptops suggests to me that for a company like Apple optimising for spills wouldn't be a priority.


A company like apple optimises for good looks. We all knew that and we have proof since Antennagate; don't try to deny that.

After breaking my second laptop with fluids (first was ruined with coffee; second with it standing in water, which was rain accumulated in my not properly closed "watertight" bag) I started buying Thinkpads. They have specially designed "Fluid drains". Because all laptop manufacturors know that spills are one of the highest death-causes for laptops. http://youtu.be/d7cvi00OZDM


The keyboard and speaker holes are sealed with some kind of rubber plate, which is glued on the case. Therefore it takes some time for liquids to enter the casing. When turning the MB directly upside down, there should happen nothing.

But as always, you shouldn't place your drinks directly to all kind of electronic stuff. It's common sense.


I agree there - the real issue is the spill.

However, there is no excuse not to design something with safety in mind.

If you look at the base, the edge rim of it is where it seeped in and sat in the curved section like a pool when oriented normally. There was at least 10ml of water which had been sucked off the table via that rim. The logical step is "isolate power". Any movement of the device resulted in the water spilling onto the logic board.

Pictures: http://www.ifixit.com/Guide/Installing-MacBook-Pro-13-Inch-U...

Now TBH I've personally done this with an acer timeline. I yanked the battery out in 5 seconds flat and hung it up to dry. It was fine the next day.


Macbooks catching fire is kind of what they are good at, just do a little google-fu:

The Powerbook 5300 was recalled because some batteries caught fire on the assembly line. (1995)

The batteries, manufactured by LG Chem Ltd. of South Korea, could overheat and pose a fire hazard, according to the CPSC. The recall affects laptops sold since January, which contain batteries produced last December. Approximately 28,000 batteries are affected by the recall. (2004)

Apple Computer Inc. on Thursday recalled 1.8 million Sony-built notebook batteries that could overheat and catch fire. (2006)


Great, so someone explicitly asks for evidence that Apple would release laptops that caught on fire, I provide 3 examples, Directly Answering the question; then I get downvoted.

Maybe I didn't provide references. Here's the references, most with the official statement from the U.S. Consumer Product Safety Commission.

The powerbook 5300 was explicitly recalled, by Apple, for the short-circuit problem. Reference:

http://www.nytimes.com/2001/03/15/technology/laptop-batterie...

http://news.cnet.com/Apple-woes-continue/2100-1001_3-211692....

http://books.google.com/books?id=R7zgbMJM3vwC&pg=PA10...

2004 recall, same problem:

http://www.cpsc.gov/cpscpub/prerel/prhtml04/04201.html (Problem: An internal short can cause the battery cells to overheat, posing a fire hazard to consumers.)

2005 recall, same problem:

http://www.cpsc.gov/cpscpub/prerel/prhtml05/05179.html (Hazard: An internal short can cause the battery cells to overheat, posing a fire hazard to consumers.)

2006 recall, related problem:

http://www.cpsc.gov/cpscpub/prerel/prhtml06/06245.html

http://www.seattlepi.com/business/article/Fire-threat-spurs-...

http://www.macworld.com/article/52084/2006/07/recall.html

This is precisely what he asked for, and now I've included references. But really, if you guys don't value facts, research, and actual answers, then to hell with it. Why do I try? Might as well hang out on Digg.


FWIW I had a large glass of wine spilled all over my macbook with no long-term ill effects after the drying period.


Good for you. You must have been a lucky one.


Ive spilled - in separate instances - water, coffee, and baby formula into my MBP keyboard, and had no problems, except with the formula, which made the keys sticky.

So presumably I'm so lucky I ought to be winning the lottery at least once a week.


Google for "macbook fire" and look at Google images for some other unlucky ones.

Yes you are lucky because you spilled from the top which it's obviously at least slightly better at handling. If the base gets even slightly wet, boom.


"No hyperbole, the design is simply dangerous." "If the base gets even slightly wet, boom."

Right. No hyperbole.

But what you said before is that the notebook in your story was sitting in a puddle of water, 10ml seeped in, and then when you turned it over, you heard "crackling, smoke, etc."

An electrical short, yes. But not an explosion.

Sure, if the cathode comes into contact with water, that's a big problem. But the battery itself is well sealed. Your criticism seems to be half baked here:

When you say their design is "dangerous" that sounds like you're saying its dangerous to the user. That it will cause injury.

But your real criticism is that you can't "isolate the power." That is, the sealed case makes the battery dangerous to the machine: You turned the machine over, and logic boards, still hot with power, shorted when the water hit them.

But the implication that it's dangerous to consumers just doesn't make sense: Removable or not, if the lithium comes into contact with water, you've got a problem.

Moreover, I think it's disingenuous when you pull the "I'm an expert" card and then provide editorial analysis: Of course if you google you'll see the bad "unlucky" macbooks that got wet. The people who are "lucky" do not post pictures of a pristine, dried-out computer. There's a selection bias there.


I spilled soda, still works, although the keyboard is a bit sticky. Also a lucky one?


Funny they did a development comparison (sparcstation 2 could run sunos4 but i'm not sure)

http://www.youtube.com/watch?v=yGBoJjfMuAk

Did you use a different way under SunOS4 ? it seems a lot of boilerplate on the video (apple bias I guess)


That video is a fallacy. I see where Apple's marketing started now.

As per everyone else back then, we wrote pretty much everything in Perl and occasionally C when Perl hit a bottleneck. GUIs were built in Tk, not in OpenLook. Still far easier and quicker than IB on XCode.

TBH it took as long as it takes in Visual Studio now, which says exactly what little progress the world has made in the last 15 years.


I prefer Linux over OS X as well. The OSX userland is a strange mixture of GNU and BSD tools in sometimes ancient versions. OSX has a UNIX certification but there are some bugs in their POSIX API or even missing functions like clock_gettime. Linux seems more POSIX compliant.

The OSX package managers suck. I tried fink, macports, and homebrew. A lot of packages are simply broken especially now after Apple switched to clang. You end up in all kinds of weird situations. A friend of mine recently wanted to install Octave and gnuplot with homebrew and it took him several days until he gave up.

It's hard to get help with problems in OSX. This might have been improved. But back when I started using OSX in 2005/6 it was really bad. It was rare to find people with a good knowledge of the deeper layers of OSX. It was a complete different culture than Linux or even Windows. Maybe that has improved now that OSX is much more popular.

And it feels like Apple is more and more ignoring the UNIX folks.


The problem is that you are treating Mac OS as if it was a Linux distro. If you want the Linux experience, with package managers and command line utilities and X11, then Mac OS is not a good choice.

Package managers with automatic dependency resolution are a typical feature of a Linux installation. But they go against the core principles of Mac OS X. Software on Mac OS X is distributed as self-contained packages, that can either be simply dragged to the application folder or installed by the Mac OS Installer program (or with single click if the Software is available on the Mac App Store). Typical software written for the Mac has no external dependencies.

The fact that package managers like homebrew or MacPorts exist does not mean that they are the preferred way to install software on your Mac. Compare it to wine on Linux: You can run Windows software on Linux, but it will never be the real thing. MacOS tries to be compatible to Unix, but nothing more.


Part of the rationale for package management with dependencies in linux is so that libraries are shared (only loaded into memory once) between different executables. If OS X is giving up some of that sharing in exchange for simpler installation, I wonder how much that has to do with the memory usage complaints in the comments here.


> If OS X is giving up some of that sharing in exchange for simpler installation,

I don't really see how OS X has 'simpler' installation than any well-structured Linux distro. What could be simpler than typing pacman -S <packagename> or apt-get install <packagename>, using a single, built-in package manager that automatically fetches, builds, (possibly) compiles, and installs all dependencies at the same time?

Even if you're terrified of the command line, it's not like you can't build a GUI that makes that process look more appealing.

And please don't try and tell me that brew or macports are replacements - they're not. They're ersatz substitutes that do a passable job of installing some (but not all) development packages and basic applications. They don't manage system libraries, and they don't manage many applications that have to be installed from the App Store or from .dmg files. That's not one system - that's three or four, and they don't all play nice with each other.

In the time it took me to write this comment, I was able to upgrade all of my installed applications, system libraries, python modules, and absolutely everything else - all in one go.

Command-line or not, I can't see how one can argue OS X has 'simple' installation on any dimension, compared to (most) Linux distros.


"Even if you're terrified of the command line, it's not like you can't build a GUI that makes that process look more appealing."

Just FYI, you don't have to build one yourself. Plenty of GUI front-ends to Linux package managers already exist.


No, I'm treating OSX as a UNIX. It is advertised as such. I know that you can use the app bundles. But they are only useful for GUI applications. I usually run applications like octave or gnuplot from the terminal or other applications (e.g., Emacs).


It doesn't matter if MacOS is advertised as UNIX or not. Technically, it is UNIX.

The practical issue is that you are using GNU programs written with the Linux/GNU operating system in mind. Of course this experience is sub-par on the Mac, because neither the Octave- nor the Gnuplot-developers see Mac OS X as their primary market.


I agree with the author: Apple has by far the best hardware in the PC business -- here's a great example of the extremes they go to in order to manufacture superior machines: http://www.businessweek.com/magazine/apples-supplychain-secr...

Alas, Apple's #1 target user is not the developer; it's Aunt Tillie. And this shows in the choices they make for OS X.

IMO Linux distributions have vastly superior software installation and management tools because they had to evolve them in order to survive and thrive in the wilds of the Internet. There has never been an 'Apple Store' with a 'Genius Bar', so to speak, for Linux users.

I agree with the author that Debian and its derivatives, like Ubuntu, have perhaps the best software-management tool on earth: apt-get. The damn thing is fast, and in my experience never breaks anything -- the system is always kept in a consistent state. (My experience with other software management tools -- particularly RedHat's yum -- has been less than 100% trouble-free.)


I've never had any of the hardware in my three Thinkpads fail on me. My Macbook, in the space of three years, had its plastic casing start to flake off, its battery failed, its power adapter failed, its hard drive failed. The other piece of Apple hardware I owned is a TimeCapsule. That lasted a couple of years before it conked out. Luckily this was a known problem and I got a free replacement. I have a friend who went through three iPhones, that all just conked out on him before upgrading to an Android device, which he has had no problems with.

Apple hardware looks good, but it's limited in function, and poor quality.

A Thinkpad keyboard kicks the shit out of a Macbook keyboard. My Thinkpad has a built in smart card reader, gps and a 3g modem. All work flawlessly under Ubuntu. It has both a nipple as well as the trackpad. The nipple on my Thinkpad is much better than the trackpad on my Macbook was.


mike-cardwell,

I had the internal fan on a Thinkpad suddenly fail at a critical time during a business trip, rendering the laptop unusable. Very unpleasant. The machine was less than two years old. It soured me on the Thinkpad brand.

Based on your comment, and given your high HN karma, I'll reconsider Thinkpads. I just bought a System76, but next time I'm on the market for a new laptop, I'll look at Lenovo's offerings.

Thanks!


I think Apple's hardware is mediocre at best. Yes, it's way better than the vast majority of bargain basement plastic-shelled PC laptops, but it's nowhere near as good as Lenovo (ex IBM) Thinkpad X series.


My first reaction was "Linux developer prefers developing Linux software on Linux - News at 11".

However, I suppose there is more to it than that. The issue is that the fact OSX is Unix under the hood is merely an implementation detail and always has been. I'd much prefer it if Apple used a solid, up to date Linux distro under the hood, but they don't. To me using the Unix system in OSX feels a bit like using Cygwin on Windows.

Conversely with modern virtualisation software, you can have your cake and eat it. I use OSX to run desktop and media apps, at which it excels, and have Linux and Windows 7 running in VMs. Perhaps not good enough if you're doing resource intensive stuff like heavy duty compiles on your Linux system, but for my purposes it works very well. It has the added advantage that if I hack around with the VMs and something goes wrong, I can usually revert to a recent VM checkpoint.


That UNIX environemnt works perfectly well for me as a software developer and I don't see how it compares to Cygwin in ANY way shape or form. All my favourite tools are there and I can install more using homebrew. I build server backend software on Mac OS X and deploy on FreeBSD and soon Linux, I've never had an issue. Cygwin is COMPLETELY different, man is that crap a pain in the balls.


I got a Linux portable this early autumn instead of a new Mac, mostly to get apt.

It is a Latitude, so it isn't that bad (~ cheap Thinkpad), but it can't really hold a candle to a real Mac. Sigh, I wish I'd made your choice and gone Mac again with Ubuntu/Debian (VM or not).


Inactive memory is not an OS X specific, and it's actually implemented in a number of other OSes, including FreeBSD.

> when it's on disk, it definitely is not made active quickly.

If an application is unloaded then many operations need to be taken to initialize stuff, reading the disk for various stuff, processing some data, allocating memory (which will be zeored out, then initialized with whatever struct and data the program needs)

If an application memory is swapped, then reactivating that memory consists of:

1. paging memory back in RAM

2. there is no step two

Paging is key, as it means the data is in a format efficiently readable and that can be put back in memory at a very reduced cost. Compare this to reading random files entrenched in a filesystem and scattered on a disk, plus doing some more processing.

> First usually freed around 200MB of memory

Out of 4GB. Wow, what an incredible improvement! Pardon me while I go write a cron entry running that command every minute so that my system can stay in good shape!

> When arriving to work the first thing was to hit repair disk permissions

This is absolutely astonishing. Seriously, Repair Permissions is a glorified ch{mod,own} -R. Quiz time! Why do you thing it reduces the 'Inactive Memory'? Because it's hitting the disk. Hard. Actually every system file gets hit. And in doing so, those files make their way into the cache and the Inactive Memory gets properly evicted. So the supposedly non-functional memory management turns out to be perfectly functional after all.

> And of course this does not support installing Python, Ruby, Perl on any other software that has its own way of distributing software.

which is bullshit (although there's no Perl).

   $ brew install python
even gives you a distribute's easy_install out of the box. You can install Ruby the same way (and since it's 1.9 it includes rubygems) but I'd recommend using rbenv+ruby-build, which is also in the package list.

Apparently the author wants python/ruby/perl packages provided by the package manager, which might just be a bad idea given how bad the status of those packages is in Debian. One would be much better served with pip+virtualenv and rbenv/rvm+bundler.

There's a brew-pip if you really want to integrate

> And in case you mix up MacPorts and homebrew, you're deeply screwed.

How so? they live in completely different directory trees. As long as you don't screw up your PATHs or something they're oblivious to each other. I've had them living side by side for some time before dropping MacPorts without any issues.

> working command line tools

    $ brew install coreutils
But I'd hardly describe BSD utils as non-working (hint: I did not install coreutils yet I spend my days on the command line).

As for compile time, it's hardly a problem as Homebrew mitigates that (contrary to MacPorts) by not duplicating every library already available in the OS. Besides, the system (much like ABS on ArchLinux) is made to make you writing your own packages or tweaking an existing one a straightforward affair. Compare to creating a .deb properly, which is, ahem, non-trivial. Yes, it would be faster not building stuff (like Arch which brings the best of both worlds together) but hosting binary packages has a cost that skyrockets as you have more users (plus one would need to make binary builds for the various OSX versions, a problem that simply doesn't exist). What's more, having software compiled from the 'original' source instead of third party is interesting in a number of ways, including running vanilla software instead of the heavily patched ones of Debian.

I'm glad the author has found a place for him but going on such an uninformed rant is unfair.


I feel you are being unfair about the uninformed rant. Picking at a couple of things I know about:

1) I've tried installing both Macports and homebrew, and some utils did not work, as whichever got put in the path first, confused the other. I suppose I could have made a seperate symlink directory, for just the binaries I wanted.

2) I found 'brew install coreutils' broke some build scripts , which expected proper mac tools, if I put them all in path.

3) I find about 10% of the time I try to install something from brew, it fails to build. g++ 4.6 failed just 20 minutes ago. That is a real pain.

I can't comment on the memory, except that I find my 4GB of RAM seems to go much further when I dual-boot my macbook into Linux. It might happen to be the programs that I run.


1) You could also create some wrapper scripts setting the path, or have someting inspired by virtualenv handle it, or maybe different terminal profiles. Those breakings really are edge cases.

2) Don't put them in the PATH. They're prefixed with 'g' so you can make an non-prefixed alias for your interactive shell, and use the prefixed variant in your scripts if need be. If you write portable scripts you're either using common features or use some vars for those utils, aren't you?

3) g++? that's not in homebrew...

Honestly though, MacPorts is a real pain. That's why I worked with others to bring Arch's pacman on OSX (dubbed ArchOSX) some years ago, but binary hosting was proving being a chore, and then Homebrew started taking off, and in my case Just Works.


Homebrew does have binary packages (we call them bottles).

I'm the guy who implemented the feature and I am realising that as I didn't shout very loudly people don't realise that it exists.

For example Qt has a bottle: https://github.com/mxcl/homebrew/blob/master/Library/Formula...

The reason we aren't doing this for more packages is basically people power and hosting (as I'm not sure Sourceforge would be happy we hosting the number of binaries we'd want to have).

Hope that explains a bit. Feel free to comment if you think we're taking a good/bad approach here.


Hundreds of organizations donate mirror space to linux distributions (e.g. http://www.debian.org/mirror/list). I'm sure some of them would provide space to homebrew.


    3) g++? that's not in homebrew...
He probably meant GCC, which does have recipes in Homebrew.

    Homebrew started taking off, and in 
    my case Just Works
This is the first time I ever heard anyone saying it.

In general Homebrew did a good job for me, but it did break on me a couple of times. And when it did, fixing it caused me a lot of stress, because in the end it's really not much better than "./configure && make && sudo make install".

What I don't understand is how come we can't have binary repositories, like Debian's. Certainly Debian has to handle much more architectures and the number of packages contained is really huge. So how come there isn't such an alternative for Mac OS X? Why are solutions like Homebrew and MacPorts insisting on compiling the packages locally?


    Homebrew started taking off, and in 
    my case Just Works
This is the first time I ever heard anyone saying it.

Here's another: it worked for me perfectly, in and of itself. I use pianobar, which frequently updates (due to Pandora changing protocol or keys), and I often have to do some recipe editing to get that latest update, but it seemed simple enough. Since the brew update list doesn't get fixes anything like as fast as the pianobar author updates when something breaks, it seems to me that having to wait for someone to actually compile pianobar elsewhere would mean that I'd have to wait days for the packaged version.


If you are attached to binary packages, there's always fink which might even predate Macports.

Unfortunately, building the packages and then keeping them current takes quite a bit of infrastructure which is why fink's binary packages are really outdated at times.

The other issue is with runtime-dependencies: Self-compiling packages gives you the freedom to, say, build vim without X11 support. With binary packages, the maintainer (or the packaging system) must create n packages for n possible combinations (if the project doesn't have some dynamic-library based plugin system) which is, again problematic from a resource-requirement perspective.


That being said, if debian can do it for around 30,000 packages on almost as many architectures as OSX has mere hardware models, why can't OSX do it? With all the app stores and cloud services, Apple is big on 'quite a bit of infrastructure'.


Because Apple don't provide these package management tools, volunteering third-parties do.

Apple are never going to provide Linux-style package management tools because the market for them is minuscule compared to Apple's real market: normal people. For normal people, there's the App Store.


Developers aren't "normal people" then? If that's seriously the attitude Apple takes towards its developer community ("no, we won't give you the software you need, get it from the crappy community projects") then why do you put up with it?

It seems like the OP's point is pretty valid to me. I use an OS that gives me what I need to do my job.


> Developers aren't "normal people" then?

No. They aren't. Developers have needs far greater than that of your average everyday user. Apple sells a machine that is the best possible for the greatest number of users, and doesn't really cater to niche markets. I don't get what's surprising about this.


Because even developers do "normal people" things. There are a lot of trade offs, but in my experience Apple provides the best middle-of-the-road machines.

Linux and Windows feel like they exist on opposite ends of the spectrum.


It may not have started out that way (but it may have, I'm not sure), but MacPorts is an Apple hosted project: http://www.macosforge.org/

What they don't host is the source or compiled versions of any of the packages in the MacPorts repository - potentially for the same reason they include no GPLv3 software in their OS.


It would be perfectly safe for Apple to offer the resources required by the MacPorts project to function adequately.


In some cases, they do. This is especially apparent if you're still using MacPort with PPC, as most of the packages it pulls down are form the MacPorts servers.


while I agree, the point is moot as this talks about developers. So yes, apple did not make a package management tool. That pisses developers off. Makes no difference because you suck it up if you want to develop for mac/ios.

Its kind of like developing on windows, for windows, gota do what u gotta do.


You could always try to revive the Arch OS X project (it was 100% functionally working end-to-end) if you want, but it failed in favor of Homebrew for a reason: no one seems to want to assume the cost (both in time and money) of the binary side of things (hosting, building distributable packages, uploading, etc...).


FWIW nowadays MacPorts does install binaries for packages when available. I just did a "port upgrade outdated" a couple days ago and many of the installs were very fast as a binary of the current version was available for my OS.


Why are solutions like Homebrew and MacPorts insisting on compiling the packages locally?

So that an OS upgrade, in which Apple tends to include arbitrary upgrades to system libraries (or the Ruby/Python etc version), doesn't break whatever you installed with Homebrew/MacPorts.


gcc and g++ are part of Xcode. That's the reason you have to install Xcode before you can use Homebrew.


Can't say I love the symlinks; however, I use homebrew for convenience, but in most cases it's ./configure --prefix=$HOME/local ...

This rarely fails. When it does, it's user error.


I believe Fink provides binaries.


This is the first time I ever heard anyone saying it.

What do you mean? People are saying that about homebrew all the time. Have you read on blog posts on it?

It just worked for me too, and I have around 20-25 packages installed.


1) Well, I could do I suppose. That all sounds a bit complicated however.

2) I am not talking about my own scripts, I am talking about other peoples. I could obviously go through and debug them, and then check I haven't broken them on a couple of linuxes, and a mac without macports/homebrew but... I don't want to.

3) You are right, it looks like gcc and gdb have both been taken out of homebrew, I assume because they didn't work. They were there previously.

Certainly I find homebrew very useful. Just now I noticed the one thing I used in fink in now in homebrew, so have removed fink which should also hopefully slove problems.

However, as time goes by, I find the OSX is getting slowly worse. In the days of OS X 10.1, the various command line tools were in sync between linux and mac os x, and now that is certainly not true, and I find the linux set more useful, especially in a default state.


2) I am not talking about my own scripts, I am talking about other peoples. I could obviously go through and debug them, and then check I haven't broken them on a couple of linuxes, and a mac without macports/homebrew but... I don't want to.

Which is why the alias solution is nice, as it will only impact the interactive shell, not the scripts.


I'm curious, what are some of the pains you've had with MacPorts? I've been using it for about 8 months with no real issues, but then again I'm not a C/C++/native developer.


I've had it fail to install packages for me before, forcing me to grab them from source, modify the source with patches, and compile by hand.

I've been using Homebrew for about a year and haven't ever had an issue like that.


I agree on package management; I haven't used Homebrew, but I definitely find MacPorts breakage much more often than I find Debian breakage, even in Debian sid/unstable.


My experience with Homebrew has been a world of improvement over MacPorts. Every time I'd try to update things with MacPorts I might as well have started filing an issue ahead of time. Homebrew rarely requires additional intervention.


I have literally never had a problem with Homebrew.


I have had a few, but largely they were easy to resolve. One thing I dislike about about homebrew over apt (and similar) is that it requires a fairly unpolished workflow to do personal mods and patches on a package, while the apt tool-chain allows for this stuff in a fairly polished (if idiosyncratic) way.


> it requires a fairly unpolished workflow

I find it quite the contrary as it summons git power upon /usr/local. So the workflow is basically just editing live in /usr/local/Library/Formula ('brew create url' scaffolds in many cases, 'brew edit formula' gets you to an existing one), committing, and handling merges on pull (which is what 'brew update' does)

If you want to contribute back, fork on github and add your repo as a remote, then push and submit a pull request.

It's really different than apt (which it is normal as Debian packaging has a massively different scope) , but IMHO much, much simpler and efficient.


A great way to run Linux on Apple hardware: boot native.

Bonus points for then using working, legal, VM's for the OSX moments. Or, if you like, just boot from a removable linux SSD for work, reboot for OSX.


Any advice on how to acheive this?

I have a slight edge-case, where I would like to install Debian Mint onto a 32GB flash drive, and boot from that. I have a Macbook Air and disk space is at a premium.

Googling around has been to no avail.

If it's possible (though absolutely not necessary) – my OSX and Debian files could be shared when booted into either – that would be great!


> my OSX and Debian files could be shared when booted into either

You can but you'll have to share data on a non-journaled HFS+ partition for it to be writable from Linux. Watch out for UIDs/GIDs: https://wiki.archlinux.org/index.php/MacBook#Home_Sharing

ArchLinux has their img booting from usb, so I don't see why not.


You might need an external optical drive to install to the flash drive. But booting from USB may be difficult. You could try reFit http://refit.sourceforge.net/


I've tried installing both Macports and homebrew, and some utils did not work, as whichever got put in the path first, confused the other. I suppose I could have made a seperate symlink directory, for just the binaries I wanted.

Yeah, you could. Also, why install both? Anyway, welcome to software. It ain't perfect. You think it'll be better with Linux? I've had happen to me on Debian a few times, and lotsa times with RPMs. But even if it does not happen, there are several other software you cannot install on Linux, like, all Mac Cocoa Apps, from iWork to Photoshop. If you don't care about having access to those, then you don't need OS X in the first place.

I found 'brew install coreutils' broke some build scripts , which expected proper mac tools, if I put them all in path.

So, you generally wanted a Mac in order to install large quantities of third party unix software in the core system on top of same regular binaries???? Do you go install FreeBSD and then add a Linux userland? Do you install Debian/Ubuntu etc and then go change the system, say, Python? Because people that tried it also found it's a world of pain.

I can't comment on the memory, except that I find my 4GB of RAM seems to go much further when I dual-boot my macbook into Linux. It might happen to be the programs that I run.

You're doing it wrong.


So, you generally wanted a Mac in order to install large quantities of third party unix software...

Most software developers that use OS X just want a decent GUI on top of Unix. So yeah, they do generally want a Mac in order to install large quantities of third party Unix software.

Do you install Debian/Ubuntu etc and then go change the system, say, Python?

Yes. I have had no problems with it.

You're doing it wrong.

You're probably not doing enough to notice that memory usage absolutely sucks under OS X.


I am a software developer and I use Mac OS X. I develop Unix backend server systems. I use Mac OS X as my primary build platform, then actually run on top of FreeBSD.

I use homebrew for almost all of the dependencies that I have, there are one or two that I compile by hand.

I've not had any issues. Best of all, I don't need the g* stuff to do my builds. I am perfectly happy with clang and the BSD tools available.

I don't understand why people purchase Mac OS X based computers then want to run GNU coreutils on top of it, or want to Linuxize their entire install. Off course something is going to break at that point, especially when build tools expect certain versions of certain tools to be available.


I don't understand why people purchase Mac OS X based computers then want to run GNU coreutils on top of it

Because only Apple has figured out how to make and sell great laptops running Unix at a (relatively) reasonable price and where all the hardware and drivers just works out of the box. If I could have bought a computer as good as the macbook Air in every way, but with Linux instead of OS X and guarenteed zero driver or hardware comparability issues I would have. But I couldn't so I bought a Mac.


My wifi is still broken after the Lion upgrade. I tried everything and no one has been able to figure it out, including Apple's support. The only thing that works is replacing the Atheros driver with the Snow Leopard one. So much for drivers just working.


I wish users could understand they are not the center of the universe, and "one machine I got didn't work for me" doesn't automatically translate to "those brand of machines are bad". Bugs, problems, bad runs, happen with everything. One personal case (or 10,000 or 50,000) out of some 30,000,000 computers, is NOT the determining factor in respect to whether a brand of hardware works reliably or not.

"Just works" is relative. Given that Apple controls both the Wifi card and the driver for their machines, and that it offers a limited range of such cards, it sure is better poised to "just work" than some obscure wifi card used in a PC alongside with some third-party open source driver for it. This is, pretty much, common sense.

So, as relativity goes, it's pretty much true, or Apple wouldn't hold the higher user satisfaction position of multiple years in a row, with over 20% distance from the second runner.


If I'm paying double what I would pay for other brands, I expect to have driver issues fixed in short order. It's been 7 months now and a lot of other people have the same problem. It's an undisputed bug. What on earth does that have to do with any relativity or user satisfaction statistics?

I wasn't making any statistical claims whatsoever. I replied to someone talking about "zero driver or hardware comparability issues".


If I'm paying double what I would pay for other brands

For starters, you are not. You are just buying the equivalent of top-tier machines from other brands. If you compare the equivalents hw specs AND build quality (from the external design to the materials like aluminum used, to the extra cost for an unibody construction, to the extra engineering effort and cost to pack things lightly and thin, to the thunderbolt ports, to the display quality etc), you either end up same price, or cheaper or the thing doesn't exist at all in the PC side. Even worse with iPads and iPhones, which have competitors struggling to compete on price.

I expect to have driver issues fixed in short order

Well, I guess you can go to an Applestore and have the machine changed if it doesn't work, or get your money back.

But in general "X issue fixed in short order" is not how it works, even when buying mainframes for top dollar. Sometimes you just have to wait until the engineers find the root of the problem and come up with a solution. Sometimes it even takes the next generation of machines for the problem to be fixed, if it's a HW bug. Sometimes it never does, if it affects some small percentage of machines with some strange setup (e.g with that brand of router, when set to those settings, etc).

I replied to someone talking about "zero driver or hardware comparability issues".

If anyone claims "zero driver or hardware comparability issues, he is clearly delusional or just speaks for himself. iBooks circa 2003 even had their logic boards fail multiple times, for example. Or G5's had strange goo coming out from their cooling system. I had a failed DVD on a Macbook Pro. Still, the same kind of things happen to PC runs all the time (I've had too many such cases from '91 to '05), they are just so fragmented as a platform that you never get to hear from them.

A Macbook run is 10 million machines of the same* specs. How much is an Asus 105-SH/i-mkII run? Or a Dell run, considering it offers 2,000 build to order configuration combinations? 1% of a Macbook run in hundreds of thousands of people, 1% of those runs is like nothing, so you don't get to hear much. Not to mention that they don't have forums and sites dedicated to the machines, anyway, just broad websites for all PCs.


Your way of comparing hardware is not very useful for me. It just shows your own personal preferences, and I don't share your preferences. For me, "unibody construction" adds about as much value as a gilded keyboard would.

So when I said "I'm paying double what I would pay for other brands" I meant it literally, including the "I". The set of machines that meet my requirements includes machines from the likes of Toshiba that cost less than half of Apple's cheapest offering.

The reason why I still bought from Apple is that I need a Unix system and I hate dealing with driver issues. So having to deal with unfixed driver issues is the quickest way to drive me away.

(I did not downvote you by the way)


How does the Lenovo x-series compare for you, where does it not measure up? I was recently at a linux conference and probably about 60-70% of delegates had either an Apple or a Lenovo laptop.


I've used IBM and Lenovo x-series (x41 and x60) laptops before and while I'm generally a fan and would definitely go with them as my number two choice, I've never had one "Just Work" with Linux. It's always very close, but there is always something. The x-220 was also more expensive for equivalent specs when I bough my Air, especially if you took into account the cost of replacing whatever drive it came with with a 120GB SSD. Also it is slightly larger and slightly heavier than the air, and for me size and weight where a big factor.

Then there are minor things like that there are no stores anywhere around here that sells them, meaning I'd have to buy one without playing with it first. Also it's basically impossible to buy one without a Swedish keyboard layout in Sweden, while Apple will happily let me choose any keyboard layout I want. None of these are deal-breakers in them selves but they're things that kind of add up.


Yeah, I'm just running Logic, Photoshop and Final Cut Pro for various projects, I mean those use very little memory, so what would I know...


(I'm the author of the original post)

You bring up good points and my article would need some clarification on some parts, I agree on that. I'll just write quick replies back to you, and try to format something on the article itself later.

Purge really did free memory and quite a lot. I'm not too expert (as you probably can tell) how the OS X memory management works, but I mostly settled with solutions that seemed to help my problem. Maybe there was some third party software that messed things up.

The problem with the inactive memory is that it is not freed, it is swapped. So when hitting memory limits of my system, the computer started swapping. Just freeing the memory, in my case, would have been much quicker. Practically my machine was constantly swapping when the memory limit came up. As you said, repair disk permissions caused all this to happen due to filling memory with disk cache. So that was a nice solution to my problem; a way to force swapping on inactive memory.

Python point is bit wrong, I was indeed trying to argue that installing python packages is impossible through homebrew. However, I did use a lot pip+virtualenv, so that's at best a bit vague argument on the OS X side. However, in production I always rely on the packages provided by the OS, not pip + venv, unless really necessary. This is mostly because it makes it easier to keep system up-to-date.

I'm sorry if this showed up as uninformed rant, but I just wanted to share how I felt using OS X and Macbook for a year as my primary computer.


The problem with the inactive memory is that it is not freed, it is swapped. So when hitting memory limits of my system, the computer started swapping. Just freeing the memory, in my case, would have been much quicker. Practically my machine was constantly swapping when the memory limit came up.

That's just not true. If inactive memory is something that's already backed by disk (like a memory mapped file), it'll be discarded. Only if it's read-write memory that's /currently held allocated by a running process/ will it be swapped out to disk. Unless you're doing something pathological (like suddenly allocating lots of RAM and forcing paging - what purge utilities do...), the architecture /speeds things up/.

If it's really true that, without doing anything special, things were always being swapped out to disk for you, it necessarily means that there was a process that had allocated (lots, it sounds like) of RAM and written to it, so that stuff had to be paged out to disk to free up RAM without losing data.

It sounds like you were running purge commands or utilities to 'free up RAM'. That is counterproductive. It causes the system to release cached 'clean' (i.e. as already on disk - a mapped file, essentially) mapped RAM and swap read-write memory out to disk, only then have to re-read it all when you actually need it. In other words, using purge 'utilities' actually puts the system into the worst possible state.


The problem with the way how OS X keeps data cached in the inactive memory is based on the assumption that you are going to re-use the same app within reasonable amount of time. With the current behavior/performance response (without knowing exactly how Apple engineers implemented it), it really feels like a giant garbage collection system that takes age to free up its own memory with no real sense of concept that if you don't use certain apps for day, chances are, it is going to take a long while before I use them again.

I am one of the devs out there running a Macbook Pro with 8Gig of memory (I wish I could have more but I have a 2010 old model). For web development, I have at least Firefox/Chrome/PhpStorm/SmartGit/Mamp/Thunderbird/Terminal/Notational Velocity/Dropbox/Alfred/Sophos AntiVirus open at all times. Now, a long the way, I may open a few other apps that I use rarely, like Photoshop/CyberDuck/VMWare Fusion/iTune/iCal/iOS Simulator/Preview/LibreOffice/Skype. Now, pretty quickly 8Gig gets used up, and the system runs to the ground shortly after.

If I then have VMWare Fusion shutdown for awhile, and relaunch later, the system really just can't take it anymore. The last resort? purge&

At least, that's my day-to-day experience with OSX. Personally, I find the memory management really lousy, worse than other other OSes I used in the past (both Windows & Ubuntu/Fedora/Gentoo).

So why the heck I use Mac still? Because of the driver support is still far better than Linux. With Mac, you are less likely in need of blacklist of some drivers because of freeze up issues.

Either way, I am definitely not a happy camper with the current memory management system in OS X


The inactive count is a misleading figure in many ways. It includes both 'dirty' and 'clean' (i.e. identical to what's on disk, and hasn't been altered) memory that's not been used for a long time. Purging 'clean' memory is instantaneous. The only cost you'll see is writing 'dirty' memory out to disk.

Your explicit purging is changing the cost of writing dirty data out from an ongoing cost to a single, longer, upfront cost. Instead of writing only when more RAM is required, you're forcing it all to happen at once.

Incidentally, the OS does try to keep an area of free RAM so that some memory can be allocated instantly, it's not only swapping things out when RAM is absolutely full. It's possible though to outrun this process if an app tries to allocate huge amounts of RAM at once though (i.e. more than is kept free for this purpose).

For your specific case, presuming apps are behaving well (see below) you would be better off quitting apps and relaunching them when you need them later. This will free up app the dirty RAM they've allocated (just like when you purge), but the 'clean' inactive RAM will not be purged (because that's not necessary - as I said, it's free to purge that kind of memory when it's needed for something else).

You also want to run Activity Monitor when your system is in it's bad state and see if it's one of the apps you're using in particular that's allocating lots of memory (check out the "Real Mem" column). The OS can't do anything if it's an app that's really allocating and writing to memory, it's obviously not able to just discard this written-to memory.

Really though, if you want to do all those things at once, more RAM might be required. Remember, with VMWare and the iOS Simulator running, you've got two whole other OSes running at the same time, it's reasonable they'd require lots of memory to work well!).

By the way, the purge command was written to simulate /worst case/ conditions when performance testing. It's designed to flush out caches so that the system has to e.g. load all an app's code from disk when launching.

[Source: I worked analysing this kind of thing at Apple until a couple of years ago].


> So why the heck I use Mac still? Because of the driver support is still far better than Linux.

I feel like this is a stereotype that just won't die.

Yes, up till a few years ago you might have to do some poking in /etc to get things working, but as long as you spend a few minutes looking up basic background info before you buy those problems just don't happen these days. I haven't had to edit a config file to get hardware working since 2007.


I've had to edit configurations files to get hardware working in the last month.

On a laptop we have here at the office I had to disable a certain driver from loading before a different driver or else the two would squabble over the wifi card and it would never show up.

The other thing that is more software related than hardware is that it is a MS Windows shop, all of the local domains are machine.domainname.local. This conflicts with MDNS as you could imagine, so the Linux machines are unable to access any of the resources on the machines named machine.domainname.local because MDNS would respond with a failure. Had to modify /etc/nsswitch.conf to fix that issue.

Linux is not without its failures. Saying it just works is certainly not the case. Whereas the Mac OS X machines I deploy come out of the box, get configured and are ready to go. Drivers work, software works, don't need to go googling for hours trying to figure out why ping won't resolve a machine.domainname.local address but dig is doing just fine.


> Saying it just works is certainly not the case.

I said if you spend a little time scoping these kinds of things out up-front, it's very easy to get a machine on which it will work without issues. Obviously if you found a machine lying around the office and tried to load an OS on it, your chances of it working well are not going to be as good. You can't just load OS X on random hardware and expect it to work either.


After 10 years of Linux desktops, including purchasing a laptop with low-performance hardware just because it used entirely OSS drivers, I still has issues with basic things like multiple monitor support (it worked, but only when I disabled compositing, which makes redraw suck).

Sorry, from my personal experience that still isn't true, as much as I wish it were.


I've been using dual monitor setups at home and at work for a few years now through several hardware builds, both desktops and laptops first using Ubuntu, then switching to Debian Testing for a rolling distribution about a year ago (after finally realizing I love the latest software, but don't want to spend time to upgrade/rebuild every 6 months to keep up with Ubuntu's release schedule or deal with the hassle of installing everything from source). I've been using Nvidia cards with the nvidia driver and dual monitor support has been fantastic for me.


Well, I've been using Nvidia cards with the nvidia-driver and dual-monitor support as well, and it has been fantastic, until I upgraded my Ubuntu install, got Unity without asking for it, after which multi-monitor support was completely broken. My colleague who has 2 screens of a slightly different type (all are Lenovo ThinkVision) and is running Arch with Gnome 3, has been experiencing random multi-monitor glitches since day one. Sometimes for no apparent reason one of his displays doesn't get a signal after waking his laptop from sleep or hibernate, and the only way to resolve it is a reboot.

To make a long story short: we could exchange anecdotes all day about the state of 'Just Works' on Linux, but at the end of the day, I think no one with enough experience using various Linux distros and OS X, can honestly and sincerely say Linux is even close to OS X in that aspect.

Myself, I've been using Linux since Slackware 4 and have tried about 10 different distro's over time, alongside OS X for the last 5 years or so. Up to this day, I regularly run into problems that need fixing on Linux, particularly after upgrades, or when switching hardware. Whether it's Wifi cards, USB hardware, multi-monitor support, network configuration issues, software that stops working, system library problems: there's always something. OS X on 3 different machines, from OS X 10.4 through 10.7, I've only had one issue that required maintenance once, on a b0rked upgrade. It was pretty nasty, but fortunately OS X has Time Machine and target disk mode, so in no-time I was able to pull off any important data just to be sure, re-install the OS, restore my Time Machine backup, only to find out everything was back to normal, to the point I didn't even need the files I had to pull before the restore.


I still haven't figured out how to get my phone to tether over USB on my friend's Mac. Works flawlessly on my Debian machine though. It's not like OS X is seamless.


You just claimed that Linux hadn't been that way in a while... I just provided a counter-point.


Well, I would say yes, Linux has come a long way to be much more mature and much more usable than before.

However, up to this date, it is not without issues, especially on laptop hardware. Remember that Lenovo ThinkPad T400 from a few years ago? Well, the level of stability from a popular distro, such as ubuntu, has been quite up and down. One release (like 11.04), I had trouble with it booting up and playing nice with dual graphics mode. Today, with 11.10, it is much better. How about that shiny Acer AspireOne 722 netbook? You should check the online thread. There are still discussions about how to prevent freeze up and etc. All these little quirks here and there are the reason why I would still run OS X.


Interesting - I do about the same thing (Minus the Antivirus.... that could be your big pig there), with 8 gigs of physical ram, and I have paging (swapping?) disabled.... and I've never had a "not enough memory" crash or whatever. With vmware going, iTunes, Xcode building stuff, skpye, dropbox, item, mail, a few browsers, a bunch of tools..... a video going maybe on the second monitor. On a mid-2009 mbp.


I do happen to have an 8 GB machine, and almost never reboot (only reboot for OS updates).

I run half a menu-bar full of resident helper apps, like Dropbox (a big one), Fantastical, ScanSnap, Xmarks (another big one), Transmit (another big one), Evernote, and more. I also keep Apple Mail running, mapped to a half dozen Gmail IMAP accounts. I have "geek tool" updating my desktop with iCal appointments and various ps outputs.

I'm running a local MAMP stack and local Django stack. I run a Parallels Windows 7 VM for testing things in IE and testing from Windows in general.

Other running software is usually Safari, Terminal, Sublime Text 2, Codebox, Source Tree, Sequel Pro, Adium. I run and quit Office 2011 every time I need to edit a document. I run and quit Aperture and Photoshop.

Using this command line to see memory used by processes:

    ps -x -o rss,command | awk '{sum+=$1} END {printf "%9.3f %s\n", sum/1024, "MB" }'
Gives 3000.535 MB for all of the above, 3235.000 MB after adding Word and Excel, or 3794.277 MB after adding Photoshop CS5 and iTunes streaming radio.

Makes me wonder what you're doing to run out of memory.


I do happen to have an 8 GB machine, and almost never reboot (only reboot for OS updates).

Here's what I don't understand - why would anyone in their right mind purchase a $2000 laptop and then not spend the 20 minutes and 100 bucks to max out the memory on the thing? It's the easiest thing in the world to do, and basically means you never have to worry about memory usage again.


Well, it's only a $1000 laptop, but...

    jerf@jerfhom ~/tv $ uptime
     01:01:32 up 2 days, 23:45,  4 users, ...              
    jerf@jerfhom ~/tv $ free
                 total       used       free ...
    Mem:       3904192    3682400     221792 ...      
    -/+ buffers/cache:     740212    3163980
    Swap:      6297444      20584    6276860
Recall that what "matters" is really that second line, which translates to 740MB being really "used" and 3.1GB being just "stuff I happened to read from disk at one point", which as it happens includes rather a lot of media files. Loading another 4GB of media into RAM isn't going to help my system performance any.

This is with a respectable Linux dev loadout, but I'm not running my VMs, but that still tends not to strain my system any. $100 on RAM would just be a wasted $100.


No kidding. I just put 8GB into my '09 MacBook Pro and it was $45. That's insanely cheap -- just max your system out and don't worry about it.


And-or get an SSD - the heavyweight disk-space users like movies and music are trivial to move to an external drive on a Mac, and with an SSD so much disk-thrashing pain just goes away. It's pretty great.


After noting my rather older MBP with 8GB RAM was outperforming my new iMac, I spent $16 for 4GB of RAM (doubling the total to 8GB); the iMac's performance improved immeasurably. By far the best bang for the buck on OSX is maxing out the RAM. (Well, at least until I can afford a solid state drive of adequate size.)


You add the individual RSS of every process, but processes share physical pages, so that doesn't make any sense. You're counting physical pages more than once. For example on my 4GB laptop you're command yields 5.6GB, and I don't even have a swap partition.


I'm skeptical about your little script (though I too have a lot of heavy apps open all the time and almost never reboot, and certainly have never, ever had any memory-related problems whatsoever).

My MBP has 8 GBs of RAM, and this is what Activity Monitor tells me:

   Free: 2.19 GB
   Wired: 1.01 GB
   Active: 3.70 GB
   Inactive: 1.08 GB
   Used: 5.80 GB
but, you scripts gives me:

   2598.480 MB
Why is that? Isn't Active kinda analogous to -o rss?


I think that's because 'ps -x' command only gives your own processes and not all processes. Add '-a' option to show stats for all processes.


You're right, typo. It's supposed to be -ax.


The script is wrong, it simply adds the RSS of processes together, but physical pages are shared between many processes.

Active is not like RSS, what Activity Monitor calls real memory is RSS.


I think the problem is that you used OS X for a year. I've been using my Mac for 4 years, and Linux for another 4. And I have Python (in fact I have at the same time Python 2.5, 2.6, 2.7, 3 and 3.1 for testing purposes), I have installed so many packages I already lost the count. You only need to be sure the path is correct when doing it. And I have 2 Gigs of RAM. Mac OS can be a bit hungry, but there's an "easy" solution for this. Closing unused applications. That bright dot in the dock means something is using an incredible amount of RAM just to sit there doing nothing. If I only leave open my mail app, Chrome, emacs and twitter client, I can go for days without ever hitting the disk cache. If I start adding more things (or tabs like I'm crazy), I'll hit the cache, of course. But this also happens in my Linux systems, it's not a Mac OS problem. As for the DVI port... I usually leave the DVI adapter linked to the monitor I use. And use bluetooth devices when possible. So far only once I have used a hub to connect 3 things to my Macbook (an SD reader to copy images, my Ben Nanonote to install some packages and my iTouch to sync with iTunes). Happened once, in 4 years (and I also have a drawing tablet, btw).


>Closing unused applications.

That's not a solution, that's a workaround. I'm sitting here with a 4GB RAM machine running Arch Linux with XMonad. I usually run the following applications in day to day usage:

    - Firefox Nightly with 50-100 tabs (largest memory hog, 
      the rest doesn't even come close)
    - Thunderbird
    - Pidgin
    - XChat
    - smplayer, which I usually keep open
    - deluge
    - mpd + ncmpcpp
    - half to one dozen of terminator instances with zsh, 
      most running some text mode applications like vim, 
      htop or the aforementioned ncmpcpp
I reboot my PC once in a blue moon, usually after kernel updates. otherwise it's running 24/7. Right now, I sit at ~35% memory (and that should include memory used for disk caching), a bit less than half of which is Firefox with ~16% (65 tabs). I usually never go over 50% unless compiling heavy stuff (like QT level heavy). I don't really know what the fuck OS X does to eat all that RAM, but it apparently does something wrong.


Emulating a setup similar to yours, the OSX machine infront of me sits a 2.8GB active, of which a little less than 800MB are taken by Terminal.app due to me having activated infinite scrollback and having two rails processes being hit 8 hours a day since like forever as I develop. So it's really closer to 2GB and I didn't even try hard. This includes mds+mdworker (the indexer) which clocks in at 200MB. Normally I also have LibreOffice, Pixelmator, iCal, iTunes, Reeder, VMWare Fusion, which makes it balloon to 3.1GB and up as I open more documents/VMs.

OSX is not doing anything "wrong".


No. Re-read the comments above about the OS X caching setup. Rather than have a bunch of memory marked "free", it is using this RAM as caches.


I have experienced OSX's "swappiness", having gone as far as disabling dynamic paging in an attempt to avoid it. Upgrading memory was the only real solution. A little bit of research would reveal a lot of other people have run into the same problem, you aren't alone at all in that.

I split my time between OSX/Linux and it's pretty obvious to me that Linux is vastly superior in terms of performance, in a wide-range of scenarios. I prefer to use Linux on older and/or memory-constrained systems.


The purge command is pretty much the worst thing you can do to OS X, essentially blowing away the filesystem cache. Back when I worked at Apple, I would use it to debug seek-incurred race conditions in various files.

So much bunk in that article.


>> And of course this does not support installing Python, Ruby, Perl on any other software that has its own way of distributing software.

> which is bullshit (although there's no Perl)

Last time i looked Perl could be installed via homebrew...

  $ brew install perl
Also Perl can be installed via MacPorts...

  $ port install perl5
However my preferable way is to use perlbrew (http://perlbrew.pl) which allows you to install & manage multiple versions of Perl and it then allows you to use the normal CPAN toolchain to manage/install your modules.

  # install perlbrew (normal user in ~/perl5)
  $ curl -kL http://install.perlbrew.pl | bash
  
  # install & switch to perl 5.14.2 via Perlbrew
  $ perlbrew install perl-5.14.2
  $ perlbrew switch perl-5.14.2
  $ perl -v   # (will show the perlbrew perl 5.14.2)

  #  Add cpanminus to perl-5.14.2
  $ curl -L http://cpanmin.us | perl - App::cpanminus

  # load Moose module into perl-5.14.2
  $ cpanm Moose


You're not really fair. Yes, it's true that osx has 3rd-party-package-manager. But it's also true that they can be very complicated, or simply broken. The easy'ness of a good linux-distribution is just not possibly with osx. The days where you need to hack yourself something togehter, just to get some basic applications, are gone...at least on linux.


You're not really fair. Yes, it's true that osx has 3rd-party-package-manager. But it's also true that they can be very complicated, or simply broken.

MacPorts is like FreeBSD ports and Fink is like apt-get. Brew is also dead-easy. While they're not perfect, there are some far more complicated systems in some Linux distros.

The easy'ness of a good linux-distribution is just not possibly with osx.

It is perfectly possible, as in nothing in OS X prevents it. It's just not happening/ed, because, well, not enough OS X users contribute to it, compared to the Debian/Ubuntu community.

Still, it's not that bad. I work with Ruby, Python, Node, C plus various web technologies, and use lots of unix stuff. I seldom have problems installing them with brew.

Also consider the alternative: I can install apps through the App Store or through a DMG image, that no Linux can run today (because they are native Cocoa apps). Stuff like Photoshop and Office, and Premiere, and Aperture, etc. The ease of installing industry standard proprietary apps lots of people need and a large number can't do without, is just not available in Linux.


As someone who administers a lot of Macs and a few LInux machines, package management on Macs is simply indefensibly bad. There is no getting around it. Apple chose not to provide this as a service to their users.

I use a Mac laptop and would never have written this post. But I don't have to pretend that the package installation setup on OSX is remotely acceptable. It is horrible. Stuff breaks or won't install all the time. On mainstream Linux distros, stuff generally Just Works.


Apple's users are not people that dive into the command line. Apple's users are people that open the App Store and have their "package management" solution.


Well, if Apple's users aren't the sort to dive into a command line, then most developers aren't Apple users.

Also, have you seen any modern, user friendly Linux distros (e.g. Ubunutu)? You never have to dive into the command line there and yet it has nice package management that just works.


Well, if Apple's users aren't the sort to dive into a command line, then most developers aren't Apple users.

Well if you have been to any developer's conference, you'd have deduced that most developers are Apple laptop users.

It's just that they don't bitch about any package that breaks.

Some of us also use a virtual machine like Fusion for an isolated environment if we want to do development with a Linux userland, we don't pile one on top of OS X and its' BSD core, and don't expect a volunteer effort like brew with 2000+ packages all sub 20K people use to work perfectly.

(The guy in the other comments said he manages multiple Macs (a sysadmin guy) and had troubles with installing the same packages to all, etc. Presumably also different OS versions. That's a slightly different problem.)


The doubtlessly depends on the developer conference in question; I bet an iOS conference and a .NET conference both have different distributions of mac users.

Looking at the recent StackOverflow survey[1] (I think it's fairly representative of developers in general), we see that about 20% of the respondents used Macs, another 20% used Linux and the rest Windows, so mac users emphatically do not represent "most" developers.

[1]: https://www.surveymonkey.com/sr.aspx?sm=2RYrV_2bFw2aZ2RfedWH...

But my real point--which I realize was poorly worded--was not that no developers use macs but rather that the ones who do are not "Apple's users" in the sense calloc used.


It's been a long, long time command line tools are not required to use package management. Synaptic is a very easy interface to add new software and Ubuntu's Software Center, while a bit rough around the edges, is a very App Store like experience.

Besides that, package management also offer an easy way to keep your system updated. On a Mac, App Store excepted, there is no central way to keep your system up-to-date - Software Update will update Apple's software (often by downloading huge packages) and you are on your own to update whatever is left. Red Hat and Debian mastered this in the early 2000's.


As someone who uses both Linux(most familiar with Fedora and Ubuntu) and OS X regularly I have to say I completely disagree with you.

My Linux systems are always a headache. Last week I pulled a recommended patch from the system updater and it broke Xorg. I had to remove it by hand and reinstall it.


I don't buy that at all. If a serious distro pushed an update that broke X, we'd have all heard about it. Which distro? What package was it that broke? Are you sure you weren't mucking about with non-distro stuff like the NVidia driver installer?


Ubuntu, and nope.


The last time my X broke was a couple years back. Ubuntu pushed a defective update. Before that, the last time something like that happened was when I was using Debian Sid.

Breaking X is something you expect with Sid. And if you are running it, you re supposed to be able to fix it and submit a patch.

If you are breaking X, you are doing something wrong.


I think it probably depends on the distro, I've had quite a few systems break on Arch because I didn't read update warnings on the Arch Linux website after they had pushed a bad update.

Tangentially that is why I moved to xubuntu from Arch, though I'm sure Arch is a bit nicer with regards to headaches now.

Fedora likes to break frequently though I don't know if packages as big as X are likely to fall through the cracks.


Both Arch and, to a lesser extent, Fedora are aimed at more advanced users who want faster updates and newer technology over stability.

I think this is a great compromise, but it does mean you may have some issues with updates. That said, I have not had any issues on Fedora that weren't my doing.

I've been using Fedora for about a year. Earlier, I used OS X for about the same amount of time and I did have problems that weren't entirely my fault, largely with Java and Eclipse. Since all I was doing during that time was simple Java development for school, there just wasn't anything else that could have gone wrong.


I am sure things are simpler if you pay to stay on the latest version of MacOS, but pick up a laptop that has an old version of MacOS (as I did last week) and try to install a modern version of python on it without reaching for your credit card and/or signing up for all sorts of accounts/development programmes. Is it too much to ask to be able to install gcc on a machine that someone has paid > $1000 for? Search for Xcode asks you to install macshop (or whatever it is called) which in turn asks you to first upgrade your OS, which in turn asks you to install macshop.. etc. Terrible experience, made me pine for Windows. I got things working, with a lot of googling, but still have no idea how and where things are installed and how to uninstall them.


If you want the free version of Visual Studio, you go to a web page (http://www.microsoft.com/visualstudio/en-us/products/2010-ed...) and click on the download link. You don't need Windows installed, or need to be signed in or have special software on your machine to download the file.

If you want the free version of Xcode, you first need a mac. Now you need a mac with a recent OS. Ok, good, go to the web page, click on the "App Store" link. Make an account in "App Store" then hand over Your CREDIT CARD information. Great. Now it will download through the app store. Click on "Purchases" to see the status.

Microsoft doesn't need to know who I am and doesn't even care if I'm on Windows but Apple wants me on a newish mac and then hand over my credit card before I can get their IDE. Really? Ridiculous and almost intolerable.

But that's how they roll.


Well said. I would add that for perl instead of using homebrew one can use 'perlbrew': http://perlbrew.pl/ which is a lot like RVM.


I was considering going the other way - but after reading this, maybe I'll stick where I am.

--

One thing that does seem a bit odd:

>"I'm a long time Ubuntu user, but this time I decided to go with Debian. Why? Mostly because our servers are Debian and because latest updates of Ubuntu have mostly focused on breaking the desktop environment."

vs.

> "Do I miss something? Sure. Even though Linux in modern times mostly works out of the box, there's still slight issues with external displays, for example I can't set the 30" Dell monitor at work to be the only display without doing some xrand magic. I guess that's really the only thing I'm missing from OS X, a sane and automatic way of handling external displays."

I'm a bit sick of hearing this meme perpetuated. Give Unity a chance ... in fact, the author's main gripe about Debian is resolved in a really fluid way by Ubuntu + Unity. I think Unity's multi-monitor support is one reason why it's worth sticking with.


What's so special about Unity's multi-monitor support? I didn't realize there was such a thing. As far as I can tell, Unity sits well above the layer that causes most multi-display issues.

Once you've got multiple displays, Unity/Compiz can do stuff to make working with them nicer; e.g. switch windows from one to the other and various other stuff. But if one of the connected displays flat out does not work or displays the wrong resolution, rotation, or whatever, Unity/Compiz usually are not to blame and can't really do anything to help you.


> I'm a bit sick of hearing this meme perpetuated. Give Unity a chance ... in fact, the author's main gripe about Debian is resolved in a really fluid way by Ubuntu + Unity. I think Unity's multi-monitor support is one reason why it's worth sticking with.

I've used Unity on my desktop for a six months or so, but I just wasn't compatible with it.

The display issue is more of an issue of drivers or something similiar: The issue I'm having with my DP-connected 30" Dell is that I can't make it being sole display without first disabling laptop's internal screen with xrandr. If I keep my laptop display on, the screen works as a mirror or secondary screen just fine.

Now this might work in Unity, but unfortunately I've got no way of testing it.


Is there anyone who got Linux to work full on a Macbook or MB Pro? I get most flavors of Linux to boot and get a nice X running, however, things I cannot miss like the touchpad control, screen backlight controls and keyboard backlight are completely not working. Also the battery is falling faster than my Android under Linux.

Is there any Linux working well? I have no problem with tweaking (I hacked on device drivers on previous laptops when they didn't work), but even with tweaking I never managed to get it working well. Otherwise I can have nice Apple hardware with my favorite OS....


I have a late 2008 unibody Macbook running Ubuntu + Gnome3 (no Unity). Almost everything worked without configuring. Brightness control required one line to xorg.conf and Airplay required installation of one Pulseaudio module.

I really like Gnome 3. It's much faster than Lion in that machine (4 GB of RAM, SSD), the battery lasts the same in my normal use as in Lion, the user interface fits for my freetime use, the visuals are simple and pretty and best of all: Nvidia has much better drivers for Linux than for OSX, so I'm actually able to play Minecraft without any hiccups.


I ran Ubuntu 9.04 and 10.04 successfully on a MacbookPro 5,4 with very few hiccups for ~2 years. There's a bit of community support around Ubuntu for getting it running on Mac hardware, but your results may vary. rEFIt and friends are a bit difficult to get working initially, but it's not impossible.

I had the same feeling as the author: displays are a pain in Linux. The sound, wireless and touchpad support weren't as seamless as they are in OSX either.

Even now I've just switched back to OSX because my Ubuntu partition stopped booting, and I do most of my work over ssh on an Ubuntu VM with gnu screen / bash / vim.

The thing I miss most from running Linux desktop is the highlight to copy, middle-click to paste clipboard.


Regarding hardware quality:

  "...why there's no PC manufacturers that would
  have the same overall quality of the hardware."
In my experience, business ThinkPads beat MacBooks by a mile for working (e.g. programming). I am surprised to see so many coders use MacBooks. Thinkpads have better ergonomics, are more robust and better performance/configurable hardware (e.g. RAM).


This is a content-free comment. You claim ThinkPads have better ergonomics, but you say nothing about why. You say they are more robust (more robust than a solid aluminum shell? Not in my experience), but you don't back it up. You claim better performance and say something vague about RAM, which is similarly unsubstantiated.

The IBM has a vastly inferior, smaller, less responsive trackpad, inferior display, inferior overall fit and finish of the case, inferior trackpad buttons.

You'll have to do better than this.


In my list, generally preferring Thinkpad over Mac:

Better keyboard feel and layout. Especially if you're used to PC/Unix keyboards.

Trackpoint. I vastly prefer this to a trackpad (and generally disable trackpads in BIOS).

Cooling/airflow. Mac's sealed design is nice, Lenovo's gets the heat out (and the dust in, sigh).

Thinklight.

To Mac's benefit:

Magnetic breakaway power connector. Sheer genius.

Illuminated keyboard. Kind of neat.

CD slot (not on Airs, obviously). I'm always accidentally opening my CD drive on my Thinkpad. Some way to lock the damned thing would be nice.

Displays -- I've been consistently impressed by the brightness of Apple displays, if not the aspect ratios (I prefer Thinkpads generally here, though they're converging on Apple's standards).

And in 10+ years of lugging Thinkpads around, I've had few if any hardware/robustness issues. One screen that pixilated badly after a fall onto the street in my satchel, replaced by IBM. Otherwise, nada.


On the authors point anyway, there ARE laptop makers who make hardware of that quality. I bought a HP Envy 14 last March, solid metal with a high quality screen, huge multi-touch touchpad, no fans on the bottom, no stickers all over the place, switchable graphics. Wonderful backlit island-style keyboard. It's the PC Macbook. I sold it last month for more than half of what I bought it for, still in perfect condition.

Not saying this is what the author thinks, but a lot of people I know complain that their PC laptops are low quality and end up switching to Mac because they refuse to buy high quality laptops. My Envy was $900 with an i5, 4GB of RAM and a Radeon 5650. Spec a Macbook at that price.


Honestly, contrary to this article, I think you'd find that most developers using Macs use them more for the OS than for the hardware.

If I could build a seriously beefy desktop machine and run OSX properly on it, I would. For laptops the nice hardware (as others have said, especially the trackpad, but there are plenty of other reasons) is a definite plus, but OSX is just great.

Wanting some specific hardware for a desktop machine recently (I was actually in a similar position to the author, it seems, I wanted to run basic stuff + dev tools + a lot of VMs) I spec'd and built a Ubuntu box. I lasted for about 8 months being annoyed and decided to use the box as the VM host and a Macbook Air to access them.


You can, it's called a Hackintosh.


Tried it several times over the years. Unfortunately it doesn't meet the 'properly' criteria :)


I can think of a few reasons:

1) MacBooks are pleasing to the eye. ThinkPads are well-designed functionally but ugly. Aesthetics don't matter to everyone, but to those for whom they do matter, they are not a frivolous concern. Working in an elegant environment (including hardware) can have a positive effect on one's state of mind.

2) A lot of people are interested in developing iOS apps. This is much easier on a Mac.

3) MacBooks are status items like designer clothing and may make people feel more attractive / successful. Obviously this is not universal but neither do I think it's a negligible effect.


Depends on the person. I find many business notebooks pleasing to the eye.


Do they still ship with unusable trackpads (that requires you to carry a mouse everywhere if you ever going to use your ThinkPad) or has that improved in recent years? It's the only pain point against ThinkPad for me.


If you're a programmer using Emacs/Vim you really don't care for trackpad or mouse for the great majority of your computer needs.


The problem is, even with how little I use the mouse (I use Vimperator back then) it's still annoying when I have to.


That's not true. Trackpad gestures are pretty awesome and it's easier to switch between applications using them as opposed to command-tab.


I think the trackpads are good (I mainly use the keyboard so I prefer the smaller ones over Apple's, because the keys are reachable more easily) What is it you don't like?

(I'm judging the T510, T60 was also ok)


It's hard to describe why Apple's trackpads are better, they simply are. If you are used to them, every other one simply feels clumsy. It must be a combination of the surface (friction), the sensor (precision) and the software drivers that makes them special. I think it was when they introduced the unibody Macbooks that the trackpads became so good. The old plastic Macbook trackpads where vastly inferior in comparison.


Too small (I have fat fingers) and low sensitivity compared to the one in PowerBook (back then). This is from a company-issued laptop four or five years ago, however, I don't know if it has seen any improvements over the past years. (Especially the "too small" part.)


I've got some stubby little fingers, so much that my guitar coach in high school told me I should quit and become a bassist (a few years later I invited him to a show so he could eat his words). I use a T61 and I love the tiny trackpad. Granted I use a trackball with it, but on my home laptop with a giant touchpad, I have to move my hand to get all the way across. On the T61, I just move my finger. It feels more precise, I think. I feel like a surgeon with a scalpel.

Granted, this is just my opinion, and when I put it down into words I realize how ridiculous it sounds. But I still prefer it.


I've got a MBP and a Thinkpad and though the TP is by far the best PC (win/lx) notebook it's nowhere near the MBP if you compare build quality and ergonomics.


I think the bottom line is that the author is used to working with Debian and thus perceives differences between OS X and Debian as bugs.

It is true that jumping between Linux and OS X can be difficult at times. It's also true that Debian's packaging system is better than OS X's.


On the memory side of things (in Linux) RAM is cheap, buy 32GB or whatever your machine can support. Once you have a large amount of memory set /proc/sys/vm/swappiness to a low number like 10 (sysctl -w vm.swappiness=10) if you want to free up memory because some application was eating it up run sync; echo 3 > /proc/sys/vm/drop_caches I rarely if ever have to do this, but it helps to know if you need to do it.

I run Debian/KDE and Kubuntu and it's a fantastic setup. Setting up multiple monitors (I have 2 27" monitors) is a breeze with twinview using kde's systemsettings program. Some xorg optimization tips Option "UseEvents" "On" Option "RenderAccel" "O" in your xorg.conf file under device will speed things up even more

I run all the other os's in virtualbox, its fast (windowsX boot time is about 2 seconds, osx takes a bit longer), and you can even do some 3D stuff, though I don't play any games so I have no idea if those work.

Out of all the operating systems I've used over the past 21 years of working with computers I have found Linux to be the best fit for customization, speed, available software, ease of use, and friendly community. Though I did like vax/vms when I was a kid, I had a mouse! it was awesome :D.

Overall though I would say if your going to be doing development, especially in a server type environment, use Linux, osx was built for your average joe who doesn't know how to use a computer. Linux is usable by your average joe, but it goes beyond that so easily allowing for extreme customization on just about every facet of the operating system that you can imagine. I feel lost without my build, the nice thing is, I put it on a usb stick and I can use it on any computer, thankfully I have never had to do that :D.

And don't worry about KDE, We have a great community, and we'll keep it going. Its not about profitability, and that is what a lot of these business people seem to forget. We work on Linux because we love the system. Not because we get paid to work on it.


I really can't understand bitching about memory in this day and age. First thing I do when getting a new machine is to max out the RAM. My Mac Pro has had 32 GB since 2008, iMac all have 16 GB and Macbooks 8 GB. It's such a cheap thing to do and it improves your daily life so much, it's essentially free in the end.


My MacBook Pro maxes out at 16Gb. Putting that much in would have cost me 4-5x as much as going for 8Gb! RAM is cheap, but there's a definite sweet spot much of the time.

8Gb is more than enough for the time being.


I must be missing something. Why waste 10 min running purge and repair disk permissions to retrieve memory when a restart is much faster???

I abandoned MacPorts for Homebrew. One feature I like is that I can build any package I want (that doesn't have a formula) with './configure --prefix=/usr/local/Cellar/name/version-no' and then do a 'brew link name' to make all the symbolic links or 'brew unlink name' to remove them. Helps solve annoying problems.

My main grump with MacOSX (still on Snow Leopard) compared to Linux is issues with 64 bit Python and MacOSX seems to store files all over the disk. Basically, if you want to work differently from the Steve Jobs Way, it takes a LOT of work. Couldn't agree more about the superior hardware quality.


I think this is mostly a case of someone who develops for Linux, but prefers to develop on his desktop instead of a development server that is close to the production environment. This results in requiring the desktop to be closer to that of the production environment. I think this is the main reason I've seen people switch away from having a Mac. Personally, I've always seen my desktop and laptop as a terminal and this approach has been working for me for 20 years now, it also allows me to not having to upgrade my desktop a lot or waste a chunk of my life fiddling with it.


>There are at least four different ways of installing software on OS X. Download a DMG image, drag the icon from there to Applications folder, run an installer, install stuff from Mac App Store or compile it yourself. There seems to be no standard way how to do this properly. Of course, being a software dev, I ended up using the last alternative a lot.

It sounds to me like the author expects OS X to be like Linux where its UNIX underlying is the centre point of the OS. That's true to some extend, however OS X abstracts a lot away so the user don't ever necessary to have touch the UNIX part of the package. Once you understands it that way, the way of "installing" software on OS X is reduced to 2 (plus 1) for most people.

One is drag and drop an icon into somewhere in your hard drive; which is not even "installing" since what you do is simply... copy it to your local disk. Installing applications via Mac App Store simply automates this (the plus 1 part). Another one is via PKG installer; if any app does this you should be alarmed that they're modifying your system, they're going to scattered files across your system, uninstalling this thing going to be nightmare, etc.

Once you stepped into UNIX land, you're on your own. It would be nice if Apple provide a central repository of packages, but then I have to worry about outdated packages (given the nature of Apple that avoid anything with GPL/GPLv3). Homebrew has already done a great job covering that.


Everything is a tradeoff. There's so much more going on in Mac OS X (especially Lion) than a typical Linux distro that you are literally talking apples and oranges.

For my money, Mac OS X makes the right set of tradeoffs to give an overall good to great user experience, even at the command line. It gets enough of the right things correct that for a developer/hacker/tinkerer, they can take it from there and add whatever else they need.

For me, Vagrant (http://vagrantup.com/) has been amazingly useful. I can install any of the popular Linux distros and configure them anyway I want without worrying about Mac OS X's installed apps and libraries. A killer feature: because Mac OS X and VirtualBox share folders, I can still use all of my Mac OS X tools (editors, IDEs, whatever) with Linux. And thanks to Chef, I can spin-up specialized configurations in just minutes. Having access to apt-get is cool and all, but being able to create specialized, configured environments using Chef cookbooks is faster and way less error-prone.

All of this (and more) and I still get all of the benefits of Mac OS X native applications, superb driver support, etc. And all of this on 4MB 2009 MacBook Pro.


With regard to memory usage in OS X I have never had any issues with my current iMac. I have 8 GB of memory, and despite very heavy usage (Safari/Chrome/Firefox for browser testing, Coda as IDE, Tower for Git management, Skype, Terminal, iTunes, MySQL Workbench, occasionally MAMP) I rarely see any swap usage. Currently my swap used is 2 MB.

(I must say that I am still using Snow Leopard. I refuse to install Lion because it is pathetic compared to the much more efficient Snow Leopard. I have installed it on my other Mac machine and am not impressed.)

Interestingly the most memory hungry program on my computer is Chrome. It seems to spawn a ridiculously huge number of processes:

    Google Chrome 137.9 MB
    Google Chrome Helper 29.5 MB
    Google Chrome Helper 7.7 MB
    Google Chrome Renderer 123.5 MB
    Google Chrome Renderer 36.7 MB
    Google Chrome Renderer 46.8 MB
    Google Chrome Worker 26.6 MB
    Google Chrome Worker 26.2 MB
That's a grand total of 7 processes and over 400 MB of memory for just three tabs. As much as I love Google Chrome its a little ridiculous.

iTunes is the second worst offender: 300 MB. I have a pretty large library, but its not that large. Perhaps it does some very rigorous caching.


mmap+shared libraries. Don't put so much faith into what Activity Monitor tells you about processes and their Memory Usage.

Even so, having free memory gives 0 performance.


That's a pretty self-indulgent rant. I've got nothing against the ultimate decision by any means, but that decision is being rationalized -- and rationalized poorly.

8GB of RAM costs $39. If the OPs time is worth so little that they'd rather stew for 5-10 minutes every day waiting for repair permissions to finish, then that's simply their choice. FWIW, I haven't done anything like that on my 4GB MB Air since I bought it, and it almost never gets rebooted. Granted the SSD makes VM usage a lot more transparent, but still...

The software installation comment displays a lack of understanding of what's going on with the App Store; e.g., it's new, things are still ramping up, etc.

As an aside, my 2-cents to anyone new to the platform is to stay away from MacPorts. It does things just differently enough to make future updates (outside of what they provide, or before they get around to providing it) a potential time-sink to sort out. Not worth it to me; it's easy enough to build what you need, and there are plenty of places for excellent guidance.


8GB would reduce battery life and, in some computers, it's not even an option. My Dell notebook is maxed out at 4. And I don't want the extra memory because I prefer long battery life.


I honestly can't tell if you're serious or not :)


serious || !serious == true


Do you know how much power 4gb vs 8gb of ram draws? I've heard this before but always wondered how big of a battery drain it is.


These complaints about OSX, esp. installing software, match my problems exactly. I'm a programmer and programme lots. It Just Works™ on Linux with a proper package manager.


I recently went back to GNU/Linux (Ubuntu). This was after a three year stint on OSX though. OSX is good for people who don't like to tweak. I'm much happier with my Ubuntu laptop than I was with my Macbook, now I have it set up how I like. I'm generally happier using free software anyway.


I can only applaud the author for switching to Linux. Developers using Linux will develop for Linux and we will all be better off. What exactly reasons are, are of less importance.

I really tried several times to switch completely and it's not easy. My bigest gripe is that even with graphics acceleration firefox and chrome are still sluggish in certain aspects and also productivity shortcuts are missing on linux. I tried different flavors and just couldn't make things work. It can't be my everyday machine yet.

If someone would make distribution focused on developers, that would be the best things that happened to linux. Do I really need to install git as package and get stupid office packaged? Zsh and gvim and emacs out of the box.


You should check out OpenSUSE which lets you specify which packages you want when you're installing it. You just select the appropriate categories like "Web Development" and it installs a selection of appropriate packages.


I will check it, thank you. However what I was talking is a little more customization then that. If it is dev build, it should link to github, start with zsh etc.


I run Debian unstable on a quad-core T520 since our servers run Debian testing and everything Just Works, but most of the developers in my shop run Mac OSX and they have endless it's-not-quite-Linux problems: installing MySQL fails because the build thought that it was a fat binary or that it was 32-bit-but-the-libraries-were-64-bit; Mac OSX system paths are kinda weird; or homebrew installed a binary in a bizarre directory.

I can't help but wonder: since all the WMs are adopting Apple's desktop experience, are the multi-touch gestures worth the minor-but-frustrating lack of compatibility? I use Gnome3 and, while I complain about X,Y or Z, it's basically the same as Mac OSX.


To me, the three major operating platforms are tools that all have strengths and weaknesses. In the same way that I wouldn't use my nice chisels to loosen a laptop screw, I wouldn't use a MacBook for writing code for our Linux infrastructure. I am more efficient doing that work on Linux itself.

At the same time, I shoot photos and video, and do some writing to take a break from IT. I've tried doing that work using the included tools on all three platforms. I find the Mac platform the most efficient and trouble-free for that work. Linux is second, but is frustrating at times - especially regarding video.

At work, even though we have a heterogenous server environment, we communicate using Office and SharePoint. Also the wireless network seems to work best with Windows. Thus, at work I use Windows 7 with PuTTY, Gnu Screen, and several Linux VMs, and at home I use a MacBook with a Linux VM. These two setups let me use the three PC platforms for the workflows for which they seem best suited.

I think it's missing the point to debate which is the one true platform. We all have things we want to do, things we want to create. In my experience, the question is not "which platform is better in general?", it's "on which platform can I most easily get my work done?". If my current platform no longer works well, I try the others. In the end, I'm paid more for getting more work done in less time, so the efficiency of a platform for that work decides the question.


95% of my problems with any computer occur as a result of me wanting to use a piece of software that requires me to upgrade something. be it the OS or just some library.

in the old days when I ran Red Hat Linux this was the main reason my system would rot: I would want to run some piece of software, the developer had decided to depend on some very recent version of a library not present on my system and I would have to roll the dice and install it. usually it would be okay, occasionally things would break, but eventually it would lead to my system becoming unusable from all the dodgy packages that were installed.

things got a bit better with Debian. and even better with Ubuntu.

to this day I still have this problem. this weekend I had to upgrade my laptop to the latest OS and in the process I managed to brick it. I spent most of my weekend getting it on its feet again.

the thing is: I spend perhaps 1/10 as much time dicking around with my system now as I did when I was running Ubuntu on a Thinkpad. and more of the system works more of the time. I think most of this is down to Macs and OSX being a much less diverse environment. the hardware is well defined, the OS releases are far fewer and thus more defined etc.

that being said: when things go wrong on a Mac it is much more of a pain to sort things out. it is much easier to find solutions online for Linux problems -- and Linux is much easier to diagnose. I'm not entirely sure why.


The issue with transitioning back to Linux is not the OS or its applications. The main issue to be able to find decent laptops that are able to run the software. Lenovo, HP, and Dell are among the brands that won't offer a quality experience (think antiglare screen, lightweight, battery efficient and decent weight.) Noise is also an issue with the fan level on most PC laptops today. SJ didn't like fan noise and it may have helped in designing the best laptop on the market today.


Except when you do serious computations the MacBook Pro begins to squall serously.

If you do nothing, then yes, it's better even than Windows.


I have been a Mac user for the last four years. I used to really like it, everything was stable and polished and just worked really well.

But in the last few months, things started falling apart. My computer would freeze every few days, syncing would destroy parts of my data, the iPhone would crash every now and then, there would be weird random glitches...

I'm certainly not ready to abandon ship yet, but I can see it coming. The Mac is not what it used to be any more.


If your Mac freezes every few days, you probably have a hardware problem (or you've installed some third party driver that's messing things up).


May I ask if you've upgraded to Lion a few months ago? My 4 years old Macbook was not as bad as you described, but it does become a lot less stable than when it was under Snow Leopard. Even Safari does not work properly in Lion.

Snow Leopard was truly very stable, it's the best major version of OS X that I've used. I hope the minor OS X updates will fix out all of the Lion bugs.


Well, if I remember correctly, all that stuff did not happen to me when I upgraded to Snow Leopard. That's really all I'm saying.


yes, I did update to Lion.


So you must swap distributions, or try recompiling the kernel.

More

Applications are open for YC Winter 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: