Enabling apps from everywhere:
Don't disable this. It's actually a good security feature. If you want to run an app that isn't signed, right-click and click "Open" and confirm you really want to run it and the OS will remember that choice for that app in future.
The default Terminal is actually pretty solid. Much better than it used to be. Two common complaints (lack of tabs, lack of colours) are now moot.
Oh My ZSH:
Or you could use fish – a new Mavericks compatible build (2.1) should be coming to the homebrew package manager literally any time now.
Also, for Python specifically (i.e. the title of this post), you can install a new version of Python via homebrew (and then place /usr/local/bin above /usr/bin in your PATH). This has the advantage of being able to run pip without superuser privileges. Though if you're using virtualenvs, it's probably not that big a deal anyway.
Maybe they've fixed it, but not long ago the whole Adobe suite just wouldn't work on case sensitive EXT FS.
You should use a convention for your filenames anyway. Eg in Python you always use lower case.
We've had a couple hiccups going from Mac to Linux because of this (if you call a folder 'Assets' and there's already one called 'assets', you're going to get two distinct folders on your Linux server and scratch your head for a while until you figure out why something didn't change), but you mostly don't have to worry too much about it.
I don't know about Mac, but once I had a problem that some of the game installers created folder " Games", although "Games" was already there.
Fun time trying to delete or rename to something same this " Games" folder using Windows Explorer, Total commander and command prompt. Each one of them got confused and tried to delete "Games" instead of " Games". At the end, I just fired up lcc, and made a simple C program calling `ulink(" Games");`. And it worked ;)
If you're using OS X for an all-round machine, it might be the wrong choice.
This is something I've noticed from using both OSX and Linux for the past few years — unless software is available as a drop-and-install package, it's universally _easier_ to install in Linux* than it is in OSX. Given the market for OSX and the amount of development that happens on it, I'm surprised the situation isn't better (for OSX), but perhaps that just speaks to the work people have done on package managers for the various linux distributions.
* - Holds true, in my experience, for Ubuntu and Arch. Likely the same for other distros..
If anything like this happens in a dependency of your module, you're probably going to have to recompile all of the downstream dependencies by hand, or risk breaking your development system.
To avoid all of this you end up compiling static versions of the libs from your dependencies, and linking them statically to your module. This is particularly true if you don't want to be at the mercy of upgrades to those dependencies in your distribution that break your code.
Looking at things from the other side, there have been various package managers for OS X that give some of the ease-of-development available thanks to yum or apt-get. At the moment homebrew seems to be getting quite a bit of traction, and has addressed many of the pain points that predecessors such as MacPorts and fink have had, making OS X decidely more linux-like when you're writing the software without plans for distribution.
If anyone knows a reliable and simple way to do this I'd be very grateful.
I've never had to deal with multiple configurations on the same mac (my solution, off hand, would be to use debian VMs, possibly through SSH rather than a desktop GUI if I needed >1 at a time.)
EDIT: and in terms of resource usage, RAM is cheaper than headaches.
Unless the package is not in the repositories or you want to have an instance with some different flags. One of the really nice things of Homebrew is that besides being a package manager, it is a great /usr/local manager.
For instance, installing something that is not in Homebrew is as simple as setting the prefix to /usr/local/Cellar/whatever/version and then running brew link whatever and it puts symbolic links in /usr/local. Want to make it 'invisible'? Just do a brew unlink whatever
It's also much easier to maintain your own Homebrew repos (taps) than e.g. maintaining an Ubuntu PPA.
I'm used to being able to do, 'brew install foo' and not needing to worry if I needed libfoo, libfoo-dev, libfoo3-dev or just plain foo.
apt-cache search foo
And usually the headers aren't that big (with the notable exception of template-heavy C++), so it's much simpler installing everything.
But such splitting primarily occurs in the Debian family.
You can see its value with:
xattr -p com.apple.quarantine <filename>
xattr -d com.apple.quarantine <filename>
And don't forget to `brew install python`!
then upgrading pip and virtualenvwrapper:
$ sudo easy_install pip
$ sudo pip install --upgrade virtualenvwrapper
If that doesn't do the trick, you might need to install the Xcode command line tools.
[update: oh well, it looks like the parent deleted the question, but for posterity, it dealt with problems with virtualenvwrapper and pip after upgrading to Mavericks]
For several situations though (like when Xcode command line tools stops working after a major OS update) it won't matter. You need a c compiler for several packages and having your stuff isolated per-user or using a non-system Python won't help that one bit.
Installing numpy/scipy/matplotlib can be pretty tricky because of non-standard homebrew library locations (for stuff like gtk+, freetype2 etc), so you're better off using this http://github.com/samueljohn/homebrew-python and http://github.com/Homebrew/homebrew-science for all your scientific needs. By the way, Qt, Numba and a couple of other libraries seemed to get broken (upstream) in Mavericks, but I guess that wouldn't take too long to fix.
Also, upgrading homebrew libraries on Mavericks requires reinstallation of the entire dependency tree (because newest clang uses the newer libc++ instead of libstdc++).
wget https://bitbucket.org/pypa/setuptools/raw/bootstrap/ez_setup.py -O - | python
Why yet another clone of powerline?
vim-airline is a nice plugin, but it uses too much functions of other plugins, which should be done by users in .vimrc.
But I will try airline out. :)
And now, I don't have a polluted host OS with tons of versions of software. I can easily try out new technologies, kill the VM, and start over.
Keeping VMs for different projects in their on silo'd around is really convenient.
Switching OSes or upgrading, rebuilding a computer, etc. I don't loose these configurations.
The list goes on and one.
RAM is cheap. I think I paid $100 to upgrade my Mac Book Pro to 16 GB.
I can also switch back to a host Windows machine and take my VMs with me (should the need arise)
RAM was cheap. Prices have went up a lot. Most DDR3 prices have almost doubled since this past spring/winter. 16GB laptop kits are ~ $140 when they were $75 a few months back. Still relatively cheap, though.
Not to mention Apple only makes 1 laptop that you can add your own RAM to nowadays (13" MBP). It's $200 to go up to 16GB on a Retina.
Did you try http://postgresapp.com ?
Once set, I don't have to worry about re-installing my development environment - and can just let anyone check out my sources, who then has to do a simple "vagrant up", wait a few minutes, and he's up & running. No worrying about which things he specifically has to install and configure, my vagrant config takes care of that.
Also - the memory requirements are very limited imho, I run most of my vagrant Ubuntu 12.04 server VM's with with 128 or 256mb...
Having used Homebrew now for many years, this is really just lingering FUD from many years ago. I've spent at least as much time building custom RPM/DEB packages, finding PPAs, etc. for things which don't compile cleanly on a particular Linux distribution. That's just a cost of using bleeding-edge or obscure packages: the right lesson to draw is to avoid taking on the cost of using less stable packages unless you gain significant benefit.
> Vagrant boxes also are platform independent if he had a vagrant development env, he wouldn't have to blog about how to install python
He didn't — that's a local cost he chose to take on for perceived benefits but it's neither necessary not advisable if you don't need something other than the system installed Python (2.5-2.7). This is exactly the same situation as on a Linux distribution.
> You certainly can install all pythons that your projects use side-by-side, but this quickly turns into a mess when you have multiple projects that deploy on different target systems, maybe with different sets of libraries etc.
This is a platform-neutral problem which has been well solved for many years using Virtualenv:
> All in all: It's cleaner, less work and more flexible.
What you described is a LOT more work and adds significant performance and usability overhead to things you do many times a day, particularly if you need to work on more than one project and spend time spinning Vagrant instances up and down, installing updates, etc. rather than doing something productive.
It might still be worth it if you really need to test your deployment process all the time but otherwise that cost adds up. Personally, running locally and testing on EC2 has been much easier, faster, and makes my testing realistic by using the actual real production infrastructure rather than a facsimile crammed onto one laptop.
Actually, I can cite an example. Try installing libmediainfo on your mac. It's a pita that requires hacking the makefiles in obscure ways. Even after hacking the makefile it installs in different locations than on linux, making all software that depends on it more difficult to compile. Oh, and yes, there's a gui package but it doesn't contain the required library files and headers.
Homebrew is fine for everything that homebrew packages exist for. Anything else you'd better hope that the authors provided a working makefile or be prepared to learn how to fix it.
> spend time spinning Vagrant instances up and down, installing updates, etc. rather than doing something productive.
I install updates when the target env installs updates. I share the puppet/chef files that are used to update the target env - so the maximum amount of time I spend doing unproductive things is typing "vagrant up" in the morning and "vagrant suspend" in the evening. Each one of those commands runs 30 seconds.
Since when does virtualenv install the actual pythons?
> Personally, running locally and testing on EC2 has been much easier
You're certainly aware that you can use packer and vagrant in combination to build the same box for virtualbox/vmware for local use and as an AMI for use in EC2, making your development env a clone of the production env? Just with faster connectivity and no need to be online for work? Vagrant can even spin that ec2 instance up and down for you.
I did it that way for a long time too, but running vms became so convenient with vagrant and all the box configurations you can download that its kind of a no-brainer as long as you have 8Gigs of RAM or more.
Sadly it seems that many python packages have big issues with overloaded =, which means that brew can't compile many tools I need. so If you need anything relying on boost you might want to delay the update