Hacker News new | past | comments | ask | show | jobs | submit login
A Way to Manage Dotfiles (github.com/kalkayan)
68 points by bminusl 79 days ago | hide | past | favorite | 40 comments

Personally, I use git [0] along with GNU stow [1], combined with making the files directly from a literate Readme.org (e.g. [2]). I sync this repository between machines to update files, and when I make changes in the org-mode Readme file it automatically generates the new file. There are ways to pull in changes made to that file directly, but haven't needed to do that. My repo doesn't walk you through it completely, but I think is pretty straightforward. If you want to see it in action along with a few links and pointers, do take a look at [0]. I really like having it all together in one place, and with org-mode everything is very (human) readable.

[0] https://github.com/podiki/dot.me

[1] https://www.gnu.org/software/stow/

[2] https://github.com/podiki/dot.me/blob/master/x11/README.org

This is beautiful. Thank you for sharing it.

(I use GNU Stow too, though have not thought to create a literate version of my scripts and hacks.)

Thank you! The move to literate files is pretty easy, and is nice to work with. Now I just gotta do the rest, with the real time being to document a bit better.

I use stow as well, highly recommended!

I've been using this bare repository approach for a while. I forget where I first saw it, I'm pretty sure it was on HN but it was not this project. I do like it, but I have a few minor issues.

The first is that I have a habit of running `git add .` when I'm working on source code, and as a result I have accidentally added my entire home directory to the bare repo more than once... Easy enough to undo but a bit inconvenient. `dotfiles add -u` is the safe option, or just be explicit about which files you are staging.

The second issue is the 'branch per machine' approach, which I do use. I have two machines I use regularly, and a third occasionally. There are some bits of config (e.g. for vim) that are shared across all the machines, while other bits are not. If I use one machine for a while, then I end up with lots commits that I need to cherry pick when I next use another machine. Depending on how long it has been, this can be a bit of a faff.

Finally, because there is no clear mainline branch, you have to pick commits to/from any of the branches. If you are undisciplined like me, then this will leave you with a 'three way ahead and behind' scenario at some point.

Anyway, I like the approach overall. If anyone has a suggestion to ease those pain points I'm all ears.

I, too, saw this approach on HN first – here: https://news.ycombinator.com/item?id=11071754

I combined the bare repo approach with a per-machine custom branch approach described in https://www.anishathalye.com/2014/08/03/managing-your-dotfil...

The idea is that you have the shared configuration in one repo, and at the end of each config file, you include a local version. The local versions live in a separate repository and use a separate branch for each machine.

For example, at the end of .bashrc, you'd have

    if [[ -r $HOME/.bashrc_local ]]; then
        . "$HOME/.bashrc_local"
and so on, for each config file. My general dotfiles repo is public here, if you want to take a look how I did it for the tools I use: https://github.com/bewuethr/dotfiles

This still isn't ideal. For example, I use Git submodules for Vim plugins in the shared repo – but maybe I don't need all of that on my Raspberry pi. I feel like at some point, a config file based solution could be better; or using a tool such as https://yadm.io/, which is using bare repos under the hood.

I do that too, with a function in my .profile that I can call in the `dotfiles/local` [1]. That lets me "advise" the global version of the file with before:

    cat ~/dotfiles/local/profile
    load-global-config "$BASH_SOURCE"

    cat ~/dotfiles/local/profile
    load-global-config "$BASH_SOURCE"
and "around":

    cat ~/dotfiles/local/profile
    echo "BEFORE"
    load-global-config "$BASH_SOURCE"
    echo "AND AFTER"
The nice thing about keeping local dotfiles in a separate directory is that you can `.gitignore` it for your "public" dotfiles repository, but still keep the local dotfiles under source control easily.

[1]: https://github.com/svieira/dotfiles/blob/4b7e948b698b623a498...

Yes that comment is where I first saw it, thanks for reminding me.

Thanks for those pointers and the link to your dotfiles too. I will check that out and maybe steal a couple of ideas!!

I use a normal dotfiles repository with an install script that symlinks files out. My solution for multi-machine is that the install script just checks for a file suffixed by -$HOSTNAME to link instead:

    install() {
        if [ -e "$SRC-$HOSTNAME" ]; then
        echo "installing $SRC -> $2"
        mkdir -p $(dirname "$2")
        ln -sf "$SRC" "$2"
    install vim ~/.vim
    install bashrc ~/.bashrc
    install gitconfig ~/.gitconfig
    # etc.
This is much simpler than messing around with branches since 99% of files are the same on different machines. Just add e.g. gitconfig-workpc to use a different git identity at work.

For some files I also support a +$HOSTNAME suffix that will append instead of override. In this case it assembles the destination file rather than creating a symlink. This is a bit annoying because you can't edit the destination file; you have to edit the originals and re-run the install script. But it's worth it in a few cases (e.g. i3 config) to reduce duplication.

Check out https://chezmoi.io/ - it solves all of these problems.

I have a `.gitignore` file with `*` in it to ignore all files, so I explicitly type `dotfiles add --force`.

Just throwing my solution for dotfiles synchronization into the mix as well here, homeshick: https://github.com/andsens/homeshick

Maintained for 8 years now, 1.7k stars, only needs git >=1.5, bash >=3, and no root access to install.

It's well tested, stable, and super hackable to fit your needs.

Hey! Cool to see you in the wild, I have used homeshick for a long time. Installing it is basically the first thing I do on any new machine, to the point where I have the GitHub url (and therefore your username) memorized.

Thanks for all your work on homeshick. It's a wonderful replacement for the old ruby version.

I just maintain an install script to do the linking. It's just a few lines of zsh to mirror one directory into another with symlinks.

I've found that the bare repo approach has too many rough edges, and that the various dotfile management frameworks are overkill.


Yep same. How does a bare repository make it simpler if you have to run a setup script anyway? A setup script can be as simple as this:

    mkdir -p ~/.config
    ln -sf ~/.dotfiles/vim ~/.vim
    ln -sf ~/.dotfiles/bashrc ~/.bashrc
    ln -sf ~/.dotfiles/gitconfig ~/.gitconfig
    ln -sf ~/.dotfiles/i3 ~/.config/i3
This is two steps instead of three. You can manage ~/.dotfiles like a normal git repository instead of the "dotfiles" command in the article. It isn't full of hidden files, you can put other scripts and stuff in it, you don't have to turn off showUntrackedFiles or add your whole home directory to gitignore, etc. It's so much simpler and easier to keep clean and organized. The only real downside is that to add a file you have to move it to ~/dotfiles and add a line for it to the script.

Side note, I get the benefit of publishing dotfiles for others to learn from, but why go through so much trouble to document how to install your dotfiles? Does anyone actually use the dotfiles of strangers? One time upon joining a company it was strongly recommended to me to just install another developer's dotfiles on my computer to get started. I had such a strong feeling of revulsion from this. Using someone else's dotfiles is like using someone else's toothbrush. I have my own, thanks.

It's good to look at others for ideas but for anyone new to versioning your dotfiles I would strongly recommend starting from scratch and doing the simplest thing that could possibly work. Hence a script like above.

Often enough, when I see something like this, the real value isn't the software itself, but the idea that perhaps the issue it addresses is worth thinking about a bit more. The solution itself may be trivial, but have a large impact.

E.g. I have created [0] the simplest of scripts for managing updates for manually-installed / source-compiled applications (something I've dubbed "misc", very proud of this backronym :p).

The script itself is extremely simple (just a list of greps over latest release announcement urls), but it has solved a big problem for me, of helping me keep such "misc" items seamlessly up-to-date.

[0]: https://github.com/tpapastylianou/misc-updater

Any love for rcm [1]? I settled on this after trying many dotfiles systems - works for me with just the right feature set. I don't often see it mentioned on dotfiles discussions online though.

[1]: https://github.com/thoughtbot/rcm

Huge fan of rcm! I’m in a very similar position - I’ve used many a different dotfile system, and after finally settling on rcm, I couldn’t want more from it.

yep I love rcm. One of the features I use that most of the more barebones solutions don't have is the tag system so I can keep files for different machines in the same dotfile repo.

I do a similar thing, but with dotbot instead https://github.com/anishathalye/dotbot

Bare git repos are fine for starting, but you quickly hit limits if you want to use your dotfiles on different machines, e.g. share your .zshrc config between Linux and macOS.

A comparison of popular dotfile managers:


I sync a directory of dotfiles across my workstations using syncthing. The .bashrc and .profile simply do:

    for FN in ~/.paths/sneak-sync/bashrc.d/*.sh ; do
        source $FN
Then updates (by adding or removing files to these directories) propagate to all my workstations. I have machine-specific ones, too, that also sync but aren't included due to differing hostnames.

Wow, that’s awesome.

I never liked the git bare repository approach, because it's each to accidentally add files to it that you don't intend (if you forget to run git init in a new project directory; or if some code generator decides to "re-use" your existing git repository). I prefer the symlinks approach, but I never liked how all of the symlinks managers tend to leave broken symlinks all over the place.

That's why I created my own solution, which maintains a state file in the repo, so doing something like deleting a config file or switching a git branch doesn't result in a bunch of broken symlinks lying in your system.


I like the bare repository approach here. I've been using symbolic links to version control my dotfiles with git.

I do the same using `stow`, and like it a lot.

I gave it a quick glance but didn't understand, how is OP's link managing it instead of symbolic links?

Files are tracked by a bare git repository, e.g. `~/.dotfiles`.

The trick is to use the combination of --git-dir and --work-tree git options. An alias can be defined to simplify the process: `alias dotfiles='/usr/bin/git --git-dir=$HOME/.dotfiles/ --work-tree=$HOME'`.

`dotfiles` can be used as you would use `git`, e.g.:

- `dotfiles add <file>`

- `dotfiles commit [options]`

By telling git to use `--work-tree=$HOME`, they tell git to directly work on the files in their home directory; no symlinks needed.

I've had my dotfiles under version control for over a decade (hg for a couple years, then git). I've never used or felt the need for any special tools, and I don't use any extra symlinking.

If you do that stuff, great, but here's what I do if you don't want to use any specialized tools:

My home directory is just a git repo. There's a regular old ~/.git directory. All of the management I do is regular git stuff, like any other repo. Everything I don't want to commit is listed in .gitignore, like any repo.

The only wrinkle is setting up a new machine. You can't clone into an existing directory (AFAIK), and your home directory already exists, so you need a workaround. An easy solution: clone the remote repo to ~/dotfiles, then `mv ~/dotfiles/.git ~`, then `rm -r ~/dotfiles`. Now your home directory is a git repo where the working copy is exactly as the home directory was before, and HEAD points to your repo's default branch. It will be dirty because all of your config files are missing. Use regular `git status` to take a look. It's always looked fine to me, so I `git checkout .`, which effectively installs the dotfiles. Then I'm done until I need to do that copy trick again when I buy my next computer. In the meantime I just commit and push as normal, and occasionally add a new file to .gitignore.

That's nice, until you accidentally type "git clean -fdx" into the wrong terminal and nuke your home directory. I would be way too scared to have all my files be untracked in an unclean repo.

I use `git clean` maybe once per year on average, so very low risk for me. And if that ever happens... that's why I have hourly backup snapshots! If I didn't trust my backups, I'd definitely address that before worrying about dotfile management. If you don't have a good safety net then you end up reinventing the net 100 times in other ways and everything gets more complicated.

I've found that a simple dot_files directory that is sync'd via a public git repository [0] works well for me. When I'm on a new machine all I need to do is clone the repository and run `ln -s ~/dot_files/bash/bashrc ~/.bashrc`. The bashrc file then takes care of everything else.

It might not work for every program out there, but it does for the small number that I use.

[0] https://github.com/andypea/dot_files

How do people manage the platform differences between Linux, macOS, Windows WSL, Windows Cygwin, Solaris (at one point I had it on my NAS); and environment differences, such as work machines vs home machines?

I have my own dotfiles mechanism which splits out paths and configs by operating system, portable runtimes (e.g. Mono - may work on different OSes) and environments (enabling home and work configs to be stored in a separate repo and avoiding data leakage). Things like .bashrc and .bash_profile are assembled piecewise from fragments in each component.

Modifications to generated files by installers (notoriously, things like rvm, nvm etc. all want their stuff to be last in the config file, first in the path and all that) are detected so they can be integrated and not accidentally overwritten.

It's somewhat complex - I'm deliberately not linking it because I wouldn't recommend other people use mine, I have no time to support it - but to my mind these features are critical for any dotfile management system.

I stole this from somewhere, but in bash at least you can do something like this:

if [ $(uname) = "Darwin" ]; then ... # macos

if [ "$TERM" != "cygwin" ] ; then ... # cygwin

I'm sure there are other ways to determine WSL and others you mentioned.

Oh there are selectors you can run, but what of you want different config files? Different inputrc or dircolors because you use different terminals, etc. And then there's the difficulty in managing an enormous do-everything script, vs something which is compiled for the platform.

I've been happy with https://yadm.io/ to manage dotfiles.

Since we're sharing, my dotfiles setup has pretty much reached its final form. I use my symgr[1] to symlink my dotfiles repo into my home dir. Pretty much everything I think about this topic is in its readme, as well as a link to my setup[2] repo with my dotfiles showing how I use symgr.

[1] https://github.com/kbd/symgr

[2] https://github.com/kbd/setup

I would highly recommend zero-sh[1] for those looking for a consistent simple tool to manage dotfiles on a mac.

1. https://github.com/zero-sh/zero.sh

My solution is to just nuke them every time I get a new comp. I never really miss my old setup that much.

Applications are open for YC Winter 2022

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact