Hacker News new | past | comments | ask | show | jobs | submit login
Using GNU Stow to manage your dotfiles (2012) (invergo.net)
203 points by matthberg on Dec 28, 2020 | hide | past | favorite | 113 comments

Call me old fashioned but what's wrong with having a regular git repo anywhere you want with all of your dotfiles and them symlink those files to where they need to go?

If you wanted to you could && a few symlinks and other commands into 1 copy / paste'able command to get up and running really quickly. It's also painless to manage secrets by sourcing in optional files. It also works nicely when you want files living across 2 systems (such as WSL 2 and Windows) where on a Windows box you consider certain Windows files dotfiles even though technically they exist elsewhere.

I've been managing my dotfiles like this for a few years and it seems to be working out well not just for myself but others who have want to use them either partially or in full. I open sourced them here: https://github.com/nickjj/dotfiles

I purposely broke up some of the commands because I know that not everyone will want a 100% replica of my dotfiles. Folks copy / paste only the commands they want.

>Call me old fashioned but what's wrong with having a regular git repo anywhere you want with all of your dotfiles and them symlink those files to where they need to go?

Well, that's what you end up with. Stow just automates the "symlink those files part".

More importantly, Stow can intelligently convert a symlinked directory into a real one when a second package wants to share it.

Stow does this, it also has a --dotfiles switch so you can save your files as `dot-vimrc`, `dot-bashrc`, etc... so you don't have a bunch of hidden files in your repo

Too bad that's broken for directory names: https://github.com/aspiers/stow/issues/33

Stow maintainer here. Thanks for the nudge on that, and apologies for not doing a better job. I hadn't noticed that this issue affecting so many users. I have some downtime in the coming week or two so will look at this.

Thank you! That fix will make my dotfiles repo look a lot nicer :)

That's pretty much what this is. I have a git repo with one dir for everything that needs configuration. Whenever I spin up a vps/vm or what have you, git clone dotfiles; cd dotfiles and stow what I'm using on that machine.

stow links based on the directory structure of the folder you're calling, so I don't have to worry about where things are, just stow fish; stow nvim

There's no symlink script or adapter script to work with stow, everything 'just works' for me.

> Call me old fashioned but what's wrong with having a regular git repo anywhere you want with all of your dotfiles and them symlink those files to where they need to go?

I prefer to skip the symlinks, and just directly use the git repository as my home directory. (I have a sizable .gitignore for things I don't want to track.)

I do the same, but I just have a single '*' in my .gitignore.

The only drawback to the homedir-as-git-repo approach I've found is that you're always in a git repo, so you have to be careful not adding stuff which was really supposed to go in another repo.

Here's a better solution.



  alias dotfiles='/usr/bin/git --git-dir=$HOME/src/dotfiles --work-tree=$HOME'
  dotfiles add
  dotfiles commit

I really like that solution, thanks. It also means I can easily move my current git dir to the new place and still have everything work.

Does anyone know if syncing a git repo over Syncthing (or similar) is a good idea or whether there are likely to be conflicts? I don't know if git names its files such that conflicts aren't possible.

Brilliant, thanks!

> The only drawback to the homedir-as-git-repo approach I've found is that you're always in a git repo, so you have to be careful not adding stuff which was really supposed to go in another repo.

I tend to always run "git status" before adding anything, which makes it fairly obvious what repository I'm in. (Or I'm working via fugitive in vim, which helps in the same way.)

I do it with a renamed `.git` and a bash alias `githome` that points `git` to the right place. This way it doesn't look like my home is a repo in normal use.

I still remember when git was the hot new version control system, so old fashioned sounds funny to me.

I had the same reaction. Stow was first released (v>1.0) in 1996, git in 2013. Yet git is the old-fashioned choice? hahah.

1.0 of Git was actually 2005.

ah cool. i just had a look in the project's releases, 2013 was the first nonzero version they had listed there. I did wonder why they didn't list a v1.0 though.

Not sure if Stow fixes these, but here are some problems I've had with that setup:

1. If you want to delete a file, you have to delete it (rm) in one place and delete it another way in another place (git rm).

2. Adding a new file in-situ (in the home directory) requires some finessing to get it back into the git repo and symlinked properly.

3. No easy way to apply a rename operation.

4. After changing the checked out branch, there is no easy way to apply these changes to the home directory.

If all your configs follow the XDG spec then you can turn ~/.config itself into the repo,and then use .gitignore as a whitelist:

I do this and clone my repo directly to .config on a new machine. The only hold out was tmux but that's solved by installing or compiling one of the recent versions released this year (3b/3c). I'll still use manual symlinks for things like bash, but they're outliers.

Works great for me.

I believe Stow can fix everything except #2 [1], although I haven't actually used it before. But it's also easy to create your own "garbage collector" that cleans up dangling symlinks in your home directory.

All you need to do is to keep track of symlinks you've installed with your setup script. This can be done by creating a symlink to the symlink you've installed, which acts as a "GC root." The next time you run the setup script, it would check those "GC roots" to see if they point to a valid file and remove any dangling symlinks.

This is the approach I take for my own dotfiles. I seriously considered using Stow or the bare git approach before, but I decided against it because setting up my dotfiles involved more than just installing symlinks. I had to be able to download files (e.g., vim-plug), clone git repositories, change file permissions, and maintain files that's not meant to be linked into the home directory. I found the flexibility of a custom shell script most fit for my needs.

[1]: The "Deleting Packages" section in https://linux.die.net/man/8/stow

> and delete it another way in another place (git rm).

I use `rm` all the time in git repos... Just `git add` when it is time to stage changes.

Right, but when you unlink it from your home directory, the file still exists in the git repo. So there is a manual synchronization that must happen between the two directories.

>Right, but when you unlink it from your home directory, the file still exists in the git repo.

That's a feature.

Not when you are looking for a workflow to make dealing with dotfiles faster and more painless. With the current (symlink based) setup, there is a hidden state you must hold in your head or else carefully inspect: the discrepancy between your git repo and your home directory. Maybe you forgot to link a dotfile, and that’s why your latest configuration doesn’t work. Ditto for unlinking a dotfile.

Isn't that what -D, -S, and more simply -R do?

I did not use stow 2.x, but my understanding is that running stow -R will allow you to remove previous dangling symlinks (during the unstow) and add missing symlinks (during the restow). Effectively allowing you to only manage a single state: your git repository.

I delete dotfiles so infrequently (once a year?) that this just isn't a problem for me.

I also don't have all dotfiles under version control, just the ones I've essentially "built myself", so maybe that makes it easier.

`git rm` will still leave the file available in the repo history, so violating the consistency seems more like a bug than a feature.

Still a feature. You might want to revert back to it. And history is history.

Call me old fashioned, but I remember using stow in 2003/2004, about a year before git existed.

Yes though, use a repo for sync, but stow takes care of at linking your dot files in the right place.

> but what's wrong with having a regular git repo anywhere you want with all of your dotfiles and them symlink those files to where they need to go?

Nothing wrong with tracking dotfiles with Git. It's probably the simplest and easiest approach without requiring any fancy tools other than Git. However, as mentioned in ArchLinux's wiki page, the disadvantage of Git approach is that "host-specific configuration generally requires merging changes into multiple branches."[1]

[1]: https://wiki.archlinux.org/index.php/Dotfiles#Tracking_dotfi...

I've had my dotfiles in git for 7+ years. Work on Windows, Linux and macOS. Conditionalization in BASH is dead simple.

EDIT: 12+ years at work.

Yeah, it seems like keeping the cond inside the dotfiles, rather than in a dotfiles manager is the way to go.

Can you share some of your setup there to exemplify? Thanks.

I use $(uname -s) (Cygwin required on Windows) and dispatch on that. There's really nothing more than that.

> and then symlink those files to where they need to go?

This is exactly what Stow does.

Yeah, I set it up with the Atlassian tutorial, https://www.atlassian.com/git/tutorials/dotfiles, and store it in a git repo in Keybase.io

I've been using bashdot(https://github.com/bashdot/bashdot) does exactly this. It's a simple bash script that sets up symlinks for a given directory. It can also switch symlinks to different directories, so you could use that as swappable profiles.

> If you wanted to you could && a few symlinks and other commands into 1 copy / paste'able command to get up and running really quickly

Exactly! I do exactly that. I have an install.sh script that simply iterates over the files and directories in my dotfiles repo and does mkdir and ln

That's what I do too. I even made a little Stow-inspired tool to help me, called "stowage", if you are curious: https://github.com/michaelpb/stowage

Yeah, the command is literally (cd ~; ln -s source-location/{.,}[A-z]* .) which is a lot easier than installing all that stuff.

If we’re codegolfing dotfile setup, then really go crazy: `cd; ln -s source-location/{.,}[A-z]* .`

STOW is great, it is simple and works well especially combined with git. That's what I do [0], and recently combined it with org-mode for literate programming, so each program has just a README.org that then generates all the files via org tangle [1] [2]. For example, here is my file that generates my Xorg configuration [3] over several files, nicely readable on GitHub, in Emacs, or just as plain text.

[0] https://github.com/podiki/dot.me/

[1] https://web.archive.org/web/20190924102437/https://expoundit...

[2] https://orgmode.org/manual/Working-with-Source-Code.html

[3] https://github.com/podiki/dot.me/tree/master/x11

I really like the idea! Thanks for mentioning it. I thought about converting my dotfiles to org-mode files but managing the tangling is always the step where I stop. I think I'd rather automate the tangling. I see you commit the tangled "results". Is this by choice or is it a compromise?

I commit the tangled results so that I can use Stow directly on other computers. But this could be easily automated with a script (since you can tangle without opening emacs directly, or within the org file itself!) to tangle after any update, or even to pull in changes if you make them to the file directly. For example, you could have a hook so that on saving the org file it tangles to update any files, or on opening pull in changes you made directly in the result file (e.g. not in the org file directly). I saw some links on this but unfortunately don't have them handy. I'm sure they will come up with some searching, but let me know if you can't find anything.

I have found that managing dotfiles is not enough. The dotfiles serve no purpose without the software that uses them. My method is to write small scripts, I call them setuplets, that install the software and then symlink the dotfile to its master that I manage in git. In the simplest case, it is just a two line script in a directory, but I have one for each program, and a tool to select which I run when setting up a new linux machine. The more complex ones may install a number of tools and set up environment variables etc. For environment variables, the script drops a script file in a directory, where my bashrc sources it.

Interesting! https://github.com/nix-community/home-manager uses Nix to solve this problem. It adds reproducibility and rollbacks but with the cost of learning the Nix DSL.

+1 for nix and home-manager - I recently switched to this setup (still learning/experimenting a lot, link here [0]) and have been very happy!

[0]: https://github.com/jpetrucciani/nix

+1 for nix and home-manager. Most other solutions make it hard to manage installing packages on different flavors of Linux. You can get even more fancy by using the direnv integration to have specific versions of binaries available when entering a directory. See https://www.mathiaspolligkeit.de/dev/exploring-nix-on-macos/ . By checking in the nix files into your projects directory, you get really reproducible projects, because all you have to do is to checkout the project then run direnv allow and nix/home-manager will install all tools needed to run/build that specific projects. Different projects can use different versions of tools.

I used to do this for macOS and Linux but it was a fragile approach, since using Nix and NixOS I've had great success using home-manager[0] which takes care of what software is installed and any overrides or patches I desire.

I also use GNU stow in my dotfiles[1] especially so that my setup still works on systems without Nix installed.

[0] https://github.com/nix-community/home-manager

[1] https://github.com/siraben/dotfiles

I don't want to sound condescendant, but are you not basically doing packages ? Instead of standard packages your distro gives you, you have "custom" packages that also contain more specific installation instructions (write file with specific content, set environment variables, ...). I know that PKGBUILDs in Arch allow you to do such things if needed.

Bonus: now even configuration files can be linked to the software that needs it; when you remove a software you also remove all the files related to it and there are no leftovers.

Because package manager are different across different distro? Also, custom setup allows you to have those tools even if you don't have root permission to install them with package manager.

No. My script calls the package manager, sudo zypper install..., and symlinks a dotfile. That is usually all.

I've been logging each install as a markdown document with directions, and manually symlinking my dotfiles, but tiny scripts that do both are a great idea!

Yes, as soon as I have installed a program which I like and think is a "keeper", I write my setuplet script, usually by copying and pasting from the bash history.

I am not an expert in Ansible, or any other software of the kind, but I think this is one of the easier use-cases of such a program. I wrote an ansible script to compile emacs from source, after some initial configuration, and it serves me pretty well.

I started with Ansible for automation, and still use it, but have abandoned it for setting up my personal environment. I still use it to set up services, such as sshd, webserver, printer, samba, smartd etc. I found it was more cumbersome for personal environment setup than my script method. My method of re-initializing my environment rests on three pillars:

Ansible, for server/OS setup. Things run by init/systemd, involving anything outside of $HOME basically

Backup, using borgmatic. I use this to restore most things in $HOME, except random dotfiles. My documents, my checked out git repos, etc.

Script "setuplets". These I run on demand, to set up my environment piece by piece. Perhaps I do not want to restore my programming environment just because I want to have my custom prompt on a host, for example.

Finding the balance between these have been difficult. What bootstraps what, and especially, how to handle credentials. My backup is encrypted, but how would I make sure I had the keys to restore it? I could restore it from my pass password store, but how would I get the gnupg keys in place first? I have not solved this completely satisfactorily yet

I am you in the past then, I am still trying to find the balance. Thank you for the insights!

I do this but include the symlink command in the document. That way all the links become one copy and paste.

May I recommend Makefiles? Nothing fancy, just a bunch of .PHONY targets. It's a converient way to bundle a bunch of scripts (or single arcane commands) into one file, with autocompletion.

That’s a good idea. Thanks!

I have a similar setup, but I don't have to write that script myself. Instead, I uses Zinit (https://github.com/zdharma/zinit) to let it setup the tools for me.

> My method is to write small scripts, I call them setuplets,

Finally I have a proper term for what I, too, have been doing all these years! :-) It's indeed the best-possible approach I've found, though there are a number of things that I haven't yet solved for myself in a satisfactory manner:

- With shell scripts there are no idempotency guarantees and there is no easy undoing / uninstalling / clean-up, especially after updating a setuplet.

- With shell scripts everything is defined imperatively as opposed to declaratively. In particular, setuplets usually operate on the filesystem directly and testing and dry runs are almost impossible.

- No status report as to what a setuplet wants to set up (software, configs, cronjobs…) and what is already set up on the current machine. That is, no diffs. This makes sharing setuplets and configs between multiple machines (say, personal and work laptop) rather cumbersome. For instance, I might forget to re-execute a setuplet on the second machine which could then lead to a missing software dependency or a mismatch between config and software.

- No simple, out-of-the-box way to have different configs/dotfiles for different machines, in particular: no config templating.

- It's hard to share common settings across applications without duplicating them everywhere. For instance, I would like to define a common set of colors / a common theme for my window manager, my terminal, my editor and so on. Similarly, (some) keybindings should be the same across applications. Moreover, I have a set of common directories in my home dir (for binaries, logs, cache etc.) that all my setuplets & dotfiles should use.

- Dependencies and interactions between setuplets are often implicit. They interact with and depend on one another through a myriad of ways, like software dependencies (of course) but also PATH modifications, cronjobs, bash aliases, file system modifications … These are very hard to recognize and, even worse, to refactor.

- Bash scripts are error-prone and cumbersome to write and debug (and refactor).

- My setuplets don't have a common command line interface and their relation to one another is unclear. (In which order should they get executed?) I tend to write scripts that invoke all the setuplets in the right order but it still seems messy and error-prone.

I've tried solutions like Ansible but I've found that its purely declarative DSL is not flexible enough to cover all my use cases in an elegant manner.

…which is why I'm currently working on a small Python library that will hopefully solve or at least ease the above pain points for me. Once the library is finished, I will rewrite my setuplets in Python (using the library to do the hard and tedious work), so that I end up with one single Python project of dotfiles and setuplets (exposed through one single command line tool) that, once executed, will automatically set up an entire machine for me within a few minutes. One nice thing would be that all inter-setuplet dependencies would be expressed through Python code (with proper typing, encapsulation in modules and everything) which could then easily be explored (and also refactored) with an IDE. Sure, this sounds like a lot of work but given that I intend to use my dotfiles for a couple more decades, it seems well worth it.

Of course, whether my approach will ultimately be able to solve all the challenges above remains to be seen but I'm at a point right now where I'm convinced that proper software engineering methods (especially dependency injection, type checks, tests etc.) would be a real boon for managing my (hundreds of) dotfiles.

Have you played with NixOS or Guix at all? They attempt to solve this problem from the ground up for the entire OS. It obviously has trade offs, but IMO it is the best solution around today (other than Kubernetes, but that is an abstraction level higher).

I really love nix and nixOS and I‘m also using homemanager on my macOS machine and an WSL instance. But I decided to turn away from it. Using just the nix packagemanager on a distribution feels off and doesn’t bring you all the benefits. On macOS it can be very frustrating to find a programm that is in the mix packages but not available for macOS. Plus some tools tend to be very outdated and use very general compile settings (light theme etc). I also have issues how libraries etc are linked. I have a rust project that for the love of god I could not compile on a arch installation with nix package manager managing all tools (rustup, git, etc). It was failing with some error around a standard c lib. It was working fine on a popOS installation with nix so I have no clue what went wrong where. The rabbit hole is very deep with nix. It is very very awesome but you also leave behind the conventional way of Linux when going all in. I played around with nixOS and really loved the idea and how easy it is to reconfigure the system from a single config. But if you don‘t want to write custom package overrides and dable with nix-expressions I would not use it as a dotfilemanager. If anyone knows a good way to integrate nix with arch that does not feel like a strapped on solution I‘m all open.

Thanks, I was entertaining the idea of giving nixOS a try for managing my dotfiles but your post confirms my suspicions!

You don't need symlinks, setup scripts, etc. Try this:

1. Bare git repo in your home directory ($HOME/.files)

2. Alias for prefixing git commands ("env GIT_WORK_TREE=$HOME GIT_DIR=$HOME/.files")

3. Strict .gitignore file (that ignores all files by default)

Simple to add files: `h git add .vimrc`

Have this set up for myself. Works great https://github.com/tmm/dotfiles

This is similar to a resource that atlassian has put out on storing dotfiles [1]. It is also similar to what I use and I can vouch for this method as well.

[1] https://www.atlassian.com/git/tutorials/dotfiles

Not sure if I got it from the atlassian post or the original hacker news post - I am aware of both. It don't see the need for anything more complex. My first config commit is in early 2017.

I've been doing something similar.

Basically adding a alias to my zshrc, that runs: `git --git-dir=$HOME/dotfiles --work-tree=$HOME`.

And then set status.showUntrackedFiles to no.

See: https://github.com/sp1ritCS/dotfiles

I've got this idea from the following blogpost: https://news.opensuse.org/2020/03/27/Manage-dotfiles-with-Gi...

This is the simplest way of doing it and in my opinion the best. You can just have all of the files where they are supposed to be without the need of symlinks or anything. I've been doing it this way (well a bit different but very similar) and it works like a charm.

I explain here the steps of how I do it: https://github.com/josepmdc/dotfiles#2-if-you-want-to-manage...

I use this technique, based on a bare git repo https://www.atlassian.com/git/tutorials/dotfiles

Works quite well ... but I'm really, really ready for a rebuild of how Linux systems are built. I hope systemd-homed will deliver.

Imagine you could ssh into a machine with a flag that forwards your local config. Or even your local bin.

I don't think systemd-homed does anything for dotfiles, let alone portable dotfiles. The user record only contains (the path to) a skeleton directory used to create the homedir if it doesn't already exist. Populating such a newly-created homedir with your dotfiles is still your responsibility.

You mean what we had with project Athena for decades? Or NFS mounted homework? Or plan 9?

> Or NFS mounted homework?

I would not wish this pain on others. It works great when your systems are homogenous. It rapidly becomes a pain when you need to make customizations that only apply in a particular network, or on hosts with a particular OS version, etc.

My bashrc has basically just become a monstrosity to control what files get sourced for this particular host. Answering why a particular env car is set to what it is ends with me trying to trace what files get loaded, what they set, and the order they get loaded in.

Much more painful than separating the concerns and having Ansible template out my bashrc.

I moved to splitting my bashrc into multiple files and having my main bashrc source them from a ~/.bashrcd directory.

At heart it's a short snippet that just checks for existence and sources each file in the directory:


I added aliases to list/edit/remove entries from the .bashrcd directory and resource it. And a script I can call with a one-liner to edit bashrc on a new machine to add the sourcing and the helper aliases.

It'll load alphabetically so I can prefix entries with a number to specify load order (defaulting to 0100 so I don't need to specify this in the commands unless I explicitly changed them).

So the end result is that I can quickly edit or create a new bashrc entry by running 'ebrc entryname'. This opens ~/.bashrcd/0100--entryname in vi, and when it's saved it'll re-source so the add/change takes effect immediately.

Or 'lbrc' to list contents of the directory, or 'rbrc entryname' to remove ~/.bashrcd/0100--entryname

It's fairly simplistic but takes away most of the cognitive load of managing a complex bashrc.

NFS, The Network is the Computer (TM).

It's a bit strange that now it's a Cloudflare trademark.

That sounds really convenient. How far away is the homed system from adoption?

I really like this method as opposed to using a bare Git repository. For one, it's conceptually simpler in my mind; you don't have to understand Git internals to get this working. Secondly, this lets you pick and choose which config files you want to "install" on a machine.

I feel obligated to share my Bash script, dotfiles.sh[1], that accomplishes what Stow does, but with a few tweaks that I found particularly useful:

dotfiles.sh targets the user's home directory by default (i.e. stow -t $HOME).

dotfiles.sh never symlinks directories, only files (i.e. stow --no-folding). (This was the straw that broke the camel's back and made me roll my own script in the first place.)

dotfiles.sh makes backups of local config files and can restore them if you remove your symlinked version.

My script is quite old now, and I use it so seldomly I'm not convinced there aren't bugs. YMMV.

[1]: https://github.com/kevin-hanselman/dotfiles

I'm very happy with yadm (https://yadm.io/) – it's a thin wrapper around git that adds things like alternate and templated files to use different versions of a file or to switch out part of a config for different systems and built-in support for secrets (not using the latter myself, but it's there)

I’ve been using stow to manage my dotfiles for a long time, works very well.

Recently I’ve moved to NixOS and have been considering trying out Nix’s home-manager (mostly just out of curiosity - I’m perfectly happy with stow). If anyone has tried both stow and home-manager, I’d be interested to know your thoughts on how they compare.

I've tried both. I was using stow before I switched to nix (on mac) and nix-darwin + home-manager. I use home-manager for everything except my vim config (I don't want to bury it in nix config and loose portability). home-manager is very nice when you want to create configuration that depends on other programs, e.g complex gitconfigs or crazy mutt setups.

I'm not sure if it's all worth it. I recently got a new machine, and it sure reduced the pain of setting it up. But I get a new laptop every four years or so. Bike shedding is fun.

home-manager is how I'm getting into nix. Having generations for my dotfiles is very appealing. Maybe I never really got the hang of stow, but I would also get bitten by it when I would rearrange my dotfiles, which would leave dead symlinks around. Perhaps this is what "unstowing" is for?

I like dotbot [0] and have been using it for a while. How do people manage secrets, I am encrypting with keybase, but really want to move away from it.

[0]: https://github.com/anishathalye/dotbot

I use dotbot as well, after having used fresh for many years. It took me a while to unwind the shell logic I’d build into fresh but I prefer dotbots yaml config.

I have a strict no secrets policy and they get laid down manually on any system that needs them.

I make a script purely with `export secret=...` lines and keep it in a local folder and KeePassXC. When I need secrets I manually source it. Not super automatic, but I dont add a lot of secrets a lot of the time

I reinstalled my OS (actually, a jump to an adjacent distro) just last night and was thankful I had set up a dotfiles manager (yadm [0]). yadm is a git wrapper, so a yadm repository is pretty much a git repository.

`yay -Qe | cut -f 1 -d " " > .config/packages.txt` to back up my packages (without version), push to remote, and then `cd ~ && yadm clone -f <remote> && yay -S $(cat .config/packages.txt)` to reinstall everything. Piece of cake.

The only thing I am missing is tracking system files, like /etc/pacman.conf, and it looks like there's a way to do it with yadm, but it looks a bit clunky and I haven't tried.

[0] https://yadm.io/

I follow a similar but handcrafted approach. I have a dotfiles repo with a setup script that automates the creation or deletion of all the symbolic links: https://github.com/susam/dotfiles/blob/master/setup

So what I do on any new system is just:

  git clone https://github.com/susam/dotfiles.git
  cd dotfiles
And if I want to undo all the setup for some reason:

  ./setup rm

I love stow. My only complaint is that I cannot seem to remember the name of the command: Is it "grab", "collect", "pirate", "organize", "install", "push", "pull", ...?

So I added a note to myself into my .bashrc about this command, and patted myself on the back. The next time I reinstalled Linux, I couldn't remember the name, but I remembered that I wrote a note to myself about it. So opened up my .bashrc, and... I didn't feel so smart.

It doesn't solve the actual underlying problem, but you can enter the definitions in Anki. I bet you'll be able to memorize all of them after 30 days like a law or medical student. Okay, it's a stupid idea, stop laughing... But seriously, if you decide it's worth your life to spend 10 minutes each day for a month to memorize the stupid commands of a dotfile manager, it really works. I had some success on remembering Vim movement commands.

stow needs a GUI frontend

I made https://dotfilehub.com as another solution for this.

I am sure there is a better way but I just do this. It works well enough for my purposes.

cd ~

git init

git remote add origin [URL]

git fetch

git checkout -f master

Recommend you also add "*" to your $HOME/.gitignore in this setup. Then you simply need to use `git add -f` to force-add files to your dotfiles store, so that you don't accidentally add files you don't intend to keep there.

I also do this. I would highly recommend doing it otherwise you may accidentally commit your ssh keys or other secrets.

Can confirm this works extremely well. Here's how I do it:

    git clone --bare <dotifles repo> ~/.dotfiles
    git --git-dir=$HOME/.dotfiles/ --work-tree=$HOME checkout
    git --git-dir=$HOME/.dotfiles/ --work-tree=$HOME config --local status.showUntrackedFiles no
And then I add an alias:

    alias cfg='git --git-dir=$HOME/.dotfiles --work-tree=$HOME'
Now I can just manage my dotfiles as a git repo, and I don't have to worry about accidentally adding local secrets to the repo.

Yeah, that's what I do. And I have an ever-growing .gitignore file with all the dot files and things I don't care about like ~/Downloads...

If you put * in the gitignore file you don't have to deal with ignoring the downloads directory. When you want to add a new file / directory just add the -f flag (git add -f) and everything else works as normal.

Sounds interesting, but I can't help but wonder:

> all your dotfiles are now neatly organised

isn't that what .config is for?

> install locally built packages in /usr/local/stow/PKGNAME-VERSION [...] so you don’t have to worry about any stray files

Isn't that what /opt is for?

> (HN comments live-linking to a repository)

So maintain externally and export as needed.

... am I missing some really important context here? I downloaded stow to check it out, but the whole 'dotfile/installation' use-case in the article sounds like a nothingburger to me.

it just automates puttting dotfiles where they need to go. I have like two dozen different config files and every time I set up a new machine it'd be a drag to symlink everything by hand.

Some more ways to manage dotfiles are linked from the vcs-home wiki:


I did my own super-simple Python script[1] with zero dependencies for symlinking files between multiple computers and platforms.

[1] https://gist.github.com/XVilka/f124936a336a5a43f54915369719e...

I originally went with the bare git repo technique, but found myself not liking how to store multiple computer's dotfiles in a single repo.

I ended up switching to https://www.chezmoi.io/, after learning about it here on HN.

Absolutely love Chezmoi.

Its a small, statically linked binary which is easy to install and the templating feature is really useful.

There are a ton of utilities for managing dotfiles. Some of them are listed here: http://dotfiles.github.io/utilities/

It’s over-engineered, but I’ve been using https://github.com/RichiH/vcsh for this for years.

Is this project's name a pun on "gnusto"? That's the spell in the Infocom game "Enchanter", which copies a spell into your spellbook.

Have GNU reinvented the Windows registry?

Obviously the Windows registry works terribly in practice, but the idea of a centralised store seems to be coming back into fashion.

For this I highly recommend https://zolk3ri.name/cgit/zpkg/ which I have been using for years now. It works wonderfully for my use case, and might work for yours, too. It is very simple.

Environment variables and their defaults:

    ZPKG_SRC = ~/.local/pkg
    ZPKG_DST = ~/.local
    ZPKG_DB = ~/.local/pkg/.db
It means that if you install anything from scratch, you have to `make install` (or the like, depending on the build system) it to, say, `~/.local/pkg/foo-1.0` and then run `zpkg link foo:1.0` to install (i.e. link) the "package".

After that you just have to make sure you have (typically in your `~/.bash_profile` file):

- added `~/.local/bin` to your `PATH` environment variable, and

- added `~/.local/man` to your `MANPATH` environment variable

Seems to do the job. For help, read the source code or type `zpkg --help` which should be of tremendous help.

By the way, I have noticed that someone created a package manager with the same name, but its initial commit was in 2019, while this one's was in 2017.

It is written in OCaml, so you do need to have the OCaml compiler installed. I recommend doing it via `opam`, but your Linux distribution's package manager will suffice (simply `ocaml` on Arch Linux, for example). Run `make` to compile it, it will produce a working executable file.

Direct link to the source code: https://zolk3ri.name/cgit/zpkg/tree/src/zpkg.ml

If you find any bugs, report it via e-mail which can be found in the `LICENSE` file. I reported a bug before and it was fixed almost immediately. I suppose you can send pull requests or mention missing features, the creator seemed friendly to me.


The above is a modified version of an old comment of mine: https://news.ycombinator.com/item?id=24238587


To be frank, I forgot how it compares to GNU Stow because it was many years ago, but I did use GNU Stow prior to finding this program. All I remember is that it is way simpler, and it seemed to be perfect for my use case, no more and no less than what I needed. Maybe it works for you, too.

guys, STOW + SYNCTHING is the way to go for me.

Anyone move from thoughtbot's rcm to stow? How do they compare?

yadm is better.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact