Many programs (like git) respect the environment variable $XDG_CONFIG_HOME, which allows you to set the location of your 'dotfiles' to a different directory session-wide. This means you don't need to make your entire home directory a single git repo.
My dotfiles repo (https://github.com/fredsmith/dotfiles/) uses this to maintain a clean, readable configuration directory that doesn't mess up my homedir with a bunch of symlinks, and the only installation I have to do is: "ln -s ~/dotfiles/bashrc .bashrc"
I've been using rcm (https://github.com/thoughtbot/rcm) for managing my dotfiles and strongly recommend it. It works by symlinking everything from your dotfiles directory to your home directory and gets out of the way. You can also do more advanced stuff like run "hook" scripts at first install or only install a subset of dotfiles using tags. Here's my rcm repo in case anyone is curious: https://github.com/olalonde/dotfiles
I think it works particularly well when you have a dotfiles setup where specific pieces of the setup differ by host but the rest is largely the same. For example, I've broken my zshrc, bashrc, and gitconfig into the common parts and the host specific parts to maximize DRY and maintainablity.
It's also on GitHub if anyone's curious about the setup
Even better: place your dotfiles in a public repo on GitHub with an open license, so that everybody can learn from it and contribute improvements/fixes. I've seen immense value from this approach.
Yeah, I wouldnt waste a private repo on dotfiles; however, it's important to maintain a .gitignore that ignores locations such as .bash_history, .ssh, and so on. You won't believe how many private keys, IP addresses, and user IDs are exposed on github because of people not practicing due diligence with their public dotfiles repos
I've never worked with a public GitHub repository, so I probably just don't understand; but how do you prevent disaster when allowing other people to modify your configuration? Setting aside malicious modifications, no-one else knows your set-up like you do, and it seems like it would be easy inadvertently to get yourself into a non-recoverable state.
Just because you're posting the code publicly doesn't mean anyone can change your copy willy-nilly. There's a pull request process to go through, something that requires positive consent to do.
I have a seperate folder with the git checkout, then use symlinks to connect the actual dotfile locations. I think the author may have alluded to the same thing:
> I can quickly clone the git repository and fire up my bootstrap script to wire all the symlinks, aliases and scripts
Though personally I don't use symlinks for my scripts (only my dotfiles) - for my scripts I simply add my git checkout to my $PATH
This is what I tend to do, along with a script that runs all the required install (rpm/apt generally) and stow commands for everything I use.
It does mean I have to update it whenever I add a new application to my default setup, or an application/tool has a new configuration layout. But those are pretty rare.
Like others, I wrote a tool to manage "layers" of configuration and personalization files (https://github.com/dcreemer/wsup). "wsup" lets me compose various work configurations, personal dotfiles (https://github.com/dcreemer/dotfiles), dot-emacs files (https://github.com/dcreemer/dotemacs) and others from multiple git repositories.
The project was born from my need to manage multiple changing personal and work environments. Now I can go from zero to full environment with a couple of commands.
Nowadays I simply rsync my dotfiles to wherever I need them. I have multiple trees in a repository, one for secure boxes, one for client boxes, one for boxes with an xserver, etc. This gives me the most flexibility, and I can install everything I need if I simply have ssh access. If I don't, I can create a zip file of the trees I want.
I wrote a bare bones "store-and-fetch" [dotfile tool][1] that's been working well for me, mainly because it's easy to set up on a new device and it delegates a lot of functionality to GitHub. It's more or less equivalent to the "Git repo in home" approach, it just streamlines the `git commit`, `git push`, `git pull` workflow.
The situation can be a bit shakier with GUI programs. I like it when they offer an explicit way to export & import configuration (Since what's findable on disk just by searching is not always reliably portable IME). E.g. for Terminal.app on OS X, which I have configured fairly extensively, I manually export a .plist file and put that in my git repo for safe keeping.
I've been going a step further than this. Not only revision controlling my dotfiles but also using Ansible to automate the process of installing all the apps I want, getting the environment in order, etc. Emacs and Mercurial have both changed my programming life and I think Ansible is a new addition to that group.
The linked article is bizarre[1], but brings up an important topic. It seems like most people are using this as a platform to plug/suggest their favorite way to manage dotfiles so I will follow suit. I m a big fan of vcsh and mr:
[1]: You could deck me out with a Klein Tools bag full of their wares and I would still have no business being in front of your breaker panel. A craftsman is so much more than his collection of physical things. Is SEO the answer to: What the connection is between the craftsman and the streamlined factory link? Or why every tenth word needed to be emphasized?
A craftsman is absolutely more than their tools, but on the other hand, they can't do their craft without their tools. Also, having their tools laid out efficiently greatly improves their ability to do work efficiently.
An interesting example of this is provided by Adam Savage - look up anything he's written or said about "first order availability", and about his toolboxes he created and maintained while working at ILM.
If anyone has resources or is willing to share his knowledge of OS X System Daemons and Agents - spotting and unloading the unnecessary ones is the main aspect of my dotfiles[1], but resources are often elusive, outdated or plain inaccurate guesswork, the whole process usually takes a lot of reading binary metadata and sometimes plain trial & error (and consequent occassional frustration).
But then I managed to saved around 2 GB of Ram just by unloading Daemons.
If anyone is interested to chip in, thankful for any link or suggestion:
My dotfiles repo (https://github.com/fredsmith/dotfiles/) uses this to maintain a clean, readable configuration directory that doesn't mess up my homedir with a bunch of symlinks, and the only installation I have to do is: "ln -s ~/dotfiles/bashrc .bashrc"