
Ask HN: How do you keep your multiple workstations in sync? - rsto
During a regular work week I work a couple of days in a co-working space, the other times either from home or at a client site - and I’m tired of carrying around my laptop all the time.<p>My dream setup would consist of a couple of low-fi workstations, say a bunch of Intel NUCs or the like, one at each site. Then, whenever I leave in the middle of work at one site, I can go to the other and pick up from there. However, due to sub-optimal internet connectivity I can’t use thin clients with a cloud-based X server. At each work location, I’d need to run my jobs on that machine (either bare metal, or virtualised). Is there a way to e.g. synchronise one virtual machine image across the workstations, when I switch locations? I wouldn’t need to run two workstations at once. Bonus points if a program or configuration, that I install one machine also shows up on the other workstations. I have no special requirements, any Unix-like OS with a window system is OK.<p>Is anyone of you running such a setup? Or have you found a similar solution? I have that nagging feeling I can’t see the forest for the trees, given that I haven’t with IT setups since 1999. Thanks!
======
davmre
A couple of ideas that don't exactly fit your question but might be relevant
to others in a similar position:

\- You could buy a more portable laptop that you wouldn't mind carrying
around, for example the new Macbook Retina or an MS Surface.

\- Even a poor Internet connection might still be good enough for command-line
work, and you can get a lot done on the command line. I've had great success
keeping all my code and data on a central server, and editing code either
directly in command-line Emacs via mosh
([https://mosh.mit.edu](https://mosh.mit.edu)) or in a local Emacs instance
using tramp
([http://www.gnu.org/software/tramp/](http://www.gnu.org/software/tramp/)) to
save changes directly to the server. This has the advantage that I can start
long-running jobs that persist on the server even as I'm moving between
locations. For data science work, iPython notebook is also very easy to set up
for remote web access.

~~~
lgbr
> You could buy a more portable laptop that you wouldn't mind carrying around,
> for example the new Macbook Retina or an MS Surface.

Here I think you're setting yourself up for failure. What happens when said
laptop is stolen, has a hard drive die, or you forget your encryption key? If
it's more convenient for him to have multiple workstations, then why not also
make sure he maintains a good practice of having his working data synchronized
and therefore backed up?

~~~
davmre
Sure, everyone using a laptop (or desktop!) should run regular backups. I
think that's mostly orthogonal to the portability solutions being discussed
here, though. For example, the VM-on-a-USB-stick solution would also require a
separate backup mechanism.

------
detaro
Biggest issue I see with syncing system images/full file systems around is
that you can't easily do a merge: If you arrive at a new location and for some
reason the sync with the latest changes isn't completed yet, you probably
wouldn't want to wait for that to complete before you can safely work.

For that reason, I probably would go with only syncing home/data dirs, and
maybe doing so explicitly or using something like git-annex. If you want to
sync installed packages and system config, maybe force yourself to apply those
only with scripts or a config management tool like puppet or ansible.

But there are options to avoid online sync: If your system fits in 1TB or
less, you could only carry a external 2,5" SSD (or maybe even just a USB key)
around and boot all machines off that. Or even carry an entire NUC, they still
are more compact than a laptop.

~~~
rsto
Ah, the USB key suggested by you and i336_ is beautifully simple! Hadn't
thought about that (but did come close by looking at the Compute Stick, which
unfortunately is under specced for my requirements). Thanks for this.

~~~
star_sailor
Just make sure you back the sticks up. USB sticks have a habit of dying
without warning.

~~~
rsto
Yeah, I thought more like: Using the USB stick to carry round my image, but
transferring it onto the local workstation. That might take a couple of
minutes during start and end of work at one location, but would give me better
performance (and some redundancy).

------
i336_
One incredibly simple-minded possibility could be to keep VirtualBox or Qemu
VM disk images on a USB HDD, and suspend the VMs to the HDD to move between
machines.

Also, since it's better to err on the side of saying something irrelevant than
not saying something at all, check out [http://xpra.org/](http://xpra.org/).

~~~
rsto
Yes, the USB image is so simple enough, that I'll just have to give it a try.
Didn't know xpra, which could come handy from time to time. thanks.

------
corford
Depending on exactly what it is you're doing, maybe keep a master vagrant file
that describes your dev environment (and every time you install a new app or
modify the env, update the vagrant file accordingly)? Then on each workstation
you use, you can just spin up a dev vm from the vagrant file.

That's one way to quickly replicate your env across workstations. Then all you
need is git/rsync/btsync + a few bash scripts to keep your data in sync.

------
javert
I have an X1 Carbon thinkpad. It's not a problem at all to carry around---very
light and thin. I have separate mouse+monitor setups in different places (e.g.
office, home). So I just plug it in wherever I am. I keep the monitor and
mouse plugs right there on the desk(s), so it takes two seconds to plug them
in.

I even have mulitple separate power adapters. I have one in my kitchen and
another in my bedroom, for instance. They are literally 15 feet away. Totally
worth the $70 or whatever since I can easily move back and forth multiple
times a day.

Carrying around and dealing with a power adapter is 60 to 70% of the hassle of
carrying a laptop and that can mostly be eliminated.

Oh, I also have a separate pair of headphones in each location AND one that
always stays in my bookbag.

------
NamTaf
My company gives employees laptops with cisco anyconnect. It basically punches
a VPN back to corporate over ant HTTPS connection it can get its hands on,
which means that it can map all of your network drives, citrix apps run fine,
etc.

By encouraging everyone to work off network drives, everything tends to stay
in sync. Other solutions include having a server-located user data folder
(desktop, documents, etc. - but that slows down for regional sites or mobile
data if you load it up too much) and horrors like sharepoint or other remote
content storage programs (HP records manager, etc.).

For code, I run a git server on a network share and point visual studio at it.
My code gets committed to it and I can see it from any device on the network,
including via VPN.

------
mhw
zerotier-one ([https://www.zerotier.com/](https://www.zerotier.com/)) is
useful to create a consistent network between individual nodes in different
places. You can then use consistent hostnames to connect to a given node on
the zerotier network instead of having to think about where a particular node
is connected.

I've tried using BTSync and syncthing
([https://syncthing.net/](https://syncthing.net/)) to automatically keep
directories in sync with each other, but there's quite an overhead to have
real-time change propagation and it doesn't seem to add much value if you
aren't actually using the remote nodes at the same time. Instead I've reverted
to using Unison
([http://www.cis.upenn.edu/~bcpierce/unison/](http://www.cis.upenn.edu/~bcpierce/unison/))
when I've finished working in one location and want to push those changes to
another node.

------
neya
I'm a UI guy + Rails developer, I own 3 Macs (1 Mini and 2 MBPs). For most of
the stuff, I simply use iCloud. For the actual source code of what I work
with, I use Google Drive w/ GIT and I CD into it no matter which system I'm
working with.

Actually, Macs are quite enjoyable in the sense that everything else is taken
care of, by Apple - The App store handles all my software installs and
purchases between computers, iTunes handles all my music collection and
ensures all 3 Macs have the same tracks (I use Apple Music w/ Match) and
iCloud makes sure all my docs are available from any device I use
(iPad/iPhone/etc.)

As for browsing and bookmarks, I use Chrome.

------
anotherevan
I work with a linux machine at home, and windows 7 at the office. Here’s some
of the bits and pieces I use to keep data synchronised as needed.

LastPass and XMarks to keep passwords and bookmarks synced across both (and
across browsers as I use both Chrome and Firefox).

Dropbox. In particular, I run Pidgin for my instant messaging (with
skyp4pidgin) on and encfs encrypted fuse file system. (On the Windows side, I
use Boxcrypter Classic for encfs.) I also use Dropbox to push repositories
back and forth (I use Mercurial for that, but Git would be just as good.) If I
didn’t need to use Dropbox for sharing some common folders with others, I
would consider SpiderOak and then not worry about encfs.

On the work machine, I have a fairly large TrueCrypt file container that holds
my Firefox profile and Chrome user data, Dropbox folder and any other data I
don’t want other users of the computer to be able to access when I’m not
there.

I can also ssh to my home machine from the office, which is handy if I’ve
forgotten something. I also use it for running rsync and/or unison as
appropriate one some data (got to keep that office music collection in sync!).
Cygwin on Windows for rsync and unison clients (as well as a bash terminal.)

I can also ssh to linux machines at work from home, and use x2go to
rdesktop/vnc to office machines when I need to occasionally. Have found this
much faster than alternatives (rdesktop/vnc over VPN, or xpra).

Hope that gives some ideas of things you could try if a full VM image
shuttling back and forth does not work for you.

------
lewisl9029
I personally can't wait for 1) OSes that can move seamlessly from phone UI to
full desktop UI depending on the size of connected screens, and 2) hardware
that can handle heavy dev work (multiple VMs, lots of random I/O) that fits in
a smartphone form factor and power envelope.

Then sync would no longer be necessary because all you would ever need to use
is that 1 unified device.

Until then, I use network shares on a cloud server through VPN for media, and
Git and Vagrant/Docker for dev stuff.

~~~
andreineculau
Basically Ubuntu Edge [https://www.indiegogo.com/projects/ubuntu-
edge](https://www.indiegogo.com/projects/ubuntu-edge) , or do you know of
another similar initiative?

~~~
lewisl9029
Yep, I backed one of those. Too bad it didn't work out...

------
flarg
Welcome to the club! I've tried thinking about this for many many years...

You best bet is to run your workstation in a VM and put your data files on a
mount to the underlying file system and then run Unison file sync across the
two file sets (the VM image(s) and the files).

Putting all your files inside the VM will work as well - but copy times will
be longer because the whole thing will be bigger.

With your bandwidth limitations anything magical is out of the question; live
file sync is an attractive idea but it's too slow for large file sets -
Dropbox, Lsyncd, Git Annex - all suffer when dealing with a lot of files
across low bandwidth. Unison, on the other hand, works well with a USB key as
the transport mechanism - and only runs when you ask it to - so it doesn't
keep transferring files as you change them (which seems like a good idea - but
is really a PITA when you're not actually sharing files with other people).

Going a bit fantastical here - but IMHO the gold standard is block level sync
with DRDB - which will keep two or more systems identical at all times; but
it's too heavyweight for your purposes.

More practically, just install the same software on both your workstations and
sync your data and dot files (e.g. .home) with Unison.

This is what I have been doing for over 5 years - and it works really well.

------
davesque
Git. Also, occasionally doing work with tmux on a Linode. I really like the VM
disk image on a USB that some people are suggesting though. And, again, if you
used tmux or screen, it would make resuming work easier.

~~~
guyromm
If you don't want to send your unencrypted data to places you do not control -
use git-annex for the heavy binaries (photos, PDFs, videos, music) and just
sync between your instances. Though git annex can do end-to-end encryption and
upload to Dropbox/gdrive as well.

Pure Git is great for consolidating configuration as well as keeping plaintext
format files in sync. Invaluable if you are an org-mode or markdown jockey.

~~~
rsto
Yes, git-annex looks nifty and its prototypical Nomad use case on the website
comes close to what I am looking for (lest the system files, executables,
etc.)

------
viraptor
What exactly do you want to sync? Is there any data, or is it only your
environment?

Almost all the ideas so far mention moving the whole thing with you (usb
sticks with system or vm), or working remotely (X / mosh / similar).

There's another possibility if you don't require any (or only static-like)
data to be migrated - use some system which can be updated / rebuilt in sync.
Setup a server salt / chef / puppet and make sure your local machines sync to
it periodically. Whenever you need some modification, you have to implement it
in your server and push.

Advanced version - implement it over something like NixOS for consistent
builds.

Pros: anything fails - just rebuild it (even remotely if you can set up the
satellite servers to boot over network); you won't forget your physical media;
you can update all machines at once; you can store all configs in a repo with
full history; if you need per-site specialisations, templates are great for it

Cons: if you need to sync any large chunks of data, this is not a good idea
anymore

~~~
rsto
I didn't much look into salt/chef, yet. The "anything fails" approach is
intriguing, though. Since my data is mostly source and config a central server
to serve puppet and cloning my work products from a Git repo would work.

The only drawback I could see is if the "implement [a modification] it in your
server and push" would turn out to become a hassle to try out libraries and
programs. But that's just me speaking without any hands-on experience with
these tools. Thanks.

------
francisb07
I use a Macbook Air and an iMac 5K at work. I keep these synced up heavily
with cloud-based solutions like GDrive, Evernote, and VPN access to the hadoop
clusters I work with.

Config-wise, I just built a shell script that I ran on both machines to
install the same packages and libs I use for work.

For my startup, we use something like this
[https://github.com/thoughtbot/laptop](https://github.com/thoughtbot/laptop)
(modified to our specs of course) to keep our machines all synced up. I'm
thinking of using Docker or Vagrant to consistently provision machines with
regular updates though. This has to be explored.

------
dafink
I use Unison file sync:
[https://www.cis.upenn.edu/~bcpierce/unison/](https://www.cis.upenn.edu/~bcpierce/unison/)

Does exactly what it says on the tin!

------
akurilin
All of my machines are linux, so I like to use ansible for it. I can sync my
dotfile git repos, configure the OS how I need it, automate syncing vim
plugins, get all of the PPAs I need. This doesn't quite solve the problem of
half-finished work on one machine being available on a different machine
(unless git counts), but at least your systems are now set up identically
everywhere you can. One option is to work remoted into a VM somewhere in the
cloud very close to you (to minimize latency), I've heard of people doing
that.

------
ivan_gammel
I'm a manager doing sometimes dev work, so I have a lot of different data to
sync. Here are my solutions: 1\. Personal - OneDrive for documents,
Yandex.Mail+IMAP sync for mail on devices, GitHub for code, OneNote for quick
notes on cell phone and Confluence on Atlassian Cloud. I also used for some
time Mindmeister.com for storing mind maps online (unfortunately they don't
have an app for Windows). 2\. Corporate - MS Exchange and lots of services
behind VPN (Git, Confluence, SharePoint to name few). I do the sync up between
several devices: 1\. Corporate desktop workstation (the most powerful beast
for working with code, Windows) 2\. Corporate laptop (Windows) 3\. Personal
tablet (Windows) 4\. Personal cell phone (Windows) Most of the sync up (except
Git) happens automatically, so I don't worry about pushing changes to server -
just open the document/email on phone that I've recently edited on tablet or
laptop. The only trickiest part is development environment setup (IDE
settings, app/web server configs, build system configs etc): I've unified the
setup of all my devices, so it will be easier to transfer them with scripting
(I'm using PowerShell). I've also set up simplicity of dev. environment setup
as a team goal at work, so everyone could pick up the code on bare machine and
be ready for debug in 10-15 mins max. Most of this achieved by following
conventions, that are either standard or recommended by vendor of software,
and using the right tool, rather than generic solution. Despite I'm using
Windows, half of my team works the same way on Macs. I'm pretty sure this is
also possible on Linux.

------
aruggirello
USB keys are known to be too unreliable in the long run, so my advice is a big
STAY AWAY from them; but I can testify an external box, connected via a USB2 /
USB3 cord, plus an (Intel 530 series) SSD can do wonders as a portable, full
GNU/Linux system (Kubuntu in my case). Boots to desktop in under 1 minute even
over USB2 (USB3 is a lot faster), and is very usable despite the USB
bottleneck ( _way_ better than any USB stick). 120Gb, even after installing a
full distro, leaves you with plenty of space to carry along your favorite
music, virtual hard drive images, so you can carry along your Windows virtual
machine(s) intact, and already configured [1]. You can then use TimeShift (or
manual rsync cronjobs) to backup your full system to local drives/USB keys,
and sudo apt-get upgrade the system whenever you get a reliable internet
connection.

Minor issues I have experienced with this setup:

1\. The biggest issue of all is, due to the peculiarity of USB power design,
most SSD's might experience failures when the machine is powered off. This is
a known issue with USB keys too! Believe me, booting a system every time not
knowing if it's going to end in the dreaded fsck mess with thousands of errors
really is a nightmare. Going with the Intel 530 series SSD's (which is a known
solution) fixed the issue as they have larger capacitors, though Samsung SSD's
might be reliable too.

2\. USB power: on most laptops, your single USB power cord is unable to power
the external drive; you might have to use the popular Y (double connector) USB
cable.

3\. Stay with open source video drivers. Especially swapping nVidia <-> AMD
drivers is a recipe for disaster.

4\. Audio HW devices will stack up, possibly resetting their settings when you
return to a previous machine.

5\. You might sometimes have issues with UEFI when you first boot your system
on a new machine.

[6] VirtualBox VM's have sometimes issues when the system unexpectedly swaps
eth0<->eth1 interface names, but this is easily fixed in the network
configuration before booting them; also swapping AMD and Intel processors is
risky, and can unexpectedly hang your Windows system on boot; BTW don't forget
to carry along on your external drive every hard disk image they might use, or
the VM's won't boot. Of course this applies to Vagrant boxes too, but is less
relevant as you can destroy/recreate the box any time.

------
kriro
I have a MBP, desktop (Xubuntu), netbook I rarely use (Xubuntu) and Android
phone + personal webspace (just use it for a simple wordpress and
owncloud+mail accounts). My setup is pretty consumer/not very sophisticated. I
recently removed my last Windows VM but will probably create a new one soonish
because there's the occasional windows only thing I need to do. That VM is
just kept up to date and has antivirus etc. but no synching with the other
machines (shared folder with the host)

\- GitHub/GitLab

\- owncloud with personal "webspace" for data, especially important for
Android (+ a photobucket account for images for random funny forum posts)

\- Zotero for Firefox (with said owncloud) for my academic work. This is worth
its weight in gold. I prefer to work with Latex but you are often forced to
use Word/LO and the plugins are amazing (instant citation/bibliography;
snapswitch to a different format).

\- Thunderbird (and Outlook on OSX, meh but needed/convenient for work would
prefer 100% TB)

\- Firefox sync (not really using it anymore but probably should)

\- KeePass to store my passwords across systems

The overlap between the machines isn't huge and I mostly just keep them up to
date. The stuff I use on multiple machines is on the same version if possible
but not automated in any way. My personal backup strategy is pretty horrible
(I just randomly put important stuff on external storage), I need to up my
game there. Not even keeping my configuration files etc. on github as of now
(planned for the near future) tl;dr: Need better backups, need scripted os
sync

Note: I don't use my phone all that much for stuff I can do on other machines
(don't mobile browse a lot). It's mostly an IM/call platform with occasional
data use.

------
ajhit406
I've been working on customized EC2 instances, DO droplets and Nitrous
([https://pro.nitrous.io](https://pro.nitrous.io)) since 2012 and haven't
looked back. There is the issue of connectivity, but I'm unproductive without
an internet connection so it has worked well for me.

I use tmux and then connect to the session from work, home, etc... and setup
ssh config ([http://nerderati.com/2011/03/17/simplify-your-life-with-
an-s...](http://nerderati.com/2011/03/17/simplify-your-life-with-an-ssh-
config-file/)) so I have shortcuts to all of my remote machines. App
environments are built with docker, snapshotted, and pushed to dockerhub.

------
jmnicolas
If you can use Microsoft and provided you have USB 3 ports on your
workstations, Windows To Go could be a good fit.

[https://en.wikipedia.org/wiki/Windows_To_Go](https://en.wikipedia.org/wiki/Windows_To_Go)

------
haser_au
You could always just use your phone as your desktop.

[http://www.andromiumos.com/](http://www.andromiumos.com/)

[http://thenextweb.com/microsoft/2015/04/29/windows-10-will-l...](http://thenextweb.com/microsoft/2015/04/29/windows-10-will-
let-you-use-your-phone-as-a-full-computer-sort-of/)

[http://www.gizmodo.co.uk/2012/02/who-needs-a-pc-when-you-
can...](http://www.gizmodo.co.uk/2012/02/who-needs-a-pc-when-you-can-just-
hook-up-your-android-phone-to-a-monitor-and-keyboard/)

Ubuntu Edge (no longer available)

------
StavrosK
I keep my dotfiles in git, and I was setting up my new laptop the other day,
so I figured I'd do it right. I added an Ansible file to provision the laptop
from scratch. It installs all my packages, tools, preferences, etc, and it
also goes and gets my repos from a server and puts them in the right places.

It's idempotent, so whenever I need to make a config change, I just change
that file and run "provision", and everything is the same across machines. I'm
not sure if this answers all of your questions, but it's good enough for work
(personal data can go on something like Dropbox or Syncthing).

------
hayksaakian
google drive to sync documents, normal files, etc.

git for code.

then i have VMs on EC2 for specific tasks.

>tired of carrying a laptop

i solved this with a lighter weight laptop.

you can usually leave your multiple displays wherever you need them, and plug
into the laptop.

edit:

looking back, i'm very invested in google's ecosystem of "synced" things:
google apps.

it's almost to a 90's microsoft extent -- where i will actively avoid and
discourage incompatible formats or platforms.

at the very least you can use any of google's things with an OK computer and
an internet connection.

that leaves me wondering what will "obsolete" the "google office"

------
Gustomaximus
The key ones for me that let me leave the laptop at home;

\- Outlook: MS Exchange does syncing on email/calendar/tasks (I use Nine for
my mobile mail client).

\- GDrive: General file storage/accessibility.

\- Chrome browser: Sync's our tabs/history. I use Tab Cloud to save sessions
across Chrome too.

\- One Note: I use this for meeting notes and brain storming.

\- Google Keep: I use this like sticky notes thats accessible cross devise.

With this mix I can easily move between my laptops/mobile devices as needed.

------
sgtpep
I've tried few approaches, which worked quite well:

\- Linux on DigitalOcean droplet through ssh/mosh+tmux (usage of CLI/TUI
software and stable network connection is required)

\- full Linux system on USB 3.0 flash drive (reliable backup configuration is
required)

Now I use similar Linux desktops on each desktop/laptop with:

\- notes syncing using unison

\- mail stored on mail server and accessed through IMAP client (mutt)

\- dotfiles managed with git repository

\- projects managed with various VCSes

\- sensible data managed with pass/gpg

------
geocar
I got tired of carrying a laptop around so I get a smaller laptop.

You sound like you would do well with that idea, but maybe it takes the form
of an external (e.g. USB) storage device that you store virtual machines on
_but don 't forget about backups_. Perhaps you have a script snapshot and
rsync diffs between your storage and your machine so that each machine's local
storage can also be your backup.

------
blfr
I tried running a home server with data, running a home server and forwarding
X from there over ssh, limiting my needs to CLI (tmux, irssi, vim, etc) and
running everything on a remote server, using built-in sync for various apps
(like notes)...

None of it ever worked well so I bought a nice leather messenger bag and
learned to like carrying my laptop around and using the small (13") screen
most of the time.

------
citrons
For my notes use private GISTS or Evernote. For projects use BitBucket or
GitHub. For syncing program configurations use my own written script that
creates symlink to hard drive from Google Drive folder. Chrome does a decent
job syncing all web related stuff for me.

Not perfect but it is ok.

------
NietTim
You can get around the internet limitation by using a USB stick with a live
linux environment and boot from that? Downside is that when you lose the stick
you're boned.

------
8note
With windows, Ive done very well with goodsync. It tries very hard to keep
things synchronised nicely, including when the connection between clients is
patchy.

------
sneak
syncthing is wonderful.

~~~
miscellaneous
+1 Syncthing is super handy for my core files (~10GB) synchronized between all
my devices.

It more convenient than cloud storage in that I don't have to worry about
sensitive information being stored on remote servers, but obviously is
restricted by the storage capacity of each device.

------
cpursley
KISS: Have just one workstation (laptop)

------
atmosx
What about bringing your laptop to work?

------
jkot
Swaping ssd with OS and data.

------
rando289
Unison. Know where your data is. Don't sync oses. Instead, keep their setup in
scripts. You can get faster sync if you can say which side is newer by using
btrfs snapshot over the network.

