Hacker News new | past | comments | ask | show | jobs | submit login

I used to be like this... crazy about dotfiles, backgrounds, shortcuts, apps, etc. Eventually, I learned that I'm better off with the standard setup and little customization. No fancy aliases, no recorded dotfiles, no crazy editor configuration, nothing. If I get a new computer or am using someone else's even, it's easy for me to install what I need as I need it and get going out of the box.



The Rob Pike approach https://usesthis.com/interviews/rob.pike/

I'm coming round to it (at a considerably less exalted level).


>> Twenty years ago, you expected a phone to be provided everywhere you went, and that phone worked the same everywhere. At a friend's house, or a restaurant, or a hotel, or a pay phone, you could pick up the receiver and make a call. You didn't carry a phone around with you; phones were part of the infrastructure. Computers, well, that was a different story. As laptops came in, people started carrying computers around with them everywhere. The reason was to have the state stored on the computer, not the computer itself. You carry around a computer so you can access its disk.

In summary, it used to be that phones worked without you having to carry them around, but computers only worked if you did carry one around with you. The solution to this inconsistency was to break the way phones worked rather than fix the way computers work. <<


What I do is the VDI approach. I have one AWS Windows instance with and I RDP to it from any of my computers at home, the office or any of my second homes, which happen to be iMacs but could be anything.


How's the latency for text editing? I tried doing something like this with *nix tools but found it unbearable even when connecting to localhost.


In my case it is indistinguisable from local text editing. I remember however that I chose to turn off SublimeText scrolling animations because it was a way worse than locally. My fiber latency to my AWS instance is about 60ms and with phone 4G about 80ms.


What does that run you per month? What are you doing on it compute wise?


I do not have exact costs at hand because it is just one of several AWS instances we have, but probably my fully loaded costs on a 160 hour month for this instance would be something like:

- 150GB SSD: $15 - 150GB 30 last days of daily snapshshots as a backup: $3.45 - 160 hours of t2.medium (4GB) Windows: $8.64

The typical applications I use are Visual Studio, Delphi, Office, Chrome, and some domain specific apps.

Admittedly, you can reduce further the SSD cost of $15 to 160/744*$15=$3.22, by snapshotting and deleting the SSD volume each time you shut down but I never did that optimization because startup time would then not be seconds but something probably in the range of 5 minutes, as you need to have some lambda funciton create a new instance, create a volume from the last snapshot, shutdown the newly created instance, replace the boot volume and finally start the instance with the right boot volume.

What I also like of this approach is that I do not have to overprovision disk or instances, if I ever need a larger drive, I just modify the volume, if I need a bigger instance, I just shutdown and start with a bigger instance size.


I'd love to do something like this but I'm 150ms away from cheap VPSes..

Do you even watch Youtube (say) inside the instance??


If I do not have fiber or my ISP is having a bad day, using my phone G4 as a personal Hotspot I get some 80 ms. I use USB instead of wifi to save 5ms. To save some money, I shutdown the AWS instance when I do not need it because it starts up in seconds.

For web browsing I often use the local browser because with Macs swiching the current desktop is just a key stroke away (which is not the case for W10 if one of those desktops is the RDP client.


It's not really true, since he wrote his OS, text editor, programming language. Now he just installs the language and the editor/IDE wherever he goes and uses that.


Why not get the best of both worlds? Spend time tinkering to find force multipliers, then throw those dotfiles or Brewfile or installation scripts into a git repo and it's all a `git clone` away.

What works for you certainly works for you, of course; it's just worth noting that there exists a very productive middle ground in which one can use tools without spending an unproductive amount of time on hyperconfiguration.


Yeah. I've started doing this. Warning though, make sure to use a private repo.

I knew someone who accidentally uploaded their AWS credentials to a public repo; that was a huge shitshow.(btw, don't put cress in repos at all, but at least a private repowill give you a buffer)


Yeah. I've started doing this. Warning though, make sure to use a private repo. It keeps two my macs and Linux systems all in sync.

I knew someone who accidentally uploaded their AWS credentials to a public repo; that was a huge shitshow.


I used to be a tinkerer, happily maintaining my own fork of, say, Rails. Or Postfix. Or the Linux kernel. It was tremendous fun and a great way to learn internals, but also a colossal time sink.

I don't fight the platform anymore, I just use whatever's vanilla out of the box. And I'm actually a lot more productive as a result.

I think it's more than okay to go through the tinkerer phase, it's one path to growing as a developer.


My Linux distro progression was Linux From Scratch -> Gentoo -> Ubuntu -> Kubuntu -> Ubuntu -> Kubuntu...

My personal laptop has been on Kubuntu LTS since I bought it seven years ago, and it works fine. My employer-provided laptop is a Mac, so I use MacOS there (with some tweaks, especially to get keyboard shortcuts for window management). My home desktop runs Windows 10 for gaming, but I find it fine for occasional dev work.

Basically, I use whatever is in front of me, now. I seem to have gotten more flexible as I've gotten older.


IMO that's a bad trade-off. You're going to be using your personal computer more than the time you spend switching computers. In that case it makes sense to customize your set up to fit you. How often do you get a new computer?


True for most people, but I have been in situations where I got new Windows machines pretty often. Then I learned to just suppress my itch to customize and just the darn thing.

Now I am my Mac since a few years and also transferred the whole thing once from a TimeMachine backup, so I customize. Though that list of the OP... SO MANY APPS?! I get weak knees just from scanning that list. Though several I may pick, like the Focus app and maybe a few others.


This is the primary reason why I stopped using Windows. The defaults wound me up rotten and after a fresh install it would take me hours (literally) to configure Windows to run the way I liked my workflow. It just got worse and worse with each new version of Windows pushing itself further and further away from my ideal workflow. But with Linux, as much as I have my preferences with desktop environments et al, I found I could more easily work with whatever was put in front of me (after all, if all else fails I can just fallback to Bash). So I gave up fighting Windows and just switched to Linux full time.

This was around 15/20 years ago now and I do have a custom Bash profile and custom tmux config. The tmux config never needs to leave my workstation but the bash profile gets copied onto each server the moment I SSH onto it (as I've aliased SSH to do this) so that means I have a familiar environment on remote systems with zero extra maintenance.


Copying your bash-profile is good as long as soon as you're _the_ user on that server, or when you have separate users for all users of the server.

At my work, we all use the same user for ssh (uid 1000), for all servers, for all projects. It's a good opportunity to exercise mode-switching when switching from (local) fish to (remote) bash and back again, and I'm happy both of fish, bash share the common readline bindings, it also makes it clear what kind of shell you're in if you're jumping back and forth between docs, manpages and shells.

I have one itch, fish and bash do word-boundaries differently when deleting with Ctrl-W (delete word). bash even does word-boundaries different from Ctrl-W when using Meta-D (delete word in front).


I don't know your specific set up so I don't want to come across as preachy but it's not generally good practice to have everyone using the same UID. I know for a fact we would fail our regular PCI and gambling commission audits if we did that but even for businesses that don't need that kind of regulation, sharing a UID means you're sharing passwords / private keys which seems a major security breach just waiting to happen (eg what happens when someone leaves the team?)


You don't need to share private keys or passwords to share a UID, each person can have his or her own public key in .ssh/authorized_keys

Of course I agree that this is still bad practice from an auditing perspective (and assuming a password is needed to su/sudo you do need to share that password).


Good point about authorized_keys, I'd forgotten about that.


> Though several I may pick, like the Focus app and maybe a few others.

And when you're used to those apps, you might pick up a few more here and there. And a few more. And in a while you'll have a list just like the one linked.


No way, in the day of cloud infrastructure I use far more other computers than my computer. And the other computers I use change all the time.


> If I get a new computer or am using someone else's even, it's easy for me to install what I need as I need it and get going out of the box.

Mackup [1] might also be worth considering. It symlinks all of your config files to a supported storage location (Dropbox, Google Drive, iCloud, Git etc.) which enables you to either backup the settings or sync them between one or more other Macs. To restore your settings on a brand new Mac simply run "mackup restore".

[1] https://github.com/lra/mackup


Yeah this guy is pretty crazy with the automation and keyboard shortcut apps, I mean 3 different apps for automating keyboard shortcuts or other automation ?

I mean it's a good idea, but I prefer simplifying my own workflow so I can perform it without any external apps. I used like the idea of automating setting up a workspace, but now I like just to open up and close whatever I need at that moment, it really goes into your muscle memory after a while.

Only thing I really customize is my programming environment, my vim setup is highly customized, also shell functions. But each to his own :)


I had that philosophy for a long time, but I’ve since written a few bash scripts and put them in a git repo to automate the things I do frequently.

I, too, try to stay close to the defaults, but I will change them if I feel strongly enough.


Same here.

Oh the days trying out every WM under the Sun.

Nowadays on Windows I do a few customization, like showing file extensions, enabling a separate process for each explorer Window and a few other things and that's about it.

On GNU/Linux, I have long settled on whatever is the default WM for the respective distribution.


I have an install.sh in my dot-files repo - https://github.com/HashNuke/dot-files/blob/master/install.sh (no secrets in this repo). Works for both my macbook and online workstation.

Recently heard someone else's smarter idea. Put the dotfiles into a Dropbox dir and use symlinks. So updates auto-sync across computers.

[EDIT: I think it was Kenneth Reitz that tweeted about his Dropbox setup]




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: