I'm coming round to it (at a considerably less exalted level).
In summary, it used to be that phones worked without you having to carry them around, but computers only worked if you did carry one around with you. The solution to this inconsistency was to break the way phones worked rather than fix the way computers work.
- 150GB SSD: $15
- 150GB 30 last days of daily snapshshots as a backup: $3.45
- 160 hours of t2.medium (4GB) Windows: $8.64
The typical applications I use are Visual Studio, Delphi, Office, Chrome, and some domain specific apps.
Admittedly, you can reduce further the SSD cost of $15 to 160/744*$15=$3.22, by snapshotting and deleting the SSD volume each time you shut down but I never did that optimization because startup time would then not be seconds but something probably in the range of 5 minutes, as you need to have some lambda funciton create a new instance, create a volume from the last snapshot, shutdown the newly created instance, replace the boot volume and finally start the instance with the right boot volume.
What I also like of this approach is that I do not have to overprovision disk or instances, if I ever need a larger drive, I just modify the volume, if I need a bigger instance, I just shutdown and start with a bigger instance size.
Do you even watch Youtube (say) inside the instance??
For web browsing I often use the local browser because with Macs swiching the current desktop is just a key stroke away (which is not the case for W10 if one of those desktops is the RDP client.
What works for you certainly works for you, of course; it's just worth noting that there exists a very productive middle ground in which one can use tools without spending an unproductive amount of time on hyperconfiguration.
I knew someone who accidentally uploaded their AWS credentials to a public repo; that was a huge shitshow.(btw, don't put cress in repos at all, but at least a private repowill give you a buffer)
I knew someone who accidentally uploaded their AWS credentials to a public repo; that was a huge shitshow.
I don't fight the platform anymore, I just use whatever's vanilla out of the box. And I'm actually a lot more productive as a result.
I think it's more than okay to go through the tinkerer phase, it's one path to growing as a developer.
My personal laptop has been on Kubuntu LTS since I bought it seven years ago, and it works fine. My employer-provided laptop is a Mac, so I use MacOS there (with some tweaks, especially to get keyboard shortcuts for window management). My home desktop runs Windows 10 for gaming, but I find it fine for occasional dev work.
Basically, I use whatever is in front of me, now. I seem to have gotten more flexible as I've gotten older.
Now I am my Mac since a few years and also transferred the whole thing once from a TimeMachine backup, so I customize. Though that list of the OP... SO MANY APPS?! I get weak knees just from scanning that list. Though several I may pick, like the Focus app and maybe a few others.
This was around 15/20 years ago now and I do have a custom Bash profile and custom tmux config. The tmux config never needs to leave my workstation but the bash profile gets copied onto each server the moment I SSH onto it (as I've aliased SSH to do this) so that means I have a familiar environment on remote systems with zero extra maintenance.
At my work, we all use the same user for ssh (uid 1000), for all servers, for all projects. It's a good opportunity to exercise mode-switching when switching from (local) fish to (remote) bash and back again, and I'm happy both of fish, bash share the common readline bindings, it also makes it clear what kind of shell you're in if you're jumping back and forth between docs, manpages and shells.
I have one itch, fish and bash do word-boundaries differently when deleting with Ctrl-W (delete word). bash even does word-boundaries different from Ctrl-W when using Meta-D (delete word in front).
Of course I agree that this is still bad practice from an auditing perspective (and assuming a password is needed to su/sudo you do need to share that password).
And when you're used to those apps, you might pick up a few more here and there. And a few more. And in a while you'll have a list just like the one linked.
Mackup  might also be worth considering. It symlinks all of your config files to a supported storage location (Dropbox, Google Drive, iCloud, Git etc.) which enables you to either backup the settings or sync them between one or more other Macs. To restore your settings on a brand new Mac simply run "mackup restore".
I mean it's a good idea, but I prefer simplifying my own workflow so I can perform it without any external apps. I used like the idea of automating setting up a workspace, but now I like just to open up and close whatever I need at that moment, it really goes into your muscle memory after a while.
Only thing I really customize is my programming environment, my vim setup is highly customized, also shell functions. But each to his own :)
I, too, try to stay close to the defaults, but I will change them if I feel strongly enough.
Oh the days trying out every WM under the Sun.
Nowadays on Windows I do a few customization, like showing file extensions, enabling a separate process for each explorer Window and a few other things and that's about it.
On GNU/Linux, I have long settled on whatever is the default WM for the respective distribution.
Recently heard someone else's smarter idea. Put the dotfiles into a Dropbox dir and use symlinks. So updates auto-sync across computers.
[EDIT: I think it was Kenneth Reitz that tweeted about his Dropbox setup]
I don't spend time "configuring and tinkering", and yet after several years on a Mac I use a good number of apps.