This isn't a dig at Linux it's simply that we still have lazy content creators using things such as Silverlight to provide content. These applications "just work" on Windows, and when they are at all possible on Linux, they tend to require a lot of manual effort and simply aren't as good as the Windows solutions (Google Drive, Spotify, VPN applications etc.).
I also use my personal laptop for dev, and as I'm an OSS-stack developer, I use Linux. Having the ability to run Linux natively on Windows is superb, and is genuinely usable for work purposes for the most part.
The article doesn't capture a few other bugbears with WSL. The file system is slooooooow (really slow), quite a few applications just flat-out don't work, and others are buggy as hell (psql e.g.). Would I use it in a serious work capacity daily? No - but I'm certainly appreciative that I can use it and it works pretty well 95% of the time.
I had a couple of years with MacOS instead of Windows. Definitely better in so many ways. But MacOS Unix is worse than WSL Ubuntu in every way, mostly because it's a bizarro BSD variant with 10 year old tools and Homebrew is a kludge.
I don't do much with WSL but find it occasionally useful in a pinch. The lack of an init system is very limiting, I find I still need a full Linux environment for most things.
Last I looked unzip on MacOS still couldn't handle files > 2GB in size. That was a patch submitted about 10 years ago. The version of less is also ancient and compiled without standard features like LESSKEY. At least as of a year or two ago, it's been awhile since I've tried using MacOS.
It will be a difficult choice if my next employer offers an OS choice. The more I ask myself what I really enjoy about Linux and Windows, the more I understand that I might just prefer Microsoft's Window manager to customizing one.
I'd rather ticker with Linux at home than fret about keeping it fixed at work. WSL and Docker let me do that in spades.
What do you mean by this? I've ran pretty esoteric set-ups at work but none of them have required more maintenance than the Windows systems some of my coworkers run.
If a bad patch comes down for Windows or MacOS then the company IT department will likely shoulder some blame. If a package manager update comes through for Linux and does fail then the onus will be on me; IT hasn't supported personal Linux installs at any company I've worked for.
From the job-insurance standpoint, it's a liability to use Linux at work if the business makes you take the full helpdesk responsibility. Plus, we're all inevitably asked to open Visio, Photoshop, Outlook, or one of the many other popular Windows-only tools. Now we get into virtualization and VMs.
Windows works for me. WSL is enough and Docker fills in the gaps. Frankly, I can run Visual Studio, Neovim, both Docker and WSL let me run what I want on Linux. In the end, it's just easier to visualize Linux in Windows than the inverse. I've never been afforded the option to "just" run Linux at the office.
Now at home, where I can let projects sit, they run Linux. My personal servers are Linux. My embedded electronics are Linux. If they break? It's fine. I like to tinker and play with those configuration files. Fixing that obscure display bug on your hardware is very fulfilling to me. When reviews come up, I'd like to be able to demonstrate the ROI I've created rather than hoping that custom WM takes my salary up a notch.
I run the latter because the former doesn't make sense to me; Linux makes for a better foundation security-wise and I can rely on it to do what I tell it to; something that's been a problem for Windows users the past few years.
I also get company policies, I'd try to make this part of the contract —I simply cannot work as efficiently if I'm fighting tools instead of business needs— but if it wasn't a possibility I'd definitely run a full X11 environment on top of Windows. It does makes sense in that case.
But I still do not get your point about maintenance; as you said, this isn't the 2010's anymore (I'd argue it was fine at that point as well if hardware wasn't simply chosen at random) and issues do not come up any more often than they do on Windows or macOS. The opposite has been the case in my office: those of us who run Linux just have far fewer issues both dealing with development tools and with system maintenance (and we have more Windows experts than Linux experts); so the implication that you'd produce a lower ROI by running Linux is a bit laughable from my own point of view.
At the end of the day you have your own situation and it's up to you to decide what's best for you, but I believe that if maintenance cost is your deal-breaker you probably should reassess your choice.
> I also get company policies, I'd try to make this part of the contract
I've never had anything close to that leverage during negotiations. If I had leverage like that then I'd use it on vacation or salary.
Regarding maintenance, Windows and macOS both have package managers now. My dotfiles work across Linux and Windows; macOS is the odd duck. If the operating systems run the same tools then it makes the OS a means to an end instead of a religious debate.
In practice, I've used all three at various jobs for my developer machine. They all have security updates and maintenance options that work fine. All can be successfully maintained, used, and developed on.
> I believe that if maintenance cost is your deal-breaker you probably should reassess your choice.
Your comments demonstrated that you prefer Linux strong enough to negotiate for it. I'm just a developer who enjoys solving problems and thinks the WSL is really useful.
Is that for Linux accessing the Windows parts of the system, or even when accessing files within the Linux sandbox?
Install VirtualBox, create a Linux VM instance and configure for at least 2GB RAM, 3D video acceleration, 128MB of video RAM, and at least 20GB of storage. DO NOT choose EFI or any other fancy/experimental features; the reason we're using Slackware is because it is dead simple and bulletproof in its default configuration, and doesn't need any of that mess.
Install Slackware on the VM instance (I prefer Xfce desktop as it integrates well with Windows in Seamless mode but use what suits you if you find another is better).
If using Xfce: Remove the bottom panel but keep the top panel, and configure it to your liking. If using KDE you may wish to move the bottom panel to the top edge.
Install the Guest Additions for Linux.
Activate Seamless mode, you'll find that your Linux desktop sits as a layer on your Windows desktop. If you use Xfce, your top panel will be at the top of the screen and will by default float behind any Windows-native windows. This allows you to keep both OSes running all the time and switch back and forth as necessary. Optional: Install the Numix GTK themes and Numix icon themes available from slackbuilds.org for a more Windows 10-esque look and feel in your Linux native apps.
Again, the reason I use Slackware for this (besides my nearly two decades of familiarity with it) is because it is simple, stable, and stays out of the way. You don't need any experimental VM features; it's pretty much pure Linux. That said, something like Alpine or Arch may be more suitable depending on your workflow, and they both are also simple and VM-friendly distros. Alpine in particular is designed to integrate with VM and container setups with minimal fuss.
Update: Also, can you explain what you mean by this:
>If you use Xfce, your top panel will be at the top of the screen and will by default float behind any Windows-native windows. This allows you to keep both OSes running all the time and switch back and forth as necessary.
I use Ubuntu on VirtualBox on Windows and can already do the switching back and forth, but am not sure what you mean by "will by default float behind any Windows-native windows".
Yes, to the extent VirtualBox allows it. Make sure you have it turned on in the VM's settings and the Guest Additions are installed.
> "am not sure what you mean by "will by default float behind any Windows-native windows"."
Sorry, I just meant that the VM apps are treated like any other Windows object and aren't "always on top" or "always on bottom". Let's say you have the VM running as I described, and you open a Windows application. That application will be the "top" layer because it has focus; the VM won't override it. If you then click on the title bar of a Linux application, it gets pulled to the front and the virtualized app becomes the app with focus, along with the Xfce panel, but there's no opaque "desktop" layer blocking any Windows apps underneath it, as it would be if it weren't in Seamless mode.
In short, it does what it says on the tin: Host and Guest apps work together seamlessly as if it were all one OS with one "desktop".
>If you then click on the title bar of a Linux application, it gets pulled to the front and the virtualized app becomes the app with focus, along with the Xfce panel, but there's no opaque "desktop" layer blocking any Windows apps underneath it, as it would be if it weren't in Seamless mode.
Got it now, thanks. That's a good feature.
Even if one believes a post-Gates/Ballmer/Myhrvold Microsoft no longer behaves illegally/unethically, one can still look forward to an embrace being followed by extension and extinguishing. Because that's how the incentives play out.
Embrace: Of course we want to give our users the best, even when some of that originated elsewhere.
Extend: We're not going to hold back our users by waiting on slow-moving standards bodies - often slowed by our competitors. Helping competitors isn't a priority for us. And the wellbeing of those few people who aren't our users, is understandably also not a high priority for us.
Exterminate: Why should we expend any effort at all towards keeping competitors viable? Why shouldn't we actively help potential customers join us? Why should we prioritize resources to helping a few disgruntled ones leave? Why shouldn't we fully monetize our intellectual property, both directly, and through affiliated third-parties like Intellectual Ventures?
Microsoft failed on phones, but VR/AR is coming, and it will transform the market for phones, laptops, and desktops. Once upon a time Microsoft created Windows-only web extensions. Now Mozilla writes them for it. Windows Everywhere may still happen. And "you can use Linux inside of Windows" is a big step towards that. It's not clear that's something to be happy about.
The times have changed, charging monthly for services is a much better economic strategy than making people pay upfront for a product.
It's clear to at least me that Linux will most likely never replace Windows as the main OS for everyday users so there is no real financial incentive to extinguish it anymore. I don't think MS cares that much anymore if you run linux or windows on the server side as long as you do it in Azure.
This thread was on a major change in the relationship between Microsoft and Linux. But comments were focused on the short-term impact on individual developers. Pointing out longer-term ecosystem impact seemed worthwhile.
> Kinda tiring to see this gets rehashed [... Microsoft has changed ...]
I encounter two versions of this sentiment. One is "yes, it's unfortunate that our only choice is to pay organized crime to collect our trash, but I don't want to discuss that every time we put out the trash" (to use an example from NYC of a few decades ago). And as long that doesn't drift into "organized crime isn't a problem", fine.
The other version is "Microsoft's conduct is no longer something to be concerned about". And if one knows that pharma, IBM, and Microsoft, are the current leaders in expanding the scope of software patents, and blocking patent reforms, and one agrees with those positions, then, well, ok. But often there isn't that awareness. Or other impacts are not fully appreciated, or not clearly reasoned about.
We're way off the front page, and my break is short, so for the rest, I'll just note that linux is unlikely to ever work as well as windows on Azure, and MS has a strong incentive to diminish linux and Mac as a competing centers of gravity for software developers. For instance, developers being able to easily avoid Microsoft products while developing and serving android and iOS, is something Microsoft would obviously like to change. And specifically, Microsoft will do whatever it takes, to not miss VR/AR as it did phones. And would really prefer to dominate VR/AR as it does desktop/laptop.
Can you point out something on linux on azure that doesn't work as well as with windows? Genuinly curious. I haven't had any linux images on azure so I have really no experience in the matter.
I don't think it's something wrong with them entering the VR/AR space, it is exactly what it needs. The more heavy players behind it the more faster the technology will evolve. Right now, there is still a big lack of both games and software for it so there is not a big incentive to buy such a system. I am very excited for the future of tech and I am not worried for one second about Microsoft will stop supporting open source and if they will, I would be the first to object to it [as a heavy user of their tech].
Terminal emulators like Hyper or ConEmu (mentioned in the article) work great without WSL as well. Git's Windows build comes with bash and most coreutils built-in, so if your colleagues filled your npm scripts with sh scripts, there's a fair chance they'll "just work". Scoop  is a fantastic no-nonsense "apt install" like tool, which is leaner and less in the way than the better known Chocolatey Nuget. Obviously, VS Code, Atom, Sublime Text and the entire Jetbrains Suite all work fantastic on Windows.
Finally I'd like to recommend Git Extensions  if you like using Git with a UI. It's the best Git UI I've come across and it's native to Windows. It has a funny name because it started out as a set of extensions for Visual Studio but it has little to do with that anymore.
I'd like to particularly commend the Node ecosystem for making cross platform dev a breeze. The last few little details that don't work the same in Windows as they do on Linux/macOS (even if you have Git's okay-ish sh/coreutils tools in your PATH) are very easily bridged with tools like cross-env  and shelljs/shx .
Note: not disagreeing with anything in this article - if you want a Linux with a nice shell, Windows is a pretty decent option these days. I just want to point out that it's also a pretty decent option these days without WSL, and share some pointers on how to get started.
We build upon so much open souce software - Linux, GNU, compilers, editors, servers, and hundreds of libraries.
I use KDE on Ubuntu, and on the rare occasion that it is lacking, a bug report or a patch helps the whole community. All the users, in Ubuntu's statistics, help the KDE developers feel they're doing something worthwhile.
Running Windows and perhaps improving tooling on Windows helps Microsoft, something I refuse to do.
(Equally, I will correct Open Street Map, but not Google Maps.)
Linux dominates the smartphones, servers and the supercomputers.
It's the software development model and the ideological underpinnings that made Linux technically superior, so I'd be careful not to wave them off as some religious nonsense.
I want the following "features" from a terminal emulator:
* Tab support
* Support for more demanding applications (like tmux and other curses based applications)
* Sensible defaults
* Reasonably clean UI
* Open Source
Going through Łukasz's suggestions:
* Hyper -- Not tried this, but: electron based, non-starter.
* Babun -- Not tried this, but: no tab support.
* Cmder -- Tried this, No tab support, I found it a little glitchy under tmux.
* ConEmu -- Tried this, support tabs... but I found that a lot of configuration was required, the UI (at least of the the box) was cluttered, and also found it to be glitchy under tmux.
* MobaXterm -- Closed source, cluttered UI (for my needs).
I hadn't tried running a Linux native terminal under WSL, but will give it a go soon -- I hope it doesn't make everything ugly!
Cmder is just conemu plus some extra things you probably don't want.
Strange, all I changed was to launch "git bash" task for each tab and it was good enough for me.
> it is an Electron based app and it’s a bit sluggish but works well, scales well and looks like it’s 2018.
Electron? Would anyone in the Linux world accept an Electron prompt?
All you need is Git + clink  and you get a ton of GNU tools + readline/history. Then it's as fast as possible and you get sharper and clearer fonts than all the other pseudo windows terminals. This is the stuff that's really important to me.
I couldn't get console to handle vim colours (although 32-bit colours is supposed to be there now) which is a real shame. Plus certainly there's no bold / italic.
Why would you do that?
- Windows development
- Productivity software
- Email and scheduling, if your company uses Exchange
- Support from corporate IT
- Company intranet apps, if you happen to work at a company whose intranet apps are only tested on IE. (It still happens.)
- Video games
The computer I use for my day job is exclusively Linux, as I have no need for Windows software or online games there. (I'm guilty of playing Minecraft during my lunch break, but that fortunately runs quite pleasantly under Linux.)
I excitedly installed the new crossover linux because of there office 2017 support in my manjaro machine at home and yea i guess it "works" but the experience was slow, painful, and i wouldn't call it stable.
Beyond that i find i am constantly having to put work into my linux machine to get things to run YES its a million times better than it used to be AND I WANT to run linux but its always more work.
The major points of vulnerability nowadays are the browser and the user. Secure the shit out of your browser - aggressive adblock, disable/click-to-play all plugins, no unnecessary addons. As a user, educate yourself about the realistic threats. Download software from reputable places - if you're used to a Linux package manager, this will take some adjusting but it's not difficult. When you're installing something, actually read what it's doing and think, don't just click next excitedly. I shouldn't even have to say this, but don't open email attachments unless they're from someone you know and you're expecting exactly what they sent. Let Windows install updates automatically and reboot when it asks. Keep your other software up to date - something like Chocolatey (https://chocolatey.org) might help, but that has its own security upsides and downsides. Turn on Windows Defender and let it keep itself updated.
Windows firewall is a pain in the ass, but if you have some time you can lock stuff down quite a bit (Windows Firewall Notifier).
Personally, I don't run an antivirus at all; I got sick of Windows Defender making disk accesses so slow so I disabled it. At most I'd set up scheduled scans.
Turning on controlled folder access if you're paranoid about ransomware. Not much else really. I haven't noticed any viruses / slowdowns / malfunctions this way.
Don't install antiviruses, they're basically malware created to fight malware. Windows defender in Windows 10 is enough.
Keep your device updated.
Mainly because Windows doesn't allow keyboard shortcuts to be rebound and the defaults often require two hands, which with the mouse normally in my second hand is really annoying and not particularly fast.
Other things I'd like:
- Ability to bind clicking the left and right mouse button together to middle mouse click (allows you to just click in the middle on laptop touchpads).
- Tabs in the file manager.
- Workspaces, in a usable form. Windows 7 doesn't have them and it's not legal for us to use Windows 10 on anything that's connected to the internet. And when I do use Windows 10 / Server 2016 in isolated playground VMs, then its implementation makes me feel like some old person who forgets about applications that they have open, because there's no indication of other workspaces existing, nor the applications in them. The shortcut for switching between workspaces being one of those two-hand-shortcuts also means that checking workspaces to see what's in them is not viable.
- A functioning search. I do not understand how you manage to make basic system search as broken as it is on Windows.
- The dream would be a tiling window mmanager and an actual functional terminal.
I know, this is not achievable without third party software, which I can't really install for security reasons, so yeah, this is unfortunately a pipe dream.
VirtuaWin - https://sourceforge.net/projects/virtuawin/
Locate32 - https://locate32.cogit.net/
Why not use pure linux? First, I like when my OS just works, without having to fix something frequently (that was my experience when using linux, both ubuntu and fedora). Better hardware compatibility too. Second one is that I use a surface device on which I like to write or sketch using the pen frequently.
1 Use X Windows or ssh/telnet to connect to your real development system and work on there.
2 Just use Hyper V install a VM that is exactly the same as you target environment.
Developing locally on a different operating system is just adding a whole other set of risks you don't need
It's also lighter weight - just a Linux userland living on top of a special translation layer that maps from Linux kernel calls into Windows ones. That saves you all the overhead of running a full virtual machine with its own dedicated chunk of your RAM and its own dedicated chunk of your hard disk.
You do of course have at least separate dev test and live systems.
And in these days of threadripper the performance argument makes no sense as you have cores to burn.
Don't take this the wrong way you need take a look at how you work and if this is actually how a professional should work.
Even for Powershell users the thing is painful. Bash for the Linux Subsystem is just another reason/justification for a refresh.
I'm aware of the third party options (as listed in this article), but third party software isn't always easy to deploy for political/policy reasons in enterprise or government.
I definitely think the cmd is a boat anchor around the Linux Subsystem's neck either way.
Which PowerShell issues are you referring to? I'm not claiming it's perfect, but I don't look at it as a second-class shell environment.
Conhost has a long ways to go but they've made far more progress in the last year and a half than any other point in time I can recall, so I'm optimistic that it will be modernized even more to fit the growing capabilities of WSL.
As it should for most users.
> forcing default bowser by itself.
Forcing? I'm not sure what you mean. You can easily change the default browser just like in any other OS.
> Forcing? I'm not sure what you mean. You can easily change the default browser just like in any other OS.
Installed FF, went and changed default browser to FF, then Windows asked me every 15 minutes to change to Edge, resulting in changing it all by itself to be default browser without my knowledge.
And it really is remarkable how much better development on Windows has become over the last 2 years.
The problem might arise should they diverge enough that software has to be maintained in multiple versions, so that commercial software companies will have to choose which one to support, and they invariably go for the one with the (biggest) company/corporation attached to it which of course would be Microsoft (just look ad Debian vs Ubuntu).
Should this scenario seem impossible, think what would happen if Microsoft decided to write a library that exposes all important device drivers to the underlying Linux subsystem: gaming companies could write their games without any fear of incompatibilities because they're using the Windows graphics card driver, the same way I use Linux drivers for a sound card Windows ceased to support years ago when I load a Windows audio software under WINE (example: Reaper+Tascam US122). That day Linux native installs would be less and less appealing for commercial (and sadly many amateur) software developers,
which makes me very pessimistic.
Microsoft is already a platinum member of the Linux foundation for reasons that go beyond my level of comprehension; what would prevent them say 5 years from now, from telling the world that their Linux is the real one?
edit: minor correction
So I'll enjoy it while it's here, but I'm nowhere near uninstalling VirtualBox. Once burned and all that.
One thing you may want to consider doing is mount your drives in WSL so that Docker volumes work. With your current set up, none of your volumes would work.
Details on how to do that can be found on:
Also, in case anyone is wondering "why not just use a VM"? I did for ~5 years but the WSL set up is a lot nicer for day to day development.
Detailed on the VM set up vs WSL can be found on:
With the above WSL set up, the development performance is pretty nice I must say. Around 250kb of SCSS runs through a fully Dockerized webpack chain (sass-loader, precss, autoprefixer, css-loader) in 1.8 seconds and live reload works. I haven't even began to try and optimize it with fast-sass-loader and other tweaks. I'm also using the stock node-sass package.
15k+ line Rails apps with 75+ gems also reload in under 100ms for code changes (non-assets).
This is also why file system access is so slow in WSL — the two systems are far enough apart that building such a fast path is either hard or really hard, depending on how much compatibility is important to you.
I personally don't use this but I installed it a while ago to try it out and since forgot about it. My point - it doesn't alter your host OS half as much as a VM does. So it is superior in some ways to that solution as well.
Anyway, do you think there's any reason at all that Apple goes out of their way to support Windows on their Mac machines? I mean here is a company that has sung the praises of their Macintosh OS, talking nasty shit about Windows the whole time....and yet they actually do non-trivial work to support this supposedly shitty OS that they apparently hate. Hmmmm. It's a mystery for the ages I guess.
With regards to having access to a Linux environment, I've tried developing using a Linux Vagrant box while sharing the Windows filesystem using VirtualBox Shared Folders. This worked for smallish directories but I still had to contend with workarounds for symlink issues. Then I went down the WSL route before and after the Fall Creators Update. It's impressive how far WSL has come but the file system performance hits are noticeable.
Now I'm settling on having just one beefy Debian Vagrant box. Its filesystem is exported to Windows using Samba so native text editors can be used but all fs operations run within the VM with no NTFS compatibility issues. Performance is great and Docker works without the Hyper-V lock-in.
Ninja-addition: but don't get me wrong, the writeup is quite nice anyway! And if you are in a mixed environment where you need to run MSBuild locally for win32 apps and do a small amount of local-linux it might make sense.
Windows isn't a nice self-hosting toolbox OS with *nix semantics and probably never will be, no matter how much emulation or subsystem layers are added. In a way, macOS is moving a bit in that direction as well, but without killing the foundation it is built upon. Perhaps, if at some point either the NT kernel gets replaced (no, it's not a bad kernel, it's just a 'different' kernel with no compatibility towards BSD, Linux, Unix, Mach, L4, or any other open kernel) it will allow for a more toolbox/developer-centric environment.
Now, while this argument might seem like the one about "I need things to be open source because I want to touch it" (which gets stale pretty quickly since 50% of the users never do this or probably don't even have the skills), it's not what I'm aiming for. It's more about the fact that you can interrogate, manipulate and visualise all levels of the system you are working on, in an identical way you can do on your target system. While most types of actions have comparable methods on Windows, there is nothing like a /sys /dev or /proc on Windows, you can't easily 'see' what you are doing, and that is a really problematic thing once you get past the entry-level web development. Most of the inspection and introspection on Windows comes in the form of closed source bulky IDE's that may or may not have that single feature you needed to figure out why your program has trouble reading some device node or why it's performance varies depending on connections, handles or other OS-dependant actions. At best, you can do some stuff with the sysinternals suite, but that doesn't even come close to basic tools like top,ps,lsof,sysctl,ldd,strace,gdb. The whole Windows ecosystem is so tightly bound to opaque systems that cannot be reached unless you dig around in some GUI, it makes it hard to do basic debugging. While it has gotten better over the years, we are still stuck with a ton of half-invisible things in ancient GUIs or MSC plugins that have an impact on your work but have nothing to do with your target system. Basically, you now have to maintain two systems of which one isn't really the one you wanted to work with in the first place.
It's not that you can't do it, it's just that much harder to do on Windows. This is not unique to Windows, and applies to smaller scopes as well. If you build an application in an IDE and you want to automatically test and integrate it, but you didn't check how to build, link and package your product because the IDE did it for you, you suddenly can't reliably do what you needed to do, and you have to reverse-engineer the somewhat opaque process that the IDE did for you in the background and reproduce it using toolbox-tooling in order to automate it or deploy it on your target system. While not using that IDE isn't the super-solution either, purely depending on one blind/black-box integrated system is hardly the smartest choice out there.
Another thing is that this points to another issue; just because the "no"-department (old-style security / CISO) thinks something should be done in one way doesn't mean you just take it, if they have a wrong idea or if a policy doesn't do what it says it does, having a conversation might actually make things better. Same goes for taking on a job, if there are policies in place that prevent a good working environment, either the policies need to be updated or you might as well not take the job. (and yes, that is more often an option than you might think, at lest where I live -- I know that is purely anecdotal/N=1)
Reasons not to change or not to make things better are easy to find, it's the solutions that are hard.
Not because my own companies IT department is populated by sticks-in-the-mud, but because we work with the financial information of some of the largest companies out there. As such, what they say goes.
I have been using Linux on servers since 1993, dual booting on my desktop 2000-2004, then went full Linux on my desktop/laptop and now WSL.
The only really annoying aspect is the slow filesystem https://wpdev.uservoice.com/forums/266908-command-prompt-con...
In other words, both Cygwin and a Linux VM with properly shared folders appear easier to use and more useful in practice than WSL; I have no idea of what Linux software is worth running under WSL despite these limitations.
Emacs, Node.js, Bash, Python, Git and many other important tools have good Windows versions, while almost everything popular is available in a package or easy to compile, and in both cases much more dependable, under Cygwin.
I appreciate that when I have to use Windows (e.g. for some apps that just don't work on Linux) I can still take advantage of a true Linux system's power. But that's about as far as it goes. Claiming it to be a "dev machine" is still quite an exaggeration. I'll see how it goes in a few years. Hope Microsoft and Canonical can keep up the work, for sure.
Wayland is the most serious effort to create a replacement at the moment and it is being pushed hard. But its design prevents it from running on WSL in any reasonable way.
1) install the ubuntu-desktop package
2) run “ubuntu-settings-daemon &”
3) while #2 is running, start a GUI app. Good font smoothing now.
4) I think at this point I could close #2 and still get good font rendering. At least until the next restart, or perhaps as long as I didn’t close all Linux processes.
I might also have added all of the fonts from Windows to the system-wide font folder on the Linux side.
I highly recommend using Docker, mounting a volume to it and running all the Linux code inside the container. You can still modify the files both ways but avoid most of the messy incompatibility issues. There's some learning curve to it but I think it will pay off in the long run.
Of course, you can run Tensorflow on 'plain' Windows:
(yes, I troll, but I swear I'll submit a pull request for this to Ammonite eventually)
I do use ConEmu, but I use PowerShell as my shell, don't run WSL or Cygwin. All our devtools (Erlang, CUDA C/C++, Python) run fine in Windows and using Windows as it was meant to be without any "shims" makes it much easier to deal with differences in file nameing conventions, path separators, etc.
Plus, PowerShell really is the Best Shell Ever.
The primary purpose of a shell is not scripting. It's meant to be interactive. Powershell is too heavyweight to be be a good daily driver for a shell.
That being said, even with commands being long, tab completion does an absolutely amazing job of shortening what you type. Also it's usually just the first command of a pipeline that doesn't have an alias because after that you tend to use the same few cmdlets over and over again (%, ?, select, ft, ...), which do have short aliases.
I use Cmd over PowerShell though since PowerShell can't process commands separated by "&&", which are used extensively in package.json scripts. I'm happy with this setup though since I don't think the CLI is a good interface at all (I prefer GUIs with lots of keyboard acceleration), so I tend to minimize CLI use as much as possible by automating everything down to single, simple commands.
With VSCode, I can run my package scripts in the built-in terminal as well. Also, if I do have to edit some code on my Mac or Linux boxes (which I usually reserve for just compiling and testing things), VSCode gives me the same experience everywhere. I like that a lot!
Software that needs to be deployed on UNIX systems is usually written in Java.
All the scripts that I need to access on both systems are written in Python.
I rather make the best use of the tools each platform offers.