* we also shipped OpenSSH's sshd server, but it's a little tricky to configure right now. Expect a blog post this week.
* This is not production-ready in the current version of Windows 10 (hence the "(Beta)" in the label), but we hope to be soon.
* All of this is being done in open-source out of a fork at https://github.com/powershell/Win32-OpenSSH (code work is technically checked into https://github.com/powershell/openssh-portable/ first, but those will be consolidated at some point). Check the Wiki there for a ton of info.
* We've been working closely with the official OpenSSH Portable maintainers to get upstream at some point in the future. They've been awesome to work with so far, and we're really looking forward to moving Windows onto the definitive SSH implementation of the world.
This is been a super fun project to work on over the last couple years, and I'm glad that there's such a groundswell of excitement around it. Like I said, I hope to be publishing a lot more documentation/material on our plans as we get to a production-ready state.
I've also been super swamped with the release of PowerShell Core 6.0  for Windows/macOS/Linux coming early next year, hence the lack of a good release announcement on these beta bits...Thanks to Patrick Kennedy for finding it and letting everyone know! :) )
When I connect to a Windows box running an sshd, what exactly happens? Do I just get dropped into a command prompt or PowerShell session or what?
With PowerShell Core 6, we also support PowerShell Remoting Protocol (PSRP) over SSH as a transport, which means that you can do stuff like New-PSSession and Enter-PSSession without WinRM. (PowerShell just gets registered as a "subsystem" of sshd, same thing sftp-server does.) You can check that out here: https://github.com/powershell/powershell/tree/master/demos/S...
If it was direct to Powershell, you could pipe objects directly into remote cmdlets.
Windows Client Access Licensing is annoying, but they have some web exception for web servers. SSH remote access is a whole new ballgame! I assume it is encouraged for one-at-a-time administrator use in much the same way Remote Desktop currently works, but even that isn't really communicated clearly yet (even from a high-level perspective).
It seems like getting the plans out there as soon as possible to begin collecting feedback and managing expectations is going to be win-win for everyone.
It works fine without a passphrase.
Those builds will also be showing up in Insider builds over the next few weeks (if they haven't already).
If you grab our GitHub bits, we compile against LibreSSL there, and you should see LibreSSL showing up in future Insider builds of Windows. That brings support for all the other crypto algorithms you'd expect (RSA, DSA, etc.)
If you mean, "interact with them from an interactive shell like cmd or PowerShell", yes, you can absolutely do that. Our implementation literally stands up a console host on the server side, so whatever you can do in that console host with the privileges you have, you can do over SSH.
If you mean, "do you support SCP and SFTP for transferring files in a network-drivey way" the answer is also yes. SCP is supported directly against sshd, and we ship both an SFTP server and client.
If you're talking about something NFS or SMB related, I'd be curious to hear more details.
It says "ON THE BACKLOG" since October 01, 2014. You have a pretty long backlog, if you didnt implement the second top voted request in three years, out of backlog ;-)
That being said, yeah, I also want tabs. In the meantime, I switch back and forth between conhost and Cmder. :shrug:
It comes with bash and all the standard tools (including ssh IIRC), so I can work on a real command line. Especially as a vi user, I'm right at home.
I find this a lot easier to manage than Cygwin: a lighter install, no packages to select, and smoother integration.
When I bought my first Mac, back in the days of MacOS 10.2, I did so because of the BSD system underneath. Having a Unix system, whether its GNU or BSD, gives me access to tools that I'm familiar with and sometimes prefer to their GUI counterparts (e.g., scp/sftp vs FileZilla).
Now that Windows has this, switching back is something that I'm giving serious consideration to.
It's not clear to me that Windows has this ...
I, like you, adopted the MacOS ecosystem because it was UNIX underneath ... but there's a big difference between underneath and alongside.
Although it is not commonly done, you can control and interact with your OSX system with UNIX commands ... there's one single filesystem namespace and you can interact with it from the command prompt as well as kill GUI apps or set preferences or ifconfig, etc.
It's my understanding that the Ubuntu subsystem in recent windows is sort of a parallel environment ... but is not meant to control the system directly or as an alternate path of interaction with Windows, correct ?
The Windows Subsystem for Linux (WSL) has it's own directory in the Windows filesystem that corresponds to /. It's uses some NTFS magic to store the Linux file attributes that don't directly correspond to NTFS file attributes.
It also, within the Linux environment, mounts your host drives under /mnt.
For what I generally use the OSX terminal for, WSL probably hits about 80-90% of my use cases. A lot of lower level utilities have weird issues - e.g., 'ip addr' seems to present the Windows network interfaces as though they were typical Linux ones, 'ss -a' gives a bunch of netlink errors, dmesg says "dmesg: read kernel buffer failed: Function not implemented", 'tcpdump' doesn't work, etc. On the other hand, curl, scp, ssh, etc do exactly what you'd expect.
I hope that these get filled out, at least with more constructive error returns, at some later date.
For now, I'm happy that they have enough of the low hanging fruit ripened sufficiently to make it possible to do 'normal' things from within WSL. I'd honestly rather they make the local (and network) filesystem more performant and robust; maybe they have.
My use case for WSL is coming around in the calendar year again so I'll be revisiting it soon.
I am not so sure about this. I think Dustin Kirkland sheds some more light here:
And he seems to be relatively excited about it so he benched it more recently:
It seems more like an inverse Wine than a parallel environment.
You can call executables in both directions, even use windows executables in a bash script and pipe its output to awk.
Full disclosure, I work at Microsoft on WSL.
You can do some things to and with MacOS from CLI. Honestly though, it's all stuff I wouldn't miss awfully.
It's the same kernel underneath so it's not really two parallel systems.
For Xming, simply set DISPLAY in shell and local GUI programs just work, as well as SSH X forwarding.
I'm on a Dell XPS 15 9560 if that matters.
It does as I've had no issues personally on as self-built desktop, an ASUS laptop, a MacBook running Bootcamp, and now a Surface Laptop.
There are also some systems where kernel support for the devices is lagging, either because they are proprietary and poorly documented or because they have insufficient market penetration to get someone interested in writing good driver support. For example support for the pen on the Surface Pro 4 line is really horrible (IMHO) on Linux.
If that's true, will those new drivers also work on a regular non-Windows Linux install? That would be really great news, and pretty ironic, if device manufacturers or even Microsoft itself were suddenly writing more/better drivers for the Linux kernel. :-)
When you run Ubuntu on top of windows, windows replaces/emulates the Linux kernel - at least the part it needs to run the subset of Ubuntu that windows currently can - this emulation provided to run Ubuntu is done on the interface between the kernel and userspace, it is not done on a device/driver level.
Drivers are OS specific, the drivers in question here are either windows drivers, which works only on windows, or they are linux drivers which work only on linux. (Noone is writing drivers for windows which could also work on linux)
Also, the native Windows git is still your best bet for git operations. That's one of the cases where I will have a PowerShell and an Ubuntu bash window side-by-side, working in the same /mnt/d/... | D:\... directory. git operations in PowerShell and Jekyll (or whatever) operations in Ubuntu bash.
Also, they _really_ chose a poor name for that executable; it should have been wsl.exe or some such, not bash.exe. At least lxss.exe isn't already in use by other software in my path...
Also, totally fair feedback about the naming. In FCU, we added wsl.exe in addition to bash.exe. It launches into your default shell rather than /bin/bash.
* I work at Microsoft on WSL
Calling Windows EXEs from bash will automatically retain the current working directory under /mnt/c/*, so it should just work out of the box. Looks like you might be able to get away with just adding native Windows git to your bash path.
Next up is to try running something more taxing, and then look into GPU access from within it.
I'm looking forward to them developing the subsystem further.
There's still a place for windows native tools.
I like PuTTY, and have used it a lot, but MobaXterm has been great.
I wish Cygwin performed better on Windows, git operations on Linux take minutes on Windows because it has to spin up the whole cygwin environment. This kind of still makes me want to use Linux, but MobaXTerm makes my Windows desktop very usable for administration.
MobaXterm is nice, but I don't like how heavy it is...
Great wiki about set up and use here: https://github.com/msys2/msys2/wiki
However in the current version of MSYS2 (after an update a few months ago) they suddenly seem to mix the different linux-like distributions and for me it's really hard to choose which shell to use and which package to install for that shell.
Plus, the real reason you get Putty or SecureCRT if you’re lucky, is because they are actually terminal emulators and not a DOS shell.
Sorry I don’t find Mona compelling at all.
I used to use Cygwin but it became a pain to manage the packages and it couldn't update it's own setup.exe last I checked.
I recently tried out WSL and I still run into problems with the file system. My home directory is in a new place which is kind of annoying. And cloning a repo with Git changed all the line endings. autocrlf didn't do anything for me either.
Back to good ol' git.exe
I still can't run those command under Windows Subsystem for Linux.
The powershell's syntax is just too esoteric for me.
I did tried that a few times.
I really can't believe somebody ever bought a MacBook because they found installing putty too much of hassle.
The pleasant interaction with other *nix systems was and is a primary driver for my choice of OS X.
PuTTy sucks. Every other terminal I have tried on Windows sucks. Getting terminal-based software to work on Windows sucks. I stopped trying years ago because a terminal on OS X doesn't suck. It might not be a perfect 10, but I'm happy with it.
I'm also more interested in Windows more and more as time passes.
Losing as the most popular OS has made Microsoft start doing some of the right things.
I think it's the explosion of open source and the corresponding command line tools that they started investing some time and effort to the command line.
A natural result is now the Devs working at Microsoft ALSO wish they could use the command line for everything in Windows. Thus Powershell, and things like AZ CLI becoming much better.
I doubt Windows 10's inclusion of ssh alone addresses this use case; for that you'd want a full cygwin or something. I don't know anymore - I don't use Windows for any purpose other than games on dedicated gaming computers these days. Despite Apple going backwards with each successive macOS release (probably since 10.8) it's still a much more usable OS for the majority of the things I do than Windows 10. I do like Windows 10 better than Vista and 7, though, on my current gaming computer.
Never been a problem for me. What are your complaints?
I barely notice the difference, but if I had to choose I'd take putty over Terminal.app, which has default keyboard-shortcuts that clash with Bash. (I believe it was Alt-b, Alt-d, or maybe Alt-f. I forget exactly.)
It is worth to emphasize what has been said before here: Since Windows popularity is decending, they start to catch up with that attitude. Nowadays, for instance, you can open the terminal from every folder in explorer.exe -- this is exactly this kind of integration Windows missed for 20 years. Once they kick out cmd.exe in favour of something like Console2 (https://www.hanselman.com/blog/Console2ABetterWindowsCommand...) and update the toolchain even more than they did with PowerShell (actually integrating GNU/bash was a major step), developers will come back.
But it's not a problem with PuTTY, it's a problem with Windows networking
subsystem. Once you remove cable, the whole interface immediately goes down,
IP address gets deconfigured, and all open sockets using the address get
closed. (I was bitten in the ass by this when I had several Windows Server
machines under my care.)
You'll get exactly the same with any other long running TCP connection.
> Remove the network cable while in an SSH session...
This argument seems a bit contrived. Is this honestly something you're concerned with on a daily basis? I doubt it.
> Try doing a port forward, and you'll need to dig around instead of just typing it into a terminal.
No, you don't just type "it" into a terminal. First, you have to lookup the command syntax if you haven't used it in a while. Then you have to type it in correctly and if your syntax is off by even one character, things don't work.
> Also aesthetically, PuTTy has nothing on Terminal.app or iTerm2 etc.
Aesthetics are nice but do you even have a tool like WinSCP? It graphically displays the remote filesystem over your SSH connection and then lets you open a terminal to the path you were looking at. I don't even have to type my password when I open a new terminal from WinSCP.
I say all of this having been a paying MobaXterm customer in the past.
These days I use ConEmu, WSL, and Bitvise (for those rare occasions I want to easily tunnel to the Windows side).
It's generally difficult and awkward to work with and unpleasant to look at.
If you use git on Windows with PuTTY/plink it's also significantly slower than OpenSSH. I saw clones of a large repository go from 60 KB/s to 600 KB/s after switching from PuTTY to OpenSSH.
I like Putty's graphical session manager; on OSX and Linux I don't want to fiddle with ~/.ssh/config.
The lock-in with Putty is annoying though, being able to export all configured sessions to a .ssh/config file would be awesome.
With Putty, if I want to change, say, the size of the terminal opened on connections, I'd have to update every Putty session individually. Been there; done that. Ridiculous. That should be a function of the terminal you run SSH under, not the SSH session itself.
It is definitely a lot messier than working with ssh_config.
Okay, I think I just found a new contender with GIMP for 'worst software title'...
Again CAVEAT EMPTOR (as with anything to do with regedit).
Oh, you meant on Windows.
Sure you can: https://devops.profitbricks.com/tutorials/use-ssh-keys-with-...
What are the killer features of iTerm2?
The web, Unix, Linux, OSX, pretty much everything other than Microsoft uses properly namespace driven forward-slash separated paths with no drives and case sensitivity.
That you can't fix, merely abstract it away. I am tired of mapping between the two.
scp myfile.txt winmachine:/d/Users/Whoever/Documents/Projects/X/File.txt
scp myfile.txt linmachine:/home/whoever/documents/projects/x/file.txt
I'm a Unix user since about '93 for ref as well. I am poisoned with forward slashes everywhere :)
Name one that isn't just "Bourne Shell does it different".
While Microsoft has started doing good things which I can agree on, when did they lose OS market share to the point of not being number one? If we're talking about Apple hardware outselling any one vendor, that's true, but there's volume. Also, one of the first things I did after getting my MacBook Pro (and others I know who don't want to leverage VMs/Parallels. I like Windows and OSX being separate from a context point of view) was dual boot Windows.
Android is first by a large margin. iOS almost outnumbers Windows as well.
Microsoft post iPhone needs to prove that it's worth using a desktop at all, not that they deserve to have the highest share of desktop users. Stopping the bleeding inflicted by Macs, which sync so well with iOS, is part of that.
The number of people using Windows to do their "computer things" has gone down dramatically. Even if they have a Windows computer, much of the time spent doing computer things has moved to Android and iOS.
If I had a Linux laptop, an extraordinary amount of my time would be spent trying to make it work. Wireless breaks, sound breaks, upgrades break everything. I would have to spend a serious amount of time and research finding a laptop that had good Linux support... regardless I would still probably have to spend hours trying to get the sound or the wireless or sleep or some feature or another to work properly.
OS X just gets out of the way. I have never had to put any work into making the graphics card work or making sound work or making the network work or fixing boot... you get the picture.
If I'm using Linux on a laptop to do any sort of work, a sizable portion of the work becomes keeping Linux working on the laptop and I don't want that.
But yeah, if you buy anything that's not part of the "top of the line developer notebook" category, specifically the Intel not AMD ones, then almost nothing will work out of the box :)
In years past you may have spent hours getting wireless or sound working. These days, if you buy decent, mainstream hardware it just works. Ubuntu does a great job of getting out of the way.
I'm a Fedora guy, myself, but find myself recommending Xubuntu to more and more people for casual computing because 1) it really does just work, and 2) Xfce is lightweight and lets you use your CPU for doing real work vs. holding up a bloated windows manager.
Linux on unsupported laptops (which is most of them) sucks big time, but not linux desktop itself. It works great on standard desktop computers (towers)
you should make that distinction, because otherwise you'll just start a flame war.
The Linux desktop has come a long way, even in just the last year or two. I know everyone always says that, so take it with a grain of salt, but I have never been happier with the state of the Linux desktop.
Might it be that we tend to like what we're used to? When I jumped from WinXP to a Gnome2-desktop, I tried to make it a bit more like WinXP, even though these days I would probably want nothing of how WinXP desktop is laid about.
One thing I've learned is distrohopping is a must if your hardware is an edge case... that said, since moving to Manjaro, I haven't had to use anything else.
I hear that argument frequently. On macOS (or even Windows) things just work(tm), while on Linux (or some BSD), there is always something that does not work correctly.
I guess I must be quite lucky. A long time ago, I positively enjoyed spending an entire weekend getting a sound card to work or something. To be honest, these days I consider myself too old for that stuff. I, too, like it when things just work. But really, Linux as a desktop system has come a long way, and for the past couple of years, things have pretty much just worked(tm) for me.
Picking hardware that is supported by Linux takes a little care, so does picking a distro,
especially with laptops. But I prefer to do a little research before buying a laptop anyway, because I am usually on a budget.
 In my experience, the more recent the hardware, the more unlikely Debian is to work. OpenSuse has worked well for me, though.
Also, I run an Intel Nuc (laptop hardware, essentially) with Ubuntu as a main development machine. 0 problems. 0% in the way.
OSX is great, don't get me wrong. But, it's been years since I've ran into driver problems with desktop Linux.
At our office, the macOS laptops tend to have the most problems with sound & graphics; the Linux desktops, OTOH, Just Work™. I find that pretty funny, actually, because I would have expected it to be the other way around.
I've never had anything similar happen with a PC (we use Thinkpads), whether running Linux or Windows.
My Mac problem was solved using ethernet adapter.
Sure, not the very newest hardware, but it's not like laptops have gotten significantly more powerful for anything that actually matters, for the last couple of years.
The biggie is the GPU. T420 has no oomph to run an external 4K display (at more than 24 Hz, i.e. in usable mode). Broadwell and newer do have the capability to run 2 of them.
The CPUs got less power-hungry. You can do the same work with less juice, so your battery lasts longer.
The SSDs with the new interfaces got much faster. There is simply no comparison between M.2 nvme drive and SATA3 SSD.
Sadly, wifi took a step back, with almost universal unavailability of anything better than 2x2 MIMO 802.11ac. In the past, MIMO 3x3 used to be available (Broadcom, but the option was there).
So yes, modern laptop is significantly different experience than few years old one, despite the CPU having the same GHz.
My point is that they're not noticeably faster for the things most people use their computers for. They still run browsers, Spotify, email, word processors and all of that just fine and last "long enough" on a charge.
The improvements have been incredibly marginal for most people.
Also, they do not have a problem with Electron apps. for most people, they are just apps like anythig else, and do not perceive any performance problems with them.
And where did Electron apps come into the discussion?
Even the Apple 2015 M.2 drives, which are "only" AHCI, but with much deeper queue (i.e. nonstandard), can do 1.something GBps and the difference compared to standard SATA SSD is noticable by normal users.
Electron apps are something normal people use and some of them compain about the speed/lag/memory usage. Usually those with older computers, who think nothing changed in the last years ;).
If you do the first, you will do also the second. It makes a difference even when launching Excel or Firefox.
Between SATA SSD and PCIe? Sure, for extremely IO-heavy tasks there's a difference to be felt, but not really in everyday use.
Macs historically have always been a decent compromise for having decent GUI/Linux-like terminal/nice hardware.
It's kind of a different environment now, with PC manufacturers coming up with well designed laptops and Windows offering more Linux integration.
Linux could be the new Mac os X if all the Linux distributions chose to focus on one platform and way to do things, but nobody in the Linux world is going to do this.
Also, Mac has unified hardware, so they pretty much can optimize so that stuff Just Works.
Of course there are things that are not working so nicely, and are not meant to. For instance, running RHEL (which is a server OS) on a laptop (which typically has new optimized hardware which is not meant to be server) and then expecting graphics and WiFi work on a kernel that is much older than your hardware and thus has no drivers.
Perhaps you might expect some trouble also when running Windows Server 2012 on a new laptop? I don't know, I haven't tried, but wouldn't be surprised.
And so therefore, no one else has either!
One of the big reasons Linux Desktop still sucks is the outright refusal of the community to acknowledge any of its problems.
How do you define "mainstream laptop"? Or is it a laptop that can run desktop Linux without problems?
The Dell XPS 13 is a good choice if you want something from a major manufacturer, but there are loads of smaller companies, too.
There’s a lot of things I like about macOS [eg: better power management], but the main reason I switched was because it was a Unix I could put Photoshop and InDesign onto.
That argument trumps pretty much everything else. If my boss came into my office today and told me to switch all of our desktop computers to Linux, as much as I would enjoy that, I would have to tell him that it cannot be done, because we use a lot of software that just is not available for Linux. (And a fair amount of it is not even available for macOS.)
I don't know what examples you had in mind when you mentioned applications having incompatible keyboards. The only one I know that does that is Vim (which by default uses the internal clipboard when you use its cut, copy and paste commands) but Vim is definitely not your average Linux application...
I usually use iTerm2 as well.
PuTTy is amazing, and my experience of using windows over the years would have been far worse without it.
However I do understand the point - compared to using a real terminal, it does indeed suck to be stuck on windows using PuTTy
For a terminal on windows, I have now taken to install cygwin, X on cygwin, and then xfce-terminal on top. It's not perfect either, but it is better.
I think the question is not whether someone would buy a MacBook because it has these tools built in but rather, why would someone buy Windoze given that it doesn't have these tools built in?
There are some workarounds if you use particular terminals (for instance you can't use Mobaxterm as it forces 777 regardless) but it's not exactly 'it just works' yet.
Not trying to be condescending, but it seems to need highlighting that just having SSH access isn't enough.
Not to mention that these other environments have offered other benefits [especially] to developers looking for a programming environment that works for them. The VS.Net IDE-play-button approach to automating testing and deployment doesn't scale for everybody.
> the people buying Macbooks, or using Linux directly [...]
... don't give two hoots about Powershell. Sorry, but I did limit what I was saying to describe people not using Windows.
PS: From the fact that you refer to Visual Studio as "VS.Net", I take it that you haven't been in Microsoftland for a while. No serious developer uses these "play-buttons" to do their automated testing and/or deployment.
It's been about a decade since I moved to Linux. Just skimming the latest VS docs though, the structure of unit testing within VS pretty identical to how it was. The feedback is a little more inline and I guess the extensions are a fairly big deal... But that sort of lends to my point that the IDE is still pushing itself down your throat for an IDE-centric build environment.
The thing I've liked about collaboration since moving away from VS is our build and testing toolchains sit completely separate from development settings and IDE projects. It's very simple to switch things out and script in new workflows. It also means we don't need to install things like VS to test on a new (eg rebuilt production test) machine.
I'm sure the same is possible in Microsoftland, I've just never —even recently, I still interact with .Net developers and their work product— seen people making use of it, falling back on what Mother thinks best. Maybe they're not serious enough developers.
The historical bias towards the IDE and component kits still exists —and probably will forever— but some of those things are reasons I found it so easy to pick and run with VB.NET and C#. I won't pretend it took me a while to work out what I was supposed to do when I left it all behind.
That all said, they are getting closer to the point (may be there already) where you can go from Powershell to remote Powershell over SSH.. But that still assumes you're lunatic enough to want to host anything on Windows. I think I've had too much freedom for too long now to ever consider that a good idea.
No, but something like installing putty was the last straw.
Putty is a terrible experience; very difficult to use. It's like every misstep or disconnection involves a dozen or so clicks back into the configuration area to "try again". This feels like normal on Windows though.
I recently tried to port my workflow to Windows for a year, but I eventually gave up simply because of something like this. I might've made it 6-7 months and I can't recall exactly where it was, death from a thousand cuts maybe.
Hat's off to you, I didn't even make it that many weeks.
Between filenames/paths being too long, poor terminals with janky colour schemes (save for Mobaxterm, it's great), WSL issues (like umask handling), gvim/vim issues with plugins, and loads of other annoying bits I just went back to Linux.
Linux is FAR from perfect, especially as I have an Nvidia graphics card, but there's a lot to like there too. Having run Fedora 27/Ubuntu 17.10 I've found I really really like Gnome. I seem to be in a club of one there (and it's buggy as hell on 17.10) but for me it's been great to use.
Not that I wouldn't use Mac if I found a couple grand down the back of the sofa, but as a daily driver Linux has been less painful. Even on the laptop, though I do have one of those Dell certified ones.
Normally you have a fully-fledged programming language that sometimes runs
a command being SSH client, and universal one at that (you can pipe data
through it, you can pipe data out of it, you can run remote commands
non-interactively with it, you can jump through SSH on other hosts with it,
you can set up TCP tunnels in ad-hoc manner without going through the
configuration, and obviously you can run an interactive shell). Plus you get
scp for file transfer out of the box, and lftp (which understands sftp
protocol) and rsync after installing some barely related software, all of them
still able to jump through other hosts.
Now compare it to the glorious ability to open a window with remote shell (and
only direct one, no jumping through a bastion host), and maybe a TCP tunnel
with some clicking through every single time (you can't recall the history of
commands, because there was no command in the first place; the best you can do is
to save the tunnel configuration permanently). And maybe file transfer if you
remembered to download PuTTY's scp client.
You can do SSH proxying with PuTTY, too. It's just called something different:
Plus PuTTY has a command line interface and everything. You can do all the things you mention with PuTTY from a cmd.exe or PowerShell prompt.
Using PuTTY or Mingw tools on Windows sucks compared to using them on a Unix-based system. I gotta say the Windows Subsystem for Linux really helps in that regard - as does native SSH in cmd an PowerShell (PowerShell by itself is also pretty awesome once you get the hang of it). Linux did not use to be such a great competitor for the rest of desktop usage (some will argue it still isn't).
Depends, for the occasional deep into CLI world to run a script, is quite ok.
Granted, the defaults regarding mouse copy-paste and window size were only improved on Windows 10.
> Not to mention that since developers are only using Linux and Mac, you will have libraries which does not work on windows (because nobody even tried).
Not all developers are pure UNIX devs doing POSIXy stuff.
Why they're trying to replace it with Powershell rather than improving cmd.exe is silly, however.
However I do think it could have been made less verbose and dislike the ps1 extension, as the 1 (one) does not make sense.
For instance, "%" is an alias for ForEach-Object, "ls" for Get-ChildItem, etc. And you can of course define your own.
Just install Ubuntu Shell.
Also, it's not like you don't need to install iTerm, zsh, docker, python libraries etc on mac. It's not like it comes with them by default. Your argument is kind of moot.
cmd.exe and conhost.exe won't do the job.
Mabye Homebrew, and its predecessors, were the real killer apps for mac?
Maybe I just don't remember installing it, but I think macs have zsh out-of-the-box too.
From the point of resourcing, I mean. Here we have Microsoft on a gazillion dollars of capital and one of the most critical tools for several people is developed by a single talented hobbyist who gives it for free.
And of course my next computer will be a MacBook again.
I'm one of these somebodies.
It's not that installing Putty is a hassle, but my choice to buy a MacBook was very heavily driven by the fact that macOS is Unix based. My time spent between Windows 10 and macOS is pretty even, but it is a much more enjoyable experience to develop in macOS partly because of the Unix-based shell.