Hacker News new | past | comments | ask | show | jobs | submit login
Rumors of Cmd’s death have been greatly exaggerated (microsoft.com)
168 points by MikusR on Jan 5, 2017 | hide | past | favorite | 194 comments

Windows really needs a public psuedoconsole API.

For the uninitiated: cmd.exe is the shell, like bash. conhost.exe is the terminal emulator, like xterm. bash or cmd (the shell) talks to the terminal via its standard input, output, and error file handles. Terminal handles are like full-duplex pipes, but richer: they transmit all the information needed for interaction with the user. Each terminal handle has two ends: master* and slave. The slave is the terminal to which cmd or bash talks; it basically emulates a teletype. The master handle allows for reading both in-band and out-of-band information from the slave terminal handle and supports special APIs that allow anyone with access to the master handle to fully emulate a terminal.

POSIX systems allow anyone to create master and slave terminal handles. When you start xterm, xterm calls openpty, gets master and slave handles, starts your shell with the slave handle, and uses the master terminal to get the information needed to draw characters.

There's no equivalent for Windows. The only Windows program that can call the Windows equivalent of openpty is conhost, which ships with the operating system. It's not possible for a different program to make a master-side terminal handle. There are two workarounds, which both suck: 1) use pipes, not console handles, and 2) make a hidden conhost.exe and scrape it. The former causes programs to think they're being run non-interactively, with all sorts of negative side effects (like excessive buffering). The latter loses screen updates, is inefficient, and is generally awful.

Microsoft could easily make this situation much better by opening the protocol that conhost uses to communicate with the kernel. If any program can create a console handle, it's much easier to implement things like ssh daemons, Cygwin, and terminal emulators that suck less than conhost.


* As an aside: when bash talks to xterm, it uses in-band signaling (mostly: see termios): it sends control code sequences, which xterm understands as "move cursor to 3,3" and "set color to green". By contrast, when cmd communicates to conhost, it uses out-of-band signaling: it uses special API functions to move the cursor and set the text color.

Forking a process on Windows is really slow, making most opensource toolchains sluggish on Windows.

The UNIX philosophy is that everything is a file, so you can read and write from files and pass data around with convenient tools... that mindset is also missing in Windows.

Command lines were not foreseen as a preferred way of interacting with the system in Windows until recently. PowerShell tried to fill some gaps, and many tools integrate with it... but it's not the same. It has a long way to go.

Also, on Unix everything is a fd until it isn't, like timers, signals, semaphores , pids...(and sockets have their own peculiar API). Pids not being fds leads to a really annoying issue of basically not being able to easily implement a robust way of handling child exits

NT is more Unix than Unix in a lot of ways, where everything is an object (more or less)

Plan 9 though really ran with the whole file thing

> NT is more Unix than Unix in a lot of ways, where everything is an object

Yep. NT's design in this respect is great. Each NT object also has a name in a global system namespace; each object has its own access control list; and most objects are dispatcher objects that you can use with functions like WaitForMultipleObjects. In Windows, it's trivial to wait for, say, a socket* and a process at the same time.

I've always wanted to add NT-style process handles to Linux. A few other people have tried, but Linus has rejected their patches on the grounds that if you can open a handle to a process (preserving its PID), a program can fill up the process table.

I've never bought this argument. You can fill up the process table in other ways today without a ulimit, and we can make the process ulimit apply to process handles too.


* Screw you, in-process winsock providers. My programs treat sockets as regular HANDLEs, and if you install some shim that breaks this model, shame on you.

One day the capsicum port to Linux might happen, but in the meantime FreeBSD does have it, so has file descriptors for processes.

And signals, timers and events (inc semaphores) have non-standard fd versions too, but they've been bolted on and you need all kinds of new syscalls to make them work right.

Plan 9 did it right if you want to go "everything is a file"

Fork doesnt make a lot of sense on win32 processes (how do you handle a forked message loop?) so the functionality isn't exposed, though nt native processes can do it. The best you can do is a spawn equivalent with manual copying of structures so you lose the CoW fork semantics that allow for speed on Unix.

For Linux subsystem processes fork works pretty well, but they are different animals

> Fork doesnt make a lot of sense on win32 processes (how do you handle a forked message loop?

Message loops are per-thread and per-HWND (which are thread-affine) anyway. One way of making fork "make sense" in win32 processes is to say "Okay, you can fork, but your new process begins execution in a new thread, and none of the old threads survive."

New thread means new message loop. You could even do things like automatically tear down single-threaded COM apartments in the child process.

Fork was implemented in the NT kernel for WSL - https://blogs.msdn.microsoft.com/wsl/2016/05/23/pico-process...

Fork worked already, as the article notes. What I really want is fork to work for the win32 subsystem! I want Cygwin, not WSL! I want tight integration between the POSIX and Win32 worlds, not the pseudo-container that WSL gives you.

There is a third way: you can create PowerShell Host application with the same .Net API that PowerShell ISE uses.

I tried that --- specifically, I made a readline-powered psh host for my own use. There are a few problems in practice:

1) readline hardcodes the backslash in a bunch of places, especially in history expansion. Psh needs '^', not '\'.

2) the psh hosting APIs aren't ready: powershell.exe uses a few internal APIs to which other hosts don't have access, and without these APIs, psh does helpful things like wrap lines at 80 characters and paginates output.

I also recall there being some problem with psh's completion.

Still, it's amazing that embedding the CLR and posh inside a Cygwin process works at all.

Is there any historical reason that conhost is the only executable able to open a master handle?

Well, originally, console handles weren't real handles. They were userspace hacks, which is why DuplicateHandle[1] has special cases for console handles. (Imagine using negative file descriptor values that libc understood.) Originally, the terminal emulator ran as SYSTEM and was spawned from (IIRC) csrss (which is special, because reasons). In Vista, with integrity levels introduced, the terminal emulator had to run in a different security context, and so wasn't themed. (To prevent shatter attacks.) In Windows 7 and 8, Microsoft developed a real, honest-to-god pseudoconsole API, with a real protocol. In this implementation, console handles are real handles and conhost is just a normal program that speaks a protocol.

IIRC, the only thing at this point stopping Microsoft creating a pseudoconsole API based on real, honest-to-god kernel HANDLEs is a willingness to externalize, document, and support what has been until now a Windows-internal API.

[1] https://msdn.microsoft.com/en-us/library/windows/desktop/ms7...

Regarding a public API, this GitHub comment suggests that people at Microsoft are working on it.[1] Of course, Microsoft can't commit to anything. As the winpty maintainer, I hope some kind of public API emerges to reduce my maintenance burden and make winpty more robust and efficient.

Aside: About a year ago, I studied how the console handles worked (especially in relation to console attach/detach/alloc and CreateProcess), and I discovered a bunch of interesting things. For example, the "fake" handles were inherited regardless of the bInheritHandles parameter to CreateProcess, and an entire "ConsoleHandleSet" was inherited if a process attached to another process' console. Real handles in Win8 seemed to be either "bound" to a particular input/screen buffer or "unbound" so they worked with any attached console. It was possible to have a console open with no attached processes.

I tried to formalize the behavior I observed here: https://github.com/rprichard/win32-console-docs

In the course of testing, I found some neat bugs like:

- A small program that blue-screens Vista

- The inheritable flag on a console handle can be changed on any Windows version, except for Windows 7.

- CreateProcess has an undocumented(?) behavior where it duplicates stdio handles into the new process rather than inherit them. (The handle values will generally change.) Up until Windows 8.1, a stdio handle of INVALID_HANDLE_VALUE was duplicated into a true handle to the parent process. (Because INVALID_HANDLE_VALUE is actually the GetCurrentProcess() pseudo-handle, you see.)

- Assorted inconsistencies in Windows XP and WOW64 (probably fixed in newer OSs).

I'm not sure how much confidence I have in the spec. It is backed by a big test suite, but there are so many cases to consider.

[1] https://github.com/Microsoft/BashOnWindows/issues/111#issuec...

> winpty

Cool. There's a third option that you might want to consider for winpty. (I was experimenting with it before I left Microsoft.)

The basic idea is to use a DLL shim to provide your own console API implementation for a process and its children --- you can replace the system console API with your own by shimming DLL import and export tables. You'd use your own type of pseudohandle for your fake console handles; programs written to work with the old console pseudohandles should work correctly with your pseudohandles, provided you've hooked the relevant HANDLE-using APIs.

Yeah, I considered that route, but I was concerned that it'd be unreliable -- maybe with other programs doing clever tricks with import/export tables, or maybe with antivirus. I can't think of a specific reason it wouldn't work, though. I think it'd have to override CreateProcess to propagate the API hook. IIRC, ConEmu's CreateProcess starts a child in a suspended state so it can install its hook before resuming the child. Maybe it'd have to hook GetProcAddress, too.

I was assuming I'd use genuine console handles, but recognize that they're associated with the special console, so divert their API calls.

Maybe the technique would make programs slower. I know that's a complaint people have about ConEmu.

There's a similar (4th?) technique I've considered -- instead of hooking every process attached to the console, hook APIs in the conhost process and reimplement the internal protocol. It should avoid the performance problems and confine the hackiness to one address space. The trouble is that the protocol is undocumented. A small change could break everything without warning, but MS could also redesign the whole thing, making a fix impossible.

I also tried using the console accessibility APIs [1] to at least synchronize scraping and forwarding to the pty. The problem with this approach is that the console code issues EVENT_CONSOLE_UPDATE_REGION and such with its internal locks held, so attempting to read from the console from inside the accessibility handler deadlocks.

[1] https://msdn.microsoft.com/en-us/library/ms971319.aspx

To do that, you have to attach the conhost.exe process to its own console with AttachConsole, right? I tried that and also noticed the deadlock. An out-of-context handler avoids the deadlock, but then it's not synchronized.

The other interesting problems I found with that API were: (1) when editing an input line, EVENT_CONSOLE_CARET notifications were delayed, (2) if I hit ESC when editing an input line, the line would clear, but there'd be no update-region notification, and (3) it's impossible to distinguish between a call to ScrollConsoleScreenBuffer and the whole buffer scrolling because we've reached the end. ScrollConsoleScreenBuffer might scroll only part of the buffer.

winpty doesn't use the WinEvents API (yet), but it might be good enough to reduce polling when the console is idle.

> - A small program that blue-screens Vista

You mean like on current up-to-date versions of Vista and Server 2008? With a PoC on the github page? You realise that you have found a security vulnerability and are disclosing it publicly, right? Userspace must never crash the kernel, even if not further exploitable, especially so if it's possible for an unprivileged user. Be responsible, send a mail to secure@microsoft.com

Yes, it's a tiny program that (1) detaches the console, (2) attaches a new console, (3) closes all screen buffer handles, then (4) creates a new screen buffer. The fourth step caused a BSOD on Vista and Server 2008. AFAIK, they were up-to-date.

I did report it to Microsoft before making it public. The reply was:

> Thank you for contacting the Microsoft Security Response Center (MSRC). I would suggest trying on a local VM to confirm BSOD. However, this currently is just a local DOS, which would would not be something we would investigate further. If you have any additional information on how this could be further used to exploit another user or a remote DOS, please let us know and we will look into it.

> For an in-depth discussion of what constitutes a product vulnerability please see the following:

> "Definition of a Security Vulnerability" <https://technet.microsoft.com/library/cc751383.aspx>

>Again, we appreciate your report.

I might be suffering from reverse nostalgia, but it was never hard to bluescreen Vista, and I never considered those issues worth reporting.

Just debugged it and it looks like a NULL pointer read in CSRSS, in winsrv!SrvCreateConsoleScreenBuffer.

In fact, the same bug exists in XP, but the NULL page is mapped in XP which is why it does not BSoD. So far I have not seen writes, only reads.

Update: it looks like the NULL pointer is accessed several times in the code. They mostly are 16-bit accesses, so it is probably not addresses. They seems to be in font-related code.

Yes. Please report these bugs!

On Vista, conhost.also ran in user context and console windows were themed, on XP they was not. I believe that conhost.exe was introduced in Vista together with new desktop separation mechanism and on earlier version all the console window logic was internal to csrss (which in original NT design was essentially display server and window manager and placing console emulation in there makes some kind of sense given the NT's model of that).

Ah, yes, you're right! The windows were created directly.

Win7, not Vista.

How do you know all this? :)

Three years on Windows and Windows Phone. I sometimes miss those days. :-)

Conemu doesnt suck and it uses method 2. It has few quirks but nothing serious and is almost on pair with usability as tmux and in some aspects even better.

I use conemu at work. It's not as awful as some alternatives. I would recommend it, but I can't say that it's a good experience. Maybe I'm doing something wrong here... do you run vim in conemu? For me, it's slow when redrawing lines of text while paging and sometimes renders artifacts in the pane, especially when I vertically split vim.

For a command line interface, I find it to be fairly unstable, too.

I use vim in cmder occasionally. The only way I've been able to use it properly is to disable conemu hooks before I execute it.

Basically, I've defined a function my profile to:

* Save the ConEmuHooks environment variable value * Set it to "Off" * Execute vim * Reset it to the previous value

I alias this command to "vim".

It's not pretty at all, and it caused me to use Sublime or VSCode most of the time, but it worked for me.

More info here: http://conemu.github.io/en/ConEmuEnvironment.html

Why would you do that when gvim is available and is ways better experience then vim ? For the sake of being shell purist ? On remote shell ?

I'll give this a shot, but I'm unclear how the hooks affect vim or other processes-does this address all of my vim+conemu problems? Thank you, in any event.

Have you tried Cygwin vim in mintty?

Also, Emacs has a mode where it'll run as a Cygwin process, but use native win32 widgets. I've found using it to be the most pleasant way of interacting with Windows systems.

When I was at Microsoft, I joked that I wasn't so much running "Windows" as GNU/NT. :-)

Agreed. It's the best terminal emulator I've found on windows. I usually use it for other shells and avoid cmd like the plague but, whatever shell you use, it's a far better app than conhost.exe.

I haven't experienced the instability that others have but I do find its configurability a bit excessive. Somehow I want it to be much simpler but still do everything that it does now. I don't know the way to achieve this but I believe it's possible.

Cmder is basically Conemu plus Clink with nice defaults and useful initial profiles.

I tried it a few times, but it always crashes rather quickly. So it sucks.

Have you tried using vim inside it?

Conemu does a decent job given all the restrictions the Windows platform has, but it's still miles behind a real Linux terminal emulator.

No... there is no need for that on Windows. I use gvim all the time.

> Typing “cmd” (or “PowerShell”) in File Explorer’s address bar will remain a quick way to launch the command shell at that location.

Where the hell is this stuff documented?!

This is one of the biggest problems with Windows, Visual Studio, Office and other larger (enormous) software Microsoft makes: an insane amount of features but documentation is nonexistent or hard to find.

The software itself is quite powerful and feature-rich, but finding the right tools from the menus, hidden hotkeys etc is difficult or impossible.

Still way better documented than most FOSS software that isn't an copy of any century old UNIX utility.

Just like on any many FOSS projects one gets the actual valuable documentation by buying books.

Well, with FOSS projects, you can at least have a look at the source. Won't replace proper documentation, but that's still better than "reverse engineering" binary tools.

Which is not really an option for your average joe.

This has just changed my life.

I thought right click -> Open Command Window Here was the only way. (Sometimes need to hold shift? Maybe just in user directories)

On a related note, I recently discovered WoX [1] and I'm pretty happy with it. I think it is an even more convenient launcher: Alt+Space from anywhere, type cmd, ENTER.

With a couple plugins you can type things like "win <TAB>" to select from a list of open windows, "g search-term" to google a term, etc.

1: https://github.com/Wox-launcher/Wox

The plugins part I understand, but Alt+Space, type cmd, ENTER wouldn't be the same as "windows key, type cmd, ENTER" that's been available for a few years baked in the OS?

...or Win+R, "cmd", enter – which has been there for more than 20 years.

Because Win+R doesn't autocomplete the command, other than the history of things you already run. WoX will pickup and do fuzzy completion on all programs of the start menu, for instance.

It's also in the File menu.

Not on Windows 7 here, but Windows 8 has it at least.

Well, I was kinda counting on a discussion about a change in Windows 10 to have that implicit context.

Not much actually documented these days about the UI, considering probably that the vast majority of users never read or even opened the online help. In any case, this has been working since forever. The address bar effectively was a Start ▸ Run with the current folder as working directory (if possible) when it was introduced. They have been careful not to break that.

I still haven't forgiven them for changing what <Backspace> does in File Explorer (used to go up one level, now goes 'back' in the history). Even though that was undocumented.

Hopefully you have discovered Alt+<Up Arrow>

I have now! :D

Dunno, but been using this for years.

Works not only for cmd, but also for any other executable on %PATH%

Maybe it's not that well documented because it is kind of an oddball use of other features?

From what I can tell:

To be as useful as it can be the File Explorer does everything it can to resolve a path. This seems to include an eventual last ditch fallback to shell create process, for whatever reason. (Maybe leftover from the proliferation era of pseudo-folders that were separate processes or when the address bar was reused as a taskbar search bar in 9x? Sounds like a good topic for the oldnewthing blog, I'd imagine.)

Because the new process inherits the current working directory from the File Explorer window that launched it, you wind up where you expect to.

You can use any command in your path, too, beyond just cmd or PowerShell. For instance, I just did gVim and VS Code for fun. (The interesting part here is that I have a Code.lnk file on my desktop apparently so had to over-specify VS Code's command name to get that to work: `code.cmd`. I also had to remember to add the `code.cmd .` to get it to actually open that folder in a new window.)

Here [1]

I have my own question though: Any idea how to give keyboard focus to the 'address bar' in file explorer, like Ctrl-L does in browsers?

[1] http://www.howtogeek.com/235101/10-ways-to-open-the-command-...

Alt+D (only on English UI languages), or Ctrl+L. Works for me at least.

Alt+d works in all browsers also :)

I know this is a bit late, but F6 is another option.

I learnt it from colleagues. Maybe it was mentioned on a blog somewhere. It's the Microsoft way!

Powershell should have been set as default years ago, particularly on core.

It escapes me how windows admin community still largely ignores Powershell while clicking next in the loop.

Until MS makes Windows Powershell first, adoption will change very slowly as we witnessed so far. Nano server will hopefully bring it faster.

Its a terrible injustice as posh is best shell in the existence IMO.

What I really want to see in future is exporting entire control pannel config as posh script. This is the perfect time to digg into it given that control panel is being transformed into 'settings'.

Many people are put off by Powershell because they believe that 20 years of experience in bash or CMD will allow them just fire up Powershell and get by. You will not get by. There's a good amount of gotchas and you will get stuck on some unexpected behavior within minutes.

The thing is, if you want to use Powershell you do need to spend a few days just learning Powershell and nothing else. I have the impression some people don't want to do the effort. It's a pity because it truly is vastly better than any other shell in existence.

PowerShell is almost like using the python interactive interpreter as your command line but better. I was convinced how cool it was when I could import a C# .dll and load the types into PowerShell. You can even just write C# 'inline' and it compiles it on the fly.

It baffles me as well how many Windows people I work with ignore it almost.

This is really the best analogy to explain PowerShell to Unix people. PowerShell's sweet spot is really not interactive use, but sysadmin-type scripting. It's very much like using Python for sysadmin tasks.

So essentially it's a programming language? For "scripting", why would I use it over just plain c#?

It's about like asking why would you use F# or (Iron)Python over C#? Different syntaxes have different strengths/weaknesses and the trade-offs can target different use cases and needs. At the end of the day it's whatever makes you (feel) productive.

In terms of specifics, PowerShell's strengths over C# for scripting are particularly in PowerShell's the ability to succinctly navigate hierarchies (folders, but also other folder-like trees) in a way reminiscent of shells from time immemorial, as well as to easily interleave native binary calls (and pipes/redirects/captures).

Some of it is the difference between grabbing C# binding for libgit off NuGet, LINQing my way through a the git DAG for the information I need versus running `git show xyz` or `git describe` directly in a shell and capturing that into a variable or piping that to whatever my next steps are. Depending on my level of abstraction and how often that script is called and if I'm paid to write that script or it is getting in the way of what I'm actually trying to work on, I use a different tool for the job.

Because C# is designed around application development, whereas the PowerShell scripting language is designed around scripting and its peculiarities.

For starters, I would look into it if it had a different look like ConEmu (fonts, themes, fullscreen)

It does? You can run powershell in ConEmu—I run it in cmder.

Except Powershell isn't a real shell in the same way that cmd or any of the POSIX shells are. It's a REPL for a language running on the .NET VM. I'm neither a fan of the language nor having to wait several seconds due to the latency of having to start the .NET VM when you start posh.

What is a shell for you then? If the defining characteristic is that it's not written in a managed language, well, okay, but I don't think many people share that sentiment.

I have observed the several seconds startup time last with PowerShell v2 or so and that was not the time needed for the CLR to start, but rather PowerShell which was slow to start. v5.1 starts instantly here and the only slow thing about it is my profile, which admittedly does a bunch of things that take time (it's not yet at a point where it's annoying me so much that I edit it, and I can start typing commands during that time anyway (considering that I often spend more time thinking about what to write that's okay)).

> What is a shell for you then?

A way to execute programs. The parent is right, powershell is more like a REPL to execute .net code. It's still no where near as good as a unix environment.

But as long as you only execute programs a script for cmd, bash, PowerShell and a number of other shells looks pretty much the same. In PowerShell executing a program is simply to state the program name (barring spaces in the path for now). You don't have to do something like execute_program(name, args, input_redirection, output_redirection, ...). Well, you can, if you're so inclined, but it's usually unnecessary. Any shell is "just a REPL for a scripting language". Interactive use is one of the tenets of a shell, otherwise it's more a batch processing system, I guess. And just like in other shells the language is one geared towards running commands (cmdlets or external), managing their execution and processing their output. That it works better for general programming than many other shell languages (in my eyes) is a nice bonus.

Heck, running programs is even easier than in most other shells since there's a specific token that stops argument parsing completely which means you don't have to care about quoting at all (an ever-present nightmare at times).

The difference is that windows/PowerShell don't provide the programs that make a she'll useful.

Wow, then I've done a lot of apparently-useless PowerShell work.

I beg to differ, by the way. Different is not the same as »not useful«. It's not a traditional Unix shell and doesn't pretend or want to be. That doesn't make it useless, just different.

It's not a traditional Unix shell and doesn't pretend or want to be.

Except in the sense that it provides aliases for most popular Unix shell commands. They even made their aliases incompatible with other Unices, like every other traditional Unix shell.

I agree that powershell is far from useless, but it does feel like there are a lot of cmdlets missing out of the box. It's real easy to end up in Get-WmiObject land for everyday admin tasks.

But WMI is really, really huge. I can understand that there is a bit resistance to providing cmdlet wrappers around every single WMI class, although I think on Windows Server there are a lot more things exposed by default (and I think even more in modules that are not loaded by default).

Also, since PowerShell can work with WMI objects natively it's already close to what a cmdlet can provide.

Doesn't Bash (or any Unix shell) fail to fit your definition, then? Bash contains an interpreter for a language that, while lightweight, is still a separate interpreted language. In particular commands like cd can't be executed as separate programs.

A shell is so named because it was envisioned as a way to interact with the operating system core (not kernel, the core typically consists of both kernel and userland processes/tools). Core and shell, get it?

A command shell is a shell that uses commands to interact with the operating system, whereas a GUI shell is a shell that uses a graphical user interface to let the user interact with the operating system core (services/tools/processes).

In Unix philosophy, a 'command' is almost synonymous with a 'program', except when it's not. Seems to me that you are stuck in the Unix mindset and refuse to entertain the idea that a "command" could be anything but a program. In fact, the Unix shell commands are already a hybrid concept: Commands can be magical built-in commands (like 'cd'), programs (like 'ls') or script files which will execute in a special surrogate process (a seperate shell process).

However, there is a problem with programs as commands: They are opaque. And because every program is started within it's own process, they are also limited when it comes to extending the shell features. This is why 'cd' is a special command that is not a program.

PowerShell is a fresh take on this: PowerShell commands are still used to interact with the operating system, but they need not be full-blown programs executing in their own process. Hence, PowerShell is in some ways even terser than the traditional sh shells. An example of this is the way that 'cd' is not a magic built-in command. In PowerShell 'Set-Location' (the command for which 'cd' is an alias) is implemented like any other command and packaged in module. Only this module is always distributed with PowerShell.

This approach also has several other advantages (not limited to):

* PowerShell "discovers" the parameters, types, positions of command parameters. It is the shell that performs parameter parsing, position matching and (most importantly) coerces values, e.g. parsing a string as a number when the parameter expects a number. This leads to much higher consistency.

* The shell can generate documentation from the command definition. Not possible with sh shells where you need authored documentation, because the parameter parsing is really the responsibility of each command.

* The shell can generate tab-completion and auto suggestions from the command. Not possible on sh shells where you need authored completion files to support completion. In PowerShell you get documentation and tab-completion even for scripts you write yourself!

* The commands may interact with the shell itself. This is important not just because you can implement commands like 'cd', but because the "shell" may actually be part of an application. A GUI administration tool can actually act as a shell, and commands can control the GUI. A very generic example of this is PowerShell ISE, where commands can actually open new windows/panes, change menus etc.

> The parent is right, powershell is more like a REPL to execute .net code.

Wrong. PowerShell is a REPL (i.e. a command shell) to execute PowerShell commands. PowerShell commands may be cmdlets but may also be programs. PowerShell may be implemented in .NET, but PowerShell scripts are not .NET code.

Yes, I'm aware of all the selling points. They even sound great in theory. In practice though, PowerShell is nowhere near as easy as a UNIX environment.

Says the one that never used a Xerox PARC like OS in anger.

A GUI workstation using a REPL for scripting/automatic is so much powerful than a plain UNIX shell.

Type program.exe in powershell, it runs

Fits the definition for me

With nano server speed is optimised. It starts under 500ms. It has too, otherwise it would be slower then docker run.

It's around 1.5 seconds here on an i7-6600U (whether I launch it in its own window or in a cmd window). I'd rather it was 1/10th of that as a maximum, but it's worth the wait!

The horrible syntax of Powershell isn't something I can look past.

I think it's fairly neat (I mean - for shell scripting not for interactive use). The problem with shells is that they want to do both. The alias system in PS tries to make it possible to use a more bash-like syntax in interactive and the verbose syntax in scripts, which is a good idea, but it's hard to get right.

As someone who golfs in PowerShell and very frequently uses it interactively too, I think the complaints about lack of terseness are quite overblown. Aliases, parameter binding by position or pipeline, the ability to shorten parameter names as long as they remain unambiguous, parameter aliases – all features that are most important for interactive use. Apart from a few nagging things where parameter prefixes clash (gci -f matches Force and Filter) I think it's quite good. Besides, I gladly pay that little price for much improved readability, discoverability and memorability. No more pondering whether the -Recurse flag is -r or -R in this particular command, for example.

Also, since PowerShell is able to tab-complete cmdlets, parameters, parameter values (when they're enumerable) and even object properties, you hardly have to type much most of the time.

The only problem with the tab-completing is that since everything has a standard verb prefix, you always have to type the beginning of the cmdlet name.

I haven't used Powershell since Windows 7, but at least back then it was super slow to start. cmd.exe opens immediately, like a normal terminal emulator. Powershell took several seconds to open.

At least for me, if I'm opening the command line it's to get something done quicker and easier than I could with the GUI. Powershell wasn't that, at least on Windows 7.

Just for your reference, I've got Windows 10 and Powershell 5.1 running on an SSD (probably the only hardware that matters for program launch, really).

I can hit Win-X-I and start typing commands as fast as possible and Powershell is up instantly and I've never missed initial characters on a command. For interactive use, it's always there and up quickly.

This is in contrast to the typical start menu search which takes about a second before it registers keystrokes and I'm always searching for something missing the first few letters of the file or executable name.

Windows 7 shipped with PowerShell v2. So yes, back then it was slow. It's not anymore.

Side note: A terminal emulator is completely different from a shell (and on Windows (prior to 10 at least) there wasn't such a thing, but rather the console, which did not emulate a terminal).

Helge Klein wrote an excellent article "What I Hate About PowerShell": https://helgeklein.com/blog/2014/11/hate-powershell/

Be sure to read the comments where PowerShell guru Jeffrey Snover politely addresses them.

I share many of these same frustrations that have prevented me from adopting it as a Windows scripting language. I use it when I have to, but I still will choose csharp compiled EXE's, vbscript, or even Py2Exe'd Python programs over PowerShell for sysadmin type tasks.

I was in the audience at MS TechEd 2005 in Orlando and watched as Jeffrey Snover presented Monad and highlighted it's Perl, awk, sed, and unix influences. I was immediately impressed.

However, PowerShell seems to best fit as a REPL shell, and for me, more mentally cumbersome than using csharp (which of course can technically do everything ps can).

My gripes with PS are kind of minor (a bit like my dislike of Python's whitespace enforcement), but it's kind of like a death by a thousand annoyances with PowerShell for me.

For example (stolen from Helge's article), these are some WTF's for me

* Testing for string equality: "abc" -eq "abc" ( why isn't this "abc" == "abc" )

* Greater than: 8 -gt 6 ( why isn't this 8 > 6 ?)

* Functions can only be used after being defined. Leading to scenarios where a script begins waaaay down at the bottom.

* dynamic scoping: http://ig2600.blogspot.de/2010/01/powershell-is-dynamically-...

* No way to syntax check a PS script before it's run

These aren't all major, but the last one is a show-stopper for me. If I'm running a script on a couple hundred thousand machines (and I have) I can't afford to have my script bail out on an unexpected code-path.

I really like the idea of PowerShell, and I am honestly glad it exists, but I think it would benefit from some changes outlined in Helge's article.

And this is why me, someone working professionally with Windows for over 15 years (sysadmin, developer, security) doesn't choose to use PowerShell very often.

> * Greater than: 8 -gt 6 ( why isn't this 8 > 6 ?)

Wouldn't that be ambiguous with ">" when used for redirecting IO?

I was happy to find out about Clink: https://mridgers.github.io/clink/

It's not a cmd.exe replacement/wrapper - it just hooks the program and beefs it up with decent stuff like Unix style keybindings, history and completions.

For C# developers, have a look at csi (C# Interpreter) as a potential cmd replacement. I've been using it extensively for a couple weeks and like it very much. Also, the same scripts can, with little modifications, be posted to Azure Functions to run on the cloud.

Wow, that is pretty awesome. Is there anyway to get Intellisense into the command window for CSI? I, all of a sudden, realized that I don't remember properties of the objects without it.

You can get that in VS via the Interactive Window (View->Other Windows->C# Interactive).

Theoretically you could implement a completion system for the terminal but we reused the WPF editor layers for the completion system rather than building a new one from scratch, so it isn't available in the terminal.

Also, all that is open source at https://github.com/dotnet/roslyn and the scripting core can be rehosted as an API.

Last time I tried this there was no way to load nuget references. Very limited usability if that's true.

I wasn't trying to use it as a shell replacement but like a REPL for interactive development/prototyping. F# works better for this IIRC but I have not used F# several years - need to brush up on it.

When I work locally, I have a dummy VS project for grabbing assemblies from Nuget. With Azure Fuctions, your function.json file can reference nuget packages.

You might want to look at ScriptCS for that, that also has a REPL and it does support Nuget packages.

Or even better, the F# version of it (.fsx).

I use Powershell as my default command shell (via ConEmu). But its verbosity makes it a pain to work with as your everyday shell. For instance, in cmd.exe you can say:

dir /od

But in Powershell, you have to do:

dir | sort-object LastWriteTime

I totally get the Powershell verbosity when it comes to writing scripts. But it can be a PITA when you're using it interactively.

First off, if you are using built-in `dir` alias, why not use the `sort` alias? Note also that you get tab completion for `dir`, `sort`, and `LastWriteTime`, so "verbosity" becomes decoupled from "speed of entering commands", which is the real goal.

    dir | sort LastWriteTime
I think many in the "Powershell is too verbose for interactive use" camp have simply been trained by cmd to have a very narrow view of what is appropriate for interactive shell execution. Cmd's syntax and toolbox is so poor that only the simplest of tasks are considered tractable for interactive scripting - doing anything non-default is incredibly difficult (or impossible) without a dedicated flag on some windows util (e.g. `/od`). Thus you come to a state where the only commands anyone enters interactively in cmd are very short and simple - not because the language is concise but because it's too painful to attempt anything more.

Translating that narrow set of cmd-suitable tasks into Powershell, one finds that indeed on average Powershell is a bit more verbose, and one says stuff like "Powershell is more verbose than cmd".

What you're missing is that Powershell actually gives you a general, complete, and uniform set of tools, which really are fairly concise given their power. The space of tasks which are now suitable for interactive scripting is now vastly larger.

So before you say Powershell is "too verbose", I ask how concise batch/cmd is for the following tasks:

    # assign earlier result to a variable (quite hilarious the gymnastics cmd makes you do simply to assign stdout to a damn variable)
    $x = dir | sort LastWriteTime

    # sort by the modified *date*, then by filename
    dir | sort {$_.LastWriteTime.Date},Name

    # get the sum of file sizes in the current dir, in MB
    (dir | measure Length -sum |% sum) / 1MB
Many would reply to this with something like "well if I need to do something like that I'd be using C#/Python/VBS". That's my point! With powershell you have a great interactive shell for super short stuff like file copies or `dir` listings (not the case for C#/Python/VBS), but ALSO the ability to do powerful general scripting with one-liners in the shell.

My "everyday shell" usage frequently covers scenarios similar to or more complex than my examples above. I am quite happy to pay a minor verbosity tax on the trivial commands so that I have the power to do the non-trivial stuff without leaving the shell.

When I started working on Windows again at the end of last year, the first thing I did was pin PowerShell to the taskbar.

I have not used the plain CMD for a long time and would not go back to it. Muscle memory makes me type "ls" in any directory I enter when looking around for something, the fact that PowerShell has this is already a reason for me to use it.

I'm not a heavy PS user though, and feel that it still compares poorly to Linux or Mac's terminal for that matter.

The terminal itself is poor, but PS the language is so much more sane and ergonomic than bash. Doing text-in-text-out where the sed/awk/split/sort relies on every program in the chain (perhaps a git log, or an ls or whatever) to spit out its text in exactly the same way it always has just feels like a terrible idea.

Example: count the number of files of each extension in the current direcory and list the results alphabetically by extension (I honestly didn't pick this as a way to show of PS strength and I realize that you can always pick a benchmark that shows what you want - but I wanted something small and understandable).


    ls | group Extension | sort Name | select Count, Name

    ls | awk -F . '{print $NF}' | sort | uniq -c | awk '{print $2,$1}'
These lines accomplish the same thing, with about the same amount of typing required. You can argue that objects are brittle too, or that text is a universal format whereas objects are not. But you can't tell me the bash line looks better, or that for everyday tasks PS is always a verbose mess. Also: just typing the bash line on my keyboard includes hitting Ctrl+Alt/AltGr eleven times!

I don't see we should have to learn a new language for this at all.

The set based paradigm of SQL lends itself very well to reporting and processing files, and is known by 7 million people.

SELECT extension, count(*) FROM files GROUP BY extension ORDER BY extension;

We built a tool, free for personal use, that does SQL at the command line


Shells aren't file processing only - you process anything (dates, logs, configs, environment, apps, ...).

Unless the underlying model has all of that, I don't see how it competes with bash/ps

It should be doable but requires more than a filesystem relational model

Select * from processes order by priority

I mean I see the benefit of using Sql, had I mastered it, for files, but I'd be reluctant to sort and group reg keys or users or devices or processes or env vars differently from files in my shell language. You must have a "relational model" for the file system, do you have a way to (or plan to) generalize this to general structured data?

I have not used PS that much so I am not sure how much can be accomplished with it.

One of the great things about a Linux system is that you can use the terminal to install and delete programs. Can you do that with PS without using something like Chocolatey?

Not sure I understand the question. You can always just copy a program somewere, add it to path and so on with both PS and cmd (or script it, the equivalent of "make install").

You can also download and install Windows installer setups from a shell, and also uninstall them e.g by name (more elegantly in PS than in cmd). Because win apps are usually binary this is a more common way than the "make install" equivalent on Linux (Where I don't believe there exists a distro independent binary installer format?)

You can't install apps like "packages" without a package manager like choclatey, but that I assume is the same in Linux, you need to install a package manager like apt or yum if the distro didn't come with one.

Indeed, you have package manager interaction but I think that on most linux boxes that is standard? I could be wrong there though, it just goes for the ones that I have used.

Still pretty meat that you can uninstall a program from PS. Might need to give it a closer look, any suggestions on where to learn it from?

It's not all that standard on Linux under the covers either, because the package manager is a function of the distribution and its ecosystem and there are two main package ecosystems rpm (current package manager: yum) and deb (current package manager: apt) and a very long tail of other package systems and ecosystems. If you stick primarily to a single distribution you wind up mostly fine (until you need packages outside your ecosystem), if you need to interact with multiple ecosystems things get more complex.

The big difference between something like Chocalatey and rpm/deb Linux package ecosystems is more one of scale (how many packages exist in the ecosystem; how many players are expected/required to buy into the ecosystem).

An interesting twist to this is that PowerShell in Windows 10 'recently' added a meta-package manager "OneGet" that was designed to use a single set of commands to learn to work with a wide collection of package ecosystems (including, but not limited to, Chocolatey, NuGet, MSI installers, Docker packages, whatever other sort of provider someone wants to build).

There are several different ways, see e.g http://stackoverflow.com/questions/113542/how-can-i-uninstal...

> Cmd is one of the most frequently run executables on Windows with a similar number of daily launches as File Explorer, Edge and Internet Explorer!

Wait, they track that?

There has been an opt-in for »Share anonymous usage details with Microsoft to make stuff better« since ages in Windows and other Microsoft products. So if yo enabled that, I presume they will log telemetry about what features of Windows are used, how often, together with which other features, etc. I wouldn't expect that to extend to non-Windows programs since there's little they can learn from it.

Executable name would probably be valuable. They could easily maintain a hand-curated list of Office-alikes, for instance (the list would be small enough to manage), to track what competitive products are being used and any trend in those.

This would provide competitive information and could help point the way to features that are important to users, based on the differentiation between the competitive product and Office.

I have no idea if they do this or not, but it's something that anonymous data could be used for outside of just Windows features and programs.

Yes, you could circumvent this sort of thing, but this is a case of the 99.99% of people who wouldn't care or know how to vs the .001% who do. For the use case outlined above, that's irrelevant.

I would be interested to know why you are being downvoted.

Because it's well known by anyone interested in this area? There's been a barrage of "Windows 10 Telemetry/Privacy" related stories since its release. It's also publicly documented by Microsoft itself: https://technet.microsoft.com/en-au/itpro/windows/manage/con...

"Performance and reliability data, such as which programs are launched on a device, how long they run, how quickly they respond to input, how many problems are experienced with an app or device, and how quickly information is sent or received over a network connection."

Because it's a comment that asks a question based on actually reading the linked article, and this is hacker news.

Also my first reaction

I was more surprised seeing internet explorer in that list

Did not read the terms eh? On "tracking disabled" how some stupid fucks tend to call it they still collect usage and crash data. Its all documented but nobody cares...

From the Article:

> Notice the nuance here: The above paragraph states that “[PowerShell] replaces Command Prompt (aka, “cmd.exe”) in the WIN + X menu, in File Explorer”s File menu“. It does not say “[PowerShell] replaces Command Prompt“, period! The paragraph even goes on to show you how to quickly launch Cmd, and points out that you can revert this default setting to launch Cmd by default instead if you prefer.


> So, to be ultra-clear here: All that’s happening is that, in the [Win] + [X] (“Power User’s menu”), or File Explorer’s File menu, PowerShell is presented instead of Cmd. That’s all! Nothing is being removed, your scripts will continue to run just as they always have, the sky is not falling!

Looks their Release Notes writer did a poor job of explaining this, so all hell broke loose and now they forced the "Commandline" Blog Team to write a blog post screaming in CAPS and with Underlined words and Big Fonts (and even more confusing language and punctuation), to tell everyone - what they really meant to say - in their release notes.

Agree that the phrasing could have been a lot clearer, but really, Microsoft removing cmd just doesn't pass even a basic sanity check. I remember seeing the original news articles and just ignoring them as obviously rubbish. It's pretty sad to see so many tech news sites & aggregators just pick-up and run with stories like that which anyone at all knowledgeable about the subject area would immediately know is hyperbole.

They implemented bash into Windows 10, and their web-ish/app-ish "platform" Silverlight does not work in their own browser in Windows 10 and isn't intended to.

Microsoft have improved leaps and bounds in the last few years but saying that a story is immediately hyperbole because "X-good-technical-reason" isn't something I'm willing to commit to for any large corporation.

Sure, but the degree of usage of cmd makes Silverlight's usage look like a hobby project. It's used everywhere, including all over Microsoft's own software. It'd easily be the single biggest backwards incompatible change ever made by Microsoft and would affect software going back to the 1980s. The engineering effort involved to migrate anything that is using cmd is mind boggling, to the extent of being completely infeasible. It's a safe bet that for as long as Windows is around, cmd is going to stay with it.

And this is exactly because of this mentality that windows next move is to add a Unix subsystem (Ubuntu) aside all the old crap instead of just being Unix compliant and progressively drop old chunks of code as year goes by like Apple started to do 15 years ago.

UNIX is oversold and if it was sold by the same price as other commercial OS back in the day, it wouldn't ever taken over the IT market.

I appreciate the option of running other OS more open to explore the ideas of Xerox PARC.

Also Apple just got UNIX by acquiring NeXT, with Be the music would be other.

Even on NeXT's case, UNIX compatibility, was a means to allow software into the system, not out of the system.

No valuable NeXT application, or for that matter Mac OS/iDevices, made that much use of UNIX APIs.


Still I find interesting that Apple and Microsoft again choose different strategies to try fill the gap between them and FOSS supporters: -Microsoft is gonna include an FOSS unix subsystem -Apple for the first time in his history is going full FOSS for Swift development

It's just the Swift language itself that is open source and a few libraries. Most of the OS X layers above BSD are still closed source.

It's not "more open" than C# and .NET in that regard.

Sorry if I was wrong I absolutely now nothing about c# & .net.

Does it mean one could actually review code and fork theses languages too if need be?

> MS-DOS’ command-line shell’s scripting language was relatively terse and moderately powerful, but lacked many of the richer, and more advanced features we enjoy in modern-day PowerShell, Bash, etc.

Nice try, but the Bourne shell, which dates from 1979, already had most of the features commonly found nowadays in Unix shells.

I want to use powershell more but I am too trained to <Win+R> cmd <enter>

To do the same with powershell I need to type "powershell" within run, many more keystrokes. I would love to just type "ps" or something short to do this. (natively not via an alias for when im using other peoples computers)

The inability to launch an elevated process with an easy key shortcut in the "Run" <Win>+<R> dialog has been breaking me of that muscle memory over last few years. The <CTRL>+<SHIFT>+<ENTER> sequence to elevate a new process works from the "Search" text-box in the Start Menu, but not the "Run" dialog.

You can try this little utility: http://code.kliu.org/misc/elevate/

It's very handy.

That's pretty cool. I've been using "su" [1] myself which is different, I think. From the website:

    "it will attempt to restart cmd.exe with administrator privileges, in the current directory, much like if you had started cmd.exe with rightclick "Run as Administrator"
1: http://p-nand-q.com/download/supershell.html

haven't used that "su", but elevate will use the built in mechanism for privilege elevation, if you make UAC prompts silent it will simply work without bothering you.

You just made my life so much better. Right click->Run as Administrator was pretty cumbersome.

That <CTRL>+<SHIFT> works on left-clicks on taskbar icons and double-clicks on other icons, too.

It also works with middle-clicks on taskbar icons to launch a new instance as admin.

You might be interested to learn that for shortcuts, Rightclick->Properties->Shortcut Tab->Advanced has an option to "Run as administrator".

Yeah, I knew that but usually, I'm fine unelevated.

I have pinned PowerShell to the task bar, so on machines I use with my users it's always Win+Shift+4 to get a new instance. Win+X,i is another way (provided you enabled the option for PowerShell to replace cmd on that menu, which the article mentions as becoming the default eventually).

Win+X,I (cmd or PowerShell, configurable, and the default that is about to swap hence all the articles lately) and Win+X,A (the same shell, but this time Admin).

Set up an alias for ps via .bat which then closes the cmd window - <Win+R> cmd <Enter> ps

Sorry didn't read to the end of your comment :)

At least PowerShell is taking over as the default. Should have been changed back in Windows 7. Copy & paste with Ctrl-C and Ctrl-V is what made me switch to PowerShell.

The name CMD.EXE actually comes from OS/2.

The recent trend is to use the git shell on windows and execute bash scripts.

What a strange writing style! Almost every sentence ends in an exclamation point! I never realised CMD could be so exciting!


"Cmd has served us all well for almost 30 years now"

No it hasn't.

If Cmd is not going away, why doesn't MS invest in it and make it more useful? It could learn a few things from xterm, bash, etc.

The article mentions that. Every change, no matter how trivial, in cmd breaks some behavior that someone is depending on, or at least it feels that way. Even fixing bugs tends to break large number of scripts in the wild.

I'd like to see a 'bacmd' or something where they could break some history without fear, but clearly MS has decided to focus their resources on a very different way of thinking for their new shell.

Bullshit. cmd has an extension mechanism specifically designed to allow for breaking extensions [1] to the cmd command language. This mechanism hasn't been used very much, but in principle, you could easily add new SETLOCAL [1] commands to enable support for modern command shell features.

But cmd is in "maintenance mode", which means nobody is interested in adding features. Powershell is the new hotness. And Powershell wouldn't even be so bad if some of the more shameful problems were fixed --- a pipe between two native programs should be 8-bit clean!

[1] https://technet.microsoft.com/en-us/library/bb491001.aspx

It'd also be of dubious value by now. Cmd has been feature-stable since Windows 2000, I think. Any features added now (i.e. available only in Windows 10, update.next) would not be used by anyone since oftentimes batch files are used where you need stuff to work on pretty much every Windows installation. (Or they are so simple that they wouldn't benefit anyway.)

Yes, just freeze the old version and create a new CMD. And while they are at it, make notepad useful.

I can understand why notepad is simple. You can easily install alternatives, like notepad++.

Does Notepad understand Unix line endings yet?

No, but wordpad does - use it in lieu of Notepad when you need to edit text on a Windows machine and don't want to install external tools.

Or open in wordpad, save (which will convert the line endings), then open again in notepad and edit there.


That's part of its charm, like the two state undo buffer

There are possible improvements that wouldn't affect the scripting language. *nix separates out the terminal (e.g. xterm) from the shell (e.g. bash).

Windows also separates that, although the separation is slightly different. For example, much of the line editing and tab completion in cmd is actually implemented by the console window itself.

They have powershell for that.

I've been using Powershell a lot lately. I keep hoping I'll get the hang of it, but everything sucks. It gives you Compare-Object, but what it returns is an object. Even if I waste 10 minutes converting it to text and want to then split the strings it returns there is a 99% chance it won't work. Sure it is great for automating a few IT things like installing new users, but the problem is it shouldn't be that hard. In bash I would just pipe the output of diff to Awk and simply split on "," and tell it to print what I want. I'm sure it is possible in Powershell, but the 6 ways I tried never worked.

I suspect that this is not a case of Powershell "sucking" but it being a tool that is completely unfamiliar to you.

You say you run compare-object and you're surprised that it returns an object; Powershell is essentially an object-oriented pipeline, so a return object should always be the expected case. You say is that it's hard to coerce the object into a text-stream that you can munge, but again I suspect that your difficulties are again a misunderstanding of exactly how you should be operating in Powershell. Even me saying "coerce the object into a text-stream" is disingenuous, because what you'll end up with is almost always going to be `string[]`. Once you've thrown away all of the data contained in the object except for the one string property you want, you've thrown away all of the value of Powershell.

None of this difficulty is your fault. In bash you would "just" pipe the output of `diff` into `awk`, but that "just" is a cop-out: bash is nearly as complex as Powershell itself before you throw awk into the mix. You already know awk and bash, so it's natural to you and all of your preconceptions about how the shell should work are met.

Powershell is entirely capable of doing what you want to do with it, but it's a completely different world. If you persist with it, and you do get the hang of it, I promise you'll understand the value of the concepts behind it.

True to a degree, but I only had to put a few hours into bash and I haven't found anything I can't do. I've spent a fair amount on PowerShell and keep running into hurdles. I found no less than 5 ways to add text from a certain process to a file. PS did it, but something like 800x slower than Python and Perl. The only fast way I can find is a direct C# implementation in PS using streams. Why so harsh? There is a lot I like such as being able to build GUI's faster than anything out there (except Red or Rebol), but the language fights you at every step. I was aware that mostly everything was supposed to be an object. This is great for certain IT work, but just adds complexity for others. I have to then hunt for properties I want like "-property name" and it usually ends up as another rabbit hole.

I suspect that this is not a case of Powershell "sucking" but it being a tool that is completely unfamiliar to you.

Maybe, but I still want to make the case that Powershell sucks:

- Function return values. WTF were they thinking when they decided it was useful when the result of a function is an array of every statement result within that function? And why isn't there a useful way (like, say, "return") that overrides that behaviour?

- Automatic unboxing and splatting of arrays. If I have a list of one element, I want to know that I have a list of one element. If I ask for the Length of that array, I do not want to know the length of that single element (which is not 1, in the case of a list of strings).

- Case sensitivity. Sorting in Powershell is always case-insensitive, except when it isn't (when you use a .Net sort algorithm instead of Sort-Object). Still, Get-Unique is always case sensitive, as is Select-Object -Unique. Of course, you can get case insensitive uniqueness by using Sort-Object -Unique.

- The -AsString weirdness. Multiple Powershell commands can process String lists, and know they're processing String lists, but they will still return their results as Strings wrapped in PSObject. This leads to very fun results when you're trying to use those results as HashTable keys. I've lost days debugging this madness, especially when coupled with the implicit autoboxing I mentioned above.

> WTF were they thinking when they decided it was useful when the result of a function is an array of every statement result within that function?

Because in a pipeline results are usually »whatever the element happens to have as its output«. Usually you know or notice if statements within your function have return values or not. I haven't found it problematic so far, either. You don't write PowerShell like C# anyway, so different patterns are idiomatic. You'll get the hang of it fairly quickly.

> And why isn't there a useful way (like, say, "return") that overrides that behaviour?

There is. It's called return:

    return (expr)
is the same as

and return will immediately exit the function / script block.

> Automatic unboxing and splatting of arrays.

Admittedly, this can be annoying, however this only ever applies to function or cmdlet output. When you're creating a one-element array (via @(1) or ,1) it stays that way unless used as the first part in a pipeline, where unwrapping is usually wanted.

You can also use Count instead of Length (unless your items have a Count property) since v3, as PowerShell implicitly understands reading the Count property on a scalar value that doesn't have it and returns 1.

The usual fix is to explicitly make the return value an array, though:

    $foo = @(Get-Foo)
PowerShell cannot know that a call results in multiple values or just one, e.g. Get-Service yields an array, but Get-Service foo yields a single object. It's not unreasonable to expect something like (Get-Service foo).DisplayName to work instead of having to do either (Get-Service foo)[0].DisplayName or Get-Service foo | % DisplayName.

So it's a trade-off between two ugly things. Trade-offs are always bound to annoy people who consider one thing worse than the other.

> Still, Get-Unique is always case sensitive, as is Select-Object -Unique. Of course, you can get case insensitive uniqueness by using Sort-Object -Unique.

Huh, learned something new there, thanks. I think the only instances where I needed unique strings was in golfing, so sort -u was the way to go anyway. At least it's documented in each of those cases.

As for sorting using the .NET method not adhering to the defaults used throughout PowerShell, I think that's hardly surprising. You're explicitly leaving PowerShell and using the BCL directly. Of course other rules apply. And there are even worse annoyances on that front, such as that PowerShell's current directory is not the process' current directory (so you can't really use relative paths with .NET APIs).

> -AsString

Never used, to be honest, so no opinion here.

There is. It's called return

I know. Just to be sure, my beef is that:

  function test() {
    return 2
still returns @(1,2), not just 2

If I remember correctly, you can put [void] in front of a statement to prevent it from being included in the return.

You can do a number of things:

    $null = (expr)
    (expr) | Out-Null

Same here. Every time I've completed a script I can't help but think I could have finished it so much faster in C# or just about any other language. I tried liking it. Maybe I'm just not finding the right useful examples.

I also like that a system touted for comprehensibility and discoverability has the following very commonly used command:

Guess what it does by looking at its name.

Correct, it lists the contents of a directory! What else could it be? Conversely, if you want to find out what "dir"/"ls" would be in PowerShell, where else would you be looking!

I like the idea of PowerShell a lot (object-oriented piping), but I hope it evolves into something with saner default aliases and better documentation, with easy things accomplished easily.

> Correct, it lists the contents of a directory

It does so much more than that. It can display child items from any PSProvider (Get-PSProvider): defined aliases, environmental variables, defined variables, defined functions, registry nodes, X.509 certificate containers and even Active Directory paths. "FileSystem" is just one provider amongst many.

You can also `cd` to any of these providers to use them in a filesystem-like manner. e.g. `cd HKLM:\`, `cd AD:\` (if you have the ActiveDirectory module)

That's why the name is abstract. Many of the default cmdlets work on all kinds of stuff, not just bare filesystem stuff.

Conversely, if you want to find out what "dir"/"ls" would be in PowerShell, where else would you be looking

Not so hopeless (although if you ask for detailed docs it opens a browser for you,internet required):

   PS C:\Users\xxx> help ls
        Get-ChildItem [[-Path] <string[]>] [[-Filter] <string>]  [<CommonParameters>]
        Get-ChildItem [[-Filter] <string>]  [<CommonParameters>]

From a basic data structures class in college, one might recall that nodes in a tree have can have child nodes.

Filesystems, most of them anyway, are structured as trees of files and directories.

Therefore one would expect a command named "Get-ChildItem" executed against the current directory to do...?

(Aside from that, the Powershell default alias set translates both "dir" and "ls" to Get-ChildItem.)

Get-ChildItem also works on things other than directories. And Powershell understands both ls and dir!

It knows ls, but there are a lot of UNIX commands it doesn't and it isn't a true implementation, but an alias for running Get-Childitem.

Yes, the point is to get started easily if you're used to another environment, not to be able to use it as a cmd or bash replacement.

I wanted to play around with PowerShell, but the learning curve looked a bit too steep.

They said themselves that PowerShell is not a replacement for Cmd...

Not a literal replacement, but no they want you to use PowerShell instead of CMD (and for good reason).

I get that they want to push PowerShell.

Instead they can adopt linux and release a Microsoft Linux. Why bother about windows?

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact