Hacker News new | past | comments | ask | show | jobs | submit login
PowerShell is open sourced and is available on Linux (microsoft.com)
1217 points by platz on Aug 18, 2016 | hide | past | favorite | 760 comments



PowerShell is my guilty pleasure of the computing world. Once you've piped strongly-typed objects around between your shell commands, text scraping seems barbaric by comparison. The problem, of course, is that you can't use all that power for much, since the pool of PS-compatible tools and utilities is much shallower than it is for Unix shells. I'm really hoping this will help spur a new wave of PowerShell-compatible tools.

Come to the dark side, we have (strongly-typed) cookies.


After a number of years using PowerShell, my conclusion is the opposite: text "scraping" is just better for most cases.

Normal shell usage means doing a lot of one-shot pipelines incrementally. This is just easier and faster to do when passing text around because you can look at it and not have to inspect some object for it's type and what attributes/methods it provides. Parsing the text is not the problem here (although many people think it is), reasoning about what we're trying to do is.

And the over-verbose syntax doesn't help.

I compare this to human languages. It would be tempting to create a language with minimum ambiguity and clear cut concepts, but it wouldn't be practical. I guess PowerShell as an interactive shell is somewhat like this.

For automation, PowerShell is nice. The language sits between a shell script and going the Perl/Python route, but I still prefer shell scripts for simpler things and Perl/Python for more complex tools.

Having said this, PowerShell is the only sane choice on Windows and has made my life easier by no small amount. I never enjoyed managing Windows servers with their focus on GUI tools and their terrible CLI. PowerShell changed that and even "invaded" products like SQL Server and Exchange making them also nice to manage.


I'm not entirely convinced that "plain text" is "simple." For one, utf-8 is a variable-length encoding, which can cause all sorts of subtle bugs and sometimes leads to security issues because somebody failed to parse a character correctly somewhere. On top of this, using plain text means that every program has to choose its own control characters, essentially an encoding within an encoding. This is great for readability, but it's not always easy to know how a certain program will treat certain corner cases, or even really just at all. I'm not sure what you mean by "you can just look at the text," because unlike something that encapsulates functionality alongside the data, text input into a program might do anything.


For one, utf-8 is a variable-length encoding, which can cause all sorts of subtle bugs and sometimes leads to security issues because somebody failed to parse a character correctly somewhere.

UTF-8 is nice in that extended characters, despite consisting of multiple bytes, will never contain a low-ASCII character amongst them. Unless you're dealing with byte offsets, splitting and scanning UTF-8 strings by delimiters works just like ASCII.

because unlike something that encapsulates functionality alongside the data, text input into a program might do anything.

On the other hand, that "encapsulated functionality" is even more hidden and non-obvious. Unlike a stream of bytes that you can easily inspect simply by dumping them into a file, PowerShell objects are far more opaque entities.


Is there no trivial way to print a PowerShell object? As JSON or XML or an ascii table formatted into columns? That seems like a point of friction, then, I'd agree.


By default, an object at the end of a pipeline will be printed to the screen in tabular format.

    (australis) ~\Desktop % Get-ItemProperty e
        Directory: C:\Users\Dustin\Desktop
    Mode                LastWriteTime         Length Name
    ----                -------------         ------ ----
    d-----        8/12/2016   9:06 AM                e
I believe types can specify which columns to display by default; if you want more info, there's always `Format-List`:

    (australis) ~\Desktop % Get-ItemProperty e | format-list
        Directory: C:\Users\Dustin\Desktop
    
    Name           : e
    CreationTime   : 8/11/2016 10:58:56 PM
    LastWriteTime  : 8/12/2016 9:06:13 AM
    LastAccessTime : 8/12/2016 9:06:13 AM
    Mode           : d-----
    LinkType       :
    Target         : {}
It's also possible to format any object via `ConvertTo-JSON`:

    (australis) ~\Desktop % Get-ItemProperty e | ConvertTo-JSON
    {
        "Name":  "e",
        "Parent":  {
                       "Name":  "Desktop",
                       "Parent":  {
                                      "Name":  "Dustin",
                                      "Parent":  "Users",
                                      "Exists":  true,
                                      "Root":  "C:\\",
                                      "FullName":  "C:\\Users\\Dustin",
                                      "Extension":  "",
  ...


> I believe types can specify which columns to display by default;

Indeed, the default view (table/list/custom) can be specified, and for the view the default properties can be specified.

Consider how 'ls' produces a sequence of FileSystemInfo objects (when run against a disk file system). The view specifies that the list be grouped by "parent".

On the terminal this gives a nice list a la:

        Directory: C:\Dell\Drivers\T7MFF\PhysX\files\Engine\v2.8.0


    Mode                LastWriteTime         Length Name
    ----                -------------         ------ ----
    -a----       12-12-2014     07:42         362232 NxCooking.dll
    -a----       12-12-2014     07:42        5520120 PhysXCore.dll


        Directory: C:\Dell\Drivers\T7MFF\PhysX\files\Engine\v2.8.1


    Mode                LastWriteTime         Length Name
    ----                -------------         ------ ----
    -a----       12-12-2014     07:42         363256 NxCooking.dll
    -a----       12-12-2014     07:42        5823736 PhysXCore.dll


        Directory: C:\Dell\Drivers\T7MFF\PhysX\files\Engine\v2.8.3


    Mode                LastWriteTime         Length Name
    ----                -------------         ------ ----
    -a----       12-12-2014     07:42         339192 PhysXCooking.dll
    -a----       12-12-2014     07:42         412408 PhysXCooking64.dll
    -a----       12-12-2014     07:42        5952248 PhysXCore64.dll

However, the underlying stream is still just a continous stream of FileSystemInfo's. It is the terminal display format definition that causes it formatting to break and write a directory heading whenever the next item has a different parent from the previous one.


You can certainly print them, but there's no guarantee you'll get all its contents by default, and there's still the problem of how to create those objects from some other output, e.g. in a file.

http://windowsitpro.com/powershell/powershell-objects-and-ou...

It could be said that, in some ways, unstructured streams are far more WYSIWYG, which can conceptually mean easier understanding and use.


Or it can mean a lot harder to understand and use. With types you get all of the metadata that you might not even have if it was plain text. In addition, every application chooses it's own serialization format. In the end, the only programs that compose well are those that work on text, not structured data (i.e. grep, awk, sed, etc...)


> You can certainly print them, but there's no guarantee you'll get all its contents by default, and there's still the problem of how to create those objects from some other output, e.g. in a file.

That's what CliXML is for. Export-CliXml writes objects to a file. Import-CliXml reads them back.


I think they're assuming the default view output is being used for Export-CliXML, when really it's serializing the object passed to it.


You can print them, but you can't serialise them in a way that can be later unserialised.


> You can print them, but you can't serialise them in a way that can be later unserialised.

Nope, that's wrong:

    Export-CliXml
    Import-CliXml
(https://technet.microsoft.com/en-us/library/hh849916.aspx)

For instance:

    PS> ps|Export-Clixml myprocesses.xml
    PS> Import-Clixml .\myprocesses.xml

    Handles  NPM(K)    PM(K)      WS(K)     CPU(s)     Id  SI ProcessName
    -------  ------    -----      -----     ------     --  -- -----------
        297      15     3856      13304              4024   0 aeagent
         88       7     1420       4808              3804   
    ...
CliXml (Command Line XML, IIRC) is a serialization format for PowerShell objects. It is the format used to transfer objects to/from remote machines when PS remoting is used.

Granted, the re-hydrated objects are not the original objects. Most method will not work after re-hydration as the object has been ripped from it's context. Think ProcessInfo (the objects returned from 'ps' (Get-Process)) - in it's original form it can be used to interact with the process, e.g. terminating it. In the rehydrated form it can be used only to inspect properties.

Speaking from experience, this is rarely a problem. CliXML works remarkably well.


> Normal shell usage means doing a lot of one-shot pipelines incrementally.

Sure because no-one ever takes those "one-shot" pipelines full of `cut`s and `grep`s and distributes them in real products. /s

As far as I can tell, Powershell is designed for scripts that are meant to be distributed, so they must be robust. Normal 'text scraping' Bash is shit at that.


If we are not talking about interactive use then to distribute "robust" scripts, you could use nicer and richer Perl, Python eco-systems.

For interactive use, the conciseness, the human-readability of input/output and the fault-tolerance to unrelated changes in the input/output formats provided by Unix shells are also preferable.


The motivation for PowerShell was to have something good at both. Why switch contexts when moving from interactive use to coded automation if you don't have to?


Because the requirements change depending on which one you're doing: quick shell automation isn't the same as the kind of automation you'd use perl/python for.


And yet most of the time I see bash being used for the same kind of automation that you think is good for perl/python.

And not because bash is good, but because it is what people know (since they use it interactively) and because it is installed everywhere.

So if someone had a tool that was installed everywhere, and used interactively, that could also be used to create more robust automation tasks, that seems like a win to me.


It depends on how complex the automation is. I'd no doubt use a pipeline in places where you'd use perl/python.

As for interactive use and robust automation, Bash isn't as bad as you'd think. The reason I'd go to python is because of script complexity, not lack of robustness.


> text "scraping" is just better for most cases.

Unix pipes handle bytes, not just text. For instance, copy a hard disk partition to a remote hard disk via ssh:

  (dd if=/dev/hda1) | (ssh root@host dd of=/dev/sda1)
The KISS principle ("Keep it simple, Stupid") is the best way in many cases. In Unix you can quickly enter a pipe without special coding which does also non-trivial stuff. For instance,

  find -name "*.xls" -exec echo -n '"{}" ' \; | xargs xls2csv | grep "Miller" | sort 
gets a sorted list of all entries in all Excel files which contain the name "Miller", no matter how deeply the files are located in the directories. Can you do this in Powershell quickly? I don't know, I am actually curious.

Objects in pipes are convenient and powerful. However, for most applications of pipes they are probably overkill. If things get tough you can simply use files rather than objects.

I would not be surprised if many Windows users will prefer bash pipes from the Ubuntu subsystem in Windows 10 rather than Powershell because it is much more handy for most pipe applications.


> Can you do this in Powershell quickly? I don't know, I am actually curious.

You can basically do the same thing with Powershell. I don't know of a 'built-in' module to handle XLS->CSV conversion, so you need to bring one in:

  Install-Module ImportExcel
Then:

  ls *.xlsx -r | %{ Import-excel $_ } | ? { $_.Name -eq "Miller" } | sort
That's my naive, Powershell noob approach anyway.

There's another option however, where you can leverage Excel itself (granted, this is likely to be a Windows only approach):

  $excel = New-Object -com excel.application
And you can now open XLS/XLSX files and operate on them as an actual excel document (including iterating through workbooks/ worksheets, etc.). It's all just objects.


Thanks for pointing this out. However, your example is not equivalent since "ls" searches only in the current directory. Nevertheless it is good to know that it's basically possible since it helps a lot to manage mixed Windows/Linux networks.


Not true. The -r flag I specified is for "recurse", will search current and all sub-directories.


A better way to do that find command is:

    find -name '*.xls' -print0 | xargs -0 xls2csv


> I compare this to human languages. It would be tempting to create a language with minimum ambiguity and clear cut concepts, but it wouldn't be practical.

Lojban[1] is usable in practice (though that may be because it still allows you to be ambiguous when desired).

[1] http://lojban.org


> And the over-verbose syntax doesn't help

This is basically the reason I have not been willing to put any time even trying to learn ps. I mean, trying to translate

unzip file.zip folder

To ps was a nigthmare. Like:

[System.IO.Compression.ZipFile]::ExtractToDirectory($zipfile, $outpath)

I don't _want_ to learn to type stuff like that for simple shell scripts.


But that's picking an example where

a. You can still use the native binary just as well. Especially in this case.

b. There was no PowerShell cmdlet for it at the time. There also isn't one for executing Python code, you still have to call python. Or, if you're so inclined, load up IronPython's assembly, and use an overly verbose .NET method call ...

c. There now is a cmdlet for doing so: Expand-ZipFile.

I'm actually curious as to those complaints: Do you never write a function in bash to abstract away something? A function in other languages? Is everything just a series of crystal-clear one-liners?

Shell scripts in Unix-likes tend to glue together a bunch of other programs and (usually) not do much programming in the shell's language. You can write PowerShell just the same, actually. It is a shell, after all.


If you're used to cmd or bash, the jump in verbosity is very noticeable and definitely gets in the way.

It might be a cultural thing, because I'm no fan of C# and its typical style either; no surprise then, that I find PS syntax exceedingly verbose too. It just feels excessively bureaucratic and awkward to have to write so much. On the other hand, bash, awk, sed, and all the typical Unix commands and its associated ecosystem seem like they "get out of your way" far more effectively.

c. There now is a cmdlet for doing so: Expand-ZipFile.

That example already shows the verbosity increase clearly - why is it "Expand-ZipFile", and not "ZipFile-Expand", "Expand-Zip-Format-Archive", or something else? In contrast, "unzip" is short and easy to remember. The fact that a native binary might exist for a given task is irrelevant to the observation that the shell's language is itself more verbose.

I'm actually curious as to those complaints: Do you never write a function in bash to abstract away something? A function in other languages? Is everything just a series of crystal-clear one-liners?

Abstraction helps reduce code duplication but is not useful when each line of the script is quite different, and in that case PS remains more verbose. Ultimately, the overhead is still higher.


> Why is it "Expand-ZipFile", and not "ZipFile-Expand", "Expand-Zip-Format-Archive", or something else? In contrast, "unzip" is short and easy to remember.

Iff you know it already. PowerShell is built around a few conventions. One is that all commands share the Verb-Noun pattern. The verb always comes first. Then there are a bunch of common verbs that all share the same basic meaning, regardless of context. Expand is such a verb that is the opposite of the Compress verb. You may not like then choice of verb but there are always going to be names you didn't choose, so that's probably a petty point. In the end, PowerShell makes it easy to discover commands you may vaguely know exist. I'd argue that also helps remembering them once you know them. Just from knowing unzip you wouldn't be able to guess the command to extract a GZIP file, for example.

___________

P.S.: I have to retract part of my original post here. Expand-ZipFile does not exist natively in PowerShell. I stumbled across http://ss64.com/ps/zip.html and while that site does have documentation for all built-in cmdlets, I didn't read too closely and this was actually documentation for a wrapper around the .NET framework's own functionality. This does not change the discussion about the name and discoverability, though, except that New-ZipFile should probably use a different verb. Actually, the PowerShell cmdlets for handling ZIP files are Compress-Archive and Expand-Archive, exhibiting the properly mirrored verbs.


That was picking an example from my previous try to work with ps. I was not the admin of the server so no installation of extra programs. Or upgrading ps to the newest version.

The shell scripting I have needed has been typically very simple. Like schedule to copy this file from that server and run this command.(what was precisely all that I was trying to achieve the last time, only the file was compressed. Would have done with .bat, but that was even more difficult) As you say, using it as a glue and build the complexity to the programs behind the shell.


Use this.

Expand-Archive file.zip


This is all a big part of the reason why I personally advocate using a standardized text-based object serialization format (like YAML) for these sorts of things. In particular:

* Still text based and human-readable, which helps for debugging/troubleshooting * Still text-based and machine-readable, so it's inherently cross-platform (assuming that all said platforms agree on their text encoding) * Still less troublesome than piping arbitrary text through `grep` or `sed` or `awk` or what have you * Still provides the "everything is an object" benefit that's lost with arbitrary text streams * Still orientable around streams of data, at least for YAML (by using the document delimiter)


Passing CSV around can be a useful middle ground, if you're using utilities that all do proper escaping and allow newlines in strings etc., such as csvkit.


What is proper escaping for CSV?


The problem is that not all tools agree on this one, but: lines are separated by newlines, items are separated by commas. An item can be quoted (and this must be done if it contains a newline, comma, or double quote) by surrounding them with double quotes. Inside such quoted items, double quotes can be escaped by doubling them.


RFC4180.


DSV is the default format for many unix utils, and AWK and cut parse it easily.


Yes, of course; that is one end of the "middle ground" I described. The problem is that in Unix, there is no common convention for how to escape the delimiter, so you cannot pass arbitrary strings. CSV can do this, with the proper tooling. Being able to pass around arbitrary strings is quite valuable, and although it is not quite the same as passing objects around, it has the advantage of still being human-readable text. That is why I called it a "useful middle ground" between Powershell and the delimiter-separated fields of Unix tools. However, you do need special tools such as csvkit, as Unix tools like awk, sort and cut cannot properly parse it.


Oh, you can pass arbitrary strings, you just can't pass arbitrary strings as part of compound datastructures. The closest you can get is using null or some other weird character to separate lines ($RS='\0' in awk), and something similar for field separation (like -0 in find, which is $FS='\0' in awk). But \0 is only one special character, so you need another one.


As a note, there have been strongly-typed shell languages for Unix for some time. Scheme Shell[0], Turtle[1] and TCSH[2] leap to mind; but there's nothing preventing a developer from using Perl, Python, Lisp or any other language that allows itself to accept stdin as an interpreted script.

0: https://scsh.net/

1: http://www.haskellforall.com/2015/01/use-haskell-for-shell-s...

2: http://www.tcsh.org/Welcome

Edit: oh my, I posted the wrong link for tcsh. No one noticed, but my apologies nonetheless.


This is why I find PowerShell to be an answer to a question nobody asked. If bash/zsh isn't cutting it for you, drop into something like Python, Ruby or Perl to do the heavy lifting.

PowerShell will never accumulate a library as comprehensive as what any of those three have, each has had decades to accumulate packages of all kinds, and more are still being added.

It's odd, but not surprising given their history of "Not Invented Here", that Microsoft never even considered using Python or Ruby when they both have .Net implementations. Their enthusiasm for JavaScript wasn't sufficient for them to embrace that either.


[Disclaimer: I work for Microsoft.]

As far as I recall, the original impetus behind what became PowerShell (as handed down to us in a conference room by Ballmer himself, sometime around 2001) was to fill the gap between ops people who did enterprise administration manually with tools like MMC and engineers who automated enterprise administration with tools like C++/DCOM. The latter were necessary in a lot of cases, but they were expensive, and we needed to give the industry a way to do powerful automation without hiring a bunch of PhDs. So, yes, someone did ask for it - the IT industry.


I was part of the Hotmail team which consisted mainly of Solaris admins trying to mass administer a giant number of Windows servers. We kept pleading with the Windows team and upper management (forgot who it was at the time, Raikes?) to give us a powerful shell and ssh on Windows. We ended up licensing FSecure's ssh daemon for windows. We also used cygwin, too.


>forgot who it was at the time, Raikes?

It was Allchin when I was there... about that era.


> So, yes, someone did ask for it - the IT industry.

... who couldn't use the well-known mature shell tools since they were locked into a closed platform, and Windows' cmd.exe was comparatively very weak.


It wasn't cmd that was the problem (although, it is weak); it was how Windows exposed APIs to allow automated solutions.

You had two options: VBScript (which is a glorified COM+ agent) or a "real" programming language like VB6/C# which mixed COM+ with Win32 API calls (which are too hard to use for most non-programmers).

Powershell within itself does improve CMD. But that isn't primarily how PS pushed Windows automation forward, Microsoft said that every major server feature and server function should work in Powershell, so engineers at Microsoft had to look at it (be it their lacking COM+ interface, provide a new PS interface, or something else).

The net result isn't that CMD got replaced by something better. It is that we have a lot more APIs and ways of interacting with Windows features and services than we ever used to.

The ironic thing is that today you could write MS Dos-style console applications that work in CMD which provide all of this new automation functionality because of the Powershell push. Back before Powershell you'd have to hack it using unsupported APIs, registry entries and similar.


> Powershell within itself does improve CMD.

Kind of depends on how you define improved. I find powershell slow enough that I usually still use cmd unless I'm dealing with a powershell script.


If all this API effort had gone into a real cross-platform language (Python, Ruby, or even Perl which was king in 2001), Windows would have flourished. They went for their traditional lock-in approach instead, so now they have to catch up.


Windows did flourish...

And are you suggesting that Perl would have been a better solution for a Windows interactive shell and automation tool?

Are you going for the more readable argument? Or the more strongly typed argument?


> Windows did flourish

Late reply, sorry. I don't think we're on the same page: Windows sold relatively ok, basically doing tying the battle for the Enterprise sector, but in the meantime it lost the cloud war and was literally wiped out of several other segments (mobile, web etc). But don't take it from me: if everything were hunky-dory, they wouldn't be busy steering like crazy towards opensource, something that was unthinkable 15 years ago.

> are you suggesting that Perl would have been a better solution for a Windows interactive shell and automation tool?

From a mindshare/commercial perspective, yes it would have been (and I say this as a complete Perl-hater). By starting from scratch, MS then had to spend a decade evangelising their new solution, with mixed results. It would have been tremendously easier to just offer first-rate support for established sysadmin tools (ssh, perl/python etc) from day 1; but again, don't take it from me: the big new Windows Server features are Ubuntu-bundling and native ssh support. This would not be necessary, were Powershell and Windows dominant from a commercial perspective.


> Windows did flourish...

Sure. Just see all those buildings full of Windows servers running Facebook, Google, Twitter, Amazon...


It's moving the goal posts a bit to suddenly restrict this discussion to only servers, no?

There are plenty of Windows servers running plenty of businesses. They may not outnumber non-Windows servers, but I don't think "majority" has to be a prerequisite for "flourish".

There are also buildings full of Windows clients...


This discussion is about PowerShell, which is primarily intended to automate server management and has been ported to Linux, presumably, for the same purpose.


> This discussion is about PowerShell, which is primarily intended to automate server management

Many (me included) find PowerShell extremely useful for automating any sort of Windows management, client or server. So I disagree that the discussion should be implicitly assumed to be limited to Windows servers.


I was administering windows computers at a university at the time of windows 98and most of my automation came from ActiveX where I could register external libraries and script most OS interfaces with vb or js. Powershell is just that plus lots of objects and types


No. The well-known mature shell tools manage files and processes - neither of which are a focus of managing a Windows system. Windows revolves around services which you RPC to, which internally store their state - attempting to faff with their state from outside the service is almost always both undocumented and likely to cause failure.

Powershell allowed those services to standardise their RPCs in such a way that sysadmins could call them easily. Alternatively, you'd have COM or a variety of other RPC mechanisms, depending on the service, often undocumented themselves as they were only ever intended to be used by a GUI packaged with the service.


It's been years since I used Windows APIs -- back then there was OLE Automation, which allowed dynamically, late-bound (call by name, really) RPC calls into COM from scripting languages. Has this been superceded by something else? Using IDL-defined COM interfaces from scripts sounds like something nobody would want to do.


Indeed, nobody would want to do that, which is why Powershell exists. Scripting VB or Javascript against COM or whatever else the software vendor decided to use was generally an awful experience, if the interface was even documented at all. Powershell replaces all of that for system administration purposes - you can even run Powershell commands easily from within any .Net application and get structured, object-oriented output, which is what at least all the Windows Server GUI stuff does now.

You're still stuck with OLE/COM for e.g. Excel automation, though, I think.


I'm having to do this on a project right now, and I wholeheartedly agree. Nobody would want to do this.


Well, lots of windows admins used things like cygwin and the like to get a unix-y working environment, but PowerShell offers a much more robust set of mechanisms to manipulate the underlying Windows operating system.


> who couldn't use the well-known mature shell tools since they were locked into a closed platform

Yeah, I remember the bad old days, when I'd download that CygWin installer and Windows would just give me the old skull and crossbones and say "You're not allowed to install GNU tools, Sucka!" One time I jailbroke XP and got it installed, but then CMD was all "You can't run grep.exe and sed.exe under CMD, Sucka! Not allowed!" I was so sad that I couldn't use those tools on Windows, so sad.


> a way to do powerful automation without hiring a bunch of PhDs.

I remember writing Windows software with DCOM was difficult, but it was more on the painfulness side than a difficult intellectual endeavor. Sure you didn't need PhDs to automate things.

> So, yes, someone did ask for it - the IT industry.

The part of it stuck in Windows, that is. Vendor lock-in is a powerful thing.


Yeah, DCOM was a task of fillout the correct fields with the correct values, and try again until you get it right. Anyone with passing C++ knowledge could do it through shear brute force.

It's one of the things which made me never want to program on Windows again.


I ported a bunch of COM to Linux in the late 1990's. Not DCOM with the full RPC, but just in-process COM. I had .so shared libs with DllMain and DllCanUnload now in them, I had CoCreateClass, I had a significant part of the registry API implemented in terms of a text file store (HKEY_LOCAL_USER being in the home directory, HKEY_LOCAL_MACHINE somewhere in /etc, ...) and such. IUnknown, QueryInterface, AddRef, Release, ...


Netscape did that with XPCOM [1] they once thought that this is a good idea - i guess for versioning of interfaces; nowadays firefox has moved on from this.

[1] https://en.wikipedia.org/wiki/XPCOM


Haha, funny enough our company did this as well - we had lots of legacy C++ code using COM and wanted to port to Linux. It was not that hard after all.


That's nice, but having CORBA implementations - why?


To port some code as-is.


I read this a while back. It's an interview with the dude who wrote PS https://www.petri.com/jeffrey-snover-powershell


His HN handle is JeffreySnover; He is commenting on this story at this moment.


Not "The IT Industry", surely only a subset of Windows systems managers.


That's actually a pretty substantial chunk of "the IT Industry" though. And if you're the people who make Windows, it's not surprising that it's a big enough chunk to do something about.


I personally find that the term "IT" has become a bit of a code word for "Enterprise Wintel", and that anyone working outside of the gravitational pull of Microsoft prefers to describe themselves as working in "tech"


I think that IT has become a code word for being a tech worker in a company, rather than a worker in a tech company.


All those sysadmins would have been perfectly happy with just a properly working unix like shell.


Yes. Like all those early car buyers would be perfectly happy with faster horses.


Both PS and bash are horses actually.


Unix like shell + Python/Perl/Ruby = car

That being said I really don't mind that MS went the PowerShell route. It improved their ecosystem a lot and I belive was also one of the engines behind the more open minded move towards open source...maybe someone at MS can chime in but I feel like PS was always one of the strong projects pushing that forward from talking to some MS engineers.


false equivalency


No, actually it's quite apt.

PowerShell, or at least the concept behind PowerShel, is an improvement/superset over the Unix shells -- and can do whatever the shells can do, plus stuff they cannot do, because they don't support typed entities.

Its problems are not technical or conceptual -- they are ecosystem and historical: lack of compatible tools, windows-only, etc.


Or to the point of the original parent: the industry considers powershell a solution in search of a problem.


Only those that disregard the inventions from Xerox and ETHZ.

Powershell ideas go back to how REPLs work in those environments.


And everybody remembers how Ford was forced to reintroduce a carriage with a horse a couple of decades later because nobody was really using those horseless carriages. Right?

https://blogs.windows.com/buildingapps/2016/03/30/run-bash-o...


I feel very similarly, but then I'm of the mind that if I can't apt-get it, I'm not interested, and I'll bet that "open source" doesn't mean to MS what it means to the open source world.

As for the NIH syndrome, one of my favorite Paul Graham essays is still "What Languages Fix" (http://www.paulgraham.com/fix.html) particularly for C#: the problem that C# was invented to solve is that Microsoft doesn't control Java.


C# incidentally solved many other shortcomings with Java. The languages are getting closer to parity, but for many years C# was pretty far ahead (if you only consider language features and not tooling/ecosystem). Between reified generics, better native interop, and many functional features, C# felt and feels more concise than idiomatic Java. And I've written plenty of both.

Disclaimer: MSFT employee expressing personal views


In what way was C#'s tooling not better?


Java seemed to have a lot more options for deployments and monitoring for a long time. More advanced 3rd party libraries in general. It had simply been around longer and thus had those tools available. Now that C# has been standard for a while almost all of those tools have a C# equivalent.

As a .NET dev I'd actually argue that the things available to C# now are actually better (especially Visual Studio), but that's certainly biased.


> ...things available to C# now are actually better (especially Visual Studio), but that's certainly biased.

You call Visual Studio good? Have you actually used it?

VS2015 is painfully slow, and yes, I've installed all of the updates and I have no plugins or such. Heck, even Eclipse feels zippy compared to it. Something as simple as finding next text search match takes about a second. Changing to a different source code tab is also sluggish.

Context sensitive search (right click identifier -> Find All References) is useless for anything that is even a bit generic. Instead of actually analyzing context, it seems to simply grep all reachable files for a text string. Including SDK headers...

I use VS2015 only for C++, maybe that can have something to do with it. Maybe it works better for C#?

I also dislike how I have to login to use an IDE. Once it cost me 30 minutes of working time, I couldn't login to a god damn IDE that had an expired license. That's a great way to make developers hate your product and ecosystem.


It is a lot better for C#. All the features work perfectly for .NET languages, but because of the nature of C++ it's a lot harder to get these kinds of things working well with it. Some answers about why are here: https://www.quora.com/Why-is-C++-intellisense-so-bad-in-Visu...

Eclipse can run better on a potato but it's also very simplistic. You should also be able to turn off most of the features you don't want in VS.


Well, heck even Emacs with some LLVM wrappers can do better job at C++ Intellisense than Visual Studio. Eclipse's CDT runs circles around it as well (so much for being very simplistic). And don't get me started on trying to use Intellisense on large codebases - such as Unreal Engine's source code. It is just completely unusable (30+ seconds before the completion pops up, often blocking your IDE (a total usability no-no) with a modal "Intellisense operation in progress ...").

Also anything using complex templates (= pretty much any C++ code these days) throws it for a loop - making it useless where it would be needed the most.

And don't even get me started at the ridiculous state of refactoring support for C++ - even the stupid identifier rename doesn't work reliably, and that is the only C++ refactoring feature VS has (and we had to wait until 2015 for it!).

Sorry, but that something is hard doesn't mean it can't be done - especially when there are free tools which can do it and orders of magnitude better. Even Visual Studio can do it - if you buy the Visual Assist plugin.

Even the C# tooling isn't that great - yes, it does work better (I often do combined C++/C# projects) than the C++ one, but that isn't a particularly high bar to clear. Ever tried the Java tooling in Eclipse? Heck, that editor and the various assist features almost write your code for you ...

Where Visual Studio is great is the additional tooling - integrated debugger and the various profiling functions. Eclipse can't really compete with that, even though most of the tools (gdb, gprof, etc.) exist and have a lot of features that even VS's debugger doesn't have (e.g. reverse debugging, record/replay, etc.). Sadly, the Eclipse front-end doesn't support all of them.

However, the Visual Studio's editor and the tooling associated with actually writing and managing code (i.e. the part a programmer spends most time in) is stuck straight in the late 90's - just look at how code navigation (jumping to functions, navigating between files, projects, etc.) works in the ancient Visual Studio 6 and 2015 - it is almost identical.

There is also zero integration for alternative build systems like CMake. That is important for writing cross-platform code or even just basic sanity on large projects - the Microsoft's build configuration/project/solution system with millions of settings buried in cluttered dialog boxes is just horrid and error prone, especially with their myriads of mutually incompatible options - set one option incorrectly on one library and your project starts to mysteriously crash. Good luck finding that - and then spending hours recompiling your project :( However, nothing else is really supported properly. (Yeah, I know that CMake generates VS solutions/projects, but there is no integration within VS for it - so you have to be careful how you add files, how you change build options, etc.)

The saddest thing is that many of these issues are pure usability problems where changing/correcting a stupid design decision made a long time ago would reduce the programmer's frustration by maybe 80% already - like the modal blocking Intellisense operation windows, the ergonomic disaster that are the build configuration dialogs, the decision to show compilation error with templates on the deepest template in the expansion (i.e. somewhere in the STL header) instead of where it has occurred in your code, forcing you to mentally parse several pages of error messages to find the line number or just resorting the Intellisense matches to show you the project-relevant ones first instead of the unrelated stuff from the frameworks/header files so that you don't have to dig through it. I.e. common sense stuff.

C++ isn't going anywhere on Windows, no matter what MS and C# advocates say and it is really sad that the official tooling is in such poor shape.

I could continue ranting about many other things that are regularly driving me up the wall at work (like the incredibly slow and memory hungry compiler, horrid and unreadable error messages, especially for templates, etc.), but that would be pointless. IDEs are a matter of strong preferences and opinions, but I do believe that with Visual Studio, especially their C++ tooling, Microsoft has missed the boat many moons ago. And isn't even trying to catch up as it seems - they rather focus on adding stuff like Node.js support (are any Node programmers using VS or even Windows??).


.. if this is you not getting started, I'm kind of frightened for when it does happen ;)


Ah, sorry for the rant.

However it does piss me off when I see people uncritically praising VS as the best thing after sliced bread and flat out dismiss the alternatives without even looking at them - when VS is really terrible IDE.

Unfortunately, this is something that is difficult to comprehend for someone who has never seen how it actually could work properly - either because they have never been exposed to the alternatives or because they have the VS-centric "workflow" (more like kludges working around VS silliness) ingrained so deeply that anything else will perturb them.


"Something as simple as finding next text search match takes about a second."

I'm not sure how our configurations differ but on a fairly modern machine (say, surface pro 4 i5) I've not had visual studio 2015 slow down, and editing and searching is az zippy as with Vim for me.

I've had the experience that some third party tools have bogged down the experience, though.


I mostly run it on a Xeon workstation with 32 GB RAM, SSD, Nvidia K5000, etc. I think you can call it a modern machine. Other software runs fast.

I don't have any third party software installed on top of VS2015.

Only unusual aspect is that I also have several other Visual Studio versions installed, like VS2008, VS2010, VS2012 and VS2013.

I also have many Windows SDK and DDK (driver devkit) versions.

But that shouldn't affect, right?


I run it at work on Xeon with SSD and 16 GB. Project sizes might be different. I don't have experience how the performance would scale with a project with tens of millions of lines of code, for example. (seems to be fine for me below that).

It does exhibit strange behaviour (including becoming suddenly very tardy) that only goes away by deleting the local cache files (intellisense db and whatnot) so I'm not claiming it's technically perfect.


It is meaningless to make such comparisons unless you also compare codebase sizes. VS is perfectly zippy if you have a small project - but try to hit CTRL+SPACE to complete an Unreal Engine or OpenCV identifier! You will be sitting there for 30 seconds or longer while VS is busy parsing/searching files ...


There's a box you can tick that says no to all of that stuff, though I don't know if you can access it after the first run. (I never have to log in when I run Visual Studio.)


No, at least in the version I have, license expires quickly.

If I don't enter the credentials, it won't be able to renew the license, so it's not possible to use it at all. Regardless of any checkbox ticks.


The latest update to VS2015 fixes a lot of the performance issues for C#.


Meh, to make Visual Studio do IDE-like things, people install things like ReSharper, built by the same people that are building IntelliJ IDEA. And I worked with both and IntelliJ is much smarter, and has support for way more technologies. And yes, it's built in Java and can be sluggish, but then again I'm using it on both Linux and OS X and runs just fine on Windows too, so it doesn't tie me to an OS I don't want.

Also, I keep mentioning this, but .NET lacks a build tool like Maven or something similar (SBT, Gradle, Leiningen). .NET is still at the level of Ant / Make and .NET devs don't realize what a huge difference that is.


>people install things like ReSharper

This used to be true but VS2015 is actually completely fine without ReSharper. Roslyn is amazing, I've even built my own custom code analyzers (ie. compiler extensions) that integrate in to the entire ecosystem seamlessly (nuget package registers them to VS project, IDE shows you code analysis on the fly, C# compiler uses them on every build, including CI/build server - and you can enforce stuff with it - eg. raise build errors).

>but .NET lacks a build tool like Maven or something similar

There are .NET build tools, FAKE, CAKE, etc. etc. but people don't use them much, tooling integration is notably missing. It would be nice to have something like Gradle in .NET but VS support for msbuild is good enough for most.


FAKE and CAKE are not replacements for Maven, but for Ant/Make. Maven and Maven-like tools on the other hand are so much more.

Here's a personal anecdote. I work on Monix.io, which is a cross-compiled Scala library for the JVM and Javascript/Scala.js.

I could easily do that because of SBT, which is Scala's flavor of Maven. Basically Scala.js comes with SBT and Maven plugins. And you can easily configure your project, indicating which sources should be shared between the JVM and Scala.js and which sources are specific. And afterwards SBT takes care of compilation for multiple targets, building JAR files for both the JVM and Scala.js. And then you can also sign those packages and deploy them on Sonatype / Maven Central. The tests are also cross-compiled too and running whenever I do "sbt test", no tricks required or effort needed to do it. And this project has a lot of cross-compiled tests.

And if I were to work on a mixed JVM/Scala.js project, I could also include a whole assets pipeline in it, with the help of sbt-web, another SBT plugin. SBT and Maven have a lot of plugins available.

The Scala.js ecosystem relies on Maven / SBT for build and dependency management. There's no "npm install" for us, no Gulp vs Grunt cluster fucks, no artificial separation between "real Scala" and Scala.js, only what's required. And trust me, it's a much, much saner environment.

Compare with Fable. Compare with FunScript. Don't get me wrong, I think these projects are also brilliant. But it's because of the ecosystem that they'll never be popular, because the .NET ecosystem inhibits reusable F# cross-compiled libraries.


Umm, .NET has nuget packages.


The JetBrains folks make incredible tooling.

I got into their stuff via pycharm and phpstorm and then ended up using intellij with the php and python plugins (plus node and a bunch of others).

I haven't enjoyed using a development tool so much since back when Borland was good, intellij is pretty much 99% of my development tool usage at this point and that 1% is mostly nano for quick 'I need to edit this one line' edits.

The thing they seem to get so right is that intellij with say php plugin feels like phpstorm and with python plugin like pycharm at the same time, it's not the features so much as how they are all seamless integrated and the tool as a whole feels designed.

One of the few bills I genuinely don't mind paying each month, everything thing else I've tried just doesn't compare*

* Your Milage May Vary.



No, FAKE is not a replacement for Maven. As I said, .NET is left behind in the Ant/Make era.

And since you mentioned an F# project, the absence of a Maven-like build, dependency management and deployment tool is why F#'s Javascript compilers are behind, because you have no shared infrastructure, no cross-compiled libraries, etc, because it's just too painful to do it.


Why isnt it a replacement.

NuGet/Packet for dependencies.

Octopus/Web Deploy etc for deployment.

All of which can be used from this tool.


>Also, I keep mentioning this, but .NET lacks a build tool like Maven or something similar (SBT, Gradle, Leiningen).

You mean in terms of integration with Visual Studio? Because both Maven and Gradle can compile .NET code (msbuild or xbuild). Gradle doesn't natively support Nuget's format, but there are plugins for it.

> .NET is still at the level of Ant / Make and .NET devs don't realize what a huge difference that is.

Gradle just uses javac under the hood.


What am I missing building projects with VS or xbuild?


For one, sane dependency management that doesn't require vendoring.


You can reference projects or other files that are in submodules. What about the .NET build process makes vendoring the only option?

Even so, it's always been more convenient in my experience when something just ships with fixed versions of dependencies already inside it. Those versions are known to work with the codebase and there's no chance of new bugs or other incompatible behavior being introduced from updates to the dependencies.


> What about the .NET build process makes vendoring the only option?

AFAIK, even NuGet doesn't allow you to just check in the dependency spec, rather than the contents of all of your dependencies.

> Even so, it's always been more convenient in my experience when something just ships with fixed versions of dependencies already inside it.

I agree. But that doesn't mean that they belong in Git, nor that they should even be copied to every single project.


> AFAIK, even NuGet doesn't allow you to just check in the dependency spec, rather than the contents of all of your dependencies.

I have a project where I only checked in the solution's repositories.config (packages/repositories.config) and the project's packages.config (ProjectName/packages.config), and it seems to work fine.


I'd have to agree, though I spend most of my time in the node space, today via VS Code (though I don't use the debugger at all). VS Proper with C# has felt better to work with than any Java tool/project I've ever used, though haven't done much with that ever. I jumped on the C# bandwagon in late 2001, early 2002 when it was still in development. Started moving over towards node around 5-6 years ago, and am currently looking for a reason to work with rust or go.

Most of my issues with C# have always, like Java, been about "Enterprise" frameworks, patters and practices implemented where they aren't needed, and complexity for complexity's sake. VS + C# is pretty damned nice, if you can work on a green project and not bring in everything under the sun.


VS was probably always a better coding and debugging environment. I've used VS as far back as VS 2010 and it's always been great for C#, but I've never had more than a mediocre experience writing Java in an IDE. Did anything support VS's level of convenient debugging ability for Java?


I have not used VS as much as I have used Java IDEs, but I'd say they are on par. VS shines in how well the debugger is integrated with the rest of the environment, but both NetBeans and IDEA offer very close to the same level of convenience. Admittedly, I have never liked the Eclipse debugger.


> As for the NIH syndrome, one of my favorite Paul Graham essays is still "What Languages Fix" (http://www.paulgraham.com/fix.html) particularly for C#: the problem that C# was invented to solve is that Microsoft doesn't control Java.

That's not what Graham said. He said, "C#: Java is controlled by Sun", which is quite a bit different from Microsoft not controlling Java.

Microsoft is fine with using and promoting languages they do not control. C++, for example.

The problem with Java from Microsoft's point of view was that Sun did not want people to use Java as a mere programming language. They saw Java as a platform, and wanted people to develop for that platform instead of developing for Windows, or Mac, or Linux, or whatever. Sun wanted all access from Java programs to the underlying facilities of the OS upon which a particular instance of the Java platform ran to go through the Java platform.

Microsoft wanted something with the good parts of the Java language without the baggage and restrictions of the Java platform, and so they made C#.


>and I'll bet that "open source" doesn't mean to MS what it means to the open source world

The github repo use the MIT licence, so I don't see what you're talking about.

If you want it in your distribution repos, ask your distribution maintainers (and I bet Microsoft is going to do so anyway for the big ones)


Yes, please ask them....happy to help adding PowerShell to any distribution .NET Core supports.


》I'll bet that "open source" doesn't mean to MS what it means to the open source world.

That doesn't make sense. ASP.Net, .Net Core, F# are all good examples of open source projects. This announcement promises the same for Powershell: Development with the community.

What are you missing? What's the criticism?

If you want to see a broken open source project, check out Android instead..


> if I can't apt-get it, I'm not interested,

It was released just now, under MIT license. Give it a little time and it'll show up.


Or even more! It will need to be backported to Trusty Tahr if one wants to apt-get it on Windows (subsystem for Linux) ;-)


WSL will end up on Xenial at least.


except that C# as a language is by leaps and bounds better than Java (both in syntax and useful features departments), so there have been other problems to solve, too.


Its ecosystem isn’t. C# will never get accepted as an open source product. It only has open source code, it’s not actually ‘open-source.’ Compare it with Typescript, which—despite originating from Microsoft—is a truly open project, and getting love left and right.

C# might be better than Java/JVM, but it’s not better enough. The culture/ecosystem barrier is so high that C# would have to be miles ahead, technically superior in every way, to overcome it. I do hate the “worse is better” adage, but there’s no mistaking it, it applies here.

It’s just too little, too late.

But Typescript is awesome, keep it up.

Ironically, Google’s competitor (Closure Compiler) has actually been unsuccessful for similar reasons. To this date its main repo is just a clone of the internal master. For whatever reason they’ve never attempted to rebase on open-source release.


What do you mean by it not being open source? The core CLR is MIT licensed and the compiler is Apache licensed.

This sounds like fud to me.


I have no issue with the license.


So it's maintained in the open on GitHub, it's technically open source in terms of licensing. Yet you claim it's not really open source. Care to clarify?


It’s technically open-source, that’s the point. There’s more to open-source than license. Sorry but there’s no way for me to clarify without just repeating my original comment.


Your original comments mostly contained your personal opinions, not actual facts. The fact is that it is open source and you are FUDing.


> Your original comments mostly contained your personal opinions

I never tried to pass it as anything else.


It is open source in every sense. The development is open, they accept patches/contributions etc.

.net is far more of a true open source framework than android is.


> It’s technically open-source, that’s the point. There’s more to open-source than license. Sorry but there’s no way for me to clarify without just repeating my original comment.

It's free software, and in that sense the license is the only thing that really matters. However, if you're discussing open collaboration styles then that's a whole different discussion. Either your project is free software or it isn't. Whether it has a diverse and open development community is a separate problem, and doesn't fall under "is this project [free software]".


There’s more to open-source than license.

Such as? You seem to have a mental model of things that make a project objectively open source, that don't include the license. I'd be curious what those things are.


I really don’t, it’s more of a feeling. With an open-source release like .NET it seems more like better documentation. In fact that was the case for early commercial Unixes—you needed the source code to actually use the system, but it wasn’t open-source.

Open-source as-documentation (for lack of better term) is still useful. It makes bug fixing a whole lot easier, for one thing. But it’s not quite the same as open-source ecosystem. For that you need to have a diverse set of actors, sharing the same goal. That’s what I think successful open-source project makes. You need to accept the fact that the project is not just yours. Something like that.

Of course Microsoft could do all those things. Who knows, it they’re determined enough they might turn it around. The problem here is like I said Java is just good enough. No one really cares, except people that could use some better documentation, that have been already invested in the ecosystem. That’s why open-sourcing is still valuable, but also why they’ll never gain any adoption of the kind they’d need.

Sorry if that sounds like rambling, it’s sort of late.


by that definition, it'd be hard to call Python open source.


do you more mean like, the decisions, and planned changes, etc, aren't open? (along with being tied to the whims of the CEO and the company's money?)


They might be open, but there’s democracy and then there’s democracy. See for example recent MSBuild incident (but don’t try to argue about it it’s just an example).

As I said, it’s a feeling. The feeling is it’s Microsoft’s project, everyone else is along for a ride. And that’s fine, but it’s something different. Let’s just not pretend technical merits drive adoption, that’s rarely true.


> They might be open, but there’s democracy and then there’s democracy. See for example recent MSBuild incident (but don’t try to argue about it it’s just an example).

> As I said, it’s a feeling. The feeling is it’s Microsoft’s project, everyone else is along for a ride. And that’s fine, but it’s something different. Let’s just not pretend technical merits drive adoption, that’s rarely true.

Uhm. So many free software projects work like that. A company creates something, releases it as free software. Yes, people contribute (and that's awesome by the way) but in general all of the engineering talent works at the company because they wrote it in the first place (and they hire contributors). At SUSE this is how YaST, zypper, spacewalk, et al development works (with significant contributions from the openSUSE community, but we have teams of people working on those projects so our contributions are more of a concentrated effort). There's nothing new or different about this model of free software development (Sun did the same thing with OpenSolaris and Joyent does the same thing with SmartOS). Yes, GNU and Linux follow the hobbyist model but that's not how all projects work.


I was too harsh with the ‘open-source as-documentation’ term.

My point is this is just not enough to compete with JVM, which is already a vibrant open-source ecosystem.


i have a hard time with this argument. on one hand what you say is true: C# is a strictly smaller community than java. OTOH that's true of pretty much any language, and yet python, ruby, elixir, swift, golang, etc. communities are healthy and vibrant.

if what you really mean is 'java people won't switch to C# anyway', then i agree, but C# isn't a really a language for them. it's a language for people who don't like and/or aren't forced to use java by their employers.


People who aren’t forced to use Java will choose Scala or other JVM langs, like Kotlin, Ceylon (a favorite of mine) or even Clojure.

C# the language is not exactly that exciting. I get it, it looks attractive next to Java, but it’s still a verbose, corporate-first, sort of thing. If anything, F# is much more competitive. Too bad it’s on CLR.


[disclaimer, also MS employee].

This has already devolved into opinion territory but I don't think you're giving C# enough credit.

I picked up F# relatively early in it's lifetime (2006ish?), back then there were many language features in F# that you just couldn't do in C#. The gap closed a lot when C# got LINQ, generics, and lambda/first-class functions (these are relatively old language features by now).

If I want to write in an quasi-functional-programming language style I can do it without having the language get in my way. I certainly wouldn't call it a "verbose, corporate-first" language, although the fact that it can be used for that is a bonus.


Don’t forget Rx. Not exactly language feature, but certainly a great contribution to come out of C#/.NET. And who knows if it would have happened without LINQ.

I like the language. Just not enough to use it over JVM. And I think most people feel the same way.


That is not quite true.

Many of us doing enterprise consulting do jump between Java and .NET projects all the time.

Sometimes even doing mixed projects, like the UI in .NET and the backend in Java.


i'll grant you that after a very brief consideration

> UI in .NET and the backend in Java.

makes a lot of sense.


The problem with native desktop Java applications that although Swing is quite powerful, it requires developers to go through books like "Filthy Rich Clients"[0] to really take advantage of it. Which very few do.

To this day I still meet developers that aren't aware how to change the default L&F to native widgets, for example.

Whereas Windows Forms and WPF already provide a much easier out of the box experience, and have tooling like Blend available.

I am curious what the JavaFX sessions at JavaONE will cover.

[0] http://filthyrichclients.org/


As if C# were any less driven by PHB-dictated internal enterprise mandates than Java is.


That's not what Worse Is Better means. Did you read the Gabriel paper?

/nitpick


I got it from Unix’s haters handbook.


Well, you clearly didn't get the full definition. Go read Lisp: The Good News, The Bad News, And How To Win Big


Sure, it beats Java, but the VM is worse for running other languages, and that's where .NET loses. F# tries to be nice, but reified generics are more of a limitation than help in that world.

In JVM land, now Java just hands you some specific well tested libraries: You write business code with Scala or Clojure, which I'd pick over C# by about as much as I'd pick C# over Java 7.

That said, I have far more faith in Microsoft improving their tooling than I have about Oracle doing the same: It's just that Oracle has to carry a far smaller weight, because the good JVM languages aren't even theirs.


Uhm, the CLR (hence the name) was specifically designed to host different languages that can interoperate with each other and unlike the JVM was not built around the capabilities and limitations of a single language.


> Sure, it beats Java, but the VM is worse for running other languages,

based on what criteria?


>I'll bet that "open source" doesn't mean to MS what it means to the open source world.

Before Satya Nadella took over it didn't. Now it mostly seems to.


Ah, that's beautiful. Using a true-Scotsman argument to claim it's not "Real" open source, while denigrating Microsoft for NIH. Classic.


> ... if I can't apt-get it, I'm not interested ...

I feel bad for you, then. The Ubuntu repos (and Debian) have almost nothing that is even remotely new and will usually be behind on most things. Trying to keep configs for an ArchLinux box and Ubuntu box synchronized is a bitch if new versions have good features, because you can't use them in your general config unless you give up on apt and install from source.


> This is why I find PowerShell to be an answer to a question nobody asked. If bash/zsh isn't cutting it for you, drop into something like Python, Ruby or Perl to do the heavy lifting.

I'm not sure this is really right though. You could make the same argument against Ruby or Python in favor of Perl or Pike, could you not?

PowerShell syntax is quite minimal, has a novel methodology for strongly typed interactive use, and has the ability to directly invoke managed code. You can write shell components in any .NET language and invoke them (with care at writing time, this can be done very cleanly, but even without said care it's possible with a bit of a mess).

> PowerShell will never accumulate a library as comprehensive as what any of those three have, each has had decades to accumulate packages of all kinds, and more are still being added.

It has the entire .NET library at its disposal. I certainly never feel a lack of support using it.


> This is why I find PowerShell to be an answer to a question nobody asked. If bash/zsh isn't cutting it for you, drop into something like Python, Ruby or Perl to do the heavy lifting.

How are Python/Ruby/Perl going to give you structured objects from "ps"?

That's the promise of object-based pipes. You can get useful records out of your command-line tools without having to write an ad-hoc regex.


I find that to be the issue. You are considering it just RAW text when it is actually formatted text that has been parsable for years with common unix command line tools. It not being in the format you consider a structured object does not mean it's not a object or even parsable. If you are using ad hoc regex I suspect you are not using all the tools available to you.

I feel like Kernighan and Pike do a much better job of explaining than I could ever.

https://www.amazon.com/Unix-Programming-Environment-Prentice...


> You are considering it just RAW text when it is actually formatted text that has been parsable for years with common unix command line tools.

Parsing command output with sed/awk/etc (ie. "common unix command line tools") is absolutely an ad hoc parser.

Let me give you an example that I recently ran into.

I have a tool that parses the output of "readelf -sW", which dumps symbols and their sizes. The output normally looks like this:

     885: 000000000043f0a0   249 FUNC    WEAK   DEFAULT   13 _ZNSt6vectorIPN3re23DFA5StateESaIS3_EE19_M_emplace_back_auxIJRKS3_EEEvDpOT_
     886: 000000000041c380    64 FUNC    GLOBAL DEFAULT   13 _dwarf_get_return_address_reg
     887: 0000000000424e60   122 FUNC    GLOBAL DEFAULT   13 _dwarf_decode_s_leb128
     888: 000000000043dca0   157 FUNC    GLOBAL DEFAULT   13 _ZN3re23DFA10StateSaverC2EPS0_PNS0_5StateE
So I wrote a regex to parse this. Seems pretty straightforward, right?

But then I noticed a bug where some symbols were not showing up. And it turns out those symbols look like this:

     5898: 00000000001a4d80 0x801058 OBJECT  GLOBAL DEFAULT   33 _ZN8tcmalloc6Static9pageheap_E
Notice the difference? Because it's a large symbol, readelf decided to print it starting with "0x" and in hex instead of decimal. I had to update my regex to accommodate this.

That is what makes a parser "ad hoc". You write a parser based on the examples you have seen, but other examples might break your parser. Parsing text robustly is non-trivial.

Worse, it is an unnecessary cognitive burden. Readelf already had this data in a structured format, why does it have to go to text and back? Why do I have to spend mental cycles figuring out which "common unix command-line tools" (and what options) can parse it back into a data structure?


The Unix answer is why are you using a regex on tab-separated values? Wrong tool for the job.

Of course the problem with Unix is that there are a thousand different semi-structured text formats, edge cases, and tools that must be mastered before you can make any sense of it all. Any time you point out the pain a Unix fan can just respond by pointing out your ignorance.


> The Unix answer is why are you using a regex on tab-separated values?

They aren't tab-separated. There are no tabs in readelf output.

Also if you assume they are space-separated, note that the first few lines look like this:

    Symbol table '.dynsym' contains 164 entries:
       Num:    Value          Size Type    Bind   Vis      Ndx Name
         0: 0000000000000000     0 NOTYPE  LOCAL  DEFAULT  UND
> respond by pointing out your ignorance.

Indeed.


I've actually become a pretty big fan of line terminated json (all string linefeeds, etc are escaped). Each line is a separate JSON object... In this way, it's very easy to parse, handle your use case, and even pipe-through as JSON some more.

In this case, you can have objects as text, and still get plain text streaming with the ability to use just about any language you like for processing.


First off, readelf shouldn't switch between hex and base10. Secondly, that's DSV, so you shouldn't have written a regex for it. You should have either cut, or awk, both tools SPECIFICALLY DESIGNED to do what you want.


What's DSV?


DSV: Delimiter Separated values. readelf uses a delimiter matching the regex /\w+/. In AWK, this is $FS by default, so AWK will parse this by default. Or you can pipe

  tr -s [:blank:] 
To cut, which will give you the row you want.


I would like to see how it will look in PowerShell. $(readelf).ShowMeTheSecondStringFromTheEnd or $(readelf).PrintTheLastColumnInHeX?


A comparable PowerShell cmdlet would give you one object per line with properties corresponding to the columns. And no, those properties usually have sensible names, instead of "LastColumn".

Of course, for wrapping the native command you'd still have to do text parsing if you want objects. This was more as a comparison of the different worlds here not so much as "if I ran readelf in PowerShell it would get magically different output".


I happen to find your example of why to use a object a bit hilarious.

You are right, readelf has the object in a structure -- because that is what elf is...

           typedef struct {
               uint32_t      st_name;
               unsigned char st_info;
               unsigned char st_other;
               uint16_t      st_shndx;
               Elf64_Addr    st_value;
               uint64_t      st_size;
           } Elf64_Sym;

If you wanted the object why did you need readelf in the first place? Why not just read the elf format directly and bypass readelf all together? That seems to be what you are advocating by having readelf passing a object instead of what it does today.


You're asking me why I use a tool instead of parsing a binary format manually? Does that really need explanation?

If that is your attitude, why use any command-line tools ever? Why use "ls" when you can call readdir()? Why use "ps" when you can parse /proc?

You just pointed me to Kernighan and Pike a second ago. I didn't expect I would need to justify why piping standard tools together is better than programming everything manually.


I never said anything about not liking command line tools. In fact I love them and think they do a awesome job!

In any case you just proved my point. You think its insane to parse binary data while scripting and I do too. That is why I think the passing binary objects is insane on the shell.

Now if you were talking about text base objects (not binary ones) then that is an entirely different story and I feel that is what we do today. In your example you have rows which could be called objects, and members which would be separated out in columns. To argue a different text base format is better than another is not something I am interested in doing -- mostly because there are a million different ways one could format the output. If you were to do "objects" I think they would have to be in binary to get any of the benefits one could perceived.

To be honest I feel the output you posted is a bug in readlef. I would expect all data from that column to be in the same base.

I will level with you I can see some benefits of having binary passed between command line programs but I think the harm it would do would outweigh the benefit.

But if you you really wanted to do that you could. There is nothing stopping command line utility makers from outputting a binary or any other formats of text. You don't need shell to make that happen.

What I think everybody is asking for is for command line developers to standardize their output to something parsable -- which I feel that most command line utilities already do that. They give you many different ways to format the data as it is. Some do this better than others, and I think that would hold true even if somebody forced all programs to only produce binary, or json text format when pipped.


This isn't about binary vs text, it is about structured vs. unstructured.

The legacy of UNIX is flat text. Yes it may be expressing some underlying structure, but you can't access that structure unless you write a parser. Writing that parser is error-prone and unnecessary cognitive burden.

PowerShell makes it so the structure of objects is automatically propagated between processes. This is undeniably an improvement.

I'm not saying PowerShell is perfect. From what I understand the objects are some kind of COM or .NET thing, which seems unnecessary to me. JSON or some other structured format would suffice. What matters is that it has structure.

I still don't think you appreciate how fragile your ad hoc parsers are. When things go wrong, you say you "feel" readelf has a bug. What if they disagree with you and they "feel" it is correct? There is no document that says what readelf output promises to do. You're writing parsers based on your expectations, but no one ever promised to meet your expectations. But if the data was in JSON, then there would be a promise that the data follows the JSON spec.


> From what I understand the objects are some kind of COM or .NET thing, which seems unnecessary to me. JSON or some other structured format would suffice.

They are .NET objects, which, in some cases wrap COM or WMI objects. The nice thing about them isn't just properties, though. You can also have methods. E.g. the service objects you get from Get-Service have a Start() and Stop() method; Process objects returned from Get-Process allow you to interact with that process. Basically wherever a .NET class already existed to encapsulate the information, that was used which gets you a lot more functionality than just the data contained in properties.


If the data was in JSON it would promise a that it followed the JSON spec -- but its not, it follows its defined spec, which in the case of readelf is apparently undefined.

Other programs that expect to be machine parsable define in great detail the output. In your initial post I replied to you mentioned ps. In the case of ps it has many options to help you get the data you want without using standard parsing tools. That is because its output was expected to be consumed by both humans and possibility other programs.

Now take readelf on the other hand. It clearly talks about in its man page about being more readable. Its author cares about how it will look on a terminal and even goes through the effort to implement -W which makes it nice to view on larger terminals. It even shows in print_vma, where somebody wen tout of their way to print hex if the number was larger than 99999. If the author really cared about the ability to be parsed they would have added a OUTPUT FORMAT CONTROL section that would provide you the contract you are looking for. Just saying if the data was in JSON does not solve your problem. Why? Because the author of readelf did not spend time to define its output properly in the man page it is not likely he/she would have implemented a json output type when piped little alone take the time to provide the object structures in the man page.

You say it's not about binary vs text but I don't think that can be said. There are lots of things to consider.

* Speed of encoding and decoding. * Memory consumption issues with larger objects needing to be fully decoded before being able to be used or processed. * Binary data would need to be encoded and would likely result in much more overhead.

Its not clear to me that a binary option would not be better than a text one. Pipes today are not just used for simple scripts and system management.

There are lots of things that concern me, maybe it is just the implementation details.

* Not all command line programs are written with the expectation to be parsed. How do we handle that? Force the programmer to make all output parsable regardless if they ever intended on the program being used in some script? * Would a program put the same thing to stdout even if it was flowing to a terminal? Are terminals not for humans? * Would structure be enforced? One of the awesome things about stdin/stdout is that you can send ANY data you want.

That all said I would love it if programs who intended on their output to be parsed offered a JSON output. I am not against structured output. I am against forcing programmers to shoehorn their output into some format that may not be the best for their program. I think a well designed and documented command line tool that expects to be parsed by other programs will go out of its way to ensure the format is documented and adhered to when operating.


It does follow a standard. It's DSV. Unix tools are really good at handling that. Awk and cut specifically.


[flagged]


The dollar sign is charming.

There's a few points here:

1) Not all data is text. In fact, very little of the data people see/work with day-to-day is raw text. It's silly to transform a PNG image into text to be able to pipe it around. (Or to pipe around its filename instead and have a dozen tools all having to open and re-parse it each time.)

2) There's nothing on PowerShell preventing you from serializing a piece of data to text if you want to. The key is: you don't have to.

3) Systems that depend on 50,000 CLI tools all having their own ad-hoc text parsers are cemented, mummified, cannot change. You can't change the output format of ps (to use an example in this thread) without breaking an unknown number of CLI tools. Even if you come up with a great way to improve the output, doesn't matter, you've still broken everything. This is less (but not none!) of an issue with PowerShell. I like computers to evolve to become better over time, and text-based CLIs are a huge anchor preventing that.


Unfortunately PowerShell relies on everything running on .NET (well, I think COM works, too); the idea of a shell that can expose live objects is useful, but PowerShell's platform limitations in reality doing that make it a far from ideal implementation of that concept. Something built on a protocol that is platform agnostic would be better.


Live objects don’t usually expose any protocols at all. They only expose an ABI. Do you know any platform-agnostic OO ABI, besides what’s in .NET?

If you’ll wrap your objects into some platform-agnostic protocol like JSON, you gonna waste enormous amount of CPU time parsing/formatting those streams at the object’s boundaries.


You can run streams of many millions of JSON objects pretty much as fast as the IO can feed it... most of the time, in situations like this, you're constrained by IO speed, not CPU... assuming you are working with a stream that has flow control.

I tend to dump out data structures to line terminated JSON, and it works out really well for streams, can even gz almost transparently. Parse/stringify has never been the bottleneck... it's usually memory (to hold all the objects being processed, unless you block/pushback on the stream), or IO (the feed/source of said stream can't keep up).


Even if printing and parsing is computationally cheap, memory allocation is less so.

If you expose JSON, each serialize/deserialize will produce another instances of objects, with the same data.

The architecture of PowerShell implies commands in the pipeline can process the same instances, without duplicating them.

Another good thing about passing raw objects instead of JSON — live objects can contain stuff expensive or impossible to serialize. Like an OS handle to an open file. Sure, with JSON you can pass file names instead, but this means commands in your pipeline need to open/close those files. Not only this is slower (opening a file requires kernel call, which in turn does various security checks for user’s group membership and file system’s inherited permissions), but can even cause sharing violations errors when two commands in the pipeline try accessing the same file.


And just think how much faster it would be if there were no serialization and I/O involved at all...


And my method can work over distributed systems with network streams... There are advantages to streams of text.


What advantages?

PowerShell works over networks just fine, because standardized ISO/IEC 17963:2013 protocol a.k.a. WS-Management.


> Even if you come up with a great way to improve the output, doesn't matter, you've still broken everything.

You phrase this as if changing things for the sake of changing was a good thing. It is not.

Well, perhaps it is good for the software vendor, but from the customer's point of view, having to re-learn how to do the same stuff over and over every other year is a PITA.


This is why I have trouble getting along with Linux users.

Without change, there's no improvement.


It's often the customers who are complaining that your current output is not suitable.


First off, you can pipe binary data around. Most tools just expect text.

Secondly, if you used DSV to parse PS, like you should, adding a new column to the end won't break anything. A fancier parser won't even break if you add to the middle, but that's usually not worth the effort to write.


> What the M$ community fails to see is that text streams can be consumed by __everyone__.

Text can be poorly parsed by everyone, yes. I especially love it when the default tools settings mean two different computers will give different text results, because the installation defaults changed at some point. What's not to love about trying to properly escape and unescape strings via the command line, while simultaneously keeping in mind your own shell's escaping, and having scripts which are neither backwards nor forwards compatible? And this is to say nothing of different distros.

It's as if the text was built for humans instead of my tools half the time, or something. I usually try to centralize my parsing of text in one place so when it invariably breaks, I don't have to rewrite my entire shell script.

Some basic structuring - I hesitate to call it a brittle object model, when most of the time I'm dealing with something more akin to C structs than invariant and abstraction laden Smalltalk or somesuch, or Java's sprawl of factories and inheritance - makes things a bit easier. New fields can be added without breaking my regular expressions. I don't need to worry about escapes to handle special characters. I can trivially dump to text for display (by simply running the command as this is the default behavior of the shell), or to feed into text consuming tools.


It's even more than that: text formatting allows the use of generic filtering and text processing tools. Whereas if you are using objects, you will tend to use less on-the-fly command line composition, and write or reuse more tools dedicated to one particular object or the other. In the end I'm not sure keeping the structuring as the default use case yields an usage so much improved in the real world, because if you work on a broad number of types you will tend to need to know more specialised tools, instead of generic ones, and have less higher-order composition tools at your disposal.

Now of course you can always format to text from your structured object, but this does not matter. What matters is what is convenient in the proposed UI, and what is mainly practiced in the real world because of such convenience and the amplification loop it creates between tools authors and users.

Also some objects are originally handled in text form, and their structured decomposition is extremely complex compared to an informal description and the naturally refinable heuristic which comes with it. For example you can grep into a source code in any language (with more or less efficiency depending on the language, but at least if there is a unique symbol you will find it), whereas trying to get and handle it in a structured way basically means you would need half of a (modular) compiler, and a huge manual to describe the structure, and possibly a non trivial amount of code to actually do the lookup.

The PowerShell approach is not all-bad, though, and obviously there are some usages where it superior to text based shells. But for a day to day usage as a general purpose shell, and a programmer shell, I'll stick to the Unix approach.


Structured data allows the use of generic filtering and structured data processing tools. The basic requirement is reflection, objects being able to tell you about their structural composition.

If code was stored in a structured representation you could still search for a string object containing a symbol name in the structured representation. You can match a structure pattern just like you match a regex to text.

Typical shells can be thought of as REPL for a programming language that makes the file system easily accessible and uses said file system or pipes to pass around data between functions/commands, sometimes as human-readable text. Most programming languages don't encourage passing data around as strings.


Nothing prevents me from changing those objects into a text stream. In fact, it's infinitely easier than turning a text stream into objects.


I feel you missed my point. It's not just a stream of raw data. It's a stream of formatted text... There is no magic or hand waving involved.


I guess that's why HTTP2 is now binary, because text is awesome.

https://http2.github.io/faq/#why-is-http2-binary


> How are Python/Ruby/Perl going to give you structured objects from "ps"?

The usual answer would be "by parsing /proc instead and returning meaningful objects that expose the relevant parts of the OS" but why would you want to do that when you could just output the relevant data in easy to parse XML or JSON?


For the most common use cases there would be a standard function to get structured objects, in Python you have os.listdir() instead of ls, etc. If there is none then you can call ps and parse the result manually or call the system functions with c interop.


For programming, I agree. However, Python doesn't work as a system shell; for that, you want something optimized for running commands, without having to wrap them in a function call with a list of arguments as strings.


did you try xonsh?


I did, and i found it a pretty big chore in practice. Its two modes (shell mode and python mode), never quite knowing which one you're in, I could never quite get used to that. PowerShell doesn't have this problem, because it was designed for the command line first and for scripts second (which is, incidentally, also its major downside if you ask me).


> Its two modes (shell mode and python mode), never quite knowing which one you're in, I could never quite get used to that.

Drive-by armchair comment: it seems like the obvious answer would be to change the prompt depending on what mode you're in?


I find this criticism of xonsh's "modes" baffling. Xonsh is brilliant precisely because it isn't modal the way, e.g., vi is.

Xonsh lets you freely mix python code and command invocations in a highly transparent way.


you can force captured subprocess mode if necessary


Powershell can use any .NET library... "Nobody" meaning you. Which is perfectly fine. But there are quite a few users of Powershell out there.


Now that it exists there are users, but when people were hoping for a modern or more portable replacement for DOS batch files I'm not sure this is what they had in mind.


Powershell In Action covers a bit of the history and decision making around powershell; interesting stuff.

Ultimately, given that powershell was initially targeted at systems admins/engineers and that Windows is highly object-oriented(in contrast to Linux's text streams), I believe Powershell was a apt development. Having done plenty of systems engineering across linux and Windows it seems to fit the needs of Windows pretty well; a fantastically unique shell that wouldn't have existed if not for Windows.


Think of it as a replacement for vbscript instead, with a commandline.


> PowerShell to be an answer to a question nobody asked.

It's conceivable that people working in the MS World who have built up (or inherited) a base of PowerShell code are asking, "what would it take to run this on Linux, or at least some of it in some way?"

Of course programmers who don't currently work with the PowerShell at all are probably not asking "I want the PowerShell on Linux" in any significant numbers.


FYI IronRuby never made it into a usable state, never came close to other implementations like JRuby or Rubinius and IronPython hasn't received an update since the end of 2014, which for an open-source project like a language implementation means it's nearly dead.

That said adopting Ruby or Python as a shell language wouldn't make sense. Especially Python with its white space aware syntax, I mean can you imagine piping commands at the command line using Python, as I can't. These languages haven't made it as shell replacements in UNIX either, even though everyone complains about Bash.


Last IronPython release: 2014

Last IronRuby release: 2011


Both of these two projects attempted to solve a problem that very few people have - using .NET libraries from Python and Ruby. In exchange, you had to give up all of your existing native libraries. That is a trade-off that very few people wanted. Source: I started the IronRuby project.

I learned this lesson the hard way a few times in my career - always have respect for people's existing code investments.


Looks like things are changing in IronPython land:

https://www.infoq.com/news/2016/08/IronPython-Leadership


I can't speak to Ruby, but Microsoft has an active Python group.[1] Granted, it seems to focus on Visual Studio (which I'm told has excellent Python tooling) and Azure.

[1]: https://blogs.msdn.microsoft.com/pythonengineering/


Neat, but that doesn't seem to be integrated into .NET like IronPython was (well, is, just not maintained it seems). Which is what the post I was responding to was talking about.


Ah, touché. I'm not a Windows person, so I don't know the boundaries well.


I used to love Python when I came to it years about after slogging along through QBasic, VB6, C++ and C, but in this day and age, it doesn't really offer much to me over doing things in modern C#, especially on the CLR. I appreciate some of the changes that were made to the runtime to support IronPython, like dynamic support and the DLR, but if I'm going to be doing something not-C#, then I'd rather go towards F#.


lately i use nodejs and have replaced the cmd tools with node modules. It is more verbose then piping shell commands but easier to understand and the same script can be reused across platforms.


Really? I have to do this to execute docker commands and it is painful. A lot of child_process I/O and it gets hairy.


I find some of it's a lot easier with shelljs or mz, but ymmv. I also like that streams of line delimited JSON is really nice as well.


> If bash/zsh isn't cutting it for you, drop into something like Python, Ruby or Perl to do the heavy lifting.

or Lua?


Hehe, I was so happy when I came across a Lua module for Windows administration. And so sad when I found out it was meant for Windows NT 4.0 administration. :(

A current reincarnation of such a module would be really sweet, though.


0: scsh is a nice scripting language, however it's not meant for interactive usage (PowerShell is) See: http://www.alphanet.ch/~schinz/scsh-faq/scsh-faq_4.html#SEC3... Scheme is also dynamically typed.

1: It's the first time I see it, but from a cursory glance and some docs (http://hackage.haskell.org/package/turtle-1.0.0/docs/Turtle-...) I think it doesn't include simply calling external commands (I couldn't find any info on how to wrap an external executable with type, too). Without this, it can't be a real "shell", only a scripting DSL for Haskell. It's not very good for interactive use as it relies on ghci.

2: tcsh is meant for interactive use and is a shell; I used it when working with FreeBSD but never wrote any scripts in it. However, I don't see any mention of types in it's manual (other than file types): http://linux.die.net/man/1/tcsh

PowerShell is statically typed like Turtle, has a nice scripting language like scsh and is meant for interactive use. It's not only typed, but also (partly) object oriented and with full and direct access to all .NET/Mono classes. That also implies that you can write commands in any of the CLR-targeting languages, by the way, like F# or Clojure-CLR.

PowerShell is unique (not literally so, as I'm sure there are other projects doing similar things, but I think it's safe to say that it's unique among widely used shells) in what it provides and in that it takes all that into interactive use. It's definitely worth checking out if you haven't already. Of course, keep in mind its problems, like the mentioned smaller number of already available utilities.


> I think it doesn't include simply calling external commands

proc and inshell. Did you check out the built-in tutorial?

http://hackage.haskell.org/package/turtle-1.0.0/docs/Turtle-...


You're linking to an old version. Here is the current version: http://hackage.haskell.org/package/turtle/docs/Turtle-Tutori...


Yes, I skimmed it. But please note that I wrote about "simply" calling external executables, where you can just type the name of a program and have it run. Turtle doesn't even try to allow this (while PowerShell has it), instead going in the opposite direction:

> Most of the commands in this library do not actually invoke an external shell or program. Instead, they indirectly wrap other Haskell libraries that bind to C code.


inshell "ls" would do what you are talking about, unless your bar for simple is just typing ls.


I was mistaken about tcsh' type system. It's not strongly typed:

    ~> @ i = 0 + "foo"
    @: Expression Syntax.
    ~> @ i = 0 + "1"
    ~> echo $i
    1


I want to throw my support behind scsh!

It's useful for those things that cross the boundaries between a program, and a shell script. I've found once you need functions for a shell program, it's time to step up to scsh.

I also wish scsh ports were more well maintained so people could use their favorite schemes with scsh syntax, but it's quite a large codebase to port, so remember that it's scheme48 scheme.


bits of SCSH have been ported to every scheme out there now. And some have been improved. Irregex, in particular, is a dramatically improved version of the SCSH regex engine.


Oh yes, ashinn's irregex is so good, I just include it in my home directory.


I thought I was the last person using scsh!

Now if I could just imbue my coworkers with your good taste....


Nice! :)

I write a lot of my stuff in guile when it's more formalized, but scsh's run syntax is just the bees knees. Can't get rid of it.


There is also Ammonite Shell[0], XONSH[1]

0: http://www.lihaoyi.com/Ammonite/#Ammonite-Shell 1: http://xon.sh/


I've been loving using Node.js for scripts. You get all the NPM goodies plus STDIO, and they end up working just as well as shell scripts.

One of the big advantages IMO is that I only need to master one language for all of my needs, from browser to server to shell :)


How do you find dealing with async? Virtually everything I'd want to do with scripting would almost certainly be async in node libraries, and while I'm willing to pay the callback / promise tax in production code, for scripts it seems like the convenience of just dealing with everything synchronously would outweigh the advantage of keeping things in one language.


Personally for me I use es2015 and async/await in shell scripts since it's so close to writing sync code to me. I have an alias to my global babel so it's uber easy to transpile and run the code. I right a lot of node code for work so it's just native to me but I'd recommend others look into it. NPM is such an amazing ecosystem for it.


In scripts I'll keep things synchronous unless I have a specific need for async.


I tend to favor es2015 + async/await (via babel-node). Combined with bits from shelljs and mz packages, it all comes together pretty well.


Pure madness. I'm sure this is how elated the architects of the Tower of Babbel felt about themselves right before the crash.


I agree... though my default reach-for these days tends to be node. I don't much care for ps, even if it's now available on Linux, I just don't see myself using it much... I like C# a lot, and even other .Net languages. But I don't like all the extra typing for one-off scripts, even if they do get re-used.

I can paste various commands for other terminals/environments in a tweet, the same can't always be said for ps. It's just a lot of typing.


There is also GNU Guile, with a fantastically easy to use FFI. Has a great manual too.


I've never seen tcsh referred to as strongly typed. Could you elaborate?


I was mistaken, it's not.

    ~> @ i = 0 + "foo"
    @: Expression Syntax.
    ~> @ i = 0 + "1"
    ~> echo $i
    1


I love the strongly-typed cookies but I absolutely hate the powershell syntax. It's like C# and Bash got together and had a really ugly baby.


As long as you bear in mind that it's main purpose is a usable replacement for the CMD.EXE batch language, then you should be willing to forgive Powershell it's warts.


Except that they added a bunch of new warts that don't have any relation to CMD.EXE. They took the opportunity to re-invent how a shell works from the strongly-typed object perspective but then they copied some of terrible syntax of 40 year old unix shells. I just don't understand it.

I'd rather write scripts in pure C#.


The point of PowerShell isn't just to write scripts but to work as an interactive shell.


That is not quite right. At the end of the day it is all about automation. But we took the position that the way Admins automate is by scripting the things that they do in an interactive shell. Bruce Payette had a great way of saying it - he said that the lifecycle of 99% of scripts starts at the prompt and ends at the carriage return. Then, somethings you want to codify things so you put it into a script. Then as the function becomes more used and more important, you want to make it more and more formal without having to throw everything away and start with a new language.

So YES and interactive shell is important but it is important as a stepping stone to automation.

That is the PowerShell mindset. Jeffrey Snover [MSFT]


That is not quite right either. I enter about 200 commands on a normal workday and write maybe two scripts a month in a shell language. A shell language should be a shell language, it's not merely a stepping stone to write a script after using a command just once. If that's what I'm looking for I'll use Python. And from the other side, Python would be a terrible shell (by default), and that's fine because that's not what it's used for.

If Powershell is made for both, it has to make compromises both as a shell language and as a scripting language. This is one of the things that annoy me about the Windows ecosystem: it ships with no languages beyond a shell, which makes scripting for it very annoying, and you can't even install a language of choice via a package manager (one has to go find a download somewhere on the web).


This is a really strange argument -- he's saying that PowerShell was designed so that you could more easily move from shell to script to library ...

Your counter is -- you don't do that? Of course you don't, because your shell is bash. That's exactly the point. If your shell is bash, when you write a script you use a different language. What if there was a shell where you could actually script ...

You're absolutely right, PowerShell is made for both, and it has made compromises (like using comparison operators like -eq instead of ==).

Compromises are not inherently bad.


> If your shell is bash, when you write a script you use a different language.

The number of bash scripts I see for various -n-x related tasks suggests that this is substantially untrue; people do, in fact, do scripts in bash.

> What if there was a shell where you could actually script ...

I can't think of any shell that you can't script. OTOH, interactive use and scripting are substantially different use cases, so it really makes sense to have both interactive-optimized languages that may be usable in script and scripting languages that may be usable interactively, but focus use of each in their optimized role.


> Your counter is -- you don't do that? Of course you don't, because your shell is bash.

Yeah of course I'd use Bash because Powershell only just became available for other platforms, but that's not my point.

I understood his point like this: the grandparent of his post said powershell has some bad things for scripting. Someone responded to that by saying it's not just for scripting. He said something which I understand as 'that's not quite right, I see a command as a stepping stone to writing a script', To which I finally responded 'I enter a lot of commands but rarely write scripts in [my interpreter of choice's language] because that language just isn't that great, and that's fine because the language is great for its purpose: command line usage'.

And compromises are inherently bad (in a way), that's the definition of a compromise: there are two or more mutually exclusive options and different parties have different first choices. Since they're mutually exclusive, a compromise must be made which is acceptable for both, but it's the first choice of neither (or at most one of them). Compromises are necessary for a dual-use language, but they don't make it prettier for both usages either.


> it has made compromises (like using comparison operators like -eq instead of ==).

wat.

The use of -eq instead of == is a compromise?

I don't get that.


You're correcting Jeffrey Snover about the point of Powershell. I suspect that the guy that invented it and championed it through years of resistance at Microsoft probably knows what is or isn't intended with Powershell.


Besides mirroring the condescending tone, I was mentioning that in my (GNU/Linux) experience -- which is a lot more command-line oriented than Windows -- it's not true what he's saying. I use a lot of commands and very few of them become scripts, thus the conclusion that it's not merely a stepping stone. A command line language can have different features than a good scripting language. Further I was remarking on the limitation of Windows which ships with a "one size must fit all" language: the language used by their shell (and now there's a 2.0 version of that, called powershell).

I'm sure he knows a lot more than me about a lot of powershell-related things. Still, beyond him being (according to you) the inventor and "champion" of powershell, there's nothing that tells me they did UX research into this, so I added my experience because it differs from his viewpoint.


I think you're misreading what was exchanged. That is unless you're arguing with him about the intent behind the design of a product you had no part in creating. He was replying to "The point of PowerShell..." not, "The point of shells...". He was simply correcting someone who misinterpreted the intent.

I think you're also reading tone into his message that's not there. He's been very chill and helpful in this thread even in reply to open hostility.

I'm not sure what "according to you" means, but you can verify it yourself. Feel free to read Wikipedia or you can probably find one of his talks about it on youtube.


Windows did ship with VBScript (and probably does still). You can also install languages via a package manager; an example:

- https://chocolatey.org/packages?q=python


After installing a third party bootstrapping program. I suppose that's as close as it gets.

Windows still ships with VBScript, but it's not a language I'd want to look into either. Everyone that has worked with it sighs when it comes up and nobody actually uses it for automation as far as I know (everything's either powershell, good old batch scripts, or custom .NET software).


You could just use WSH (Windows Scripting Host) with either VBScript or JScript the way you've always been able to. AFAIK, Windows has shipped with that scripting solution since like Windows 2000 or possibly even earlier.


I noticed that on old computers (Windows 95, I think, and certainly 2000 and XP) where it would classify .js files as executables. I was about 11-14 so I didn't understand much of Windows at the time, let alone in a foreign language (I'm Dutch) when I wasn't even allowed to use the Internet (dialing up was expensive per-minute stuff), but yeah I remember that.

Still, I've never heard of anyone using Javascript or VBScript to automate anything. It's either powershell (these days), cmd scripts or custom .NET software. I've never even heard anyone mention using javascript in Windows (until apps came around in Windows 8) and vbs is only ever patchwork for legacy third-party software products as far as I've seen in my sysadmin days.


> I've never heard of anyone using Javascript or VBScript to automate anything.

as just one example, It's common to include JavaScript or VBScript steps in Windows installers. You would never know it, if you ran an installer that used a script to automate a few things.


> Still, I've never heard of anyone using Javascript or VBScript to automate anything.

Well thousands of companies have done it, so. I don't know what to tell you.


I thought Windows shipped with .NET which includes a command line C# compiler, csc.exe.


The current C# compiler is no longer distributed as part of .NET. They're separate again (as is MSBuild).


As one of those enterprise pseudo-sysadmin-scripting-guy, I do agree with the sentiment; and I think the fundamental object-ness of PS was nice and the emphasis on docstrings is nice. However, I still think a python-like syntax would have been miles better. As much as I tried, I just can't get into Powershell -- way too many special characters look awfully "dirty", and $vars give me PHP/Perl-induced PTSD. Deployment is also a bit of a pain.


I'm not at all convinced that the syntax is any better for typing in as commands in an interactive shell.


I think it works pretty well for that beyond the weird omission of anything like &&


It works fine as an interactive shell but that's not exactly high praise. It could be better for that and for scripting but instead tries to be more familiar to bash users. As a bash user, I'm terribly happy about it's syntax either but it's the price to pay for 40 years of backwards compatibility. Powershell has no such excuse.


True enough.


It sorely needs an && and a &


I would hope it's also meant to put vbscript out to pasture as well.


In my admittedly limited experience there seems to be at least 3 aliases for every keyword which makes reading someone else's code a real hassle sometimes. Our ops guys swear by it though.


You're supposed to avoid the aliases in scripts. Or that's what's recommended anyway.


Yes, this makes for much easier to read code for people down the road :)

I think OPS folks (myself included) sometimes create things thinking nobody but them will use them, then an occasion arises to share, and you're stuck with something functional but horrible to read.

It took just one such occasion for me to convert to writing everything from the perspective that 'other people may need to use this.'


There seem to be a few tools to do it for you automatically. ctrl+shift+a if you're using powershell studio for example. Or https://gist.github.com/DBremen/9db6632423d673ff18f6


This is cool, I'd not seen this before! Thank you for the link.


The implicit return semantics are insane, since you can implicitly early-exit when you do things like call ArrayList.Add (which returns a bool to indicate whether the thing was added).


I know the problem you are talking about - it has bitten all of us.

The new language mode you use when writing classes in PowerShell shifts the needle more towards programming semantics than shell semantics and addresses this.

Give it a try!

Jeffrey Snover [MSFT]


I really like the functional approach of PowerShell. The pipeline, while not the end-all-be-all of perfection, is still very, very good and very flexible. Once you get into the functional mindset, it is very easy to write terse, powerful code.

My real wish is that there was a syntax for writing function decorators and function generators. This would be similar to the [cmdletbinding()] syntax, but would allow me to write my own.


Interesting. I'll have to look into it. Thanks.


Our goal is to be incapable of sustained error. :-)

Jeffrey Snover[MSFT]


insane in a good way right? You aren't "returning" you are emitting to the pipeline


Not if you're in a script and declaring a function. So, no, it's a source of confusing bugs. Implicit return works better in, like, Ruby, where it has to be at the end and can't just randomly exit in the middle of a function declaration.


ArrayList.Add ... whatever ... | Out-Null

Its not implicitly returning. The return value wasn't assigned so it is emitted to the pipeline.


OK, sorry (there is an explicit "return" keyword that I thought worked the same way). Either way, yes, I know you can avoid it by piping to Out-Null, but that behavior is surprising.


    return {expression}
is equivalent to

    {expression}
    return
the only place you have an exit from a function, script block or similar is either at the end or at a return statement. Every value that isn't assigned or piped into nirvana gets pushed to the pipeline. I actually can't remember having used return statements more often than once or twice so far (and I use PowerShell daily, heavily).

Then I rarely found the need to use ArrayList. Sure, you can use it, but you can just as well do

    $array = @()
    $array += $item
A personal pet peeve of mine is actually that lots of people write PowerShell very similar to equivalent VBScript or C# code. And that makes for horrible PowerShell code (in my eyes). The pipeline is immensely powerful and not using it often makes for pretty crappy code.


Most people writing PowerShell scripts are likely dabblers though. Anyway, I have to admit I don't really appreciate the whole pipeline idea; I just know that `$something = Some-Function` sometimes wouldn't do what I thought it would because of these semantics.


Yeah, I wanted to switch to PowerShell but then I had to call an exe with variable arguments from the PowerShell script. That was a quick way back to cmd.


Usually everything works as expected, e.g.

    someprogram.exe arg1 arg2 arg3 $foo (Get-Item bar).Length
Variables get expanded in the argument list, arguments get quoted automatically based on their content. The only gotchas I can think of are:

    & "command with spaces.exe" args
It's a bit unexpected that the & operator is needed.

Arguments sometimes get quoted unexpectedly, or need (single) quotes depending on PowerShell syntax rules to avoid evaluating something.

The worst (but sadly common) thing people can do when faced with problems here, though, is heading for Invoke-Expression, sometimes in combination with all sorts of wrappers, each of them making the problem worse, not better (it cannot get better that way).

The most robust way I tend to use when necessary is to just prepare a single array with arguments and splat that when calling the program:

    program.exe @array
No surprises with PowerShell's parsing in command mode because we prepare the array in expression mode. Things are generally more predictable. Automatic quoting of the arguments still happens, though.

(Side note: My own little echoargs replacement is actually a batch file :-)

    @echo off
    echo.argv[0] = %~0
    set cnt=1
    :loop
    echo.argv[%cnt%] = %~1
    set /a cnt+=1
    shift
    if not [%1]==[] goto loop

)


Couldn't you use $() for this?


They fixed this in later versions with the literal: --%

But you can also always use start-process like:

start-process app @'

your arguments and parameters go here

'@


I was frustrated as well by this. There's a program echoargs.exe you can use to find out what you need to escape in your powershell args.


This was the hardest thing to figure out. I have a program just to ensure I'm sending the arguments that I think I am.


Yeah, it's a mixture of Perl and Bash with sudden C# calls interspersed.


Is this really so useful in practice? I have never used PowerShell before (nor do I have plans to do so in the near future), but I would imagine the downside is complexity, both for user and especially for developers. With text you just read and parse, while with objects you need to know the object type, its fields,... Whatever happend, I hope text stays here (and I have a bad feeling about it ever since I met systemd).


Could you elaborate on what the actual complexity is, rather than what you imagine?

For a developer writing a tool, exposing an interface as objects is far easier than having to constantly serialize data into a string. Keep in mind that once you serialize data into a string, you can never change the order of the data or else other tools will break. Not so with a object based interface.

In other system-admin type tasks PowerShell hides complexity. For e.g. finding the IP address of a network interface is simply

  Get-NetIPAddress | select IPAddress
compared to something horrendous like

  ifconfig | perl -nle 's/dr:(\S+)/print $1/e'


As someone who recently got his first exposure to PowerShell, I'll chime in that the case I needed to deal with was one where it added complexity.

In summary:

I need to pull data from a company we're partnered with, and get it into our systems. They ship it as SQL Server backup files. OK, no big deal, we can mount those up and run queries to dump stuff out to whatever format we like and ingest it.

Except part of that data is files, which were stored in the DB as binary blobs (in a table along with filenames and metadata). Need to get those out of there and into our storage as real files. OK, no big deal, I can just query for filename and file contents, dump out onto disk and then batch-transfer into our archival storage, right?

Except no matter what kind of obvious shell-idiom method I try to use, it doesn't work because PowerShell stubbornly insists that I must be working with text data when I'm using all those handy shell constructs, and so the files end up corrupt.

So now instead of just being able to shove data through pipes like I would on Unix (which, for all its faults, doesn't really care whether I'm sending binary or text data), I have to have my script go and instantiate objects for creating a file and reading and writing binary streams, and remember to clean up after them, and my script ends up way more complicated.

What would have been a pretty simple script anywhere else now has to know a bunch of .NET libraries and how to create and work with their types, and that's less useful to me.


Its true that powershell being a new tech probably doesn't have the large library of "known ways" to do things that the UNIX world has. And BTW I'm not even a powershell person. I write embedded software in C++ for a living ! Having said that, I'd imagine you would parameterize something like

  Invoke-Sqlcmd -Query "SELECT ..." and write it out to a file? 
or avoid it and just use BCP "select blah" queryout "c:\file.jpg" etc etc


What about:

    ip addr show | grep inet
Edit: wrote back original command and replied below instead


That doesn't work on my CentOS box.

  [kapil@localhost ~]$ ip addr show | grep inet
      inet 127.0.0.1/8 scope host lo
      inet6 ::1/128 scope host
      inet 192.168.128.236/24 brd 192.168.128.255 scope global dynamic eth0
      inet6 fe80::215:5dff:fe80:501/64 scope link
  [kapil@localhost ~]$


Correction then:

    ip addr | grep "inet " | cut -f 3 -d " "


It still doesn't work (you forgot to add the tr back in your reply after your edit)

  [kapil@localhost ~]$  ip addr | grep "inet " | tr -s " " | cut -f 3 -d " "
  127.0.0.1/8
  192.168.128.236/24
  [kapil@localhost ~]$

To be honest, I don't consider this to be that bad, I was just replying to a person who thought that bash was "simple", and powershell was "complex". Also, I don't have a problem with either tool. I'm very open to the idea that Microsoft can and does create awesome tech, despite having other shitty products.


Thanks, too many edits indeed!


Nice way to prove the point that object based is far more reliable than text based.


ip addr | awk '/inet / { print $2 }'


That doesn't work either.

  [kapil@localhost ~]$ ip addr | awk '/inet / { print $2  }'
  127.0.0.1/8
  192.168.128.236/24
  [kapil@localhost ~]$
You guys are just proving my point. _I_ already know how to do it using bash. Its just that using objects and querying properties is far simpler. And I just took a random example because its a common thing that someone would want to do.


You really couldn't have asked for a better chain to demonstrate the validity of your point. Having been blinded by my own world view and biases in the past I sympathize with them but this is too funny not to have a little laugh at their expense.


I think what others might be demonstrated is that there isn't a reliance on what object Microsoft decides to ship to you, in *nix we can combine commands and builtins and get the information in the structure we need without having to wait 8 months for a system reboot (or worse a paid for upgrade) that will give us what we need.


Well, to be fair, what was demonstrated here was that people didn't actually extract the data after multiple edits and tries, regardless of what tools they had at their disposal.

I think the idea of an object pipeline is sound for precisely this reason, but I don't think it should be connected to any one language or platform.


First of all you are incorrect in your assumptions. You are reliant on the author of the tool in both cases. Just as there is no "UNIX Person" who decrees that all executables on UNIX must output text, there are vendors other than Microsoft who write software for Windows. You can write one too, it takes about 5 minutes to expose a powershell interface. And I'm not even a .NET developer. I write embedded software for a living.


But once again, you're simply scraping text, instead of querying an object for properties.


It is parsing text, but that text does have some consistent structure. Contained within it are properties. One can make the same comment of reading JSON or XML but I'll admit those are more like what you'd choose for an API. In general there is a not-entirely-obvious (to me) question of where does "formatted for machines" end, and "formatted for humans" begin? We can't say "it's for machines if and only if it must conform a schema that's defined in advance", not any more.


You would have to be aware of that fact in order to write the command. The question is about relative advantage, not about what the tools do.


> Could you elaborate on what the actual complexity is, rather than what you imagine?

I can only use this feature by using a .NET language. I can't take the strongly typed objects into a Node.js script, or a perl script.


>I can only use this feature by using a .NET language.

It goes without saying that a binary interface requires use of a system that understands that particular binary interface. With .NET being open source though, anyone is free to write a bridge to using the objects in COBOL if they wanted to.

>I can't take the strongly typed objects into a Node.js script, or a perl script.

Yes, because they have no concept of strongly typed objects. You can convert objects to strings if you're using an incompatible tool/language. I don't know what your point is. Sorry..


The point is that this thing is less valuable by being limited to only being used by things that Microsoft created (in reality). Text, with all of its flaws, can be used by everything.


I guess the point is the point. Potential bridges that do not exist already and probably never will do not matter -- extending that idea of potentiality I could say that you are free to reimplement the CLR in Fortran, so PowerShell is compatible with Fortran? Oups sorry you already reached that ridiculous point, only with Cobol.

Virtually any language can already input and output text. That's the real lingua franca of computing (even if it is messy and dirty), not .Net objects...


>Potential bridges that do not exist already and probably never will do not matter

The difference between impossible and possible is important. Please read what I was replying to. Using your argument, Anytime a project is open sourced, you could reply with "who cares about potential since nobody has actually done anything with the source". Good job.

>Virtually any language can already input and output text.

Um, you seemed to have missed the obvious point that objects can also be converted to text through powershell if the other side can only accept text. So, I don't quite know what you're complaining about. Could you detail your actual technical complaints with powershell, rather than having a philosophical argument?

>That's the real lingua franca of computing (even if it is messy and dirty), not .Net objects...

I don't agree with your opinion, nor with the idea that only a two-choice system can exist.


That's not really true, you can serialize as json in all of those


And how to delete files older 2 days? Get-Files / | select olderThan 2 | Delete-File ?

But it's not bash, it's coreutils

  find ./ -mtime +2 | xargs rm


    Get-ChildItem -File | Where-Object LastWriteTime -lt (Get-Date).AddDays(-2) | Remove-Item
If you prefer shorter (and similarly obscure if you don't know the language):

    ls -file|? LastWriteTime -lt ((date)+'-2')|rm
I actually find PowerShell's version more readable, even if more verbose. But code is written only once and read a lot of times. I may not know what the +2 actually means with the -mtime argument. How would I change it to look for the age of two hours instead of days?


> I actually find PowerShell's version more readable, even if more verbose.

Sorry, I can't agree with you.

> I may not know what the +2 actually means with the -mtime argument. How would I change it to look for the age of two hours instead of days?

If you forget or doesn't know what is mtime, you can open man find. For less granularity than days use mmin.


Just as an aside (and this probably applies to powershell as well), what if you have mapped folders from multiple servers into your file system and those servers are in different timezones? Something like

  /Mapped-Temp-folders/Server-Asia/temp
                      /Server-UK/temp
                      /Server-Japan/temp
I guess the script wouldn't work then if you wanted to clean out data older than 2 days, as per their timezone.


    ls|? LastWriteTime -lt (date).AddDays(-2)|rm


I just now have the chance to use Powershell on Linux, with that Powershell command you demonstrated, I get a command not found.


> With text you just read and parse, while with objects you need to know the object type, its fields

parsing is actually a huge pain in general. a lot of unixy tools produce different output when their stdout is not a terminal to be more pipeline-friendly. That output is generally easier to parse (often tab-separated or so), but you still need to know the same things about the output (the "type" could be whether it came from `ps` or from `ls`, and the fields are just in some static order that was usually chosen decades ago, and if it's extensible it's cumbersome).

If `ps`, `ls`, etc produced typed output, I would be sooo happy.

When I worked at google I thought about what it would be like to have a unix userland where all the parts communicate with protobufs instead of raw text. There are some problems (like portability of message definitions, which was a solved problem within google, but which is harder when you have multiple parties/entities/companies contributing message definitions).

edit: accidentally a word


I wrote a bash script to sort processes by age on an old redhat 4 machine.

What sounded like a very simple thing, quickly ended up being a complete nightmare of bash, python, and an unholy bunch of linux commands.

It was ugly, and a terrible way of doing it, but we got it working.

It would have been amazing if i could pipe PS into something that could sort by the dates on the 4th column, and return the last column in the sorted order.


Pretty sure the section in the `ps` man page called OBSOLETE SORT KEYS indicates that even the `ps` on very old redhat machines could do this out of box... see the field `start_time`.

Failing that, consider the `etimes` field (seconds since process was started) used in conjunction with the -o feature (control output format) so you can put `etimes` up front then simple pipe the output to `sort`.

Those things being said, the situation you describe is indeed hairy. `ps` output was designed for human consumption, that's why the process age field format varies (1w vs 10:15) and that's presumably why you wanted to bring python into the mix!

...You might have had an easier time looking in /proc...

Indeed, this looks promising:

    cat /proc/[0-9]*/stat|sort -n|less -S
EDIT: hm, I don't know how old /proc is!

EDIT #2: But try doing ANY of this on an ancient Windows machine! The "peer" of your old Redhat machine would be what, a Win2k3 machine, or a Win2k? How old is PowerShell? :)


Honestly it was a quick and dirty one-time project, and I just needed to cobble something together.

I was more just commenting on the fact that with a "strongly-typed" shell some of this stuff becomes much easier without every single tool needing to add all these options.

It gets much more "Unix Philosophy" when every command doesn't need it's own sorting logic, and doesn't need it's own date display logic, and doesn't need it's own way of displaying it to the user.

It can output an object, which can be piped into another program to use what it needs which can be piped to something else to do some sorting which can pipe it out to another to display it in a nice table.


I hear you. I myself was impressed by some things PowerShell can do easily.

But let's not throw out the baby with the bathwater... The unix shell ecosystem is powerful magic. There are LOTS of things that can be done there with great ease.

Not to mention--and I understand full well that this stance is falling out of fashion--MICROSOFT ... KILLED... my PAPPY! :|


PowerShell was released after Win2k3, however you could install it on Win2k3/XP, even PowerShell 2 afterwards: https://support.microsoft.com/en-us/kb/968929


It's entirely possible this is a recent development, but "ps -o cmd --sort etime" seems to do what you're asking for.


sadly that wasn't available, and researching an easier way to do it without that was met with pages and pages of people saying to do that...


There's literally an example in 'man ps' that does exactly this. It's right there in the output formatting section.

    ps kstart_time -ef


You could probably hack something together with awk, but this sort of thing is exactly why I like powershell a lot.


Yeah i ended up using awk, some python for the date parsing and sorting, back to bash for grep and another thing i can't remember, then into the program that actually needed the data.

It was a mess, and I remember thinking it would be such a simple thing that was so much more complicated than i thought it would be.


I find it massively useful in practice, even for mundane tasks.

For example, the other day I was trying to search the columns of a very large SQL Server table¹ but also needed information about the column type.

  …\dbo.site> ls Columns | select Name, DataType, {$_.Properties['Length'].Value} | where {$_.Name -like '*int*'}

  Name            DataType $_.Properties['Length'].Value
  ----            -------- -----------------------------
  internal_end    datetime                             8
  internal_start  datetime                             8
  interval_code   varchar                              5
  prod_intv_start int                                  4
  prod_intv_stop  int                                  4
  working_int     decimal                              5
Quick, easy. In a *nix shell that would be very tricky.

¹ SQL Server installs a Powershell extension so that you can essentially treat databases like a fancy object filesystem. It's wicked cool actually.


How would that be hard in a *nix shell?

    mysql > SELECT  column_name, data_type, numeric_precision FROM information_schema.columns WHERE table_schema = ? AND table_name = ? AND column_name LIKE "%int%"


The reason I chose that particular example was not for the database integration (which is very cool) but for the fact that it's actually something I did.

Point is, imagine trying to grep anything tabular in bash. Hypothetically, the output of ls -la for files with execute permission that have been modified in the last day. Or, more realistically, anything involving ps.


Imagine trying? Sir/Ma'am, I have done so.

When boxed in-between the fearsome foursome--`bash`, `sed`, `awk`, and `emacs`--every problem involving tabular data (2D) is trivial. Hm, I'll give a shout-out to `sort` and `uniq`, too! When you go 3D, you might need to bring in `join`.

But anyway, you'd use

    find ~/bin /usr/bin -mtime 1 -executable -type f -ls
..to provide `ls -la` output for every executable, ordinary files (not directories, which are often +x) modified in the last 24 hours, under the directories /usr/bin or the bin/ under your home directory.

:)


Most users do not find awk trivial. Most people basically use it for {print $colnum} when simple columns are available.

PowerShell's integrated query syntax that you can patch into various things is a lot more consistent and reasonable. It is difficult to imagine volunteering people into a world of awk, sed and cut with good intentions in one's heart.


I didn't call awk trivial, I meant to say that the prudent application of powerful tools makes quick work of otherwise intractable problems.

Forgive my being frank, please, but I think that given the context of this conversation, it's appropriate: Most users don't HAVE non-trivial problems!

There aren't features in sed nor awk, nor regular expressions in general that weren't put there for some specific reason in the past.

When you wield your shell, well, you wield a tool honed through the ages by people who think like you! Do so with pride! :)


I think this speaks to the difference in mindset I've seen a lot in this thread. It basically comes down to people who have gained and honed experience with these tools vs. people that haven't. Yes, for someone with a lot of experience with these tools something like Powershell is redundant... for them.

Some people just want to get shit done because this is just a footnote in a task that's a small part of a project. Utilizing gained proficiency in several tools is cool but if you have a choice between doing that and just taking advantage of a simple tool that streamlines things and lets you move on to your next task quickly the correct choice is obvious.

It could be that these tools are better and may be more valuable in some situations but asking someone to spend time gaining proficiency or mastery when another path is available is just premature optimization.


I have no time for tool worship, language worship, and other sentiment like that. These are tools, not people. I'll use the best I can buy, borrow, or make until I see another better one comes along.


A corporation has no incentive to give you (let you keep forever) the best possible tools.


If that is true, then it is equally true of the authors of shells.

But of course, it's not true. Many great and lasting tools have been founded out of corporate work in a bid to improve mindshare and ease recruiting by increasing the esteem of corporate engineering efforts. Many of the things we think of as "open source" and "free" were built subsidized either directly or indirectly by capitalistic interest.

But I'm not concerned about keeping it. Ossifying my process so that I can give myself the luxury of time of from learning is not something I'm prepared to do, because I think it will make me soft and less capable as an engineer. I regularly rotate my editors and learn new languages because I don't want to become like the sad specters of the previous generation of software engineers I see now, starched shirts and ties and a fanatical devotion to the dated technologies they stomach in order to collect huge consulting fees from corporations.

I've switched off Fish to Powershell as my primary shell for awhile. It's interesting.


Hey, don't forget cut and xargs!


And paste for tasks requiring the opposite of cut, then there's tr, the lightweight sed. And jq for structured queries on JSON. But, all these are built in to poweshell, essentially.


No they're not. PS can't process text all that well...


    *gulp* I've heard people say that, but I haven't delved into them!!  :-S


cut doesn't match awk, but it's handy for the little stuff. And xargs is the most usefull thing ever.


Imagine for a second that the DB can't do it and you need to filter the STDOUT output.


That doesn't sound hard at all. AWK, anyone?


Writing awk to do it is definitely more work than calling obj.Property though.


  $mysql_cmd|awk -F\| '$2 == "what you want the second row to match" {print $0}'
Not that much more work. And you can do better if you can coax mysql to emit proper DSV.


That kind of stuff adds up fast in an interactive shell.


...Which is why you write a script to do it if it comes up often.

Extendable tooling!


Okay, I've imagined it. It does seem terrible.

Fortunately, that's not a world any of us live in.


Er, it's a world all of us live in. Say, for example, ‘ps’ or ‘ls -1l’.

Then you end up using awk, which is rather like a half-assed version of powershell to work around shortcomings with the unix everything-is-a-string design.


AWK isn't a workaround. It's a tool for processing tabular data, in a textual format. The fact that it can solve so many problems is actually a strength of the EIAS design.


Parent comment was talking about parsing the raw output from a database in stdout. That's what I was responding to, and I'm not wrong.


OK, but now imagine it's not a database, because there are programs that do act this way.


And those programs usually emit proper CSV. You could use regex to match on what you want, but AWK was literally built for this. It's fantastic, it works great, and if you don't use that tool that was built to solve your problem, that's your own lookout.


I don't think awk is actually capable of parsing CSVs in all cases.


It doesn't parse escapes, IIRC, but those are rarely emitted.


Pretty much all databases will support "describe tablename" to do exactly that.


> Quick, easy. In a *nix shell that would be very tricky.

You'd probably need to make an explicit SQL query since you wouldn't have the database integration.


Well, yeah, but I mean the concept (grepping in tabular data).

If, hypothetically, bash had database integration, ls Columns would probably return something like

  foo	varchar	30	NOT NULL
  bar	decimal	 8
  baz	int	 4
Which would be a royal PITA to search for anything in—if I were to search for ‘int’ I'd get all int columns and any columns with int in the name….

I could've used an example involving process lists, but I didn't want to trigger PTSD in anyone who's had to grep the output of ‘ps’ for anything non-trivial. (Plus, the database thing was something I actually did a few days ago.)


But thats why there are tools like awk.


awk is rather similar in many ways to powershell, actually, but untyped. I agree most of what powershell does you can do with awk, but it's often less pleasant than powershell (in my experience).


ps aggregates /proc info for human viewing. Why grep the formatted output rather than the source data?


Maybe on Linux. OpenBSD doesn't even have procfs, FreeBSD deprecated it, DragonFlyBSD doesn't mount it by default.

/proc is a race condition waiting to happen, so I can't really say I recommend using that.


Objects are discoverable in powershell. Enter the object, and lists the object out as text.


More specifically, you can find out all of the type information about anything you get access to by piping it to 'get-member'

  get-netadapter | get-member
Which gives the following output http://pastebin.com/bM0cBeXb about the type information for the result of 'get-netadapter'. In addition to this, tab completion for properties is available.

Alternately, you can get all of the current properties of any existing object by piping to 'select' (to get a general subset) or 'select * ' to get all properties. Note that this gives all information for the objects in the pipeline.

  get-netadapter | select
  get-netadapter | select *
The first will provide a short list of fields for all network adapters, and the second will provide all properties.

'Select' can also be used to easily filter the properties of items in the pipeline to ones you care about.

  get-netadapter | select Name, MacAddress
You can also just generally filter the pipeline based on generic filters (give me all of the network adapters that are currently not enabled, and have a driver provided by Mellanox).

And ultimately, if you want to use powershell to do the stuff that typed objects are good at, and then leverage command line tools to do the rest, you can if you like.

  get-netadapter | select Name, MacAddress | exportto-csv
Which will take all of the network adapters, gather their name and mac address, and spit it out to stdout as CSV which can then be sent to any number of external tools, or using the '-expandproperty' flag to 'select' can give just a bare list of properties one on each line.


Yes, this is so awesome. Can't remember the exact parameter/field? Just dump it to the console.


That what you would do with text? If you have no idea what its outputting, explore it by putting it to the console.


Sure, but with text you might have empty fields that are just omitted. With Powershell you can see that the field exists but is empty.

There's a bunch of subtle distinctions like that which don't sound impressive but end up being really nice when working with it.


What you can't do so easily with text is also just auto-convert that object directly to JSON or YAML or CSV, for that matter, and forward that easily into your favorite JSON/YAML/CSV exploration tool.


Ok, that sounds good. But what if some developer decided that some field would have prefix "ABC" in all values, then the value I am interested in, then some value I am not interested in at the end? (you say this can't happen? Clue: WMI.) Can I take the value, convert it to string, make some magic on it and convert it to another object entirely?

I don't know, I never had any problems with text parsing. I love strong types in programming languages, but in system administration I think it would only get in the way. But as I said, never tried it, so I might be missing something.


The default regex functionality is absolutely terrible (maybe this can now be fixed with a PR). You'd write your own cmdlet[1]. A few things are demonstrated in the cmdlet: functions that take pipes, functions that return pipes, anonymous functions, actions to perform before processing a pipe and dynamic objects. The Powershell syntax gets ridiculed for being ugly, but it's beautifully expressive.

> Can I take the value, convert it to string, make some magic on it and convert it to another object entirely?

    Get-ChildItem *.js | Get-Content | Select-Regex 'require\("(?<Require>.*?)"\)' | Select-Object -ExpandProperty Require
In other words: find JS files | read lines | turn regex capture groups into an object (another object entirely) | select a single capture

Ultimately, Microsoft provides WMI cmdlets[2]. You'd have strong types from the get-go and wouldn't need resort to this string silliness.

[1]: https://gist.github.com/jcdickinson/cee4582448300c0d404bbff1... [2]: https://technet.microsoft.com/en-us/library/ee176860.aspx


The default regex functionality is absolutely terrible

How so?


Replicate the above without using [Regex]:: (i.e. use -matches and $matches). It forces you to radically depart from a typical Powershell mindset and approach.


I'm not certain what you're getting at; -match doesn't handle multiple matches so you can't replicate the above just with it? Yes, that's a bit annoying. You could match line by line:

    gci *.js | foreach { gc $_ | foreach { if ($_ -match ..) { $Matches... 
which is ugly, but doesn't use [regex]::Matches. But you can do:

    sls 'require\("(.*?)"\)' *.js | select { $_.Matches.groups[1] }
or other variants of Select-String, depending on exactly what data you want, without departing into the .Net Framework too far.


> "But what if some developer decided that some field would have prefix "ABC" in all values, then the value I am interested in, then some value I am not interested in at the end?"

I don't see why this would be a problem for PowerShell. Just use Select-String to strip away the characters you're not interested in. The advantages that derive from passing around objects aren't based on all data being automatically in the format you want.


There are many ways to accomplish this (in general). My favorite is that you can create 'computed' properties on the fly which means you can continue to keep the strongly typed object in the pipeline, and still have access to the data you care about.

https://technet.microsoft.com/en-us/library/ff730948.aspx

You use the exact same functions (first, last, substring, replace, etc) as you would normally. Then the rest of the pipeline has it available if needed.


I spend about half my job inside Powershell and had no idea bout this. Very cool.


Yep! There's a few ways to accomplish such a thing, but probably the most straight forward would be the Foreach-Object command (aliased as "%") to do a mapping from source object to something else. Free form objects/dictionaries are easy to make and consume.


Select-Object also makes it very, very easy to make new objects


Yes, it's quite useful in practice, because the alternative is knowing exactly the format of text some command is going to produce and munging it, which is hardly better


It's nice when it works, but it's a pain to write a command-line utility that outputs into objects. Whereas in Unix, everything automatically outputs into text, so you have to work hard to prevent it from being chainable to the system.

There are also details of Powershell that are very annoying, such as the lack of < redirect, and the difficulties it has on binary output sometimes.


difficulties with jumping through hoops to consume and produce binary output are really the only "problem" I've had with powershell


Pipe anything to Get-Member. For example:

(ls)[0] | gm

will list every field and its type on a File object. Wicked easy.


You can even just do

    ls|gm
and get the lists for both FileInfo and DirectoryInfo. There will be no duplicate types listed because Get-Member already sees to that there won't.


It's not about complexity; it's about robustness, and avoiding insane hacks like this:

    REASON=$(echo "$line"|tr ' ' '\n'|grep reason= |cut -d= -f2)


I've done some Powershell work in my brief time as a Windows sys-admin. I personally struggled with it. The object interaction can be complex and the naming feels less scriptable and more directly out of the C# world.

That being said, I can see the appeal and I know people who love it. I have seen some amazing shit written in it, like this security tool that was just open sourced this year:

https://github.com/adaptivethreat/bloodhound

I have used Mono/GTK# for years on Linux and prefer it over other GTK wrappers. Powershell might be worth a try on Linux, but it will take time to build up the libraries and wrapper needed to interface with most of the Linux subsystems (unless Microsoft as ported a lot of useful stuff in this initial release).

People don't (or shouldn't) use bash/zsh/fish for really complicated stuff anyway. I have a feeling most sys admins/devops people will continue to use Python/Ruby for complicated tasks; and they both have a ton of libraries that help avoid the stdin/out/parsing situation you have with chaining together regular shell scripts.


Python and ruby for the big stuff, bash and the Holy Trinity (sed, grep, and find) for the small stuff. AWK for anything in between.


You can just use the Mono bindings for the Linux-specific API's, I hope.


I really hope this drives a new wave of experimentation and development in shell interaction. Text as the lowest common denominator will never go away but having more powerful constructs available will be great for a lot of use cases.


Check out the language I wrote for that exact use case: http://tkatchev.bitbucket.org/tab/


I wonder what's the difference between an object-pipable shell and a lisp machine (only 10% troll here). Kalman Reti said that passing pointers as IPC meant zero perf penalty btw. Text serialization has design value sometimes, but the amount of cpu cycles allocated on string munging seems flabbergasting.


> Once you've piped strongly-typed objects around between your shell commands, text scraping seems barbaric by comparison

This is why Powershell is a great thing on Windows. Windows works with .Net objects and Powershell works directly with those objects.

It's also why I don't see the reason to be excited that it can run on OS X and Linux now, because neither of those systems work by passing .Net objects around. You parse text with bash/sed/awk/perl/python ... because *NIX works on text based config files and commands that output text.


> Windows works with .Net objects and Powershell works directly with those objects.

I'd say COM, WMI, and the registry are more prominent parts of Windows itself than .NET is. However, PowerShell seamlessly works with those as well.


I'm assuming that the Powershell port includes some subset of the standard cmdlets, which will handle serializing to and from objects.


A subset means it doesn't work as well or the same way.


No... It just means that all features may not be available. It makes no comments on how well the existing features work and how compatible they are with the Windows version.

I mean, `Get-ComputerRestorePoint` and `Get-WmiObject` don't mean anything on Linux. I assume those won't be ported. `Get-Process` is certainly meaningful; I assume that will be. The ports might also have cmdlets not available on Windows. I could certainly see cmdlets to interact with `apt` in Ubuntu, or `yum` in RHEL. Especially since PowerShell is focused on provisioning and managing machines.


Strongly typed cookies you say..

Maybe we all can just write our scripts in Haskell and eat some delicious type cake!


Along the same lines, for Linux and MacOS: https://github.com/geophile/osh


Interesting, nice work


"text scraping seems barbaric by comparison"

"can't use all that power for much, since the pool of PS-compatible tools and utilities is much shallower than it is for Unix shells"

The explanations are obvious though... aren't they. Text might be barbaric, but every programming language in the world makes sending text to stdout dead easy. The unix shell utilities are smart to support that case.


Elixir's REPL (iex) makes a surprisingly-good environment for "pip[ing] strongly-typed objects around between your shell commands." It could use a form with added convenience magic, the way Pry is for Ruby, but I've managed to do some decent interactive ops work on systems through an Erlang-remote-shell (remsh) connection to an IEx shell.


IIRC correctly, the major PowerShell push started with Win Server 2008, and headless server mode (no desktop). A lot of WinX sysadmins pissed their pants that they couldn't use GUI's to control everything, and PowerShell was pushed as the way to do server management w/o GUI's (crazy, I know).


I don't think it's too late for PowerShell, I think it's too late for UNIX. You need a big CLR ecosystem for this to play out.

If somebody resurrected Midori as a CLR Unikernel I'd be very pleased.


Care to post a few simple powershell examples that do the sorts of things those familiar with typical nx shell scripting might find useful?


It's junk. Tried to get something useful done and version incompatibilites and poor documentation made it a losing battle. Type theory is over-wrought for getting things done. My boss doesn't care that I have to cast an integer to a string.


I'm hoping this means running EF migration scripts on Linux will be less of a nightmare. It was one of the biggest issues I ran into developing a Web application in .net hosted on Linux.


Powershell is aweful


[flagged]


I take a super utilitarian view. At the end of the day, PowerShell on Linux is another tool in the toolchest.

PowerShell has a POV about abstracting gorp into high level task oriented abstractions, object pipelines, structured data, and the workflow from interactive shells to ad hoc scripts to formal scripts to production scripting.

That is what a number of people are looking for in their tools and that is why they like PowerShell.

We are constantly looking for ways to improve PowerShell and make it more useful so I'd encourage you to suspend disbelieve long enough to kick the tires. If you see some stuff you don't like, I would be very interested in hearing the details. You don't need to be polite in your feedback but I'd ask you to be specific so that I can identify potential changes.

If you are happy with your existing tools - happy days!

Thanks!

Jeffrey Snover [MSFT]


I admire how polite your responses to these posts are. Kudos.


> So many strongly typed shells exist already.

But no one uses them, and no one is building an ecosystem around them. That's what I'm hoping will change thanks to this announcement. Even if it's not PowerShell itself, if a different strongly-typed shell gets traction because of this, I will be happy. That's what I was trying to say with my comment.


[flagged]


> the only people I've seen say nice things about powershell are microshit fanboys that don't know any better.

I really like Powershell. If you install RHEL or Fedora I'm the 'mikem' in the default /etc/sudoers file.


[flagged]


I've been using bash for 20 years, trained a few hundred RHCEs, and spent a large part of my life working for Red Hat and then IBM's dedicated Linux group. So yeah maybe I don't know any better, but that's because I haven't encountered it before. Piping objects to 'where' and 'select' is simply a better approach than scraping text. The implementation details - having to use .net to make cmdlets - aren't great but the fundamental idea of powershell is excellent.


[flagged]


We've banned this account for repeatedly violating the guidelines. If you'd like to commit to only commenting civilly and substantively, you can email hn@ycombinator.com and we'll happily unban the account if we believe you'll do so.


[flagged]


"Be civil. Don't say things you wouldn't say in a face-to-face conversation. Avoid gratuitous negativity."

That's pretty clear, and has been in the guidelines since long before sctb was a mod.


Back in the day I thought VAX/VMS was amazing. Particularly things like file versioning, however, I had to leave it behind and do things on UNIX (and then Linux) terminals. That change was relatively painless even if I did delete a file or two that I should not have. However, I have moved on and nowadays I would not have a clue how to use VMS beyond typing 'dir'.

If the VMS 'shell' was suddenly available on Linux, despite its 'proven' benefits in various scenarios, I just would not have the working knowledge to get going with it again. Furthermore, there would be no easy answers on StackOverflow to do basic things such as recursively renaming part of all filenames, or batch modifying the content of said files.

The problem PowerShell has for me is that it might as well be a souped up version of VMS (or even BBC Basic). It is not that I am adverse to learning, it is just that I left Windows World a long time ago. Much like my move from VMS to UNIX, there were some good things that I left behind (such as the file versioning thing), however, if you move to a new way of working then there is no need for the old stuff. I appreciate that PowerShell is new, but it does not suit my way of working, i.e. cribbing answers from StackOverflow and tutorial guides.


I appreciate that PowerShell is new, but it does not suit my way of working, i.e. cribbing answers from StackOverflow and tutorial guides.

PowerShell is ten years old and there's 37,000 StackOverflow questions tagged to it - http://stackoverflow.com/questions/tagged/powershell


>We will be extending the PowerShell Remoting Protocol (MS-PSRP) to use OpenSSH as a native transport. Users will have the option to use SSH or WINRM as a transport.

Is possibly an even bigger announcement (this means there's actual momentum behind https://github.com/PowerShell/Win32-OpenSSH)


https://github.com/PowerShell/Win32-OpenSSH/releases

Microsoft has an SSH server for Windows right now.

[Edit] Here is a guide from the WinSCP peeps on how to get it running. https://winscp.net/eng/docs/guide_windows_openssh_server

The native SFTP works well!


This is what I have been waiting for right here!


yup, me too!


I think it might have rolled out already. Turn on Developer Mode in Windows 10 Anniversary Edition, and ssh to port 22 with a valid l/p?

Doesn't look like you even need to install WSL.


It looks like they're a little different for now. See the note under here.

https://msdn.microsoft.com/en-us/windows/uwp/get-started/ena...

Full Disclosure: I am a MSFT employee.


More specifically, I don't want to SSH from a Windows box. I want to SSH into a Windows box (and say figure out a way to easily install a LE cert)


There have been ssh servers for windows since forever.


>This is not Microsoft's OpenSSH implementation, which you can find on GitHub.

Nope, but it is the same SSH implementation that was in one of the Windows 10 for Raspberry Pi images!


PowerShell is interesting, and I don't consider it evil. I know some people swear by it, and I am not one of those people - I'll take my AWK with a side of Ruby/Python, and a helping of SCSH, thank you very much.

It does seem a bit out of place in a UNIX system, though. In Windows, PowerShell ties to a ton of strongly typed databases (SysReg, &c) and windows APIs together into a nice package. But on UNIXes, those APIs are already well exposed elsewhere, and that data doesn't exist. UNIX data is typically stored as text, or can be extracted as such easily. We don't tend to do a lot of binary object munging. I mean, maybe if it integrated with databases, it could at least be handly for dealing with RDBMS scripting, but those bindings are available elsewhere - ruby, perl, python, etc. And MS could build bindings to POSIX, so you'd get a PowerShell version of tools like PS that returned objects, but what are you going to do if you want to munge /etc/fstab, and, say, generate user accounts for all your boot filesystems? I could do that with a line of bash and a few lines of AWK. I don't know why I'd want to, but I could.

So to conclude, PowerShell seems like a neat tool, but it was designed for an ecosystem that doesn't exist on the *nixes.


> But on UNIXes, those APIs are already well exposed elsewhere, and that data doesn't exist.

I'd go with exposed, but not necessarily well. Examples I run into all the time: get an IP from an interface, change iptables entries, even change fstab options. Sure, if you write enough regex, it will probably work most of the time. But it's never going to be as reliable as get_ips('eth0'), or get_filesystem('/tmp').options.add('nosuid').


getting an IP is still pretty ugly, somehow:

  ip address show dev eno1|grep 'inet '|cut '-d ' -f6
But I wouldn't call it unreliable. The cut is the only really nasty bit.

Changing your fstab is easy with a little awk:

  $2=="/tmp"{$4=$4 ",nosuid"}
  {print $0}
Given, it won't remove contradicting options...


It's unreliable, because you don't know where you'll find 'inet ' or what's the specific spacing of the output. Also, that's not the ip, because you've still got the cidr attached.

And no, fstab is not easy either. For example it fails for:

    # /tmp has option_blah in order to ...
changing it to:

    # /tmp has option_blah,nosuid in order to ...
Confusing the next person to look at the well intended comment.

It's no impossible to implement fairly reliably, it's not going to take days to do, but you're likely make mistakes. And in both cases you're spawning extra processes instead of just getting the information you need.

PS. also slightly annoying is that everyone is so concerned about the text processing part that barely anyone checks what the tool itself can do. (ip -4 -o a s dev eno1 | awk '{print $4}')


>PS. also slightly annoying is that everyone is so concerned about the text processing part that barely anyone checks what the tool itself can do. (ip -4 -o a s dev eno1 | awk '{print $4}')

Actually, I spend a while digging through the manpage to try and find this. ip has awful docs.

As for the comment problem, yeah, not optimal.


Oh, I forgot: the comment you showed wouldn't trigger my AWK script. Read it more carefully.


Yes it will. If you mean the $2 part, the comment character takes the place of $1. Or did you mean something else? (either way, I tested it before posting)


Nope, you're right. I just miscounted.

You could always add

  && $1 != "#" 
To fix this, though.


And either the "ip" command can never change how it formats its output, or your script with that command will break some day.


...Which is why I said it was ugly. The grep probably won't break, the cut is more likely to. This is in part because the 'ip' command isn't very well designed. A better solution I found is

  ip addr show wlan0 | grep -Po 'inet \K[\d.]+'
Which is less likely to break. If your system supports it,

  hostname -I
Works great. However, this seems to be an ubuntuism, and it doesn't work on my system.


  ip addr show eth0 | awk '/inet/ { print $2 }'
Will be better ;) 'inet[::space::] ' not working with ipv6 addresses.


I though there was a specific desire for ipv4 addrs. That's why the space was there.


> In Windows, PowerShell ties to a ton of strongly typed databases (SysReg, &c) and windows APIs together into a nice package. […]

> UNIX data is typically stored as text, or can be extracted as such easily.

They could ally with the systemd folks in their mission to change that.


Which I’d be very happy about – see also the discussion about getting the ip for an interface above.


Are you out of your mind? Or do you just not understand what you're saying? Adding an interface to PS to get this info natively is good. But Lennart has a magic ability to get everything wrong, and he's NOT messing with my fstab any more than he already has.


Then which audio system is simple to set up AND allows per-program volume, effects, etc?

Which init system "just works" without requiring me to copy and modify a hacky init script from stackoverflow?

Lennarts products are controversial, but they’re the classical "least worst" options.


>Then which audio system is simple to set up AND allows per-program volume, effects, etc?

JACK

>Which init system "just works" without requiring me to copy and modify a hacky init script from stackoverflow?

s6, runit, bsdinit (yes, really).

>Lennarts products are controversial, but they’re the classical "least worst" options.

No, they're ugly, broken messes that look nice. Systemd is a monster. If it was an init system, I wouldn't complain, but it's not. It's an init system that's also a cron replacement, a syslog replacement, an automount replacement, a docker replacement, a consolekit replacement, steadily gaining a tight integration with udev, overcomplicates everything it touches (we need a hostname daemon now?), provides libraries to high-level userland that are DELIBERATELY POSIX incompatible, promotes binary formats for no reason, flies in the face of tradition for no reason and breaking code (screen sessions die now), overcomplicates PID1, increasing the chances of a kernel panic, and that's just the tip of the iceberg.

Systemd is a tiny step forward, followed by a massive step back, and of you can't see that, you haven't been paying attention.


> JACK

> simple to set up

Try getting it to run with basically anything, seriously. Pulseaudio is plug-and-play.

> Which init system "just works"

Neither of the ones you mention fit that description

> It's an init system that's also a cron replacement

Wrong.

SystemD is a project with a similar branding issue that KDE has.

Systemd is a project.

It develops a set of libraries.

And, on top of that, an init system.

And a cron replacement.

And a syslog replacement.

You do not need to use them together – just like you can use Gtk apps in a KDE environment – but together everything works best.

> for no reason and breaking code (screen sessions die now)

For the reason that shutting down my PC means it spends 10 minutes waiting for screen sessions to kill themselves, and I can not stop that without either doing a killall, alt-print-reisub, or just pulling the plug? It also provides a way to keep things running still.

> increasing the chances of a kernel panic

Never had one, and I’m running bleeding-edge arch

> Systemd is a tiny step forward, followed by a massive step back, and of you can't see that, you haven't been paying attention.

Systemd is the least worst option.

There is not much of an alternative for such a kind of project, that properly handles things in a more modern way.

I want a userland that works plug-and-play, I want syslog as a daemon with database, not as a fucking flat file. I want more dbus, and, ideally, I’d have all config files in a database or registry for easier administration and sync.

I want proper udev and containerization, I want programs of my user to be killed upon logout unless specified otherwise.

Lennarts projects don’t provide what the old crowd of nerds want.

But they provide what the average user want, and what’s required to make Linux actually usable for your sister, your grandmother, and your retarded neighbor that can’t even turn on a PC.

It also reduces the workload for server admins, like me, again.

I can write systemd unit files in a few minutes, can easily stream syslog over the network from multiple servers and aggregate, I can easily handle namespacing of software.


>I can write systemd unit files in a few minutes, can easily stream syslog over the network from multiple servers and aggregate, I can easily handle namespacing of software.

You could do 2 of those three things already (syslog and namespacing), and bsdinit init files aren't hard to right.

>Systemd is a project.

>It develops a set of libraries.

>And, on top of that, an init system.

>And a cron replacement.

>And a syslog replacement.

>You do not need to use them together – just like you can use Gtk apps in a KDE environment – but together everything works best.

That's BS. A lot of it's all one executable.

>I want a userland that works plug-and-play, I want syslog as a daemon with database, not as a fucking flat file. I want more dbus, and, ideally, I’d have all config files in a database or registry for easier administration and sync.

Well, if you want Windows, you know where to find it. But stop trying to turn my UNIX into it.

>I want programs of my user to be killed upon logout unless specified otherwise.

Well, you're in an EXTREME minority.

>But they provide what the average user want, and what’s required to make Linux actually usable for your sister, your grandmother, and your retarded neighbor that can’t even turn on a PC.

...and if you think "the year of linux on the desktop" is ever coming, you're delusional. Besides, Ubuntu had a reasonable UX for that audience long before systemd came around.

>I can write systemd unit files in a few minutes, can easily stream syslog over the network from multiple servers and aggregate, I can easily handle namespacing of software.

Software namespacing doesn't require systemd. Syslog streaming has been a thing for a long time. And BSDinit, s6-rc, runit, openrc, sheperd, nosh, systemXVI, perp, watchman, and a variety of other init systems exist that make writing unit files easy. Many of them predate systemd.


> That's BS. A lot of it's all one executable.

That’s not even true if you use the packages for debian or redhat. In both cases, it’s over a dozen packages for systemds independent projects, and you can replace each of them seperately, if you wish to.

I’ve done so in the past on a server where I couldn’t update everything because it was an old-style containerized VPS with shared kernel.

> Well, if you want Windows, you know where to find it. But stop trying to turn my UNIX into it.

> Well, you're in an EXTREME minority.

Then go and use UNIX, aka BSD. This is the linux world, and most people want it – as seen in the countless times when distro maintainers held a vote pro or con systemd.

> ...and if you think "the year of linux on the desktop" is ever coming, you're delusional. Besides, Ubuntu had a reasonable UX for that audience long before systemd came around.

Does Ubuntu allow me to configure all services and systems via a neat UI? No? Why? Because it still uses a flawed file-based approach, which is useless in the 21st century.

Ubuntu replaced user-facing UX, but even there not everything could be properly done yet.

Despite that, Ubuntu also created their own init system, display server, syslog daemon, and a registry for configs.

So, they had the same stuff systemd now provides, and more.

> You could do 2 of those three things already (syslog and namespacing), and bsdinit init files aren't hard to right.

> Software namespacing doesn't require systemd. Syslog streaming has been a thing for a long time. And BSDinit, s6-rc, runit, openrc, sheperd, nosh, systemXVI, perp, watchman, and a variety of other init systems exist that make writing unit files easy. Many of them predate systemd.

I’ve tried unit files for other init systems, it was a pain in the ass.

Systemd is the first where every software provides matching unit files, I can easily write my own, and it just works.

Again, the entire point is that it just works and is still easily configurable and modularized.


It's not modular, it's a mess. Systemd may be organized into separte components, but make no mistake - it's a monolith in sheep's clothing.

As for unit files, nosh, at least, provides the same format as systemd. s6 "unit files" are just the command to daemonize the process, in most cases. BSDinit and OpenRC are pretty simple as well.

>Then go and use UNIX, aka BSD. This is the linux world, and most people want it – as seen in the countless times when distro maintainers held a vote pro or con systemd.

Linux is a UNIX too, and most of the people who know and care about init systems don't want it, judging by the massive protests that show up constantly from people who understand the technical aspects.

>Again, the entire point is that it just works and is still easily configurable and modularized.

It "just works," but it's a technically deficient mess that goes against everything unix is about, is a monolithic beast in disguise, and is trying to force everyone to use it.

If it weren't for that last bit, I really wouldn't care.


Systemd is to sysvinit what Linux was to Hurd and other microkernels.

Sure, it’s theoretically monolithic, but you can replace modules, and it just works.

And "s6 "unit files" are just the command to daemonize the process, in most cases" is a horrible solution, too.

The reason systemd unit files are so awesome is because they’re simple configuration, not having to string complicated commands together.

I configure the user it’s run under, the context, the syslog identifier for stdout, etc.

It all just works.

That’s the big thing.

You fundamentalists had 2 decades to make a working system, yet, what we ended up with was a clusterfuck that was worse than the X11 clusterfuck.

It’s time to end this.


>Systemd is to sysvinit what Linux was to Hurd and other microkernels

That's just not true. Linux was a return to a simple, tried-and-true model from a more complicated and theoretically beneficial one. Systemd is a devergance from a simple, tried-and-true model to a more complicated one, that really doesn't have many benefits.

>And "s6 "unit files" are just the command to daemonize the process, in most cases" is a horrible solution, too.

>The reason systemd unit files are so awesome is because they’re simple configuration, not having to string complicated commands together.

So are S6 unit files. So are a lot of config formats, actually. Sysvinit was really bad, but other inits had moved on. The complicated commands are only required when the app had complex requirements, or when init is stuck using fragile pid handling, which S6 and the like don't need, because they implement proper process monitoring.

>You fundamentalists had 2 decades to make a working system

BSDinit, OpenRC, s6, hell, even upstart, and countless others are all working systems.

Anyways, I'm not a fundamentalist. I don't think that systemd's unit files are broken. What I think IS broken is systemd itself.

Init has a few jobs: managing startup, managing shutdown, reaping processes, and handling daemons, etc. By this point, init's job is well defined. How it should do them has been argued, but the jobs are well defined. That is what init should do. NOTHING ELSE.

Systemd tries to do everything. Thus, it's overly complex, AND it's a single point of failure.

>It’s time to end this.

I quite agree: When Lennart stops making me try to use his piece of trash disguised as software, I'll stop complaining about it.


> Sure, it’s theoretically monolithic, but you can replace modules, and it just works.

Show me an example of that being done with systemd.


Great! So I can now run bash on windows (with WSL), open tmux, and open powershell prompts in each tab: the loop is complete!

I wish powershell's documentation was a bit more clear about what's powershell 1.0, 2.0, 3.0... Etc.

A lot of time I'd find a verbose and tedious way to do one thing because the code was written in "old powershell", when the latest version had a much more elegant solution (either because of new language features or simply more cmdlets were added, removing the need to fall back to some C# syntax)

On stack overflow it often means read more than the chosen answer, somebody would have added a newer way to do it. I wish Microsoft "hey script guy" blog had some versioning info (this post was last updated in 2016, using powershell 3, see powershell 2 version here)


The official cmdlet documentation is versioned, though for some reason it only seems to have versions 3, 4, and 5. Which, agreed is very annoying, since Windows 7 has version 2 by default.

https://technet.microsoft.com/en-us/library/hh849971.aspx

(You should upgrade to at least 3. It has all sorts of useful stuff added! File hashing and web requests at the very least.)


Does tmux work well on Windows? Everything Microsoft has been doing recently with bash on Windows has made me consider buying a surface pro as my next machine.


I have Surface Pro 4 i5 and while I personally love it- I can't really recommend it due to various issues (battery life is shit, sleep is bugridden so best to hibernate only). It's uncomfortable as a tablet, and near useless for the main thing ultra light computers are used - traveling a lot.

I use it at home. The pros are a really tiny size, yet it packs sufficient computing power for many things, awesome screen. I can use it at a desk or grab it to the sofa - but I need to keep the power adapter handy.

If the execution was as flawless as Apple's Ipads are it would probably be the best computer ever (by my subjective scoring which appreciates a pen and sRGB precise screen probably way more than the average user).

But as it is - and I'm comparing it to my Ipad pro which makes all the problems stick out like a sore thumb - it's not an obvious recommendation.


It really is annoying that the battery issues are so persistent. That's the main thing that has kept me from even going to a store to check one out.

I've been stalking /r/surface and I've been seeing complaints about this for over 6 months. Apparently it's more Intel's fault than Microsoft's, but still very disappointing. I love the form factor and I want to support Microsoft lately.


I was a big GNU Screen user on linux, which unfortunately doesn't work on WSL. So I tried tmux, and it works fine for the few minutes I tried it: Create new tabs, detach session, etc.

But once you close the Bash primary window, the session dies, so it's not as persistent as running tmux on a Linux machine, which makes sense given how WSL works. I haven't figured out yet if there is a way to always keep WSL running in the background so that I could close the bash terminal, reopen, and reattach my tmux session.


Execute tmux as a scheduled task.


Tmux came first to cygwin, and now it works on msys2. I don't know about MS-BASH, but MinTTY + msys2 + tmux has been working great for me.


tmux works quite well. I'm less sure about the surface pro :p


I agree, there's been a lot of churn in the PS versions and it makes it hard. Especially when you're new to it and trying to learn.


Yep, in my experience the documentation for Python and really anything OSS far outstrips powershell


Microsoft is projecting that the .NET ecosystem will become a main revenue driver, regardless of the underlying operating system.


Which is why I have no doubt the Universal Windows Platform will become the Universal Platform and be a true desktop/tablet/mobile/server/etc Java competitor on all major OS (mobile, traditional and server).

It must be rather frustrating for Microsoft to look back a decade or more and see they were sitting on the exact technology a hell of a lot of people said would be their future had they made it cross-platform from the start. I remember using C# for the first time around 2004 and thinking "god damn if this were cross-platform like Java it would kill Sun". Back then Microsoft couldn't see past the tip of their nose though.


Yup, I've been playing around with C# more lately and it's such a good language.

Things like embedding with native code is trivial compared to Java. Want to call a delegate(with extras) from a C function pointer? Sure thing, just specify some syntax and away you go. Marshalling strings? Just works.

Totally a breath of fresh air compared to the mess that is JNI.


Me too, back in the first .com wave I had access to early betas and was quite critic of it, nowadays I enjoy using it more than Java.

Although I do still like Java, just don't like having to wait for Java 10 for some of the .NET features already available (possible free AOT compiler, value types, reified generics, JNI replacement).


But that's what .NET wanted to be. That's what Silverlight wanted to be.


Until now they didn't do what was necessary to make that a reality -- actually fully embrace other platforms.


IMO this is why Google is currently building a new universal OS from scratch. They know the power Microsoft can yield with this. It will take some time to catch on, but when it does, it's going to be so big.


I think this can be explained by the fact that Microsoft has a long history of trying to kill OSS software and alternatives to it's virtual monopoly in the desktop OS market, killing innovation across many technology sectors in the process. Being cross-platform was the opposite of their goals. They specifically created their own version of open technologies and promoted them as the best/only way of doing things on windows - see directx/opengl, and mictrosoft intentionally degrading opengl performance on windows in order to hurt the OSS community.

http://blog.wolfire.com/2010/01/Why-you-should-use-OpenGL-an...


Well they certainly have the resources to grow and support the .Net tech stack, that's for sure. I'm finding myself stuck in it more and more.

TypeScript was my gateway drug.

Linux is still my daily driver for getting work done but I can't help but enjoy breakpoint debugging Node applications in VS Code.


Microsoft operates as though, and under the assumption that their software is needed. It's a faulty premise. The presumption is arrogance incarnate.

As a business I can fully, completely operate without any of the their software, profitably, peaceably, and with at least the same amount of friction or less. And more securely.

If you think otherwise, you are ignorant of the tools available for free and of some intelligently thought-out engineering.

Quality of a language is such a trite, over-used discussion that the results/goals of software are completely forgotten. This too is arrogant ignorance.

How many software engineers are there in the world? It's more than taxi-cab drivers. More than Dr.'s More than teachers. Cooks, farmers, and builders (not combined of course). This is evidence of software for software's sake. There is no quality. Just a moving shell-game of "you need this".

I love writing algorithms--that PowerShell and .NET are some kind of something that will solve people's problems any better than something else is simply marketing. I have a pet rock to sell you.


As a Windows Sysadmin, my life would be a living hell without PowerShell.


Maybe consider moving away from Windows/ You are living in their world. If they want PowerShell to be adopted their whole program management will make sure you can only get things reasonably done with PowerShell. It's their world, and you don't have to be a non-paid employee of Microsoft.


By giving everything away for free?

The only related revenue stream M$ seems to have left is Azure with some vague notion that giving away their ecosystem will drive people to use their cloud more


Windows isn't free. Office isn't free. All of their enterprise server software isn't free.


Support and training contracts aren't free, either. I don't know if Microsoft does much of that, but it's another option for revenue with .NET.


Microsoft does not grow food, build houses, create clothes, cars, or protect you from foreign invaders. The basic human needs are not met by MS. The business that claimed that open-source was a cancer is just a marketing company with memetic feedback loops. Plenty of over Operating Systems are out there. Their office product is no longer relevant (vis. a vis. Libre Office, Google Docs, and others). Are you telling me a multi-billion dollar company cannot compete with a startup such as Google? There's a reason why. They do not create. Just marketing.


>Their office product is no longer relevant (vis. a vis. Libre Office, Google Docs, and others).

You have no idea just how wrong you are. Excel has no competitor worthy of even talking about. Libre Office is a mess. Google docs doesn't do everything word can and is in the realm of a toy web app.


Why is Google Docs a toy web app?


I'm always in favor of things being open-sourced, but I do kind of wonder in what universe I would use PowerShell when Bash is readily available.


As a full-time Linux user, I have to say that bash is overrated.

For instance, on Windows I could press the up arrow through my history 20x to find a group of commands I want to run, press Enter, then press DOWN for the following command, then enter, then DOWN until I run them all. In Bash, I need to press UP 20x every single time.

Yes, you can customize it a lot, and you can script it but default bash isn't that special to me.


You can use control-O for that, which does "execute this command as if I pressed enter, then when it's done bring up the following command in command line prompt". So if you have a set of commands to run that are 20 lines up in history, you can first find them (with 20x UP or by reverse-search), then just control-O control-O control-O control-O ... to execute them one after another.


Don't forget control-R. I find that even the GP's example of up arrow 20x to be ridiculous when I can type C-r and a few unique letters that were anywhere in the command to find it instantly.


Which is why any sane powershell user would probably have installed PsReadline to get this and other goodies. Preferably running in ConEmu.



I wasn't aware of this. Thank you.

Not that I've been using Bash and zsh every day for twelve plus years...


Twelve? Youngster. I'd guess I'm coming up on thirty years. And I did not know this.

Now the trick is to remember it next time the need comes up. :-)


Seems to be an instance of Cunningham's Law:

https://meta.wikimedia.org/wiki/Cunningham%27s_Law


Oh. My. God.

Life changing!

Out of curiosity, how did you discover that you could do this?


I am not the GP but I learned about C-o by reading the man page. If you spend a lot of time in Bash, why not learn to use it properly? There are less than ~100 key bindings, see "Readline Command Names" in http://linux.die.net/man/1/bash You should at the very least skim over them.

  operate-and-get-next (C-o)
  Accept the current line for execution and fetch the next line relative to the current line from the history for editing. Any argument is ignored.


It's been something I've used for many years, so I really don't recall any more just where I found out about it...


Maybe it went something like "god damn it why can't I ctrl+v to copy, what other ctrl keys are there..."? And now I'm looking at http://ss64.com/bash/syntax-keyboard.html and being amazed I didn't know about / forgot a few of these. It'd be nice to put some of the good ones on a coffee mug, I first learned useful vim commands from a coffee mug...


Nope, because I've never used control-keystrokes to do cut-n-paste. I always use (and love) X-style mouse cut-n-paste and hate that there are no good implementations of it on other OSes.


Dating myself, the way I learned copy and paste was CTRL-insert and SHIFT-insert, and those still work fine in konsole, mate-terminal, most any other half-decent terminal program, and straight text mode outside of X.

Not xterm, but I never use that.


are you serious? you guys need to learn more shortcuts. http://teohm.com/blog/shortcuts-to-move-faster-in-bash-comma...


Doesn't work in bash running in my Terminal.app on macOS, although it certainly does when I `ssh` into one of my Ubuntu boxes. Wonderful shortcut.

EDIT: If anyone knows an easy way to get this working on macOS, I'd love to know it.


Stackoverflow to the rescue: http://apple.stackexchange.com/questions/3253/ctrl-o-behavio...

This turns out to be because on OSX the terminal driver defaults to throwing away control-O which is rather unhelpful of it. "stty discard undef" fixes it.


> For instance, on Windows I could press the up arrow through my history 20x to find a group of commands I want to run, press Enter, then press DOWN for the following command, then enter, then DOWN until I run them all. In Bash, I need to press UP 20x every single time.

I think the Bash default is better here. If I wanted to run 20 closely related commands, I would make them a single line; far more commonly I want one command from 20 commands ago and then the last command, which is a pain to do on Windows. Much worse is that Windows seems to lose all history every time I close and reopen a command window.


If you put this in your `~/.inputrc` [1]:

    "\e[A": history-search-backward
    "\e[B": history-search-forward
    "\e[C": forward-char
    "\e[D": backward-char
  
you can start typing a previous command and go up through your history for all commands starting with whatever you currently have typed into the bash shell. Reset the scrolling through the history with `ctrl+c`.

Personally, I hate that powershell remembers your position in the shell history. I don't want to maintain a mental model of the shell history rollodex and where I am in it. That's one less part of my brain that I can't use for thinking about the code. If I have a set of commands to type over and over:

    history 20 | head -10 | sed -r 's/^ *[0-9]+ *//g;' > new_script.bash

* [1] http://codeinthehole.com/writing/the-most-important-command-...


Exactly, but it makes more sense to bind history-search-backward and history-search-forward to <M-p> and <M-n> in my opinion.


The search by previous characters seems to be the default behaviour of FreeBSD's t/csh. Confused the hell out of me at first because Ctrl+R isn't reverse search.


I think I tried something like this once before, and it broke something else. I can't remember what that was, so I'm totally trying this.


Yeah, this breaks the "move over by word" with Ctrl+Left and Ctrl+Right which I use a lot. Maybe I need to add them explicitly.


Ah. I use the standard emac'esque commands to move over by word. alt+b for back word, alt+f for forward word. Not at all intuitive... windows line movement commands win over emacs bindings form an intuitive perspective.

Also you don't need these to get up/down to work:

    "\e[C": forward-char
    "\e[D": backward-char
Those are for left/right arrows.


For the record - the PowerShell team loves Bash and much of PowerShell is modelled after it. To be fair - we also modelled portions after VMS DCL, Perl, TCL and AS400 CL. There are lots of awesome engineers out there working on different platforms and we learned a lot from of them. Jeffrey Snover [MSFT]


In many bash configurations, ctrl-o is bound to 'operate-and-get-next’. If you hit up 20x, you should then be able to hit ctrl-o to run that command, and your history will automatically go to the next command down. Check it out, it’s great!

EDIT: beaten by pm215!


Bash is 27 years old. It was written to replace the Bourne shell, which is 39.

It is not terribly surprising that a 9 year old tool like Powershell made some improvements over an earlier tool that was already old enough to vote!


I don't see a tremendous amount of accumulated wisdom in Powershell. It may be newer, but I'm not sure they looked at prior art or had design members who were shell gurus in any other shell than `cmd.exe`, which is a horrific shell. Why are commands horrendously long in powershell? If you spend time in the shell, you don't want your fingers falling off due to overuse. Aside... one of the design decisions Microsoft took that absolutely blows my mind is using `\` for file hierarchies instead of `/` when you're writing an operating system in C!


There are lots of aliases for cmdlets for exactly this reason - interactive use. The verbose commands are useful when reading a script that isn't working at 3am in the morning when your boss is breathing down your neck to fix it "stat!".

RE - your aside '\' vs '/' - I think most of us still shake our head at that decision.

Jeffrey Snover [MSFT]


I don't know if that was really a "decision" or an unfortunate compromise. DOS inherited /foo style command line parameters from CP/M, and DOS 1 didn't have a hierarchical file system, so there was no conflict with the path delimiter character UNIX was using. When DOS 2 introduced directories (and brought over the cd, md, rd, etc. commands from UNIX) suddenly we had to choose between massively breaking backwards compatibility or using something else, like \, as a delimiter.


> Aside... one of the design decisions Microsoft took that absolutely blows my mind is using `\` for file hierarchies instead of `/`

That decision goes back to MS-DOS 2.11, circa 1983. (DOS 1.0 didn't have subdirectories.) The model for DOS directories came from UNIX, which at that time was the very aggressively defended intellectual property of AT&T. As Microsoft was an AT&T licensee -- for Xenix, their version of AT&T System 7 UNIX, which was Microsoft's vision of the future of operating systems when PCs became powerful enough: they licensed it in 1978 and resold it via OEMs -- I speculate that they were afraid that AT&T might sue them for patent or copyright violation relating to the filesystem design if they went with a forward slash.


They used \ for directories because most DOS programs used / for switches.

https://blogs.msdn.microsoft.com/larryosterman/2005/06/24/wh...


Interesting hypothesis! I'll ask Bill the next time I see him.

Jeffrey Snover[MSFT]


I'll take his word for it. (If he remembers trivia like that a third of a century later ...)


Well, he made that goddamned donkey game...


> using `\` for file hierarchies instead of `/`

You know you can use both since like XP, right? You can even mix them up in one path which can look quite confusing c:/windows\system32/etc/drivers\


`/` is not a first class citizen. You encounter different oddities throughout the windows environment in the shell and external utilities when you use `/`.


There are many (*nix-inspired) aliases for frequently used commands though. I think PowerShell's preference for verbosity is a strength more than a deficit. Why "grep" when select-string "sls" is so much more intuitive and promotes readability.


The bash I'm running is actually 3 years old. You make it sound like bash was written 27 ago and no one has touched it since then.

Secondly, what constitutes an improvement is a matter of opinion. For example, I don't see this Powersell as an improvement at all.


Autocomplete in Bash was actually a revolutionary improvement


And autocomplete in PowerShell is again.

Cmdlets tell the shell their parameters, and their parameters are long names, so you can just tab through them to find a) the available parameters and b) the ones which might help you.


Well, it's not an improvement, it's just a different design-choice. I for example much prefer Bash's way of doing it.

Also, Bash does have a way of doing it the Powershell-way, if you want that, as others have commented.

Which is also why I disagree with your statement. Bash may be old, but neither the underlying technology nor the use-case changed much in that time. So, there is not a whole lot where you can innovate with a new tool, at least not for such basic tasks, and instead, Bash has matured throughout 27 years to perfectly fit that underlying technology and use-case that millions of people have.


It seems weird that the feature of PS that you're most impressed with is something so... trivial? You have full access to your history in both shells so you could easily emulate that feature in bash/zsh or you could create a special binding to run all commands from a point in your history so you only have to press Enter once.


I actually think the basic windows command line that does it, as I was never in the habit of installing PS while on Windows.


Or just be sane and use ctrl-r and search.


I get the feeling that most users have only accessed 5% of the features available in their unix shell of choice


I would say not just the shell but most software


And almost certainly languages, too. For example C++.


As someone who has created templated subclasses with both private and public virtual inheritance, I don't think that's a bad thing. C++ is wildly powerful, but it's a mess, and most people should stick to easily understood idioms/patterns, for the sake of maintenance if nothing else.


I also highly recommend enabling the pgdn/pgup bindings for history-search-forward/backward. If you are searching for a prefix it's a bit easier than ^R.


I bound these under C-n and C-p. Very convenient indeed!


I do use this too.


>For instance, on Windows I could press the up arrow through my history 20x to find a group of commands I want to run, press Enter, then press DOWN for the following command, then enter, then DOWN until I run them all. In Bash, I need to press UP 20x every single time.

I know other users have addressed this issue in other comments with solutions, but you do know this isn't an issue with Bash so much as it is with readline, right? There are plenty of ways to customize readline/inputrc to make it do what you want. There's even linenoise, if you hate readline for what it is. It wouldn't matter how good Bash got as a shell language, this scenario wouldn't be any easier unless readline also gets a facelift.


I actually did not know that.


Bash is awful. But so are shells in general.

Even Rob Pike says as soon as your shell program gets longer than a few lines, write it in a proper language.


I feel that principle is true from all high-level languages as you move down the chain.

Shell scripts should be 1-20 lines. Python/Ruby/whatever for 20-100 and compiled for anything >100.


I was about to refute you with a recent example of mine that just needed to serially execute a bunch of commands, and it's well over 20 lines. Then I remembered that I needed to introduce some branching logic, said "not in bash, I'm not", and rewrote it in Python.

So for me, it's not a matter of LoC but complexity. Need to copy file from here to there, then commit to git repo, blah, blah, blah, then kick off a build? Whether it's a 1000 lines or 2, I don't have a problem with doing that in bash. But as soon as I need some logic (or on rare occasions, perf), I go to something else.

Given that, why do you cut shell scripts off at 20 lines? I ask in the spirit of being open to the possibility that maybe I'm doing it wrong. :-)


Well it isn't a hard limit. I am okay with a 25 line script for example ;)

Well really you are more right than me. It is about complexity more than the number of lines. However for the most part the longer something is the more complex it is to work with even if it does something simple. Managing a 1000 line bash script is almost certainly going to be more painful than doing it "properly" in Python or Java or whatever.

It mostly comes to do this for me "is this thing going to be something I need to understand and manage outside of my own little bubble?" if it is then I want it to be better designed/maintained and I find that gets easier as I work "down the chain".


It mostly comes to do this for me "is this thing going to be something I need to understand and manage outside of my own little bubble?" if it is then I want it to be better designed/maintained.

Thanks for taking the time to reply. And you're right, even I wouldn't want to maintain 1000 lines of serially-executed commands, meaning I have a line limit somewhere as well. So maybe a mix of complexity, and at some point length even if I don't put a number on it like you did.


Everyone has their own way of working :) I first got into things with batch files on Windows 95 and quickly realised how awful they are when more than 5 lines! I quickly moved on to their Windows Scripting Host stuff which was better but still painful. Then VB6 then C++/MFC.... The usual Windows user story.

Scripting on Unix is a lot better but it is only recently that places seriously use VCS for scripts in my experience so before the days of Git if you wanted to do anything with an iota of manageability it meant you did it in a "real" language hehe.


For me, the marker point is arrays. Arrays in shell are terrible - if I'm going to need an array, drop shell for something else.


Respectfully disagree. I've been writing full-on applications using Python and Qt4 which are well over 100 lines of code.


That's cool, we are just different is all. I use Python for automation tasks that require more than what I can do with Bash/Batch/PowerShell but don't warrant a full write/compile/deploy process. If your job is working in Python then sure you are going to use it way past the limits of what I would use it for.

Things are a lot nicer these days than 10 years ago though with PyCharm, etc. it makes managing a Python project so much easier.


Me too. And personally I'm somewhat conflicted as to whether or not it's a good idea. On the one hand I love python and use it for virtually all code I write (that doesn't have to run in a browser) and I'm quite convinced that for what I'm doing it's the most efficient language for me to use. On the other hand I have found myself on more than one occasion really wishing python wasn't quite so dynamic when working on large python code bases and if I was starting a new large desktop application today I'm not sure python would be my first choice. There really is something to be said for a better/stricter type system.


What if you use Nuitka or PyPy for Python, does that magically allow you across the arbitrary border?


When my shell program gets longer than a few lines, it's time to break it into some makefile targets with intermediate files.

I have in the past written a moderate size program in Powershell (installer for a program which needed a lot of configuration for a lot of components and got installed into a lot of different environments). And while that probably wasn't the best approach in hindsight, it wasn't actually that awful an experience.

It's not a million miles off writing a moderate size JavaScript program, except that it has an actual module system.

I wish they hadn't chosen dynamic scope though, nor the silly array result squashing behaviour.

Also the IIS 7 Powershell Provider - that was bad.

(Or a proper language is good too.)


Citation please? I would love to share this with some of my coworkers.


I can't find it anywhere, I have tried.

I think it is in the plan 9 mailing list - 9fans, as that is where I have interacted with Rob


You can use Ctrl+R for reverse search. However, these are minor things... You can find a plugins and what-not to add these kind of functionalities.

I use oh-my-zsh and I'm more accustomed with searching sub-history of commands, e.g. nmap <up> #=> shows flags...


I do that same on bash and zsh since... ever?


It's just default behavior and its customizable for a reason. Different people have different preferences, powershell's default behavior is in no way 'better' than bash's. You can also use a different shell like zsh.


It is just another tool in the toolchest. As you know, on Linux, there are tons of great interactive shells and it is sortof a lifestyle choice. You might want to kick the tires and see whether it fits your lifestyle or not. If you use Bash as your shell, think of PowerShell as you would python or perl - something you write scripts in and call from Bash.

Jeffrey Snover [MSFT]


Thank you for doing this!

Anyone who's done any kind of if else while for in bash/zsh and powershell knows what a relief PowerShell is. I'd love for PowerShell (PSReadline actually) to support the common options of readline- e.g. HISTIGNORE, HISTCONTROL etc


does powershell compile to IL? is it possible to create stand-alone executables from it that do not depend on the runtime?


PowerShell is interpreted, as far as I know. Class declarations may be compiled on the fly, but I've never used them; I'm basically still at PowerShell v3/v4.


We have a hotspot compiler.

We have talked about having it create a stand-alone executable but it has always fallen below the cut line.

Jeffrey Snover [MSFT]


It is about Microsoft being a neutral player in a multiple platform world. Microsoft want people using their tools as it means they are more likely to use their platforms and services. Be it PowerShell on Linux or Bash on Windows or VSCode on macOS they want to make the best tools in the hope that it will push companies/developers towards Azure/SQL/Cloud as that is where the future for MS is.

Windows is on life support and they know this. The future isn't about companies buying hundreds to thousands of Windows licenses but an instance of an OS which is billed per X/month. They have seen businesses are more than happy to enter into such a model with Office 365 and The Cloud so it makes sense for them to move in that direction.


>Windows is on life support and they know this.

How is Windows "on life support"? Desktop/laptop computers are not going anywhere, at least until they invent some kind of VR (and no, mobile OSes like iOS are not suitable for doing real work and managing data in a networked environment the way desktop OSes are). Windows still has at least 90% of the OS market. People and businesses could change to an alternative, but they just aren't, and they aren't going to unless something really drastic happens to force them to (and this would likely have to include many popular applications also providing support for one or more of the alternatives). In the business/enterprise world at least, there is simply no indication that Windows is in any danger at all of losing marketshare or revenue, in fact it's probably going to increase since businesses seem to love the software-as-a-service model.

MS could probably just make Windows free for home users (and make even money on them with advertising and selling telemetry data and also support calls), maybe small business users too, and then soak the big businesses with high fees for their site licenses and SaaS and other services.


Windows revenue has been shrinking for a few years now. The growth is in Azure and related cloud technologies.

Sure Windows probably won't ever go away (until a massive paradigm shift as you mention) but businesses are not upgrading like they once were. You have a mixture of reasons for this, BYOD has chipped away at some of it and forced companies to allow non-Microsoft (i.e. Apple) systems to connect to the network for remote and onsite.

With more and more work being done in the browser it makes specialised software that forces upgrades to Windows out of the picture.

I have worked for a number of large companies and recently the reduction of annual spend on Windows clients is pretty staggering. With almost all "managers" going BYOD with their (mostly) MacBook Pro's and onsite machines lasting a good few years longer than they used to it means we just don't need to buy as much from Microsoft.

You can see Microsoft know whats up as well. Windows Enterprise subscriptions and Azure are the future for Windows in business. For home users Windows is already "dead" from a revenue perspective with the exception of charging via the OEM at first sale. Android, macOS, iOS, tvOS, all the operating systems home users are aware of are free upgrades on supported devices. This is where Microsoft has a problem as obviously they support everything so they can't lock out some users with a "2011 Dell XPS". However going forward I expect we will see them do something along these lines with limitations around what minimum hardware is supported so you will only get upgraded to the latest Windows if you have AVX2 or such based hardware.


I would agree that Windows is nowhere near being on life support but I do think that the size of the desktop OS market is shrinking and that in the future we may see a lot more home computer users move to tablet only or something like a Chromebook.

Enterprise users are much more slow to move but as enterprise SaaS offerings become more robust we may see a lot more businesses move towards a thin-client (a la ChromeOS) and SaaS model to meet their needs. Microsoft is doing the right thing by quickly getting out in front with Office 365. If they hadn't maybe something like Quip or Google Docs could have starting stealing significant market shared but now I don't think many enterprises would bother switching if they can stay with Microsoft.


I wonder the same, but I hope it succeeds.

When PowerShell was written we knew more than when Bash, let alone the original Bourne Shell was designed. Powershell understands that parsing is sometimes hard. Powershell does not invite data-insertion attacks with the the fervour of Bash. In short, it does many things right that would Bash do too if it could be rewritten.

And yet -- I've rarely found it useful in practice. But it's hard to judge that, since I don't do as much in Windows as I do on Unix. So this move will level the playing field a bit, and we shall have a fairer comparison.


Honestly, I prefer the object based approach compared to the text wrangling approach.


I think PowerShell would actually be way ahead of Bash. But we have things so much better than Bash. Python/Ruby let you do a lot of amazing stuff (not shells, I know, but still immensely more useful scripting languages than what you can do in Bash).

Also, if you're still using Bash, I'd highly recommend looking at newer shells. Fish is pretty amazing. I've been using it over over a year. There are others too like Zsh.


I actually use Zsh most of the time, but my point is that pretty much every Linux box has Bash at this point, so I'm a bit curious what Powershell has to offer.


Powershell is really more about the scripting language and the command processing. The "shell" part is modular. On Windows you can still use most cmd tools inside powershell and with its commands.

So you shouldn't give up bash, you should use both and take advantage of the tools that help you get the most done.


What do you know about Powershell? (The answer will depend on that)


Wow. I've been very impressed with Microsoft lately. They could have easily set themselves up as the next IBM, in the sense of being a monolithic corporation relying on corporate practices, but they've been spewing some crazy fun things that shows they're not out of the league yet.

Might just be MY miscontrued perception that microsoft was becoming old/monolithic, but really excited with the direction they are heading.


I speak only for myself.

Microsoft have been hugely focused on hiring people who have experience outside the Windows bubble, and have been encouraging cross-pollination of ideas with the OSS world. There are holdouts for the old guard of course (there's still a BU that requires Windows Phones), but I think everyone knows that the Microsoft of old cannot compete today by bringing an attitude that pleasing the CFO is all that matters. There are techies in the boardroom now who are familiar with the early-mover advantages of free software and the importance of the OSS culture to their workers.

So when a company of massive resources, who can see the writing on the wall, decides to hire thousands of people who actively disbelieve in the value of monolithic lock-in and who enjoy using open-source tools, and empowers them to build interesting things without regard to company loyalty, the result is Powershell on Linux and Bash on Windows. Oh, and a whole lot of goodwill amongst the people you had previously alienated.


Yet that does not prevent them from harassing the linux world with patent attacks. And claim they love linux nonetheless. Thats like shizo behavior.


Well said. If it only happened a decade or more earlier than it did, I think MS wouldn't have lost as many customers.


If only their recruiters were a little more responsive...


Very much so. They're returning to their roots(winning over developers with world-class tools) and it's really refreshing to see.

Between this, VSCode and .NET Core it really is an impressive effort.


Let's hope the awesomeness they are showing on the developer relations side finally leaks into the Windows side and they stop their shenanigans with W10 and Android and VFAT patent threats. It's so disappointing to see this kind of split-brained behavior.


Nah, why should they?

Developers are pretty much the only demographic that might realistically switch operating systems, so they only really need to cater to them. The rest, they can pretty much treat however they want, and they won't lose them as customers.


Exactly. The developers they can get to work for them are "useful idiots", whose work they then use to further screw over their main customers, who aren't going to abandon them no matter what, and also bring in more developers to their platform and away from the competition. As long as these developers are easily and naively wooed by a bunch of BS talk about "the new Microsoft", they get to help MS with their evil shenanigans.


Sigh. I don't even know how to start with this one. I'm a "useful idiot" because I'm interested in making PowerShell work as a proper Debian package, or because I might look into what's required to get it into FreeBSD?

Seriously?

MS is not a monolith, it's made up of lots of different people with different ideologies and different goals.


I think the parent comment's point, with which you are likely to disagree but haven't fully addressed, is that even the best intentions of developers can only serve the ends of the less noble segments of Microsoft.


Fair enough. I think that falls into the category of "what if someone does something evil with my code?", to which I don't really have an answer.


In response to the flagged sibling comment: I use Linux as my main OS and have almost non-stop since ages ago. I have in the past half-seriously wondered whether infamous projects that make Linux worse were MS false flag ops. But here, rather than making personal accusations (regardless of how true they might seem to you), it's probably more productive to provide a counterexample, a reminder of MS's negative behavior, or evidence that they haven't really changed. Then either people can see for themselves, or people within MS will see they still have a long way to go and keep pushing for change from within.


> “Microsoft loves Linux”

Not really sure. Does Microsoft really treat C#, F# compilers on Linux as first-class citizens?

What has MS done to bring MS office on Linux? Alternatively, has it contributed to other efforts like OpenOffice or LibreOffice?


I agree with you! Even if this is all for microsoft's bottom-line (and there's nothing wrong with that since they are a business after all), we all (devs and consumers alike, directly or indirectly) stand to benefit. The prospects certainly seem exciting.


I'm glad that Microsoft is doing stuff like this. But, I can't imagine a lot of Linux users are going to want to use it. As a former Windows developer I always found Power Shell to be lacking in nearly everything I wanted to do on the command line, in fact, I mainly did shell commands on a remote Ubuntu server.


Would you mind elaborating on Power Shell's lack? I've never used it so I'm curious to know more.


I think the most succinct way of describing it is that UNIX tooling is scatterbrained and at times a little ugly, but each command exits because it solved someone's real-life problem. PS by comparison is a set of tools that try to be all encompassing and solve problems MS devs think people will have.

Our PS repository is full of scripts that do nothing but abstract PS into useful high-level actions.


PowerShell's object-passing approach ultimately is a heck of a lot more composable and reusable than the "blobs of text with AWK and SED magic" approach that its predecessors took.

It's somewhat surprising to see people here arguing that the pure text methodology is superior. It's difficult for me to imagine a technically motivated scenario that is improved by having everything dump out inconsistently formatted and specialized plain text.

Great example, ifconfig and netstat. Regular interfaces and an object model over these under the covers is a whole universe better than the ad hoc tabular formats they present.


And now PowerShell on Bash on Ubuntu on Windows is a thing.

(Spoiler alert, it doesn't work very well right now. The cursor keeps jumping to the top of the console and overwriting existing text.)


No, PowerShell on Bash on Ubuntu on a Windows VM on a Linux host is a thing :)


And soon Windows VM's will be able to run inside of other Windows VM's so..... VM's all the way down.


I was just about to try that exact series of steps to see how things went.


If you run it inside of tmux, that issue seems to go away for some reason.


Finally! I love the methodology behind PowerShell. It's more verbose than bash, and that's OK to me. The readability and _consistency_ far outweigh the extra characters. Now, I get that coupled with all the flexibility of *nix tooling.

It's going to be a long road to hook everything up, but this has the chance to finally standardize command parameters.

I'm sorry. This is truly amazing to me and I'm quite excited!


I Can Only See One Problem With PowerShell :)


So, that's kind of the beauty to me. For long lived scripts I can always type out `ForEach-Object { ... }`. When I'm ad-hocing there's always `%{ ... }`.

Bash is a strange beauty, but part of that beauty is how arcane it can be. Generally, people use the smallest parameter possible in scripts instead of the full name. While this can definitely continue with PS, the community has decided on "alias for ADHOC and Full-Name for scripts". This helps the readability tons if someone hasn't memorized all the command parameters.


Powershell is case-insensitive, so feel free to avoid that shift key if you're so inclined :)


Been using PS for a long time and I had no idea about this. Thanks !


It's happy to gobble up whatever improperly-capitalized cmdlets you throw at it, heck you can even forget some letters ;). And with a little tap on that tab key it will helpfully fix your laziness so there's no trace of it in your shell history.


Having powershell + bash on linux lets me compare this way:

powershell, 14 pids(threads?), 3119M Virt size, 80160kb Resident.

bash, 1 pid, 22068kb Virt, 3976kb Resident.

And this is simply starting the process to an idle prompt.

This is on a Debian 8 64bit system. (using their Ubuntu 14.04 binary)


I'm not a huge fan of PowerShell but you're not really making a good comparison here. Bash is "just a shell" that executes commands with a few bits of syntactic sugar here and there to make it easier to string those commands together and perform some primitive logic.

PowerShell is basically an interpreter that works more like a Java Virtual Machine. Consider the hypothetical example that you need to lookup 100 users in an LDAP directory from Bash: You run the openldap command 100 times in a loop. That's 100 processes that need to fork and quit in a serial fashion in order to perform the task at hand.

To do the same thing in PowerShell you'll be calling a native PowerShell LDAP API which will execute all 100 lookups without having to fork any separate process resulting in a vastly faster execution.

It's better to think of PowerShell like the Python or Ruby interpreters. IMHO, PowerShell is a vastly superior improvement to cmd.exe as a shell but on a Unix host it will be inferior to bash for day-to-day tasks.


I fully agree -- they're two different animals. IMHO, it's a symptom of the two different paradigms of running a process on Windows vs Unix.

I worry about getting out of the habit when scripting on Linux of forking off a new process for each iteration of a loop, rather than pushing the loop into powershell, for better execution.


So I wonder how it compares to having the python interpreter open.

According to my machine, running the python interpreter results in 1 PID, 129M virt, 4.6M res.


Using java + jython, results in 15 pids, 2010M Virt / 196M Res.

Jython is probably the more apt comparison, vs the compiled interpreter. But, it falls into the same trap as powershell: You couldn't invoke jython in a loop like you would 'python' or 'perl'.


I'm not so sure it results in a vastly faster execution, at least on OSes which don't need half an hour to spawn a new process. So, yes, of course it was needed for Windows, because there creating a new process is so expensive that sometimes you've got to mortgage your house to do it.


I was going to cover this topic (the cost of starting/forking a process) in my original comment but decided it would be a bit tangential since I was able to make my point without it.

On Windows the cost of forking a process is high whereas on Unix/Linux it's low but on both platforms every application has a nontrivial startup time that will be significantly greater than a native call within an existing process.


Looked at from another direction, the unix world works beautifully with slim VMs, unlike windows which requires gigabytes of ram. There are plenty of 0.5GB ram unix VMs out there that do their job just fine... and to fork over 20% of that ram just to do shell scripting? Not very portable.


Powershell is essentially running a JVM stlye virtual machine in the background though.


Finally, began learning bash recently, was driving me nuts.. Been more productive on 1st day with PowerShell than I've been with bash in past Month


Cool! I'm having a play with it and it seems to work really nicely. Does anyone know why the following commands don't produce the same output?

    get-childitem | ConvertTo-Csv
    get-childitem | ConvertTo-Json | ConvertFrom-Json | ConvertTo-Csv
It seems like, if there's a perfect underlying data model doing what's effectively an identity conversion shouldn't change anything.

Edit oooo, and all the linux stuff is available within it too. So you can run this to just augment your linux experience.

    ls | grep '_' | ConvertTo-Json


ConvertTo-CSV converts each property into a single string. Rich object? Tough. You get a string. If the type does not support a nice string representation, you get type names, like "System.Collections.Generic.List`1[System.String]" in your CSV output.

ConvertTo-Json serializes in-depth ... properties that are actually rich objects become nested dictionaries instead of simple strings. For instance, the "Parent" which is a Folder object. Recursively, up to the depth allowed.

Anyway, when you convert back, you don't get actual FileInfo and DirectoryInfo objects, you get dynamic property bags. So those complex properties remain dictionaries ... which ConvertTo-CSV turns into strings in a very different way.

If you specify -Depth 1 when you ConvertTo-Json ... the resulting output would be very similar.


I don't have a powershell handy, but one thing that might be causing changes is that json dictionaries usually don't have a defined order (parsing libraries are free to render the keys of a dictionary in any order); are the changes you're seeing just ordering changes?


If you have a Docker Host, you can simply run: docker run --rm -it trevorsullivan/powershell

To clean up the image after you're done:

docker rmi trevorsullivan/powershell


ls and grep are aliases in powershell, so not exactly the same, but pretty close

For your first question: you have converted types. When you did a ConvertTo-Json, you serialized the System.IO.DirectoryInfo type into JSON compatible data. When you ConvertFrom-Json you have a PCSCustomObject.

The conversion to CSV is different for the System.IO.DirectoryInfo than the PSCustomObject. This has to do with the sub-types (converting from objects into serializable JSON objects). It also appears to handle time strings differently as well.


ls is an alias for Get-ChildItem. However, grep is the native command in this case. While such things work to some extent, it has the problem of turning your objects into plain strings. Effectively the same as if you had run

    (ls | % Name) -match '_'
The fact that the file's name is its default string conversion can be pretty annoying when dealing with results, e.g. from gci -recurse where it's quiet important to retain the path to the file. So, careful with such things. On Stack Overflow I tend to scold people who reduce PowerShell objects to strings without good reason ;-)


Powershell is the first language where I managed to save time by writing a program for a task. Needless to say, I'm thrilled at the prospect of using it on Linux. The only thing that I hope they will need to work on is startup time making sure it performs well on low-spec servers (It can be a bit slow starting cold on older laptops).

    ls -Recurse | ? { $_.Name -match ".+docx" }


I am not sure what your example does. How does it differ from

find . -name ".docx"

ls -Recurse | ? { $_.Name -match ".+docx" }

And, if these are the same, why type in twice as many characters as a command? Actually,

ls -R | grep "\.docx$"

is, I think, closer in spirit.

Now, the issue is things further in the pipeline, I imagine. The find solution allows selection on type name, type, date, depth, mounts, properties, file system type, ownership, relative age, inode, symbolic link. find actions include delete, exec, print, quit. This can pass on needed information to further pipeline commands.

However, an interactive shell is meant to allow expression of commands with a reasonable minimum of keyboarding (thus ls and not dir). I could argue that the "Unix" command should be

find . -n ".docx"

(because the most common -n... option used is -name).

I think, to be fully reasonable the find command you want is

find . -regex ".*\.docx$"

which is still shorter than the PS example. I am not sure that for most interactive tasks, PS is a win. It does appear to be more elegant, but only usage over time will show if it is indeed more productive. I am looking for academic productive studies to be done, now that both bash and PS are available on common platforms.


If you're just talking about ls/dir, then PowerShell isn't really a win. The advantages are clearer when you start talking about lots and lots of commands. A lot (but not all!) Unix tools have flags to control their output format or input filters, but that means you have to memorize different flags for all of them. OTOH, in PS you just pipe to where/sort/select and they work for everything. IMO PowerShell has better separation of concerns.


In Powershell you can also do this:

  ls -Recurse -Include *.docx
or even this:

  ls -Recurse *.docx
which is really quite similar your find command.


I should have made it clear that this wasn't an example of Powershell being superior to bash, just what I can remember from that little script.


I've been hoping for a fully-featured Linux version of Powershell for a while, very awesome. I've used various Linux distributions for a long time and find bash very practical usually, but when I started learning Powershell when working in Windows environments I've been impressed with how easy it was to do some powerful stuff, and felt conflicted since I wanted to be able to use similar functionality and scripts across Linux/Windows.


Powershell isn't competing with bash, it's competing with Python. So then, how does it compare against Python? While plenty of small scripts are still done with bash, I think Python is pretty common for serious production needs. I use it for virtually all scripting these days.

I had a recent experience where a developer spent a day and couldn't get Powershell's webrequest library to do what he wanted. I was able to accomplish the goal in 15 minutes with Python on a Debian system.


Sure, there are different ways to do various things, especially depending on what one is familiar with and what the task at hand is. It sounds more like you knew how to get something done with your tools faster than the other guy/gal in this case. :)


There are obviously many factors such as domain knowledge and language familiarity. I respect the developer who failed to do this in Powershell, he's very capable.

In this case, I would chalk it up to Python having a longer history and being more mature. Neither of us were particularly familiar with this domain, but because of the quality of Python's online community and documentation, it wasn't difficult to accomplish the goal.


Powershell really became interesting for me recently. I was working on diagnosing a production issue where one of our win32 applications failed on loading a .net assembly (dll). The error messages weren't terribly useful, and I wasn't getting a good stack trace. While Googling around, I found out that you can randomly instantiate .net objects from the PS command line. A few commands later and I had a full stack trace indicating the problem.


It's been such a difficult road. The consistency that PowerShell provides makes discovering things simple. There's no referencing a MAN page to see which parameter switch it is for _this_ tool. Yet, there's been no power. No ability to harness OSS tools in that way. Now, maybe I can have both.


"If Microsoft ever does applications for Linux it means I've won." - Linus Torvalds


For Fedora 24 users:

So some reason, the rpm that Microsoft provides doesn't list all the dependencies that it actually requires. What is even worse, the compiled binaries inside, require quite outdated versions of icu & openssl that are not available in Fedora 24 any more.

What I've come up w/ to force poweshell to run:

  # dnf install icu lldb lldb-devel lttng-tools lttng-ust
To view what we're still missing:

  $ find /opt/microsoft/powershell -name \*.so -type f | xargs ldd 2>/dev/null | grep not\ found
	    libcrypto.so.1.0.0 => not found
	    libicuuc.so.50 => not found
	    libicui18n.so.50 => not found
(libcypto* is openssl-libs package, libicu* is libicu.)

Now, you need to manually download:

http://archives.fedoraproject.org/pub/archive/fedora/linux/r...

(Yes, it's Fedora 17 == 2012 (!), which is ridiculous.)

http://archives.fedoraproject.org/pub/archive/fedora/linux/r...

and extract usr/lib64/* files from both rpms to, say, /opt/tmp/lib64.

Then finally run:

  $ LD_LIBRARY_PATH=/opt/tmp/lib64 powershell
  PowerShell
  Copyright (C) 2016 Microsoft Corporation. All rights reserved.

  PS /home/hf> $PSVersionTable

  Name                           Value
  ----                           -----
  PSVersion                      6.0.0-alpha
  PSEdition                      Core
  PSCompatibleVersions           {1.0, 2.0, 3.0, 4.0...}
  BuildVersion                   3.0.0.0
  GitCommitId                    v6.0.0-alpha.9
  CLRVersion
  WSManStackVersion              3.0
  PSRemotingProtocolVersion      2.3
  SerializationVersion           1.1.0.1
It's a very strange feeling to see "Copyright (C) 2016 Microsoft Corporation" in xterm, if you ask me.


The build scripts for PowerShell right now require an existing powershell binary to work. If I can untangle this and work out what's required to build from scratch, then that'll probably help you build a proper Fedora RPM.

(I'm looking into it for Debian, but the work will help other distros, too).


Thank you for taking the time to share all that work you did to get Powershell running. Your instructions work for Fedora 23 as well.


I've been working on this for some time now, and am ecstatic to share this project with the world!


That is awesome. Truly an inspired set of changes at Microsoft lately.



If people could be relied upon to maintain their own systems and apply security and other updates, this wouldn't be a problem. The unfortunate reality, is that they can't.


PowerShell is an improvement on Bash, but I'll still be using F# .fsx files for my .NET scripting needs. That said, the real reason I'm excited about this release is that there's finally a great example of a very large project based on .NET Core [1]! Most other examples I'd seen were pretty small, so this is really helpful.

[1] https://github.com/PowerShell/PowerShell


Can you share advantages or powershell improvements over Bash ? or is it kind of over-statement?


PowerShell is one of Microsoft's better offerings, it runs rings around Batch, thank heaven. However, it has its own quirks. Loads and loads of stuff done implicitly (unwrapping arrays as parameters when constructing an object?) and the .Net integration can be annoyingly flaky. Add in the bizarre syntax, some kind of halfway house between Bash and C#, and you quickly end up writing enough code to hang yourself.

My favourite: defining functions more or less like any other C-inspired language: function abc([string]$param1, [string]$param2) { #do stuff }

...but if you CALL that like any other C-inspired language: abc("string1", "string2)

PowerShell mashes the two strings together and ignores the second parameter. Drove me mad when I started out. Incredibly inconsistent.

I like Microsoft's effort here, and the design is a lot better, working with objects, but not on my Devuan box!!


You can run PowerShell Core inside of a Docker container quite easily as well.

https://channel9.msdn.com/Shows/msftazure/Run-PowerShell-Nat...


I'm something of an OO zealot and I'm a C# developer so I'm familiar with PS and probably it's target audience so to speak.

PS is a fantastic product but I just find it too verbose. When everything is expressed as an object with properties you need to reference the object and the appropriate property. I know there are alias' for common Bash operations and idiomatic PSallows shortcuts but most people who use PS for short spells to automate something are unlikely to be adopting idiomatic styles.

For PS to gain traction I feel like it needs to be a superset of Bash and offer an easier, intuitive gateway. I know it has enormous potential, I know it has massive capabilities but still I only reach for it as a last resort. I don't think I'm alone in the matter :/


I like Microsoft's move here. They first made available bash on Windows and then PowerShell on Linux. So no one is going to cry. And now they will improve PowerShell and win over users with a good tool rather than just because that's the only one available.


Powershell is an old tech, so browsing the source tree on github should show some of MS internal practices & guidelines in the old times. Yet the github project start in feb 2015. Where is the source code before that ?


On some interal MS RCS.


But why? It's like if VW invents doors that can be connected to BMW cars. (spoiler: BMW cars already have doors)


I don't understand Microsoft any more. They are making all their proprietary lock-in stuff open source, and they are alienating their user base. I've heard a number of long-term Microsoft users express frustration the user experience (not to mention the Windows 10 upgrade shenanigans). It was bad enough that my dad, a long-time lover of Windows NT (he liked its VMS legacy) switched to a Mac and loves it.

I feel like Microsoft might be pulling a Sun here. However, I'm sure a lot more likely to use Microsoft stuff if I'm not tied in to Windows.


As a counterpoint, I've been a full-time Linux user (desktop and server) for well over 10 years, with server use for 20. (I'm 40, so while others have used Linux/unix longer than me, this is the entirety of my adult working life.)

With the open-sourced .NET stuff, I've been active in pushing for projects I touch to use it. And, I now dual-boot Windows 10 and Linux, actually using Windows for daily work for the first time in a decade.

Open source C#? Sign me up. And I really do like the Windows 10 interface. The telemetry stuff just doesn't bother me. I think it's largely a red herring.

(And, furthermore, I think privacy stuff really has to get nailed down in laws and society, not just with technology. Going just for a technical solution means you'll always be chasing the answer. Encoding it in laws, and setting people's societal expectations, means you get to set the answer instead of follow behind.)


The main fallacy is to think someone can 'understand' any big corp. Especially Microsoft - http://www.bonkersworld.net/organizational-charts/


they just try to get some developers back on board. Did not sork out so well so far i guess.


Finally! It's exactly what we need since there are currently no scripting languages available in Unix user-land for when you need to do something beyond /bin/bash


Poe's law


The initial release is an “alpha” and is community supported. In the future, we will deliver an official Microsoft released version of PowerShell based on open source to anyone running a supported version of Windows or nix. The timing of the official Microsoft version will be based upon community input and business needs. We hope all of you will help us get it right!

'Based on business needs' = if there's ever enough demand.

Still, pretty cool.


Windows can have a neat-o shell all they want but when you open powershell on a windows computer you still get the following when you try to ssh... so, not really seeing the point here:

ssh : The term 'ssh' is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again. At line:1 char:1


see: https://github.com/PowerShell/Win32-OpenSSH. ssh is coming, likely in an official capacity in the future (this is an official MS group on github)


[flagged]


he's not wrong though. ssh is the quickest way of accessing remove machines, not to mention secure (i have never seen remote desktop done with priv/pub key in practice). it also makes automation and transferring files easy. putty is terrible compared to ssh on linux. what i would have given for good ssh on windows back when i was still doing C#...


he's not wrong though. ssh is the quickest way of accessing remove machines

    Invoke-Command -ComputerName $ListOfComputers -Command {#code...}


yes, yes, well done. how does this work with public/private keys again?



Having lived around the Windows world since Win 3.1, I cannot get over the sneaking suspicion that somewhere deep down Microsoft is working towards an attempt to "embrace and extend" Linux. Maybe it's just me, but they've gotten rather interested in Linux lately and they are reaching out to it in seemly innocuous ways. I hope I am wrong.


I just spent all day trying to figure out why Invoke-SqlCmd only works in certain extremely convoluted ways. I'm going to assume this has more to do with the Sql Server team than it does to do with the powershell team, but still: I cannot be excited about anything to do with Powershell since I'm now up to like 10 Googles per line.


Strongly-typed scripting is nice, but I think PowerShell has horrid syntax. I would rather take .fsx scripts any day.


(.fsx scripts are F#, so also strongly typed)


I think PS is miles above CMD but there are still things that bug me. For one, loading scripts into the current session is super confusing to me. Sometimes they are automatically available, other times I have to load them, still others aren't available at all. I'm guessing there is some way to "install" scripts so that they become cmdlets but I don't know what it is.

Another big problem I've had is backgrounding commands. There are multiple ways to do it and I always seem to pick the wrong one. For example, I want to install a bunch of things with Chocolate so I do a lot of Start-Process commands, but they seem to run serially and not in parallel. It's just very difficult to manage jobs in PS, or at least I haven't found any good information on how to do it properly.


OK, so you can finally fix the bug that breaks non-ASCII, LF line endings and binary redirection. Link: https://github.com/PowerShell/PowerShell/issues/1908


A lot of people seem to be missing the point of PowerShell. The concept is far more poweful than the unix shells, because it's a superset of them. It's like Haskell and Haskell-without-types.

We could only dream we had something like strongly-typed pipes in Unix -- and that various tools supported it too.


It's amazing what happens when a company starts to lose market share, isn't it?


I've got this embarrassingly ugly perl-inside-cygwin to automate something with a GUI tool that happens to have an even worse scripting plugin.

I suspect Powershell would allow me to take the whole process native?


I dreaded this day would come. As a recent ex-Microsoft alumnus, and with years of System Center insanity behind my back, I saw this coming, and even heard faint echoes of it in the hallways as I left.

I have deep respect for Jeffrey and what he's accomplished (there's a great video of Jeffrey talking to Linux sysadmins and explaining the genesis of PowerShell, including the insane amount of money we burned before he wrote the bits that actually worked over Christmas break), but the grand (corporate) motivation for sponsoring something like this is more driven towards making sure the MS datacenter management tooling (which comprises trendy things like OMS and System Center) attain minimal mindshare in both the on-premises datacenter and on Azure.

That is not something most people would care about, but Enterprise IT likes having a single hammer for all its nails, and PowerShell's promise is to be the one thing that your garden variety IT sysadmin would love to use to automate every single system in your company.

Ironically, PS is effectively lousy at automation because of its inconsistent error handling and incredibly hard to maintain over time due to constant changes and updates, so I'm curious to see how it will evolve to manage all the different abstractions in the UNIX world (knowing my former colleagues, it will likely follow the OMS route of "Windowzifying" those abstractions and making them "uniform" - i.e., Windows-like).

The underlying issue where it regards PowerShell is not the language itself (although yes, something better, more coherent and with an actual standard library would have been nice, like Ruby or Python). It's Windows.

Windows (the enterprise platform) has always been a hectic patchwork of multiple components (AD, SQL, IIS, etc.) that evolve at their own pace, and PowerShell grew to encompass them not merely because basic automation was an actual checkbox we had to have to be able to sign some hefty deals, but, technically because _none of it had open APIs_. Nor would it be feasible at the time to go back and re-do any of it with proper language-agnostic APIs.

It's very nice to iterate over AD entries and set some fields with a couple of lines of PowerShell, yes. But it's also fundamentally impossible to do so with just about anything else.

Tying it all to the .NET stack (which is still prevalent in everything done internally, because that's what customers invested in) turned out to be a lazy-ass way to ensure we could check all the boxes in terms of automating new things as they came along (mostly - WMI is still iffy). Effective and "scriptable", but essentially still proprietary.

And right now, since my former colleagues are betting the farm on Azure, I expect PS for Linux to become the de facto scripting language for managing Linux on it (it already is, to a degree, since PowerShell DSC already lets you "manage" Linux VMs).

For interactive use, PS was a pain to learn (I was a VMS guy, originally), but, most importantly, it was a nightmare to maintain, because cmdlets would pop up and disappear between releases, change semantics, and were quite often inconsistent among themselves.

I shudder at the level of technical debt would-be adopters will be buying into, and wish them luck.


mod parent up!

(sorry, all this MS vs. Linux discussion brought me back to the slashdot days)

Seriously, powershell finally makes sense to me after reading this. Thank you.


I am pretty excited about this as I have few cross-platform languages that I can use (due to conditions at work). I am wondering how many distros will adopt it into their repos.


Probably most distos that want to be available on azure. Which is why this happened really.


I fully expect this to be in the AUR in under a week. Probably within 48 hours, even.


...and it's already in the AUR. It doesn't run yet, though.


Get-CobolFingers -Variant=Microsoft -Payoff=MuchLessThanAdvertised


    Get-Cob{tab}{tab} -v{tab} M{tab} -P{tab}{tab} M{tab}
cmdlet long parameter names both a) work if you type "the shortest non-clashing prefix", and b) tab complete. And if a parameter takes an enumeration you can tab complete those, too. Which is incredibly nicer than usage: somecmd -umhcglsrfmh line noise and man page lookups.


Actually, you can ctrl+space in latest versions to get a list of parameters (or values if the parameter takes an enumeration) you can scroll around with the arrow keys :)


It's better than reading regex linenoise.


Is there anything preventing Microsoft from putting a Linux shell natively on Windows? I'm sure PowerShell is awesome, but the DevOps person in me is not excited at all.


Not quite what you're looking for, but arguably better: https://msdn.microsoft.com/en-us/commandline/wsl/about

Been playing with it ever since I got a new Windows laptop for college (I usually would have installed Debian but I prefer OneNote to notetaking apps on available on Linux). The interesting part is that it's not a VM; the LoW subsystem translates Linux syscalls to Win32 ones


this x100. i wish they'd work on chef/puppet/ansible/salt support instead. probably never going to happen though, so much functionality in windows isn't available from the command line.


Yeah, probably never happen. You'd never see J. Snover the 'father of PowerShell' at ChefConf last year talking about PowerShell Desired State Configuration (DSC) and its support for Chef (https://www.youtube.com/watch?v=Gh9zfm5gVEg) or two years ago the PowerShell team announcing a DSC Cookbook for Chef (https://blogs.msdn.microsoft.com/powershell/2014/07/29/chef-...) or last year announcing Puppet's announcement of support for PowerShell DSC ( https://blogs.msdn.microsoft.com/powershell/2015/10/06/desir... )

You know that the entire point of PowerShell is to make Windows functionality available from the command line, for centralised management and configuration, for automated provisioning on Azure, right?


I think we have different definitions of support. What constantly amazes me is how similar many OSes are - except Windows. DSC hasn't changed that, even though Mr. Snover is a very clever guy. If you're in a homogeneous environment, DSC is probably great. But it's 2016, and environments are heterogenous. Even shipping an SSH client and server with Windows would go a long way as far as interop goes - which may be happening AFAIK (OK, I admit I love SSH, but it's so useful). PowerShell feels a lot like the xkcd n+1 standards thing. I don't use bash any more, but it (along with ash/dash) is the de facto standard.

Sorry I can't get more excited about this, which is especially sad because my background is QBasic -> VB -> C#.


That is how we got started on this. We didn't want to invent anything here. We had a technology called Services for Unix which provided all the shells and utils and I got 99.5% of the way to getting that shipped natively in Windows but it got hung up over IP concerns. So we made it free instead.

It turns out it didn't help managing Windows at all.

The heart of the problem is that Unix is a Document-oriented OS and Windows is an API-oriented OS. In Unix, if you can edit a file and restart a process - you can do most management tasks. Therefore text processing utils like awk, sed, grep are actually management tools. When these became available on Windows - they didn't help manage anything because awk didn't work against the registry, sed didn't work against WMI, grep didn't work against Active Directory. An API-oriented OS needed an API-oriented solution.

That is why we HAD to invent PowerShell. I believed that the Unix automation model was fundamentally correct: Interactive shell composing small tools together in ad hoc ways to quickly construct novel solutions to novel problems. I like to call this the " A | B | C" model.

I did a deep rethink of the "A | B | C" model and asked myself the question WHY? Why not just type A? Why didn't A do what I wanted it to do?

There is the traditional answer about toolchest of tiny tools but I keep chewing on it and came up with a different answer:

  A tightly binds 3 steps into 1.  It

   1) Gets a set of objects

   2) Processes those objects

   3) Outputs those objects (typically as text)
When when we say A doesn't do what I want it to do, what I'm REALLY saying is that A did one of those 3 steps in a way I didn't want. So piping the results to B and C is REALLY all about taking the text and reverse engineering my way back to the original objects to do one of the steps differently.

My observation was that the pipeline should go between the 3 (smaller) steps and that the pipeline should pass objects and only output to text when you wanted text.

This object orientation is one of the great simplifies of PowerShell. It allows us to shift our focus from the mechanics of text parsing to the semantics of what I want to achieve. We like to call this THINK, TYPE, GET.

THINK about what you want.

Type it

Get it

Imagine you wanted to get all the processes whose workingset was greater than 5mb and sort it by workingset and then format that as a table just showing the name the id and the workingset - in PowerShell, you would do that with this:

PS> Get-Process | Where workingset -ge 5mb |sort workingset | format-table name,id,workingset

some people complain about the verbosity of PowerShell but that is there for scripts. We have lots of aliases for interactive use. Here is an equiv:

PS> gps|? workingset -ge 5mb|sort workingset|ft name,id,workingset

So that is the answer - a lot of the great Unix tools don't work on Windows because it is an API-oriented OS. So we had to develop an API-oriented automation model - that is PowerShell.

Jeffrey Snover [MSFT]


In the fifth paragraph: "But this is a new Microsoft". They are acknowledging that they made mistakes in the past due to which the missed out big on the server market. As much as .NET is popular, it would have been a lot more popular (vs Java ecosystem) if they were OSS from the beginning. As primarily a Java developer, C# seems more elegant to code in.

As they say - the first step to fixing a problem is admitting that you have a problem.


the Unix shell has been quite static in recent decades; its a sign of success, sharing unstructured text through a pipe is good enough for most administration tasks and people got used to the many command line options of grep/sed/ps and friends.

is this really the best of all possible worlds? I think that its good that the shells get some new competition in the form of PowerShell - maybe some new ideas will come out of it (maybe like passing s-expressions or json around the pipe - something more structured so that we could do with fewer command line options - maybe)

doing structured data 'right' is difficult (right meaning uniform and expressive ways of handling it); historically it was easier to stick with unstructured text - but maybe there is a better way that would be easier to learn and handle.

(Well a major problem of structured data is that there are many possible ways of structuring - leading to more problems for the consumer of the data; maybe it was not adopted in system administration because there is not common/agreed upon culture of how to structure things).


Excellent news. Now I might actually learn PowerShell


Its not clear to me what this means to Powershell SDK? (https://technet.microsoft.com/en-us/library/ff458115.aspx). Will I be able to write applications (not scripts) in say C++ to communicate to say SCVMM?


Great. I wonder what Steve Ballmer is thinking.


Ballmer's long gone, so he's probably thinking about the LA Clippers (he's the owner).


Probably "<something>, <something>, developers".

I'd love to know his reaction to and thoughts surrounding this move.


he's thinking about an nba championship.


Steve who?


He gets his share of evilness with Windows 10 wreaking havoc in users' privacy. Microsoft doesn't seem to have changed that much.


Still need something other than powershell to access a linux server from a windows machine, which seems ironic...

ssh : The term 'ssh' is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again. At line:1 char:1


What would a shell have to do with the existence of a userland program? ssh isn't exactly a bash built-in either ...


An (unofficial) PowerShell AppImage is now available for anyone interested in trying out PowerShell. No installation necessary. https://bintray.com/probono/AppImages/PowerShell#files


  milek7: ~$ time powershell exit
  
  real	0m3.323s
  user	0m4.140s
  sys	0m0.507s
Seriously?


Propably bit unfair comprasion, but...:

  milek7: ~$ time bash -c exit
  
  real	0m0.009s
  user	0m0.007s
  sys	0m0.000s


Can I write tools that use PowerShell and .NET objects in non-.NET languages, like Python or Perl (using the C implementations, not a .net python)? Like, is there an API that I can implement to interact to implement PowerShell-like "native" tools?


It's not the most efficient way to do it, but if you just want to get it done, Powershell has had e.g. ConvertTo-Json since version 3, I think? You can serialize pretty much anything at that point, and just hand that off to your target language.

If you're talking an FFI though, not so much, I think.


use pythonnet to interact with .NET dlls from CPython:

https://github.com/pythonnet/pythonnet


On the Java side here I tend to use groovyConsole when I need to do crazy one off stuff like open excel files which then require a DB query, and maybe to run regexs for little things. I use bash for running docker commands in a specific sequence.


I made a Docker container running PowerShell on CentOS7: http://github.com/gbraad/docker-mono Have fun with it...



If PowerShell on Linux takes off properly, then that will lead to one of the biggest improvements in the general computing ecosystem in a long time; roughly comparable to the adoption of DVCS, maybe.


`cat youCantBeWorseThanCmdRight.utf8.txt > openMeAfter.txt` in PowerShell and cmd, and tell me how good posh is

now go tail a 1GB file in posh and bash and tell us your results


PowerShell defaults to UTF-16 encoding, indeed. It's a design choice on a system that uses UTF-16LE as its character encoding and one from the early 2000s. UTF-8 wasn't nearly as prevalent back then. However, the text in both files is still the same.

    Get-Content -Tail some1gbfile.txt
works quite well, by the way.


well, enjoy typing

    Get-Content -Encoding utf8 .\utf8.txt | Out-File -Encoding utf8 out.txt
and manually removing the BOM, for the rest of your life


It's still the same text. Not all of us have to deal with tools that can't handle Unicode or require exact byte-for-byte round trips. Yes, it bugs me too, from time to time, although less so than the extra line break at the end. In that case I have to go through [IO.File]::ReadAllText and ::WriteAllText and thus deal with full paths again. There are a few gotchas.

However, by now we can actually fix that issue. The encoding parameter that the file system provider adds to a bunch of cndlets is an enum and we can add a new value there.


Parsing text is unfortunately not one posh's strenghts... but one of the beauties of posh is that you can very easily include .NET classes, so you could e.g. use the System.IO.StreamReader class https://rkeithhill.wordpress.com/2008/03/02/nothings-perfect...

On the other hand, with posh I could ingest a json file and convert it into an object (or an array of) with ConvertTo-JSON. Once I've got the object, I can query/navigate it instead of parsing it. I'm not sure it is that easy with bash, but then again they are not mutually exclusive, so it's great to have both now on Linux and on Windows.


I am still trying to understand if there are any tangible benefits on learning powershell over bash/zsh.


I wouldn't think of it as a replacement for bash/zsh or other shells. Those are meant to control the OS and run programs where powershell is an environment to programatically interact with API's. They can both fill similar roles but their not the same.


Piggybacking on this thread, I wonder if there's a robust PowerShell profiler out there.


I bought a new laptop recently, and used Windows for a little while (I didn't have time or a USB drive handy for installing Linux for a couple of weeks). I was pleasantly surprised at PowerShell. Not because of all the stuff people normally like about PowerShell; all the PS things are still confusing as hell, to me. But, because MS has embraced compromise and meeting developers where they are in ways that I have never seen from MS until recently.

PowerShell now accepts most of the common UNIX shell commands (this feature could have come at any point in the past decade, or so, as that's how long it has been since I last tried PowerShell, when it was new). The changes that resulted from that change of mindset has made it so I could actually stay in PowerShell long enough to learn other things. Having to re-train my muscle memory away from using stuff like ls, cp, cd, was just a bridge too far, and so in the past I'd revert to cmd (as truly awful as it is) until I could get bash installed from Cygwin.

They seem to have, somewhere along the way (again, it may have been a decade ago, for all I know) given PowerShell a reasonable UNIX shell facade. Sure, it's not quite shell, but bash/POSIX shell isn't flawless; I'm OK with changes, as long as I don't have to think every time I want to do basic stuff that I've been doing for 20+ years with a POSIX or bash shell. PowerShell has some Long-Ass-Command-Names, and I still find them difficult to stomach, but at least all the common sh equivalents (except which, for some reason) already have aliases.

Anyway, the change from a company that tells developers how to work (and for whom the command line takes a back seat to GUI tools and most developers live in the Visual Studio environment, I guess) to one that looks at what's working in all sorts of developer communities and borrows liberally from the best of them, means that having to work on Windows is no longer the worst possible thing I can imagine (I'm even considering porting our products to Windows, which is a suggestiong I have laughed at numerous times over the years).

PowerShell, as it is today, reflects that change, and open sourcing so much stuff lately seems to prove it is sincere rather than merely aping the most visible traits of these OSS communities without actually embracing the single most important facet of them.

I still have an old OSS nerds suspicion of Microsoft, and I will probably never choose Windows as my favorite desktop OS (and I can't even imagine it for server usage...how would you even do that without a decent system package manager?), but my most recent experience with Windows was almost entirely positive. Interoperability has gotten pretty good. SSH is a normal thing, rather than a complicated setup process.

It's interesting, to me, that Apple seems to have been retreating on these fronts. They seem to be less interested in participating in OSS than they were in the early days of Mac OS X. That may be my own ignorance of the platform, but I don't see Apple making these kinds of overtures to Linux/UNIX developers on such a regular basis. Maybe they figure the community is taking care of it for them, because so many devs use Apple laptops even while deploying to Linux servers, so the ecosystem is still pretty strong even without Apple doing much of the work.


"and I can't even imagine it for server usage...how would you even do that without a decent system package manager?"

I'd be pretty hesitant myself to use it as a web-facing server. But if you're running an enterprise network, the reality is almost everything you need on your server can be added and removed from the "Add Roles and Features" wizard. There's not a whole ton of outside packages you would want to add.

If your server is running third party software, you should probably only run that one piece of software on that particular server, and that software should install everything it needs to run on it's own.


Powershell released in 2016 still isnt as powerful as Stephen R. Bourne's shell released in 1971.


Is this actually useful?

I always thought powershell was largely just for interacting with windows components, like installing an msi or talking to active directory.

What would you actually practically use this for on a linux box?


The ramp-up period is much lower with Powershell.

It is truly an interactive ("online") environment. Almost everything is typed and through introspection the shell can offer very rich auto-completion hints, without needing tedious scripts such as those used by bash/zsh.

Quick example:

get-<Tab> cycles through all get-[whatever] commands, as you'd expect. get-certificate <Tab> cycles through all parameters available for get-certificate get-certificate -ErrorAction <Tab> cycles through the list of valid values for ErrorAction (there's like 6 of them: stop, suspend, etc.)

It is also highly standardized, there's a set of "verbs" allowed so that it's much easier to find something since everything is in a certain category (https://msdn.microsoft.com/en-us/library/ms714428(v=vs.85).a...)

It has its bad/ugly bits, but the principles are quite solid. In many ways they are what I'd expect if Richie and Thompson would have designed Unix in a modern environment (i.e. not an environment where the output of your commands would be printed :) ).


You can also use things like Get-Command (with options for -Verb, -Noun, -Module filtering) or Get-Member to inspect types/methods/properties. Very helpful when learning a new area.


Scripting. While I typically use Bash on Windows to use grep to interactively test my webserver, I'm much happier using Powershell ISE (which I don't know was ported [1]) to write standard tests with multiple parts, particularly ones that require e.g. checking cookies on the response.

1. Looks like they ported a service that does the same. https://github.com/powershell/powershelleditorservices


Can you be more specific?

Powershell instead of using grep? Or as a testing framework? Or an embedded interactive REPL?

I mean, just broadly, with basically zero knowledge of powershell, what I see is a custom scripting language with a set of common 'macro' commandlets that do little tasks.

...but, I don't see how that's useful?

How do you import a package to upload a file to S3?

I can't just install https://aws.amazon.com/powershell/ right? ...because that's not ported? So...? etc. etc. for other practical tasks.

Not trolling; just genuinely curious what actual tasks are actually supported that you could actually use this for, beyond trivial stuff like 'check if file exists'?


We are currently performing final validation on our new AWSPowerShell.NetCore module and hope to publish it to the PowerShell Gallery in the next couple of days.

Blog post to the announcement: http://blogs.aws.amazon.com/net/post/TxTUNCCDVSG05F/Introduc....


Since it's only come out not all 3rd party scripts are going to available straight away.

but there are standard http post/get/put commands so you can write your own.


Yes, and there's an extension to use it from VS Code[1]. It works pretty well (autocomplete, F8 to run a script, debugging, go-to-definition, etc.). The fact that it's a language server means it can be used from other editors to (although I don't know if any other extensions exist).

[1] https://marketplace.visualstudio.com/items?itemName=ms-vscod...


I shouldn't comment before coffee - not grep, curl.


symbolic soft link yet?


People laugh at me when I tell them I switched "back" to Windows after decades of Unix and Mac programming. But it's a really good, productive platform. Microsoft really seems to be heading in the right direction now.

I hope .NET starts taking off on other platforms, too, because it really is a much better system than Brand "J".


If only they resolve the privacy leakage concerns in a reasonable manner.


In the Windows 10 thread someone posted this hilarious comic: http://www.bonkersworld.net/organizational-charts/

We just have to hope that "our" guys (OSSers) have bigger guns than "those" guys (privacy destroyers).


Considering most OSS devs slam analytics on literally every website. Really there isn't much difference.

That's what windows is sending. Analytics information. Yet I can guarantee every major website developed by open source using devs is sending the same information for their apps. Yet no one complains.


Look at the comic. I'm talking about divisions inside Microsoft. I doubt it very much that Scott Hanselman (for example), who is in the camp that pushes Microsoft to open up, is the kind of person that designed the privacy controls (or their lack) in Windows 10.


Amen. Having used Windows Server and various Unixes, I much prefer Windows.


If you meant Java, say Java please. Brand "J"? To me, that would mean J/K/APL/APL2 (MATLAB, Nial, S, R, and other array systems). But I guess Brand "J" is cute.


I don't want to be that smug ITT guy but it has to be said:

Bash != Readline and Bash != coreutils and Bash != (what gives your machine the ability to do ssh)

Bash is the stupid if for while syntax

You have Readline in powershell and you can use coreutils binaries in Powershell. No problems there.


[flagged]


Just like Google, Apple, IBM, etc. etc. etc.

I'll join in on the Microsoft bashing club when every other major corporation gets bashed the same way by devs. The day that will happen is when socialism makes more sense to people than capitalism, and it becomes a pervasive idea, but sadly, I think that day won't be coming around any time soon.

Until then, let's enjoy the fruits of capitalism.


I've been enjoying the fruits of Bell Labs and UC Berkeley.

I'm not a "dev" or even a Linux or Bash user. I like the simplest Bourne shells the best. Plan 9's rc is also good.

I've never enjoyed using Microsoft Windows. (But if they gave away the source for NT kernel I might find use for it, after a lot of work trimming it down to size.)

Maybe the size and power of UNIX has spoiled me.

MS obsessively focuses on their competition to the detriment of users and software quality.


> Ballmer said Linux is cancer

He more or less said that the GPL is cancer which is pretty accurate. The entire quote is

"Linux is not in the public domain. Linux is a cancer that attaches itself in an intellectual property sense to everything it touches. That's the way that the license works"

Not very nice but not entirely wrong either.


They see all things that flourish outside their control as competition.


[flagged]


We detached this subthread from https://news.ycombinator.com/item?id=12313389 and marked it off-topic. Baseless accusations of astroturfing are not allowed on HN. If you believe this is happening and you have evidence, please email us hn@ycombinator.com and we will happily investigate.


Powershell is a horrible abortion, what the hell do people see in it? I mean that on a language level, I cant speak for Windows specific features. I have once written a simple script to copy a folder recursively to a fileshare, rename some stuff, edit some files. Took me 4-5 hours. I then did it with julia in 30 minutes and I never did anything in either language before that. It's insane the kind of weird language crap you have to deal with, the weird options that some functions need and worst of all: That the functions have weird problems, like I dont remember the specifics but I'm pretty sure that the folder copy function had no option to actually recursively copy all of them. I had to work around that. Then I had to work around some more specifics around file/folders. In a normal language you would just call something like "copyFolder(srcPath, dstPath, recursive: true)" nope.


I wouldn't go so far as to call it a "horrible abortion" - but it can be painful to use at times. I manage several Azure environments with PowerShell and not a day goes by that I don't wish there was something better. Especially on those days when I have to dig into Azure's JSON templates.

There is an SDK for Python, but I haven't had an opportunity to try it. If anyone has used it I would love to hear about your experience.


Linux ecosystem has a huge amount of different tools and scripts for most of the situation that you can imagine. I'm working with Linux more than 14 years, so I can say that text processing idea is the best way. It's pure KISS principle. If you need just to see something - type a command, for complex operations - awk/perl/python. PowerShell looks ugly because it's something between a programming language and shell. It's not KISS. It's all in one tool.


There are the Azure REST APIs that you can curl. Or the Azure-CLI on Linux and Windows.

That said, I've noticed that powershell seems to get priority with Azure support before the azure-Cli client, so there are a few gaps here and there. I've been told that the Azure REST APIs are all in step with Powershell support, but sometimes go in documented unfortunately.


To be honest, I prefer PS to the Azure CLI but I have had to resort to the CLI on a couple of occasions. If I remember correctly it was to reset credentials on a VM (which wasn't possible with PS at the time). I guess the main problem with the CLI is that it's not a shell or a general purpose programming language.

The Azure REST API requires a ton of deeply nested JSON data structures. Take a look at this [1] sample. I just can't bring myself to work with that every day.

I really need to buckle down and try that python API.

[1] https://msdn.microsoft.com/en-us/library/azure/mt163591.aspx


Azure CLI, Azure PowerShell and even the portal are nothing more than API clients. You can capture traffic while working with any of them and see it for yourself.

Also, for PS and CLI the code is in GitHub :)


oh man I agree with you so much. You can't do anything data intensive with powershell, it is too slow.


This might come across as a little rude, but I really wish Windows developers stayed away from Unix. I moved away from Windows a long time ago because I didn't want anything to do with Windows or Microsoft, and I went to great lengths to avoid their software and their proprietary file formats. Now we had command-line interfaces, desktop environments, running on top of an incredible, powerful Unix-like OS. It was free software, and it was Microsoft-free. Please, don't ruin it.


"Um excuse me, but I really like dogs. However, I don't like you, so I don't want you making movies about dogs. Dogs are my thing. Stay away from the concept of dogs."


I have to agree. I can only feel like this is pollution or another EEE attempt.


Tell me how well EEE works with an optional application released on Github with an MIT license will work.


GitHub is just the distribution here, and software licenses are subject to change. That's the embrace portion.

Hypothetically, Powershell becomes widely used on Unix, and Microsoft releases Powershell 2, which extends Powershell. This version only runs on Windows and contains killer features. These new features can't be implemented from the GitHub source either technically or because of intellectual property. That's the extinguish step.


So then you switch back to another shell?


Yes, or you buy Windows. That's the purpose of EEE.

That's similar to saying, use a different word processor. Just in this case Powershell is less used than Microsoft Word.


The purpose of EEE is to extinguish a technology. Your situation doesn't hold up because PowerShell is MIT licensed. Even is Microsoft did release a new version that was Windows only (which, given their use case, makes no sense), there's nothing stopping the community from forking the repo and adding those features, or porting those features to another technology.

People will continue to trumpet the sound EEE banner every time Microsoft does something that people like. The problem is that waging an EEE campaign with MIT licensed technologies on Github is extremely ineffective. Also, it's pretty clear that while Windows is still a big part of Microsoft, they are trying to move beyond the desktop. The rise of macOS, the proliferation of mobile devices with iOS and Android, all these developments have caused the company to shift its strategy to the cloud, namely Azure. Even in that space, they can't wage an EEE campaign, because with competitors like Amazon and Google, there's no way they can lock you into Azure

Times have changed, and so has the company's strategy. Whether or not the company's culture has changed is not something I am concerned with. They have enough watchers that call them out, like with the Windows 10 update debacle and the collection of telemetry data. I am more interested in their actions with regards to their technology, because it's my opinion that have some of the best people working on languages and tools.


Cool. https://powershell.org/ needs some work though. It screams WordPress 2008 website, and I'm not given any idea how to install or use it, or what it's good for, just by looking at the front page.


It is not a Microsoft website.



This could really be used as the default reply for a lot comments here that don't get why people are still a bit antsy about MS apparently doing things for the good of others besides MS. But it's probably an argument only worth having in one thread. I can't really add much to the argument, other than I think my feeling matches a lot of others in the OSS world: ever since Nadella took the helm I've been cautiously optimistic that we're not just part of a very long embrace+extend cycle that will end with a crushing extinguish, additionally as the years go by without an attempt at extinguishing but even more embracing+extending instead, with MS having lost so much of its former power and weight in the industry as a whole, I think even an extinguish attempt wouldn't be all that successful. It's not enough for me to actually use these offerings, but I'm not going to go on a rant that no one should use them or distros should avoid packaging them. (e.g. Stallman once had a Mono+Banshee rant that was probably justified at the time but it's hard to argue the position now.)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: