Hacker News new | past | comments | ask | show | jobs | submit login
Curl author asks Microsoft to remove 'curl' and 'wget' aliases from PowerShell (github.com/powershell)
475 points by laurent123456 on Aug 19, 2016 | hide | past | favorite | 341 comments



I kind of feel sorry for the MS chaps here. They admit it's wrong, say they will try to fix it but it'll take time because of bureaucracy. But then they get attacked in the thread because they are Microsoft. It's sad to see when they are generally trying.


Why do people call structured governance a bureaucracy? About a month ago there was a well received post on GTK's rotten foundations that boils down to several core points, one of them being: if you keep breaking stuff all the time, there's no point in investing in your API.

And here we are with initial OSS release of PS for *nixes. It's broken from the (valid) POV of its target audience and authors want to fix in an orderly fashion. But that's not enough. People at MS are "stupid or malicious", as some commenters put it, and it should be changed ASAP the way community wants, or else.

Or else what? How many of the people arguing for immediate change are an actual and/or potential consumers of PS? How many of people thinking it's a simple change (just do it) have actually maintained large piece of software deployed by thousands of users? I bet that with the exception of Daniel who opened the bug - none.

I don't envy folks maintaining PS. I used to share an apartment with dude who can't get over the US v MS to this day and anything even tangentially related to MS (or Gates) is definitely, without any doubt evil, devious and is some sort of extortion or at the very least embrace extend extinguish strategy. No matter that US v MS was 15 years ago when he was 10. This attitude is pervasive and turbo-counterproductive in cases like this one.


Why do people call structured governance a bureaucracy?

Being pedantic, but that's what a bureaucracy is. https://en.wikipedia.org/wiki/Bureaucracy

I'm sorry for being pedantic....


People are surprisingly bad at asking questions, which is why the help desk should always first ask "What are you trying to do?"

So why does "bureaucracy" have such a negative connotation? Like a lot of powerful tools, it is very easy to abuse. Unless you are very careful in design, you are likely to create a amplification attack of work. Most managers aren't careful, so few people have a positive experience.


> No matter that US v MS was 15 years ago when he was 10.

US v MS is an ongoing thing. It's never been over... So MS is releasing OSS stuff, but making it hard to run Linux alongside their OS on the same hardware. It's like a guy giving you a hug while stabbing you in the back.


It's not really like a guy at all. MS is a large organization, and can't be expected to act like its decisions are as simple as one person's.


Ah, so if a large corporation stabs me in the back for completely understandable reasons that are hard to avoid in the real world ... I'm still stabbed in the back.

I don't think the venom here is justified at all, but I also don't think it would be justified if it was one malicious person who did it.

But the problem and the poor decision needs a ruthless critique, the reasons it ended up this way are a side problem that most people shouldn't care about.


Decisions like the one I'm referring to are made at the very top level, so the "MS is a large organization, and can't be expected to act like its decisions are as simple as one person's" line does not hold water.


Not really buying this argument.

This isn't some obscure situation tucked into the back corners of some IRC chat. Open sourcing and reaching out to the community is, apparently, the companies direction going forward.

As such, they could/should prioritize these situations as a way of showing real commitment.

Dragging their heels is going to make it feel like nothing but marketing bullshit.

It might also signal to MS rank and file to cut their territorial bullshit (which MS has a storied history of).

Satya could get this fixed today if he said "This is the future. Get it done."


> Satya could get this fixed today if he said "This is the future. Get it done."

Or, as Bezos would say, "?"

I think the takeaway from this is they should fix things like this before they decide to open source.


One of the reasons I haven't open sourced some stuff is because I don't want to deal with parts of the open source community, it's not even about having a thin skin (Which I don't) it's simply I'm not interested in dealing with them.


99% of open source software floating around on the internet has no "community", or any expectation of one.


You mean horrific levels of entitlement?


Microsoft is a convicted monopolist in the EU. That's black and white criminal (and immoral) behavior.

The PS team seems like it's doing its best, but you lose some degree of presumption of goodwill when you work for a criminal entity.

Ultimately, I don't think either side is right, but I also don't think either side is wrong.

MS made a choice not to cultivate goodwill in the OS community for years, and they profited from it. It's reasonable for that community to make them earn it back, and this is what it looks like (unfortunately for the individuals on the PS group).


> Microsoft is a convicted monopolist in the EU.

As an aside, what was their punishment?


the main sanction was having to un-bundle IE from Windows.


Sorry, but this is totally irrelevant. Any large organisation will have legal issues.

Google: https://www.theguardian.com/technology/2010/feb/24/google-vi...

Uber.. hasn't been quiet in the courts.


My point is simply that Microsoft has to convince the community of their goodwill, and it will take time.

I don't think the existence of other companies with (mabye even bigger) PR problems has any bearing. So what? You want to be equally critical of Uber? Ok. Let's make them prove their goodwill too.

Furthermore, Microsoft is unique in this regard, because at the time of their conviction, they were fighting the browser wars with two open communities: FF and chrome.


You don't seem to understand the definitions of "convicted" and "community".


Would you like to provide a definition? I don't understand what you are saying.


> How many of the people arguing for immediate change are an actual and/or potential consumers of PS?

Right. I would think the _author_ of curl should probably have a bit of a say if someone adds a curl command to shell but it doesn't work like curl.

He's going to be the one receive angry tweets and in issues in gh.

> Or else what?

People will see how ridiculous this practice of adding aliases-but-no-aliases to existing tools is, and perhaps decide not to use PS for *nixes. Is that bad? Good? I don't know. I am guessing humanity will still go on.

> No matter that US v MS was 15 years ago when he was 10. This attitude is pervasive and turbo-counterproductive in cases like this one.

Yet decisions made at that point still have repercussions. Ship a stupid API -- reap pain for years to come. Don't think that's major news here.

This is not the first time. Microsoft did stuff like this before. Any web developers remember IE8 and its incompatibilities with everything else out there. WebRTC was discussing and working an API and Microsoft shows up at the last minute, and said "Yeah, we got a new completely different proposal". It's shit like that. Some people are more upset about stuff like than others.


Each time Microsoft releases free software, there has been discussion about whether they're really participating in a community. When Jason closes a PR, saying that it needs an RFC, and oh, by the way, you aren't allowed to start an RFC, I'm skeptical. When Jeffrey provides a work around, opens an RFC, and is generally diplomatic, I'm more hopeful.

One thing I find interesting is Frank's claim that "You have then ignored bug reports about this for years." It seems like users have a better chance of getting problems addressed on Github than the previous feedback mechanism.

Edit: after reading Daniel's blog post, which laurent123456 mentioned in another comment, this has indeed been a longstanding nuisance for curl users.


> One thing I find interesting is Frank's claim that "You have then ignored bug reports about this for years." It seems like users have a better chance of getting problems addressed on Github than the previous feedback mechanism.

Maybe, but I see that more as just that the "new MS" is trying to become more open and integrated with the OS community. The change being in the company culture and approach rather than the reporting platform. Ignoring long-standing bugs on many products was just the modus operandi for the old MS.


Yeah. The whole "new MS" thing is nice.

Now if only UWP will either open itself or die, I'll finally say that old MS is gone for good.


W00t, the new Microsoft is totally not like that old Microsoft that used to fund SCO's lawsuit and spread FUD about competing open-source software and patents violations.

https://www.eff.org/deeplinks/2016/08/windows-10-microsoft-b...

http://www.zdnet.com/article/310-microsoft-patents-used-in-a...

http://techrights.org/2015/10/01/microsoft-loves-linux-brain...

They indeed changed. Back in those days they weren't violating their customers' privacy.


What's your point? Even Ubuntu sold what you typed in Unity Search to Amazon.

By the way, the thread to discuss about Windows 10's privacy issues is here: https://news.ycombinator.com/item?id=12305598


1. Ubuntu doing it doesn't excuse MS. They're completely unrelated companies, and Ubuntu doesn't represent all of the FOSS OS world. 2. Ubuntu actually listened and stopped doing it.


If all of your friends jumped off a bridge, would you do it too?


> If all of your friends jumped off a bridge, would you do it too?

Probably: https://xkcd.com/1170/


I love how the EFF singles out Windows 10, but doesn't mention iOS and stock Android pretty much do the same thing.

Your tech rights articles points out IV, but fails to mention they raised money from Google. They also like to equate Linux, Android, and ChromeOS. That article is just poor. Microsoft has done, and continues to do some shitty things. Windows 10 update and privacy is a debacle. But their releases from a tech standpoint are a welcome change.


Yeah, this is the kind of thing I was talking about.

OTOH, you're seriously linking techrights? They're like the Free Software version of Fox News for tech.


Very true! When those decisions were made PowerShell was never planned to be run on anything that ran those cmds natively. Now that they have changed the rules this should be expected as they move to a more neutral ground. I would be the 1st to call M$ on being jerks, but in this case give them a break the ship just did a 180 if you setup your furniture a certain why I guess you will have to set it up again...deal with it.


wget has been on windows for 15 years: http://web.archive.org/web/*/http://gnuwin32.sourceforge.net...

PowerShell first appeared 9 years ago.

Edit: curl has been on windows for at least 14 years according to this: https://curl.haxx.se/mail/archive-2002-06/0114.html

Edit 2: 17 years: http://ftp.sunet.se/mirror/archive/ftp.sunet.se/pub/www/util...


They have been on there, but not as standard Windows components. Perhaps they had lofty goals to replace the tools with a "better" version and changed direction.


Open source projects should really register trademarks and sue companies like Microsoft. It really doesn't pay to be the good guy. After all, if anybody used the word Windows for an alternative OS, I'm pretty sure Microsoft wouldn't be happy.


You shouldn't be able to trademark function names and executable file names. Therein lies madness.


OTOH, if I (or a big entity like, say, Apple) released something called Java which didn't really behave like Java™, that would also result in a bit of madness.

I think trademarks are good; and when compatible alternatives exist, of say, a product named 'Foo™', they should be able to market themselves as 'Foo™ compatible' (and obviously not as 'Foo™').



So assuming they have a similar problem where 'ls' is aliased to 'dir'. Who should have trademarked 'ls'?


Replacing something with "better" version is step #2 of EEE.


Replacing something with a better version is also the basis of the success of life.

It's all in how you approach things.


If you are going to argue that Microsoft is out on a new EEE trip you are going to have a hard time now that it releases everything as Open Source.


There have been EEE accusations against e.g. systemd, so it's not clear that other OSS couldn't get the same. Actually now that I think about it I don't see why EEE would depend on unreleased source at all.


They aren't, but the only reason for these aliases in the first place is that type of attitude. curl is a commonly used utility, so lets replace the command with completely different.


> now that it releases everything as Open Source.

Windows?

Office?

SQL Server?

Seems to me, that Microsoft now releases "everything" as open source only by the most ridiculously limited definition of "everything".


>>PowerShell was never planned to be run on anything that ran those cmds natively.

Really? I thought that windows versions (native versions, not even cygwin) of wget and curl existed for more than a decade. I am not very good at googling old stuff, but I remember using curl natively in 2004 or so.


"Native" for Windows means comes out of the box or as an official release from Microsoft, according to many enterprise security teams. PowerShell was not allowed on stuff here until Microsoft started including it with the OS, even though Microsoft fully supported it on downlevel OSes.


On the other hand, curl.exe has been available for five years.


PowerShell was originally designed so that you would mostly just run CmdLets with it so colliding with an actual program wasn't considered an issue. And these aliases haven been available since 2007.

They even admitted it them self that it was stupid and needs to be fixed though so just wait I guess?

edit: As a separate note I really would like to have versions of curl and wget that actually have all the params working but return the correct PowerShell objects (so the web request result thingie) instead of a String object. Or maybe have it do that with a flag or something.


I get the feeling there are a lot of commands that are going to need to be customized for actual typed output. It would be an interesting project to do a usr-bin for PowerShell.


curl on Windows existed many years before PowerShell.


Microsoft is being hated because there's a double standard here.

If Linux tools used Microsoft utility names, someone would've been sued.

Microsoft used wget and curl `aliases` (while native versions were there) and not only the didn't get sued, they are causing problems for original authors.

Of course no hate for the engineers doing this. They are just doing their job and this is a tough compatibility issue they are facing now.


> If Linux tools used Microsoft utility names, someone would've been sued.

In a universe where Wine exists, you think someone would be sued if they released an app with a default alias mapping dir to ls?


Wine implements a set of API's.

curl and wget could be considered trademarks.


wine also ships a regedit.exe, a notepad.exe, ... you can construe this as implementing API (it's usually the motivation) but it gets mushy. PowerShell has a curl also to "implement an interface", however loosely.


Agreed. That thread is a mixture of some useful and constructive dialogue and others just "piling on" offering problems without solutions. Those type of commenters would frustrate me to no end.


That's basically what happens in open source, once you get enough people in and around your project.


Best unintended consequence of GitHub adding comment "reactions" is the redditification of discussions (thumbs up/down brigading). Lovely.


If only it was just thumb down and up, but it's also smiley faces, fireworks, hearts and probably many more. It feels like reading a conversation between teenagers on MSN Messenger.


This is such an old fud thing to say. As if using an emoji wasn't a mainstream thing for the vast majority of people to do.

Personally I like the inflections and emoji brings, plus it makes statements that might otherwise be taken as hostile either as well intentioned or just good old fashioned ribbing.


Plus, these reactions allow folks to add their sentiment/support to comment without adding an extra 'I agree' to the comment chain, it actually helps maintain a bit of focus.


In the semi-professional context of a PR discussion, all these big emoticons are a bit out of place.


You're trying to hold back the tide. If anything PR seems to be driving this sort of thing now.


Are you guys talking about Public Relations or Pull Requests?


Yes.


>As if using an emoji wasn't a mainstream thing for the vast majority of people to do.

It shouldn't be. Fuck emoji.


Fortunately the "reactions" are purely metadata, and do not effect the visibility of the comments. Who cares if there are a million thumbs down.


Unlike Reddit, no action is taken upon a certain number of "reactions". Your comment isn't censored after receiving an arbitrary amount of :-1: 's


They added the buttons so brigades would use those instead of commenting.


Yeah, I honestly think it's an improvement.

+1

;)


GitHub never would have had that problem in the first place if :+1: were not so easy to do. There's no reason to make emoji part of GitHub flavored markdown or for it to be an autocomplete.


People have used plaintext "+1"'s for ages, I wasn't even aware there was markup for it now.


There is now no longer a reason to make emoji as part of the markdown, as the vast majority of devices can now do it natively. When Github flavored markdown was first created, emojis were most certainly not at all widespread natively.


Not quite, it's difficult to avoid using votes as a heuristic for post quality. A reader's first impression of a heavily downvoted comment is negative. It encourages groupthink, downvoted comments are unacceptable, and serve as examples to others who might step out of line. Also, there is less pause when downvoting a comment if it is already well in the negative. When you think about it, the whole downvoting thing is antisocial. It's bizarre that almost every popular discussion platform these days allows users to passively shit on each other as a core feature.


In a system that doesn't rank, sort, hide, distinguish, or otherwise do anything other than increment a counter next to a post... huh?

Groupthink: You mean a community that downvotes a comment about X suddenly has a positive notion about X because the votes are hidden?

An example to others: I see this as a good thing - it lets an outsider get a good feel for what community members like. A number is much easier to parse than thousands of comments.

"Shitting on others" is a very melodramatic way of "people saying they don't like a comment with numbers".


Open a request to github to enable per repository/org disabling of voting.


That's not exactly what they say and I agree with them that it would kind of suck if you wrote a bunch of scripts using the alias and MS broke them overnight.


Agreed. I also think people are not appreciating that even if you agree a part of your interface is bad, it's not a given that you should change it. There's a trade-off between improving your interface, and making breaking changes for your users. Evaluating that trade-off requires discussion and investigation.


Given that the Powershell that ships with Windows 7 doesn't have these aliases, whereas the one that ships with Windows 10 does, It seems that that Microsoft was already happy to have made the diametrically opposite breaking change without the corresponding discussion.

It seems that the users of faux-curl-on-powershell are now getting more consideration than the users of real curl under Windows a year or three ago, when Windows 8 or 10 made this change.

It's maybe a good sign. Opening the source and putting it on github means that curl and wget devs and users now feel empowered to ask Microsoft to change stuff in a way that they didn't before.


Yeah, I don't understand what that gevaerts guy tries to accomplish.


I think he's trying to accomplish being an asshole, he's succeeding. The whole thread is terribly saddening. It's a breaking change, they have a process for making breaking changes and the creator of PowerShell specifically says, "We need to fix this, let's get that process rolling. If you're on Windows here's how to work around it."

Then a bunch of jerks pile on about how it was a bad decision and all the Microsoft developers should feel bad.

The whole PR went from something pretty awesome to a great example of why we can't have nice things.


I sometimes have the feeling that a lot of developers don't really know what stability in APIs and breaking changes actually are. They themselves don't really care as long as the change is in their interest, but they can't (or won't) see the impact it has on existing code.


It's okay: this is also an opportunity to show that they can play well with others. Done correctly they can build trust within a community that has traditionally disliked them.


That's a "Standing up to the bully adds character" argument. I prefer to foster a culture that didn't have the bullies.


Microsoft standing up to a bully..? My god, what is this world I'm in.


That's not Bill Gates they are messaging, it's developers like you and I just doing their job.


Yeahhhh, because developers like you and me are assigned the IP... Nope. We take a home paycheck but relinquish the IP to MS. What you're suggesting is tantamount to Walmart employees taking offense to Kmart for copying "everyday low prices". Sure, a real live human being came up with that tagline, but do you think Kmart or Walmart send them checks every month for the benefit derived for the now famous line? The open source world is different. In many cases individuals do maintain ownership of their IP. And when they don't, often it is held by a trustee for the benefit of the community.


Also, that whole bit was supposed to be a reply for a diff comment, but this app I'm using sucks so can't correct. My bad.


To me it seems they only try to fix it because they got caught (and now publicly shamed).


Agreed. The first post from a PS team member was essentially "no, we don't want to do that, if you have a problem with it open an RFC" ... only later when it was plainly obvious that they were wrong did they agree to do the RFC themselves


How is it wrong to put an alias when wget and curl are not standard Windows applications? Sure a lot of people use them but it's easy enough to create a new alias.

Don't people have better things to do than bitch about an alias??

Perhaps MS had goals on their inception to make the commandlets behave more like the true applications then changed directions.


I think the outrage is over the fact that MS broke any use of the command wget or curl and also, it doesn't even provide a compatible alternative to what they were aliasing ala busybox.


That's great and all but their alias does not support all the switches of wget/curl. I have both installed as compiled windows binaries on my machine and was pulling my hair out wondering what the hell was going on.


> How is it wrong to put an alias when wget and curl are not standard Windows applications?

Isn't that a trademark violation when a Big Corp is doing it officially in their software?


Are either curl or wget trademarked?


Yes, because all that is needed for something to be "trademarked" is for someone to use the term "in commerce" in a consistent fashion and to build an audience of expectation for what that word means in a specific region and context of discourse. When I see "curl", I know what that is: I know what it is supposed to be, and so it is trademarked in most reasonable countries. I assume your question is "did they register their trademark?", but one does not need to register one's trademark to receive the protections of a trademark, as that would make the law somewhat useless: the goal of trademark law is in many ways to protect users who are being tricked, not trademark owners.


cURL: https://curl.haxx.se/legal/thename.html, so it seems he has not obtained a trademark on this in the US and lives in Sweden, where trademarks aren't a legal thing.

wget: https://www.gnu.org/prep/standards/html_node/Trademarks.html

wget is a GNU project (now at least, originally?). They seem to (makes sense) have a thing against acknowledging trademarks period. So I'm not sure it'd make sense for them to even try to enforce a trademark should someone make a product named (or alias their product to the name) wget.

Trademarks aren't a universal thing, not every country or culture will have an equivalent concept, legal or otherwise. And if products aren't trademarked then claiming trademark abuse is inaccurate.


"Don't people have better things to do than bitch about an alias??"

Now that's a cordial and produtive tone.


They're talking about changing it on the Windows version of Powershell too; I don't think it's something to be "ashamed" of, I thought it was a huge convenience - I knew that wget wasn't literally wget, but when I needed to fetch a webpage I knew the command to use without looking up Invoke-WebRequest or whatever it is and without having to go out and download the real wget.

I think taking the alias out is actually disappointing and a very slight change for the worse on Windows.


Let's not forget MS real motive here, which is indirectly, selling more of their software and bringing in more cash.


Everyone has a family to feed right? There is no shame in trying to make money.

Satya was super clear on this point - he told us to get out of our offices and go talk to customers and find out what they needed to be successful. "Don't worry about the money - if you are making customers successful, we have smart people that can figure out how to make money".

To the all the engineers in our company - that was music to our ears!

I understand your skepticism but there is a new Microsoft at work here.

Jeffrey Snover [MSFT]


Sure thing, I get that.

That's nice to hear from satya btw.

It was just the sentiment of now "feeling bad for MS" is also a bridge too far.

That said, a healthy, fit, "new" MS would be healthy for the whole ecosystem.


Apparently a lot of people have this MS hates Open Source idea. A totally different experience you'll get when you follow the dev of VSCode, the team is just amazing, I don't know them, I don't like or hate MS but recent changes to MS has made MS embrace FOSS. Gone are the days when they pushed hard for the windows platform, now everyone runs multiple OSes, says the blog entry announcing PS.


Going to work for MS in the first place, you kind of take that risk.


Wow people are too lazy to simply create a new alias to the Curl/Wget that they want.. What's the world coming to.


The problem isn't about laziness, it is about defaults.

If Microsoft removed them, installing an update to PS could literally break existing scripts. Sure, you can trivially fix it by re-creating the aliases, but only after you realise that an Update broke it.


Exactly, which is why they can't just switch it back. But people are on the attack about something so trivial. For anyone who needs "the de-facto curl or wget" then create a custom alias or re-assign the old one. It's a minor inconvenience, whereas if Microsoft were to patch it and break existing systems that would be more worthy of criticism..


One imagines a sort of "boilerplate" at the top of every script, patiently fixing various P$ mistakes before the script actually gets to work. I've seen similar things in other challenged environments.


All the comments make me incredible sad. Established projects need a valid backwards compatibility policy, this is particular true for shells and programming language to not break existing scripts.

People make mistakes and it's been agreed that introducing these aliases years ago was a mistake, albeit I think back then an understandable mistakes, given the idea to bring a shell to windows that has the power of a traditional unix shell.

Now people are loudly complaining, laughing, etc but without any constructive feedback that takes the backwards compatibility needs of a tool that is literally shipped to hundreds of million of people into account and stomping RFC approaches that are pretty similar to those in other projects (see PHP RFCs, Python PIPs, etc).

Since when did people became so hostile towards open source projects that try to do the right thing?


Powershell's "native executable" story (running things which aren't cmdlets) has never been simple or graceful [1], which means that posh scripters might call exes like curl more explicitly and avoid the alias altogether.

After months of varaible-expansion-frustration I have alnded on the form "& foo.exe par1 par2", spelling out the .exe and avoiding the alias.

[1](https://web.archive.org/web/20110726162028/http://huddledmas...)


It's hard, and I think it scales with a project's popularity. GitHub is very much susceptible to the lowest common denominator running in and yelling at maintainers. You can tell by many of the comments that both the users and the maintainers forget that they are dealing with real people. I wouldn't talk to anyone on the streets the way these people do to each other. It's also crazy because it's open source. Open source only works if people are willing to cooperate.


It's because it Microsoft, it doesn't matter what the issue is. People just want an excuse to bash them.


To be fair one of the first comments on the issue is some guy from Microsoft saying:

"We can't make this change without an RFC. Oh, by the way, the community cannot create an RFC."

If I was actually interested in using powershell on Linux I too would be a little upset they Microsoft is so unwilling to work with the community.


You are not giving the whole context:

> We can't make this change without an RFC. Oh, by the way, the community cannot create an RFC, WE INTEND TO OPEN THIS UP.


After so many years of Microsoft trying to do exactly the wrong thing towards Open Source, I can understand why people are a little slow to trust and accept them. It is in fact very wise at this point to be careful with Microsoft.

Microsoft needs to be patient and understanding of this mistrust, given the weight of history behind them.


And in what is starting to seem like a pattern, Microsoft makes a public move to do the Right Thing.

>jpsnover commented ... @bagder You bring up a great point. We added a number of aliases for Unix commands but if someone has installed those commands on WIndows, those aliases screw them up. ... We need to fix this.

> Snover is a Technical Fellow at Microsoft & the inventor of Windows PowerShell ...

The thread goes on at length about bureaucratic requirements for the change. I guess this is what happens when you hold a bazaar inside the cathedral.


I find this comment most important in all discussion:

>To be honest, I don't see how this dscussion about how many people might depend on these aliases is relevant. The story as I see it is that a few years ago you decided to usurp the names of some well known tools, in the full knowledge that doing so would break those tools. You have then ignored bug reports about this for years. You really can't deny that you knew a long time ago that what you did amounted to hostile behaviour towards other people.


How is that comment useful? No-one is denying anything. It's obvious that the decision to implement those aliases was incorrect. It's not like Microsoft is saying that it was correct.

It's not a question of who was right or who to blame. It's a question of how to make the changes because there _are_ users that depend on those aliases.


They are also honestly pretty useful as a new user. `ls` and `dir` are more obvious things to try than `Get-ChildItem`.


The half-way solution could have been making the 'aliases' output migration documentation about how to accomplish common takes with the PS commands.

That way PS would be far more discoverable to a UNIX person and would save time searching "$command in powershell".


Yeah but frankly it also sucks to type in such a long command rather than using the aliases.


Most of the commands have PS "native" aliases which also follow a more consistent pattern, e.g., get-childitem->gci, invoke-webrequest->iwr.


That's true, but after years of typing "ls" into terminals that's just more natural.


Then maybe design new ones?

"download" aka "dl" instead of curl/wget?

"list" instead of ls/dir?

Would also make them easier googleable


That's not the problem MS was trying to fix with the Linux aliases -- the problem they were trying to fix was ergonomics. A lot of us are used to Linux commands and are turned off by Windows shells when we have to learn a new set of commands and can't use `ls`.

In fact, being able to do `ls` rather than `dir` on PowerShell was one big reason I gave it a chance. My other Windows shell experiences have been supremely intolerable because of that.

With curl and wget, Microsoft admits they were overeager.


I think the correct ergonomics would be if you saw:

  $ curl http://example.com/foo
  Fetching http://example.com/foo by emulating curl
  This command is an alias for "...".
  If you've installed a real curl command, disable this by typing...
so that you could still use muscle memory to type the commands you know, could see what the actual underlying command is so you could adapt to it, and would know how to use the Real Thing if you have it.


Kind of noisy, isn't it?


It is, but the noise is telling you how it differs from what you'd normally expect and how to address it. I think that's an acceptable tradeoff.


I think it is immediately obvious from the output that it's not the same program, without the message.


Really? What's the normal output? The content of the URL being fetched. If that message was to STDERR so that curl > foo worked without alteration but the help text still showed up on the console, I think fewer people would be surprised.

The number of bugs about this indicates that people are still being surprised.


That is not the output of the PoSh command. Like most of them it outputs a list of object properties and values: https://mcpmag.com/articles/2013/01/22/~/media/ECG/mcpmag/Im...


PowerShell has other ones in addition to the cannibalistic ones. `Get-ChildItem` has `gci` and the command in question, `Invoke-WebRequest` has `iwr` on top of `wget` and `curl`. Admittedly, it's pretty kitchen sinky.


So, there’s already a solution.


Those are not discoverable in the same way.


I think this comment is important in understanding why so much open source software doesn't have backwards compatibility

> If removing these aliases (retroactively, if you want to clean up your act you have to remove this from all versions in which this was ever released) breaks things for some people, then you can go and send people out to help them fix their issues. You knew it was a hostile action, you broke things, you fix them.

retroactively remove things from old versions? This is crazy


No one does that. How is this comment from someone who doesn't understand the "release" concept so important?


When I had no customers - I had lots of flexibility.

PowerShell is now installed on many hundreds of millions of machines.

With great success comes great responsibility.

We can and will change things but we have to do so in a thoughtful manner.

Jeffrey Snover [MSFT]


Kind of worrying that the inventor of PowerShell could make such a mistake. Reminds be of when rvm used an alias for "cd", which was even worse. Making up alias for popular tools such as cd, curl or wget is always a bad idea both for security and practical reasons since it breaks many tools in non-obvious ways and can be hijacked etc. It's great though that it's being fixed.

I installed PS on my ubuntu last night, it works but I don't really see the point in using it on anything but windows. Last thing I need is worry whether my scripts will run on my custom shell script as if running ubuntu didn't come with its own issues to worry about already.


> Making up alias for popular tools such as cd

cd is a shell builtin on every single *nix shell, not an independent tool. Every shell has its own cd implementation.


You might also add that it's actually impossible to implement cd as a standalone tool - processes cannot change their parent's working directory.



Well how about that. Serve me right calling something "impossible".

Now what do I say? "Impossible without awful hackery"?


Are other commands also different from shell to shell? It seems like an interesting point, if true.


Lots of simple shell functionality is provided as builtins. See the bash manual for a list of builtins inherited from the Bourne Shell[1] and that are bash-specific[2].

Note that some of these are part of the POSIX standard, so there's not a huge amount of variance allowed between shells which are being compliant. Obviously, PowerShell doesn't adhere to this.

[1] https://www.gnu.org/software/bash/manual/html_node/Bourne-Sh...

[2]: https://www.gnu.org/software/bash/manual/html_node/Bash-Buil...


Here's a couple of pages listing the builtins in bash: https://www.gnu.org/software/bash/manual/html_node/Shell-Bui...

And zsh: http://zsh.sourceforge.net/Doc/Release/Shell-Builtin-Command...

Powershell doesn't really have "builtins" per-se, every cmdlet is executed in the process of the shell itself but they're all contained within loadable modules (though 2-3 modules are loaded with the shell startup by default), Invoke-WebRequest is contained within Microsoft.PowerShell.Utility for example.


> The thread goes on at length about bureaucratic requirements for the change.

I don't like the characterization of the RFC process as a bureaucracy. RFC processes are more open than a pure "bazaar". In the lack of any formal, open process to make project changes, there ends up being a hidden informal "be friends with this person to get your patch merged" process, which is far less transparent.


It's nice to see the Microsoft people being open to the dialogue but they're in a hard place here. These aliases have been in PowerShell for years, they can't be removed without breaking existing scripts.

Anyone who's used PowerShell knows that a lot of commands work differently, even `ls` and `rm`. They have different options, different text format (colors!), different output and different ways to interact with other commands. That's the very definition of PowerShell: structured interchange between commands with real objects instead of fragile strings.

wget and curl are no different. They have the names of well known UNIX commands but within PowerShell, they have their own syntax and behavior. It's not unexpected and anyone who uses bash and PowerShell regularly (I kinda do) is used to the differences between these two shells.


I'm not too familiar with PowerShell, but I wonder if they could introduce new syntax to fix this. Add a new statement, like import, include, use, or using; if your script includes that statement, you don't get the aliases unless you ask for them (e.g., using LegacyAliases).


The problem with that is that it breaks existing scripts. If you have to update old scripts, why try to maintain the legacy stuff in the first place.


No, it doesn't, because existing scripts don't include this new statement. You only have to make your aliases explicit if you edit a script to use this new namespace/module feature.


Version pinning? i.e. you specify in your script the version that you will use. Scripts missing the version pinning information run under backwards compatible mode.

It...sorta worked for HTML? Over the long run, old pages could be phased out and now we have a pretty nice situation with HTML5.


This actually works reasonably well in Perl, where you can, for example, stick "use 5.22.2;" in a script and thus enable all the features of that version (and the versions before it) while keeping the features of newer versions disabled, thus ensuring backward compatibility.



This seems far less dramatic than the thread is making it out to be.


I think MSFT needs to try harder at their customer service. Their first response from a human was hard rejection response.

lzybkr wrote "We are rejecting this PR as it introduces "Unacceptable Changes", see our breaking change contract."

It is not, "let's investigate it where what the problem is and get back to you some feedback.".


lzybkr is a developer on the poweshell team, and who I would guess is reacting according to standard operating procedure. Then jpsnover, who is a Technical Fellow (a senior position) gets involved and makes the call to start the RFC process. This seems like a reasonable escalation process to me. lzybkr may well not have the authority to commit to an investigation/fix.


They don't even have an obligation to consider outside PRs, let alone provide a reason. It's rather a stretch to say people who want to change the behavior of PowerShell are their main customers.


Depends what main customers you are thinking off. For this product in particular, it seems these users might be potential customers of powershell. Each product has it's own main customers.


The customers who pay money.


FYI - we don't charge for PowerShell.

Jeffrey Snover [MSFT]


That was exactly part of my point: "main customers" are people who pay to Microsoft, e.g. for Windows licenses.


> I think MSFT needs to try harder at their customer service. Their first response from a human was hard rejection response

All open source projects (on github and everywhere else) that require a contributor agreement do the same thing (e.g. Android). They usually have bots that auto close pull requests when the person hasn't signed the agreement first.

It's an unfortunate but necessary requirement for some projects. There's no malice, just bureaucracy.


He wasn't talking about the attribution bot (which did not actually reject the change). He was talking about the first human response from lzybkr, which was a "no, it's a breaking change".


Fair feedback - we are in learning mode.

Jeffrey Snover [MSFT]


Agreed. The irony here is their first human response sounded more robotic than the initial CLA bot response.


> I think MSFT needs to try harder at their customer service.

You don't think any of the particularly spazzy responders on that github thread are really Microsoft customers, do you?


They are people that work at company's that have the option of buying Microsoft products that will discourage coworkers from buying Microsoft products...


Devil's advocate: If you're using PowerShell on Linux you probably mean to remain compatible with a Windows workflow. PS uses other unix aliases such as ls and cat. But these all have different semantics than their namesakes. So if PS on Linux deferred to the native tools it would break compatibility with PS on Windows where those tools aren't typically available. If PS wants to remain true to its own goal of being a unified toolchain for cross-platform development, it should stick to its own implementations even if it perturbs some people who are primarily *nix developers. Because developers not working with Windows are not the primary audience of PS.

Consider that PS also aliases DOS commands such as DIR but is not compatible with the COMMAND.COM commands.


This seems like a similar issue to how `sed` is implemented slightly differently on OSX/Darwin systems and Linux systems.

How have those communities dealt with the problem? Or is this a totally off base comparison?


Sed predates Linux (by a very long time). There is an opengroup standard for sed: http://pubs.opengroup.org/onlinepubs/7908799/xcu/sed.html

As long as the OSX version of sed conforms to the standard there is no problem if it is incompatible in some areas with the GNU version used on Linux.


The comparison is a bit different: the difference for 'sed' is between the BSD version and the GNU version. Two tools based off the same original UNIX version of 'sed' whose development paths have diverged over the years.

Also, the version of sed on OSX is around 11 years old, which helps contribute to its differences.


Scripts tend to allow you to override it, so you can do "export SED=gsed" or similar if you need to. (Good) script authors try to stick to the POSIX-standard subset of options (and ideally test on multiple systems). Sadly there's no test suite for "will this script work on any POSIX-compliant system" AFAIK.


Totally off base. BSD versus GNU conventions.


If you think that that does not, similarly, generate long "But it breaks my scripts!" "Your scripts were wrong." "You should have named the right shell interpreter." "BSD commands are insane." "GNU commands are non-standard." "Are you going to fix all of the scripts that currently rely upon this?" "But this is the de facto standard!" "Have you forgotten about this large userbase over there?" "I don't care what you say, as far as I am concerned this is bash." "I don't care what you say, it's Terminal because I use a terminal to type." "It should be tar." "It should be cpio." "Now it's pax." "What exactly are the options to the 'ps' command?" "Why did you think that you had a command named 'll'?" "No, the superuser's login shell is intentionally the same as it has been for 30 years." "Why does the user manual have a (sometimes little more than a placeholder) note telling me to read something else for the user manual?" "We're settling on info documentation." "No, we're not." "Hey people, I've had a bright idea of documenting everything in HTML and I'm calling it http://cr.yp.to/slashdoc.html ." "The compatibility mechanism is of course to put /usr/xpg4/bin in your PATH." "No, that's gmake." "No, that's gawk." "Ah, actually it's mawk." "No, I think it's nawk." "I'm the real ksh!" "I am the real ksh!" "That's because it's the -I option, not the -i option." "We've actually had a non-interactively-usable general-purpose in-place text editor since Bill Joy wrote ex." "But ed is the Unix standard." discussions ...

... then you have not read enough of Usenet, WWW discussion fora, mailing lists, and others. (-:


What was the thought process to add those aliases? "Yes let's implement broken aliases that supersedes actual working windows version of those utilities" ?


I'd guess they started with stuff like "ls" (where it makes a lot more sense to just execute a windows equivalent, at least in interactive use) and then somewhere someone was a bit to enthusiastic?

For stuff like this "weak" aliases which point you to the right command if it can't find an executable for it would have been a way better design. "No command with that name, you are likely looking for Invoke-WebRequest". Only slightly more work in interactive use, and safe in other cases.


Yes, but the dir/ls alias behaves nothing like it's counterparts, so it's actually confusing when you use dir or ls but none of the familiar usage works.


Just wanted to add my two cents here that I normally use `ls` on PowerShell (instead of `dir` or `get-childitem` or `gci`) because it seems expedient. (I come from a Unix background.)

Yes, it does confuse me that plain `ls` in PowerShell is more like `ls -l` on Unix, and this confusion sometimes causes me to type `ll` (my normal Unix alias for `ls -l`). But still, I prefer `ls` over `dir` in PowerShell.

So I guess I'm saying that I'm fine with some of the aliases. (But `curl` and `wget` should be removed, yes.)


Hypothesis: The Windows world is a closed ecosystem. The developers and most of the users aren't familiar with any of the non-Windows systems. They were given a list of commands to map to the closest Powershell equivalent and made aliases for the basic functionality.

That is the only explanation I can see for why the would pick those commands.


Back when it was introduced, PowerShell effectively had a set of commands no one knew. To ease migration, and make things more familiar, aliases were added for many common Unix and cmd commands, e.g. both dir and ls map to Get-ChildItem.

However, having those aliases (not even limited to curl/wget or all the Unix-like aliases like ls, cp, rm, ...) is a risk since they shadow native programs that might exist. It's not as bad with dir, copy, del, since those are cmd-built-ins and thus cannot be called from anything that isn't cmd, but where already shadows where.exe.

So in general, since command resolution order places aliases before native programs, any default alias can break stuff on any machine. Which places this in pretty iffy territory. They cannot remove aliases without breaking existing scripts. This is still Microsoft we're talking about here. They don't just go around breaking stuff left and right. And they actually could never have safely introduced default aliases without a chance of breaking things.


They aren't necessarily built-ins. Years ago, I wrote a CMD where DIR, COPY, DEL, and others were intentionally not built-in. I even supplied it with two different versions of an external COPY command, one with syntax that was aligned with MOVE, REN, and so forth, and one with the full original idiosyncratic syntax of the IBM/Microsoft COPY command (that could concatenate multiple parallel lists of files).

I went in a different direction to JP Software's (then) 4OS2, which took the tack of adding futher built-in commands. JP Software's (current) Take Command is interesting to consider in light of this discussion. It has its own built-in versions of bzip2, gzip, jabber, rexec, rshell, sendmail, tail, tar, zip, 7zip, and others.

* https://jpsoft.com/help/index.htm?bzip2.htm

* https://jpsoft.com/help/index.htm?gzip.htm

* https://jpsoft.com/help/index.htm?jabber.htm

* https://jpsoft.com/help/index.htm?jar.htm

* https://jpsoft.com/help/index.htm?rexec.htm

* https://jpsoft.com/help/index.htm?rshell.htm

* https://jpsoft.com/help/index.htm?tail.htm

* https://jpsoft.com/help/index.htm?tar.htm

* https://jpsoft.com/help/index.htm?zip.htm

* https://jpsoft.com/help/index.htm?7zip.htm


To help bridge the gap for UNIX users who are switching over - but that being said, it's dumb since I think having the same commands that don't function the same way as the UNIX tools would confuse me more than not having them at all and having to learn a new web request command.


It's great to make things better for your users. But that doesn't make it ok to use other's IP to do so. For decades MS has used IP to crush it's competition, so there is absolutely every reason to treat this type of behavior from them as hostile. Is MS gonna give back all the $ Google paid it for Android lawsuits? No? Then it's still just business.


They could have just bundled cURL with PowerShell/Windows like macOS does. But NIH, I guess?


Powershell is an object world, cURL is a string utility. You wouldn't want to bundle it, you want people to use Invoke-WebRequest (current cURL alias).

The alis just needs to be removed. They should replace it with a manpage that suggests cmdlets for popular UNIX commands if those commands aren't found (e.g. if curl returns "not found" it tacks on, "You could try XYZ cmdlet instead").


Powershell is an object world, cURL is a string utility. You wouldn't want to bundle it, you want people to use Invoke-WebRequest (current cURL alias).

Specifically, for non PS users reading, Invoke-WebRequest returns n object with the page content parsed and a live DOM to work with, as well as the request headers and the raw data. It's very convenient.

Including curl.exe would not do that.


If they really wanted to help users switching from Linux/Unix etc. then invoking commands like 'ls' would display a help page informing users how to get the functionality from native tools.


wget and curl aren't included with Windows. You have to go out and get them.

Powershell does come with Windows. If you don't know Powershell, having wget aliased to Invoke-WebRequest is hugely convenient, because it's the quickest way of getting stuff done; assuming your Linux knowledge is somewhat, sorta valid is a lot easier than reading a bunch of documentation or googling to see what the actual name of the command is (even though Powershell has pretty discoverable commands.)


I bet there was no RFC for that breaking change back then ;-)


Has anyone looked at the big list o' Powershell aliases? This is from the section containing wget and curl:

...

// Porting note: #if !UNIX is used to disable alises for cmdlets which conflict with Linux / OS X

#if !UNIX

                    // ac is a native command on OS X

                    new SessionStateAliasEntry("ac",
                        "Add-Content",     "", ScopedItemOptions.ReadOnly | ScopedItemOptions.AllScope),
...

                    new SessionStateAliasEntry("cpp",
                        "Copy-ItemProperty",   "", ScopedItemOptions.ReadOnly | ScopedItemOptions.AllScope),

                    new SessionStateAliasEntry("diff",
                        "Compare-Object",  "", ScopedItemOptions.ReadOnly | ScopedItemOptions.AllScope),
            
...


...And I'd say that this displays a problem with the PowerShell design more than anything else. No, not that the curl and wget aliases exist: it made sense at the time, and no longer does, that's fine. The problem is that PS, unlike every other shell out there, handles structured data, which means it necessarily has to live in its own little world, with its own commandset, because even if it uses external commands, they're not going to be expecting structured data, because that's not how external commands work. So, by necessity, PowerShell is semi-monolithic, and doesn't integrate well with the rest of the system on UNIX, and integrates poorly with regular CLI apps on Windows.

This is an unavoidable problem with what PowerShell is trying to do. It doesn't make PS bad, but it causes problems.


Your comment ignores the major integration that PowerShell does provide: .NET. PowerShell can invoke any .NET code and work with the objects returned therefrom. Yes, it has terrible integration with text-based tools, because it's not a text-based shell. It has great integration with the .NET object-based environment, because it's a .NET object-based shell.

(This is also completely ignoring the fact that you can still call any executable; you'll just receive a bunch of strings back. The only failing here is that PowerShell doesn't have the developed string-manipulation tools like sed, awk, etc. because working with text is typically unnecessary.)


The only failing here is that PowerShell doesn't have the developed string-manipulation tools like sed, awk, etc. because working with text is typically unnecessary.)

What kind of tools does it not have? It has all the .Net style string methods (split, trim, and so on), all the .Net regex methods, built in arrays and hashtables, looping over lines of text shortcuts.

It's pretty trivial to do small scale ad-hoc text parsing, cut lines by a character, take element 4, cast to 'datetime' and add +4 days to it, handwave a counter into existance and +1 it, match a regex and do some replacement or group some fields.

It doesn't go as far as sed "in-place text replace" or awk records, but saying it "doesn't have developed string-manipulation like sed" is like saying that of Python. And if you were writing Python you wouldn't need to call out to awk because you'd just use Python string processing and basic data structures for what you were doing, right?


I meant exactly that "it doesn't go as far as sed... or awk...". PowerShell has what I would term the primitives of string manipulation, inherited from .NET -- substring, trim, regex replace, etc. But it does not have the advanced methodologies that sed and awk provide. Especially in regards to editing, since PowerShell also inherits .NET's immutable strings.


This. AWK is the best tool for record DSV record manipulation I've ever used. Some would say that it has since been surpassed by other scripting languages, but I disagree. It's fast to learn, and is incredibly good at what it does, with a very simple and elegant design. Simple AWK scripts can read like descriptions of what you want done. Like this script that prints all users in /etc/passwd that have home directories in /home:

  BEGIN{$FS=':'}
  $6~/^\/home.*/{print}


Well, if I was using Python to write a script better suited to AWK, I'd use AWK instead.


Yes, and on most platforms, .net has fairly weak integration.


If your sole argument is that .NET has weak integration on platforms for which version 1.0 of the core runtime was only released two months ago, then I don't really have anything to add. Of course it's going to have a weaker integration than a tool set that has existed and been developed together for decades.

However, your original argument was more along the lines of PowerShell being a walled-garden of cmdlets. Which is patently wrong -- it works with any .NET code. It also works with the standardized structured text formats of CSV, JSON, and XML.

It does not currently have tools to work with the two-dimensional text structures typically found on *nix machines, because until this week it was not even available in those environments. We might very well see creation of tools to ease this process.


This is a problem of most platforms.


Weeell, Unix commands also live in their own world, with their own commandset, where everything expects structured data — only the data are streams of bytes.

The problem is at the interface between these two modes of operation (and it'd be just as bad with a third mode, e.g. S-expressions or something else).

Honestly, it'd probably better for PowerShell to only deal with PowerShell commands, and Unix commands to only deal with Unix commands, with a well-defined interface between these two very different ecosystems.


Yeah, but streams of text are standard: every programming language can handle text streams. Even brainf*ck can do that. PowerShell objects are PowerShell specific.

>Honestly, it'd probably better for PowerShell to only deal with PowerShell commands, and Unix commands to only deal with Unix commands, with a well-defined interface between these two very different ecosystems.

Well, yeah, but that would make PowerShell even less useful on UNIX than it already is.


> Yeah, but streams of text are standard: every programming language can handle text streams.

What makes you think PowerShell can't handle streams of text? You'll get an array of System.String objects, one for each line, to do with as you please.

> PowerShell objects are PowerShell specific.

PowerShell objects are .NET objects living in the .NET runtime, and can be passed to any other .NET code. Yes, some object types live in the PowerShell namespace, because they expose PowerShell-specific functionality. But, for instance, `ls` returns System.IO.DirectoryInfo and System.IO.FileInfo objects, which is exactly what you would use in C# or VB.NET or managed C++ or F# to represent the same information.


It CAN, but that's not its MO.


Exactly right. It's so funny when people cite one of Powershell's selling points as the fact that it's "extensible", meaning that you have to write programs specifically designed to be compatible with it.


a) You don't, it can read text through stdin from any utility which outputs text.

b) Other shell utilities aren't magically compatible with each other, they have to be "specifically designed" to output representations of information in compatible text forms. Which is an ad-hoc thing, hence the existance of parameters like -0 makes the output null terminated instead of newline terminated, for compatibility with tools x,y,z.


Well, it CAN call out to unix commands, but it has poor text manipulation capabilities...


This is funny.

I work on linux or mac os and a couple months ago I was trying to show an intern on windows how to use curl to make HTTP requests from the command line. Having no PS experience, I opened up PS and tried curl, saw that it was a known command but couldn't get it to work as expected since it didn't behave like curl usually does.

Was very confusing. This should definitely get fixed.


I wonder if the creators of open source software should claim trademark rights in the names of that software. "curl", "wget", "git", and "nginx", for just four examples, are probably distinctive enough in their areas of use to qualify. If the creators had trademark rights, they could then write "cease-and-desist" letters to ask others to stop using the names in confusing ways.

This wouldn't necessarily open up a trademark-troll sinkhole, because the creators of widely used software could, in their licenses, grant the use of the name to implementers of their packages.

The best way to claim trademark rights is to register the trademarks. But that's not necessary.

(Unrelated: Jeff Snover isn't a creepy guy; he's just trying to do the right thing.)


To be honest powershell has always struck me as over engineered, brimming with technological smugness, a solution in search of a problem.

Powershell fans get so excited about how their rich CLR object pipeline is so much better than a textual representation that they forget that their shell is basically an unusable theoretical exercise and still blown out of the water by "the real thing".

The fact that the designers of this platform would over-extend and try to clone "Unix utilities" (which have had windows ports for multiple decades) but completely oversimplify the purpose of these tools and fail to replicate them accurately - it's no shock to me at all.


Perfectly said, can't agree more.


Some of the unthinking, reflex accusations of conspiracy by Microsoft that I'm reading here are saddening and a bit disturbing.

Somebody decided to add some aliases for convenience. It turns out to have unforeseen consequences. It'll get fixed using a formal process intended to ensure that more unforeseen consequences don't follow from a breaking change.

This sort of thing happens all the time in this industry. Its not a conspiracy.


If you're referring to my post, I don't think "conspiracy" is the right word. The "extend" part of "embrace, extend, extinguish" is just due to them having a somewhat insular culture that doesn't make use of upstream.

You can be an entirely good person and do "embrace, extend". You're probably not the one making the decisions to extinguish.


I only see one post by you on this topic, and thats the one I'm replying to. So no, I wasn't referring to you.


It's not a conspiracy, no.

It is harm from a historically bad actor.

So it's not accepted with enthusiastic gratitude.

And there's no reason why it should be.


And this sort of stuff is definitely going to come up as Microsoft open sources things. I half think Microsoft would open source Windows today if they weren't terrified to show some of the stuff that undoubtedly lies beneath the covers.


Embarrasment would be the least of their troubles if they did that; for a start, Windows would instantly be community forked, and the fork would be instantly superior by virtue of having all the annoying stuff removed.


There is a way to address these things that I'm surprised they don't know about as language designers.

For these situations, making a change that breaks people's code, you have a run-time compatibility mechanism. A command line switch or environment variable can tell the language implementation "please behave like version N".

Then all changes since N which are backwards incompatible are suppressed. For instance, troublesome commands that were removed make a re-appearance.

You can then boldly fix something that is obviously wrong, while still giving any negatively affected users a way to fight any fires.

It's not a perfect solution but it placates most of the concern of the form "this is a good all-round fix, but it will break things for some unknown numbers of users".

(Users who have to be absolutely sure that their code will work the same way regardless of language interpreter and run-time updates simply have to package their work together with a specific version of those components, and have their code refer to them instead of the standard installation.)


> A command line switch or environment variable can tell the language implementation "please behave like version N".

That's not a fix. If you tell people running existing code that they need to pass a new parameter to maintain the existing behavior, you've broken them. This is no better than just telling them to update their scripts to call "Invoke-WebRequest" instead of "curl". For the record, powershell already supports this. You can request a specific version of powershell. Most of the time people don't use this functionality, though.

> It's not a perfect solution but it placates most of the concern of the form "this is a good all-round fix, but it will break things for some unknown numbers of users".

It's actually not a good all-around fix. It might be the best fix for this situation, but breaking an unknown number of users' scripts of unknown importance is still a pretty bad fix. Some guy's payroll processing will break because of this. Some startup's web scraping logic will break. Lots of stuff will break if they fix this.

> Users who have to be absolutely sure that their code will work the same way regardless of language interpreter and run-time updates simply have to package their work together with a specific version of those components, and have their code refer to them instead of the standard installation.

So basically no expectation of backwards compatibility? That's why Chrome ships with its own copy of Windows, right?


> If you tell people running existing code that they need to pass a new parameter

So, make it an old parameter by having this from the beginning in your language.

Once the parameter is several years old, it's no longer a new parameter.

Users can use it proactively, before something breaks. It can be a recommended practice for deploying code.

> This is no better than just telling them to update their scripts to call "Invoke-WebRequest" instead of "curl".

It is substantially better than asking people to change source code.

Users can still change their scripts to call Invoke-WebRequests (and should!); just in the short term, they just adjust some version mechanism.

> That's why Chrome ships with its own copy of Windows, right?

Windows has versioning mechanisms in some of its API's, by which it can tell that a program is calling it that was compiled for an old version.

Speaking of browsers, web pages declare what version of HTML they are in with <!DOCTYPE ...>. If your page begins with <html>, you can't expect it to render the same way everywhere.

Deploying applications in containers with copies of operating systems is not unheard of these days.

> So basically no expectation of backwards compatibility?

Backward compatibility is the normal modus operandi. This type of mechanism is just for "oh shit" situations: there is no way we can fix this thing while remaining backward compatible. Any newly introduced use of the version emulation is treated with great regret.


> So, make it an old parameter by having this from the beginning in your language.

It's already part of PowerShell. You can request a specific version when you run PowerShell.

https://msdn.microsoft.com/en-us/powershell/scripting/core-p...

> Users can use it proactively, before something breaks. It can be a recommended practice for deploying code.

I'm sure it is recommended practice. That's irrelevant if people aren't following that best practice. And they generally aren't, because it adds friction to development.

The "go back in time and do it right from the beginning" fix isn't feasible due to the lack of reliable time machines.

> It is substantially better than asking people to change source code.

It's really not. If the workaround avoided compilation, this would be lower cost. But the cost to make a small change to a script and the cost to change the way the script is invoked are pretty much the same.

> Windows has versioning mechanisms in some of its API's, by which it can tell that a program is calling it that was compiled for an old version.

No. Windows is just super-serious about backwards compatibility. Some APIs are versioned. Most are not. Microsoft just bends over backwards to keep stuff running (to the point of emulating bugs that programs relied on).

> Speaking of browsers, web pages declare what version of HTML they are in with <!DOCTYPE ...>. If your page begins with <html>, you can't expect it to render the same way everywhere.

Also no. HTML5 did away with the versioned doctype crap because it wasn't actually useful. It's just "<!DOCTYPE html>" now, because that's what you need to get browsers to render your site in a standards-compliant way.

> Backward compatibility is the normal modus operandi. This type of mechanism is just for "oh shit" situations: there is no way we can fix this thing while remaining backward compatible. Any newly introduced use of the version emulation is treated with great regret.

This type of mechanism 1) already exists for powershell, and 2) doesn't really help much here. I'm sure if they take this breaking change, they'll advise impacted people to use this workaround if they cannot for some reason fix their scripts. But this is not a "fix".

Disclosure: MSFT employee, but not on PowerShell or Windows


I never called this a "fix". I used "fix" in reference to whatever backward incompatible change was being made to make this workaround necessary. (For whatever reasons, it is considered a fix that is desirable; and so the question is then now to mitigate the impact.)

Of course the fix for any behavioral regression is to make it go away without the user having to do anything (other than apply the fix).

The versioning request mechanism has to be supported via an environment variable also, because if a user has a tree of scripts calling each other, it may not be feasible to insert this extra argument into all those calls.

> But the cost to make a small change to a script and the cost to change the way the script is invoked are pretty much the same.

That is true, but the cost to make hundreds of changes to a script versus one command line parameter isn't the same. A change which breaks backward compatibility could affect some programs in many places.

There is also the cost of finding that there is a problem, and what that problem is: where is it breaking and what changes need to be made. All while ensuring that those changes work for the older versions of the interpreter too, not just in the upgraded environment.

Say I have some big, 10000 line script. I update to a new interpreter, and the script doesn't work. First thing I will try is the compat option to emulate the previous version. If it works, then there is my workaround; for the time being, I don't have to care why, or whether forty places in the script are affected or only three. The thing has a way of continuing to work (for a good, long time) so I have plenty of time to investigate it. I can treat it as a low-priority issue and give it as a background task to a co-op student instead of as an urgent blocking issue.

> HTML5 did away with the versioned doctype crap because it wasn't actually useful. It's just "<!DOCTYPE html>" now,

I.e. HTML5 just shortened the spelling of the utterance that you need to indicate that "this page is HTML5". If your page is HTML4, you need the older, more verbose utterance.


> I never called this a "fix".

You called it a solution. You called it a way to address the issue. Nitpicking use of the specific word "fix" is pointless, when you were clearly proposing this as a fix.

> That is true, but the cost to make hundreds of changes to a script versus one command line parameter isn't the same. A change which breaks backward compatibility could affect some programs in many places.

You're right. The cost to hack it with a specific version is higher. You either have to set a system-wide environment variable (which someone will forget about and ship a break because locally they were using a newer version) or you have to inject it at each point a relevant script could run. Or you could do a find/replace for the problematic alias and be done.

> There is also the cost of finding that there is a problem, and what that problem is: where is it breaking and what changes need to be made. All while ensuring that those changes work for the older versions of the interpreter too, not just in the upgraded environment.

It's literally a grep/findstr. I don't know why you're acting like it's hard to find uses of the tokens "curl" and "wget" in ps1 files.

> Say I have some big, 10000 line script. I update to a new interpreter, and the script doesn't work. First thing I will try is the compat option to emulate the previous version. If it works, then there is my workaround; for the time being, I don't have to care why, or whether forty places in the script are affected or only three. The thing has a way of continuing to work (for a good, long time) so I have plenty of time to investigate it. I can treat it as a low-priority issue and give it as a background task to a co-op student instead of as an urgent blocking issue.

What are you talking about? The whole reason to maintain backwards compatibility is so you don't end up in this situation. You can use the version switch if you need to, but in general if you need to in production it means someone screwed up.

In the scenario you described, there's a 95% change you'll never take the version switch off once it's in place (because it's "low priority"), so you'll have this hanging over your head until something breaks and it becomes critical to upgrade, at which point you'll be frantically trying to fix the problem and cursing Microsoft for not maintaining backwards compatibility.

> I.e. HTML5 just shortened the spelling of the utterance that you need to indicate that "this page is HTML5". If your page is HTML4, you need the older, more verbose utterance.

Not exactly. If you slap the "HTML5 doctype" on an HTML4 page, it's expected to work. Because again, that abbreviated doctype is all you actually need to get a browser to use standards-compliant mode. But also, this is the doctype going forward. So far as I understand, there is no plan for HTML6/7/whatever to change this. Because backwards compatibility is greatly preferred over trying to force a specific version.


If I have a problem with Firefox and switch to Chrome, that addresses my problem and is a solution; yet it isn't a fix! When I use words like "address", I'm specifically being weasely, avoiding the word "fix". :)

> The whole reason to maintain backwards compatibility is so you don't end up in this situation.

That's the ideal, which ignores the negative aspects of absolute backward compatibility. Very good backward compatibility most of the time is all round better than perfect, absolute backward compatibility.

If you want perfect backward compatibility and an excellent design everywhere, then you have to make only perfect design decisions in everything right from the start.

Dennis Ritchie regretted not fixing the precedence of the & operator in C. It's strangely low because at one time there had been no logical && operator and & was used in its place. He wanted to fix it, but, alas, the story goes, they already had several hundred kilobytes of C code written across three machine installations. The result: a piece of technical debt spread to immeasurable numbers of lines of C written since, world over.

> I don't know why you're acting like it's hard to find uses of the tokens "curl" and "wget" in ps1 files.

Simply because I'm thinking of the whole class of possible backward-incompatible changes in a language or library, not all of which can be necessarily found this way. For the ones which can be found by looking for specific identifiers, you need reliable release notes to tell you what they are.


I imagine they do know about that technique, and like me, they do not like it. It complicates the implementation itself, as well as the interface provided to the user. I generally would rather pick a side: fix the issue and deal with the consequences, or live with the badness because fixing it would be too big of a breaking change.


In my experience using the technique in a language implementation, it is a breeze and the interface to the user is unobtrusive.

I imagine they don't know about the technique.

Engineers who know about rejected alternatives tend to mention them, to avoid the embarrassment of someone else doing it for them.


Implementing it and presenting it your users once is easy, and not that obtrusive. Doing it N times is not, and the complexity multiplies. It also increases the complexity of support. I see it as avoiding design decisions instead of making them. Since I don't see it as a viable option, I imagine they don't as well - it doesn't take must insight to realize you can provide an external switch to go between two different behaviors.


In my experience, no complexity multiplies.

Compatibility switches hinging on different versions for different features in the code are basically unrelated.

Sometimes multiple switches affect the same area of the code. This is fairly linear. E.g. if the same logic is affected by two version checks, basically the behavior is split three ways: the ancient behavior before those two versions, the behavior starting with the older of the two versions up to before the later one, and then the new behavior starting with the later version. The code variants do not grow exponentially in N (the number of places in the implementation where a version check is made).

That could happen if you introduced individual Boolean controls for individual behaviors rather than a linear scale on version: the same area of code being "hit" by three run-time flags for altering behavior could have 8 possible behavioral combinations and up to as many blocks of code to implement them.

I don't see how it avoids making design decisions. If you have a plan for somehow mitigating the impact of backward-incompatible changes so that users can get through them, you have more freedom to uproot bad design and replace it with good design.


I have to wonder how many people are really trying to run curl and wget from Windows PowerShell and running into a problem. I feel like if you're the kind of person who would do that you are also the kind who could easily configure your environment to get it the way you want. Breaking existing behavior seems like a bigger problem.


You can bypass this problem by appending ".exe" to the end of your command anyway, as in, "curl.exe" instead of "curl" fixes the alias problem and allows native curl to run.

Alternatively you can use a fully qualified path (c:/utils/curl.exe).

Technologically it is trivial to bypass the alias. But that doesn't mean the curl/wget people don't have a legit point about trademark usage and or incompatible args between the cmdlet (actually Invoke-WebRequest) and original UNIX utilities.


Yeah, but the aliases are mostly meant for an interactive scenario where it has real value to be able to just try typing in a command you know and then you can actually look at the docs to understand the details, rather than having to Google "PowerShell equivalent to wget" or whatever.


I think I may quote this when the periodic "text input/output is stupid" threads come up.

"The output of curl.exe (and every other exe or other command external to PowerShell) is System.String.

"So changing curl to run curl.exe instead ofInvoke-WebRequest` would break this script because the output type would change."


What that actually means is that if you change a command so that it runs a totally different program the script would break, which is equally true in a stringly-typed world.


Though in the stringly-typed world, it's when the format changes from:

  A B C
to:

  A B D C
Or some other such thing. Your string processing code (perl, sed, sort, uniq, etc.) expects certain things that have now changed because <reasons> (which may be totally reasonable or unreasonable).


Yes, but if it's a totally different program there's no real reason to expect the output to be even slightly similar.


That isn't a reason to go back to only passing strings between processes, that's like saying "I can't assign this string to an integer so all strongly typed languages are stupid."


The aliases in PowerShell are a pain even for normal operations, because they don't behave like their DOS or Linux counterparts. I mean, the Get-ChildItem is aliased to dir and ls, which makes it discoverable... but it's parameters behave completely differently which just makes it confusing.


Maybe `cd` is a shell builtin, so I expect its behavior to change when I change shells. So it's not a problem to have `cd` behave slightly differently in bash, zsh, fish, PowerShell.


It's great that Microsoft has made PowerShell available to anyone who wants to do as they like with it and even took the trouble of packing it up for Linux and macOS, but for release, they have to care about their paying customers first, which includes some Windows admins who ignored 10 years of advice to never use aliases in a script. I'm a Skype for Business/Exchange admin who has generally followed that advice, but even I tend to use "select" instead of "Select-Object" in scripts.

This sort of thing is part of why open sourcing stuff that was developed closed and commercial isn't just a matter of tossing it up on GitHub and dropping a note here.


Is there a list of all the commands they alias?

What is the filename of the profile jpsnover is referring to with the work around of adding:

  Remove-Item Alias:Curl
  Remove-Item Alias:WGet


The `Alias:` listed is a virtual path provider. So just run:

    Get-ChildItem Alias:
For the profile path, it's stored under the variable `$profile`.


get-alias will list all the aliases on your system.

The profile is here (at least in Windows 10, it might be in a different location in other versions): C:\Users\username\Documents\WindowsPowerShell\Microsoft.PowerShell_profile.ps1


$profile already points to it.


I'll leave the MS politics out of my comment, and just summarize my suggestion [0] to handle this sort of thing. A module that has exported module members allowing you to opt-in to future breaking changes now could be added to PowerShell, similar to how Python has from future import [whatever]. So you could remove curl and wget aliases by adding Import-Module Future.RemoveAliases.Net.Http to your profile, or even opt-in to all future breaking changes by adding Import-Module Future to your profile. I wonder what Daniel would think of that idea.

[0]: https://github.com/PowerShell/PowerShell/pull/1901#issuecomm...


One of the arguments I saw from the Microsoft guys was that it's consistent with other "aliases" PowerShell has (like ps and ls).

Reminds me of a demotivational poster we have at the office:

https://www.google.ro/search?q=demotivational+poster+consist...

It reads: "Consistency: Only a virtue if you're not a screwup".


I think that *nix commands can be split into roughly two groups: ones that feel like generic "actions" and have multiple implementations (ls, cd, cat), and ones that refer to specific pieces of software (curl, emacs). The first category should be aliased to the PS equivalents, while the second should be left as-is.

With commands like ls, I'd prefer to have my muscle memory get me the first-class Powershell experience (which means a structured list of files, not a string containing ls's output) instead of attempting to act exactly like ls acts in bash. If I didn't want to use Powershell's features (which /bin/ls does not support) I wouldn't use Powershell.


PowerShell is really cool but its not a traditional shell at all. Totally different model. Also, Windows is not Linux. Stating the obvious but people seem to be doing a mashup without quite noticing the level of difference.

Its not an open source organization. If it was, they would have the discussion there rather than just closing it and talking about an RFC.

I think they should add an install option rather than nust defaulting to those aliases.

MS does need to continue making money. No matter how much people pretend otherwise, that means they have a conflict of interest with open source and Linux. And the web platform (still). And with so many billions at stake, yes their actions reflect this conflict.


There is so much weight of history for Microsoft to overcome if they want to now be accepted and trusted by the open source community. I'm willing to believe that at least parts of the company are sincere and serious about achieving that, but they need to have a thick skin and be understanding when people don't immediately trust them right off the bat.

Many long years of name calling and Embrace and Extend (which this particular issue feels a lot like) are going to take some time, patience, and long suffering on their part to counteract.


For the curious here is the authors blog post about it: https://daniel.haxx.se/blog/2016/08/19/removing-the-powershe...


Dear Microsoft bundle curl and wget in the next patch / major release. Your Historical blunder can be forgiven and you can make new friends in open source


You think that's bad. They alias ls as well...


ls is a standard on many different shells, with different implementations.


Are there shells where `ls` is a built-in?

I just replied to another comment where I wanted to say that I don't mind that the PowerShell version is different because `ls` is a built-in. But then I discovered it's not. So I just used `cd` in that comment...


Read about the C shell's "ls-F" built-in command.


Don't get me started on the awfulness of csh...


Isn't there a third way? make the PS versions of curl and wget adhere exactly to the original interface/specs?


Yes but I think the better solution is to fix the alias problem and then work with Daniel to make it super simple for people on Windows to get his latest/greatest bits. I've reached out to him to start that conversation.

Jeffrey Snover [MSFT]


That'd introduce the same kind of breaking changes and wouldn't fit with the PS model.


This would be difficult to do unless you can tell which alias you are coming from in `Invoke-JsonRequest`.


I wonder how many venomous comments in the PR were authored by people even remotely affected by the issue. I wager the answer is 0.


To me this looks like a twisted version of step two of the “Embrace, extend and extinguish” strategy – creating interoperability problems: https://en.wikipedia.org/wiki/Embrace,_extend_and_extinguish

To recap:

1. Embrace: Development of software substantially compatible with a competing product, or implementing a public standard.

2. Extend: Addition and promotion of features not supported by the competing product or part of the standard, creating interoperability problems for customers who try to use the 'simple' standard.

3. Extinguish: When extensions become a de facto standard because of their dominant market share, they marginalize competitors that do not or cannot support the new extensions.


To me, you're looking for ghosts when there are none. The Microsoft representatives stated multiple times on that thread that they fully agree the aliases for cURL and wget should be removed. They want to do an RFC to see how removing those aliases would affect existing scripts that use them. PowerShell has been around for a decade, it's more than reasonable to assume people have built workflows with those aliases.

Plus, step 3 literally makes no sense because these aliases only exist on the Windows implementation and everyone agrees they are substandard to the standard implementations.

But that won't prevent people from calling out EEE whenever Microsoft is involved.


It may look like it, but do you really think that's the most plausible explanation?

https://daniel.haxx.se/blog/2016/08/19/removing-the-powershe...


How does this fit? The two Microsoft reps who post first say that the alias is wrong, and they'd like to remove it. I'd say that's the opposite of EEE, which would involve saying that you have to keep the Microsoft way.


Or they have realized that at this point they have to go back to the embracing state to compete with the Linux ecosystem. Hence all the Open Sourcing and the NT Linux subsystem.


That only fits if extinguish is still their eventual plan. Red Hat, Google, Amazon, even Apple all at some point embrace open source ideas/software/standards. Embracing is good. Extinguishing is bad. (Extending is...sometimes good, sometimes bad).


I think Hanlon's razor applies better here than embrace-extend-extinguish.


Hanlon's razor applies when there is no evidence pointing towards malevolence. There is ample evidence pointing to EEE in the case of Microsoft.


"Microsoft" is not some uniform entity. The people who perpetrated this mis-design aren't necessarily related to the decision makers who steered IE/Office/Windows to have the EEE behavior.


Not a uniform entity, but a real entity with real habits of behavior. You don't have to attribute any malice to the "people who perpetrated this mis-design", Microsoft's bureaucratic structure might just be designed in a way that EEE strategies arise organically. If they people who drove the use of that strategy with older software were rewarded, and held up as examples within the organization, their ideas could be permanently embeded in the nature of the organization itself.


Embrace, perhaps. But I think it ends there.

My guess is that they wanted to keep onboarding easier for Linux types (Embrace) by creating an implementation they thought would handle the typical use case (like ls and the like) using a wrapper for their Invoke-WebRequest.

Unfortunately, they've now created a problem in that they need to modify things to prevent users of real curl from getting hosed but at the same time not breaking the Invoke-WebRequest wrapper.

But the goal of something like this may be no more nefarious than, say, Xamarin or similar where the goal is for reuse across systems.


>Embrace, perhaps. But I think it ends there.

Maybe. I haven't got a clue. The point is simply that Hanlon's razor is a poor defense to the accusation that this is EEE.


Poor defense against accusation of EEE?

Do we think Invoke-WebRequest is going to somehow become the new standard for curl functionality? Do we think Microsoft has any intention of that coming to fruition?

I imagine if they wanted curl for the sake of curl, they'd have included it (with attribution in some text file), and then parsed the results back into the object format preferred by PowerShell. Instead, it seems more like a stopgap to prevent people from having to totally rewrite their shell scripts to work in Windows as there was enough moaning over PowerShell at launch as yet another shell. Being completely incompatible with users potentially existing scripts? That'd be a big negative and honestly, a move the old Microsoft would've done without a doubt.

Personally, I'd prefer (and maybe it is this way, I don't use PowerShell) that if there is some type of text editor w/ Intellisense that if it saw a curl request it would give an error/warning (warning at the very least if you're going to do the alias thing) telling the user to read the Invoke-WebRequest API (and if it is something that they're currently mapping, offer replacing the curl statement with the IWR equivalent).

Seems the curl author's issue is primarily in that the alias created a situation where he receives false issue tickets (and don't get me wrong, that is terrible) due to the user thinking they're using curl when, in fact, they're not. And honestly, I could completely see where the user would be confused.

So back to the initial question, I don't see malice here. If they rewrote to include every piece of curl functionality and then went above that, then yeah, I'm with you. I do, however, see where someone screwed up royally (both in the initial shortsighted action and also in the headache it now creates for the PowerShell team in addressing w/o breaking for current users who knowingly/unknowingly are using the alias). I think that counts as stupidity.


>There is ample evidence pointing to EEE in the case of Microsoft.

Forever?


Let's flip this around. Are you sure you want to argue that past behavior can't inform us (to some degree) on future behavior?


Microsoft has been doing this for a long time: https://en.wikipedia.org/wiki/Embrace,_extend_and_extinguish...


This is of course their old and time-honored playbook, which served them well in the past, when they owned the PC market (literally).

But in the era of Linux, Google, and iOS, I don't think they have enough leverage to successfully implement the Extend. So they can forget the Extinguish.

That leaves only Embrace, which might work out best for all involved, a scenario wherein MS becomes a good neighbor in a diverse computing ecosystem.

In this case, nobody is going to keep using PowerShell on Linux or Mac if MS implements or promotes incompatible features. There is an exit available, where there wasn't on PCs in the 90s. I can go back to my plain old unix shell scripts, which have worked well for 30+ years.

To keep this project going, they are going to have to play nice and make sure it works well with everything. Or just abandon it. But they aren't going to be able to Extinguish anything, not in this day and age.


While I'm not sure I agree with you (this does not seem like it was done out of malice), I think it's interesting to note that MS employees probably need to be careful about adding fairly innocent changes/differences to popular APIs in order to avoid accusations of EEE.


If you want people to believe that this is a deliberate act to extinguish use of wget and curl, then I think you should explain what you think MS expects to gain from this. Why are they picking on two super-useful but comparatively specialised OSS tools? How much is the http data transfer tool business worth to Microsoft?


That's nothing more than old EEE strategy targeted for years against Unix/Linux. We will see this very soon again in Edge once its API becomes 'compatible' with Firefox and Chrome API for addons.

http://www.theverge.com/2015/4/29/8515771/microsofts-edge-br...


Seriously, how does this work in your mind? So Edge implements extensions the way Chrome and Firefox do, and the Microsoft tries to extend and extinguish it. Then what? People will suddenly not use Chrome or Firefox? EEE only works when people don't a reasonable choice.


To be fair, Firefox is planning on extending the chrome API for addons after (or really around the same time, in all likelihood) it reaches compatibility as well.

If it weren't, Firefox wouldn't realistically be able to deprecate XUL addons, because there would be no alternative for many of them.


I guess Powershell is going to have to take out all of it's useful aliases...

Bye ls, rm, cp, etc


They should put a flag that's something like $clobber_posix with default state of off.


429 points by laurent123456 4 hours ago | 289 comments

And already pushed off the frontpage, yet stories with 16 votes are now on the frontpage.


[flagged]


We detached this subthread from https://news.ycombinator.com/item?id=12320681 and marked it off-topic.


Wait, so now you're attacking me? The ESR post was entirely justified given the context. And I'm not an anti FS zealot by any stretch.

Also, ESR actually is a bigot. I mean, seriously, have you READ his blog?

In addition, I still do remember a lot of usernames because I trust them. and in this case, I remember site names because I don't.

And I'll remember your name, because it's one that I'll take less seriously from now on.


I read his blog, and disagree. Some of the posts (racism is just a byproduct of the uncanny valley effect, for instance) actively disprove your statements.

Either way, your post history proves me correct, and has for a while now.


I have brought up ESR only when it makes sense. If you looked at the parents, you'd know that.

I am sometimes skeptical of RMS and his organization. That doesn't make me a bigot. And I'm looking at my post history right now.

Also, that post doesn't really disprove my claim. And yes, I read it.


Offtopic: What I find interesting about this discussion is that comments regarding “Embrace, Extend, Extinguish” and Microsoft's past misbehaviour got upvotes first, then suddenly many downvotes at once. I have seen this earlier on HN with other controversial topics and I do not know what it implies.

Is this voting brigading / shilling or just normal behaviour of groups trying to suppress dissenting voices?

Also, is this part of why semi-noobs think that HN is turning into reddit?


> Is this voting brigading / shilling or just normal behaviour of groups trying to suppress dissenting voices?

Neither. (If anything, perhaps the assumption of malice on the part of people who disagrees with you is what smells like Reddit)

The dynamic is something like this: The comment gets up-votes because nothing is easier than hating on Microsoft, and down-votes because it isn't actually insightful at all. You can squint at any action by any company in just the right way and make it look like it's part of "Embrace, Extend, Extinguish". The comment essentially boils down to "Microsoft has done bad things in the past, so they're probably still doing them" with precious little in the way of actual analysis.


We detached this subthread from https://news.ycombinator.com/item?id=12319735 and marked it off-topic.


I think I understand what detaching means, but what effect does marking a thread off-topic have?


tl;dr; EEE is overused in many cases and thus not good conversation but this comment does bring a new spin that people can engage in.

First for your general question:

I think that people get tired of seeing the Embrace, Extend, Extinguish come up in every single thread about Microsoft.

It's a simple argument that seems to be overused and an easy grab for votes, or to bash Microsoft. If someone brings something new to the conversation by calling Embrace, Extend, Extinguish then I'm game for listening, but most the time it's just. Microsoft is a wolf in sheeps clothing look look they are Embracing. OH NOOOOOO. Open source? They can ALWAYS change their licenses? Runs on Linux? What are they going to do start offering free Windows 10 upgrades on Linux? Extinguish Extinguish!

Microsoft is a huge company, some departments suck and are run by unethical dicks (looking at you Windows 10). Their developer tools have taken some huge steps lately and have had huge success. Visual Studio Code is used by a lot of people who don't like or dislike Windows, some don't even use windows. I'm just kind of done with this argument.

Some comments on the OP:

This particular "Embrace, Extend, Extinguish" argument does bring something newish to the table calling this particular action one of the steps as foretold by our CS prophets on high. The OP gives a new spin and is on topic and shouldn't (imo) be down voted because it brings out healthy conversation (which is the point of reading comments on HN vs Reddit). If this were Reddit it would have been punnier.


Didn't vote either way but the post is a "do you think it's like this?" without presenting any real evidence, which may have been interpreted as them being a bit too pro or con one way or another. The thing that makes this place not-reddit to me is that the discussion is much more technical and more importantly, there are quite often references backing up someone's assertions.


I would favour that HN investigate in this. HN puts a lot of energy to mobilise its employees/whatever to take over a HN thread like this.

Their minions write a lot of blub comments so that the comment count is higher than the voting count of a story which equals to flagging a news!!

Another very visible behaviour is that they very quickly push positive news out as soon as there is a negative news.

If you track HN stories about Microsoft with a third party HN tracking service, you clearly see evidence that unfavourable HN stories about MS vanish very quickly by flagging or overflowing the story with a lot of blurb comments to hide them. And on the same day another often minor but positive news surface and sits on HN frontpage for several hours - and of course flagging all comments that even try to hint the privacy issues with Win10 or anything remotely normal comment that isn't overly hyping their software/service.


We always take users' concerns about astroturfing or possible manipulation of HN's front page very seriously, and I'll look at the data on this post as soon as I have time. I've addressed your concerns thoroughly in the past, though, and we've seen no evidence since to change our minds:

https://news.ycombinator.com/item?id=11844253


I suggest to create an internal view that lists the top 50% (story vote count) per day ordered by date descending. Mark all stories that are currently on the HN frontpage in a certain color. Also include all stories that got flagged, deleted in another color.

I have created such a view and see correlation with stories from MS. But not from Google, Apple or Amazon, nor any other company.

The usual tactics one can see is that a negative story about MS is getting flooded (after x min after they started to react) with many comments. As soon as the story has more comments than upvotes the HN sorting algorithms acts like the story was flagged and puts it of the HN frontpage to page 100+, and put a few flags to the mix and the story vanish pretty quickly. One can also see that MS seem to have a backlog of prepared positive stories. As everytime a negative story hits HN, the negative story is threaded as mentioned plus a new positive story is submitted and upvoted as replacement.

Edit: everyone else can do this too, and can watch these behaviours.

I come to HN to read news and the comments. So I sincerly don't understand why a story with eg 400 upvotes and 500 comments is hidden from me, but being the number one story on HN of that day. I suggest to change the threshold on HN so that either a comment count higher than upvote count don't alter the story position at all or change the threshold to make the site experience more transparent and fair.


Please share the code to generate your view.


Thank you for the linked comment. I have rarely seen moderators address users' concerns in more depth – especially if they think such worries are unfounded.


IANAL but this seems like a pretty clear cut case of copyright infringement, I.e. a situation where an ordinary person might be confused into thinking that PowerShell curl/wget provides the real thing, but does not.

Although I suppose that PowerShell could fall under "parody."

EDIT: this was sarcasm.


That's absolutely not the standard for copyright infringement (under US code). If curl or wget were trademarked, however, it could definitely be "confusingly similar" which would probably make a valid trademark infringement claim.


curl and wget automatically earn trademark protection simply by using a 'mark' in 'trade.' They do not need to be registered to earn this protection.


I think you're thinking of trademark infringement.

Copyright infringement would be if Microsoft had used GPL'd code (for example) in violation of its license. I don't think there's any suggestion that's the case here.

Your contention that there is a "real thing" would have tremendously negative impacts for the Linux/BSD community. Your typical Linux/BSD contains many utilities that share names and functionality with closed-source System-V utilities but are not in fact the "real thing" (in their case, for licensing purposes). Is everyone supposed to go back and purge those just because they aren't perfectly equivalent in options and functionality to the original System-V, lest someone like the SCO group go after them legally?

I don't think there's any doubt that PowerShell provides a bad clone of wget and curl, as there are many complaints about it. And as a matter of user-friendliness there should be a way to easily access the real executable if desired (I can't comment either way on if there is, I don't know PowerShell). However, I see Microsoft as just as within their rights to provide a "wget" alias as a Linux distribution is to provide a "ps" utility to their users.


I feel there is a big difference between making a replacement for a tool that comes with a generic name in a suite and replacing an entire tool. If you are going to make a reimplementation of Microsoft's spreadsheet software that is open source, you don't get to call it Excel. If you make a reimplementation of cURL, you also shouldn't get to reuse the name of their project.


So again, what is the difference between making a new spreadsheet app and calling it "Excel", and reimplementing a suite of smaller applications and reusing all their same names? Say, how coreutils steals the name of a large number of SysV utilities?

It certainly feels like there's some kind of a difference between the two, but I can't put my finger on exactly what puts one into either category. IMO it may simply be a matter of willingness to sue/defend your trademark. AT&T didn't defend their trademark and now it's genericized.

To throw extra fuel onto the fire, let's add Oracle v Google into this discussion. To me, the coreutils are almost like the API of a programming language - you string together application calls just like function calls, to produce a useful application. Oracle v Google established that the API itself is not copyrightable. Should an attempt to defend based on trademark of the function names have succeeded? If not, what differentiates the "ls" utility from Excel?


What you are describing is trademark, not copyright.

There is no trademark on 'curl' or 'wget', so there is not legal status to force anyone to remove features based on these names.


trademarks (at least in the US) are acquired automatically when they are used (just like copyrights). you can't use someone else's name just because they didn't file a trademark app.


PowerShell isn't copying the curl or wget code. This would be more like trademark infringement but I don't know if curl or wget are registered trademarks.

Still not a cool move.


You don't need to register a trademark to have trademark protections. If you use a 'mark' in 'trade' and can demonstrate consumer confusion, then you have a case.


There's a problem with your argument (replacing 'copyright' for 'trademark' as others have pointed out).

What level of name protection applies to command-line tools? For example, can the GNU project replace existing Unix tools with their own implementations which have the same executable name?


yes, absolutely they can. In fact, they almost certainly DID! GNU's legacy was in creating tools that were compatible reaplcements for Solaris/Unix/etc ones.


Just because someone did something without getting sued doesn't necessarily mean it was legal.


You mean trademark infringement.


Alternatively you could see curl not as a program, but part of the shell API. Microsoft has then reimplemented the API. Sure, from the Oracle vs Google case, even this may be infringement, but only in a more anal way.


The point is that they have not reimplemented it.

If the Microsoft versions of curl and wget were compatible with the original utilities, there would not be any problem.


Which actually also came up in the Oracle case: Google claimed to be building a compatible implementation of the Java API, but they couldn't come up with a single example of an app that worked on the JVM which could also run on Dalvik, or the other way around: they were pulling the exact same "embrace extend extinguish" stuff with the Java API that Microsoft had with Visual J++. In that case Sun sued Microsoft and won, leading people to cheer from the sidelines.


Are curl and wget copyrighted? I thought they were free software under a copyleft-style license.


Yes, their source code is protected by copyright. What does this mean? It means that the people have granted the author of that code a limited-term monopoly on making copies of that source code. In turn, as you said, the author has made it available under a restrictive license requiring redistribution of source.

What did Microsoft do? Well, apparently they created aliases that mimicked some subset of the curl/wget behavior for PowerShell. This might allow simple *nix shell scripts to work with little or no porting to PowerShell.

The names of the executables are not protected by copyright. Although -- who knows, if APIs can be considered protected by copyright maybe the names of the binaries themselves are a shell-style API. It seems like a case you'd be unlikely to win. :/


They are copyrighted. They are free software. curl's is distributed under a MIT/X [1]. It is not copyleft. wget is distributed under the GPLv3 [2]. It is a copyleft license. Both licenses' legal basis is copyright law, and the copyrights held by the tool's authors.

All of this probably does not apply to shell aliases anyway...

[1] https://curl.haxx.se/docs/copyright.html [2] http://git.savannah.gnu.org/cgit/wget.git/tree/README


He mixed up copyright and trademark, as far as I can tell. And trademarking is then no problem with copylefted software. For example, Linux is trademarked by Linus Torvalds and Firefox is trademarked by Mozilla. And at least in the case of Firefox, it's actually pretty important to have, as otherwise someone could bundle malware with Firefox and offer that for download as "Firefox", with nothing that Mozilla could do about it.


That doesn't mean that they aren't copyrighted. It just means that you can freely use them under the terms of their copyright license. If the code for them were not copyrighted (usually referred to as Public Domain), then you could do whatever you wanted with them-- things like making derivative works and using a commercial license.


Let's be very clear here.

This is not true:

No copyright notice == Public Domain.

Everything You make gets Your copyright by default. This means the default is: nobody can use what You make.

You can put other copyright terms on your works by adding a copyright notice.

You can put something in the Public Domain by adding a notice to your work.

Things can also become Public Domain automatically after a certain amount of time. But that time has been extended several times in the US to something unreasonable: like 75 years after the death of the author. You can thank Disney/Mickey Mouse for that.


Create a list of tests. Report how many of the tests pass/fail.


I don't think you understand the problem.


This is kinda awesome. This repo has been public for less than 24 hours and Daniel Stenberg (@bagder) already put in a PR to remove wget and curl? That's amazing. What a Swede.


One is always left with the impression Microsoft thinks their users are ignorant, for lack of a better word.

Of course, along with all those users who Microsoft treats as ignorant, again for lack of a better word, there are some very sharp people who use Windows -- who will perhaps take offense at any criticism of Microsoft -- no offence intended! (I am just not smart enough to feel confident with Windows myself -- too complicated.)

What is amusing to me about this incident is that I imagine Microsoft would argue that "Monad/Powershell" is aimed at its so-called "power users". Apparently these "power users" would not notice or would need something like this?

Intentionally or not, Microsoft is targeting beginners, not "power users" -- tactics like this leverage the popularity of the curl and wget programs and at the same time prevent beginners from learning how to use those programs, not to mention the long history behind them.

Disclaimer: I prefer nc, socat and tnftp myself; I have no bias in favor of curl or wget.


> One is always left with the impression Microsoft thinks their users are ignorant, for lack of a better word.

My impression is that these aliases were added so people wouldn't need to install cURL and wget on multiple machines.

> What is amusing to me about this incident is that I imagine Microsoft would argue that "Monad/Powershell" is aimed at its so-called "power users". Apparently these "power users" would not notice or would need something like this?

PowerShell was initially aimed at providing sysadmins a better tool for scripting tasks within Windows. They added aliases to help reduce the mental shift when changing technologies.

> Even if this can be explained away as a purely innocent "mistake", it still says something about the level of attention to detail.

At the time, it was intentional. The reality is when PowerShell was designed, there were no plans to free it and put it online, it was a pure Windows tool. The decision to add aliases made sense.


Yes or no question: Is it deemed too difficult for a user to create their own aliases, based on their own needs?

Why should anyone have to make a "mental shift"? The answer is not because UNIX is incompatible with other systems. The answer is more likely because Microsoft Windows is a proprietary commercial product; its sales and marketing staff have traditonally sought to discredit UNIX. Having followed Windows from 3.11 onwards I would argue that it is "different" and incompatible by design. The "mental shift" is intentional and strategic, but not at all necessary.

Why were there no plans to make PowerShell free and open source and publicly available initially? That's a rhetorical question of sorts.

UNIX as the OS that generally "runs the internet" has become better known and more popular in recent years as a result of several factors. That's why MS has to make the moves they're making. keywords: "has to"


>Yes or no question: Is it deemed too difficult for a user to create their own aliases, based on their own needs?

Are you intentionally trying to be difficult? The answer is of course not, considering there is the New-CmdLet alias. These aliases exist purely as convenience factor.

> Why should anyone have to make that "mental shift"? The answer is not because UNIX is incompatible with other systems. The answer is because Microsoft Windows is a proprietary commercial product and its sales and marketing staff have always sought to discredit UNIX. Having followed Windows from its inception I would argue that it is "different" by design. The "mental shift" is intentional, but not at all necessary.

You followed Window's inception, and this is the best argument you can come up with? A not to subtle attempt to discredit the company? Your answer ignores the fact that Windows was built on top of a DOS derivative and the scripting language for Windows for many years was batch. It was different because it wasn't based on UNIX at the time because Linux was still in development.

>Why were there no plans to make it free and open source and publicly available initially? That's a rhetorical question of sorts.

I'd imagine because it was aimed at people who were managing Windows servers. Considering Linux has a robust set of tools to do that, there would have been no reason to do that.

>As I see it, UNIX as the OS that generally "runs the internet" has become better known and more popular in recent years as a result of several factors. That's why MS has to make the moves they're making. keywords: "has to"

I don't think anyone on this thread would deny that. The question is how is that remotely relevant? Companies have to change strategies in order to respond to the markets they operate in. Companies survive on profits and revenue. It makes sense their actions would be guided by those. Why is this a difficult concept for people to understand?


OK, if you meant going from DOS batch to UNIX sh scripts, i.e., from one scripting language to another, as the "mental shift", then that's fair.

But using an alias for a program that may be called by a script is not going to address that.

A user can just as easily call curl.exe using BAT, WSH, etc., or PS.

As for DOS, keep in mind MS did not even write the software that became MS-DOS; they bought it.

Many believe it was a copy of the work of Gary Kildall whose CP/M was undisputedly more original and superior in quality to anything else for the "PC".

Using their usual tactics (remember "vaporware"?), MS extinguished Kildall's DR-DOS, and put his company out of business.

Later they paid a $150 million settlement for this move.

What is difficult to understand is why MS cannot make money without copying and interfering with other companies.

What's wrong with their "original" work? Can't they sell that?

Will they use the same tactics against open source projects that write software for UNIX? Maybe that is what concerns people here.

Also difficult to understand why they must force users to "upgrade" and OS that already works? Profits and revenues. Right. Rah rah Redmond.

If this behavior is what they must do in order to derive "profits and revenues" then why would you be confounded by people who would question it?

I'm all for winning in business, profits and revenues, but truthfully I am here because I like using, reading and trying to write software that is better than average.

Microsoft offers nothing in this regard.


What are you talking about? Most powershell users who aren't idiots, now that they aren't the real versions of the commands.

They're simply there help people who might know bits of bash.


curl and wget have nothing to do with bash


But they are commonly used in bash.


"They're commonly started from bash" might be a better way of putting it. When you're running curl, you're not running bash; bash is suspended. But the program you use to start another program isn't really relevant here.

Sorry for nitpicking... people describing everything related to the UNIX command line as "bash" is a pet peeve of mine.


They're commonly used on Unix. Unix has many shells besides bash.


Yeah, there were a lot of "power users" working around these aliases. I mean a lot. That's something that was discussed on that pull request, in fact. People definitely noticed.


This is how it went for me: One day I typed "curl" in a PS window and I got this Invoke-WebRequest thingy. I toyed with it and decided I preferred to actually download curl for my initial purpose.

Now I know Invoke-WebRequest exists and I like it, so now I have two different tools to accomplish similar things.

And don't get me wrong, I actually agree that this use of the aliases is wrong and should be changed. We now have two options here, wait for MS to do it or fork the project. After all, it's open source.


Great comment.


Am I crazy or is Microsoft announcing new open source or dev friendly stuff everytime there's a PR bomb here on HN?

18/08/2016 https://news.ycombinator.com/item?id=12305598 With Windows 10, Microsoft Disregards User Choice and Privacy (eff.org)

A day later they announce Power Shell is open source.

I remember a few months ago they blasted HN's frontpage with news after a few discussions about telemetry and EEE (first wave of Windows 10/UWP showing its true colors).

I don't care enough to do a reddit-esque investigation on HN stories. I just find it funny, that's all.


You are definitely not crazy nor alone with watching these PR tactics. I dislike that so many MSFT employees/puppets/bots/fanboys frequent HN since Build conference 2015. I would prefer the HN of pre-Spring-2015 with news about venture capitalists, startups and software that matters to startups - and not corporate PR about software that doesn't make sense for startups. See how MySpace lost, look what software stack they used and how it burned through cash because of very expensive license fees, compare it to Facebook. Everyone who got burned by the Seattle guys once usually learn and avoid getting burned again.


If that's what they're trying to do, then they're doing it badly.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: