Hacker News new | past | comments | ask | show | jobs | submit login
You may finally use JSHint for evil (mikepennisi.com)
78 points by coldsnap427 on Aug 4, 2020 | hide | past | favorite | 80 comments



The project has been "rewritten from scratch" [1] in order to sidestep the legal requirement of preserving the original license (from JSLint, which it was forked from), against both the original authors' wishes [2].

A whole lot of post-rationalization of how the unconventional license caused the project to slow down: http://mikepennisi.com/blog/2020/jshint-watching-the-ship-si.... Certainly not due to ESLint's extensible design, customization options, better error messages, ES6 and JSX support and adoption by multiple mainstream libraries.

Legal implications aside, it's still sad to see that 'do no evil' is so hard to agree with.

[1] https://jshint.com/relicensing-2020/index.html

[2] https://github.com/jshint/jshint/issues/1234#issuecomment-23...


The part that explains why this clause (originally written by Douglas Crawford in the JSON license) is a problem for free software:

> If you’re not versed in legal matters, that probably seems like an odd restriction. By rejecting JSHint, are people admitting that they want to do evil? And is that clause actually enforceable, anyway?

> The answer to the second question is “no,” and that helps answer the first question. Legally-conscious objectors aren’t betraying their own dastardly motivations; they’re refusing to enter into an ambiguous contract. Put differently: they’re not saying, “I’m an evildoer,” they’re saying, “I don’t understand what you want.” This consideration disqualified JSHint from inclusion in all sorts of contexts.


As an admirer of the WTFPL it's hard to sympathize. The lawyers can still play their games in court, it's free software either way. The fact that a couple central entities get to decide what qualifies as 'real' open source is the most concerning part.


>The lawyers can still play their games in court, it's free software either way.

That isn't true at all. If someone takes GCC, makes some changes and re-releases GCCv2 as closed source, there's no open source boogeyman that will force GCCv2 developers to release their code. It's up to courts, and by extension lawyers. Free software implicitly depends on lawyers to hold up the contract of Free Software.

If lawyers tell you "this contract is unusable because of this clause" then you don't have free software, you have a weird proprietary license. The entire point of free software licenses is to define rules for lawyers to play "their games in court". It's worthless otherwise.


IMO, the fact that lawyers can't even agree on the scope of the AGPL means that semantics are just that, semantics.

Many licenses haven't been tried in court, including the AGPL, yet it's used fairly broadly.


AGPL is widely avoided by corporations precisely because of this problem...

Corporations vastly prefer BSD/MIT style licenses or at most GPLv2.

GPLv3 is avoided, AGPL is avoided even more.


AGPL is avoided because cloud/server corporations know exactly what it means. It's expressly written to ban their business models.

Likewise for GPLv3 and Tivoized corporations.


The license JSHint is moving to is MIT. It is by far the most common license used in OSS libraries and it allows usage or re-licensing in proprietary software.

That is the context from which I'm speaking, where enforcing openness through legal means is not a concern as the license is permissive (I imagine you were thinking of protecting GPL software, which is kind of the opposite situation). There have been zero court cases involving legal dispute over MIT licensed software.


Right, the unmodified MIT license is fine. There haven't been any cases precisely because of how unambiguous it is in giving permission.

The question is, how do I know that a contributor to software under the JSLint license (perhaps a corporation that paid an employee to contribute something) won't sue me on the grounds that I'm doing evil with their code?


It is not free software.

To quote https://www.gnu.org/philosophy/free-sw.en.html

"The freedom to run the program means the freedom for any kind of person or organization to use it on any kind of computer system, for any kind of overall job and purpose"

Do no evil is maybe tounge in check, but its a real issue when people put things in licenses like "nobody from USA gov is allowed to use"


There’s a couple of other central entities which do actually decide what gets to qualify as a legally binding agreement and the WTFPL likely falls short of pleasing them.


WTFPL isn't ambiguous though.


Yep. WTFPL, while profane, is unambiguous in its permissivity. It is perhaps ill-advised in its lack of a warranty disclaimer, but that's a problem for the developer, not for the end user.


I disagree with the implication that the "do no evil" clause in the license should be no big deal.

Imagine the library had some code like:

    if (get('http://example.com/am-i-evil') === 'yes') {
      fs.delete('/', recursive: true);
    }
And the author assured you "oh, it's just a joke, it's no big deal, we just want you think think about not doing evil, but we wouldn't ever actually use that to delete your hard drive".

Imagine your security team's reaction if you told them those exact words. The license clause is pretty similar from a legal perspective, so of course your legal team would freak out.

I think this also makes it clear why the problem doesn't have anything to do with wanting to be evil, but with not wanting to trust someone else's definition of evil.


Ok, we'll change the URL from http://mikepennisi.com/blog/2020/you-may-finally-use-jshint-..., which has the catchy title but is just a summary page, to the post you linked to, which seems to have the most information.

Off-topic side note: Blog post series are a problem for HN because usually only one element in the sequence gets attention, meaning readers only get part of the story. Alternatively, more than one thread gets attention, but then the discussion is split and we run into the problem of follow-up posts [1], i.e. the exponential decay of interestingness under repetition [2]. So from an HN point of view it's best to just make one long article on a given topic—but that's just the local perspective.

[1] https://hn.algolia.com/?dateRange=all&page=0&prefix=true&que...

[2] https://hn.algolia.com/?dateRange=all&page=0&prefix=true&que...


> Legal implications aside, it's still sad to see that 'do no evil' is so hard to agree with.

I disagree that you can ever put the legal implication aside when discussing license terms. The legal implications are the entirety of what's being communicated by a license, which means you're basically saying "if you ignore what these words mean, nobody should disagree with them". That's very true! It is easy to agree with something if you ignore what it means. And?


> Developers from the Debian and Fedora GNU/Linux distributions independently concluded that they could not include JSHint due to licensing concerns.

Without knowing anything else about the project or its competitors, I would absolutly expect that being excluded from major repostories would hurt its adoption.


It is my religious belief that we all do evil, at least a little bit, at least sometimes, even if we wish to do good. As St. Paul said, "For I do not do the good I want to do, but the evil I do not want to do—this I keep on doing."

That means that I cannot, in good conscience, use software that requires me to agree that I'm not doing evil. If your conscience is confident that your use of JSLint involves no evil - that you are sinless and blameless - I'm happy for you. Unfortunately I cannot say that of myself.

(And if your conscience lets you say that of your employer and all your co-workers, I really want to know where you work.)


> The Software shall be used for Good, not Evil.

It's so funny because Crockford is one of the most persuasive speakers and coders on avoiding ambiguity, on readability over coders desire to "express themselves" in code, and on making language design decisions based on research.

He then takes a step outside his area of expertise and chooses to express himself with an ambiguous license created without consulting anyone who has domain expertise in software licensing.


He probably sees it as a form of "sticking it to the man". He probably did it as a joke initially and is now digging in his heels. He remembers the 80s when hacker culture abounded and software could have lots of inside jokes and lawyers were nowhere to be found.

Now that software has gone "mainstream" and copyright and patents and lawyers and courts and legal red tape has been rolled out everywhere he is refusing to conform and is instead sticking to his guns.


He also made some strange linguistic choices in his latest book[1]:

"The word for 1 is misspelled. I use the corrected spelling wun. The pronunciation of one does not conform to any of the stan­dard or special rules of English pronunciation."

[1] https://howjavascriptworks.com/sample.html


The context matters here:

> The word for 1 is misspelled. I use the corrected spelling wun. The pronunciation of one does not conform to any of the stan­dard or special rules of English pronunciation. And having the word for 1 start with the letter that looks like 0 is a bug.

> The spelling of wun is unfamiliar to you so it might feel wrong. I am doing this intentionally to give you practice with the idea that a bad feeling about something unfamiliar is not proof that it is wrong.

> This is how spelling reform happens. For example, some cat decides that it really would be better if through were spelled thru because it does not make sense that half of the letters in a popular word be silent, being wildly inefficient and putting an unnecessary burden on students of the lan­guage. Spelling reform is a struggle between tradition and reason, and sometimes, reason wins. I feel the same way about programming lan­guages. So if wun makes more sense to you than one, then please join me in the effort.

I understand that he's making a point on how reforming systems can feel, the example of "wun" instead of "one" is a way to communicate that to readers.


Side note but that is the best typography I've ever seen in an HTML document.


There are a few latex mimicking CSS styles with a similar vibe, I suggest having a look at those


It is quite amusing that he thinks the English language has "rules" as though it were a well designed programming language.


The rules of a language are its conventions.


I have found it funny that Douglas Crockford apparently gave IBM this exception: "I give permission for IBM, its customers, partners, and minions, to use JSLint for evil." (for JSLint, which JSHints was forked from): https://web.archive.org/web/20170722132351/https://dev.hasen...


> The Software shall be used for Good, not Evil.

I get that software developers add these things to licenses in good humor, but lawyers have no humor unfortunately. If you want your project to be used in a serious way, it's better to leave the ambiguous phrases out of your license and choose something standard.


Lawyers have no sense of humor is one way to say that they are risk adverse. After all, part of their job is to help their clients avoid worst case scenarios. Therefore, the weird kind of ambiguity in that license, which has no precedence to give the lawyers any ideas as to how judges would rule, is basically a big blind spot in the ability to manage risk.


Well, JSON is used in a serious way, in fact I doubt you can use most web applications (and many desktop apps too) without your machine parsing some json. And... https://www.json.org/license.html


That license does not cover the JSON spec; nearly all JSON parsers are not distributed under that license. I assume Crockford's original reference implementation of a JSON parser is released under that license, and not much else.


But JSON isn't software, it's just a specification. Any text that matches the specification is JSON, whether it is binary, a string, printed on a piece of printer paper, etc. So what does this license mean, exactly?


The JSON License was created to cover the original Crockford JSON parsing and linting libraries.


My employer specifically forbids us from using the json.org license because it's not considered FOSS


> ...licensing concerns. That’s why Ubuntu users can’t download JSHint via sudo apt-get install jshint.

Okay, this is misleading. You can install tons of non-free packages using apt from the official archive. They are in the multiverse section.

http://archive.ubuntu.com/ubuntu/pool/multiverse/


I've also had customers dig their feet in because I depended on this library: https://dst.lbl.gov/ACSSoftware/colt/license.html. Exactly when what you're doing is considered a military application is ambiguous. Have an obscure DoD contractor as a customer? Uh oh - can't use this...


Sorry for commenting on the formatting instead of contents, but how could someone say "The following graph shows how many times JSHint has been downloaded from npm each week over the past five years" and then present a graph with four data points?

Frankly that makes me doubt the author's central piece of evidence (that JSHint gets much fewer downloads than ESLint).

* Graph from this URL in case it changes again: http://mikepennisi.com/blog/2020/jshint-watching-the-ship-si...


Here's a better set of charts, using the `npm-stats` site:

https://npm-stat.com/charts.html?package=jshint&package=esli...


Writing software is not a neutral act. You are morally responsible for its uses. Restricting those uses is both a responsibility and a good. Circumventing this restriction is itself an immoral act.


You know what? I’d like a world where people don’t decide that software licensing is an ideal place to wage ideological war and inflict harm on their enemies. I want collaboration and communication between people not in perfect agreement. I understand this is an overtly political stance; I am willing to commit to it as a positive and trust that it will benefit communities and the world at large, and that fragmentation into polarized ideological splinters will harm communities and the world at large (and enable our well resourced enemies to strike us in our weakness).

Because this is a linter, that will help coders code well, and not a facial recognition suite that will facilitate totalitarian regimes as they crack down on dissidents.


I think it's worth pointing out that software licenses are what enable the kind of world you're describing. Unless you're suggesting massively changing copyright law (Which may be a fair thing to suggest), the typical well known software licenses enable lots of code sharing and contributions by making it very easy to understand the terms on which you can use it. Without them, you could never easily share or take code contributions without potentially having tons of copyright issues.

Put another way, the issue here is not software licenses in general, it's the fact the author picked an obscure one with with dubious requirements. Picking no license would have actually been worse than the license he had.


> Writing software is not a neutral act. You are morally responsible for its uses.

That's an unsound argument - it suffers from a confusion of agency. The most I can be responsible for as an author is the intent of my software.

If I'm a wrench manufacturer, I have no moral obligation to attempt to control how people will use my wrenches. They are amoral tools with a good intent (to help with construction projects). Whether my wrenches or those projects will end up being used for evil is beyond my purview.


This I would generally agree with. However, care should be taken with powerful tools – deepfake software, explosive material, and so on. The laissez-faire “guns don’t kill people” attitude isn’t wrong, per se, but it can be inconsiderate if used blindly.


True, but I think your caveat is covered by my assertion regarding the intent of the tool.

The intent of deepfake software is clearly either fraud or development of deepfake detection tools. If you're a vendor and your intent is the latter, it would be morally commendable to do your best to sell to reputable customers only... but with security tools like this the authorial intent almost doesn't (and shouldn't) factor in. Unfortunately, if you try to go soft so it won't "hurt someone" then the fraud detection R&D can't be as robust.

Regarding guns, they're either for killing people, killing varmints/animals, or defending oneself against bad guys. As a salesperson, you can reserve the right to refuse sales to known crooks -- that would be morally commendable (as far as it's possible). But it's your decision based on your own judgment. As far as the gun itself is concerned, if as a manufacturer your intent is defense against bad guys (or legal hunting/varmint control) and your product reflects that intent, then you're morally in the clear.

Incidentally, I'm strongly against sales of surplus military gear to police departments, because of the mixing of intentions that are at-odds. The intent of military gear is to fight foreign armies commanded by bad guys -- which is an intent that police departments should never have, as they're dealing with civilians. That's why in the US, we have the National Guard if an army of bad guys ever appears within our borders--it's different from the police, with different training and different intent.


"The intent of deepfake software is clearly either fraud or development of deepfake detection tools. "

I think there are a ton of positive, creative uses of it. For instance de-aging an actor for a flashback sequence, with their permission. Deepfake software can potentially do this way cheaper than the state of the art, such as the young Tony Stark / Peter Quill's dad / Leia. And eventually, it should be able to do it better. When the technology is truly indetectable, it will be used all over the place in filmmaking and even video games.


Do we need to consider only intent, or can we extrapolate probable futures?

Because then I worry about deepfakes making actors obsolete (as we have made obsolete so many other professions)


Maybe I'm older than you, but I remember being worried about those newfangled bulldozers making us ditchdiggers obsolete. :)

Or how about movies making stage actors (and set builders, etc) obsolete? I mean, when you only have to act it out once and millions can see your performance, as opposed to actors performing shows every night in small venues in towns all over the country, it puts a lot of people out of work. There are pretty few people today making a living as actors.

And of course photography put portrait painters out of work. And high quality cameras in cell phones probably put a lot of photographers out of work.

What might happen is that there will still be actors, but all they have to do is act, not be beautiful. They can use a model for the beautiful part. And then you can have talented people on computers merging it all together.

I do like the idea of bedroom production of movies, which can happen if you don't need actors. Previously, if you want to be a film director, you needed to be born into a rich family, and then after extremely expensive film school, you needed to get big money to finance your movies. So there's that.


I stand corrected -- you're clearly right.


> Regarding guns, they're either for killing people, killing varmints/animals, or defending oneself against bad guys.

Also just shooting inanimate targets for sport because it's fun.


deepfake software will eventually be so easy that script kiddies can and will flood the internet with deep fakes. That will hopefully "cheapen" the value of online video and audio clips and force people to learn where reliable sources of truth come from.


In a world where people still get fooled by Onion or Onion-like articles, I'm not holding my breath for it.


Verified instagram/twitter accounts with text?


You're buying a wrench. The checkout person says, "I hope you're not going to murder someone with this!" You say, "actually I am." The checker then refuses to sell you the wrench.

Freely provided software unifies the producer and distributor into one agent. That person has responsibility. The absolute minimum they are morally responsible for is restricting the uses of their product to ones they do not find morally objectionable.


You're buying a wrench. The checkout clerk disappears in a cloud of sulfur and reappears as a demon. She tells you she will sell you the wrench, but if it's ever used for evil, your soul will be bound to her for eternity. She then holds up a virtually identical wrench and says "this one is fine, though, do whatever you want with it." You really just need it to tighten some fittings under your sink. Which wrench do you buy?


You're buying a wrench online. You add it to your cart. Dang, 5 day holding period while the manufacturer does a background check. Ugh, you need to fix your toilet. You head over to a hardware store and get a wrench. The checkout person jokes "I hope you're not going to murder someone with this!" You roll your eyes and say "ya got me!" and put your hands up like Walter White. The checkout person suddenly gets serious and refuses to sell you the wrench to legally cover the store's hide. You go to a second hardware store. This time you've learned your lesson. You buy the wrench without saying a word. You do not respond or interact with employees and you keep your purpose a secret. The End.

What an ideal world.


No, you're not. This is an argument that doesn't hold water. Why am I morally responsible but none of the abstractions below my code are not? What about chip manufacturers? The developers of CPU instruction sets? Compiler developers?


I think adding the term to the license doesn't solve this problem (and were silly), but I think your argument is the wrong way to look at it. People should take a moral responsibility for the code they write, and refuse to write code if they view its uses as immoral, but you have to weigh for yourself what that means and what those criteria are, and adding a license requirement doesn't solve this problem.

It's not hard to justify developing a compiler or CPU (or a JS lintier) and deciding that it's just a generic tool and you have no control over the code compiled with it - and that's perfectly fine. You might however find it harder to justify writing functionality that's designed to track people or that you know is intended to be used for nefarious purposes, and in those situations you should be willing to hold yourself to whatever moral standard you have determined. Just because everyone below you hasn't prevented you from doing it isn't a reason to go along with it, you know more directly what your software is going to be used for than they do and they're effectively trusting you to make these decisions.

Again though, a JS linter doesn't exactly have these concerns ;)


Tracking someone isn't evil. Killing someone isn't evil. Pushing a kid into the bushes isn't evil either.

Your tracking someone explore a new cave and recording paths found. You kill someone who is about to kill other innocent people. Pushing a kid into bushes to avoid a car could save their life.

Evil depends on context. What's evil to you is not evil to me either.

Courts generally throw out clauses like these unless the definition of evil is part of the agreement.


I think it's worth pointing out I agree the clause is silly. I was talking on a more broader context of software development.


They are. Why wouldn't they be? We put people in jail for distributing potentially harmful tools to people who are known to desire to harm others. This is not a new or controversial idea.


Name one tool that has not been used to kill someone or commit an immoral act


This attitude disturbs me, it tries to absolve responsibility by denying the possibility that responsible behavior is possible. That's not just obviously false, whole troves of law cover exactly this point, individuals and corporations have responsibility that follows their products.


I just want to say I agree 100%. All these arguments that people are making that you can just absolve yourself from responsibility for anything you create because "you don't know what they'll do with it" worries me. Sure, a JS Linter is probably not a thing to be worried about, and adding an entry to your software license doesn't really help, but people (and developers) are asked to do arguably bad things, and we should encourage people to use their moral judgement in deciding whether to do something rather than just decide "I'm not responsible, so it doesn't matter".

I think it's also somewhat interesting to compare these comments against the ones on articles where we find out a company is helping China (or a different country) censor their website, either locally or globally. In those situations people are upset they were willing to do it - but why wouldn't they if they have no moral responsibility for what they're making and it brings in more dollars? Someone in those companies had to approve it and eventually add those lines of code or those entries in a database, and good or bad they should be willing to stand up for those decisions and say no if they truly disagree with what they're doing. They are, quite literally, our last line of defense against such actions - and it might not be "fair" to them, but it is reality.


I don't think Linus Torvalds loses sleep at night just because North Korea has created a controlled linux distribution for use in their country[0]. Would you if you were the creator of Linux? If so, what would you have done that Linus Torvalds didn't do?

[0] https://en.wikipedia.org/wiki/Red_Star_OS


Your example is so simplistic, not all situations are so passive and indirect. Here's some better ones:

Do you think Linus would be willing to adding a driver for a USB device developed by North Korea that could or would be used by them for nefarious purposes?

Do you think he would be willing to adding code that would make it easier for the NSA to spy on Linux users?

Do you think he would be willing to add code to track Linux usage?

Do you think he would continue to develop and work on Linux if it turned out North Korea was the only user of it? Not just one of, but the sole consumer?

The real world involves actual hard problems and questions, and in some cases (In Linus's case, probably more often than not) your choices will have a direct impact on people. And while I can't tell you Linus's answers for those questions, it's not the answers that matter. What's important is that he should be willing to stand up for those decisions. It shouldn't be acceptable to us or him for him to simply throw his hands in the air and say "well I don't have any responsibility here so who cares how this software is used".


That's the beauty of GPL. Linus doesn't get to decide whether or not people can add stuff to a Linux fork. And thus, Linus is not encumbered with being a moral gatekeeper. North Korea can add a driver to their own linux kernel that makes it easier to control torture devices if they want and Linus can reject that same driver from being merged to the upstream master copy if he wants.

Software that is morally gatekept by its creator is not free (as in freedom) software. I value freedom more than I value "preventing harm" or whatever you're worried about. The whole world is trending toward less freedom (of speech, etc.) in the name of "preventing harm" which I think is a big mistake.


> Linus can reject that same driver from being merged to the upstream master copy if he wants.

You really missed the point. Yes, anybody can maintain a fork, but Linus still has to make the judgement on whether to add such a driver to Linux. And making such a call is a big deal, it means he and the kernel developers are committing to maintaining it (to some degree) and makes it easier for North Korea to keep it up to date and functioning. And Linus's tree is the tree people get their Linux Kernel from, just because forks exist does not mean he doesn't need to think about what he adds to his. Linus is a "moral gatekeeper" whether he wants to be one or not, and in some cases he has to answer such questions, the only relevant part is how he chooses to answer them.

> Software that is morally gatekept by its creator is not free (as in freedom) software. I value freedom more than I value "preventing harm" or whatever you're worried about. The whole world is trending toward less freedom (of speech, etc.) in the name of "preventing harm" which I think is a big mistake.

These are two completely different things. There's a big difference between "I'm not going to prevent you from writing X" and "Here's X, it's only used by you and I wrote it or I'm keeping it up to date for you". That was the whole point of the example questions I gave. At some point it's not just some theoretical boogeyman, at some point you are aiding those who are doing things you don't consider OK. Just because they can fork Linux doesn't mean Linus has no responsibility for what he puts into his fork and develops.

> I value freedom more than I value "preventing harm" or whatever you're worried about.

And what's your point? Refusing to develop software or maintain software for someone is expressing freedom, I would argue that very thing is what will protect freedom from various nefarious actors. Or do you suggest such people (like Linus) should just blindly develop whatever they're told without any concern for what the results may be?

The real irony here is that some of the software I'm talking about is software that is currently restricting the freedom of people across the world. And I can guarantee you that some of the people who go on Hacker News help develop it, and justify it via "It's not my responsibility, it's just code and I get paid to do it" - and that shouldn't be acceptable to them or us. My whole point is that they should have a higher standard and realization of the affect the code they write has on other people, and I think we're actually in agreement on this part.


Your attitude disturbs me. You want the ability to yank the rug out from under individuals and corporations if they ever do something that runs afoul of your personal morality system. How do I know your sense of morality isn't capricious and subject to the whims of twitter mobs? You could destroy my whole business if you arbitrarily deem my current or past actions immoral.

For that reason, I will never, ever use software with a license like that. Therefore, I will never, ever create software with a license like that.


So if I just write "you're not allowed to do bad things with this" in the license agreement, I'm covered?


Is a vegetable farmer responsible that people who eat the veggies only do good deeds?

I suppose if you really wanted to, with GMO seeds sold as a service, you could conceivably enforce that.


You're not entirely wrong, but looking at things this way will take you down the wrong path. The big question here is who decides what is right and wrong? Is it the author of JSHint? Is the the random dev tasked with adding a lintier to a project? Is it the lawyer who's vetting the organization's licenses?

Ultimately, these questions of how we translate morality into a system of rules, that we use to punish people, shouldn't be decided by a few technical experts, but by the everyone, and we already have a system for that.


This is too simplistic and requires an objective sense of morality.

I have to accept that some of my work may be used for things I do not agree with. I do not believe this is a stain upon my honour.


Asking software authors to make moral considerations is less simple than saying, "You're free to ignore them." precisely because morality is subjective. Being irresponsible is much easier than being responsible after all.


There is some nuance. If I were writing software to control missiles then I had better be comfortable with how those missiles get used.

But if I were writing a compiler for a mainstream programming language, then I don't think I could be expected to lose sleep over the fact that _someone_ might be using it to write missile control systems.


It goes further. Unless you stop the project the missile control will be built with less qualified developers. If a missiles keep hitting buses of average citizens you are at fault because you could have prevented that by finishing the project.

Then you can even extend and say because the missile system wasn't built (because you left) when another country bombs and kills kids you are at fault.

You could even say because the missile control wasn't built the next Hitler was born and all of those deaths are your fault.

It's no wonder all presidents age so rapidly for those 4/8 years.


I think it's reasonable to expect people to refuse to write software that has only or mainly immoral uses.

But suggesting that a software developer is responsible for any and all possible uses of their software, whether they could have reasonably thought of all of those uses, is absurd.

Regardless, a tool is a tool. If something has, say 60% good use and 40% bad use, should it not be built? Where is the cutoff? 90%/10%? 99%/1%?

Should hammer manufacturers be responsible for murder done using their hammers?


A license is not a pact with a demon, known as the original developer. A license is a legal document to be enforced by many different courts across the world.


I disagree. If someone publishes something, or otherwise makes something available, then it is not available. It is not the author's/inventor's job to decide who is going to use it and how; they only decide how they will use it by themself. That includes money, too: they can charge money to provide a copy to someone else (or refuse to provide a copy at all), if they want to do, but it should not be their job to stop the customer from making additional copies with or without any kinds of payment and with or without modification. I think copyright and patents is bad. Of course something can be used immorally, but that does not necessarily make the author immoral. If you write a program or a book or whatever and you think bad things will be made from it, then you might consider to not write it at all (if that is your choice), but you might also have a different consideration, such as, that good things will be made from it, or whatever other consideration it might be.


This makes no sense. If I write a todo list app I bare no responsibility for what people put on their list.


There's a difference between writing a spreadsheet program that ends up being used in a concentration camp to writing the AI that decides who is sub-human.

Neutrality doesn't mean much, although I agree as an industry software can be very naive ethically.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: