Hacker News new | past | comments | ask | show | jobs | submit login
Usefulness of the "not in" operator (python.org)
52 points by llambda on Dec 29, 2011 | hide | past | favorite | 15 comments



Separate from this individual post, I'm not sure I buy the original claim that led to this (that "x not in y" is worth having in a language when it's identical to "not x in y").

They use a veiled slippery slope argument; 'if we did that, we might as well use not and >='. Well no, when you say something is less you don't say "not greater than or equal", in English those two sentences sound different even if their net meaning is the same. With != vs not ==, that would be awful not because not == is inherently bad looking, but because precedence levels would force you to use parenthesis every single time; in doesn't suffer from this problem since it is higher precedence than not. With !=, you avoid 5 or 7 tangled characters with 2 simple characters. With "not in" you are adding something to the language so you can type 5 simple characters instead of the same 5 simple characters in a slightly different order.

You could just as easily go the opposite direction and say 'if we are going to have a 'not in' operator, why not have "unless" that is the equivalent to "if not", why not have an "until" keyword that is the same as "while not"?

Is there any legitimate claim to be made about how "until" in VB is widely accepted to be awful but "not in" in python is beautiful?


I personally find "not in" to be a bit more readable. If I say:

    if not x in y:
        ...
Do I mean:

    if (not x) in y:
        ...
Or:

    if not (x in y):
        ...
Same with "is not".


"""that "x not in y" is worth having in a language when it's identical to "not x in y""""

It's not identical. Their effects are identical --which is a different thing.

With turing completeness you could do with even less, but that's not the point.

The point is: "x not in y" is more natural -- and Python is praised for it's ability to read like natural language/pseudocode (which it doesn't always achieve, but this is one case that it does).

We say: "if my tie is not in the suitcase, do something".

We don't say: "if not is my tie in the suitecase, do something" (except Yoda).


The question I am raising is not whether "x not in y" is more natural looking than "not x in y", I completely 100% accept that it is. My question is that why is it that people consider it to be so sufficiently natural looking that it demands it's own semantics when "unless x:" is considered to be abominable.

You completely did not respond to what I wrote; I say "until and unless are just as much more natural compared to the way you currently implement them, as (x not in y) is compared (not x in y)" and you say "(x not in y) is natural!". That doesn't show my claim isn't true, to be an argument against my claim you have to also claim that people don't say "until you are done, do next thing". Or that people don't say "do the next thing until you are done", or "bake cookies until done"; python has no do-while construct much less do-until.

So again; why is that "not in" is sufficiently important to be a language feature but do-until is hideous and unnatural?


This is a pretty cool demonstration, but it should go without saying that all computers are built, fundamentally, to be "functionally complete." (http://www.cs.umd.edu/class/sum2003/cmsc311/Notes/Comb/comp....)

The NAND and NOR functions, independently, are the most well-known functionally complete sets -- that is, all Boolean logic can be derived from each of those single Boolean functions.



Um... that's a basic topic covered in any CS discrete maths course. What's so special about this one?


Believe it or not, there exist some programmers who have never taken a course in discrete math. ;)


Not really. The argument isn't that "x not in y" isn't useful, but that it duplicates "not (x in y)" by adding an operator spelled "not in".

I like "not in" and use it lots but I wondered the same thing the first time I saw it.


That might be the argument elsewhere in the email thread, but it's not apparent from the actual page linked to. Without any other explanation, I'd go with the assumption that the poster thought that this particular email contained an interesting idea, and submitted it.


It was not covered in my discrete math course.


We didn't have a "CS discrete math" course. We took the same duster te math and algebraic structures course as math majors, then we had a languages/theory of computation course that covered the CS-specific bits.


The demonstration reminds me of the link shared a few weeks ago about Ruby: http://news.ycombinator.com/item?id=3343205 Programming With Nothing


If you really want to go all out, check out The Elements of Computing Systems (here: http://www1.idc.ac.il/tecs/). The book walks you through defining basic gates in an HDL up to having created your own "fully featured" processor, programming language, and runtime.





Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: