Hacker News new | past | comments | ask | show | jobs | submit login

> I don't understand, what is wrong with extending "=" such that in the example you gave "if(x=3)" it stores 3 in x and then makes the comparison?

There are many theoretical and quasi-religious arguments you can make about this point, but I think the real reason is a lot simpler: in practice, allowing assignment during expression evaluation has led to tons and tons of bugs.

The number of times a programmer accidentally uses a single "=" sign instead of a "==" in a conditional statement compared to any efficiency gains of avoiding an extra line of code, has empirically resulted in a net negative in productivity.

This isn't a theoretical or philosophical point, it's one rooted in collective experience. This "collective" experience may not match your own, but honestly, for experienced developers it's such a minor issue, that I'm fine giving up 0.001% of my productivity if it allows less experienced programmers to gain more than that.




I don't believe this. Sorry. It really isn't that complicated. If someone intends to compare and, instead, assigns by mistake, well, that code isn't going to do what they wanted it to do in the first place.

Yet, at a more fundamental level, there's nothing wrong with code like this (taken from the PEP):

    reductor = dispatch_table.get(cls)
    if reductor:
        rv = reductor(x)
    else:
        reductor = getattr(x, "__reduce_ex__", None)
        if reductor:
            rv = reductor(4)
        else:
            reductor = getattr(x, "__reduce__", None)
            if reductor:
                rv = reductor()
            else:
                raise Error(
                    "un(deep)copyable object of type %s" % cls)

What is wrong with this code that desperately needed fixing through the introduction of a new operator?

There's something in the PEP about programmers wanting to save lines of code. Really? The code at the microprocessor level is the same. Heck, if lines of code were so important I would still be programming in APL, where I could shrink a hundred line C program into just a few characters on a single line of code.

I guess my problem is with the idea of fixing something that isn't broken while ignoring stuff that is just nonsensical. A simple example of this is the lack of pre and post increment/decrement (++/--), frankly, it's just silly. We are writing "x = x + 1" instead. The real irony is that every microprocessor I know has increment and decrement instructions! I mean, it's hilarious! "x = x + 1" is likely to be compiled to such an increment instruction, which, is the machine language equivalent of "++" (in rough strokes). One could not make this stuff up. Who are we fooling?


> I don't believe this. Sorry.

Your belief conflicts with Swift, which also disallows assignment in if conditions for this reason [0]. With GCC, that warns about "suggest parentheses around assignment used as truth value" when compiling with `-Wall` [1]. With the most popular Javascript linter [2].

The tricky part is that "that code isn't going to do what they wanted" isn't always immediately obvious. For example, when I google'd "comparison assignment typo", this is one of the first results: https://gitlab.freedesktop.org/wayland/weston/commit/209e8f1... If the author was expecting that test to pass with the ==, then = would introduce a subtle but devastating bug. In the future, someone could break that function but it would never get caught by the unit tests.

[0] https://docs.swift.org/swift-book/LanguageGuide/BasicOperato...

[1] https://gcc.gnu.org/ml/gcc/1998-07/msg00085.html

[2] https://eslint.org/docs/rules/no-cond-assign


> There's something in the PEP about programmers wanting to save lines of code. Really? The code at the microprocessor level is the same.

No, they don't want to save lines of code, they want to save time to understand code. Disallowing assignments during expression is one way to advance that.

But I'm not really understanding your bigger point here. Who cares about such minutia? Is "x++" versus "x += 1" versus "x = x + 1" really all that different to the programmer? When have you proudly used the shorter versions and actually made code better in some marketable way?

These programmer purity metrics are silly and counter-productive, we should instead focus on programmer quality and speed to get things done.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: