Hacker News new | past | comments | ask | show | jobs | submit login

Sigh, I knew using that contrived get_cached_user function with all those arguments would detract from my point, which is why I tried to couch it in disclaimers.

My point still stands though: nesting is bad. The takeaway from your comment is that you can solve the issue sometimes by factoring out smaller more focused functions instead of returning early, as you've demonstrated.




I have always liked the Linux kernel coding style in this regard:

Now, some people will claim that having 8-character indentations makes the code move too far to the right, and makes it hard to read on a 80-character terminal screen. The answer to that is that if you need more than 3 levels of indentation, you're screwed anyway, and should fix your program.

Yes indentation != nested ifs, but in the end its about the same. I don't mind a bit of nesting, but I think Linus is onto something with >3 meaning: you've done something wrong, step away from the keyboard and think about what you've done.


> The answer to that is that if you need more than 3 levels of indentation, you're screwed anyway, and should fix your program.

That particular cliché doesn't work if the essential complexity of data/algorithms you're working with implies more than three levels of decision-making. This can easily occur with graph-related operations like subgraph matching and graph rewriting, for example.

Splitting out the inner levels of such algorithms into separate functions doesn't gain you anything (unless the particular nesting you've got represents a subgraph that is meaningful in its own right out of context, of course) so deep nesting within a single function may be making the best of a fundamentally awkward modelling problem.


>if you need more than 3 levels of indentation, you're screwed anyway, and should fix your program.

A single if statement in a class method would put the indentation at 3, meaning if you have something like this...

    class MyClass(object):
        ...
        def my_method(self, param):
            try:
                self.calculate(param)
            except MyException:
                if param == 100:
                    ...
                else:
                    ...
you're screwed? I don't see how you can set such a definitive measure of bad code.


Linux is written in C, so every "top-level" instruction is going to be one-level deep. In Python, the "top-level" is often the body of a method, which means it's two deep. So to apply Linus' advice to Python code, you can go four levels deep:

    class Foo(object):
        def my_method(self):
            try:
                if foo is bar:
                   self.quux()
            except:
                 pass
This seems acceptable to me. Factor out the try block if you want more conditionals and use polymorphism instead of conditionals. Rather than writing:

    class X(object):
        def quux(self):
            if self.foo is self.bar:
                self.quux_foo()
            else:
                self.quux_bar()
Write:

    class X(object):
       def quux(self):
           self.quux_bar()

    class FooIsBar(X):
        def quux(self):
            self.quux_foo()
No conditionals and your intent is encoded in your class hierarchy instead of inside some random method.


Thanks for the thorough reply. I think his argument fits for C, but the linked article uses three different OO languages that would require a deeper "top-level", so I don't think it's fair to copy/paste his quote without any explanation or context.


FooIsBar doesn't need to be a subclass of X, does it? The polymorphism comes from the interface (implements `quux`), so inheritance doesn't seem relevant.

Feel free to correct me if I'm wrong. Still trying to learn the difference between inheritance and "evil" inheritance :)


Python doesn't have traits/roles, so it's hard to give a good example. You should build classes by composing them from smaller pieces of classes. There are two good ways to do this; delegation and role-based composition. Delegation is where you pass the constructor of your class an instance of another class, and your "foo" method becomes an alias to that instance's foo method. This allows you to do the same interfaces/roles as the object you're "wrapping", but still lets you customize the behavior of each method call you choose to delegate. (The Law of Demeter protects you from other callers touching the non-delegated methods.)

Roles let you implement pieces of classes without being a full class yourself. A classic example is a "Comparable" role, that requires the consumer to implement a "cmp" function, and then provides lt, gt, and eq implemented in terms of "cmp". You mix this into your class that does cmp, and now it gets lt, gt, and eq for free. You also get the security of knowing that you're calling the right method; duck typing will treat a tree's bark and a dog's bark as the same actions, while roles will let you determine whether you have an object with a wooden outer layer, or an object that can talk like a dog.

Python's philosophy doesn't seem to push delegation or composition, so we use inheritance in the above example to stick with Python programmer's expectations. Just beware of invoking the wrong type of bark; duck typing is flexible, but it ain't safe.


In my experience, a lot of the lessons of kernel/driver code don't really carry over to the more general purpose app development using higher level languages.


Your get_cached_user function is very authentic and is a great example of the kind of hairy code that a programmer should be looking for in his own code. There is no need to backpedal on this choice of example; it's fine.

You should go one step further and conclude that nesting is a warning sign often indicative of bad code, and that just fixing the nesting is not the solution to the problem.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: