More than just the normally used: if err := foo(); err != nil; { return err }
Large code Go code basis are so much better (more maintainable) for "hyperliteral case-by-case error checking". My impression that only people who havn't written large Go projects have this complaint.
Also, there is no reason you could not define a function like:
That seems to me like an awful example. But speaking of generics, exceptions and other things that Go lacks, like pattern matching, lazy values, currying and so on, in my opinion Go sucks because you can't abstract well over common patterns. To take your example as a use-case:
object NonFatalError {
def unapply(err: Throwable) = err match {
case _: TimeoutException => Some(err)
case _: IOException => Some(err)
case _ => None
}
}
def executeWithRetries[T](maxTries: Int)(callback: => T): T =
try {
callback
}
catch {
case NonFatalError(error) if maxTries > 0 =>
logger.warn(error)
executeWithRetries(maxTries - 1)(callback)
case other: Throwable =>
throw other
}
And usage:
def funcMayErr(): Int = throw new TimeoutException
val value = executeWithRetries(5) {
funcMayErr()
}
But there's more. Because with generics and exceptions you can actually wrap the whole result in an algebraic data-type that also has handy methods for dealing with failure (e.g. a monad), as in:
Try(executeWithRetries(5)(funcMayErr)).map(x => x + 1).getOrElse(default)
* those casually glancing over to figure what the code does
* those actually trying to figure out what a given expression does, either because they are reviewing or debugging
* compilers are people too
Conciseness is often overvalued and pursued to the extreme where effort is made first by the author to seek for the perfect oneliner, and then for the reader to actually check that this code is doing what expected.
Composition is important, but I don't think I found great real world examples of composition which wasn't either working only because of a tightly controlled code base or because it was just an example to prove a point.
Don't get me wrong, I love scala/haskell, I find playing with those constructs interesting and beautiful.
It's just that Go is a different thing, is a modern approach of getting back to basics, a minimal toolset for do just programming, more or less translation of thought into instructions.
And it's works very well; it's very easy to get things done quickly and the produced code tends to be easily maintainable. It's easy to have control over the memory footprint. The tooling is very mature (http://blog.golang.org/race-detector, gofmt formatting+refactoring)
I think many of us want Go to be something it doesn't want to and will never be. There is potential for a fast, simple, statically typed language; borrowing good ideas from Lisp, ML and others (and NOT resulting in something like Scala).
I'm not sure how that code example demonstrates something other than error checking that has to be written case by case at the call site of any function that might return an error.
Well it does demonstrate that, and I think that is a good attribute of Go. (And if the caller returned the error, it could be handled by a previous caller too btw)
Not if you want to parameterize the exceptions. This kind of "matching" requires that you compare your exception "types" by exception "values", so you cannot provide parameterized information about what exactly failed.... which key? Which URL?
Underpowered error handling (and I'm not advocating for exceptions, persay), lack of generics (and the resultant copypasta party and interface{} runtime casting [aka, the return of void *]) are real warts in an otherwise fine language.
And I'm not just theorizing: I spend my days writing a large, nontrivial system in go.
I've used Haskell a lot before, and I'm not asking for no nulls or real ADTs (though I wouldn't complain), but generics + better typed errors would really help clean things up.
Meanwhile, a lot of us are just waiting for rust...
It just depends on what you're trying to accomplish, but most use-cases can be accomplished without exceptions. The other use-cases often indicate bad design.
As for generics, you can get 90% of the way there with interfaces. The use of `interface{}`--while sometimes necessary--is often an indicator of bad design.
In large code bases, you often don't need (or care) to know what underlying type something is. For example, you shouldn't care whether an `io.Reader` is a TCP socket, file or completely in-memory ala `io.Pipe()`.
There are times when type assertions are the best/only way to get something done, and that's why they're there, but those cases should be relatively infrequent.
Generics would make some things easier (Rust's implementation is quite nice), but it's not significantly impacting my productivity, and I certainly wouldn't consider switching languages just because Go lacks them.
Of course you can, error is an interface, any number of concrete types can satisfy it, and you can use type assertions or type switches to unbox (and use) that concrete type.
More than just the normally used: if err := foo(); err != nil; { return err }
Large code Go code basis are so much better (more maintainable) for "hyperliteral case-by-case error checking". My impression that only people who havn't written large Go projects have this complaint.
Also, there is no reason you could not define a function like: