Hacker News new | past | comments | ask | show | jobs | submit login

I haven't watched the full talk, so let me know if this gets explored by Lamport, but...

Here's a thought: PL researchers seem to generally agree that typed languages are superior to untyped languages, yet programmers tend to prefer untyped languages to typed languages, to the point where Java and C++ have the fanciest type systems in common use, with ML being the closest thing to an academic language that gets significant use.

It seems intuitive to me that typed languages are bad for exploring algorithmic design, so could this where the disconnect is? Most programmers do not know what algorithm their final product is going to have when they start and dynamic languages are much more ergonomic when you are exploring he design space.

Certainly if I knew exactly what program I was going to write when I started, I could probably write it in haskell; indeed once I have a working program, translating it to haskell is often fairly mechanical. However much of my time is spent trialing algorithms that may or may not be a good fit for the problem I'm solving and in those cases I feel like haskell gets in my way, while lisp gets out of my way.

I don't know if there is any practical takeaway from this, but it does seem to at least give an explanation for the divide between PL theory and practice.




> programmers tend to prefer untyped (dyanamic?) languages to typed languages

I have not found this in practice. I have found that some developers prefer a dynamically typed language for certain situations, like rapid prototyping. However when maintaining code bases, I've found that developers often bemoan the lack of static types to assist in small modifications to code they haven't touched in a while.

Personally I love the ability of static, declared types to organize assumptions about code; I also love the ability to do terrible things to types to shoehorn poorly designed third party libraries into my codebase. I don't see it as a developer preference so much as dependent on the problem domain.


I agree.

I recently tried to port an ~11k SLOC object-oriented Perl program from GTK2 to GTK3. The lack of types was maddening. I would have some simple goal, like "find the path of $image." I would immediately run into a roadblock: what type is $image? Is it a string? A file handle? A Gnome2::File?

So I trace the variable back. It's a parameter, so look at one of the places that calls it. That callsite uses a global variable. Aha! I go to where it's initialized. It's initially set to undef. Dead end.

I couldn't help but think, "I wish I was programming in Java right now." That was a new low for me.


For me it's typically weighted less on the type system and more the fact that I don't need to wait for a compiler or use any fancy tools to get a quick idea realized.


I’ve been developing with compiled languages since the early 90s and even with a computer of that era, compiling within a decent IDE hasn’t slowed me down.


It's not some hard rule. Just pointing out that a lot of these decisions can have very little to do with the type system.


If you remove a lot of the cruft and redundancy and add memory management, a statically typed language with fast compiles can have some of the benefits of a dynamic language.

If I could have a dynamic language, and static typing, that would be the best of both worlds. However, gradual typing doesn't seem to have caught on.


Gradual typing has a strong tendency to turn into never-typing because when you've got it running there is no real point to going back and adding types. Which divides people back into those who want dynamic typing and those who want static typing.


Tracing JIT VMs gather the information on what the types actually are at runtime. I wish that such JIT VMs running server software would gather such information, which could then be automatically applied.


The JIT VMs gather that information, but then are careful to hide all of the typed calls behind run-time checks for the validity of the types. Failing to do so is an invitation to create bugs on rare use cases that didn't happen to be in your run. Making those code changes either results in duplicated auto-generated code that nobody looks at, or results in a source of possible bugs.

To give a concrete example, suppose that you have a weighted graph whose weights are integers. Code to find a Minimum Spanning Tree will always have an integer for the weight. Great. But if you add that type annotation, you'll run into trouble when someone checks whether the previously found MST is unique by calling the same algorithm again using as weights (weight, edges_in_common_with_prev_mst). (This will find the MST that shares as few edges as possible with the original, so if there are > 1 you'll find a different one.) Those ordered pairs were just fine in the original code, but don't fit the type annotation.


The JIT VMs gather that information, but then are careful to hide all of the typed calls behind run-time checks for the validity of the types. Failing to do so is an invitation to create bugs on rare use cases that didn't happen to be in your run. Making those code changes either results in duplicated auto-generated code that nobody looks at, or results in a source of possible bugs.

Of course, you want to use data correctly. There was a study of server code in Python that found that lots of playing fast and loose with type information happens just after startup, but then types settle down. If the initialization can be isolated in certain parts of the code base, it would make such date easier to use.


With C# you can mix dynamic and static to some degree. In some cases like parsing JSON dynamic code can be much shorter. But for maintainability the dynamic code parts should be kept localized and to a minimum/


Works pretty well in typescript, too.

I don't find myself actually using gradual typing though. Not very often at least. Even in prototypical stages of an application, I usually just put together the types of what I currently have, even if I think they will change later.


Typescript is pretty similar to C# from what I can tell. Not a surprise considering that the designer is the same.

I have found it very useful for things like translating incoming JSON or XML to the typed world. As long as the input data is of known format dynamic code is much shorter and concise.


I think this fits squarely into the “rapid prototyping” scenario. Perhaps if you had to design your software to very tight constraints you might find static typing to be a productivity boost.


Vote for Typed languages here, every day of the week. Why introduce yet another (and unnecessary) sub-game of "find the type"? Keep It Simple!


>yet programmers tend to prefer untyped languages to typed languages

The more strongly typed a language is the more it forces you to think about the big picture and architecture your entire application in a way that will map nicely to the type system. For instance with a language like Rust which is both strongly typed and integrates concepts like lifetimes into the type system you need to think long and hard early on about how your "data" and your "processes" are going to interact to figure out a way to map everything cleanly.

Doing this correctly requires a lot of experience. You need to be able to project quite accurately what your project is going to look like (lest you end up to refactoring everything a few weeks from now or worse, lose a lot of time fighting the type system instead of leaning onto it) and you need a deep understanding of your tools to figure out what's the best way to model it and the tradeoffs involved.

If you're a beginner it can be seen as overwhelming and maybe even counter-productive, let me code and stop bothering me about integer to string conversions!

At this point I'm tempted to draw the conclusion that "dynamic languages are for n00bs and static typing is for the 31337" but my self-awareness douche alarm is starting to beep very hard so I'm going to deviate and concede that many brilliant programmers seem to favor dynamic languages so clearly some strive in this environment. So instead I'm going to draw a weaker conclusion: while competent programmers may or may not prefer dynamic languages, beginners will almost certainly favor the more forgiving nature of dynamically typed languages. That biases towards dynamic languages in the general population (and especially the population who cares about engaging in language wars in the first place. Us experienced coders know that the only worthwhile debate is tabs vs. spaces. It's spaces by the way).


I believe that Steve Yegge got it right in https://plus.google.com/110981030061712822816/posts/KaSKeg4v... and it is all about your political leaning. Or more precisely, it is about how much ceremony you are willing to go through for some promised benefit. If you're willing to go through a lot of ceremony, you're a software conservative. If you don't see a point to the ceremony (either because it gets in your way, or you don't believe that the promised benefit is no that big), then you're a software liberal. Which you are likely to be is a combination of a function of your personality, and what kind of software you write.

It really isn't about competence. Competent programmers can fall into either camp. However there is a strong tendency for people to conclude that people who disagree with them are incompetent. As you just did. And then backed up to realize that there are good people who disagree with you.

I would also disagree with your weaker conclusion. In general bad programmers agree with whatever role-model they have most recently imprinted on. If they've picked up a Java book that talks about typing, they assume that it is right and types are worthwhile. If they pick up a PHP book, then vice versa. The current popularity of Javascript and Python mean that a lot are going to be on the dynamic side, but far from all.


> The more strongly typed a language is the more it forces you to think about the big picture

This is a good point. A straightforward corollary is that the size of the picture can be drastically different for beginners and experts.

One would imagine that beginners work on more small toy projects than experts. In such cases the protections that static languages offer can seem less valuable, and the friction more painful.


I think most of the dynamic/static bias depends on the specific experiences of your problem domain.

If you have a lot of types of data to model(layers of records, sequences, indirection, etc), dynamic types make it easy to start bodging something together, but result in write-only code. So you end up wanting to have additional structure and definition - maybe not immediately, but just after prototyping is done and you have a first working example.

If you're just applying a routine algorithm that ultimately does a round trip from SQL, you don't need that additional assurance. The database is already doing the important part of the work.

If you have simple data but it needs to go as fast as possible, you end up wanting to work at a low level, and then the machine size of the data becomes important - so you end up with a static types bias.


Most of the oldies I know prefer static typing.

A lot of the new people use dynamic types because that's all they know.


This. I'm an oldie and I prefer static typing. Heck, I can't even get my head around dynamically typed languages. And for any non-trivial production level software I strongly believe the choice of programming language has no bearing on the productivity. I.e., a team of Java experts will be as much productive in a Java stack as a team of python experts in a Python stack.

A big reason for the recent shift towards dynamic languages, in my opinion, is what's taught in college now a days. I believe it's mostly Python or the likes. In my college days (late 1990s) it was all statically typed languages (Fortran, Cobol, C, Pascal etc.,). So that's how my generation ended up learning to model a problem and solution in statically typed languages and natural preference towards them.


I agree. People obsess about the choice of programming language, insisting that months of study are needed to choose. I just don't see it except for very specialized tools or applications.

That said, I find maintaining an old, large application written in an untyped language pretty miserable.


I've had equal misery both typed and untyped on large, old, legacy stuff.

Maybe programming just sucks.


At least you don't have to read the method to figure out the input and output types. And the IDE can help you much more. I find these things big when trying to modify some massive codebase I don't know inside and out.


True. I started coding a couple years ago and used a static typed language for the first time a couple months ago. Definitely don't want to go back.


Care to define "old"? I'm 38, and the programmers I know that are older than me don't seem to have a significantly different spread compared to younger (excluding 1-language programmers in both cases).

Maybe a bit less python and a bit more of one of: perl/awk/apl/smalltalk but still they seem to choose untyped languages over typed languages when they are unconstrained to pick the language.


Only until they have to refactor an older codebase of some size...


That's when they become old people


They who? Old or new?


"A lot of the new people"


But that's only if they also got some experience working with static typing. Without that, they won't see the difference in difficulty, as they have no reference to compare to.


Feeling the pain would already be a good start :-). With a lot of Javascript projects being very short term a lot of people never get the experience of maintaining a code base for years.


Ah, thank you for disambiguating the pronoun 'they' whose reference wasn't clear as you provided no antecedent.

EDIT: Heh, kind of ironic really. A pronoun without an antecedent is like a variable in a dynamic programming language whose context isn't clear.


I guess I am a dynamic writer but a strongly typed programmer :-)


Lamport would argue that if you're debating typed vs. untyped PLs, you're already missing the point, as all programming languages are necessarily[1] at a level that's too low for system/algorithmic thinking, and if you're working in a language appropriate to thinking about algorithms/systems, then the considerations surrounding typing are quite different from those pertaining to PLs (TLA+ happens to be untyped, but it's not a PL by any means).

[1]: Due to the requirement that they be efficiently compiled/interpreted.


As I understand it, TLA may technically be untyped but that's simply because you are supposed to implement your own more granular type checking as part of your spec.


It's true that TLA+ doesn't rely on types to specify properties, but it's still untyped. You could implement similar arbitrary properties in any programming language as predicates.


I think a lot of the disconnect comes from "static typing" meaning different things to different people. When some people say "static typing" they are thinking a language like Haskell with a powerful type system. However, when most programmers say "static typing" they are thinking a language like Java or C++ with a rather underpowered type system. Programmers who prefer dynamic typing are largely deciding that they would rather work in a language without a static type system than a crappy one.,


The correct distinction is static vs. dynamic typing on the one hand, and strong vs. weak vs. no typing on the other.


I think the point is way more complex than this. Some type systems are more expressive than others, some type systems require more ceremony than others, and some type systems are more constraining than others.

The real problem with all those discussions about types (and why I think nobody should try to pass an opinion unless they tried something at least as good as Haskell) is that the most used static languages have the least constraining, most ceremonious and least expressive systems around. Developers gain nearly nothing from their type systems, but they are incredibly demanding of upkeep.


I find that ML (Ocaml and F#) does not get in my way (Haskell being the exception due to the absence of quick and dirty mutability) while dynamic typing (in python) does hinder me when I depend on external libraries.


> Haskell being the exception due to the absence of quick and dirty mutability

Haskell has ST for mutable local variables, and for the truly "quick and dirty" cases there's always unsafePerformIO. What is it you're trying to express that wouldn't be handled by either of these options?


> Haskell being the exception due to the absence of quick and dirty mutability

I don't think that's "getting in your way" so much as asking you learn pure, immutable data structures. Which is good even aside concerns of laziness.


Which programmers?!

With the exception of JavaScript, I only use dynamic languages for throw away scripts.

I had lots of fun with Lisp and Smalltalk, but can't imagine using them with the team sizes I usually work with.


> I had lots of fun with Lisp and Smalltalk, but can't imagine using them with the team sizes I usually work with.

I have an impression that it correlates less with team size, but rather with average member skills.

Well-written and logically organized Lisp code shouldn't be any harder to maintain just because the team size has increased. Rather, Lisp provides too much power (as in, macro system, compiler easily available at runtime, etc). These can be easily abused and wreck a codebase.

I have chosen Golang for most new development in the group I am in, basically because it forces a straightjacket onto people, even down to things like code formatting. I got tired of asking for people to write unit tests for their Python code – the fact that people can get away with throwing Python code with zero tests into production baffles me, it's like taking C code to production without compiling it.

So at least I got a compiler enforcing some level of code correctness and some uniformity in code conventions. And the fact that noone is overriding anything they should not...


> Certainly if I knew exactly what program I was going to write when I started, I could probably write it in haskell;

If you don't know what program you want from the start, it's yet another reason to write in Haskell.

Haskell programs are much easier to change than any dynamically type language. Each time you change your mind in Python, you have to basically rewrite your entire program, and if done often enough (what 2 or 3 times are already enough) you will still get a mess on the end because of all the changes.


Should Your Specification Language Be Typed? Leslie Lamport & Lawrence Paulson, ACM ToPLaS, 21, 502-526, 1999: http://lamport.azurewebsites.net/pubs/pubs.html#lamport-type...


Personally, while I strictly use dynamically-typed languages and wouldn't want to do all that typecast typing in Java, I still use type hints for the IDE—in PHP, Python and JS. Because hints give valuable information and enable IDE features that can be directly used by me in my work, as opposed to typecasts which make me toil for the compiler.

So for me, the divide is that the tools shouldn't make me do work that they can do themselves, namely type inference. And I'm baffled that Java people just outright decided they don't want type inference. They chose to type things that the computer can calculate for them! It makes no sense.

Granted, the usefulness of the work I still choose to do scales with the longevity and involvedness of the code, so I dump hints in scripts that will live for half an hour.


Java already had limited generic type inference, and 10 just introduced local variable type inference: https://developer.oracle.com/java/jdk-10-local-variable-type...

That being said the only thing I've run on jre10 so far has been Scala, so ymmv.


I think the crufty type systems and semantics of Java (pre-1.7) and C++ did drive a lot of programmers away from static typing, but most recently popular languages (Go, Rust, Scala, Typescript, Kotlin, Swift) have been statically typed.


I don't think it is fair to say that TypeScript is not in common use and TypeScript has a way more complicated type system than Java and C++, particularly since it uses structural typing rather than nominal typing.


I was thinking the same thing about TypeScript too. It's certainly catching up, and its a great way of having a gradual shift from dynamic to static types as your requirements crystallize.


There are very few untyped languages.

* Distinction between dynamically typed and statistically typed is clear.

* Weak versus strong typing has less clear definition. Usually weakly typed means many implicit conversions between data types and representation.

Most programmers like the late binding approach. That is easy to do with dynamic typing. Adding static typing or type inference for late binding is more difficult.

Programmers usually hate the "weak" typing aspects of their dynamically typed languages when they are chosen wrong. JavaScript is good example of choosing implicit type conversions in a way that leads to confusion.


> Distinction between dynamically typed and statistically typed is clear.

Not so clear as you might think. For example, it's relatively easy to construct a typed version of the untyped lambda calculus using isorecursive types.


That's completely abstract example, not relevant to actual practice.

It's hard for actual programmer make that mistake.


It's a concrete example, but I understand your point that "in practice" we know what dynamic vs. static typing means. Personally, I think a lot of that just comes down to marketing, though :)


There are many PL researchers in the dynamic typing camp, “generally” is hardly justified.


I feel like all the hype for dynamic languages has led to a lot of backlash in the opposite direction.


For me it was writing unit tests for python where you’re pretty much manually checking types. Or running slow one off processing scripts that fail at the last step because of an issue that static types would have caught. Writing the same thing in go usually takes the same amount of time, runs 5x as fast, and works the first time.


So, use python and static type annotations for the best of both worlds?


The type system in C++ (with templates) is actually incredibly powerful.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: