Hacker News new | past | comments | ask | show | jobs | submit login

>The author isn't claiming to have had an informed opinion of types. They're saying that they found learning about types to be difficult and stressful. That's a real problem that I think people who understand types forget.

Respectfully, its really hard for me to wrap my head around this mindset.

When I wrote my first line of code, almost 10 years ago, one of the very first concepts I learned was types. You know the basic OOP lessons where they teach that a Dog is a type of Mammal which is a type of Animal, etc. Typing was ingrained in me before I wrote any meaningful code. So from my perspective (and the perspective of anyone who learned similarly), typing is basic, foundational knowledge, and programming without an understanding of types is akin to running before you can walk

I think a lot of the developers learning nowadays are learning to code only on JS. And on JS you can do a lot without ever thinking about types. So they get stuck in the mentality that typing isn't necessary, and don't ever put in the effort to learn types.

Learning about types isn't that hard. But if you've been conditioned into thinking that typing is esoteric (perhaps due to modern coding courses completely glazing over types), then you're likely going to find it very challenging.




I've been thinking about this (and will probably try to put together an essay about it at some point). I think that the types you learn in programming 101 and the advanced types (the kinds of things written about in type theory books and papers) are related, but different beasts.

If we drop down to the lowest level, basic-programming-types are about one thing - interpreting what is otherwise a pile of bits in memory. Without any notion of the difference, how would you interpret the difference between characters and integers and floating points? There is nothing about the ones and zeroes to tell you. So the first thing you learn is to write something like `char a = 'b';`

Dependent types (and related type theory constructs) do get used for this in languages built from the ground up with that as a theoretical foundation, but they have their basis in ideas that are more related to formal verification and logic.

And that's where the divide comes in.

Using types to pick a codec for some bytes and using them to verify systems are very, very different mindsets.

And we simply call both "types".


That's an excellent point. "Do we interpret these 8 bytes as a 64-bit integer or a double-precision floating point value?" is "typing", but it's hugely different from defining a typeclass that describes a monad and then implementing it for a list type.


> Respectfully, its really hard for me to wrap my head around this mindset.

> When I wrote my first line of code, almost 10 years ago, one of the very first concepts I learned was types.

Well, that's great for you. When I wrote my first line of code, almost 20 years ago, the first thing I learned was string interpolation. Then a long, somewhat embarrassing course in what happens when you interpolate strings that weren't of the same "type". If you're introduced in any kind of self-taught way to programming, and you start with dynamically typed languages, you're doing stringly typed stuff.


A move at my alma mater to switch the introductory course from Java to Python was criticized on the same grounds. The students had a hard time in successive levels of coursework.

It seems to be a lot harder to get people who learn dynamic typing first to learn static typing, than the other way around.


It's dangerous to assume that the same number of people will continue programming if you start people on dynamic vs static typing. Maybe the static first programs weed out more people?


The anecdotal information I heard was that in general there were less people adjusting to later coursework successfully, in absolute numbers.

From a pedagogical standpoint, if the point of an undergrad education was to produce people with well-rounded abilities in their field, I don't know that switching introductory courses to Python was the best move.

Personally, I'm thankful that the first program I was enrolled in allowed me to "fail" fast, so that I quickly figured out that I shouldn't really waste more time and energy into the program and switched into CS. It's a lot easier to pivot as a freshman in your second semester than as a junior finishing the academic year. (And I knew people who took that long to switch majors, who either spent way more time and money or dropped out.)


> When I wrote my first line of code, almost 10 years ago, one of the very first concepts I learned was types.

Lucky for you. Not everyone did.

It's very hard to remember what it felt like to not know something and likewise hard to remember how hard it was to learn something once you do.


The opposite for me. Typing makes zero sense. I know what I'm doing with my variables so why do I need to specify a type? Why can't the computer figure this out? Isn't that what computers are good at?


Well, as the author of the code, you should be able to tell the function what type of variable you expect (string, int, or business entity e.g Book, Bookstore).

That way if you later try to call the function passing an int when it expected a Book, you have an early warning system at compile-time. Not run-time.


The evidence from a whole lot of code written by a whole lot of people seems to indicate that, for the most part, a given developer actually may not know "what [they're] doing with [their] variables".

For everything else, there's type inference.


The computer can indeed find out. Most statically typed language can infer types nowadays.


Python has this: https://github.com/google/pytype

"Pytype checks and infers types for your Python code - without requiring type annotations."




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: