Hacker News new | past | comments | ask | show | jobs | submit login

I remember reading about this issue from my C# days. But, does all this stuff actually help write better programs?

I now work in a language that doesn't really support building "types" in this way and I can't say I miss thinking about this sort of thing nor consider its lack to be a problem. Types in this sense strike me as more of a boondoggle of overhead that gets in the way more than helps achieve the goal of whatever I'm writing.




That's a good question. I don't know the answer.

What language are you working in now? Some really don't support variance (Go, apparently, and C would be an obvious example) but many others support it but just don't enforce it (Python and all the other dynamically-typed duck-typed languages).


As a language designer, I’ve been thinking about this sort of thing lately. It’s basically a tradeoff between how much power you give a programmer to enforce properties of their programs, and how much work you ask of them to do so.

More expressive type systems let you write programs that are more internally consistent (and, by proxy, more correct), but at the cost of more work to write the appropriate types, and to read and understand such types written by others.

A lot of PL research is going into advanced static type systems, and I think that’s a good thing, but it doesn’t necessarily translate into better programmer ergonomics or productivity.

Maybe we just need new notations for expressing types (and code!)—notations that are more in line with how people naturally think about these things.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: