Hacker News new | past | comments | ask | show | jobs | submit login

That argument would make more sense if such a language was widely available but today in practice it isn't so we live in the universe of less ideal solutions. Actually it doesn't really respond to DJB's point anyway, his case here is that the downstream labor cost of compiler churn exceeds the actual return in performance gains from new features and that a change in policy could give security-related code a more predictable target without requiring a whole new language or toolchain. For what it's worth I think the better solution will end up being something like constant-time function annotations (not stopping new compiler features) but I don't discount his view that absent human nature maybe we would be better of focusing compiler dev on correctness and stability.



> his case here is that the downstream labor cost of compiler churn exceeds the actual return in performance gains from new features

Yes but his examples are about churn in code that makes assumptions that neither the language nor the compiler guarantees. It's not at all surprising that if your code depends on coincidental properties of your compiler that compiler upgrades might break it. You can't build your code on assumptions and then blame others when those assumptions turn out to be false. But then again, it's perhaps not too surprising that cryptographers would do this since their entire field depends on unproven assumptions.

A general policy change here makes no sense because most language users do not care about constant runtime and would rather have their programs always run as fast as possible.


I think this attitude is what is driving his complaints. Most engineering work exists in the context of towering teetering piles of legacy decisions, organizational cultures, partially specified problems, and uncertainty about the future. Put another way "the implementation is the spec" and "everything is a remodel" are better mental models than spec-lawyering. I agree that relying on say stability of the common set of compiler optimizations circa 2015 is a terrible solution but I'm not convinced it's the wrong one in the short term. Are we really getting enough perf out of the work to justify the complexity? I don't know. It's also completely infeasible given the incentives at play, complexity and bugs are mostly externalities that with some delay burden users and customers.

Personally I'm grateful the cryptographers do what they do, computers would be a lot less useful without their work.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: