Hacker News new | past | comments | ask | show | jobs | submit login

C wasn't bad for making that design decision in 1973, to include syntactical space for octal numbers. Programmers actually used octal notation then. It made sense in an environment where contemporary tech included the PDP-8's 12-bit words (four octal glyphs) and displays of seven-segment numeric readouts (where octal is useful, as letters for hexadecimal can't be displayed and base-10 needs conversion logic.)

Javascript was bad for making that design decision in 1995, when nobody seriously used octal for anything, so the leading zero was vastly more likely to introduce a class of WTF bugs instead.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: