Back in 1970, no language had a 64-bit integer type. And it started with Unix, which was a skunkworks hobby project, so a thinking of "we'll solve it within the next 68 years" is perfectly reasonable.
They could have made it unsigned instead of signed, which would have made it work until 2100 or so, but I think a 68-year horizon is more than most systems being built today have.
>They could have made it unsigned instead of signed
C actually didn't have unsigned integer types in the beginning. They were added many years later and also not at the same time. For example, the Unix V7 C compiler only had "unsigned int".
If anything I imagine guys like Ritchie never thought we'd be using a Unix-based system so far in the future. Back then OS's were a dime a dozen and the future far too cloudy to predict in regards to computing.
>but I think a 68-year horizon is more than most systems being built today have.
That's a lot of time, especially if we see Linux breaking into the mainstream about 1995 or so. That's 43 years to worry about this. Meanwhile, we saw Microsoft break into the mainstream at around 1985, which only gave us 15 years to worry about Y2K.
> Back in 1970, no language had a 64-bit integer type.
It would be more accurate to say that "no language had a two-word integer type." 1960s CDC 6000-series machines had 60-bit words, and Maclisp got bignums sometime in late 1970 or early 1971.
They could have made it unsigned instead of signed, which would have made it work until 2100 or so, but I think a 68-year horizon is more than most systems being built today have.