I mean, in C and other low-level languages unlike python, it's generally assumed you have a basic understanding of the machine model and the consequences of arithmetic in limited types.
If most people make mistakes, they should self-select to languages which have properties that protect them from their ignorance. for example, I mostly program in Python using longs so I don't have to worry about overflow.
What kind of "long" are you referring to? Most programmers would think of a C long, int64, not a Python long, bigint.
Maybe people ought to self-select, but that would mean they'd need the training or experience to recognize what they don't know. It's often the most ignorant people that believe they have the most expertise.