I don’t get this take. Is it so hard to understand that a computer operates on a giant array of bytes?
I think the hard thing to understand is that C’s pointer syntax is backwards (usage follows declaration is weird).
I also think understanding how arrays silently decay to pointers and how pointer arithmetic works in C is hard: ptr+1 is not address+1, but address+sizeof(*ptr)!
Pointers are not hard. C is just confusing, but happens to be the lingua franca for “high level” assembly.
> Is it so hard to understand that a computer operates on a giant array of bytes?
Beginner programming languages universally (since BASIC and Pascal) were designed to hide this fact. There's nothing in a beginning Python course that explains the true nature of computers. You learn about syntax, semantics, namespaces, data structures and libraries. But there's nothing that says, "a computer is endlessly incrementing a counter and executing what it finds where the counter points". And this is probably partly because of "go-to considered harmful", which posited that a lack of control flow (which is a fundamental fact of how computers actually work) is harmful to reasoning about programs.
It's probably objectively true. But a lack of go-to also restricts people from seeing the fundamental truth of the indistinguishable nature of data and instructions in the Von Neumann architecture. Which may also make it difficult to explain GPU computing to students (because it must be understood by contrasting it with Von Neumann architecture).
I think the hard thing to understand is that C’s pointer syntax is backwards (usage follows declaration is weird).
I also think understanding how arrays silently decay to pointers and how pointer arithmetic works in C is hard: ptr+1 is not address+1, but address+sizeof(*ptr)!
Pointers are not hard. C is just confusing, but happens to be the lingua franca for “high level” assembly.