> Is it so hard to understand that a computer operates on a giant array of bytes?
Beginner programming languages universally (since BASIC and Pascal) were designed to hide this fact. There's nothing in a beginning Python course that explains the true nature of computers. You learn about syntax, semantics, namespaces, data structures and libraries. But there's nothing that says, "a computer is endlessly incrementing a counter and executing what it finds where the counter points". And this is probably partly because of "go-to considered harmful", which posited that a lack of control flow (which is a fundamental fact of how computers actually work) is harmful to reasoning about programs.
It's probably objectively true. But a lack of go-to also restricts people from seeing the fundamental truth of the indistinguishable nature of data and instructions in the Von Neumann architecture. Which may also make it difficult to explain GPU computing to students (because it must be understood by contrasting it with Von Neumann architecture).
Beginner programming languages universally (since BASIC and Pascal) were designed to hide this fact. There's nothing in a beginning Python course that explains the true nature of computers. You learn about syntax, semantics, namespaces, data structures and libraries. But there's nothing that says, "a computer is endlessly incrementing a counter and executing what it finds where the counter points". And this is probably partly because of "go-to considered harmful", which posited that a lack of control flow (which is a fundamental fact of how computers actually work) is harmful to reasoning about programs.
It's probably objectively true. But a lack of go-to also restricts people from seeing the fundamental truth of the indistinguishable nature of data and instructions in the Von Neumann architecture. Which may also make it difficult to explain GPU computing to students (because it must be understood by contrasting it with Von Neumann architecture).