Interesting. How would you advocate actually gaining that knowledge then?
It seems like a student would need to know a significant amount of coding in order to learn those abstractions in an interactive manner.
And by learn them I mean learn them (not just following a tutorial), organizing the code for a fully working x86 architecture is no joke.
But a student with that level of skill probably doesn't need to learn the x86 architecture so intensively, they are probably already employable.
I am asking this seriously, by the way, not trying to nitpick. I'm trying to put together a free course based on the video game Turing Complete[1] but from what you're saying it sounds like it might not be very effective. (to be clear the goal is to teach programming, not Computer Engineering)
My working assumption throughout was that the people in a computer architecture class already had 1 or 2 semesters of other programming courses where they worked in a high-level language, and are looking to learn how computers work "closer to the hardware". And educational architectures create a completely false impression in this domain.
If I had to teach assembly programming to people who never programmed before, I'd _definitely_ not want to start with x86 assembly. I'd start by teaching them JavaScript so that they can program the computers that they themselves, and other people, actually use. At that point they'd be ready to learn computer architechture through an x86 deep dive, but would no longer need to learn it, since, as you said, they'd probably already be employable. But the same goes for learning LC-3, and much more so.
To be honest, my opinion is only that educational architectures are a poor way to learn what modern computers actually do, and while I think I have good reasons for holding that particular opinion, I don't have the breadth of experience to generalize this specific observation into an overarching theory about teaching programming and/or compsci. I hope your course will be a useful resource for many people, but I doubt listening to me will make it better: my experience does not generalize to the domain you're targeting.
Many years ago I taught CMPT215 - Intro to Computer Architecture. This was a second year course and taught MIPS assembler. There’s a chain of follow on courses culminating in a 400-level course where you implement a basic OS from scratch on x86 (when I took it) or ARM (present day, I believe).
MIPS was, I think, a decent compromise between academic architecture and industrial architecture. There was a decent amount of practical stuff the students had to deal with without having to eg do the dance from x86 real mode to protected mode.
One of the most memorable lectures though was on the second last day. As it turned out I had gotten through all of my material one say early. We had a review/Q&A session scheduled for the last day but needed 1h30 of filler. I ended up putting together a lecture on JVM bytecode. Throughout the course I would often use C <-> MIPS ASM throughout examples because the 214, which was a prerequisite, was a C course. All of their first year work had been in Java. Anyway, we took real Java programs and disassembled them down to bytecode and walked through it, showing how it’s very similar to hardware assembly but with extra instructions for eg placing objects on the stack or calling methods on objects. The class ended up going almost an hour over because everyone was oddly engaged and had a ton of questions.
This is very interesting. I assume that most of the students would never manage to get a similar job such as working with JVM or MIPS assembly, but I'm sure they would recall your class with a smile.
The thing is that you are conflating CPU architectures with computer architectures; In academia they are treated as two different educational topics, for good reason.
The first one covers the lowest level of how logic gates are physically wired to produce a Turing-equivalent computing machine. For that purpose, having the simplest possible instruction set is a pedagogical must. It may also cover more advanced topics like parallel and/or segmented instruction pipelines, but they're described in the abstract, not as current state-of-the-art industry practice.
Then, for actually learning how modern computers work you have another separate one-term course for whole machine architecture. There you learn about data and control buses, memory level abstractions, caching, networking, parallel processing... taking for granted a previous understanding of how the underlying electronics can be abstracted away.
I appreciate the candid response. I have noticed there is a class of very intelligent, well-educated adult learners who have nevertheless been unexposed to software education until adulthood who are now looking for a career change. I've found that there is a lot of difficulty initially with combining abstractions, i.e., "a variable holds a value, a function is also a value, a function is also a thing that sometimes takes values and sometimes returns values, therefore a variable can hold a function, a function can be passed as an argument to a function, and a function can return a function".
Reasonable adults might have reasonable questions about those facts, such as, "what does any of that have to do with a computer?"
To my embarrassment, I realized they were completely right and my early exposure to software made me overlook some extremely important context.
So for these adults, the expectation of struggling through a few semesters/years of javascript is not an optimal learning route.
My hope was that working from the logic gate level up would at least provide the intuition about the relationship between computers (Turing Machines, really, not modern computers) and software.
However, I think based on your excellent critique I will be sure to include a unit on how "educational architectures are very different from modern architectures and I may have ruined your brain by teaching you this" haha.
It seems like a student would need to know a significant amount of coding in order to learn those abstractions in an interactive manner.
And by learn them I mean learn them (not just following a tutorial), organizing the code for a fully working x86 architecture is no joke.
But a student with that level of skill probably doesn't need to learn the x86 architecture so intensively, they are probably already employable.
I am asking this seriously, by the way, not trying to nitpick. I'm trying to put together a free course based on the video game Turing Complete[1] but from what you're saying it sounds like it might not be very effective. (to be clear the goal is to teach programming, not Computer Engineering)
[1]: https://store.steampowered.com/app/1444480/Turing_Complete/