(Disclaimer: Most of what I say here has caveats and qualifications – hard to capture full nuance in a comment's space.)
One of the distinguishing features of computers is easy simulation. Humans learn well by direct, tight, interaction with a system – poking at a thing to see how it works. Tight feedback loops are how we quickly build intuitions.
Explorable Explanations [0] are a great extant example of what I mean, but we can do more... Imagine a "textbook" constructed around a well built-simulator (extant example, Earth Primer [1]). Sections of the textbook would present the simulator configured in a pre-set state, with various simplifications and initial conditions. Students can poke and prod these simulators, reset them, test out new states, and answer questions. Assignments could be on the order of "Given Sim[Initial Conditions] figure out what gets you to Sim[Desired State]".
Current explorables are lovely, but (usually) incredibly bespoke. One concrete improvement would be to produce an OpenSim standard, which would allow other content creators to embed and customize these sims to construct new narratives.
Another powerful computer affordance is extreme specificity – a system should be able to build a representation of a learner's current knowledge / skill graph, and figure out a (or many) shortest path(s) from current knowledge to desired knowledge, scoped to that user's interests. (Many LMS systems attempt this, but we're still in the early, clunky days).
Next we have non-linearity. Most textbooks and courses impose a false idea of topic dependence. Yes, there are some intrinsic dependencies, but there are many MANY more ways (orderings) of moving through a learning space than your chapter textbook suggests.
I can go on like this forever. I myself am currently focused on constructing usable knowledge graphs (and localizing incoming students on them), and simulations students can easily play around with to build powerful intuition. For the latter, we've found the wealth of STEM tools available in Python to be a huge boon. We (I work for an ed-tech non-profit startup) usually teach students some programming, and then have them start building models (physics simulations) or interacting with existing toolkits (such as, recently, the Rosetta protein modeling suite). These tools act as a forcing function for real-world relevance, and allow for direct intuition building over rote memorization. More effective, and much more engaging.
To address your comment: VR (and IMHO, AR) look to have a lot of potential in the future, though they are still in their infancy. I think there's a lot of potential for the humanities to provide more immersive experiences of places and time periods. (Re humanities, simulations also work quite well. Mock trials, political re-enactments, etc. There's a reason simulation games are so popular...)
A knowledge map (or graph) is definitely something that should be used more intensively in course structure and instruction.
I remember very well reaching an understanding of a subject by different means than what the teacher expected(?) or was taught to expect. That path of learning was rejected (despite the result being the same) and accordingly graded as a failure. Such resistance to different learning paths, a "status quo" of how people should learn, and the tendency to try and fit everybody into well-defined boxed is frustrating and counter-productive.
This is fantastic. Not that you’re lacking in ideas, but as you mention the value of simulation I wanted to ensure that you had seen this historical case study: https://obscuritory.com/sim/when-simcity-got-serious/
You might also be interested in what I'm building at https://learnawesome.org : A rich learning map of educational resources, augmented with tools like spaced-repetition or notes with bidirectional-linking and project-based learning cohorts with peers and mentors.
I'm 100% on board with these ideas, and I've also been following a lot of these ideas for a while. I'd like to learn more about the startup if possible, my email is in my profile if you don't want to post here.
Likewise, I’d love to learn more about what you’re doing! I’ve found the learning-by-doing paradigm has worked for me (e.g. with learning ML concepts by applying them alongside reading theory, rather than after) and your simulation driven learning using cutting edge tech and applications (like your protein folding example) is a very intriguing extension of that idea. Email also in profile.
One of the distinguishing features of computers is easy simulation. Humans learn well by direct, tight, interaction with a system – poking at a thing to see how it works. Tight feedback loops are how we quickly build intuitions.
Explorable Explanations [0] are a great extant example of what I mean, but we can do more... Imagine a "textbook" constructed around a well built-simulator (extant example, Earth Primer [1]). Sections of the textbook would present the simulator configured in a pre-set state, with various simplifications and initial conditions. Students can poke and prod these simulators, reset them, test out new states, and answer questions. Assignments could be on the order of "Given Sim[Initial Conditions] figure out what gets you to Sim[Desired State]".
Current explorables are lovely, but (usually) incredibly bespoke. One concrete improvement would be to produce an OpenSim standard, which would allow other content creators to embed and customize these sims to construct new narratives.
Another powerful computer affordance is extreme specificity – a system should be able to build a representation of a learner's current knowledge / skill graph, and figure out a (or many) shortest path(s) from current knowledge to desired knowledge, scoped to that user's interests. (Many LMS systems attempt this, but we're still in the early, clunky days).
Next we have non-linearity. Most textbooks and courses impose a false idea of topic dependence. Yes, there are some intrinsic dependencies, but there are many MANY more ways (orderings) of moving through a learning space than your chapter textbook suggests.
I can go on like this forever. I myself am currently focused on constructing usable knowledge graphs (and localizing incoming students on them), and simulations students can easily play around with to build powerful intuition. For the latter, we've found the wealth of STEM tools available in Python to be a huge boon. We (I work for an ed-tech non-profit startup) usually teach students some programming, and then have them start building models (physics simulations) or interacting with existing toolkits (such as, recently, the Rosetta protein modeling suite). These tools act as a forcing function for real-world relevance, and allow for direct intuition building over rote memorization. More effective, and much more engaging.
To address your comment: VR (and IMHO, AR) look to have a lot of potential in the future, though they are still in their infancy. I think there's a lot of potential for the humanities to provide more immersive experiences of places and time periods. (Re humanities, simulations also work quite well. Mock trials, political re-enactments, etc. There's a reason simulation games are so popular...)
[0] https://explorabl.es/ [1] https://www.earthprimer.com/ [2] https://rosettacommons.org/