Knowledge of the entire machine's activity. When I was a young teenager the home machine of choice was the Commodore 64 and its competitors. When you turned it on you usually had an interactive programming language REPL start up, probably some variant of Basic, and that was it. Off you went:
No OS to worry about (the machine probably had a very basic BIOS to handle peripherals but that was it)
No permissions model
A tiny API to interact with whatever graphics/sounds facilities were available
No need to worry about what resources simultaneously-running programs could be using (there weren't any)
Actually no need to worry about concurrency at all (you just couldn't do it)
No need to worry about what language to use (either the one blinking at you when you turned on the machine, or assembly language for the processor)
No need to worry about how to restructure computations to use a computation shader or SIMD
_You_ were the owner of every resource in the machine and it all danced to your tune. And, best of all, you could come to a complete understanding of how every part of those machines worked, in just a few weeks of practice. Who today knows the intricacies of their laptops to the same extent?
Yeah, those were the days. Learn to read input from the keyboard, display characters to the screen, and to read and save files and then you could do almost anything that any professional software did.
To add to that, you could learn to read/write a serial port and poll a mouse to find out where the pointer was and whether a button had been clicked, and that was cutting edge. At that point you were doing things that much commercial software didn't even do yet.
Just a few simple I/O things. All the rest was whatever logic you coded up.
You ran it and it either worked or didn't, and if it didn't you knew that was a bug in your code. Code that you knew because you'd written it.
No stack of components, no frameworks and libraries and dependency manager configurations and VMs and container configurations and network connections and other programs that might interact with it. It was just you and your code.
And sure, the IDEs today are technically better. But so much more complex that you could spend a lifetime studying them and still not understand all their functions. Turbo Pascal's IDE, though much simpler, well, was much simpler. You could easily grok it entirely within a week's normal usage.
So without all the cognitive overhead of a modern code ecosystem, you could just focus fully on solving the problem, doing the logic, figuring out the best way to do it.
Nowadays you spend most of your time figuring out all the tools and dependencies and configurations and systems and how to glue them all together and get them actually working together properly. There's relatively little time left for the actual problem you're trying to solve and the actual code that you're writing. Actually doing the thing that achieves the objective is kind of just an afterthought done in your spare time when you're not busy babysitting the ecosystem.
Well said.
"The Emperor's New Clothes" parable kind of applies here. Almost no one wants to openly say how ridiculous the situation has become, for fear of being discredited by others with an axe to grind [1], or by those who just do not get the point, or by those who are into unnecessary/accidental complexity or RDD (Resume-Driven Development). A sad state of affairs, more so since it is among a group who think of themselves as, and claim to be, smarter than "normals". Even that last word is ugh, and revealing of the mentality.
After doing a little bit of arduino programming, I really liked how the resources were really limited. It reminded me of installing and running linux back in the early 00s, and messing with sinclair spectrums and apple computers even further back.
I reckon cos of the constraints it's a really good learning environment.
I was going to say the same thing. I feel so lucky to have started programming in the era of 8-bit home computers. Looking back, this was a Zen like experience. Switch the machine on and, literally, within a few seconds you were faced with a blank screen and a blinking cursor. It is as if the machine was saying "Go on then...do something wonderful".
And one's own wonderful was within reach because of the things you point out. Of course context matters here with having nothing to compare too, but every learning point seemed like magic to me.
Out of necessity, learning how the hardware worked and relating that to software was such a big part of the culture. Books and magazines wrote about CPU architecture, address and data busses, video programming etc.
Being into electronics at the time I constructed external address decoders and data line drivers (7400 series) to make lights and relays turn on as well as being able to sample and store an external voltage in memory via homemade R2R ADCs.
Years later, I was writing control and logging software in Topspeed Module-2 running on DOS as part of my postgrad work. I'd arrive, somewhat stressed, in the lab after a fairly lengthy two train commute then relax to the sound of the hard disk as the PC booted up. Then it was me, the machine, a single language and a couple of RS232 serial ports for I/O. Bliss!
>_You_ were the owner of every resource in the machine and it all danced to your tune. And, best of all, you could come to a complete understanding of how every part of those machines worked, in just a few weeks of practice. Who today knows the intricacies of their laptops to the same extent?
This is because you lived in the microcomputer world back then. Mainframe and minicomputer programmers lived in a very different world, and had many of the same problems: OS, permissions, concurrency, other users on the same machine, scalar vs vector processors, etc.