It is a fantastic book. It doesn't take u into typical algorithms (at least that I recall), but rather it explains as intuitively as possible how a computer is built up from flip flops and binary logic to assembly, intermediate language and on to full-on compilation of a useable language.
Basically beginner programmers can acquire a broad understanding of the foundation the programs you're building are built on by reading this book. It reads more like a non-fiction expose than a programming language tutorial book, which is to say, given its subject, it's an easy read you can do on the couch. Depending on your skill and knowledge level, there may be a few sections you have to re-read several times until you understand it, but you won't feel as though you need to go over to your computer chair and try something to fully grasp it.
If you can do basic arithmetic, you can get through this book. That seems to be the hidden premise. That computers are easy and should be easy to understand. This book is a testament of that. Though I'm sure some will find this doesn't go deep enough. But the point is: learning so generally will create many entry points for you to follow up on in your journey into programming and computer science. It will clear up many things and essentially make the path seem less scary and out of reach. This book achieves that really well. High level programmers will come away feeling far less insecure about their lack of knowledge of the underpinnings of whatever it is they are developing. I know I did. I can't say enough about this book. It's the real deal. I'm sure those with a computer science degree might have more to say (that is they likely think it's a cursory overview), but I think for everyone else it's a computer science degree in a book you can read in one or two weeks. At least half the degree. For the second half, I recommend Algorithms In A Nutshell. And done! Go back to programming your high level JavaScript react app and get on with your life.
On a side note: it's my opinion that theory first is the wrong way. Application first, theory as needed is the right approach. Otherwise it's like learning music theory before u know u even like to play music. U might not even like being a programmer or be natural at it. And if u spend 4 years studying theory first, u will have spent a lot of time to discover what u could have in like a month. In addition, it can suck the joy and fun out of the exploration of programming and computer science. It's natural and fun to learn as u dive into real problems. Everything u can learn is on the internet. It's very rewarding and often faster to learn things when you are learning it to attain a specific goal. The theory u do learn seems to make much more sense in the face of some goal you are trying to apply it to. In short over ur computing career u can learn the same stuff far faster and far more enjoyably if you do so paired with actual problems.
But that said sometimes u do gotta step back and allocate time for fundamentals, even if u have no specific problem they are related to. However you will know when it's time to brush up on algorithms, or finally learn how the computer works below your day to day level of abstraction. Just know that a larger and larger percentage of us programmers went the applied route, rather than the computer science theory first + formal education route. It's probably the majority of programmers at this point in time. In short u r not alone learning this as u go. Learn to enjoy that early on and save yourself from the pain of insecurity of not knowing everything. This is an exploration and investigation, and perhaps you will make some discoveries nobody else has been able to make, and far before u have mastered and understood everything there is to know about he computer. Perhaps that's it's biggest selling point--you don't have to know everything before you can contribute to the computer world! So enjoy your pursuits in programming, knowing in your unique exploration at any time u may come up with something highly novel and valuable.
I think that everyone will get new information better when it is something that fixes an immediate issue or clarifies an immediate doubt.
But this is not a contradiction. Theory can be presented in such a way that you want and need to know the next piece of information, the way mystery novels work.
It can be easier for the writer to create this need with examples instead of narrative, no doubt about it.
But let's not fall in the opposite direction of having only examples and no theory, so common with blogs now. I feel empty when I read such materials.
Yes! I read it about 5 years after finishing my degree (which already covered many of the topics in depth), and it was very enjoyable. It gives a very good, succinct (if simplified) overview of computer architecture.
[1] https://www.amazon.com/Code-Language-Computer-Hardware-Softw...