My parents bought me this book when I was in elementary school. I was fascinated with it; I read it and reread it until I understood everything in it. It took me something like eight months, but when I was done I had an instinctive grasp of computers and mathematics that many of my college classmates still don't have. Highly recommended.
So, I picked up this book on a whim from Half-Priced Books about six months ago. I've since loaned it out to a mechanical engineering friend of mine, and he's finished it; I'll be loaning it out again soon--it's one of those books that I consider essential for spreading the good word of CS/EE.
ACTUAL REVIEW/SPOILER CONTENT:
So, the book does start pretty slowly. It considers the case of signaling with flashlights. This motivates semaphores (not the concurrency type!), which in turn motivates telegraphs. The telegraph repeater and its automation is used to motivate logic gates. Around here (forget exactly whether it's before or after) the author diverges for a chapter or two into boolean logic and counting mechanisms. Then the idea of storage is brought in, then the idea of state machines. Calculation is brought in, and soon the author has a simple little ALU. The author talks about wiring up an interface for this, and then there is talk about interrupts, operating systems, and also real embodiments of this sort of hardware--6502 or 8086 assembler is introduced and discussed.
The writing is geared such that someone in highschool shouldn't have trouble understanding anything, and there is enough history thrown in with the light style that it isn't a chore to slog through.
WHY THIS BOOK IS NOT A WASTE OF TIME:
So, I've already taken the introductory computer engineering/CS classes during my time at university. I've already read a lot in high school and a lot after college on computers and their architecture. This is not a new subject to me. Why is this still a good read?
Folks, our professions (whether you are a mechanical engineer black-boxing gear trains, a software engineer black-boxing Java/Scala/JVM bytecode, or a Perl hacker black-boxing the very mouth of madness) are all rooted in abstraction. Everything we do, everything we touch, is a slice of a pyramid of (sometimes shaky/leaky/smelly) abstractions.
The great thing about Code isn't what it teaches you about programming (it doesn't cover much other than assembly, if that) or computer engineering (no help soldering or designing ring buses or whatnot) or even mathematics (boolean algebra is pretty straightforward in its presentation); instead, Code focuses on bringing us to a functioning microcomputer from a flashlight in our bedroom, without ever skipping a layer of abstraction.
Even if you already know each slice (in broader detail than presented), seeing the entire journey is at worst enjoyable and at best extremely educational.
As my ME friend (at that time, roommate) was chugging through the book, he'd randomly come up and ask me questions or points of clarification. This culminated one evening during dinner.
He, my other roommate (a Linux sysadmin/applied mathematician), and I (software developer and former ME) were enjoying some Indian food, and the ME began bugging us about interrupts. He couldn't understand quite what they were useful for.
So, over the course of four or so hours, we probed him with questions geared towards increasing his understanding. Something along these lines transpired:
ME:"So, what are interrupts good for?"
SD: "Well, they're good for getting the processor's attention. How would you handle checking if a button is down?"
ME: "I could have processor check a line in."
SA: "And if that check happens to miss the button press?"
ME: "...erm, I could have something else check it, faster."
SD: "And how would you check it?"
ME: <thinking hard> "You...could have the switch be controlling a bit of the memory, I guess."
SD: "But you still haven't told us about how to solve interrupts."
ME: <several more suggestions that are basically around polling>
SD: "Let's think about the problem. What you want to do is have the processor poll the button, right?"
ME: "Yes."
SD: "And only do this when it's needed, right?"
ME: "Yep."
SD: "So, what if there was some kind of way of signaling the processor that data was ready?"
ME: "...oh. I guess you could use a pin for that."
SD: "What would that pin do?"
ME: "I think it would have to tell the processor to go handle input."
SD: "And how would that work?"
ME: <several suggestions about dedicated copying logic>
ME: <converges to idea of setting program counter to interrupt code in memory>
SA: "What happens if I want to go somewhere else to handle my interrupt code?"
ME: <suggests looking up equivalent of jump instruction to trampoline into handler>
SA: "You've just invented the interrupt trap vector!"
...etc...
So, by the end of the night, with proper questioning we'd gotten him to come up with MMIO, with virtual memory, with paging, and eventually gotten him to consider how to implement concurrency via scheduling and his newly discovered virtual memory mechanism.
It was delightful, and wouldn't have been possible unless he'd been getting the lay of the land from reading Code.
edit: oh god the spacing
edit2: processor's, not processors
The easiest way is probably through a hobby, in the widest sense. Be it a Science Fiction club, a HN meetup, a charity that helps connect disadvantaged high school kids with technology and programming, toastmasters international or some political activism. Basically any thing that involves meeting other people, and, as a filter, requires a certain minimum level of commitment and getting-off-your-ass.
If you are in Germany or Britain, I can give you more concrete pointers.
After reading Code my hacking skills increased considerably. I am never comfortable relying on abstractions so understanding more of the programming stacks I use increased my willingness to experiment with then. It also helped me to better understand pointers and memory management as memory became a real thing to me.
Also, I had fun building my own simple computers in logic gate software.
Sounds a lot like my Digital Circuits course in university. We started with a breadboard of transistors, and after we built each component we got that component as an IC to build the next level of abstraction (flip-flop → memory → adder → processor → etc) . At the end we ended up with a fully programmable computer that we were tasked to program as a sensor-driven traffic light controller.
Seeing every step from the ground-up really helps demystify the machines you see around yourself every day.
I've decided today that I need to start playing with Arduino for precisely this reason. Before I switched from EE to CS, I took one of the first EE courses where we used a breadboard, a controller which we had to program and actually made some type of device (can't recall now what it was). I can picture my engineering notebook full of detailed notes, complete with the numbered pages and initialed corners after filling in a page. It was a lot of work, but tons of fun and so cool to see it working at the end.
I can only imagine how much easier it will be today than in 1994. Oh the gadgets I will build... Good thing my birthday is here and Christmas is coming soon!
Before reading this book, I had what I thought was a fairly solid grasp of computing. I was 19 (20 now), largely self-taught from spending nights on Google instead of going to parties, and if I didn't know the details of something, I knew enough about the concept to get by. I could program well enough in a few languages, talk about hardware, security, and protocols for hours...
I still had no idea how 10010011000110101001010 etc etc allowed me to play World of Warcraft. There was a small part of me that hoped beyond hope that computers were magic and there was some grand conspiracy going on to cover that fact.
Then I read CODE.
My hopes were shattered, but they were replaced by something much more treasured: Understanding.
CODE is good, but there's a weird disjoint between the very slow start (really, the stuff with light switches and relays goes on for much too long) and the suddenly much harder late section.
I am a programmer - have been earning my bread by programming for over a decade and before that had been studying CS for another 5 years with all those courses on Turing Machines, Random Access Machines, von Neuman computation model, logic and so on. Why should I read this book? Is there anything in it that was not covered by this standard curriculum?
No, there isn't. But it brought me so much joy to figure that the core of what I have learned about computers in the university can be conveyed in one small book, in such simple language and with so gentle learning curve! It starts off with explaining the work of digital switch with an example of flashlight (who can't understand that!) and goes all way to digital circuits, CPU, low level and high level language at such a gentle pace that it is marvelous to behold even if you don't actually learn anything new.
Ever since I read it, I recommend it to every person who wants to learn about computers. I think this should be the first book to read for any person who wants to learn about computers on any level.
Think from the point of view of someone who does not have as much experience as you do.
Plus, not all universities cover everything that is in that book. For example, we studied logic and elements of computability theory, but we were never taught how all these concepts are brought to work in modern hardware/software.
This book brings the knowledge (that some are only able to gather through experience) in a compact and systematic form.
Way back in the early nineties, when I was in college, I went to go visit my friend at his school. He had class, so I went to the library to kill a few hours. I dug through the (rather ancient) CS section, and pulled some books to read. Two have stuck with me:
One was __The Mythical Man-Month__ (the old one, not any of the reissues). That put into words so many things that I felt about computer programming, if nothing else the humanness of programming as a pursuit.
The second was __Software Tools__. Even then, whichever version I encountered was ancient (I was learning C in school). It didn't impact me quite as much as TMMM (which I think i read most of in the library), but it was enough for me to remember and pick up a used copy a few years back.
I still haven't read it, so I'm glad you reminded me about it.
As far as a modern book, the one that seems closest to me is __The Practice of Programming__.
I graduated w/ a degree in Electrical & Computer Engineering with a minor in CompSci. From the review, this sounds like all the stuff I learned that wasn't CompSci, distilled into an easy-to-read form. Stuff like flip-flops, RISC assembly, how memory works, etc. It sounds fantastic.
Having the benefit of an EE based undergrad degree was awesome for this reason, esp. since my first job was firmware design. The downside is that I never took a compiler course and have had very little exposure to stuff like discrete mathematics and turing machines. I didn't learn anything about OO design 'til after school, and I still haven't gotten into functional programming.
Sometimes I worry that I actually missed out on the good stuff. :/
I was a physics major in college and got into programming a couple years ago.
In my experience, of the hard sciences, CS is by far the most amenable to self-study. CS isn't easier, whatever that might mean, but CS is amazingly inviting: if you've got a computer and a textbook budget, you can learn some really neat stuff. So don't despair; get studying!
The dean summons the CS department chair to his office.
"You people are bankrupting us!" he fumes.
"Why do you need all this expensive equipment? All the mathematicians ever ask for is pencils, paper, and erasers. And the philosophers are better still: they don’t even ask for erasers!"
It's somewhat the other way around for me: I got to study logic, discrete math, functional programming and OO, but I feel I missed out on stuff like electronics.
I would not propose this book as one that every programmer should read. But others have posted about books that started with fundamentals and illustrated how computing systems evolved, which brings this book to mind... Henry D'Angelo "Microcomputer Structures".
"Microcomputer Structures" is out-of-print, but I recall the small text starting with atomic physics and building up to an introduction of the Von Neumann architecture.
As a freshman at Boston University in the 1980's, I took this mind-expanding course (probably because it was way above my level) with Professor D'Angelo. The final project was building an interface to a single-board computer.
I have given this book to 2 people who told me they wanted to know more about computers. It's very easy to read and does an excellent job of explaining basic computer concepts. Probably not too interesting for this crowd. But it's an excellent introduction to computing.
alright, meh, not convinced. we're all busy people and my bedside book stack is 12 high.
compare books like CODE, SICP, and related books discussed below, to books like Beautiful Code, Higher Order Perl, Mythical Man Month, Seven Languages in Seven Weeks, tons of whitepapers.
do I really need to read CODE? Seems like a bottom-of-stack book that will never quite make it to the top.
I wouldn't say that you should read it, that depends on your priorities.
But the truth is that the benefit/effort ratio in that book is pleasantly high. It is very easy to read and it did gave me quite some "Oh! So that is how it works!"
I ordered The Mythical Man-Month a year or two ago. This is something of a blasphemous statement on Hacker News, but for me the book was a letdown. It felt like a rehash of ideas that I already knew from reading Paul Graham's essays, other technical blogs, and comments on Hacker News.
The reality, of course, is that the book was so spot-on and insightful that many of its ideas have simply been absorbed into the way we approach software development. It seems so common sense today that people don't even cite an idea stolen from TMMM.
Unfortunately, this meant that nothing in the book was new to me, and reading it was rather dull. Because of that, if you've spent much time following the tech community, my recommendation is to prioritize reading TMMM below many of your other suggestions.
What you're talking about are _much_ higher level books. They'll make you a better programmer for sure.
This book (probably) won't do that, but it will give you a more intuitive sense of how the machine actually _works_ at its core. This is not meant to be hands on stuff for someone who's programming at your level of abstraction.
Compared to those books you could clear this one out in a few days, and the benefit/effort ratio is probably actually better, although much less directly applicable.
It really depends on where you are. If you've done a CS undergrad program you probably already know everything in CODE. But for a bright high school student or self-taught programmer interested in CS the only book I'd rank higher would be SICP.
yeah, that seems to be accurate. that's the thing about these "best book ever for all software engineers" posts -- anyone reading even just a few industry books a year already knows this stuff.
Just trying to teach them that somewhere, down under all that runtime, there's a bunch of hardware which deals with the world 32 bits at a time, and that in the end all these magical "objects" really cook down to a bunch of memory addresses... yeow.
That's what I loved about being in computer engineering. You'll sit there and fill in charts of the cache, perform Tomasulo scheduling by hand, and by god at the end you'll have a pretty decent idea of how the machine does things. Not to say that CS students can't or don't learn the same thing, just that the ones I knew tended not to.
It sounds like the book does a lot to flesh out software engineering as a legitimate form of engineering. Programming is how we apply and manipulate electrical circuits, which arise from the natural laws of electromagnetism.
I picked the basics of programming and how computers work with a book called "The Beginner's Computer Handbook - Understanding & Programming The Micro". Actually this book was the reason I became a programmer. I was 8 years old when received it as present on birthday from my dad.
I could not find a link to the book, but found this blog post which has also some pictures:
At my university, this kind of stuff is covered in a class called Digital Logic, which is a precursor to Computer Architecture. I've a pretty good understanding of how all the stuff works that's mentioned in the blog post.
edit: It may actually be more of an electrical engineering class, but the CS program here is ABET certified, so it's essentially an engineering degree itself, but this is the kind of thing that any CS graduate should be intimately familiar with.
Another book that takes this "soup to nuts" approach to the pillar of computer abstractions is Zalewski's "Silence on the Wire". It's mostly focused on computer security, but it does an excellent job of peeling back software abstractions and showing how innocent decisions like blinkenlights and link-following web spiders can be utilized to compromise security. My favorite computer security book, easily.
Do CS programs not teach this? Pretty early in getting my degree I was taught about gates, timing signals, flip-flops, deMorgan's Law, etc. By the time of the Sys Architecture class it was assumed this had been covered.
While I agree that programmers should understand the fundementals of how a cpu et al works, is this book the best one?
A good layman's guide for computer science is Danny "Connection Machine" Hillis' "The Pattern On The Stone: The Simple Ideas That Make Computers Work". He builds up from Boolean logic to Turing machines to quantum computing in fewer than 200 pages.
K&R is a book that influenced me a lot, but its most important lesson is not laid down very explicitly. IMHO, this lesson is its notion of programming style: terse, small, efficient. It permeates the whole book and C/Unix design but the book is not explicitly about it.
Unfortunately it is a lost lesson. Today, programming became just the act of gluing bloated frameworks, libraries and doing severs configuration. The small, simple and powerful K&R approach is totally lost in a world dominated by bloatware (e.g: Java/.Net/Boost).
I firmly believe that every programmer should be able to read and write C to some extent, and to learn it one must read K&R. So, yes, I agree that it is also one of the books that every programmer should read.
What does learning C provide that other programming languages (majority of which have some basis in C) don't teach you?
Memory management (allocation, pointers, etc)?
Pipes and buffers?
They are present in other languages, and one could argue with memory management why 'every' programmer must learn those techniques even if they are never really applicable to them.
I'm really trying to think back to my C days (I haven't done it in a long time) as to why it is so important to know? What is unique about it?
I think I am a better programmer for having used C.
C is interesting because like php it is very easy to end up with an unmanageable code base (but of course C is a lot more powerful than php).
I learnt a lot from good C code, the best C libs I have seen use a pseudo version of objects. They create a data structure to hold state and pass that as the first parameter to each function that can be used on it. Once I got my head around that I found I had a better understanding of the concept of object orientated programming.
One good reason is that by writing algorithms and data structures in C, you can see them in all the detail, see how the memory is laid out and how it's managed, and understand their costs and benefits.
I used ANSI C to implement most of the algorithms we studied at the university and I can say that it helped me understand them better.
Plus, an understanding of what happens on the lower level can be quite handy when debugging some software written in a more high-level language.
C and some limited amount of assembly are what drive most of the electronics that surround you; most of the of "smart" devices in your house, car and pocket contain large amounts of code written in C. The phone infrastructure you use, the traffic signals, the network you get news from, the browser you're using right now? C.
Scripting languages and bytecode-driven languages? Their interpreters and jitters are likely written in C.
There are exceptions. I understand that some Java-based systems are built in Java, all the way to the metal. Apple was a Pascal shop until it turned to C in the late 80s. By and large these are systems on the fringe, and the bulk of the heavy lifting today is done in C.
Same reason you can't be a mechanical engineer without having taken a shop class.
You can sit in front a CAD and FEM package all day - but if you don't appreciate how the things will actually be built you aren't going to be a really good engineer.
While this book sounds like it would cover a lot of what someone might have missed in a CS program if they were not very inquisitive, I picked that stuff up, and continue to do so as i forget by going back and looking things up.
My vote for must read is: "Zen and the art of Motorcycle Maintenance" - Pirsig
I wouldn't even say it is geared towards the CS student/CS grad, more like general nerd reading. If I knew a kid who was "interested in computers", but there was no one around to give them a helping hand I'd definitely give them a copy of CODE and The C Programming Language.