Hacker News new | past | comments | ask | show | jobs | submit login
Books I recommend to my software engineering students (utk.edu)
411 points by ashort11 on April 23, 2020 | hide | past | favorite | 116 comments



I've advocated for a "canon" of books for software engineering and computer science that could be taught in school. The canon would accomplish several goals: it'd inform students about the less technical but profoundly important ideas within CS (Brooks' No Silver Bullet, The 10x cost incurred when moving from stage to stage in waterfall, how to treat and manage failure, etc.), it'd teach students how to think about problems (How To Solve It by Polya, The Design Of Everyday Things), and it'd provide history/culture (Coders At Work, Soul of A New Machine). One should, of course, learn how to program in a CS major. But there's diminishing returns on teaching programming in a class; ultimately students must make the shift from learning in class to teaching themselves. Perhaps teaching more of the "soft" aspects of CS would give better returns, especially as the students move up the ranks and start managing people.


I disagree.

I think books on software management and process are mostly lost on someone who hasn't been involved in the process. (Hell many of those books are lost on project managers)

I think the most important subjects to teach students are javascript, css, & sql (or mongo). Before you teach a beginner wood worker the "Zen of wood" you teach them how to cut a piece of wood without sawing off a finger.

The rest will come later.


> I think books on software management and process are mostly lost on someone who hasn't been involved in the process.

I think that's true and fair. However, once some experience has been acquired, I believe it's helpful to evaluate whether it's good experience or bad experience. Comparisons against the literature can be helpful at that point. Case in point: I tried reading 'The Psychology of Computer Programming' in college, but it didn't make much sense to me. Coming to it 15 years later, I realize it'd have been the perfect companion to my first five years in industry, naming and describing problems I'd faced and solutions that only came through experience more painful than reading.


Interesting counterpoint. I kind of agree that these books will be lost on people (although I believe I got something out of them and I'm early into my career). But I'm also thinking of it as providing early exposure so that if the students need guidance later on, they know where to visit. Much as high school students find the classics boring, but upon revisiting them later in life, discover their brilliance.

I agree that you should teach the practice. My ideal education would be co-ops combined with a reading course or two on the canon.


Any books to recommend for the topics you mentioned? Especially sql or mongo.


I really liked 'The Art of SQL', Stephane Faroult


Mongo has a series of online courses


I'm kind of bothered by a lack of rigour and science in general software engineering disciplines. The vast majority of decisions seems to be made subjectively that any objective criteria. Also industry is very bad at documenting it's processes and experiences to the community, most documents if any are available to the company internally only.

Also detailed histories of the devdlopment of large and complex software needs to be written and studied, but I just don't see it happening.


I wish I had more time to do that. I work at a feature factory.


At least for the written part there's the "Architecture of open source applications" series.


I always thought a history book dedicated to snippets of old code or software problems that has a story to tell could be interesting.

E.g. explaining an interesting or significant software problem and it's surrounding environment from each decade for the last 7 or so decades.


> The 10x cost incurred when moving from stage to stage in waterfall

This seems to be a common falsehood propagated by Agile consultants and swallowed whole by an industry that doesn't know any better.

It doesn't look like the waterfall model was ever used on a significant scale; not as far as I remember, anyway.

https://softwareengineering.stackexchange.com/a/139107


> It doesn't look like the waterfall model was ever used

Well - yes, it was used, and still is: it's the default if you're not careful or if you don't think very hard or realistically, or are very naive. The confusion is that is was only given a name to disparage it: W. Winston Royce observed that the way most people managed software projects was completely unrealistic and didn't take into account changing requirements. He called it "waterfall" as a way to underscore how inflexible the default "write down all the requirements, then write down how long they're going to take, then do them in that amount of time" approach was.

Unfortunately, most people who adopt what they refer to as "agile" processes are still stuck in that same mindset; they think that, by having daily standups, putting in JIRA tickets, and referring to every two weeks as a "sprint", they'll somehow meet the project manager's pipe dream of 100% predictable software development schedules.


Royce didn't set it up to disparage it. He set up his idealized flow and then proceeded to add details that he believed were critical to making it work. In the end, what he proposes is what most people understand to be Waterfall but with some extra details:

- Feedback loops (because problems or deficiencies will arise)

- Involving the customer (because you don't want to spend 12-60 months building the wrong thing)

- Build a prototype (good advice)

- Document, document, document (he thinks 1500 pages is a good target)

The only thing practitioners seem to have taken away is that last bullet. He still made a strong distinction in his model between analysis, design, and implementation. Though, to be fair, at the time "programmer" was more of a technician level and design often involved making detailed designs like flowcharts and such that could be more easily translated into code. These days, design and implementation are really tangled up, and the documentation is awful an non-actionable (I've been on those teams, it's nightmare inducing). People think prose can replace a diagram, so they fill out their 1500 page quota with lots of words but no clarity.

The name came later, and was enshrined in a DOD standard by people who couldn't read past the first few pages. So they entirely missed the lessons learned he was trying to apply to it.


Agreed. Pure waterfall has become a strawman for the Agile zealots. Realistically, Waterfall is used as one component of the overall design and implementation process.


The waterfall model is definitely used at scale in DOD projects. It is as horrifying as it sounds.


'The Pragmatic Programmer', Hunt and Thomas


Definitely needs DeMarco and Lister's "Peopleware".


I'd like to plug Responsive Communication.

It's available entirely online here http://www.ankn.uaf.edu/curriculum/AxeHandleAcademy/rc/50pat...

It was written in 1986 by the linguist Ron Scollon and his wife Suzie.

I found a paper copy on a free-book shelf outside a thrift store, and consider myself blessed to have stumbled upon it.


I agree with the idea of a canon for software engineering, probably more as a living, shifting document than as something set in stone (except, maybe, MMM, which seems perennial.)

I'm going to try an experiment, posting five books I'd put on the list, one per comment. Downvote, upvote, add your own if you like the idea.


Would suggest also The Inmates are Running the Asylum, and for history/culture, the Jargon File.


'The Mythical Man Month', Fred Brooks


its shown in the main photo at the top of the article?


I'd add the "Elegance is not Optional" essay from the preface to The Art of Prolog.


You probably meant The Craft of Prolog by Richard O'Keefe. The preface of the Art of Prolog is about David H. D. Warren's recollections of his first encounters with Prolog.


Good catch! Thanks for pointing that out!


Design Patterns by Gamma, Helm, Johnson, and Vlissides (aka GoF)


I still don't understand why people worship this book. Contrary opinion here but I think this book is actually below average.

I think it's good that it introduces the overall concept of Design patterns, but the book itself isn't really about the general concept of Design patterns.

The book is focused on OOP techniques and tricks. It makes the assumption that OOP is the most general way to modularize things and think about computation and design.

Additionally a lot of the "patterns" aren't actually good, they're actually really bad. Many of the patterns introduced by this book actually increase complexity of a program while giving only an illusory sense of increased modularity and reuse-ability.


Yeah, I hated that book. I didn't really understand it. Then I read "Head First Design Patterns" and it clicked. I went back to the Gamma book and now it makes sense, but not as a first book on Design Patterns.


The book didn't introduce people to design patterns, it invented the concept within the discipline of software programming.

Perhaps one needs to have been in industry before and after this book to appreciate how it introduced a common vocabulary, a mechanism for documenting and sharing experience, and a framework for thinking about common problems.


I agree with the first reply that the book is overrated. While it's not irrelevant, I wouldn't think it fits into a "must read" list for people learning software engineering. I often hear, "Head First Design Patterns" as a better reference if you're interested in design patterns. I own the GoF book, but have thumbed through Head First and am inclined to agree.

What I think is missing is the idea that design patters are often language smell. They're common patterns to compensate for deficiencies in the language. The GoF is most applicable to C++ code that uses a lot of OOP. Like the previous reply said, reaching for these patterns too quickly complicates things. Like using unnecessarily large words when simpler language would suffice.


The first expression of a concept is not necessarily the best or most useful expression. As such the book can be the first expression of the concept of Design Patterns and be worthless as a lasting text.


>The book didn't introduce people to design patterns, it invented the concept within the discipline of software programming.

By introduced I mean, introduced to the world for the first time. Invented is debatable concept as in was math invented or discovered? I avoid that debate by using the word "introduced."

>Perhaps one needs to have been in industry before and after this book to appreciate how it introduced a common vocabulary, a mechanism for documenting and sharing experience, and a framework for thinking about common problems.

Humans invent vocabulary for everything that they do, giving patterns names is inevitable. I would hardly call it revolutionary. This book just gave the concept of naming patterns a meta name in itself: "Design patterns." I think it's good that this was given a name but its importance is largely exaggerated.

Case in point: Other disciplines of programming don't use the term "Design patterns" or formal pattern names that frequently. In fact, technically, there are tons and tons of "Design patterns" in procedural programming, declarative programming and functional programming, yet experts in these respective fields don't feel the need to give these concepts over-inflated and complicated nomenclature.

Take currying for example. Functional programmers don't call it the "Curry pattern" nor do they refer to it as a design pattern (even though it technically is). A name just arose naturally. No need to give the concept of naming concepts another word.

To take the illustration even further, imagine driving techniques. Should I give the concept of naming driving techniques some name to exaggerate its importance? Perhaps I can, I'll call it: "Vehicular Translation patterns." And instead of talking about cars using regular English like a normal person I'll just communicate like this: "Execute the drift pattern in composition with the slide pattern and inverse throttle pattern to maneuver through that turn." Talking like this helps me sound smarter while obfuscating communication to the point where it can only be understood by a select few. The book design patterns introduced to the programming world what could potentially be done for driving as well. Needless to say, if it's pointless to do this for driving then does that mean it's pointless to do for programming? In my opinion the answer to that question is, "Yes."

In short, use english for most concepts and name things only where it matters and as it comes naturally. No need to invent an entire discipline and inflate it with made up epistemology.

I realize race car driving does have it's own nomenclature but it's not over used to the extent of design patterns nor did they feel the need to give the nomenclature it's own nomenclature (vehicle translation patterns, VTS theory for short)


Currying is a bad example, as that was a concept in mathematics before it was a software engineering concept. Also defining a vocabulary is no trivial feat. Most of software engineering is deciding what definitions for architectures, modules, and interfaces should be used to make software systems most efficient and maintainable (in a way creating a vocabulary upon a specific domain)


Why does that make it bad? I could call it Mathematical Lambda Patterns. MLP for short and put currying under it.

Or I could just call it english vocabulary within the field of mathematics.

Every single thing you said could be done without the usage of the word "design patterns" and is already done without meta awareness of itself in all fields of engineering, science and anything.

If I were to build a house is there not a language for "architectures, modules, and interfaces" used to make houses more efficient and maintainable? Also it would be (in a way creating vocabulary upon a specific domain)

The difference is for physical architecture it's missing the pompous self importance. It does not need a name for itself. Imagine if it was called "Structural Language Patterns," and every concepts was suffixed with the word "Structual pattern."

You will note "design patterns" is actually derived from physical architecture but physical architecture does not go overboard and give itself a name other than "English Vocabulary" and attempt to communicate with big words where English will make more sense.


Design Patterns gave us the vocabulary and opportunity to argue about design for a previously ignored level of abstraction.

Source: started design patterns study group, still active today.



The value of Design Patterns is the shared vocabulary. Abuse of words, grammar drives me nuts too.

But older me now understands every generation goes thru a phase where they think they newly discovered sex.

When I asked my then teenaged son the difference between "emo" and "goth", he informed me that "goth" is for old people.


>The value of Design Patterns is the shared vocabulary.

English is shared vocabulary. What is the point of "design patterns" when you can already define a word for a pattern in english.

My argument is design patterns is 100% the abuse of words.


Would you rather argue the precise meaning of words or discuss the nugget of an idea that someone needs your help articulating?


Examine the two actions below:

   Given a concept, I name that concept "Gloop" and make it part of a defined word in the "English Language."

   Given a concept, I name that concept "Gloop pattern" and make it part of a defined word in "Design patterns."
Both actions have the potential to fulfill either goal you describe above depending on the definition(s) of gloop and "gloop pattern."

Independent of the definitions or assuming both mean the same thing, the two actions are one in the same. There is no benefit of using one technique over the other.

Let's give you another angle: Defining a term under the umbrella of "Design patterns" doesn't make that definition any more precise than if you defined that term under the umbrella of the english language. There is no difference period.

Design patterns is shared vocabulary. English is also shared vocabulary. But the words "Design Patterns" is part of the "English Language." By creating the term "Design patterns" in the "English Language" you are essentially recursively adding complexity to the english language by defining a redundant concept in the same concept.

Just use english. Define the pattern in english, there is no need to define the pattern in "Design patterns."

Additionally, many patterns are better described with existing english words. Why use Facade pattern, when you can just say Object wrapper. The word "Design patterns" inserts a sort of false formalism and elitism into what is essentially just creating new vocabulary in the english language. The claim I'm making in this paragraph is that while yes it's good to have some formal nomenclature, it's excessive to give all of these patterns their own names. Let the naming and the definitions flow naturally.

Other fields of programming have patterns but they don't try to turn these patterns into some kind of theoretical field with it's own nomenclature. Currying is just currying nobody calls it the "Curry pattern". Recursion is just recursion, nobody calls it the "Recursion pattern."


I think you would have loved our study group. Especially the early years.

My personal position on the value of the art of namings, eg Facade, Proxy, Wrapper, Adapter, was somehow capturing the author's original intent. As distinct from the implementation. How it's meant to be used.

Lofty sentiment coming from someone condemned to decades of code maintenance.

Professionally, methinks design patterns, and their misuse, has been mostly detrimental. Maybe because the notions were taken too literally, treated prescriptively rather than descriptively.

Calling everything a Decorator. When it's actually a Chain of Command. And when you try to patiently explain the evils of silent failures, buried deep in the levels of indirection, to the "senior architect" author, that same architect insults your intelligence and walks off.

The mere utterance of Factory and Singleton in public somehow empowering legions of noobs littering entire organizations (and libraries) with innumerable implementations.

Trying to debug something called a "Write thru Cache" when its anything but.

Etc.


A very good pattern book that I've found to be incredibly well written is Robert Nystrom's Game Programming Patterns. I'm a huge pattern advocate, and have consulted the Gang of four pattern book many times, as well as Christopher Alexanders' A Pattern Language. Still, Nystrom presents some new patterns with very concrete examples. Even though they are presented within the domain of game development, with a little creativity their utility can be imagined elsewhere.


History and culture is under-taught and under-appreciated. Maybe students can have a big debate in their third year about whether or not worse really is better?


'Code Complete', Steve McConnell


I would recommend Clean Code over this one. Code Complete isn’t bad, but I feel like it’s filled with things most junior programmers already know


'Becoming A Technical Leader', Gerald Weinberg


I probably ought to read this one. I got on a Weinberg reading kick after reading some recommendations here a few months before his death. The one I particularly liked was The Psychology of Computer Programming. While dated in many ways (like MMM), it has a lot of interesting ideas and discussion about the structure and behavior of people and teams (regardless of discipline, though he was writing about programmers). I actually have a quote that I've kept in my office IM for a while that is a paraphrasing of Fisher's Fundamental Theorem [0]:

  Fisher's Fundamental Theorem states—in terms appropriate to
  the present context—that the better adapted a system is to a
  particular environment, the less adaptable it is to new
  environments.
The theme of that quote kept coming up, I presume deliberately, in many of the later chapters.

[0] https://en.wikipedia.org/wiki/Fisher%27s_fundamental_theorem...


You might also find Weinberg's 'An Introduction To General Systems Thinking' of value. Weinberg was heavily influenced by general systems theory, as illustrated by referencing Fisher, and he wrote the book as a way to introduce others to that line of thinking.


Isn't that why most people use Docker nowadays?


'Making Things Happen', Scott Berkun


> [The Mythical Man-Month] Key takeaway: You'll soon fall victim to the problems identified by this book, even after reading it.

Yeah, that pretty much describes all of software development in one stroke. But you should read it anyway (if only to experience the feeling of someone predicting your future before (most of) you were born).


I've always like the MMM but there is also an undercurrent of this being a re-telling of xeno's paradox, and its important to remember the rabbit actually does win.

So, adding new bodies does incur cost, but the belief that always adding new bodies always adds more cost than adds impetus to the outcome is not always true.

The classic example is "the hump" which was the cost in fuel terms to ship fuel to China, to be able to fly from China to bomb Japan. Economically ruinous, people forget that missions were nonetheless successfully conducted: it was ruinous but the outcome was achieved within limits.

So the MMM does not say "never add a body" it says "know what your incurring in overhead, choosing to add a body"


I should say that I mis-speak (write) characterising MMM this way, its how its casually talked about. The incremental benefit of adding people can be higher than zero, but less than enough to be worth the cost. It can also be negative which is the core point MMM was making. Can be, is not always. And of course there are problems where it simply cannot alter the trajectory at all (the baby problem)


The difference is that in software, if you add a body and as in most organizations, that body has to appear useful, they will tramp about in the soup, making a big mess and harming progress.

Unlike in war, the marginal curve of adding an extra body goes negative in software. And right quick unless circumstances are favorable.

(If you doubt this, imagine a thousand Arthurs charged with implementing a CRUD app.)


From what I recall, it talks about 3 somewhat related things.

First is that adding people slows things down in the short term.

The second is that due to overhead, adding people won't give linear speed up in completion time.

And third, adding people won't help a project that is linear in nature. The common example is using 9 women to get a baby in 1 month.


That's a good summary.

My years suggest that he wasn't nearly pessimistic enough.


You mean if you add 9 women you get a baby in 81 months?


I mean that in the limit, adding the n+c'th person isn't just useless, it's devastating. If the optimal (in terms of calendar date of completion) size for a project is (say) three people, it is entirely possible, if not almost certain, that putting 30 people on the project will push the expected completion date to infinity. In other words, the project will fail.

Now, you might say, "I will just bench those other 27 people" and have them play cards or something. In most organizations, though, you cannot do this. Instead, they must be seen to be useful somehow. And if you have 30 people with their hands in a three-person project, no matter how peripherally, failure is all but guaranteed.

I have seen this many times.


100% feel you.

Last time I worked at a big company, there was lots of dead wood, but one guy in particular was just a massive liability. Nice guy, tried hard, but utterly incompetent, and (I think there is a term for this?) he was unaware / ignorant of how incompetent he was. He would often check in code that would break the build (team of 300 engineers, large telecom system), he would write and run scripts that would bring computing clusters to their knees (this was late 90s, there are probably ways to mitigate that now), he would consume lots of high-quality talent's time with basic questions, etc.

I literally asked my manager if we could pay Leo to sit home and play video games. OF course, as you said, everyone's gotta look busy.

Here's the punchline: I learned a new term (to me, at the time... not sure I've heard it again) from a greybeard/wizard there -- this guy was a genius. He had a very appropriate term for Leo: negative producer.

Bingo!


Reading this book AND trying to follow its key lessons makes a huge difference in productivity, which I can testify from my own experience. This book is often compared to the bible of software engineering, suggesting that everyone knows about the book (ex: 'no silver bullet'), some people read it, but only few people abide by it. So, its key lessons are hard to follow in the real world, but for a good reason.

We started a project 5 years ago, after a few months of failed attempts. From the very onset of the project, we tried to adhere to the key lessons of the book. Examples are: recognizing the importance of minimizing communication overhead (the most important assets are not people but time), following the surgical model (key decisions should be made by a single individual), practicing effect-free programming whenever possible, allocating enough time for testing, and so on. I would definitely attribute the success of our project to the teachings of the MMM.

Like many books on software engineering (and self-help books in general), just reading a book and learning its contents may not make any difference in practice. Only when you seriously make conscious efforts to practice its teachings do you realize what the book is really about. This is also the reason why many university courses on software engineering are boring.


I would also recommend two more good books. They are relevant to almost every aspect of life and creation process:

- Antifragile: Things That Gain from Disorder by Nassim Nicholas Taleb[1]

- Skunk Works: A Personal Memoir of My Years at Lockheed by Ben R. Rich[2]

[1] https://www.goodreads.com/book/show/13530973-antifragile

[2] https://www.goodreads.com/book/show/101438.Skunk_Works


Can I dive straight into Antifragile without having read the author's previous books?


I started from Antifragile and didn't regret it. If you feel like drawn toward the earlier ideas, read Antifragile -> Skin In the Game -> Fooled By Randomness -> Black Swan. This might also be related that I usually have more interest in starting with practical knowledge and then digging deeper into the theory/more basic knowledge.


Sure. Any references to concepts from earlier books can be easily googled and learnt. Most books nowadays are around 200/500 pages but the essence is contained in only 20/30 pages.


I'm disappointed not to see the pragmatic programmer there. I would think that it's great reading, not only for every software developer, but also for stakeholders and product managers.


Peopleware is another good one.


Software Tools by Kernighan and Plauger was the first programming book to blow my mind, and changed the way I approach working in difficult/legacy systems. As a programmer, you don't have to accept the limitations of the system you're working with and by building simple tools can make even the worst system bearable.

And of course The Elements of Programming Style is a classic.. The lessons seem like clichés now, but there was a time when things that seem obvious now were hotly debated.

https://en.wikipedia.org/wiki/The_Elements_of_Programming_St...


"The Design of Everyday Things", I would consider the digital versions to be

Code and Other Laws of Cyberspace

http://codev2.cc/

and

Free Software Free Society: Selected Essays of Richard M. Stallman

https://shop.fsf.org/books-docs/free-software-free-society-s...


I've read only the The Mythical Man-Month from this list. It is amazing how the project management really did not change much. Similar ideas (sometimes same) to similar problems.

I would recommend watching this talk [0] from Kevlin Henney to anyone who likes that book.

[0]: https://youtu.be/AbgsfeGvg3E


A Discipline of Programming, by Edsger Dijkstra.

"For lack of a bibliography, I offer neither explanation nor apology."


I take it that the goal is to list books of general interest for the software engineering student. It is not clear what the author means by classic CS material, so it is hard to know what is excluded. Surely, Mythical Man Month and GoF qualify as classics of software engineering. Or does the author mean classics from the entirety of a CS program?

I've been teaching a software engineering class for a few years. Being conscious of the high cost of textbooks, I find that Applying UML and Patterns by Larman achieves the best balance of practice, design, and process.

For students who want to dig into agile methodologies I would recommend Extreme Programming Explained by Beck and the Poppendiecks on Lean.

In my opinion, a philosophical grounding is helpful for the programmer. Accordingly, I'd want to suggest something like Locke's Essay Concerning Human Understanding or a secondary source on Aristotelian categories. Moreover, a broad knowledge of different types of ethical systems will benefit a developer.

Lastly, as a kind of curve ball, I would recommend any STEM student to work through an LSAT preparation book. Technologies will come and go, but, throughout a career, the problems a developer encounters will likely have dimensions that require general critical thinking.


I don't think you can teach software engineering. Writing software is more like writing proofs than building airplanes. Take a formally verifiable language like Agda for example. Proofs are indistinguishable from programs in that language. The Curry-Howard correspondence shows this is true for program-proofs in general.

Large software systems fail continuously in countless ways all the time. Google search will still throw 404 or 5xx occasionally. We're not building airplanes that either fly or crash (although even the software running airplanes is now increasingly fault tolerant). We are effectively getting from point A to B through a variety of logical operators, increasing the set of valid A through trial and error. We find an input that throws error, our theorem was faulty and we update our proof. A seemingly valid input can't produce the results we want, we double check to make sure our proof is still valid. Dilettante engineers will begin to introduce unbound complexity at this point for making fixes without understanding root causes.

For the above reasons, we should abandon any attempt at comprehensive software engineering curriculum or canons. We are not architecting a skyscraper to be delegated to a construction company to be built one and done for all of time. We are continually exploring and staking out a specific problem space. Our greatest compass in these new journeys will always be foundational computer science. We must take the specific and make it general. Careful application of elementary data structures, algorithms and discrete analysis will move and has moved us continents further than any convention to "best practices" ever will.


Funny, I'd say that writing proofs is a lot like building an airplane, and software engineering might be far away from both.

In writing a proof, you provide an argument that justifies some claim is true. Your argument has to be airtight, so that no matter what kind of counter example or ugly scenario is proposed, your argument still holds.

The same seems true for an airplane. It needs to do its job (fly) and do so on the face of a myriad of different external factors. If you build a plane, it should "never" fail.

Software on the other hand, can be built without thinking about ALL of these edge cases. Obvious it's better if it always works, but it's usually okay if there is some bizarre scenario that causes an error. For some pieces of software it might even be okay if these errors never get fixed (ie: a very small number of users ever experience them). Software can work, even when it (kinda) doesn't.

The Curry-Howard correspondence might be technically true, but we're just as bad at writing specifications for programs as we are at writing the programs themselves. It could be good to acknowledge that and just write the software that solves most, but maybe not all, of our problems.


Although programs function in an ideal universe amenable to proof and theory and therefore perfect for calculation and theoretical approaches rather than engineering "best practices," what you describe has not been the case in the industry.

The irony is airplanes live in a unideal universe where no theory can predict anything to a perfect degree and testing must take precedence. However when building an airplane, aerospace engineers use far more theory than software engineers even when testing and engineering best practices is an absolute requirement for the unpredictable nature of real world physics acting on an airplane.

I feel the reason why the world is the way it is is due to necessity. A buggy program is (usually) an annoyance. A buggy airplane is (always) a disaster. It is necessary to prevent disasters but it is not necessary to eliminate an annoyance.


Blech. Nothing here that's obscure or likely to alter the path of some one's knowledge. No philosophy, no history, and the list includes Gladwell. Actually, the inclusion of Gladwell probably distinctly colors my thinking on the subject.

Maybe "How to read a book" or "The Alchemist" or I don't know, something that's not just so totally typical.


Yeah, I use Gladwell as a negative control on book lists as well, along with Jared Diamond.


Harari would be upset to be excluded from this list.

:)


You went too far with Coelho. Might as well throw some osho's books in there too.


I personally don't like philosophy books. I think there's more to be learned about philosophy by reading about specific topics, rather than actually reading about philosophy. Some exceptions like Plato, etc.

Absolutely agree with history being useful, though. Any particular periods of history you think are essential to know about, and any particular great books on them?


I'm not the OP, but I agree that history's an essential subject.

History has many themes. These appear in every time and place, sometimes in the forefront, sometimes in the background. I believe the most "essential" period is the one that answers your questions.

So, I'd recommend picking any period that you have a vague curiosity for. I'd go a half step further and recommend avoiding recent periods (late 20th Century).

IMO, it's too recent for there to be consensus on what constitutes good scholarship. There are obviously exceptions to this, but if you're new to history reading, it's difficult to disambiguate the good from the bad.

Examples of periods and geographies that are particularly well-studied, with good accessible literature:

- Late antiquity in the mediterranean (fall of the roman empire)

- Inter-war period in continental europe (Weimar, etc)

- Revolutionary period in France, United States

- Antebellum period in the United States

- Early Russian Revolution (there aren't many good syntheses imo, because this period was incredibly complicated)

- Europe during the reign of Louis XIV (1643-1715)

- Napoleonic wars and aftermath

This list is pretty euro-centric. IMO, these are the safest place to start, as the plurality of english-language scholarship is in these places and periods. After developing a good nose here, you'll feel comfortable reading in areas where the scholarship isn't as deep.


At the risk of sounding pretentious, I very much enjoyed Herodotus' "The Histories" (Sélincourt translation). It reads like a Game of Thrones season with all its twists and turns.

(Side-note: I usually multitask a number of books and many months can pass until reading resumes. Yes, it's weird, and yes, if someone has a nice trick for this, please help, I'm running out of bookmarks)


I'm genuinely curious why Gladwell is a negative control. The name sounds familiar but I don't know much about him


> I'm genuinely curious why Gladwell is a negative control. The name sounds familiar but I don't know much about him

He's written a lot of bestselling pop social science books. IIRC, he's a good storyteller, but is criticized for cherry-picking stuff to create the impression that you've just learned something profound and counter-intuitive when maybe you actually haven't.

Here's a review of one that makes that point: https://www.newstatesman.com/2013/10/malcolm-gladwell-backla...

Disclaimer, I actually enjoy his books.


Creativity Inc is a wonderful book, but Pixar's culture and work environment are exceedingly rare if not entirely unique. For a more realistic, yet still useful example of corporate culture, I would recommend "The Phoenix Project" to students.


I absolutely love The Phoenix Project, but it is 100% fiction and not based on any real-world evidence. It’s great for laughs but I wouldn’t use it to “teach” anyone.


My opinion: There's more we can learn from fiction than we think.

By analogy, creativity, perhaps even reality distortion. I don't think we should solely rely on non-fiction and proven methods to teach or grow. Indeed, if we did that, we wouldn't grow at all.


Scroll to the bottom of this page and watch the lectures on conceptual design https://stellar.mit.edu/S/course/6/fa18/6.170/materials.html interesting analysis on why some software ideas fail on launch. He had a book as well but the lectures are better so posted them.


I had a professor make us read Death March by Yourdon and Peopleware. Both were great, especially Death March. It helped me notice bad patterns at an employer and gave me the motivation to quit. Its also very well written and funny, but a bit too real at times. You'd think all of it is fake or exaggerated, but then you're living it and it's so depressing.


I bought my boss two copies of The Mythical Man-Month so he could read it twice as fast.


If he keeps one copy at home and one copy at the office, he could get done reading faster than if he left a single copy in one place and only read it at that place.


And he hired 4 people to read to him for each book. Now he could learn 8 times as fast.


Not if he outsourced the reading of said book to a foreign company. He would would have to check if their reading was the same as his every hour for 18 months that would have taken him 3 days to read it himself. Slowly.


I've often had thoughts of opening up a little web shop that anonymously shipped copies of The Mythical Man-Month in bulk.


I used to leave copies of Mythical Man Month on the seats of the MUNI in SF during the DotCom era. Also PeopleWare. Actually Patti Dunn gave me a few hundred extra copies of PeopleWare from some Barclays event I accidentally attended and I couldn't conveniently get them home on my bike, so that was a one-time event.

I so enjoyed the pragmatism of Crockford's "Javascript: the good parts" that I gave it away for a while too.

Sometimes I would print out great tracts like "big ball of mud" and leave them at appropriate places in SF.

Eventually I realized I was either preaching to the converted or preaching to the wilderness. So I stopped. But you've inspired me to reconsider anonymous viral software evangelism, at least for a brief moment.


| But you've inspired me to reconsider anonymous viral software evangelism, at least for a brief moment.

That is likely the best complement I've gotten this week.


Good one. I chuckled.


nice


I would swap out Outliers for Dark Pools, which better highlights how a revolution driven by Software Engineering in its purest form looks like, and is still ongoing to this day.

Outliers is kind of entrepreneurship junk food and 10,000 hours has already been debunked.


I'm not sure it's debunked and I don't understand the rabid hatred of it (like bringing it up where it isn't mentioned at all).


Outliers is debunked. The authors of the primary sources that he quoted disavowed it as an inaccurate representation, and when you look outside of kind learning environments like sports its becomes even less correct.


It’s interesting to see “The Mythical Man-Month” in a list like this, since I’ve never once seen any team implement what it suggests. Is it meant only as a cautionary tale?


What I took away from MMM was less concrete, actionable suggestions you can implement, and more "things to watch out for" in the development process. Things like the second system effect, and Brooks' Law (adding people to a late project makes it later) are red flags. You can't always avoid them (management will add warm bodies in desperation, you probably can't convince them otherwise). But at least when you see second system effect happening you can often try to mitigate it by design/architecture decisions.

I would also say the lessons around keeping teams small have been taken to heart in industry somewhat, e.g. in the notion of the two-pizza team at Amazon.

That said, some of Brooks' suggestions, such as the "surgical team" concept, are more products of their era and haven't aged well.


“ Note: This book does seem to overgeneralize and oversimplify complex situations, but there are still good points.” describes everything by Malcolm Gladwell.


If they want to get started with ML, I recommend Hands On Machine Learning with Scikit Learn and Tensorflow.


Aurelien Geron is very good at explaining concepts and I thought it was a lot more approachable than say the canonical Deep Learning book.


Hey I have no background in ML. Would you still suggest it ? I do systems.


Yes


My list for all engineering students:

Design of Everyday Things Founders at Work Antifragile


Clean Coder is great for developers just starting their career.


I'm surprised to see nothing on negotiation there.


fwiw, i found John Ousterhout's "A Philosophy of Software Design" pretty good actually (despite having misgivings about it earlier)


These books will tell them nothing.

I legit think that there is only one way of engineering complicated software and that is Entity-Component-System (https://en.wikipedia.org/wiki/Entity_component_system).


ECS systems are great, but claiming they are the only option available is more than a tad unreasonable.


As inflammatory and irrelevant as this comment is or seems to be, I'd have to agree ECS is, at least, underrated. My mind was blown when I learned about it.


Can someone explain to me what separates ECS from relational modeling in databases?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: