Hacker News new | past | comments | ask | show | jobs | submit login
The missing semester of CS education (csail.mit.edu)
1163 points by anishathalye on Feb 3, 2020 | hide | past | favorite | 196 comments



Over the years, we (@anishathalye, @jjgo, @jonhoo) have helped teach several classes at MIT, and over and over we have seen that many students have limited knowledge of the tools available to them. Computers were built to automate manual tasks, yet students often perform repetitive tasks by hand or fail to take full advantage of powerful tools such as version control and text editors. Common examples include holding the down arrow key for 30 seconds to scroll to the bottom of a large file in Vim, or using the nuclear approach to fix a Git repository (https://xkcd.com/1597/).

At least at MIT, these topics are not taught as part of the university curriculum: students are never shown how to use these tools, or at least not how to use them efficiently, and thus waste time and effort on tasks that should be simple. The standard CS curriculum is missing critical topics about the computing ecosystem that could make students’ lives significantly easier.

To help mitigate this, we ran a short lecture series during MIT’s Independent Activities Period (IAP) that covered all the topics we consider crucial to be an effective computer scientist and programmer. We’ve published lecture notes and videos in the hopes that people outside MIT find these resources useful.

To offer a bit of historical perspective on the class: we taught this class for the first time last year, when we called it “Hacker Tools” (there was some great discussion about last year’s class here: https://news.ycombinator.com/item?id=19078281). We found the feedback from here and elsewhere incredibly helpful. Taking that into account, we changed the lecture topics a bit, spent more lecture time on some of the core topics, wrote better exercises, and recorded high-quality lecture videos using a fancy lecture capture system (and this hacky DSL for editing multi-track lecture videos, which we thought some of you would find amusing: https://github.com/missing-semester/videos).

We’d love to hear any insights or feedback you may have, so that we can run an even better class next year!

-- Anish, Jose, and Jon


That’s cool. Now, on to my possibly unpopular opinion: This isn’t what computer science is about. In fact, you don’t even need to use a computer to do computer science.

Sure, some stuff you learn in CS can make you a better software engineer. CS cannot make you a software engineer.

CS can definitely not make you adept at using computers and neither should it. That’s something earlier education institutions must tackle.

It’s always good to have optional courses for various topics of interest. _Requiring_ students to learn, say, MS Office (I had to), is just plain ridiculous.


Science doesn't abstractly drop from the heavens, fully formed. Every science has practical enablers that are required to get stuff done. Astronomers use telescopes, physicists/chemists/etc use lab equipment, Mathematicians use various notations and other tricks (and nowadays proof assistants) to make their job easier.

MS Office might not be practical for computer science (also note how the OP doesn't list that), but learning how to write your papers in latex might, and knowing how to use a shell certainly is.

E.g. if you'd like students to learn about type theory, they will need to experiment with your compiler. You cannot expect students to miraculously be proficient in this, and explicitly teaching them (and requiring it as a prerequisite course to signal that yes, it's important) can turn weeks of frustration followed by a huge dropout rate into a productive course.


They should be writing them in asciidoc (or even markdown) and using style sheets if they really need complex formatting!!! (although at least latex is decent at typesetting and has nice defaults.)


Does asciidoc have equation typesetting support? I don't really care what I use to write my words, everything is equally good (except for LaTeX which is abnormally good at typesetting), but I care a lot about the user experience for equations, which varies widely between programs.


I always use LaTeX style equations although I think it supports MathML if you want a WYSIWYG. The LaTeX math is the main argument for it, mathtype is kind of miserable.

The reasons I prefer asciidoc to straight LaTeX are:

1) the formatting is completely seperated from the text

2) you’re insulated from the specific rendering engine (you can use PDFLaTeX or WebKit+MathJax etc.) while still getting decent equation syntax and BibTeX.


Yes. I intended that more as an abstract example for a tool that's useful to do science, but clearly not a science in of itself[1].

Though latex might still come in handy once you actually want to submit papers to journals, or for a thesis. YMMV.

[1] then again, if I remember my feeble attempts to write latex macros, maybe the emergent behaviour of common latex packages would be a good research subject? ;-)


> That’s cool. Now, on to my possibly unpopular opinion: This isn’t what computer science is about.

This is not an unpopular opinion at all, CS degrees do not typically cover what's in this class which is precisely why they called it "the missing semester".


Yes, it is exactly this naming that somehow irks me. It somehow seems to imply that this should be part of the regular CS curriculum. It should not.

Indeed, universities should once again become a place where you go to pursue a career in science. Not a half-baked vocational training center. That’s why I’m against excessively accommodating this misuse.


Are you saying that you completed a CS degree without writing programs on a computer?

There are many great experimental physicists (including Richard Feynman), chemists, biologists, and engineers who used real hardware. Why shouldn't computer scientists use real hardware?


Wasn't it Sussman and Ableson that said that Computer Science is both not a science, and not about computers?


Yes, and MIT (correctly) dropped Sussman and Abelson's head-in-the-clouds intro class in favor of a practical one that actually teaches you about computers.

I took Abelson and Sussman's class myself, as an MIT undergrad, just before it was phased out. I got a lot out of it, because I had already been using UNIX and writing code for years at home as a teenager. If you didn't have that background it would be useless to you.

"Computer science isn't about computers" is a similar statement to "English composition isn't about pens or keyboards." If you can't use the tools, you won't get very much work done. A writer is fortunate that our grade schools generally teach handwriting and/or typing - but if they didn't, a college degree on how to tell compelling stories and understand the monomyth isn't going to help you actually write books. Computer science isn't about using editors or shells, but if you don't come in with knowledge of editors or shells, you won't get very much done.


Your comment reminds a bit of Umberto Eco's How to Write a Thesis. He gets down into details of using colored pens and index cards. Craft.

https://thereader.mitpress.mit.edu/umberto-eco-how-to-write-...


Yup.

'Underlying our approach to this subject is our conviction that “computer science” is not a science and that its significance has little to do with computers.' -- from the preface to SICP.

There's a similar quotation often attributed to Dijkstra, but it seems doubtful whether he actually said it.


I didn’t, because I switched to a different type of university. I could have, though.

I’m not saying science should be only in the mind or maybe pen and paper. I’m saying CS should not have a “how to computer” course.


It's not computer science. It is computer literacy.

Most people who come out of a CS course will go on to become developers, not academics. The course topics are standard domain knowledge for anyone who builds software - not because of specific tools, but because of the concept that scripting and automation tools exist to make development easier.

There is no sense in which being aware of these topics will make you less effective as an academic computer scientist, if that's what you want to be.


Also it's good to get both academics and developers to use similar tools to ease collaboration.


Astronomers do stop to study how to build telescopes, chemists do stop to study how to create glassware, and biologists do stop to study culture media.

Why do you expect CS students to thrive without learning how to use a computer?


It's only with CS where I see this crazy attitude that you don't need to know anything about the tools of implementation and experimentation in order to pursue a science.

Imagine someone trying to pursue a career in particle physics without knowing anything about how a modern particle collider works. Or someone trying to be an astronomer without knowing anything about how real telescopes work. And at least in those professions, the tools are so huge and expensive and complicated that those scientists really do need an army of technicians and operators to help them gather data or perform experiments; with computers, this just isn't the case: anyone with at least one hand can write a program on a personal computer to test their theory, so the feedback loop should be much, much shorter.


While theoretical computer science doesn't need to use computers for many problems, and the experts do not need to be extremely proficient, that does not mean that a general BSc program should not teach fundamental computer architecture and programming skills -- indeed, it would be irresponsible of them not to.

The other aspect is that CS can be experimental -- the experiments are computational ones. Large computing systems are analysed using the standard techniques of experimental science, and that needs good bench skills -- except the bench is being able to program, reason about programs, etc.

However, does a BSc program need to teach large scale software development, topics like version control and tools like git, CI, etc? No. They are more properly in a BE(Software) course or its equivalent.

Should universities teach Flash/Javascript/Python/C#/nginx/Active Directory as an end in itself? No. That is almost a technical trade qualification.

But you should be able to leave university, learn language X, program in it efficiently, and know how to learn about some system.

(I'm sorry you had to learn about Office.)


I learned tools the old fashioned way - by asking an upperclassman or TA.

I also remember taking a "Software Engineering" class and it bore little to no resemblance to any part of my 20+ year career.


It isn't at the core of the CS curriculum, and neither are general requirements, but it is highly relevant to the course work and I am willing to bet it does not only help people interested in software engineering, but also people who want to be CS academics.

Making sure students are familiar with a full featured text editor, document preparation system, a version control system, the command line, etc. will go a long way. I don't think those are topics regularly covered in earlier education institutions.


I think that's reasonable. I really wish they offered more applicable classes at lower levels of education... I did like 2 days of programmable Legos in 5th(?) grade and then had to take a basic typing / MS Office course in 9th grade (required like you). In 11th grade I took CISCO Networking, Computer Manufacturing, and an intro programming course in C++.

After high school, I did a 4 year degree in Computer Science and while I learned a lot about algorithms, proofs, FSMs, design-patterns, etc. We got very little practical experience building software.

It'd be nice if they covered version control, more *nix and text editors, extending an existing codebase, refactoring, debugging, APIs, common libraries, multiple languages, and the web in general.

I think most CS students learn to become software engineers either on the job or on their own and there's room for school to help ease that transition.


The college I went to attempted to balance these - covering both abstract concepts and concrete tools and practical skills in roughly equal measure.

I think that's fair. Some people will go on to use both to work as a programmer, some may focus much more heavily on the theory end to focus on the science part of CS.

MS Office definitely doesn't seem to have a place. Maybe for a general "computer competency" course for college students, but not as part of a CS program.


"I choose a lazy person to do a hard job. Because a lazy person will find an easy way to do it." -- Bill Gates

Some people just accept absurd drudgery interacting with computers; it never even occurs to them to seek an easier way. But if you can teach the students to be "lazy" when it comes to that kind of repetitive drudgery, they'll be set for life.


I think the problem is more a lack of imagination. They just keep doing things the way they always have.

I seem to have a surplus both of "there has to be a better way"-itis, and some skill in finding such ways.


I call it a lack of curiosity instead of a lack of imagination. Both are useful, but to me imagination is more about coming up with new solutions than understanding how something that already exists can be bent to your will.


I think they go hand-in-hand. You have to have some imagination about how something must work (however off-base) to be curious enough to find out how it really works. It often seems to me that people who aren't curious lack it because they can't even imagine something different than what is in front of them.

They don't wonder how a car works because they can't even begin to deduce how it might work, from the observable outside in, because they lack the imagination to make those deductions. Someone who's curious might see a car and think, "How does this work? I can see the wheels spin and propel it forward, but something must be making those wheels turn. How do they all spin at the same time? What turns the push of the pedal into making the wheels turn?" etc.

I see curiosity and imagination as two sides of the same personality trait expressed in different ways.


> I see curiosity and imagination as two sides of the same personality trait expressed in different ways.

That's because the world of today is both complex and complicated. Imagination is needed to decipher things and curiosity to get you the urge to find out more. I would argue that it's two different traits both selected by evolution. Let's imagine a tribe on savannah where some are smart enough to imagine lions hiding behind a hill and not sufficiently curious to find out by just going there, it's great to imagine throwing rocks to scare potential predators away. So you need both, they're complementary and might not even have to be expressed in the same individual but beneficial if they do.


I don't think it's just that. I think a lot of it is just a desire for successful results. Once you learn how to do something and it works you don't bother with asking if it's the most efficient way or not. You know it works so you just keep going back to the well. You then spend your energy on figuring out the next thing you don't know.


Three great programmer virtues, according to Larry Wall: 1. Laziness (figure a smart way to do it that is less work) 2. Impatience (figure how to do it faster) 3. Hubris (figure that what you're doing is important)


Sadly. It doesn't seem to be a real Bill Gates quote. https://quoteinvestigator.com/2014/02/26/lazy-job/

But you can quote Larry Wall on the three virtues of a programmer. http://threevirtues.com/


Since you mention scrolling using key repeat - it truly is painful to watch someone do it on default settings. And there are usually better ways to do that sort of thing. But sometimes, there is no substitute: going to the right spot in the middle of a word/line, moving around a page, browser title bars, etc. Here's the kicker though: most keyboard repeat rate settings around have a maximum that is pretty much unusable! But you can fix it thusly:

xset r rate 180 60

When I work on my computer it's like driving a Porsche. When I sit at someone else's it's like I tripped over a door threshold.

There are ways to adjust this on OSX too but it's a lot more touchy. Haven't attempted on Windows.


> There are ways to adjust this on OSX too but it's a lot more touchy

System preferences > Keyboard > Key repeat rate

Adjust the slider to your liking. Works in every app. It's been there since 1984, but to the topic, not many users poke around in system settings anymore. If you want faster than the slider allows, try this from the command line: defaults write -g KeyRepeat -int 1


That's what I mean by more touchy. I think you also can't do this without messing with system integrity protection. And, "1" is the lowest it can go which is not that low IMO. :)


Personally this is what I choose to use the mouse for - even in Vim. A scroll wheel is nicely intuitive for descending a buffer.


Well, I think the point was a different one... more like press G


We previously had the same problem at University of Copenhagen, it was called "the hidden curriculum" among students.

When the undergraduate programme was reformed a few years ago these subjects where integrated into various courses, so they could be taught in a learning-by-doing fashion. As part of the first programming and problem-solving class (F#), we also teach LaTeX, Emacs and basic use of the command line, as part of a project-based software engineering course (second semester) Git is used extensively, and so on.


Um, quick question: so should one use Emacs as the University of Copenhagen, or VIM as MIT is teaching?

(Just kidding)


That question has already been answered: https://missing.csail.mit.edu/2020/qa/#vim-vs-emacs :)


Love this. I remember it wasn't until junior year when I was reading about the theoretical underpinnings of the unix kernel that I learned what the pipe operator I'd been copy-pasting on psets really did. I mean, it's great to learn those underpinnings, but most course 6 classes assumed you already knew all these tools... if it wasn't theory, it wasn't their responsibility to teach it.

"Missing Semester" describes it perfectly. Wish there had been something like this back in my day... I remember I felt as if I had learned all the theory behind fluid mechanics but didn't know the first thing about fixing a leaky faucet in my kitchen. Keep up the good work!


My feedback is to keep doing what you're doing. Some people will talk about how this kind of mundane training might not be relevant some years later. By the same reasoning students shouldn't learn the layout of their campus or how to add and drop courses, because those skills won't be relevant after they graduate. Keep helping students not waste time now on tasks that should be simple.


I read through the article, and a lot of what you've got looks really useful. Your explanation of GIT and source control is especially valuable. These are all pretty much skills a good programmer should have, but I would make one exception.

Personally, I like and have frequently used VIM. It's a useful tool to know how to use if you need to edit text files on a GUI-less system. However, I have yet to meet a single programmer with any ability to work as a team member that chooses to use VIM while using an OS with a desktop environment. VIM does have an interesting and valuable ideology, but that ideology isn't perfect and I worry that exposing new programmers to only one command line text editor and itss specific ideology might provide too narrow of a perspective. It might be a good idea to alsi present the ideologies behind other command line text editing approaches such as perl scripting and regex in order to update many files in order to widen their perspective and consider what they need to do while selecting the tool to use.


Lots of people use Vim in a desktop environment. At Facebook I would guess that something like 20% of software engineers use it on a daily basis. It’s still an incredibly powerful editor with lots of practical application and it’s constantly being updated. If you have the right plugins it can be an order of magnitude more productive than an IDE.


I’d recommend Docker / virtual machines as its own top-level module. There’s essentially no better way than containers to make projects portable and reproducible, and containers are now essential for pretty much any production system. I’ve taught docker to a few dozen people and while they hate it at first (the learning curve can be steep) they love it in the end.

Also wish this course covered bazel, maybe in potpourri. Make is important and bazel isn’t standard but bazel is pretty important for large C++ projects.

Docker and bazel both have fairly complicated interfaces and the time set aside for a course like this is the perfect time to play with them.


Ha some MIT alum friends and I were just talking about how great it would be if something like this existed, rather than having it be left to one’s first years as a junior engineer in a company. Thanks for doing this, it’s sorely needed.


I'd recommend making a course on how to create a proper Makefile, from the beginning to the advanced level.


Makefiles are worse shell scripts. /religion

I do agree with the spirit of this though. There's a big difference between "click a button in your IDE" to compile and learning about all the code-ish things that go into a real software project:

* choosing a build system (and package system too, I guess?)

* setting up CI

* static analysis

* setting up CD

* config-as-code for managing the state of prod

* staging environments?

* production monitoring

* production alerting


I would go one step even more basic and teach touch typing. It amazes me when I watch someone code by laboriously copying some other code from somewhere else, from a full code snippet to a single variable name, paste it in and begrudgingly type what needs to be typed, slowly.

The thought that always goes through my mind when I see this is "this is your interface to code, people, shouldn't you spend a little bit of time at least trying to master it?"

Forget vim, if people just learned to type they could be twice as productive in simple notepad. Vim (which I use) raises this to another level still.


Learning to use the editor macros well, could save anyone lots of effort typing. And should definitely be in your course.


I think this is an amazing course and indeed missing. I learned the things presented in the course only after graduating CS and working at a company! They (rightly so) thought I was somewhat of a fool because I preferred to use Windows.

One thing on the data wrangling. I do think that the Linux tools are powerful, but would like to give some credits to R here. For example, merging tables (similar to SQL joins) is available in the standard library. This in combination with R Markdown for visualization makes it much more easy to use than Linux CLI tools.


Hi, how are you getting the keystrokes you type in to overlay on the video? Is there a tool you're using for this? Would be very helpful if you could make it.


On macOS, we used this excellent open-source software called KeyCastr: https://github.com/keycastr/keycastr (we used this e.g. in the text editors lecture)


We shouldn't really be encouraging people to program in vim. I mean, an 80 char line length limit is very 1995. People have massive monitors now.


Your entire file can be one line in vim, regardless of it's size.


You use Vim in your example. Get your students out of it and into something like IntelliJ IDEA or VS Code or (at a bare minimum) Eclipse.


Unpopular Opinion: IDEs suck when teaching people programming. A simple Text editor, the simpler the better, is all we need, combined with a compiler. Modern IDEs introduce a boatload of completely irrelevant concepts and complexity that just adds to the confusion. Doesn't matter which one you use.

If I was to design a CS101 course, I'd do away with Java (which is the language of choice at my university) as well. I don't know what would be the replacement; vala looks nice but the resources are lacking. My main points in choosing a language to teach would be that I can do OO in it while not needing everything to be encapsulated in classes. Also, the language shouldn't care about whitespace, because if I'm teaching a few hundred students, I'm not going to debug a boatload of mixed tabs and spaces issues.

Also, my CS101 course would be about debugging as much as about programming. Including stuff like "here is some source code, that's the observed bug, submit a corrected version by midnight x-y-zzzz and describe how you debugged it in bullet points". Unfortunately, plagiarism on these types of assignments makes them very much useless in the real world where grades depend on that.


Your opinions aren't unpopular, though maybe old school (in a good way):

https://blog.osteele.com/2004/11/ides

https://www.joelonsoftware.com/2005/12/29/the-perils-of-java...


> A simple Text editor, the simpler the better, is all we need

> I'm not going to debug a boatload of mixed tabs and spaces issues

As someone who uses an IDE, I have literally never had to worry about tabs vs spaces.


The problem isn't using one consistently, in self-written code, but students will copypaste small snippets from somewhere. While python recommends using spaces for everything, more than enough code exists that uses tabs. Now; not every IDE is smart enough to actually convert everything to whatever is set. It may work for PyCharm or VS Code, but then again, I specifically do not want to endorse any program.

Edit: Or put another way - if my CS101 course has about 500 participants (which is not unreasonable in Europe), even 1% with whitespace problems will keep you busy and from answering questions that provide real insight and understanding. Because, after all, while learning to distinguish a tab and n spaces is a valuable life skill, it's entirely not my point while teaching how to think when programming.


I think it's valuable to start with Vim, especially for non JVM or CLR languages. Learning Vim as a first step will allow the students to easily navigate and edit code on remote servers. Then, when working on a code base like Java or C#, they can install the Vim plugin for their IDE and have the advantage of a superior editing tool to go along with the tooling the IDE provides.


Why? What’s wrong with vim, gdb, and some static analysis tools like cscope and a linter?

IDEs confuse people, a lot of them start to think you can’t have the compiler/debugger/build automations without the editor and start doing stupid things because of that.


I personally can't stand IDEs, and use them as sparingly as I can get away with. I don't want the build process abstracted away from me, I want to know exactly what it's doing. Whenever I'm trying to figure something out in an IDE to fix a problem with why something didn't build correctly, I want to throw up my hands and go back to a simple text editor and the make command or some bash scripts. I dislike programs that try to anticipate me, or that fix things after I've typed them. And the part of me that grew up learning to program with 4K of memory available cries when I see how many resources are being used by it.

There are a lot of things I like about them from an editing standpoint, though. Nice for refactoring, easy to find out what arguments a function takes, easy to jump around between files. Vim with multiple buffers isn't too far behind this, though, and it doesn't bring my system to it's knees just opening it up.

I guess I'm getting old.


Or preferably all of the above and Vim, and also CodeBlocks and KEIL just in case.

Really? There's a point where you get to learn your IDE...


I took a Unix half credit course randomly where you basically did bash scripting, a huge bunch of command line tools and then eventually use all that to build your own linux distribution. I swear I learned more in the half credit class, and way more if you try to count it as useful information, than 90% of my other CS courses.

Edit: And since this got some traction, here is the current version of the class: https://www.cis.upenn.edu/~cis191/ it looks pretty similar to what I took but they added a little bit of Python.


Going through Linux From Scratch[1] manually and reading each component, while allowing yourself the time to look into interesting bits of each, is essentially this. I did it back in the early 2000's, and view it as one of the most useful educations I've ever gotten on how a Unix-like system functions (even if a lot of the knowledge is somewhat obsoleted now for many Linux distros with the introduction of systemd). I was doing it as research for how to ship a custom distro for a small purpose-built device, and while that never panned out, I've never once regretted spending the time to go through the process.

1: http://www.linuxfromscratch.org/


This - it methodically takes you through all the little details inside the system that are usually taken for granted and hugely improved my confidence working with Linux. It also taught me a lot of what's in this class (e.g. editors, shell scripting, even a fair bit of troubleshooting)


I never successfully made it through LFS, maybe this is the year :)


I did a similar Unix course. It was on quarter based system so it only lasted 3 months. We learned Unix from first principles. There were programming assignments that started off as shell scripts. We wrote manpages for our shell scripts. Then we learned C with the expectation you figure out the hard parts really really quickly (we were a Java school). We implemented various different C and Unix programs. We even reimplemented some of our shell scripts. We implemented a custom version of ls(1) with built-in sorting features.

In the last 3 weeks we built a full-blown Bourne shell in C including background jobs, shell/environment variables, pipelines, i/o redirection, lex/yacc parsing, etc. A lot of the features were extra credit (e.g. single pipeline a | b, versus infinite pipeline a | b | ...).

The class is unforgettable in my mind. I now mentally parse everything I do in the shell and imagine system call tracing every program I run.


We had a very similar course at Penn State, as well. It's among three courses that I learned the absolute most from, and have had a huge amount of applicability to my life and career.

https://bulletins.psu.edu/search/?P=CMPSC%20311


I took this class as well and agree that it was the most useful course I took at Penn. I still use all of the things that I learned in this class every day (especially emacs). Luckily, I took it early in school and it made subsequent classes so much easier. Thank you Perry Metzger!


He recently had a really interesting talk called Emacs: The Editor for the Next Forty Years [1]

[1] https://youtu.be/KYcY7CcS7nc


Nice to see some Penn alum on HN :)


In the lab for my second or third CS course, the professor was walking us through intro Unix usage, but he didn't take attendance so it ended up being just 2-4 of us showing up. After a few weeks, he cancelled the lectures and told us he'd just be around to answer questions, help with homework, etc.

The last lecture before he cancelled things was an intro to Vim. The next would've been Emacs.

And that's the story of how I became a life-long Vim user. :-)


When I started at school we all got unix shell accounts mostly for email. Later years got bumped out into multiple machines but for that one year almost all of us were on the same 8 core box (and seriously clamped down quotas)

Within a few weeks, a friend of a friend of a friend had figured out how to pull pranks, and that opened the floodgates. Soon we all knew about man pages, about commands to query utmp and wtmp (to deduce who just did what to you or a friend), grep, file permissions, tty write permissions, quotas, nntp.

The most fun was when someone did something to you without figuring out how to stop it from being done right back to them. Many a kernel was cat'ed to someone else's terminal.

Even today when I'm trying to troubleshoot, half of the stuff I'm using was first used trying to seek revenge on a prankster. Once in a while I wonder if this would be a good way to teach a class on Unix.


Life long vim user that was initiated by the terminal here, vim has a special place in my heart, not because the amount of customization that I was able to add, but because it enabled me to write actual diary.

Here's the script.

https://github.com/Aperocky/termlife/blob/master/diaryman.sh

All I had to do was to type $diary in the terminal and I'm writing my diary in vim. I had never had more extensive record of my life. My distracted ass would never have managed this without that kind of access.


I’ve had something like this for about three years now, except it doesn’t open vim, I just use a single read command and it appends whatever I type with a time stamp. One file per quarter. And there’s a cron job to pop up a window each hour to remind me to type what I’m working on.

I’ve used this to figure out how much time I spent on certain tasks by sampling it Monte Carlo style. And sometimes when I run into some weird non-googleable problem, it happens again a few months later and the solution is in those files.

It’s like my own real-life syslog. Or like a scientist’s lab notebook.


I love this! I've been looking for a lighweight diary tool, and this is giving me serious inspiration.


Check out Vimwiki, it has similar functionality and allows hyperlinks to other documents + index page generation


(Life-long vimmer here) At the bootcamp I teach at, one of our students almost became a life-long VIM user today, too: he had accidentally opened it and didn't know how to exit ;-) . Unfortunately, my co-worker helped him before I could introduce him to our dark ways.


As someone who uses vim consistently, I hope I'm not a life-long vim user, I plan on at least a short retirement ;)


My wife and I took a month long honeymoon. Really hoped it'd be as close as possible to one of those Tim Ferris-esque "mini retirements." The first morning I awoke early with some serious jet lag and couldn't really figure out what to do while she was still sleeping. So of course I rewrote my whole vim config from scratch and had a blast doing it.


There are two sides to this kind of teaching. If you're MIT, you can afford to hire great lecturers who know to teach the students fundamental and deep truths about the tools they're using and not trivia. That way, they can generalize onto other tools, so anyone who studied git can quickly adapt to using cvs or svn. My university (in a developing country) is on the other side. Last semester was a disaster.

We had an AWS course, half of which was memorizing information about pricing and S3 tiers. If I were going into a job as an AWS guy, I'd definitely have to know that, but this is just third year of undergrad in CS :-/ and not a training course. The quizzes also had deliberately deceiving questions, which is the worst type of trivia!

Even better example. The Windows Server course was also compulsory (just like the AWS course) and mainly consisted of memorizing trivial information about context menus, which buttons to click and the licensing terms/durations/prices for different Windows Server versions. I'm jaded from the experience. Got my first two Cs in both since I spent time learning stuff described in the post instead of that nonsense.


I find it very weird to see something like an AWS course in a university-level CS curriculum.


AWS and Windows Server courses are definitely not university-level CS courses.


Yep. And the sad part is that some of the students I met here don't realize what they're learning is not CS. I only applied to avoid the compulsory military service, I was already learning a bunch on my job.

On the bright side, the local job market demands match what they teach here very well. According to my anecdata, lots of students who had no knowledge or experience working as programmers now have full-time jobs mostly as web developers. Looks like the university's doing its job and I'm just a whiny C student.


That's because each university caters to some business needs. Top universities form students towards engineering careers because that's what the market requires and expects them to be.

Low rank universities train students to be button pushers in a world of increasingly automated systems. Because that's also what the market needs : people to keep the systems running and run applications updates.

The good thing ? Developing is one of the fairest trade : if you are good, you will manage to find good job and decent money without much struggle.


I graduated a top 50 school, that gave us an “engineering” degree but had no such similar course. There were no more than 20 students, out of 300ish, who understood the material of this course well enough to teach it. From my perspective, this knowledge was generally inversely correlated to GPA and strongly correlated to postgrad performance.

Take my anecdata for what it is, but I think these skills are strongly underrated by universities and students alike. Kudos to MIT for publishing this online; I know I would have benefitted from exposure to these topics in my first couple of years.


> these skills are strongly underrated by universities and students alike.

Universities believe their focus should be on timeless principles, and expect that students will supplement their instruction as needed. For example, teaching students the Bourne Again shell isn't useful on Windows, and declining in importance in macOS. Same logic applies to editors, but even more wildly varied. Rather than build 20 courses, one for every IDE, and then revisit the pedagogy every year as software changes, they basically lean heavily on 'you're smart enough to figure this out on your own.'

Some of this is obviously motivated post-hoc reasoning but still makes a modicum of sense -- anyone with 5 years exp could probably teach undergrads bash and git (with maybe 40 hours of prep), so university instructors with PhDs should be focusing on knowledge gained from research findings that wouldn't find their way into industry via other means.


In general, I agree with you; I’m not suggesting that the whole degree be built around these tools, just that they should be taught, preferably early on, alongside the traditional intro to CS course.

But it’s not just about the tools, there are some seriously timeless concepts and skills under the surface: regexp, environment variables, users and user groups, documentation comprehension, piping, filesystems, streams, processes, to name a handful. These apply to any Operating System or programming environment, and give some concrete foundation to the theories you learn elsewhere.

Plus, some of these tools really do merit a “timeless” label at this point. VIM, git, bash (and bash-inspired shells), and most of GNU utilities have been around for a very long time and have not been conclusively supplanted by more powerful tools.


It takes no great skill to teach any undergraduate CS class. If you’re talking about skills that are still going to be useful in 50 years, Unix scripting is probably one of the strongest competitors. More critically, the basic model of piping commands through a set of simple programs is extremely powerful and widely applicable.


Perhaps my experience was atypical, but freshman year included a course on proofs of correctness, starting from propositional logic and ending with LTL (comparable in concept to the system Lamport published as TLA+). Upper level course often included a hands-on section on PROMELA. These are not something I would expect a rando practitioner to know about or teach, and only rarely put into practice (albeit to good effect -- amazon uses TLA+ to formally verify their AWS services: lamport.azurewebsites.net/tla/formal-methods-amazon.pdf).

And that's really the value of an undergraduate degree: learning stuff industry hasn't yet widely adopted from the people who helped invent it.


I'm not trying to be cheeky, but you're proving the point: I'd argue that UNIX systems and tools are the foundation of AWS, not TLA+. Amazon was famous for being held together with Perl, back in the day.

If your service or company survives long enough for the marginal cost of applying TLA+ to it to be worthwhile, you can afford to hire postdocs to work on it. They probably have 10 people using TLA+ on it. There are probably 500 software engineering teams in the world that are mature enough where the payoff is worth it.

It's not to say that TLA+ isn't useful, or that it isn't worth learning. It's just that TLA+ is never going to be a bedrock tool in the same way that UNIX systems knowledge is.


> If your service or company survives long enough for the marginal cost of applying TLA+ to it to be worthwhile, you can afford to hire postdocs to work on it.

Judging from the Amazon open reqs I've seen, it's their SRE team writing TLA+. So like, both topics are useful; I just don't think it's worth paying a professor to teach you how to use computers when there are so many other resources available. This is why I support things things like LUGs and student jobs -- I spent like 5 years supervising student SREs for the OSU Open Source Lab, speaking at Barcamps, presenting and advising at LUGs, etc. And why a couple of PhD students are running a one month session and recording it for posterity.


> Universities believe their focus should be on timeless principles, and expect that students will supplement their instruction as needed.

Why the arbitrary distinction? Why have teachers at all? Surely the students can learn the timeless principles on their own as well.

What matters is where teachers can add value. Sure, you don't need a tenured prof walking students through Intro to Vim and Bash and By The Way GDB Exists", but you do very well to have a TA run a practical lab to teach these things.


I don't think it's really coming from any sort of high priesthood dedicated to the noble, timeless principles. It's just boring human stuff.

Professors aren't typically trained educators, many of them at best tolerate their teaching duties. They muddled through, why doesn't everyone else?

In sports there's a truism that "good athletes make bad coaches". Brilliant former PhD students often make bad teachers. Whatever the timeless principles might be, they're useless without some craft.


I think what's happening is that motivated individuals learn these things by doing, sometimes starting at an early age. Teaching these topics explicitly will probably not automatically turn students into 10x hackers; however it might help some. I know I cry a little on the inside any time I have to explain grep to an engineer that has been around for more than a year.


I have those skills, no one in my data science team does. The people with better communication skills, who can gain buy-in from colleagues and raise disagreements without making anyone angry, are doing far better than me despite their lower level of technical competence.


This doesn't mean they aren't important though. In academia (not CS), but we use a lot of what's covered in the op MIT guide.

I don't know if this gives an edge at all, and it's certainly 100x less important than soft skills, but there has to be some advantage to knowing how to pull a git repository or how to extract text matches from the command line without having to download or use an external application.


Interesting that you think it's inversely correlated with GPA. At my school, I feel like the people who are experts with the terminal and *nix stuff are the ones who have high GPAs because they are at the top of all the CS classes.


I think it’s a bimodal distribution. The people with a lot of practical skill are also often busy with their own projects or working part time.


Metaprogramming has an actual meaning and it means something completely different from what it is used for on this website:

"Metaprogramming is a programming technique in which computer programs have the ability to treat other programs as their data." [1]

i.e.: it's a programming technique and not something related to any process related to building code.

I understand the idea of giving some more practical side of using computers in a "missing semester", but please pay attention to the nomenclature. It can be really confusing for someone studying CS that does not yet grasp the concepts that well.

[1] https://en.wikipedia.org/wiki/Metaprogramming


“Meta” means “one level of abstraction above”. I find it perfectly okay to use it in this context.


The point isn't whether a word fits a definition, but whether that word already has a definition with which the new definition will be confused.


And the word is specific to programming! The usage in the course seems incorrect to me too.

Everything else looks great though!


Students will presumably learn about actual metaprogramming in one of the non-missing semesters, so maybe it's meant as a pun?


This course is wonderful! I've read through all the material and watched all the lectures and I can say it has helped me tremendously thus far. I'm still trying to master all the tools they've mentioned but I already feel much more proficient with e.g. version control, vim, using the command line, etc. If you're an experienced dev then you might already know all of these things, but if you feel that you have some gaps in your knowledge with some of these tools, this course will likely point you in the right direction.


If you're an experienced dev, you use all these every single day.

God save me if I'm asked to implement quicksort though... theory is nice, but for me, academia largely forgot about this practical stuff.


I missed out on a formal CS education. I found my way, but I always saw it as all of those things you never had time to research deeply. Otherwise, it's just a trade school. For example, you've implemented a few basic sorting algorithms so you feel confident in choosing one or kicking the tires on a new one you've never heard of. Knowing how an OS is put together gives you an idea where things should go.


At my school there is a 2 unit class that you must take along with the intro DS&A course that teaches bash, vim, git, testing, etc. It was definitely the class that helped me the most in my internship and also made me a Vim user.

http://ieng6.ucsd.edu/~cs15x/


I too use (Mac)Vim.

In my experience, most CS majors at UCSD that I've met complain about the course. On the other hand, DSC doesn't have an equivalent, which I find disappointing but understandable. Maybe it'll get one in the future.


Oof, the main memory I associate with this class was not realizing there was a midterm and therefore not showing up for the midterm.

Still passed though, and with an A somehow. Thank you, guardian angel TA!


I think the reason you don't see this kind of course offered is because it is primarily concerning training and not education.

Imagine if your training course 25 years ago focused on the Turbo C IDE and that's all the University offered. You would be amazingly proficient at Turbo C and know all its keyboard shortcuts but that wouldn't be too relevant in today's market.

Keeping such training material up to date with the latest trends is exhausting work and not worth maintaining, especially given how difficult it is to predict what may be the next tool de jure. Contrast this with more timeless educational topics and it starts to be clearer why this sort of thing is explicitly not taught.


This is one of the issues that I have with how CS and software engineering is taught in universities.

Yes, you could look at this as "vocational training" and not "education". But you could also look at this as education with a hands-on component using commonly available tools.

Sure, you could have students memorize a list of git commands. That would be terrible. But you could also teach students to understand how a distributed VCS works, the graph operations involved in commands like "checkout", "merge", and "push", and the underlying way of thinking about data and how it changes. That both provides students with an extremely useful practical skill as well as setting them up for understanding much more complicated topics like CRDTs and distributed data synchronization. And when the tool that was used gets replaced, so what? If the concepts used by the tool were taught correctly, then picking up a new one should be trivial. I've been through countless IDEs and debuggers and version control systems in my career, but the reason I can do that is because at some point I learned the first one of each and was then able to transition that knowledge along the way.

"Training" gets looked down in a way that seems actively harmful to me. Just because a tool might not be relevant in the future doesn't mean there's not a huge amount of value that can be gained from learning it today.


I absolutely agree. If we in the industry are expected to constantly remain up to date on new technologies, it seems reasonable to expect our CS educators to do the same. Realistically these tools change infrequently enough that courses which teach these tools will likely remain relevant for at least the first few years after students enter industry. I've had many courses in which my peers didn't understand git even as seniors because it's not properly taught at my school. If you enter industry without basic git knowledge you're going to have a bad time and the one of the main points of education is to prepare you to enter the workforce.


Sure, or it could have focused on bash, terminal, Unix, vim, etc and still be relevant today.


Sure, but 25 years ago that'd still be placing a bet that those tools would be relevant today. That was my point.


As someone whose intro to programming class WAS built on Turbo C almost thirty years ago, let me tell you how it was:

One day's worth of handouts told you about the IDE.

Another handout was tips for debugging, including common compiler errors and what they meant.

That was it! The rest you learned in the computer cluster, with the help of your dormmate, TA, or computer cluster tutor. The course itself, from then on out, focused (quite rightly IMO) on programming concepts, structures, abstractions, and basic algorithms.

The main computer clusters on campus also had a whole panoply of brochures on how to use Unix, as well as various editors/tools. They were for use by all students, not just the CS students.

In short: it should not take a massive investment of class prep and time to teach students the practicalia they will need to know. Save it for the labs. Don't make a semester course out of it. Include the whole student body as your audience by publishing self-help materials for all.


Did they remind you to define bool?


There was a special set of libraries the class provided that wrapped the standard libraries and made sure that stuff like that was well-defined. (Well, as well-defined as it gets in C anyway.)


FWIW, Carnegie Mellon runs a "tools of the trade" course which is targeted to CS first-years, GPI [0], that is mostly taught by student TAs with the supervision of a professor. Asking students to create and teach the material seems to be a great way of solving this problem; TAs get teaching experience, students learn about the tools, professors are not duly overburdened.

[0] https://www.cs.cmu.edu/~07131/f19/


Presumably though there's change in a lot of different fields (not just CS), and it would just be up to whoever is maintaining a course to keep the tools up to date with what's used in industry.

But also, how long has it been since vim/emacs were first used? Or a vcs? I think a lot of this stuff would remain pretty well the same year on year.


It's funny, when I was in school, I was always told the difference between a good CS school and an ok school was that the good school only taught you theory and left the practical application to the reader. The ok school had courses on tools and language syntax.

It's kind of awesome to see this coming out of MIT.


The world's best CS school is an internet connection.


I'm entirely self-taught and can entirely understand wanting that belief to be true, but it just isn't. A university can give you so much more than just learning some programming language. At the very least, it's going to force you to dabble in some languages you wouldn't ordinarily touch. The same obviously goes for theory.

Assuming there are at least some things that require a minimum effort that is uncomfortable before allowing you to see their benefits, you will discover more guided by qualified people and well-designed curricula than on your own.

Then, there are the people: from my experience especially with Ivy-League faculty, these schools seem to do something right. Remember that one teacher from school that really got you into (reading shakespeare/track/organic chemistry)? Yeah, they aren't all like that, but a rather significant number seems to be.


> Assuming there are at least some things that require a minimum effort that is uncomfortable before allowing you to see their benefits, you will discover more guided by qualified people and well-designed curricula than on your own.

Perhaps, but sitting in the lecture hall isn't the only way you can get thorough instruction from professionals. Books exist, and they don't suffer from the same monetary, timing, and pacing issues that classrooms do.

The primary historical disadvantage of books- that they weren't interactive and you therefore couldn't get help if stuck- is no longer an issue with the internet. It's possible that the internet is too disorganized and low-quality to be one's primary teacher, but its amazing supplementary value makes other media tenable.

Those are my anecdotal opinions, anyway. But I'm curious, what do you think physical teachers have to offer that Books/Online Courses/Podcasts/Whatever + The Internet don't?


I think that structure, accountability, and community are the big draws towards school. For young people I think that these are extremely important- when I was in school- taking 6 classes meant I spent a roughly 40 hour work week on school stuff. I had a really hard time doing half of those hours when I wasnt in school.

Obviously that's more of a me issue than an issue inherent to self-learning, but many of us have me issues.

If you're not the type of person who would benefit from structure and community- the value proposition clearly doesn't make sense. Even if you would benefit from those things- the value proposition isnt clear at all- its tremendously expensive.


I guess part of "community" is your peer group, but also access to an authority to whom you can address questions (that won't leave you hanging, most of the time).


Going to Stanford doesn’t mean you lose internet connection while you’re there


My book about the GitHub API from O'Reilly had a similar idea: thinking in and about tools is an important concept. O'Reilly permitted me to release it under creative commons so you can find it here for free:

https://buildingtoolswithgithub.teddyhyde.io/


I really wish this practical stuff was more emphasized. I graduated with a lot of very smart people who could write great code - but they could not compile, run, test, or check it into VCS to save their lives.

It made group projects hell.


I graduated high in my CS class from a top school, but it turns out that I'm just good at tests/abstract thinking/hacking things together with duct tape. I'm terrible at software engineering, and really glad to see this kind of stuff being emphasized at the university level.


Depends on your point of view, but it sounds like you’re great at software engineering. :)

Abstract thinking and duct tape are the tools of the trade! The tools they’re teaching in this course are just fantastic abstractions held together with duct tape, after all.


Learning about a command-line debugger and command-line profiling tools would be helpful for those that find themselves in the past. IDEA and Visual Studio have had these things integrated for decades. I find the likelihood of knowing how to use a debugger or profiler is inversely proportional to the amount of time someone spends in the terminal.

It's astonishing how many developers rely on logging/print statements for debugging. The learning curve of using a tool like PDB or GDB is just too steep for most people.


Logging and print statements are phenomenally useful debugging aids. I think people get too caught up in what a debugger can give them and forget how much value you can get out of logging as well.

I'm perfectly happy to fire up LLDB and step through a program, but my go-to first step is frequently to add logs, and one of the reasons is that it's often just plain faster. It's kinda like the binary search of debugging: Sure, I could fire up the debugger and step through the long, complicated, probably multi-threaded algorithm... or I could add some log statements, reproduce, and rapidly narrow down exactly where I should spend my time in the debugger.

If you've got a project with a particularly painful compile time for simple changes (which seems like a whole different issue...) or an issue where setting up a reproduction environment is difficult, then sure, fire up your debugger and set up your breakpoints appropriately. But I think debugging via logging gets a bad rap when it's frequently a completely rational first step.


Logging is also less likely to disturb the rest of the program, in my experience (JS). If I want to figure out in what order a set of things executes, I could just add some `console.log('hello')`, `console.log('helloooo')`, `console.log('does this even get called?')`, etc's around and see how they get spat out, or I could add a bunch of breakpoints, mess with the ordering of the event loop, and spend a bunch of extra time manually stepping around.


JS has it's own debugger problems, largely because of transpilers and the shattered ecosystem surrounding them. Source maps help but aren't perfect.


> It's astonishing how many developers rely on logging/print statements for debugging.

One reason is that you don't switch contexts if you insert a print statement in your code. I personally find that to be the least distracting way to debug.


I’ve tried using GDB and respect those who can, but when I am debugging my short term memory is generally full trying to understand the program code and trying to remember GDB commands clobbers that.


In something like VSCode (/any other IDE) you can just hit a keybinding or click the little button just to the left of the line to add a breakpoint; all the runtime info is shown in the sidebar and the debug console is shown where your terminal normally is; and the process to run is often the same as the process to debug. So context switching isn't as big of a deal. But still, print statements definitely have their place.

(Disclaimer: I work on VSCode)


> It's astonishing how many developers rely on logging/print statements for debugging. The learning curve of using a tool like PDB or GDB is just too steep for most people.

On the other hand, since I can’t hook up a debugger to production, it’s also very useful to be able to understand my applications log output (especially tracing/debug statements) to triage the situation as quickly as possible.

Also any time I’m debugging USB or Bluetooth LE protocols, I’m relying on some kind of Wiretrace-like packet logger.

Debuggers have their place but they aren’t the end-all be-all of debugging.


> It's astonishing how many developers rely on logging/print statements for debugging

http://taint.org/2007/01/08/155838a.html

> While reading the log4j manual, I came across this excellent quote from Brian W. Kernighan and Rob Pike’s “The Practice of Programming”:

>> As personal choice, we tend not to use debuggers beyond getting a stack trace or the value of a variable or two. One reason is that it is easy to get lost in details of complicated data structures and control flow; we find stepping through a program less productive than thinking harder and adding output statements and self-checking code at critical places. Clicking over statements takes longer than scanning the output of judiciously-placed displays. It takes less time to decide where to put print statements than to single-step to the critical section of code, even assuming we know where that is. More important, debugging statements stay with the program; debugging sessions are transient.


The browser dev tools of various browsers are debuggers and are very widely used - I'd guess completely independently of amount of time spent in a terminal.

Other factors, post-C, are probably things lots of languages don't have good debuggers or good debuggers take a while to materialize, debuggers are less necessary in managed-runtime languages, etc.


Ruby has a great debugging story.

You just type "binding.pry" wherever you'd like to stop the application and in your terminal, it will open a Ruby shell where you can access the program state at that point in your code.

https://github.com/pry/pry


If I sorted debuggers I've used by how great they are, they would all be better than that ruby debugger, including visual basic 3 for windows 3.1.


Yeah when I hear great debugging story I’m thinking Smalltalk, not Ruby PRY.


Testing and debugging is still a weak spot for me.

I did a bit of dorking around with web development on my own, decided to change careers, needless to say the boot camp didn't cover any of that and the 3 dev company I ended up with we're still trying to wrangle together good practices and pull old code into the future to the point that we're not doing nearly as much testing as we should... so I'm left to my own devices.

Even my usual web resources don't really cover much in the way of JavaScript / Node debugging outside "here is how you set something up ... k bye!".


The quality of debuggers varies hugely, especially since different languages support different debugging features (e.g., conditions vs exceptions vs neither). Logging works pretty much the same everywhere.


I have never used a language that doesn't support visual debugging from the editor except maybe embedded c. Considering the complexity of "logging frameworks", logging is not always as easy as it may sound.


I take logging in the general sense to mean “print”, which is pretty easy.

I’ve used lots of languages without usable (visual) debuggers. I was probably programming for 10 years before I saw a debugger that didn’t crash the OS half the time it was invoked.


Non CS graduate here, funny how my learning curve has been basically what y'all are saying.

I had one java class before I officially kick started my programming career by wiping windows off my laptop and installing ubuntu. Then proceeded to force myself to do everything from the command line, not that there were many other options. It escalated quickly from there.

Starting from the terminal is much more intuitive than writing 'int main/ public static void main' in an IDE.


> Starting from the terminal is much more intuitive than writing 'int main/ public static void main' in an IDE.

i would take this a step further: starting from _any_ REPL is an advantage in learning programming. the tight feedback loop fosters experimentation. python is another good starting point in this regard.


Here's a link to a similar course offered by UMich EECS. I really enjoyed it when I took it as an undergrad.

https://github.com/c4cs/c4cs.github.io


In other engineering disciplines this used to be called "shop class" or similar. In his day my dad was taught as much about carpentry, metal work, plumbing etc as he was taught about radios and circuits.

As an educational technique, "sink or swim" is about as efficient as spinning a wheel and handing out degrees.


This is not only useful for CS people, but to us hard science members. Coding is a mandatory prereq.. which is usually one class on C or C++. Then we're expected to collaborate on a project with hundreds of developers. This is a great resource for those of us who are a bit lost, thank you.


Good as a primer for those that aren't naturally hackers but decided to become computer science majors and have little to no experience with a unix-like operating system.

I learned Linux, the shell, basic scripting, and the terminal environment in high school out of necessity and then began to thoroughly enjoy it. Planning to enter university as a CS student I took the time to learn Vim, though I didn't start using it regularly until much later.

I can't exactly articulate why, but I'm fairly upset these sorts of things comprise an entire course. What happened to RTFM? Where is the general curiosity? Even if you have no prior experience with a majority of these things, these are the kinds of things you figure out over the weekend while doing your regular courseload.


Why is Compilers an entire course? Databases? Operation Systems? All of these topics are covered in books you can read.


It's kind of like a special class to teach engineering students about all the features on their graphing calculator.


When someone comes to me these days for knowledge about basic Unix/MacOS/Linux CLI tools, I direct them to the GNU core utilities documentation - it is very nicely organised according to task. I also demonstrate how CLI can do read-based operations as fast as SQL on medium size databases (just take an SQL dump and pipe) -

https://www.gnu.org/software/coreutils/manual/html_node/inde...

For various sorting and filtering followed by report generation in latex etc., a knowledge of awk also helps. I use the mawk interpreter since I have found it to be slightly faster with ASCII format files.


It's sad to see freshman CS students getting thrown into this unknown world of programming with zero understanding of basic programming principle such at simple logic, tools, and even a book. I just witnessed this last semester where the students first class was basically 'intro was a C++'. No books or resources supplied or referenced.

So it even gets better when the instructor implies don't get help from the internet or examples for assignments, in his eyes everything on the internet is bad or wrong. I gathered he basically wanted the students to fail, what a super hero. Glad to see the effort here to help CS students, great job.


it's even worse at the school I graduated from. the intro level course for cs majors is in python, but calculus I is a prereq. so the students who are catching up in math and/or don't quite feel ready for the real intro course have to take the remedial course, which is in c. I was a TA/grader for that course and what a shitshow it was. I don't think the professor ever explained the concept of undefined behavior.

the highlight for me as a grader was the assignment where they first had to define functions outside of main. many of the students didn't understand that you need to actually return values from (non-void) functions. but in a lot of cases, the value they were supposed to return happened to be sitting in $rax, so they would end up producing the correct output. I was never sure what to do in those cases; they lacked the conceptual tools to even understand what they had done wrong.


> the intro level course for cs majors is in python, but calculus I is a prereq. so the students who are catching up in math and/or don't quite feel ready for the real intro course have to take the remedial course, which is in c.

I don't understand, why would students who need to take Calc 1 not just take the normal course the following semester?


I didn't explain that very well. students who were seriously considering the cs major could certainly just take calc their first semester and then take the real intro course next semester.

at the same time, the university basically considered having taken calc in high school to be a proxy for the overall quality of the student's stem education so far. so if you were one of those people, you were encouraged to take the intro intro course first. this was also the course that people would take who were interested in cs but weren't stem majors. it was a poorly designed course for both purposes.


Got it, thanks. Sounds like the "C for Engineers" class I took as a freshman EE. Mine sounds like it was better-designed, though, possibly because it was a gen-ed-type requirement for all engineering degrees.


Definitely more value in taking a udemy type instruction prior just to get introduced to the concepts at your own pace. Nothing has really changed after decades, incredible.


What kind of intro course was this? Just an intro to programming, or something like data structures?

If it's the former, calculus as a prerequisite makes zero sense. It's bizarre that anyone would approve that idea.


literally just the first course in the cs major sequence. they didn't actually have to do any math. as I mentioned in a sibling comment, the department used calc I as a proxy for the quality of your high school stem education. that was the official reason, anyway. there were a lot of courses with irrelevant prereqs. cs was by far the largest major at the school, and they were constantly short on faculty to actually teach a the courses. a cynical person might think that the track was designed deliberately to stop people from being able to take too many cs courses at once.


In my graduate program (Neurosciences) the students with a computational background usually realize after about two years that all of our colleagues desperately need a basic orientation for how to actually use a computer to do real work. The only institutions that I know of that teach the kinds of courses we need are community colleges.

The curriculum presented here is an enormous community resource and I hope the other institutions recognize the desperate need for a course like this and allocate the resources to teach it.


CMU has a similar course called "Great Practical Ideas for Computer Scientists" (a play off of the infamous "Great Theoretical Ideas in Computer Science") at https://www.cs.cmu.edu/~15131/f17/.


Purdue CompE has a required 1 credit hour lab that covered Unix command line, bash, git, and a little Python. It's been the most useful credit hour I've ever earned.


https://missing.csail.mit.edu/2020/security/

> An example of a hash function is SHA1, which is used in Git. It maps arbitrary-sized inputs to 160-bit outputs (which can be represented as 40 hexadecimal characters). We can try out the SHA1 hash on an input using the sha1sum command:

Can we please stop presenting SHA1 as a valid choice for a hash function anytime now? Especially in a security context? This passage is probably just a holdover from the 2008 version of this page, but it's still frustrating to see.


I still remember my first week of engineering school (1997). Before the actual classes started, we had an intensive UNIX week. I loved it. It seemed like the real deal to me. I was already using linux at the time, but for the first time I was properly taught things by experts, and not computer magazines. What was my hobby was suddenly becoming something serious and respected. So much has changed since then but I'm glad that to this day I'm still using the same commands and tools.


I just graduated from University of Colorado Boulder this last semester with a CS degree, and am glad to say that this was a required course. It was a 3000 level course called Software Development that also incorporated a long form group project for team cooperation skills. It definitely helped me to improve my efficiency and organization when working on projects. Teaching thoughtfulness in designing not only a solution, but a work environment, is an important lesson.


When I was in college, we also followed a "traditional" CS curriculum. The introductory language was Scheme. The language in core courses was C. Advanced courses used whatever was best for the course, usually C or C++. Many of the courses have had the same problem sets since 1985.

At the time, students had problems with it because "how are we supposed to build apps with this knowledge?!" There were even student efforts to "modernize" the curriculum by supplementing it with student run workshops on app building (which actually did really well).

Personally, I really appreciated the curriculum. I was a self taught programmer going into CS, having done lots of web dev and automation stuff with PHP / Python. And I was really cocky about it. But as my adviser told me on my first day, "yes, you're a programmer, but you are not yet a computer scientist." I knew nothing about Computer Science, and had never touched a low level language, because I had no practical reason to. The next four years were valuable for me because they forced me to learn about topics that I will never teach myself.

And you know what? By the time I graduated I was fully proficient with git, python, Javascript, bash, and a whole bunch of other tools and concepts, despite the fact that no class forced me to learn them. I picked them up from working with others, scripting tests for my assignments, and my internships. It also helped to have the foundation of lower level knowledge. It's a lot easier to understand the "magic" of your scripting language when you have an appreciation of what it's actually doing underneath the hood. It's easier to understand bash when you've written a shell in C. Etc.

For me, this approach worked well. School taught me the fundamental concepts that haven't changed since 1985, which means they probably won't change much by 2054, either. Who knows what industry will be using in 2054? I appreciate that the school curriculum prioritized core, fundamental concepts over tooling trends and the language du jour. Yes, it's important to learn those practical skills, but they'll come with practice and experience in industry (especially if you have 3 internships); there's no need to take a class on them. And when the next one comes along, you can learn it in a week. Your knowledge of the fundamentals will help.


Honestly, this course would have been a game-changer for those of us who went to terrible colleges with minimal opportunities for internships. The disconnect between theoretical classes and practical ones was simply too vast to grasp for the majority.

You sound like one of the top tier of students, being a programmer before entering college (which I assume was a good one). There are colleges out there where the faculty have only a theoretical relation to any actual useful knowledge. E.g. my semester-long networking class actually had less content than just watching a few YouTube videos. My operating systems class had a prof who was so completely clueless about so many things I can't list them. On the first day of class, he said "There are 3 OS (sic) - Windows, Mac and Unix" and refused to believe a student who told him that OSX is a unix. My CS education is entirely self-taught as a result.


I can't speak for any other schools, but before I dropped out of Florida State, they had a class called "Intro to Unix", which was only worth one credit.

We all were given login credentials to a server and our assignments were typically Bash scripts; there was one class spent on Vi, and the professor combined the Nano and Emacs class together. We also learned how to do Makefiles and how to run GCC...

I had assumed that this was standard in most CS majors...was I wrong?


Very wrong, unfortunately. At my school there are students halfway through their degree who can't switch a git branch without using a drop-down menu in their IDE (and even then with difficulty) or have ever used the CLI for anything. The place students tend to learn these things is their internship. The first one is typically a turning point in their programming and tools ability


Me too... except it's a somewhat simpler, but required course for ALL students, not just those in CS ! (A bit understaffed - or maybe not - inter-student cooperation worked well enough!)


Thank you a lot for posting this link.

I am majoring in CSE in India and I think you guys have it still better. India's engineering education (maybe except IITs) is completely screwed up by people who don't know what they are doing. The average student here doesn't come out of interest but for sake of getting some job. There are lot of people who don't understand difference between text editor and word processor till end of the 4 year program, but still get an 'S' grade (>90% score) because of rote learning system. Leave alone mastering the tools, they expect to open a PDF/docx in their phones and copy it to the computer. Having said that, the college I study is one of "top" colleges of the state.

They foolishly waste 1 year in our life by teaching "common" subjects like chemistry, physics, elements of {electronics,mechanical,civil} engineering etc.. I mean, they may be somewhat marginally useful, but they waste time and money for that while there's lot of stuff to learn in CS itself; just because some baldhead decided it would be good to teach all subjects to engineering students in first year.

I really wish I could study in US.


2 words: bash reduce

(eg using cut, sort, split -l, nohup, ssh to orchestrate a massive job with only bare unix tools available)


All of this seems like stuff I would expect someone studying CS to pick up on their own. I learned to use git in college, but not because anyone taught me: I just figured it out because I had group projects and needed version control. On the other hand, there's no way I could have possibly learned how something like ray-tracing worked without learning it in school or doing some really intense self-study. I feel like it's a waste of class-time to teach stuff like git, shell scripting, etc. If you're smart enough to understand graphics or compilers, you should be able to learn those tools on your own without any trouble.


I wonder how different things would be if cs departments had a special lab without mice.


We had a similar course at Columbia (half semester, 1 credit). The professor was good and the information pretty useful; very similar material to this one. The paradox was that for the ones who knew the significance of such a course, well, we had already learned most of these things the hard way. Maybe our other CS profs should have advertised the course as an optional requirement to get people interested early instead of their junior/senior year.


I have a MIT degree and learned these when I encountered them in my work. An important thing is to learn how to learn for the rest of your life.


University of Michigan had this as an elective three or four years back, and it's served me well to understand things like piping, output redirection, and reflog. I'm glad MIT's making this public. It seems like it could fill a big gap for people who don't have the mentorship to know what they don't know.


This is a standard second year CS course at UNSW: https://cgi.cse.unsw.edu.au/~cs2041/19T2/index.html

Topcis included: Shell Perl Git Javascript Webservers Make Performance tuning


Vim? Why not teach a real IDE like Eclipse or Intelij Community Edition? It has much better auto-completion, has a GUI for Git, debugging and profiling etc.

Also I'm missing test driven development or at least unit tests in the curriculum. Reproducible results are important, even at university.


I could add, one course or two about "communications" the one with people in. Most of us will spend our times in teams, with highly opinionated and sometimes conflicting individuals. Computer programmers don't know how to listen or to respond, we mostly just react.


To be honest, I was kind of disappointed with this, because most of it I already knew! (That's not a criticism of your course c:)

In the vim section, you might want to cover `*` - go to next match of current word and `#` - go to previous match of current word


Probably at less depth but I ended up learning quite a lot of these topics at GCSE (16ish), with the exception of proper version control although I taught myself git and C++ instead of actually working. Probably quite lucky in retrospect


We have a class like this at my uni and it’s the most hated class for the CS course.


In my observation (at a major research institution), some grad-level CS students lack an even more fundamental skill: touch typing.


I thought it was going to be communication or active listening.


> Vim avoids the use of the mouse, because it’s too slow; Vim even avoids using the arrow keys because it requires too much movement.

I'm sorry, it isn't my intent to start an editor war. Use whatever you want, I don't care. Just don't lie about it. I expect far more from MIT than this nonsense.

VI and VIM are what they are because keyboards of the era --and the entire UI-- looked like this:

https://retrocomputingforum.com/uploads/default/original/1X/...

Notice the location of the Ctrl key. It used to be where the Caps-Lock key is today, making it FAR more convenient and comfortable for entering Ctrl sequences.

This is from a Tektronix terminal, which I used in the early 80's. Here it is:

https://www.computerhistory.org/revolution/input-output/14/3...

The VT-100, if I remember correctly, introduced four cursor keys:

https://www.computerhistory.org/revolution/input-output/14/3...

I also used VT-100's and clones during the '80's.

https://www.computerhistory.org/revolution/input-output/14/3...

Notice the total absence of anything even resembling much more than a simple typewriter. No mouse, function keys and other modern facilities.

So, yeah, if you were writing a text editor at the time, you would be well served to do such things as implement modal view/insert operation for more reasons than just the archaic keyboards. These terminals were used to connect with remote systems at VERY LOW BAUD RATES.

It's hard to imagine that BAUD 300 or 1200 was great speed at some point in history. In that context, cursor keys or grabbing a scroll bar with a mouse to yank it around with abandon made no sense. You were literally only able to receive from 30 to 120 characters per second...and a screen with 80 characters by 25 lines had 2,000!

This is another reason for which escape control commands had to be invented. You had to be able to address you 80x25 canvas and place text where needed rather than to refresh the entire 2,000 character screen.

This is why, quite frankly, I hate the "cult of vi". Cult members are, for the most part, people who have no historical connection to where this tool came from and why. We, at the time, would have KILLED for a graphical UI with a mouse. Yet that was impossible at the time due to both machine and connection speed limitations. You literally could not have used it even if you had it.

So, yes, at the time, if you had to write a text/code editor (I wrote a few) you had no choice but to use a bunch of Ctrl-<something> codes and perhaps even implement a distinction between reading and inserting code. I still remember sequences like Wordstar's "Ctrl-k-x" running on an 8080 S-100 system with, yes, a VT-100 terminal attached.

Yes, the VT-100 introduced four cursor keys, but if you were writing software at the time you could not make the assumption that the user had access to cursor keys, most keyboards did not have them until much later. That assumption was not safe when VI was created.

Like I said above, use whatever you like, I truly don't care. Just don't lie to yourself about it, particularly when the truth is a matter of history. I think most people who came up through that era of computing laugh at the vi/vim cult because from our perspective --not yours-- it is complete nonsense. This text editor had NO CHOICE but to be as it is due to being written for crappy computing hardware and environments of the time. If you had to use one of those systems today you would be horrified. If you had to write a text editor back then you would write it exactly this way. And the minute a decent an ubiquitous GUI showed up you would drop it like a hot potato and try to forget the nightmares.

MIT Computer Scientists ought to know history and not print nonsense like that. That entire paragraph about VI is basically wrong, historically wrong. You can leave it like that and perpetuate a fantasy or correct it and at least show some respect for history.

And then do everyone a favor and explain that the speed of text entry is of no consequence whatsoever. MIT should not propagate that cultist belief. The time devoted to things having nothing to do with text entry is, in some cases, orders of magnitude greater than text entry. Not everyone is a PHP script kiddie. Some of us are writing real and complex software, some of it with life/death potential, and code entry is so ridiculously low in the scale of where time is spent and what things are important that it is absolutely laughable to talk to someone who has become religious about text editors because of stuff like this out of a respected university. Do you really think code entry speed and efficiency is important at all when working on the code for an MRI machine, a space capsule or an implantable insulin pump? Exactly!

If coding is factory work, then, sure, mechanize as much as possible and count strokes. That would be the day I become a gardener.


> Vim avoids the use of the mouse, because it’s too slow; Vim even avoids using the arrow keys because it requires too much movement.

I haven't read the article, but this reads to me as "using the mouse is too slow, so try VIM instead and soon you won't need to use the mouse anymore." and "Reaching for the arrow keys takes too much time, so try VIM and keep your hands on the home row for added productivity.", respectively.


The point is that this kind of productivity does not matter at all. I mean, not even 0.00001%.

We deliver complex hardware + software products that have to work correctly under challenging conditions and, ideally, not kill anyone in the process or burn down entire buildings. We do not deliver code-entry athletic performance.

In a typical project code entry time is so ridiculously insignificant that I would never hire anyone who came in and made the typical vi/vim argument about efficiency. This would instantly tell me they have no mental connection to what is actually important.

Another element is that, when you work in a multi-disciplinary environment you don't necessarily have the luxury of sticking with one tool. Which means devoting a lot of time on something like vi is pointless.

Another view: Say you have a team of 100 software engineers and you have to get a complex product out the door. Would taking a month off to train everyone to become a vi/vim ninja make this team deliver properly working bug-free code sooner and at a lower cost? The answer is, of course, no. Or, more accurately, no f-ing way. That's why this cult of efficiency is misguided.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: