Hacker News new | past | comments | ask | show | jobs | submit login
Why MIT Switched from Scheme to Python (2009) (wisdomandwonder.com)
205 points by behnamoh on April 21, 2017 | hide | past | favorite | 97 comments



I took 6.001 in 1999 (I think). Hal Abelson taught it along with a professor who took off his sweatshirt to reveal a Microsoft tee at his final lecture (he went to Microsoft).

What was great about Scheme (Lisp) is that most programs are basically words from your vocabulary, parentheses, and cars and cdrs. With procedural languages, there is always a sense that the language provides all the tools, and the magic happens somewhere underneath the hood. But with Scheme it feels as if it's happening right before you. It makes you feel like a wizard, and not a monkey. And it requires knowing all the spells. It requires knowing how to make magic happen. Or not. Once you know none of it's magic. But that's the point!

Python is arguably easier and more practical for both science and work. It's already popular on the web server, and is used everywhere else -- unlike Scheme.

But the recent resurgence of functional programming is super exciting with Elixir and Elm and the like.

I'm looking forward to building my next project using Elixir.

Here is the classic course: https://ocw.mit.edu/courses/electrical-engineering-and-compu...

Original video lecture on YouTube (linked from above): https://www.youtube.com/watch?v=2Op3QLzMgSY


My favorite thing having finally taken a dive into lisp and SICP, is not that it is very friendly to functional. It is that it is very friendly to showing how it all works.

My favorite section is where they go over making a constraint based system that will either calculate degrees F or degrees C. Or validate that the two given values are accurate. All depending on what you have entered. And this is done from the ground up. No hidden magic of the system has to support you, by and large. (Not strictly true, as this does make use of GC and such throughout.)

If you hadn't seen it, https://www.infoq.com/presentations/We-Really-Dont-Know-How-... is a great video showing Sussman's style. He also showcases a program they wrote that could solve circuits based on this general idea.


Thank you so much for linking this video. I had no idea it existed. Watching that lecture was so much fun.

I wish that there was a camera pointing to the crowd when he said the more controversial things.

I didn't know that Sussman hired Stallman. Their work changed our lives.


This is excerpted (as the "technology" segment) of another great video, Bret Victor's "Humane Representation of Thought." (at about thirty minutes in)

https://vimeo.com/115154289


Also, the current grad-level class in the same area (taught by Sussman) is online here: http://groups.csail.mit.edu/mac/users/gjs/6.945/

By far the most rewarding class I've taken at MIT. Check out the psets, they're pretty fun.


Miller? He advocated turning off Athena in the 90s, because NT had obviously won. Linux must have been a Hell of a surprise.


I think there's a tension between two imperatives in teaching new programmers:

1) Learning must be applied learning. Give people problems to solve and they will come to you for data structures and algorithms, O notation, etc. If they don't, they should do something else

2) A lot of what's out there in programming languages are cargo cults, and newbies need to be prepared for this. For instance, virtual function inheritance isn't a thing. It's a weird call chain pipeline system glued into your vtable, which C++/Java teaching also won't bother to tell you exists. Make people start from assembly, graduate to C, then write their OWN vtable, so it's demystified. Now you've reduced the ability of software designers to piss on your leg and tell you it's raining.

So ideally I'd want students (at first) to be doing something either close to the machine or in the functional model. As they say though, the imperative of 1) means getting people to actually produce something, which is easier in Python.


My problem with this sort of bottom-up approach to learning how to program is that what seems "fundamental" from one point of view always turns out to be an abstraction built on an even lower-level foundation.

So, virtual functions aren't real because they're just vtables implemented in C. But C isn't real because it's just fancy assembly language. But assembly language isn't real because it's just fancy machine language. But machine language isn't real because it's just 0's and 1's zipping around the hardware. So I have to learn how integrated circuits work before learning how to program? No.

Bottom-up knowledge is important for understanding performance and other trade-offs. However, there's also a lot of benefit in learning how to program in a formal system without knowing much about how that system is implemented under the covers. If I was teaching an intro programming course to undergrads in 2017, I'd be strongly inclined to teach them a clean functional language first (e.g. Elixir, F#, etc.), and only introduce gory details like pointers and memory allocation later on.


I personally agree with your inclination to teach them using a clean functional language.

I'm sure it would be possible to produce a few excellent software developers doing the bottom up approach as well, starting with a course on hardware, then assembly and then going up the stack from there. I would have loved that approach personally, but it would probably turn the average CS student into an awful programmer.


I think a relatively balanced approach is desireable, as in most things.

Sure, start the freshman off with Python. Let them get their feet wet and see whether they even enjoy this dark art at all. Besides, knowing about the hardware before "Hello World" isn't very helpful.

After their intro class, you don't do bottom up or top down. You start at both ends and work your way in. You got your computer systems course which teaches C and a smattering of assembly and hardware and your algorithms and data structures class that can use an easier language like (python/scheme) to reduce the cognitive overhead to teach the material.

If the students are taking these at the same time and the professors coordinate to explain how they relate at the various levels, you get very balanced developers. (Assuming they do well in the classes!)


F# would be particularly well suited for this since it can reasonably do imperative, object, and functional styles. I'm not as familiar with Elixir to make the judgement there. Though I suppose one could also argue for something like Clojure.


> So I have to learn how integrated circuits work before learning how to program? No.

I took a couple classes in college that were designed to teach what is going on underneath the 1s and 0s of instruction sets. I didn't take the classes about how circuits work, but they existed, I had friends who took them, and I really wish I had as well.

Maybe this isn't required knowledge to do software engineering, but it's definitely useful knowledge. Why wouldn't we want to learn these​ things?


I agree that these are useful and interesting things to learn. I just don't agree that they should be taught first when learning to program.


Sure, I agree.


While I think that most CS people should learn assembly and computer internals basics, it would be an absolute disaster to start students off in assembly as CS 101.

Python is a fairly decent language for starting people off:

* It doesn't have a lot of boilerplate, so no magic "I'll teach you what this means in three weeks" steps

* Output for debugging is fairly easy, since print accepts a lot of stuff without needing to muck with formatting strings

* Assignment is a statement, not an expression. You can't say "if x = 3" by accident

* Python is one of the major programming languages, so you're teaching students something that is pretty much guaranteed to be useful for them.


But many students come in with years of programming in a language like Python, Java, or C++. My friends and I built video games when we were in high school so we were comfortable in all three of those languages.

Scheme levels the playing field. I have never encountered a student who entered university with years of functional programming experience and I was a TA for an introductory CS course. I think moving back to Scheme would be a wise choice for universities that are combating problems with diversity.


If students have years of programming experience, why are they being made to take programming 101?


When you say "programming 101" you might not be talking about the MIT courses being discussed here.

6.001 was a course in functional programming, based on the book Structure and Interpretation of Computer Programs.

6.01 (the replacement course that uses Python) is a project-based course involving robots and stuff.

Neither of these courses depend on any specific knowledge, but they would be very difficult courses if you've never programmed before. (Most people coming into MIT course 6 have programmed before.)

The "you don't know how to program, here's how to program" course has taken various forms over the years. When I was there it was 1.00, a Java course for engineers. Later it was 6.00, an optional intro to Python, based on "How To Think Like A Computer Scientist".


>6.001 was a course in functional programming, based on the book Structure and Interpretation of Computer Programs.

While I can't speak for MIT, SICP was used as the introductory course for CS in a number of universities. It assumes no programming. It is challenging, but that's the point of going to top universities.


As part of a reorganization of the undergraduate EECS curriculum last year the intro class is now two half-semester classes, 6.0001 and 6.0002. It's 0's all the way down!


Because they probably don't know as much as they think they know regarding algorithms, Big-O, etc. There are exceptions, of course.


Because it is a required subject in your curriculum? You don't get to pass people just because they say they know something.


If a student claims to know what the class would teach, you don't have to take their word for it: you can talk to them, or you can give them a test.

(I skipped the first two intro CS classes in college after talking to the professors and figuring out the right place to start given my background.)


Most campuses call this "challenging" - I did it several times. In city college, I successfully took courses without their prerequisites by speaking with the professors and showing them my work/experience. It wasn't test-based, just portfolio review.

In university, successfully challenging also gives you credits for that course (though there is a small fee per unit). I successfully challenged four courses (all in graphic design) - three of which were based on portfolio review and the fourth was based on completion of course projects in an accelerated period. Additionally, though not quite a challenge, I was able to speak with a couple CS professors and get them into admitting me into their courses despite not having the prerequisites (though was doing them concurrently).

The course requisite structure is more a guideline than a rigid computer system that works one way and only that way.


Berkeley used to let students skip the introductory CS course, 61a (in Scheme), if they had received a 5 on the AP CS exam. They removed that option I believe after fall 2010.

One of my friends was able to skip 61a by meeting with the department chair, Paul Hilfinger, and convincing him that 61a would be a waste of time for him. My friend could drone on and on about the simplest of subjects so we assumed he just annoyed Hilfinger until he said "enough already".


Hilfinger also liked to give people enough rope to hang themselves many times over with.


My university did. COS101 was an optional class for complete novices but if you weren't already at least exposed to programming your advisor would sign you up for it. COS125 was the first for-realsies class (taught in Scheme, as it happens).


In addition, computer science is not the same as programming.


Today it is not "many" students who come in to college with programming exp; for instance Berkeley's CS61A, which also moved to Python has over 1300 students. The overwhelming majority have not taken CS classes before.


> The overwhelming majority have not taken CS classes before.

When I want to university I never took a CS class before and nevertheless could program really well. I believe one is not able to go to a university if one is not able to teach oneself the prerequirements of the subject that one want to study.


Dang. 61A with SICP was my favorite class.


It really depends on the type of school. We started programming on University with a simplified assembly, but it made a lot of sense since it was electrical engineering school and we've learned about hardware in parallel, so we could follow what physically happens when the code is executed. I don't remember that anyone had a huge problems with following it. For many people around me with good math skills, but no previous programming experience (and little interest in it) it was much harder when we next moved to C and data-structures and algorithms, where you need to think in an abstract way and solve more complex problems.


I choose electrical engineering for some weird reason - I don't think it occurred to me that software engineering was a way of making a living even if I had been a hobby programmer since I was 12.

Luckily the introductory course was based on SICP - and it really realigned my BASIC- and C- damaged brain.

It turned out to be the only course that I enjoyed so I eventually switched to a more computer systems and sciences oriented education.

It was a good introductory course for me, and perhaps most others. Perhaps not good for the most daft though...

I guess you also need some assembler-skills but it's hard in a totally different dimension somehow.


I'll add one more item to this list: Python has a turtle graphics library out of the box. Turtle graphics is an excellent way to explain concepts like loops, functions and recursion in a visual way that is much easier to grok than the more abstract explanations. It goes step by step, too: you can show the basic notion of issuing statements by drawing stuff like triangles and squares and pentagons; then demonstrate how this becomes tedious as number of edges grows, which gives an opportunity to introduce looping; then show how it's essentially the same loop for various polygons with different parameters, and introduce functions. Drawing spirals, snowflakes and trees is where recursion steps in. And so on.


Another approach is to introduce newcomers to programming with a single course that: 1) uses Python to teach general programming, 2) uses JS to show how to write code that runs in a browser, and 3) demonstrates low-level programming patterns using C.

This is essentially what CS50 at Harvard does, and it seems to work well. Once a student has been exposed to these 3 languages, subsequent courses can explain how a computer works and introduce new paradigms (FP, LP).


Yes, I think it's very reasonable to use a scripted dynamically typed language to learn control flow to start with. We tend to forget that the basics of loops and mutation are a hurdle for many people.


I'm not sure new programmers need to worry about how virtual function inheritance is implemented via the viable. Are there even standards for that? Or is that an implementation detail which seems popular across multiple compilers?

There's time for them to learn the details, but just learning that the functionality exists and knowing how to use it is a good start.


The point isn't to learn it as if it's something you'll have to recreate--it's to understand that "advanced" language features are built on top of the "fundamental" structural model and are not magic. You need some understanding of how programming languages are built in order to keep languages and libraries honest. Otherwise learning about type/object systems becomes learning all the pitfalls and incidental complexities caused by the design as if they're benefits and not costs.


I kinda despise the notion that assembly is everything. It's better than falsely higher abstraction like cpp and java but it's not all.

Inductive logic and the functional paradigm maps onto all of these. And allows for more diverse abstractions without lying either.


One of the key things about SICP in Scheme was that the language used so few intrinsic keywords and structures (I think it was something like 7 intrinsic keywords that could be used for just about everything you needed to do) that you could move on from learning about the language really quickly.

It kept the focus on the underlying principles (every programming language is a set of primitives, a means of combination and a means of abstraction; recursion; interpreters; etc) that were powerful enough to let you pick up any other language without getting bogged down in the details. An opinionated language like Python seems like the wrong choice to teach fundamental structure (it WAS Structure and Interpretation of Computer Programs), but if they aren't approaching it from that angle and are looking more at application development then that's probably ok.

You can pick up enough of any language or framework from the manual, and google/stackoverflow/o'reilly will give you fine points/quirks, but that fundamental base that allows you to look at ANY program in ANY language and be able to reduce it to first principles is (IMO) a necessary component of any computer scientist's toolkit and the best insurance against the continuous rise of packaged software that is continuously burning the short grass in most software development.

I remember reading somewhere that focusing on application development is like training blacksmiths: you end up with people skilled at specific techniques but extremely vulnerable to "industrial" software that replicates the techniques and eliminates the need for custom development.

There are not that many blacksmiths left, and the truth is in 50 years there won't be that many application developers left either.


> 50 years there won't be that many application developers left either.

I like your comment, but the crystal ball statement in the last sentence is a bit too much.


There are still blacksmiths, out there doing specialty work. There will always be application developers. And yes, I may have the timeframe wrong. However, the rate at which things are accelerating I really do think most of the simple stuff will be gone.


I think it's probably right. 50 years is a long time.


I've been coding professionally for 30 years. And programming isn't really much different now than it was 30 years ago. The tools are better, yes. But we're still sitting at desks in front of a screen, entering code into computers as text, with keyboards.

Perhaps there will be a threshold reached where suddenly that goes away. But I'm not holding my breath.


Berkeley made the same switch at around the same time, and it made me sad.

Having done the intro course in Scheme, I'd say it helped me understand functional programming far more than I ever could have with Python, and it opened me up to different ways of thinking.

Even though I never again used any Lisp variant, I'm still really glad I learned it and feel that what I learned using Lisp has informed my future decisions.


The switch occurred while I was at Berkeley and it also made me sad. I loved that course so much I decided to TA for the self-paced version which kept SICP and Scheme.

Brian Harvey explained the decision to switch here:

"But, as I keep saying, the choice of programming language isn't the main point. I get upset when students, even the students who like 61A, refer to it as "the Scheme course." It's not a course about Scheme! It's a course about programming paradigms. MIT, where SICP was written, no longer includes that course in its lower division core sequence, not because they wanted to change programming languages, but because they changed the entire sequence from courses organized around big groups of ideas to courses organized around application areas: first "let's build and program a robot" and then "let's build and program a cell phone." That's a brave and thoroughgoing attempt to, among other things, attract non-nerds to computer science. To say, as some people do here, "MIT switched from Scheme to Python" completely misses the point of what MIT did; nobody is proposing any change half as profound at Berkeley." [1]

[1] https://people.eecs.berkeley.edu/~bh/61a.html


That's a brave and thoroughgoing attempt to, among other things, attract non-nerds to computer science.

That's Berkeley. Abelson and Sussman once got MIT to trademark "Nerd Pride".[1] MIT still keeps the trademark active. Here's a Nerd Pride button.[2]

[1] https://books.google.com/books?id=LJq0JhoElk8C&lpg=PA78&ots=... [2] http://www.computerhistory.org/collections/catalog/E1312

[2] http://www.computerhistory.org/collections/catalog/E1312


Hmm. I don't think Berkeley intentionally switched from Scheme to Python to attract so-called "non-nerds"; it was more founded upon a recognition that computer science will become essential in any field, and not to put at a disadvantage any student who wanted to learn but was new to programming. I mean the current enrollment for CS61A is well-over a thousand, the university lacks resources to properly fund the department, and to declare there's a 3.3 GPA minimum (assuming for Letters & Sciences as College of Engineering students get admitted into the major from freshmen admission).


1. That looks like an interesting book. Have you read it?

2. I think you are focusing too much on the use of the word "nerd" here. Harvey chose the use of that word, not MIT.


CS61A still has a Scheme project (albeit written in python), and focuses on Scheme for about a month before switching to SQL.

[1] http://cs61a.org/proj/scheme/


61A gave me my entire background to programming and computer science. I had never programmed a computer before, and I think I was better for it.

The common phrase was that you should be made to take 61A as your first class and as your last. You really dont fully comprehend it until you get wider exposure, but you need that initial exposure so your brain can begin to knit things together as you see them.

Meta-Circular Evaluator RIP. You will be missed.


I liked Sussman's answer. There is a huge difference in learning things deeply from the bottom up vs. 'get stuff done' approaches using lots of black box libraries, etc.

I have been using Ruby, Java, Scala, Python, and Haskell mostly for years now. But just recently I started working on the 4th edition of my old Common Lisp book and some preparatory work on a new Scheme book. I have been enjoying the Lisp-way of bottom up interactive development a lot.

Both general approaches have value. When doing machine learning it is great to have the universe of useful libraries for Python but for green field work that is more algorithmic, doing bottom up development in languages like Scheme or Haskell is also great.


At UBC CS a few years ago they made a switch in the opposite direction, from Java to Racket for their intro course. I think this was a very good decision because it allowed them to spend less time teaching syntax (which in racket is very simple) and more time focusing on fundamental topics such as graphs, recursion, functional programming, and test driven development. Compared to the equivalent intro programming course that was taught to engineers in C I think students in CS learned a lot more than the engineers whos most complicated curriculum involved arrays and procedure calls. What's interesting is that I believe the inspiration for teaching functional programming as an introduction came from MIT.


The sheer amount of nonsensical boilerplate required in any Java program, even trivial ones, makes it a terrible language for teaching Computer Science.

Think about trying to teach the meaning of "public class Foo { public static void main(String[] args) { ... }}" to a complete novice. Class-based design, inheritance, visibility, return types, arrays, etc. all right off the bat. It's too much.

Scheme and Python both are much better choices for this alone.


Some snarky drift: Then post-college commercial reality sets in, where using mind numbing boilerplate supported by frameworks and heavy ides (as opposed to libraries, custom or otherwise, and an editor), is SOP. Join the programming masses and literally think inside the box(es). So why did you go to college now?


So that you can realize that you're inside a box, and leave it on those few opportunities that present themselves (and, perhaps, specifically seek out such opportunities).


One point of learning these is to learn the ability to learn new languages on your own.

I would have a semester teaching several "toy" languages in different paradigms: perhaps some kind of lisp, some kind of prolog, and some procedural language, followed by a second semester where students choose the language they want to learn, and they learn it by themselves.


"Trust the natural recursion"


"And why Python, then? Well, said Sussman, it probably just had a library already implemented for the robotics interface, that was all."

That, right there, is the reason why I end up using Python so often.

As dynamic languages go, Python is not the best for anything, but it's good enough for almost everything, and there's probably already a library implemented for it.


Well, that tells us something about you. But it doesn't exactly get to the heart of why people like writing libraries for Python.

My theory: it was the first language to take a human factors approach to language design. Not the best, but the first. This design philosophy has the simple consequence that Python programs are by and large incredibly easy to read, understand, and modify, compared to its contemporaries. "Network effects" did the rest.


For all those fondly recalling learning Scheme first, allow me to provide the opposite perspective (kind of).

Learning Scheme as my first language scared me away from programming for years. Yes, solutions to problems were often elegant and puzzle-like, but accordingly, the tools and patterns often felt constrained and roundabout. For example, consider recursive patterns vs conventional looping patterns.

In that class I often felt I wasn't learning a 'real' production programming language, but rather a language designed to teach me programming concepts.

To that point, I think Python offers a much more robust and straight forward tool set, while also being friendly enough for the young developer to sink their teeth into.

However, in retrospect I was just a bratty kid, who was learning a lot of valuable fundamentals. I'd like to think some of those have stuck with me to this day, esp. as functional patterns have come into vogue.


Eh. If you stick to software development you have the rest of your life to poke at non-ideal systems. I still have fond memories of learning Scheme and realizing that programming (specifically computationally dissecting a problem) could be a ton of fun.


Sussman wrote "Structure and Interpretation of Computer Programs" which is the book formerly used in many introductory CS courses including 6.001

"The wizard book" is a great read and I recently starting flipping through it again.

PDF https://mitpress.mit.edu/sites/default/files/6515.pdf

HTML https://mitpress.mit.edu/sicp/full-text/book/book.html


this was posted here a few weeks ago but for posterity [1][2] is an HTML5 and EPUB3 version of SICP with 'SVGs, mathematical markup with MathML and MathJax, embedded web fonts, and syntax highlighting'. there's a PDF version[3] from the same author with decent typesetting. additionally I came across [4], SICP in an interactive textbook form with editable code fragments.

[1]: https://github.com/sarabander/sicp

[2]: http://sarabander.github.io/sicp/

[3]: https://github.com/sarabander/sicp-pdf

[4]: https://github.com/zodiac/isicp


Could it be, however, that there is a need for introductory courses at university for those not quite at the level for SICP yet? I've worked as a web developer for years and can barely follow most of that text. If intro CS courses are provided in a more accessible format, more people will get interested and stick with it in the first place. There's always room for more advanced courses. I think that should be the goal of most universities.


Also available in Info format, for easy browsing and searching from within Emacs:

http://www.neilvandyke.org/sicp-texi/


There's been time for the new course to shake out. Anyone taken it, and ideally read SICP too, and want to tell us about it?


We used scheme in my programming languages course (which was one of my last course before graduation). at first I hated it, but then once I got it, it was fun! functional programming forces you to think in a way that is much more conducive to understanding programming on a deeper level and thinking creatively to problem solve. Imperative programming, while easier, is more of a blunt instrument. Efficient, but blunt.


Functional programming is an attempt to pretend that Von Neumann machines are abstract, mystical mathematical engines, rather than a bunch of registers that read in values from an electronic grid, mutate them, and write out new values back to the grid.


Every higher abstraction could be described as an attempt to pretend that the lower abstraction is less complicated and error-prone than it actually is.


I know - I'm being facetious, but really, if you want to really understand how most modern computers actually work, there's probably no better way than to learn C. It's high level enough and provides enough structure so that you don't need to actually blurt out a bunch of assembly instructions, but low-level enough so that your code mostly more or less corresponds to what the underlying Von Neumann hardware is actually doing.


I'm not convinced c without assembly is much better than Pascal or even Rust. Sure, you might not need a full assembly course (write a real, multi-user, bare-metal os in assembler using most of x86 16,32 and 64bit syntax) - but a C course mixed with assembly I think makes a lot of sense if the goal is to learn a bit about how computers get work done.

I mean, one should be able to look at the assembler output of hello.c and have an idea of what's going on. Without that level of understanding, I'm not convinced using C over many other languages gain you much. It's still a lot simpler than eg. c++, though.

I'm thinking something like: http://pacman128.github.io/pcasm/ but rewritten from the ground up for x86_64.


I think it depends on what modern means. If modern means more or less the system architecture basics from the 80s that our current systems pretend to be, C is a pretty straightforward mapping; but if modern means Intel kaby lake/amd ryzen, there's a lot of things going on that C or even assembly don't really address.


if you want to really understand how most modern computers actually work, there's probably no better way than to learn C

I have found advice like this to be a good sign that the speaker has a very limited understanding of how modern computers actually work.


How computers really work never clicked until I took a computer architecture class where we started in a logic simulator with individual gates, and progressed step by step until we had an 8-bit computer, running in the simulator, that could load in machine code generated by an assembler we wrote for that simulated computer, or compiled from a cut-down dialect of C.

That was a long semester.


You take a course in computer architecture to learn how computers work. If you follow up with assembly, you'll be able to do things you can't express in C without breaking its common model. What the underlying hardware is doing also varies considerably across implementations: x86 CISC; RISC's; embedded; SIMD/MIMD; multicore; HLL or safe/secure CPU's; DSP's; GPU's. I know C doesn't map to a lot of that because there were whole Ph.D. programs trying to extend it with things like Cilk language. However, you learn C language if you want to learn its model or how to work on software using it. It will also teach you a little bit of what a computer architecture course will but leave off parts that could help a lot.


Of course C doesn't map to all that. Neither do most of the MIPS emulators they actually use to teach MIPS assembly in universities. But it does map pretty well to basic loops/conditionals and other intro-to-programming stuff.

I mean, just write a simple C program that reads in argv[1] and then loops from 1 to atoi(argv[1]) doing fizzbuzz. Then compile the program (without optimization) and look at the generated assembly. I just did that on an x86_64 using gcc, and the generated assembly is pretty straightforward and maps pretty well to the actual C code.

The only part that doesn't neatly map to the C code is the boilerplate call-stack stuff, (subq instructions on the rsp register, etc.). But everything else maps rather neatly. The library calls to printf, etc. are mapped to a simple call instruction. It demonstrates how closely C code maps to the actual assembly for these sort of simple programs, which is what actually matters in a pedagogical environment.

This compared to similar Python code, where the actual op codes being executed would be vastly more complicated and less obviously related to the actual "business logic" of the program - you'd have constant incref/decref'ing of objects (assuming CPython), byte code execution in the main ceval loop, hash table lookups for globals, etc. etc. The point is C code maps much better to what the hardware is actually doing. Your arguments to the contrary (talking about vectorized instructions and GPU) are really just being nitpicky. That stuff can be covered later in more advanced courses.


Your facetious claim was that understanding how a modern computer works is best done by learning C. I said it's best done by learning how a modern computer works with classes on computer architecture with intros to fundamental techniques. The comment I'm replying to has good detail on ways C maps to computers better than most languages. Original claim is still false across the board where people capitalizing on underlying hardware are better off understanding various forms of computing followed by languages that take most advantage of each or most of them.


I was using "deep" as in the artistic sense which is analogous to "abstract." You're right that imperative languages model the HW better, I just think there's something to be said for understanding programming from artistic side as well as the understanding the metal.


Berkeley EECS is MIT-lite and its 61A used SICP/Scheme. They even wanted us to use emacs (which for me, so wasn't gonna happen). So pretty much everything in this article applies to Berkeley EECS as well. They've also shifted from SICP/Scheme to Composing Programs/Python.

http://cs61a.org/articles/about.html

The last time I used Scheme was in 61A. I like Scheme; it was elegant. I probably would have continued to use it but Instructional Computing had a bastardized variation. I'm pretty sure that students of 61A will continue to use Python since they're using Python 3.


Funny, I was present when this conversation took place. Hearing Gerry Sussman talk was truly inspiring. However, I would somewhat disagree. For sure, the way we program in day to day work has changed, and the curriculum should also touch that. However I think, learning the very fundamentals of computation is essential for a proper system understanding. SICP provides a great way of doing that and the choice of scheme as a minimal, yet powerful language fits to that. I can only recommend everyone to at least read through this great book.


I implemented the huffman trees (Section 2.3.4) in Clojure.

https://github.com/nickbauman/sicp/blob/master/src/section2_...

I loved the process. Scales fell from my eyes. I didn't get this with Python, even though I love that language too.



Pretty sad honestly. We used SICP and Scheme in the first computer science class I ever took and that is the most fun that I ever had programming. I remember I couldn't wait for new homework assignments to be released so that I could solve the fun puzzles.


This book(http://composingprograms.com/) is in Python and is based on SICP. I found it interesting enough to include it in my ever-growing to-read list, worth checking out.


I hope they take a look a pyret, language designed by a team of CS teachers from Brown. They have many good points on the value of pyret above scheme. Somehow a more formal view on programming (recursive ADT, tests), their students are learning fast with it.


I think SISP is an awesome book, but I don't see why it has to be Scheme, at least for the first half of the book. Let's not be too dogmatic about our shovels. A shovel is a shovel.

For example, here is SICP with examples using the JavaScript shovel: https://www.comp.nus.edu.sg/~cs1101s/sicp/

BTW, since this site isn't super mobile-friendly, I made a modible friendly version a while back: http://ivanistheone.github.io/SICPapp/ (only first two chapters)


The Metacircular Evaluator should be a prime example of why Scheme matters. It completely loses focus into a mess of parsing and loses the idea that code and data are the same thing. Also that whole tail-recursion thing is kind of important.


If people aren't exposed to enough languages to a level of competent understanding, if not mastery, a groupthink monoculture will inevitably develop where engineers only understand JS and Python, not C, Tcl, Clojure, Erlang, Brainfuck, Haskell, Coq, Rust, etc.

The most important thing is for good engineers to understand both near-universal basic concepts and the low-level impact of data structures and algorithms (like branch mis-predictions / deep pipeline stalls, memory usage, latency, locality.)


Joel Spolsky wrote a blog post about this a while back, about the advantages of starting off with a language that is less mystified but harder to grasp like C/Scheme over one that might be more beginner friendly:

https://www.joelonsoftware.com/2005/12/29/the-perils-of-java...



I trained electronic eng and never gave a thought to informatics, too much theory, more math than anything else, OOP and RUP is all there is to it anyways so whatever.

I ended up in IT and programming, and when I bumped into SICP I regretted the 10 years spent without having read it already.

Someone commented that it's the first and last book you should read at Uni, and I totally agree


One of the things I loved about the choice of Scheme was that nobody knew it, so it helped level the playing field. Also, it's a small language, so we got through the "here's Scheme" part quickly.

That said, the rationale presented in the article is a really good one. Hm.


I prefer "breadth-first" to "depth-first" for learning, which is why I think Python is a better language for learning.


All I knew about Sussman prior to reading that quote was that he was an author of SICP. Now I'll think of him as a tedious and arrogant curmudgeon. Can someone give me a nice quote by him to make that go away?


To be clear, that was a serious criticism. Pretending that there was no particular reason for choosing python is a classic case of being unwilling to credit others (python authors) for their hard work, and is thus arrogant, ungrateful and dismissive. Claiming that "back in my day programmers had to think hard" is a classic case of being tedious and curmudgeonly.


Read many comments, decided to make a reply on the main one so its easier to address them all. Many people suggest that it was a bad switch and that Scheme should have been kept because it helped them understand FP.

I think it would be doing a disservice to other students if they didn't switch. The rationale given was a great one actually. OOO is there for a reason. I do not like the idea that FP is the answer for everything.

I understand many students used Python, C++, etc. This was the case for me as well. I did not go to any lectures for the Introductory course. I spent my time wisely studying for other Engineering classes (in reality: I actually didn't do shit and did other activities). You should do the same too.

What I am trying to say is, just because you already know OOO because you took the initiative before University (which I think - and I am biased - is a good thing to do), does not mean everyone else has. University is about teaching a body of students, all of education revolves around the mean of this body of students. It is a sad fact, but no professor or department or University will give you the unique attention you deserve; this is why I would not suggest University for that purpose. I only did University to get a piece of paper. I learned most programming stuff on my own time, before going in.

Teaching OOO, an easier of the two (FP/OOO), is a good thing. And teaching it in Python.. is a great thing. Kudos to MIT

I think instead, it is a better idea to keep Python/OOO as first year and in second year start teaching FP in whatever language. Or perhaps teach them both in first year. I think we just need to realize that when we go to University/School, we will have to walk at the pace of the "mean" even if that requires having to run (to catch up) or crawl (to slow down). Sad reality.

Edit: I just noticed that if I press the post button twice, it makes two posts. I deleted the dupe.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: