Hacker News new | past | comments | ask | show | jobs | submit login
Why teach with Ruby? (hackety-hack.com)
68 points by steveklabnik on Oct 14, 2010 | hide | past | favorite | 53 comments



I think Ruby is a beautiful language but you really got to consider it in the context that it was invented in. Matz originally created it kind of as a spiritual successor to Perl, and it's original incarnation was as an easy to write, easy to read, fun to use scripting language. Rails came along later, I don't think anyone was really expecting it before it showed up, and it changed the Ruby conversation entirely.

Ruby is excellent in many ways but I feel like Matz never took into account what is needed for a group of programmers to work on the same code base. I don't want to say he completely dropped the ball or anything like that but stuff like monkey patching and other shenanigans that make life easier if you are the sole programmer working on a code base become problematic if you are working in a largish team where not everyone might know everything that is going on.

What I am trying to say is, IMHO Ruby "feels" like it is a good language to teach in, but you can end up using it for a really long time without acquiring programming discipline and fundamentals. If you start in C, for example, and you have to mess with malloc and pointers and stuff like that, you are forced to get that discipline, because otherwise stuff goes south really fast. Someone who starts in Ruby might never really understand the performance cost of initializing an object, or what the stack and heap is, or why it matters. I think it would be better for a new programmer to start out in C or even assembly for a little while, just to understand how a computer really works, then it becomes so much easier to understand why languages like Ruby and Python (and my personal favorite, Clojure) were created and how to use the abstractions that these more modern languages provide.


I like your point about C, and I have a related point: When beginners see something that looks or reads like natural language, it activates the wrong part of their brain, and they start to wonder why the computer doesn't "understand" what they're telling it. The idea of a logical machine mindlessly executing symbolic instructions comes pretty naturally for some people, but for others, it's the first Big Idea in programming that they have to spend a lot of time getting used to. Every time they read a line of Ruby code that sounds kind of like stilted English, it causes a little regression in their brain back towards the mental model of the computer as something intelligent that understands the natural-language meanings of keywords and variable names.

Say what you want about BASIC, but programs like "10 ? "HELLO" 20 GOTO 10" never let you forget you were dealing with a brainless, inflexible automaton. Same thing with C.

With C, students are encouraged to ask the question, "What's really happening inside the machine?" There are simple, concrete answers based on a simplified view of the hardware. With Ruby, the answer to that question would be more complicated, and more importantly, it would be less concrete -- it's a long, long way to the hardware. C programmers start out grounded, at least grounded in a simplified model of the machine that is consistent and often helpful. How can beginners develop the habit of looking both up and down the ladder of abstraction, if they're looking down from the mountaintop into a shapeless gray cloudbank?


One thing I took away from my SICP experience was the idea that it is languages/abstractions all the way down. You can play in a super high level language like Lisp, or you can mold silicon. Somewhere you have to choose your level of abstraction.

If I am teaching somebody how to think about an algorithm like sorting an array, worrying about memory allocations is tangential, an inconvenient burden C delivers unto us. If I want to teach them about working with memory, pointers, and so forth, C is an excellent language for the job.

A crazy thing is happening here when we look down upon beginners who want something that feels like a natural language. Why CAN'T we simply say, "I want an app that opens two windows: one with a list of my music files, and the other with cassette controls. Make it so."? Someone could develop a language that does this; AppleTalk, for example, has some crazy high-level stuff similar to this.

We have been mentally shackled by our languages. We think in them. It is hubris to suggest that this the canonical way to think when programming, that, for example, "i = i + 1" is the idiom for incrementing a variable.

I do believe, to be truly effective in our line of work, that our beginners must eventually come to an understanding of the platform upon which they develop. They must understand memory concepts. They must understand timing concepts. They must understand multiprocessing concepts.

But there is no reason to ram C down some poor soul's throat as a first experience. What a terrible language for teaching beginners, full of syntax and slopped-together features. And what a great language to teach somebody in order to prepare them for the industry.


Ruby doesn't encourage you think in multiple layers of abstraction. The next step down from C is a simple machine model with instructions, a CPU, and memory, which is actually kind of helpful for writing code and understanding performance. That layer is visible in the design of C. (It's an oversimplification with no concept of caches, buses, instruction reordering, etc., but those can be added to your mental model when desired.) What's the next layer of abstraction down from Ruby? A byte machine and a garbage collector, neither of which are evident in the language design (to a beginner) and neither of which especially helpful to think about while writing Ruby code.

As an abstraction, Ruby just isn't leaky enough. It's too good. It provides no necessity and no reward for thinking about the next layer down. (Plus the next layer down is pretty sophisticated, and pretty abstract from the beginner's point of view. A beginner should know his computer has a CPU and memory and machine instructions, but byte machines and garbage collectors? They will be a mystery until later in the student's development.)

A crazy thing is happening here when we look down upon beginners who want something that feels like a natural language. Why CAN'T we simply say, "I want an app that opens two windows: one with a list of my music files, and the other with cassette controls. Make it so."? Someone could develop a language that does this; AppleTalk, for example, has some crazy high-level stuff similar to this.

Natural language promises are a lie. When you understand the rigid logical nature of a machine, it's fine. There's no disappointment or confusion when the natural language model breaks; you just fall back from your natural language mental model to your mental model of the machine as a logical computing machine. You can do that because you already have that model of the computer as a mindless, mechanical machine.

What happens if you don't already have that mental model? What if your closest familiar model for writing a program is writing instructions for another person? When you don't yet understand the nature of the machine and the nature of programming, natural language-like functionality intentionally misleads you. It tries to hide the nature of the machine from you. It will occasionally fail, and the cumulative effect of dealing with the occasional failures may be that you come to understand that the rigid, uncomprehending behavior that peeks through is the true nature of the machine, but why force a beginner to deal with this now-you-see-it, now-you-don't approach to learning how computers work?

Even worse, a beginner might continue to base his model of the computer on the "normal" humanistic behavior and just write the "failures" off as mysterious, uncharacteristic aberrations. When that happens, a programming language that mimics natural languages is actually much worse for beginners than it is for experienced programmers, because it encourages beginners to stick with a mental model by which the machine will always be mysterious and inscrutable.

Ruby isn't that close to natural language, of course, but I think it's close enough to occasionally revive a beginner's dominant, inappropriate mental model and hold back their learning. I see that phenomenon in myself with the "loop" macro in Common Lisp; lacking a precise understanding of loop, my first impulse is to say things that sound right, write them as code, and then see what happens. I would never actually get it right that way, of course -- it would just be an exercise in random frustration unless I strictly copied a known example -- but somehow it feels like the right thing to do, instead of developing a valid understanding. When syntax has no pretension to correspond to natural language syntax, it encourages a more appropriate mindset.


   Ruby doesn't encourage you think in multiple layers of abstraction.  The
   next step down from C is a simple machine model with instructions, a CPU,
   and memory, which is actually kind of helpful for writing code and
   understanding performance.

Vaguely. The "C is close to hardware" point has been well put to rest by folks with better knowledge on the compiler intricacies than I. C gives you the illusion of knowing what you are doing. Your mental model does not help you much in C, at least not in modern C on modern architectures. The next step down from C is object code generated for a specific OS in a specific memory model with a spiderweb of links to other libraries. After that, the OS processes that object code, sending instructions to your hardware, a complicated machine model that includes instruction pipelining, memory pages, and so forth, all of which are difficult to handle by hand without a clear understanding of the interactions. You pointed that out, but you oversimplified the issue.

Fortunately, we are rescued from having to have that much understanding by the C language, which abstracts a lot of concepts nicely. I should point out that printf(), the first output anyone learns in C, has its own language for outputing in the form of format specifiers, far removed from direct machine code. Which instruction do you use for "i = i * 2"? a multiplier or a shift? Some developers rewrite malloc() because of various inefficiencies in the way it manages memory, but the beginner can understand malloc() on a very simple level and be successful.

The nice thing about ruby, python, perl, java, and many other languages is that you do not have to worry about the memory drudgery. In all likelihood, you are going to do it wrong anyway. If I am teaching beginners algorithms, well, we can focus on memory management techniques later. We can still understand that there is a computer with a CPU and some memory available to us.

My point is that you do not have to understand all these layers of abstraction. It is nice to know something of them later, and we are all better developers for it. In fact, knowing a little about garbage collection of your specific language can, for example, help you avoid circular reference memory leaks.

   When syntax has no pretension to correspond to natural language syntax, it
   encourages a more appropriate mindset.

This is what I mean about being shackled by our languages. We come to believe there is some "appropriate mindset". In fact, the language of choice creates the mindset, and your approach to an algorithm will depend on your ability to translate your will into that language's primitives, means of combination, and means of abstraction. Functional languages are the latest fad in part because programmers in general have rediscovered that method of thinking; as you put it, they have had a "dominant, inappropriate mental model [that holds] back their learning". Our beginners were shown the "i = i + 1" pattern and not shown the "for my $i (1 .. 8) {}" pattern.

Sure, natural languages may never be natural enough; two individuals in the same room do not necessarily speak the same English. You have to draw the line somewhere. It just does not have to be at rock bottom, and we have continued to evolve, making layer after layer on top of it. We are corresponding here on a medium that is many levels abstracted from the silicon, and many a beginner in web design can grasp plenty of HTML without understanding how a specific browser is rendering it.

I would argue more optimistically that this is not a matter of somebody's learning being held back. They learned and applied a method that is valid. When that method became insufficient, bug-ridden, poorly optimized, whatever, they learned or experimented with a new approach. This is not holding anyone back in any meaningful sense; they are evolving.


Absolutely agreed.

Being the ruby fan myself, I got to admit, that Ruby is so sweet and so abstract that the student starting with ruby is not probably going to lower level language by his own will.

People still start programming with pascal in Russia and it's a good thing, imho.


People don't need to start in C. If we required that, there would be many fewer programmers. In University, our CS department got people started in Java (not the best starter language, but certainly cleaner than C) but our required OS course was in C. So, after being introduced to programming in a somewhat simpler form, you were required to spend a semester writing big projects in C, and you learned about pointers and memory allocation really fast.

So, I agree that programmers should learn more about how these machines really work if they want to write good code, but I'm not convinced people should have to start at that level. Doing so would unnecessarily discourage people from learning to program. Imagine suggesting that everyone should learn to start programming in assembly. I think that's an analogous claim and is pretty silly. People starting to learn to program shouldn't be worrying about low level details or unnecessary syntax, they should be discovering the joy of creating programs.


My experience has been that Ruby allows to achieve what you need at your pace, and learn what you need as you go.

When you start to learn C, you really have a couple of technical points to understand before being able to print a simple hello world.

Sidenote but I'm thankful to Matz for creating a language that makes disciplined programmers able to create highly maintainable code bases, too (I find it's easier than in C# or Java at least in my experience).


"Sidenote but I'm thankful to Matz for creating a language that makes disciplined programmers able to create highly maintainable code bases"

I agree with that. The "disciplined programmers" part is, however, the reason that I think that Ruby should be a reward for programmers who make it through C programming classes first.

Ruby's a great language, and a lot of fun to program with, but the same characteristics that make it an enjoyable language to use also make it possible for undisciplined programmers to get things done.


> like that but stuff like monkey patching and other shenanigans that make life easier if you are the sole programmer working on a code base become problematic if you are working in a largish team where not everyone might know everything that is going on.

That's why God invented conventions.


Conventions: excusing language design flaws since 1956.


Gun is not the reason you shot yourself in the foot, right?


Language design trade-offs, easily presented as flaws: inevitable since 1956.


"Matz originally created it kind of as a spiritual successor to Perl"

Really?

Matz has called his language "matz-lisp". Yes, he glommed stuff from perl, but from CLU, smalltalk, lisp, and other languages.

I agree that anyone serious about programming needs to learn what happens under the hood, but an advantage of using something that is easy to jump into yet still amazingly powerful is that people get to first see if programming is really what they want to invest time in.


Matz has also said that it was specifically motivated by his frustration with Perl's lousy OO support, leading him to come up with something similar but strongly based around objects. He specifically called it "the next language after Perl," and named the language "Ruby" for the next birthstone after the pearl. When asked how much of Perl made it into Ruby, he said, "A lot. Ruby's class library is an object-oriented reorganization of Perl functionality … I used too much I guess." The language contained a lot of Perlisms early on that fell out of favor as it diverged further. Overall, the language is more evocative of Perl (in its scripting capabilities) and Smalltalk (in its block-passing pervasive OO model) than any Lisp that I know of. I don't doubt Lisp has had some influence, but he definitely had Perl in his sights when he decided to create it.

(Source: http://linuxdevcenter.com/pub/a/linux/2001/11/29/ruby.html)


I think it depends. For teaching I would start with Scheme to teach the basics. Then switch to C to explain how all that stuff works.


The arguments are all very good, but I think I prefer Scheme as a first programming language just because the syntax is easy to teach and you can get to the good stuff much more quickly. Doesn't meet your 'real world' requirement though - although, I think a few months spent learning the basics in Scheme would make it easier to pick up subsequent languages rather than vice versa.


I think Scheme is a fine language, especially when paired with SICP. But I see it as being something that's more of the leap from intermediate to master than total-beginner to literate. I could certainly be wrong about this, however!


Unrelated, but I often wonder why people insist on a program that prints a string to the screen as the universal best first program. It's very practical, but it's also so tangential to what programming is.

Just delay compilation until the third week and you'll be able to start the dialogue at objects, "is a", "has a", actions, properties. This is the language we use to deal with the world already and though it's got funny syntax attached, it's not a foreign concept to anyone. Talk about the runtime system and it's expectations later.

(Edit: I think this is also why a lot of people hate the IO monad in Haskell so much. Side effects are actually an advanced topic in pure FP so much of the literature on IO is based around flaky metaphors and "don't worry about how this works" just to get people printing things ASAP. If you just use the REPL and don't worry about implementing it then you can build up to understanding the interaction between IO and the RTS in a sensible way.)


It goes straight to self-expression and provides a program that is customizable in a way as unlimited as language itself. A beginner can put in a string that no one else would ever say and then have the satisfaction of seeing a program that no one else could have written. It instantly illustrates how programming enables individuality, self-expression, and control.

Also, for environments where saving code in an editor doesn't instantly cause a change in behavior (compiled languages, web programming, working with a running Lisp image) it gives beginners an easy way to learn and troubleshoot the process of deploying their code.


>Unrelated, but I often wonder why people insist on a program that prints a string to the screen as the universal best first program. It's very practical, but it's also so tangential to what programming is.

The point is that the first step is getting your environment set up so that you can write[, compile,] and execute code. Printing a string is just a conveniently trivial program.


It's true that that's the first step to programming as a craft, but it's hardly the first step to programming as a way of thinking, managing complexity, and building abstraction.

That's like saying the first step to flying an airplane is turning the ignition. Yes, that's true, but without a firm idea of what's going to be happening after that point you're only going to hurt yourself by turning that key. It's something that comes to head when learning to fly a plane/program.


It's true that that's the first step to programming as a craft, but it's hardly the first step to programming as a way of thinking, managing complexity, and building abstraction.

But programming is a craft, so how is the first step tangential.


>It's true that that's the first step to programming as a craft, but it's hardly the first step to programming as a way of thinking, managing complexity, and building abstraction.

It's the first step when learning a new language, which is exactly when Hello World programs are customary.


I think the main benefit of the "Hello, World!" programs isn't the action it accomplishes, but the infrastructure needed to get it up and running. It's the minimum viable program which produces something that the beginner can tell "it worked".

You need to have your text file, download & install the compiler, figure out how to compile, and finally execute the output -- all of which are non-trivial steps for a first timer. Once you get the plumbing set up, you can then introduce other things about the language.


Quite the opposite, unless you are a born plumber, somehow ended up learning programming. Downloading and installing the compiler, and learning how to compile just to get things going is to programming what putting on a condom is to a... you know...


I'd like to highlight your mention of Scratch. I agree that Ruby is an excellent language for introducing older students to programming, but Scratch is truly exceptional for younger students. Scratch excels at introducing fundamental programming concepts,while hiding syntax and language subtleties for beginners. The puzzle/block metaphor of Scratch really works for "building" a fully-functional program. For those that have never worked with Scratch, the underlying Squeak(Smalltalk) implementation is completely hidden.

I guess the choice of language really comes down to the audience and age group. It's unlikely that colleges will ever utilize Scratch in their introduction to programming courses, thus Ruby is an excellent choice. But elementary and middle school students would benefit greatly from learning programming concepts such as logic, control structures, iteration, etc. from a tool like Scratch.


I can't really ready your post because of the background. I'd recommend changing it


Safari Reader (or the Chrome/FF extensions like it, or the readability bookmarklet) fix it nicely (and many other sites like it).


I think that's a foreground, actually. :)


I will check into that. It's just a stock Tumblr theme, and my screen is usually dirty, so I didn't even notice!

EDIT: just plain white now. Thanks! For anyone who is curious what he's talking about, the theme I'm using had this set as a fixed background: http://static.tumblr.com/nfoxfxu/r5fl3h6w1/bg.jpg


Background/disclaimer: I learned Ruby first (largely thanks to _why's guide), but have been using Python at my job and in my personal projects for about a year and a half now, and have grown to prefer Python.

The main gripe that I have with using Ruby as a teaching language is that the whole magic thing. You address this really well, and I really like your passage on abstractions, but I don't agree that the magic is just limited to advanced usage of the language.

Here's my gripe in its most pure form: in Python, in order to figure out all of my capabilities right now (short of what things I can import), I can use dir(object_i_want_to_know_about) and see everything that object can do. I've got a friend who's learning Ruby, and we wanted to find a built-in String method to see if a string starts with a prefix. We found it after Googling, but I wanted to find out about it in the context of the REPL. I figured that Ruby must have similar functionality to Python's dir(), but as far as I can tell, the only way to achieve that is to hack the REPL in a very minor way (you can add a few lines to your .irbc that will achieve this). There doesn't seem to be anything like dir() built into the language, and so it's a lot harder to figure out what you can do at any given point in time without Googling.

I think that this difference speaks volumes about the difference in philosophies between the two languages, and is in large part why when I introduce friends to programming these days, I tend to do it via Python, in particular via Zed's book.

I haven't used Ruby much in the past year or two, though, so if I'm completely mistaken about my assertions here, I'd love to find out about ways in which I can reason about the options available to me when I'm hacking around in irb.

PS: I really liked the code example near the end of the article.


I figured that Ruby must have similar functionality to Python's dir()

Sounds like you wanted

   'x'.methods
    => [:<=>, :==, :===, :eql?, :hash, :casecmp, ... ]
or the more focused

   'x'.methods - Object.methods
    => [:casecmp, :+, :*, :%, :[], ... ]

?


Yup - that's exactly what I was looking for. Awesome.


The question is why Python calls it "dir" when you just want to see a list of the object's methods...


[deleted]


That's any language that has scoping. I think what you're searching for is something like using lambda {} that captures the outer scope.

I used lambda actually to introduce some middle schoolers to capturing behavior in Ruby. It worked well.


Ruby is awesome.

Please don't compare it with java, that makes me so sad.

Java is so bad, deal with it. The good thing about java is the JVM to use with another language like scala, clojure and so on.

Being a programmer is actually funny, you can stay all your life using 10 frameworks to do something for web in java with a lot of lines of code and bugs and think you are so good(like I did), but you're actually so bad.

I'm still trying to learn lisp and start doing some clojure too, can't wait!


> Please don't compare it with java, that makes me so sad.

I only do because Java has been the go-to year one language for colleges for the past few years. I wouldn't even mention it otherwise.

> I'm still trying to learn lisp and start doing some clojure too, can't wait!

Enjoy! I don't have a ton of Clojure experience, but from what I've seen, it's pretty awesome.


This post was a lot of work, and I'd love to hear your thoughts, HN. Please tear me apart. :)


It seems a lot of the "magic" with Ruby comes from Rails, and I would say that teaching a new language using a framework is probably going to be pretty overwhelming to any beginner.

One thing I would worry about with Ruby as a teaching language is that the syntax is somewhat unique. I don't think this is a terrible thing, the basic structure is similar once you get past the syntax, but I think it can trip people up if it's their first language, and then they move on to Java/C++/Perl/Python/PHP which seem to follow more of a continuity.

I think, from my experience, I actually think learning a couple different languages at once can be really advantageous. I pretty much learned Pascal, C and Assembly at the same time, working in the demoscene, and linking them all together.

The important part of programming isn't the syntax, it's the structure and the concepts. You can always look in the manual for syntax, but structure and concepts are the hardest parts to grok.

So maybe my suggestion is to have side-notes for the examples that say, "here's how you do this in PHP, here's how you do it in Java, here's a breakdown of the differences and similarities."

On the other hand, I may be completely wrong, because I'm biased based on how I learned programming, and not everyone learns that way.

With all that said, this is a pretty great piece.


As a TA for an intro to programming class taught using Java, I wish, wish, wish we were using Ruby instead. I spend 70% of my time teaching people Java's obtuse syntax, rather than teaching the underlying concepts. Once a programmer has developed the concepts, she can code in nearly any language, but she really has to learn the concepts. Ruby's syntax is straightforward enough that little kids get it down in days, so students can actually learn computer science rather than Java science.


It depends what you're learning. If all you're learning is to memorise blocks of code, then Ruby is probably easier to learn because there's less of it. If you're trying to learn what a programming language is actually doing, I don't think Java is any more difficult than Ruby. If anything, most Java code is a lot more clear about exactly what is going on.

FWIW, I learnt and tutored Java at University and have recently been learning Ruby and Ruby on Rails for fun.


Learning what a programming language is actually doing is good, but arguably that is a different course than an intro to CS course.

One big advantage Ruby has over Java is that irb is simply a snap to use.


We also can't underestimate how much of an effect being familiar with a certain style and syntax has an effect on our judgment of languages which break from that style.

Since I started with Pascal and C++, and then branched to Perl and PHP, the first time I looked at Ruby code, it looked like gibberish. But Java and Google Go immediately looked very familiar to me.

I've since learned Ruby better, and it's kind of come into focus, but I think that's definitely something to keep in mind when we're teaching programming to others.


I think the hallmark of any good teaching language is the ability to get up-and-running quickly, and Ruby (and PHP and Perl and Python and scripted languages in general) have that over Java, C++, and others.

But I think it's important to compare and contrast languages for just that reason, so that when moving to a language which requires a lot of preparation before you can even echo "Hello, World", the student doesn't get frustrated.


I agree...it should the language and language only if you are teaching a beginners class.


Thanks for the post, and I completely agree. However, there is one thing about Ruby that could have been better, considering Ruby as a didactic language: TMTOWTDI (more than one way to do stuff) concept which Ruby borrowed from Perl: although it arguably helps with "the principle of the least surprise", I find it confusing when it comes to learning: multiple names for the same methods, {/} and begin/end block syntax, completely superfluous for keyword first come to mind, and even optional parantheses for method calls could be confusing, especially when they collide with operator priority.

Slightly unusual syntax, comparing to the mainstream languages, shouldn't worry anyone, since 1) it's really not that different (we're a bit spoiled there, as languages are lately crafted to mimick each other as much as possible - to facilitate "brain drain" from rivaling camps) 2) the aim is not to "learn to make money by programming", but to learn to code 3) opinions here are highly biased, because we already learnt programming our way.


I've been teaching myself programming over the past six months or so, and I'm very glad that I started with Ruby.

The syntax is clean, and I think it's awesome that Ruby's object model is so elegant. I agree that the magic parts are frustrating at first, but it's pretty neat that you can spend a couple weeks reading, e.g. Paolo Perrota's Metaprogramming Ruby, and then actually know how it works!

In terms of conceptual and syntactic elegance, I personally prefer Ruby to, say, Python. I watched most of the videos for MIT's intro CS class (available on academicearth.org), which uses Python, and just never particularly enjoyed it. It just struck me as messier than Ruby. That said, I'm sure I should give Python another chance.

But at the end of the day, I think the main advantage of Ruby over something like Java, at least as a teaching language, is that you get a REPL. My memory of complete-noob-hood is still fresh enough to say that being able to do interesting things in a REPL is HUUUGE. And even now that I've got a bit more programmatic maturity, I wouldn't have gotten nearly as far with Haskell as I have without GHCi.

So... I wonder what it would be like to teach a class using Javascript? Marijn Haverbeke's book, Eloquent Javascript, is pretty awesome, and I bet you could teach a pretty sweet class using it in conjunction with something like Node (for its REPL, in addition to all its other webby goodness). Hmm...


Good post, although I disagree with it :-)

One thing that bugs me is the Hello World example. While I completely agree that the Ruby is example is far cleaner, the Java/C/C++/C# example doesn't add much confusion in my experience teaching intro programming. The concept of classes is often confusing, but setting aside the code isn't much of a mental challenge for any student I've met.

It just feels like its the wrong place to have the discussion of magic. The places I see students get tripped up is with iterators, recursion, classes, byref, and metaprogramming.

I think the standard ceremony vs convention concern that people in the blogosphere love to talk about is generally a non-issue for almost everyone except those people who like to blog about it. Show how those other issues are made simple, yet don't miss important facts, with Ruby (or whichever language you think is most effective) and you've made a powerful case (note, I think this is largely the argument as to why SICP is so well regarded as it did a decent job on most of those).


> The concept of classes is often confusing, but setting aside the code isn't much of a mental challenge for any student I've met.

While I have little formal teaching experience, I hung around in the undergraduate lounge in school, which is where students came for help. I ended up helping a lot of non-majors with their first CS class, which was popular because it filled a general requirement for an Arts and Sciences degree.

These two classes were in Java and in VB6, and they struggled with anything that wasn't near-rote application. They had trouble with the scoping of variables, they didn't know when to use functions appropriately... if you say 'what does x = 5 do?' they can answer, but they can't apply that to writing new programs.

My friends who are TAing courses now have several students who end up turning in homework that doesn't even compile. Maybe students at your school are just smarter than mine, but a lot of people struggle here.

> It just feels like its the wrong place to have the discussion of magic.

It's true, I am conflating 'magic' slightly. I don't mean 'they can't properly metaprogram,' I mean it in the "any sufficiently advanced technology is indistinguishable from magic" sense. They struggle to apply what they see as magic incantations to new problems, because they don't have a solid foundation to work with.


So I taught intro CS for CS majors. I can totally believe that for a subset of non-CS majors this could be more problematic.

My old submission program wouldn't let them turn in programs that won't compile (nor will it let them submit without it passing the example tests I gave them).

This was about eight years ago... Maybe students just aren't as smart anymore ;-)


Absolutely agree :)

But seriously, if the students turn in programs that won't even compile, I'd say the blame should be on the TAs and the Profs!


I make a living with Ruby and Rails, but I'd stick with JavaScript if I were learning today. Server-side JS is a LOT of fun.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: