There’s a lot of folks in this thread who’ve shared their personal experience with learning and teaching programming languages. Some attempt to generalise this experience into principles of how much the first language matters and how to choose it.
It would move the conversation forward if people were up front about which metric they’re trying to move. For example, here are two equally valid pedagogical goals that are in tension with each other - helping everyone become a programmer (no dropouts) and making sure every programmer meets some pre-defined criteria (strong understanding of memory layout or type theory or ability to build products). If your pre-defined criteria includes understanding of type theory, you might start with Haskell. Certainly all surviving members of your starting cohort would succeed according to your criteria, but this is only success if you didn’t mind people dropping out. Conversely, a language like python might help fewer students drop out, but they might have “weaker fundamentals” for some definition of that.
So let’s be clear about what success looks like. Let’s state a hypothesis for success and which metrics we’ll use to measure that. Let’s verify that hypothesis by offering the teaching approach to similar groups and interpreting the results.
We can do better than “I was taught like xyz, and I’m successful now and so that was definitely the right approach”.
Lastly, some people in this thread claimed that anyone can become a programmer if the teaching is good enough. On this, I’d like to paraphrase fictional food critic Anton Ego - “only now do I understand what was meant by the motto ‘Anyone Can Cook’. Not everyone can become a great artist, but a great artist can come from anywhere.”
> “I was taught like xyz, and I’m successful now and so that was definitely the right approach”.
For what it's worth, the same reasoning can be heard in other fields. I work in foreign-language education, and debates about the best way to learn another language—memorize vocabulary and grammatical rules? move to a country where the language is spoken? watch a lot of movies in the language?—often reduce to competing personal success stories.
Teachers who have taught a foreign language using a particular method often come to believe that their method is the best, even though they are usually unable to measure the actual long-term outcomes for their students. It may be just too discouraging to be uncertain about the effectiveness of what one does to make a living, so teachers become advocates for whatever approach they happen to use.
Foreign languages come with an enormous amount of experience. Whether that experience comes from watching a lot of foreign language media or living there or something else entirely probably doesn't matter. You need to build intuition and that seems to only come with experience. Teaching in this sense tends to provide subpar results, because no teacher can spend enough time to get students the experience required to become close to fluent. (I also suspect that it's not just the amount of experience total, but that there's some kind of minimum "experience density" based on time. Eg an hour a week for 20 years might not be enough, but 2 hours every day for 1.5 years might be. But this is just a hunch.)
At least that's my personal experience. We studied English for 11 years in school, some studied it later in university too. Every single person my age that was good at it used English outside of the classroom - mostly on the internet. They had enough experience that even without knowing the grammar rules they got the right answers "because it felt right".
I think this applies to learning anything else that requires to recognize patterns works in the same way: programming, mathematics, listening to a different language, reading handwriting, fighting difficult bosses in video games etc. You can theoretically understand all the pieces, but still be incapable of utilizing them for a satisfactory result.
This is a great analogy. It boils down to having a reason to put in the time to become proficient. A lot of people learn to code by the book without ever finding a puzzle they passionately want to solve through code. With language, you need a reason to practice. Only a breadth of experience can fill the gaps that allow you to intuit meaning in the real world, as opposed to a classroom. In a foreign country, the reason amounts to simply living there. I remember years of knowing simple BASIC and Perl when I was about 10-13 years old and having no ideas what to make with them. I could compile a Mac program with windows and menus and didn't know what to make, other than screen savers. Hypercard was what harnessed my creativity in narrative to my curiosity about code. A similar thing might happen to anyone who wants to code who loves music or writing or art. Code, like language, is for many people only important as a means to an end. It's a tool to communicate. So the measure of success in learning such a tool isn't necessarily whether you make a career at being a translator. It's at whether you can effectively use the tool to do the things you want and need to accomplish outside the specific domain of communication itself.
I've struggled with exactly this lately. I used to write and record and play in bands. It sounds awful but at some point I lost hope of being commercially successful at it. The fantasy of rock stardom was what drove me to spend thousands of hours writing and playing. But I still love music. I rearranged my goals during the pandemic, bought a pedal steel guitar and started teaching myself something completely different. Now I can see myself getting onstage in a year or two and backing up a band I like, just for the joy of playing. Just having that long term goal in the back of my mind has let me lose myself in hours and hours of practice. In the code arena, I often have a "Show HN" in the back of my mind, even if I rarely or never show anything.
Hi there. I write software for learning foreign languages, at a foreign language instruction company. If you have time and inclination to discuss things, please check out my profile.
This is a great starting point. Let's not play down tradeoffs, standards, orthogonal goals, etc.
Either interpretation of ‘Anyone Can Cook.’ Cooking isn't just about being great art. Neither is programming.
Lets not forget that a lot of programming is making something like salesforce do something after an employee does some other thing with their account... Proverbial (and literal) CRUD. The practical requirement for this job is (1)"can program," (2)"knows how salesforce works," and (3)"can reach an understanding for what the problem is."
There's a real tendency to downplay this "use case" for an education in programming, but this is a lot of what most programmers do for a living. Whether or not it's simple, it is definitely not trivial in the sense that most people do it well. In isolation, the "can program" bar is not that high.
Another important part of the choice of first language (and approach) is making the student want to learn the material. That can be very different for different students. For nerdy high schoolers you might want to start with SICP while for an adult career transitioner you might want to start with “Your First Web App” to highlight the commercial possibilities.
Going to take a stab at defining measuring sticks for the problem:
Here's some metrics we should measure a "system for programming education," of whatever form, against.
--Success is measured against new students with zero programming background.
Todo would be defining zero programming background.
Ideally people who've never evaluated a statement before AND never used a CLI interface AND without an amazing mathematical background.
--Students do not feel regret, unhappiness or anxiety before they start or after they finish a 1-2 hour programming session. Students want to program in the future. All other things being equal.
--The level of expertise to target for a "solid programmer" is creating discrete systems using higher order abstractions from scratch. Whilst using a strong mental model of what higher order abstractions actually look like in their physical state to improve their creations.
A secondary KPI might be "can the student make a compiler." But a mental model of what higher order abstractions actually look like in logical state seems like a good goal.
--Currently underrepresented cultural backgrounds show end state success at the same rate as other cultural backgrounds, all other things being equal.
Programming stuff is difficult, when there are difficult academic barriers to overcome people on the south side of social inequality fail more. Thus it needs to be a metric.
In my (admittedly limited) experience with teaching and mentoring young programmers, it is much more important to have them write sound code-like programs before any particular implementation. Appreciating logic is a skill they will use for 50+ years.
On discovering that I used to think in a functional manner before learning to program I wish I had begun with Haskell. Things which seemed like obvious things to try didn't work in C++ 98 so I 'learned' that my thinking was wrong and bad... I remember discovering the possibility for tail call elimination and being disappointed that it did not work.
Lets keep the standardized tests with the standardized humans: absent.
This is a good response. There are some people that are determined to become programmers at all costs. For them I don't think starting language matters as much. For others it might matter more.
Given the ever-increasing amount of automation in the economy, I believe that programming (of one form or another) is going to become a skill as important as writing. Everyone learns how to write in school, even though only a few become novelists or journalists. Similarly, everyone will learn to code even though only a few will become data scientists or software engineers.
Programming for non-professionals looks a lot more like Excel than Python. I imagine the trend will continue where deceptively simple interfaces allow people to automate routine tasks. Those interfaces might get more expressive languages, but most programs will be at most a dozen lines long.
That's the kind of language I think we should start with. We can worry about more detailed CS topics for the students who actually specialize in that.
I’m bearish on the long-fabled ‘citizen programmer.’ This is a concept that’s been long discussed in the legal tech world, and it just won’t happen, not here, and not yet. Non-technical people, largely, don’t want to make things work, they want to tell other people to make things work. They have the raw intelligence to do it, but don’t want to do it, and don’t want to be convinced to apply themselves to learn how to do it. It’s just a cultural gulf that I think tech-minded folks can’t wrap their heads around.
It has been a pipe dream of some since the 1960s that "everybody" would program computers in the future. It hasn't happened and won't happen. Most people simply don't have the patience for it.
> In the US, 14% of the adult population is at the "below basic" level for prose literacy; 12% are at the "below basic" level for document literacy, and 22% are at that level for quantitative literacy. Only 13% of the population is proficient in each of these three areas—able to compare viewpoints in two editorials; interpret a table about blood pressure, age, and physical activity; or compute and compare the cost per ounce of food items.
And keep in mind Romania is average at a world level, economically, maybe even slightly higher than average.
On top of that programming is based on logic (for the layman who doesn't want to think of it as based on math; and let's not go into the layman's clear separation between logic and math), so it for sure has an extra difficulty level compared to reading or writing.
100% population literacy is neither reasonable nor attainable. Yet, functional literacy has drastically improved throughout the past century.
To say programming is never to attain such a characteristic is making a statement against the trends of history in all other academic disciplines. Focusing on the present state is an ignorance of underlying trend.
A couple thousand years ago, groundbreaking advances in trigonometry were made by some of the worlds smartest people. Today the average high schooler knows soh cah toa.
But if these advances take centuries, from a purely personal perspective they don't matter much. By then, I, and everyone I have and will ever know, will be dead.
It is amazing that no one has pointed out JavaScript in this thread. The vast number of people who can come into programming by way of a free web browser with a built-in debugger is astonishing. It is incredible to think of the zero-cost (!) development environment that is Microsoft Visual Studio Code plus Google Chrome or Mozilla Firefox (run+debug). I could have never imagined it twenty years ago, and I am not a JavaScript fan-gurl/-boi.
And what about coding boot camps? Again, they have put umpteen more people to work as programmers. (Are they more than universities yet?) Again, you don't have to love them, but they are lowering the bar and helping more people gain access to the programmers market.
I'm fairly sure that the average programmer today writes scripts for processing data in Python (or in R, Matlab, or something like that). Software engineers, data scientists, and other similar professionals who define themselves by their skills rather than their field are already a minority among programmers.
Or maybe I'm just spending too much time with biologists.
It's true, but my guess is that folks working in software companies don't see much of this. Social scientists alone are running things through Excel/R/SAS all day, let alone more quantitative fields. Software has eaten the world so to speak, it's a subset of "computer science" folks who haven't realized it yet.
True. Even if someone only understand truth table, excel formula and some mathematical operations (max, min, ceil floor avg, mod, root, pow), they can be very useful to support any companies.
Another big reason is that it is so much easier to write powerful software for programmers than non programmers. So at some point it will be easier for the world to teach program to everyone and have that as a baseline than trying to put ever more programmers building specialized programs for non programmers.
Usually the pipe-dream is that some sort of graphical "no-code" or "low-code" system with a crippled DSL will be more benefit than hassle, and enable them to do things without needing to actually know how programming and the whole stack of abstractions they are sitting on work.
It's one of those things that is always 20 years away, like nuclear fusion.
And what is wrong with Excel -- functional programming by formulas, and procedural programming by VBA? Until mid 2000s, many Wall Street Investment banks were still running trading desks making 100M+ USD per year profits (each!) using only Excel for quoting and trading. I assume Biotech and Big Oil still use lots of Excel.
At most one will create an easily teachable AI they can utilize. But that’s most definitely not the same as training a NN by specifying input/layers, etc.
That's what I meant. Trying to get the already-trained AI to do exactly what you want by modifying the prompt could be thought of as a type of programming and could become a skill-set in itself.
I believe it will either be "fool-proof" and so intuitive as "Follow the wizard and click on the button you want to be clicked when this and that happens", or it won't happen.
I think the biggest obstacle standing between the general populace and computers is the amount of detail one has to provide for a task to become programmable. But it comes up in real life as well.
Either "programming" is not using "mathematical structure" to instruct a computer, OR printf("Hello world\n"); is a "mathematical structure", OR it is not programming.
If a student has trouble transferring their knowledge of programming fundamentals from one language to another, it seems far more likely they didn't really understand them to begin with, which is a failure of teaching and testing, not some sort of 'limitation' in the student.
> If a student has trouble transferring their knowledge of programming fundamentals
“Programming fundamentals” isn’t the same from language to language. It’s inconceivable that a student who learned to program with python would develop the same type of fundamental understanding as a student who used C (to develop an understanding of memory-level APIs) or Haskell (to develop an understanding of type and semantic theories).
> which is a failure of teaching and testing, not some sort of 'limitation' in the student.
This is a bold claim. I don’t think everyone can understand computer science.
> I don’t think everyone can understand computer science.
This is a bold claim. None of us are special for working in tech. Our life has presented us the opportunity and passion to invest ourselves deeply in the field, often at great opportunity cost.
Saying that a bad teacher makes no difference in student outcomes is equally ridiculous as saying an excellent teacher makes no difference in student outcomes. If that's the case, we should be able to replace professors with cardboard cutouts, no? I was a student and TA long enough to witness directly the impact that a thoughtful curriculum can have on students.
>> I don’t think everyone can understand $RANDOM_DISCIPLINE.
> This is a bold claim.
I disagree. What makes computer science so special that it can be understood by everyone, while it is generally accepted that other disciplines cannot?
After all, saying "Not everyone can understand chemistry" doesn't draw the same ire.
Why do you believe that "not everyone can understand chemistry"?
Perhaps I should be a little more specific. When I say "everyone" below, I mean "the overwhelming majority of people in the average range of human intelligence, without debilitating mental deficiencies". This group spans virtually everyone you are likely to encounter, from car mechanics to school teachers, to postal workers, to professional software engineers.
People have blind spots, but in general they're pretty smart when it comes to the things they think about every day. I believe:
* Everyone has the mental capacity to learn $RANDOM_DISCIPLINE if they put in the time and effort. A good teacher can dramatically accelerate the process, while a bad teacher can bring the process to a grinding halt.
* Everyone has a limit to how well they can understand $RANDOM_DISCIPLINE, (even Terrence Tao) but most careers do not demand that workers reach that limit. One can be a successful researcher without performing at the same level as Terrence Tao. One can be a successful software engineer without being a computer science researcher. One can understand advanced topics in math or computer science or chemistry without having personally discovered them. One can work as a chemistry lab technician, or a high-school chemistry teacher without winning a Nobel prize.
You can replace $RANDOM_DISCIPLINE with things like "computer science", "constitutional law", "how to reassemble a car", "chemistry", etc.. If you start getting very specific, you might be able to get me to disagree with statements like "Not everybody has the mental capacity to have invented Inter-universal Teichmuller Theory".
> Everyone has the mental capacity to learn $RANDOM_DISCIPLINE if they put in the time and effort.
I know that this feels like something that should be true in an ideal, just world or in the HN-reading bubble, but I've never agreed with this. I know people who are incredibly smart, I know average people who are not very intelligent, but are fairly knowledgeable, I know rather simple people who live simple lives, and I know people who are, bluntly said, absolutely fucking stupid.
I am absolutely sure that I know people who wouldn't be able to get from zero to doing my job exactly the way I do it (and I'm certainly no rockstar programmer, I'm like in the bottom 10 % here) in any reasonable timespan, like 20 years even. I would literally bet any amount of money on it.
> Why do you believe that "not everyone can understand chemistry"?
At least 5% of the population can’t understand algebra. If you can’t do algebra then you might be able to follow recipes but you can’t do chemistry. Without algebra you can’t do stochiometry and that’s essential for even high school level chemistry.
Sorry, by "can" I mean "has the mental capacity to". I agree with the statement "not everyone has the knowledge they need to understand chemistry right now". I disagree with "not everyone has the mental capacity to understand chemistry". Note also my definition of "everyone". I certainly think everyone has the mental capacity to understand algebra.
> I certainly think everyone has the mental capacity to understand algebra.
I invite you to tutor some middle school students then. For values of algebra that include “can be taught to draw the appropriate line given the corresponding equation” there are plenty who can’t understand algebra.
I don’t know about middle school kids, mine’s not there yet, he’s 9 - but he’s already done algebra, including simultaneous linear equations with multiple unknowns.
No powers, quadratic equation or differentiation and the like yet, but I can see the QE coming up in next years books (which we recently bought since 3rd grade just finished)
I’m pretty sure that when I was at school, this was a “senior school” (age 11-16) topic, and I was maybe 12 or 13. Kids are learning stuff earlier than they used to, at least it feels that way to me.
Oh, you can teach some six year olds algebra. It’s not impossibly hard. Some children are ready at a very young age. But some people you can teach it for a semester and every shaky grasp of the subject they got in four months will disappear completely in two weeks. Some people cannot do abstraction.
I think there is a huge difference between currently being unable to understand something and being unable to understand something indefinitly (e.g. because you have mental deficiency that blocks you from forming the necessary pictures in your mind).
Having thought math to mentally challenged kids, I had only one kid where I would really say he could not learn algebra at that point of his life. Mainly because he had a memory span of 10 minutes and severe developmental issues.
All the other kids where really bad at math, but given the right effort I could really improve things by just working on their stuff for 1 hour a day.
I know many tech guys like to see themselves as some kind of elite that has a secret knowledge very few others can master. But in my eyes that is just a failure of an education system.
As for kids, I agree with you that the majority of them can potentially be good at almost every field. But as for adults, I’m not so sure. Most of us become way too rigid as we age, learning capability and more importantly motivation becomes way lowered. So someone who had trouble with basic algebra (at the time mostly due to a bad teacher/experience with math) will not generally be able to overcome his/her identity of “being bad at math”. Of course, there are exceptions.
The majority of us can improve in any area. Some brains are hard wired for programming others for say music.
A programmer can make a song. A musician can make a program. I'm sure you could train those kids to sing as well. I'm sure you could train those kids to play basketball.
You wouldn't be able to train them enough to make any professional league.
Programming is genetic. Recently I met my birthparents and I discovered the whole family are programmers. I thought I came up with the idea myself I pushed myself through school. No one in my highschool did programming or elemetry school. I thought I determined my own choices.. I was proud of the unique path.. then I met the birthparents and realized how predetermined everything is.
I do think it is more about the way you explain something than about the actual topic. You can explain the same topic in such a way that it works for people who can visualize things, for people who like abstract concepts, for people where it is about social relationships etc.
Good educators will always target multiple of those patterns. Bad educators always target the same pattern and will therefore only have auccess with the same kind of person.
I am currently teaching electronics and programming to artists who quite uniformly hated math and physics in school, and it kinda works. You just cannot teach it in the way they would teach it to a STEM student.
> Why do you believe that "not everyone can understand chemistry"?
I didn't say that I did. My belief is irrelevant anyway, because it is generally accepted when someone says "not everyone can understand chemistry".
I was pointing out that saying such a thing only draws ire if you say it about CS.
> When I say "everyone" below, I mean "the overwhelming majority of people in the average range of human intelligence, without debilitating mental deficiencies". This group spans virtually everyone you are likely to encounter, from car mechanics to school teachers, to postal workers, to professional software engineers.
And yet there is no controversy if postal workers, car mechanics and hairdressers say that they are not able to understand chemistry. There is only controversy for CS.
I feel that you would do well to explain why only CS is controversial, while chemistry, auto mechanic and art is never controversial in regard to this particular assertion.
I guess I disagree with your premise then? In my experience someone who is likely to think that anyone can learn CS is likely to also think that anyone can learn art, car repair, or chemistry. On the flip side there are plenty of people who believe that people are born "chemistry people" or "math people" or "art people" etc.. I've not witnessed anyone say that CS is accessible but chemistry is not.
> I guess I disagree with your premise then? In my experience someone who is likely to think that anyone can learn CS is likely to also think that anyone can learn art, car repair, or chemistry.
That isn't my premise. My premise is that saying such a thing isn't a bold claim, it's a commonly accepted claim.
When you said "This is a bold claim", I replied with "I disagree", because "Some people just aren't any good at $DISCIPLINE" isn't a bold claim.
> * Why do you believe that "not everyone can understand chemistry"?*
I am a physical embodiment of a person who can not understand chemistry. Basics hell yes, organic chemistry for most parts yes, process chemistry ... iffily, but when it comes to physical chemistry, I'm hopeless. And I tried. Over many years. I changed my major in university from chemical engineering to CS because my head simply would not work the way you need it to work in physical chemistry.
On the upside, I think there are surprising parallels that have made my recent life (~15 years) easier because of the lessons I picked up from chemical engineering. Conflicting feedback loops, S-curves and equilibrium states are not that different from the concepts and safeguards you encounter in distributed systems.
I really want to be as optimistic as you are, but I can attest from personal experience, not about chemistry, but about math - not everyone can understand basic mathematics.
I often tried helping friends and classmates, in one-on-one sessions, to understand basic (primary and/or high school) math. Sometimes, you just end giving up. This was after trying a number of different approaches, different learning techniques, different ways of understanding / visualising abstract concepts, etc.
Not everyone can understand abstract concepts (or our teaching methods are too primitive).
The bar for getting stuff done with programming is much lower than the bar for chemistry. While I do agree that not everyone can or will be a full time software engineer, I think that most people can incorporate programming into some aspect of their lives. Whether they want to is another thing but that shouldn't stop us from teaching at least the basics even if it means the most they'll do is write conditional excel sheets or figure out how to build their own iPhone shortcuts.
I suspect many people on HN have an unrealistic view of how most of the population operates.
People who can do any version of STEM with some minimal basic competence are unusual. They're not exceptionally unusual, but they're certainly not the population median.
Media numeracy is around Level 3 on this list. Numeracy in the US is lower than the OECD median and is somewhere between Levels 2 and 3.
I'd estimate you need Level 4 for programming to make any sense at all and maybe get you started with very simple programming tasks, Level 5 to be able to do basic code construction on a daily basis, and Level 6 to be a competent senior with some useful modelling skills.
Level 5 is maybe 20-25% of the population. Level 6 is probably less than 10%.
A very unrealistic view of the programming and mathematical abilities of the average person, even the average college educated, employed, under 40 person, is the main source of contention I find whenever this argument comes up in my social circles as well.
Those who think anyone can learn programming should pick an acquaintance ages 25-45 who has sufficient time, and attempt to teach them to program, and see the results. Or inquire about the actual success rates of coding bootcamps (which are already self selected groups and have filters themselves, far from the average population to begin with).
I've done the above with very mixed results, it's pretty clear that for a litany of reasons the average person is functionally incapable of being a useful programmer. I don't see the average citizen coding a lot in the next few decades, even casually.
I teach programming (not CS) part time. All students have been doing reasonably well and took something valuable with them, which is the intent of the course.
However only a few have what it takes to regularly sit in front of a computer and power through long term. It’s definitely special for better or worse. Let’s face it: We’re nerds!
The person you responded to didn't make any claims about bad teachers or good teachers. The claim is that not everyone can understand computer science. As an extreme example do you think a person who is mentally disabled can understand computer science? From my perspective it's clear to me that not everyone can understand computer science and the same applies to many academic areas.
Not everyone can understand computer science, but I think the OP meant that your socioeconomic status matters more - gradual learning from age 10 makes college-level education a breeze compared to being disadvantaged well into high school and only using computers in order to live and use essential services.
I agree that the whole bad vs. good teachers thing is off-topic, though.
The grandparent comment was about teaching, so I interpreted the parent comment to mean that "innate ability matters much more than good teaching", which my personal experience teaching people disagrees with.
And sure, the statement "not everyone can understand computer science" is technically true if you include people with mental disabilities. But in reality, some people draw the line a lot higher as a form of gatekeeping. I've known a number of professors who let themselves off the hook this way too: Poor student performance couldn't possibly be a failure of their teaching--the students must not be smart enough!
I've spent about 2.5 years teaching programming. Students really do seem to have some innate programming difficulty slider, which absolutely matters. Some of my students worked their absolute butts off to learn programming, and asked for help constantly (which I gave gladly). And they still barely scraped by.
Other students seem to learn programming extremely easily. I would show them something one day, and the next day they've already used what I showed them in their code.
I'm open to the hypothesis that its my fault that some of my students failed to learn programming. I certainly felt terrible for them when they worked really hard and still failed. But I don't think its all me. I certainly can't take equal credit for the students who did very well. My star students seemed like they barely needed me to teach them anything at all.
And for what its worth, the perspective that talent doesn't matter is an awful thing to teach struggling students if its wrong. Students hear that as "If I struggle at programming it must be my fault". I don't think it is. I think some people just have brains wired to make programming easier or harder to learn. There's no shame in encouraging some people to pick a different career. If programming really isn't right for someone, the sooner they swap to something else, the better.
I don't disagree with a lot of what you said. Practically speaking, some students are simply "behind" where they should be, at it might be too much effort to catch up, so the pragmatic choice is to pursue something else. I guess it's usually that I attribute their struggle more to external factors than internal factors.
* The student might have other stuff going on in their life at the time. Heavy course load, personal drama, etc..
* The student might have other interests, so they don't devote as much time to studying as they should.
* The student might not yet have had a good math education, so their abstract reasoning skills lag behind. I think it can still be learned, but someone in a college compsci course just doesn't have time to catch up.
I've been an undergrad/grad TA as well as a tutor for high school students in math/programming. My own experience tells me that learning is very "path-dependent". Students will be more receptive to my way of teaching if their background is similar to my own. Part of my job as a teacher is (was) to try to recognize the gaps in their background, but sometimes I do have a lot of trouble understanding what path a student took before coming to me, so it's harder for me to help them.
I was the one who appeared to learn it super fast. In retrospect, I had comparatively large amount of preexisting knowledge - my parents shown me very basics (in basic) before and we had some programming games.
And looking at my kids, this stuff matters a lot. A kid that comes in with complete zero experience has massive disadvantage against one with some experience.
This is very key. By the time I entered undergrad CS, the kids who had owned computers growing up, and played or even modded videogames, had a massive starting advantage over the ones who hadn't.
The entire first year of CS courses was mostly there to instill fundamental computer skills (not CS skills!) in the ones without that preexisting knowledge.
I haven't seen any data on video games helping with programming. But having a strong math background definitely seems to help. I don't think its knowledge - there's something about how you internally model a system that math teaches you. Approaching programming with the same mindset makes learning much easier.
I read a paper over a decade ago talking about this. They gave a survey to freshman students and looked for correlations with their end of semester marks. They found the students who assumed there was a consistent set of rules underpinning programming (even if they didn't know what those rules were) dramatically outperformed the students who thought the computer would "work it out somehow".
Video games definitely help develop overall computer literacy. The basic computer skills that are missing in folks who didn't grow up around computers are so foundational they might surprise you:
- Filesystem basics, aka "where does a file go when you save it, and how do you find it again?". Do you know how many people just save everything to the documents folder, and scroll through the list of every file they've ever accessed every time?
- Window management. I clicked on something, and now my document has disappeared. Help!
- Drag-and-drop. Particularly on a laptop with a trackpad, this requires a fuckton of coordination.
Modding video games is real entry point. It is low key programming. But also how to edit files, filesystem, most importantly that you can and no one will yell at you. Basic computer literacy, simple administration are also something kids without free access to computer lacked.
This stuff is not taught here. You either know it or are seen as lost cause.
Even more importantly, I had those games that introduced you to loops, ifs, variables, programming in general.
I recall struggling in algebra as I didn’t follow the particular process that the teacher went through, while some friends seemed to readily follow this same teacher, and yet upon one on one discussion, I had no problem comprehending the same.
Thank you for the clarification. I understand your point now.
I teach math at a community college an I'm absolutely convinced not everyone can understand math. I'm not a great teacher and I think I'm not a terrible teacher. I've come to the following conclusion. For some people the effort required to learn the material given their natural ability is so high that they'll never get it. They won't be able to put in the effort required.
I could be wrong though and might be one of the people who let themselves off the hook as you mentioned.
Computer occupations as a whole have a median intelligence at least 1stddev in excess of the population median. https://www.gwern.net/docs/iq/2002-hauser.pdf I have to imagine that programmers in particular are even higher.
I've always felt like a good data point in favor of IQ playing a non-trivial role in suitability for programming is the fact that anti-credentialist moves like 10-week coding bootcamps have been so incredibly successful.
Bootcamps took off shortly after I graduated from my top-ranked CS program, and I remember being extremely dismissive of their value: there was no way that you could fit the four years of my CS degree into ten weeks. I've since realized that being good at Computer Science grossly overqualifies you for many programming jobs, and that bootcamps do a great job of giving reasonably smart people an on-ramp to these jobs. I know three separate people who jumpstarted six-figure careers from scratch after attending a bootcamp: one was a waitress, one an artist, and one's entire career thus far consisted of a decade of changing tires at Costco. One of the three wasn't college-educated, and two were first-generation Mexican immigrants (as children) from working-class families, belying the assumption that privilege and educational access fully explains programming ability. Unsurprisingly, all three are fairly intelligent people.
I can't think of another industry that has a similarly egalitarian entry path, and I think a very plausible explanation is that suitability for these types of entry-level programming jobs is little more than a test for high IQ (relative to the population norm).
This is my experience as well. During an ill-advised foray into building the tech org for a mediocre startup with a poor candidate pipeline, the quality of candidates we got was so poor and the work we were hiring for so easy that I pretty immediately found myself dropping every requirement except for knowledge of basic programming + intelligence. This worked pretty well: The no-experience, very-smart guy that I strongly recommended hiring became the capable workhorse of the eng team, and the dumb guy with a decade of experience that the founders hired over my objection was an absolute dumpster fire, unable to work with any autonomy and bounced from area to area in an effort to find a place where he would do the least damage.
You do indeed get very smart people doing menial work, and their talents are largely wasted.
IQ has always been a good indicator of programming skill. Unfortunately school grades and specialisation aren't a good indicator of IQ, and (I think?) it's illegal in the US to require job candidates to take IQ tests.
So the issue for companies is that without the usual whiteboarding or educational paper trail you have to find a way to identify high IQ people without being too obvious about it.
The other issue is if you find no-experience very-smart people you have to spend time training them before they start being productive. This works if you're not in a hurry, but it's harder to justify when runway is limited.
> it's illegal in the US to require job candidates to take IQ tests.
> So the issue for companies is that without the usual whiteboarding or educational paper trail you have to find a way to identify high IQ people without being too obvious about it.
Yup. I was surprised to find myself gravitating to this through trial-and-error, since I had pretty thoroughly absorbed my affluent coastal background's bullshit religious conviction in the pure blank slate of every human mind and the uselessness of intelligence as a distinct concept.
> The other issue is if you find no-experience very-smart people you have to spend time training them before they start being productive. This works if you're not in a hurry, but it's harder to justify when runway is limited.
Right, the other ingredient that I mentioned was that the engineering work we needed didn't require any specific skills. The vast majority of our engineering work was just a Python backend, which is about as "pure" an exercise of manipulating logic as you get in engineering.
<<I've since realized that being good at Computer Science grossly overqualifies you for many programming jobs>>
Thank you. What a naked and honest thing to say on HackerNews. My brother is so much smarter than me. I went the CompSci route and suffered (toiled!) for years at a university. His boot camp was less than three months and he landed a just fine job. Internally, I was incredibly dismissive of his boot camp, but was proven wrong (again!) by my own horrible elitism.
No everyone needs to be working on cutting edge ML/AI. My brother makes websites for shopping. (Yay.) His income easily doubled after a year of cutting his teeth.
I experienced this from the other direction, joining Google and assuming the work would be stimulating because it's The Best Smart-Person Job (at least at the time). I didn't have a lot of good mentorship, so it took me years of wrong turns until I found myself where I am today, finally feeling like my career doing Applied CS Research is a good fit for me.
It's a bold claim, but it's been largely proven by the famous "The Camel Has Two Humps" article that one basic prereq for learning CS/programming effectively is understanding that one is dealing with a formal system that's following consistent rules, even if it's not always clear what these rules might be. It seems to be quite hard to teach this understanding precisely to those who haven't internalized it, but it's a key part of the tacit knowledge one would need to work in the field.
"That document was very misleading and, in the way of web documents, it continues to mislead to this day. I need to make an explicit retraction of what it claimed. Dehnadi didn’t discover a programming aptitude test. He didn’t find a way of dividing programming sheep from non-programming goats. We hadn’t shown that nature trumps nurture. Just a phenomenon and a prediction."
That's too bad, then. A good paper should not be retracted simply because of so many misinterpreting what it's talking about and creating some pointless controversy about it. The underlying "phenomenon and prediction" (namely, the bimodal distribution of proficiency in basic CS) stands up to scrutiny.
I dislike the way that so many people have a burning desire to belittle the difficulty of what we do. The fact of the matter is that most people cannot do it, and pretending otherwise is not helpful.
> “Programming fundamentals” isn’t the same from language to language. It’s inconceivable that a student who learned to program with python would develop the same type of fundamental understanding as a student who used C (to develop an understanding of memory-level APIs) or Haskell (to develop an understanding of type and semantic theories).
I think this is a pretty good point and it touches upon something I was thinking about: what language you use to introduce students to computer science says a lot (not everything!) about your pedagogical attitude towards what "computer science" is. If you think that it's fundamentally about humans interacting with machines, then low level languages make sense. If you think it's fundamentally about math, then functional languages make sense. If you think it's about building things which provide a lot of real-world value, then something like Python makes sense. Now of course (1) students probably need to learn about all three at some point; and (2) there are a number of other factors here, perhaps most importantly making the material enjoyable, approachable, and rewarding for students such that they'll want to delve into each of these aspects.
What inspired me to learn programming was your last point: building things. That's why I think that people should be introduced to a loosely typed, high-level, forgivable language. Compiler errors and verbose code does not appeal at first glance, you need to draw people in..
I will never forget the moment 11 year old me saw a message box appear with text that I specified. I made that happen, I built it. I felt so powerful and cool. That 1 line of code in an obscure language got me hooked and over time I moved on to other languages.
Most programmers are not computer scientists. Even most people with computer science degrees are not computer scientists at heart, they forced themselves to do the coursework so they could get a job (mostly web, utilizing almost nothing they were taught).
I learned to program as a child (I taught myself, for fun!) but I didn't learn much computer science until well into adulthood.
As someone who fought hard to switch careers into this industry and teaches software engineering now I understand the impulse to think this but I have to say all my experience thus far contradicts it.
This is an interesting question. From personal experience, I have also found simple GUI work to the "digging ditches" work of programmers. (Weirdly, I still love it, because it makes it easy for end-users to understand your tools!) Of course, complex GUIs are another beast, but usually GUI programming on the low end as it is very hard to test (easily) and requires endless tweaks due to the high configurability of GUIs.
For example, if you just look at GUIs created using Excel/VBA, you will find they seem to need endless tweaking to work just right (edge cases?), but do not at all require a formal CS degree. (I have worked with many non-CS people who are fine a mid-level Excel/VBA stuff.)
You could think of there being a universe U of programming fundamentals. Each language uses a subset of U as the language's fundamentals (no language uses all of it), and each language uses a different subset.
Some languages have subsets that overlap, but that isn't always obvious, because they clothe the same thing in different syntax and, sometimes, different semantics. Even when the semantics are the same, they talk about the semantics using different terms.
So the problems with transferring knowledge of the fundamentals from one language to another are that they aren't the same fundamentals (different subsets of U), and that they are expressed in different terms and in different syntax. That can make it hard to recognize that the parts that are in common actually are the same. And then there's the new parts.
Now here's a class in C#. One student comes in knowing Haskell; one comes in knowing Python; and one comes in knowing C. The "fundamentals to transfer" differ from student to student. It's asking a lot for the teacher to be able to help each student recognize what part of their background transfers, what part of C# it transfers to, and what the deeper fundamentals are that are expressed in both languages. An exceptional teacher could certainly do so, but as the adjective implies, such teachers are the exception.
But it's not really fair to blame the student, either. It's asking a lot of a 19-year-old to understand deeply enough to make those connections.
My own guess is that this happens best after working as a professional for several years, and in more than one language. Then it's somewhat easier to see what's going on "under the hood" of the languages.
I think you’re vastly overestimating programming fundamentals.
Every language can do some form of if then else be it machine code with goto’s or purely functional. Ask someone to find the absolute value of a number and they should be able to quickly figure out something that works in a new language. The basics of programming are not about being correct so much as not being lost.
Uh, not exactly. I had a really hard time learning prolog because it doesn't really have if-then-else. Haskell is missing the idea of something happening after something else. Stack languages don't have variables. And so on - there's a lot more to programming than python.
There's a big class of popular structured / OO languages which are quite similar (C, JS, Python, Java, etc). But they certainly aren't "every language". Not by a long shot.
And even amongst those languages there's huge variety in how the grain of the languages shape the code we write. I was talking to a grizzled old programmer at a conference a few years ago about the performance of java vs C. He said in some benchmarks he did the performance was almost identical. I got him to show me his code - and sure enough, he wrote java as if it were C. He was using native java arrays everywhere and basically everything was a primitive variable, in huge classes acting more as buckets for code than anything else. His code was unrecognisable as java because he was still writing C, just with a different syntax.
> The basics of programming are not about being correct so much as not being lost.
The basis of programming is expression. The most idiomatic way to express yourself in each language is very different. Its as much cultural as it is syntactic.
Also, Java was designed to handle performance critical code like that. That’s why people call it a fast language, you can write non performance critical code however you want, but optimize for speed and it’s going to look like C.
Maybe I'm nitpicking, but I see that construction as the equivalent of the ternary operator. Its not the if-else construct you find in imperative languages. The claim was "Every language can do some form of if then else" - which I read to imply that every language is semantically familiar to someone who knows python / C / java. Haskell's approach is certainly not familiar if you've only ever written javascript or something.
> Also, Java was designed to handle performance critical code like that. That’s why people call it a fast language
Sure; but my point is that idiomatic java looks quite different from idiomatic C. If your first language is C, its easy to write java that looks like C. If your first language is java, you can torture C into looking like a bad version of java. But both approaches are newbie mistakes. Learning a language involves a lot more than just being able to make programs that work at all. Its also learning a culture, and learning to make code that fits well with the existing ecosystem. You won't get very far in the java programming community if you don't know how to write java in the popular style.
Thats why they say learning your second language is difficult. You need to unlearn some of how you approach programming so you can come at your second language with fresh eyes.
Doing better is irrelevant when we are talking about what constitutes the basics of programming. You can always improve your personal skills, but the basics are in effect the smallest set of skills that works.
“The smallest set that works” in the real world requires that you can write code that others can read, write code that could pass a PR. And that you can read the code that others write.
If you only know enough Java to be able to translate awkwardly from C, you aren’t fluent in Java. I know bits and pieces of Haskell - probably enough to make working programs. But I’m not fluent in Haskell. I can’t read the code others write. I can’t think in Haskell. And that’s not enough.
This is talking about students. Writing code that works and others can read is roughly the benchmark for operating at the professional level. It’s not enough to get a job at FAANG, but many company’s have lower standards.
“Professional” is not what most people mean when they talk about the basics as say a welder, cook, etc.
> Haskell is missing the idea of something happening after something else
A Monad is implicitly a monoid, which certainly does specify an ordering. And as for expressions, the order of evaluation of arguments is not specified either by C for example.
The bigger problem is that "something happening after something else" is not related to the idea of an if-then-else.
In Common Lisp the if-then-else construct is IF. The "something happening after something else" construct is PROGN. The whole point of if-then-else is that only one thing happens; "something happening after something else" requires a minimum of two things.
I don't really get your point. If/else expressions are in principle free to evaluate both branches, but that would be needless. But when we provide an IO Monad for example, if/else expressions will not suffice in the naive way -- you have to use the monad "bind" operation to actually provide an order -- what it boils down to basically: the A; B; C; in imperative languages become C(B(A())) in pure FP ones -- with implicit ordering.
> If/else expressions are in principle free to evaluate both branches, but that would be needless.
This is true only in a pure functional language; if you admit the existence of side effects, then it is a crucial part of an if statement that the code associated with the other branch is never executed.
My point is just that it makes no sense to say "some languages have no form of if statement; for example, Haskell doesn't have the concept of one thing happening after something else". That example is untrue. But if it were true, it still would not interfere with Haskell's ability to have an if statement. Thus, defending Haskell's ability to run one statement after another statement is unnecessary to refute the claim; the claim never worked in the first place.
Having learned C++ and Python, I feel they're similar in programming model (imperative, object-oriented). My limited experience with application or kernel C code feels like C++ but with high-level constructs replaced with pointer juggling that makes me feel uneasy (C has a culture of not using vocabulary types to communicate nullability and responsibility for deallocating). Haskell (barely know) is an altogether unfamiliar programming model, being based on pure functions instead of mutation, but the underlying model beneath the abstractions (thunks) leak through.
Maybe not everyone, but definitly a majority. I'm not talking about learning assembly language, but learning some HTML/CSS/Javascript (if properly taught) seems pretty simple to me. With enough effort and youth (ie malleable brain).
It's crazy that programmers think this to me and just a form of ingrained tribalism that tells them that what they do is somehow special. Probably because they get such crazy time-valuations in terms of salary. But that's just a supply and demand curve.
> “Programming fundamentals” isn’t the same from language to language.
You should be able to carry very, very basic concepts over to other languages with much less difficulty than learning them the first time. A student sufficiently skilled in Python will be able to make the same intro level programs in C given enough time. Badly? Yes, but they'll function.
Similarly, a watercolor artist won't forget the basics of art when handed a pencil.
C and Python are also languages with a pretty direct historical relationship. Going from, say, Python to Prolog, it will be significantly more challenging to identify commonalities.
I would say it took 10 years of professional work across many languages where I can say I feel that comfortable.
Maybe I was a shitty student but I think I went to a general well taught undergrad program that grounded me in the various concepts (although not functional programming and lambda calculus - that was a pure CS-only thing at the time). It took a long time to actually build that level of intuition through experience. I wouldn’t expect someone who is just out of college to have that trait but it is to me a strong signal of seniority (eg you can start helping someone proficient in the language/more familiar with a codebase try to spot tricky system performance issues because you know what kinds of things to Google). Similarly language fluidity is a skill that not everyone develops (due to whatever mix of available time, interest, or natural talent) and that’s also OK. There are plenty of meaningful contributions made by people who only specialize in expressing all the CS concepts in one language and that’s OK too (or even just building useful tools/OSS contributions that are unrelated to CS topics).
> programming fundamentals from one language to another,
But the fundamentals of different classes of programming languages differ. For instance, Haskell's model of computation is not Turing machine but a G-Machine. It's a different matter that G-Machine and Turing machine are equivalent and that doesn't mean learning one is sufficient to understand the other. There is very little that can be transferred between the two.
To take another example, Javascript supports continuation passing style coding where as Java didn't until very recently.
I think this goes to the core of how do you teach programming.
Can you teach "programming" without conveying a fundamental model of how the machine operates.
Should you?
Because how the machine operates is very simple.
Fundamental units of state are read from memory to an evaluator, evaluated to extract meaning, and this can result in units of state being written to/assigned depending on certain conditions.
Data structures are a complex arrangement of state that has meaning other than the literal evaluation of said fundamental units. Algorithms are logically proven concepts represented in state. Compilers are a tool for converting "abstracted instructions designed for humans to work quickly in" into operations supported in hardware.
How each language abstracts it is pretty secondary.
There are so many unanswered questions in programming education.
"Whether you should attempt to teach the model of fundamental state" to all novice programmers, or just a few shorthand rules, is the biggest.
"Or it simply means the student does not have the aptitude for programming. Not everyone does."
Some students (and programmers) can excel in one programming language and struggle with a different one. Programmers don't want to admit this because it suggests a superficial focus on syntax.
Developers often say: it's semantics that matter in programming, not syntax. But you can't separate syntax from semantics - they are so closely entwined. And for many developers it shapes the way they solve problems (me included).
Some programmers develop a deep understanding of semantics that makes them easily see beyond syntax regardless of the language. I'm definitely not one of them. Dare I suggest (with no judgement), that a substantial number of programmers, possibly even the majority, also find it difficult to see beyond syntax for more complex programming topics?
Aside: A tweet from Sahil Lavingia (founder of Gumroad) from 2019:
"For those who think they can't learn how to code: try another language.
I tried PHP a few times and gave up. Then JavaScript: gave up.
Then the iPhone came out, I tried iOS development, and everything started to click!"
But the topic is how their view is shaped by the first language they learn. If they learn C first, will their ability to knowledge transfer into other languages happen differently then if they learned Python first?
Concretely, I can't fathom how people learn Rust without learning one of C or C++ first, and I'd think the underpinnings of how you understand code and systems is probably different if you are subconsciously mapping new concepts back to Python or Java or whatever.
I agree they are different, I really did mean "I don't understand how people learn Rust without first learning one of several languages that are more straightforward/explicit what is on the stack and what is on the heap".
C++ overall obviously isn't simpler then Rust; C++ in its entirety so complicated it's like three languages stapled together. But with C++ as first language, you can study programming for 6 months before ever even hearing about std::move, and with Rust you'd need to already learn about move semantics on like week 3, when people are still having trouble with loops
At my university, there was an intro to programming course which used Scheme & was inspired by SICP. I took the course with previous programming experience (Java, Python), and found the course straightforward. I also knew students who took the course having programmed in C++, and who struggled with the course.
I'm not sure "do SICP exercises in scheme having learned Python/C++" is really about transferring programming fundamentals. The SICP exercises seemed really geared towards a small lisp.
You are given a set of tools, designed by folk's that may not have envisioned how you want to use those tools. You make the best you can from the bag of tools available.
The biggest problem with the first language is that a student has nothing to compare it to, and so naturally comes away thinking that the design decisions in that language are a reflection of some deep underlying truth, that this language is The Way Things Should Be Done rather than a product of its time and particular goals of its designers.
When I was a wee coder I thought BASIC was the bee's knees. I don't think that any more, but it was a painful series of transitions until I discovered Lisp and never looked back.
I started with machine codes. Followed by ASM, Turbo Pascal, C/C++. I played with BASIC but quickly dismissed it as it would greatly limit of what I could do (I was hooking computers to devices using high speed parallel bus and processing data in real time). No college. I was studying physics in university. All my programming experience I got by looking at docs and somebody else's codes. Of course every once in a while I would use something like Matlab to solve my little problems and when I switched to develop software for living I also used many other languages. Now I mostly free to choose whatever tools I see fit and usually stick to 3 - C++ for backends, Delphi/Lazarus for desktops and JavaScript for web front ends. Occasionally use C when I have to do firmware. I played up to my gills with many scripting languages like Python. Personally I found that on any project of decent size they do not speed up development at all. The performance loss of the end product however is monumental.
No, it is an argument for teaching intro programming in C so they better understand what operations a computer is capable of doing. When you are used to C most other languages feels like a library that automates tedious tasks rather than something alien, you are well aware of the things needed to happen in the background so that the code you write in other languages can do what it does.
I've never looked under the hood of a `cdr` implementation but I feel pretty confident in saying that it's likely to be an incremented pointer most of the time.
Learning C (and even a little ASM), gives a better understanding of the underlying layers. I think that provides a great foundation for learning a scheme, and then branching out from there.
> Scheme is more of a blank slate upon which many different paradigms can be taught.
reply
Also, for a lot of first-time programmers, Scheme and other lispy languages, can come with heavier cognitive friction.
Unlike procedural languages where you can think about what you want to do, break it down into steps, and then make those steps, there is a higher-level (or deeper-level, if you like) of thinking that needs to goes into functional development, where one thing feeds into the next which feeds into the next.
If you already have a very strong background in mathematical theory, this might not be so bad (and might make a lot more logical sense), but it is a shift in thinking from how many people do day-to-day tasks, which is more similar to procedural programming.
C is not really representative of underlying hardware anymore, and is a language with basically no compile time defenses, making the student scratch their head at hard to debug segfaults. I would definitely not start with C.
> C is not really representative of underlying hardware anymore
While I understand where this is coming from, can you suggest any language that is closer to the mental model of how the underlying hardware works than C (other than assembly)?
Well, hard to define closer as a simple dimension — but in certain things perhaps Rust — eg. SIMD is notoriously bad in C. Also, multithreading — which is not exactly a new thing either and don’t forget about pointer aliasing — though that latter is more of an optimization possibility.
Fortran may also be considered as another candidate for having SoA over AoS which can potentially be more cache-friendly.
But my point was more like, there is no reasonable difference between eg. C and another language that compiles down to machine code — the old saying that a seasoned C programmer can reasonably guess the created machine code of a program is no longer true with today’s heavily optimizing compilers and the amount of magic happening is almost comparable to eg. Haskell.
What about multithreading? Someone exposed to developing a multithreaded app in C using pthreads will have a much better understanding about threads. I have worked with too many younger developers who are confused about kernel threads, cooperative multitasking, green threads, 1:1, 1:M due to only having experienced higher level languages. You couldn't ever have this type of confusion after working in C with threads.
Memory handling of course is the other big area. So many developers grown only on higher level languages have varying level of confusion about memory allocation, passing buffers around by copying vs. pointers, modifying memory vs. copy on write vs. reallocating everything. You couldn't possibly come out confused by any of this after a solid stint in C.
And just like threads, it's very valuable understanding even if later you mostly program in higher level languages.
So yes, while it certainly true that in the 80s you could squint at C code and see the generated assembly code, and that's no longer necessarily true today, it is still true that having a solid foundation in C will help tremendously in understanding what's going in the CPU and memory in ways that probably no other language (that I'm aware of) will help you learn.
I do need to spend a solid season in Rust sometime to have a deeper opinion about it!
Georgia Tech around the year 2000 required most new students to take CS 1301/1311/whatever number course. It was, at one point, taught with a Pascal-derived pseudocode (incredibly unpopular, though I liked it quite a bit). Then they switched to Scheme (using SICP?) which was a fantastic catastrophe, major cheating scandal the first year. The TAs were not prepared (most had not taken it themselves and many/most were only sophomores or juniors with insufficient experience or time to teach themselves). Then they switched to Python (?), I don't know what they used after that because by then anyone I still knew at the school was beyond the freshman classes, most having graduated.
At the same time they had the Scheme-based course (and possibly before, but I know at this point) they also offered a MATLAB-based introduction for non-CS/CMPE/EE students (EE and CMPE had to take the first 2 or 3, respectively, CS courses with the CS majors as part of their degree program). This MATLAB course was targeted more towards ME, AE and other majors who really just needed something that let them write programs to solve problems in their domain. Many of them did much better. The problems assigned were better motivated for them and they were able to apply it within their broader course of study or internships or research work.
That a first course should be appropriate to the desired outcome for the students seems obvious, I'm rather surprised there's much debate around this topic.
This feels like a narrative in need of hard evidence. I was there at what I assume was the tail end of this in 1997. My gut was cheating was no worse then it had been, but they had automated systems for the first time.
Some of it was so silly that I was accused of cheating in the C course for using and citing the hashtable implementation from the assigned textbook. On just talking to the graders and professor, they dropped it and were amused that anyone used the book.
For the other sciences, they would shake up the rules on how to capitalize figures. Idea being that cheaters would use last year's rules. Or, lazy people would just not pay attention..
The cheating came out in the Scheme class, specifically. It was something like 150 students caught. It was not a general cheating scandal in the first CS course over the years. My comment was very clear about this, I did not imply cheating was a problem elsewhere (may have been, I wasn't aware of any issues though).
It wasn't really overblown, it was a problem with that course. The instructors and TAs were grossly unprepared to teach hundreds of freshman a semester in a language they (the instructors and TAs) weren't familiar with that was fundamentally harder (especially for an arbitrary student rather than those actually interested in the topic) than what was done before.
The rest of my comment was about appropriateness of the first course to the students and their objectives. MATLAB being a much better first language for the engineering majors than the general CS requirement provided.
My point was that a lot of that was the definition of cheating. It got so bad that just working together in electrical engineering would get you threatened with cheating.
So, yes. They had evidence of massive cheating on a level they had never seen. They also had evidence of near incompetence in teaching at a rate never seen before. Not shockingly, they focused on the cheating.
My assertion is that the behavior in the cs students was no different from any other students. They just had more mechanical grading schemes and were less prepared to teach such a variety of students.
Edit: and apologies for skipping the Matlab assertion. As an EE at the time, that was my intro. I don't recall it being particularly good. Or bad. It just was. They shifted to java soon for many folks. As someone that was a TA for the Java classes. Then the VHDL classes. Then the C classes. I can't really see any benefit of any one over the others.
From [1], 2002 187 students investigated for cheating at GT. 77 students had their final letter grade dropped by at least one letter grade, 26 received an F, 32 received a zero for the assignment, one student was suspended for two semesters.
Though I couldn’t read the full article, but that’s what it seemed to say in the first two paragraphs.
There seems to be two slashdot threads discussing the issue from 2002. Perhaps it was worse after you left?
I should have been clear on my call for hard evidence. I am not claiming point blank that there was no cheating. I caught some folks myself that left someone else's name on the submission.
Rather, a lot of what they classify as cheating in that time was stupid close to standard practice for sorority and fraternity members. Having banks of previous years tests and assignments, for example. (I say this as someone that was not in one, so mayhap I am misrepresenting.)
So, my request for hard evidence is mainly to get to a root cause and to establish norms and base rates of behaviors. Yes, it could have gotten worse or had some bad years. It feels highly suspicious that it just happened to correspond with the rust of automated graders. Yes, coincidences happen. Feels really shaky, though.
>Then they switched to Python (?), I don't know what they used after that because by then anyone I still knew at the school was beyond the freshman classes, most having graduated.
As of 2016, CS1301 was still in Python. CS1331 was in Java. MATLAB was present either in specialized CS classes (like computer vision) or in the Intro To CS for Engineers (CS1371). Not aware if it remained this way afterwards though.
It's my feeling that articles like this are evergreen, simply because the demands on a "first programming language" are contradictory and excessive.
Depending on who you ask, a first programming language (I'll say 1PL to reduce repetition of the phrase) should do one or more of the following:
1) Be practically useful even if the student drops computer science immediately.
2) Be low-level enough to teach the fundamentals of what's really going on with the machine.
3) Be high-level enough to teach the fundamentals of true computer science.
4) Be stringent enough to allow the users to learn precision in their programming.
5) Be lax enough to allow the users to not get bogged down in endless boilerplate.
6) Be in common usage to help ensure immediate practical users for the language.
7) Be an exotic to avoid turning the first year and a half of CS courses into a competition to see who spent the most of their formative years messing with computers.
8) Bonus round: claim that the 1PL is a bogus fixation and instead the learners should learn several different languages.
The problem with all these arguments is that they are all sorta-kinda true, but the consistent application of even a few of them result in a learning load that would result in the first year of computer science (or some "intro to programming" course) being not so much a major but a 2-3 year full time job carried out to the exclusion of all else.
It might make a lot more sense to think not about the 1PL but how the 2ndPL, 3rdPL, etc. are introduced and when. Instead of overloading the 1PL with a host of contradictory and pretentious goals, maybe we teach people the basics of programming as quickly as possible and move on?
For my part, I think the experience of the 1PL should be focused on usability and fun. I'm not really a fan of the whole Java thing for reasons not germane to this, but "Processing" really had a whiff of what was fun about messing around with Commodore 64 BASIC back in the day. Maybe that's more important than making sure that the language in which everyone adds up columns of numbers and prints an invoice for some fucking stupid imaginary warehouse application properly inculcates everyone into the One True Way of Thinking About Computing, whether that's being close to the hardware, or being able to write metacircular interpreters, or be a service course for some miserable engineering department.
The fact is, most of these first-year courses are dull, miserable and unappealing.
Slightly related, I am having great success with using Processing as a first language for my young child. The fact that it is very visual and easy to make "pretty things to look at" while still being pretty powerful made the onboarding a lot easier then something like JavaScript or Python.
As a bonus, it is close enough to (Java, C, etc.) that the move to a next language should be pretty straightforward.
One of them also did a ton of Scratch, which is very different in a bunch of ways - and he seemed to spent a lot of time wrestling with the fundamental limitations of Scratch. It felt so specific to Scratch that I often felt like what Scratch led to was, well, more Scratch - and also getting caught up in the politics of putting things up on their public forum.
I felt that both Processing and Scratch certainly hit the 'fun' point, and that Processing also had the advantage of being a lot closer to actual coding. I imagine that there are probably also JavaScript setups that might tick some of these boxes too.
Folks getting huffy about the fact that people aren't learning metacircular interpreters or assembly code (or logic gates!) right off the bat forget that (a) even the basics of solid imperative programming, and the inexorable way the computer does what you told it to, not what you "meant" is hard enough and plenty to get started with and (b) computing is one of many things people could be doing with their time, and making it shitty and esoteric and unpleasant to pass some purity test is not going to necessarily attract the people you want in the field.
Notably the latter point - weird culty languages, ultra-low level approaches, etc - are going to filter out your students down to the people who knew from Day 1 they were going to be computer scientists. Nerds, like me. This is by no means going to filter people down to the 'best possible student cohort'.
When I arrived at university we had to do a Haskell course first. I didn’t realize it at the time because it just felt backwards, but it was a great “reset” of the minds of everyone like me who had done C64 Basic and Pascal. It also served as a great equalizer in the class when some people had never programmed while some of us had done it for 15 years. The Haskell course put everyone on an equal footing.
"When I arrived at university" -- Usually 18 years old.
Then: "while some of us had done it for 15 years" -- I'm confused. Are some people learning to program a 3 years old? This is breaking new records for HN exceptionalism.
University is often 19 or 20 here (mandatory military service!) but obviously there is no upper limit; some in my class had done it for 15 years (possibly more, we had people with existing careers, but they were older than me). I was 20 and had programmed for ~12 years at the time. Obviously zero years professionally.
Even MIT abandoned Scheme and SICP and switched to Python.
There's a beautiful elegance to that approach. It gives a sense of the mathematical foundation of computer science. It's not that useful for building large programs from off the shelf libraries, which is what almost everybody does today.
At one time, computer science required a good understand of what's in Vol. 1 of Knuth. That's not in much demand any more. Just about everything in there you now get from a well-debugged library.
There are people who still write hash tables, and they're serious hash table theorists. I'm impressed with how good hash table technology is today. You can get over 90%+ fill and near-constant time. But only a few people today need to know about that.
University started out as a trade school for the clergy, then added law and medicine, and then became a finishing school. At all points the median student was more interested in drinking than studying. Scholarship has always been a minority pursuit at universities and research universities haven’t even existed for 300 years. Universities are now and always have been vocational institutions. The supply of idle rich who don’t need a job has never been all that high.
In more mundane terms, the fact that you go to university to get a good job has been the social contract pretty much everywhere in the world in the 20th/21st centuries.
The idea that this is “not what it was meant for” which is oft repeated on HN - and I presume in academic circles too - is not only inaccurate as you point out, but also arguably irrelevant, because the people paying the bills and justifying its existence expect otherwise.
> the fact that you go to university to get a good job has been the social contract pretty much everywhere in the world in the 20th/21st centuries.
It's an idea from the middle of the 20th century. For much of the 20th century, university was just something you did if you were upper class. Then someone noticed that upper-class people had good jobs and decided that must be due to their university attendance. It wasn't.
The clergy weren't a vocational body, they were a club. And wars were fought over whether the people in that club had to go to university to be able to join.
Ditto the lawyers, who certainly don't need to go to university to read law or argue. Being a lawyer is also basically being in a special club.
Medicine, I can only see a vocational aspect. But that is only 1 in 3.
> At all points the median student was more interested in drinking than studying.
A good use to time in the club. But only relatively rich students can afford to do this. Anyone who drinks their way through a university experience is either rich or busy trying to join a club.
I've found that for topics like this, the vast majority of Hacker News thinks the world began in 1776. Saying this, the Universities of Oxford and Paris do meet GPs argument - there were heavily theologian until fairly recently. Cambridge was too until the 16th century.
Also Bologna, the oldest and first university in Christendom. Notice that all of the subjects studied are either vocational, i.e. they lead to jobs, or the liberal arts, which are a prerequisite for studying theology.
> The university arose around mutual aid societies (known as universitates scholarium) of foreign students called "nations" (as they were grouped by nationality) for protection against city laws which imposed collective punishment on foreigners for the crimes and debts of their countrymen. These students then hired scholars from the city's pre-existing lay and ecclesiastical schools to teach them subjects such as liberal arts, notarial law, theology, and ars dictaminis (scrivenery).[17]
If the employers don't want people who have a detailed understanding of academic minutia, they should maybe stop hiring from university courses and set up some sort of vocational system.
Just because there is a need for vocational experts doesn't mean the universities should try to fill it. The place in society for universities is not a short-term-practical one.
This is a very minority view, unfortunately. The politics of UK higher education has been completely marketized, and university courses are ranked by what salary the graduates can achieve.
Yah, they’ve always demanded to see my Engineering degree to even consider my application, only to proceed patronizing me from day one about how the grown-ups do work differently.
Now that I’ve put enough years I don’t get the treatment any more, just strenuously opposed with script-kiddie hacks when I put some intellectual effort in what I’m doing.
There’s a ton of people out there saddling down the industry who are totally our of their depth and terrified of becoming irrelevant.
As an employer, why should I spend money, time, and attention on setting up a vocational system when I can just keep doing what I'm doing, and have other people spend money, time, and attention in the process of educating my employees?
If the employers don't want people who have a detailed understanding of academic minutiæ, why do they spend so much time testing their prospective hires for detailed understanding of academic minutiæ?
Employers don't prefer university graduates because universities offer better vocational training. Universities teach scientific thinking which makes students better employees. Universities don't teach specific skills but they teach the underling principles.
Employers prefer university graduates because they have to pay extra for good people. And if they have to pay extra they want a kind of certificate that it's worth it plus to justify the extra pay in front of other employees. A degree makes all this easier. Yes scientific thinking is nice but what really matters is that this people can deliver a compkex workload at a given date. In the rarest of cases you actually need a researcher and at that point you want someone with a PhD
I don't know if you'd call him a theorist, but I think this gentleman maintains GHashTable and I found his posts comparing/benchmarking widely used hash tables really informative and entertaining:
We might be able to teach a lot more people about programming if we don’t expect students to know mathematics first
I'd like to know where the author thinks mathematical learning can stop. In the US we live in a society where people will say with a smile, as though it was a badge of honor, "Oh I can't do math, I always hated it in school."
I'm sure the author isn't talking about that level of learning and would still want a reasonable degree of knowledge for all people, but their opinion on what that should be would be would be interesting in this context.
The most applicable is, "It is practically impossible to teach good programming to students that have had a prior exposure to BASIC: as potential programmers they are mentally mutilated beyond hope of regeneration."
But do not forget, "The use of COBOL cripples the mind; its teaching should, therefore, be regarded as a criminal offence."
I've started with BASIC as well, in the 80s, on a C64
Dijkstra was right: I've wasted a lot of time unlearning BASIC to re-learn, if I wasn't exposed to BASIC at all I could have learnt from scratch a proper modern programming language/paradigm
It helps to have a good teacher. My years of Atari 8 bit BASIC seemed more help than hindrance when a good high school computer teacher then taught me Pascal.
I think you need to know how many students were introduced to programming with a course in Basic, and gave up programming, to make that claim. Not that I necessarily agree with the quote.
Most people don't want to program in C, and I think my high school programming course (which used C) did more harm than good in that regard. The only students who got anything out of it had already taught themselves programming on their own!
I came back to C later and enjoyed learning it properly, but it is very much not a gentle introduction into the world of programming for the average person!
"Matter" to what end? That is, if you do not have a specific goal in mind when learning a language, then of course it doesn't matter. On the other hand, if your ultimate goal is to learn a myriad of languages so that you can jump around fields, then of course it matters, as you will likely have difficulty learning advanced concepts without understanding the basic ones. (Just as you would likely not dive into calculus without understanding algebra.)
> We don’t have to make programming about mathematics.
I also don't understand the article's interest in decoupling programming from math; that seems like trying to have physics without math or philosophy without logic. The power of the field is founded on math; you can't just sidestep around it and gain its power at the same time.
>I also don't understand the article's interest in decoupling programming from math
It's an unfortunate side effect of America's long acceptance of anti-intellectualism. And people like this perpetuate it by coddling Americans who were somehow allowed to get through high school without even a basic understanding of the fundamentals of math. I am also a victim of this defeatist way of thinking, and have worked hard to appreciate math and learn it on my own to undo the trauma instilled in me by math teachers who simply taught formulas for us to remember to pass tests that we immediately cleared from our memory afterward.
I had some math teachers who either did not understand math very well themselves and just regurgitated text books, or just didn't know how to teach it well (which, to be fair, can be challenging).
But the article states near the end:
> Conversational programmers struggle to find resources that help them learn because so many of them require a focus on the logic and mathematics, but we are developing approaches to help conversational programmers learn without the math. We might be able to teach a lot more people about programming if we don’t expect students to know mathematics first
I'm just not sure that's a great (or entirely workable) idea. Embrace the math!
That said, I don't think one has to learn the math first; you can learn math and programming at the same time. In fact, it is perhaps easier to learn math when you have a concrete programming context to apply it to. I taught myself the basics of algebra by teaching myself BASIC in elementary school, years before I had a clue what "algebra" was. By the time I was taking algebra classes, it was a breeze. (Calculus on the other hand....)
In my opinion the best sequence looks something like this:
- Machine language (not assembler)
— Assembler for a couple of different architectures (i.e: Von Neumann vs. Harvard, CISC vs RISC, etc)
- Forth (written from scratch with one of the above)
- C
- LISP
- APL
- C++
I think of everything else beyond these mostly as C/C++ cousins or derivatives. With languages like Javascript and Python it’s far more about learning the libraries than a massively different programming paradigm.
Learning each of the languages is my list —in the suggested sequence— will provide someone with a deep understanding of different ways to solve problems with computers.
The machine an assembler should probably be taught together. Where possible either emulations of real chips, programmable logic simulated chips, or real chips should be used.
I recall the class which used a 6800 derivative in one of my college courses had a _very_ well written developer's spec manual for it. Complete with a (large) table that showed the logical relationship between the binary value of an instruction and the assembly mnemonic.
Today that might be RISC-V.
'x86' and ARM should also be used, and at least a taste of very constrained systems for super cheep stuff and very low level state machines for components would help round out the package.
Forth should be a semester project for one of the classes. This should include some practical case where either a manually clocked serial link, a set of dip switches and memory transfer captures, or some other constrained input method (maybe that 'boot sector forth' I recall seeing recently?) are used to trickle the core into RAM and then add higher level instructions. This might also include a small boot blob that inits the platform and opens an external data link; a useful example for subsystems.
The intent behind making a distinction between machine language and assembler is because one can offer a deeper understanding of what actually goes on at the lowest level of computing.
The way I would prefer to teach machine language would be to start from scratch. This means you would create and define a simple set of instructions in the context of an equally simple 4 bit processor the students would build on a breadboard out of simple chips. And, yes, the microcoded instructions would be coded by hand using a diode matrix.
I truly believe such an exercise has enough value that it should be the starting point of a real education in computer science.
From there it's on to assembler on preexisting processors.
The reason Forth follows assembler on my list is that you can easily bootstrap Forth from scratch on any processor. Building the language from scratch is, again, to repeat myself, a worthwhile exercise. You would use assembler to get started and then switch to the embryonic Forth to build upon that. At some point you write your own code editor in Forth. From there there are a number of interesting applications (PID motor controller?) that could be implemented with the language you just built from scratch.
C offers an entry point into a world of languages that look and feel like C. It provides a tool that is close to the hardware, yet high level enough to allow for much greater freedom of expression. Doing arrays in Forth is a very hands-on proposition. In C it's pretty easy.
Once you are good with C you can easily pick-up languages like Javascript and Python. Not the same thing but you "speak" the same words to the computer to achieve pretty much the same thing. A "while" loop and an "if" conditional are the same incantations.
LISP opens the mind to an entirely different paradigm. It is well worth learning. The same is the case for APL. I don't think of these languages as "career level" languages. That may have been the case ages ago. No longer the case. However, the perspective gained is invaluable. As we try to solve more and more complex problems the limiting factor might very well become our ability to express ideas for a computer to solve these problems. Having the perspective of languages such as LISP and APL makes someone realize there are other valuable approaches to solving problems computationally.
Finally, C++, well, it's the entry point into object-oriented programming.
This, BTW, was pretty much my path through computing. I can't complain.
As someone who has taught (or TAed) undergrad courses in all but APL and machine language, I would suggest a different sequence:
- LISP / Scheme
- Python
- C
- Assembler / machine
- whatever else beyond here
maybe leave out the LISP depending on how religious one feels about it (I'm a fan, from a pedagogical approach at least).
Most students have a very weak understanding of what a computer does (i.e. Von Neumann arch, executing an infinite tape of instructions, etc). I like LISP / Scheme because it really helps bridge the gap between expressions (which students can pretty intuitively grok from high school math education) and it really has no language design warts; you can write a metacircular evaluator in ~10 lines.
Python is a next step because it goes from "how does coding work?" to "okay how can I do stuff with this?". The language itself is pretty straightforward (although certainly has its warts too, e.g. the odd methods of variable scoping), but there are a ton of libraries and it is pretty much useful straight away for anything one wants to design a course around. Many unis combine "learning Python" with programming some robot, doing an AI thing, or other such practical outcomes.
Importantly, whenever I taught a "relatively beginner Python course", I would always spend a huge amount of time on teaching students about ipdb / pdb. Allowing students to explore the state and stack of a running program is a critical doorway for them to step through to not only learn how to debug their own problems (less of a load on me!), but also really start to grok the "program is a sequence of instructions that change state" thing.
C before assembler, although I have found it's often useful to "combine" them so that the concept of a pointer seems less odd and more useful. Also, C still has tools that students can use to explore their code (e.g. gdb), whereas they're much more on their own with assembler.
After that, it really depends on what students want to do. They can stick with systems stuff and do Rust / C++ / Go or go to Javascript or go crazy with the PL stuff with Haskell, etc.
Certain people (perhaps yourself) really understand this "program is a sequence of instructions that mutates state" thing, but in my experience, most of my students really don't get that. If I were to throw them into that deep end of the pool where they lack the tools to debug their own problems, I would get a lot of bad reviews / people switching out of the major. On top of that, students often like to really be able to produce cool stuff, which they can in Python very easily. Many students may bounce off the lower level stuff if they just want to do front end too, which is perfectly fine in a "top down" approach from higher-level languages.
I think you should teach assembly language first, and then build a few simple high-level langauges on top of that. I suspect it would prevent a lot of mental dross from forming.
I agree. At the very least, teach assembly in the first term, at the same time as students are learning a high level language.
If you know assembly, you can understand why arrays exist in C ( because CPUs have base-and-offset addressing), why records exist (same reason), what pointers are (assembly is pretty much about pointers), why ints are different from floats (ADD vs FADD) and so on, how functions are called (system stack).
Absolutely. My first year of programming we took both Fortran and IBM assembly. It all made a lot more sense, and still does.
We have dedicated classes in computer architecture to learn all the things you mention, it's not the job of a programming language to teach them to you
When I first learn C in highschool the asign statement was what bugged me the most because I had already has a math background at that point so statements such as x = x + 1 made no sense, and even in natural science general, the idea that you can "asign" new intrinsic property to a thing is very unintuitive. 20 years later mutablity is the exact idea that I have to unlearn everyday.
My point is all these languages and learning hierarchy were made up by people like us, and in an era when frankly, they had no idea where this field was headed. We should not just blindly think because they're written in a book and sound important, that students should be condescendingly reminded about them in programming languages while they're trying to learn to solve real-world problems.
Well, I had a similar experience in algebra. Here's an equation; it's the problem we're trying to solve. And we do this operation to both sides, and we get a simpler problem. OK, I understand that. And we do another thing to both sides, and we get an even simpler problem. Great. And we keep doing it, and we get... the answer??? Not a simpler problem, but the answer? How did that happen? Blew my mind.
I eventually decided that what we actually got was a very simple problem, so simple that the answer was trivially obvious.
In contrast, it was trivially obvious to me what "x = x + 1" was doing, even on day 1.
Maybe I'm unique. But I suspect that different people trip on different things. Some trip on "x = x + 1", but that doesn't mean that it's a bad syntax. "x := x + 1" is going to make you type an extra character in every assignment over the next 40 years, in order to prevent confusion in a high school or college class. That may not be a good trade off. It especially may not be a good trade off when not everybody trips on it. (I didn't ask them to change the way they did algebra problems.)
my quibble is not really about the equality symbol, more about assignments as a fundamental building concept of code.
(Re)assignments exist in social interaction so I guess it's intuitive to people that way, but if you look at code from a perspective of modeling reality that's just not a thing that exists in nature. Maybe it's just me.
I suspect that your mental model doesn't line up with what's happening in an assignment. (I suspect that a similar thing was going on with me and algebra...)
At UT Austin, undergraduate electrical and computer engineering students spend roughly the first half of the first semester building up from bits/binary to transistors to logic gates to latches and flip flops and registers. The second half of the semester finally moves on to understanding a "toy" ISA called the Little Computer 3, or LC3, and writing programs in its assembly language.
It's a much sharper and better motivated method to teach computer architecture, in my opinion.
This can work well for folks who want to stick towards the ISA / hw level, but for general coding I don't find assembly more useful than just a simple language like Python. The "simplicity" of assembly evaporates pretty quickly beyond the basic math instructions (e.g. mfence, vector stuff, the concept of a stack). Undoubtedly these things are useful to learn eventually, but they are incredibly daunting to any novice coder.
The ultimate frustration is in trying to support beginners in learning assembly, as they will feel absolutely bogged down whenever they hit an error and will be completely incapable of being independent in figuring out their problems. They will lean on the teaching staff like crazy (ask me how I know....) and it's this feeling of "my code is broken and I have no damn clue what is going on!" that drives people away from the field. They have to get eased into it.
When I taught undergrads a primarily Python-based class, I spent the first third of the semester teaching them how to effectively use pdb (or better yet, ipdb; tab completion!). Although my section did not get as far as other sections (i.e. we got to the end of the required material only, whereas other sections advanced to some optional stuff like numpy and Django), the profs teaching the following course told me how much they appreciated that the students from my section really could dive in and solve their problems (or at least give them a good first go). If you don't give beginner coders the tools to debug their own problems (which are lacking in the assembly world), it may hamstring them for the rest of their careers.
I think they should teach assembly in parrallel but I disagree with the evolution you're suggestion.
High level languages are expressing informational logic that actually can have nothing to do with computers, formally.
The way that the language maps to assembly, memory, cpus etc. is obviously incredibly important, but that's crossing a major boundary of abstraction.
Assembly is an under the hood implementation detail to quite a lot of CS work.
That said - of course it has to be taught - but I actually don't believe it's a 'precursor'. That's just how computers evolved. Imagine a day in 50 years when we switch to Quantum computers. Maybe 'assembly' will be something completely different then.
One of my favorite computer jokes is "Computer Science could be called the post-Turing decline in the study of formal systems."
Your criticism is valid, I think, because the chips are now so complex "under the hood" that assembly is, in practice, no longer the thin abstraction it used to be, eh?
I’m primarily a front and developer and I wish that was the way I went. When I work in even basic back end code now I feel unprepared. I highly doubt I’ll ever write assembly code and if someone was to ask me to I would essentially face plant
I'm not gonna downvote you but I think you should be. This is already what universities were doing up until recently and I don't think it helped anyone. Some people learn top down and some learn bottom up, some learn... middle-out? what ever their way is, you're just saying everyone has to learn bottom up.
Assembly is indeed intimidating for novices. I agree. But I still think that newly minted programmers need to understand something about what's under the hood of languages like python.
Maybe a full course in assembly is too much for freshmen, but they should at least understand that RAM isn't like VCR tape: you can fetch any RAM memory location in constant time, unlike with a serial storage medium like tape. And they should know some of the hardware memory addressing mechanisms that make arrays possible.
No, asm is not intimidating, it's irrelevant for some people. I guess my point is not to be so close minded and arrogant as to thinking you could possibly know what is and what is not important to someone's job, and dictate how people should do things. As well as thinking there is only one true way of doing things.
Again you learn how computer works in computer architecture class, you learn programming already know that, or not, if it's irrelevant to your learning goals. You don't design programming languages to teach how computer works, nor do you learn programming to learn how computer works, that makes zero sense.
Schools don't want too many of their students to quit. Especially when other schools start with more friendly programming languages. Assembly is important to know about relatively quickly, but I think it's a recipe to make people hate computer science when it's the first programming language.
I started with BASIC in middle school on a TRS-80, then bought a TI/99-4A when I was in high school. At university it was a mishmash. Every course was different. PL/1, Pascal, Modula-2, FORTRAN, Scheme, C, assembler were in the mix.
I'm not sure it matters, but I think most people would do best learning a simple imperative language like BASIC or Python to start. I didn't really "get" Scheme/LISP functional programming until after I graduated, but now it's my favorite style and Erlang is my favorite language.
One thing that doesn't seem to be called out and that's the 'overhead' of a language i.e. compilation, libraries, resources etc. - vs - a simple opportunity to teach basic programming principles and algorithms.
Python and Javascript have very low barriers to entry for the overhead (it can get complicated but for small programs it's easy).
You can use those to teach a lot of basic things.
In my own education we started with C/C++ and we were all drowning in pointers 80% of the time, the actual course material was pedantic.
What Uni tends to not focus on very well is just teaching the real, applicable basics of a language, with some good professional 'best practices' and students can spend years wasting their time trying to get things to compile, lost in bad manpages etc..
Start with something 'very light' and for everything else - provide a short 101 course for languages of instruction which focus on the practical mechanics of the language and nothing else. Those courses might even be taught by professional developers, outside of the regular kind of pedagogy. It feels 'not very academic' but that's the reality of the discipline.
If you want to teach architecture, and expect students to 'build buildings' ... you need to teach them properly how to use the hammer, nails, measuring tape - even if that in and of itself is not particularly academic.
My first not-quite-languages were Logo and then a smattering of autolisp (I didn't learn near enough to know much to do with it). First real programming that followed was Fortran, eventually on a Cray with threading extensions. And then a fork in the road leading to both C and Smalltalk-80 in parallel.
I don't feel like this specific heritage affected how I learned programming. But then, I was mostly self taught with a generous amount of solicited input from older engineers and contractors I worked with/around (looking back, I'm a bit self conscious and just how annoying young me probably was--I did try to load balance at least). So maybe this article doesn't resonate much with me because it comes from the realm of common computer science pedagogy.
In fact, my first "college programming classes" were a year or so later, two classes in C that all mechanical engineering students at BYU had to take. I was able to wow my peers by making a faster sieve of erastones than others. And then for the "chose your own project", I used Smalltalk 80 to model quantities with units and a symbolic algebraic equation modeler/evaluator including a graphic visuallier that could plot arbitrary equations like PV = nRT -- the TA marked off points because my code didn't include enough comments even though the next closest program was a Fortran piece that prompted for a few values and it then spit out the correctly interpolated u, h, or s values from the steam tables. Needless to say, my impression of computer science pedagogy in the early to mid 90s wasn't that great. My take away was not so much what language was first, but who I learned from first.
Taking specific languages out of it, I think some characteristics of a good starter language could be:
* Simple Syntax - I think it's important for this to be a 'typed' language rather than blockly-style as otherwise people find it hard to transition after. It's important to learn typing in syntax otherwise that's a band-aid that's harder to rip off after.
* Mix of dynamic and static typing - On the other hand, words like 'integer', 'string', 'for' e.t.c. confuse people who are starting. A good starter language would allow people to dynamic type and then encourage them to static type in the future, getting them used to the terms.
* Very helpful error messages.
* Standard functions that support drawing simple stuff to screen (I'm looking at you python - how hard is it to draw a circle for a beginner! Beginners want to break out of the terminal quickly).
Super cool, but also difficult to start for total beginners!
I suspect this is aimed more at people who can already program and want to learn functional programming, rather than people who have never programmed, but I might be wrong!
The first language is incredibly important and I normally recommend javascript for the following reasons
- it's /highly/ accessible (any computer with a browser can compile and run javascript - you just need to get your browser to load the file)
- it shares a lot of its syntax with C-Style languages (which makes transference to other languages much easier
- it's 'relaxed' on how it enforces rules (now, this is contentious, should a learner be allowed to get away with mistakes that will haunt them later, or should a learner be hobbled and forced to be precise from the get-go, IMO the former is easier for people to upskil with than the latter)
- it's (currently) got commercial value (that is, you can get a job with javascript on your CV)
The relaxed rule enforcement is very difficult for beginners to understand. After one has experience with other languages, we kinda understand that every language is gonna have at least one crufty WTF corner, but beginners will become really confused at the behavior like the infamous "Wat talk" [0] demonstrates.
Even Python has some quirks beyond the usual monkeypatching madness that can occur, e.g. with regards to name scoping.
I'm honestly not sure what could fit here that isn't a LISP dialect (which I think is pedagogically the right choice, but is usually impractical unless one is sure that one wants to do "hardcore" CS stuff, vs. using coding as merely a tool in another field).
I think JavaScript is a bit too advanced though and some mistakes are quite difficult to explain, such as the scopes and event functions, or the relatively invisible typing system. For example, Reddit is full of beginners making fun of JavaScript because concatenating a number and a string using the + operator produces a concatenated string and not a sum number.
However to reduce the complexity, it's possible to restrict students to a subset, using eslint and one or two plug-ins. I like eslint-plugin-unicorn to restrict my co-workers.
I learned qbasic, graphing calculator basic, and visual basic more or less on my own. I learned C++ in a mediocre way from the AP course in high school. In university, C++ was the most common language outside of low level architecture courses. I picked up python pretty quickly to make rapid progress on student projects. I pretty much use some combination of C, C++ and python for everything.
I had to use java for a robotics course, and it wasn't too hard to pick up the differences.
I'm going to have to pick up rust in the next few months for work. Reading through the reference manual, it's going to be a bit painful to rewire my brain. So far none of the concepts seem alien, though, just a bit weird.
Me in an incredibly dull Java class in college: This is awful! I dreamed of making computers do interesting things, but it turns out programming sucks and is mind numbingly boring to learn! (I eventually dropped out and went on a different path)
Learning Python on my own 20 years later from various free online resources: This is incredible! I can make computers do anything and it all makes sense! What else can I learn?!?
For me, having both the right starter language and the right form of resources were required for it work out (ingesting lessons via video as I am curious about a certain topic works infinitely better than a classroom).
I can’t imagine how many people switched majors or dropped out due to having something like Java foisted upon them in their 100 level CS classes. I have been using it off and on professionally for 15 years and I am always floored when I have to introduce somebody to Java and a Java codebase. Even something as simple as knowing which version to install requires a lengthy exposition. Do I need the JRE or JDK? What about J2SE. Or do I need J2EE? Should I use adopt (a dumb name itself) or maybe coretto? Who knows.
The current state of programming instruction in universities is abysmal. The way that universities do it right now, you get some super basic instruction in one, maybe two languages, and then are left to fend for yourself for everything else. Either actually teach programming or don't bother at all.
With mathematics, a student entering a math degree program is expected to have taken several math courses in high school. Students take placement exams to determine if they need remedial courses, but in general, you'll need to have taken algebra, trigonometry, geometry, and maybe a level of calculus.
But for computer science, we don't do any placement testing. Everyone goes into the same Java 101 course. And it's often the same course that everyone else in the college are taking as a gen-ed requirement.
Yes yes, insert any Dijkstra quote you want. I've read a lot of his papers. He was a professional troll. We need to stop holding him up as a paragon of computer science instruction. He did some great work in algorithms, but the dude is not the role model we should be pushing.
For as much as people like to parrot "computer science is as much about computers as astronomy is about telescopes", at least a telescope is conceptually quite easy to understand. A computer is a vastly more complex beast. Imagine trying to teach mathematics with students who don't even know the basics of the symbolic representation that algebra teaches. Such a student would fundamentally lack the tools necessary to succeed in a math program. But with computer science, we have students who don't know how to use their computers as tools. They only know how to use the software that other people have written and probably not even all that well. Even though "computer science is not about computers", being able to effectively use the tools of the field is still a necessary prerequisite.
So either shift the expectation to students having to learn the basics of programming on their own completely, or introduce a course of remedial programming classes in a variety of paradigms. But this purgatory of holding students hands through essentially nothing more than learning how to use the terminal and run a compiler, with only the most basic of procedural programming scattered on top, then throwing them in the deep end, that has to stop.
It wasn’t my first programming language, but I declined to attend a college programming course in Pascal because I thought Pascal was outdated and I wanted to learn JavaScript. It was 1997, and it seemed like the future to me. The teacher told me I should just learn Pascal and the same principles apply, but I felt like I didn’t want to waste my time and mental space on a language I wasn’t going to use.
I probably should have done it anyway as I didn’t end up getting seriously into development until 2008.
My first programming language was solder. I learned it in 6th grade. My second was BASIC, which I learned 5 years later. I'm old, and I've been around the block quite a few times... I can tell you what was in those fields before they become subdivisions. I tend to go off on tangents sometimes.
Teaching isn't about knowing the one right way... teaching is about giving people information that relates to their existing knowledge base. In the language of RF Design, it's about impedance matching.
If you want to turn out Rust programmers, it's probably a bad choice to teach them COBOL first. BASIC would be less bad. The closer the languages are in terms of impedance, the easier it is likely to be to transit between them.
Tangent: Bear in mind that in RF Design, impedance is only a complex number, in a person's mind, there are an infinite number of dimensions, you'll never get a consistent number.
Tangent: Teaching someone BASIC on a machine with 2K of RAM is also radically different that teaching someone QBASIC with megabytes of RAM and a hard drive. The constraints are different, the language itself has differences.
Tangent: Non Von Neuman systems like Verilog, FPGAs, solder, and Excel are valid tools as well.
My parents sent me to a "computer course" when I was around 8-9, to stop me from messing with all the electronic equipment at home (and sometimes getting electric shock). I was always fascinated with machines and taking them apart.
The course turned out to be surprisingly worth it! Since it had kids up to 15yo or so (I may have been the youngest), they introduced some advanced topics. I quickly picked up HTML (with Netscape and Notepad) and at the end they even taught us C in a UNIX environment (we worked on IBM terminals).
I'm really glad and thankful I had a chance to learn C that early. I have been taught Pascal, Java, C++ before reaching Uni years but I have always appreciated the compactness of C and compared every other language to it.
Discovering Python on my own in high school was like a eureka moment for me. It was the first time since my primary school C episode that I felt "home" with another language.
This seems to be a natural evolution of a new field.
At the start, the goal of teaching is to produce highly-skilled graduates who can contribute greatly to the task at hand. During that phase, there's no problem with a high drop-out rate, and no need to "sell" the discipline by extending its reach. This is basically an exponential phase.
Later on, as faculty slots fill up, a steady state can be reached, in which the supply of graduates meets the demands of the community. That's when things get extended, and the goal can start to shift from turning out highly-skilled graduates with narrow focus to simply educating. Every university has a psychology department that has huge classes. Students in those thousand-seat classes are not going on to become psychologists; they are enriching their experience of life. And the dividing line between what they are learning and what they could learn in grade 10 is very thin, so this kind of material can also be taught in high school. Again, those 15-year olds are not destined to work in psychology, any more than they are destined to work in history, literature, or all the other fascinating courses that are being offered to them.
It looks as though the author is suggesting that education in computing is entering this state, in which there is very little probability of the student becoming a professional. And, in that scenario, I agree fully with the theses of the article.
But I still think there is a place for the old system of teaching a language and expecting that some of the students will go on to contribute at the cutting edge. It's great to have classes on photoshop and the like, but there is a question about what they ought to be called. Perhaps CA (computer applications) is a better acronym than CS for that student in sixth grade.
Using the same title for everything that happens to involve using computers is a mistake, not least because it gives students an incorrect impression of the pathways they might want to follow.
Declarative languages are then the natural choice. Instead of reasoning about arbitrary control flow semantics you can just think about what your program is actually meant to be doing. Then if you need to zoom in later and manipulate state/control flow directly that’s a new paradigm to learn.
I'm not sure that's a good reason. Handling "just trust us, we'll cover it later" is as essential skill when learning anything, especially things related to programming.
Yeah, ignoring that might be okay in a course like "Programming for non-CS Majors", but I'd expect better from an actual CS program where understanding the foundations of the language you're working with are important.
I remember the same exact thing from Java, and similar "magical words" that weren't explained but I was simply told had to be there.
Learning for/while/do loops and variable assignment and all of that is fine for a programming course, but the first semester of a CS course shouldn't just be sticking a fahrenheit to Celsius converter or a string reversal or simple sorter in between magic words and symbols.
I disagree. Other than the already mentioned points in replies, static typing and failing at compile time with a helpful error is really important. Runtime errors are much harder to interpret and debug - it is a skill they will have to learn later on, but inferring runtime state at a given point is arguably harder than doing so statically.
Only yesterday I commented on a related thread I commented[1] thus which lead to a long subthread.
It all depends on the first couple of language that one learns and how they are learnt and taught. 10-15 years ago people would find it hard to grok Python's functional concepts. But as it began to be taught as the first language in universities and those graduates join the working population you see how it's super natural for them to grok Python.
That said, in a professional environment, one should be able to pickup ~70% of proficiency in 6-8 months if they aren't constrained by crazy timelines. This is based on my own experience.
Interesting, as someone who loves math learning programming as an extension of math made a lot of sense. It also explains why other people I know struggled to learning to program in the same setting, but later we’re able to thrive given a different curriculum.
This applies beyond introductory programming. We faced this in our university's Machine Learning course. The same course was taught both to students from the engineering school, which requires a certain number of math courses (Linear Algebra, Calculus, etc.), and in the general college, which had much fewer requirements.
The professor was forced to stay surface-level on the math, so the coursework was a task of glueing together code from scikit-learn. Unfortunately, the exams were much easier if you had the mathematical intuition.
As a biologist I learned to program at 34, it was using Python (pandas and seaborn) in Jupiter notebooks. I mainly did data science in the beginning. I can tell you this worked extremely well for me. I could make small steps, add text and output variables and graphs constantly as I “developed”. It took me quite some time to shift to a full vscode flow, I kept going back to notebooks. I’ve become more of a software developer now but for data science (that which I learned in Excel and Origin) I still go back to the notebooks.
There’s many different styles of programming. Excel and Jupyter notebooks are a kind of “end user computing”. Programs developed in those environments often need a lot of refactoring to get turned into “software” (by some arbitrary definition). However these programs are incredibly useful for the business ideas they explore.
To put it another way, using Notebooks with pandas is a great way to wrote biology programs. Using Notebooks to write pandas is a terrible way to write software.
Yeah I found this out indeed, by transitioning from data science to software development, there was an interesting time in between where I couldn’t let go of the notebook concept but I now certainly have. That said, the notebooks got me started in programming and did so in a way that I experienced as very gentle and with a flattened learning curve. I guess that was my point.
My father is a CS person, but he barley can code. I spent lot of years learning programming through tutorials and none of them worked, because I lacked critical thinking skills and abstract reasoning.
I believed in miracles. I believed in illogical things.
I was bad at Maths.
I however didn't stop. I took a critical thinking class and become an atheist three years after.
I was to able to see things differently after that.
Still, it was not enough to be able to program, but certainly boosted speed.
I took a Python class. I was able to understand basic concepts of programming.
It's still not enough because I can't make something real world.
I was looking at real world Python projects but they are hard to read and has obscure short names.
Then, I spent my time understanding what an API is. I found bubble.is and there I used my skills to build a real world app and made some money.
I moved to Outsystems and there I boosted my understanding on a lot of concepts.
I started to feel that I was like ready but afraid to swim into a real programming language.
I got angry one day. I wanted to build an API in real world language. So, I googled and found a result for express.js
I hated Javascript because a lot of people hated it in online. But I started to copy paste express.js example it on my machine and modified it according my needs.
I felt relief. I finally started real world programming with Javascript.
I made a lot of money with JS for 6 months.
I then discovered about C++ and the static typing world.
I went to learn C++ despite people telling me you should learn C before learning C++
I got understanding about pointers and memory management, compilers after learning C++
I then moved to embedded space where I did a OS and kernel from scratch.
That's my journey and of course I'm still learning Maths.
Actually, learning critical thinking & programming helped me to see what Maths really is. I'm studying basic algebra and going thru a lot of concepts right now.
I think the mistake in teaching languages is to start students with a blank page. I learned by typing in code from a magazine, and thus started from a working program to tinker with. So, when my then 9-yr-old son wanted to learn, we spent a weekend hacking the Minecraft code. We did the first change together: making it so the Endermen didn't kill you for looking at them. Then I left him to it. By the end of the weekend he was flying around on the Ender dragon.
Basic web stuff
MSSQL
C and C++
Bash
Intel Assembly
Overall I'd say a little bit of everything is good for a CS student. Don't keep them hooked on one thing for too long. Not that students will be using C/C++ or assembly upon graduation most likely, but then are just another tool. I just wish we learned more about REST and doing CRUD operations. That took me a long time with my internship for it to "click" on what was happening.
First language should focus on getting started. Python works perfectly fine for this. Startup is minimal, students can start writing functional code within minutes.
Subsequent languages should match the content and focus of their classes (i.e C for Data Structures and Algorithms, C/C++/Rust for Systems Programming, Java / C# / C++ for OOP classes, Haskell/Clojure/etc. for Functional Programming classes , etc)
So many has ( and hasn't changed ) in the past 50 years of computing. I mean look at C++ today, and pick any programming language in the 70s.
Another point is, are we really taking about programming language on the surface, its syntax? Or programming language as its whole ecosystem with library, tools, edge case and best practice. The former, arguably is very very easy. The latter is very very hard.
From the PL theory front, when confronted with the idea that languages shouldn't necessarily assume prior mathematical knowledge, even some well-known researchers simply shrug and say "I can't do anything about that." And those same people will also criticize more popular/accessible languages on the grounds that they lack features from their favorite pet language based in some mathematical footing.
The hypocrisy is real. It is definitely a real problem. If your field has confined itself in formalism without a consideration for HCI, you are doing yourself harm. Not only is the transfer of knowledge to industry and other domains harder, you miss out on equivalent but more accessible ways of expressing the same ideas.
Especially nowadays when ppl learn some specific JavaScript framework, think they know how to best do stuff (because the framework is hyped), and then the learning curve stagnates.
"The tools we use have a profound (and devious!) influence on our thinking habits, and, therefore, on our thinking abilities.
...
It is practically impossible to teach good programming to students that have had a prior exposure to BASIC: as potential programmers they are mentally mutilated beyond hope of regeneration.
The use of COBOL cripples the mind; its teaching should, therefore, be regarded as a criminal offence."
- How do we tell truths that might hurt?
Edsger W.Dijkstra, 18 June 1975
Which language you start is not as important as whether that language is used by your teacher on a daily basis. One thing you don't want is a teacher with outdated notes.
What about pointers? They are fundamental to data structures, and languages which doesn't clearly show the difference between pointers and values makes understanding data structures much harder, as many data structures relies on multiple pointers pointing to the same data and that simply doesn't make sense if you seem them as separate values.
I think this is a good point. You can implement a linked list in python, but, in my opinion, unless you already understand pointers, it's rather confusing. Doing it in c makes everything explicit.
That depends on purpose of the teaching. If you want to teach them basics what the thread starter mentioned is enough. If you want the deeper understanding you should teach them how it boils down to machine code or add a bit of Assembly to the menu. It explains pointers and other such things pretty well.
Leaning the second language is certainly hard, but learning your fifth or sixth language or so becomes easier? Hopefully? At least if you put forth any effort or interest I would think? If anything, I feel like the biggest trap people fall into is assuming they know a lot of languages when they really just know a lot of Algol-descendants... you should make sure to spend time really building something meaningful at least once in at least Haskell, Erlang, and a Lisp (and a lot of others, but if you haven't worked in those three... have you really programmed? truly?).
My story: I have been programming since I was a little kid, having used mostly stuff like Logo and BASIC for years; after spending a bunch of time with Visual Basic I kept failing to learn C. In high school I took a course on some slideshow programming system whose name is escaping me right now, and then one on Pascal. OMG Pascal was hard for me: I made flash cards to try to learn the keywords, and I was struggling. There was something concurrent I was doing with simple JavaScript, which seemed to be going OK-ish.
I then started taking a second course in Pascal--an AP class--but by this point it had all clicked: I had been programming at that point--even writing software for local companies sometimes (in Visual Basic)--for like 8 years and I finally got it. The teacher noticed, and since I was only a junior he said I shouldn't bother continuing in Pascal: I'd help him learn C++ and co-teach the class the next year with him, and take the AP test in C++.
That was... over 20 years ago now? I now feel like I can just learn any language almost immediately. I remember being in the audience when Apple announced Swift: I was looking at the slides and thinking "ok, I see what they are doing here: this is like Objective-C crosses with Scala but with some of the syntax of Ruby... I bet I could program in this". I spent the next day playing with it and had already befriended the Swift compiler people as I was finding tons of bugs, and then I gave a talk on Advanced Swift Internals two days later across the street at AltConf.
I personally feel like this kind of mental process is "teachable". I thereby sometimes have gotten to teach a course at UCSB (I have sometimes been hired as a "1/8th lecturer" or something like that to do this) on programming languages, where I try to take students through the breadth and depth of syntax and semantics, rather than just focusing on a couple examples. Sometimes we look at a semantic and how it evolved, sometimes a syntax and how it got reused, and sometimes a use case and how various languages tried to support it. By the end of the course I expect you to "know" none of the languages, but to hopefully be able to quickly use any of them with some examples. The big project for the class is to design and implement your own "esoteric" language.
FWIW, I "thereby" personally am still in camp "it doesn't (exactly) matter"... as long as it inspires the student! Which I actually think is what this author is kind of getting at (and so I actually do agree with them) with that paragraph about "what about a student who X"... a lot of teachers seem to think "learn this toy thing and you will be able to eventually learn anything", but the toy is boring and doesn't do anything the student wants to work on and by the end of the entire experience you are lucky if they did any of the work at all. It is like teaching someone to play some horrible sounding musical instrument using example pieces that the student doesn't enjoy listening to with the goal of establishing enough musical theory and experience that they can learn a second instrument / style some day and eventually get good enough to do what they wanted. It is just unrealistic? I was able to get as good at the languages I was able to largely because they were useful to me... and honestly, that Pascal wasn't was probably part of what made it so hard :/.
Yeah, you learn _any_ language that much faster if you have a solid motivation, both external and internal, to do so. And if the language is _availible_ enough, meaning it has a lot of help, examples and 3rd party libraries, going for practicality or fun&profit projects really helps.
Then again switching programming paradigms not so easy, but when you can...
Author is showing confused thinking, by conflating "first" with "only". Teaching a new programmer less than 2 language paradigms warps their brain into limited view of what is possible.
Even learning JS + HTML, or Spreadsheet + Bash/PowwrShell, with good instruction, should be enough to instill the key idea that languages are abstractions and there are radically different languages for different problem domains, and you should find ones that fits the applications you are interested in.
I would argue that in the context of this topic, HTML is not a useful second language. It's a completely different tool, closer to markdown or, heck, Word documents than an alternative way of programming. (I see your general point, though. Just this combo I'm not sure is the best example.)
The ethics of academia requires that computer science shouldn't become a factory pumping-out sausages by the millions by whatever means necessary.
Perhaps a prerequisite of the major would be demonstrable knowledge of 2+ programming languages that exist in nearly orthogonal domains.
I think low-level to high-level, with the right amount of history, is the best way build rather than relying on mysterious abstractions below. Microcode, assembly, C or Fortran, are the fundamentals CS graduates should be required to have or the accreditation bodies aren't doing their jobs.
Also:
If you can teach monads, you're a great lecturer.
If you're Sean Davis from UC Davis, you are/were a great lecturer.
A great lecturer of technical minutiae both has an expressive personality and explains concepts as simply as possible using learning aids such as analogies, examples, and diagrams.
I think it's worth distinguishing between "Computer Science" and "Programming".
One of the motivating examples in the OP is "conversational programmers". OP defines that as people who want to be able to have conversations about programming stuff, but won't program. -- For those people, sure, assuming heavy technical/math background when teaching programming isn't helpful.
But many topics in Computer Science heavily rely on applying math. (3D rendering, Computer Vision, Machine Learning).
I learnt BASIC mostly by myself when I was in elementary school (because programming computers must be cool, and I was sure I could do it).
I self-taught myself computing concepts all my life.
I self-taught myself Python (in earnest) while in MBA school (because I believed spreadsheets needed to die and Python would be a good spreadsheet killer).
I've been teaching Python to other students in the last few years. It's usually their first programming language (if not, R probably was).
I've always believed in being a computer language polyglot, if only to be able to communicate with others from other languages and to learn what those languages have to teach us, but I'm also of the opinion that one should stick to one's strengths.
I would agree that we should think the initial programming language doesn't matter - if we're simply looking at whether we can communicate or translate fundamental mathematics of computation.
However, when a student is an adult trying to get immediate value from the learning (instead of the gratification of increasing pure knowledge, like I did) their choice of language matters greatly.
My first recommendation to anyone asking me is of course Python, due to broad applicability (which means learning one time a language that can be used to scrape the web, provision a restful API, or do complex data analysis), license (free to use), cross-platform compatibility (MacOS, Windows, Linux), popularity (which leads to ease of getting help), all leading to a good reputation for readability and maintainability (which is encouraging broad industry adoption meaning more jobs for Python programmers).
Of course, for specific applications, the recommendation may be different, such as JS or TypeScript for front-end development, or C or Rust for low-level programming, or Lisp, Elm, or Haskell for functional programming in their respective domains.
Further, Haskell and Elm have greatly informed my approach to object oriented programming - I feel like they have made me better and made my Python more robust and straightforward. I didn't learn them first though. In retrospect, I feel like I could have easily learnt them first. I wonder how I would look at my Python if I had done so - what would indeed be different? I don't know.
But I do know, for pragmatic purposes, first programming language does matter greatly.
It would move the conversation forward if people were up front about which metric they’re trying to move. For example, here are two equally valid pedagogical goals that are in tension with each other - helping everyone become a programmer (no dropouts) and making sure every programmer meets some pre-defined criteria (strong understanding of memory layout or type theory or ability to build products). If your pre-defined criteria includes understanding of type theory, you might start with Haskell. Certainly all surviving members of your starting cohort would succeed according to your criteria, but this is only success if you didn’t mind people dropping out. Conversely, a language like python might help fewer students drop out, but they might have “weaker fundamentals” for some definition of that.
So let’s be clear about what success looks like. Let’s state a hypothesis for success and which metrics we’ll use to measure that. Let’s verify that hypothesis by offering the teaching approach to similar groups and interpreting the results.
We can do better than “I was taught like xyz, and I’m successful now and so that was definitely the right approach”.
Lastly, some people in this thread claimed that anyone can become a programmer if the teaching is good enough. On this, I’d like to paraphrase fictional food critic Anton Ego - “only now do I understand what was meant by the motto ‘Anyone Can Cook’. Not everyone can become a great artist, but a great artist can come from anywhere.”