Hacker News new | past | comments | ask | show | jobs | submit login

Ultimately it seems strangely exclusionary to draw lines in the sand like this. Everyone who learns how to program doesn't need to A.) be god's gift to the science of computing and B.) have the same level of understanding/fluency in what they're doing.

It's like saying the only way to teach any amount of a language is to require 6 months of total immersion. There is value in someone taking on year of a language and learning a few phrases, and moving on. Ultimately, people have practical and immediate concerns, like getting paid and putting food on the table. We aren't all programming for the same reasons, and sometimes getting the job done is all people have time or even the inclination to do. And we shouldn't judge them for that.

I'm not suggesting that he is entirely wrong, but reading things that sound like proselytizing just make me sad. There is so much room to educate and come to mutual understandings. Some people might think they've found a better route or a deeper understanding, but treating the rest like idiots is the best way to make sure they never get your message.




I like the analogy that English writers don't have to be at Shakespeare's level; even being able to write a shopping list is a very useful. Hence it's a good idea to teach everyone reading and writing at school.

I think the programming equivalent of a shopping list would be something like a simple 1, 2, 3 cron script, or if-this-then-that event handlers. That only needs a very basic understanding of computing (and how to find information effectively online), but opens up lots of really useful automation, e.g. backing up files, sending an email (e.g. a notification/reminder to ourselves), etc. Even without some automated API to use, a sequence of 'print' and 'sleep' commands can be useful to help ourselves/others perform some common task, e.g. a receipe which tells us when to turn down the oven, when to put on the pasta, etc. Hence I think it's a good idea to have simple 'learn to code' classes in schools; whilst more advanced classes would be optional for those wanting to specialise down that path.


I would go even further and say that the most important aspect of teaching programming in schools is simply exposure. Most people don’t stumble upon programming in their day to day lives, especially now in the age of iPads and other locked down devices, so it’s important that the subset of population that would be good programmers are given the opportunity to at least try it out early in life.


I'd argue that they do, and that will become more and more true. Things as simple as an Excel file, or writing simple email filtering rules may expose you to _some_ programming.

As a side note - I believe this is going to be more and more true, especially among knowledge workers - I'm somewhat bullish on UiPath's low-code app platform [+]. As programmers we understand well the benefits (and pitfalls) of automating our menial tasks, if you can lower the bar so that most people (like assistant managers, receptionists, call center operators etc) can do that - you suddenly start exposing much wider audiences to programming. It's a tough problem but I believe RPA might just be the right foundation to enable that sort of thing.

[+] Disclaimer: I work for them so maybe that's natural :)


My experience with the exposure to excel is that it isn’t used in a way that exposes all users to programming experiences of any sort.

Even presently I’m staring down the barrel of a project that involves what is looking closer and closer to NLP to simply parse cell values to map them to positions in a layout and character styles in a given InDesign template.

The people responsible for a inputting the original data do it in what is basically paper napkin thoughts that have to be interpreted accurately for want of running aground advertising regulations (should a value end up incorrect in the entire process)

So I’m only speaking anecdotally, but applications as seriously powerful and user-friendly as excel do as much to obscure as they do to expose people to programming/computer science. I mean, excel does work so well for so many—even unintended—applications that the users expect unlimited magic in all things computer and fail to understand why new limitations may be imposed when they want their workflow simplified and it can’t “just work” no matter what.

(Forgive any latent frustration making its way into that comment, the project I’m referring to has felt nothing short of Sisyphean while a new application in the space of it would be quite simple to develop)


You’re in a bubble. Most people don’t use Excel, much less write formulas, and they certainly don’t write email filters.


I don't think people will get more exposed to programming as we go on, currently we are moving away from personal computers to dedicated app machines like smartphones where you don't see files and such.


The best cure for optimistic thinking like this would be simply to try teach some of that to _not_ self-selected group. It becomes clear very fast, how non-trivial "trivial" concepts are.


Not only that. But just the concept of programming is not something most people even grasp what it is.

I stumbled upon a nice way to show people what programming is the other day. However, it requires the person who asked the question to be engaged. If they only see programming as a trade of wizardry and a way to make money they may tune you out.

But my latest analogy for people who as 'what is programming' is to put a rock on a table and say 'teach this rock to go to the store write down every single step including getting off the table and opening the door'. They may get a bit angry and I will tell them a computer is no smarter than a rock and it will do exactly what you tell it. You need to be very explicit in what you tell it. If something goes wrong like you end up in the bathroom instead, something is wrong in the instructions and you have to fix it the rock cant. Now I am sure this analogy will fail on someone at some point. But it has worked on a few people I tested it on. My job is to pretend to be very stupid and then ask 'what sort of instructions do I need'.

That is just the simple style of programming. Add in callbacks, async, events, injection models, and SQL in there and people nope out.


It reminds me how Feynman described a computer as a superfast filing system https://youtu.be/EKWGGDXe5MA

Though programming is less about precise detailed instructions, it is more about glueing together mostly existing components in a manner understood by your team.


I had an Elementary school teacher that gave us a similar exercise to the rock thing. She said something like "imagine I'm an alien, and tell me how to make a peanut butter and jelly sandwich". I got so caught up in sharing my special technique I had at the time (extra glob of PB in the middle after covering both sides) that I just got a bit deflated when she stopped me to ask what I meant by what seemed like a basic step. (It was something like "okay, I have the bread", and I hadn't asked her to pick up the knife before applying the peanut butter)

Oddly I don't remember the point of the exercise at all, or if we did anything building on it. We certainly didn't do any programming.


As my dad says, computers are extremely-fast idiots.


I think teaching anything to a not self-selected group is excruciatingly hard. It's the same with foreign languages, math, biology etc.

An easy way out that schools use is to teach things that can be memorized like laundry lists. Name the 5 components of the <thing>. Name <famous person>'s 3 contributions and write a sentence to each. Specify the formula for <physicist>'s law.

As you say, if you try to teach a non-self-selected general population computing, you'll see how nontrivial even the mental model of files and folders etc. can be. But these are people who successfully do complex jobs in their lives. Sometimes even some sort of STEM-related or technical/engineering job, just not computer related, like a car mechanic. I don't think it's some sort of brain-compute-power/intelligence issue. A car mechanic uses similar brain pathways to "debug" and repair a car, keeping logical dependencies in mind etc.

Perhaps it's about the transition from physical to the entirely abstract/symbolic world.

It may be about a mental defense against the low-status of nerds, as in "I'm not like them (thank God, haha, I have a life), so I can't do this...".

Again, forcing knowledge into someone's brain is extremely difficult. They need to cooperate by their own will, otherwise you just get memorized lists that are forgotten after the test.

People learn languages by immersion when they want to interact with people around them, but when people are forced to live in a country they don't like, they can go decades without properly learning the local language, despite going shopping etc. and living a normal life.

I know relatives that run to me with all sorts of IT tech support issues, pretending they just cannot solve it. But when it's actually about something they really want done, like watching a movie or an episode of their favorite TV show, they are suddenly able to figure out all the details of torrenting.

Not that this is some novel insight, but motivation is key. If you have an actual goal in mind that you really want to achieve (like watching the next episode of a show), you will push through the discomfort and uncertainty of learning how to get there. If you want to learn a language because you love a culture, or you need it to talk to clients at work to get a promotion etc., you will learn much better.

Some charismatic teachers can create motivation where there was none. This sometimes ends up as "learning to please the teacher", but sometimes a single teacher's influence sets someone on a whole career course.

Learning basic coding (perhaps not necessarily becoming professional devs) should be possible for a large part of the general population if they have direct use for it today ("I cannot do <desired activity> unless I figure this out") outside of made up tests and teacher-pleasing.


"If someone wants to learn something you cannot stop them and if someone does not want to learn something you cannot force them"


> I like the analogy that English writers don't have to be at Shakespeare's level; even being able to write a shopping list is a very useful. Hence it's a good idea to teach everyone reading and writing at school.

Language "skills" are strongly social. Literary acclaim falls squarely within celebritydom. Your ability cannot be too far off from the mean or else few will understand you. Though there may be some unmeasurable internal utility as a thinking tool.

We learn to read and write to do it as well as our peers do. This covers 99% of the utility.

There may be a social argument for programming, software is eating the world and having some shared understanding may be useful. But clearly technical ability is much less socially bound. The point at which you stop getting extra utility is very far from the mean.


You want those lines though, glue software engineers tend to get really bitter when they get quizzed on computer science stuff, so it is in their best interest to draw the line instead of arguing that they should have the same title.


This highlights something interesting for me: I've literally never heard anyone "arguing" that they should have the title of "software engineer" or "programmer" or "software developer". I don't know the history of it, and it could easily just be something that I haven't personally witnessed. Ultimately I fall into the camp of someone who would do fine getting quizzed on the computer science stuff, but at the same time it is ultimately about adding value (at least, when we're talking about someone paying you to work for them).

There aren't two camps: "glue software engineers" and "computer scientists". There is a spectrum of people with varying abilities in a wide array of sub-fields.


>I've literally never heard anyone "arguing" that they should have the title of "software engineer" or "programmer" or "software developer"

this kind of discussions only happens on the internet because in reality nobody gives a *, especially that the only place that seems to distinguish those is the USA

coders/developers/programmers/se/programming ninjas/shamans/system necromants

an artifical titles


I used to work for a small company that developed a cellular radio network design application and also had a radio network engineering consultancy arm. We renamed the product and changed our job titles to include Engineer and Engineering because our sales people said it enabled them to charge more and sell up our services more easily, and I believe them. I was on some sales meetings to provide technical background so I could see they knew what they were doing.

It may seem silly, but words mean things and you need to clearly and effectively articulate why your product is valuable, or what value you bring as an employee. Effective communication is really important.


The entire sector is somewhat fraudulent in its nomenclature, because it’s still a pretty young field. We’re basically going through what mechanics and physics went through about 250 years ago. At that time there were no legally-defined engineers, anyone who could do reading/writing/math could develop entire factories.

At some point it will have to clear up its act. I expect it will take another 50 years or so, as employment demand stops growing and eventually falls, at which point it will make sense for incumbents to raise formal barriers to certain titles.


On of my girls is thinking of doing Computer Science so we were checking out University courses are there are BEng and MEng computing courses with professional accreditation here in the UK.


I do understand it from marketing standpoint, but when we talk between people from this industry, e.g here, then it's pretty weird.


I use "nerd" as my job title.


There is far more to writing correct code than knowing the contents of undergraduate computer science inside and out; there is an entire world of software engineering where undergraduate computer science theory is damn near meaningless.

I mean, I worked at a medium sized medical device company (10,000+ people) (who was considered the gold standard of their industry in many respects) where you would be hard pressed to find a single person that wrote a lot of code who had anything but a Physics, EE or CpE degree; I honestly can't remember a single person who had a CS degree despite being on an algorithms team.

I can't even remember a single person at the scientific device company I worked at where their algorithms team had anything but PhDs in Physics.


I agree with the first paragraph.

The second and third paragraphs are alarming. Anti-elitism is never a good thing in actual technical fields. Of course, the lead may be from outside CS, but to say that there is not a single CS person or an expert in embedded systems and OS, who was involved in the design and testing of a pacemaker is not very confidence-building.


> Anti-elitism is never a good thing in actual technical fields

I think it's not anti-elitism but just elitism.

Outside the CS bubble, being a PhD physicist has more prestige and status than being a PhD computer scientist. A physicist is universally recognized as a scientist, nobody would ever question that. A "computer scientist" is not universally regarded as such. I'm not arguing whether it should be, but it's clearly not universally like that in people's perceptions.


> Outside the CS bubble, being a PhD physicist has more prestige and status than being a PhD computer scientist.

I disagree with this take. Just as there is more “applied” physics and “theoretical” physics there is similarly “theoretical” computer science and more “applied” computer science. The former camp are similar to the theoretical physicist as many of them are mathematicians with a computer science bent. Even within the “applied” CS camp you have experts in number theory and algebra that apply their skills to cryptography, whereas in the “theoretical” camp you may find theorist using mathematics like homotopy type theory, category theory and the like in their research. Number theory, graph theory, combinatorics, abstract algebra, category theory and the like can get really hardcore in the pure math world. Similarly, theoretical physicist also have to know a lot of mathematics.

I think your optics of what a computer scientist is and does is a bit removed from reality and that PhDs in CS are highly respected (I didn’t even mention the folks working in AI or quantum computing or the other important areas of the field).


I'm talking about the perception by general people like business managers and HR etc.

I think it's silly that there's this purity contest or practicality contest going on. I studied CS but at a more engineering focused university, so the curriculum was closer to electrical engineering than to math and physics. Many engineering profs had a weird distaste for physicists/natural scientists and vice versa. They were both "elitist" just from a different view of the world.

I like Feynman's take on this kind of bickering among fields: https://youtu.be/f61KMw5zVhg?t=137


CS is a bit weird in that, partly for historical reasons, it straddles science and engineering more than most university majors. So at some schools, it's closely tied to the math department while at others it's associated with electrical engineering. Which unsurprisingly affects the curriculum in various ways.

(And, of course, there's also the matter of how applied the curriculum is--i.e. software development vs. underlying theory.)


As it is a subset of math, CS belongs in the Math Department this better reasons than historical ones. It is too easy to mistake CS for what it is not due to it's name including "computer." Suggested to me a long time ago, it makes it easier to understand what CS is if instead it is thought of as "Reckoning Science." The computer in Computer Science is "one who computes or reckons." Also what helps conceptually is the famous analogy that a computer is to a computer scientist what a telescope is to an astronomer. Astronomy is not the science of telescopes, and likewise, Computer Science is not the science of computers, nor even as the word is most commonly understood (as "using a computer"), "computing."

Programming is a subset of CS, and Software Engineering is elevating that part of CS to match the monumental interest it attracts, to the extent of placing it fully into the Engineering Department.

Some universities have CS degrees that are merely programming degrees compared to the better CS programs, which are usually made up of an equal amount of strictly advanced mathematics courses and CS courses which incidentally may involve programming, but learning to program whatever is merely a requisite of that course. CS is relative to Physics as a well-exploited tool. A lot of physics wouldn't be possible if not for computer science.

Whatever the job title may be, programming is an occupation, and salaries can range from median to lucrative. With the most lucrative salaries, programming is often a small part of a larger solution, but a fully necessary feature of the service of solving some big problem or related group of smaller problems.

When you think of a computer scientist, you should not think of a coder, yet coding is a tool they may exploit to achieve some goal, and that goal may or may not be entirely dependent on what is actually a telescope, an underlying physical computer. The really exciting rock n' roll computer science seems to be anything whiz bang, graphics and games, digital audio processing, AI, etc., and whatever language skills support those, so it is easy to confuse the goal being able to parse and generate various amounts of code with what all that coding work is helping to accomplish. IMO, weather modeling, modeling anything, satellite tracking, as well as informatics, are massively more interesting applications of CS. Help Desk, Desktop Support, Networking, Programming, Database construction and maintenance, and the rest of IT are all narrow and rather droll practical applications of CS, entirely practical, and they can pay well, a lot of the work is reasonably important (hospitals and air traffic control, really everything relies on computers). Web design is usually kept within a company's graphics or marketing department, yet many designers have escalated their job title to "developer."

Classifications can blur because CS is so damn useful, even without computers, but CS is not just the one thing it is most popularly applied to. It is all the things.

I think of a legit lettered and experienced computer scientist as the penultimate problem-solver, and they only add a computer as needed.


I appreciate the perspective and agree with a lot of what you wrote. That said, software is not some abstract thing that exists (or at least exists usefully) outside of the context of computer hardware--which is clearly in the domain of electrical engineering.

That can be less true today given the additional layers of abstraction we continue to pile on in general. But I'd still argue that computer science is closely tied to the hardware that computer science concepts as implemented in software runs on. And therefore, it can make sense to lump it in with the engineering and, specifically, electrical engineering.

(Certainly my opinion is probably flavored by the fact that, where I got a different engineering degree, electrical engineers and CS majors typically get the same Bachelor of Science in Computer Science and Engineering degree.)


I actually studied "Computer Engineering" (rather, literally "Engineering Informatics") at a technical university (with other prominent programs including electrical, civil, and mechanical and chemical engineering) but we did learn all the algorithms and data structures, complexity theorems and proofs lots of math stuff rigorously, but the curriculum grew out of electrical engineering and so we were also close to the metal at the same time.

We took logic design in first year, flipflops, half-adders, built projects with such stuff, learned about analog-digital converters, the Intel 8085 architecture. Physics, to understand electricity (Maxwell et al.) and circuits. We learned assembly, C, system programming, resource management, paging algorithms, scheduling, filesystems (following Tanenbaum), we learned some Verilog and VHDL, but also graph theory (with proofs, plus applications to VLSI routing), group theory, but also computer graphics and associated data structures, like octrees. We learned control theory, signal processing theory and audio and image processing. But also network protocols, TCP/IP, ARP, exponential backoff, Ethernet frames etc. Databases, normalization etc. Compression algorithms, cryptography, Reed-Solomon code and its use in CDs, similar codes in RAID. Public-key crypto theory with proofs but also its use in practice in SSL. Backups, differental and incremental, practical stuff like calculating with mean time between failures, etc. Understanding hard disks, like the plates, sectors etc to understand delays and better seeking algorithms. But also GPU architecture, programming GPUs through shaders, breaking down math problems in a way that maps well to CUDA etc (before the deep learning craze, but when GPGPU was a hot new term). Java, C++, UML. Machine learning, evolutionary algorithms, agent and voting systems.

We don't have a good vocabulary. Are the above things all CS? Or some are engineering? What does informatics even mean? Certainly a lot of the above is intimately tied to computers as physical objects with timings, latencies, voltages, not just to abstract Turing machines. And I wasn't raised to be ashamed of that or to find that dirty. Nor to be bitter about having learned about red-black trees or the proof of the five color theorem or Cauchy's integral theorem or the simplex method for linear programming or linear algebra or the conjugate gradient method etc. It's possible to have the right blend of the abstract and the concrete.

And how do you learn "problem solving" if not through actually working with things like the ones I listed? Why this distancing from concrete metal-and-wires computers as opposed the pure mathematical formalisms? Is it because it's seen too close to blue collar, wrenches-and-sweat-and-grease work? That the image of a scholar sitting in an armchair is nobler? You don't have to become a help desk technician or a person plugging in the Ethernet cables at data centers, just because you've studied the concrete technical aspects.

It sounds as if medical practice were low status and only biological research had prestige. Or if practicing law was too blue collar and only things like abstract constitutional law theory was something of high worth.


When rather large company employs either everyone with the degree or everyone without degree cs degree, good working hypothesis that bias is going on in hiring.


I don't got a computer science degree either. However I do know computer science fundamentals better than most people with computer science degrees. A degree is neither evidence of skill nor evidence of absence of skill.


In my experience embedded or control system software is more about the thing being controlled than anything else so that would seem natural?


> glue software engineers

Where do you put the line between them and “purist” SWEs?

Even Things like

  #include <stdio.h>
already make you a gluer.


It isn't possible to draw a hard line, no. But people tend to identify themselves as one or the other, just ask them whether technical or social is the most important aspect of their work. Gluing things together requires you to communicate more, like adding a library or depending on an api or reusing code from another team. Building things yourself is just code, less questions so less social work but more technical work.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: