Hacker News new | past | comments | ask | show | jobs | submit login
Eliminate the Computer Science major (bendmorris.com)
42 points by bendmorris on Nov 27, 2010 | hide | past | favorite | 62 comments



I'm getting my CS degree because I want to know and understand computer science. This article is just wrong in so many ways.

The author may be surprised to know that just because most of us CS majors will become programmers doesn't mean that we all want to take a "Programming" degree that would probably boil down to vocational training. There's nothing wrong with having vocational programming courses, but I don't know if universities are where they belong.

Secondly, there is a false dichotomy here between "programming" and "theoretical computer science." Theoretical computer science is merely a part (albeit an important one) of the field of computer science. There's also things like operating systems, databases, networking, graphics, and probably more. All these fields have overlaps with actual programming and theoretical computer science, but aren't solely in one or the other.

As an aside, I think the term "theoretical computer science" is often applied too broadly, since I wouldn't consider things like algorithms/data structures, AI methods like image recognition and data mining, and multiprocessing to be all that theoretical, since they don't require mathematical rigor to understand or produce useful results.

My Algorithms and Data Structures class didn't have much math at all. Complexity theory edit: and algorithm analysis did require brushing up on some summations and limits from calculus, but it was taught from a very practical perspective. My Languages and Machines class, ostensibly the most "theoretical" of all, has been my favorite class, and other than some basic digital logic it had no math at all. That's not to say it wasn't formal or rigorous; it just wasn't what I would consider purely theoretical.

The author claims that CS students don't plan to be computer scientists, but rather programmers. I am and will always be a computer scientist, and I will probably work as a programmer for most of my life. I see no contradiction or foolishness there.


^ Some more thoughts:

The author implies that this gap between real-world skill and the related undergraduate program is unique to computer science, but certainly this is not the case.

My father is a high school History teacher. His degree is in History, not "High School History Teaching." Sure, many History majors probably go on to teach History. Sure, they probably learn way more "theoretical stuff" about the US Civil War or the Renaissance than would ever be covered in a high school class.

In re-reading the article, I'm now confused about exactly what the author is claiming or proposing. He says that CS programs are dumbing down the rigorous stuff because students want to learn modern programming, but he also complains about using "easier" programming languages. I'd like to interject and say that "easier" languages are probably easier because they're better for the task at hand. Language/compiler/interpreter design has come a long way and I have no problems with moving away from 40 year old languages and learning 20 or 10 year old languages. I don't think scientists or employers will be impressed at your ability to solve a problem in a difficult manner rather than an easy manner.

As for his argument that students hate the "theoretical" stuff and just want to learn to program, that's not true. I go to Missouri State University, not at all known for its CS department, and even students that don't do very well often list classes like Discrete Math or Languages and Machines as their favorite courses in the curriculum.


I don't believe that all CS students plan to be something other than a computer scientist, but that most of them do. I believe people like you to be the exception. That's why I don't really believe in killing the theoretical side (as the title might suggest) but in splitting off a vocational track that would be better for most students. I think it would be better for people like you, as well, because courses could be more difficult and focused.

There is overlap between the two proposed split degrees, just as there's overlap between, for example, engineering and business. That's why my University has separate calculus and statistics classes for each. But everyone knows if you really want to learn calculus, you should take the course for engineers, not the course for business majors...


The OP's complaint seems to be mediocre students, not the BSCS. Don't eliminate the BSCS, just have higher standards for students.

I'm just finishing up a BSCS, and frankly the CS stuff seems pretty helpful overall. Algorithm design and analysis builds up my problem solving chops, AI teaches me interesting techniques, and automata and languages gives me a good theoretical basis behind development folk wisdom like "you'll never have sufficient test coverage to get rid of ALL the bugs" and "you can't parse HTML with regular expressions". Compilers and operating systems give me great examples of how to design and structure a large system.

On the other hand, the "software engineering" courses I've taken seem to be teaching little more than 1990's era process and testing methodologies. My school has an apparent fondness for teaching RUP, a methodology invented by IBM[1] to sell CASE tools. The best explanation I've gotten from instructors about this is a combination of "defined process models give us something we can write exams for" and "students need to be familiarized with formal process models in case they run into them in industry". I won't say I've learned absolutely nothing from these classes--there's some good stuff in the testing class about how to design tests[2], and the first software engineering class assigned readings from The Mythical Man-Month (always worthwhile) and taught us through experience that going through the correct processes and writing up all the design documents IBM wants you to stretches out a 2 developer, 2 week project to an entire semester for 9 developers, plus you don't finish the project.

On balance, I'd say the people who know about computer science are university faculty, and the people who know about software development are developing software somewhere else, and I'd rather learn something useful about computer science from competent computer scientists rather than learning about software development from people who don't actually develop software.

[1] By "IBM", I mean "a company that IBM has later acquired".

[2] Especially if you're coding in Java, C++, or C#. The software engineering curriculum hasn't quite caught up to the idea that different kinds of languages are used to develop actual applications.


"The OP's complaint seems to be mediocre students, not the BSCS. Don't eliminate the BSCS, just have higher standards for students."

I think you've struck the heart of it here. Unfortunately, while "enforce higher standards" sounds nice, given the incentives involved, it's not realistically going to happen. There are plenty of smart students out there who would do well in a rigorous university program. There are also many less-qualified students who are willing to pay university-education prices for vocational educations as long as they get sheepskins after four years or so. The universities have discovered that they can cater to the latter without losing the business of the former, thus raking in considerably more cash, so of course that's what they're doing. This is probably more true for liberal arts programs than any other curriculum, but it's affecting everything.

This argument has come up a lot on HN lately, and it usually breaks down into two camps: those who think that the higher education system is being diluted to death, and those who say, "I just finished my degree, and I received an excellent education." Then come the anecdotes saying, "I've been trying to hire recent graduates, but they're all lazy, self-centered, entitled, incompetent, gormless whiners," while others respond with, "I just hired some recent graduates and was overwhelmed by how smart and driven they all were; I had a hard time narrowing the field down to make job offers." Finally, a bunch of cherry-picked studies and statistics are thrown out by both sides to support their positions.

Here's my personal concept of what's really happening: university curricula are being watered down to cater to less-qualified students, but this process is not YET having a perceptible impact on the most qualified students because they are doing what university students are supposed to be doing: extending their learning well beyond the classroom. So far the universities are managing to have their cake and eat it: they are selling watered-down diplomas to under-qualified students while simultaneously providing a good learning environment for qualified students. However, I think that those qualified students are still being hurt by the change. They may say, "I think I received a great education," but they have no way of knowing if their education would have been even better if it had been optimized for them instead of targeted at the lowest common denominator. Also, I wonder how long the universities can continue to run these schizophrenic curricula before something gives.

My understanding of the author's underlying intent was to find a way for universities to teach a rigorous curriculum to the smart students without driving out the less-qualified students, because any solution that drives out the less-qualified simply will not be adopted in the real world. His answer was to create two curricula: one for the people who actually belong in college, and one for people who just want a vocational education and a piece of vellum. You could apply this model to almost any field of university study. I think that there is a definite need for vocational education of this sort, and I don't think that universities should be providing it. However, they've discovered that they can make a lot of money doing so, so they're not going to stop. Instead of hoping that universities will abandon vocational education in favor of higher standards, we should instead look for a solution that segregates the two types of education so that universities can do both well.

I think a lot of people are focusing on his proposed breakdown, which I agree isn't quite right. He takes some subjects that do have a place in a university education and puts them strictly on the vocational side. However, his overall idea of accepting the reality that universities are providing vocational education because it is lucrative, and of separating that branch of the school from the traditional university education, is good.


In general separating vocational from academic studies is a good idea, even if we have to do it in the university environment. In practice, universities are conservative institutions ill-equipped to provide vocational education in software development; at least, mine is.

SE education seems geared towards churning out enterprise-type developers, and I guess there is a legitimate staffing need for enterprise-type developers in enterprise-type shops, but the CS-educated developer is perhaps a more important asset to cultivate, and the OP seemed to suggest that the CS-educated developer should go away entirely.


"Computer Science students, however, don't go to college to become computer scientists. They go to learn 'programming'."

When I finished my CS degree in 1988 I remember considering what it was that I had been educated to do (I think this is really only apparent at the end a course) - it was pretty obvious: we had been educated to a level where the "natural" thing to do was continue on to do postgraduate research. The vocational element of our course was (thankfully) very low, there was a distinct (and times alarming) theoretical component but you also had to do a lot of development work. But developing stuff was a means to an end of demonstrating an understanding of the course material, rather than an end in itself.

So I went to university to learn about the academic field called computer science - not to learn to program (which frankly, isn't that difficult).

20+ years on I'm rather pleased with the course I did and the subsequent six years I did in academic research - I've used a lot of maths in various jobs (industrial simulation, investment banking, even more industrial modeling), it was a good course, with a good class and excellent teachers in a great location. I even met the chap who became the first investor in our startup through the course - he had graduated from it about 20 years earlier.

So yes, don't do a CS course to help you become a "standard" developer - it is almost completely irrelevant and will probably frustrate and confuse.


I decided to go into CS because I wanted to be a programmer, yet I wasn't one of those kids who programmed from an early age. I had no idea where to start and either didn't have the time or the drive to embark on a rigorous self-education spree. University seemed a good fit to me, even to learn to become a "standard" developer.

That said, three years later I'm nearing the completion of my CS degree, and I understand what computer science really is and love it. I have no plans for further time in academia. I'm also confident in my programming abilities, at least for an entry level job. Would you say a CS course was appropriate for me, or should I have spent the three years learning programming on my own (I had a decent tech support job after high school so the latter was definitely an option)? What if anything would you recommend to people who are interested in programming late in high school but don't have the advantage of having grown up programming?


Sorry, my post was a bit of an early morning coffee fueled rant.

I was really replying to the usual "I did CS and they didn't even mention RoR" posts that usually pop up when anyone mentions CS on a development oriented forum.

If you have done a course and appreciate what CS is and you are confident in your skills then it sounds to me like you have done the right thing. In my experience the absolute best people I have worked with have been been fantastic natural developers who also had a good CS education - but this only pays off in technically difficult areas - either because the domain is complex and/or the solution is complex.


I also wasn't a programmer before I started university, and I haven't regretted getting a Master's in CS one bit. There's some sort of general aptitude for programming you need, and no university education in the world can help you if you lack that. But if you have that, a CS education is a lot of help in making you a good programmer. Not because of whatever languages are in fashion when you get the education, but because of the more timeless subjects: Algorithms, paradigms, complexity theory, and compiler theory. These are the tools you need to be able to quickly pick up a language, any language, and be reasonably good at it.


This debate will never end. A lot of it has to do with unrealistic expectation. 4 Years is not enough to learn a field and become good enough to be proficient at it for most people. The good programmers I know started as teenagers. That means they had a head start before going to college. I believe this is representative but I don't know of any studies on this.

As others have pointed out there is now a lot of schools that have Software Engineering as a separate major. It's still debatable if it's a better way to go. I'm not yet convinced that people graduating with Software Engineering degree aren't as bad as Comp Sci major early in their career due to my first point.

Finally there are also those suggesting community colleges giving more vocational training as a way to go. I have one friend who has gone that route and he's a pretty good programmer. He did start very early also. We were hacking code together in middle school.

Finally it's really up to the companies hiring to understand this, I know that it's common practice for engineering firms who hire engineers out of school give them a year of training or have him paired with a mentor. This is important.

I'm hard pressed to find a single job a person with an undergrad in anything could do directly out of college without some training. Good professional jobs usually require graduate school. Even accounting usually require more than 4 years (internship) before being completed (same with engineering.)

Edit: many people here on HN will say that they didn't need school to become good programmers. That is true but how many years have these people been programming for?I bet it's at more than 5 years on average by the time they are 18.


I did some extremely light programming late in high school on my TI-83 calculator (before I knew that gotos were considered harmful and before I would have understood that reference). Soon after high school (graduated in 2005) I decided I wanted to be a programmer, but I didn't really know what programming as a discipline or career entailed, and I honestly didn't know where to go to learn. In 2008 I enrolled at a university, and my first real exposure to programming was my university's intro to CS course, which of course used Java and covered basic OO and some basic data structures. I liked it and learned quickly enough.

Now, a few years later, I love both CS and programming, but obviously I am in no position to judge how good I am at either. Here's what I can say objectively, and maybe someone can give their opinion on how legit I am or how much my CS degree did for me: I took a national standardized CS test for to-be graduates (MFAT I believe it's called) and scored in the highest bracket >95%. I read, comprehended, and loved GEB. I read, mostly comprehended, and loved SICP and The [Little, Seasoned] Schemer. I had to give up on a book about Gödel's incompleteness theorems because I couldn't hang with the math. I did the Grepling challenge a while back in Python without cheating.

I've learned how to learn new languages on my own. I'm decent with Python, and have created some nontrivial web apps with Django. That said, I still make mistakes when programming that are embarrassing when brought to light, and I still discover new things that I feel I should have already known. I've never made a start-up or even contributed to an open source project, so I'm still probably near the bottom in this community.

I definitely needed a CS curriculum to become a programmer at all, and with any luck I'll end up becoming a good programmer.


You're doing exactly what you need to do, you are putting extra effort into it.

My feeling is the "bad" cs grads usually did not put in that extra effort. Reading GEB is rarely a required par of any CS degree but reading it is an incredibly useful thing in my view. It's a thick book and takes commitment to read.

Side note: Check out this book (http://www.amazon.com/G%C3%B6dels-Proof-Ernest-Nagel/dp/0814...) on Godel's proof. It's been updated by Doug Hofstadter the author of GEB. I found it pretty good. Read it slowly, two three time if needed. It will make sense.


Thanks for the recommendation. I've actually been looking for a new book.


The problem is that the Computer Science college curriculum (specifically the Bachelor's of Science degree) has become the de facto requirement for software engineering careers. It's being pulled from different angles (toward a trade college degree in software engineering on the one hand and toward pure science on the other) and has failed utterly at satisfying either requirement. Ultimately the biggest improvement in the situation will come from separating out the software engineering training requirement off on its own (whatever form that takes).


The article a false assumption that it's possible to be a good software developer without understanding the basics of Computer Science (whether through university or self- education).

It's not, not even in the bread and butter application development tasks: imagine writing an accounting system without knowing how to accurately handle decimal values[1] or what a B-Tree index is or what a race condition is. I won't even begin to mention systems/library/algorithm implementation work.

Degrees that cover programming without computer science do exist, but they intended for developers who want to transition to (people or project) management.

[1] E.g.,

  alex@jupiter:~$ python
  ...
  >>> 1-.3
  0.69999999999999996


imagine writing an accounting system without knowing how to accurately handle decimal values[1] or what a B-Tree index is or what a race condition is. I won't even begin to mention systems/library/algorithm implementation work.

Are you certain that business systems are written by people who know all three of these things? My conjecture is that most practising business programmers will tell you that "libraries" and "frameworks" exist to solve these problems for them and that knowledge of the details is nice to have but not necessary for employment as a business programmer.

"Systems/library/algorithm implementation work" is another thing entirely, but I'm sure we can agree that at this moment in time, there are very few people doing systems work or building libraries compared with the number of people using them to build business systems.


>Are you certain that business systems are written by people who know all three of these things? My conjecture is that most practising business programmers will tell you that "libraries" and "frameworks" exist to solve these problems for them and that knowledge of the details is nice to have but not necessary for employment as a business programmer.

Are there programmers who write business applications without understanding and knowing these things? Yes, most certainly. Are these programmers working in businesses known for using software as a competitive advantage? Almost certainly no. With the exception of AWS, most of what Amazon does (and what made them successful initially) was business software: e-commerce, shopping carts, payments processing. Yet it's clearly that they used in-house software development talent as a competitive advantage (most obvious example is Dynamo's original role as a shopping cart engine which allowed them to handle peak holiday traffic without declining sales).

Libraries and frameworks are great, but they're of no use if the application developer doesn't know they need to use them. An app developer will have to know that Double can't give him precise decimal math, before he'll use BigDecimal. An app developer will have to know what race conditions are before he'll know where to use a ConcurrentHashMap vs. a HashMap vs. a synchronized block vs. ReentrantLock. I'm not aware of an ORM that is sophisticated enough to tell you what sort of index to use (this may also be an omission on my part and there may be ORMs and other tools that will make reasonable default choices and setup indices during the DDL phase).

>"Systems/library/algorithm implementation work" is another thing entirely, but I'm sure we can agree that at this moment in time, there are very few people doing systems work or building libraries compared with the number of people using them to build business systems.

My perception is somewhat skewed: I'm in Silicon Valley and have worked at multiple Internet firms and at an infrastructure (security) vendor as a systems/library implementer (or you can say, "mostly userland systems programmer"). While it's obvious that there's lots of systems work going on in infrastructure vendors, I'd say that least 50% to 75% of engineering in established Internet companies and lead start-ups in that sphere involves building tools, libraries and implementing/fine-tuning algorithms rather than application-level programming. The application developers drive the demands for these tools/libraries/algorithm implementations and thus have to have more than "black-box" level familiarities with them. Very rarely are people also hired just for application development roles: knowing a specific library/framework well won't get you hired, as that may not transfer over to other positions in the company. If you look at the interview processes in these companies, you'll quickly see that they reflect that. The app to infra development ratio will certainly be different at smaller startups (more app, less infra), but the need to hire generalists with strong knowledge of the fundamentals is even greater.

tl; dr Let me say it again: good programmers should know the fundamentals. This doesn't just go for the systems/infrastructure developers, but product developers as well: you may still be able to get a job without knowing these basics, but you'll be "locked out" of the most lucrative jobs in a given industry (or even out of whole industries).


good programmers should know the fundamentals. This doesn't just go for the systems/infrastructure developers, but product developers as well: you may still be able to get a job without knowing these basics, but you'll be "locked out" of the most lucrative jobs in a given industry (or even out of whole industries).

I agree with this statement, but I sing in the choir. I suspect that everybody else on HN signs in the choir as well.

Jeff Atwood once pointed out that he wasn't trying to write for his readers, he was trying to write for the millions of programmers worldwide who don't read any blogs at all. Most of them don't work in "The Valley."


> Jeff Atwood once pointed out that he wasn't trying to write for his readers, he was trying to write for the millions of programmers worldwide who don't read any blogs at all. Most of them don't work in "The Valley."

Agree and disagree. Yes, library implementers should think about application developers and the problems they're solving first and foremost. One of the most interesting developments lately has been the work on making concurrency accessible to application developers, which illustrates "and disagree" part: I used the example of Doug Lea's beautiful work on java.util.concurrent to show that while an application developer (or even a systems implementer without a good cause, for that matter) shouldn't have to implement lock free data structures themselves, they have to ultimately understand what lock free data structures solve and don't solve in order to be aware of them and know where to use them.

Lastly, one of the recurring themes of articles I see on this site is about creating a "The Valley" elsewhere (and I don't just mean Seattle, New York, Tel Aviv, etc... - tech cultures in each are more similar to each other than they're different). Perhaps that's something to ponder: is there a culture of "cut and paste and clock out" or "build software that's an asset and not a liability"?


> developers who want to transition to (people or project) management.

And that's pure evil, for they will become managers who think they know what they are managing, but, really, have no clue.


Well it's better than 'social promotion', taking your most accomplished and most experienced workers and peter-principling them into incompetent newbie managers just because that's the next title in the series.


At the very least they can understand what they are managing.


The author seems to be claiming that higher-level languages are easier or less valuable for learning CS. I don't think that's true. Sure, it's easier to learn to write "hello world" if there's no boilerplate, but that just means there's less between the programmer and the algorithms. "Too easy" is not one of the common complaints about SICP.

Java, of course has little place in a CS education. It's not even a good introduction to OO. Institutions who want to teach Java should probably call their degrees "Enterprise software engineering" or some such.


Slightly of topic rant here on the topic of learning to program:

The first language I programmed in was Pascal (Turbo Pascal 5 and 7 to be precise), at the age of twelve. It was a great language for several reasons: it was mostly imperative (meaning I didn't have to try to gasp objects right away; there were OO extensions, but I didn't have to use them initially), statically typed (meaning I did have to learn about interfaces and implementations), allowed for raw memory access (so you could learn about pointers and memory management) while having "real" arrays and strings (so you could learn these concepts and write basic programs before learning pointers).

Going from Pascal to C (at the age of 14) was easy for me, including dealing with the fact arrays and strings (as my handle suggests) are pointers (I know what pointers were and how to deal with them, while at the same time knowing what strings and arrays are for). Moving to C was a necessity, as I was finding myself writing far too much inline assembly without actually knowing assembly (Google "Why Pascal is not my favorite language" for the reason why), but it was much smoother than learning programming from scratch with C.

In college/university, the CS courses were all taught in C which, I think, gave me somewhat an unfair advantage over students who didn't program before coming to university (unfortunately, as a transfer student I was exempt from the "great equalizer" class which forced everyone to learn Haskell).

It's sad that schools these days face the choices of: teaching C which forces students to deal with issues tangential to what they're learning, teaching Java or Python which exclude very important aspects of programming and data structure implementation, or teaching a language which likely no longer even has a stable and widely available implementation but is more suited for the "pedagogic" role. Out of these choices, I believe C is the less wrong, but not the optimal choice.


From what I understand, accreditation the result of a mix of desires from the different companies that support the accreditation. Java exists as a major language because companies want to be able to look at a degree from a certain, accredited college and trust that that degree means the person knows how to do {x,y,z}. If Java is required for accreditation, that means that companies want someone claiming a Computer Science degree to know how to work with Java.


I disagree that Java is inherently wrong for CS education. It ultimately depends on what you decide to teach using it (i.e. it's concepts that are being taught, the languages are only the carrier).

It's possible to teach courses in data structures, algorithms, and so on using Java, just as it's possible to use it as a gateway to other languages like Python and C.


Java puts boilerplate between the programmer and the algorithms and data structures without allowing direct control of the memory. Python, Scheme, Haskell and many others do it without the boilerplate, while C and any of many assembly languages require that the student learn what the hardware is actually doing.


It should be something in between. I have done my undergrad at Cambridge and I am now at Stanford, and the curricula differ a lot. Cambridge is very theoretical, and teaches a lot about core computer science - they love functional programming, discrete mathematics, denotational semantics etc. Stanford, on the other hand, is much more hands-on. It's more project-based, where you have to actually implement something. At the same time, I feel that students here learn less about the theory.

Both approaches have their advantages and disadvantages. Personally, I feel lucky to have the theoretical background as an undergrad and now be able to apply that knowledge as a grad at Stanford. I think there has to be something in between, and computer science as a major should not be on either ends of the extremes (theory/application).

I do not want to see a CS major who has no clue about programming any more than I want to see a CS guy who lacks the theoretical understanding of his subject.


Computer Scientist from NTNU (Norway) here. During the 3 first semesters we have 5 pure mathematical courses, which consist of ordinary and partial differential equations, Laplace, Fourier, statistics, numerical methods, complex numbers, discrete mathematics, linear algebra and probably lots of other things which I've forgot to add in. We also have digital design, basic electrical circuits and physics. To learn about algorithms and data structures, we have a course where "Introduction to Algoritms" by Cormen is the curriculum. So far, we've only had one course where we've learned Java, and that course mostly focused on object-oriented programming - not Java itself.

I'm not sure whether the computer science major is that bad over in the states, but if it is, maybe you should take a look at how we do it?


My CS program at Missouri State University isn't that heavy on mathematics (we just take Calc 2, stats, and discrete). We take digital logic and the normal kinetics and electromagnetism physics courses as well as a physics course on microcontrollers.

Our languages and machines course is excellent, as is our algorithms/data structures course (also using the Cormen text). The two intro to CS courses cover OO with Java, and also taught basic data structures like double linked lists and binary trees. Algorithms also used Java for homework assignments, but there was nothing specific to Java in the course. We do have a C++ course and a silly web programming course that covers html/css/js/php, but they're elective. The Database course is required and goes in depth about relational algebra. The Software Engineering course is required but I haven't taken it yet. Edit: forgot about the computer architecture course; it's required and had no math beyond arithmetic for calculating cpu efficiency with different pipeline sizes, etc.

I honestly think the anti-CS major crowd is misguided. There's no way I would have been qualified for an entry level programming job before college. I barely knew how to program, and more importantly, I didn't know how to learn to program better. Now I do.


I'm studying CS in Berlin and we have 4 mathematical courses in the first 4 semesters: linear algebra, analysis I + II and stochastic. Besides stochastic, all of those courses are designded for engineers. In the first 4 semesters (basic study) we have 4 modules theoretical cs: basics of algebraic structures, automata and complexity, logic and specification and semantics.

The other 2 main modules are: technical cs and practical cs (methodical and practical basics of cs) in "practical cs" we learnt a functional pl (1st semester) and java (2nd). In technical cs: MIPS assembler (2nd), C and ARM assembler were a requirement in the 3rd semester.

So, although if you just look at the modules, the practical part takes up the biggest chunk (ECTS), it's only about 1/4th of the study.


At my school, we have three required math courses: Calc I, Calc II, and basic (non-calculus-based) statistics. There's also one discrete math course aimed specifically at CS.

That's it.


That's more or less to my course-layout at the University of Central Arkansas 12 years ago.


I could've used more theory in my CS major, but in fitting with the article, I've learned much more outside of class than inside. My profs have been very helpful, and have absolutely aided in this, but few classes have provided me with anything I didn't pick up in 1/10th of the time prior to the class. Though that's likely true for anyone who's very interested in their field.

In defense of the course, we have programming from the very beginning. I'd assume everyone who had gone through even a couple classes could solve FizzBuzz quickly - a fair number of the recent applicants where I work couldn't, despite nearly all of them carrying degrees (I graduate soon).


I hope that someday in the not-too-distant future programming will be considered the new literacy and will become a foundational requirement for every discipline and major, just as writing is today.


Is architecture required for everyone, though we all live in houses? I don't think that will ever happen.


I think almost everyone would agree with the crust of the argument, but I don't think CS degrees need to be eliminated. I think there's room for reform.

As in any other engineering discipline there are two distinct aspects in Computer Science. One is the "more theoretical" part, which deals with topics such as complexity, AI, operations research, language construction, databases, etc. Then, there is the "more practical" part, which deals with the application of the above theories to produce real solutions (what we usually refer to as "programming").

I think clearly universities are well-equipped to cover the first part. It fits well with current teaching methods and there's no reason to believe the courses would be substantially different from courses in mathematics.

The second part, which is what most students arguably want to learn how to do, is where I think universities really struggle. Firstly, many academics in the field aren't really qualified to teach the stuff since they have very little, often outdated knowledge of the industry. Secondly, to effectively teach "programming" you have to take on board a lot of subjective criteria. What makes good design or good code is inherently ambiguous. To my mind, effective "programming" courses have to be a lot like writing courses, where teachers lack objective criteria and teach their students best practices based on their experience of reading and writing good code. This is something CS academics are reluctant to accept, but if they did we could finally have effective CS education.


This is actually a problem with the way all engineering is now taught: an overemphasis on engineering science at the expense of engineering design. Software engineering may be worse off than most, because the term "Computer Science" implies that it's all about the science and there's no room for design at all. (Disclaimer: I studied Electrical Engineering, so I can't comment first-hand on that.)

The best Software Engineering programs (like the one bartman describes here: http://news.ycombinator.com/item?id=1944707) seem to at least bring software engineering back in line with other forms of engineering in this respect. The worst "Software Engineering" courses tacked on to Computer Science degrees (like the one philwelch describes here: http://news.ycombinator.com/item?id=1944735) merely teach wrong ideas about engineering to try and get the student into a position of being the drawer of UML diagrams rather than the programmer who translates them into code.

It goes without saying that I completely disagree with the author that there should be a degree in 'programming' since this is just training people to work at the other end of the UML diagram pipeline, a hierarchy which shouldn't exist.


“Computer Science is no more about computers than astronomy is about telescopes.” –Edsger W. Dijkstra


I hate to blaspheme Dijkstra, but I've never quite agreed with that quote. I suspect it's used out of context. Many subfields of computer science deal directly with making physical computers better: faster, more responsive, etc. I'm thinking of things like virtual memory systems, page replacement algorithms, multiprocessing, interrupts, driver abstraction, networking—basically everything you learn in the OS Theory course in a CS program.

Sure, all these things can be modeled "on paper" as a mathematical abstraction, and a human with a list of rules, a pencil, and a roll of paper can compute anything an Intel cpu can, but it's hard to ignore the fact that digital computers do the vast majority of the computation which CS is concerned with studying.

You don't have to know much about astronomy to make or improve a telescope. You have to know a lot about computer science to make or improve a computer.


Computer science helps in improving the stuff running on a computer, but it does help elsewhere, too. Examples:

1. When sorting cards by hand, I find myself switching algorithms depending on the size of the data set.

2. Caching/page replacement strategies do apply to the case of a scholar writing a book. His desk will not hold all his reference works, so some will have to move to the table next to it, others will be closed and moved to a bookshelf, possibly in a different room.

3. Without the theory behind multiprocessing, of course, philosophers couldn't dine together :-)


True. I'm not suggesting that CS is only about computers, but computers certainly play a bigger part in CS than telescopes do in astronomy.


``Computer science is no more about computers than architecture is about construction''


I would like to move computer science back to being strictly part of mathematics and make a computer science major's curriculum contain ⅓ of programming and ⅔ of CS theory and mathematics.

Then we would need second-level/bachelor's CS degrees in polytechnic/vocational schools where it would be ⅔ of programming and ⅓ of CS theory and mathematics.

Both would take at least four years and be much more merciless in letting students continue to the next year, i.e. no dumbing down of the courses.


The university where I went, Purdue, already has had this distinction for many years. They have CS in the school of science and CPT (Computer Programming Technology) in the school of technology. It was common knowledge that if you couldn't hack it in CS you switched majors to CPT which was focused on leaning technologies instead of concepts. Same as how there is calculus for math majors and calculus for business majors.


Many schools around here already have a second major that focuses on the profession of programming instead of theory: Software Engineering.


I'd like to know what they replace the "theory" with. I can't imagine getting rid of the courses on digital logic, algorithms, databases, and language design. You could probably get rid of the normal math courses and just have a condensed "Math for CS" course as a prerequisite for the above-mentioned courses, since they'll almost certainly require some knowledge of propositional logic, summations, limits, etc.

As I understand it, it's hard to teach "real-world" programming, since a lot of that just comes down to learning the specific enterprise tools your shop uses. I can see covering debuggers and version control systems, since my CS program is sorely lacking on those two topics, and perhaps requiring or strongly encouraging internships, but other than that I don't see what a Software Engineering degree will provide that a CS degree does not.


I attend a school solely offering Software Engineering. For my BSc degree I had two theoretical computer science courses, digging deep into theory of computation and formal languages; two math courses (logics, algebra, analysis, very little statistics); 1 semester digital logic (all the way up from transistors to finite automata to microprocessor internals - very intense); databases are offered as a specialized field of study if you're into that with one mandatory course for everybody.

Compared to what I hear from friends in computer science programs we have around half the amount of math they have, the remaining basics are about the same.

The software engineering aspect shows in the many projects we do. In every semester we had group projects, both programming and software modeling, with at least 5 fellow students. The last year of studies focusses on a project with 4-8 people and an external partner who's the customer for whatever you do. We had two intense lectures on project management/planning of software projects that mainly focussed on agile methods. One of these classes was accompanied by a software project of the entire class of 80 people: we ended up having 10 development teams, each doing SCRUM (including SCRUM Master and Product Owner), the scrum masters and product owners of these teams met and organized inter-team collaborations, the product owners also interviewed potential customers (startups in the local region) and gathered their requirements and thus steered the whole project. This was an incredible experience - except that we used svn.

I am obviously biased, yet I feel the focus on group work taught me lessons that would take many years longer to learn in industry and will help me a lot in my future life. +1 Software Engineering

edit: Another observation that struck me was that we had about 25% people who never programmed before, 50% who did something in school and maybe a little in their free time and 25% who've been into all that since they were kids or teens. Yet in the third year all of them worked full time on a software project and would get along pretty well. The lack of years of experience was still noticeable, but rarely ever an issue in day to day work - they did good or better work than some of the year-long programmers and had solid theoretical foundations on patterns, architecture etc. Of course there are some who just struggled through, but it's still pretty amazing to me.


OK, I admit that sounds awesome. It actually sounds like you covered most of the "computer sciency" stuff my CS program covers, just in fewer/more condensed courses. Did you study operating systems in depth, like deadlock, page replacement strategies, file systems, etc? That's one of the big courses in my program, and frankly one that I think should be reorganized. Knowing your way around a Unix system is pivotal, but it's difficult to teach in one semester or in a textbook.

The group project stuff in theory sounds awesome, but I'm not very social with most classmates, even the CS people I see every semester, and I generally groan whenever I hear that there's going to be a group project that requires out of class meetings with random students.

All in all, I don't think my program and yours could exist distinctly at the same university. They would just be subtly different emphases, and it sounds like the fundamentals would overlap so much that having two different majors would be mostly nominal.


I forgot about the operating systems - we did have that one too. It's split into two semesters, of which only the first is mandatory. We did all the synchronization techniques, page replacement, scheduling, memory management and general structure with assignments being coding on the Windows Research Kernel (which is surprisingly well written). I think the second semester talked more about file systems, but I didn't take that.

I agree that group projects can be really annoying, the main point seem to be the "random students". Over time (and projects) I found a group of people I could work and have fun with very well, so late night shifts weren't as annoying as they could otherwise have been.


The problem is, knowing how to program without understanding much computer science is useful only for the most basic of tasks, and I expect that the demand for some of those more basic tasks, or at least for people to know a generic programming language (as opposed to something more domain specific) is going to dry up as the underlying problems are solved in more generic ways, and higher level languages become increasingly more feasible for programmers with computer science backgrounds to use in production without the need for vocational-training-only programmers to churn out their boilerplate.


My CS degree was a sandwich course (UK), I did 2 years at college, a full year working for an employer (of my choice, the college had the power of veto) and a final year back at college, I did a postgrad after.

The industrial training year was the only thing of real value. It taught me professional discipline and the importance of delivering what the customer wanted, and of finding out what they wanted if that wasn't clear at the outset. Everything computer specific I learned on my course (compiler theory?) was either irrelevant, outdated at the time, or became so shortly after.


My school created an IT program and funneled the practical/programming only students through it. Seemed to make sense to me. The IT program is huge now, and the CS program has remained about the same size. A lot of students will get a dual major just to pick up the theory, though.

I could do without the hyperbolic blog titles followed up with the same arguments about the CS degree.


Well, is programming even degree worthy? If you wanted to learn just programming then make it a 2 year diploma.


Probably a 10-year diploma, if you ask Norvig?


"A Biology major is training to become a biologist." Um, how many people get a BS in biology and become biologists? More often, they get jobs as food safety testers or as hospital techs.


Don't forget teachers. I bet a significant portion of all the majors the article mentioned go on to become teachers.


let me put it this way: if you're really a passionate programmer, what you study is rather irrelevant. you're going to be working on one of your own projects anyway, whether you're majoring in liberal arts or computer science. I think what really matters, is how much time you have left besides your studies.


Computer science is bullshit, it should really be taught as an art curriculum with code as the medium.


I suspect if you explained yourself more, you wouldn't get downvoted. (I didn't downvote you.)


the lesson here is don't drink and comment on HN




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: