The OP's complaint seems to be mediocre students, not the BSCS. Don't eliminate the BSCS, just have higher standards for students.
I'm just finishing up a BSCS, and frankly the CS stuff seems pretty helpful overall. Algorithm design and analysis builds up my problem solving chops, AI teaches me interesting techniques, and automata and languages gives me a good theoretical basis behind development folk wisdom like "you'll never have sufficient test coverage to get rid of ALL the bugs" and "you can't parse HTML with regular expressions". Compilers and operating systems give me great examples of how to design and structure a large system.
On the other hand, the "software engineering" courses I've taken seem to be teaching little more than 1990's era process and testing methodologies. My school has an apparent fondness for teaching RUP, a methodology invented by IBM[1] to sell CASE tools. The best explanation I've gotten from instructors about this is a combination of "defined process models give us something we can write exams for" and "students need to be familiarized with formal process models in case they run into them in industry". I won't say I've learned absolutely nothing from these classes--there's some good stuff in the testing class about how to design tests[2], and the first software engineering class assigned readings from The Mythical Man-Month (always worthwhile) and taught us through experience that going through the correct processes and writing up all the design documents IBM wants you to stretches out a 2 developer, 2 week project to an entire semester for 9 developers, plus you don't finish the project.
On balance, I'd say the people who know about computer science are university faculty, and the people who know about software development are developing software somewhere else, and I'd rather learn something useful about computer science from competent computer scientists rather than learning about software development from people who don't actually develop software.
[1] By "IBM", I mean "a company that IBM has later acquired".
[2] Especially if you're coding in Java, C++, or C#. The software engineering curriculum hasn't quite caught up to the idea that different kinds of languages are used to develop actual applications.
"The OP's complaint seems to be mediocre students, not the BSCS. Don't eliminate the BSCS, just have higher standards for students."
I think you've struck the heart of it here. Unfortunately, while "enforce higher standards" sounds nice, given the incentives involved, it's not realistically going to happen. There are plenty of smart students out there who would do well in a rigorous university program. There are also many less-qualified students who are willing to pay university-education prices for vocational educations as long as they get sheepskins after four years or so. The universities have discovered that they can cater to the latter without losing the business of the former, thus raking in considerably more cash, so of course that's what they're doing. This is probably more true for liberal arts programs than any other curriculum, but it's affecting everything.
This argument has come up a lot on HN lately, and it usually breaks down into two camps: those who think that the higher education system is being diluted to death, and those who say, "I just finished my degree, and I received an excellent education." Then come the anecdotes saying, "I've been trying to hire recent graduates, but they're all lazy, self-centered, entitled, incompetent, gormless whiners," while others respond with, "I just hired some recent graduates and was overwhelmed by how smart and driven they all were; I had a hard time narrowing the field down to make job offers." Finally, a bunch of cherry-picked studies and statistics are thrown out by both sides to support their positions.
Here's my personal concept of what's really happening: university curricula are being watered down to cater to less-qualified students, but this process is not YET having a perceptible impact on the most qualified students because they are doing what university students are supposed to be doing: extending their learning well beyond the classroom. So far the universities are managing to have their cake and eat it: they are selling watered-down diplomas to under-qualified students while simultaneously providing a good learning environment for qualified students. However, I think that those qualified students are still being hurt by the change. They may say, "I think I received a great education," but they have no way of knowing if their education would have been even better if it had been optimized for them instead of targeted at the lowest common denominator. Also, I wonder how long the universities can continue to run these schizophrenic curricula before something gives.
My understanding of the author's underlying intent was to find a way for universities to teach a rigorous curriculum to the smart students without driving out the less-qualified students, because any solution that drives out the less-qualified simply will not be adopted in the real world. His answer was to create two curricula: one for the people who actually belong in college, and one for people who just want a vocational education and a piece of vellum. You could apply this model to almost any field of university study. I think that there is a definite need for vocational education of this sort, and I don't think that universities should be providing it. However, they've discovered that they can make a lot of money doing so, so they're not going to stop. Instead of hoping that universities will abandon vocational education in favor of higher standards, we should instead look for a solution that segregates the two types of education so that universities can do both well.
I think a lot of people are focusing on his proposed breakdown, which I agree isn't quite right. He takes some subjects that do have a place in a university education and puts them strictly on the vocational side. However, his overall idea of accepting the reality that universities are providing vocational education because it is lucrative, and of separating that branch of the school from the traditional university education, is good.
In general separating vocational from academic studies is a good idea, even if we have to do it in the university environment. In practice, universities are conservative institutions ill-equipped to provide vocational education in software development; at least, mine is.
SE education seems geared towards churning out enterprise-type developers, and I guess there is a legitimate staffing need for enterprise-type developers in enterprise-type shops, but the CS-educated developer is perhaps a more important asset to cultivate, and the OP seemed to suggest that the CS-educated developer should go away entirely.
I'm just finishing up a BSCS, and frankly the CS stuff seems pretty helpful overall. Algorithm design and analysis builds up my problem solving chops, AI teaches me interesting techniques, and automata and languages gives me a good theoretical basis behind development folk wisdom like "you'll never have sufficient test coverage to get rid of ALL the bugs" and "you can't parse HTML with regular expressions". Compilers and operating systems give me great examples of how to design and structure a large system.
On the other hand, the "software engineering" courses I've taken seem to be teaching little more than 1990's era process and testing methodologies. My school has an apparent fondness for teaching RUP, a methodology invented by IBM[1] to sell CASE tools. The best explanation I've gotten from instructors about this is a combination of "defined process models give us something we can write exams for" and "students need to be familiarized with formal process models in case they run into them in industry". I won't say I've learned absolutely nothing from these classes--there's some good stuff in the testing class about how to design tests[2], and the first software engineering class assigned readings from The Mythical Man-Month (always worthwhile) and taught us through experience that going through the correct processes and writing up all the design documents IBM wants you to stretches out a 2 developer, 2 week project to an entire semester for 9 developers, plus you don't finish the project.
On balance, I'd say the people who know about computer science are university faculty, and the people who know about software development are developing software somewhere else, and I'd rather learn something useful about computer science from competent computer scientists rather than learning about software development from people who don't actually develop software.
[1] By "IBM", I mean "a company that IBM has later acquired".
[2] Especially if you're coding in Java, C++, or C#. The software engineering curriculum hasn't quite caught up to the idea that different kinds of languages are used to develop actual applications.