Hacker News new | past | comments | ask | show | jobs | submit login
Online Degrees That Are Seen as Official (nytimes.com)
84 points by Futurebot on March 8, 2015 | hide | past | favorite | 59 comments



I think the problem is fundamentally that two separate concerns are entangled: education, and certification.

The certification is what you need. But to get it, the university can require you to purchase a whole bunch of very expensive education you may not need.

Only by separating the two will this problem be solved: having certification authorities that are independent of educational providers.

In that utopia, even a regular university can't grant you the certification. They can market themselves as the best and most cost effective way for you to reach the competency needed to achieve your certificate. But, perhaps MOOCs are better, or intensive library study, or perhaps you're already that competent, and don't need any education to get there.

It is only that which can solve the scandal of indebting vast swathes of young people by tens of thousands of dollars.

So I'm suspicious of this article, because the solutions described still make the same mistakes. The body that provides you with the education, certifies you as having had that education. Okay there may be more evidence as metadata, but still, the incentive is there for providers to bias their certification towards the educational products they want to sell.


Of course... various industries have found that certifications without an education path tie-in are near-pointless. Why? Because nobody can figure out how to do certification exams very well.

What a useful certification usually actually says isn't "this person can use this tool", it's "this person has been educated in a certain way and paid attention during that education, leading to the ability to use this tool".

For the vast, vast majority of certs out there where you don't have to take a course to take the exam, you can knowledge-cram for a comparatively short period of time, then take the exam without actually having much experience, even if "only" in a lab, then forget most of what you've learned shortly afterwards. As a result, few employers care about them.

If you want to do certs separate from education paths, you have to figure out a better way of testing people - possibly real-world (or at least lab-based, scenario-oriented) testing over a period of time.


Right. I partially agree. Which is why separate certification hasn't succeeded yet. It is, I'd suggest, an area ripe for disruption.

Where I disagree is the tacit implication that, therefore university education is somehow better at being an indicator of those skills and qualities. I've seen a lot of terrible graduates who I wouldn't employ if they paid me: graduates who couldn't reason their way out of a paper bag. So I'd respectfully disagree with that implication.


> Where I disagree is the tacit implication that, therefore university education is somehow better at being an indicator of those skills and qualities.

It does tend to be a better indicator - just because it's not perfect doesn't make it the same.


Being a CFA carries a lot of weight. It seems like certifications (unsurprisingly) work when a lot of effort has been made to set up the certifications properly. The reason why we haven't seen such an effort in a lot of areas is probably because most places are relying on college degrees.


It also helps when a certification is testing to a relative specific and concrete set of knowledge. Some IT certs are certainly better than others and they probably speak more to some base level of knowledge as opposed to real competence. That said, it's easier to use a test (preferably with a hands-on component) to test for $TECH administration competence than it is to test for computer science competence in a broad sense.


CFA requires bachelor degree, four years of professional experience and series of exams with cumulative passing rate of 16% - so it's not just certification.


> The reason why we haven't seen such an effort in a lot of areas is probably because most places are relying on college degrees.

In the world of IT, university degrees mean very little, as universities almost universally fail to teach anything useful for that industry.

There's still very few certs that mean anything. At the entry level, all of the certs are dreadful, and past the entry level, experience usually matters more than what certs you have.


Actuaries seem to have managed to separate education and certification.


Kinda. In all 50 states, you need to complete 150 credit hours to be eligible to sit for the exam, i.e., undergrad + one year of grad school.

Most state-administered professional certification exams have similar eligibility requirements.


You appear to be thinking of the CPA requirement which is for accountants, not actuaries. I'm an actuary, and the education requirements are thin https://www.soa.org/Education/Exam-Req/edu-vee.aspx.

You could theoretically get certified with only high school and work experience, although I don't think any employers would hire someone without a bachelor's. The degree can really be in anything though as long as you've passed an exam or two, as this shows an employer that you're serious and have learned at least something.

I believe the system works in our case because there is an ecosystem of employers, employees, recruiters, and government agencies that all buy into the exam process. It's sort of a throwback guild system that's not very similar to IT certifications such as CCNA.


You're forgetting the other important one: sorting. Why do banks, consulting firms, and even tech companies go gaga over Stanford and Harvard grads? Because the UCLA grad doesn't have the requisite skills? No, because they trust them to sort the top 1% from the top 5% from the top 20%.

Neither MOOCs nor certification programs offer that key function. When Coursera only accepts people with a certain SAT score, then issues "degrees" ranking candidates by class performance, I think we'll see employers take them more seriously.


I think that filtering is only one of many important factors, many of which are interrelated. An elite school's imprimatur, for example, can more easily convince clients and even internal management of an employee's competence.

Filtering alone doesn't explain why those companies go 50% deep into the pool of Harvard graduates before considering the top 5% of UCLA grads.


Right. The article talks about this. 'Top' universities are filters for high degrees of talent or money. It is not their value-add then, which makes their graduates employable, it is their pickiness choosing with students to allow to buy their services.

But I partially agree: certification programs need to indicate degree of attainment, not just pass or fail. I disagree that SAT entry requirements is particularly important to an employer. If I wanted to make sure a candidate had a certain SAT score, I could ask them for it, couldn't I?


Interesting point. Though universities allow F students in through loopholes. Fail HS, go to community college where the tests are open book, then transfer to a University which by law must accept them.


I largely agree with this, but there's one complication. A lot of the certification aspect is that certain schools have earned a reputation over many decades for being selective, having high academic standards, and producing effective graduates (no claim as to whether they upkeep those, or even rightfully earned them in the first place). It is very difficult to invent some certification that is comparably valuable to a degree from such schools.


Perhaps a broader issue is that the current system is the one that nobody designed. And, designing a certification system from scratch, that achieves any kind of widespread acceptance, may be impossible.

I see something similar happen on a smaller scale when a business tries to re-design a successful legacy product: The conditions that made it possible to design the original product no longer exist. Nobody really understands why the legacy product was successful, and the original designers are long gone -- they were "acqui-fired."

The legacy product might have a number of glaring shortcomings, but the re-design team ends up creating something that's worse.


In Australia, certification is effectively governed by TEQSA and the relevant accrediting body for your discipline. TEQSA examine whether a university's degree meets the AQF requirements for being a Bachelor's degree (or other kind of degree), and in our case the relevant discipline body is ACS, and they also look in depth at the quality of what we teach. Each of those is a thick, thick, document full of things that go far beyond the exams. It is the education in a degree, not just the exam-answering, that they attempt to accredit.

I realise that degree-certification-bodies isn't what you meant by "certification" -- that you're thinking more of the "industry cert" model, where a training centre teaches you material, and then you pay to take an exam from the certifier. The point though is that's not what society (or at least as represented by government) wants university to be. It is, at least currently, supposed to be education not just training and testing.


The same kind of thing happens in other countries, including the US. There are authorities which accredit universities as providing suitable standards of education to award a degree. That wasn't what I was getting at, no. It prevents diploma-mills, certainly, but does nothing to stop the perverse incentives. And doesn't help any alternative methods of achieving the same competency to compete.


The point is that so far it seems it's not just the "competency certification" that the market and society is demanding -- it (for the moment) really is demanding the education that you describe as a mis-sold extra.

That's not so surprising to me, in that a degree tends to be an entry to a career. And that career can be long enough that competencies are likely to change in unpredictable ways over your career. So, it makes some sense for graduate employers to be more interested in an education they believe means you are likely adapt well to those changes, and perhaps lead those changes. Rather than a competency certification that shows you know the current state-of-play.

Whether that will change over time is an interesting question. If the market does move towards just certifying competencies, I have an awkward suspicion that it will be because of general trends towards workforce casualisation. If you see employees as disposable (hire the skills of today, and tomorrow fire them to hire the skills of tomorrow) then education is less important than today's competency.

This isn't intended as a criticism of MOOCs, nor in any sense to diminish them. I'm not only interested in them, but trying to develop some -- I have something of a vested interest towards MOOCs... It's just that I'm personally more interested in how we can improve MOOCs to offer an education, rather than how we can get the market to stop demanding an education and just want a competency-certification.


Aren't you just defining competency in a very narrow way to move the goalposts? If the goal is that students show competency in adapting to changing requirements, and mastering information literacy, then that is the required competency.

By competency, I don't mean some narrow technical expertise, but whatever skills it is that an employer needs their workforce to have, that universities claim to impart.


I think the problem is fundamentally that two separate concerns are entangled: education, and certification.

There's actually one or two other concerns, too: networking and fun. Some of the value I and many other people got from college is meeting others with similar interests.

Regarding fun (and parties), I get the sense that most HN readers probably aren't or weren't party animals, but that's actually a pretty common reason for people to go to college, per Sperber's Beer and Circus (http://www.amazon.com/Beer-Circus-Crippling-Undergraduate-Ed...) and Armstrong and Hamilton's Paying for the Party (http://jakeseliger.com/2014/04/27/paying-for-the-party-eliza...).

I don't really have a sense of how the university bundle really breaks down into core value.


Right, but I think the problem is the entangling of the two issues I mentioned.


Business owners would love if there were some exam which accurately informed them about who to hire, and there has been no shortage of attempts to create such a thing. The problem is, most of the characteristics needed seem to be things we don't understand, or don't know how to quantify.

Greatly complicating things is the fact that you often don't need to know anything in particular to succeed. There is no shortage of people who learned on the job and became successful. Filtering out such people with an exam may badly damage your hiring.


It would be nice to able to get, say a:

* General Computer Science Cert

* Software Architecture Cert

* C Lang Cert or a Java Cert

from an accredited institution and be instantly employable at the entry level. This is kind of how it works with Network Engineering, I wonder why it can't with Programming/SoftDev.


The University of Waterloo is a good example of getting an official degree all while doing courses online. Their transcripts do not differentiate between online courses and regular courses so if you get a BA in English or Philosophy for example (degrees fully offered online - http://cel.uwaterloo.ca/undergraduate.html), nobody will know you took it entirely online.

Love that flexibility when I was there. The cost per credit was the same whether I took the course online or on campus.

Did three semesters online before switching to the regular campus (my program wasn't fully online). The education was top-notch too with responsive profs and (for the most part) a good pace even though you're doing a lot of self-studying. My final exams were either held at colleges near my area or proctored at public libraries (the latter though I had to arrange on my own and the school just mailed the exams directly to the library).


Waterloo is also quite notable for their extensive 'co-op' program, which places students in internships at a really high rate.


Definitely. I think that's one of the reasons they have such flexible online course offerings (gives their co-op students a lot of flexibility).

At the end of my second year, I transferred to McGill and I was a bit disappointed at the state of their online course offerings (barely had any, apart from some electives, that could count for my degree). Also felt the school had a misguided bias against online courses (if you take one externally, there's a limit of 2 courses that you can apply to your degree - http://www.mcgill.ca/oasis/away/online-courses). When I was trying to get transfer credits, any mention of the course being taken online got my advisor telling me that my course would probably be denied. It was a good thing Waterloo didn't differentiate in my transcript between the two or else a lot of my credits might have not transferred over.

This was such a stark contrast to Waterloo where I really felt they considered their online courses to be just as comparable to their on-campus ones. I suppose though it's the whole culture of the instition. It helps too that Waterloo has a whole department dedicated to just the creation, design, development and testing of online-based courses - where they have designers/devs/subject/qa spcialists working closely with professors to design the courses (both for credit courses and professional development/continuing education courses). I don't believe my school has such a department to rely on.


I'm paying ~ $7,000 for an online degree from a top 10 CS masters program (Georgia Tech). The degree is identical to the on-campus one. If this doesn't disrupt the system then I don't know what will. This is as "real" as it gets. I really wish more universities will follow. In the meantime UC Berkley and Stanford both offer CS master degrees for $60K and $50K respectively. I rather pay about 10th of that and get "just" a 9th ranked program than a 1st ranked one and use all of my kids college fund to pay for it.


Wow, I just looked up that program. I was in my alma mater's online masters program for EE, but eventually had to cut my losses, because each class was ~$2100 total, so about 3x as much as the Georgia Tech program. I hope to see more like this in the future, maybe that would be the push I need to end up finishing that EE degree!


Identical is a strong word. Can you give an overview of how group projects are structured?


Well, the given degree (I mean, the diploma) is identical. Take two students, an on campus one and an online one (hopefully the first will graduate around end of 2015 beginning of 2016) and compare them, (except the name of the student), it will be identical.

Regarding the degree program itself, the material and rigor (assignments, tests etc), is also identical. However the process it's obviously different. We use Piazza for collaboration, and for group projects google hangouts. I had a great project last semester with a team of people all around the US (the course staff puts people together by timezone and native language spoken to make it more feasible). It worked great! Just like today's workplace is distributed, working with remote teams or working from home is no longer a niche, then why should education be any different? If I had the time (and budget), I would prefer an on-campus program, mostly due to the social aspect, networking, and fun aspect of it. But I'm not in my 20s. I have kids and a day job, and what GT did was simply the only way I would go for a graduate degree. (I didn't need it per-se, I do it because it's fun and interesting and cost effective at their current price range. had it been higher, I don't think I would go for it)


Not sure about that specific school, but when I was an undergrad in higher level classes, it wasn't uncommon to have the online masters students in the same section. Only difference was that oral reports from them would be recorded in advance.

Edit: I can't remember having to do a group project with an online student though. I think they may have been able to do a smaller project by themselves for the group assignments.


I'm surprised the Georgia Tech Online Masters in Computer Science[1] isn't mentioned in the article.

[1] http://www.omscs.gatech.edu/


I was too, though it doesn't really fit with the author's thesis -- it's actually very similar to traditional distance programs. The price and the class sizes are crazy, and the impact on the market is probably going to be huge, but if anything it'll reinforce the stranglehold of traditional institutions providing traditional degrees.


The class size for the one course I enrolled in this semester actually wasn't crazy (unless by crazy you meant crazy small). It was ~30 students.


I believe they're capped at 200 but I also get the impression they're accepting at a rate that assumes a lot of attrition prior to "full" acceptance (completion of foundational courses with a b or better).

For example, this last semester saw a lot of people accepted without compsci or math experience. That would make the prospect pretty daunting.


That must be CS6505 after the drop date. I joke, I think we had at least 200 people in that class, but there was a good amount who dropped.

I am guessing you are in CS6290, or one of the newer courses.


Unless the online degrees can signal prestige they'll never be seen as official by employers. Traditional Employers don't actually care about what impractical academic stuff you learned. They like that "you're like me" and that's what college really serves as: the Harry potter sorting hat.

It's time we do something else though. Paying 100k just to signal to employers your socioeconomic background is bankrupting everyone.


Yeah. Some other angles on this would be "people who are able to go 4-6 years without needing an income" and below the top-end "people who have living parents and/or family members/people who receive(d) familial support." The system we have here in the US is a fantastic way to exclude potentially very smart people form non-traditional and poor backgrounds from pursuing higher education. Contrast all this with the Danish model:

http://en.wikipedia.org/wiki/Student_loans_in_Denmark

Seems like there's a lot we could learn from a system like that.


For anyone wondering about Europe, my wife is studying a BSc Psychology with the Open University, which is accredited by the British Psychological Society. The fees are pretty much the same as a regular university though, which isn't great.

http://www.open.ac.uk/courses/qualifications/q07

As per other UK universities, you can transfer the credits you've earned to brick & mortar universities, so she'll probably do that for the final year.


In Australia, I teach at a university that offers degrees both on-campus and off-campus. Many do. The courses are regular university courses (not even a different offering -- off campus students enrol in classes with the on campus students) so naturally that means they are regulated and accredited by the same national bodies as for on-campus students.

(Technically they are "off campus" rather than "online", as the exam component of the assessment is an invigilated written exam, not an online exam)


The CFA institute provide exams of approximately degree standard for about $630/exam times 3 exams to get the CFA qualification. That also includes a kind of text book. I don't know if that kind of stuff is the way forward. I've done exam 1 and 2 and it's a bit tedious compared to a regular degree but fairly rigorous.


If you want to become a financial analyst it is fairly important to get. The CFA is nothing new, one of my parents got theirs years ago and a few of my friends are in the program now.


Bleedin' passive voice... "seen as official" - seen by whom? - suitable for what office?


Shameless plug. Before open badges can work we need standardization of listings.

This is what we are working at http://www.CourseBuffet.com

Like courses are given the same classification and fall under the same CourseBuffet title.


Even regular degrees aren't seen as "official." I'm tempted to pursue a graduate degree but I'm afraid it's just another scam.

Education in America is over. Public schools are completely underfunded, mismanaged and ignored. Universities have become businesses that offer a horrendous ROI. If your lucky enough to get in an Ivy or you'd like to work in academia (good luck in both cases), go. Otherwise save your money and figure out your own path because our previous highway to middle-class has crumbled.


That's quite a cynical take. I didn't go to college but I think if people have the opportunity and they want to learn they should take the chance. The resources at a college for learning are second to none. That should be your goal. That knowledge will help you find riches, but it's up to you. You have to put the work in just like you would if you skip college. Nothing comes easy.


Education in America is bigger than ever and still seen as official by the vast majority, though arguably optional in our field.

But there's also truth in your comment, particularly about ROI (excluding prestigious or legally required degrees) and figuring out your own path.

You're wise to consider the value of a graduate degree before pursuing one.


Graduate degrees make sense almost always, as long as the prestige and quality of the graduate program is an order of magnitude better than your undergraduate alma mater.

The calculation is trivial -- top masters programs often publish entry and mid career salary information.


"The new digital credentials can solve this problem by providing exponentially more information. Think about all the work you did in college. Unless you’re a recent college graduate, how much of it was saved and archived in a way that you can access now? What about the skills you acquired in various jobs? Digital learning environments can save and organize almost everything. Here, in the “unlabeled” folder, are all of my notes, tests, homework, syllabus and grades from the edX genetics course. My “real” college courses, by contrast, are lost to history, with only an inscrutable abbreviation on a paper transcript suggesting that they ever happened at all."

Exactly. On the merits, a MOOC degree ought to be infinitely more worthwhile than any traditional college degree. MOOC completion (quite literally) demonstrates project work, practical skills, attendance, and interaction. The courses themselves are the highest quality in the world and the rigor and impartiality is available for all the world to see. In a sane world, MOOCs will inherit global education.

But the world hasn't caught up yet. Right now the only commonly understood credential is the college degree, which everyone knows can be accomplished through registering for bullshit courses, skipping class, plagiarizing occasional essays nobody will ever read, and earning a one-word major that does nothing to prove any actual skills. Historically, this is as good as we've been able to do. No longer.

Give it time. There are entrenched interests here -- not only the massive and powerful education industry, which is now obsolete, but also every existing college graduate whose expensive credentials are now threatened by the radically superior MOOC certificate. Not to pick on teacher unions in particular, but several states require infamously pointless masters degrees in education before you can teach. Lawyers likewise REALLY don't need a three-year JD before we can practice... and the US stands alone in requiring bachelor's degrees prior to legal training or bar passage (with the exception of California). Virtually any non-STEM job that requires a bachelor's can be filled without the aid of expensive four-year certificates in poetry or pottery or communications. That's an enormous threat to basically every working professional in the United States.

Employ the D.E.N.N.I.S. system here.

First, demonstrate the value of MOOC certificates to employers.

Second, engage employers with hiring pipelines direct from Coursera/Udacity/edX.

Third, nurture these pipelines and grow them into a viable Github-for-employment.

Fourth, neglect protests from existing interests that have demanded advanced degrees in "teaching MOOCs" (seriously) or jeers that MOOC completion rates are low.

Fifth, inspire students with visible examples of success through MOOCs. Thiel fellows are the closest to this, but the project more closely resembles a lottery than a structured path.

Sixth, separate MOOCs entirely from their current aping of traditional courses. There is no reason whatsoever to release courses on a real-time basis, or to rely on artisanal videotaped lectures. (Though MOOCs have admittedly improved dramatically on the latter, with faster speeds, transcripts, open courseware, etc.)

Or... just withdraw federal subsidies for student loans and make host institutions partially liable for student defaults. I'd love to see the argument to pay $50k/year to attend no-name for-profit ripoffs in the middle of nowhere, versus free attendance in every Harvard course at any Starbucks in New York City.


I question your statement "a MOOC degree ought to be infinitely more worthwhile than any traditional college degree". (Taking "infinitely" to mean "at least twice".)

Do you have any evidence for that? That is, there's very little research to back that up. For example, http://www.irrodl.org/index.php/irrodl/article/view/1902/300... compared the two systems and found that they were roughly the same, in terms of outcomes.

However, that's based on completing the course. From the paper, "Although approximately 17,000 people signed-up for 8.MReV, most dropped out with no sign of commitment to the course; only 1500 students were “passing” or on-track to earn a certificate after the second assignment. For the IRT analysis we included only the 1,080 students who attempted more than 50% of the questions in the course, 95% of whom earned certificates. Most of those completing less than 50% of the homework and quiz problems dropped out during the course and did not take the posttest, so their learning could not be measured."

Further, "It is also important to note the many gross differences between 8.MReV and on-campus education. Our self-selected online students are interested in learning, considerably older, and generally have many more years of college education than the on-campus freshmen with whom they have been compared. The on-campus students are taking a required course that most have failed to pass in a previous attempt. Moreover, there are more dropouts in the online course (but over 50% of students making a serious attempt at the second weekly test received certificates) and these dropouts may well be students learning less than those who remained. The pre- and posttest analysis is further blurred by the fact that the MOOC students could consult resources before answering, and, in fact, did consult within course resources significantly more during the posttest than in the pretest."

This reads like if a student finishes a course then there isn't much difference based on how they learned the material. Hardly the sign of a 'radically superior' education system.


'Radically Superior' could mean equal quality of education for:

1) Free (or nearly free) 2) Open to anyone 3) flexible to fit anyone's schedule 4) Easier to show the actual work completed (because there is a digital copy).

Generally, any student who needed special considerations for any of the above (a likely non-trivial amount of students), they had to compromise on the quality of education. For those students, I'd argue that it a significant improvement.


I am pleased to see an increasing uptake in mass higher education, except like the OP I think we have confused liberal education with job training. But it would be unwise to ignore the history of distance learning, which started with correspondence colleges in the 1800s. Certainly anything which was done by post can be done now by computer, so I have no doubt that MOOCs can work. The question is how judge if they are "radically superior".

California used to have (1). The GI bill also meant (1) for many veterans. Germany, Sweden, and some other countries still have (1). Which makes it easy to judge if current US universities are radically inferior to tuition-free universities. The Open University in the UK is a decades old example of (2) and (3), though not (1).

It's therefore hard for me to accept that MOOCs are significantly more radical and superior than existing systems which already incorporate most of the points that are supposed to make it radical and superior.

And, (4) Seriously? What, some employer is going to come and insist seeing my individual assignments for partial differential equations before hiring me? And read through my essays for sociology class? And my term papers for introductory philosophy? Embrace the Panopticon!

For that matter, my wife's college (she takes online courses) is focused on team projects, so most of her assignments are done with 2-3 other people. How is the outside world supposed to figure out which part is hers? How much time are they willing to spend to disentangle this?

And finally, if this were useful then a pen-and-paper correspondence college could add a small surcharge per course to hire someone to scan incoming mail and put it in a file for future reference. That that hasn't happened, nor that there's been a call for it, suggests that it isn't so useful.


You forgot step four of The System :-)


The real customers of these MOOCs and other programs are employers, not students. What are these programs doing to work with employers and appeal directly to them?


There is long tradition of evening schools.


Those are a resume stain along the lines of the paragraph beginning with "Traditional college degrees represent several different kinds of information." and ending with "tell the job market almost exactly the same thing: “This person was good enough to get into Harvard.”"

A traditional student trajectory tells the HR lady the following positive (from the point of view of corporate HR) things:

1) Came from a social / economic / racial class able to afford schooling instead of work in early years

2) Dedication to a pointless goal (graduate in 4 years vs 5+, or in the corporate world this would be pointless grinding optimization of meaningless numerical metric goals)

3) Able to take on huge debt load (see #1 above) and has a huge debt load thus easier to mistreat because they've got that loan payment hanging over them (see also the reason why companies love when employees take out a mortgage or car loan, welcome to mandatory unpaid overtime LOL)

4) applicant is a follower not a leader and does whatever people tell them. Once that meant taking out $100K in loans to get a piece of paper, after hiring it'll mean helping the boss "smooth over" some regulations and accounting issues or whatever other form of corruption as long as it comes from above. Because authority wouldn't be authority unless it was correct, right?


I'd like to think your points are unnecessarily and viciously cynical.

I'd like to think that, but actually I suspect you are dead right.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: