A good portion of a degree's value is in its branding/prestige value. It's not just about the knowledge, it's about the overall reputation of an educational institution. The material at Penn State is 99% the same as MIT's. If this weren't the case, MIT and Stanford would have lost their luster to public schools decades ago.
Unfortunately I don't see a degree from a website named "Udacity" garnering anywhere near the level of respect that a traditional university has. Step 1 to bringing down the university system: pick a decent name.
Udacity isn't a brand yet. Someday something like it or it will be. Then it won't matter.
I respect people who are resourceful and get stuff done. If you choose to go to college and wait out the great recession, I'll look down on you no matter if you went to MIT or Stanford. If you went to college for business or technology and you didn't produce anything during your time at college, you'll have a hard time convincing me to hire you.
It's only a matter of time before the rest of the world catches up to my way of thinking.
"I respect people who are resourceful and get stuff done."
Does getting a college degree not count as getting stuff done? Would you rather someone list every single problem set, proof, project, etc that they have done? I can vouch that a degree from some schools requires much more work/time/resourcefulness than making a couple of cool projects on github.
Also, as a technologist, my ultimate goal is to push things forward. Think SpaceX, Google, Intel, or even something smaller like Lytro or projects like Google's autonomous vehicles. These technology beneath these things (cheap manufacturing, PageRank, integrated circuits, etc) isn't invented by the people who make CRUD webapps; its made by those who master a field and have some insight to vastly improve their domain. That's what changes things.
Note, mentioning Apple or Facebook might seem like a counterpoint to my statement. I would argue Apple has pushed design forward in the same way Intel has pushed the semiconductor business forward. However, Facebook seemed to have no extraordinary technology at its core, but just capitalized on a hole in the market so much that it actually has changed how society functions.
Also, I'm not at all discounting the hustlers who get stuff done, by I think to really change things, one must be a hustler (or know one) and also have extraordinary technology.
I don't doubt that a degree can take much more time or work. In fact I'm sure it probably does, which is all the more reason to NOT go that route.
That's awesome that you want to push things forward. We need that, but what I'm saying is that nothing is going to help you except down right resourcefulness. You're going to have to go deep and master something to do that, sure.
Five years ago, I think you would have been right. The resources to get started weren't there. Today it's all changed and now resourcefulness gets you so much further than it ever has before.
I've replied below a few times, but I need to respond directly to this comment:
It's only a matter of time before the rest of the world catches up to my way of thinking.
What's so special about this way of thinking? Do you really believe in the existence of brilliant MIT or Stanford grads who have produced nothing during their 4 years in school? Further, do you believe these hypothetical underachievers are getting hired by Google, Facebook, Apple, Intel, or any other excellent company?
You're populating this thread with straw men and false dichotomies. The two spectra you're talking about -- autodidactism versus traditional education, and real-world experience versus theory -- are not mutually exclusive. There are students with strong theoretical training who hack on projects at AT&T Labs over the summer, and there are autodidacts who take night classes at their local university or college.
Please stop spouting off about "people who get stuff done," or these pseudo-sheeple you've dreamt up who go to Caltech, get degrees, then do nothing. Everyone gets stuff done in one area or another.
I admit that last phrase was a bit over the top. But, I do believe that long term industrial education will slowly die out. Probably a lot slower than I think. There's nothing that special about my thinking really.
Google, Facebook, Apple, Intel, etc all hire the best. I don't know THAT much about their hiring policy or practice, but there are people just by sheer numbers (on either side of spectra) who are more than qualified.
In fact you can go to college and get stuff done. It's one of the best ways to go because you can use loans and scholarships to fund yourself getting stuff done. It's the degree so much that I don't care about. And to be really honest, it's probably easier to get stuff done in the right college setting than it is to get it done almost anywhere else in the world.
All I'm really saying is that I have a strong preference for self learners and people who have done more than get a degree.
This is just as poor of an attitude as the people who only care about the degree. Like anything else a potential employee has done, a degree (especially from a good school) adds something that you can't capture using any other metric.
Why is that? You're not adding much to this conversation just by saying "That's an awful way of thinking."
Why is it so awful that I realize that technology and business can be learned much faster, much cheaper, and more indepth by actually doing them than going to school for them?
Why is it so awful that I value someone figuring out a better way to spend a recession than in school learning something that may or may not help them and for most of them puts in solidly in debt with few options after graduation.
I can't speak to business, but I can about technology. When you're "on the job", you don't have extended periods of time where you have the luxury of studying a complicated topic in depth. Many tech jobs also don't give you the luxury of breadth, but rather are guided by an overarching practical purpose -- to get the "job" done.
Your generalization of the knowledge one gains by doing a degree is much too broad.
Here's the dirty little secret that people like you overlook - spending four years hacking around and hanging out on HN will teach you nothing about Big O notation and how to analyse it or about advanced data structures. What it will teach you is plenty about chasing low hanging fruit and very little about tackling big and difficult problems.
Sure, someone who's very driven will learn the same amount regardless - mostly by reading all the same books that someone who's gone to school will have read. But most people are not that driven. Most people are happy to hack around building webapps on the back of other people's libraries and then deluding themselves into thinking that gluing together a bunch of libraries and adding a pretty frontend counts as a significant achievement.
Frankly, with your attitude you're doomed to hire the mediocre while the people with the brain and skills to do any form of non trivial analysis go elsewhere - as they well should.
This is not a binary issue. There are many facets to software development. Knowing Big O will not teach you how to best use git in a production setting. It turns out they are both useful and important.
And that is the crux of the matter. A college graduate that is not driven to learn about the practical aspects of programming is no better for the job than the non-graduate who is not driven to learn about the theoretical side. After only four years of study, I would say that both applicants in the previous example are on equal ground. Both are lacking, just in different areas. It is not really clear at that stage which one of the two will have the drive to fill the gaps.
The dirty little secret is that you cannot apply generic processes to find the best applicant. Doing so will lead to mediocre selection no matter which bias you choose.
This is an awesome, insightful comment and really highlights what I'm getting at.
You're right that they both have weaknesses. What I'm saying is I'm willing to go all in on the person who figured out much of it by himself using whatever he had available to him than the person who had a college instructor put together everything for him and then provide additional readings, power points, and web links to all their students.
A smart person has to think their way out of a box. A resourceful person can just hack their way out of the box. I don't mind a little mess, so I'll take the resourceful person over the smart person everyday of the week.
You're putting together a collection of false dichotomies in this thread. Let me muddle them up for you:
CS departments are not all ivory towers.
Consider the following opportunities in my department:
(1) TA or grade production-quality software classes taught by Google engineers, with discussion on version control, code style, scalability, etc.
(2) research and implement algorithms for diagnosing hospital patients more effectively using ML techniques,
(3) push the frontier of computer vision in concert with former telecom engineers.
Software development contains a large set of cool, non-trivial problems. Without formal CS training, you will not get to access those problems. This might change someday, but it is not close to true right now.
> Without formal CS training, you will not get to access those problems.
This is also a false dichotomy. You do not really know what people are working on outside of the college setting.
Computer vision is pretty low hanging fruit for anyone to take on. I will grant you that it may be difficult to access medical data outside of the institution, but the same ML techniques can be applied to other data that is relevant. As a hobby farmer, I see all kinds of interesting places for ML on the farm. How many CS students are working with that kind of data?
No, it's a reality of the labor market for doctorate holders. Startups are experiencing a shortage of coders, but the supply of CS doctoral students is robust. Just look at how quickly internships/positions at industry labs (AT&T, Yahoo, Microsoft) fill up.
Not all computer vision is "low-hanging fruit," especially if you're pushing boundaries. Similarly, applying ML techniques to "other data that is relevant" is a far cry from using ML to save lives at a hospital due to misdiagnosis. I'm not talking about regressing A/B testing results.
> Not all computer vision is "low-hanging fruit," especially if you're pushing boundaries.
Computer vision is low hanging in the sense that you already have everything you need to make positive contributions to the study. I think the same is true with ML in general, but it was specific about what type of ML, which hangs higher due to the data availability.
> Similarly, applying ML techniques to "other data that is relevant" is a far cry from using ML to save lives at a hospital due to misdiagnosis.
Are you saying that programmers that are not working directly on saving lives are essentially wasting their time? There are a lot of interesting ML problems that do not save lives, but they are still worth working on.
"Computer vision is pretty low hanging fruit for anyone to take on."
Sure, but to be fair to achompas, that isn't what he said. What he said was
"(3) push the frontier of computer vision in concert with former telecom engineers."
Pushing the frontier of CV (as distinct from implementing/applying some CV algorithms) is hard to do outside a university or industrial research lab. Without formal CS training, it is very hard (Not impossible, but very hard) to access those problems.
Hard is quite different to impossible, which is what achompas implied. The beauty of computing is that you are only limited by your imagination. Anyone can accomplish anything they want. You do not need a CS degree to get there – though for some, it might help.
My point is that you simply cannot generalize. You have absolutely no idea what talents someone has just by looking at their history. It is simply irrelevant information if you want to hire the best of the best.
Hard is quite different to impossible, which is what achompas implied. The beauty of computing is that you are only limited by your imagination. Anyone can accomplish anything they want. You do not need a CS degree to get there – though for some, it might help.
Look, I appreciate your attitude. You have a positive outlook on what you can accomplish, and that's undoubtedly a good thing.
But, to be frank, the skills required to access my example problems above are not trivial. Let's consider the autodidactic route for computer vision:
1. You need to be cozy with linear algebra, convex optimization, calculus, and algorithmic complexity if you even want to understand prior research. This, alone, is 1-2 semesters of course load for a full-time student.
2. Then, you need to survey prior research to gain awareness about what already exists. You'll hit Google Scholar, search for papers, and have to circumvent article paywalls.
3. After that, you'll need to code your own framework (non-trivial) or convince other researchers to share their source code (very non-trivial--almost impossible, given that they might monetize or license their work, or their university owns said license).
4. Then you need to collect data to test your CV algorithm, iterate on it, etc.
Universities overcome all of these barriers. Hence, it is unrealistic to suggest one can produce cutting-edge CV work by themselves (or without university help).
1. Anyone can take two semesters worth of time to study the material. This is not exclusive to students.
2. You have to spend the time doing the research no matter who you are. Alternatively, you can ask someone else. Either way, anyone can do it.
3. This is a fair point, but you are allowed to spend money. If it costs money to access that code, so be it. Students are paying for that access too.
4. Again, true of anyone.
But more to the point, who cares how someone achieved their accomplishments? If it was through college, great, if it wasn't, still great. Why are you immediately discounting the person who did something amazing, just because he did it by himself?
Edit: I confused you with another poster. You may not have been judging people on their past. I do agree that people are more likely to do that work in a school setting, but that remains irrelevant when it comes to hiring.
I'm inclined to agree. Since the differentiator is expected drive, one can reasonably conclude that autodidactic is more likely to have it. It is what got them this far. While many graduates also fit that description, it is difficult to filter them from those who graduated only due to social pressures and income dreams.
For the theoretical programmers out there, you may recall that the future is independent of the past, given the present. I'm not certain it is worth hiring anyone based on their history. I would look to what they can offer today, and try to discern from that where they are headed in the future. It takes more effort this way, but you get what you give.
I hate to say this, but you're really starting to sound like you have a chip on your shoulder. You admit that GP has a good point, but then you go ahead and bash one type of person based on some ridiculously flawed notion that an "instructor put together" their knowledge for them. That's as far as I'm going to read this thread.
It depends on what you're doing. If you're writing java/python/ruby plumbing, sure some practical experience is great.
If you're job is to improve PageRank, I'd rather much take the person with a strong theoretical background linear algebra, probability, machine learning, and algorithms. That person is more than smart enough to learn the coding on the job.
A lot of the learning occurs outside the classroom too. Going to an good university helps facilitate that. Not to mention the possibility of working on research with some of the best minds in the world.
> The material at Penn State is 99% the same as MIT's
As someone who attended Penn State (but not MIT), I would have to say that's some extremely wishful thinking. And even if the material is 100% identical, the way it's taught certainly isn't. The fact that the latter is far more important is undeniable. After all, why bother with a university at all when we could all purchase the same material (textbooks) on Amazon.com?
You also seem to ignore the fact that many tech companies are now far more interested in your personal website/portfolio and GitHub page than the degree listed on your resume. Even if you find a potential employer who values an expensive piece of paper more than your actual ability, would your really want to work for them anyway?
I haven't attended Penn State or MIT, but I have attended the University of North Texas and Yale, and I agree with you 100%.
While there were differences in curriculum, the biggest difference I found was in expectations for the work I produced. My Yale classes have simply expected be to do more with the material: take it farther, make more inferences, produce a more insightful answer/project than the ones I took at UNT did.
Regardless of branding, I could never say the experiences have been interchangeable.
You also seem to ignore the fact that many tech companies are now far more interested in your personal website/portfolio and GitHub page than the degree listed on your resume.
Isn't that just a SV thing, though?
Even if you find a potential employer who values an expensive piece of paper more than your actual ability, would your really want to work for them anyway?
For many or even most people, the answer is a resounding yes. They just see the college degree as a painful/expensive ticket to gain entry. The skills necessary to do the job will be learned in the first 6 months of working, and the degree is just the shit-sandwich/iq-test-by-proxy you have to endure to get your foot in the door in the first place.
This is even more so the case where I live, due to the Scandinavian tradition to defer to experts and promote based on seniority and credentials. Oh, and having free college helps too, I guess.
Does anyone have a pointer to ow this company pans to deal with the Blackboard patents? A few years ago Blackboard patented various aspects of online classroom applications. They offer a free license to not-for-profit projects but it's unclear Udacity intends to remain entirely not-for-profit.
From what I understand Blackboard lost almost every legal case they filed with those patents, so as long as they have enough money to get through a lawsuit, they should be fine.
However, they still sue everyone they possibly can in order to still lock down their market. For a few years now, the barrier for hasn't been the patents but for the cost of standing up to them in court.
Theoretically this is illegal. In practice, it's hard to draw a sharp line between "plaintiff actually feels his patent should apply here" and "plaintiff doesn't honestly believe his patent applies, but is using the courts as an anticompetitive measure".
The US patent litigation process is a horrendously bloated system that favors lawyers and companies with deep pockets, and makes it possible for a sufficiently rich and motivated company to bankrupt an upstart competitor.
It would be awesome if they could compare (or map) this course list to what would be typically taught by a University-level Computer Science program...
I'm not interested in earning a degree/certification, rather I'd want to take these courses for general interest only.
I'm interested in understanding the depth and breadth of each course, vs. just going off the heading. If I can understand the value of each Udacity course as compared to the standard (eg: another University's program), I'm more likely to commit / buy into the whole Udacity approach.
i.e.
"At Stanford, a typical 4/5-year degree has these X topics / competencies... and these 8 coures meet 50% of these competencies."
Without understanding the value of each course as it relates to, say, another University program or competency map, I have a hard time understanding the value of Udacity.
This probably sounds confusing, so if there's anyone from Udacity that wants to chat I'm more than open to. I think what Udacity is trying to achieve is great.
As someone who already has a bachelors in CS, I'm still looking forward to fill in some gaps in my education with these. For example, I never took a Theory of Computation class.
This sounds bold, but I think the success of Khan academy is simply due the fact that Sal Khan is a brilliant man, and so is Sebastian Thrun. The best teachers provide the best teaching experience, and the Internet allows everyone to have this experience.
I really wish they had chosen a name for their university that was not so Web 2.0. It makes it unclear to me whether this is a startup or an educational institution.
It's both. Did you check out their website? Looks like a classic startup.
> We are a rapidly growing company located in Palo Alto, California looking for great people to join our team. We're looking for a wide variety of backgrounds - the only thing in common is a passion for improving education.
A university offers a lot more than just classes--one of the primary functions is just to bring together lots of smart people. Being able to talk to both other students and professors face to face is important; and having plenty of other like-minded students around is great for more than just classes.
On top of this, universities also offer research. Being able to work on something exciting while you're there is important. And research benefits both from having a high concentration of smart people and having the appropriate facilities. (It also sometimes benefits from having a university full of guinea pigs :).)
Also, while most CS classes are easy to move online, this does not hold for other technical subjects which need equipment and labs. An online-only EE or Bio or Physics class cannot reproduce the hands-on experience which is very important to understanding the material well.
I think its important to distinguish that a lot of EE, biology, or physics classes could moved into an online format because not all of these classes are labs. The demos would be easy to replicate just as part of the videos of the lectures.
I'm a graduate student in the sciences and every biology, chemistry and physics class that I took had a lab, except the evolution class. You could put the lectures online, but universities will never be completely replaced, as long as people want to learn science or do research.
>as long as people want to learn science or do research
Maybe this will create a large demand for amateur level/mass producable/cheap lab equipment and that would eventually lead to a situation like PC vs Server where even top notch servers are ~ comodity hardware on steroids. This would lower the barrier for innovation significantly.
hmm, at Caltech (my school) the courses are very much divided into either lecture courses or lab courses, and the phys/chem/bio majors probably have 2/3 lecture classes and 1/3 lab classes. The lecture only classes could very easily be put into an online format.
Blockbuster is more than videos. It provides jobs to youth and sells snacks to customers. All of which is irrelevant to their basic economics when people stop buying from them.
If 10-25% of students decide to learn online, a university's budget is seriously compromised.
This could shutter a lot of the teaching-focused universities.
It wouldn't really work the way you are thinking. If 10-25% of applicants to the college I went to decided to go to Udacity instead, my college would have had exactly the same number of freshman and wouldn't notice a dollar difference in its revenue. They already only admitted X% of the people who wanted to come in, where 0%<X<100%.
So these online college courses aren't going to kill regular universities. They might make the odds of getting in easier as some people forego normal college for online school.
Likely the schools that currently struggle to fill their classes (lower tier schools, for-profit schools) will go out of business, but these online classes simply aren't going to hurt the big colleges.
I'd say traditional universities will most likely move to similar platforms rather than just going extinct. This is because Universities provide more value than just the classes, like research.
I think it can be successful in a niche, but won't go too far.
Universities have two main components: teaching and research. Udacity has a quite competitive teaching model -- there are cons compared to traditional schools, but also some pros.
But they don't have a research model. For a professor to leave their school and work somewhere like Udacity, they have to pretty much give up their research in favor of the opportunity to teach many students (online), and the money. Some may be willing to make the tradeoff, but most of the best professors won't and probably never will.
That's why I think the most successful online teaching programs will be affiliated with traditional universities. In CS, you could argue that the monetary resources needed to do research aren't as high, and you don't need specialized facilities. Maybe so. But you still need colleagues and collaborators -- especially in other fields like biology or engineering -- and you need grad students. And trying to do an online PhD doesn't sound very viable or desirable.
And in other fields than CS, where you absolutely need facilities, centers, etc to do research, it will be even harder to attract good faculty to a place that has none of these. You could build them all and bring in grad students, but then you're just a copy of a traditional university with no undergrads, and it's not clear why you didn't just partner with a traditional university in the first place.
So, this model may attract large numbers of students, but because of the research angle I don't see schools like these gaining the prestige and reputation necessary to draw students away from most traditional colleges.
Traditional research universities will be around for a long time. Good online schools may eat in to the commuter-campus types of schools that are mostly teaching centers. This would include a lot of the regional campuses of major state schools.
I've said this elsewhere. Good lectures, even good exams, must be matched with good projects and discussions with peers (refereed by a deeply knowledgeable teacher). I wonder if a cottage industry of local study groups will form around these lecture sources. Perhaps Udacity could even spearhead the creation of these and offer some kind of certification to vouch for the local teacher.
Then you have the best of both worlds: an internally consistent curriculum with good lectures and digested information paired with one-on-one instruction and project development / discussion which will serve individual students in their own unique ways.
I know of a grad CS course that is essentially doing this. Students watch the lectures and help each other out, then supplement the course by presenting papers and aiming to have project ready for publication by the end of the semester. It's a small course consisting of grad students and upper division undergrads. It will be an interesting experiment.
But I think this is a great model, in my experience state schools tend to have pockets of really bright students that can easily out preform the quality of the teachers (certainly not always the case). There are huge opportunities to improve public higher education in the US
I think community based schools will start cropping up. They'll align with some subset of online course material and there might be a local teacher to help coordinate but otherwise, it's the students themselves helping each other. The local tangible presence would not only allow for that interaction / project based element I'm so fond of, but also to help pool resources for equipment.
If we're going into this brave new world of private, decentralized education we may as well do it in such a fashion that your own financial status doesn't prevent you from participating.
My 16 year old son took the online Introduction to AI class and liked it very much. I told him that Udacity plans to offer an entire online CS curriculum & certification, and he's now very interested in evaluating if he wants to get an Udacity degree instead of going to a more traditional university. I guess I will have to do some serious thinking before I decide if I should encourage him to get an Udacity education or a more traditional education.
At least in the short term an Udacity education is going to be considered the same as self-taught. A traditional education however is going to open up other opportunities that the self-taught cannot easily obtain. As long as Udacity is free however, there's no reason to not pack in some extra classes if he has the time.
I have a traditional education, BS in CS & EE, and it seems to me that it never has been relevant in my professional life. I'm not talking about what I learned in college, that part has been useful. I'm talking about the certification aspect. I finished college 29 years ago, and in all that time I've never had the issue of showing some kind of certificate come up when being offered a job. Maybe I'm an exception and I'm underestimating the certification aspect, but what I'm more concerned about right now is that he wouldn't experience meeting many different kinds of people if he stays at home to study instead of going to some campus, and that doesn't feel right to me.
I'm also concerned about how complete is the curriculum at Udacity going to be. I'm sure all the CS classes are going to be pretty good, but what about other subjects like math & physics? I wonder what they are going to do about that and who's going to teach them.
As someone who moved to the US for work, one important benefit of 'the certification aspect' is that it'll open up travel opportunities he'd otherwise struggle with: most work visas require a degree, sometimes 'or equivalent experience' but that's harder to demonstrate. If he actually wants to immigrate somewhere the degree may also provide points for that.
I'm talking about the certification aspect. I finished college 29 years ago, and in all that time I've never had the issue of showing some kind of certificate come up when being offered a job.
Doesn't this proof of certification happen the moment Human Resources asks for a resume?
I got my first job when I went to a company that sold Atari computers to buy from them an Editor/Assembler cartridge for the Atari 800. They were also selling Altos business computers and they wanted someone to help them with sales, they thought that if I wanted to buy that cartridge then I did know about microcomputers and microprocessors. I wasn't really interested in business computers or selling them but they did have lots of Atari software I could play with, I thought that could be fun so I took the job.
Their main focus was to sell 8 bit Altos computers running M/PM, a multiuser version of C/PM. I really didn't care much about those computers, but a few months later Altos introduced the Altos 586, a 16 bit computer running Xenix. I had bought a copy of the "C Programming Language" a couple of years before and that was the first time I could get my hands on a machine with a C compiler. From then on I spent all my time playing with that computer, 7 days a week, writing C programs and learning all I could about Unix. The Altos 586 turned out to be a very popular computer, we sold many of those, and I became well known in the local computer industry as the Unix & C guru. From then on, people who have offered me a job have always been people who know me and know what I can do, so the certification has never been and issue.
There are plenty of online schools already. They usually either have you take tests at a local third party testing center, or they watch you through your webcam while you take tests.
Additionally there's plenty of cheating going on in traditional universities. I've even heard of people being paid to take entire classes for other people.
Agreed. If you're going to cheat, you'll cheat. It might even be easier to detect, e.g. enrollment and registration coming from one IP, but all coursework submissions happening from another. Or other more subtle detection mechanisms could be developed I'm sure.
Taking the big tests at third party testing centers should be the way to go.
However, I would like to point out that an university does not merely comprise of lectures. The research projects, internships and even the interaction with peers cannot be recreated online. Traditional universities still have an edge over this.
If enrollment was high enough, local meetups for some classes would go a long way towards solving this problem.
What I would like to see eventually happen is for learning and credentialing to separate.
I'd like to see a university that was more freeform. What if instead of just offering 4 year degrees, you could take a 6 month AI program, or a 3 month probability and stats course.
I imagine your concern is "how can an employer know this cert means anything?" The short answer is they can't, employers should stop filtering on presence of degree/cert/buzzword. For the jobs where certain certs might matter in a legal sense (certain areas of civics engineering I'm sure) I would hope they vet candidates above and beyond presence of the cert and knowledge of the most basic facts.
In practice, the result of cheating for a degree is the same as lying on a job application or resume. When the person gets hired for a job in that field their performance will make that job short-lived (if they're genuinely bad). Similar for putting a school and degree they didn't go to on their resume just to get hired. If they really don't know the material, their performance will show and worst-case the company lost six months of time they could have spent taking someone else through the hiring process. (Ed: okay, worst case is that the person's incompetence causes them to blow up the Earth on the first task they're given.)
If they do know the material, and their job performance is good, do you really care that they cheated for the degree or lied about having gotten it? (Okay, it pegs their honesty level quite a bit and you should definitely start bolting things to the floor, but if they do the work well they're still a valuable bee and you can keep them on while looking for someone else if it's morally offensive enough to you.)
I've received one of these emails as well. Almost makes you wonder how much would offline university degrees be worth in a few years.
Objectively, this is not a new idea. We have been talking about revolutionizing education for years - and free online education is pretty much the ideal. There is even enough material about most scientific fields and areas of study (some are more open than others, though) to become an expert.
So what is the problem? Becoming an expert with enough authority for people to actually listen to you. Easy enough to do it on your own in programming or entrepreneurship you might say. Not so easy in medicine or biochemistry (though I will admit I have no first-hand experience).
In that regard, certifications are a significant next step, provided that they get wide enough acknowledgement. That is the next battlefront, I think - convincing industries, governments and academia that online education is the way to go.
I think this will highlight differences among fields of education (?).
Programming is a very pragmatic field. Your ability to code is at its core judged by whether you can program something that works. Don't have a formal education? Whip up several interesting programs / websites on your own, and less people will care about your degree. In addition to that, it's becoming increasingly easy to learn the subject on your own.
Not many other fields are like that. Medicine for example. Sure, you can learn the names of the bones in your body, understand the use cases of different drugs in the market, but can you diagnose a patient? To learn that, you need access to training in real hospitals which are only given to students enrolled offline.
Same thing with biochemistry (needs access to labs and direct mentorship), law (access to courtrooms? or certification from proper boards). Many of these other disciplines are based on initial trust. It takes time for a biochemist or a lawyer until their work results in something. Having an actual (offline) degree places some kind of 'proxy' for the work result, until they actually appear.
I'm not that certain that they can be replaced / revolutionized by online classes. I'm all for revolutionizing education, but I don't think this new approach is able to revolutionize all fields of education.
> but can you diagnose a patient? To learn that, you need access to training in real hospitals which are only given to students enrolled offline.
There are medical simulation tools that teach you to diagnose patients, train you in surgery , etc.They seem to be very effective as a training tool. One can imagine a certification process that tests you using this tool, and verifies that you have good diagnosis and prescription skills , and maybe part of the treatment skills.
That might be a good enough basis to admit you on a trial basis as a resident, or to a pre-residency short program.
Maybe in a similar fashion, one could build a simulation software that can train biochemists affordably ,and test to see you're qualified enough to work in a lab.
Call me a skeptic, but I don't think those simulations can ever replace real-life situations.
The human body is bewilderingly complex that it it still a subject of thousands of researches worldwide. How can you simulate something you don't completely understand?
Plus, in the case of medicine, one needs to know also how to interact with a patient. Is he/she telling the truth about his/her symptoms? Is there something the person's not telling that might affect your diagnosis? What about educating the patient about the disease?
These are real day-to-day situations that can never be simulated.
How can you simulate something you don't completely understand?
Many medical diseases are reasonably understood, at least at a level we know to associate symptoms with tests and treatment processes.And there is real simulation software that being used in teaching med students, so it must have passed some quality assurance.
Yes , simulations might have a hard time replacing human interactions, but as far as i understand , you learn patient interaction in an environment external to the university(a hospital). There's no reason well educated online students won't have access to those experiences.
I do agree on that point, but I'm not sure if pragmatic is the right word. Need of a specialized work environment, perhaps. Programming needs one, too, I suppose, but it just happens to be a computer.
Approaching other subjects the same way we do math and programming would be quite monumentally stupid. Could you trust a "doctor" to make the right decisions if his/her experience consists of watching lecture videos on YouTube and answering quizzes?
However, should we completely give up on those fields that require specialized practice? I see it as more of a challenge than a reason to despair. For one thing, you can combine online education with offline practice (provided willingness of all parties involved to experiment with new approaches, which is certainly not a trivial thing to ask). Also, an approach I'd try is having games that simulate the real-world task as close as possible. At some point, you probably would need specialized equipment and/or test subjects. Reducing that to a minimum, though, is in my opinion a very good thing, as it would drive down the cost of education.
Read your blog Ruchit. I agree with your thoughts on human interaction.
I think in a couple of years we will discover that the key benefit to online education is the ability to consistently deliver high quality instructional content to very large numbers of students.
The social component of learning should benefit from this. My hope is that online delivery of instruction will free up time in the classroom for the human dynamic of discussion, problem-solving and shared inquiry.
It's too early to conclude what are the limits of computer mediated human socialization.It's still a pretty young field.For example, just before something like 6 months , google came out with google+ hangouts, and now some people are cooking together using it, which seems pretty unexpected.
Historically, doctors and lawyers did teach themselves and many would sell their services without the appropriate papers. (Or the appropriate papers were obtainable through a test without requiring spending X amount of hours beforehand.) There's nothing fundamentally different about those fields as opposed to programming that make it so you can't learn the majority of the knowledge you'll need in the field from home. You can definitely diagnose without formal training--there's a story I heard of a guy who was having the signs and symptoms of a heart attack, googled them, realized he was having a heart attack, and called the ambulance. There's a book whose title I forget that describes in fine detail plumages, shapes, and other features about a huge variety of birds, but has no pictures, and it's commonly used by bird watchers to help them classify.
You may have an argument with biochemists since it's still a relatively new field, and specific universities are a decent place to find mentors, whereas historically you might just go seek a mentor from your local village or in the Big City for things like alchemy. The mentor may even have their own "School of Me", we don't have those anymore. Or maybe you might send a math paper to the Big Shot in Math in the UK, and if he likes it, he'll tell his circle of other math guys and suddenly you're "in" and will be recorded in the history books written by other people in the UK.
I think online media has revolutionized all fields of education and will continue to do so, but only in the simple way of making more correct information available and easier to acquire or recall. Self-learning is as old as books and online education only changes the source material from book to computer (which has a ton of benefits of course). I do agree with you that it is unlikely online education will ever completely replace having offline classes with other people, I know I would have preferred a programming mentor when I started out but all I had and knew of was a book. (It didn't occur to me then to seek out mentors at, say, a meetup.) The main reason I agree is because I don't foresee a decoupling of the university degree for a subject Y and a certification for doing Job X in a field sort of related to subject Y any time in the near future for many jobs, such as doctors and lawyers as you mentioned. What would be truly revolutionary for the Plebs is a mandate from the government saying HR departments can't require an accredited education, just as they can't require a particular race/sex/etc. (This won't stop discrimination of course just as the other mandates don't, but it would help I think even if it just dispels the notion that "I need a degree to get a job" that follows directly after "I need a high school diploma / GED to get into college.")
From what i read, today's online learning is pretty social , not self learning.I guess that's a big part of what enables so many people to be successful in learning at home.
Part of the problem right now is that online education has such a bad rap - for example, The University of Phoenix is widely recognized as a degree mill. It remains to be seen whether companies/institutions (like Udacity) can get the certification part good enough that it will be worth something.
UoP's bad rep comes not because they're a degree mill but because they have a high dropout rate (which is sort of the opposite problem -- if they were a degree mill their dropout rate would be low).
Their problem is really that they spent many years recruiting unqualified students who used financial aid to attend, telling them things like "it's no problem to take on $100,000 in debt, with a degree you'll easily be able to get a job that pays $100K." Then the student drops out after four semesters and defaults on their loans, leaving the taxpayer holding the bag.
> That is the next battlefront, I think - convincing industries, governments and academia that online education is the way to go.
If the program was an accredited degree, I think it would get us a lot closer. I'm sure that there are some online friendly institutions such as WGU and University of Phoenix who would be interested in offering a low cost CS degree program.
The main point is that if they line up credentialing that mirrors Udacity's courses, they can offload the cost of curriculum development and course management and sell the exams and degree, at a substantially higher margin. They can also offer a pure CS degree instead of the IT and MIS degree they currently offer.
It doesn't have to be free... it just has to cost less than 20k a year to be disruptive. This has the potential to be very disruptive.
One of the things that excites me the most is not only the potential to eliminate the massive load of debt that your average graduate inherits when they're finished with college, but it also lowers the barrier of entry to classes. So a person who already has graduated can study a cross-discipline a lot easier. I think the amount of people have an expertise in very different areas is going to increase. A lot of potential opens up if/when that becomes a trend.
Believe it or not PA has one of the highest rates for in-state tuition. The per credit hour rate for my post 9/11 G.I. bill is based on the highest in-state tuition for a state school. Last time I checked the per credit hour rate for PA was about $900 while in California it was around $300. If I didn't already know better I probably would have guessed that it was the other way around.
When I went to Georgia State (started in 2002) it cost about $4k/yr (not that I paid any of that, I made about $6k a year after tuition and fees from scholarships).
Now that my little brother is going there, it costs him about $10k a year in tuition and fees.
As someone who is extremely interested in getting started in the e-learning space, what are some ways to offer credentials to people via your e-learning company?
I imagine there is a host of options, ranging from formal university credits (which would require teaming up with some educational institution) to just an email saying "congrats, you done it bro". I'd imagine the interesting options lie somewhere in the middle?
Unfortunately I don't see a degree from a website named "Udacity" garnering anywhere near the level of respect that a traditional university has. Step 1 to bringing down the university system: pick a decent name.