Hacker News new | past | comments | ask | show | jobs | submit login
My Education in Machine Learning via Cousera, A Review So Far (richardminerich.com)
105 points by Rickasaurus on Dec 3, 2012 | hide | past | favorite | 29 comments



One thing I've found somewhat odd is how seriously people take the timeline on coursera courses. To me one of the biggest weaknesses of these courses is the hard time constraints.

For all of the courses I've taken (ML, PGM, NLP, SNA) I've followed roughly the same pattern: follow the course for 50-70% of the material doing all homeworks etc. Take what I've learned and go off do some fun projects. Once I hit a road block return to the course and finish up (usually months after the course is 'officially' done) typically forgoing the homework being I've already created 'homework' for myself.

Compared to my classroom experiences, I've found that this method greatly enhances the benefit of the materials I learn later in the course, as well as allowing me to more strongly reinforce material learned earlier. This is also something that physical classes simply cannot emulate. I do find it a little funny that I've never officially completed a coursera course, despite having covered much of the material very thoroughly. Hopefully as these courses continue we'll start to see them adopt patterns that extend beyond replicating the limitations of a physical classroom online.


Just to provide a counterexample: for me, the hard deadlines are really helpful. That, along with the near-instant feedback, are what make the experience so much better than just watching a series of lecture videos. They act as a motivating factor & help prevent me from getting too sidetracked by things where I wouldn't be learning as much.

Maybe that's an area where online classes can innovate further: hard deadlines could be optional. People like me who want them could enable them when they start a course; people like you who don't, could disable them instead.


I also like the sense of accomplishment in finishing a class with a high grade and time constraints. It's like beating a video game on hard :).


There are several studies showing that hard deadlines usually result in better performance and higher completion rates, so you're not alone[1]. It also has the advantage of creating a community on the same page as you are.

So there's no clear-cut answer. Making hard deadlines optional could very well cause more damage and less completion even if it's better for a few individuals. I hope that Coursera is collecting analytics on how people perform with different types of deadlines, so that they can come up with the best overall solution.

[1]: http://pss.sagepub.com/content/13/3/219.short


I've done a number of MOOCs now, and I've learned to ignore the timeline in some cases. Sure, I don't get a certificate, but that's never been the point for me.

I just learned about the Neural Net class about a week ago and it's scheduled to end today. I decided not to let that stop me and went through all the lectures in a week. I'm going to have to watch a lot of them again, and I need some time to work with the concepts, but I enjoyed 'cramming' it all in one big gulp.

On the flip side, I stopped doing the Computational Finance at week 8 of 10, but I'll go back and finish it up over the holidays.

And on the third hand, the Modern American Poetry class I did was best done following the timeline, so that I go to participate in the forums at the same time as everyone else. That was a very forum heavy class, though, which is rare.

Anyway, I'm just glad all this stuff is out there now. I've learned a ridiculous amount in the last year.


Same here. I already have degrees and the certificate of completion doesn't do much for me. I value the knowledge and want to use it for fun projects. I am really annoyed by the course format for Coursera and the other clones. They should let anyone access the old archival videos for all their courses. Frankly, I'd pay for that.


How are the different online-education websites in terms of being able to take courses at your own pace? I'm interested in trying an online course out to pick up some statistics and machine learning, but I probably wouldn't be able to follow it week-by-week (and not having to wait for a given start date would be great).


All Coursera courses are deadline based. Some of the courses remove their material at the end of the course. Others, such as the database one, are later opened up as self-study but do not offer a certificate of completion.

In the case of edX, MIT's 6.00x course is strictly deadline based. CS50x allows you to take the course at your own pace as long as you finish everything before the 13th of April, 2013.

But those aren't the ones you're looking for. Udacity meets all your requirements. They have open enrollment, meaning you can join in at any time. Furthermore, they have no deadlines. As long as you complete all the problem sets and give the final exam, you get a certificate.

Here are some Udacity courses that might interest you:

Statistics 101 - http://www.udacity.com/overview/Course/st101 - Taught by Sebastian Thrun

CS271 Introduction to Artificial Intelligence - http://www.udacity.com/overview/Course/cs271 - Taught by Sebastian Thrun and Peter Norvig

CS373 Artificial Intelligence for Robotics - http://www.udacity.com/overview/Course/cs373 - Taught by Sebastian Thrun

Hope that helps.

Edit: There's CS188.1x by Berkeley on edX - https://www.edx.org/courses/BerkeleyX/CS188.1x/2012_Fall/abo...

It's deadline based, but I don't know whether they will remove their material at the end of the course or not.


Something I found out recently: there are many courses that pull their course material when the course is finished. Even if you are/were enrolled, you lose access. I found this out from the Statistics One course (since you mention statistics), which cut me off in the middle.

Obviously, you can avoid this by downloading all the videos and slides as soon as you have access.


Various Coursera courses have a self study option, which lets you go through the videos & assignments, but there's no certificate at the end.

I believe that all Udacity courses are now self-paced. You can go through the lectures/quizzes/assignments whenever you want to.


How do the courses on Coursera work? Can a person do a course merely by watching the videos after it closes?

Is it possible to review old courses and take them at my own pace, or are they completely gone once they are done?


It depends. Some courses are operating on a schedule and once they're closed you can't access the materials. Other courses operate on a schedule but have elected to keep their materials open.


Asynchronous learning will definitely play a major role in the future, as will a more fine-grained graph-navigation approach, because there's a wide spectrum of the world population for which the September-May, two-semester academic calendar doesn't make sense, but the problem is that it's hard to integrate testing (and certification) into that. Answers to exams will appear on the Internet at some point in time, so the exam questions will have to be refreshed continuously.

Academic honesty is probably the biggest hurdle that online education is going to face. Once grades for these courses start to matter (and at some point, they will) there will be people in the world who will try to cheat.


http://news.ycombinator.com/item?id=4849300

to beat a dead conversation past the point of anyone caring other than me...

if you looked at the minute by minute activity, an honest effort to take a class should produce a fairly detectable minute by minute signature

for example, someone answering the questions who hasn't watched the whole video, unlikely to be an honest effort

there are enough data points in there, that detecting cheating should really be a question of mining the right data points

http://news.ycombinator.com/item?id=4413474


There are courses definitely worth trying:

  ML is the general introductory course.
  PGM is more about applied knowledge.
  NN is a specialized and very good one.
Scala is just a buzzword. Lisp is a language for AI for many reasons - it has almost no clutter, so you can read it.

There is also Berkeley CS188.1 on edX, which covers search.

Of course, just courses are not enough. You need textbooks, AIMA, PAIP and PGM are highly recommended.

But most importantly you need practice. One cannot learn how to swim, ride a bicycle or wrestling just by reading books and watching movies.

Very few regions of your brain learn something just by watching and listening. Those which are active when you actually do something must be trained by doing. It is how you will form your own intuitions, timing, etc. Non-verbal parts of the brain are doing the most of the job. Neural networks need to be trained on real data to learn its behavior.)

This simple idea is a cornerstone of MIT approach in teaching CS. They do lots of projects and labs, working in groups. This is real training. Recent version of 6.01 is a good example.

So, start your own project or join some Open Source. Then you will get real practice, training and feedback from other people. It is not meeting that matters, it is practice.


I agree with most of what you're saying, but not this:

Scala is just a buzzword.

Disagree. There's a strong case for static typing. I am not saying that it is "right" or "wrong", but this static/dynamic trade-off is not simple and has no all-purpose answer.

Also, you will never convince most engineering managers to let you write a production system in Common Lisp. They may be wrong in their prejudice, but they're focused on risk-reduction and would prefer all production software to be written in Java. Scala has a fighting chance of getting into the "for production" language space. In 15 years, most of the good Java engineers are going to have moved to Scala and prefer it.

Out of curiosity, have you tried Clojure? It's pretty neat. I actually like it better than I like Common Lisp.


they're focused on risk-reduction and would prefer all production software to be written in Java. - this is cancer, so to speak.)

I know only one way to reduce a risk of running a software - choice an appropriate hardware, and run as less software as possible.

You must know your hardware and your software. That means it must be compact, readable and easily modifiable, so that you can quickly adapt and fix it, on the go.

So, I would choose a well-defined, mostly-functional, small, simple language, with decent compiler directly to native machine code and very thin FFI, to use specialized OS services.

It could be a solution from http://scheme.com or Gambit-C, not anything that begins with J.

So, in my opinion, a native compiler for, say, Arc to x86_64 (written in Lisp) with UNIX integration (NO Windows support) would be a better solution. Unfortunately, it doesn't exist yet.)

It seems like Go is taking the same approach - they have very thin layer of abstraction on top of an OS (post of Plan9 core libs) they have native compilers, and comprehensible runtime. So, this is real, available better solution.

There is Erlang - almost the same underlying principles, but complicated with a VM.

In order to reduce risks you must have deep understanding of what you are running.


I'm loving Go, it doesn't have a lot of libraries designed for Machine Learning, but it's easy to write wrappers and we have one for Bayesian operations: https://github.com/jbrukh/bayesian

I'm investing my time in learning this language: I'm sure it will be more widely used in the following years.


This is a nice overview. I'm up and down on the Coursera courses as they seem to be hit or miss, at times incomplete, at times easy A material and at times intractable. There really is no good progression from A to B and that is fine if you are a true autodidact as I am, but for those who are considering this, or any collection of OCW as a replacement from standard education, I would strongly suggest not doing this.

The problem with Coursera, IMO, is that despite the promise of free education for all, the very large amount of courses with a broad spectrum of usefulness and difficulty makes it too difficult to navigate and can cost a student hours of wasted time watching videos that may be far beneath or above the student's abilities. I would suggest Coursera creates a pre-exam section for all the courses so the student can have a slightly better understanding of what they are signing up for. Hell, they are promoting themselves as a Machine Learning repository, can't they do a basic suggest-a-course section based on pre-exams?

With that said, I love the idea of Coursera, Udacity, EdX, etc. and what they promote. I am one of those people who, despite being American, had zero chance of ever getting into college due to the quality of my inner-city high school and financial circumstances. Of course, the issue is that if you do grow up in this circumstance, you probably won't be able to take advantage of the free education because you may not have a computer. Regardless, anyone who wishes to partake can join with little more investment than a $200 computer from Walmart, so the financial excuse isn't quite the barrier that it once was.

How much opportunity do any of these courses open up? How much do employers give a hoot about a silly certificate from a pseudo-school that rides on the backs of watered-down Ivy League classes? Employers still take a degree from DeVry more seriously than the scattered layout of MOOC, and until this is changed, there will never be tangible proof or progress in the sphere.

I could tell employers all day that I read, and worked through, SICP and Cormen, but why should they care? They shouldn't because there was no one around to tell me my code is wrong. I can show employers websites I built using exotic languages such as Clojure, but why should they care when I can't pass a basic whiteboard test? I could tell them that I can proficiently write in x y z language, but they don't care and they shouldn't. To all of those people who suggest that employers don't care about your background education and only care about the projects you worked on, I challenge you to point out how many of your coworkers don't have at least something that resembles a Formal Education (TM). Coursera is not a Formal Education (TM). The proof will come if, and only if, one day a student self-learned from Coursera with a large portfolio with self-projects nails a job over a kid from MIT with no portfolio. No one, at this point, can seriously say this is happening and a convincing scale.


It's not the silly certificate that you should value from these courses, it the learning itself.

I think you overestimate the utility of formal qualifications for getting a programming job (assuming that that's what you're talking about here). In fact I would say that if tell a potential employer that you've worked through SICP, you show them some projects that you've done and they reject you on the basis of not having a degree, you've dodged a bullet!

I'm completely self taught. I haven't got a degree, nor have I worked though SICP. I'm in my mid 30s and I've been professionally programming for about 5-6 years. I work at a hedge fund in London; it's doable!


You know that a coursera-taught team won a (non-academic) machine-learning competition, right? Knock-on articles about it made the rounds here a few weeks back.


This article popped up on Google. Is this what you're talking about? http://gigaom.com/data/why-becoming-a-data-scientist-might-b...

The winners were a mechanical engineering student, an actuary, and an insurance risk analyst. This is all neat, but I was refering specifically to students w/ no education using these courses as a replacement for college.

Yes, as an add-on, I give it a huge Plus 1, but as a replacement, not yet. I'm not saying there is no hope, I'm saying it's not quite there yet.


I missed that one, do you mind sharing a link?


Why can't you pass a basic whiteboard test? Haven't you learnt what you need doing the projects you mentioned?

By whiteboard test, I assume you refer to a coding problem given at an interview. I can see that it can be very difficult to get an interview without formal credentials, but once there, wouldn't your experience allow you to shine?


I never did a whiteboard test, so I guess I don't know how I'd do, but I can't imagine that I would blow anyone away. I'd probably be asked a stupid-simple question for say, a Junior, but give an algorithm or answer that would quickly expose my inexperience, lack of ability, and lack of training / guidance.

To be clear, I wasn't exactly stating my own credentials in that post, I was simply stating that even with said credentials, there is little chance of being taken seriously. I never read Corman, but the rest is fairly accurate.


It's my experience that if you acquire the skills and talk to enough people about them, the opportunities will present themselves. It's not just big companies out there, there's a lot of smaller ones who can't compete with google in hiring PhDs. These are a good place to build credentials that you can carry you to the larger institutions, if that's where you want to be.


Nice write up. While I might disagree with you on certain points, without any proof on my part, I completely agree that one should take open courses almost exclusively only if s/he wants to learn something new for themselves. I get a huge kick from learning new stuff. If anything I started with class on algorithms in hopes to land a job at Google, well I never did, but it gave me taste for knowledge that is hard to satisfy. :)


You might want to relax. I mean, these MOOC courses only became ubiquitous this year. Give at least 5-10 years to see what the effects will be.


some well designed pretesting would go a long way in improving the usefullness of online schools




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: