Hacker News new | past | comments | ask | show | jobs | submit login
Artificial Intelligence Graduate Certificate (stanford.edu)
224 points by tma-1 on Jan 31, 2017 | hide | past | favorite | 116 comments



Not sure if people remember but Stanford in collaboration with Sebastian Thrun offered one of the first online educational course in Machine Learning. This piqued my interest in ML and I enrolled & completed the course from one of the most rural parts of India (I had to create an extension antenna for my 3g phone to get better reception).

Udacity was born due to the popularity of this course.

Can't thank Sebastian and Stanford enough for this free course.


That's a [edit:perfectly fine] example to choose for a Stanford MOOC, since Coursera was started based on the success of Andrew Ng's machine learning course.

edit: People below (and above) are more knowledgeable than I am.


Actually, Thrun's AI course, Ng's ML course, and Widom's Databases course were announced and ran pretty much simultaneously. They were all successful, but only Thrun decided to split off because of it.


I noticed that Andrew's course is still available.

https://www.coursera.org/learn/machine-learning

Is it still a good introduction to machine learning? It's several years old now. Are there better courses available?


That course is a very good basic machine learning course. It has many of the stepping stones to complex machine learning problems, no matter which specific field.

So yes, it is a good intro to machine learning. There are new advances coming up all the time. But you will definitely need to know most of the topics in that course. That is, if you want to properly understand most of the latest techniques.


It also works great psychologically - concretely, Andrew is capable of inspiring enthusiasm and confidence.


I see what you did there. :)

Someone even made a drinking game out of it: https://www.reddit.com/r/mlclass/comments/lvxuz/mlclass_drin...


It's a good course.

The only(?) real criticism of it is the reliance on Matlab/Octave.


And even then, I can see why they use it - the syntax is pretty light and largely gets out of the way of the problem-space.

The only problem is that modern applied ML is dominated by Python...


I'm curious why there hasn't been a push towards more efficient compiled languages like golang?

I don't work with machine learning and AI, but I used to do a lot of server-side programming with dynamic language. Switching to golang has been great. I'm far more productive with it and the CPU and memory savings have been great. Isn't ML the kind of domain where CPU cycles and memory matter?


Because of GPUs.


Thrun's course was offered earlier* and Udacity launched earlier. What's funny?

* edit: I just learned the courses were simultaneous.

Udacity launched June 2011, Coursera April 2012, according to Wikipedia.


No - AI Class (Norvig/Thrun) and ML Class (Ng) both ran at the same time, in the Fall of 2011. I took both courses, but only completed the ML Class (I had to drop the AI Class due to personal issues at the time).

I'm not sure which MOOC company started first (in 2012), though - I do think it may have been Udacity. I know that when it did start, it wasn't able to offer the AI Class because of {reasons} (I think maybe licensing of course content or something) - and so instead it offered the CS373 course (at the time titled "How to Build Your Own Self Driving Vehicle" or something like that). By that time (Spring 2012), I had moved on past my earlier problems, and jumped at that course - I took it and completed it as well.

Coursera, meanwhile, was able to offer the original ML Class as a premier course (maybe Ng had different rights to the content, or maybe there was issues on the AI Class that had to do with the dual instructor partnership of the class - I'm not sure what really happened there).

After about a year (IIRC?), Udacity was able to finally offer the original AI Class as part of their courses (I think now renamed "Introduction to Artificial Intelligence").

Today, I'm taking the Udacity Nanodegree course, as I've noted before here.

I think this offering from Stanford is interesting, but it doesn't seem like it is currently "available" to enroll, because the two required courses seem closed or something? Maybe they're taking enrollment for a future starting date. That said, while the tuition isn't outside of something I could do, I currently think the best use of my money and time - after I complete the Udacity thing - would be to pursue getting a BA on CompSci or something of that nature, then pursuing other paths.

Should've done it a long time ago when I was younger - but I was dumb.


Let's not do anything drastic. If you've finished those courses and have decent experience a bachelor's degree will get you little to no practical value. If there are holes in your knowledge, by all means fill them, but a degree of little value is not a good investment in this line of work.

You are better off filling holes in your knowledge and build something awesome. That's it. You want an interview & offer at Google, facebook, Microsoft, it's pretty easy with a solid portfolio and a few months study or refresh on algorithms.


Thrun & Norvig did Intro to AI, Ng did Machine Learning, Koller did PGM (another course related to AI/ML) and Widom did DB/SQL, which is how it all started. Thrun then started Udacity, Ng & Koller started Coursera.


Did that course actually. Sebastian Thrun and Peter Norvig were the instructors. Still have the certificate from the course(back then they used to provide signed certificates from the instructors for free).


Why not take the Georgia Tech online MSCS instead? Only $7k, you get a Master's degree instead of a 'Certificate', and you have the option of covering a lot of the same material.


Signalling value. It's kind of a nihilistic answer, but that's it.

The question is only whether the HR person's neural network triggers more strongly on "master's degree" or "stanford".

None of this has anything to do with what you learned or know. Like most tertiary education.


Yep. And Sebastian Thrun's course (the Stanford professor, google VP, Udacity founder and google driverless car initiator) is part of that program. You can take AI for robotics for credit there :)


They require a previous undergraduate degree, which is a no go for me. I'm already a high level software engineer at a top tier company. I have no desire to go back and take an undergraduate degree. But I do like collecting certificates... Currently taking Udacity's self-driving nano-degree, pretty fun for someone with no robotics background.


How closely do they check the undergrad degree? Some second rate universities will trade you a degree for experience plus cash. Most of them are non-accredited though.


Do you happen to know any accredited university or institution that would do that? Maybe with some examinations?


I don't think any actual accredited university would allow you to "test out" of an entire four year degree, let alone simply buy it because you have comparable experience.


You might be interested to know that it's possible to enter a Master's program without completing a Bachelor's degree. This is not openly advertised, but I am aware that even universities with high signaling value like CMU and Oxford are open to this arrangement.


In general Masters in CS programs require either a BS in CS or coursework in areas like operating systems, data structures, algorithms, computability/complexity etc. These courses generally have prereqs such as 1-2 semesters worth of programming courses, discrete math etc. Someone without a CS background most likely would not be accepted to a standard MS program.


i'm sorry but this is not correct. CS Masters programs are more money-making then credential caring. Georgia Tech's MS as well as CMU's both accept non-CS traditional backgrounds.

This is trivially verifiable on linkedin. That said they do require high ugrad GPAs frequently.

Source: Knowing Georgia Tech Phd TA's and being a CMU TA for Master's classes


> Georgia Tech's MS as well as CMU's both accept non-CS traditional backgrounds.

There are exceptions but in general CS Masters programs look for sufficient coursework/background in Computer Science and Mathematics. An art major with 0 math/programming courses will find it difficult to be accepted to a Masters program in CS.


Yes, of course. But that's moving the goalposts a bit. When I originally made my point, it was to another commenter here who is presumably a self-taught software engineer without an undergraduate degree, not an "art major."

For other commenters reading this thread down the line: my point here is that yes, while it's an exception, it isn't exactly stunningly difficult to get admitted to a good CS graduate program without an undergraduate degree. You don't need to be a prodigy who is so inarguably talented that you're just skipping the bachelor's. You'd be shocked what you can bypass by convincing a real human with decision making ability that you can do the work, instead of relying on every rote admissions page.

Admissions pages for graduate degrees are like job posts - almost all say they want an undergraduate degree at minimum, but when you open the kimono many are willing to silently drop that requirement without advertising it to others.


again i disagree - i know for a fact of cs masters students with undergrad degrees in biology, neuroscience, and humanities (though with high gpas in those)


One of my colleagues dropped out of his undergraduate degree and went back to do his Masters in his 40s. Given his experience as an operating systems architect in the meantime it's not surprising he was accepted without a bachelor's degree.


For posterity's sake, do you mind telling us which university it was? Or at least what "tier" university it was?

In my experience I've found (ironically), that it's generally the more reputable universities that are actually willing to do this, because "celebrity" (and I use that word very loosely) professors can talk a promising graduate student in over admissions department's veto at those institutions.


I am directly familiar with students accepted to CMU and Oxford for CS graduate programs (and other universities, but they aren't as noteworthy) without a Bachelor's degree. It's true that they had a CS background, but they demonstrated that with experience and one or two specific Coursera courses in areas that were lacking.


My comment was aimed more for someone without a CS background entirely. If you have sufficient CS preparation, then of course you can be admitted to a MS program (assuming you satisfy whatever other departmental admission requirements). I'm sure there are exceptions out there/programs that accept non-traditional students. Even if you lack a CS background (assuming you already have an undergraduate degree), you can make it up in postbac programs and the like.


Thank you for the information. I had thought this was possible. In my initial queries to several institutions, they did not seem receptive at all even with extensive experience and solid knowledge back fill. I think I probably just didn't talk to the right people at the time.


Any undergrad degree or only CS ones?


Are ML/AI certificate holders competing for the same jobs as ML/AI PhD graduates? I have a list of follow up questions to this but it seems like there's a lot of hype for ML/AI/DeepLearning but no definitive way to track this new job market. These online programs, although more accessible, are doing the same thing as their physical campus counter part and not being transparent about what to expect after finishing their programs.


ML/AI is very new field. I heard someone say "AI is like teenage sex. Everyone is talking about it but no one is sure how to do it"

So for jobs, ML/AI is pretty much an even field unless there are niche research jobs which only requires Phd gradudates


According to another HN thread this is actually not true. Only having a certificate, without CS/AI undergrad, makes it almost impossible to get a job. You need Masters at least for most positions, some engineering positions you can get in with Undergrad in CS and taking the right AI courses.


Do people in the tech industry find any more value in physics/math undergrads than CS? CS is not as rigorous. I always thought that physics/math undergrad + ML certificate might be just as competitive as a masters in ML coming from a non-relevant BS


I'm going to have to disagree.

Physics/Math is useful but your particular field/speciality in CS is relevant as well. I would bet a bit more money on a CS grad to understand convexity, comp geometry and optimization than a physics grad, and math majors don't always have the code skills to develop things for production.

Point of argument here being frequently neither do CS Phds and at tech companies you pair Math/CS Phds with 'production' engineers.

Also certificates aren't really rigorous necessarily and also are quite easy to one-off versus relevant programs/theses/research experience.

production is an overloaded term depending where you go


It's quoted on the neural-redis README[1] and attributed to Mike Townsend[2].

[1]https://github.com/antirez/neural-redis

[2]https://twitter.com/Mikettownsend/status/780453119238955008


And before that it was being used for big data.

http://techcitynews.com/2014/08/07/big-data-is-a-lot-like-te...


Admittedly it's been a few years since I considered getting into this field but the last time I checked there weren't that many positions available and most of them required graduate degrees from good schools. Seems that even companies with a significant ML emphasis employ a lot fewer ML specialists than nuts & bolts coders.


The certificate holders may be competing for the same jobs, though I really doubt they'd get one over PhD candidates unless there was something they had done substantially at least as interesting as an ML/AI PhD holder, or I guess if the company absolutely can't find anyone with higher qualifications.

And I think what I perceive as your doubts as to landing a job with one of these are probably founded. This is a way for the university to make a bunch of cash. As long as that happens, whether students get a related job or not is not important to the university except to help with marketing to prospects.


What will the AI expert job market be like in two years? Expectations seem to be running high, people see promising trends, but no one can see the future. At least, not yet, I think. Maybe there's a new paper.


$20K for a few online courses? Genesereth teaching logic and automated reasoning? (I took a class from him once. Exam question: "Does a rock have intentions?")


So, does it?


You have to take the class to find out!


You won't believe what happens next!

Find out for just 20k!


In the end, the answer is "it depends."


This costs about $15k - $19k. With all the great lecture and homework content out on the Internet about AI, ML, deep learning, vision and natural language, I wonder if one can put together a more comprehensive and customized version of this that one can learn for free.


EdX is strongly committed to learning being free. You can audit any of the courses in the AI MicroMasters for free.

https://www.edx.org/micromasters/columbiax-artificial-intell...

(EdX employee)


The toughest curriculum in any AI related MOOC I've seen so far. Just CS228 - Probabilistic Graphical Models is enough to bleed someone's brain. Wish Daphne Koller still taught that course.


You are in luck. They have just re-released it on Coursera. https://www.coursera.org/specializations/probabilistic-graph...


That's awesome! Thanks for the link


Oh well, as someone who is currently enrolled into Udacity's Self-driving car and AI nanodegrees, watching ongoing MIT's self-driving course, considering taking GATech's online M.S. degree for ML, I should probably start planning budget for this Stanford offering...


Awesome! I'm also currently taking Udacity's self-driving car nanodegree, but I think I only intend on doing the first of three semesters because I want to concentrate on the fundamentals, and not necessarily self-driving cars. Do you recommend enrolling or checking out the AI nanodegree? Have you looked at the ML one?


It's too early to evaluate AI nanodegree as it has just started and so far we did Sudoku solver ;-) As for ML, I have it on my mind after SDC & AI, though already took Ng's first run of ML one so I am not sure it would be that beneficial.

So far SDC is the best fun I had in a while, getting a car drive all by itself on a circuit feels absolutely cool! ;-) Have fun as well, I hope they have more cool stuff prepared for us!


Wow that's a lot of learning.

By MIT self driving course are you referring to http://selfdrivingcars.mit.edu/ ?


Yes, that one. It's more difficult than Udacity's SDC but also more real-world oriented. I also have subscribed to most of the Udemy courses related to CNN/RNN/reinforcementL/ML so I have a lot of fun ahead ;-) Hope you enjoy playing with ML as well!


20k or 10k per required course (there's electives but let's assume the bare minimum) seems steep for an online certificate which seems fairly worthless from a signaling POV.

I'm not sure their brand name justifies that price (not sure about the content). The competition is probably the AI nanodegree from Udacity which costs 800$/term with the chance to earn some of that back. If the employer I want the certificate for knows what online certificates are, chances are they are familiar with Udacity (possibly more so than with Stanford in that market).


If one's goal is to work at OpenAI, FAIR, or DeepMind, which would be a better use of time — obtaining this certificate or getting quality papers into NIPS / ICML?


Certainly the latter, getting published in top journals or conferences is several orders of magnitude more difficult.


Wow tough call. The certificate would represent a baseline of understanding which can be then further trained, a good publication history in NIPS/ICML Etc. would represent a solid contribution to the field. I would not be surprised if those were, to a first approximation, equivalent.


This has to be sarcasm right?


Clearly


Stanford is just trying to cash in on people by having their cake and eating it too.

They want to charge 20k, but not let anyone have a chance of further advancing to complete a real degree, no matter how excellent their performance in this program.

The reason they do this is solely to protect their brand and exclusivity. They already offer online degrees but the acceptance rate is just as limited as the on campus program.

Yes the learning is important, but so is the credential and a certificate doesn't even come close to a degree in the job market.

Stanford should pick one:

1) Charge Stanford prices, scale up online, and let any student who can do the work pay tuition and earn a degree.

2) Charge lower prices for certificates and continue to artificially ration real degrees.


I did this program. I can say it was literally a waste of money. I planned on joining a member company (so I could complete the degree remotely) at some point but meh none were interesting enough to join. You can get all the value in this program for free by reading a couple textbooks.


Can you please be more specific? I am considering taking this program and any details why was this inadequate would be very helpful! Thank you!


You get to take the actual classes but you don't get real credit for them. Recruiters, hiring managers don't care because it's not a real degree and they have no knowledge of the program. If you just want the knowledge you can basically get it for free from youtube or buy the textbooks. The ML/AI lectures there were often better than the ones from Stanford. You also don't have real student privileges. Some of the professors request that you do not attempt to show up for lecture or office hours because you aren't a real student. They just want you to watch the videos online.

And so on. It was overall a pretty negative experience for me. I can't recommend it.


"a certificate doesn't even come close to a degree in the job market"

So? Do they say that it does? I'm already a software engineer and can tell you that having an AI certificate from an accredited University is a great stepping stone to transitioning into this line of work even if it does not make me an expert.


I believe it helped you. I also believe if given a chance to try you may have been able to earn a full masters from Stanford and been helped even more. Not everyone has the time/money/desire, but many people now are artificially limited in the name of preserving brand equity and profit.


For me it's valuable the way they've constructed it. I like the idea that I can take some night classes in addition to my full time job, and just focus on one specific area.

Okay, graduate degrees fulfill that purpose too, but with life events now and over the horizon, I just don't have the bandwidth to commit to getting my Masters or a Ph.D. right now.

I really have no problem with this.


But why pay for it than?


They're betting that a certificate with "stanford" on it is better than a certificate with "udacity" or "coursera", or "I read a bunch of web pages and books", "heres my kaggle ranking" or "look at this stuff I did on github and corresponding blog posts". [naturally people breaking into the field need not pick just one].

Depending on hiring filters, that might be true. If an application ever gets to an engineer though I'd bet on the last one or two winning.

I think a sensible organisation should skip the usual hiring filters in new fields, because [edit](my experience is in security) you can scoop up really good people who happen to have "unconventional" backgrounds if you have competent people evaluating them.

Regular HR people tend to do a really bad job with career switchers and the self taught since they mostly work off "signalling value". But in newer or very fast moving fields, the oddballs can be majority of decent applicants.

My limited experience is that very technical positions are not actually overwhelmed with applicants, and that it's not hard to evaluate if people have the right stuff because in these areas it's not difficult to devise quite objective challenges without resorting to shibboleths (guess what the interviewer is thinking or, do you come from the same technical culture as me).

Arguably ML positions should be the easiest to algorithmically hire for (at least for industry, not hard core research). Just put an automated judge with a fairly low bar on a business relevant objective function between your careers and your "submit job application" page :p

Personally I find this a little bit funny, because ML driven competence evaluation in "hard" (reasonably concrete objectives) fields should eventually render credentialism and signalling obsolete. But here we are, the $20k "certificate".

All these things said, a structured course of study is super useful for the undisciplined (certainly including me) and dropping "new car money" on something has a way of focusing the mind :)


I think a sensible organisation should skip the usual hiring filters in new fields, because much like security you can scoop up really good people who happen to have "unconventional" backgrounds if you have competent people evaluating them.

Sometimes it's quite sensible to travel in the opposite direction to everybody else.


Are you talking about guaranteed advancement into a real degree?

That indeed does not exist, but if you do well on these, you are probably pretty eligible for their part-time MS program.


I can't see any reason to believe this is true. It's not stated or even alluded to in their literature or admissions criteria.

Moreover even if it increases your chances a bit, it's a massive investment with no guarantee. Why not simply let the students who demonstrate excellence in earning the certificate a chance to spend another 50k for a degree?

The reason is to maintain artificial scarcity. Stanford has done a lot of great things and all the grads I've worked with have been top notch. However in some ways, Stanford is to education as De Beers is to diamonds.


Darn, I thought they were preparing to give degrees to AIs.


Can you get similar knowledge through books? In general what book do you recommend? I'm specifically interested in deep learning.


I think "Artificial Intelligence a Modern Approach" is still the best foundation book. However it certainly is not focused on deep learning. There's chapters on pretty much everything (one on reinforcement learning) but it is build around the idea of intelligent agents first and foremost. Best written CS book I own.


I'd love to read a book like that but I don't have any maths, and I presume those things are full of maths.


I think this book is actually very approachable without much of a mathematics background. It's structured so that you can usually skip the mathy parts (and there aren't many really). Time and space complexity might be a bit tough but it's not that hard to work through. I think the Bayes related math was explained well and should be approachable with next to no math knowledge.

The math is mostly applied and not proofs. There's a fair amount of pseudocode and algorithms but they are explained well and I think it's not hard to follow (our students of different backgrounds usually didn't have problems). I did get a bit tired of the running example of the map of Romania (essentially used for all the search related things). The diagrams for algorithms are very helpful.


Would anyone recommend this over doing a masters degree in the similar area of study but at a local university (ie. not Standford)?


Hard to say without having taken these classes. GT isn't Stanford but a top 10 MSCS that offers an ML specialization for ~$8k-$10k.

Whether that provides more value or employment opportunity I cannot say, but it is encouraging to see more to universities offering an alternative to traditional degree programs.

I think there's a path for non-MS, non-PhD backgrounds, but probably not now. Outside of the big companies, ML/AI is often a solution without a problem. So until they learn practical application I think supply will outnumber demand and most of the jobs will go to PhD AI/statistics backgrounds.


Outside of the big companies, ML/AI is often a solution without a problem.

I would say just the opposite. There isn't a business too small to benefit from ML/AI. That is, assuming you acknowledge that there's more to ML/AI than "deep learning". Not everybody needs a deep neural network. Sometimes you just need linear regression or a random forest.

If you come at it from that point of view, the knowledge you gain from taking Andrew Ng's machine learning course is enough to create value for companies / organizations.

A lot of the discussions on this topic here on HN seem to be based on an assumption that you have to be doing cutting edge original research to be useful. I think that's very far from the case.

Now the problem is, does the mom and pop bakery on the corner know that they could benefit from ML? And perhaps the more salient question is "are they looking for somebody to do ML for them"? The answer is probably "no" in both cases, so you might have to do some work to sell to that market. But the value you can create is real. Help them optimize production so they throw away less bread on slow days, etc. and you're talking direct business impact.

I suspect that there will be, for a long time, a wide continuum in terms of how ML/AI skills can help create value. Which means there will be a lot of ways to leverage this field from a career standpoint.


> There isn't a business too small to benefit from ML/AI.

That's not exactly what I'm saying. There are ML problems everywhere, opportunities to create efficiencies, new products, etc. out of existing data and processes. But nobody hires an AI/ML candidate to come in and tell them what to do. Nobody says "we want to dabble in ML somehow but don't have any pressing needs, can you start at $140K?"

Those businesses need to recognize the opportunities first before the market will expand.


You might be be surprised.

Not all of the opportunities are reasonable (or make any kind of sense at all) but if you take a shitty hire at a place with interesting data you can have fun.

I can hardly count the number of organisations which now want to project the idea of being "innovative" and are excited to slap the coconut shells over their ears and try waving down planes with a stick.

There's still a surprisingly huge number of companies still hiring "blockchain consultants" even though they're rapidly heading into the "despair" phase of the hype cycle.

If you're a low cap stock or even a tired brick and mortar, two ML hires and a plausible investor story gives your graph a serious bump.

The difference between blockchain bull()$& and ML bull()$& that is the ML could actually have legs. Mostly it'll just tell you stuff that someone who understands market actually knew though.


While I believe that, it sure sounds like a miserable job for people who actually want to do cool things with data. Hired for show likely means you're still hunting for that problem (which might not exist).


Disclaimer: I do not hold any kind of real degree (just an Associates from a defunct tech school - worth little to nothing now) - so take the following for what it is...

Based on my experience, which I won't re-iterate here - the various MOOCs I've taken (and currently the Udacity Nanodegree) would not be anywhere close to a masters in the subject (unless I am severely overestimating a masters - but I don't think so).

TBH - they would probably equate closer to an Associates, at best.

This offering from Stanford? Not sure - but I still don't think it would be the equivalent. I'm not saying it wouldn't be worthwhile, but I think if your goal is a deep level of knowledge and understanding of the subject, then a quality masters program for CompSci or similar would be the better path.


Rigor is there.

I'm now working in the ML/AI research division of big-4, all I needed (and all hiring teams needed to see on my resume) were a couple of these classes (with good internal performance ratings).

Having said that, my employer paid for these classes. No way I would pay this price out of pocket. There are probably much cheaper ways to get the same knowledge.


Because I also used to make this much too common grammatical mistake: You're degree is an "associate" (singular) degree as in Associate of Science, etc. (An "associate's" degree could be any type of degree belonging to an associate of some type;-)


I think this seems right. The value in going to a school, taking classes, talking to students in the same boat and talking to professors is greater than the value of a very low-commitment thing like a certificate for watching 20 videos and turning in a few assignments. For the autodidacts, their github profiles are probably worth more than some silly certificate.


I've taken a couple on this list (employer paid as a perk, no way I would pay this price out of pocket) and I can attest it is 100x more worthwhile than Coursera MOOC.

You actually study, partner with, and test along-side Stanford MS/PhD students. Not to mention the accountability factor, since you paid $5k for the class you better take it seriously.


I'm curious about this as well.

I have interest in AI, but I'm not sure how companies would see this online degree (even through it is from Stanford).


I would think it depends on the company. Remember, the world is a LOT bigger than GoogHooBookSoftBer or whatever. Walk into Tom's Auto Salvage or City Chevrolet or Barney's Screen Doors and Small-Engine Repair and show them that you can produce value for their business, and I don't think they're going to give a hoot if your degree/certificate/whatever is from Stanford, Yale, UNC-W, ECPI, or University of Phoenix.


> Software engineers interested in acquiring a solid foundation in artificial intelligence.

Does a "solid foundation in AI" actually exist?

I'm asking because it seems that nobody really knows why many algorithms actually work, or even how they should be adjusted to cover new applications. To me it sounds more like "educated guessing".


If you are talking about ML then: Bayesian stuff, Variational and MCMC, Graphical models, Deep Learning, Linear Algebra, Probability theory, Statistics, Multivariate Calculus, Optimizations - as a baseline. I think the "average" person on HN would not be interested in ML if they had to learn it rigorously. It's much easier to blackbox the entire thing and label yourself as a "ML Engineer" then learn the fundamentals listed above. Having a Stats/Applied Math background would be very helpful.


This only makes sense for people whose work will pay for them to take it.


Anyone have suggestions on alternative programs that are less expensive, online, and self-paced? 40+ IT guy trying to stay relevant.


My advice, start with the free stuff. If you complete it then consider a paid thing. The completion rates of online courses is super low. I take this to mean people don't really want to do it.


I think its just difficult for people to hold themselves accountable. in standard colleges, you have other students, teachers, grades, etc which creates a social pressure for you to continue your progress.

with a MOOC, there doesnt really exist the same social pressure to continue or excel.

Its been interesting to watch how various MOOCs have tried to recreate these pressures (due dates, courses separated into weeks, peer review assignments, teacher 'office hours' etc.) and while i think they've gotten a lot better than at the beginning, im not sure they will be able to fully replicate the pressures of having real people in the classroom with you, who will notice if you are gone for a week


Which free courses do you recommend?


Columbia University has an Artificial Intelligence MicroMasters on EdX, first cohort started this month. It's free, or 1200$ if you want Columbia course credit.


According to the EdX page, course credit is only available to Columbia University Master of Computer Science students.

https://www.edx.org/micromasters/columbiax-artificial-intell...


Would you provide the actual bit you're interpreting? These are the words I see:

    -------------
    Who is this MicroMasters Program intended for?

    The MicroMasters Program in Artificial Intelligence 
    is intended for those who have a Bachelor’s degree in
    Computer Science or Mathematics and have a basic 
    understanding of statistics, college level algebra, 
    calculus and comfort with programming languages.
    -------------
I don't see it mentioning anything contrary to the above statement. The micro-masters wouldn't be a full fledged master's degree, but could allow admission to receive a full masters (per the below):

    -------------
    Complete, pass and earn a Verified Certificate in all
    four courses to receive your MicroMasters Credential. 
    Learners who successfully earn the MicroMasters 
    Credential are eligible to apply to the Master of 
    Computer Science program at Columbia University.
    -------------
Maybe I misunderstood you.


Other than the tagline "A series of credit-eligible courses recognized by industry", this is the only reference I found to course credit:

> Take your Credential to the Next Level

> If a student applies to the Master of Computer Science program at Columbia and is accepted, the MicroMasters Credential will count toward 25% of the coursework or 7.5 of the 30 credits required for graduation from the on campus Master of Computer Science program.

This part: "Learners who successfully earn the MicroMasters Credential are eligible to apply to the Master of Computer Science program at Columbia University" doesn't seem to mean anything. Isn't everyone already eligible to apply?


Udacity is offering an Artificial Intelligence "nanodegree" which will set you back ~ $1600.

https://www.udacity.com/course/artificial-intelligence-nanod...


How do people in industry generally view these types of certificate programs? Is it markedly "worse" than doing e.g. a Master's on campus?

Also - is this closer to a Master's level program or part of an undergraduate curriculum?


The courses are identical to the courses one would take if they were getting their Master's at Stanford - the only difference is that only four classes are required, while a Master's requires about 15.

I'm enrolled in Stanford's CS Master's program right now through the Honors Cooperative Program (which lets you get a Master's online while working in industry), and I'm currently planning on doing a dual specialization in Systems and AI. For the AI specialization I've already taken CS 221 and 229, and I'll have to take three more AI classes drawn from a list pretty similar to the Elective Courses list in the OP.


Did you take the courses online or on campus? Did you feel like a "regular student" or "divided but equal"? Did you look at alternatives like Harvard Extension ALM in Software Engineering?


Question - you peaked my interest in the Cooperative. What is your goal going out of the Cooperative with an MS?


The available classes align with graduate level one's for their MSCS.

If nothing else it's markedly worse because it's 1/3 the classes :). Beyond that I'd expect rigor ... it's Stanford after all.


I am feeling that even a non-Stanford place would present more rigor than an online Coursera course. When I was in school, I benefited greatly from having 1-1 private conversation time with mathematicians, scientists, grad students, my friends, etc. With Coursera, that person is replaced by Google/Bing/Whatever. If I don't get time to talk to the scientists or grad students (at least), I don't see how Udacity/Coursera/etc. are anything more than a glorified YouTube. Maybe I am disillusioned, but the value proposition from Coursera/Udacity/etc. is looking worse and worse as I look at it more and more... hmm...


When you pay for Udacity/Coursera/etc you at least get human interaction with paid instructors and other students.


If you look at pricing, it's below their Master level so I guess is more undergrad-style.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: