Hacker News new | past | comments | ask | show | jobs | submit login
Fast.ai releases new deep learning course, libraries, and book (fast.ai)
799 points by amardeep on Aug 21, 2020 | hide | past | favorite | 81 comments



I do recommend getting the book that just came out (I did, it is fantastic)

https://www.amazon.com/Deep-Learning-Coders-fastai-PyTorch/d...

that said: fast.ai also released a draft of the book available here (including the notebooks) https://github.com/fastai/fastbook

edit: if you can afford it, getting the book is a great way to support the authors


I should mention - the `fastbook` repo is not just a draft - it's the full book in basically the final version (except it's in Jupyter Notebook format).


I do recommend getting the book to support authors who've put one of the best Deep Learning content (completely for beginners) for Free of Cost. I know that's the open source culture but someone doing it with education seems highly appreciable to me!


Honestly we don't get much of a cut, so don't worry about it too much either way. Although it's nice to support O'Reilly, since they were kind enough to let us make it all available for free.

Frankly though, there are much more important areas right now that could really use some extra money, so I'd rather see folks donate to a good cause, if they don't actually need the paper or kindle book... :)

I've been giving my teaching stipend from university to the Fred Hollow Foundation: https://www.hollows.org/ . They can give sight back to many people that are blind, for around US$25.


That is a very humbling response. I will be donating to Fred Hollow Foundation. Thanks for the fastai work.

Edit: Donated. And reported a bug in their donation process that almost preventing me from being successful.


Thank you! This is such a great service to the community, really, all of it.

While you're here---do you have advice on the course vs the book? I'm a person who really prefers to learn via writing rather than via video, but if there's stuff in the course that people who use the book can't really experience...


Use both. The book will encourage you to work through the notebooks. You can watch the first video and see if it adds anything for you - if it doesn't, skip the rest.

They cover basically the same material. (Except that the course covers only half the book - the rest is planned for a part 2 course later.)


Thank you!


The epub is $60 on the Apple Store, which is quite a bit more. Usually Apple’s prices are competitive.

https://books.apple.com/us/book/deep-learning-for-coders-wit...


Wow that's wild. Is that because they add the 30% fee or something? The Kindle version is $35


It's apparently not just an Apple thing. ebooks.com has a DRM-free version [1] (PDF or EPUB) that's also about $60.

[1]: https://www.ebooks.com/en-us/book/210066947/deep-learning-fo...


In Germany it’s 40 euro, max 45$


I am a Tensorflow guy. But I'll read the book in Safari O'Reilly library to expand my horizon and support the author.


for those unfamiliar with fast.ai:

it is a practitioner-style style deep learning course that instead of starting with the fundamentals starts with examples and results and then over time, layer by layer reveals what it is all about and how it works in detail until you ask yourself "that is all there is?". a great way to make a seemingly unapproachable topic approachable.

you don't need big data, you don't need a GPU, you don't need to install a ton of dependencies, you only need a browser (to access jupyter notebooks).

last but not least: this is kind of the "definitive version" of the course as it now comes with a book, a new version of the library (re-written in a more thoughtful way) and with new versions of recorded lectures/lessons based on the book w/ way better audio quality (compared to the previous ones).

If you ever were curious about deep learning but did not find the time to take a look or thought it was unapproachable: now is a great time to dive in and this is a great course (& book & library & community) to do so


> it is a practitioner-style style deep learning course that instead of starting with the fundamentals starts with examples and results and then over time, layer by layer reveals what it is all about and how it works in detail until you ask yourself "that is all there is?". a great way to make a seemingly unapproachable topic approachable.

Well said and this is exactly what I loved about the course and the way Jeremy peeled things back. If you're a 'learn-by-tinkering' person, and I suspect a lot of HN folks are, I can't recommend it enough.


exactly! This is the reason why I recommend this course to my friends and colleagues who ask me about how to get into AI/ML. I tell them to do this course first to get idea about what all is possible. The version 2 (which you do at your own pace) provides the theory/maths behind it.

While coursera/Andrew Ng course are(were?) classic and have great content - I personally prefer Jeremy's style and this code-first approach to Deep Learning (yes, DL != ML != AI but that's not the point).


The version 2 (which you do at your own pace) provides the theory/maths behind it.

Which version of the course is considered version 2?


I think wadkar means part 2 - deep learning from the foundations - which you could do optionally after finishing the 'main' course.

It USED to live here: https://course.fast.ai/part2 but has disappeared, I reckon as part of the new course relaunch. Perhaps there'll be a new version of this part too?


Correct, I meant part 2 of the course and not the version 2.

Thanks for correcting me a_bonobo.

As for revision of this part 2, based on my skimming of forums.fast.ai , I don’t think it will happen anytime soon. The second part is more about theory/maths which is not affected by the new library.

Edit: part-2 clarification


Would it teach principles of deep learning in the same depth as if I slogged through the YouTube videos of cs231n?

A legitimate question as I'm considering embarking on one of these two paths. As most of the people here my programming skills are more honed than my math skills so the fastai path looks like the easier road to take but I'm not sure if they both lead to the same place.


No. I think the hands-on approach of fastai would probably help you contextualize the theory you learn in CS231N and elsewhere.


I’m considering something like this too. I need a way to keep it fun though or I probably won’t follow through.


If you get serious about data science, you're going to end up reading/watching a lot of different resources. I'm a data scientist and I'd suggest starting with fast.ai and then following your interests elsewhere. The best book or course is the one you're going to finish.


Sounds awesome. What’s the best way to approach this? Do I need the book?


Oh, neat! I went through an earlier version of the online course when I was just trying to understand what this "deep learning mumbo jumbo" was all about, and it was the clearest, and easiest to follow, and most interesting one available, by a long shot! One of the assignments had you train an image recognization model based on google image results, and after a shockingly small amount of work and time I had a model that could distinguish a picture of a game of Go from a game of Chess almost perfectly. That was a huge eye-opener for me.

That was maybe 1-2 years ago at this point and I had wanted to take another look. What a perfect opportunity! And I'm excited it sounds like there might be a little more discussion of non-DL ML and applications in tabular data (where I'd have the most likely use for it), as well as the nitty gritty like deployments and use in production!

Any progress on the Swift front? Is that mentioned / used / discussed at all in this new course?


Sylvain was working on fastai for Swift, but he became busy with the book and course and also has left for HuggingFace. Jeremy has not been working on Swift. No lectures on Swift in the course. The Swift4TF team is still active, though Chris Lattner has unfortunately left.

Another project that is similar is Fastai.jl, a port of fastai to the Julia language. It is still in active development: https://github.com/FluxML/FastAI.jl


It doesn't appear that the Swift port for fast.ai is a high priority, but Jeremy, Sylvain, and co have rewritten the Python library from scratch and it is much more sensibly organized now. The version 2 codebase is completed developed (generated, along with documentation) from Jupyter notebooks.


Not interested much in deep learning, but wanting to be somewhat on top of it to understand it well, I've done a few courses and skimmed a few books which are available.

The fastai video course was, with a big gap, the best, most understandable, most practical and most enjoyable of them.

Just wanted to say this. Thanks so much for creating it and regularly keeping it up to date!


Seconded. It was one of the few MOOC's I've actually completed because it was so engaging and fun.

I hope the top-down style of teaching spreads because for some people (such as myself) it's one of the best ways to learn and get excited about a subject


A question for Jeremy, perhaps. For the longest time the fast.ai courses have used Adam and one-cycle, at least for CV tasks. More recently Ranger and flat-cos have been dominating the Imagenette leaderboards. I guess I'm curious if fast.ai intends to switch over to teaching that policy instead of one-cycle?

I guess more generally I'm curious what criteria the fast.ai team uses for deciding what techniques to teach. My feeling is that the courses have always taught the training techniques that are a healthy mix of SOTA, generally applicable, and easy to use.

Ranger + flat-cos has seemed like a really robust combo, and easy to use. So yeah, just interested in whatever internal discussions fast.ai may have had about it and other potential replacements for Adam + one-cycle.


Yes it's a great combo. I think fastai is the only library that actually has both of these built-in.

However, because the LR warm-up is built-in to Ranger, it's actually a bit more fiddly to use - i.e. you really need to understand what it's doing. It doesn't work great with `Learner.fine_tune` and gradual unfreezing more generally, since you don't really want a full separate warmup for each phase.

So I don't see it becoming the default or main optimizer shown in the course. But it's great to learn and use.


I took fast.ai a few years ago, and then again a year or so ago. I like their lectures and their methodology of teaching which enabled me to meet a lot of interesting people in my city, but I ended up just building models using vanilla PyTorch instead of using their library as an added layer just because it felt like they were tweaking and revamping their code so often that at times it was kind of hard to connect the docs with the latest code.


Yeah that makes a lot of sense. It's why we took a year off from teaching to try to make the definitive version of the deep learning framework we really wanted - and even wrote a peer-reviewed academic paper about the design we came up with.

So today's fastai library really doesn't have the issues that we had a year or two back - it's a really carefully designed piece of software. Amongst other things, we've made sure works with the book, which means it has to last for a long time.


Are you Jeremy Howard? If so, thanks a lot for your courses and framework, it’s really great!


Yup that's me. You're welcome :)


Thank you! I also took your course recently and it was outstanding.


Bought the book and trying out the lesson 1 notebook, but man, I can't seem to make this work. Colab can't import fastbook with the GPU runtime, and the TPU and CPU ones are too slow. Gradient gets a little further, but fails with "self.recorder already registered" on the "#id first_training" cell. Maybe I'm too dumb to be a data scientist, but I didn't expect to have to do this kind of debugging right off the bat.


> I didn't expect to have to do this kind of debugging right off the bat.

Stick with it, and consider setting up your own machine instead of trying to use Colab. I say this because literally the hardest part about the previous course for me was getting started and doing the setup. Once you're able to actually run the notebooks I promise it'll get much easier. I can promise this with confidence because the lectures are excellent, and I've been through what you're talking about on the previous version of this course when I was first starting with AI.


Did you `pip install fastbook`? TPU runtime is slow since there is currently no TPU support (it's upcoming). Please share your issues on forums.fast.ai and I bet the community will help you get up and running very quickly :)

EDIT: I just tried the Colab notebook and it worked successfully for me. We can discuss on the forums if you want.


Took this course two times, first when they used TensorFlow and afterwards based on PyTorch. Like how it is practical from early on and updated on new research. Recommend trying to build networks from scratch in combination with the course, so you don’t become to dependent on the fast.ai framework.


In the new course and book by the end we show you how to build everything from scratch - including implementing gradients, creating a data processing library, writing all the equations out for loss and activation functions, building a resnet from scratch, etc...


Jermey, this sounds great! I now have several servers at our loft with GPUs, to run training and avoid noise from the laptop. You helped me have a good experience learning machine learning and enjoying learning+solving new problems.

My output from my new powers are several papers together with others and solved problems within the green tech energy market. We detect and forecast usage within timeseries data (energy consumptions).

Keep doing what you are doing! And thank for all the hours you put down into this.


Oh awesome! There's a really great fast.ai time series study group BTW, who have between them built a lot of great projects and (IMO) are more familiar with deep learning for time series than any other group I've come across: https://forums.fast.ai/t/time-series-sequential-data-study-g...


This course and the accompanying libraries were very good when they were released and have only improved over the past several years. I will echo what others have said - the courses are very approachable and practical.

Fast.ai changed the course of my career and helped give birth to deep learning as a practice at my place of work. Thank you Jeremy!


People have had a lot of negative things to say about fastai v1, claiming it is not very flexible and intuitive and only good for the certain Kaggle-type problems. I would recommend them to check out fastai v2 as a serious competitor to other PyTorch-based frameworks like PyTorch Lightning, Catalyst, Ignite, etc. It's very easy to work with default deep learning problems, but for more complex and unique problems, the mid-level/low-level API and callbacks make it quite painless to use fastai in your workflow. Plus there's tons of community support (forums.fast.ai + Discord), even for a package maintained by only a few people. Check it out!


Jeremy should update his JavaScript course as well. He might be one if the few people to make it look less messy than it is everywhere else. The fast.ai course is wonderful. Definitely recharged my own interest in deep learning.


Gosh I'm surprised (and rather pleased!) that anyone even remembers my old JavaScript stuff... :)


I scoured the web but wasn't able to find the JavaScript course that you're talking about. Can you share it here?


Here is one video on crud apps using asp.net and angular js: https://m.youtube.com/watch?v=Ja2xDrtylBw


Thank you!


Has anyone gone through a career change (to something in data science / ML) after going through courses like fast.ai? If so, how difficult / easy was that change?


Yes lots have - it's really common. Check https://forums.fast.ai for many stories. There's also some linked from https://course.fast.ai.

It's a lot of work and requires tenacity - the same amount of tenacity that's required to reach a high level of competence in any field.


This is great! Looking forward to trying it out. I explored it while back when I was looking for a deep learning library that can take a tabular data file and build a multitask predictive model involving different datatypes (for example, some columns may be be text). Uber's ludwig library does it. Would love to check it out.


Amazing news. I pre-ordered the book a while ago and am a bit surprised (positively) it's over 600 pages now. The German Amazon page still says 350 pages btw.

Worked with fast.ai for a couple of projects starting <1.0 and with the first MOOC. You're doing great work and it's really appreciated.


Interestingly my ordered book from bookdepository.com also says 350 pages. Could there be 2 versions of the book?


Hi Jeremy, congratulations on the new releases and thank you.

I see that the original ML course[1] link has been removed from the home page. Does it mean it's been invalidated due to integration of ML lessons with the DL courses?

I was pointing those who wanted to learn ML but don't have good access to proper Internet to the old ML course with custom scripts to make installation of requirements for those course in inexpensive SBC like Jetson Nano or similar. I was planning to make those setup public, but should I refrain from doing that because of Fast.ai v2? If so, is the cloud compute de facto first class citizen now?

[1]http://course18.fast.ai/lessonsml1/lesson1.html


As one of the folks that took this course, I was thoroughly engaged. I wouldn't start masquerading as a data scientist after learning this material, but this is a highly-practical approach to deploying new engineering tools.


Jeremy Howard and Andrew Ng are the two teachers who got me into ML and eventually as a career. Amazing to see so much progress! Because of FastAi I can see ML being used around the world just like Excel or python


Looks great - will probably pick up the book

In Lesson 1 they talk about use-cases where Deep Learning is the best known approach. Are there any popular use-cases for which it is not the best known approach?


Yes - the lesson on tabular analysis focuses on decision tree ensembles, since that's what most people use (although we also use deep learning for it - and we ended up getting a more accurate model that way). We discuss the pros and cons of the approaches in some depth in that chapter.

Also, of course, there are many things that aren't really amenable to any kind of machine learning...


Though not related to the content of post, I found that the favicon of fast.ai is a H character, which is not related to AI. Somebody should update it.

FYI, letter H comes from theme Hyde in Hugo: https://themes.gohugo.io/hyde/?search-input=menu%3Dmai#sideb...


I am trying to get into ML in general and I am having a bit of a problem. I don't know what is what and I lack a basic trajectory. Fortunately I have all the mathematics prereqs so I can just jump in. What I need is some sort of up-to-date overview of everything ML so that I know what topics to study in which order. Does anyone know of such a thing?


Yes. I would recommend the fast.ai course linked above. It covers all the essentials of deep learning and some classical machine learning. You'll have enough breadth of knowledge to know which areas you'd like to explore in more depth and all the skills you need to build practical projects.


I got a little ways through the very first course way back when. I am planning to learn ML/DS in my spare time, but I have a particular end goal - self driving cars/computer vision. Does this course cover those topics?


Hi Jeremy, thanks for your awesome library! I've followed the last online course and was pretty impressed by how effective is your top-down approach.

Are multi-gpu setups supported in this version of fast.ai?


Yes, although I'm planning to work on making them simpler to use in the coming weeks.


Any recommendation on how to approach the course? Is it better to read the chapter in the book before you watch the lecture(s) covering the content of the chapter or vice versa?


Is there something similar for deep reinforcement learning ?



I took the fast.ai courses and highly recommend them for anyone who really wants to learn ML.

Are there any plans for courses on reinforcement learning?


No, there isn't.


Could you elaborate on the reasons why not?


I wondered why fast.ai still stick with unet for segmentation task.


can anyone suggest MOOCS similar to Jeremy's teaching style? I really like the way he teaches.


@dang or someone - I wonder if you can fix the title so it's not just "Fast.ai releases new deep learning course"? The article is just as much about the release of the fastai v2 software library as it as about the course.

The original title was "fast.ai releases new deep learning course, four libraries, and 600-page book", although "fast.ai releases new deep learning course and library" would probably cover what most people are interested in, and is quite a bit shorter.


Thank you for all the work you've put in to your MOOC and all these other resources. I love your teaching style and have gotten immense value from all this.

I'm excited to get my print copy of the book delivered tomorrow!


I've just received mine and I must say I find it so much nicer to learn using the print copy book! It forces me to type out (rather than copy and paste) the code, and I like being able to scribble all over the book. Given the thousands of dollars of value I've gained from the courses, I have to say that spending another $60 for the book was well worth it.


To be more specific, this will be the Version 4 of the course with a major upgrade to the `fastai` library (v2). There's an excellent talk by Jeremy (the creator behind the course) at Stanford that goes into the various layers that `fastai` library brings on top of `pytorch`:

https://www.youtube.com/watch?v=1TfI88uQNj8 (Please seek to around 30mins for the technical part)


FYI the gp is Jeremy


Wow, I don’t know if I should apologize or feel embarrassed or perhaps both!


I really appreciate all the work you have done to make this so accessible to folks.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: