NLP (or specifically NLP using deep learning) seems to be having a breakout moment in the last year or so where there have been large advancements back to back.
Generalization is hard - you're often tuning millions of parameters at once, and often the most "sane" thing for the loss function to do is rote memorization. It'll be interesting to see what comes about from this discussion.
> NLP (or specifically NLP using deep learning) seems to be having a breakout moment in the last year or so where there have been large advancements back to back.
I don't know about that. Everything with deep learning seems to attract so much hype that it's hard to measure the actual progress without being a researcher.
On the other hand, "classic" AI projects seem to get no recognition, even when they deliver astounding results.
For example, how many people here heard about MIT's Genesis project?
NLP (or specifically NLP using deep learning) seems to be having a breakout moment in the last year or so where there have been large advancements back to back.
Generalization is hard - you're often tuning millions of parameters at once, and often the most "sane" thing for the loss function to do is rote memorization. It'll be interesting to see what comes about from this discussion.