Hacker News new | past | comments | ask | show | jobs | submit login

>Yes, I've seen the result. They're nice but, as the article points out, not extraordinary compared to state of the art, open NLP research.

This isn't my impression.

It's not the best in many domains but it a single network that is moderately decent in many domains. You can use it to summarize by adding TLDR to the end of text. You can use it to translate by listing previous translations. And of course it does blow away any state-of-the-art RNN text generation I've seen. RNNs tend to fall apart after 1 or 2 sentences where as this holds it together for multiple paragraphs.




Have you been following NLP closely lately? It seems like most of the frustration and/or skepticism is coming from those closest to the field (i.e. researchers), so I'm pretty sure I'm missing a big part of the picture.

I'm trying to get a sense of just how quickly things have been advancing. I read a few NLP white papers about a year ago and never saw anything as compelling as this, but I am definitely an outsider possibly on the left hand side of the Dunning Kruger graph...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: