Hacker News new | past | comments | ask | show | jobs | submit login

> indistinguishable from a Markov chain generator

A Markov chain generator wouldn't:

* Capitalize the first word of each line

* Make lines of approximately the right length

* Mark text with who is to speak it

While this is just a toy example, it's powerful enough to start showing the ways RNNs can produce text that looks superficially correct.

(Generating Shakespeare is actually one of the examples given in the classic http://karpathy.github.io/2015/05/21/rnn-effectiveness/)




You're mistaking a Markov chain toy with an actual Markov chain generator.

1. Yes it would. It would see capitalized words as high probability for first word on a line or after a point.

2. Obviously it could, depending on stop condition. Especially if you include line length.

3. If trained on corpus of plays, for sure it would.

The strength on the RNN is supposed to be in context and memory... Perhaps handling of grammar.

There are advanced hierarchical grammars that are related to Markov random field models that are about on par with RNN based on many text and music analysis loads. (In fact probabilistic math is often used to describe results and workings of a deep NN anyway.)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: