Hacker News new | past | comments | ask | show | jobs | submit | to-mi's comments login

It seems that the post is comparing a predictive distribution conditioned on N data points to one conditioned on N-1 data points. The latter is a biased estimate of the former (e.g., https://users.aalto.fi/~ave/publications/VehtariLampinen_NC2...)


This. Leave-one-out doesn't turn an in-sample prediction problem into an equivalent out-of-sample problem, but a slightly less skilful one.


A rap lyrics generation algorithm with deep learning was also recently published [1] in KDD conference and it also got quite a lot of publicity in other media: see the associated website at http://deepbeat.org.

[1] Malmi et al.: DopeLearning: A Computational Approach to Rap Lyrics Generation http://www.kdd.org/kdd2016/papers/files/adf0399-malmiA.pdf


Author of DeepRhyme here. With respect to the authors of DeepBeat, what they are doing is less ambitious. They are taking full existing lines out of a rap lyric corpus and assembling it into a verse that rhymes and makes sense. There is a paper that is very similar to what I did : http://www.emnlp2015.org/proceedings/EMNLP/pdf/EMNLP221.pdf . It's hard to compare our models, because they don't give much output text. They trained on 1% of the data I did, so I'm a bit dubious how successful they could have been.

I am writing a followup post where I'm going to talk about previous work, I hope no one takes any disrespect.


>> It's hard to compare our models, because they don't give much output text.

The referenced paper doesn't give many examples of output, but the authors have made the system available online on the url given above.


I think you're talking about DeepBeat, they aren't tackling the same problem, as I mentioned.


Instead of completely banning actual lines from other material, it might be interesting to allow D-Prime to quote or slightly modify a phrase if it met some high threshold of notability.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: