Hacker News new | past | comments | ask | show | jobs | submit login

Not necessarily. For example:

https://engineering.fb.com/2018/08/31/ai-research/unsupervis...

> Training an MT model without access to any translation resources at training time (known as unsupervised translation) was the necessary next step. Research we are presenting at EMNLP 2018 outlines our recent accomplishments with that task. Our new approach provides a dramatic improvement over previous state-of-the-art unsupervised approaches and is equivalent to supervised approaches trained with nearly 100,000 reference translations. To give some idea of the level of advancement, an improvement of 1 BLEU point (a common metric for judging the accuracy of MT) is considered a remarkable achievement in this field; our methods showed an improvement of more than 10 BLEU points.

Although, this specific method does require the relative conceptual spacing of words to be similar between language; I don't see how that would be the case for Human <-> Whale languages.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: