Hacker News new | past | comments | ask | show | jobs | submit login

Natural language is hardly appropriate for things that we use computers for, primarily because of its inherent ambiguity. Its not a language for expressing algorithms, which is precisely what one uses a computer for. If we expect to code in language that is close to natural, we should also expect an error rate of natural systems.

But then Dijkstra said it much better: http://www.cs.utexas.edu/users/EWD/transcriptions/EWD06xx/EW...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: