Hacker News new | past | comments | ask | show | jobs | submit login

The original thesis concerns only computable functions, which are deterministic by definition. It doesn't even consider their complexity. There are extended variations of the thesis, though. The complexity variation means that any computable function in one model of computation can have only a polynomial slowdown factor in comparison with another model of computation. It's believed to be false since the raise of quantum computations.



Nondeterministic Turing Machines were introduced very early

and became the basis for the Church/Turing Thesis.

An important complexity result is that in important

applications, a digital implementation can be hundreds of

times faster that a parallel lambda expression.

Consequently, the lambda calculus cannot be a practical

foundation for computing.

See the following for more information:

"An Actor Application Can Be Hundreds of Times Faster Than a Parallel Lambda Expression"

https://professorhewitt.blogspot.com/2021/03/an-actor-applic...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: