Hacker News new | past | comments | ask | show | jobs | submit login

You can skip type checks, avoid unboxing performance hits, avoid changing map deoptimization, use integer for math heavy computation instead of doubles and the list goes on.. The potentials for speedups is significant.



It all sounds quite complex when you say it like this. But current JS engines can already do it and they do it very well (and efficiently). They even use integer arithmetic when it is possible.

By the speedup, I expect you mean the "preparation" speedup, not the execution speedup. It depends on what you do. But in most of cases, the preparation speedup will not be so significant.


I meant faster execution. Browsers don't do that and can't do that because of the dynamic nature of JS.

Chrome can do 31bit signed integer computation when it seems it possible, but it still needs to be converted to 64bit doubles most of the time.

I work on a very compute intensive application and have profiled JS under Chrome down to the generated bytecode for countless hours..


I think you have no idea what you are talking about. Browsers do that and can do that. What do you mean by a "dynamic nature of JS" and why is it a problem?

Integers are not converted to doubles when it is not necessary. Can you show me what bytecode do you mean?


I obviously also feel you have no idea what you are talking about and so far you have provided no supporting evidence or justification to your argument. Nor did you elaborate on your experience on the topic.

I don't care enough about you or this argument to pursue it further. Please, do your own research..




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: