Hacker News new | past | comments | ask | show | jobs | submit login

Thanks for the explanation! One follow-up question is: how can we still call ignition an interpreter with this flow where TurboFan is still used to generate machine code? Doesn't that defunct the idea of an interpreter in v8, which is to be used in platforms w/o write access to executable memory, such as iOS or PS4?



While bytecode handlers (and other builtins) are generated by TurboFan, this happens at V8-compile-time, not at runtime. Their generated code is shipped embedded into the binary as embedded builtins.


This suggests that a specialized app (such as a set-top box, smart TV, or game console) could push more code through the pre-JIT process to further close the performance gap. (This is interesting to me, because I haven’t seen much interest in pre-JIT compilation since the early days of Java, HotSpot, etc.)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: