Hacker News new | past | comments | ask | show | jobs | submit login

So we would never build a computer because back in 1950 we didn't know how to make compilers, only raw bytes of machine code (we couldn't even "compile" assembly language). Life somethings require that you create a prototype of something that you think will work to see if it really does in the real world.



But with 1950s-era machines, it was expected that programmers were capable of manually scheduling instructions optimally, because compilers simply didn't exist back then.

VLIW architectures are often proposed to simplify the superscalar logic, but the problem with VLIW is that it forces a static schedule, which is incompatible with any code/architecture where the optimal schedule might be dynamic based on the actual data. In other words, any code that involves unpredictable branches, or memory accesses that may hit or miss the cache--in general CPU terms, that describes virtually all code. VLIW architectures have only persisted in DSPs, where the set of algorithms that are trying to be optimized is effectively a small, closed set.


> So we would never build a computer because back in 1950 we didn't know how to make compilers

No that's different - the big idea with the Itanium was specifically to shift the major scheduler work to the compiler. We didn't build the first computers with the idea we'd build compilers later.


But we did build an awful lot of RISC machines with exactly that idea. And it worked.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: