Hacker News new | past | comments | ask | show | jobs | submit login

> but this is not the same thing as being an imperative device. Almost every CPU out there is multi core these days

The interface to the CPU is imperative. Each core (or thread for SMT) executes a sequence of instructions, one by one. Even with out-of-order and speculation, the instructions are executed as if they were executed one by one.

> and GPUs absolutely don't work in an imperative manner, despite what a GLSL script looks like.

They do. Each "core" of the GPU executes a sequence of instructions, one by one, but each instruction manipulates several separate copies of the state in parallel; the effect is like having several identical cores which operate in lockstep.

> If we had changed the mainstream programming model years ago, perhaps chip manufacturers would have had more freedom to break free of the imperative mindset, and we could have radically different architectures by now?

The cause and effect are in the opposite direction. The "imperative mindset" comes from the hardware. Even Lisp machines used imperative machine code (see https://en.wikipedia.org/wiki/Lisp_machine#Technical_overvie... for an example).




> The interface to the CPU is imperative. Each core (or thread for SMT) executes a sequence of instructions, one by one. Even with out-of-order and speculation, the instructions are executed as if they were executed one by one.

That is, in the traditional model of declarative programming, the semantics given are guaranteed, but the actual order of operations are not. So, in a sense, the CPU takes what could be construed as imperative code, but treats it as declarative rather than imperative.


Exactly my point. With out of order execution, we execute as if they are in order, making sure that an item with a dependency on the outcome of another is executed in the correct order.

We end up having to rely heavily on compilers like LLVM which make boil down exactly what should depend on what, and how to best lay out the commands accordingly.

Imagine if the dominant programming style in the last few decades had been a declarative one. We wouldn't have had any of this nonsense about working out after the fact what depends on what, we could have been sending it right down to the CPU level so that it could deal with it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: