> Enough with this 'emulating a PDP-11' cliche (never seen it quoted as the PDP-7 before).
That's not a "cliche". It's the reality how computers work today.
Internally they're data-flow machines since quite some time. But they need to emulate a command stream machine (which did not change fundamentally since the time of the PDP-7!) to the outside world.
> Explicit instruction-level parallelism has its own issues that make it difficult to use effectively, precisely because it must do statically what out-of-order processors do dynamically.
That's why nobody here ever talked about "explicit instruction-level parallelism"…
> The use of C as a scapegoat is odd.
No it isn't, as the success of C is the root of all evil in this case.
> What alternatives would have led to a different world of processors? Pascal? PL/I?
No, of course not. Because Pascal or PL/I are of course also "just C" (with some minor, and regarding this consideration here, completely irrelevant differences).
> Experiments in parallelism at the time […]
Are irrelevant.
The topic is: How would things look like if we would start over with all what we know now, and with our current technical capabilities.
> the kind of instruction-level parallelism this line of argument calls for.
At least I have talked explicitly about data-flow (and nothing else)!
Imho the whole command stream based approach is a dead end.
Dynamically reconfigurable data-flow hardware (with local scratch memory instead of RAM) is imho the answer.
But of course it's almost impossible to compile sequential command streams (aka. "imperative programs", which is synonym to C and all languages that work the same, so actually almost all languages in existence) into anything that could be efficiently executed on such data-flow hardware. That's why you would need to "rewrite the world" form scratch to get any benefits from finally sane hardware (instead of the expected degradation in case you would try to map our current sequential command streams into this new world).
That's not a "cliche". It's the reality how computers work today.
Internally they're data-flow machines since quite some time. But they need to emulate a command stream machine (which did not change fundamentally since the time of the PDP-7!) to the outside world.
> Explicit instruction-level parallelism has its own issues that make it difficult to use effectively, precisely because it must do statically what out-of-order processors do dynamically.
That's why nobody here ever talked about "explicit instruction-level parallelism"…
> The use of C as a scapegoat is odd.
No it isn't, as the success of C is the root of all evil in this case.
> What alternatives would have led to a different world of processors? Pascal? PL/I?
No, of course not. Because Pascal or PL/I are of course also "just C" (with some minor, and regarding this consideration here, completely irrelevant differences).
> Experiments in parallelism at the time […]
Are irrelevant.
The topic is: How would things look like if we would start over with all what we know now, and with our current technical capabilities.
> the kind of instruction-level parallelism this line of argument calls for.
At least I have talked explicitly about data-flow (and nothing else)!
Imho the whole command stream based approach is a dead end.
Dynamically reconfigurable data-flow hardware (with local scratch memory instead of RAM) is imho the answer.
But of course it's almost impossible to compile sequential command streams (aka. "imperative programs", which is synonym to C and all languages that work the same, so actually almost all languages in existence) into anything that could be efficiently executed on such data-flow hardware. That's why you would need to "rewrite the world" form scratch to get any benefits from finally sane hardware (instead of the expected degradation in case you would try to map our current sequential command streams into this new world).