Do you happen to know if it has always been this way, or is this the result of a gradual squeeze to a limit due to megahertz wars, increasingly shortened product cycles, and each companies product line structured in a way that tends to cannibalize their own sales before creating fresh profit.
And the decreasing return, in practical terms, for their customer base when upgrading since most basic computation tasks have long been solved and it's only exotic high-end workloads (plus constant bloat scaling factor that is largely a consequence of the added performance, rather than it's target) that benefit from added performance.
In short, was Intel making percent, or even single digit margins back in the 80s or even early '90s?
Thats a great question; I'd have to dig into their statements to see more. It's hard to get reliable data from that long ago for free. From what I remember as a consumer, MSFT + INTC (Wintel) absolutely cleaned up in the 90s until AMD started the CPU price & MHz wars, and Apple started making MacBooks.
For Intel, it looks like they kept one dollar out of every four until the recent crash. I don't know the reason for the decline, but I know I wouldn't want to compete against Apple.
And the decreasing return, in practical terms, for their customer base when upgrading since most basic computation tasks have long been solved and it's only exotic high-end workloads (plus constant bloat scaling factor that is largely a consequence of the added performance, rather than it's target) that benefit from added performance.
In short, was Intel making percent, or even single digit margins back in the 80s or even early '90s?