> Intel wants Intel's GPU not a choice of NVIDIA, S3, ARM, and soon AMD/ATI.
I haven't seen any sings of Intel trying to push their GPU's into ATI/NVidia's niche market (gamers).
NVidia and ATI (now AMD) GPU's have always had their strongest consumer base among gamers [1]. Intel GPU's have never been in the same class as their contemporary NVidia/ATI cards on any measure -- triangles per second, texture bandwidth, gigaflops. Intel also generally uses shared-memory architecture, which means their memory bandwidth is limited and contends with the CPU.
Intel's GPU's are focused on being a low-cost, low-power, on-board graphics solution. As long as they can run a 3D UI and play HD video, they're not going to push the performance envelope any more, for the good and simple reason that they don't want to incur additional manufacturing cost, chip area, design complexity and power consumption for features that are irrelevant to non-gamers.
[1] By "gamers," I really mean anyone who's running applications that require a powerful GPU.
I haven't seen any sings of Intel trying to push their GPU's into ATI/NVidia's niche market (gamers).
NVidia and ATI (now AMD) GPU's have always had their strongest consumer base among gamers [1]. Intel GPU's have never been in the same class as their contemporary NVidia/ATI cards on any measure -- triangles per second, texture bandwidth, gigaflops. Intel also generally uses shared-memory architecture, which means their memory bandwidth is limited and contends with the CPU.
Intel's GPU's are focused on being a low-cost, low-power, on-board graphics solution. As long as they can run a 3D UI and play HD video, they're not going to push the performance envelope any more, for the good and simple reason that they don't want to incur additional manufacturing cost, chip area, design complexity and power consumption for features that are irrelevant to non-gamers.
[1] By "gamers," I really mean anyone who's running applications that require a powerful GPU.