Wikipedia: "Fast Approximate Anti-Aliasing (FXAA) is an anti-aliasing algorithm created by Timothy Lottes under NVIDIA. It was first used on several games such as The Elder Scrolls V: Skyrim."
Interesting, but he mentions all these techniques that are available if you have low-level access to the GPU. But on a PC, you never have that because of DirectX and OpenGL, so why do the GPUs have all these hardware functions that are never going to be used by any PC gamer?
For the same reason a modern CPU has capabilities far beyond what's exposed through Javascript or Ruby: it's easier to build a simple general purpose device and then implement any API through a software layer than to make hardware implementation of every complex API.
I would think future updates to OpenGL/DX could allow more features of the cards to be used by devs. Otherwise, to use all the features of a card, the game would need to be heavily optimised which increases development time and costs, and then only supports a certain configuration which would be such a small amount of people that all that development wouldn't be worth it.
Many of these features are no doubt used by the driver author(s) to implement cool stuff in GL extensions. One particularly crazy example: https://developer.nvidia.com/nv-path-rendering
That's not the xbox leak that the present specs are from. I've managed to confirm the newest leak (8 Jaguar cores, 4MB L2, 12 GCN CU, 8GB DDR3, 32MB fast local pool) from a guy who works on a game for it and who I trust.
A long long time ago back in the 64-bit era, a bunch of fellow conspirators got some of the videogaming news sites to wax poetic about the impending arrival of Super Multi-Uniform Rational F-Splines...
The more things change, the more they stay the same...
Wikipedia: "Fast Approximate Anti-Aliasing (FXAA) is an anti-aliasing algorithm created by Timothy Lottes under NVIDIA. It was first used on several games such as The Elder Scrolls V: Skyrim."