Easy just think how browsers render HTML/CSS/JavaScript, C compilers specific interpretation of ANSI C, UNIX variations from what POSIX leaves to each implementation, and many other implementation vs standards variations.
Just like them OpenGL is a text standard describing how a 3D API is supposed to behave.
It happens that what those papers state and what each team of developers at every card manufacturer understand is not always the same.
Then there is the set of card specific extensions, for every nice feature they want to sell on their graphics card, but yet to be adopted by OpenGL paper standard.
There are of course certification tests available, but they are costly and don't cover 100% of the API usage anyway.
So programming OpenGL, happens to be like trying to write web applications, while trying to make the code portable and bug free (with workarounds) across all graphic cards out there.
> So programming OpenGL, happens to be like trying to write web applications, while trying to make the code portable and bug free (with workarounds) across all graphic cards out there.
And it's much harder to test because you have to find the outdated graphic card, with the outdated driver, and have a machine where you can plug the offending card.
One reason is that shaders are stored as text and require each driver vendor to write parsers for them and not everyone implements them in the same way.
Further, the drivers have to be more permissive than necessary. It's not an option for a GPU vendor to ship a driver update that would break Doom/Quake/Dota/etc.
Ideally games would ship only shaders that are strictly valid GLSL but that's not quite the case.
If you're developing GL, do everyone a favor and start using the official GLSL validator in your build scripts.