That's stupid. Competent 3d on Windows is DirectX. Competent 3d on Mac/ios is Metal. Competent 3d on consoles uses their propriatary API. So, what is the point of OpenGL portability? Between Linux and Android? You _will_ have to switch API if you are making anything worthy.
That's some impressive graphics fetishism on display there. It's hard not to take offense at the comment that anything less is not "worthy". Not everyone is a AAA studio with an engine development team, nor should everyone hold themselves to those standards of engine performance.
OpenGL 3.3 is a pretty nice target. It's easy to port an OpenGL 3.3 game to a lot of systems, including mobile, since OpenGL ES 2.1 is pretty similar, and Windows XP, which cannot run DirectX 10. (This matters less as time goes on, but it's been an important point in certain markets in Asia.) So you can write the graphics code once, more or less, and run it everywhere, more or less.
The idea that somebody who chooses to do this is therefore not "competent" just boggles the mind. No point in deriding engineers who decide not to use the latest bells and whistles in the graphics pipeline. Maybe they just have other priorities.
(To clarify: the recommendation that everyone should just use OpenGL regardless of situation is, in fact, stupid. So I agree with that part. But it's equally stupid to say that everyone should use D3D12 / Metal regardless of circumstance.)
> That's some impressive graphics fetishism on display there.
More like compatibility and stability fetishism. nVidia and AMD can't even agree on how to compile the same GLSL shader - I'm much more confident about my ability to ship working D3D9 or D3D11 than OpenGL without a QA team with a wide range of hardware and driver revisions. D3D? Uniform bytecode. Uniform debug layers. Nice. Being indie just makes it easier - simpler rendering pipelines, easier to port.
For Android dev, where I have no choice about using e.g. OpenGL ES 2, the compatability mess is so bad that even for solo dev, I've been eyeing AWS Device Farm and Xamarin Test Cloud. Maybe AAA studios can afford the QA time - but I can't even afford enough phones to test to my satisfaction. And in my heart of hearts, I blame OpenGL, even if it's really the fault of mobile GPU vendors. I have a much weaker urge for a PC device farm, where D3D mostly just works. It'd be a much stronger urge if you threatened me with the prospect of supporting desktop OpenGL.
Biggest company I've worked for had about 50 people, usually on a much smaller team. I don't think that's AAA. Wasting milliseconds left and right, we didn't need much per-platform tuning - still lower hanging fruit around. Still almost agree with euos. Aside from compat - getting OpenGL working on a console or in the WinRT sandbox is probably more work and worse results than just doing a straight up port. Worst of all worlds.
> The idea that somebody who chooses to do this is therefore not "competent" just boggles the mind. No point in deriding engineers who decide not to use the latest bells and whistles in the graphics pipeline. Maybe they just have other priorities.
Agreed - competent engineers and managers may decide that competent 3d support not worth their time. I welcome this restraint when it comes to Excel's pie charts and the good fight against scope creep. If you're targeting the holy trifecta of Windows, Linux, OS X, and little else - OpenGL might be right for your MVP and your launch window.
Yet I'd still be eyeing that OpenGL on Windows as possible technical debt. Hell, I basically look at OpenGL ES on Android as unsolvable technical debt. I'd have a hard time labeling it "competent 3d". And I'd be wondering if it was really better than D3D9 + Wine.
Metal might be a little flavor-of-the minute. I need to give it a shot sometime...
Intel HD 3000 has better OpenGL support on Linux (thanks to Mesa). It's at 4.0 with software implementation of ARB_gpu_shader_fp64. And I doubt it supports all of DX11 properly as well.
I don't think you need to worry about that when gaming is concerned though. It's too old and below minimum requirements of huge amount of games already.
> That’s correct for AAA titles. Casual gamers however often play on PCs without a dedicated GPU.
Even in such case, they wouldn't commonly use Sandy Bridge generation GPU. And those who use it, aren't expecting recent games to support it (whether they are demanding or not). Increasing number of games already require OpenGL 4.x, even if they are not very demanding in practice. And now with Vulkan, older Intel GPUs simply won't cut it anymore.
Before it happened, software developers need to choose whichever GPU API works best for their particular project.
Personally, I have not developed games for several years.
For the last 1.5 years, I’ve been working on a CAD/CAM software. Traditionally people use OpenGL for this area. I have picked D3D 11 instead. The renderer is reasonably sophisticated; there are many complex shaders in there. The software is now used by thousands customers worldwide, and yet there were very few rendering-related bugs so far.
The article you linked to is advice from people who port games to Linux on how to make games more portable. It's not advice on how to ship them on time, how to make them performant, how to minimize the amount of developer labor required to build them. Given what a small share of the market Linux is for games, developers have other priorities than what makes games easy to port to Linux. And it's not a chicken-and-egg problem; the Linux market for games isn't much smaller than the Linux market for any other commercial software.
Performance is orthogonal issue to portability. You can make it portable, and perform well too (using something like Vulkan and proper engine design).
Linux gaming market is growing. Not sure about other commercial software in this context, but I'd guess it can indirectly be affected too, since bigger gaming market makes Linux desktop usage grow in general.
So, "Here, use a different API with worse developer support, and in return you get access to a market that's ~1-2% the size of the Windows gaming market?"
I mean, I'm all for OpenGL. I use OpenGL. Hell, I develop OpenGL games on Linux and then port them to Windows afterwards (released ~10 made that way so far). But the choice to use it depends on what market you're going after.
Vulkan has better performance than DX11, and it has literally twice the audience of DX12 - DX12 only supports Windows 10, so any W7/W8/etc users are SOL unless you use DX11 or Vulkan.
My point was, adoption in engines is relatively slow, but once it happens, things become easier. So current usage lists aren't a reflection of general progress. Unity for example gained Vulkan support in the latest version which came out recently, and Unreal is still not there (but already close). Same goes for Lumberyard.
Of course the tax on development imposed by lock-in freaks translates into that slowness. I.e. as you said, need for engines to support many balkanized APIs means slower releases to the market.
In the abstract, performance may be unrelated to portability. But if tooling, drivers and expertise are all focused on DirectX, and if getting OpenGL to do everything DirectX can do requires using vendor-specific additions to the spec that are non-portable, then in practice, it may be harder to make OpenGL performant across multiple vendor's GPUs.
I'm going to agree with cwyers here. The choice of which graphics API to use is not automatic, and "just use OpenGL, dummy!" just glosses over the fact that different developers have different priorities.
OpenGL gets you desktop and mobile all in one fell swoop (well, with a lot of tinkering and hard work). If that's your market, sure, go ahead. If you're writing a higher-end game targeting consoles and PC, then you'll end up doing a lot of extra work porting your engine to consoles and in return have access to a much larger market. Those consoles don't really support OpenGL, at least not as a first-class citizen, but one of them does support Direct3D. Direct3D also has excellent developer support.
Given the good support, good tool integration, and the fact that Direct3D runs on, say, two of your three most important platforms, it's a good choice.
That rhetoric about "the dark ages" is doing nobody any favors, it almost makes it sounds like Sony uses tabs in their source code or something even worse. Is this some kind of holy war, OpenGL versus the forces of darkness?
To clarify, yes, I was talking about PS4 and Xbox One. Those two systems, plus Windows, are the primary market for most AAA games. Just do a search for "2017 video games" and you'll see a big chunk of games that only run on those three systems. OpenGL isn't a very compelling choice there.
> That rhetoric about "the dark ages" is doing nobody any favors
I'm calling it what it is. Their refusal to support Vulkan shows they intend to remain in the dark ages, and continue forcing developers to operate with balkanized non portable APIs. It's like some browser maker would refuse to support HTML and would require you to use ActiveX, Flash or whatever.
"I'm calling it what it is," is a poor excuse for poor rhetoric. Balkanized APIs are a historical fact. Khronos wasn't exactly quick to support modern hardware in the 2000s, everyone paid the price for it, Microsoft picked up the slack, and here we are with developers entrenched in the Direct3D ecosystem. Vulkan arrived about a year ago and that's not exactly a lot of time to sweep away all the inferior APIs before it. These things take time. The engine developers are figuring out how to use Vulkan, and Vulkan support is spreading. It's going to take more than a couple years.
Meanwhile, Microsoft is managing their profitable Windows business by supporting key app developers, including game studios, who are already happily using Direct3D. Are they going to axe it and piss off a valuable segment of developers? No.
The analogy with Flash is a pretty good one. Flash was used everywhere on the web for years and years. Then the iPhone sucked all the oxygen out of the room and Flash died.
Just like everyone switched from Flash to JavaScript so they could get their websites to run on the iPhone, maybe everyone will switch from Direct3D to Vulkan for the same reason. Flash took a long time to die. After Apple announced the iPhone would not support flash, it was five years later that YouTube stopped using it for video. And yes, there were a lot of good reasons to require Flash in the meantime, until the open web caught up.
Dark ages were long and historic too, which doesn't mean they didn't have a lot of problems :) So I find it a proper comparison. Those who insist on keeping things balkanized today (MS, Sony and Co), are slowing down the progress.
> maybe everyone will switch from Direct3D to Vulkan for the same reason.
I definitely hope so. Current messed up state shouldn't exist forever.
As developers put it here[1]: "Don't make a game that depends on Direct3D. All the hard hard work is getting the thing to run with OpenGL".
1. https://www.gamingonlinux.com/articles/about-linux-games-bei...