Hacker News new | past | comments | ask | show | jobs | submit login
Valve: OpenGL is faster than DirectX — even on Windows (extremetech.com)
127 points by mrsebastian on Aug 2, 2012 | hide | past | favorite | 69 comments



While Valve is certainly in a position to know this, it's worth keeping in mind that they do have a horse in this race. For other legitimate reasons, Valve wants to move away from Windows as a platform, and that requires that people be convinced of the viability of the alternatives, which in turn hinges on whether or not OpenGL is competitive with DirectX.

That doesn't mean they're wrong about OpenGL being faster than DirectX, and even if they are wrong, it doesn't mean they're being intentionally misleading. But it's important to keep our eye on the priors.


The problem is that ExtremeTech is slanting their words in order to sensationalize the story.

Valve (in their blog post):

"This experience lead to the question: why does an OpenGL version of our game run faster than Direct3D on Windows 7? It appears that it’s not related to multitasking overhead. We have been doing some fairly close analysis and it comes down to a few additional microseconds overhead per batch in Direct3D which does not affect OpenGL on Windows. Now that we know the hardware is capable of more performance, we will go back and figure out how to mitigate this effect under Direct3D."

Extreme Tech:

"As for why OpenGL is faster than DirectX/Direct3D, the simple answer is that OpenGL seems to have a smoother, more efficient pipeline. At 303.4 fps, OpenGL is rendering a frame every 3.29 milliseconds; at 270.6 fps, DirectX is rendering a frame in 3.69 milliseconds. That 0.4 millisecond difference is down to how fast the DirectX pipeline can process and draw 3D data."


I think the real interesting conclusion here is: if OpenGL has parity with Direct3D on Windows in terms of features and performance, why should anyone ever use Direct3D any more? Direct3D is almost non-existent on mobiles and tablets, so anyone using Direct3D is just making their lives harder if they want to go mobile, or port to any other OS. We don't live in Windows monopoly days any more. If OpenGL can get you the same thing on Windows, and helps make it easy to port to other big platforms, I can see Direct3D really falling to the sidelines.


Because it ISN'T feature parity.

In OpenGL all the fun stuff (e.g. the stuff that's in DX9+) is in vendor specific extensions.

A bit like vendor prefixes in CSS, except they can actually be implemented differently, have different names, etc. It's BAD news, and the main reason why 99% of AAA games use DirectX on Windows.


That might have been true in the OpenGL 1.5 days.

There are few things one might miss in core OpenGL 4.2. If anything any features exposed in extensions building on OGL 4.2 are so new or experimental that D3D doesn't have them at all.


Most of players don't have OpenGL 4.2.

You'll get hundreds of bug reports about game crashing on start.

Testing on old hardware (for <50 000 triangle games it's Intel GMA and Geforce 4 MX and some old Radeon) is a good way to be sure it works on most of computers. Of course game needs to support those old cards only on lowest quality settings.


Here's Valve's viewpoint on user configurations: http://store.steampowered.com/hwsurvey/videocard/

To understand a somewhat more casual audience, it's interesting to contrast the users of Unity: http://unity3d.com/webplayer/hwstats/pages/web-2012Q1-gfxcar...


The same is true for Direct3D.

If the user's GPU doesn't support D3D 11, you have to supply a fallback.


But that's just it: A fallback. Not totally separate codepaths for NVidia, ATI, etc, which is pretty much where legacy openGL is.


You are right. OpenGL works everywhere.

But you have to be careful: my amateur OpenGL 3d game had problems running on half of computers, especially laptops. It's because various cards support different opengl extensions. You can't use non-power-of-two textures, can't use textures bigger than 512x512 (or other number), can't generate mipmaps automatically, can't have too big display lists.

I thought about optional DirectX renderer when running on Windows, because I've heard it's more standarized.


The non power of two issue is not a problem. Just load the texture in with padding to get it to the next power and put a texture region in that doesn't have the padding.

OpenGL is an awesome platform for those who want system portability IE minecraft, or any other indie developer who wants maximum exposure.


But now you know about those limitations :-)


I've met a lot of graphics programmers, that are simply in love with D3D, and despise OpenGL (my own experience, I work in a video game studio doing console and PC games)


That's all well and good, but the fact remains that Direct3D is certainly a better documented API, and also that OpenGL even now has annoying remnants of its past as a hissing, clanking state-machine (glBind* garbage). Speaking from experience, trying to get OpenGL to behave sanely in a multithreaded environment is nontrivial.


One word: inertia.

The question development shops would be asking themselves is: will the additional revenues provided by Linux / Mac ports of these games be enough to offset the cost of retraining their people (or hiring new people) who are prepared to do serious OpenGL development?

Moreover, many game companies are using licensed game engines. These engines may not now (or ever) support OpenGL, so the game developers themselves may not have the option to change--even if they did want to.


Yes, DirectX benefits from network effects, but there are a ton of mobile game developers these days developing for Android and iOS in OpenGL ES, and I could see a day coming when these network effects are reversed in OpenGL's favor. The fact that Valve and EA are going to support Linux and OpenGL is just a bonus, although a pretty big one. And don't forget Blizzard's Diablo 3 and Starcraft 2 are also written in OpenGL for Mac OS X.


> a ton of mobile game developers these days developing for Android and iOS in OpenGL ES, and I could see a day coming when these network effects are reversed in OpenGL's favor.

I'm pretty sure we're seeing that already. And WebGL is not going to improve the situation for D3D.


Why stop at OpenGL? Let's work with GPU vendors to get a more consistent, lower-level API for high-performance 3D graphics. John Carmack observed that not being able to program the hardware directly caused an order of magnitude drop in effective performance of PC hardware. http://www.rockpapershotgun.com/2011/06/08/get-with-the-prog...


Because isolation from dozens of graphics architectures (many from the same vendor) and hiding chipset bugs (not just in the graphics hardware, btw) that would scare the jebus out of you is worth it. Not having to track down wierd synchronization issues is like money in the bank to most developers.

Make every game reinvent their own chip-level drivers? Wow. You want to run 10X faster, or slip your schedule by years?


Well said. Let's be honest, John Carmack is something of a rarity even in the gaming world. While I'm certain that he could handle the challenges associated with tearing down those abstractions, I'm not sure the throngs of other developers would fare so well.

Listening to people like Carmack, Torvalds, and others of their ilk is a good thing; but, we need to be cognizant that they are, themselves, special cases in the world of engineering and what can work well for them likely won't work well for the other 90% of the engineering world.


You can get the best of both worlds, with opensource drivers though.

They imply well documented hardware and the potential to access it directly. But you can also use, and/or help improve the the higher levels of abstraction.


You're talking about the really great out-of-the-box support that audio always enjoys on Linux, right?

No, let's be honest--we'd end up with a half-dozen opinionated projects with slight overlap implementing different graphical interfaces, and another half-dozen metaprojects on top of that. Fuck. That.

Just give me a competent implementation of OpenGL or Direct3D by leading vendors who know their hardware and quirks, and be done with it.


Audio support accross the board is actually better, than the graphics support.

What is changing now, is that because of the consolification of commercial OSes, there is strong commercial pressure to support linux now, just to keep an open OS. What is happening with opengl on linux now, is because of this change.

So, audio isnt a good counter example: who knows how much worse it would have been with closed source drivers.

And the problem isnt having multiple solutions/systems available: the problem is lack of a clear winner.

Companies like Valve and Canonical will be getting more dominant in picking the "winning" systems. This is true already to such an extend, companies like NVidia and AMD go out of there way to please companies like Valve. Contrast this with the kind of get-on-our-knees-and-beg relationships the linux community has had.

And Valve is arguing for opensource drivers. You should read Intels blog about their cooperation. They felt like they had to constantly remind Valve, they were preaching to the choir.

And NVidia and AMD might not give Linux much priority, but they will not let Linux ruin their relationship with the gaming industry.

The point about open drivers is continuity and integration. The HW vendors really should concern themselves with the quality. The opensource community should concern itself with adapting that support to their needs.

I want NVidia to write and maintain their driver, but i want them to publish the source. They are ones that should maintain quality: they are the ones profitting from it with hardware sales.

Audio is a really bad example, because the actual hardware is pretty much without a meaningfull profit margin, being integrated and all. And at the other side, there are few needs that reach beyond just "play this audiobuffer". I dont know who writes these systems, or what motivates them. But 99% of the needs were already fullfilled with OSS.

Thats very different from game developpers contributing to the graphics infrastructure, to fix actual problems they have.


"Thats very different from game developpers contributing to the graphics infrastructure, to fix actual problems they have."

I think you grossly overestimate the time most developers have to fix other peoples' problems, especially game developers.

In fact, I'll go out further: if you give game developers the impression that they can go look at driver code, you are opening an entire box of sadness and compulsive brittle micro-optimization that will consume a lot of man hours.

It doesn't matter about the power trip Linux people can have now that the graphics vendors are coming to them. It doesn't matter about the "freedom" to have overworked and underpaid developers muck around in driver code that frankly may exceed what they are competent at doing. It doesn't even matter that the open-source community can now second-guess and snarf about shitty vendor driver code.

What matters is that this innocent little idea could do very, very bad things to developers. "It Just Works, don't worry" is a solid reassurance, and having my game fail because some neckbearded jackass decides to recompile the Nvidia sources but-oh-so-different and push it upstream is unacceptable.

This is a bad idea.

And, for chrissakes, let's start small. How about you give me a stable ABI to program against, and we'll take it from there? Baby steps.


How would that work? Would it need to use C? Would this help?

http://hsafoundation.com


My impression is that part of the issue is that OpenGL has only really gotten its act together, so to speak, recently and that fresh in the memory of many game devs was a time when DirectX really was obviously better.


This article completely missed the fact that there's way more to DirectX than just Direct3D. DirectInput, DirectPlay, DirectSound etc. are out-of-scope for OpenGL which means you have to add a lot of unrelated libraries to approach the same functionality as DirectX.


On the other hand,

> DirectInput

Officially deprecated for Windows Message Loop and XInput

> DirectPlay

Deprecated for GFWL, not even present anymore in the DirectX SDK (I believe), some modules (voice and NAT helpers) were completely removed in Vista.

> DirectSound

Deprecated for XAudio2 (Vista and 7 implement DirectSound in software, although ALchemy will dynamically intercept DirectSound3D calls and translate them to OpenAL on the fly on Audigy and X-Fi)


That's not really so much of a problem as you might think:

http://www.libsdl.org/


The lead developer of SDL is now working at Valve, so I guess that is a very likely solution.


Do you mean Ryan Gordon?



Sorry but Valve has been on a smear campaign to slow Windows 8 adoption for one reason, Microsoft will own the store and the keys. So I'm immediately skeptical of any information they put out there. A lot of their grumpy comments sound very similar to Sierra Games of the 80s and 90s that resisted the jump to Windows from DOS.

I do think this push into Linux is great for making Linux a richer platform overall. This way, everyone wins in that case, but I still do not see it an overly successful monetary venture.

If Windows 8 is a failure, people will either stay with what they know (Windows 7/XP) or continue to defect to OS X. Microsoft doing what they always do, will see where they need to improve, and continue to iterate and push out a more polished Windows 9.


OpenGL is still available through Windows. It's hard to see this as a "smear campaign" on anything but the Direct3D batch processing.


Did you read the actual Valve blog post, or just the sensationalist ExtremeTech article? If you read the former, you'll have to point me to the part where they did the smearing. I must have missed it.


Is it time for an OpenGL revolution?

Perhaps not, lets wait until Valve has a chance to fix their D3D issues. It seems to me the blog post was more about success in ferreting out problems with drivers and APIs on Linux and Windows than pure Linux/OpenGL superiority.

I'm a longtime Linux user and there is little reason to assume Linux the is king of graphics. One thing worth pointing out is that Valve is using a Nvidia card with proprietary drivers which don't make use of any shared Linux graphics infrastructure (Gallium3D, etc). So this type of performance gain compared to Windows may not be repeatable with Intel or ATI cards.


Valve has been optimizing Source on Direct3D/Windows for 10 years, so I don't think waiting any longer will help.


Valve's own blog seems to think it will: "Now that we know the hardware is capable of more performance, we will go back and figure out how to mitigate this effect under Direct3D." http://blogs.valvesoftware.com/linux/faster-zombies/


notice he said mitigate, not eliminate. There's a difference.


cards are very different, even the driver is open sourced, I don't think other cards can benefit from it.


What, only weeks after I threw away my (very) old OpenGL books? The reason why DirectX is used everywhere is that graphics cards have been developed to suit it better for the past 10 years or so, I thought. Also, DirectX is a more complete solution for developers including networking, audio, controller handling code ...


OpenGL ES 2.0 is actually pretty clean; the monstrosity that is other OpenGL is basically being held up by the CAD companies (such as AutoDesk) who are very conservative and don't care much about games; they just want their old code to keep running on new versions of the spec.


This simply isn't true. A number of years ago with the introduction of DirectX 10, Autodesk began transitioning several mainstay products to exclusive DirectX use under Windows (see http://archicad-talk.graphisoft.com/files/autodesk_inventor_...) Morevoer, Autodesk has moved to using a modern, unified renderer for a large number of their products. There may be other CAD companies "holding up" the "monstrosity" of dated OpenGL versions, but Autodesk is not that company. In all likelihood, the issues lies not with the CAD companies, but with their clients that refuse to upgrade their legacy applications of 10-15+ years age.


Ok so my information must be outdated, apologies, I read about this a few years ago.

OGL still needs to get rid of the legacy fixed unit pipeline and standardizes the interfaces for more advanced functionality (like OGL ES 2 and DX10).


Ever heard of the Core Profile?

If you request an OpenGL context from your OS and ask for a core profile context, you'll get a context that's almost identical to ES2. It's shader based and all of the legacy fixed stuff is entirely gone.


> Also, DirectX is a more complete solution for developers including networking, audio, controller handling code

I gather you're talking about DirectPlay, DirectSound, and DirectInput?

They've all been deprecated and removed from modern versions of DirectX. You need to pull in other libraries to replace those portions (Games for Windows Live, XAudio2, and Windows Message Loop + XInput is one way to roll up that functionality).

Honestly DirectX is mostly just Direct3D in modern incarnations. The situation is fairly similar to modern OpenGL.


Your old OpenGL books would have been worthless, as they can only have talked about the fixed pipeline which has been deprecated (thank god).


Yes, yes it is. Now can OpenGL please get their act together and create a respectable SDK and documentation?


I'm slightly skeptical of their benchmarks -- the fact they're using 300fps numbers is a red flag. No one runs games at that speed. What's the comparison for a game running at 30 - 60fps? My hunch is that D3D has a tiny fixed per-frame cost that becomes irrelevant when frames take 10x longer to render.


when you turn on vsync, the frame rate is at most 60. When you turn it off, you certainly can have 300fps.


Sorry, my point was that they picked a benchmark that was too easy. I suspect if the machine was straining to hit 60fps their results could be vastly different. That's how most games run, so that'd be more interesting to see.


I think the point is that benchmarking the graphics subsystem's throughput is easiest when you look at high framerate games; otherwise CPU or just rendering time might be the dominating factor.


That would change the size of the effect, but OpenGL would still be faster. Also, 60fps x 5 = 300, so 1/5th of 20% ~= 4% faster which is still significant for them.


"More developers use DirectX because it has a cleaner API and better documentation."

Sounds about right. If raw speed was more essential than less problematic development, we'd all be writing in assembler.


DirectX is a very good option for games. OpenGL on the other hand is a more generic 3D API (used, for example, by most CAD and other scientific applications). Additionally, of course, it is available on all platforms. I doubt that DirectX will outlive OpenGL in the long run.


> DirectX is a very good option for games. OpenGL on the other hand is a more generic 3D API

I don't know about that any more. I have heard that for about a decade, but with the massive changes that have been made to OpenGL, I don't see why that should still be true, if it even was true in the first place.


Here is another, older take from Wolfire Games (Overgrowth, Lugaru):

http://blog.wolfire.com/2010/01/Why-you-should-use-OpenGL-an...

and a follow up:

http://blog.wolfire.com/2010/01/DirectX-vs-OpenGL-revisited


This puzzles me. Last I heard, the Mac versions of Valve's games were slower than their Windows counterparts. So maybe it's more about the Linux kernel than about OpenGL? Is there any comparison between the three platforms on the same hardware?


The problem with Mac OS, that it's always had with every 3rd party built-in library, is that the OpenGL they use is extremely outdated.

Source: Current version on newest OS X if you don't want to support Legacy OS X is 3.2, Legacy OS X is 2.1 https://developer.apple.com/graphicsimaging/opengl/capabilit...

Most recent OpenGL is 4.2. http://www.opengl.org/

Most OS X coders, till the recent OS X release have been using 2.1 which was released in 2006 which is when OpenGL was inferior to DirectX in every way.


> This puzzles me. Last I heard, the Mac versions of Valve's games were slower than their Windows counterparts.

That's in part a driver issue (though Mach message traffic may also limit perfs), the drivers are mostly developed in-house by Apple, and Apple is notoriously hard to work with (for third parties). So while Valve did lead to significant performance improvements following Steam/OSX I would expect they couldn't work closely with internal drivers developers.

And game perfs probably still isn't the main focus for OSX drivers development, sadly.


the drivers are mostly developed in-house by Apple

Really? Are you sure about that? I'm pretty sure that while there's some Apple-supplied kernel-mode glue, and maybe a lot of user-space components, most of the driver code is actually supplied by NVidia/AMD/Intel.

In fact at one time or another OSX seems to have inherited funny GPU driver bugs that were common with Windows (like the way NVidia's 32-bit drivers were unhappy with living outside the 2GB-4GB memory range).


No, you're right, what I wrote was idiotic, they clearly are not going to develop the driver code from scratch especially when there's hardware bugs to fix in the drivers. I tried to mean that nobody else built/touched/integrated the drivers, that Apple had pretty complete control over the binary and what did and did not go in it (and likely what kind of code they'd integrate), related to that were things like their long-outdated OpenGL support (2.1 mainline until Lion, which bumped support to... 3.2 — by the Lion release, OpenGL 4.1 was a year old and 4.2 would be released a month later, 3.2 was 2 years old)

Although these days I'm it looks like they write significant portions of the graphic driver subsystems: when released their automatic graphics switching seemed quite markedly different from Optimus and completely independent[0], which is corroborated by them not wanting to restrict their ability to switch back to AMD.

[0] http://arstechnica.com/apple/2010/04/inside-apples-automatic...


That 0.4 millisecond difference is down to how fast the DirectX pipeline can process and draw 3D data.

So let me see if I can follow this: OpenGL is faster than DirectX because...OpenGL is faster than DirectX?


OpenGL is faster than DirectX because the pipeline is shorter/faster/more efficient/etc. i.e. it's not due to texture copying, worse memory addressing, driver issues, programmer error, etc.


The post title bears little relation to the linked article's title.


True, but it better represents the linked article's body IMO.


And of course the mods have gone and messed things up again.


From the article body:

> we could be on the cusp of an OpenGL revolution

Close enough?


I coulda told everybody that.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: