Hacker News new | past | comments | ask | show | jobs | submit login
Direct3D to OpenGL abstraction layer (github.com/valvesoftware)
247 points by weslly on March 11, 2014 | hide | past | favorite | 106 comments



For the interested, google's Angle is the opposite : https://code.google.com/p/angleproject/


Now someone needs to start a project to find a large, documented subset for which ANGLE(ToGL(x)) = x so we can all get rid of this historical two-standards-that-do-exactly-the-same-thing nonsense.


All that reminds me of is this: https://xkcd.com/927/


Yes, one must be careful not to define a new standard, but rather to observe a strict subset of both for which that "equation" holds.



Only for WebGL, though. (Edit: OpenGL ES 2)


Actually it's up to OpenGL ES 3 (which may or may not be good enough for your app)


WebGL is just an OpenGL ES 2 binding to JavaScript. OpenGL ES 2/3 is a subset of OpenGL 3/4. Angle translate GLSL shaders and OpenGL calls to DirectX


Sadly it is a bit more than "just", specially if one needs to write code across plain GL and GL ES.


What hurts the most? I would think the imperative drawing stuff would be missed least.


Which OpenGL flavor to support 2.x, 3.x or 4.x, followed by which GLSL level to support, and which GL ES flavor to support 1.x, 2.0 or 3.0 and their respective GLSL levels.

Not to mention vendor specific extensions and workarounds driver bugs.

So one ends with multiple code paths, if several versions are to be supported. Similar to how HTML or POSIX are portable, but then again not so much.


Valve covered how these tools are used in their Porting Source to Linux: Valve’s Lessons Learned talk: https://www.youtube.com/watch?v=btNVfUygvio

slides here: http://adrienb.fr/blog/wp-content/uploads/2013/04/PortingSou...


Not sure here but don't all major engines already support OpenGL ootb ? Basically any engine thats runs on PS3/PS4 does like UnrealEngine, Cryengine, idTech, Unity and most indie/open source engines are OpenGL anyway. All engines that target Android/iOS are OpenGL too. Even for things like XNA which was Xbox/DirectX only there is MonoGame to port it to *nix platforms easily.

So if you have rolled your own DirectX-only engine this is awesome of course, but do people really do that?


> Basically any engine thats runs on PS3/PS4 does like UnrealEngine, Cryengine, idTech, Unity and most indie/open source engines are OpenGL anyway

That's completely wrong. PS3 never used OpenGL and PS4 doesn't do either.

Unreal while having somewhat working Mac port is usually ported via Cedega and not the Mac version which is largely unmaintained. The Linux port of Unreal was never merged back either. Presently there is no reasonable OpenGL renderer. CryEngine also does not use OpenGL.


> PS3 never used OpenGL

It sort of does. PS3 uses OpenGL ES 1.0 with extensions and then there is a separate graphics library, LibGCM, that is lower level.

Details here (although this use says it doesn't use OpenGL and then says that its API is based on OpenGL + extensions, so it is a bit confusing):

http://scalibq.wordpress.com/2010/05/15/sony%E2%80%99s-plays...

EDIT: HEre is a PS3 developer saying that PS3 is OpenGL ES:

http://www.khronos.org/assets/uploads/developers/library/sig...


Almost nothing uses GLES, though. All engines have libgcm renderers.


> Almost nothing uses GLES, though. All engines have libgcm renderers.

Very true. The performance impact of both Direct3D and OpenGL is actually huge, so it is best to avoid it when you can using lower level interfaces that speak directly to the GPU.


Wow, what are they, a cursory Google search turns up this result: http://stackoverflow.com/questions/6345538/is-there-a-lower-...


Quote from Wikipedia >> The current release is Unreal Engine 3, designed for Microsoft's DirectX 9 (for Windows and Xbox 360), DirectX 10 (for Windows Vista) and DirectX 11 (for Windows 7, Windows RT and later); OpenGL (for OS X, Linux, PlayStation 3, Wii U, PlayStation Vita, iOS, Android); Stage 3D (for Adobe Flash Player 11 and later); and JavaScript/WebGL (for HTML5). <<

You are right that CryEngine doesn't use OpenGL it seems, but at least they are pretty close with PSGL and they also seem to be working on Linux support.


Crytek apparently read your comment ;) and an hour later announced official support for Linux.

http://venturebeat.com/2014/03/11/steamos-gets-a-powerful-ne...


PSGL is barely OpenGL, it is OpenGL ES 1.0 with Cg for shaders.

Not much that code is portable across the respective OpenGL flavors.


Wikipedia is wrong about the usage of OpenGL on consoles.


Crytek uses libGCM on PS3, not PSGL.


PS3/PS4 commercial games do not use OpenGL and never have. This myth needs to die.

http://scalibq.wordpress.com/2010/05/15/sony%E2%80%99s-plays...


Somehow the FOSS crowd seems to propagate this myth that only Microsoft has vendor specific graphics APIs and the rest of the world runs on OpenGL.

Never understood how it got born.


It was probably born out of the fact that this is the truth for PC gaming, where Microsoft uses DirectX and "all" the other platforms use OpenGL.

For a long time, console game development was obscure and unaccessible (and to a large extent, this is still the case), so aspiring game developers had somewhat of a tunnel vision when it came to game platforms.


> It was probably born out of the fact that this is the truth for PC gaming, where Microsoft uses DirectX and "all" the other platforms use OpenGL.

Except Acorn, Atari, Amiga, Mac OS (<= 9) never had real OpenGL support as well.


Right, but we're talking about 3D hardware APIs, which didn't really exist in a large scale until the late 90's/early 2000's.


You mean like Warp3D (Amiga) and QuickDraw3D (Apple) ?


But then add Glide (DOS), please. :-)


I remember playing Screamer 2 with that. I think my brother still has his 3dfx cards lying around (Voodoo Dragon, Banshee etc.)

Happy times. The music in Screamer 1 was better though, more noodly guitar music in the style of Satriani.


I left it out on purpose, because DOS was a Microsoft platform as well.


> I left it out on purpose, because DOS was a Microsoft platform as well.

MS-DOS was a Microsoft platform, but not DOS itself:

> http://en.wikipedia.org/wiki/DR-DOS

Also Glide was not a Microsoft API.


I just wrote DOS as I always abbreviated MS-DOS as DOS.

DR-DOS actually came with the first PC I bought and it was for all effects a MS-DOS clone, hence a DOS clone, like PC-DOS from IBM and a few others on those days.


> DR-DOS actually came with the first PC I bought and it was for all effects a MS-DOS clone, hence a DOS clone, like PC-DOS from IBM and a few others on those days.

Under this argumentation GNU/Linux or even OS X is a UNIX clone and both should be abbreviated to UNIX (or UN*X to avoid trademark violations ;-) ). Indeed both originate from UNIX clones - but then new features were added that made them better than the original in a sense. The same happened to DR-DOS. Read DR-DOS' wikipedia article to read about features that DR-DOS added over Microsoft's original.


Did you failed to read I was a DR-DOS user?

I use computers since the early 80's.


I was a DR-DOS user, too (Novell DOS 7).


But I believe Glide wasn't a Microsoft API, so the example is still fine to defend your case.


I said large scale, not very many people had the accelerators required to use these APIs. Glide is perhaps the first real example to hit a scale that mattered.


yeah, the one time i tried to use psgl as a time saver it was a lesson in how broken, unmaintained and poorly documented it was...

i'm pretty sure it would have worked at some point because the docs indicated that it had. however - writing a gl like layer over the top of the rsx stuff is not a particular challenge - the functionality maps extremely well.


"the advantage of a console: hardware is fixed". YEP! Why use an abstraction layer when you can just code to the metal.


Time? Do you write all your applications in assembly?


Many games run on multiple consoles.


Still a pretty small number (2-8 max?) compared to all of the PC configurations you have to worry about in DX and OpenGL land.


Not DirectX only per se but DirectX first and OpenGL later.

I think a lot of games featured in Humble Bundles used to be that way. A lot of games used to be (and some still are) initially only released on Windows with Mac/Linux ports released later.


Unreal actually implement both, so it doesn't matter which one is available they can still run.


I have rolled several of my own DirectX-only engines, so I'm guessing that no, most people probably do not.

I always wrote them so that all DirectX and Windows specific code was abstracted behind platform-agnostic generic interfaces. So, in theory, OpenGL support would be as simple as coding an OpenGL implementation of my generic graphics API. Probably a day or two's work, but I haven't had a compelling reason to do the OpenGL implementation yet.


This is huge, really huge for anyone who write DirectX games and want to make them run on OSX/Linux. This is basically what a part of Wine do, am I wrong?


Wine is a bit different, I'm not 100% sure but it's more like a reimplementation of native calls than anything.

But yeah, this is awesome. As a side note, developers who want to make games run on OSX and Linux should just stop targeting DirectX. This is good for already-existing games and port them over, but in 2014 if you're still writing games in DirectX, you're most likely doing it wrong.


>> but in 2014 if you're still writing games in DirectX, you're most likely doing it wrong.

No, you are not. There is no wrong and right, it's just a stupid statement. OpenGL is years behind D3D (not DirectX, it's not comparable!) in many areas, and writing using the whole DX stack can be much nicer than setting up a GL stack with a lot of different libraries.


Behind in what sense?

I'm not a video games programmer, but judging by the benchmarks it seems the most efficient graphics stack is based around AMD's Mantle: http://hothardware.com/News/AMD-Mantle-vs-DirectX-Benchmarks...

John Carmack suggests the OpenGL extensions that NVidia have developed give comparable performance to Mantle: http://n4g.com/news/1376571/john-carmack-nvidias-opengl-exte...


>>OpenGL is years behind D3D (not DirectX, it's not comparable!) in many areas,

Explain please?


Complete lack of multicore rendering.

You can practically issue render commands only from one thread. And there is no way to save bunch of commands anymore as display lists were deprecated. Also it's still very much state machine based. So you have to do a lot of individual calls to set everything up for actual draw call.

I personally love OpenGL and use it on my work. However this is one of the biggest drawbacks on OpenGL currently.


No clue what multicore rendering D3D offers, but 'complete lack' is an overstatement.

OpenGL has context sharing, which means several contexts in different threads sharing the same objects. You can issue commands as long as you synchronise access to objects yourself. In practice, that means filling buffers, rendering to an off-screen framebuffer, etc. from other threads.


Multithreading in GL does not work consistently. If you ask the driver vendors (AMD, NVIDIA, etc) they will tell you it doesn't work at all or across platforms. If you ask engine developers (like Valve) they will tell you in talks that it doesn't work across platforms.


What's cross platform? I guess what matters in this specific argument is feature parity with D3D on all the platforms it can compete on.

If it works on Windows, then I wonder if it really matters. I've seen a lot of games (e.g. Team Fortress 2) still offer multithreading as an option in the UI, so there's already two code paths there.


I have been told by driver vendors that GL multithreading does not work on windows, yes.

D3D multithreading works (even, to a lesser extent, with D3D9!), and this is one of the reasons why our D3D9 renderer is dramatically faster than our GL one - we can offload a subset of rendering work to helper threads instead of keeping it all locked on the thread that owns the window.

TF2/etc do offer a multithreaded rendering option, but IIRC on Windows their engine still uses D3D. Also, note that it's an option, because the feature used to be very unstable (still might be, actually) - I suspect people using the GL version of those games may have to disable it on certain hardware/driver configurations.


As mentioned in the other comments it's not consistent nor always supported. And you can do the multithreading only by sharing already rendered fbo's.

In D3D it works as OpenGL display lists would, except with arbitrary commands. So you have multiple threads composing the scene and then a single thread just issues few commands to replay the command buffers created by the other threads.

It would be rather simple to implement the same in OpenGL as we already have the display list concept, even if it originally was made to reduce the amount of glvertex3f calls.

"complete lack" was maybe bit of an exaggeration, however in practice it is true.


I figure you're talking about DX11's multithreaded submission? While it's true that GL doesn't have support for this yet. AFAIK, DX's implementation is very slow, and all the major engine developers end up implementing their own batched dispatch anyways which ends up being faster on PC.


ARB_multi_draw_indirect (and its bindless friends such as NV_shader_buffer_load) somewhat invalidates your statements re: saving a bunch of commands.


One of the biggest problems with OpenGL currently is the lack of a good binary intermediate representation of shaders. Direct3D has a byte code representation that is vendor neutral. OpenGL has APIs that let you cache off binary shader representations (glProgramBinary), but they are configuration specific. Configuration differences may include hardware vendor, hardware version, driver version, client OS version, etc. So in practice these formats are only useful when the shader files are compiled on the client machines. They don't actually allow developers to ship only compiled shaders unless they are comfortable with their shaders not running on future hardware, for example.

This leads to developers pursuing various less optimal solutions that all involve more startup time for users and less predictable performance and robustness for developers (at least when compared to the solution D3D has offered for more than 10 years). So when people say OpenGL is years behind D3D this is one of the things they mean. D3D isn't perfect here either. There is a fair amount of configuration-specific recompilation going on, but the formats are more compact than the optimized/minimized GLSL source formats people are pursuing on OpenGL and while the startup time (shader create time) is still too long, it is still much better than OpenGL. Shader robustness is generally more predictable and better on D3D but it's hard to disentangle shader pipeline issues from driver quality.

To be fair multicore is also an issue for OpenGL, but D3D isn't great at that either. The current spec for D3D11 includes a multicore rendering feature called "deferred contexts" but performance scaling using that feature has been disappointing so it isn't a clear win for D3D. Other APIs (e.g. hardware-specific console graphics APIs) expose more of the GPU command buffer and reducing abstraction there allows for a real solution to the multicore rendering problem. There should be a vendor neutral solution here, but so far neither of the APIs has delivered one that is close to the hardware-specific solutions in performance scaling.


Note that I said in many areas. Not all. A history lesson of why many actually prefer the Windows stack can be read here: http://programmers.stackexchange.com/a/88055/51669


And you did not say which areas and why it is behind in them. I see multicore support in another comment, and maybe things related to geometry shaders. What else am I missing?


Geometry shaders are in OpenGL 3.2. Almost any complaints about "OpenGL is missing X feature" ignore the existence of the entire 4.0 generation as well.

Which is kind of understandable - I use Mesa as my "assumed" OGL level, and it just hit 3.3. So you absolutely can't assume anything from the 4 line.

Though I still couldn't use geometry shaders as an assumption, because as recently as Sandy Bridge Intel GPUs didn't support it. Add on if you want to port to mobile you need to refactor to GLES2.


I think it's not years behind, for example the latest AAA game I've played on OSX (Elder Scrolls Online) run natively on it, and it's in OpenGL. It look as good as on PC where it run with D3D.


The gap between GL and D3D has nothing to do with visual quality and everything to do with framerate and load times.


I'm afraid I don't know exactly what comment to leave this under but it feels very relevant to this discussion. This user's answer to the question goes through quite a bit of history of Direct3D and OpenGL.

http://programmers.stackexchange.com/questions/60544/why-do-...


OpenGL drivers on Windows are still pretty far behind DirectX drivers, especially on the kind of low-spec hardware (read: Intel GPUs) average customers have. If you use OpenGL instead of Direct3D you might be paying as much as a 50% performance penalty (we certainly see about that when comparing our GL and D3D renderers for the same scenes)


In addition with OpenGL you can access the latest features even on Windows XP. Saving you from writing separate DX9/10/11 renderpaths.

But this lib is for those countless DX9 games already in existence. Valve is making SteamOS ports as easy as possible.


You still have to write an absurd amount of spaghetti code around testing GL extensions that are available. Especially when you consider the Intel IGPUs from the last 5 years - the first gen is only 2.1, second gen 3.1, and Ivy Bridge and above is 4.3. But besides those the Nvidia / AMD parts have supported 3.3 and above since 2006.

Of course, if you are targeting Windows XP... you got to think back a lot further than that. In some ways theres some elegance in how you can write GLES2 shader code and have it run everywhere, including on mobile, and just check extensions for everything else you use.


That's true if you want to support older OGL versions.

Personally I use the ES3 subset of 4.3 right now. Same code works perfectly on ES3 mobile HW and 4.3 desktop HW. The only features it lacks are geometry and tesselation shaders.

I'm also using compute shaders right now, even though they are not supported in ES3 they are going to be supported in the ES next update. And that gives me effectively everything geometry shaders etc can ever bring.

By using those I'm saved from checking any extensions. And there is not much they could even bring to the table.


> The only features it lacks are geometry and tesselation shaders.

fp64, draw_indirect, multi_draw_indirect, shader_buffer_load, bindless_texture.

These are all killer features in writing fast renderers.


Wine does provide the D3D calls and translates them to GL, which works reasonably well.

It is better if the game does GL directly, which is why something the game can link in works better.


If nothing major has changed, I'd rather use the nice API DirectX offers than the general weirdness that is OpenGL. Any project that makes DirectX (as an interface) available for all platforms is great news in my book.


Yeah but this does not help if you want to port older games.


Not really, DirectX and OpenGL are abstractions to the graphic cards so you don't have to program directly on them (specially considering how they differ between vendors). If you allow me to use a very loose analogy it would be like ODBC for graphics cards.

Wine is a reimplementation of the WinAPI on Linux, so Windows applications can use the same functions in WinAPI even if they are on Linux.

Also, most AAA engines (such as Unreal Engine) already have an abstraction layer for OpenGL and DirectX so their games can run independently of which one is missing.


You know, CPUs don't have problems using radically different architectures under one ISA (compare Pentium 2 to a core i7 or Kaveri part). The ISA evolves over time and newer instructions become more prevalent, but I never understood why graphics hardware couldn't coalesce around an extensions based ISA for hardware in general.


It is true that there is a part of Wine which receives Direct3D calls and translates to OpenGL. But, as the other posters were commenting on, Wine does much more than that.


I don't see why this is so "huge". Maybe you could explain? I think developing OpenGL-Games or porting them was never a big issues. Most 3d engines today even have those abstraction layers included anyway.


Most of the big off-the-shelf engines (Unreal/Unity/CryTek/Source) do. If you look at first-party engines, though (Square Enix has Crystal Tools, EA has Frostbite, Monolith has LithTech, Activision has their IW engine for the CoD games), they're likely to be DirectX and console targeting. Valve is giving those kinds of engines the capability of targeting OpenGL through their existing DirectX support.


its not a big deal... its just that most game code is hidden and buried so that most of the loud opinions and

i have to say its good to see this but i'm mildly disappointed too... DirectX --> OpenGL is one thing, but a properly cross platform rendering interface that will work with DX, its flavour on 360, the novelties of Win8 and Xbone as well as PS3 gcm and gx/gx2 on nintendo platforms whilst dealing with the quirks of desktop vs. es gl? that would still be a pretty run of the mill technical achievement of no particular note.

aside from that i don't like the code. singleton class instance? why not use the static keyword so the compiler knows what you want and can optimise accordingly and save you the potential to shoot yourself in the foot. and when did 2500 lines become an acceptable file size for a header? also did we forget that you can nest folders, or name files with modern 1990s style filenames...

i need to open source my stuff...


fyi i meant to delete 'so that most of the loud opinions and'


Wine really only "works" for 2d apps.


Not true. Plenty of DirectX games work fine under Wine.


In a thread not too long ago, I asked why CS:GO might have such terrible performance in OSX, and olegoandreev hypothesized that it might have to do with poorly compiled shaders. Maybe with enough eyes on the code, this can finally be fixed? We already know that the OSX port of the Source engine can have close-to-Windows performance based on benchmarks done in Parallels, and yet it's still shamefully only half as performant.

https://news.ycombinator.com/item?id=7303753


Limited subset of Direct3D 9.0c


Now that I'm reading that ... isn't every subset of a finite set itself finite (and thus limited)?


I'm not a native English speaker, but I think in colloquial terms, "limited" means "smaller than you might want" or something like that.


A limited subset would not be a complete implementation, yes.


kind of preaching to the choir, no?

i mean, there are more open source games that use OpenGL than DirectX, and are already cross platform as a result. similarly, i would imagine if a AAA game studio wanted to do such a thing, they would roll their own (or just use GL in the first place).

perhaps i'm being too cynical (but this is hacker news..)


So far most AAA games are still developed for DirectX only. Those are the games that make or break a platform. Doom 95 was arguably the kind of killer app that established Windows as the primary gaming OS. In the context of Valve's SteamOS strategy it makes perfect sense to encourage more major developers to build games on a portable API like OpenGL or possibly AMD's Mantle by making it as easy as possible. If they fail at this, SteamOS, Steam Boxes and gaming on Linux in general won't have a chance at mainstream success.


sorry if i wasn't clear, but i am aware that most AAA games use DX. the same is not true for open source.

there are already DX->GL compatibility layers, although most of them are in house, by companies that produce ports (ala Feral Games).

yes, i entirely agree with you.. almost. ideally, i'd like DirectX to.. go away, forever. proprietary APIs that result in lock in to a software or hardware platform are seriously holding back computing in general. and i'd never recommend anyone use AMD's Mantle for the same reason (which is a massive waste of everyone's time)


Well there are possibly smaller studios and devs who perhaps don't have the resources to roll their own translation layers, for whom this might be a tipping point towards supporting SteamOS and/or Linux.

Since Valve has already written this for their own purposes, it'll cost next to nothing for this to be published if they can stop people expecting support, and they might get a few bug reports into the bargain.


> there are already DX->GL compatibility layers, although most of them are in house, by companies that produce ports (ala Feral Games).

And this is Valve releasing their particular variant mostly targeted at other major game studios. There is no reason every studio should be forced to create their own in house compatibility layer.


This was probably done to persuade the people complaining about porting their codebase to OpenGL (and hence Linux / SteamOS).


why would they roll their own if there's a battle-tested steam-os-compatible library with a BSD(-ish) license?


"BSD(-ish) license"

In fact, it's MIT: http://opensource.org/licenses/MIT


This is a good start, I'd also like to see a Dotnet and VCRUN conversion to Linux libraries like Mono and GNU C++ to help port even more Windows games to GNU/Linux and Mac OSX.


Nice. Does this mean that developers can start porting their existing (pre 2013) games over to Linux? Or are we missing a DirectX to SDL translation layer?


I wonder if you could use ToGL and Wine... Wire up the D3D graphics stuff at compile time, and translate the rest of DX at runtime.


well, SDL will do the job of getting a window from the OS. when used with GL (in the way valve do), it doesn't really do much else.


It handles sound, input and a bunch of other stuff which DirectX also handles besides window management and rendering.


This seems like a new version of this previous effort:

http://en.wikipedia.org/wiki/Cedega_(software)


Cedega/Cider is the cheap way to "port" games on OSX. It result of bad performances, yet at lot of companies still pay a lot of money to do it. The sad thing is that you can do it yourself (with Wine), for free, and a lot of time under 30 minutes.

Even the latest version of Wine patched with the Command Stream WIP is 5 times faster.


Not really. Cedega is a version of Wine.

This is a library, intended to be used in your application in lieu of calling the APIs directly. You cannot take a full application and ask this library to run it.


WINE does however contain a DirectX -> OpenGL translation layer that can be used separately from the rest of it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: