Hacker News new | past | comments | ask | show | jobs | submit login
Godot 3's renderer design (godotengine.org)
261 points by uyoakaoma on Sept 26, 2017 | hide | past | favorite | 60 comments



While scrolling fast through the long document, my eyes immediately focused on "Wait, why not Vulkan or DirectX12?" :-D

tl;dr "At the end of the day, the use case where Vulkan and DirectX12 make the most sense is when you have hundreds of thousands of objects, which are all different (different geometry, textures, etc.), and which move. This is a difficult use case to achieve, even willingly."


It's not necessarily the wrong choice for their use case but this explanation is a little misleading. Vulkan and DirectX 12 were designed to reduce the overhead of draw calls and to enable multithreaded dispatch. That makes it feasible to render more objects with more state changes but the efficiency benefits still apply (in theory) when rendering fewer objects, particularly on a multi core CPU. Wasting less CPU on rendering API overhead is always a good thing - if you don't use the efficiency gains to render more stuff you can use the spare CPU for something else like better physics.

Both APIs are harder to use however and it's quite possible to end up less efficient than older APIs by using them badly plus current implementations can be poor, especially on mobile, so there are still good reasons not to use them.


> While scrolling fast through the long document, my eyes immediately focused on "Wait, why not Vulkan or DirectX12?" :-D

Well, I don't think it's architected in any way that would prevent the adoption of lower level APIs; but Godot's main markets are mobile, embedded, and web where OpenGL ES 3.0 is a low (but not the lowest) common denominator.


Vulkan should help with mobile battery life though.


This is a false assumption. Existing GL drivers do a lot of hard work to compensate for platform-specific (or even chip-specific) quirks and limitations. If you're building on top of Vulkan or D3D12 you potentially don't get that assistance (due to the lower-level nature of the API), and you'll have to deal with a lot of that nonsense yourself.

Android apps like Firefox or Dolphin already have a bunch of one-off workarounds for shader compiler bugs, etc. It's worse without a smart driver. On the other hand, the SPIR-V system used for shaders in Vulkan routes around a significant portion of bad shader compilers, so that's nice :)

If you have a Vulkan app that's not tuned for your specific device, your battery life (and framerates) could easily end up worse.

Vulkan and D3D12 shine in the hands of a company like EA or Ubisoft, and it's no coincidence that the Mantle spec (a precursor to Vulkan and D3D12) was basically completely designed by DICE, the developers of EA's Frostbite engine.


> This is a false assumption. Existing GL drivers do a lot of hard work to compensate for platform-specific (or even chip-specific) quirks and limitations.

While it may be possible for GL stacks to do this, real mobile and embedded GL stacks tend to barely function at all. I highly doubt they do much more than try to extract maximum straight line performance for synthetic benchmarks within their budget.

The reason Vulkan can have an outsize effect on power consumption is that it can allow renderer work to be spread across several threads. Running two cores at 60% the clock rate will typically cost considerably less power than running one at a higher frequency.

This is an explicit (and accomplished) goal of mobile vendors in supporting Vulkan.


The paragraph above that is relevant as well. Vulkan is only on Linux and Windows, DX12 is on Windows, Metal is on Mac and iOS. They'd have to have multiple renderers written for different APIs to cover all the platforms that they support.


The one problem with that plan, is apple have mostly stopped improving their OpenGL driver in favor of Metal.


Yeah, this really gets my goat. These days MoltenGL and MoltenVK are getting more popular. Good on the Molten* folks for making these layers, but I have a feeling they won't be used very widely unless they're basically free to license. Maybe somebody (Google?) will come around and buy them outright.

Even worse, Apple's GL drivers never performed very well to begin with. Their intel drivers have been much slower than Mesa for more than half a decade, and supported fewer (super useful) extensions for about two years.


All major engines already support Metal, or are in the process of finalizing it.

Middleware makes the actual APIs kind of irrelevant.

Also there is no OpenGL or Vulkan on majority of game consoles. Even on Switch, Nintendo still has their own API with much more lowere level control than Vulkan, just in case.


This is simply not true. To begin with, the shader languages and bytecode formats are different across OpenGL and Direct3D. Consoles tend to have their own shader pipeline as well. Even if you have great abstractions in your middleware, the capabilities actually provided by the system differ, and you have to consider this. APIs and drivers remain extremely relevant, middleware or not.

FWIW the PS3 (among others) actually offered OpenGL, not that you'd want to use it (the implementation was pretty bad - Apple-quality).

You can also theoretically run OpenGL code on the PS4 or XBox One if you want to, since they use well-understood AMD GPUs. Fail0verflow has already booted real OpenGL on hacked PS4s (via Mesa, I believe).

You can also run OpenGL on some platforms by using ANGLE to emulate specific versions of OpenGL on top of other APIs.

So, ultimately the "there is no OpenGL or Vulkan" observation isn't that meaningful - if your codebase is built around OpenGL, that's not necessarily a problem at all.

I don't know if I'd bother, but I'd also put money on it being possible to run ANGLE or Mesa GL on top of the PS4 graphics stack (I'm familiar with it).


Regarding the PS3, OpenGL ES 1.0 with Cg as shading language is quite different from regular OpenGL.

As for the rest of your examples, they are workarounds, not official support, which means extra development cost when problems arise.


People don't seem to undetstand this.

DirectX, PSGL, NVNAPI, and Metal are already available where Vulkan will probably never be.


Graphics are more than just game engines.


Graphics libraries abstracting 3D APIs are a thing, SDL, Skia, Cairo,...


Right, but SDL at least provides either software blitting or gl context setup (ie doesn’t abstract hardware acceleration at all). Cairo is also software rendering. I’ll check out skia!


I can't remember exactly what it was, but I remember seeing a GL extension that was originally proposed by engineers at Apple and then not implemented. Other platforms supported it. In hindsight, maybe that was implemented after Apple had internally decided to bail on standards and retasked their graphics engineers to working on Metal.

Agreed that this is a sad state and has been for years. Hopefully we'll see an open source solution to Vulkan->Metal compatibility.


The Vulkan group is working on finding a subset of of Vulkan which can be implemented efficiently on top of DX12 and Metal. Hopefully that goes well, as I would very much like to write my graphics code once and have it work everywhere.

For now, OpenGL ES 3.0 is the closest thing we have to a universal standard. Sadly, compute shaders weren't included until 3.1 so your options for cross-platform GPU compute are OpenCL 1.2 or OpenGL ES 3.0 with transform feedback.


I find quite interesting that they keep forgetting to mention libGCM and NVN API.


Nintendo supports Vulkan, so implementing it on top of NVN shouldn't be necessary. Unfortunately, I'm not in-touch enough with console development to understand the situation with libGCM.


Hmm, anything in here[0] ring a bell? For what it's worth they were reasonably active for a lot of years, they just completely stopped participating a few years ago and it's starting to hold things up.

[0]: https://www.khronos.org/registry/OpenGL/extensions/APPLE/


Molten folks would be happy to be open sourced as long as a sufficient gratification bonus is included. Also, Vulkan Portability initiative is relevant here - https://www.khronos.org/blog/khronos-announces-the-vulkan-po...


> Vulkan is only on Linux and Windows

Also on FreeBSD :) https://github.com/myfreeweb/freebsd-ports-dank#mesa-with-vu...


> Vulkan is only on Linux and Windows

And android. I.e. everything but Mac OS, iOS.


On Android it is optional for 7+ devices, currently about 13% of market share.

It is not available on Sony and Microsoft consoles, UWP apps.

On the Nintendo Switch there is NVAPI, which provides even more low level control.


Plus you can get Vulkan on macOS with MoltenVK.


I've seen that before, and while it's technically true it's also 100% useless for an open source engine.


It may be open sourced in the future, there are stakeholders who might be willing to pay for this to happen (Google, for example, might want it for their iOS and Apple deployments).


Eh, google doesn't really need it. They don't make anything that requires high performance and cross-platform, they don't make games, and chrome is only for android so they don't need to be able to run it on ios.


Chrome runs on OS X, where GL is deteriorating; they also have Maps and Earth on iOS.


On Windows Vulkan is only available to desktop apps.


Same here. While I do see a place for previous-gen graphics engines out there for the nearest time being, the provided justification does not sound too convincing.

Vulkan/DX12/Metal make more sense in many ways, not just to draw thousands of different objects. For one, they allow explicit control on the memory transfers and synchronization, which leads to less performance hitches and and spikes, more predictable and stable framerate.


What's not convincing about their argument about Vulkan not being supported on mobile? In my experience with embedded systems, that is true. They even discuss working with OGLES2 where OGLES3 is unavailable. Even within OGLES3, I have seen a mix of 3.0, 3.1, and very rarely 3.2 support.

"Added to that fact, Vulkan still has years to go until it's properly supported in most desktop and mobile platforms, which makes it unattractive to implement for us (as it means considerably more effort to write, debug and maintain). As for DirectX12, it's only relevant for Windows/UWP, so there is no strong incentive for us to support it as a cross-platform engine."


Sure, it's not that much available on mobile, but where it is - the benefits of a next-gen API are greater than on desktop (since you have more weaker CPU cores and stricter latency requirements). I'm not arguing to drop OGLES, not at all.

DirectX12 is also relevant on XB1, where you get not many other alternatives. Would be nice to see Godot titles running on consoles, eventually.


Yup. By using only OpenGLES 3.0 and equivalent OpenGL, they achieve maximum compatibility while having nice enough visuals as well as good enough performance. Pragmatic decision, no point trying to compete with Unreal Engine in the visual department.

That being said, people are achieving some really nice visuals in Godot 3.0 using the PBR shader and new renderer...


Researching Godot's multiplayer support it looks like there is not anything very comprehensive?

In a separate discussion, will we ever see any sort of UDP support in the browser for client/server games?

https://gafferongames.com/post/why_cant_i_send_udp_packets_f...


Not sure if this is what you are looking for but there is high level networking API coming in Godot 3.0 http://docs.godotengine.org/en/latest/learning/features/netw...


I have UDP semantics for my game, which has a Javascript front end and Golang servers: (Though I use node-electron processes as a WebRTC proxy.)

https://news.ycombinator.com/item?id=15334726

https://news.ycombinator.com/item?id=15332187


Reading the article I linked in my first post, my understanding is that it is possible to use WebRTC for UDP _today_ but its just very difficult?

"while WebRTC makes it easy to send unreliable-unordered data from one browser to another, it falls down when data needs to be sent between a browser and a dedicated server...It falls down because WebRTC is extremely complex. This complexity is understandable, being designed primarily to support peer-to-peer communication between browsers, WebRTC needs STUN, ICE and TURN support for NAT traversal and packet forwarding in the worst case."

Given your experience with WebRTC stcredzero, do you think it is possible to wrap WebRTC in a class to emulate UDP? What is the hold up?


it falls down when data needs to be sent between a browser and a dedicated server...It falls down because WebRTC is extremely complex.

The SimplePeer library works very well to hide the complexity of WebRTC. It works just fine for me. The big sticking point is the complexity/overhead of the setup. This might be mitigated by using the true headless modes in Chromium and Firefox. Given what I remember from installing dependencies, it should be possible to create a fairly lightweight headless fork that only has just enough to run WebRTC datachannels.

So, if you don't need lots of scaling and can deal with the overhead, there is no hold-up right this moment. It's just a matter of motivation.


It takes some time to develop and existing implementations aren't very mature yet, e.g., https://github.com/xhs/librtcdc and https://github.com/chadnickbok/librtcdcpp


I'm really looking forward to the proper release of Godot 3! Want to use it to get my kids into game writing, but have been holding off because Godot 2 has a custom scripting language which will be going away and I don't want to invest in that. Any idea when 3.0 ships?


You might really like UE4's blueprint system, it's the only time I've seen "visual programming" work really well.


It's really impressive. I thought I'd be using C++ a lot more when I started seriously digging into UE4, but more and more it's just to make helper functions so the primary logic and asset management can be done in blueprints.


Godot 3.0 also has visual scripting


> Godot 2 has a custom scripting language which will be going away

It's not going away, it'll still be in 3.0 and probably forever after. Just in 3.0 you have the option of C#, as well as C, C++, D, Nim, and others if using GDNative.

Edit - if it's for your kids, there's also going to be visual scripting in 3.0 (which basically is the same as GDScript, in visual form).


Their custom scripting language is the main reason for not choosing Godot. I didn't know they were replacing it in v3.

Any idea what are they going to use?


https://godotengine.org/article/godot-getting-more-languages

However, compared to parent:

> GDScript will always be the main supported language, and our recommended choice for all Godot users.

> To clarify things: since the Mono runtime is relatively heavy, and many Godot users will prefer to stick to GDScript, we intend to provide the Mono-enabled version of Godot as a separate download.

They're also adding their version of visual scripting...


GDScript will not be replaced in 3.0 but they will add support for other languages, like C#.

Personally, I find GDScript quite OK since you can treat it just like Python, and I think there are some interfaces that enable you to use real Python in Godot (check out their blog).


Do you know how the performance of GDScript execution compares to something like Python embedded in an engine where the more costly operations are run in compiled C or C++ code? Or even how the real Python performance in Godot compares to the GDScript performance?

I tried to find this out, but the one "benchmark" I found was somewhat dubious. The Godot documentation claims that GDScript has all sorts of advantages over using other scripting languages that sound qualitatively superior, but it lacks quantitative comparisons.


> The Godot documentation claims that GDScript has all sorts of advantages over using other scripting languages that sound qualitatively superior, but it lacks quantitative comparisons.

The advantage of GDScript is that all its primitives are the same as the C++ primitives that lie underneath. It's memory management model is the same as the engine's, etc... A GDScript function is a C++ function. GDScript is completely integrated into the engine, so if you're doing typical game-dev sort of things, there's literally no overhead.


If perf is an issue check out GDNative. Then you can code in Rust, D, C++ etc with Godot.


That looks like a nice way to add support for other libraries or optimize certain operations. But I'm more interested in a comparison between dynamic languages for quickly writing game logic. It's a pattern that seems fairly common (e.g. with Lua or Python bolted onto C/C++) and Godot's implementation of that idea is very unusual. They claim benefits for it, I'm curious if those benefits actually pay off in some quantitative fashion.


Main advantages are tight integration with engine and ease of use. As I wrote here: https://news.ycombinator.com/item?id=15310477 GDScript is not very fast but It's rally an issue. In 3.0 where we will have C# and GDNative it won't be an issue at all.


That is an exceptionally useful answer for me because I'm considering this for a strategy game with tilemaps. Thank you.


It's in the ballpark of 100 times slower than C/C++ code, which is about the same performance of Python.


Someone who wish to learn about developing game and its theory can get started here. http://gameprogrammingpatterns.com/contents.html


While I agree this is a wonderful resources, it is much less about game development than it is about design patterns. Basically, it's a gamedev flavored book about design patterns. A much better books about game development would be Game Coding Complete [0].

[0] https://www.amazon.com/Game-Coding-Complete-Fourth-McShaffry...


Very interesting reading.

I like specially the architecture decision of a separate render process.


Here's an archive link since you probably won't be able to see the article from the original location.

http://archive.is/lvKV9




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: