Hacker News new | past | comments | ask | show | jobs | submit login
The Next Mainstream Programming Language: A Game Developer's Perspective (2006) (lambda-the-ultimate.org)
47 points by caustic on Dec 28, 2010 | hide | past | favorite | 18 comments



2006...so taking that into account.

I am not quite sure why something like OCaml has not taken off in game development yet. Seems the promise of C speed, better programming abstractions and functional core would be something a strong game development studio would jump all over.

Though, being completely honest, I am not fully in tune with the game development community. Can anyone else comment?


Inertia. From a technical standpoint the game industry is about customer-facing features that happen to also be technologies. You can go from one product to another carrying along many of those features, but change the language and you blow them all up.

Today you have a "legacy" group of AAA developers with huge, well-tuned C++ engines designed specifically around that dev model(typically, lots of scripted and hand-crafted static content with photorealistic graphics and in-game cutscenes), and a growing number on the indie/mobile/web/social side who are using whatever is convenient, and often rolling their own thing to fit their own projects. For many that still means C++ since they're already familiar with it, there's plenty of examples and library code around, and some targets still demand(or literally require, in the case of consoles) that mix of tooling, performance and features. For others it might be Java, Objective C, Actionscript, or Javascript.

The value proposition of a new language isn't great for the AAA developer because the engine coding effort - on a large project - is tiny compared to the content. For tiny games most of the pressure to ship is on programming - but once you're past the core engine(resources, scene/entity/rendering management, collision, events, etc.), the benefits are still unclear since most gameplay coding uses the same patterns in any language - lots of intermittent special-case logic, blobs of state accessable through various indexes, timing, and sequencing. These things can usually get parceled out to a scripting language for faster iteration(or, at minimum, designer-driven iteration, vs. coder+designer pairing).

I haven't mentioned parallelism once yet, notice. Despite Tim Sweeney's focus on this in his slides, it's not a big concern for anyone but the AAA guys who are specifically aiming to max out the hardware. The gameplay proper is hard to enforce in-parallel because most game designs are massively stateful and interdependent. That leaves rendering and miscellaneous auxiliary processes. Rendering quality is now bottlenecked mostly by the content creation process, not by runtime concerns. We can always throw more polygons into a scene, but the details of shading, texturing, and animation have become far more important to image quality.

Look at what Valve and Blizzard are doing. They've both stayed far away from the cutting edge in their own technology, and that hasn't hurt them a bit in sales. So again, there's just very little reason to move forward on the tech if you aren't an indie, and if you are then you have to weigh your choice against how difficult it'll be to ship the final product to consumers. The math tends to favor one of the languages mentioned earlier(C++, Obj C, JS, AS3, Java).


If true, the industry would be vulnerable to disruption by an ocaml shop.

I think C's low-level hackability and integration with assembler is hard to beat for max performance. In other cases, a scripting language is more agile. And correctness isn't important in games (if it seems right, it is right). So ocaml's impressive attributes don't win any of these battles.

With xbox having 3 cores and ps3 having 8, a better concurrency language has an opportunity to have an impact. Maybe in the next nextgen... (if launched today, moore predicts x8 as many cores: 24...64).


You can program Xbox 360 games in F# (a Microsoft language that's based on ML, and is similar to ocaml) using XNA. A few very small indie games have been written that way.

People who write a game in ocaml tend not to write a second game in ocaml. They tend to go back to C/C++. To me this indicates that ocaml is not a good fit for game development.


C++ is very ingrained in the traditional (big game) industry. Even if the legacy/library compatibility and performance of alternatives were no longer issues, I think a lot of the more senior developers have chosen "C++ for life".


I am not a game developer but I believe its an issue of legacy, tooling, libraries, tutorials, documentation and tradition. As late as 2002 I was still seeing C vs C++ speed arguments in gamedev forums.


Lua is fairly popular in the game industry, for scripting engines written in C++ or C.


It doesn't attain C-level speed yet. Maybe for a web developer's purposes it does, but definitely not for the kind of ridiculous optimizations that console games frequently need.

Further, C/C++ runs the industry, and that's all the console manufacturers are going to provide for in their SDKs beyond scripting, such as Lua. Even then, use of that has to be limited because they're usually so cramped by the hardware.

http://www.wiisworld.com/wii-specs.html

88mb of memory in the Wii.

Lets not forget that you have to interface with OpenGL / DirectX for the graphics. Good luck wrangling with OCaml for something like that.

Oh, and you'll have to retrain everyone anew for it too.

It's just not a good enough tool/language to justify the switch given the immense inertia weighing against it.

Edit: This is why I don't work in the gaming industry, amongst other things.


I love checking in with this slide deck from time to time.

I'm not a language nerd by any means, but I wonder how much of the reliability issues are now "solved". It seems that Scala would do a lot of what is required, if only checked exceptions were enabled (why the Scala compiler doesn't have this as a compiler flag, I don't know). I understand why the author wants his static compilation cake, but sometimes sucking down some syntax for practicality/doing stuff now might be OK ;)


Checked exceptions, at least the way Java does them are exceedingly verbose, potentially requiring a try/catch block at every level of a call stack. I think it would be possible do static analysis of most possible execution paths and determine whether there's a possible exception not being caught by a certain level without having to deal with it in every call.


This is relatively trivial, but only for whole-program compilation. The good thing about declaring exceptions on functions is that you can partially compile programs without a module system - otherwise, at a minimum you have to declare them at module boundaries.


It's a little disappointing that we didn't get to 20 cores by 2009. We didn't even make it to 8.


I think one of the big issues that needs to be tackled for the next generation of languages is inherent support for concurrency and orchestration of concurrency. Typically multithreading is something you have to explicit program into your solutions and make a trade-of between performance and thread-safety. It would be neat if a langugage could be designed with concurrency supported by default, with as small hit to performance as possible for simple scenarios.

There are some languages that provide some inherent support already, primarily functional languages (haskell, f#). Some more tradional languages have started tacking on features to better handle parallelization/concurrency, for instance C# with async,plinq, parallel.net etc.

I'm a bit doubtful however that everyone is going to switch to a functional language, I think the mainstream will stay with a object-oriented language. But there should be an oppurtunity to design a new object-oriented language from the bottom up for paralllization and concurrency which tries to solve performance issues with locking/collections etc on a lower level but provider higher level of abstraction.

Another issue that perhaps should be addressed is better support for "huge" data. In most cases with todays programming languages you have to type your integers, meaning you have to chose between an int, long, bigint or whatnot. and if you type something as bigint for instance all instances takes up loads of memory even if the span of the integer isn't that big. Why cant a number simply adapt to how it's used. I realize the underlying problems of the cpu and precision but still, it should be possible with a more elegant solution.


Re-reading this now, I'm struck by how similar some Tim Sweeny's lines of discussion are to Rich Hickey's, particularly about functional by default and bounded mutability.

A dependently typed Clojure would do well against all 3 of Tim's stated topic areas (and of course most likely would need the obligatory curly bracket syntax... for better or worse).


Ruby, Python, and in the game-dev world, Lua have become quite popular without curly brackets and semicolons. I don't think syntax is all that critical to a language becoming popular[0]. What does matter is that it solves an immediate problem that simply isn't being solved by other languages, or is attached to a framework or tool that does so. Python got popular because it was a more readable replacement for Perl. Ruby got popular because it was used for a framework that eliminated half the work from the average web app[1].

[0] With certain limitations. A painful syntax will keep people away, and it remains to be seen whether Lisp is painful enough to most programmers to prevent adoption when a language otherwise solves all their problems.

[1] Compared to PHP, Java or just about any language with CGI and a hand-rolled data access layer.


Syntax does matter. The lack of expected operator precedence kept many technical non-programmers away from Lisp and Smalltalk. Python, Ruby, and Perl all succeeded because they respected the cultural expectations of *nix hackers. Cultural expectations are a big factor in HID. Programming langs are just a highly specialized case.


The lack of expected operator precedence kept many technical non-programmers away from Lisp and Smalltalk.

I've heard the opposite; where did you hear this?


Oh, Comic Sans!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: