Indeed. Hopefully it can also, finally, be the stab in the back of DirectX. While DX is good and all that, I would much prefer to see an Open Standard prevail in this space.
That's cute and all, but you probably--deep down--don't actually want that open standard to be OpenGL. OGL is a crufty, messy API, and even now support is spotty for the different versions.
For a very long time, for example, OGL 2.1 was the only thing supported by Apple, despite numerous extensions to bring it up to feature-parity with modern OGL. Vendor extensions made programming code a mess, and many of the core concepts as embodied in the API are minefields for developers.
Don't hate on DX just because it's from Microsoft--an open re-implementation would actually help everyone.
There's been some progress on one already, the d3d1x state tracker from Gallium3d is an implementation of D3D 10/11 directly on top of the GPU drivers on Linux, without using OpenGL as an intermediary.
It's funny because you think technologies that are at the mercy of a company are a good idea.
There was a post here a few weeks ago (some dude dug up some old content of a 3D mesh of a cow in a Java viewer). And the old OpenGL and Java code "just worked".
"... and in their death throes, they gave onto us, technologies to last more than 2 years at a time. Sights and wonders never before seen only told, far outreaching the merciless grasp of the Holder of Shares."
Dude, you say this like OpenGL isn't run by Khronos, in turn "at the mercy of a company".
CAD vendors were what fucked over OpenGL for a long time, and it was only the fourth-quarter ill-fated kick by 3dlabs that gave it a programmable shader pipeline, and in turn set back the API development by another half-decade at least.
Look, "open" is not some magical fucking pixie dust you sprinkle over an abortion of an API to make it breathe life and grow up and change the world. If it was, audio in Linux wouldn't be such a shitshow, and neither would X, and neither would any number of other open APIs that everyone hates.
I do agree with you, though: rest in peace, Silicon Graphics.
It's not my area of expertise but it does ring true to me - whenever I see people complain about OpenGL they always seem to be confusing API surface with implementation details. I.e. the "legacy" parts can be shimmed to the "modern" parts without harming anyone very much.
As an avid reader of Mesa3D and a user of GL 3.3 core, I can say that much cruft was removed in so far as it doesn't resemble a modern GPU (e.g. fixed function pipeline), OpenGL still has plenty of warts to bitch about.
The GL 3.3 samplers, e.g., are great. It does not require you to bind a sampler object in order to change the properties of it. However, the rest of the GL objects follow bind-change-unbind and a lot of implicit state which makes writing OpenGL libraries that can be reused highly weird because you never know what state some moron left the GL machine in.
Now GL 3.3+ is in an interesting position. They've got old functions that operate in a specific bind-change-unbind and new function that operate directly. So not only has the API not yet been fully "modernized", it's in the halfway state and it looks like it won't be pushed in one direction or the other -- just stuck in limbo for something like 3-4 revisions.
So I guess what I'm saying is, until the inconsistencies are ironed out, OpenGL still has some cruft that directly affects even the most modern GL 4.x programs.
> I can say that much cruft was removed in so far as it doesn't resemble a modern GPU (e.g. fixed function pipeline)
Please read this again:
> they always seem to be confusing API surface with implementation details.
It does not have to work like a modern GPU. An API is an abstraction.
If you want to introduce a new API that is a better abstraction for current hardware that's fine, but that's not an argument to remove the earlier abstraction which is working fine for other users.
Right, I get it. I was merely stating that the parts of OpenGL that don't resemble a modern GPU have been removed. That is a fact.
EDIT: I would also like to add that only in the strictest and most painful sense was the legacy OpenGL "just an abstraction". Before, it was a simplistic cross-platform layer over actual hardware and while abstract to a degree, hardly attempted to shield the programmer from hardware details. It's notable that the features that OpenGL "happened" to include as the "core abstract GL state machine" mapped pretty much 1-to-1 onto SGI's tiered hardware offerings at the time. In other words, what made the cut was very much based on real hardware, hence, the evolution of the API was not based on random abstractions that seemed like a good design [cough]. The legacy-GL-on-modern-hardware -- we refer to it as "fixed function emulation", since the abstraction is so far removed from actual hardware nowadays. Perhaps this isn't a "good reason" to change the abstraction, but it is notable that if the entire legacy API can be emulated using the newer API, then the legacy API may not deserve to be part of the driver (which takes a lot of time and effort to develop) so much as another library. That is to say, if one were to change it into terms of "computability", there are some things that cannot be computed using the FF, but the reverse is not true, i.e. FF can compute a proper subset of the programmable pipeline. In fact, this is pretty much the exact idea that Gallium takes in Mesa -- and why you can implement D3D / GL ES / core GL / legacy GL on top of it.
Now, on to the real issue that I raised, which is a purely API design choice. Direct state access vs bind-modify-unbind. The consequences of bind-modify-unbind have been lamented by developers for _ages_. As a response, the newer GL sampler objects do not use that. However, the old style API (you know, the thing we've been lamenting over for ages) does not have a modern equivalent. Please tell me how this has ANYTHING to do with GPUs? It's pure and simple API cruft. The API is _not_ consistent with itself.
The problem is that it isn't working fine for other uses.
It's slow and ugly, and very nearly cannot peacefully coexist in the API. And maintaining it requires developer resources that don't easily exist for these companies.
Look, if you want to run off and create libAncientGL and do all of the book-keeping yourself and call to the most recent GL API, go nuts, and godspeed. That doesn't mean that we should hold companies to that requirement.
> Indeed. Hopefully it can also, finally, be the stab in the back of DirectX.
There are plenty of reasons developers go to DirectX, support and tooling being two of them.
On the game consoles, outside Microsoft, regardless what is spread around, game consoles don't support OpenGL, just variations thereof or similar APIs.
For example the PS3 has OpenGL ES 1.0 combined with Cg, not GLSL. In the end most developers use Libcgm anyway.
As for the other consoles they are also other type of APIs.
In the end, the best approach is to have an API agnostic middlelayer and be independent of the underlying API.
Because PS3 has shaders. Sony decided to support those shaders by an agreement with NVidia to use Cg, instead of adapting GLSL to their tooling.
Later on, it was asked at one GDC event if developers cared about GLSL, but since almost everyone that cares about performance on the PS3 uses Libgcm anyway, the update never happened.
I do not think that Apple is taking it's own NDA seriously. It may just be there to be on the safe side. When Lion came out I had a look at some of the new API it had - while it still was under NDA. I had a couple of problems with the new API so I posted a request for help on G+. Some of Apple's own employees responded to my request.
This does not show that Apple as a company does not really care if you are publishing a very short comparison of one tiny tiny aspect of something that is still under NDA but I doubt they do.
I agree entirely that Apple's NDA is barely enforced. They don't put anything really special in the previews. They know anyone with $100 can see it, and that the rumor sites will get "emails from anonymous developers" with any interesting specifics not mentioned publicly by Apple.
(In fact, I read a theory somewhere that the NDA is really just about corporate competitive advantage for some legal reason. I forget what it was; it might have been something with the date a technology becomes public re: patent law.)
That said, more than almost anything else, I'd be wary about posting benchmarks. Performance is subject to change wildly between the developer betas and the final product. It could get better with optimizations and taking out debug code; it could get worse if some optimizations are found to be unstable.
If it was just, "Hey, this is awesome! They've implemented through OpenGL 4.1 and their implemenation is significantly faster across all OGL versions!" I'd be less worried. I can imagine a graphics programmer getting twitchy about early numbers.
People like John Gruber, who have close relationships to Apple and are - in theory - also under the same NDA openly talk about betas and previews on their blogs, podcasts and so on.
People like John Gruber have long been talking about unreleased software from Apple under the assumption that any information that is public is fair game (i.e. if you talk about something that you could have read on Macrumors there is no problem).
I’m not sure whether that would actually hold up legally – but that’s the theory.
(Also, Apple doesn’t seem to be very interested in actually stopping sites like Macrumors – which are filled to the brim with every tiny detail about the iOS and OS X DP – from actually publishing that. So I guess they are even less interested in going after people who do not even show videos or screenshots.)
It seems the have the NDA but if they actually don't want you to talk about something then they specifically tell you. For example when they demoed Lion I think it was to Gruber and a few select journalists they must have used something beyond the standard NDA.
As far as I understand, this NDA is used to make big newspapers publish reviews of the new OS at the same time to make a concentrate PR effort. If every big newspaper was free to review Mavericks, by the time it got out, no one would care about it. Also, early builds always have bugs and shortcomings and those will be discussed in the reviews. Using NDA Apple makes sure that major media sources release info on the latest, most tested version and at the same time to create necessary effect on the readers.
Those Apple employees should have pointed you to devforums.apple.com, which is the correct place to discuss NDA'd material. The fact that individual employees didn't is in no way indicative of Apple's official position regarding the NDA.
While your first point bears repeating ad infinitum, and everyone who has direct access to the DP and finds something off (however minor) should be filing rdar reports before blogging, the effects of your second point about the NDA is open to interpretation. There's a clause in the confidentiality agreement that prevents it from covering anything that's public knowledge through no fault of your own. Given the extent of the Apple news machine, very few aspects of DPs are outside of public knowledge for very long.
OpenGL 4.1 means it's still 2 years behind where it should be, but it's progress, I suppose. It would be nice if Windows supported OpenGL by default, too.
Actually, graphics drivers provide the OpenGL interface, just like they provide the rendering backend to DirectX. People are going to install graphics drivers anyway if they're going to do GPU stuff.
> I've just assumed [graphics drivers are included] with whatever apple provides.
That’s correct. When you buy a Mac, you don’t need to download extra drivers. Only if you buy a Mac Pro and later install a different (or additional) PCI graphics card, you may need to download drivers.
The only PCIe expansion capability available on any mac (including the new mac pro) is thunderbolt. The new mac pro does not have any internal expansion.
Nominally these are to support higher end cards for things like CUDA and OpenCL. Interestingly, though, the official drivers contained in updates from Apple have evolved support for pretty much every nVidia card since 2010, and recently starting gaining support for the AMD 7xxx series. For example I run a hackintosh with a fully enabled GTX 470. In Snow Leopard days I had use the nVidia supplied drivers for their workstation cards, these days it Just Works.
Maybe if you use the latest kernel, get the latest driver for the manufacture, recompile X, Mesa, and a bunch of other modules you can get OpenGL 4.2 on Linux (with a ton of bugs)
No, you can't get OpenGL 4.x on Linux. You can't even get GL 3.3 -- geometry shader support is in it infancy. A quick look at mesa3d-dev or phoronix will show you that.
I think you're missing the implicit "open source" in this statement. The OP mentioned "recompile Mesa" and getting buggy 4.x support. No open source driver supports 4.x yet. I'm aware that proprietary drivers are far ahead, see my other posts.
The "implicit" open source in that statement is more than a little important to state explicitly. You're correct (AFAIK) about open driver support, but many many people use Linux, support free software, but are willing to install the nvidia or amd drivers for the performance benefits they bring so they can get work done. Especially if you're willing to cave and just use a Mac, I think the comparison to nvidia's closed drivers is a fine one.
I don't mean this in a bad way, but if someone says "recompile Mesa", I take it for granted that the understood context is "open source drivers". I didn't mean to confuse anyone.
Would you like to elaborate on your position, or just continue smugly toeing the party line?
For example, OSX has really bad (read: no) support for hardware slightly deviating outside of its niche. It also uses an arguably-broken UI metaphor, and has no truly good package management story.
It was not a smug comment of any sort. What's a party line and how does one "toe" it?
While it is well known that Linux supports the most hardware devices out there (in absolute terms) and Windows is most likely to support currently common hardware, neither allows one to rely on a recent version of OpenGL being available.
On Windows one must install drivers to get OpenGL support because GPU manufacturers focus on the very platform-specific Direct3D at the expense of everything else, on Linux support is spotty and only through binary blobs because the same GPU manufacturers don't provide full open source drivers for the hardware their customers paid for.
I was lamenting the state of standards compliance in general and OpenGL in particular; in this particular case OS X does pretty well.
"neither allows one to rely on a recent version of OpenGL..?" Erp, where are you from?
Suppose I had GL 4.3 capable hardware.
Closed Source Drivers on Windows = OpenGL 4.3
Closed Source Drivers on Linux = OpenGL 4.3
Open Source Drivers on Linux = OpenGL 3.1 [best case]
Closed Source Drivers on MacOS = OpenGL 3.2
My Game Engine Requires = OpenGL 3.3
---
As you can imagine, that means:
If you're running Linux on open source -> no
If you're running Mac at all -> no
In fact, what is ironic is the ONLY systems in the ENTIRE PLANET that can run this game are ones using proprietary drivers, specifically Windows and *NIX (but not MacOS)
Yeah, that's not correct. Windows keeps fairly up-to-date OGL drivers through vendors (as makes complete sense), and the Linux driver story isn't terrible.
Windows also generally will try and install driver updates as soon as they become available--this hardly makes OGL some magical burden to keep up-to-date.
If anything, OSX has been lagging behind hard until recently.
It's a marketing phrase actually -- it's a holistic approach to how a product is presented. It took me a long time to understand it in its entirety, but nearest equivalence here is "user experience", i.e. the drivers on Linux are quite mature, stable, and performant (with the implicit comparison usually to open source drivers or another platform such as Windows).
When I worked for an MMO game company I constantly got dinged for OSX's ancient OpenGL support which limited what we could do (being cross platform). Now that I don't need it anymore they go way forward. Still it's great support. And add in the Borg Trash Can's dual video cards for lots of fun!
Funnily enough, only one of those two FirePros are connected up as a 'graphics chip'.
The intent of the Mac Pro, when you look at it, is to be a small, lightweight, ridiculously powerful OpenCL workstation. It has a single Xeon to handle systems tasks, two FirePros (one of which does graphics and computation, the other does only computation), and PCIe SSDs to keep the OpenCL cores fed and store their data.
The more people that request it the more resources they throw at things. So if they keep this yearly update schedule maybe mountain mavericks or whatever the hell they name the next release might have it.
I submitted a bug report to help things along. I don't find posting non useful replies on Hacker News to be very productive at getting change to occur. Apple is far from perfect, but they do pay attention to how many people complain about the same thing in their bug tracker.
Asking to keep OpenGL up to date through bug reports just because Apple don't care to do it in timely manner themselves sounds waay far from perfect to me. But developers have to deal with what exists of course.
What it is is showing Apple that OpenGL matters to those developers. It is likely more a case of priority shifting, if not enough people request it it is deemed a lower priority. Until you make it known en masse it can't be pointed to by engineers as needing to be worked on.
Would it be ideal if Apple did everything HN commentators wanted? Well no, most of the requests here can be a bit ludicrous. But if you're a developer on Apple and you do not file a bug report specifying the need for up to date OpenGL, complaining here isn't productive. I will leave the hyperbole about "far from perfect" aside as all human endeavors are far from perfect and hate the platform wars that proliferate here.
The only thing I dislike about apple, is that it forces you to use cocoa to use opengl. As a C++ programmer and Ogre3D user, it put a lot of pain in the porting process.
Maybe OpenGL is better with apple, but C++ is not. Shoving Nextstep under programmer's throats is really a weird strategy.
Apple only expose the OpenGL bindings via Cocoa. There are many libraries that will expose OpenGL for you. [SDL](http://www.libsdl.org/) and [GLFW](http://www.glfw.org/) are the two most prominent.
Exactly. With Go and GLFW I could write 3D GL apps running the same under Linux, Windows and Mac OSX. No need to mess with OS-specifics if you don't want to. SDL could do the same but since I didn't need Audio and all kinds of "media layers" and what not, GLFW seemed a lot leaner and immediately worked "out of box", too.
You misunderstood him. He's not saying "only cocoa exposes gl", he said "cocoa only (as in just) exposes GL, but other cross-platform libs also expose it for you if you prefer not to deal with cocoa".
> Shoving Nextstep under programmer's throats is really a weird strategy.
No, it is like any vendor that makes developers stick with their own technology. This has been happening in commercial software since the dawn of computing.
We're talking about a language, not just a tech. Vendors always had C++. It's agnostic. Apple and its coolness. I doubt people would use Cocoa without nextstep.
And honestly, right clicking with mac, what a nightmare, always bringing the throbber. When I think about it, apple delivers the software that works with the hardware, but the soft is just cool, it's not really nice to work with. Most features comes from unix anyway. While microsoft was hardware agnostic and allowed the computer industry to grow, Apple just comes in with apparatus and just want people to forget the pain of what allowed the computer industry to grow by delivering both soft and hardware.
You can't make programmers learn a language which works in so many different ways, and expect to create a monopoly on such apps. Or maybe it will take time for crowds to forgets C++ and use cocoa.
All relevant compilers on OS X allow you to mix and match C++ and Objective C. If you look at Apple stack traces, you will see plenty of C++ in there. I don't see how using a few "Objective" bits in your C++ code is any worse than using non-portable COM interfaces in C++ on Windows.
Well what's the point of using unix then, if it's not to make unix-friendly app not compatible ? To attract linux nerds ?
> XWindows and OpenGL are not part of POSIX.
I don't really care about POSIX, I care about being able to recompile apps for a new OS, without having to rewrite core parts or change the design. I guess people would start to make new apps for the sake of apple.
And by the way, what's the point of Cocoa compared to other existing things ? Is GUI programming that much important, now that everybody is using html and js ? Is it worth changing the wheels for the sake of a language paradigm ? At the kernel level, I don't see why there's a need for cocoa.
I've done a lot of C++ programming and Objective-C is significantly easier to write code for. It takes hours to understand Objective-C compared to years for C++.
C++ is great at a lot of things, but interface glue is not one of them.
Yeah, well, [your preferred development stack] sucks because [subjective and/or political reason] that [anyone who agrees with me] knows is obviously true. Clearly, [vendor of your preferred development stack] should go spend many years rewriting everything so that my mind can continue living comfortably in the box it's in, safe from the pain of having to learn something different.
I don't prefer anything, but making things work between systems is a pain, and I thought portable languages were the thing that allowed those systems to run the same programs.
I just think that trashing portions of technologies that worked on other systems, is debatable.
> Clearly, [vendor of your preferred development stack] should go spend many years rewriting everything so that my mind can continue living comfortably in the box it's in, safe from the pain of having to learn something different.
Yeah, maybe. At the time I adopted Ogre3D, I thought its code would run on any OS. I would really like to know the good reasons that push companies to break backward compatibility. Carbon comes to mind. Most other OSes think backward compatibility is important. But now, we move faster than light, screw the weak ones who won't learn all the new stuff !