Hacker News new | past | comments | ask | show | jobs | submit login
Cycles X (blender.org)
415 points by Tomte on April 23, 2021 | hide | past | favorite | 148 comments



I've kept an eye on Blender for the last 15 years and tried it again recently after having almost exclusively used Maya. It's continually improved in just about every way, is relatively easy to use with just a trackpad, and it performs really well.

In contrast, Maya is has more features overall but is... a piece of shit in many ways. It doesn't matter what hardware I've used; it's a rule that Maya has to crash randomly when doing basic things. I used to be able to just open the Hypershade, but with Maya 2020 merely opening it causes the entire application to crash.

So I've decided to ditch it for Blender. I'm not doing character animation anymore, but I still enjoy modeling and rigging some things I want to conceptualize in 3D space, as well as create 3D printables, and Blender is absolutely up to the task.

It kind of baffles me how little interest the motion picture and television industries have had in Blender. Big studios are still using Maya, which I know also crashes for them and has performance issues. They could pour all their money into improving Blender and eventually not have to pay Autodesk anymore for licenses for poorly managed software that seems to rarely receive bug fixes.


Maya was an amazing piece of software once. It is falling behind only because Autodesk seems to be content to just let it slowly rot.

The funny thing is that there are litanies of complaints about many core Autodesk products that echo the same chorus: the software is poorly maintained, bugfixes are promised "for the next released" (paid upgrade, of course) and yet, they never appear. Customer feature requests are completely ignored and each release feels like it's just a version number bump with a few token changes here and there so that sales can act as if they created a meaningful new release.


Some of us admins in the field like to joke that Autodesk isn't a software vendor, they are a licensing company. And to be honest some of their changes with regards to Named User Licensing is going to be a PITA for many studios.

I've gotten the chance to have a few conversations with some on the Maya team before, and they are great people. But they are fairly hamstrung by a company focused on sales and growth than development, one where M&E is less than 10% of their company revenue[0] (those numbers, which are an increase in revenue, can likely be attributed to the forcing of subscriptions), and having to cover a lot of ground re-architecting an application with a ton of technical and legacy debt. The historical idea of Maya being a Fraken-app exists for a reason.

[0] https://investors.autodesk.com/static-files/749abc95-16be-4c...


Autodesk is the Oracle of the professional media world.

Adobe is their Microsoft.


I was surprised not so long back to find out just how much of the proprietary 3D market that Autodesk held. It seems like they own pretty much all of the big, studio-oriented 3D packages I had heard of from 10-15 years ago.

> Autodesk is the Oracle of the professional media world.

Autodesk’s reach in the niche of M&E 3D packages seems far more horizontal than Oracle’s in even databases. It is as though they set out to own all of the competition and largely succeeded – and this isn’t even the area that brings in most of their revenue.


Media & Entertainment being a tiny part of their revenue fits with the general wisdom that the market for general purpose 3D software is virtually non-existent. The software is costly to build and maintain and combined with a small customer base and thus high licencing prices, this is a killer.


Autodesk does an equally poor job maintaining their Architecture and Engineering softwares, despite them being a larger market. GP’s comments about the state of Maya hold true for AutoCAD and Revit as well.


For some reason, enough resources is infused into Blender for it to be competitive, even though it does not bring any sales or licensing revenue.

I wonder if the market is that small.


The resources that get poured into Blender are coming out of pure desperation. The companies that pay for it have to have some tool that is suitable for their work and supporting Blender is simply cheaper than doing it from scratch.


For a complete opposite of Maya, see Side FX Houdini. An incredible masterpiece of software development, keeps getting better every year, the company is extremely competent, beloved by the community, they keep innovating even though they're so far ahead of any other alternative.

I don't know if I have a point to make, I just wanted to gush about my favorite 3D software, and also I'm sad to see what autodesk is doing to Maya, what an absolute disaster. I loved Maya so much when I was a teenager. It was built by a brilliant company (Alias|Wavefront), it could've been so good if it wasn't sold to autodesk.


SideFX is the single most pleasant and hope-restoring company to deal with in this entire field. Other vendors have amazing teams, but top to bottom SideFX outclasses them all in terms of support, feedback reception, release cycles, feature/stability improvements, and licensing.


Every year, I keep praying that Side FX doesn't sell to Autodesk - so far it seems to be working.


Autodesk and Adobe have become a problem, the lack of competition and incumbency mean they are basically milking the same cow ad nauseam.

To be a little bit fair - those systems are in fact very complex, and it is quite a big deal to do anything in them. But literally as I speak Photoshop will not go 'fullscreen' on my Mac, and Adobe blames this utterly ridiculous bug on Apple. Which may very well be the case but whoever is 'at fault' - things are retrograding a little bit.

I sense there is opportunity for a new angle on things.


> things are retrograding a little bit

Sure thing, but it seems even you are acknowledging that this specific issue seems to belong to Apple rather than Autodesk, so you're saying things are retrograding in the Apple ecosystem here, not Autodesk.

Disclaimer: I'm biased against Autodesk and love Blender, but even I see the fault in your argument here.


I said it may belong to Apple, not that it does, and frankly, I don't see this bug with any other software so really it seems iffy, moreover, Apple is still part of the 'old guard monopoly retrograde cash-cow' which is my point.


I mean, ultimately Autodesk doesn't give a damn about any of their other products as long as the AutoCAD money firehose continues.

Maya continues to be the industry leader because VFX studios have spent 20 years integrating it with their pipelines. Blender could beat Maya feature for feature, and places like ILM and Weta still won't adopt it because of the effort required to fit it into their pipelines.


Well the cash cow keeps making them money and the Studios are happy to keep giving them money so why bother

It's the same with Sports games, "corporate software" etc etc


I learned Maya on Irix, then purchased a License for Plug In development on Windows around 2004, which was pre-Python support. I really got to knew the DAG very well, as well as rendering custom helper tools in the viewport. It was extremely flexible to extend.

I've also kept an eye on Blender since then, and no other piece of OSS causes me so much joy to see the improvements it makes version after version, specially because they are usually linked to small movies which demonstrate the capabilities.

But the few times I've taken a look at the Python SDK, I've felt a bit turned off. It somehow felt a bit like Cinema4D, where you had this huge API to code against, but never ended up with small modules (DAG nodes) which were pretty much stand-alone utilities which you could then connect to anything Maya had to offer.

I've seen that Blender lately has something which feels a bit similar, but it appears to be limited to shading pipelines (Shader Nodes).

There are no such transform, group, deformer nodes and stuff like that which allows you to manipulate geometry, correct?

I mean, in Maya I'd build a simple data node (in C++) which had an input, and I'd route the xyz of an object to the xzy-input of my data node, and then apply particle physics to that object and hit play, and the data node would then obtain a "data stream" in its xyz-input representing the object's position, as a very trivial example. Even vertices could be "streamed" into such data nodes. How would I do such a thing in Blender (with Python)?


In addition to Geometry Nodes [0], and the "Everything Nodes" initiative [1], you might be interested in Sverchok [2] too, which implements this kind of graph-based parametric/procedural modeling technique.

[0]: https://docs.blender.org/manual/en/latest/modeling/geometry_...

[1]: https://wiki.blender.org/wiki/Source/Nodes/EverythingNodes

[2]: http://nortikin.github.io/sverchok/


There’s Geometry Nodes now which let you write smaller procedurals that you can combine together the way you’re describing: https://docs.blender.org/manual/en/latest/modeling/geometry_...

They’re extremely new (2.92 I think) so the framework is there but there aren’t many nodes yet; 2.93 introduced a whole bunch of new nodes though.


Excellent! The fact that there's something as basic as the Boolean Math node [1] kind of shows that this is something evolving into that direction. I will definitely look at it and search for the C-code on GitHub. Thanks.

[1] https://docs.blender.org/manual/en/latest/modeling/geometry_...

Edit: Hmm, it appears to not have a C++ API.


This is precisely the point I made in my comment [1]. The lack of a C++ API is a huge let down in every Blender release.

[1] https://news.ycombinator.com/item?id=26921528


Just an observation and a point of discussion regarding the big studios. Disclaimer, I am not in this space professionally, but I listen to a LOT of podcasts about VR/AR which spills into all of these topics.

Do you see a push away from Maya and 3DS max for a more platform agnostic modeling environment? Blender seems to be mentioned all over the place for beginners. From what I am hearing on podcasts, Unreal Game engine is making a big push into film/animation production in addition to its impressive game credits. As long as you can export an FBX file UE4/5 should be a great pipeline, or do I have it wrong?


I'm not in the field myself, though animation was my original career and I know people who work in feature animation.

To answer your question, the field has been sloooowly diversifying in some ways, as you say with Unreal (particularly since Unreal's engine is fast and good for real-time previews), but I don't see a huge push for moving away from does-it-all packages like Maya or 3DS Max. The economic incentive isn't there. Studios have figured out how to make their money and are comfortable with the pipelines that they already have. This isn't to say their pipelines are good. In fact, many of you would be shocked at just how bad most animation pipelines are at big studios. Not only are they way too vendor-locked (as is kind of the case with Maya), but they don't want to pay an adequate salary for qualified software engineers who can build better pipelines and write more software that's agnostic. Of course, I'm making sweeping generalizations, but this is the kind of story I've heard more often than not. Pixar is known for having good practices and they write (and sell) their own software, but I don't closely know anyone who has worked there.

It's going to take a decline in the field of animation before studios realize that they're basically stuck in 2006 in terms of how their artists are working to make their content. I won't mention them here, but people at certain major studios have been trying to get things like Unreal for 10+ years and never get approval to receive licenses for it and they never do because higher ups don't believe they'd see a monetary return on investment. Their content is so profitable that the people in charge don't really give a fuck.

In case I didn't make it clear, I'm pretty much an outsider at this point, so I'm sure there may be some more relevant viewpoints here that would contradict mine. I don't intend to be authoritative.


“ Their content is so profitable that the people in charge don't really give a fuck.”

Sorry but that’s completely wrong. Almost every animation or vfx studio is barely scraping by. Margins have been reduced so much for film work it’s extremely difficult to turn a profit, which is why so much work is outsourced overseas these days. The reason software isn’t better is because they barely have the resources to improve it.

Everyone’s aware of newer tools, but things like unreal still arnt as good as traditional non-realtime pipelines in terms of flexibility and scalability.


In my opinion, most of the 3D software and renderers are starting to align in their workflows and set up. If you've spent enough time in a 3D package and have your fundamentals down, then a reasonably experienced 3D artist could make something look good in Maya, Blender, Cinema 4D, UE4, 3ds Max, Modo, or slightly more specialized programs like Katana, Mari, Nuke, etc. Most of the "learning curve" is just finding out what a particular program calls one tool or another. I think this video starts to illustrate what I mean

https://www.youtube.com/watch?v=VkvRBvKHFek

Also, at studios, you tend to hear about the "main" pipeline where the majority of the work flows through. There are often secondary, smaller pipelines where they evaluate new tools and workflows before committing to entire rewrites. I would agree that small to medium sized studios don't have the resources or luxury to do as much exploration


Every CGI toolset has its strengths and weaknesses depending on the project. Also while the knowledge transfers between each software, they each have their own unique implementations for just about everything. It's hell to learn a new CGI toolset.

For work and also hobby I create 3D animations and experiment with all sorts of things in Maya. A recent turning point was using Redshift to render everything and suddenly I can do overnight GPU renders at high quality. Here are some of my projects released using a CC license -

https://www.jasonfletcher.info/vjloops


> Their content is so profitable that the people in charge don't really give a fuck.

This is usually a general technology smell. Consistent, profitable revenue is the technological equivalent of the resource curse [0].

I haven't figured out if it's because people never get fired (thus new ideas are extremely slow to penetrate and propagate) or because management can afford to be extremely conservative (no pressure, resistance to change).

Put the most charitably, it's because there's no need to be more efficient, and stability & consistency is more valuable than improvement.

[0] https://en.m.wikipedia.org/wiki/Resource_curse


> It's going to take a decline in the field of animation before studios realize that they're basically stuck in 2006 in terms of how their artists are working to make their content.

I’m pretty sure there has been a huge decline in feature animation in the United States. VFX and feature production has almost collapsed here, with much of it moving to places that offer subsidies and cheaper labor for this kind of work, like Canada, Europe, and India.


The problem with UE4 in film/animation is about the the place it should occupy in the pipeline. Being a game engine at its core, it's fundamentally designed to ingest models and render images. So its natural place is at the end of the pipeline. There's no really good roundtrip capabilties that let you e.g. export a scene that was created in UE for use in other tools. So despite the interactivity that would let you e.g. block out a scene with awesome real time feedback, there's no good way to get the data out into a different animation or lighting tool for final touchups before it goes to an offline renderer. This is why some big animiation studios have invested a lot of effort into in-house realtime preview renderers that attach to their tool pipelines with less friction.


That's an excellent point about why they would rather write their own in-house software. Of course they then get saddled with having to support all of that software and may struggle to do so, which is another problem in itself.


Maya's grip on the VFX companies is, I suspect, in part due to how Maya has been built into the pipelines and tools that many companies use, which is only possible because of Maya's fantastic API. I haven't used the API in anger in about 10 years, but it was amazing back them. Very well designed.

Now if Blender had a comparable API that VFX studios could build on Blender would have a better shot at eating Maya's lunch. I predict that smaller companies with smaller budgets and almost no custom software will be the early adopters of Blenders and will pave the way for bigger companies.


Thanks for this feedback. Maya has been around forever. I think I had the initial release crash on me when in the late 90s. I figured it was the warez back then!

Would you mind sharing what kind of character work you've done previously (hobby or pro) and what types of topics or projects drive you to use Blender for conceptualization?


Probably not because it was warez. I know people who work with Maya in productions; everybody knows that Maya is crash prone to varying extents. It's the least reliable software I've ever used. It does a TON, and it's otherwise pretty awesome, but it's embarrassingly buggy.

I never really got to work with it in a professional space, but before software development I wanted to do character animation and went to school for it. Outside of assignments, I did a bunch of hobby projects both with just animation and sculpting. I was particularly interested in mixing CGI with live action, and there were some things I did with modeling and rigging things like monsters, filming footage, resolving camera motion, and compositing rendered animation into the video. Was both a lot of fun and horribly difficult. These days I it's much easier since software for resolving camera motion has improved by orders of magnitude.

There was a studio I worked at briefly, but it was a pretty crappy deal and I decided to just ditch that field for software development, which I am much better at anyway.


:) It was hard to tell what was what back then.

Thanks for sharing those notes on your character animation background.

I'm curious what drove you to be interested in mixing CGI / live action. Jurassic Park, perhaps?

When I was a kid Roger Rabbit and Cool World were amazing to me. JP's CGI seemed different, as though the dinosaurs were real.


Oh, I also didn't catch this:

> what types of topics or projects drive you to use Blender for conceptualization?

So I often have hobby projects (that I sometimes complete, heh) where I would rather sketch it out in Blender so I could more easily figure out what it is I want to build. Sure, I could be using something like Solidworks, but I've just never felt like I needed a full on CAD suite to do what I need. The process helps me design physical objects and decide what parts I'll need to either buy or 3D print.

One such example is a 16mm film scanner I've been meaning to build. I have a 16mm projector and collect 16mm films, and some films you can find on eBay are kind of obscure. So I thought it would be fun to build something to scan them since it would use my programming talent and involve controlling stepper motors.

That was actually where I gave up on Maya completely. I had used Blender for some things already so the transition was pretty easy, but I have more experience in Maya overall. After having it crash on something basic, not even potentially weird operations like booleans (which every other 3D software package gets right), I decided to abandon Maya entirely and hopefully I won't ever have to look back.


It was The Incredibles. That and my father was and still is an animator. But The Incredibles really got me interested because it was both pretty advanced for its time and had what I still consider to be a more mature storyline than even most animated films today. I definitely know what you mean about Roger Rabbit!


Yes, Blender is an amazing project and great 3D production software for being free. However there are many issues that remain unresolved after years. Viewport performance being one of them. There seems to be at least to an extent a mode of development where fun new things get done but old, boring, more critical issues will not.

I give all credit to Blender developers for doing the old, boring and important things. It's just very frustrating waiting for years and realizing that what was the vision a couple of years ago, very few things actually got done.


A friend of mine in the industry who is now switching from Maya to Blender for concept art once said: "yeah, Blender is free... if you don't count the $200 in add-ons you need to buy to get anything done."

I wonder what percentage of Blender enthusiasts DON'T have Boxcutter and HardOps installed at a minimum.


You can purchase up to ~1500 USD a year on plugins/models/textures with Blender and it'll still be cheaper than a yearly subscription to Maya. It's not hard to understand why Blender is currently picking up a lot of momentum.


Viewport performance is one of the several major improvements demoed in the linked video by Brecht!


Brecht’s videos show off improved rendering. I think the person you’re replying to is instead referring to viewport performance in general: scene complexity and size, undo performance, framerate, responsiveness, that sort of thing.


Oh, makes sense. I do know improved dense mesh editing performance is on the roadmap for 3.1 - 3.3 era, so at least there's that :)


Presumably this happens because it’s mostly volunteer labour right? Hiring a person to solve those boring old but important-to-your-business issues seems like it could pay off at a certain number of maya licenses.

My guess is it mostly doesn’t happen because these shops aren’t in the software business and don’t want to be, so don’t know how to value engineers and don’t want to spend money on engineering they aren’t sure will pay off in this fiscal year.


Blender is mostly developed by paid devs I think.


I think there are quite a lot more developers contributing to the project than those employed by the Foundation or supported by developer grants [0]. However, the complex changes on the roadmap of Blender's development are planned and implemented by the core team.

[0]: https://www.blender.org/about/credits/


Huh, TIL, thank you


Blender has a lot of things lacking however:

* It too is crash happy just in different ways. However Maya scales better with large scenes than Blender does and that's key

* Blenders licensing is a deterrent. Very few people want to deal with GPL.

* Blender is not good for extensibility if you're looking for performance. The API is unstable and exposed only via Python. Studios need to extend the DCCs and Blender doesn't allow for it in the way they need unless they fork the whole app.

*Blender has no professional support contract. This is very important for studios.

I think Blender is awesome, but a lot of the stuff that makes it appeal to freelancers doesn't appeal to big studios. In some cases like the licensing, it pushes them away.


> * Blenders licensing is a deterrent. Very few people want to deal with GPL.

Why would a studio care about this? A tech firm would certainly care, but the entertainment industry? They probably don't know what GPL is and would probably just associate it with "free"


I can assure you that legal teams in studios are just as aware as those in the tech field, even with the (for the most part) distinct lack of apparent software distribution.

Even if it's just for internal consumption, legal likes to know what's going on and where it might impact them later.

However, the GPL and other copyleft software will not for the most part have much of an effect on studios unless they sharing some of their IP they've created with the broader community, partners, and/or proprietary software vendors.


GPL may as well mean radioactive as far as legal is concerned.

A friend worked at a company where they banned GIMP as they feared that editing logos and other trademarks in it could invalidate them.

Even working for a tech company, I don't think I would suggest using anything GPL as it would cause a fuss.


I'm not a lawyer, but I'm pretty certain that's not how the GPL works. It has no effect on the content you create using the tool. It places restrictions on redistribution of the code itself. I don't think there are any limits on what you do with gimp or blender, unless you are modifying its code and redistributing it.


Yes the GPL doesn't apply to work created by GPL software. I've heard people in companies be worried about it though because they misunderstand "using GPL libraries" as applying to work created as a product of them, rather than strictly software.


The Blender Foundation is quite upfront about this though to prevent misunderstandings. It's directly addressed on the page that explains the license: https://www.blender.org/about/license/


The case in point here though was GIMP not Blender. I'm not sure whether they provide or provided such clarification.



That is correct, but there are plenty of lawyers who interpret things incredibly overly-conservatively to save their own ass, just in case.

If you're a senior enough manager and have a legitimate business reason then you can often push back enough and get them to give in, but it's really a question of whether it's worth your time and effort internally.


Tons of widely used software -- Linux, for one -- is GPL.


Of course - the GPL thing is just FUD. The company I work for (Red Hat) sells tons of GPL software to small and large companies, including huge media/entertainment companies, and none of them is talking about how the GPL is "radioactive".


Because libc is licensed under LGPL and not GPL. GPL in and of itself isn't bad if you're not developing code that links it in. LGPL is also widely used in the CG industry (Qt etc...). If Blender licensed their API under different terms it would greatly open things up.


The current version of GPL is 100% toxic at many places I know of - this is not overly conservative lawyers - you have all sorts of rules around releasing your encryption keys - secure boot chains etc - it’s a no go and viral


Studios write a LOT of custom code for their pipeline and tools. If you were to go to ILM or Dreamworks or WDAS and watch someone use Maya there, it would look COMPLETELY different from what you see at home.

Given that, I'm not surprised that the (not-always-technical) lawyers are worried about GPL.


Current version of GPL is an absolute no go for many larger places - even Ubuntu ended up dropping it for parts of stack - risks are way too high


> even Ubuntu ended up dropping it for parts of stack

"even" Ubuntu? Are you implying that Canonical are some kind of champion of free software?


Yes - they ship open source and are generally more comfortable with open source licensing because they have much more experience with it.

Expecting an entertainment industry org (which were specially attacked in latest version of GPL around DRM) to be comfortable seems far fetched.


There are definitely management types that see "free" software as "I can't yell at the vendor if something goes wrong." I've seen this many times in government contracts. It's a CYA move IMHO. Even if the vendor is never going to fix the bugs you find, you can at least blame them for not delivering on time.

Of course it's even more silly when you consider that you're never going to be making patches to Maya, so in theory nothing is lost on the Blender side here.


Maya has support licenses (as do most commercial DCCs). Studios can get Autodesk to patch Maya for you and give you custom builds.


Studios are very much tech firms themselves.

Quite a few have released commercial software, share proprietary software with partner/vendor companies , or contribute to OSS.

As with any tech company, the GPL is a very unwelcome license for fear of how far reaching it can be.


> Blenders licensing is a deterrent. Very few people want to deal with GPL.

I see this claim taken seriously all over the place, and I just don't get it. How in the world is the GPL actually an issue?

The GPL does not apply to anything made with Blender; only changes made to Blender. You can very trivially use Blender without ever dealing with the GPL.

Do studios really think they need to make changes/extensions to Blender itself, then keep those changes totally private and proprietary? How absurd!

I just can't for the life of me think of a scenario where the GPL would actually inconvenience a studio. The only potential problems I can imagine are beaurocratic FUD like having a legal team arbitrarily demand every tiny piece of work be owned by the studio. What a silly reason not to use better tools.


When you write plugins (which VFX studios write a lot of for various DCCs like Maya, Nuke, Katana, Houdini) for GPL software, are the plugins then derived works? Does Blender's License have an opt-out clause for that?

Sometimes (but not often) these plugins do need to be shared with other studios (or even the vendor - Netflix is starting to get fairly aggressive in asking for copies of the source work, but that doesn't really work well with every studio having a custom pipeline and different ways of doing rigging / deforming), but it looks like it's going that way. In this scenario, is "sharing" "distributing" from the license point-of-view?

Large VFX/Animation studios are not going to open source critical plugins that given them potential edges. They want them to be totally private and their IP.

The large studios have a lot of research / IP stuff going on as well, they are basically tech houses (hundreds of software devs), and they really care about IP (both software and the client's material).


Current version of GPL is an absolute no go, especially if you need to import an api / sdk style interface into your extensions and tooling - the GPL is viral. You import a GPL library to interface and your stuff is now GPl.

and latest version requires release of encryption keys etc etc - all major studios WANT content protection to work and the GPL is explicit in its attacks on that


> all major studios WANT content protection to work and the GPL is explicit in its attacks on that

They want content protection on what? Their tooling? They want DRM on the plugins they write for the tools they use? How or why?


If you import a GPL library into your plugin your plug-in / add on is now GPL. They don’t want that.

GPL has what is called anti-tivoisation / DRM. This license was designed to specifically target the entertainment industry.

Even Ubuntu had to get a different license for boot loader to avoid risks here


> They don’t want that.

Again: why? Because they have a grudge against the GPL's anti-DRM?

That's still FUD. There is nothing that a studio would be doing with Blender where they would want to implement DRM.

Sure, they will want DRM to be used in the distribution of the content they make with Blender; but that is totally separate from Blender itself and the GPL. It's not like the film itself will contain a copy of their asset creation or rendering pipeline!


They don’t want to open source the tools and programs they develop as part of their pipelines that tie into GPl or each other - GPl is viral and a library import triggers it.

In terms of the tivoization / drm provisions - the GPL is viral - it only takes one screw up or chain of viral connection to blow their business up. Apple fought the govt to avoid unlocking a terrorists phone, that’s how hard they protect signing keys .

The issue is they don’t know who will use what where, and the viral aspect adds insane risk. Minecraft / roblox and other games may decide to add design pipelines or render chains. Or they may want to run the software on golden image VDI pools that are locked down.

Even Ubuntu was so worried about the chain risk they changed bootloader license away from latest GPl


> In terms of the tivoization / drm provisions - the GPL is viral - it only takes one screw up or chain of viral connection to blow their business up.

Did I not just finish explaining how that is not true?

The GPL is only "viral" to software that it is licensed with, and extensions to that share meaningful data structures with that software.

The GPL does not cover content created by GPL licensed software.

> The issue is they don’t know who will use what where, and the viral aspect adds insane risk.

Except it's trivial to understand the "viral" aspect of the GPL. It's clearly explained in many places. Therefore, there is no risk at all.

> Minecraft / roblox and other games

Games? We're talking about motion picture and television studios. Then again, plenty of game devs use Blender in their asset creation pipeline without getting the GPL involved in their codebase.

> Or they may want to run the software on golden image VDI pools that are locked down.

So? What does that have to do with anything?

> Even Ubuntu was so worried about the chain risk they changed bootloader license away from latest GPl

I don't know anything about that, but I sincerely doubt the context for that decision is in any way similar there...

All I'm seeing here is FUD. Not one of your examples actually brought up a reason - outside irrational fear - to avoid using Blender to make movies.


Dude - let me make this super simple. These rendering pipelines import the SDKs and APIs they connect with. The pipelines have tons have high value custom code. Under the GPL, they have to be open sourced if they use blender - this is 101 stuff. And under GPL it is viral, if stage 1 is now forced open, the next big set of stuff that integrates with stage 1 is also forced open and so on.

Pixar is not open sourcing their pipeline - period. Do you not understand that these companies build giant and high value software around the various engines?

Listen - I didn't realize how little you understood. This is actually covered in the Blender FAQ's because it can really bite you (even if a small player) if you build an add-on to blender.

"Blender’s Python API is an integral part of the software, used to define the user interface or develop tools for example. The GNU GPL license therefore requires that such scripts (if published) are being shared under a GPL compatible license."

This is cool if you want to use stuff - they make that clear too. "Sharing Blender or Blender add-ons or scripts is always OK and not considered piracy." But for commercial players this is an absolute no go.

This is not FUD, this is hard reality, and no commercial player is going to tie into something like this.


I'm not sure why you're talking about DRM...but software licensing absolutely applies.

Studios work together and with outsourcers. Sometimes that means sharing plugins. Under GPL that would mean they'd have to share their code which is strong IP.


It's not absurd that corporate entities would make changes to their software without a desire to release those changes as code. It has serious implications on patents etc...

If Blender had a stable C API that was dual licensed, you'd see a lot more studio uptake IMHO. Heck, even libc is dual licensed.

This all comes down to the GPL not being well received by most tech corporations. People can argue whether that's got merit or not. However it's a long standing fact that tech companies do not like having dependencies on GPL code that can "infect" their code base. Entertainment studios are the same. They're largely tech companies at their core that create art.


This.

Blender has some rather embarrassing performance choke-points even doing relatively simple things in some cases, that software like Maya and Houdini don't have to the same degree.

GPL code (even if you're not distributing anything) is more of a problem than people would think, especially for places working on very strongly defended IP content (Marvel), where the content owners need to have guarantees that the software used is fully-licensed (there have been quite a few incidents in the industry where unlicensed commercial software was used, which causes surprising issues, like films being delayed), and similarly they need to know the software licenses for non-commercial software used doesn't have "surprises" in (some software has weird additions to licenses, like "must provide credits", and even the artists / developers working on the show for the studio don't necessarily get that).

Extensibility is really the key thing: you wouldn't believe how much custom code is written for glue wrapping around stuff for integration - still mostly Python2, although 3's on the way for the industry, but the fact Blender only supported Python3 when the industry as a whole was still happy with Python2 (they don't really care about unicode, they would have loved the GIL to be removed though) meant it was a non-starter. Similarly, custom plugins (in C++ for performance) are required in huge numbers for various different things, modifiers, deformers, proxy drawing code, etc, etc, and Blender doesn't expose anything like that.

Again, support: Autodesk are far from perfect, but they will fix critical bugs for key customers when they insist - i.e. there aren't work-arounds (often within a week).


Just as a counterpoint: in computational fluid dynamics there are many large commercial companies that use GPL softwares, and there are also commercial companies that offer paid support contracts for those GPL tools. Once it catches on in the community that these tools are good quality, there is no legal risk, and there is good support if you want to pay for that, they tend to spread rapidly.


I think what you're saying is very much in line with my initial comment, if we skip out on extensibility for a second.

If the Blender foundation offered paid support, such that studios used it like a black box (e.g like Maya etc), then the GPL part doesn't matter. They can get someone else to take on the burden and supply them with builds.

Blender however doesn't offer that and I think it's a missed opportunity.

However even if that were the case, extensibility is important. And unfortunately, blender doesn't have any API other than Python, and its API is GPL too. Studios need a C like API for performance reasons, and using the API shouldn't affect the licensing of their own code.


I didn't look but I'd be surprised if there wasn't a consulting company offering the last service


I think one of the main issues for Big studios is the whole pipeline is based around Maya / Houdini and it is easy to get artists who know these tools. It's hard to switch and takes time (see how many studios are still using Python 2.x as a switch to 3 will break so much). I know a lot of people are beginning to look at blender (mainly due to cost and the new licensing models from Autodesk). It is really expensive for studios to use now.


> is relatively easy to use with just a trackpad

The last time I tried Blender - which admittedly is a while - it was practically unusable with a trackpad. Has this changed?


Try using Maya or Cinema 4D with just a trackpad and you'll realize how well Blender works with a trackpad. ;)

With Blender, it's not only a matter of learning the key combos, but it even has on-screen controls by default so you can at least use that. Maya used to have something similar, but they pretty much got rid of it. It's possible to configure Maya to make it easier to use with a trackpad, but the configuration for doing so is horribly obtuse and messes with mouse usage.


One major change is that it now defaults to select with left click, which should help things.

There's also an official Pie Menus add-on that lets you fit more controls in a smaller laptop keyboard. Numpad is the normal way to switch between view directions, but having them in a radial menu keeps it reasonably usable.

Lack of middle-mouse drag could be awkward and I don't know the solution off the top of my head, but I imagine there's something. You need to have both pan and zoom, I'd guess one is via two-finger scrolling and the other adds a modifier key?


Pie menus are actually build into Blender now, but there's also a great Pie Menu Editor add-on that lets you easily build you own pie menus and other user interface widgets without coding in Python! Which is very important for making efficient custom workflows.

https://blendermarket.com/products/pie-menu-editor

Alias (and now Autodesk) has been abusing the patent system and spreading FUD about marking/pie/radial menus being patented for decades, which inhibited 3dsmax and Blender from supporting them for many years. But now Blender has excellent support for pie menus, and they're not encumbered by patents.

Autodesk Advertisement About “Patented Marking Menus”:

https://miro.medium.com/max/534/1*3C79dFnlhN__OJ3XmEjN9A.png

http://images.autodesk.com/adsk/files/aliasdesign10_detail_b...

Pie Menu FUD and Misconceptions: Dispelling the fear, uncertainty, doubt and misconceptions about pie menus.

https://donhopkins.medium.com/pie-menu-fud-and-misconception...

>The Alias Marking Menu Patent Discouraged the Open Source Blender Community from Using Pie Menus for Decades

>Here is another example that of how that long term marketing FUD succeeded in holding back progress: the Blender community was discussing when the marking menu patent would expire, in anticipation of when they might finally be able to use marking menus in blender (even though it has always been fine to use pie menus).

>As the following discussion shows, there is a lot of purposefully sewn confusion and misunderstanding about the difference between marking menus and pie menus, and what exactly is patented, because of the inconsistent and inaccurate definitions and mistakes in the papers and patents and Alias’s marketing FUD:

https://blenderartists.org/t/when-will-marking-menu-patent-e...

>"Hi. In a recently closed topic regarding pie menus, LiquidApe said that marking menus are a patent of Autodesk, a patent that would expire shortly. The question is: When ? When could marking menus be usable in Blender ? I couldn’t find any info on internet, mabie some of you know."


Even better!


It's surprisingly good on a Mac trackpad, with support for lots of multi-touch gestures. Can't speak for other OSes.


I admit I am a bit of a masochist in that area but I am almost exclusively using blender on my laptop for many years and once you get used to it and have setup your keys correctly, it works


It's not the trackpad that gets me, it's the reliance on having a number pad that catches me out. Ok, it's not a reliance per se because you can control the view with the on-screen UI, but it's far less immediate that way.


There’s a setting for that: Edit → Preferences → Input → Emulate Numpad (‘Main 1 to 0 keys act as the numpad ones’). I don’t use Blender often, but I always enable this setting on the rare occasions when I use it on a laptop.


Maya's UI is like it is stuck on Windows95 and can't get up


> Deprecation

> OpenCL rendering kernels. The combination of the limited Cycles split kernel implementation, driver bugs, and stalled OpenCL standard has made maintenance too difficult.

I am not really up to date on the GPGPU world, but is OpenCL in such a bad shape that it is not really usable? If so that is very sad. Are there any alternative open hardware agnostic GPGPU apis or has CUDA eaten the entire market?


My perspective is that OpenCL is indeed in that bad shape, though it does have defenders. Both AMD (ROCm) and Intel (oneAPI) have ways to run workloads originally written to run on CUDA, but they're nowhere near the level of polish as CUDA.

I believe an open stack can and will emerge, but it will take time and effort on all levels of the stack. It's possible to do pretty amazing things with Vulkan compute shaders, but the programming model is different than CUDA (it's not single-source), and the tooling support is not quite there.

In time, I am hopeful that WebGPU will gather more momentum, and be officially supported even in places where Vulkan requires janky adapter layers. But in its current form, it's very immature and far from being usable for real workloads.


ROCm is a total mess, and is Linux only.

OneAPI is in a rather good state considering it’s barely a release candidate now I’ll put my money on Blender support Intel GPUs sooner than AMD ones with Cycles X unless AMD will adopt OneAPI.


ROCm doesn't even run on every AMD card, it only supports a subset of their architectures skewed towards the HPC market

The current and previous generations of consumer AMD cards just don't work with ROCm and there's been no indication they ever will


OneAPI/SYCL also works on AMD if the card supports ROCM.


“Works” and actually works are different things. ROCm isn’t in a state that i would define actually working atm, considering just how broken their CUDA to HIP stuff is I’m not going to hold my breath.


SYCL can be targeted directly to HIP without going through Cuda first, but I agree that it's far from perfect. IMO though, it's as useable as OpenCL by now.


Yes it’s that bad unfortunately. Ok, so you got you kernel working and it performs well. That’s quite an achievement, because most tooling is atrocious, and profiling tools nonexistent for many platforms.

Now you want to run this on the users machine. You are of course using an ancient OpenCL version, because very few vendors updates their OpenCL drivers. Situation has gotten so bad that the consortium had to basicallly roll back much of the newer standard because nobody sipported it.

Anyway, the users GPU has the right capabilities and should run your code fine. But it doesn’t. If you’re lucky, you get an error message. Often you don’t, either you get a cryptic error code with zero Google results, or the OpenCL compiler just crashes. That actually happens quite often.

In summary, if you want to support many different GPUs in different OSes, you’re in a world of pain, because everything is half-baked.

There was a letter by the Blender devs to Apple a couple years ago to get their shit together and fix their OpenCL driver. I don’t think they ever did, just deprecated it and told you to use Metal...


Even worse than error message is incorrect results. I worked on the OpenCL neural net evaluation backend used in Leela Zero and lc0 Go and chess bots. We had reports of several OpenCL drivers being so broken that they gave incorrect results while appearing to work correctly without giving any error messages. Intel integrated GPUs on Apple were the worst offender and it looks like the drivers are never going to get fixed. Some older AMD cards had similar issues. We had to add a check that GPU NN evaluation matches CPU reference to catch these broken drivers.


OpenCL is old, and everyone seemingly wants to abandon it, but it's your only option for a bunch of configurations (such as AMD on Windows, even though AMD has pulled all their OpenCL stuff from their website).

As someone who's done a fair bit of SIMD programming on CPU, and heard many scream that wide SIMD (like AVX512) is pointless when you have GPGPU, it's certainly eye-opening to see how poor a state cross-platform GPGPU development is in. Well, I suppose if you only care about Nvidia, CUDA does seem to be pretty good. Too bad it's Nvidia only.

I've heard people consider Vulkan Compute as an alternative. I had a quick look, and it doesn't seem like it supports integer operations (what I'm mostly doing), so doesn't seem viable for me, but I guess it could for a bunch of folk. Not familiar with Vulkan myself though, so corrections welcome.


Vulkan supports integer operations just fine, but tool support for just about everything is extremely primitive. Sizes other than 32 bits are available, but generally as an option. You can check https://vulkan.gpuinfo.org/listfeaturescore10.php to see the fraction of drivers that support the various integer sizes (and lots other optional features).


Thanks! I guess I got confused with the information out there.

Most examples seem to be using GLSL shaders for the kernel, but posts seem to indicate it uses SPIR-V as input [https://community.khronos.org/t/is-a-vulkan-compute-shader-d...]. And then you have threads like this [https://community.amd.com/t5/drivers-software/amd-dropped-sp...] saying that SPIR-V isn't supported on the AMD's Windows driver (I had similar issues myself trying to get SYCL examples to run), though I see other places running Vulkan Compute stuff on AMD+Windows. Maybe that only applies to SPIR-V with an OpenCL runtime?

Diagrams [https://www.khronos.org/assets/uploads/apis/2020-spir-landin...] seem to indicate that you can feed stuff like OpenCL and SYCL into SPIR-V (and then Vulkan Compute) instead of GLSL. For the former case, would you essentially still be using OpenCL, but just with a different runtime? (though articles like this [https://linuxreviews.org/The_State_Of_OpenCL_To_Vulkan_Compu...] seem to suggest OpenCL -> Vulkan isn't in a good state)


There's a lot here, let me try to clarify a bit.

Most of the time, when people say OpenCL, they mean that an OpenCL driver is provided by the GPU vendor. That's what's in a particularly sorry state. Many vendors ship OpenCL but have deprecated it. AMD's ROCm is based on OpenCL, but they don't support it on all cards and there are problems.

The other thing that's picking up steam lately is using a lower level API such as Vulkan as the interface to the graphics hardware, and having a layer that runs the compute workload (whether OpenCL or something else) on top of that. In my opinion, this actually has a pretty good future. On Linux, Vulkan is obviously the way to go, on Windows it's supported by all major cards, and on mac it's possible to fake it with MoltenVK. This is what clspv and clvk are about, but from the article you posted these are not in usable shape yet. I think that might say more about the level of interest around OpenCL than anything else though. It's entirely possible that running compute on Vulkan becomes mainstream through other efforts like IREE than by porting OpenCL workloads.

In a couple years or so, there's a good chance the landscape will shift again, as WebGPU might be capable of running compute workloads well, and is likely to be supported by all vendors. Note that despite "web" in the name, it doesn't require a browser or other web technology. You can run prototypes of it today, but there are still huge chunks of the ecosystem missing.


I see, thanks for explaining all that!


> And then you have threads like this [https://community.amd.com/t5/drivers-software/amd-dropped-sp...] saying that SPIR-V isn't supported on the AMD's Windows driver

That thread is talking about SPIR (the OpenCL binary shader format) in the OpenCL implementation, not SPIR-V (the Vulkan binary shader format, which has also been adopted by other APIs) in the Vulkan implementation.


I'm genuinely surprised that AMD didn't put in an effort to implement CUDA themselves. It's been almost 15 years. They should have started a skunkworks project to get CUDA running immediately.

Intel didn't take long to eat crow and ship the AMD64 instruction set in their CPUs as soon as it became clear they'd lost the 64-bit ISA game. AMD should have taken a lesson from that: if the market wants the other guy's API, you can implement it, customers will be pleased, and that gives you power over the API that you wouldn't have otherwise.


> I'm genuinely surprised that AMD didn't put in an effort to implement CUDA themselves

I think HIP is their attempt at that, as it's very similar to CUDA and is meant to be easy to port. Maybe there's reasons why they (or anyone else really) can't just adopt CUDA (licensing?), though that's beyond my knowledge.


Vulkan Compute is the alternative, and you should look again if you're serious.


Thanks. According to this post [https://community.khronos.org/t/opencl-vs-vulkan-compute/713...], Vulkan Compute seems to be mostly focused on rendering over general compute. This is probably fine for Blender's renderer, but may not be suitable for some compute loads.

Unless something has changed since then, or the post is inaccurate?


That post is 4 years old. Since then, Vulkan compute has made enormous strides forward, while OpenCL has stagnated. Examples of the things Vulkan has now are subgroup size control, explicit float16 and int8, pointers (previously you had to fake them with arrays), and a memory model. The memory model alone is a pretty big advance.

What is still a mess is the tool situation. You have to write your shaders in GLSL or HLSL (the latter is still missing some subgroup operation) and compile to SPIR-V. You also have to have CPU-side code to manage all resources, including memory allocation and putting in explicit pipeline barriers and other ways to manage the asynchrony. For someone who just wants to get their code running, it's pretty painful.

My own application is 2D rendering, but a lot of the motivation for improving Vulkan Compute is machine learning workloads. One of the projects to watch is IREE (https://google.github.io/iree/).

I've got some blog posts and tutorials in the pipeline, stay tuned.


I see, thanks. I presume the end-user running the application will need to have up-to-date drivers to take advantage of these changes in the past 4 years?


For the most part, yes. It is possible to do runtime detection of these features and have compatibility paths, but that further adds to the burden of what's required of tooling and infrastructure. Depending on exactly what you're doing, it's probably best to assume recent drivers. For this type of application (a renderer for a creative app on desktop), I think it's a perfectly reasonable requirement.


Watching Blender become more and more prominent over the last several years has been extremely satisfying. This looks like another great step.


I used Cinema 4D for years and a few months ago I gave a try to Blender. I have to say I was extremely impressed. I did some digging and it turns out Blender has an amazing product focused management thats main goal is to serve the users. Extremely well executed product management.


Does anyone know about good books that discuss the various design decisions behind the "kernel graph"? (https://code.blender.org/wp-content/uploads/2021/04/cycles_x...)

I mean, I recognize that its the "graphics pipeline", but there's all sorts of performance discussion points. You want the data to be in a particular form as it traverses the graph. The various bits of the graph want to be "pipelinable", and parallelized if possible (possibly executing on a 'Remote' GPU). The data may-or-may not be local to a certain location. (Ex: data in DDR4 will need to be transferred to GPU, and back. Or if you're in a render-farm situation, you may need to TCP-send the data to the remote computer before that computer can move forward with its rendering).

And of course: all of those need to be balanced out with the featureset of the underlying program. Anyone can make a fast "sphere-only" renderer/raytracer, but Blender supports many different types of meshes, NURBS, and other features, which is probably the bulk of the complexity.

-----------

The depreciation of the OpenCL kernels is a bit sad, but understandable to anyone who has actually worked with OpenCL vs CUDA vs ROCm. The OpenCL kernels in Blender are particularly weirdly coded, so throwing them away might be the best option.


When I was working on a ray tracer, I found that interpolating the color from neighboring points instead of leaving it blank for in-progress elements was a huge improvement for quickly seeing what the scene is going to look like. In the video examples it doesn't seem like they're doing it. I'm interested to know the rationale.

(See "Progressive Rendering" section for an example: https://blog.vjeux.com/2012/javascript/javascript-ray-tracer... )


What do you mean by blank pixels? Cycles is a MC path tracer. This means that it will always trace some paths that don't find high contributions to the final image. When this happens for the first couple of iterations when rendering progressively, these pixels stay darker than the rest. This is just how the algorithm works. You could try to apply a denoiser, but then you're not using the computational power to shoot more rays and you're replacing noise with bias.


Sorry, what is MC an acronym for here?


Monte Carlo.


Thank you.


Cycles works a bit differently but the same general idea is supposed to be implemented via "CPU viewport rendering with Open Image Denoiser" which live denoises the rendering in a more advanced way. Unfortunately the video for that section doesn't seem to be set up correctly so you can't actually see it in action.


I wondered about the same thing. But look at the other videos. In some of them, the entire viewport is drawn at once and then incrementally improves.


There are a few reasons the speedup scheme you’re talking about doesn’t get used for GPU ray tracing. On a GPU, time to first image that fills all pixels is not typically a problem, and rendering every other pixel and interpolating is more complicated and might take long enough that it doesn’t actually help.

Rendering every other pixel on a CPU in JavaScript is a huge advantage because ray tracing on that platform is incredibly slow compared to GPU ray tracing, and rendering is done sequentially one pixel at a time. Rendering on a GPU is very different because it handles thousands of rays at a time in parallel, and the typical time to get the first complete image on-screen is a fraction of a second. Today’s high end GPUs can trace tens of billions of rays per second, so with that kind of budget it’s easy to get through all the pixels of a 1080p image with multiple rays per pixel. The images in your blog post can be ray traced on today’s high end GPUs at 60hz with tens or even hundreds of samples per pixel for antialiasing.

Another reason a pixel interpolation scheme isn’t used on GPUs is because you don’t have random access to neighboring pixels during a single launch. What this means in practice is you’d have to do a ray tracing launch followed by the interpolation launch. The launches have some overhead, and the UX would be that you first see the checkerboard pattern all at once, and then later you get the interpolated image. You don’t get to see partial progress as you go, unless you’re breaking the image into tiled launches, and that slows down rendering and adds more complication. (Many pro renderers on the market do have tiled rendering to give progressive feedback, BTW).

On the GPU, maybe one of the closest things to what you describe that is being used in games today is DLSS; render a lower resolution image, then upscale to a higher resolution. Instead of interpolating neighbor pixels per se, it’s using a neural network to improve the interpolation based on the image content https://en.wikipedia.org/wiki/Deep_learning_super_sampling

There are approaches to deferred shading on the GPU that do something similar to what you’re talking about https://graphics.geometrian.com/research/dacs_in_hw.html

There is also denoising, which has the same high level goal as what you’re doing (fill in missing data to get a high quality preview faster), but uses a much more sophisticated interpolation algorithm, and rather than skipping pixels works on Monte Carlo images with low samples per pixel. https://developer.nvidia.com/optix-denoiser


Oh man, I am pumped about the viewport improvements. Whenever I'm working on a scene, tweaking lighting or shaders is really arduous because after every tiny adjustment I have to sit and wait at least a few seconds to really see what effect it had. These previews look amazing.


Edit: I should clarify this only applies when

1) I'm using a Cycles-only shader node (which unfortunately I do a lot; see the thread below), or

2) I'm focused on lighting the full scene with all the indirection, or

3) I'm adjusting volumetric effects

Otherwise Eevee is a great stand-in


Have you tried using Eevee for shader adjustments? A pretty common workflow is get your lighting setup working with Eevee and then switch to Cycles for the final render.


Certain shader nodes only work in Cycles. In particular I use the Bevel node a lot for an "edges" effect: https://docs.blender.org/manual/en/latest/render/shader_node...

You can get the difference between the surface normal and the bevel normal and use that as a "sharpness" value to mask things like wear on the corners of objects


One other suggestion is to use a render border.

https://docs.blender.org/manual/en/2.79/editors/3dview/navig...


> A pretty common workflow is get your lighting setup working

Don't think that's common at all because that won't always work well. The difference between Cycles and Eevee is the lightning, materials, rasterization (compared to raytracing that Cycles does) and such.


The GP is correct in many cases. Part of the reason Eevee was created was for this exact purpose. You can't preview certain shader nodes (though you can preview most of them), and you can't preview raycast-only effects like indirect lighting (unless you bake it) and certain volumetric effects. But you can preview your geometry and direct lighting and physically-based materials well enough to get an idea of how things will look, and then you can make a final pass where you preview the actual Cycles output.

If you're fixing your UVs, or adjusting the depth of a bump effect, or arranging a scene to see how objects' colors balance with each other, or tweaking the metal-ness of a material, Eevee works great as a preview even if your final render will be in Cycles. Eevee is particularly useful when working on an isolated object (minimal indirect lighting), vs seeing how an entire scene gets lit.


I've found that with complex shaders, it can be more performant to use CPU Cycles rendering because GPU rendering can have a substantial delay after changing parameters. In the same vein, Cycles is sometimes faster than Eevee, because the latter needs to recompile the shader after every change.


Awesome!!! Those initial test results are darn impressive. Good to see they are working directly with Intel, AMD and Nvidia.


For GPU rendering anything but NVIDIA wasn’t even a second class citizen on Cycles for a long long time.

Cycles X will likely be OptiX only and then OptiX first for quite a while until a true cross vendor cross platform stack appears.


In the linked article, benchmarks are shown for CUDA as well. Of course that's also nVidia-only like OptiX.

And antecdotally, OptiX is around 1.5-3x faster than CUDA with Cycles rendering.


I wonder why they decided against Vulkan VK_KHR_ray_tracing again. In the original Cycles they did not go for it because of its architectural incompatibility, but now that they are doing a fresh start ...


All I know is, this better bypass the dumb AF "MAc OS can't use CUDA drivers with the discrete GPU"

Like, why tf did i buy the macbook pro with expensive discrete GPU and can't even use it to render GPU accelerated 3D. Seems like a primary use case


This isn't Blender's fault, Nvidia discontinued the development of the CUDA Toolkit for macOS. Version 10.2. was the last version released for macOS [0].

> CUDA 10.2 (Toolkit and NVIDIA driver) is the last release to support macOS for developing and running CUDA applications. Support for macOS will not be available starting with the next release of CUDA.

[0]: https://docs.nvidia.com/cuda/archive/10.2/cuda-toolkit-relea...


This is a really weird response.

1. Nowhere did I mention Blender, although yes, this is a primary gripe

2. Even if I had, where does the inference that Blender somehow drove this come from?

3. Why leave out Metal2 and AMD frome the discussion?


Sorry for the late reply. I assumed that your comment was about Blender given that you've commented on a post about it. I also assumed that the comment was directed at Blender because there have been plenty similar comments/questions w.r.t. CUDA support on Blender's Stack Exchange in the past. Since you were addressing CUDA in particular, that's what I did in my reply as well.


No worries, CUDA (non)support on Mac is clearly directly created by Apple


I love Cycles but they need to fix Eevee (still no support for particle input for instance, forced to use animation nodes for that). I never use Cycles in production because it's too slow (by design since it's PBR), unless I can't do otherwise, but in these cases I usually just bake materials in Cycles then render with Eevee.


Have you tried the new geometry nodes? I've found them to be an excellent alternative to particles for a bunch of use cases.


Yay, Blender: Nvidia edition! Not blaming the Blender devs, this is largely AMD's fault, but still... I guess I'll need to get a Threadripper because there's no way I'm dealing with Nvidia on Linux again


> supporting all major GPU hardware vendors remains an important goal

Is Apple (Their GPUs (M1)) a major GPU hardware vendors?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: