Hacker News new | past | comments | ask | show | jobs | submit login

Also Intel. Being Nvidia-only is not very good from an accessibility point-of-view. It means that only ML researchers and about 60% of gamers can run this.



Thankfully it's a specialist software package not aimed at gamers or ML researchers.


And most people in Hollywood using rendering engines like Optix.


No they don't. Also optix isn't a renderer, it just traces rays and runs shaders on the ray hits on nvidia cards. Memory limitations and immature renderers hinder gpu rendering. The makers of gpu renderers want you to think it's what most companies use, but it is not.

Also Hollywood is a city and most computer animation is not done there. The big movie studios aren't even in Hollywood except for paramount.


Except that is what companies like OTOY happen to build their products on.

https://home.otoy.com/render/octane-render/

As for the rest of the comment, usual Nvidia hate.


Octane is exactly the type of thing I'm talking about. This is not what film is rendered with. It is mostly combinations of PRman, Arnold or proprietary renderers, all software.

I don't know where you are getting "nvidia hate", studios that use linux usually use nvidia, mostly because of the drivers.

None of this changes that optix is not a renderer.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: