The processors of modern videocards (GPU's - Graphical Processing Units) are SO powerful, that they're much faster at rendering than CPUs. Here are the first results of a test project that introduces GPU rendering in LuxRender. If they pull this off, it promises an incredible speedup!
The frames in the following sample only took seconds to render:
From the demonstration video's description:
Testing the incredibly tiny OpenCL software demo SmallptGPU (and a quick look at the OpenCL accelerated SmallLuxGPU and Bullet Physics as well). The OpenCL fun appears to be just beginning!
Path tracing, yes the typically very slow path tracing, which used to take hours to render, is now getting a helping hand from OpenCL. The frames in these animations took only seconds to render almost entirely on my GPUs (CPU utilization literally was at 0%). Notice the soft shadows (from true spherical area lights), indirect bounced light, color bleeding, caustics, etc.
Using AMD Stream SDK v2.0 with OpenCL 1.0 support on two ATI Radeon HD 4890 GPUs (Catalyst 9.12 Hotfix drivers on Windows 7, no CCC, not Crossfired). Now I want 5970s :P. AMDs implementation of OpenCL and SmallptGPU also allow exploiting the full CPU as an OpenCL device as well, with SmallptGPU it helped with one GPU, however not with two. With SmallLuxGPU, you seem to be better off using native threads for the CPU and OpenCL for the GPUs.
In the spirit of this test I also used the experimental partially (btBatchConstraintSolverOCL) OpenCL accelerated version of the Bullet Physics engine to simulate the physics. It appears still too early to be using this version.
Note that none of the software programs tested in this video are full featured packages, they are for demonstration purposes at this stage.
Link
46 Comments
Looking forward to see this on my desktop :D
awesome news. This is so early days and cutting edge stuff I wonder where we will be in five years time when this tech has become mainstream.
You may want to check the latest Chiaroscuro's work with Bullet Physic engine and SmallLuxGPU at http://www.youtube.com/watch?v=YlGVitBaaHE
A quite awesome work =)
@Dade: thanks! I added that video to this post.
great news!
would be cool have the same thing in the internal renderer :D
I thought Christmas had been!? Wow, fantastic news!
Really pleased to hear they're picking this up. CPU, GPU and network rendering mashups are the future or rendering! take my word for it!
Wow pretty amazing! Soon we will laugh at those days where it took hours to render a simple ray tracing scene while the new generation get even more spoiled (no pun intended) and look at us like a weirdo with our "past". Technology is amazing!
I can't wait! Thanks for the amazing video too Dade & Bart!
Simple: W O W
I miss the information, how fast one frame renders? BTW. zero % utilization of CPU seems to be a... waste of time ;) Don't get me wrong, this is all fantastic, but couldn't be even better? :)
This is mindblowing. I need a new graphics card.
My thanks to those who develop this for us!
@Melon: you can check this SmallLuxGPU video: http://www.vimeo.com/8799796 It should give you an idea of rendering times (and you can see that the CPU is used for rendering too).
This is, instead, a video of SmallptGPU (the one running only on GPUs and not using the CPU): http://www.vimeo.com/8141489
@melon: Yes, and that's supported magically with no additional work (OpenCL is awesome). You just use your CPU as another OpenCL device.
This is very interesting.
rendering in the GPU will be a major leap forward.
I have a Quadro and something like this in a Quadro will be a major time saver.
Great, but Blender has very good glsl rendering, but no one uses it... (realtime and high quality graph, see ps3 or x360 games...)
Many many things, and no users...
Hi Endi,
Would you have a link to glsl rendering?
Thanks
Andrea
You forgot to mention LuxRays, which could be used also in blender's new shading pipeline.
GPU render in LuxRender might also mean GPU render in non raytracing renderer in Blender.
x200 speed up would be awesome.
Can haz GPUz pls? :p
this is great.. i hope you choose a better name for it though instead of SmallLuxGPU
I've been following the Luxrender team's endeavours with OpenCL ever since OctaneRender hit the news. It's very exciting to have this for Lux too!
To be honest i've never looked much into GLSL... But i think it's appropriate only for the internal renderer. For Lux it's clear how this is of much use, and is better due to the simple fact that what you get is the actual final render result.
Guess I need to check out the OpenCL-capable cards in the future...
@Nik, others:
GLSL is not used for rendering - it's for the realtime viewport. It's a way of describing shaders for surfaces, not a way of rendering. Using a renderer such as Blender Internal or does not use any part of GLSL.
Christopher,
Yes, but why? Why we cant use glsl for rendering? This is a very sad thing I think...
With glsl it is possible to make very good results, see next-gen games...
I think Blender users dont use that they already has... always dreaming about new features...
@endi
I think that's a valid point you're making.
EDIT: GPU rendering certainly is an interesting development, though.
Now Blender still has to go GPU. :p
What does "only seconds" mean?
5 seconds average? 10? 15?
Thats pretty darn impressive.
We're getting a very competent renderer to compliment Blender, very nice work guys, thanks for you AWESOME WORK!
GLSL is for rasterization only. OpenCL can compute global illumination (mirror, reflraction, caustics, etc) which is raytracing, that's why.. ALL games now use rasterization and game developers have become VERY good at simulating effects. However, as processors become more powerful I expect that games will eventually switch to global illumination as well.
Note however that NVIDIA has posted presentations (with real time global illumination demo!) voting for a hybrid OpenGL(GLSL)-OpenCL approach. The future remains to be seen. Personally I really hope for Blender integration sometime after the initial 2.60 release.
So this means that GPU rendering and blender's GLSL capabilities are two entirely different things... That makes "Why aren't blender users happy with what they have" an invalid arguement.
Being able to use GLSL for rendering is a very nice thing, although it would be nowhere near as handy as *realtime raytracing*.
In any case, i think i read on the Durian blog that there were absolutely no plans on using OpenCL with the internal renderer any time soon. That's fine, considering it's still way too early. When extremely fast GPU rendering becomes a common thing with 3d packages, i doubt Blender is going to just stay behind.
@Nik,
No, you still have a misunderstanding.
GLSL is not fit for doing a final render. It's not necessary for the purpose.
It's for realitime display.
Next generation (and current) games have been using GLSL or an equivalent because that's what's available to them to do effects.
All GLSL is is a shading language. that's it. The same thing that GLSL enables can be(and is) done without explicitly using GLSL.
It's an apples and oranges situation.
Take a good look at these (leaked) nVidia Fermi GPU rendering videos. Impressive, to say the least.
http://www.youtube.com/watch?v=XFgzG9WyGNQ
http://www.youtube.com/watch?v=Cbnv_z6VDj8&fmt=18
http://www.youtube.com/watch?v=QLu8DyzoVMs
http://www.youtube.com/watch?v=uEieu-OKVvs&fmt=22
http://www.youtube.com/watch?v=jcXmV5Je_gc
A long thread discussion.
http://hardforum.com/showthread.php?s=5da1de802ab5339871bb2da1d8fe0014&t=1483646
The post with the above links in it.
http://hardforum.com/showthread.php?p=1035195143#post1035195143
Careful with the ATI series cards - the high end ones, the 5870/5970, apparently won't fit in my mid-tower Antec Sonata. Don't know about the others.
Kudos to any and all software developers that are attempting to support OpenCL.
Thank You!
I am very happy to be seeing other GPGPU render software popping up. I have been in love with Octane since it came out, though it is still very green. Using OpenCL, even in its infancy, will open the door to other people using ATI based cards, especially as they are a little cheaper then Nvidia. I will probably spend my money on Octane though as it seems a little farther on and I am an Nvidia user anyway, though I will love to try out a full release of the SmallLuxGPU. Good work guys.
What are the chances of seeing this kind of acceleration in the Blender internal renderer? :)
JoOngle
http://www.vimeo.com/8799796 It was posted up there. That's what they mean seconds. Sometimes, it's one.
@Iconodast. Oh...
thanks for the link, that was a very interesting view!
I'm still a little confused: I've seen some questions on here about the blender internal engine getting GPU support, but no real answers. To me it seems like the holy grail? To not have to learn another renderer, fiddle with converting particle systems and modifiers to actual objects, etc, etc: isn't that what we all want?
I love Blender, and have recently had the chance to actually use it in production. Despite a long time familiarity with the program, I am finally learning all the ins and outs. But I'm frustrated as hell to create something, then find out that I can't get the photorealistic render I wanted because I can only do certain effects in another program. I'm not a guru- I still have stumbling points and don't know all the tricks. And I don't have all the time to go out and find them.
Sorry to ramble. I guess what I'm saying is: I'd rather be using something that is built into Blender. I'd pay good money to do so. And I'm betting a lot of other people would as well. If we could find willing programmers, how hard would it be to actually set up a fund to get it done? It's still cheaper than unloading a few grand on commercial software!
Maybe it's just time for me to get more involved! And no offense to the Luxrender guys. What initiative.
It's a bit misleading to say that "The processors of modern videocards (GPU’s – Graphical Processing Units) are SO powerful, that they’re much faster at rendering than CPUs" when most people imagine that GPU obviate CPUs. The kind of processing GPUs do is very, very constrained so they do better than CPUs in *that particular type of task*. For many other kinds of general purpose tasks, no so much. Even in these videos, the little bit of not-so-strictly rending computation (like the collision physics) isn't too strenuous nor too elaborate - will within the class of computations that you'd expect to do well. Sort of like looking at a 2-seat sports car and comparing it to a general purpose sedan and wondering if everyone will dump their sedans for sports cars because the latter obvious goes faster and handles better - well for that particular - sporty driving - sure: better. But probably so much for general purpose use.
Now all is needed is to combine the power of the GPU *AND* the CPU together, along with a network renderer, to make some kind of monster super renderer, capable of travelling faster than a speeding bullet, and leaping tall buildings in a single bound!
At which point, Blender3D 2.5 is released, and this all combined makes Blender3D the new standard in the 3D industry?...
.. OK, nevermind that, it's just another dreamed up world domination plot. But on a serious note, this could be very good for Blender if it can be incorporated into it.
That looks very promising. Thanks for the infos.
Not blender related:
Does anyone of you know about fprime? I bought this renderer back in my Lightwave days :)
I dont know about the inner workings of rendering engines -- but this was possible a few years ago. And there was no GPU needed (at least thats what I think to remember). Just for pure inspiration you should take a look at the fprime realtime videos here:
http://www.worley.com/E/Products/fprime/videos.html
Best regards, Christian Lehmann
This is incredible.
Wait is this correct? GPU is mainly processor devoted for graphics? I heard that it uses parallel computation much like 2 core and quad core processors we have, I read that GPU has as much as 25 to 100 processors though they share a small amount of memory each. So I think JG has a point in saying that GPU is primarily powerful for rendering and some stuff that can be processed in parallel but is not good in others... I think, especially on memory demanding ones.
But then, if its quite capable of speeding up rendering (wow up to 64 times faster!), wouldn't it be great but I wonder if it still works well with large images and polygon. The tests are only spheres and bocks with simple materials, which does not amount much to memory use. In lux forums, it seems that luxrender will just use GPU computation on some parts of the whole rendering process. Hmmm.
Still, its promise is amazing. Surely, if its possible, everyone would be in love with it. Gosh, realistic, unbiased rendering in seconds! Even if its minutes, it is a large cry from hours. I tried luxrender before and it took me up to 60 hours just to make my image clear up, not to mention the disheartening fireflies that never seem to disappear.
miraculous
"""Does anyone of you know about fprime? I bought this renderer back in my Lightwave days :)
I dont know about the inner workings of rendering engines — but this was possible a few years ago."""
i don't think fprime is unbiased. there is a difference between biased and unbiased renderers. it would be kind of interesting to also see a biased gpu accelerated renderer pop up. :p
Stunning! Can't wait for a real LuxRender build with GPU support to show up!
Definitely where the industry is going I think.. It wasn't too long ago that I saw Kun Zhou's research using GPGPU and thought it would be cool if the Blender Internal Renderer headed this direction.. BTW, he uses Elephant's Dream in his paper for some of his test renders (to answer ralmon's questions regarding complex renders).. Check out his paper on Renderants - http://www.kunzhou.net/
very amazing tried it on my own pc... but how do I create *.scn file??? would like to test it on an own scene(and yes I have googled already^^)
hi all!
I think Lux render is the best there is! Now that we will have GPU rendering and I am glad the developers are sticking with openCL. I would donate to them but they don't have a donate button or I cannot find it? Anyway this is amazing and I will continue to teach this wonderful product. TAKE THAT INDIGO!!!
Ditto that. If anyone knows of a donations link, post it here. I'm sure Octane is a great product (except that it keeps crashing my system), but I would much rather put my money into something made to work directly with Blender. My Dad always said: throwing money at a problem doesn't always make it better. My response was always: No, but it can't hurt!
Goodness, this is awesome!
hopefully, this will be usable soon, so good!
I agree with endi on that it would be nice to have machinima style animations with OpenGL\GLSL. Currently blender renders the UI interface gizmos too in the final render.