Which one is the best render for Blender? Sunflow? YafRay? Indigo? Or Blender Internal? An artist called Hannu Kuisti, asked himself the same question. He was wondering, which render would suit best the needs for architectural visualization, and made a few tests with all those renders.
He wrote an article about his experience, showing the results with different settings for each render, with the respective render times. If you want to check out the article, visit this page. A must see for anyone that haven't decided yet, which render to use.
This was some good stuff. As for the Sunflow renders, having the Java Development Kit to use the -server switch is a must since itÂ reduces render times by quite a lot. I hope he re-does it with -server enabled. He should also try using "instant gi" to see if he can lessen the render time even more over the path tracing image.
He notes that comments should be posted to "the forum" but I can't find the post after a quick scan at BA. Can someone point me to it?
Very interesting. Especially because you're not a render guru :p
You seems to put lights as an architect will do. I means where lights are physically and then watch the result to imagine how your room would be lighted.
Whereas, it's easier to work on a specific desired light effect.
Do you know Kerkythea? I would like to see YOUR test with.
And then, tweak them all to do beautiful renderings :)
great ideia i allways ask the same question, thanks for sharring
3d for achitecture
could you please provide the packed blendfile, so we can test it with some other settings?
Very informative. I wouldn't mind seeing Eugene's & Eon's suggestions put into effect. Otherwise, the two best renders I've seen, so far, has come from Blender's internal(w/ AO), and Yafray's.
something to add about Indigo:
he used version 0.8 stable.
But the current version (0.9t7) is far faster and actually very stable! (I didn't have any problems, 'till now)
oh, gain doesn't do anything - it is/was an artefact of blendigo...
(I didn't check for latest blendigo, so I don't know, if it's still there^^)
There is one important thing you have to realize when you compare renderers (not "renders" by the way) like this: The renderers are designed to be used with completely different setups.
If you use the same scene setup for two totally different renderers like Indigo and Blender Internal, for example, one of them is bound to come out ahead by miles. I recommend reading the book Digital Lighting & Rendering by Jeremy Birn for a thorough understanding of lighting and rendering techniques. (http://www.amazon.com/Digital-Lighting-Rendering-2nd-digital/dp/0321316312/)
Pah - he can complain - I used to do all my renders on a machine running at 333Mhz with 96MB of (very slow) RAM ... strangely enough, on my new machine which is running at 2.6Ghz x 2 cores with 2GB of (very fast and synced with the processor) RAM, it only manages to cut the render times by about 1/5 - which doesn't really stack up that well... but I never could understand how my old machine could be so fast compared to other people's with similar specs lol. I know what was in it because I built it myself - so I've no idea...LOL
Shouldn't it be renderer?
Sorry to double post, but the reason why Yafray has grain on the best setting is because it uses the AA settings for the grain amount, and he had AA turned off. Same with DoF.
I'll like to see Yaf(a)ray compared to Yafray, just to see the difference between them.
I cant find Kerkythea renderer and renderman
All of the renders were very poor and nowhere close to professional quality. The images were vastly different from each other and in my opinion this test is useless as he has anti-aliasing turned off.
He never claimed them to be professional. He just wanted to compare rendertimes for himself and posted about it.
But I agree that the result is not really comparable. I think to make it comparable you would have to have a reference image which you should try to get to. Record the time to tweak the scene and add it to the actual render time giving you the total time. Now that would be comparable.
Another thing I didn't catch is which version of Java did he actually use and on which Platform since this might greatly influences the performance of JAVA since the JVMs are different on different platforms. Another note on the side: Server setting is default for Java 1.6 and up.
Thanks for sharing this tests and settings, very useful info!
Hey, but where is discussion forum you talking about in your article?
For me all that proves is that the Internal Blender Renderer works fine for me ;)
No doubt we've all asked the same question, I certainly have for ArchiViz and as Hannu, tried all the renderers that are on offer. BTW using Sunflow 0.7.2 is far from upto date, should be building from SVN, there are many improvements in the SVN, Chris hasn't made a release for a loooooooong time.
But the big problem is all renderers handle the materials, textures and lights in blender differently. Not only that but because the renderers don't have a mature feature set, you end up not being able to use the same renderer for stills and animations and get photorealistic results. So you end up getting two different appearance between the stills done with say Sunflow or Indigo and animations done with BI.
Then how many of the renderers intergrate in any way with blender's passes? Only one BI. How many of the renderers have it on their roadmap to do passes, we can have motion blur and other wizz bang stuff to prove a renderer coders skills and standing in the community but what about the basics? When will a renderer coder look at a bit more closer intergration with blender other than coding a python script to create an XML file, which then needs hand tweaking to get results. At least they're doing something i hear you shout.
But this is Open Source and the reasons why people start coding projects is for their own needs and understandably so, but it would be soooo good if they looked at a bigger picture as well, look at what blender internal is missing like GI, good fake GI and get those in place first along with passes then blenderers can mix passes and get results instead of waiting (yes we're not all coders and can't damn well do it ourselves) for one particular renderer to get the features needed or have us all bouncing from one incomplete renderer to another trying to get results. It's simply far easier to just go proprietary with Vray or Brazil, it ain't mega bucks and results come easier with time because the features are there to get the results.
Hopefully the render api GSOC project will bring about change and it's for me the most exciting thing that's happening to blender for sometime. Instead of talk of coding BI to do GI etc, the greatest flexibility will be the ability to render passes in different renderers based on their strengths and composite to get the results.
A lot of replies, I didn't guess the page would be actually making it's way to here...
I originally started the thread in blenderartists.org forums and I put the link
to the page there. I requested the comments there (now I see them here :)
I have some problems with this project, because at the moment (and at the
moment of making the renders) I have a 56K modem connection to the Internet
and sooo much to download :) I'll probably get the latest Indigo when the
maker starts calling it stable. It's developing so fast that I don't get every version.
Mikkel: I know the renderers are designed to be used with different setups. Here,
I am testing which renderer works best with this setup. This is a fairly basic
setup with one sunlight, and just a few UV-mapped objects. A fairly basic for
archi-viz. As I have written to the page, that I was mostly interested in indirect light bounces. You probably didn't read the text?
daxian: Use Google :) for real: I'll look into them.
I will also have to get the right version of Java, if it doesn't take hours to
download on my 56K :) If I even find the right version on the Java website,
which looks like someone vomited on my screen.
noen: The render are different because that's how the renderers made them.
As I pointed out, I didn't even do any post processing. I had to make everything
quick & dirty because I have to make some actual work on my machine...
about the AA, I will have to put it on to my next tests and see how they will
And as for the tweaking, this is why I actually posted the original page! I asked
for advice how to better the renderings, and now I have quite a lot. Still,
some questions unanswered.
Another thing here is the amount of documentation on there open source/free/GPL
renderers: there's not a lot. There's a lot of testing that one has to make
to really understand how the renderers work. So to everyone who thinks the images
were crap, i dare you to make better ones! I would (really) like to see different
pages comparing renderers with differents kinds of setups listed out with the renderings.
The more professional looking work I save to my clients :)
But hey! Keep the comments on coming.
Not much use with AA turned off and some unknown level of Jpeg compression on the images. In summary: a good idea redundantly executed. Now the weather...
Musk -- I was under the impression that he was presenting himself as a professional. Perhaps I misread something. If so then I'm sorry about that Hannu.
@Hannu: I did read the text. And if you want to do such a comparison, you are welcome of course. Your computer, your website, your time, your choice - you are free to do anything you want. All I am saying is just that I don't think such a comparison makes much sense.
I also wondered why Kerkythea isnt in the test? Couldnt find it? try google http://www.google .com :P