This is not available in Blender, but boy would it solve the noise issues!
This technique is based on the paper "Spatiotemporal Variance-Guided Filtering: Real-Time Reconstruction for Path-Traced Global Illumination".
This is not available in Blender, but boy would it solve the noise issues!
This technique is based on the paper "Spatiotemporal Variance-Guided Filtering: Real-Time Reconstruction for Path-Traced Global Illumination".
4 Comments
So where did they tested it?
One thing to mention here - as far as I know this is for animations only. Not for stills.
Quote from the linked site:
'We introduce a reconstruction algorithm that generates a temporally stable sequence of images from one path-per-pixel global illumination.'
So, would it work, if you render 3 frames animation about your still?
Yeah, of course, it works for still images too! :) It's based on accumulating more samples over time and blurring image brghtness along edges and by looking at temporal varience (highly lit or areas in shadows have low temporal variance -> less blurred).
It's biased and can oversmooth a bit, but it would much better than noisy image in an early low-sample render. I'm trying to implement something like it, I think it can be done in much shorter time than they presented (approx 2.5 ms for filtering on 760gtx, 5 ms for two passes as they do).