You're blocking ads, which pay for BlenderNation. Read about other ways to support us.


  1. @ jayh
    Even if there isn't a button committed to doing animated textures, a python script to change the picture each frame wouldn't be difficult to write.

  2. yah but that isn't needed anymore (not that I didnt have to make the damm script once), now you can just set your texture as an image sequence

  3. apart that the name of the coder is mispelled :P , this is a really cool feature, along with the whole GSOC this guy is going on with. If you don't find a fresh compile on graphic-all google for blender-zoo (telechargèr)

  4. Correct me if I'm wrong but haven't we always been able to do this with spotlights??

    So if you're after soft textured lighting, then why not just do a render pass of just the spec/diffuse of a textured spotlight hitting the floor, but feed it through a blur node? You would get a very similar looking render to the one in the article.

  5. redbyte: No, there is no appropriate texture coordinate generation for that. However, you can get the same effect by using an area light behind an actual half-transparent screen, and then using TraShadow and Transluscency. However, that's pretty slow... I don't know if it's slower than this, though.

  6. After experimenting more, I found that all of these solutions don't really give that impressive results.. the closest I got was by dupliverting a square spotlight that has the image as a texture - as a grid on the screen :) And I guess, this project does it in a similar way, and optimizes the render time using the lightcuts stuff :)

    Funny experiment actually. Rendertime 1 Minute 20 secs...

  7. @Alexander:
    It's not quit the same. This is not about projecting a texture, this is about actual lighting coming from the image. If theres a red pixel in the image, it's equivalent to have in that spot a red light, next to it could be a green pixel and it works like if you have a green light there to, right next to the red light, just like a real TV set ^_^

  8. RedByte: Actually, this is extremely different. The original method of mapping textures onto lights didn't allow, afaik, for proper scaling and orientation. And it only projected the texture itself. The lightcuts method colours each point on the area light, so that the light will emit vastly different colours depending on where it is.

    This would be more noticeable if, for example, you had an area-light with a yellow and purple gradient on it, shining on a cube. The side of the cube that both ends of the light shone on would be white/grey, the part that only the yellow side shone on would be yellow because of a lack of purple light and vice versa. This'd work well in the case of making a giga-screen like in a sports arena or in the case of an animated advertisement billboard, where the area light's coverage is huge enough to make a noticeable different in the environment depending on location.

    Also, a gorgeous bonus is the fact that the specular colour is also emitted from the screen as a whole, which gives a wonderful soft look to the specular objects themselves, instead of the single-location specular light that a spotlight would exhibit. :] All-in-all, UncleZeiv is doing fantastic work.

  9. This is good - and that pointlight stuff could come in handy later for other things if the implementation isn't too specific... Think fast and good quality radiosity approximation, or uber-fast global illumination - I've read lots of papers about this sort of thing and I already have an idea how you could use pointlights to do it (and possibly spherical harmonics to aid in fast and quality sampling)... Volumetrics could also really benefit!

    Keep up the great work on this! =]

Leave A Reply

To add a profile picture to your message, register your email address with To protect your email address, create an account on BlenderNation and log in when posting a message.