You're blocking ads, which pay for BlenderNation. Read about other ways to support us.

About Author

Bart Veldhuizen

I have a LONG history with Blender - I wrote some of the earliest Blender tutorials, worked for Not a Number and helped run the crowdfunding campaign that open sourced Blender (the first one on the internet!). I founded BlenderNation in 2006 and have been editing it every single day since then ;-) I also run the Blender Artists forum and I'm Head of Community at Sketchfab.

12 Comments

  1. That's really great! I've manage the light beam effect in one last year's project whit the Directional Blur node, but I had to tweak it a little in order to have the smooth effect that we can see in this demo. In that project, as well as in the future Sunbeams node, I miss an input for the location of the effect. I wonder how difficult could be to add that so we can input tracking data if the footage have some movement. Like in the image below, where the camera rises up (I had to do it manually then).

    • Sean Kennedy on

      Hey Antonio,

      I've already asked for a factor input on the x/y values. Don't worry, I'll keep bugging them about it. :)

    • well If you ever want try to implement such a system you will need to add a driver to the XY values on the sunrays node. but one thing you want to take into account before going further is that the values in the Sunrays node is measured in pixels so you have to scale the position of the light source mapped 0-1 by the resolution of your image. so the expression in your drivers should look something like this (just swap out the X and Y per driver):
      LightPos(X or Y) * ScreenRes(X or Y) * RenderPercent / 100
      the LightPosX and LightPosY variable is the position of the light source onscreen (going 0-1),
      the ScreenResX and ScreenResY variable should be linked to the scene's X/Y Render Resolution,
      RenderPercent variable should be linked to the scene's Render Resolution Percentage Slider.
      Note that the X and Y Pos values should be driven by the X and Y position of the light source in screenspace (aka. if the light object is seen in the top left corner of the image the XY = 0,1 if on bottom right then XY = 1,0) For 3D this is the hard part that will require some complex math that will normalize were the 3D space light source is on the camera's 2D space. if you want to use 2D camera tracker info this should be easier, or just manually animate the Light Position values and use the driver expression to make the result resolution independent.
      I hope this helped and wasn't to confusing.

        • the Track Position node is to used track 2D camera positioning. it requires you to set up a camera track but It is probably one of the best solution for normalizing the light direction. I would experiment with it.(I haven't tried it yet but it seems like it would work out perfectly in your case if you add it to the light position in your 2D shot)

Leave A Reply

To add a profile picture to your message, register your email address with Gravatar.com. To protect your email address, create an account on BlenderNation and log in when posting a message.