You're blocking ads, which pay for BlenderNation. Read about other ways to support us.

Stereoscopic Rendering in Blender


retroAfter searching around for a stereo camera technique Sebastian Schneider decided to create his own version.  Time to get out your 3D glasses again.

"Since Blender doesn't have a stereo camera and I couldn't find a good solution to make stereoscopic animations (link Up, IceAge3 ...) with our favorite 3D

program I decided to write a python script and a short workaround to simulate an off-axis stereo camera. "

You can find the result on my website



  1. Wow - fantastic work! I produced a similar node solution on the BlenderArtists forums a while back, but your use of python kicks mine and leaves it for dust... a very well thought out method.

  2. Although I never thought about it, I'm surprised stereoscopic imagery is not possible in blender. Will it be standard in 2.5?

  3. Stereoscopic imagery is possible using various methods in the game engine (see game framing settings), but it's good to see a full-render appproach. :-)

  4. Nice!
    But... why not to automatize the whole process of preparation (creation of scenes, linking objects, setting left and right cameras as active...?

  5. I had a problem rendering 3d to uncompressed avi, the red always darkened a little too much from filtering.
    png lossless still images it works fine, but virtuadub looks like it has a similar compressor. I think I tried avidemux but idf I remember right it wouldn't recognize the images so I gave up.
    There was a really good 3d dragon walkcycle on youtube and people were "finally one that looks good!" :D Mostly they're all blurry and double vision-ish.

  6. I haven't checked out the file, so i'm not sure how this works.

    But stereo rendering is possible, using the shift frame option for the camera (X, Y axis) you can create a true stereo setup.

    You need to have the correct eye seperation for the cameras and then use the shift frame option, you need to have the cameras frame shifted inwards towards the center of the two cameras, i did some testing at work with my setup and it worked pretty well (running on a 100" dual projection system), so blender really can do true stereo, it's just a pain trying to get it to work right, since you have to re-think set design completley.


  7. Awesome effect!!!
    I viewed the "turtles" and the 3 cubes image and the stereoscopic 3d effect is crystal clear compared to other 3d images!!!
    Great work Sebastian!!!

  8. Henry Schreiner III on

    That's nice, I do hope a stereo option will be available for the camera in 2.5. It could work like the focus does now. However, I guess you'd need a way to get each eye. There is an option for stereo in the game system, but I really wish it had the option for straight/cross-eye viewing. I've trained myself for cross-eyed viewing of 3D, works really well and the colors are okay, but the L/R pairs have to be switched (L image to the right of the R one).

    I really liked the way you can control what's in front of and behind the camera here. I'm going to be trying this out, though with a node set up for side by side images. Great work, Mr. Schneider!

  9. I have used nodes before to make 3D videos using Pulfrich motion, parallel stereogram and anaglyph (Red-Blue) stereogram methods. In spite of many hours of tinkering with a spaghetti-bowl of nodes, I was never satisfied with my anaglyph results. Using python in the mix sounds like a great idea, so I can't wait to try this out!

  10. I did this ages ago, rendering 2 camera's and compositing it according to Wikipedia information. The image seems a bit red but its actually because the eyes are more sensitive for one of the colors. The resulting render was pretty convincing (a bit is lost due to google video resolution)

    I also tested the anaglyph rendering of the game engine back then and that was just wrong, both colorwise as well as seemingly the colors were flipped and the space was to big. Oh well, I just hope they carefully make it perfect so one eye really doesnt see the other color.

    PS. you can order free sample anaglyph (and other 3D) glasses online if you don't have some yourself.

  11. I've been doing stereoscopic rendering with Blender for many years. I use a stereo camera set up with parallel left right cameras, then I just estimate separation based on the 1/30 rule then render the left/right seperately. The renedered animations/images are rendered slightly wider than desired, then cropped to set the "stereo window" in the free applications StereoPhoto Maker or StereoMovie Maker.

    If I'm understanding Sebastian Schneider's python script correctly, it sets the image plane (stereo window) by rotating the cameras (not prallel). This is also called "toe in" or convergence. Most stereoscopic photographers avoid doing that because it can create "keystone" distortion.

    I also would very much encourage developers to create a built-in stereo camera and automated stereoscopic rendering that supports multiple formats (separate left-right files, side-by-side, cross-eyed, anaglyph, etc.) and hopefully the camera would have the option of parallel or converging (toe-in) alignment.

    If the Duran project were to decide to go 3D it would certainly make a complete Blender stereoscopic system a priority. :-)

  12. I should have read more carefully. The Sebastian Schneider python script doesn't use toe-in (convergence). Although there is still something to be said for leaving the setting of the stereo window (zero paralax point) until post production. The advantage is that it is easier to change if you decide later that you want different stereo window placements.

  13. @MarcoA

    You're right. I uploaded a new version of the script:

    The creation of scenes, linking objects, setting left and right cameras as active will be automatically done be the script.

    So, now the only thing left to do (after you press 'Set Stereo Camera') is, to setup/edit the nodes in the compositor.


    @Daniel wray

    You're right too. Calculating the correct 'shift x' value to get the zero parallax was my first attempt but I found the range (min: -2.00, max: 2.00) in Blender a little bit uncomfortable. BUT I think there must be a solution to get a 'real' off-axis camera using this horizontal image shift in Blender. I'll try.


    and thanks for your nice comments

  14. @shul

    I worked at Sony Pictures Imageworks on G-Force and they use the same technique (parallel cameras with axial shift). The main difference is that they render separate left and right images which play through projectors with polarizing filters in front that flicker left in right (corresponding to polarized glasses) with the images.

  15. 3d durian is incredibly a good , different idea that will bring allot of interest in to the project
    those whom i think would be interested enough to keep an eye in it for competition is dreamworks since they have some experience in 3d films and the sponsor might be enthusiastic about it

  16. Is there any way to do an automatic node setup?


    PS: You should submit your script as a patch to BF, this is a great help!

  17. This sounds like a great solution.

    I recently made a rig that uses 4 virtual mirrors to split one 2x frame into two (R/L) movies.
    It was a pain in the proverbial butt, but it worked (!), and I could re-order the nodes to make it x-eyed (for my review) or straight stereo; parallax as well as convergence were easily controlled.

    But I like this script *MUCH* better!


  18. John Montgomery on

    I am coordinating a project using OpenSceneGraph where we are looking at adapting
    Johnny Lee's : Wii Remote hack - which you can see on YouTube:

    ..only we are considering using facial recognition through a screen mounted webCam, rather than the Wii and infra-red. This *might* allow 3D Stereoscopic simulation without all the eye paraphenalia and on a standard monitor.

    It's early days as yet, but eventually it would be nice to see such a facility linked into the Blender games engine in future.

    Glassel, Scotland.

  19. Well that sounds amazing but you can't choose a specific camera to bind to the render pipe in blender.
    I looked for that since I've always seen that since 1996 on other packages.
    Animation Master, 3D studio, Maya or EIAS do have this feature.

    To enable such a thing there should only be another source into the node manager

    like ADD Input > camera

    or the ability to bind a RENDERLAYER node to a specific camera.

    Also since cameras are actual rendering sources I don't understand why they are not represented
    into the Node Editor.

    That way you could change their setup quickly instead of digging into the scene graph change parameter, go back to the node viewer...

  20. could we have stereographic with 3 colors, for those of us with 3 eyes? also, where do i find the correct viewing specs for this?

  21. I did stereo in Blender back in the 2.4 days, it's been doable for a long time(you can do it pretty well with parenting and an empty).

    I even got it to output anaglyph 3d(since I had no access to other methods at the time).

    I'm actually confused why people think this is not already doable in blender, though it would be cool to make a special camera type that does all the work for you, and maybe make the compositor previews and render previews output to stereo hardware if you have it(i.e. NVIDIA 3D Vision or a 3D display over HDMI or DisplayPort). That would really help the process.

Leave A Reply

To add a profile picture to your message, register your email address with To protect your email address, create an account on BlenderNation and log in when posting a message.