It looks like you're using an ad blocker! I really need the income to keep this site running.
If you enjoy BlenderNation and you think it’s a valuable resource to the Blender community, please take a moment to read how you can support BlenderNation.

Rendering Elephants Dream in Stereoscopic 3D

35

Rendering Elephants Dream in Stereoscopic 3D video art blender Wolfgang Draxinger tells us he's going to re-render Elephants Dream in stereoscopic 3D. To do this he patched Blender 2.41 since it's the only version that will render Elephants Dream so you don't have to fix the cameras, ie. just render!

... no, not quite. He probably needs some works on some fake matte paintings since everything you saw in the movie isn't necessary made in 3D.

In the other news, Jayden Beveridge writes us he made an anaglyph video tutorial.

Wolfgang writes:

I'm currently working on rendering Elephants Dream in stereoscopic 3D - in Digital Cinema 2k resolution!

Do to this I extended Blender a little bit do aid in setting up a stereoscopic camera. Now you can do stereoscopy with the press of (a few) buttons.

A screencast video showing a preliminary version can be seen here:

I asked him further about what he did to the camera.

In as few words as possible. It adds stereo capability to Blenders camera object: If enabled upon rendering the camera is offset and the projected shifted for the selected eye, so points within the convergence frame plane, which distance to the camera can be controlled and animated, will be perceived at the screen's depth. The stereo base controls how intense the stereoscopic effect it.

Wolfgang planned to port his stereoscopic patch to 2.5 and he added that he also wants to do this with Big Buck Bunny and Sintel.

And now, Jayden Beveridge's anaglyph tutorial.

http://www.youtube.com/watch?v=MpQ67dMSuCs

Share.
  • http://www.miikahweb.com/en/ MiikaH

    Cool. I hope that Wolfgang's camera system gets into trunk!

    So clever idea to show a visual plane at screen depth. Makes it very easy to set proper convergence and therefore adjust the pop-out and depth without even having to render anything!

  • J.

    Thing worth knowing: Pixar does the same!

  • boogi

    check out http://www.youtube.com/watch?v=hey8_BFoEeY
    i made it with blender too.

  • Michael Tiemann

    Nice that Blender is catching up with the stereo craze. Who is working on QUAD?

    :-)

  • Nemesis

    I want stereoscopic 3D in Blender 2.6. I could use it.

  • tblank

    Can you also do stereoscopic 3D with nodes? By separating the colors blue and red.

  • Olaf

    That's the *true* power of open-source, being able to do anything you want, like patching old versions of a program.

  • kram2301

    This is great :D

    I'd love to see something like a three-empty plane setup, which at the same time works for stereo:

    They are set up like an L, the corner one is the, err... center point of intereset, the top one is the upper point of interest and the thrid one is the side PoI. Those three would then define a viewing plane like the one of a shift/tilt lense, and at the same time, it would be the focal plane for both the 3D effect and DoF.

    Now, shifting already is implemented but controlling it currently is a little difficult at times...

    If all that became additional camera options, like in the first video but pushed a little bit further, it would be perfect :) - with some nice buttons to de/activate stereo and shift/tilt seperately :)

    Nice stuff, definitely.

  • boogi

    @tblank sure you can. make two scenes, one for each camera. add both renderlayers as inputs to the nodes, then add a "separate RGB" node to both input images, and a "combine RGB" node which has R as input from left-eye-scene-image and GB from right eye.

  • Clavin12

    Isn't this already in the game engine?

  • http://albartus.com/ LOGAN

    @boogie well better use halfcolor for the RED channel (feed the whole render into the red channel as greyscale) for a better result.

    @Clavin12 Yes but very crude in my opinion. To much 'ghosting' going on.

  • http://yearofthecicada.blogspot.com jay

    Too cool, thanks for the videos sir.

  • Sillstaw

    That's pretty cool. Whenever I do this sort of thing, I just add another camera for the second render.

    I have thought that Elephants Dream would look pretty cool in 3D. Big Buck Bunny... maybe not quite so much. Sintel might look good, though.

  • Tim, LA

    You're kidding, aren't you!!
    Anaglyph stereoscopic??? Not for longer than 1 Minute!!!

  • MRKane

    We use a physical rig for arranging the cameras in our current project, and it's been working well for the last few months. In the end we didn't even need much scripting work just a couple of bone constraints.

    What would be fantastic is if blender could render images from both cameras in forward or reverse stereo instead of having to render both left and right sepiratley.

    The other fantastic ability would be for the entire setup to work with a stereo rig (ie glasses) similar to the expensive addons to Max or Maya. Because, lets face it, Blender's better :)

    I also think that reverse stereo is the best way to go if you don't have the cash for a fancy rig, plus you can arrange windows to cameras so that you can view it on a normal screen.

  • http://www.noeol.de/s3d noeol

    I made some stereocopic render tests for 'Elephants Dream' a while ago.

    Here is my old result: http://www.noeol.de/s3d/elephants.html

    @Wolfgang Draxinger: to render the Elephants stuff is not that easy as it looks like. You need Blender 2.41(as you mention) and you have to figure out how they did the whole compositing/file-linking at the orange project, wich is in every scene diffrent, to place the stereo cameras. But your Blender extentions looks good and I think this could work.

    Btw. I wrote a little python script for Blender 2.54 beta to add an OffAxis-Stereo-Camera, you can find it on my website.

    @Tim, LA: nobody wants anaglyphs anymore, Wolfgangs extentions or my python script gives you the ability to set the 'Stereo Window' in Blender and render the final left and right view ... then you can combine these images as side-by-side or above-under (for 'good' stereo devices) ... or anaglyph (so every one with simple displays can see it)

  • http://www.youtube.com/robbielosee Robbie Losee

    This seems like a good place to mention: The download page for Elephant's Dream no longer contains links to download the Complete Two-Disk ISOs with the studio database extras. The files are available on a Germany mirror but can't be downloaded except one piece at a time.

    I know Big Buck Bunny has the ISOs on Internet Archive,
    http://www.archive.org/details/BigBuckBunny

    But Elephant's Dream only has various movie files there
    http://www.archive.org/details/ElephantsDream

    And sadly Blender Store seems to be all sold out of Elephants. :-(

    So I was wondering if anyone with the Complete Elephants Dream Iso's might be so wonderful as to upload them to Internet Archive sometime? That would be a more permanent solution than just starting a torrent that would quickly die, and (in my experience) it is relatively easy to upload to Internet Archive. It just takes a good connection and some time (although I have never myself uploaded something that large, I know it is possible).

    Thank you!

    P.S. - Sounds great Wolfgang! A stereoscopic render of Elephant's Dream is something I have been wanting to play with for months. Great to hear someone is serious about it. I am so looking forward to seeing ED-3D!

  • Wolfgang Draxinger

    @noeol: Yes, understanding how the whole linking was structured was a bit hefty. But nothing you couldn't solve using strace and grep. I.e. I opened each and every .blend file (I had to to that anyway), tracing which other file-paths Blender tries to open. Then I locate that file and add hardlinks at the right places. In other words: I recreate the file system setup used back then. The compositing itself is no problem, as due to my approach patching stereo into Blender itself, the process is transparent. In most scenes all I've to do is setting up, and where neccessary animating the stereo parameters and put the files into the rendering queue. Since I also added background rendering command line switches to select the eye then all you've to do is sending the rendering jobs twice, one time for left eye, one time for right eye, and after each batch copying the output files into two, otherwise identically prepared, directory structures which are finally accessed by the final step "live_edit.blend". The net result are two movies, one for the left eye, one for the right eye which you to put together in the end.

  • joeri67

    Nice to have the technology in place.
    But... As with all good things, its not the technology making the succes.

    Stereo needs to be directed. Its a movie your looking at and not the real world.
    For example: In a tense closeup one might choose to pull the faces of the screen, but on the other hand: if its a tender moment one might not want to give the audience tired eyes right at that moment.

    You might want to read into the subject before starting the task.
    Pixars Whitehill on the subject: http://news.cnet.com/8301-13772_3-20007683-52.html

  • Wolfgang Draxinger

    @joeri67: Don't worry, I'm carefully directing the stereo. Looking at my screencast you might have noticed that I mention, that you can animate both convergence depth and stereo base, and that's for a reason. In most scenes of ED you can leave them constant, as most happens in the same distance to the camera. But there are noteable exceptions. Like the bird chase and the elevator ride.

    At the end of the bird chase the camera flies into the Telephone's earpiece. In this shot convergence epth and stereo base must animated for several reasons.

    Also animating the stereo depth gives you another artistic device, kind of a Dolly-Zoom, you can use convergence depth in a similar manner. Like with all artistic elements you should not overuse it.

    In the case of ED, I use it at the bird chase, to soften down depth range: There's a bird flying in front of Emo, very prominently on the screen. As long as it's flying there I keep the depth plane just behind it's tail, so that it only slightly pops out. Then it veers away and while the camera keeps the distance to emo I animated the depth convergence to match up with him.

    For all those with good advice, rest assued, that I'm not a newbie to the subject. I'm involved in stereoscopy for a long time (to be exactly since 1999 when I got a new graphics card that shipped with shutter glasses).

  • joeri67

    Looking forward to your work.

    Specialy the Rutger Hauer scene in stereo could well become my fav.

  • http://HDPostConsulting.com Jason Gilbert

    The process for re-rendering existing scenes like Elephants Dream or BBB is so simple. I have been re-rending 3D scenes for FullHD 3D Blu-ray for a month now. All I do is...

    1. Import scene.
    2. Replace camera originally used with this one. http://www.daz3d.com/sections/tutorial/files/2268/Stereo_camera_38x30x60.zip

    3. Render Left Eye of new camera at 1920x1080 quicktime aja v210
    4. Render Right Eye of new camera at 1920x1080 quicktime aja v210\
    5. Encode MVC for 3D Blu-Ray using 2 new quicktime files.
    6. Build

    I can have E.D. or BBB re-rendered in a day.

    Hit me up.

  • Dennis

    Here's Big Buck Bunny in anaglyph 3D! 720p. Found it on youtube the other week.
    http://www.youtube.com/watch?v=7e1OB4mq1zk

    The only sad thing with this technology is that everything looks sooo green.

  • Wolfgang Draxinger

    @Jason Gilbert:
    I've to rerender ED for both frames, since I'm going for 2k Digital Cinema Scope (2048x858 pixels), so I need additional horizontal resoluion. Also I also have to adjust the length of the lens for each scene to preserve the visual frame due to the new aspect ratio. You can't just crop it.

    Also just displacing the camera won't work. You've to adjust stereo width and convergence depth for each scene, a process which can't be automated. This is the main reason I patched Blender: To make this process as painless as possible.

  • joeri67

    @Dennis
    It also shows stereo needs redirection. Some scenes are horrid. ( like the spikes ).
    Also, Arent the green and the red eyes wrong way round?

  • Wolfgang Draxinger

    @Dennis: That Anaglyph BBB is everything, but not stereo! You can see it very good at 7:30, those spikes are placed all at different depth, so in the picture overlay the separation distance should vary depending on the depth. But it's the same for all spikes. ...and the background, and twigs and all.

    Someone just took the _same_ frame twice, just separating it a few pixels. Whoever did this, he should be ashamed.

    Here's how real 3D looks like. This is one of my early render tests of a ED scene, which I did before I figured out, that only Blender 2.41 will process ED correctly. I had it readily available for upload, so it's not representative of my current efforts.
    http://vimeo.com/15104199

  • Frederick D

    nice... this will come in handy to make Sintel 3D is guess...

  • Dennis

    @Wolfgang
    Wow, that's some jaw-dropping differences! I Can not belive that I fooled myself to believe that the first one was in real stereo. And I can't wait to see the whole movie like that. Just wow.

  • zoooom

    Hi, thanks for sharing this tutorial.

    Is it possible to render automaticly a cross eye view with the nodes ?

  • gkossowski

    http://www.youtube.com/watch?v=qAGzCulhEgk
    http://vimeo.com/15137637

    tutorial without a python script.
    but... i really dont know if it works, cause ive no those binoculars to check...

  • gkossowski

    A new link to the video:

    http://vimeo.com/15150989

  • Frédéric Lopez

    @Wolfgang: you seem to employ frequently the word "convergence" which is only used for toe-in projection AFAIK. Are you using the (incorrect) toe-in projection or the (correct) off-axis for your rendering ?

  • Wolfgang Draxinger

    @Frédéric Lopez:
    I'm doing off-axis, i.e. the cameras' lines of optical center are parallel. But you also need lens shift, so that the images converge on a certain depth. It is then this depth, which will be later percieved as screen-level. If you'd not apply the lens shift, the whole depth volume would be compressed between screen to viewer - which is very uncomfortable to watch. If you do the math I use for my camera calculations and put in lim[depth of convergence->inf] the resulting lens shift -> 0, i.e. no lens shift ~ screen depth reverse-maps to infinite depth in the scene.

  • Frédéric Lopez

    Ah nice, thanks for the explanation. And good luck for the continuation of this project too...

  • http://www.Stereografix.co.uk Stereografix
Share.
7ads6x98y