You're blocking ads, which pay for BlenderNation. Read about other ways to support us.

360º Panorama rendering in Blender


Bernhard Millauer has written a brief post explaining how to make a 360º panorama render of a scene using Blender 2.5 . It says :

I was searching for a way to create a panorama shot but just found some wired informations about PartX properties.
Then i came to a post where it was described how it works with blender 2.5.

The setup is now fairly easy:

- Place the camera with horizontal alignment between the objects
- Set the focal length to 5mm (yes, five)
- Enable the panorama option

You can read the full post here.


  1. This is not a "360° panorama" render, since it usually means that you got all the environment (360x180, all sky, equirectangular).
    That is only a "horizon panorama" with a small vertical field of view. True 360 panorama is impossible with Blender.

  2. @Faxrender
    When reading this article I was hoping someone had found a way to render out a true 360º by 180º panoramic render, sadly this was not the case. It is however possible to archive true equirectangular panorama from Blender, by stitching together a cubemap in a panorama editor such as hugin. I've made full equirectangular panoramas myself that way with pleasing results, however it is a very crooked workflow, full 360 by 180 panorama directly out from blender would truly be awesome!

    That's because the tutorial didn't use enough vertical performance tiles, more vertical tiles equals better quality and less pinching in the panorama.
    Also using a focal length of 5mm won't produce a full horizontal panorama of 360º, which result in it not being horisontal tileable, placing a object on the back of the camera will make the gap pretty obvious.

    The best settings would be:
    X Tiles: 300
    Focal length: 5.05 mm
    and of course tick the panorama checkbox.

    While we're on the subject of panoramas, one rather obvious feature that I find blender to lack is support for equirectangular or "latlong" panoramas in the world texture mapping, for background. Currently the only way to take advantage of such panoramas is to convert them to angular maps (fisheye projection), and use the "angmap" mapping setting for a full 360x180º projection. This is quite a cumbersome approach, and will in most cases result in a drastic reduction of image quality. Mapping such a texture to a shadeless sphere (dome/skybox) is always an option, but the environment lighting can't take advantage of the texture then. I guess this isn't the place for such "feature requests", but I just felt for throwing that out here :p

  3. @Artorp
    With regard to rendering equirectangular panoramas, isn't this what the "Sphere" option does in the World textures? So, "Sphere" because equirectangular would be a type of spherical mapping. I'm not exactly sure myself, but I've tested with an equirectangular image and it seems to work, though is only mapped to the +z halfplane (as the tooltip mentions).

  4. @scuey
    Yep, that's exactly what it does, but unfortunately as you said it can only utilize the top 90º, and that is not covering the full range of view. And for instance an infinite flat floor will lie beneath the horizontal seam until the camera end clipping is large enough, otherwise it won't be able to fill in the gap between the object and the horizontal seam. For some projects when the camera is surrounded by objects such as mountains, the sphere-mapping would work just as well.

  5. @peter

    That is a fisheye render, don't mistake that with true 360x180º equirectangluar projection:

    Besides, the distortion from the animation is not even even.

    Fisheye renders can only cover so much of the scene, it's more or less only covering half of what a equirectangluar projection can, while heavily distorting the outer edges which reduces the relative resolution farther away from the image center.

    And as far as I know, that is not possible to archive directly from a render. Either you have to point the camera to a perfectly mirrored ball (seems like that was used), or use a lens distortion node. The raytracing method (mirrored ball) limits the set of features such as basic masking possibilities, smoke, particles and so on, and the lens distortion node won't even increase the field of view while reducing the actual image information.

  6. Blender cannot do it directly, that's all.
    I use some external tool to do it. It works, but it's a loss of time.

  7. 6 (or 5) camera clusters can be done easily modifying the blender code:

    1) In blender set Panorama, Parts = 6 and image width/height = 6X (e.g. 3000x500 for example) and set careful the angle of view.

    2) Modify the blender source so that each Part gets rotated properly (+- 90° in all directions).

    3) Make a tool to process automatically those images (to equirectangular or fisheye) once they are saved to disk during animation.

  8. Alright, I just made an equirectangular option for mapping world textures. So now they could be used properly for Image Based Lighting. I think I'll submit a patch soon.

    @Ron Proctor
    Pretty cool stuff you guys are doing at the planetarium. I remember seeing those here before.

  9. @Artorp
    Yeah, the patch is in the patch tracker. So you can find it there if you want to test it out yourself - assuming you are building from source.

Leave A Reply

To add a profile picture to your message, register your email address with To protect your email address, create an account on BlenderNation and log in when posting a message.