It looks like you're using an ad blocker! I really need the income to keep this site running.
If you enjoy BlenderNation and you think it’s a valuable resource to the Blender community, please take a moment to read how you can support BlenderNation.

Oak Ridge National Laboratory: Blender on a Supercomputer!

Oak Ridge National Laboratory in Tennessee is using Blender on their 300,000 core Jaguar supercomputer for scientific visualisation.

Mike Matheson reports:

At Oak Ridge National Laboratory in Tennessee, the largest computing complex in the world devoted to computational science, Blender is used to support scientific visualization. Currently, three large liquid cooled Cray systems are located at the site in a half-acre computer room. The Department of Energy's Jaguar XT5, the University of Tennessee's Kraken XT5 in support of the National Science Foundation, and the National Oceanic and Atmospheric Administration's Gaea XE6 provide the leadership computational resources.

Blender runs on the Supercomputer which is Linux based. At least, most of the renderer does – we don’t build the player or game engine or features we don’t use.

Currently, Jaguar is being transformed into a new Cray XK6 which will be renamed Titan. When the last upgrades are completed, there will be 299,008 AMD cores and 600 terabytes of memory on Titan along with thousands of nVidia next-generation Tesla GPUs. Titan and the other two systems will total more than 500,000 cores and have roughly 1000 terabytes of memory when completed. The disk infrastructure with tens of petabytes of high-bandwidth storage is critical to support the systems.

Blender is well represented by the work done by visualization members at the Oak Ridge Leadership Computing facility. In fact, at this years Scientific Discovery through Advanced Computing (SciDAC) Electronic Visualization Night, five of the 12 awards went to ORNL researchers and all were done with Blender. This competition is an annual event with entries representing visualization work from National Laboratories of the United States, Universities, and other visualization groups.

Here are just a small subset of various examples of Blender use from scientific datasets at ORNL.

Computational fluid dynamics simulations to aid in the design of fuel efficient trailers for semi-trucks rendered with Cycles:

High speed shock wave / boundary layer interactions from computational simulations.

Magnetic Field Outflows from Active Galactic Nuclei

Over 500 million polygons show simulation of processes related to the efficient production of ethanol from cellulose as part of the US energy-policy goals:

Blender runs on the Supercomputer which is Linux based. At least, most of the renderer does - we don't build the player or game engine or features we don't use. We normally don't render on it simply because it is busy and we have our own clusters with thousands of cores available. The way we normally use any of the compute resources is to generate frames in parallel. We assign 1 - N frames per node and use 100s of nodes simultaneously. We tend to try to keep the maximum time to render a single frame at less than 1 hour and usually at around 20 minutes. Since we run this way we don't really exploit any high speed interconnects like those available on the Cray which is another reason we don't typically use it ( it's too valuable to people who need the interconnect ). We use Maya/MentalRay as well but because of license restrictions we can never match the sheer horsepower that we can employ with Blender. So our unique situation really makes Blender a great tool for us.

We use Maya/MentalRay as well but because of license restrictions we can never match the sheer horsepower that we can employ with Blender.

So a very typical render would be to generate 60 seconds of animation at 24 frames per second for 1440 frames. I'd take 128 nodes of roughly 16 cores each ( 2048 cores ) - I'd get back 3 x 128 frames every hour so in less than 4 hours - I'd have 1 minute of HD animation. So it is possible to generate 60-90 second clips in a night without requiring a lot of resources ( compared to what we have anyway ). However, from the number of nodes we have you can see that we could render many minutes of video in less than 30 minutes if we needed to do it. We ( visualization/scientists) are the current bottleneck since we have vast amounts of computing resources. Usually these short clips are sufficient for the needs of most scientists. The most cores that I can think of using simultaneously is probably around 7500-8000. It was a case of rendering the exact same scientific data set with 3 or 4 different cameras.

The most demanding use of Blender we have is for presentations on Everest. Everest is a 35 MPixel powerwall and we do render a select few animations at this resolution which is 17x higher resolution than HD (1920x1080). These frames are absolutely brutal and every rendering artifact will be visible so it takes a lot of care in creating them. These frames take a long time to render and this is the sole use case where we have used many nodes to render single frames - although we still usually will just render 1 frame per node. If we are doing stereo - double the effort.


  1. I want to tell everyone I know about this because it's just so cool but they wouldn't understand me.  Maybe in thirty years I'll have that kind of power.

  2. ah, you advertise on blender, and you have to allow project Mango to use your machines in the spirit of giving something back?



  3. next year intel is coming with the new knife  intelchip for PC with an equivalent of 50 processors inside
    which will change the world of CG

    this will probably also change the set up they have there at Oakridge!
    so it's a field in constant motion

    but interesting  to see such a high class lab using blender for scientific rendering

    happy blender

  4. Forgot to compliment on the super cool post. One of the most interesting ones in a while. I'm pretty positive that with the help of this material I'll be able to convince some partner universities to take a serious look at Blender for their visualization needs. Cheers :).

  5. I got the chance to work at NERSC, the supercomputing center for Lawrence Berkeley National Lab and ran some Blender rendering on the Hopper Supercomputer. Awesome Stuff!

  6. It's interesting that these large, government funded labs have working conversions between Blender and tools like VisIt and ParaView.  As a researcher who could sure use such tools, I really wish they were open-sourced!

  7. Where's the tutorial? :P
    Does anyone know how to do renders that just look like these, without actually being physically accurate?

      • If they not... then of course they should! They spend lots of money (millions?). And I see ppl who ask around 15k for complete opencl compositing node solution. I see more... ppl who ask for 5 bucks for food. So imagine. They earning using Blender - they work on Blender - they ask for gov. and corp. money.
        In my opinion the biggest flaw in OS is that, only few big players with real money Donate.
        Case study is a different story of course is very interesting, but anyway they should donate, and they should be proud of that.

  8. I am a beginner in blender and I have problem using boolean tool ,when I apply boolean with difference option to any object it provide the expected output as it should be in 3d model but when i render that model , those faces where I applied boolean does not look proper . Can anybody help me to solve this problem, ??

  9. Hi - I'm new to blender - I'm trying to construct a jet aircraft model, or use an existing one. The model should have accurate dimensions. Can anyone point me to a mailing list or site that may be appropriate? Thanks! - Hal

Leave A Reply

To add a profile picture to your message, register your email address with To protect your email address, create an account on BlenderNation and log in when posting a message.

Recommended articles