It looks like you're using an ad blocker! I really need the income to keep this site running.
If you enjoy BlenderNation and you think it’s a valuable resource to the Blender community, please take a moment to read how you can support BlenderNation.

"Next Gen" - Blender Production by Tangent Animation soon on Netflix!

45

A friendship with a top-secret robot turns a lonely girl's life into a thrilling adventure as they take on bullies, evil bots, and a scheming madman. Next Gen, a Netflix Film, launches Sept 7.

Last week the news broke that Tangent Animation created 'Next Gen' - and that this full movie will play on Netflix! I've received confirmation that this is correct + Tangent has offered to provide us some more information about the production (as far as they can - there are obviously restrictions in place here) - you can leave your questions below and we'll make a selection!

Jeff from Tangent Animation writes:

I can confirm or dispel a number of things that have been speculated on, such as Tangent not being the primary production facility for this movie (only storyboards were done outside of our facility), or our use of Blender in our pipeline (we’re effectively 100% Blender, other than plugging in apps in a few areas to supplement departmental workflows).

The budget was also 5x that of Ozzy, for those wondering why there is such a large difference in the quality of the movies. Ozzy bootstrapped our company, and explicit choices were made regarding execution of the artistic on that movie to execute on the movie on that budget.

Let me know what you’d like as backstory/information to supplement your Blendernation article, and I’d be happy to provide what I can.

So you heard the man, folks! What would YOU like to know about creating a Hollywood production in Blender?

About Author

Bart Veldhuizen

I have a LONG history with Blender - I wrote some of the earliest Blender tutorials, worked for Not a Number and helped run the crowdfunding campaign that open sourced Blender (the first one on the internet!). I founded BlenderNation in 2006 and have been editing it every single day since then ;-) I also run the Blender Artists forum and I'm Head of Community at Sketchfab.

45 Comments

  1. This looks fantastic! I have a few questions: What were the biggest hurdles in terms of creating a feature length movie in Blender? How big was the production team? Was it rendered in Cycles? Which version of Blender was used? How were the crowds acheived? I'll probably have more questions later. I'm looking forward to seeing this when it hits Netflix.

    • I can answer these questions for you (hope Jeff doesn't mind)

      I was a Senior Lighting Artist on the movie

      There were quite a bit of hurdles to over-come to get blender "pipeline ready". Thankfully we had a great dev team that was supporting the production the whole way through.

      Early on Motion-Blur was a big problem, causing huge render-times, but thanks to Stefan's implementation of Embree we were finally able to render motion-blur with a predictable increase in render-times instead of the random spikes we were getting before.

      Memory issues were another huge problem. Unfortunately blender isn't very efficient with memory, most of our shots averaging 60-70GB of ram due to a lot of factors. While it's true on a feature film production ram consumption is always going to be heavy, blender could still use a lot more efficiency's in that area (this is also why gpu rendering wasn't used on the movie)

      There were many other issues (no U-DIM support, no OpenVDB support for fx sims at first but later on introduced, a very clunky and slow compositor, ect.) but we did our best to work around the issues and even used some limitations to our benefit, for example, since the compositor in blender is too slow for most compositing tasks we opted for doing as much "in camera" as possible. This means rendering motion-blur and DOF, where most lower budget features (even some higher budget films) would often do a fake in comp.

      Yes, cycles was used for everything, though our version of cycles was modified (with stefan's embree core, and crypto-mattes which were beyond valuable for compositing). The version of blender we used was the studios own dev version (which I believe was using blender 2.78 as it's base?)

      The crowd work-flow was another big challenge in the movie, unfortunately I don't know a lot about how the work-flow was built so I can't answer how they were setup but I can say all the crowds were rendered in the shot.

      • Matthew Burkey on

        Thanks Justin, that's really useful information :) I'd love to see a behind the scenes feature on it the making of this film and/or some breakdowns, one day. It's very inspiring to know that the tool I use everyday is capable of creating a fully fledged Hollywood grade production. Time for me to skill up! :D

      • Rombout Versluijs on

        WHy do all the comps in blender if its to slow, there tons of other apps?

        PS you said something about not using GPU that means CPU rendering which means tons of processors. I dont quite understand why you guys have had problems while there are quite soe render farms and also blender org is doing there film even using GPU?

        Most films do post in other dedicated apps for easier and better compositions. There are quite some apps for dealing with DOOF as well. Was rendering vectors and DOF that heavy than?

      • Mario Camarillo on

        There is a way to use UDIMs with color displacement and other things Justin I am also senior lighting and surface artist you can make a node with offset translation and you can use until 10 in U and 10 and V is a master node uber you can customize that node and propagate the group node without any issue take a look. in the image I attached.

        • Pierre Schiller on

          @Mario Camarillo, could you please post this picture on another site to read it? The picture doesn´t open, nor saves in HQ. This looks interesting (UDIM hack).

      • I watch the film last night and was very impressed with the lighting. It looks fantastic, and is very confident and realistic. I guess whoever was leading had a vfx background? Amazing to then find out it was blender-cycles and not maya/katana - arnold/prman. Very impressed - I only noticed render artefacts (noise) in one shot in the dof. Well done.

  2. "The budget was also 5x that of Ozzy"..can we have a rough expenditures percentage? want to know what is the most expensive or costly part in the production...that may help new studios plan their projects.
    Congratulation and Thanks in advance!!
    Moh Lotfy

  3. Hey! Great to have this chance!
    Here we go!

    1. Do you use alembic for lighting, how do you shade it then, and how do you use the hair particle system with it, if so? (pipeline step-by-step)
    2. Do you use vdb and in which software do you generate it. If outside Blender, then how do import it? I heard about your own blender build for importing the houdini's vdb.
    3. What software do you use for texture painting?
    4. How do you simulate dynamics and store caches. Because i had some rough times with caches being rewritten accidentally an so on. Do you have some indoor software for managing it?
    5. What are the average system memory requirements to render a shot. Do you use nvidia quadros with a lot of vram or you can only render on a cpu renderfarm.
    6. Do you use a wider gammut, cause blender renders in rec709 only, but the industry standard is dci-p3 and now it's become even wider - aces. Do you use rec709 and filmic, or somehow aces and its luts?

    Pheeew... I've been waiting for it for so long :)

    Thanks!

    • I can answer these questions for you (hope Jeff doesn't mind)

      I was a Senior Lighting Artist on the movie

      1) We used blender linking for almost everything, certain things were alembic (like water sims from houdini) I'm not gonna go into a pipeline step-by-step cause that would be the length of a novel haha

      2) We did use OpenVDB eventually, once the devs were able to implement the ability for blender to import and render VDB's

      3) I believe they used substance painter and photoshop (I wasn't in the asset department)

      4) caches were stored in the fx files, nothing fancy there. Dynamics (like cloth) were baked out into alembics

      5) average memory usage was 60-70gb (120-140gb for bigger shots), this movie would be impossible to render with GPU's

      6) we rendered in linear space and used filmic as a lut

      • Thanks a lot, Justin!

        And one more. Blender memory inefficient because of lack of the rendertime textures mipmapping. Why didn't you implement it, or it's not a trivial task to do?

        • Time, the production was on a super tight schedule we just didn't have the time to develop that part (there were bigger fish to fry at the time). Stefan did do a rough implementation but it needed more work so it just got put on the shelf.

          • Rombout Versluijs on

            SOrry to ask, but i see the name Stefan popup quite some times. WHo is he? I dont know hijm as a dev guy

      • Rombout Versluijs on

        PS did you guys properly manage textures? imean you can save a lot of space doing that i guess. But that would mean cusom addons and such

  4. Also, another question, will tangent be doing another talk at bcon next year about this? I'm sure myself and many others would love to see a "making-of" type thing

  5. beautiful and nice movie. Crowded and I like the color perfect like Baymax Big Hero 6, The character moves looks good. Cool.

  6. Did the team have close communication with the blender developers throughout the creation of next gen? Are there things you guys suggest(ed) to be added to blender to make it more friendly for feature film production? In the future if tangent stick with blender will they continue to use there modified version of blender or move onto 2.8 and implement their custom developments into 2.8? also is there any intent to get embree and cryptomatte to be implemented into blender ?

  7. Is it true that the head of Tangent, Jeff Bell has the strength of ten men and once killed a bear by yelling at it?

  8. That is what I heard, Kevin.

    Yegor, we used the hair particle system and it was animated with a combination of cloth sims driving curve guides, as well as a rig to fine tune or do specific animations. It was a fairly manual process that could be scripted or simplified in the future. It would be nice to have access to the guide hairs that are generated from the hair system itself, but we approximated it by converting the parent hairs to geo, then to curves. This area of the particle system in blender needs a lot of work still.

    I believe our intent is to continue developing custom tools for future projects, and get our tools back into the standard build of Blender when they are ready. When and if that happens is something to be discussed with the Blender Foundation. That's not really my area, so take what I say with a grain of salt.

    BTW, I was Modeling Supervisor on the show. I jumped around a bit, did some surfacing on 7723 and I also joined the lighting department once the modeling wrapped.

  9. Hello. I have an opportunity to ask now, thanks a lot.
    Comparing the whole workflow and overall experience to Maya. Could you tell me that, it was an easier or faster experience. And what could you say is Blender's strengths, that you have discovered in the process of making this film.

    • Hi! I was the Surfacing Supervisor on the film!
      Substance Painter, and Photoshop were great tools to create texture maps and lay out the base for a lot of assets used in the film. We created a lot of shader node networks in house, and some custom hair and skin shaders too! The challenges came when we were doing massive sets and needed techniques to cover wide areas while retaining resolution, something we couldn't do with sets of textures from Painter alone.
      So long story short, shaders totally in Blender, created with methods that Im sure you guys have seen, and as much procedural stuff as we could get away with. Substance painter and photoshop to paint details where we needed them, and all of the human characters!

      • I thought it was really odd that Netflix didnt include Chinese as one of the language options. Excellent work Tangent team! Incredible milestone for Blender with great world development and animation.

Leave A Reply

To add a profile picture to your message, register your email address with Gravatar.com. To protect your email address, create an account on BlenderNation and log in when posting a message.