Appleseed is an open source renderer. You can can export from Blender to Appleseed files using Blenderseed.
Appleseed describes itself as:
appleseed is an open source renderer with a focus on quality and correctness. It started as a platform for rendering research but the end goal has always been to develop a high-end renderer for a wider audience.
The renderer has a good set of features, and I'd like to hear how you think it stacks up against other Blender renderers!
The links aren't tagged correctly, we have to copy/paste them.
Whoops! Fixed, thanks.
It doesn't look bad at all but.... why another open source renderer, again!? What is making it really different from LuxRender, for instance? Those guys are all talented, they should gather to make an excellent renderer...
For instance, LuxRender doesn't have a "Studio" software and they are thinking of making one to avoid having to make materials in other softwares, while appleseed seems to have a great studio... I see them as complementary, not fundamentally different.
Appleseed predates Lux and is a ray tracing-based renderer, rather than another "unbiased, physically-based renderer".
Franz here, the creator and principal author of appleseed.
freem: this is a very legitimate question that we often get asked. The short answer is, appleseed is actually much older than LuxRender, Mitsuba, Cycles, YafaRay, etc. When we started the project, there wasn't that many open source physically based renderers, and even fewer that could render flicker-free animations.
To recap very briefly:
* We are a bunch of experienced rendering engineers and VFX dudes from the industry ("we" refers to Jupiter Jazz: http://www.jupiter-jazz.com/).
* The development of appleseed is financed.
* We're building an open source renderer using a very permissive license (MIT) that allows commercial embedding (but appleseed is and will stay open source).
* We're building a physically-based renderer that permits tricks, shortcuts and non-physically-based setups to allow greater artistic flexibility and shorter rendering time at the expense of "physical correctness" (whatever that means).
* We're working on the "hard" problems of rendering: fully raytraced deformation motion blur, high quality displacement, fully programmable shading. Few open source renderers (if any) provide or plan to provide these three features in the same package.
There's much work to do, but we're dedicated, patient, and financed. We're welcoming any help, especially on Blenderseed, our Blender-to-appleseed translator. We've made a very short test animation last year (WRECK). This year we'll be financing and making a one-minute long animation to test appleseed "in the field" and showcase some of its features.
There's a thread on BlenderArtists with some more details and answers: http://blenderartists.org/forum/showthread.php?230882-appleseed-renderer-(now-with-experimental-Blender-exporter!)
It's commendable what you're doing. But, GPU rendering is the way of the future. If you don't jump on that bandwagon soon, you will be left in the dust.
BTW, I like the logo, very creative. :)
The logo is the work of Paolo Berto, the main founder of /*jupiter jazz*/ (the collective of VFX guys) and Jupiter Jazz Ltd. (the company behind AtomKraft, an innovative renderer for Nuke and After Effects).
Jupiter Jazz: http://www.jupiter-jazz.com/
This is very debatable. For instance, the future of production rendering is considered by many to be embodied by Arnold (Cloudy with a Chance of Meatballs, Monster House, 2012, etc.), a physically-based unidirectional path tracer running exclusively on the CPU (with no plans in the near future to implement GPU rendering for final frame rendering).
We're very open to GPU rendering, but as of today we feel it's not the right technology to achieve our goals (rendering of very large scenes using motion blur everywhere, and eventually render-time displacement and programmable shading, without hitting driver issues left and right).
Your mileage may vary. Cycles is an awesome renderer :)
This sounds awesome, especially adressing the "hard" problems! Shame I didn't hear of this until now.
Is there a roadmap about appleseed development available?
You didn't hear about it earlier because we didn't really advertise it yet. Although it has been in the works for several years, we feel it's not ready for prime time except maybe for the most adventurous open source enthusiasts :)
Hence the "alpha" tag, by the way. As you'll see if you give it a try, there's lots of obvious features missing, like being able to change the frame resolution (!) (right now you need to edit the scene file to do that).
Lots of rough corners such as this one should be smoothed out in the next few releases, before we start the next round of "core" developments.
Although long overdue, we don't have a publicly available roadmap yet. That being said, some posts of mine in this thread (in particular the answer to blender_and_CAD_user below) outline our ideas for the next few releases.
Why another open source renderer? It's simple, because people think that that "other" piece of software, in this case, and external renderer is going to be the solution to all their problems and is going to magically make their renders come out the way they want them to. There are a lot of people that think that Blender's Internal Render sucks or is incapable of doing what they want, so instead of *making* Blender's internal renderer do what they want, they run off to another external rendering engine in the hope that there is a magical button they can push for a excellent render. They fail to realize the problem isn't with Blender's internal renderer but their lack of skill in using it.
Secondly, it helps the open source community to have more than just a few options out there for external rendering. More people developing open-source software actually helps the cause of free and open source software.
Third, there is no end-all be-all software solution.
Oh no come on ! Not another one ! The jungle of Renderers gets bigger and bigger... you better would concentrate your work on helping finishing ONE GOOD fully production usable Open Source Renderer insteat of a devolopping hundret unfinished incomplete Renderers.
It actually predates Cycles, Mitsuba, LuxRender, YafaRay, etc (just wasn't made public until July 2010).
To all who say "why another one" you could ask, "why don't Microsoft and Apple join forces and create the bestest operating system evar?
Because they are production ready softwares with fundamental differences?
Couse linux will destroy them :)
Because they are both proprietary software corporations competing for your money instead of trying to work together to produce a useable solution unhindered by copy protection and the inevitable system-wide slowdown this entails?
Now now, competition is also a fertile soil for the ambition to do better than your rival. No competition would also entail a system-wide slowdown.
Then again , I'm asuming that these companies cannot be proactive without a counterpart which could also be wrong.
Talk about proprietary—while doing a bit of Web searching to find out about IES profiles (that someone mentioned elsewhere on this page), I came across several mentions of a lighting-simulation tool called “Lightscape”, dating back to about a decade ago. Quite a few people did research using this tool. And guess what—it was a proprietary tool, which is now dead. Looks like Autodesk bought the company making the product, then shut it down. (Happens quite a lot with company acquisitions, doesn’t it?)
So what happens to all that research work that was done with it? What happens to others trying to make use of that research, replicate it, build on it? Looks like they pretty much have to start again.
A lesson in there, about falling into proprietary traps, don’t you think?
That's exactly what i thought. Many small renderers, but no one which could really handle every production feature all in one.
About CPU/GPU development debate: Many improvements are coming from that side: I think soon there will be no need to write code differently just because it has to run on a special hardware (GPU).
How would that work... would it depend on writing new compilers for existing programming languages?
Very interesting renderer!
As a new user of that renderer, what is the fastest/best way to receive help, if i meet some bugs or need a feature that is not currently implemented?
Do the Appleseed renderer supports IES photometric profiles? I ask that, because i could not see it in its features.
Lastly: please update the Blenderseed for Blender 2.63, with its new BMesh (ngon) modelling capabilities. Many architects will be (very) happy!
The best way to ask for help or request features is our forums. We're a couple of people watching there and we'll answer usually within 24 hours, often in much less time. All feedback, including user frustrations and WTF moments are always welcome. We've been working on some crazy renderers over the years (mental ray for instance) and we know how frustrating it can be for the users and how important it is to have quick answers and responsive support.
appleseed doesn't support IES light profiles yet. They are not on short term roadmap but our plans are definitely not cast in stone.
Blenderseed will see a major update within the next few weeks, it would be indeed nice to support ngons natively (appleseed already supports ngons).
About our short term plans: the next release (alpha-13) will bring some much needed improvements to appleseed.studio (the ability to change render settings, for a start, but also fills some of the obvious functionality gaps here and there) and to Blenderseed (export of textures). The release after that (alpha-14) will probably be an all-around improvements release where we'll be fixing bugs, fine-tuning code and improving our translators as we work on our next short animation. We'll probably also improve the texturing subsystem (in particular the texture cache) in this release.
Our plans for the alpha-15 release are more fuzzy. High on our lists are render-time, crack-free and artifact-free displacement, and the integration of programmable shading via OSL.
Thanks for the very informative post!
You will definitely see me in the coming weeks in your forums! I will keep my nick name there, too. :)
I would only want to add to my previous post that ALL architects around me are evaluating to change 3ds Max with Blender, because of the ngons. I do not know how many of them would need the IES profiles. However, i do know from experience by dealing with them, that they choose software according to bullet points. If ALL points are met, then they are very happy users of that software. If not, then they move along. That is the reason, why i was asking about IES profiles in my previous post, because for example LuxRender and YafaRay have them half-implemented.
Note that as of yesterday, Blenderseed 1.1.6 works in Blender 2.63 and supports BMesh natively.
now quick question. I saw in the features that it supports voxel data. Does this mean that I can do a smoke sim in blender and i can render it out in apple seed without composting? Because if that is the case, I will kiss cycles goodbye.
appleseed contains a primitive (but fast) volume renderer that can render smoke represented as voxel data. At this point this is more an experiment than a real feature, especially since the shading model is rather crude, but it works, it's very fast and it could be transformed into an actual feature with reasonable efforts. Blenderseed would also need to be extended to export volume data to appleseed. This is all very doable.
If you're curious, you can download a very simple test scene (an export of a Maya fluid) and render it with appleseed. Grab the scene here: http://appleseedhq.net/stuff/smoke.zip
Proper volumetric rendering is absolutely on our plans.
Well, I for one welcome Appleseed to my lineup. Apparently, I must be one of those rare guys who thinks the best option in life is options. I mean, it's free software--what's all the complaining about? You can't have too many renderers, in my opinion. They each operate differently, and are developed with different goals in mind.
I mean, if you compare Cycles with LuxRender, or Mitsuba with Yafaray, you'll see that they each have their own aims in mind, they each have their own unique architecture fitted to those aims, and they each focus on certain strengths the other ones doesn't make as its highest priority. Cycles is the least physical-accurate renderer of them all, but it's the best one for cheating physical accuracy and producing some quite impressive results pretty quickly. LuxRender is probably the most physical-accurate renderer of the open-source renderers, but it's the slowest, too. Yafaray is a wonderful raytracing engine with a "faceless" interface that allows for easy converting of Blender materials and textures. And Mitsuba's a physical-based renderer that has a great volume rendering and microfacet (micro-flake) rendering. Appleseed's got its own advantages and aims as well, such as being a physically-accurate renderer that allows for interactivity with the scene as you render and producing output images that contain no artifacts. If you can't see the different advantages of these distinct renderers, or can't grow enough curiosity into finding out just why they each have their own architecture and aims, then that's your problem.
I wish more people in daily life were programmers, because the appreciate level tends to be compromised with many of those who don't actually build this stuff. Some folks think we programmers can just pull this stuff out of our asses. They tend to think that every developer has to be on the same "team" (though they're typically the ones to needlessly group such developers as "competitors" to begin with), just because they might be doing something quite similar (though justifiably different) to another developer's work. Some will demand features or undertakings as if its all owed to them. And when they find a flaw problem with something, the people who simply and kindly address the issue are usually the minority.
I don't see why in the world it would be a problem for Appleseed to be an additional option available to the world of open-source renderers (though, as it has already been pointed out, this renderer actually predates the other open-source render engines mentioned). I mean, it's not like someone's putting a gun to anyone's head. If you don't want to use it, don't! But you don't have to put the developer(s) of this software down about it!Honestly, I don't see how this addition would contribute to a "jungle of renderers" unless you're too busy trying to keep them all to your own dissatisfaction. Even if somebody wanted to create a completely redundant addition to the open-source software world (something which NONE of these renderers are), THEY CAN! People are free to do it with their lives, so why make life any more stressful than it already is?!
(Sorry to the rest of the folks this doesn't pertain to...don't mean to get so disturbed about it, but it's just the violation of simple principle, you know?)
I agree with you completely. I am of the opinion that it's better to have options with software. It's near impossible to make a renderer (or any kind of software for that matter) that satisfies every user's every need. For example, I myself use different painting programs depending on what I am painting. I believe that instead of a single 'jack of all trades' software, its better to have many specialist software. And honestly, why should someone be forced to modify a software they don't want to work on in the first place. Many developers would create their own programs rather than maintaining others.
Of all the software I know, Blender plays the 'jack of all trades' role exceptionally well, but there are areas (eg. no event driven particles, support for ATI GPUs, ) that are very weak compared to specialist software, of course, those areas will be worked on eventually :D But thaty doesn't mean anybody shouldn't develop blender-compatible options for them :)
Is it possible to assign emission/absorption spectrum (from data file wavelenght vs intensity) to surface instead of RGB value?
Let me start with a small note: as it turns out appleseed is a fully spectral renderer (it internally operates with 31-channel colors). We started with a fully spectral pipeline because it is easier to do so than retrofit an existing RGB-only renderer to use spectra. Now we have plans to (transparently) support RGB-only pipelines for increased performance.
We're currently finishing the first revision of the appleseed Project File Format reference (it's a Google Document, I can share it to whoever is interested to have a look while it's being worked on).
In the meantime (or in addition), I suggest you have a look at the Cornell Box test scene that is provided with appleseed, as it uses the actual spectral reflectances and emittance from the official Cornell Box specification. If you open the file scenes/cornell box/cornell box.appleseed with a text editor, you will see that colors there are defined as spectra over the 400-700 nm range (see for instance lines 19-31).
I guess what we would need now is a way for Blenderseed to load some spectral file format (which?) so that it can emit spectra in .appleseed files. If anyone's interested in implementing that, the contribution is most welcome. I think it should be a fairly quick and easy project.
luxrender, mitsuba, cycles, nox... and so on...
why not a biased render engine?
ok it's the future... but this future is not so near... a good workstation for the GPU rendering is expensive (at least two g-cards... at least...)
agree, I hope Blender Internal will be still expanded even though everyone is now focusing on cycles. :/
Unlike some other render engines, appleseed is not specifically targetting 100% "unbiased" setups. It's definitely possible to have a physically-based setup via careful choice of parameters, but it's also possible to trade between correctness and speed.
In addition, and again unlike many other renderers, appleseed supports mixing of physically-based and non-physically-based materials. See for instance these test renders: http://appleseedhq.net/images/fast-approximate-sss-green-killeroo-test and http://appleseedhq.net/images/fast-approximate-sss-red-killeroo-test. The creature is using a very fast (interactive) subsurface scattering model which is NOT physically-based. It however integrates correctly within a scene that does use physically-based materials.
The point is, we're not trying to duplicate what LuxRender, Cycles or YafaRay already excel at. We're trying to do something different by targetting addressing the problems that are not covered by most of the other open source renderers, like solid deformation motion blur (for a start).
Hope this helps understanding what we're aiming at.
The problem with there being so many different renderers is that few people have time to test them all (let alone master them), or to keep recreating material libraries for them. Then there is the risk that the one you liked most, and spent ages mastering and creating materials for, stops development or loses its primary coder to another project.
This is why I stuck to Blender Internal for ages, and now I will stick with Cycles. As long as Blender exists, its renderer exists. I have just spent several days reassigning all my models to use a Cycles material library, and I couldn't consider spending all that time again (and then some!) importing/exporting to an external renderer... all for a tiny difference in render quality that my audience would never notice.
Blender+Cycles will remain the quickest way to good looking output for Blender users, and my average GPU does it 5 times faster than my fast CPU.
Yes, and this is absolutely OK.
We're not trying to replace Cycles or any other renderer, and we're certainly not trying to "win" any users over another product. If you're happy with Cycles and it delivers the results you're expecting, by all means stick with it and do create awesome renders. As I said above, I think Cycles is an excellent renderer, I love its design and the roadmap.
Let me state it again: we're trying to build a renderer for those cases not covered by Cycles, Mitsuba, YafaRay, Aqsis, Nox, LuxRender, etc.
If you want ray traced deformation motion blur, none of this renderer can do it (as far as I know). appleseed can.
If you want ray tracing with fully programmable shading, only Aqsis come close. But Aqsis is a REYES renderer and is definitely not physically-based like a modern renderer ought to be.
More and more productions are moving away from the traditional "everything is a hack" and "my shader has 600 parameters" situation. Arnold (SolidAngle / Sony ImageWorks) is filling the gap very quickly in the high end commercial market. We feel there should be an open source renderer on this segment, and that this renderer should use a very liberal license that allows commercial embedding (such as the MIT license which appleseed is using). We feel there should be a fast, robust, flexible open source renderer out there that can render frames that don't fit in memory, where everything is moving, robustly and predictably on a render farm that is GPU-less. We feel this renderer should be well documented. We realize this renderer will never be as easy to use as Cycles or YafaRay but that's OK.
It's really that simple. Cycles is faster, much more featured and probably much easier to use than appleseed. But until it catters to the animation market, it's not a competitor. Use whichever suits your needs better. And Cycles is definitely going anywhere so I reckon it's a safe bet :)
Well, I'd guess, all those not-another-renderer-sighs are not so much about competition, but more about joining forces - or at least about dreaming of that ;)
I do understand, that developers are bound to their projects for a reason, and being engaged in a piece of work for a reasonable long time does make it worse. If it would be that easy to throw appleseed, lux, aqsis and a pinch of yafaray into one big pan on low heat, and stirr it every now and then to get the one and only big fat insane lightspeed renderer for everything, I'd advocate that approach, too.
And one complaint I understand from the bottom of my heart: Do I really have to learn another interface? *sigh* ;)
Anyway, agony of choice isn't that bad, and laying sources open might help to develop the one and only big fat insane lightspeed engine one of these days... or to give us even one more piece to choose.
Keep it on, and thanks for your work.
I understand the "sighs", I really do.
I guess the point I try to get across is (and many people already pointed this out very well), there probably won't ever be one "big fat insane lightspeed" renderer. It doesn't exist in the commercial space (as far as I know), and it doesn't exist in the open source space. V-Ray comes approaches somewhat this definition, but not quite.
Like many other products, building a renderer is about making choices and compromises. One choice that will fit the workflow of a Blender user will be the wrong choice for the guy rendering occlusion passes for a 4K animation. Too few settings / too many settings, too slow for small scenes / too slow for big scenes, too complicated / too powerful, etc. Naturally, when we have a chance to satisfy both types of users, we try hard to.
As for the interface: ideally, we would like to have a full-featured integration of appleseed within Blender. And why not support the node system of Cycles? Unfortunately our resources are too limited to tackle this by ourselves. We might apply to GSoC at one point and that could be an interesting project.
I love the fact that we are lucky enough to bask in different render engines, they ALL have different features and yield different desired or otherwise effects, this can never be a problem as to us artists it only gives us more possibilities.
Franz, awesome, thank you so much for you and your teams never ending efforts. As an artist that makes a living of producing various product visualizations for customers, I can only be really grateful for everything you and other people out there do, we´re all a part of this whether we use it, write manuals for it, participating in bug finding or whatever we chose to do with it, the point is that we DO IT. ;)
I´ll most certainly look into Appleseed, and the longer my "list of Renderers"..in the artistbox become, the more ways to solve customer issues I have, takes time...but so did your effort to give us amazing opportunities. Thanks again!
Thanks for the kind words :)
Simple curiosity, Is it faster than Blender Internal?
I would tend to say no, if only because Blender's internal renderer is much more mature and has probably been thoroughly optimized over the years for the kind of renders it's supposed to produce.
The comparison isn't really fair though because the two renderers compute wildly different things. For a start, unless configured otherwise, appleseed will compute a mostly unbiased solution to the light transport problem (be it with or without global illumination) while this is definitely not the case (or the goal) of Blender's internal. appleseed also compute a fully spectral solution (31-channel colors instead of just RGB). In many situations this isn't useful and we do plan to support RGB only in the future for increased performances.
That being said, appleseed is quite optimized. The core is a well optimized, state-of-the-art ray tracer with a modern design. The whole architecture of the renderer is also designed to scale efficiently on multiple cores. Our tests show that this is the case, at least up to 8 CPU. (More testing is welcome, especially on beefier machines).
However don't expect Cycles-speed at this stage. At this stage appleseed is a CPU-only renderer and, unless your scene is very complex, Cycles will definitely have the upper hand.
Do you mean that Appleseed is the main competitor of Arnold ?
Excellent question :)
I personally don't think Arnold has any serious competitor at this point. Arnold is in the process of redefining what a modern production renderer should or could look like in a post-REYES era and, as far as I know, it's pretty much alone on this bleeding edge.
At this point, appleseed is far too immature to even pretend competing with Arnold.
But, if we have an objective, it's certainly this one.
Does Appleseed renderer support 3d displacement at render time? I couldn't find information about it.