The Google Summer of Code Blender project acceptance results are in, and like many of you, we are shocked that so few Blender projects were accepted this year (in 2005, ten projects were accepted). That being said, some amazing new features will come out of SoC this year, and we're grateful to Google for their funding and support of these projects.
Technical limitations have prevented credit from being given where credit is due. Thank you to Eugene Reilly (etr9j) for doing this article with me. -- spiderworm
Nicholas Bishop - Interactive Sculpting with Multi-Resolution Models
Mentor - Jean-Luc PeuriÃ¨re
One of the neatest tools to come about in open source modeling recently has been SharpConstruct, a powerful stand-alone brush modeler built originally for Linux and now also available for Windows. This summer, Nicholas Bishop, creator and head of the SharpConstruct project, will be bringing his knowledge and experience to Blender. He plans to create native Blender brush modeling tools (not Python plugins!) similar to those available in SharpConstruct, as well as a feature not yet in Sharp called Multi-Resolution Meshes.
Brush modeling is a technique that has been around for a while, but has recently become very popular thanks to software like ZBrush. If you're new to brush modeling, imagine traditional painting on a flat white canvas, where a broad, thick brush is used to create general shapes and smaller, fine brush heads are used to detail the scene, and you've begun to understand what brush modeling is like. This technique, while less useful for mechanical or architectural modeling, will likely revolutionize organic modeling in Blender.
Despite the revolution thatÂ sculpting tools brings, they have one major disadvantage: the final mesh is often extremely dense, making it harder to make changes. Multi-Resolution Meshes (also known as MRM or Multi-Resolution Models) isÂ advantageous because subdividing the model doesn't justÂ change the model's topology, it adds a layer of detail on top of existing layers.Â So when you're editing at a high resolution (high subsurf level) you'll produceÂ small faces which you can thenÂ manipulate freely and individually.Â Currently, you can subsurf smallÂ faces but you do not have full control over theÂ individual faces created like you will with MRM when you switch from editing at a higher resolution to editing at a lower resolution.Â For example,Â lets say you change from resolution 4 to resolution 1. When you make changes to the mesh at a resolution of 1 it will also affect the faces created by the level 4 resolution but you will not lose the detail. [corrections madeÂ to MRM description on May 25th by spiderworm and etr9j]
A very big thank you to Nicholas Bishop, Jean-Luc PeuriÃ¨re, and Google for making the creation of this tool possible this summer! More information about the goals of this project can be found in the SoC proposal draft, found here:
Benjamin John Batt - Modifier Stack Upgrade
Mentor - Daniel Dunbar
In Blender 2.40 we saw the inclusion of the mesh modifier stack, and while this great new feature has enabled a lot of new customization in how a mesh is formed and rendered, there are still a number of features that it lacks. Looking to change that is Ben Batt, a Computer Science/Computer Systems Engineering student in Melbourne, Australia. His project will involve upgrading the modifier stack, upgrading the current modifiers to work with the new modifier stack, and adding 3 new modifiers.
The upgrade to the modifier stack will address several constraints in its usage. The main constraint is the "original data" constraint, meaning that certain modifiers require the original mesh's data and cannot be placed in the stack after non-deforming modifiers have been added. Another constraint indentified is that many modifiers cannot be applied to curves, which would be very useful for some (but not all) modifiers.
The project will also introduce three new modifiers: Autosmooth, UV Projection, and Displacement.
- The Autosmooth modifier will perform the same function as Blender's current Mesh Auto Smooth option, but the results will be visible in Blender's 3D view window without needing to perform a render.
- The UV Projection modifier will change a mesh's UV coordinates by projecting them onto the mesh using the matrix of a helper object.
- The Displacement modifier will use a texture to displace vertices of a mesh. This will act similarly to the texture displacement option already available in material settings, but the results will be viewable in the 3D viewport and subject to other mesh modifications.
As you might imagine, we are also very excited about this SoC project as well. Thanks go out to Ben Batt, Daniel Dunbar, and once again Google, for their work and support to bring these much needed features into Blender. For more information, take a look at the project page:
Dmitriy Mazovka - Sky Generator
Mentor - Kent Mein
The process of creating a sky for a scene is typically just throwing a big sky picture into the scene. But where is the control of the sky's look and feel? How do you light the scene to mimic the sky image? How do you animate the scene with a static sky image? Dmitriy Mazovka has proposed to solve the riddle of the sky with his ambitious SoC proposal.
First on the list is a controlled, physics-based simulation of skylight, sunlight, and aerial perspective effects that comes with cloud simulation based on cellular automata, all of which will be able to be stored as an environment map. Additionally we'll be given the ability to then animate the sky, and watch the sun walk across the sky as the clouds float by.
Not enough? Next is the creation of scene objects that can be used in constructing the lighting environment based on the generated sky properties. The proposal gives the example of light shafts being generated automatically from volumetric clouds based on the type of light you've added to the scene.
Integrated sky creation is certainly a sign that the Age Of Blender is upon us! Thanks in advance to Dmitiriy Mazovka, Kent Mein as the mentor, and yet once again to Google for bringing this functionality to Blender. A bit more information can be found here:
Although many fine project proposals did not make it this year, we're very excited about those that did. All the best to the coders and everyone else helping with these projects. We will continue to keep everybody advised on the progress of these projects here at BlenderNation.com.