You're blocking ads, which pay for BlenderNation. Read about other ways to support us.

See the neat new Blender features from Google Summer of Code 2019


Check out the work done for GSoC this year: from updates to the Outliner to VR support.

Since 2005, Google has sponsored students worldwide to develop open-source software over their long vacations and since 2005, that software has included Blender. Over 15 consecutive Summers of Code, Google has funded over 100 Blender development projects, including work crucial to Blender's fluid simulation framework, its sculpting tools, and its animation system: one of the first recipients was future Cycles creator Brecht Van Lommel, then working on an inverse kinematics solver.

This year, 1,134 students completed Google Summer of Code 2019, including seven working on Blender projects. So which features were they working on and when can you expect to see them in Blender?

In Blender 2.81, object selection will synch between the Outliner and 3D viewport, thanks to Nathan Craddock's work from GSoC 2019. You will also be able to drag and drop to parent multiple objects.

Blender 2.81: improved Outliner, new shading nodes

Two of this year's projects are already scheduled for the next version: Blender 2.81, due for release in November. With the increased focus on the Outliner since Blender 2.80, Nathan Craddock currently a student at Brigham Young University in the US has been working on ways to improve its usability.

The biggest is synchronised selection: selecting an object or camera in the Outliner also selects it in the 3D viewport, and vice versa. It will also be possible to navigate the Outliner properly with the arrow keys, and to box-select objects by clicking and dragging. You can read a summary of the new features in the Blender 2.81 release notes, and see more demo videos on Nathan's Twitter feed.

Meanwhile, Omar Ahmad, who studies at Egypt's Helwan University, has been working on ways to improve life for technical artists by adding shading nodes for the Cycles and Eevee renderers, including new types of procedural noise, and more math operations. You can see the new workflows that this opens up in our sneak peek at Blender 2.81, and details of all the new nodes in Omar's project report.

In Blender 2.82, it may be possible to view your scenes in virtual reality, if Julian Eisel's work on supporting headsets like the Oculus Rift and Windows Mixed Reality hardware is merged into the main branch.

Blender 2.82: better bevels and the chance to see Blender in VR

Blender 2.82 currently scheduled for release in February 2020 should also include two projects from this year's Google Summer of Code, including the option to view your Blender scenes in virtual reality. The work, done by Julian Eisel, a student at Germany's University of Applied Sciences Kaiserslautern, adds support for VR headsets via OpenXR, the new open standard for virtual and augmented reality.

By the end of the summer, Julian had VR viewport rendering performing well on simple and moderately complex scenes: using solid shading, scenes with hundreds of thousands of vertices render at 100fps on "mid-range" hardware, well above the refresh rate for a head-mounted display.

It will be possible to view a scene in virtual reality on an Oculus Rift or Windows Mixed Reality headset, although if you aren't using Windows, you may have to do a bit of work: at the minute, the only OpenXR runtime the connection between Blender and the head-mounted display that supports Linux is Monado, which means compiling both Blender and Monado yourself. You can find more details here.

This is still very early work, so at first, the only thing it will be possible to do with your Blender scenes in virtual reality will be to look around them. If you want to manipulate objects using your VR controllers or to use a HTC Vive check out MARUI-Plugin's Blender XR, a separate branch of Blender that was developed as a temporary workaround in the run-up to the publication of the OpenXR standard.

If merged in Blender 2.82, Hans Goudey's work on the Bevel modifier will make it possible to define a custom profile curve for a bevel in a widget within the modifier's UI, rather than the 3D viewport itself.

If you do a lot of hard-surface modeling, stand by for an update to the Bevel modifier, courtesy of Hans Goudey, a physics and computer science major at Middlebury College in the US. The work brings Blender's modeling workflow more in line with other DCC applications by making it possible to define a profile curve for a bevel via a widget in the user interface, rather than having to create it in the viewport.

Yiming Wu's ongoing work on LANPR makes the non-photorealistic line rendering engine compatible with Blender's Grease Pencil 2D animation system, although it isn't currently scheduled for any specific release.

Further off: better line rendering and faster ray tracing

In the future the code merges are currently scheduled for 2.82 or later we could also see new options for generating outlines in non-photorealistic renders, thanks to ongoing work by Yiming Wu (吴奕茗), a student at China's Xi'an Jiaotong University, on the LANPR line rendering engine.

Work on the engine, which is intended for rendering anything from comics to engineering diagrams, began in 2018, with this year's work bringing the interface in line with Blender 2.8 and adding the option to export rendered outlines to Blender's Grease Pencil 2D animation toolset.

Another feature due in 2.82 or later is support for Embree when rendering on the GPU in Cycles. Intel's set of high-performance ray tracing kernels can already be used when rendering on the CPU to speed up the process of building a Bounding Volume Hierarchy (BVH), although it is turned off by default.

Work done by Quentin Matillat, studying at France's ESISAR, brings the use of Embree on the GPU – and with it, Embree support in Blender's binary packages – a step closer. There's more optimization to be done, and at the minute, Embree doesn't always outperform Blender's native BVH builder, but in Quentin's benchmark tests, it can be up to 40% faster when rendering with deformation motion blur.

Finally, Ish Hitesh Bosamiya, currently a student at India's PES University, has been working on adding support for adaptive remeshing to Blender's cloth simulator. Adaptive cloth simulation promises better-quality results in the same amount of computation time, but there are still stability issues to resolve, so it looks likely to be a while longer before it makes its way into an official release.

About Author

Jim Thacker

I've been writing about Blender since the mid-2000s when, as editor of 3D World magazine, I commissioned a series of on-set diaries from the Blender Foundation's first open movie. Since then, I've worked with ArtStation and Gnomon, ‘development edited’ books for Focal Press and Design Studio Press, and am currently editor of industry news website CG Channel.

Leave A Reply

To add a profile picture to your message, register your email address with To protect your email address, create an account on BlenderNation and log in when posting a message.