You're blocking ads, which pay for BlenderNation. Read about other ways to support us.

Ptex support being added to Blender


Ptex finally made its way into Blender (or vice-versa?) As many readers probably have noticed already, while this is not yet in trunk there are builds for it from Nicholas Bishop's branch on GraphicAll ready for testing. Ptex was made and released by Walt Disney Animation Studios as open source in January after support was added to some softwares like Pixar's RenderMan before that and then later by 3D-Coat.

Ptex could make model texturing less painful since there will be much less need to mess with around artifacts from UV unwrapping and seaming. It also allows controlling over per-face resolutions and works well with subdivision. These videos were taken from Nick Bishop's vimeo site (Thank you very much! I for one have been longing for Ptex!) showing Ptex with overlay painting and multires.


  1. wow,, this could and will prove really useful to me and many others.. seems alot more effective than drawing the uv maps =O

  2. The Fatsnacker on

    I'm sure the stability will come, me i'm just amazed by all those projects bringing something new and improved into 2.5.

    as take that once said 'have a little patience'.....

    I'll just be happy knowing what the tool can do, as opposed to what i can do with it.


  3. @Horace: No. Its different. Ptex assigns different textures for each face (that's why it allows "per-face resolutions").


    Well what remains is to optimise and improve the painting tools. It's quite limited for now (and slow... in my station at least).

  4. yes, but with vertex painting and high enough subdivisions you basically also have a texture per undivided face. :) ptex sounds very similar to me and in the demo videos blender is in vertex painting mode. kind of confusing... :)

  5. I hope it will be fast enough on low/medium real time game models, because having to unwrap or dealing with seams is a bit annoying, especially on more complex organic models (creatures).

    If so, was looking forward to this for quite a while now.

  6. I pictured this feature as a year-long, epic programming project. What I love about this is Nicholas just casually 'threw this in' in a few days. Obviously needs bug-fixing, but development has completely exceeded
    any expectations I had.

  7. YES!

    I am all for traditional UVs, so I thought...and then, realized the TON of applications and many uses I have for this ! Go for it! :)

    Really cool and important thing.

  8. As I said in my earlier post, this feature works very well on my system (WinXP Pro 32-bit, Intel Q9450).

    Unfortunately, although I can paint with it, I have absolutely no idea of how to use the painted result in a render.

    Does anyone have any idea of how to do this ?

  9. yeah but will make my cube look awesome in the view port...:D)

    Seriously though, dude this was an unexpected surprise and one that you will be famous for for a while. Rock on Nicholas!

  10. Great :D
    I love the idea of PTex and was very keen to see a Blender implementation of it :)
    Can't wait so see it fully implemented in trunk :)

  11. YES YES YES! This was my most wanted feature on that survey a while back. PTex will allow for incredible streamlining in my workflow. When you don't have to worry about UVs and seams, so much is possible. This will also allow for 3D vector diplacements to be easily stored as a PTex file. If they can add something like patches at render time instead of relying on Subsurf for all geometry, this will really make Blender right up there next to the big boys.

  12. @Jan A Ptex is just a different way to store a texture file. If implemented correctly you should be able to apply it to whatever channel you want. However, the main uses of Ptex are for transferring information between a high-res sculpted model and a low-res base mesh without having to do any kind of unwrapping; I don't think it would really be feasible to hand paint a Ptex because of how it is stored. All texturing would have to be done in 3D paint mode, or in an external program that supports Ptex such as Mudbox.

  13. And how I can use this PTex thing in game engines? For internal rendering, I think is OK, but if I want to export the texture to UDK or other game engine, I dont need to UV unvrap?

  14. Nicholas Bishop on

    To address confusion over the use of "vertex paint", I've taken over that mode for development purposes. I think ptex could pretty much replace the old vpaint mode entirely (assigning a 2x2 ptex to every face is essentially the same as the old vertex colors), but that's a subject for discussion yet.

    @Matt: it's certainly possible to hand paint ptex (I assume you mean in a 2D painting program like gimp or photoshop). In essence it's just a bunch of rectangular textures packed into one file. Painting directly onto that would be confusing, but it's also possibly to "flatten" a group of quads so that you could paint in 2D on a connected area of the ptex file. (I think one of Disney's demo videos shows their app doing something like this.)

    @daredemo: I don't know which renderers support ptex; I don't know of any open source ones that do.

    Regarding game-engine uses, there's no reason why ptex couldn't be used there too (although it's not currently implemented.) For painting, the ptex data is already being converted to OpenGL textures and implicit UV coords. The same could easily be done for a game engine. And once the internal renderer supports ptex, it should be easy to bake ptex into a regular UV-mapped texture as well.

  15. @Nicholas yes, I've seen the ability to flatten out quads for ptex painting, but so far I havent seen it in an application available to the public. But as for just taking the whole ptex file and editing it, it would be crazy to work with since as far as I've seen they're laid out in a grid without any obvious continuity. What I meant is that texturing in the traditional sense (image file>image editor>reimport to Blender) would be futile. If someone could integrate the flattening and painting into Blender that would be awesome, but I feel like it will be a while yet. My main interest is that IIRC Ptex saves as a 32-bit file, which means that if someone could implement vector displacements and procedural render-time surface subdivision into Blender Internal, we could finally get rid of the awful Displace modifier and catch up with Max and Maya.


    While it wouldn't work with Unreal Engine, there's no reason someone out there couldn't write a game engine that could utilize Ptex. It would drastically change the rasterization order that almost every game engine has used for almost a decade, but it could be done, and in the end would probably save artists and modellers a lot of time. For now, however, Ptex is almost exclusively used for production animation.

  16. Nick, I've been a big fan of your work since Sharp Construct. Since I found out you were going to be developing for Blender I've been telling people to watch you. Thanks for proving me right!

    I can't wait to put this to use, I hate using UV unwrapping and if this along with the other new texturing/painting features are as fast and advanced as I've been hearing I'll be a happy camper.

    Thanks a billion!

  17. Nicholas Bishop on

    @Matt: Ptex is actually nice and flexible in regard to precision; the ptex lib support 8- and 16 -bit unsigned precision, as well as 16- and 32- bit floating point precision. (The half-precision float isn't supported in Blender, but the other three are.)
    Regarding the Unreal engine and others, I expect ptex support would be quite easy (not sure what you mean by a change to the rasterization order?) As I already pointed out, it's quite quick to convert ptex into something OpenGL or Direct3D can use.

  18. """As I already pointed out, it’s quite quick to convert ptex into something OpenGL or Direct3D can use."""

    but if it gets converted to normal uv-coordinates for opengl, how come there are no seam problems? i could imagine that it's quite tricky to align the pixels properly at all the triangle borders.

  19. @horace: it's true, triangles do make things a little tricky. Internally, ptex supports two formats, one for all quads and one for all triangles. Blender uses only the first format (all quads), and triangles are internally subdivided into ptex subfaces (similar to Catmull-Clark subdivision, each triangle gets three quad subfaces.) Using this format, triangles can either be drawn by converting the three ptex quad subfaces into a single triangle texture (or rather, a regular texture that contains a triangular area with the ptex data interpolated into it, then references with UV coords), or each face can be subdivided (either on the fly or as an initialization step) into quads. (Blender uses the latter method.)

    Alternatively, with either the all-quads ptex format or the all-triangles ptex format, the ptex data can be sent directly to the graphics card and a pair of vertex and fragment shaders could take care of displaying it without resorting to UV coordinates at all.

  20. @Nick that's what I meant when I said that the raster engine would be different. Without the need to call UV texture coordinates, couldn't they potentially cram more into each clock cycle with code optimized for Ptex?

  21. I have a question. I'm not that well versed in the code of Blender. How hard would it be to implement 3D vector displacement into the internal engine?

  22. Not too hard; it's already implemented for multires. Recall that the renderer doesn't do micropoly displacement or anything fancy like that, so it's basically just a matter of moving the input vertices around. Usually you will want some level of subdivision of course, and the output vertices can be manipulated through a modifier however you like.

  23. """ [...] or each face can be subdivided (either on the fly or as an initialization step) into quads. (Blender uses the latter method.)"""

    hm... so if i understand this correctly with this method ptex really is very similar to directly using [vertex colors / vertex painting] like i already assumed previously? is it more memory efficient?

    thanks for your explanations. all of this is very interesting! :)

  24. @Nick I didn't realize that multires already had it implemented. It makes sense now though. Hopefully some day we'll have something like micropolys in the Internal Renderer, and 32-bit vector maps supported.

    I digress though, that's not to take away from Ptex support. That's a huge step, and it's very exciting to see. Can't wait until my desktop is back in working condition to try the build out.

  25. @horace: if you were using vertex colors with the same resolution as the ptex, then for each ptex texel you'd have an additional vertex (with the associated space for coordinate, normal, etc.) as well as additional edge/face storage for all the little polygons making up a subdivided mesh. So Ptex is much more efficient here. Of course, if you have a subdivided mesh anyways, then the difference is not as great.

  26. Wow this is awesome! Is there anyway to actually animate the textures on an object? So maybe someone gets cut or something you could draw it on at a certain keyframe and then it would appear there.

  27. >And once the internal renderer supports ptex, it should be easy to bake ptex into a regular UV-mapped texture as well.

    Any time frame for this baking? Could it be done ahead of int renderer support?
    I would like to be able to ptex a scene in Blender for export to Octane using .obj
    Also could .svg be used for texture sometime in the future?

    I think Octane will support renderman and collada in the future.
    Lux and Indigo users might be interested in baking as well.

  28. It is too bad that there is currently no render support for the PTEX painting work.

    But at least let me throw up a big "Thank You!" to Nicholas Bishop for his fine efforts so far. :-)

  29. Thanks, Nicholas, for your constant effort! :)

    PTex is just a beautiful idea and I liked it from the very first moment I read about it. I can’t test it right now, ’cause I’m a guest on this computer, but I can’t wait to do it soon.

    I also hope, that it will be possible to bake PTex paintings to UV set and vice versa and ideally within the same model, for it would be very handy often. And also that further editing of already painted mesh didn’t ruin the paint unnecessarily.

    Regarding the problem with real time rendering in game engines - isn't problem that game engines usualy use mipmaps and if there will be automatically generated UV layouts (I guess it would be similar to lightmaps) it can show seems with growing distance while displaying lower mipmap, if there will be no automatic gaps/bleed mechanism in UV generation. Isn't that true?

    Thanks and regards!

  30. PTex is a very impressive thing. Even more impressive is how fast it got implemented into Blender...

    @Omar - no, that would be Vector Displacement, although PTex really helps in properly baking a detailed Vector Displacement map.

  31. uv problems mcgee! on

    I have been very excited about this since I learned of it. I have always felt my severe lack of UV mapping skills has always held me back, unwrapping has been a skill I have always severely lacked.

    I think that this will breath new light into one of my favorite hobbies.

Leave A Reply

To add a profile picture to your message, register your email address with To protect your email address, create an account on BlenderNation and log in when posting a message.