Dalai Felinto shows the state of his Blender ViewportVR addon in this video using an HTC Vive headset. He can walk around and inspect a model in Virtual Reality, very cool! It's still work in progress in a separate branch, but this is making me very hopeful for further integration in Blender.
Dalai will be be arriving at the Blender Instute this Monday, I'll try to be there next week and ask him some questions about how he sees the future of this addon (AND try this out for myself ;-)
There used to be a fairly low limit on the number of polygons for a scene, perhaps 60,000 IIRC, does this still apply for this generation of VR?
I am not sure what performance is bound by currently. I would like to be able to use Blender and VR to appreciate how concepts for new products might be in reality but I would need maybe 2 million faces to capture enough detail. Textures wouldn't be that important but I might need say 64 different colours.
Perhaps you can do an in depth report after you get to try it out.
Thanks for running this site. Always come here to check out the latest Blender news. :)
It mostly depends on your videocard, but I've seen scenes of millions of polygons work very well in WebVR.
I can't wait to test this. I use blender for VR develop here in Munich. The ability to see models in VR before exporting to Unity and then finding the problems - problems that exist in a VR version of a model - would be an incredible time saver. This is really exciting!!! Thanks.
This is incredible! I got a Samsung GearVR, and have found third party programs that allow streaming, even HTC Vive and Oculus (up to 0.8 sdk) content. It's not perfect but it is getting better and better, and I am looking forward to trying it out with Blender. Wow. Great work, guys!
Just to confirm, I got the latest version of the Vive add on from here: https://drive.google.com/open?id=0B-2NOYYw8dpVTHJkY0xOQXZ2SzQ and it DOES work on the GearVR using RiftCat on Windows 10 and Vridge app on Samsung Galaxy S7! Looking forward to being able to see the mouse cursor in VR so editing and painting can be done.
We need to use the positional data of the valve handle to cast a ray, and get the place you would be mousing over, and get mouse context that way
Any idea if it's possible to allow the blender render to use a different material depending on the eye being rendered?
I'm trying to represent a 3D movie being shown in the theater. So I have the theater model which works nicely through the vive. Now I would like to emulate the projection of the 3D material on screen. So the map displayed on the movie screen should be different for each eye.
This is now possible on blender 2.8!! Is it going to be implemented, Dalai?
Guys, I actually tried the HTC Vive at the Microsoft store tonight and my gosh... I got to shoot a virtual bow and arrow, pick up a virtual turkey leg and "eat it", and (my personal favorite) pick up a virtual apple and toss it in the air AND catch it again as if it was the real thing! The tracking is so smooth and responsive! There's just one thing missing from this equation: Suzzanne!!!! :P Obviously I want to not only look through Blender's viewport in VR but also be able to interact with and edit things with tracking, even layering traditional input devices like tablets and mice in cooperation with the motion controllers. Macro editing could be done with mostly the motion controllers while detail painting and precision editing could use the mouse and tablet for extra precise input. Making the motion controllers and traditional input devices work seamlessly together or at least with a smooth transition from one to the other would be a lovely thing indeed. What do you guys think?
Yes, the blender environment needs to include interaction, esp modeling, textures, and esp sculpting.
Do you have experience using Vive Pro 2 in Blender? I am trying, but the image on the Headset are showing me freeze .