Advertisement

You're blocking ads, which pay for BlenderNation. Read about other ways to support us.

Immersive Interface for Blender Awarded

23

Tuukka Takala writes about his immersive interface project - it won a prize at the 2013 Symposium on 3D User Interfaces.

Tuukka Takala writes:

The above video shows how Blender is used with 3D display, PlayStation Move controls, and head-tracking (Kinect or PS Move). This work won the "Best Low-cost, Minimally-Intrusive solution" prize in the annual 3DUI contest at 2013 Symposium on 3D User Interfaces.

I created this Blender interface in 3 weeks using only open source software components: Blender, Processing, and RUIS. It's pretty fun way to do 3D modeling once you learn the baiscs. I reckon that this kind of interface would be useful especially in 3D animation (posing bones, doing motion capture, etc).

I used my own Blender script that provides stereo 3D support for Blender that can be used without Kinect or PlayStation Move. All you need is a 3D display that supports top-and-bottom stereo mode.

You can get the Blender script and instructions for using Kinect and PS Move with Blender by downloading RUIS for Processing.

23 Comments

    • Plus some LEAP motion controllers. Can you imagine being able to be immersed in the rift, and being able to use your hands to naturally control everything at the same time? Now THAT would be awesome!

      • I've tried the Leap motion and ordered 3 of them. I think it would be interesting for 3D modelling/animation, sure. Leap doesn't track your fingers as well as their advertisement video lets you believe (you get only palm center point and 5 fingertip points in 3D, and it keeps losing the fingertips often). On the plus side the Leap is very responsive and precise.

  1. Some years ago Johnny Lee did a similar hack with the Wii back in 2007. His solution was a fair bit more elegant in my opinion, because 1.) the lights on your face are in infrared and less distracting, 2. the lights mount easily on regular glasses making them more natural to use and 3.) it uses two light sources, so its able to judge rotation and position more quickly and accurately.

    https://www.youtube.com/watch?v=Jd3-eiid-Uw

    Still, good work. Not as elegant as Johnny Lee's but still slightly less dorky than the Oculus Rift (which I actually think is pretty cool btw).

  2. Johnny Lee did a similar hack with the Wii back in 2007. His solution was a fair bit more elegant in my opinion, because 1.) the lights on your face are in infrared and less distracting, 2. the lights mount easily on regular glasses making them more natural to use and 3.) it uses two light sources, so its able to judge rotation and position more quickly and accurately.

    https://www.youtube.com/watch?v=Jd3-eiid-Uw

    Still, good work. Not as elegant as Johnny Lee’s but still slightly less dorky than the Oculus Rift (which I actually think is pretty cool btw).

    • You can view Blender's 3D viewport in stereo 3D using my Python scripts, IF you have a 3D display that supports top-and-bottom stereo mode. You don't need Kinect, PS Move, or any additional software.

      The Blender scripts are within the RUIS for Processing package that you can download here:
      http://blog.ruisystem.net/download#ruis_for_processing

      Unzip the package, and open the readme.html file that is located in \RUIS\examples\BlenderControl\
      Read points 1. to 3. from "Instructions" section, and then read the "Stereo 3D in Blender" section for information how you can get stereo 3D working in Blender.

      Notes:
      Currently it's better that you have your desktop extended over two displays: A 3D display for the 3D viewport and a 2D display for all the other Blender UI elements.
      Buttons and other 2D elements as well as some 3D gizmos will not be mirrored into the left eye view of the 3D viewport. This causes a slightly annoying discrepancy.

    • See above my reply to dfelinto, there are details about how you can get stereo 3D to work in Blender with my Python scripts.

      When I started working with this interface in November, I was excited about the idea of sculpting in 3D. But at that time there was no adaptive tesselation in Blender sculpt mode (at least in the stable releases), so I didn't proceed with it.

      Sculpting with mouse and stereo 3D functionality of my script should work fine (though brush pointer and some other 3D gizmos might be visible for one eye only). Head tracking with Kinect or PS Move might cause problems for sculpt, because in my experience moving the 3D view's camera breaks off the currently drawn brush stroke (at least in texture paint), and with head tracking the camera is constantly moving. As for using 3D input devices like PS Move for sculpt, one would have to code the sculpt tools from scratch in order to support the 3D location and 3D rotation of the input device. And I'm not sure if that's possible with mere Python script.

  3. Pretty cool. I saw something like this during the PS4 conference from Sony earlier this year, but this one looks much more versatile. Great work on this and congrats on your recognition!

Leave A Reply

To add a profile picture to your message, register your email address with Gravatar.com. To protect your email address, create an account on BlenderNation and log in when posting a message.

Advertisement

×