Michael Gschwandtner uses an Android phone to control a Blender camera in real time in this proof of concept.
Michael writes:
This video shows a proof of concept implementation of an Android app and a Blender addon that allows you to use your smartphone as a real word substitute for a camera in a Blender scene.
By moving the smartphone in front of the marker with the Blender logo the app estimates the phone pose relative to your screen. This information is sent via Wifi to a server running inside Blender which calculates the phone position and rotation relativ to the laptop and transforms the camera in the Blender scene accordingly.
The Android app uses the Vuforia SDK from Qualcomm which is really fast in tracking the frame markers (the border around the Blender logo).
As you can see from the video the app is far from perfect and the whole Phone to Blender connection is not really userfriendly but it works and there is much room for improvement.
TODO:
- lots and lots of things ;)
- Kalman filter for the 6DOF pose of the camera
- Different frame markers on all 4 edges of the screen or maybe using an image target instead of the frame marker
- Saving the movement of the camera
- ...
Please leave a comment if you like it or if you got any suggestions. The code is too "fragile" to be released just now, but I will continue working on this if the community wants it to be developed further.
37 Comments
very cool reminds me of some of these making of TIn Tin. Where they used a virtual camera to see what the final scene would look like while filming motion capture.
That is soo awesome. I definitely want to see this further developed! I love the idea of animating the camera with your own hands, adding some human touch to your animations.
Not sure if this is done already, but would be cool if the virtual camera path could be recorded from an actual shoot. Maybe sticking the android phone to a camera somehow and having a central receiver point? Local logging of coordinates and import would suffice. Wouldn't that make 3D object placement a lot simpler? (come to think of it, didn't George Lucas do this in reverse in 1977...?!)
Andrew Kramer did that in After Effects in this tutorial http://www.videocopilot.net/tutorials/animating_a_still/
by tracking a handheld cam shot and applying the motion onto another video.
I think blenders motion tracker can also do that, just track a shot and assign it to a blender camera.
run this app in the background, while running a vnc session on the phone should work.
This is cool and has a lot of potential, especially if you can link it back to blender running on the phone (or stream the video as it is rendered) so that it looks like the phone is actually looking at the object. That would also be cool if you dont need to keep the marker on the screen.
Playing the camera view on the phone is one of my next steps. But first I need to find out how I can actually get this view :)
Wow, already looking quite nice!
If this is improved to a state with a little less random camera shake I see many uses for this like using it with real time auto keyframe to animate a scene. That would include a VERY realistic handheld camera feel, and highly interactive camera
motion editing.
Keep up the good work!
That's actually what I am aiming at. Recording the motion during playback of the Blender animation.
Very nice concept. I like it that you are using the same phone as I am. LOL
So HTC isn't that bigga crap as I initially thought.
I still wonder why people do not combine all the motion sensors that are in the android devices themselves.. And then also feed back the camera view to the phone, so it becomes free moving camera with viewfinder...
Hmm, but this is a nice start!
very nice. I'm in the middle of trying to figure out what to do with 6DOF IMU (real, not cell phone). I think this application would be an excellent demo. I'll try emulating it. I assume the blender side isn't too difficult. I can handle the remote to PC connection fairly easy (I hope).
The blender part is based on this code http://p.quinput.eu/qwiki/Wiki.jsp?page=Blender_remote_control but I am not using HTTP requests since the position updates are currently made at 10fps per second.
But if you don't want to wait until I can release the code you can modify his server.
yeah, that's perfect. I dont need to use http either. thanks for the pointer.
This is great! I definitely give that a try if you come with a stabil version of the addon. Keep up the good work!
That's good idea, real time motion tracking. But use mobile camera ?? For experimental that's awesome.
I am constantly surprised at the initiative of people when they use Blender for really unique things. Please do keep it up.
there's little genius out there :)
Android power Forever !!!!!!
This is kinda how they controlled the camera in "monster house." I remember seeing the making of and the director could control the camera ouside of the software to get it to look how it wanted, so it is very possible to do, but I like the thought of using my phone to make it work.
That what James Cameron did with Avatar!
Very very cool, very very clever!
Can you do it like this?
http://www.youtube.com/watch?v=Wy7IvrizRok
This could be a really useful plugin! It would be great for modeling as an alternative to using the rotation and translation widgets while sculpting and texturing! would be very intuitive! And lets hear it for the most popular of our open source community, go Android!!!
Please keep developing this. Not only would it make handheld cameras more realistic (and easier), it's just plain awesome. I would be more than happy to test this with my phone (Galaxy Nexus) and Blender.
Yey, we want it :D
Impressive! Now it seems possible what I was thinking the other day....
I like what 'layar' is doing with augmented reality and all (using GPS?).
But I thought it would be nice to put a few markers on a t-shirt and project some 3D model in the phone's view
... Just checked the Layar site and it seems they're already 'augmenting' magazines... no 3d yet though...
Cool !
But I would prefer Blender uses the webcam to track the eyes of user and automatically rotates the object to always present the side of interest... :-)
This is just fantastic! I was wondering some time ago about digital chisel for sculpting and I think this might be an answer :)
I like it. Would be even better if the camera view could show up on the phone itself, but this suggests many exciting possibilities.
Just using phone and tablet devices as peripherals, input devices etc, this kind of interactive approach, there's all kinds of potential there.
Im already working on that. The current viewport will be streamed to the Android device which then becomes kind of a viewfinder for the virtual camera
Cool. That'll be awesome.
I presume the android device can also be set up to drive other objects in the scene, where you could pick them up, rotate and manipulate them as you might if they were real world objects?
An direct idea is to not use the Blender Logo on the screen, but a Ball (round object) IRL to move the camera around, so you can "go around" the object.
Once everything works I might change the markers to something that can be detected in a more robust way, but for now it's governed by the library.
And if you would use a ball or another structure (cross with colored balls on the tips) then you don't see the actual scene through the camera viewfinder (display of your device). But you are right there are definitely several different ways to implement something like this.
i want this! i need this! please keep developing it!
I don't see how this could be useful for modeling, but it does have some great (aesthetic) applications for presentations at least. For now. Could be used in camera animation.
Sorry if I missed this in the comments, but have you tried to in this straight off the Android yet?
Well people are using blender in so many different ways and wonderful experiments at that. I am proud of our blender community.. rocking awesome.. I can imagine blender used for creating avatar like films without technically getting bogged down. hats of to you Michael.
As suggested by some people that reducing hand held shake to give more steadiness to the camera motion capture data; I would add that instead there can be a control (value range) on the degree of steadiness. Then users can choose how much to clean the shake. This is because in some instances the shake might me the effect desired.
Also expandability to use a frame marker off the screen (like a marked cube put in the middle of a desk) to be able to work on a space of more than 180 degrees. In that case the marker would also have to be kind of 3d physical marker. I know it can be far more complex than that or very simple, but i'm just imagining a scenario. In case of the 'on screen' frame marker, you can just do full pan, dolly, track. And can move about the subject constrained to less than 180 degrees; correct me if i'm wrong.
This is so promising. :)
Great work man.
It's great ! Any news since 2012?
It would be very useful if this addon also worked outdoor where we could just leave a few tracking standard objects.
It would save quite a bit of time if we could just skip the camera tracking part.
Of course, I think everyone needs this to better animate this camera. Tracking is long and hard. Keep developing please.