Advertisement

You're blocking ads, which pay for BlenderNation. Read about other ways to support us.

EEG mind reading audience watching Sintel trailer

26

Many readers sent leading stories of this use of Sintel trailer in a demonstration of a mind reading hardware.

EEG (Electroencephalography) is the process of recording brain activity. From the video, they use Emotiv 14-electrode headset with a software called EmoRate. Robert Oschler shows us he could index and catalog his emotions while watching Sintel trailer and then seek for the hot spots within the Sintel clips.

What could be the future use of this technology for Blender community? Gathering users' emotions when Blender crashed, when autosave folder is being browsed and then when the file was found and opened? How about your ideas?

Links:

26 Comments

  1. Sintel is an emotional rollercoaster! :D

    Really cool technology. Will be interesting to see how this affects computing in the future. I'd love it if menus simplified themselves when I got angry :)

  2. Andrew Price:
    Haha, yeah, first the computer teases you with pretending to be buggy and then, when he notices, you're too angry, he immediatedely does what you want.

    Nice stuff.

  3. Technology is freaking me out!
    http://www.youtube.com/watch?v=40L3SGmcPDQ&feature=fvw

    How long since they find a way to "make things easier for us" and allow us to buy a bubble gum from the comfort of our home using just our thoughts through internet?

    "Just give us every signal in your brain in real-time and you get your bubble gum with no effort."
    (Of course they won't ask for our credit card number, they don't want to get important sensible information from us.)

    Or "wear constantly your brain monitoring device to use that very popular feature everybody is using, and you would be out of this world if you don't use".

  4. Sometimes I think my computer already can reads my emotions. Something goes out of whack, for a few minutes it examining my emotions, and then, with no apparent reason, it works just like before. Do you know that too? :)

    But the above technology is really amazing! I also wonder, what impact it can have in the future. I guess it will become very popular in advertising industry.

  5. It's 2010, Radical!

    It'll be interesting how this differs from every individual, since we all bring in our own memories to the table. This is just the tip of something that will only get more refined. *the computer says I'm both excited and scared*.

    I wonder what direct implications this could have for animating or 3D in general....

    I reeeeeally think this will soar if it could be hooked up "my music" folder. "Oh you're feeling sad, and want to remain somber: here's some Radiohead....Oh, you're sad and want to get lifted up: here's some Bob Marley"

  6. Cool. Love that he gave such a long shoutout to the Durian project at the end. Just wish he showed the website (or at least if it *is* there that it was more prominant... I didn't see it!)

  7. You know, with the video focusing on such heady-tech, and after seeing Sintel every time I open blender....It totally didn't register how fantastic it is that they indeed chose to utilize this specific video...I mean, it could have been *any* movie, and this was picked! This rep. for team Sintel is quite exciting indeed. ...and I don't need a computer to tell me this emotion is true.

  8. C'mon, guys, let's not jump to conclusions to what the device actually does.

    What it is actually doing in this demo is reading facial muscle positions, so it requires the user to actually make some face in order to measure the user's emotion.

    Notice at 1:46 how the guy opens his mouth with a "Oh", gets his eyes wide open and raises his eyebrows. The sensors detect that as Fear. Then at 2:04 he is smiling, and the sensor detects that as Happy. So the device depends on the users having a facial expression in order to detect the emotion.

    The device is not mind-reading; it is reading the position of selected muscles through EEG. Certainly an accomplishment, but it can be fooled (like the way some people smile when they are sad; that may not work very well with this device).

  9. Computers reading brain waves = scary, exciting and scary.

    "Search by emotion" feature was totally unexpected and fully mind-blowing.

    Banlu Kemiyatorn wrote:
    What could be the future use of this technology for Blender community? Gathering users’ emotions when Blender crashed, when autosave folder is being browsed and then when the file was found and opened? How about your ideas?

    Endless uses in the Game Engine, from focus testing to gameplay modification.

    Imagine you're playing a game, and you're getting frustrated. The game's difficulty could be scaled back, or a hint or tutorial could appear to help you get through a tough spot. Conversely, if a game is getting boring, the difficulty could be ramped up to give you a real challenge.

    I don't like the idea of making a game harder to control because the player is feeling angry or scared, but for a statistical, turn-based RPG, some emotions might improve or reduce your party's accuracy or critical-hit ratio.

    A conversation-based game like Mass Effect could incorporate emotions and facial expressions to add depth to conversations.

    Competitive multiplayer games could show other players how you're feeling, and give you more points for going after the angry guy. There could even be gameplay applications, like a tag-based game (Juggernaut in Halo) where the person feeling the least emotion is "it."

    Even before a game is released, reading the emotions of players as they play through the game, and as they watch cinematic cutscenes, could help developers understand which parts of their game are having the desired emotional result, and which parts may need to be tweaked a bit. Reading the emotions of a focus group or beta pool would be an invaluable tool for game designers.

    Could Blender be at the forefront of such a movement? Like I said, scary, exciting and scary.

    @fbs:
    You know, that was the impression I got, but that wasn't what the article said. Still, there has to be a first step. Not quite as scary as I thought?

  10. Oh great; more metrics!

    >> Job description, 2011 =

    "Your reel must pass Focus Group emo-metrics in each column, to an average minimum standard of 80. This includes modelers and riggers." :p

  11. I can see the Blender Artist posting now. In addition to,
    "You model seems too blocky -
    Your textures could use some work -
    Your lighting is all bad -
    The skin seems a bit waxy and
    Yout animation seems stiff and unnatural"
    You can add

    "Your animation failed to invoked enough fear response and no happy response at the end. - Sad from beginning to end"

  12. I have my doubts about how accurate this is just yet, given that you're kind of going on trust that the facial/muscle movements that the device is actually sensing accurately reflects the stated emotion. Emotions are complicated, not just black and white (happy and sad).

    That being said, if you could develop it to make it more sensitive, I could see this as being a huge tool for the film making process. The bookmarking feature seems particularly useful to me. Imagine you have just finished a first cut of your film and want to show it to a test audience (which regularly occurs, by the way). And there is one scene which is supposed to be very emotional for the audience. Well, you can have your audience wearing these headsets and be able to tell not only what emotion your audience is feeling at this emotional juncture (it's not good if you want them to feel sad, but they're happy instead, etc.) but you can also pinpoint the exact time code where your audience starts feeling the emotion. This would let you go back in and tweak your editing to try to adjust for emotional missed connections, etc.

    Interesting stuff.

  13. thx fbs and Kristopher,

    you nailed it. Getting too exited about it would be the same as making a happy dance because of the invention of the kite: "yes, now we're ready to fly to the moon!"

    The device is intended to measure brain activity (EEG) but they are actually doing Electromyography (EMG), recording muscle activity. From there to real emotions is a looooong way that people are trying to figure out for ages already.

    Nice presentation though, many people actually think computers will be able to read minds (!) in the next 100 years... :))

    Oh, and great to the Sintel trailer once again! Can't wait for the public premiere!

Leave A Reply

To add a profile picture to your message, register your email address with Gravatar.com. To protect your email address, create an account on BlenderNation and log in when posting a message.

Advertisement

×