Advertisement

You're blocking ads, which pay for BlenderNation. Read about other ways to support us.

Experiencing Cosmic Rays with Blender in a Fulldome

10

Dalai Felinto, Mike Pan and Martins Upitis use Blender in a real-time setup with sub-atomic particle detectors and a full-dome projector. You can see their work at the Cosmic Sensations event in Nijmegen, the Netherlands from Sept 30 - Oct 2, and they're giving away free tickets!

(By the way: it was good to meet you last night guys, I hope to see you again at the BConf 2010!)

Dalai Felinto writes:

Hello BlenderNation,

we are here to present the current Blender Game Engine project we are working on and to invite interested artists to see it live. Attend the Cosmic Sensation event that happens from September 30th to October 2nd in Nijmegen, the Netherlands. For more information please visit: www.cosmicsensation.nlThe tickets are pretty affordable, nevertheless we will be giving away 27 tickets for the first blender heads who contact us into the email [email protected]

Our team is an international join force between artists, technical artists and coders. The project started one year ago with the architect Dalai Felinto (dfelinto - Brazil). During this course of time a lot of effort was put into making sure Blender 2.5 had a Blender Game Engine up and running. In the Open Software world that translates to a lot of committed code and reported bugs. With the bases covered, the digital artist Mike Pan (mpan3 - Canada) and Martins Upitis (martinsh - Latvia) joined the project for an one intensive month of creative work.

The project is called Cosmic Sensation. It's held by the high-energy physics experimental department in the Radboud Universiteit Nijmegen. Led by the professor Sijbrand de Jong, their research discovered new ways to detect sub-atomic particles, in particular one of high energy frequency known as Muon. Given its  characteristics, Muons don't have its trajectory affected after its separation from the Proton nuclei. Therefore being able to detect Muons and to trace their directions can lead to important studies on the origin of the protons emitions and their direction. The original study was published in the Science magazine. However the scientist team wanted to reach a larger audience. Therefore they came up with the idea of the Cosmic Sensation event.

For three nights the largest immersive fulldome in the world will be the stage for Cosmic Sensation. This event is a blend of dancing, music and visuals in a way never seen before. Real particle sensors, installed onto the 30 meter dome structure, will trigger music and visual effects whenever a new cosmic ray hits the dome. This realtime feed of events not only produce procedural music (superimposed at the DJs work) but is also responsible to feed an interactive digital visualization projected inside the dome. As you could have guessed, Blender Game Engine is the technology behind that part.

We don't want to spoil the surprise, but the Blender visualization is an artistic interpretation of the particles movement along the dome with fancy visual effects. We are using the whole dome as a canvas for digital projection, receiving the realtime data from the sensors (through OSC), and using the fulldome/fisheye mode (slightly patched) to project into the correct stitching.

For those unable to attend the main event and interested to hear more about this project we are going to talk about it in the Blender Conference 2010. For in November, passed the presentation, we will be releasing the files in a CC license and photos and footages from the project. Stay tuned!

Related Links:

* News about the project and more inside informations on the team blog's.

10 Comments

  1. @Dalai: Very cool. Congratulations and best of luck!

    Also: what is the effective resolution of the display? Assuming it's multi-projector -- how do you get Blender to split the live video output for the individual projectors?

    @faxrender: While fisheye is not natively supported, you could use our fisheye camera rig. We've produced a number of planetarium shows this way. (Works in 2.49 and 2.5)

    Here's the current link:

    http://www.planetarium.net/group/blendheads/forum/attachment/download?id=2096069%3AUploadedFile%3A13314

  2. I am definitely interested in seeing how they control Blender with musical instruments. Are they using Midi with the game engine?! That would be fantastically cool, even giving game designers an extra arsenal of input possibilities using midi controllers.

    Just imagine, you're bug-testing and polishing a game, tweaking property values while the game is running (if that's possible) by placing the mouse cursor over an enemy spawn-point or something, and then turning a knob or raising a midi controller switch to adjust the property levels. It could be loads of useful.

    From a purely musical Visual DJing standpoint, there are tons of options. With so many different kinds of input available through midi devices, that could trigger any number of customized effects and light shows, all running in real-time on Blender's game engine.

    Just a few input possibilities:
    Velocity (of a keyboard key being pressed)
    Note
    Modulation Wheel
    Pitch Bend Wheel
    Aftertouch
    Sustain Pedal
    Sliders and knobs
    Switches
    Velocity Sensitive rubber drum buttons

  3. @Ron: we are using one output split by two 1920x1200 (so we have an effective 3840x1200 output). This get projected to a 360ºx40º fov (field of view). So in the end we half aproximately a 1/2 inch pixel. It's not so bad, it does look good in the dome.

    Blender is connected to two stitching machine responsible to control all the input (videos, blender live stream and the other VJ's live stream work) and to handle the blending, color balance, ... There are using a commercial software that wasn't originally made for domes, but for love events. I believe they will officially support domes in their next versions (so far a lot of experimentation is going on here.

    There were some challenges on the stitching from the Blender part. As I write here I'm implementing one extra "dome" mode specifically for this dome, in order to gives us the better resolution with a optimized performance.

    @Coby: the sensors are sending Midi events to a middleware that converts midi into OSC signals (to broadcast over the different controllers (audio, visual, blender, lights, ...). Blender listen to the OSC and indeed is directly controlled by those events. In the last Blender Conference I saw at least two works doing exactly this OSC/BGE integration, so it's quite straightforward. You can find the videos and proceedings online.

Leave A Reply

To add a profile picture to your message, register your email address with Gravatar.com. To protect your email address, create an account on BlenderNation and log in when posting a message.

Advertisement

×