SIGGRAPH is known worldwide for showcasing some of the most innovative and groundbreaking visual and interactive content. This year, an added focus will highlight music and audio in their significant relationship with computer graphics and interactive techniques.
"Just as important as the graphics themselves are the musical elements and how they enhance the visuals and storyline in order to complete the audience experience. The creation and manipulation of sound and music provide an open challenge and creative opportunity in interactive techniques," stated Peter Braccio, SIGGRAPH 2009 Conference Industry Relations Director. "This new focus on Music & Audio aims to highlight not just the close relationship music and graphic arts have to one another, but also how the integration of music and audio enhances the overall impact of visual pieces."
The SIGGRAPH 2009 Music & Audio programming will include a Keynote presentation by the pioneer of sound and two-time Academy Award(R) winning Sound Designer, Randy Thom, as well as a series of panels discussions, featuring musicians and composers from around the globe. In addition to music performances, courses on topics such as "Creating New Interfaces for Musical Expression", and "Interactive Sound Rendering" will also take place.
Highlights from the SIGGRAPH 2009 Music & Audio program include:
Keynote Presentation: Designing a Movie for Sound: How to Make Sound a Full Collaborator in the Storytelling Process
Speaker: Randy Thom
Randy Thom has worked in a wide variety of creative capacities in more than 75 films including some of Hollywood's biggest blockbusters such as "Bolt", "Forrest Gump", "Harry Potter and the Chamber of Secrets", "Harry Potter and the Goblet of Fire", "Ratatouille", "War of the Worlds", and "Wild at Heart". Thom began working for Lucasfilm in 1979 as a sound designer and re-recording mixer and is currently the Director of Sound Design at Skywalker Sound. He received two Academy Awards® for Best Sound in "The Right Stuff" and Best Achievement in Sound Editing for "The Incredibles". In all, Thom has shared 14 Academy Award® nominations, and has worked with some of today's leading directors and producers.
Sound and Story
Moderator: Paul Lipson, Game Audio Network Guild, Pyramind, Inc.
Panelists: Lorne Lanning, Oddworld Inhabitants; Brian Schmidt, Brian Schmidt Studios, LLC, GameSoundCon; Tommy Tallarico, Tommy Tallarico Studios, Inc., Video Games Live, Game Audio Network Guild
What we hear greatly influences what we see and feel. This panel celebrates the role of sound and music in the aesthetic experience of storytelling. Experts in film and videogame sound design and composition discuss the art of combining audio with visual narrative, present highlights and favorites, and debate emerging directions for sound and story.
DIY Music & Distribution
Moderator: Scott Draves, Google Inc., ElectricSheep.org
Panelists: Eddie Codel, Geek Entertainment TV; Aaron Koblin, Google Creative Lab; Tiffiniy Cheng, Participatory Culture Foundation
A discussion of how low-cost or open-source development and distribution tools are affecting creative production. It features creative pioneers and programmers who have irretrievably altered musical composition, computer graphics, the future of journalism, and the definition of art. Like every advancement since the Stone Age, their work enlists the help of machines to improve upon what humans once made by themselves - fundamentally modern, but also timeless.
The Visual in New Interfaces for Musical Expression
Moderator: Georg Essl, University of Michigan
Panelists: Joseph Paradiso, Massachusetts Institute of Technology; Sergi Jordà , Universitat Pompeu Fabra and Reactable Systems
We are constantly creating new ways to generate and organize sound. Sometimes the result is plain fun, and sometimes it's just really nice to listen to. This panel brings together experts who have tried to create new interfaces for musical expression through very different technical means. Using tabletop interfaces, visual-sound installations, mobile music making, and circuit bending, the panelists explore what the visual means in these different approaches to musical art.
Creating New Interfaces for Musical Expression
Instructors: Sid Fels, University of British Columbia; Michael Lyons, Ritsumeikan University
Advances in digital audio technologies have led to computers playing a role in most music production and performance. Digital technologies offer unprecedented opportunities for creation and manipulation of sound, but the flexibility of these new technologies imply an often-confusing array of choices for instrument designers, composers, and performers. This course covers the theory and practice of new musical-interface design and explores principles that are useful for designing good musical interfaces.
For complete details on all of the Music & Audio programming offered at SIGGRAPH 2009 visit http://www.siggraph.org/s2009/focus/music_audio/index.php.
###
About SIGGRAPH 2009
SIGGRAPH 2009 will bring an anticipated 20,000 computer graphics and interactive technology professionals from six continents to New Orleans, Louisiana, USA for the industry's most respected technical and creative programs focusing on research, science, art, animation, music, gaming, interactivity, education, and the web from Monday, 3 August through Friday, 7 August 2009 at the Ernest N. Morial Convention Center. SIGGRAPH 2009 includes a three-day exhibition of products and services from the computer graphics and interactive marketplace from 4-6 August 2009. More than 200 international exhibiting companies are expected. More details are available at www.siggraph.org/s2009.
About ACM
ACM, the Association for Computing Machinery www.acm.org, is the world's largest educational and scientific computing society, uniting educators, researchers and professionals to inspire dialogue, share resources and address the field's challenges. ACM strengthens the computing profession's collective voice through strong leadership, promotion of the highest standards, and recognition of technical excellence. ACM supports the professional growth of its members by providing opportunities for life-long learning, career development, and professional networking.
26 Comments
Blender is very bad, because it has no music editor.
Impossible to use Blender for serious (AAA) projects, because we cant make music with it.
Blender isn't supposed to have a music editor...
Endi: I've never got it to make a nice stew. Someone really ought to work on this.
the ability to make music in blender would be cool. :)
A music and multi channel sound in blender would be very useful and streamline workflow.
That would allow video and soundtrack to be worked in the one program.
Blender is very bad because it has no spreadsheet-application, no wordprocessing and no outlook support.
Impossible to use Blender for serious projects, because we can not output documents needed for running a business on it.
Ondi
please understand my sarcasm ;) , each application has its own field.
An actual advanced music editor? No but blender does edit audio sequences. You can even animate the various properties of the audio. It's all right there in the wiki...
http://wiki.blender.org/index.php/Doc:Manual/Sequencer/Audio
Awesome info on the SIGGRAPH, by the way.
isn't possible to make blender to work together with other applications? (audacity, ardour, avidemux... cinelerra, if they make it easier...)
my dream is taht somebody tries to make all the open source projects to interact...
would be awesome if blender had some better "audio to keyframe" tool, like afterFX
Maybe Endi's speaking of multi track audio to go with the video sequence editor? I haven't got into that area of Blender too much so I'm just guessing.
I'd like to see the audio tools tweaked a little and iron out some of the issues the seem to have at some point. Adding a multi track option with some basic tools wouldn't be that bad, but Blender's not a one stop movie creation program. I don't necessarily think it should be, but I guess there would or could be some benefits. Just re scanned the comments and caught Levi's link to the audio area.
I'd prefer jonhj's idea of more cooperation or integration with other open source projects, perhaps a "movie creation package" that could include the best most advanced open source programs. Blender, audacity, gimp, motion tracking software (if there is some), and a video program with capture and more advanced editing can do most, if not all, of your hard work on a movie.
D'oh! the actual point about siggraph is interesting news too. Audio is a very important part of a movie, more so for CGI a every sound has to be created or recorded, even those subtle little ones that might go overlooked but add to the viewers total immersion in that world.
Blender probably should have better Ardour integration..
Maybe some sound wave simulation for animations? I think that would streamline the surround sound production.
The upcoming Modulation Curves in 2.5 will be a boon to future audio work with Blender... A simple MIDI sequencer and three-part synthesizer integrated would be the cat's meow... Ah, to trigger and control things from imported audio/MIDI events, and- the other way around- to output key sequence events to be tracked by external devices and software, gives me goose bumps to consider!
Please, oh, please, some lucky videographer at SIGGRAPH 2009, send us some tasty keynotes and highlights!
I am working on a script for blender that will create world peace and solve all of our problems. After that I think I can work on a music editor for blender, then I'll get it to make a stew. Any other requests?
Are we talking "animusic" kind of control there Born?
We need an open-source music software, as great as Blender for 3D, that takes music beats and flows from Elvis, mixed with the Beatles and the result is Michael Jackson.
endi :))
Blender is very bad because I can't make pan cakes with it for a dinner so I am forced to use pan and cooker, why can't blender be used as a cooker, why no one thought of making it for cuisine instead of stupid software for 3D animation :D:D:D:D:D:D:D please don't take my post seriously, i just can't help my self, blender already do more than most of high end and high cost 3D software and people say it is bad because it doesn't have audio editor equal to Samplitude. :D:D:D:D:D:D:D:D:D:D:D:D:D:D Sorry
it was shocking. hehe, phew
Okay, i think no one will disagree that Blender is amazing software that is magnificent art in it self.
It might seem unnecessary to some but having audio(velocity, bpm, frequency) and midi driven events is a really useful feature. So perhaps incorporate a multi-data editor that could send data to anything be it audio, midi, animation etc. A time based NLE that handles everything. Look at Houdini CHOP´s.
If I had my say I would like a amiga style soundtracker editing tool in Blender :D
Shouldn't they rename the conference name accordingly?a
The jokes about Blender making stew etc...aside,
Levi, johnj, Tom, Born, and SoulVector (sorry if I missed anyone making a constructive post)... make good points!
Blender, based on its name and logo, could do 3 things (there are 3 tangents/lines coming off the 1 ring in the logo):
1. Internal core functionality domains (sub-apps) for all parts of the animation/live-action/game process cycles - development/preview/packaging. Eg extending audio sequencer to allow creation of tunes, using nodes/modifiers(?), etc.
2. Intra-functionality pipelines, facilitating functions between domains eg hooking sounds to animation events.
3. External pipe ports to import/translate/export related open source apps files (or any app with open source file formats) eg Ardour to Blender.
The 2.5 focus is on seperating the GUI from Blender's core functionality, refactoring code to MVC, and allowing more customisation of GUI and functionality.
Once this is done (or as it is done), then with RNA system, etc, it should be possible for the community to look at a longer term goal of Blender expanding, consolidating, and integrating its core media functions (visual, audio, interactive) with other application domains.
There's no reason why Blender can't be extend to do 1 - 3 above from version 2.5/3.0.
It would seem a natural progression.
It's not about making Blender into Gimp, but allowing Blender to 'blend' all the domains of the production/preview/package(export) cycle, so that if you are making something simple or without access to a specific application (eg because of employer work policy), the artist can get the whole job done in Blender.
There would be a tiered approach:
1. Creation of content - 1st priority;
2. Sequencing of content - 2nd priority;
3. Inter-operation of content with other full domain specific apps (pipelining) - 3rd priority.
Perhaps 1 - 3 above should have equal focus in future (from 3.0), but it seems this is the current priority list, and 1 has an historic lead over 2, and 2 over 3.
After Blender 3.0, future development may allow 2 and 3 to catch up to 1, and we'll see Blender becoming even more well-rounded in regards to functional sub domains eg audio tools may equal animation functionality.
To so, each future functionality sub domain (sub app) of Blender, eg animation editor, video editor, music editor, etc, should
1. Be extensibile with script/plugins/mini-apps(external eg MakeHuman);
2. Intra-operate with other sub domains eg music with animation - internally;
3. Inter-operate with other full applications with open source file formats eg with Ardour.
It's taken a LOOOONG time for Siggraph to pay attention to music - perhaps becuase animation has matured a lot since its early days.
If in early days of Siggraph anyone suggested it should focus on audio too, then lots would have laughed (and probably) did make silly jokes like in this forum above.
But the relations between audio can animation can't be ignored, especially now the graphical domain is more developed, and isn't needing all the attention to grow.
The irony is, as a near mature domain, for animation/graphics to grow, Siggraph does need to look more at its context - how it fits in with music, tactile/gestural/touch environments, etc.
It's like a child focusing on its own inner world before exploring its surroundings and working out how to live in the big world.
Siggraph - and animation - is growing up - and needs to look at its outside world - or who knows - someone else may enter its domain from a related one. (It's like Mac going to CES.)
And once Blender 2.5/3.0 launches (and I'd argue while refactoring for it), Blender community needs to look at how Blender works with other applications doing related parts of the whole media production/previewing/packaging process.
By making 2.5 and later 3.0 so customisable, extensible, etc... Blender will become the 'blendering' king of multimedia...one subdomain at a time.
It doesn't have to be as good as GIMP, or as awesome as Ardour, but it should be able to do the basics of each subdomain as needed for making stuff in Blender only, communicate these media subdomains internally, and be a good pipeline partner with external apps eg GIMP, Ardour, etc.
And, allow future versions of Blender to slowly improve the functionality of each subdomain by first extending with scripts/plugins/external mini-apps(MakeHuman), and then possibly merge some of these as core code in future versions - and/or in different builds - based on different focii needed by different members of a production team.
A TD could have a Blender build that has more focus on coding/APIs/pipelining tools eg a better Python editor.
An animator may have Blender build that has more plugins for animating, editing models, etc.
And a music compose would have a Blender build that has more functionality for linking music to animation, editing it, etc.
Each build would be tweaked via scripts and plugins to begin with.
And each build would be able to inter-operate, 'think/talk/act', with the other differently focussed builds of Blender.
Just as a production team works now.
Like the 3 pronged Blender logo, Blender could at its core and by itself, be used for Visual, Audio, and Interaction(eg games), and some day, animation/film, VJ, games teams could go to http://www.graphicall.org and download Blender scripted/plugged-in/compiled for each of their respective roles in the production process.
Some day they could even choose these options from the GraphicAll site, and roll their own Blender, for whatever focus they have in their team or personal project. (Like XDA Developers ROM Kitchen for mobile phones.) This further away - but perhaps the great 'end goal' of Blender's evolution. (Although I think the 'Make Art' button may be the real end goal! ;-)
Let's not be like the people years ago who would have laughed at Siggraph having an event also focusing on audio - who today would be ashamed of their stuborness, or the people who mocked Blender having game related tools before it was open sourced - who would be embarassed to see Apricot kick butts today.
What has been constructively written in this forum is a sign of Blender's natural future evolution.
Let's discuss, plan, and make it happen!
The fact we can 'entertain the idea' shows that Blender is becoming ready for this!!!
Laughter is good.
Didn't a wise figure once say you have to be laughed at a 1000 times before your ideas are accepted?
Let's first laugh at the shortsightedness of the past.
Blender is about ready to blend more than vertices and could be 'the' open source Media Blender.
It's how we get there that matters as much as 'where there is'.
Didn't the same figure (or whoever) say that every great journey starts with a first step.
This community began this a long time ago, much has changed, and many dated opinions of Blender have fallen as it has risen.
Let's use Siggraph's 'maturity', 'coming-of-age', stepping out into the big world of Media, as sign that this is where Blender too is headed with 2.5...
and start thinking of the next versions of Blender, the next road to journey on, and where Blender should be, what it should be.
Let's be inspired and only laugh at our stubborness to not blend when we should!
blndrusr, Your eagerness and passion for advancement shine through wonderfully!
Tom (not Ton), I'm not sure exactly what type of "control" you mean, but, yes, without a doubt, a lot of what I'd be after as a either a sound engineer, or a score composer, or a special effects artist, or a director, or producer, etc... would be about having tight control- precise timing events- on the audio-side, in perfect sync with the animation... Not just to control audio software, soft synths, etc., but also all kinds of hardware studio equipment, like vintage Moog synthesizers, mixer boards, effects processors, modded guitar pedals, etc... This could be achieved with some type of IPO-to-MIDI export script... At once an entire music/audio production studio- from the smallest home studio to the most prestigious recording studio in the world- would have precise timing events to work offline with, always knowing that the timeline is perfectly in sync.
LOL at all the humorous ideas put forth earlier! Great ideas!
As to journeys, I can't think of a better place to start than right at the very beginning of a good one!
Let the adventure begin!
I want my cookie to get baked when I press 'Bake'. Why does the bake button not work? When I want it to bake some meat it doesn't want to. Instead it says I need to UV map it first.
blndrusr: You're cool. ;)