Moviesandbox

news about the 3D realtime moviemaking tool "Moviesandbox"

Friday, August 28, 2009

Verion premiere

Yesterday evening at 7 pm in Trondheim, Norway we premiered Verion. In the week before, Verion had coverage in all major newspapers, national radio and national television - in Norway...
Initial reception was good, as far as audience clapping can be a measure for that.

I plan on posting a sort of post-mortem for the production as soon as it's finished.

verion_poster

The characters, scenes and overall setup for the production will be made available with Moviesandbox for people to play around with.
And I'll upload some video and screenshots soon.

This is also the time to thank a couple more people than those who fit in the credits:

* Zachary Lieberman and Theo Watson for openframeworks - an open source c++ framework that we used in some of the content creation tools for this production, and in some parts of Moviesandbox.

* Theo Watson for his Video Input library - used for Video-Textures in Moviesandbox

* Eyebeam Center for Art and Technology, for providing the initial resources to get Moviesandbox off the ground as an Open-Source project.

* Michael Nitsche of GeorgiaTech for feedback and an initial testbed for Moviesandbox

Monday, August 24, 2009

Verion setup

The project I am working on right now could not have been any better as a testbed for Moviesandbox. It's a 80 minute live theatre performance mixing two stage actors with stereoscopic 3D projection (as you might have figured out by now).
The piece is run live, to make sure that we can give the actors some room with their timing.
Which means that all cues for animations, lipsync, sound and cameramovement have to be triggered manually.
This is a screenshot of what one scene in this play looks like:

(click to embiggen)

The left screen is the node setup with all the key-press events and Udp-(network) input events in place.
In order to stay in sync with the sound and not have the overall piece become a complete trigger-mess, we decided to have the sound as the primary cue source, so we send start- and end-commands from the sound computer over a local network using UDP. We also send the amplitude of the character voices, so their mouths move accordingly.

To get a better idea of the setup, feel free to have a look at this flickr set which documents the production and the people involved:



www.flickr.com







The amazing thing about this project is that it was completed from start to finish, including script development and technical R&D in under 4 months with a software team of 2 people and 3 additional illustrators.

Monday, August 03, 2009

the stereo effect

The project I am currently working on uses stereo projection to make the animation a bit more funky. You know, like in Beowulf or Ice Age 3.
Here are two screenshots showing you how that works:



The first image shows two almost identical images side by side. One is rendered for the left eye, the other one for the right. They are placed so that each spans exactly one full screen, and we send them to two projectors that are projecting on the same screen.
The projectors have polarization filters on them, polarizing the light from the upper projector vertically and the one from the lower projector horizontally.
The surface we project on, a silver-coated canvas, keeps the light's polarization, and if you wear glasses that have similar filters in front of each eye, we can send individual images to your eyes.

The same can be done by splitting the channels in red and blue and using red and blue filters before our eyes to trick our brain:



If you have a pair of red/blue glasses, you will see a monochrome, but seemingly 3dimensional picture composited from the one posted above.