The successful implementation of this piece requires extensive programming knowledge and expertise to create graphical animations, an in-depth exploration of projection mapping to develop real-time structured light systems and audio engineering for the mixing and mastering of the audio tracks. As a result, I’ve broken the project cycle into 3 phases: Animation and Movement, Projection Mapping, and Audio.
The Animation and Movement Phase consists of programming the graphics. I will program particle systems that can algorithmically respond and move. They will have flocking tendencies and Rothko-like fluid color. This will create the color clouds that move and react to participants. They will be inspired from my research into bioluminescent fish. I will program shaders, alpha masks, shifting colors, opacity changes and moving graphics. During this stage I will felt the objects/strips, which will hang from the ceiling and be projected upon. I am in the initial stages of this phase currently.
The Projection Mapping Phase consists of incorporating Kinects, not only to sense the viewers, but to also map the projections in real-time to the felted objects/strips. Computer vision software can account for the movement of the object and re-project the image animation onto the object/geometry in context to its shape and position. This is integral since viewers will be moving through the space and causing the felted objects/strips to move and shift slightly.
The Audio Phase consists of recording sounds, and then mixing and mastering them.