I'm thinking of creating a custom JVing application as a Quartz Composer composition controlled by a MIDI controller.
Tempo-aware
I want to use the track's tempo as data to affect the visuals - see below for details. That tempo will be input manually by the VJ, while listening to the music and figuring it out "by ear". In the app, it would be possible to input the Beats Per Minute number, and the app would also allow the VJ to (re)set the time of the "tick" by hitting a key/pad.
How this tempo would be used
- Duration of animations: this tempo value will be used to set the duration of some of the looping animations, so that they loop at an interval of
tempo * N
. - Rendering on next "tick": Each time the VJ triggers a live change in the composition (for example changing from color A to color B, from image C to image D, or from animation pattern E to animation pattern F), that change is rendered on the next "tick" of the tempo.
I'm a total beginner with QC and VJing, but I'm an experienced programmer (Java, JavaScript). I have a decent amount of free time and I'm getting really interested in digital art. My question is the following: if you were in my situation, would you build the above custom VJing application based on QC? Or would you choose another software solution to achieve the same features?
Thanks for your time.