Just did an offline recording using two M4 Macs each running:
a Jitter patch compositing three streams of a “Ken Burns” type effect on folders of abandoned building pictures
Endogen (Max and Super-Collider - see previous post) noises and manipulations tweaked in real time
The Macs run HDMI into a Rode Video S mixer which is Luma-Keying the videos and mixing the audio from both Macs with external drones from the Osmose Synth through Poly Trails and Polyend Mess.
I don’t think the Luma-Keying worked how I hoped it would. Since both patches have a feedback effect that can cause a massive “bloom” I wanted the other channel to key in at those moments which it effectively does but at the expense of ultimately horrid colours.
Needs some more tweaking but the potential is building. Could be a more intentional, less generative set of materials or processes is required though I’d prefer to keep it improvisational.








