- 
		jennieo posted in pixel# • read moreI'm doing a project that is taking a live feed of a dancer, putting it through a threshold to only have the silhouette, and projecting it at different time intervals, to create multiple shadow-like figures repeating the dance movements just performed. I have the threshold and delays no problem, but I can't seem to figure out how to have multiple versions of the same live feed, playing at different delays in the same GEM window. I tried using the pix_mix but it only shows the one single delayed projection. Do I need to change the opacity? How do I do that? Or is there a better way to do this? 
- 
		jennieo posted in technical issues • read moreHello, I am having some issues with my interactive installation patch. I am using a blob tracker I built in Pure Data as my input data (x and y coordinates), and I want to use it to control Ableton Live specifically: 
 triggering specific audio samples
 changing the tempo (bpm) ( I wanted to do this in Live if possible due to their tempo-warp)
 and controlling a VST pluginI think both programs can operate on Open Sound Control and communicate to eachother but I'm not exactly sure how to set it up? Has anyone done this before? 
