Hi . . .
I have a Pd patch that uses multiple random objects to trigger random envelope points, random envelope times, and a random pitch. In total there are about 17 random objects in the patch. I'm trying to solve an issue where all of the random numbers being generated are the same across multiple Raspberry Pi computers. If I load the patch using Terminator on all eight RPIs and send a bang to trigger a note via OSC all of the random numbers are identical even though the Pd patch is loaded on eight different RPIs.
So I looked into this further and changed each RPI to have a different time zone and I seeded the randoms in the patch using the "current system time" method. This solved the problem of the eight RPIs generating identical random numbers compared to each other but seeding the randoms caused all of the random numbers specific to each Pd patch to be the same! I hope I explained that correctly - micro randoms are identical in each patch (the envelope times and points are the same in each instance) but the macro randoms are different (each RPI has different random numbers compared to the other RPIs). I could probably try and seed 17 random objects in a different way in the patch but that seems like overkill to me.
My temporary solution is to load the patch on one RPI, trigger all of the various random objects once via OSC, and repeat seven more times for each RPI. It solves the issue of all the random numbers being identical but it is a rather tedious process to go through to get a piece ready to perform. I guess I'm surprised that a Pd patch using random numbers loaded on eight completely different computers will generate identical random numbers when started at the same time!
My goal is to be able to use Terminator to start 8 instances of the Pd patch with a single command (pd -nogui /home/pi/pd waveOSC10.pd), trigger the notes together via OSC and have the random numbers be completely different in each instance of the patch and compared to the other instances running. Any ideas or advice is appreciated. Thanks!
-Clay