Halloween hit and I wanted to do something cool. The goal was to be able to hit buttons on tablet, and trigger a lightning type effect. I wanted to trigger DMX stobe and LED up-lighting (Color Kinetics) in time with blanking the LED video wall segments and firing off a cue in Pangolin Beyond that contains sequenced static beam shots terminated safely. I also wanted to be able to trigger a few other cues in each program.
Cell DNA allows midi input which is able to switch which cue is playing. However, the laptop I was running it on (that also had the duty of spitting out scraped video over gigabit network to the LED panels) struggled. I didn’t have a ton of time to scale the video down optimally.
FreestylerDMX can talk midi. Beyond can talk midi and OSC directly.
Touch OSC usually targets a single endpoint, so in my case I ended up running the TouchOSC midi endpoint aimed at Beyond, which has a feature of doing midi through. From there I simply shot the output into the rtpMidi utility which then got the commands onto the network as a broadcast protocol and delivered them to both the computer running CellDNA (borrowed from MAGFest) and the one running FreeStylerDMX.
CellDNA was the only thing that had a lot of issues, and that was it drowning in the video codec I believe.
The outcome was a pretty cool looking halloween display that simply wasn’t scary enough. The wind blew away the output from the fog and haze machines often — though some people caught the lasers. And it was cool when kids would straight up ask, “Is that a laser?” Yep.
But needs to be more scary.