Having worked with the GrandMA2 and it’s timecode programming tools, it’s hard not to feel powerless when it comes to timecoding on Eos. I’ve used it with great success, but it was not as pleasant as the GrandMA2.
I would LOVE to see a timeline implemented with markers, colourcoding and layers. These are concepts beyond the GrandMA2 though.
What I would like to be able to do;
I would like to be able to import a soundfile, extract the waveform and underlay it on a timeline. Then I would start by marking important points on the timeline(verses, choruses...). Then I would start programming by learning different events, having for example all strobe subs colourcoded to red, all washes to blue, and all cues to green. Then I could look through and visually adjust timing of certain events, nailing them down to a specific highpoints in the waveform.
I get why audiofiles and Eos could be a can of worms if implemented, but having at least a timeline with some structure to it, like markers and having different subs mapped to different layers, so I could for example solo only the sidelights while adjusting programming.
This is probably already in developement in some shape or form, since this is one of the places where the Eos is most lacking.
Makes sense?