developing alternate human interfaces on hog

Has anyone tried to develop tactile interfaces for Hog? I am wondering if it would be possible to build a MAX msp GUI interface or some other programming platform to interface with Hog lighting controller via midi or artnet. one general idea is to have a touchscreen with a visual representation of a lighting array which you could control with gestures like finger painting.
also is there an SDK for the hog platform or a way of modifying or creating windows?
i would like to build chases which you could drag your finger across a rig and have interactive tactile control over. The overall goal will be to create a specific effect for a film but i would like to develop it as something other people may enjoy using as well.

many thanks
-one small fish
joe abraham dean
set lighting console programmer
Parents
  • That would be interesting functionality as well but totally not what this thread is about. Vista is an operating system with its own limits and problems. What i am talking about is a method of programming which would separate the programmer from the code, and the hard entries associated with encoders, keystrokes and command line entries.

    I agree with you on some level, if i understand your suggestion. I beta tested the GrandMA I think it was like serial number 11, and my strongest suggestion to ACT lighting was to allow the creation of macros and commands via a script editor, so that you could recreate a complex series of commands via cut and paste. this is still not implemented in lighting consoles for some reason. why? who knows. I think that every cue should have a line of code that could be modified in a text editor and reintroduced to a show via the web, or a simple cut and paste function from my iphone (sarcasm- yuc yuc, no cut and paste yet for iphone). I love the MA for many reasons but these days I am a Hog programmer for many reasons.

    and as DrEad stated "The lighting/visual control becomes integrated into a dance performance by dance movements, rather than programming and playback via timecode or 'next cue'..."

    This would be more of a conductors approach to programming, I could not see a conductor effectively conducting an orchestra via command line entries or copy and paste.

    Lets think outside the code and look at a rig for what it is objects in space with unique IDs and functions, individual players if you will. I would love to be able to grab the fixtures with one gesture and point them with another and color them etc..make a chase that excludes lights that my fingers touch, or includes them.... maybe learn timing from these gestures and so on.

    all of these things are possible today at an experimental level but nothing that has been packaged.
    thank you for you posting on this Edward Hodge.
Reply
  • That would be interesting functionality as well but totally not what this thread is about. Vista is an operating system with its own limits and problems. What i am talking about is a method of programming which would separate the programmer from the code, and the hard entries associated with encoders, keystrokes and command line entries.

    I agree with you on some level, if i understand your suggestion. I beta tested the GrandMA I think it was like serial number 11, and my strongest suggestion to ACT lighting was to allow the creation of macros and commands via a script editor, so that you could recreate a complex series of commands via cut and paste. this is still not implemented in lighting consoles for some reason. why? who knows. I think that every cue should have a line of code that could be modified in a text editor and reintroduced to a show via the web, or a simple cut and paste function from my iphone (sarcasm- yuc yuc, no cut and paste yet for iphone). I love the MA for many reasons but these days I am a Hog programmer for many reasons.

    and as DrEad stated "The lighting/visual control becomes integrated into a dance performance by dance movements, rather than programming and playback via timecode or 'next cue'..."

    This would be more of a conductors approach to programming, I could not see a conductor effectively conducting an orchestra via command line entries or copy and paste.

    Lets think outside the code and look at a rig for what it is objects in space with unique IDs and functions, individual players if you will. I would love to be able to grab the fixtures with one gesture and point them with another and color them etc..make a chase that excludes lights that my fingers touch, or includes them.... maybe learn timing from these gestures and so on.

    all of these things are possible today at an experimental level but nothing that has been packaged.
    thank you for you posting on this Edward Hodge.
Children
No Data
Related