Integrated PTZ Camera (physical) control and programming with Eos and Augment3d

This observaton/discussion could have found its way into a different forum but the implications are wide enough that I thought it belonged here.

We are a performing arts academy with a black box theater shifting into multi-camera capture and live streaming of performances in HD for now. We have installed robust IP based video transport using Newtek's NDI protocol and have installed a collection of PTZ cameras in the black box. IP control allows for a great deal of control of the various camera attributes but the software currently available leaves much to be desired in theatrical settings.

Recently we replaced our old Smartfade 24/96 with Nomad touch screen monitors and a 2x20 wing. The convergence of these two projects has sparked an idea which, I think could revolutionize and greatly simplify the integration of physical PTZ cameras into both busking and cue-based productions. 

PTZ cameras are in many ways analogous to moving lights. But, instead of projecting light, they are capturing it. The physical movement attributes are pretty much identical. The difference is the need to control, often on a cue by cue basis, things like sensor gain, dynamic range, noise reduction, etc that need to be adjusted in order to achieve maximum image quality in different lighting conditions.

The GUI, workflow, flexibility and moving camera control of Eos are a natural fit for PTZ camera control. Eos already contains needed functionality like move in black, cue stacks, park, etc. that if implemented for PTZ camera control would be reason enough for organizations moving to new media distribution to consider. However, there's much more.

Augment3d already relies deeply on the concept of virtual cameras to provide visualization. Imagine being able to use it not only to design and program what the live audience will see, but also what the live camera will capture.

An integrated stack of lighting and camera cues would allow one to step through the show from both the perspective of the live audience and/or each camera during programming. When live, cameras would always be in exactly the right place at the right time and trigger a video switcher as needed to provide the correct transition and handle any other audio/video centric elements like streaming and capture.

I realize that some of the pieces are in place to do this using OSC/Midi etc. Perhaps there are folks out there who've already created profiles for PTZ cameras but I'm not aware of any.

1) Are ETC thinking about this as a development/business opportunity?

2) Any forum members already dabbling with this?

Cheers all!

Parents
  • Forgive my ignorance, is there a standard for PTZ camera control? Everything I'm finding seems to be manufacturer specific.

    Also, why not build a box that takes sACN, ArtNet, or DMX and converts it to PTZ and just build a fixture profile for it like it were any other fixture in your rig?

  • There are many. The predominant one seems to be VISCA (developed by Sony). The Visca over IP implementation runs fine over NDI. For now, your suggestion is exactly what I'm planning on trying.

    But... I still say this is an emerging market for ETC and having native PTZ camera support in Eos would open up a big market. For performance facilities fitting up for streaming.

Reply
  • There are many. The predominant one seems to be VISCA (developed by Sony). The Visca over IP implementation runs fine over NDI. For now, your suggestion is exactly what I'm planning on trying.

    But... I still say this is an emerging market for ETC and having native PTZ camera support in Eos would open up a big market. For performance facilities fitting up for streaming.

Children
No Data
Related