EOS as followspot-system?

This is a text that tells my journey. I don't need to have suggestions for solutions, just share information.

We have been thinking for a while about how to replace our Robert Juliat followspot and looked at various systems for full auto tracking to manual ‘robot followspots’ on our theater.

When we travelled around at fairs, we started to think about whether we could benefit from using EOS and Augumented 3D instead. EOS already has a 3D environment where we have placed our fixtures and can run fixtures together towards a certain point. By utilising EOS, we don't need additional systems that we need to know. We can train our followspot-operators in EOS so that the step is not so long if they want to develop further into EOS-operators.

I’am not a EOS-operator but a servicetechnician. The start of my little journey was to think about ‘Can I move lights in X and Y direction together in EOS?’. I could with ‘X Focus’ and ‘Z Focus’. The next step was to see if I can control this with OSC? I started my experimentation with the OSC functions ‘eos/wheel/X focus’ and ‘eos/wheel/Y focus’, as it is like spinning the X and Y Focus wheels on the console. Of course, Z Focus works the same way. WHEEL in EOS requires -1 or 1 to move the light beam, I needed to have this with me on my trip.

Since I have Raspberry Pi lying around, this was my first test with Python. Just plug in a monitor, network cable, keyboard, mouse and start programming. Sending an OSC message via Python in the Raspberry Pi requires very little programming, especially if you download a ready-made library of OSC commands.

The first test I did was with a regular computer mouse. I had everything at work and found that after downloading an OSC library, it only took 30 lines of code to send the OSC command to move the selected lights in EOS in X and Y direction depending on how I move the mouse.

 
Python code for mouse.

I chose not to read the position of the mouse on the screen, but found a simple way to be able to read the mouse movement already on the incoming data from the mouse in Raspberry Pi. Even though the mouse stays at the edge of the screen, when I move the mouse further sideways, I still get motion data when I use this technique. Using the computer mouse was beneficial as the positioning data coming in works as an encoder with a value of 1 or -1 depending on how I move the mouse. So the code was just to note whether I moved the mouse in X or Y direction and send the incoming value as an OSC command like X and Y-Focus. This also works with a trackball.

I later used the two buttons on the mouse to trigger two different focus palette. Unfortunately, I could not access the scrollwheel on top in the same easy way. I would certainly have tested the scroll wheel as a dimmer or ‘Z-focus’, to be able to change the Z height from the floor while walking or change the intensity.

Since we think a lot about ergonomics, we went ahead and dropped the computer mouse track and checked the possibility of a $40 joystick for flightsimulation.

 
Mission SGFTJ 1.0 - Flight Joystick

When it comes to joysticks and flightsimulators, a problem arises, a joystick does not work as an encoder but with a potentiometer. EOS does not want a position on the joystick, it wants 1 or -1 for each step the lights should move one way or the other. This could perhaps become an extra function in EOS to be able to send a decimal value between -1 and 1 and then let the software itself use it as "moving speed" if enough people think it's a good idea.

Since I use Raspberry Pi, there are some libraries to download for different flightsimulator joysticks. This means that you do not have to do much of the coding yourself, but can start focusing on the actual code that converts all buttons etc. to OSC commands.

The big problem is how to convert a joystick position into a stable clock pulse that matches how fast we want the moving heads to move? Running the code in a regular ‘While true’ loop isn’t a stable solution, I realised quite quickly. If you move the joystick up and to the side at the same time, there are double commands to be sent. If the first command takes a little longer, it is noticeable that you do not get a stable clock pulse and neat movement. I am just a hobby programmer and it is certainly possible to make a more stable solution.

When I played a little with the joystick we bought, it still does not feel like a good solution, it is ‘hard’ and it is difficult to make nice diagonal runs. It has some buttons and sliders that are easily accessible, but did not get a wow feeling if you had come by and got a nice diagonal run. Maybee it´s better with a more expencive joystick, but I can´t efford to buy all of them to find the best. Since the joystick has 3 potentiometers, I have thought about whether a twist of the handle would be more appropriate to run left-right, then up and down front and back. But have stopped at the idea when I got new things to test.

A colleague lent me an XBOX 360 controller that I could do some test on. There I redid the code and ran with ‘Threadings’ in my Raspberry Pi so that X and Y joints could each keep track of their own clock pulse for the movement. The control is more comfortable and I managed to get a decent diagonal run. The problem is that it is difficult to control the intensity with the XBOX control and the rash on the joysticks makes it difficult to find back to a good speed.

So can you control an EOS in a different way to the controls ETC present to us? Absolutely, if you have a Raspberry Pi, a little creativity and programming knowledge, you can easily come up with different solutions directly in EOS. Look for what ready-made libraries are available for Raspberry Pi to minimise coding time.

Have we moved on to joysticks followspot in EOS? No, I haven't found a way that makes me comfortable running a ”follwospot” in Augumented 3D via mouse or joystick, other than my computer mouse solution being the one I was most comfortable with. But it falls short on ergonomics. Also tested with a touchpad from Apple which was then connected with BT to my Raspberry Pi. It was difficult to get a stable connection from day to day, so I gave it back to the sound department.

After the last Prolight & Sound in Frankfurt, I got new ideas. I apologise to all the companies that develop their own systems like this, but my curiosity made me do a new test.

Have you seen RoboSpotTm or Spotrack Evolution V4? There are advantages and disadvantages to both systems in my eyes. Spotrack has the advantage of being an independent system as they are not a moving head manufacturer. Robe is a manufacturer and built RoboSpot after their moving heads and you can not choose any other brand. Without going into more pros and cons, Spotrack is more interesting as we want to be able to freely choose the best moving head for us, and they have some cooperation with ETC if I understand things correctly. You do not need to have a moving head with a built-in camera, but there are moving head cameras that can work as your main fixtures where you connect any moving heads.

With the knowledge I have gained from controlling lights with a mouse or with game controllers, I wondered if it was possible to build my own RoboSpot or Spotrack directly connected to my EOS. The big question was how can I get the tripod to move in X and Y direction with all the bearing paths needed. I also did not want to spend a lot of money and buy things, as I do not know if it would be possible to get this. I wanted the project to be with things I have access to and the ”big cost” in the project is only my working time.

As a service technician, you run into old moving heads that are too expensive to repair, which are just spare parts to quickly replace a module or circuit board. I have both Robe BMFL and Martin TW1 that are still used and some broken fixtures that I can pick spareparts from. The choice fell on picking a Robe BMFL clean from everything except the stepper motor for pan and tilt and see if you can use the stepper motor as an encoder. Both motors are in the yoke on BMFL, unlike the TW1 where one motor is in the base. This means that I can place my Arduino/Rapsberry Pi in the yoke without having to run wires down to the base. The only cable is the USB up to my Nomad via the Yoke.

When I started to disassemble the lamp, I found that there is an encoder together with the stepper motor in the BMFL. It was fed with +5V and therefore it was now Arduino I choose and use Stefan Staub's EOS library for USB. Since the encoder board was available, I did not have to buy resistors to modify the stepper motor as an encoder. By taking the example from ‘Box1’ in the EOS library and changing which pins I connected the encoder to and then changed “const String ENCODER_1_PARAMETER = “Pan”” to run ‘Y Focus’ or ‘X Focus’ for “ENCODER_2” instead, I now got my own ‘RoboSpot’ tripod stuck straight into my Nomad with EOS software with USB.

From starting to clear the BMFL of all content until I had a “finished” working product, not pretty - but working, was completed within 26h. This is thanks to curiosity and the tests that led me to this. Instead of the moving head, there is now a Nomad and two touch screens where I can build any magic sheet and control what I want in the whole rig via Augumented. Since it is USB connected, I can also plug it into my APEX console or laptop running EOS.

   

With EOS 3.3 and the ability to bring in a video stream, the plan after it is released is to see if I can integrate a moving light with an NDI camera that I control via my ‘followspot system’. I know I can download the beta and test, but I have no NDI camera to mount on a moving head.

The advantage of this solution is that I have full access to the entire rig and can pick any moving head I want as a follow spot. If I want followspot from behind, just pick the free moving head and point it right on the stage and control these with X and Y focus as a separate user to control only the selected moving heads.

I have talked to ETC and sent emails to Spotrack asking them to consider an EOS mode in the Evolution V4. So that with Spotrack Evolution V4 (or V5) I can control movements in my EOS in a system where I already can and have control of where all the lights are in the X, Y and Z joints through e.g. OSC. Hopefully this will lead to ETC perhaps little adjusting in the EOS software in the future, to be able to be used for more things as they are already on the Augumented software and make it easier for us EOS-users to keep working in EOS. My 26h test shows that it is possible and it should not be too much work to do a test between Spotrack and EOS if the will is there from both parties.

We will probably not use this live on any stage, but we will buy a ready-made system. There are certainly patent issues that mean that this is not a product that ETC will develop themselves, even though they already have the software to handle moving heads in this way and I show what You can do in 26h with a creative mind.

A new command in EOS could be ”GROUP 12 FOLLOW CHANNEL 163 Z-HEIGHT 0.5” to help us build more fun things with Augumented. My new command means… All fixture in group 12 use the same X and Y-positioning data as channel 163 and raise it 0.5m above floor. After tips from ChrDuda in the forum, I was shown how I can solve it already now. This makes it even more interesting to exploit EOS potential to use Augumented as part of a followspot system.

The next ‘playhouse’ will be to see if you can find alternatives to full automated tracking directly in EOS whitout UWB-communication. This will take a little more time and will require some form of purchase.

 //Tommy

Service technician at Kulturhuset Stadsteatern in Sweden.

Parents Reply Children