Cyberlights on 110V?

Hello all
I'm currently working on building up a good set of equipment for my attempt at lighting design. After reviewing many options for used equipment (And my parents demanding for me to buy American made) I've recently discovered the Cyberlight classic as a good choice. However, this brings up my nightmare, in that Cyberlight classics do not support 110V, which I will primarily be dealing with. This also applies to various other lights too, from various manufacturers.

I am curious if anyone knows of any way to get Cyberlights to work on 110V, even though they do not naturally support it. Preferrably, an easy, simplistic way, that won't cost more than the lights themselves to do. I've already looked all through the manual for the Cyberlights, and I understand about their 208/240V Support. So, is there any way to do this that is not unnaturally dangerous, or overly skilled/expensive?

Also, I'm quite curious if anyone knows why the Cyberlights and certain other fixtures from various manufacturers require 200+V, whereas other lights that use the same lamp and same features do not. I'm just curious on that.

Please do let me know! I am really hoping to be able to use High End systems equipment in time, but this nasty roadblock popped up and I must find a way around it to do so.

-C
Parents
  • It's really a case of electrical code and experience rather than fear-mongering.

    One critical fact you're missing is that standard circuit breakers are only rated to hold 80% of their rating with a continuous load. That means a 15-amp breaker can only supply 12 amps continuously. Drawing more than 12 amps for a few minutes is usually fine. But you can end up with nuisance trips when you're running the load for an hour or more.

    Let's assume that a 1200-watt fixture is drawing 1400 watts from the electrical service after you account for ballast losses and motor power. A 15A/120V service derated to 80% can nominally supply 1440 watts. If the step-up transformer is 95% efficient, you've gone over the limit.

    So, you're already in a precarious situation.

    Now, consider the standard wall outlet (which is what it sounds like Chris wants to use). Let's take a 15-amp wall outlet that's 250 feet from the electrical panel. If you're lucky it's 14-gauge wire (If it was a crummy install from the '60s it may be 16 gauge, or even worse aluminum wire). 500 feet of 14-gauge (250 for the hot, 250 for the neutral) is about 1.2 ohms. From Ohm's law, this gives you about 14 volts of voltage drop. If your electrical service started out at 120 volts, it's now down to 105. If it was 112 or 115 to start with it could be 're drawing even more current through the circuit breaker.

    Standard residential/commercial outlets just aren't intended for this kind of high-power use.

    Finally, think about the electrical systems often found it old school buildings, clubs, etc. The in-wall wiring can be 50 years old or more. A heavy load will exacerbate any problems that are already there. Every loose connection and rat-chewn wire will be that much closer to over-heating or burning out.

    Yes, sometimes it will work. It sounds like you (Tim) have had good experiences with using step-up transformers for larger fixtures. But there can be problems, and it's definitely not something I'd recommend to someone just starting out.

    (Just be be clear: I am *not* a licensed electrician and cannot give advice for any particular jurisdiction. I'm speaking strictly from my own personal experience and knowledge. Electrical codes vary widely across the country/world.)
Reply
  • It's really a case of electrical code and experience rather than fear-mongering.

    One critical fact you're missing is that standard circuit breakers are only rated to hold 80% of their rating with a continuous load. That means a 15-amp breaker can only supply 12 amps continuously. Drawing more than 12 amps for a few minutes is usually fine. But you can end up with nuisance trips when you're running the load for an hour or more.

    Let's assume that a 1200-watt fixture is drawing 1400 watts from the electrical service after you account for ballast losses and motor power. A 15A/120V service derated to 80% can nominally supply 1440 watts. If the step-up transformer is 95% efficient, you've gone over the limit.

    So, you're already in a precarious situation.

    Now, consider the standard wall outlet (which is what it sounds like Chris wants to use). Let's take a 15-amp wall outlet that's 250 feet from the electrical panel. If you're lucky it's 14-gauge wire (If it was a crummy install from the '60s it may be 16 gauge, or even worse aluminum wire). 500 feet of 14-gauge (250 for the hot, 250 for the neutral) is about 1.2 ohms. From Ohm's law, this gives you about 14 volts of voltage drop. If your electrical service started out at 120 volts, it's now down to 105. If it was 112 or 115 to start with it could be 're drawing even more current through the circuit breaker.

    Standard residential/commercial outlets just aren't intended for this kind of high-power use.

    Finally, think about the electrical systems often found it old school buildings, clubs, etc. The in-wall wiring can be 50 years old or more. A heavy load will exacerbate any problems that are already there. Every loose connection and rat-chewn wire will be that much closer to over-heating or burning out.

    Yes, sometimes it will work. It sounds like you (Tim) have had good experiences with using step-up transformers for larger fixtures. But there can be problems, and it's definitely not something I'd recommend to someone just starting out.

    (Just be be clear: I am *not* a licensed electrician and cannot give advice for any particular jurisdiction. I'm speaking strictly from my own personal experience and knowledge. Electrical codes vary widely across the country/world.)
Children
No Data
Related