How I used open source tools to build a theater lighting system

Open source software was the star of the show for a local production of Les Mis.
413 readers like this.
open source button on keyboard

The things we do for family, eh? Sometimes I wonder why I do it to myself, this not being the first time my perfectionism has led me to do far more work than a task originally required.

My father-in-law approached me a little over a year ago, because the church he attends was putting on a production of Les Misér ables. To make it bigger and better than the previous year, he had purchased some lights and a lighting desk and was hoping I could operate them. "Sure!" I said. "How hard can it be?" He then described his vision for one of the scenes: A large battle with flashing lights; you know, really exciting! "Cool," I said and unwittingly agreed.

I sat down for the first time to work the lighting desk he had purchased: a small, low-end, multichannel DMX-512. It had capacity for hundreds of scenes, fading, and even a nice jog wheel. I was excited! Despite the fact that the manual was obviously a poor translation and the desk required a veritable procession of button presses just to get it to do something, I persevered. I created a battle-like scene, thinking it was smart to start with the part I believed would be the most difficult. I showed it to my father-in-law and saw his disappointed face. "This is just a test," I was quick to point out. "The real thing will look much better, of course." He walked away. D'oh.

I started thinking about the scene's problems. It wasn't random enough ... it needed to look more random. I decided that if I could generate enough scenes where the lights moved on certain paths, I could just fade between them quickly using the built-in fader, and all would be good. Right? Generating the scene information was hard. It was at that point I enlisted my first open source tool in the job: Blender!

I used the animation curves in Blender to play with lights in a scene and generate a convincing battle look and feel. Then I wrote some Python scripts inside Blender to export the movements to a CSV, giving me column data for each scene, which I could then program into the desk. I finally felt like I was getting somewhere.

If it weren't for the fact that just programming this thing was enough to induce stress (on top of the feeling that I was giving myself a repetitive stress injury), and that a single slider was supposed to control a time duration of 0.2s to 30s (I mean come on really?!), the results were less than stellar. It just didn't look right. Sigh. If only I had complete control over it, I thought. If only I could program these lights using Python.

My mind drifted, and a vision of a guy relaxing in a deck chair on a beach programming his DMX-512 lighting in Python floated by. "He looks so happy," I thought. I must have what he has.

I had a chat with my brother-in-law, a fellow tech enthusiast who, I might add, had already taken on doing the sound mixing. We decided that it would be a worthwhile investment to purchase a USB DMX-512 interface. We bought one for probably half the cost of the desk, and I began exploring options for DMX-512 controlling on Linux. I found a few applications, but most of them seemed from the outset to be based on static scenes. I also had this niggling feeling that this project was something that would expand, grow, and suffer from scope creep.

Finding no application that satisfied my needs for both availability on my OS or functionality, I decided to write my own. I found the Open Lighting Architecture (OLA) and, after a bit of tweaking with blacklisting modules, I had a tiny Python script that would send data to the lights and turn them on and off. I was ecstatic. Now I just had to write an entire system for achieving the dream.

I started off small, considering what was absolutely necessary. I wanted a common format to store the scenes. One thing that bothered me about the lighting desk was a lack of backup. If the desk suffered from some malfuction, all my precious scenes could be wiped out. Hours and hours of would-be work wasted. No. Not an option. I chose the open format YAML for my scenes. I used it a lot in my day job, and it seemed to fit the bill.

Pretty soon I had the ability to fade between two scenes in a sane manner, interpolating between each channel value for each light. This meant that if the brightness channel on light 1 was at 50% in scene 1 and 75% in scene 2, then the software would linearly ramp from one to the other over the specified time period. Excellent!

I then started to add modifiers to the channels. This was something I had seen in Blender. Animated channels of data (an arm-bone bend, a camera x-axis translation) could have keyframes, but they could also have modifiers applied to the channel to generate things like noise or other effects. I added a few rudimentary modifiers to my project, and soon I had cos/sin giving the tilt/pan channels on the light a circular motion. I then added waypoint modifiers using spline curves to enable the channel to smoothly go to different values at different points in time. All was steadily progressing.

Working on the lights while not at the church was challenging. It was hard to visualize what the scenes looked like, so I took to Blender once more, adding a simple HTTP API to the application and asking Blender to routinely query and update the lights I had set up in a scene. This allowed me to demo my lighting and see what was happening.

Demo of early visualization

A smoke machine was added to the set. "Can you control that with your thing too?" I was asked. "Yes, I guess," I replied. Due to the very clean and minimalist UI, there were only scene controls, no channel faders for the lights. So, controlling the smoke meant baking some values into the scenes. Smoke is not only a non-free commodity, but performers tend to get rather upset when you deluge them with smoke while they perform, all in the name of testing. I could have used one of the waypoint modifiers but didn't really want it triggering every time I entered that scene.

I needed another way to control it. My mind wondered. "Wouldn't it be cool if I could use a WiiMote to do just that?" Then it hit me. "Why the heck can't I?" I turned to the cwiid library, picked up a WiiMote controller that was languishing in the lounge, and put it to good use. Soon the smoke machine channel had a WiiMote modifier that allowed me to raise the controller to douse the performers in a thick cloud of smokey goodness.

The pitch of the controller is linked to the overall brightness of lights 1,2,3,4. The roll of the controller is linked to the amount of blue on lights 1,4 and also the tilt on lights 2 and 4.

Getting carried away, I added dynamic GTK UI generation of the channel sliders from the YAML configuration, meaning that I could switch to different channels and override them. It's a good thing I did that, as performers never seem to use the same mic every evening. Lesson learned. Now I could react to those changes and bring lights up and down manually, just like using the desk.

Three performances came and went. The battle scene, awash with sin/cos and the new randomized modifier, looked spectacular. Coupled with the audio that my brother-in-law put together, we felt pretty good about it.

Demo of battle scene

Fast forward to a year later. I have a performance to do tomorrow, and my brother-in-law just asked me, "Is there any way you can get your system to play the sounds so we don't have to trigger them together?" "Of course," I reply. Now it can, thanks to GStreamer. It can also control a channel based on the amplitude of the wave, giving me a way to sync the explosions in the battle scene to lighting movements. Thanks, SciPy!

So what's the takeaway here? Without open source I would never, ever have finished this project. I would have had to write or buy WiiMote drivers, DMX-512 drivers, DMX routing and controlling backends, visualization software, configuration standards, UI generation. Open source gave me three things, which are all invaluable.

  1. It gave me access to all these things so that I could focus on writing the glue that pulled them all together.
  2. It gave me the ability to continually update and add to my frankenproject, which no doubt one day I will rewrite and release.
  3. It keeps my father-in-law happy...for now.

Without open source I would have been stuck a gibbering, repetitive stress injury-inflicted mess, muttering something about the need to hold the scene button for three seconds before pressing a combination of the program button and the clear button.

I think open source just saved my life, and gave me one hell of a good ride while it was at it.

User profile image.
Peter is a passionate Open Source enthusiast who has been promoting and using Open Source products for the last 10 years. He has volunteered in many different areas, starting in the Ubuntu community, before moving off into the realms of audio production and later into writing.


Being a sound & lighting tech, and a (relatively novice) linux user, I found this article pretty interesting, and I can certainly relate to the frustrations of programming certain lighting boards.

I'm assuming the DMX adapter you purchased also featured DMX input? It seems to me a waste to omit the hardware lighting board entirely, when it could be set up as a simple bank of dimmer channels, allowing the physical faders to supply control value inputs to whatever software functions or modifiers you assign them to. This would provide 'hands on' operation, & eliminate the need for the wii controller, as smoke could simply be controlled (or over-ridden) by a fader.

I've considered doing something similar to provide more physical control to Martin M-PC lighting software, as our Martin control surface has only a handful of faders, and we have another lighting board with 48 faders, but limited programmability.

A MIDI controller could also work.

Good luck with your project. If you pursue releasing your project, I'd suggest a good GUI, as your average lighting tech probably doesn't know how to code.

Right, so the controller we bought on our limited budget, limited availability at short notice and seemed the most compatible, was only DMX out. Totally agree that we could have used the inputs and indeed that could be an awesome idea for the future, utilizing techniques similar to the MIDI learn, where we can just right click a fader on the screen, click learn, move a fader on the control board and then have it mapped to that fader.

It could also be used then to allow multiple channels to be controlled by the same fader on the control surface allowing a cheap mans gang. (ganging of faders to groups was actually something I was going to add in a new version, giving the ability to group faders and have them all controlled by a single one which would appear either on the side or on a new page.

As I said, once I added the GUI to control things, the Wii controlling became superfluous, and was a cool toy to play with. You read my mind, a MIDI modifier was also one of the things I wanted to add.

There is no coding to use the tool, the lighting format while verbose, is easy to use, an example of it is here,

However, making a good GUI for is is something I would love to do, the ability to set constants in the UI, apply modifiers to channels, create scenes and even see graphs of the channel data in realtime was also on the plans. However before I do that, a complete rewrite of the modifier system would be in order. There are still somethings it does that I really do not like ;)

Creative Commons LicenseThis work is licensed under a Creative Commons Attribution-Share Alike 4.0 International License.