Kilauea Cove Build — Part 5: Smoke & Lights
Back to the good stuff — smoke and lights! Now that I have (almost) all of the decor in place, I can see how the lights will play off of everything. First things first: let’s re-mount the lights.
Re-mounting the PaleoPixels
Way back in Part 1 of this series, I had mounted the Adafruit 12mm Diffused Thin Digital RGB LED Pixels to the underside of the unfortunately-shaped curved HVAC duct that plows through the volcano nook. However, asymmetry is almost a virtue in tiki decor, so I decided to go with it.
I had removed these “PaleoPixel” LEDs before painting, so I mounted them back to the duct with a series of heavy-duty zip ties, wrapped around the duct. The zip ties also formed the base for a section of reed fencing that went over the bottom surface of the duct, with the LED pixels poking through. I pushed each of the pixels into a small rattan ball that came with some old faerie lights, which hides the LEDs from direct view, and also secures the reed panel above them. To hide the front face of the duct, I built a small section of random looking bamboo facade. This was mounted to a scrap wood bracket that friction-fits onto the face of the duct.
Now, all of the lights are up, and I can start programming them! The PaleoPixels in the rattan balls are arranged in five rows of ten pixels each. There are six NeoPixel strips on the shelves — each one about 40 pixels long, in the 60 pixels-per-foot version of NeoPixels. One strip is on the top of each shelf, at the front edge, shining up onto the mugs. There is also one strip on the underside of each shelf, toward the back, illuminating the reed panel behind the mugs.
The initial code that I wrote for the Raspberry Pi to control the lights (available here in my Github repository) was great for doing very regular animations, like turning a whole row a single color, or fading softly between two colors. That was great for two of the modes: Idle and Drink-mixing.
Idle mode is just a preset that illuminates the mugs in a pleasing way. The PaleoPixels are set in solid-color rows of gold and blue, suggesting a sunset beach. The front-edge NeoPixels illuminate the mugs in a warm white, showing off the texture and glaze on the ceramics. Meanwhile, the back-edge underside NeoPixels shine on the wall in a dim, slightly blue hue — the contrast making the mugs pop from the background. For the future: I would very much like to have the colors in this mode slowly undulate slightly, suggesting waves on a beach.
Drink-mixing mode is almost exactly like Idle mode, except that the NeoPixel strip underneath the bottom shelf shines a bright white onto the bar cart. This way, I can actually see what I’m doing when I’m trying to mix a drink!
This is a good start, but there’s one mode left to program: Volcano mode! The whole impetus for this project was to put a volcano in my basement. How can I program a full-color eruption simulation without controlling each pixel by hand?
A little bit about me: I’ve been a video editor and motion graphics artist professionally for almost twenty years. So, of course my first thought was, how can I make this play a video? I figured I could produce the animation I wanted in After Effects, and export the video. How could I make the Raspberry Pi translate that video to display on the LED pixels?
After quite a bit of experimenting, I found that the best simple solution was to use OpenCV, an open-source computer vision code library. OpenCV will load the video and report the RGB color value of each pixel. I can then send that data to the code I already had for setting the color of each pixel. You can see the code in
superpixel.py, specifically the
PixelPlayer class. The
SuperPixel library combines control for both NeoPixels and WS2801 “PaleoPixels” into one master controller.
To save on memory and overhead, I created an animation in After Effects that was exactly mapped one-pixel-per-pixel to the layout of the LED pixels in the display. This gets read in by the
nook_controller.py script, which runs the whole shebang. You’ll see in the gallery above that the original video size is 16×16 pixels, since I can’t make the video any smaller than that. The Python script reads in the relevant 5×10 pixel grid and throws away the rest.
The delay between displaying the frames is meant to be 30 frames per second, as in the original video, but it’s slightly off of that, due to the limitations of the Raspberry Pi not being a real-time controller like the Arduino. However, it’s pretty close, and a little padding with delays here and there got it to sync up with the smoke and sound effects, which we’ll get to in a bit. This is also the point in the project when I upgraded from a Raspberry Pi 2 to a version 3, which gave me a faster processor — enough to smoothly display this animation.
So now I have lights, but where’s the smoke? When I first planned out this project, I was going to use a small party/halloween smoke machine to provide the thick smoke and pump it out on demand through a series of PVC pipes. I even built the pipes, with a big cardboard tube reservoir for the smoke, with fans and everything. It all worked reasonably well, but I just wasn’t able to get the smoke to pump exactly on cue. I still want to come back to this idea, since I’d like to do a second version of the show that’s closer to my original vision, with music and more intricate timings. However, for the late-summer tiki party, my deadline was fast approaching, and I needed a different solution.
The solution came in the form of a tiny smoke generator from Seuthe, that is designed for use in scale model locomotives and steamboats. I wanted to use a volcano-themed tiki mug as the basis for the volcano, so this generator would fit perfectly inside the mug, instead of trying to duct smoke from somewhere else into the mug. I 3D-printed a small support mount for the smoke unit, so it would sit upright in the bottom of the mug. Over the lip of the mug ran a 5V power wire from a relay, so I could control it on cue from the Raspberry Pi. You’ll see in the gallery at the top of this page that the mount has a sort of ‘hat’ on it that I thought might help distribute the smoke more evenly, but it turned out not to help, so eventually I removed it.
The smoke itself was not enough. If we’re going to make it look like an active volcano (and not just a smoldering one), it needed the red glow of fire and lava to go with it. To enhance the smoke even further, I extended a wire from the last strip of NeoPixels to add a small ring of NeoPixels that sits just inside the lip of the mug, turning the smoke a bright lava red. During the volcano mode, that particular mug is also spotlighted in red under-lighting, while all of the other mugs go into silhouette.
That just leaves the small matter of the fish float lamp. I wanted to make sure that it turned off during Volcano mode, so that the relatively bright light wouldn’t wash out the atmospheric effects going on in the volcano nook. I got a SparkFun PowerSwitch Tail relay, which allowed me to control the power going to the lamp with a small signal from the Raspberry Pi. In the future, I would like to swap out the bulb in the fish float lamp for a Philips Hue, so I can control the light level and color more precisely. Maybe I could then turn it a dim red during the volcano show instead of switching it completely off.
Even before I created the LED pixel animation, I first edited a set of sound effects to go with the animation, so I could know what timings I needed to hit. I pulled together some open-source jungle/island sounds for the idle animation, and worked in some real volcano eruption sounds for the Volcano mode. At first I thought I would be able to play these sounds from the Raspberry Pi itself, but there was a horrible electrical interference noise that came out of its headphone jack whenever it sent commands to the LEDs. Of course, during Volcano mode, it’s sending commands every 1/30th of a second, so that was a no-go.
I eventually settled on using the Adafruit Audio FX Sound Board, which lets you load in several sound files. Each file is associated with a trigger pin, so when you send a signal to that pin, it plays the sound. There are also options for whether the sound loops, plays the whole thing or latches only while the signal is sent, plays a random sound from a selection, etc.. It’s not ideal, because it doesn’t let me do any mixing — one sound abruptly stops when the next starts — but it’s a good interim solution.
What’s left? Controlling it all! I hooked up three translucent arcade buttons, backlit with single NeoPixels. These allow you to switch into each mode: white for Drink-mixing, amber for Idle mode, and red for Volcano. Only, it seemed “unsafe” to allow anyone to just hit Volcano mode an any time. We need to give time for the villagers to evacuate — or gather to watch! So, I added a safety toggle with a cover, that you have to switch on before you can trigger Volcano mode with the red button. (I’ve wanted to use one of those safety toggles ever since I saw Top Gun as a kid, so how could I not?)
The switches, Raspberry Pi, and all of the other connections were soldered and mounted into a small plastic project box. I’d still like to give it a faux-wood finish, to make it blend in a little more, but it does the job!
That about wraps it up. Above is the diagram of the final circuit connections for the controller and all of the LEDs and smoke (created in OmniGraffle). If you do something like this yourself, you don’t necessarily have to use these pin-outs on the Raspberry Pi, but it does match up with the code that you see in my GitHub.
You can see the whole Volcano mode presentation, with sound, in this video! If you have any questions, please don’t hesitate to ask — here in the comments, or reach me on Twitter or Facebook.