This Sunday afternoon I found myself without much to do. As I tend to in such situations, I turned to my box of electronics-y bits and pieces and pondered what I could cobble together quickly. The thing that first caught my eye was a 60 RGB LED strip I’d bought a few months back but hadn’t put to good use. The strip uses the very popular WS2812B LEDs which are RGB LEDs with a little control IC built in. This results in 60 addressable, bright LEDs which are very well supported in terms of software since Adafruits Neopixel range uses the same LEDs.
I downloaded the Arduino Neopixel library and played with some of the examples. I must have spent a good 20 minutes just staring at the colours. At a bit of a loss at what do with the LEDs I played with using a potentiometer to change the brightness and colour of the strip. Now I was getting somewhere, maybe I could make it into some sort of lamp? After all it was bright enough to light the room with all the LEDs set to white and you could play with the colours to create a “mood” or something. But a strip of LEDs isn’t really the right format for a lamp and the strip is only about 1 meter long so I couldn’t really light a whole wall with it… Then it struck me, years ago I remember seeing an advert for a clever sort of TV backlight that used the screen content to set the colour of some leds behind the TV. The idea was that it was supposed to make the experience more “immersive”. While I’m not sure that it would, it would be a suitably fun afternoon project.
Since the LEDs are incredibly simple to work with using an Arduino and the Neopixel library I decided to start work on the desktop program. It would need to, somehow, figure out what colour to set each led based on what was on screen at the moment. I settled on using Java’s Robot class to get a screen capture of a specific area of the screen. I chose to use a 40 pixel high strip running the width of the screen about 80 pixels from the top of the screen. The reason I chose not to use the top most pixels is that in lots of test videos the colours around the edge of the screen are less vibrant and less visually pleasing. Really it’s just personal choice! Anyway, I end up with a 40 by screenWidth sized image which is then split up by how many LEDs I am using. My TV is about 55 LEDs across so I split the image into 55 segments. For each segment I iterate through each pixel to get an average colour in that segment. The averaging is literally an average of all of the r,g and b values for each pixel, nothing fancy. The result is an array of 55 colours ready to be displayed. These can be transmitted via serial, again using Java, to an Arduino UNO.
The Arduino program works as follows:
- Wait for some serial data.
- When data arrives, store it in an array.
- When 165 bytes have been received ( 55 colours with R,G & B), set the pixel colours based using the array of stored data and display.
- Reset and go to 1.
The data is transmitted from the desktop program to the Arduino as three bytes (RGB), so we only need to send 165 bytes per frame. Could this be reduced? Yes, but really I don’t think it is necessary. 99.99% of the complicated code is handled by the Neopixel library, leaving me to deal with reading from the serial, parsing and writing the correct colour to the right pixel. In terms of connecting hardware, the LED strip has ground connected to the Arduino and the power supply, the signal pin to the Arduino and the positive rail connected to the power supply. A big debouncing capacitor (1000 uF) is connected across the power supply as the power demand of the LED strip changes very rapidly. The LED strip is, at the moment, just sellotaped to the top of my TV. It would look better on the back of the TV with the LEDs facing the wall, but it is easier to test everything this way.
I ran the system at about 20 fps, although it can handle 40 fps. The limiting factor at higher frame rates is the screen capture method as implemented by the Java Robot class. The time taken for the method to return seems to vary from < 1 ms to about 15-16 ms, which can make the LED frames lag behind what is on screen just enough to be perceptible. I might take a look at an alternative method of screen capture but I’m not sure it is really worth it. The overall effect is really quite cool, BUT the light from the LEDs tends to blend together as they are placed so close together on the LED strip. I suspect you could get away with maybe 4 or 5 groups of LEDs, where each group displays the same colour and it would possibly look better. I also think that the whole project could benefit from some colour/gamma calibration as the colours, especially dark ones, can be a bit off.
Here is a video of the “finished” system, I’m not 100% sure I like the effect but it is interesting and it was a fun afternoons tinkering and I may spend some more time on it later.