Impressive but overkill. I have my own ambient lighting setup. It runs on a Raspberry Pi and works like this.
1) I get the video signal from the composite output of my cable box (or any video source), which outputs HDMI and composite in parallel. If that weren't the case, I could have used an HDMI splitter and HDMI2Composite converter. I prefer the composite signal, as its not encrypted, and you don't really need to waste processing on an HD resolution signal. The "resolution" of your LEDs around your display doesn't even come close to even an SD video signal.
2) The composite cables run into a USB Video Capture card, which I picked up for pretty cheap on Amazon. The USB card is plugged into a Raspberry Pi, running Raspbian.
3) I got the driver for the video card working on Pi. Then wrote a driver for the LED strip, which communicates over SPI.
4) My main program, which is set to run on powerup, will make the video card sample the video signal as fast as it can. I do some image processing on the capture to average the pixel colors in each of several rectangular areas around the border of the image, each assigned to a corresponding LED. Then I send the SPI signal to drive the LED to that color. The sampling, image processing, and LED driving is plenty fast that the LED "frame rate" is well above perceptible limits.
5) This is all configurable with a config file that accepts parameters for LED layout, how big the area to process for each LED should be, overlapping those areas for smoother color transitions, how many frames to average the colors over (also for smoother transitions), etc.
6) The wiring is the simplest part. Power source, split off to a usb connector to power the Pi, the other line split into power and ground for the LED strip. The strip needs 2 lines for communication as part of the SPI protocol, which are just wired to appropriate GPIO pins on the Pi.
Overkill is the best part of personal projects. :) There's nobody saying that technical debt has to wait or that the solution in there is "good enough". It's the chance for a developer to delve deep and solve something in an interesting way, no matter how long it takes.
You're right. And the paths that creativity follows in the unconstrained case are important. Sometimes they lead to dead ends, and sometimes to new things that never would have gotten started otherwise.
I sometimes think of a distinction between the engineering way and the hacker way. Both are good, and the two are related, but they aren't the same. If HN has a purpose in the world it surely is to champion the latter, whose mottos are "gratifying curiosity" and "just because". By "the engineering way" I mean creativity operating under economic or organizational constraints that demand justification and exclude the whimsical. You end up with different projects and products that way.
But I don't think matmann2001 was being critical—just eager to share the cool details of their own ambient lighting setup, which is an excellent thing to do and squarely on the hacker side of this distinction.
You captured my sentiments well. I found OP's project extremely impressive and I never meant to detract from the hard work he put into it. When I was designing my own system, doing HDMI decoding on an FPGA was an option I considered. But HDCP (a from of DRM encryption on HDMI signals) was the deal breaker. Although there has been some recent research on MITM attacks to break HDCP, it's currently beyond my budget and patience.
It's much easier to sidestep HDCP altogether, which can be done with a simple AV converter.
Absolutely agree. There's a place for both pragmatism and the artistic nature of unconstrained hacking in both the work you do for people and and side-project hacking.
There's a balance that every hacker has to find. Sometimes it's good to just get something out the door, sometimes you've invented an excuse to learn technology X. :)
That's the "analog hole", and it closed January 1, 2014. No Blu-Ray player manufactured after that date offers analog video output.[1]
Of course, if your source is a Blu-Ray player, the HDMI output will have HDCP encryption, so the approach from the article won't work either. There are some "HDMI splitters" which don't encrypt on the output side, but most splitters now do re-encrypt.
You can download the verilog for a HDCP "overlay" (which, for obvious legal reasons only encrypts HDCP so that a video overlay can be put on a digital image) from: http://kosagi.com/netv_hardware/
Your setup is what I'd call an overkill! ;)
You probably got this: http://lightberry.eu/ ?
Too many single components for my taste.
Having an all-inclusive solution like this one with a dedicated FPGA makes much more sense to me.
Unfortunately it seems that a high res solution is never going to work for all use cases due to HDCP, processort speed etc, and it will probably never launch commercially because of Philips' ambilight patents like EP 1379082.
I've been looking for an in-between all-in-one solution which takes HDMI as an input for the past years, but there's no real reasonable solution which I would take.
The best I saw to this date is this: http://www.keiang.de/Content-pid-32.html, but unfortunately it's not publicly available (he didn't release the circuit diagram and the list of electrical components needed, at least the last time I researched the site).
Mine isn't a lightberry. I just use the usb video card plus the Pi.
Also, an HD signal as input to the ambient light setup just isn't necessary. The software (or FPGA) is going to average pixel colors around the border to effectively reduce the resolution to match your LED spacing. With an HD signal, you're just spending more processing power/time to filter out and throw away more information.
The difference you're seeing is the blending that my setup does. For each LED, not only do I average a box of pixels nearest each LED, but I also average over the past X number of frames, where X is typically between 5-10 (configurable).
It makes color and brightness transitions smoother, which makes the overall effect less distracting, and likely less seizure inducing.
Sounds interesting. Do you think we can have a look at the source code as to how you achieve this? Till then, I guess the next best option we have is the OP's link, if I wanted to build something like this.
I believe these simple lights add a lot, particularly to the experience if you're watching a movie. Thanks for sharing your solution :)
did you ever measure the latency? If a scene is strobing, you might notice a considerable phase offset that would probably be jarring. It would be nice to see this performance measure.
I average colors over the past X number of frames (where X is configurable). This might be a personal preference, but the sense of immersion is lost with rapid changes in colors or brightness. That's why I do this kind of blending.
If you were putting our computing resources head-to-head (my Pi vs his Spartan FPGA), sure. But you have to consider that the LED strips in both of our systems are the most power-hungry part, potentially drawing up to 1A per meter.
Taking that into account, the difference in power drain for our respective computing resources would be dwarfed by the power requirements for the LEDs, such that the difference is near negligable.
1) At the GPIO level, the RPi actually has kernel-space SPI driver. What I really mean is a driver interface for my program.
2) SPI is a ridiculously simple protocol. I have a background in embedded systems, so it's really not a lot of effort. Particularly compared to the OP, who deconstructed and intercepted the entire HDMI protocol on an FPGA. I'd consider the effort required for that task as quite a bit more.
Linux also offers user space SPI via `spidev` (i.e. the "driver" consists of a registration call, and all communication is done via regular `read()`/`write()` calls).
For low performance applications, it works perfectly.
I thought HDMI was encrypted to prevent this sort of thing? If you can do this with an FPGA what stops you from designing one that will copy a Blu ray?
It's just easier to rip a Bluray in software. Playing back a Bluray and ripping the video is probably going to be real time, while Bluray's will rip faster than that.
And just to clarify, "offers encryption" means it's optional. The sender can require HDCP if it wishes, but it can also transmit plain unencrypted data. Some devices will work either way, but restrict what they do without HDCP. (For example, iTunes on a Mac won't play DRM'd videos if a non-HDCP connection is present.) Presumably whatever this guy is using for his video source doesn't require HDCP.
I find it strange that it's apparently desirable both to eliminate any sort of light glow/bleed on the panel itself, and manually add glow bleeding out the sides. :-P
This is amazing, but how would a person that doesn't want to hack around too much just buy something like this? Like a starter kit, with some device and a led strip?
$10 for the software maybe. The cost of the parts would be much more. Strips of individually addressable LEDs are pretty expensive, even if buying in bulk.
1) I get the video signal from the composite output of my cable box (or any video source), which outputs HDMI and composite in parallel. If that weren't the case, I could have used an HDMI splitter and HDMI2Composite converter. I prefer the composite signal, as its not encrypted, and you don't really need to waste processing on an HD resolution signal. The "resolution" of your LEDs around your display doesn't even come close to even an SD video signal.
2) The composite cables run into a USB Video Capture card, which I picked up for pretty cheap on Amazon. The USB card is plugged into a Raspberry Pi, running Raspbian.
3) I got the driver for the video card working on Pi. Then wrote a driver for the LED strip, which communicates over SPI.
4) My main program, which is set to run on powerup, will make the video card sample the video signal as fast as it can. I do some image processing on the capture to average the pixel colors in each of several rectangular areas around the border of the image, each assigned to a corresponding LED. Then I send the SPI signal to drive the LED to that color. The sampling, image processing, and LED driving is plenty fast that the LED "frame rate" is well above perceptible limits.
5) This is all configurable with a config file that accepts parameters for LED layout, how big the area to process for each LED should be, overlapping those areas for smoother color transitions, how many frames to average the colors over (also for smoother transitions), etc.
6) The wiring is the simplest part. Power source, split off to a usb connector to power the Pi, the other line split into power and ground for the LED strip. The strip needs 2 lines for communication as part of the SPI protocol, which are just wired to appropriate GPIO pins on the Pi.
Boom. Done.