A number of games implement Mumble Link (https://wiki.mumble.info/wiki/Link) which provides a memory-mapped file that includes positional data used to provide directional sound by voice-comms software (so when a teammate speaks you hear their voice as if you were both in your relevant locations in the shared virtual space). Some games also expose other data through this including health/mana/ammo levels which could be used to control lighting based on game state.
Logitech devices - http://gaming.logitech.com/en-au/developers (haven't looked into this since the G15 was new).
Corsair CUE SDK - http://forum.corsair.com/v3/forumdisplay.php?f=271
Razer Chroma SDK - http://developer.razerzone.com/chroma/
Although this might be much, much harder because you'd have to trick each SDK into thinking it's talking to the devices it expects. A good example of this is a nice hack that takes Logitech LED commands and sends them to a Corsair keyboard instead - http://forum.corsair.com/v3/showthread.php?t=140755
This is one place where some common standards / APIs would have really helped. Instead, the community is left to do all the work.
Or there's Roccat's power grid which interfaces games to a phone app. No idea how open or investigated that is though.
As a non-hacker some of the similar results i've found has been philips hue and lightpack (https://www.kickstarter.com/projects/woodenshark/lightpack-2...)
I have no idea how HDMI works, so I don't know if this is even possible. Would you see a degradation in quality even though you are not re-rendering anything? i.e. Can you do this without a high powered graphics card acting as the HDMI proxy? Is it even possible to intercept HDMI stream, or is it a proprietary protocol?
EDIT: Some good info here . tl;dr, seems possible with < $200 hardware.
In a similar vein, here's a DIY Ambilght clone, which you can customise to fit different sizes of screen via software and a raspberry pi:
It's a bit trickier since I need to drive them at ~30v and 1 amp per channel :)
If you have any tips, please share, because that would be very useful for this project.
It uses chordify.net to extract the beats and chords from the song. Code is here: https://github.com/nick264/music-processor-master
It's just a matter of timing the lights with the audio output. My setup is a Raspberry Pi as a master, and an Arduino to talk to the lights (which is necessary because the RPi's pins are too slow to send signals to an addressable LED strip). The RPi sends chord and beats in real time as they happen, over USB. Every few seconds it syncs the light show clock with the audio player clock.
I'm just messing with you, I don't know a quick method myself.