Hacker News new | past | comments | ask | show | jobs | submit login

Can LEDs be used to create the same brightness without the heat?



Without heat? No. With less heat? Yes.

Light is energy, and filming requires a lot of light, so there's always going to be significant visible energy available to heat things up. (Unless you're filming a mirror, but in that case why are you illuminating it?)

Incandescent lights emit a lot of invisible energy in the form of infrared radiation, while LEDs are designed to emit most of their energy in the visible spectrum. This means LEDs need less energy to produce the same visual effect, and therefore impart less heat. Another option is to slap a "hot mirror" or "cold mirror" (like the one from the dentist's office) in front of your lights that separates the visible from IR.

https://ogc-jp.com/en/wp-content/uploads/2017/08/cat12.jpg

https://www.edmundoptics.com/c/hot-cold-mirrors/989/


I wonder why they don't use hot / cold mirrors to separate the visible light from the IR. You could bounce the IR energy (upwards(?) or into a heatsink) so that the scene isn't as intensely heated.


There are in fact theater lights at least as early as the 90's with cold mirrors to direct IR to a heatsink, to extend the life of gobos and gels, and put less IR on actors.

Cold mirrors that large are expensive and it's just complexity/something to break or need servicing (cleaning) so I doubt they were stocked much by rental companies or saw much usage in tv/movie production.


It's hard to get the same effect with LED though due to spectral differences.


The light spectrum from LEDs is undesirable.

But otherwise yes they would be less hot.


LED lights (small, flat, square panels with LEDs) are used in filming for TV.


If you're filming at high frame rates the flickering might become apparent.


So feed them with DC current? It's not like we're talking consumer lightbulbs that need to convert AC to DC as cheaply as possible to stay competitive.


It’s possible, but I’m thinking a lot of constant-current devices will have some ripple in their output, even if smoothened. Human eye won’t see it, but high frame rate camera might capture at just the wrong rate to see it.

It’s an surmountable problem, but can be expensive if it doesn’t need to be overcome often.

(All speculation on my part)


Follow your switching regulator or rectifier with some analog filtering, then a linear regulator to convert the remaining ripple to heat. It doesn't have to be very expensive, and only loses a little efficiency. I have LED light bulbs that work like this.


> then a linear regulator to convert the remaining ripple to heat.

I've become much less of a fan of the "switchilinear" approach. The control loop of a linear regulator isn't fast enough to respond to fast switching frequencies; the regulator is effectively just a resistor in a filter network at that point.

Even if your switching frequency is in the regulator's loop bandwidth, the loop's gain at that frequency is almost certainly very low.


Ususally they use PWM for controlling the brightness because LEDs change their spectrum with voltage change and this is undesirable. That's why PWM is used in laptop and smartphone screens.


You can filter that to be as small as you'd like. Also, it can intrinsically matter much less; if you're switching at, say, 1MHz (which can be desirable for smaller inductors), you're talking about thousands of cycles of ripple in each frame.

Assuming you're using a global shutter camera: even if the ripple is completely unfiltered, the only remaining intensity effect would be that some frames get 8000 "blinks" in a frame and some get 8001.

Already, stage and film lighting is a relatively small market compared to the entire lighting market, but it's not so small that NRE can't be spread across a lot of units.


Incandescent suffers the same issues if run off mains. I think the filament itself has a bit of a filtering effect (it takes time to heat and cool), but for sensitive film work I assume you'll need power conditioning either way.


> I think the filament itself has a bit of a filtering effect (it takes time to heat and cool),

I believe the time constant for most small lamps is 30ms and longer for bigger lamps. 1 / (2 * pi * 30ms) is 5Hz, so 120 Hz will be about 30 dB down. This is probably just barely big enough to matter.


These ones, apparently, take fifteen minutes.

I wonder what the filament looks like.


Lights that take a long time to warm up are more about establishing an arc or gas temperature for full operation. They still respond to power fluctuations more quickly than the time it takes them to warm up.

E.g. I have both fluorescent and HID lamps with this characteristic; they take a couple minutes to reach full brightness, but still will dim when a big load kicks on and turn off immediately when I flip the switch.


I feel like the film industry is big enough you should be able to engineer and market a special, non-flickering, tuned spectrum LED light that produces 80-90% less heat, and as long as the TCO isn't like an order of magnitude higher, you'd have a gangbuster success.


High CRI, tunable color balance and temperature LED lights are very widespread in tv/movie/commercial shoots and have been for years...


Do you happen to know if the lights everyone tells stories about -- both this news article, and other anecdotes -- are already LEDs, and still absurdly hot, because more light is better?


The LEDs only reach up to 800W in the case of the SkyPanelX, while regular halogen lamps are still needed for more focused, higher brightness lights (which go up to 18kW per lamp(!))


Yeah, even with the 4:1 lumen:watt advantage for LED bulbs, that's still a nearly 6x brightness advantage for halogen.

Is the limiting factor the technology to make a 4500W LED bulb in the same form-factor/focus as the 18kW halogen, or does it just become prohibitively expensive to do so?


LEDs produce 80% less heat, but they also fail at much lower temperatures. Which is the major limiting factor for LED brightness today, as the cooling solutions get larger and more esoteric the more power you concentrate in an area.


LED lights can be made small, flat and portable which is good for TV (e.g. for news reporting).


Flicker isn't inherent to LED lights rather it's inherent to being particularly cheap when making them. Even just good consumer LED lights will have an extremely constant output.


I don't think it's that, I think the primary reason is color:

https://en.wikipedia.org/wiki/Color_rendering_index#Film_and...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: