Hacker News new | comments | ask | show | jobs | submit login
Return of incandescent light bulbs as MIT makes them more efficient than LEDs (telegraph.co.uk)
472 points by randomname2 on May 25, 2016 | hide | past | web | favorite | 267 comments

This news article doesn't even mention the article's name, the authors, or when it was published.

It was actually published in January, and has just now been picked up by the Telegraph.

Here is the citation:

    Ognjen Ilic, Peter Bermel, Gang Chen, John D. Joannopoulos, Ivan   
    Celanovic, Marin Soljačić. Tailoring high-temperature radiation and
    the resurrection of the incandescent source. Nature Nanotechnology,
    2016; DOI: 10.1038/nnano.2015.309                                  
And here is the actual paper, via Sci-Hub: http://sci-hub.cc/10.1038/nnano.2015.309

The Telegraph's explanation is terrible: "with a special crystal structure in the glass they can bounce back the energy which is usually lost in heat, while still allowing the light through."

The tungsten filament is sandwiched between two plates made up of layers of oxides, designed to selectively reflect infrared radiation and transmit visible light.

They used a numerical model to design and evaluate various candidate compositions for these plates. For a proof-of-concept, they chose one which uses layers of silicon oxide and tantalum oxide, with 90 layers in total per plate. This reflected about 90% of infrared radiation, producing a luminous efficiency of about 6.6%. This is comparable to commercial LEDs and compact fluorescents, though far from state-of-the-art.

However, the results closely matched their numerical model, and a more complex structure comprised of layers of silicon dioxide, aluminium oxide, tantalum oxide and titanium dioxide, with 300 layers in total, should produce a luminous efficiency of 40%. This is significantly better than the state-of-the-art in LEDs (about 15-30%). They did not, however, actually build that one.

I have no idea how expensive this would be to commercialise. It doesn't sound like the physics is particularly complex, but manufacturing costs could be prohibitive. I think it's safe to say we won't be seeing it outside the laboratory any time soon.

"The Telegraph's explanation is terrible: "with a special crystal structure in the glass they can bounce back the energy which is usually lost in heat, while still allowing the light through."

The tungsten filament is sandwiched between two plates made up of layers of oxides, designed to selectively reflect infrared radiation and transmit visible light."

Whats terrible about that? Its the same as what you said using words that could be understood by a child.

I was thinking the same thing. When I read the Telegraph explanation and was told it was terrible i thought "dam those reporters!".

Then I read the actual scientific explanation and thought the Telegraph summary was pretty good.

Ask the child to draw it, you'll notice the problem. One description makes me think that the glass has some new treatment, the other makes me think that there is a new structure within the bulb.

Splitting hairs. The point, as succinctly as possible is that:

Incandescent bulbs are inefficient because they lose a lot of energy as heat instead of light.

This counters that by introducing a crystal that selectively blocks the heat and lets the light through.

Maybe the explanation is wrong in the minutiae but it gets the fundamental point across.

I don't understand how that's supposed to make the bulb more efficient. If you're trying to keep the interior of the bulb hot, mission accomplished. But the problem we're trying to solve is that when we run a current through the filament, some of the energy we supply becomes visible light and some becomes infrared light. We want more of it to become visible light and less of it to become infrared light. How does preventing infrared light from leaving the bulb help with that?

The reflected IR light will be absorbed by the filament, therefore requiring less energy over time to keep the filament at temperature. That thermal energy while lower than the input energy will create a baseline temperature, so now you only need Some of the new energy to output more photons, instead of All the energy.

Bingo, this is how incadescence work: heating any solid material above a certain temperature will make it glow, all we need to worry about is finding a material that can withstand the temperature and make it into an efficient package.

Gas mantles work pretty much the same way. A cage made of ceramic metal oxides (formed by firing a cotton bag dipped in thorium or yttrium nitrate, if thorium is used then every use will release a few microrems of radioactove radon gas) is heated by gas or oil fire to incadescence; because the temperature is much higher than a light bulb, gas lamps actually waste less heat as infrared.

Here’s my take on this (corrections welcome): the thing you need to get light out is a hot filament. You use electricity to a) make it hot and b) keep it hot. Both are easier the better the filament is thermally isolated.

Caveat: if that argument is correct, I would think a bulb with triple glazing would help, too. Has that been tried?

The thermal mirror needs to selectively reflect infrared while being transparent to visible light, and it needs to bounce radiation back and forth, which requires a certain geometry. The novelty of this paper is creating a big flat filament sandwiched between high-temperature plates that reflect infrared and can be deposited as layered coatings.

> Caveat: if that argument is correct, I would think a bulb with triple glazing would help, too. Has that been tried?

The triple glazing is supposed to keep the infrared in (radiation losses), or supposed to keep losses from convection and conduction down?

You're trying to keep the filament very hot while spending the least amount of power. Apparently, reflecting the thermal infrared back at the filament lets you spend less power while keeping the filament equally hot and equally luminous in the visible band.

> when we run a current through the filament ... becomes ... light

Not exactly. It's actually "when the filament is hot it emits light". It has nothing to do with the current (except that the current heats it).

So it doesn't matter how you heat the filament - in this case you heat the filament by reflected infrared energy at it.

In contrast a florescent light does actually use electricity to make light - heating it would do nothing.

The IR is only a loss if it escapes. Their design tries to reflect it back at the filament to reuse it. A hot filament encased in a perfect window that reflects everything but visible light is highly efficient, simply because there's nowhere for waste heat to go.

Check out the wikipedia article on "luminous efficacy". Hypothetical black-body sources truncated to ≥2% photopic sensitivity range are excellent, and that's what this group is trying to approximate.

Think of it like a blanket for the filament. The IR radiation helps keep the filament at the appropriate temperature, so the amount of ohmic heating from electricity can be less.

Or greenhouse effect ;)

I also found the Telegraph's explanation lacking. Their explanation they give makes me think that the interior of the glass bulb has some coating or special composition.

This impression is quite distinct from the reality of the filament between plates with alternating layers of SiO2, TiO2, Al2O3 or Ta2O5.

The Telegraph could have at least said that the advance involved sandwiching the filament between layers of transparent material that transmit light by reflect IR. Not to mention conflating crystal structure and glass. (Glasses can be distinguished from crystals by a lack of repeating long range structure).

The manufacturing costs won't be too bad, I'd imagine. At least, in terms of the normal shocking costs of manufacturing. If the design is simple enough, that 300 layers may be just 75 repititions of four layers (no doubt not actually the case, but the machine spraying material onto the plates can adapt).

I'm a bit disappointed that the recipe wasn't included; I would have liked to see what the design entailed. All the layers are on dialectrics, and they're all oxides, so the process chambers won't be too hard to control. And I say that knowing full well some poor coater engineer is actually just going to have a devil of a time with it (or not! quarter wave filters are fairly standard stuff).

Most of all, the thing I'd be intersted in is how resistant to error is the design? It sure looks like it doesn't need to be too perfect per layer. For a pair of microscope lenses, precision is everything. In a lightbulb you can probably get away with a lot of error (per layer: as a whole you could easily make the color something goofy). Of course, if it's color swings wildly with minor errors in certain layers, well, that could hole the product up in R&D hell.

> Most of all, the thing I'd be intersted in is how resistant to error is the design?

I feel like this is what keeps the most impressive research out of real world application. Yeah, manufacturing is expensive, but tolerances are just so much more controllable in a lab.

Manufacturing also produces revenue, whereas funding the years of research necessary just to figure out a new nanometer-precise process is a hard investment for most companies to swallow, especially if the end product isn't guaranteed to produce a profit itself.

The good news is that the processes I described above are already well established. The trick is always in the dirty, nasty details, but in this case I'd be bullishly optimistic. I haven't personally worked on a large batch coater, so I can't attest to how hard it is to tune the things in, but your typical coater engineer is a bit eccentric anyhow and will no doubt make it work.

See, the weird bit of how these coatings work is that it's fairly easy to empirically tune the damn things to be angstrom accurate (-ish), but the geometry of how the material is sprayed messes up the uniformity, which in turn ruins your yields (one square centimeter in the center is good, chuck the rest!). But if the coating can withstand some variation between layers, tuning it such that the layers sorta cancel out in the bulk, you'll lose the efficiency, but your yield can absolutely skyrocket. There will likely be key layers at the optical half, quarter, eighth, etc. thicknesses, but if you get enough layers you can probably get away with a lot. You can also just build really neat source geometries to sorta flatten out the spray over a larger area, but that's something I never worked on.

My ignorance of this sort of coating has me worry about it immediately destroying itself from the insane material stresses... Though no idea how well it'll anneal after 1000 hours of constant heat (or worse: on-off heat-cooling cycles!) But since they're all oxides and likely put down together while the substrate stays hot, I bet it works out ok.

> Yeah, manufacturing is expensive, but tolerances are just so much more controllable in a lab.

Not least because, in a lab you are happy if one out of ten prototypes work for your experiment. You don't want 90% spoil in manufacturing.

Each of those layers of oxides are "quarter wave stacks" (from the paper) which means they're somewhere between 200 to 1000 nanometers (depending on the wavelength of light they want to transmit or reflect).

So one of those layers is say, 200 nanometers, you could end up reflecting the visible and transmitting the infrared, which is the opposite of what you want to do.

True, but when you have an awful lot of layers, you can play some silly buggers with it. Even if you botch a layer, it won't neccesarily ruin the whole thing, since the interference betwen the sets of layers also act as filters. I haven't worked on these sorts of stacks, personally, but I'd be surprised if the whole stack dies because one layer was a bit screwed up (I'd expect a degredation in the efficiency, but not necessarily a total reversal - now ask me what happens if your goal is to tune to neutral color and you make a slight error and then I'll flip sides and say "rainbow").

What I'd really expect is that if you make one layer slightly larger, your process almost certainly making all the repititions of that layer larger too, and now yeah, your bulk properties are just hosed.

> It was actually published in January, and has just now been picked up by the Telegraph.

The Telegraph article is dated 11 January 2016

It might be back dated then? Or perhaps just slow to publish online?

The url implies March 12.

An additional and important application discussed in the paper for the (relatively) high efficiency infrared reflectors is for thermophotovoltaic systems, which many think will be generating a lot of our electricity in 50 years.

There's nothing state-of-the-art with 15-30% efficiency LEDs, maybe a few years ago but not 2016, blue LEDs are almost at 59% and even the whites are at about 50%.

I'm confused. If reflecting 90% of infrared radiation only produces a luminous efficiency of 6.6%, how can any further increase in reflection lead to 40% efficiency?

Obvious but maybe wrong answer: because it's not linear, if you trap nearly all the IR it stays until converted to visible, otherwise it still escapes after a couple of bounces.

If you reflected all the infrared light, the efficiency would be 100%, since all the light leaving the bulb would be visible. Normally, the vast majority of the light leaving the bulb is infrared, but if you keep 90% of it in the bulb, then 6.6% of the light leaving the bulb is visible light.

Can this method of internal reflection be used to enhance the efficiency of LEDs?

Why can't we apply the same technique to fluorescent filaments?

Fluorescent lights don't use filaments, they use electrically excited gasses.

The reason this technique improves the efficiency of incandescent lights is that they produce their light by heating the filament. The hotter the filament, the more light it produces.

So by reflecting the heat back towards the filament, you can achieve a hotter filament with less input energy.

Fluorescent lights do use filaments. At each end of the tube there's a hot cathode - essentially a small incandescent filament designed to emit electrons rather than light. This is the source of the electrons that excite the gasses.

While there are some designs that use cold cathodes these are not the common type as they are less efficient.

Still won't make this filament treatment help though - sandwiching with layers would probably block the electrons.

So it sounds like this new bulb will have the same problem people complain about with LEDs and CFLs: slow warm-up times. The filament will take longer to warm up to full operating temperature, since you're relying on thermal reflection in the glass to let the heat build up. I guess they could get around this by having a power control circuit in the bulb, to give it a large inrush current to heat up quickly, and then back off once it's warmed up.

Or around candles, while we're at it...

Upvote for your nice summary, almost went the other way for the undeserved criticism of the article.

Thank you, this was so much more informative than the article.

Is that the same oxide as used in Tantalum capacitors? Those things have a tendency to explode randomly if inrush current isn't controlled appropriately.

They're also expensive and somewhat of a conflict material from what I understand in the little reading I've done on them(but they're awesome for power circuits).

People complaining about LEDs being too "bright" or "clinical/sterile" were too bothered to look into the different color temperature options. The annoyingly bright bulbs are around 5000K brightness which would make a home at night feel like a fully-lit classroom. I have 2700K bulbs which give off a much cozier incandescent glow: http://blog.batteriesplus.com/2013/seeing-things-in-a-differ...

Sure, the bulbs take about 1/5 of a second to turn on, which is not instant, but the cheaper LEDs can take up to 2 seconds to reach full brightness. You have to buy a better made bulb to get the quicker "on", which means spending more money. People only trying the cheapest LEDs available are the ones complaining the most about them, I'd assume.

Color temperature isn't the only thing, CRI is important too (though less so). It's hard to find CRI specs on most bulbs, nobody mentions it unless they have a really good number to brag about (90+).

One thing never mentioned is that LED color temperature will vary over time. It's a two-part system with complementary colors, the LED itself is blue and it's coated or surrounded with a yellow phosphor. The phosphor will dim eventually.

I've never seen a LED bulb that was so slow to light up that I noticed it, unlike some fluorescents that are really atrocious.

I've already given up on fluorescents, if only for the hassle of recycling them due to the mercury content.

Normally when people complain of led bulbs being slow to turn on, it's actually the dimmer at fault. Most dimmer switches have only two wires, and so charge their caps through the bulb. Since LEDs draw far less current, this takes longer, and so the bulb takes longer to appear on.

I have some stairways I must light up and they need to light up instantly for safety reasons. I make sure one of the bulbs in the fixture is incandescent, and CFL/LED the rest.

You can also not use a dimmer, or use a (quality) dimmer intended for LED bulbs. (Many dimmers supposedly for LEDs will still not turn on instantly.)

I have no dimmers at all in my house.

Bizarre that the LEDs would have a delay then, or that including an incandescent would make any difference in that delay. (Or are you saying it doesn't make a difference in how quickly the leds turn on, but it itself turns on faster? I guess if they're cheap ones with poorly tuned internal rectifiers that could be the case.)

I need some light immediately, and the incandescent gives me some light immediately.

My Cree Type A LED replacement bulbs are over 90 CRI.

Incidentally, is CRI a totally bullshit measurement, or does it mean something? I find it oddly coincidental that an incandescent supposedly has a CRI of 100 and matches daylight. How is this possible for such an old tech? Did they just get really lucky with the first lightbulb? Or have the gases and filament material improved over time to bring CRI up to 100? I have a hard time believing CRI was the ultimate goal of incandescent development, but I could be wrong.

CRI measures how close (using a terrible metric) oranger lamps get to a black body radiator, or how close bluer lamps get to an approximation for daylight from the 60s, which each have a CRI of 100 by definition. Incandescent bulbs are a black body radiator, so unless you put some kind of filter in front, they’ll have a CRI of 100. It’s not a 0–100 scale though. 100 is an arbitrarily selected number.

There are reasons to be dissatisfied with the light produced by LEDs and fluorescent lamps, but CRI is only very marginally useful.

Better is to look at the spectral power distribution, and try to find lamps with a broad emission spectrum and no sharp spikes. For nighttime lighting, try to avoid sources with much emission at wavelengths below 500 nm, because these knock out night vision and disrupt sleep cycles.

CRI measures how natural the objects' colors appear under an artificial light source against the objects colors under a natural light source (standardized daylight).

The color temperature and luminosity of natural light sources varies tremendously during the course of the day or in different parts of the world, yet the objects colors remain consistent. This is a property of the human vision called chromatic adaptation. https://en.wikipedia.org/wiki/Chromatic_adaptation

Sources which emit a continuous spectrum have better CRI than tri-color sources, especially on artificially colored objects (dyed/pigmented) due to metamerism. https://en.wikipedia.org/wiki/Metamerism_(color)

If you want to get technical, the algorithm for computing CRI is:

- take a specific set of arbitrary paint chips, and record their colors when illuminated by the target lamp (using the CIE 1931 2° standard observer, and the 1960s CIEUVW color space)

- perform an outdated and not very effective type of chromatic adaptation transformation so that the white point matches either a black body radiator (like an incandescent bulb) for lamps of correlated color temperature of <5000K, or a point on the “Illuminant D” series of approximations to daylight for lamps of CCT >5000K

- measure the distances in UVW space between those target colors and the colors of the paint chips as illuminated by the reference black body radiator or D series illuminant.

- sum up all the color distances, multiply by an arbitrary factor, and subtract from 100.

Everything about this process was totally ad hoc and arbitrary 50 years ago when it was first invented, and is absurdly outdated today.

There was an attempt at the CIE to replace the CRI with something better in IIRC the late 1990s, but the lighting industry didn’t want to follow the recommendations of the color scientists, so it fell apart. (I could be misrepresenting what happened; I’m not an expert in CIE politics or the history here, and haven’t ever researched it in detail.)

> For nighttime lighting, try to avoid sources with much emission at wavelengths below 500 nm, because these knock out night vision and disrupt sleep cycles.

Moonlight has significant energy at wavelengths below 500 nm [1]. Does this light disrupt the sleep cycles of other primates, or is this something that developed in humans only after we invented "indoors" and started sleeping there away from sources of bluish light?

[1] http://www.olino.org/us/articles/2015/10/05/spectrum-of-moon...

Yes, if you stare directly at the full moon, it will be harder to see into the shadows for a while afterward until your eyes adapt back to the dark, and you can probably (I’m not sure this has been scientifically tested) push back when you start feeling sleepy.

In general, the moon is high overhead, so it won’t be directly in your field of view while it lights up your surroundings.

I would suppose that the difference would having these < 500nm wavelength light sources pretty much right up in our faces.

> try to avoid sources with much emission at wavelengths below 500 nm, because these knock out night vision and disrupt sleep cycles.

Citation please. What does it mean "below 500 nm"? 400 nm or 600 nm? 400 nm is violet, 600 nm orange. Which of those "knocks out" what?

Light, especially bright sources of glare, in the short-wavelength part of the spectrum saturates the “rod” light detectors in the eye, causes vision to become bright-adapted. To adapt back to fully capable night vision takes something like 20–30 minutes.

Additionally, beyond rods and cones, there are a 3rd set of light detectors in the eye, the “photosensitive retinal ganglion cells”, which regulate melatonin and sleep cycles. These are particularly sensitive to light in the 400–500 nm range. Looking at such a light source at nighttime can suppress melatonin production until about an hour after you stop the light. This is the reason that smartphone, television, and computer displays are all so disruptive to sleep when used before bed.

> matches daylight. How is this possible for such an old tech?

Kind of makes sense when you realize that the sun and an incandescent bulb are both generating light as a small slice of their overall thermal radiation envelope. It's just heat, made visible in both cases. And like sibling commenter said, the CRI was defined based on the incandescent.

Though the filament would have to be at about the temperature of the sun's visible surface (with some adjustment for atmospheric absorption and scattering) to have the same spectrum. That is about 5700K, and tungsten melts at 3695K. As was mentioned above, the scattering changes quite a lot through the day.

True, though I guess I'm explaining more the continuous nature of the spectrum. LEDs are very notchy or peaky, while light produced from glowing tungsten is a smooth output across the visible spectrum that it covers. The fact that bluer light is scattered by the atmosphere also helps bring sunlight closer to incandescent light in terms of color temperature though, right?

Incandescents work more similarly to the sun and are basically a natural full spectrum light source. Florescents and LEDs are not, so they need to be tweaked/hacked/refined to produce an ideal spectrum. Florescents are known to use mercury to help with this, making them bad for the environment. And though the prices are coming down, high CRI LED's have been a trailing technology with high CRI 5000K bulbs becoming affordable/viable only recently.

Yep, the spectrum of light emitted is what matters. A histogram or spectrum intensity curve printed on the cardboard package would be ideal. Sadly, I've never actually seen that in consumer grade stuff.

I've found that most LED bulbs you can find in supermarkets are really quite bad in this respect.

For my home office, I got something called "True-Light" E27 bulbs. The specs are 12W, 5500K, 920lm, CRI 94. They emit a passable bright, white light.

My favourite light source are still E11 halogens. Just pepper the ceiling with them, add dimmers, and hope it doesn't heat up the room too much…

I’m one of those goofball hipsters who is in love with those “Edison filament” bulbs. In the past year, passable LED facsimiles have been turning up everywhere. No idea what the color temperature is, but if you like that cozy, dim feel…


These look cool as hell, thanks for the link!

Hey, those look really nice. I may turn to the Victorian style ones for my lighting needs. The site advertises the colour temperature as 2200K to 2700K depending on the model.

I like these bulbs but not too close to the face. A friend has some over his kitchen table and I don't like the heat in the face.

The LED ones are hardly even warm to the touch (there are LEDs embedded in the "filaments").

Yep. I've replaced about a lot of the bulbs in my house with LEDs. The key is to buy a number of them, find ones that you like, and then put the ones you don't like in places that the color doesn't matter as much (garage, storage areas, etc).

For me, a lot of the Philips LED Soft White (2700k) bulbs worked.

Agree. While tastes differ, to me the Philips models (the ones that look like an Erlenmeyer flask) give the most pleasant light by far. They also seem to be much better constructed than the knock-off cheapies you find in bulk packs at Lowes or Home Depot.

I've found that that Walmart "Great Value" brand 8.5 W, 800 lumen, 2700K, 20000 hour bulbs are nearly indistinguishable from the 60 W incandescent bulbs I had before, except that these LEDs are not dimmable.

They are nominally about $2 each, although they frequently drop the price to an astounding $0.17 each. I'm not sure if these price drops are due to some sort of automatic at-the-register rebate from the local power company, or just an internal Walmart effort to promote efficient lighting.

The same company that manufacturers these for Walmart is called TCP and they sell the bulbs on Amazon as well. I've bought about 30 of them.

Pretty happy with them so far. The only problem I have had (as with all brands so far) is led bulbs seem to interfere with my garage door receiver signal (if placed right next to it).

I've found the Philips 424382 (60w replacement, 2700K soft white) to be very nice. I haven't tried any of the replacement SKUs, though.

Philips makes lots of different bulbs, even with the same specs. I suspect they have very little in common, other than the name on the box. This is a time of very high churn in the industry.

I wanted to like 2700k, but I find it to be extremely yellow. I remember, even on tungsten times I found lights to be annoyingly yellow. It's even more apparent when working on a led screen, then turning around to see your devastatingly yellow lights. Then again, I find 5000k lights too blue, and I have found that 3500k-4000k is the most pleasant temperature, but the industry has settled on yellow and blue lights. It's a shame.

For me the LED bulbs Costco sells worked the best. They're also coincidentally quite inexpensive.

The Costco Feit-brand 2700K bulbs are quite nice and decently priced. My overhead CREE cans are also great.

I've tried two brands so far (GE and Hyperikon) for LED bulbs at 2700k and 3000k and they all were incredibly white and nothing like my incandescent bulbs- and on top of that, even the "dimmable" LED bulbs suck. I want to have LED all over my house, but there's honestly almost an "uncanny valley" feel to the LED bulbs I've tried, and so far I'm sticking with my incredibly inefficient bulbs.

I've tried plenty of LED bulbs and so far they've all been rubbish. The light of 2700K bulbs is definitely whiter than incandescent bulbs. Stuff in the room is not the right color. In addition, the bulbs seem to "flicker" and/or make strange noises when dimmed. Good thing some shops in smaller towns still sell incandescent bulbs.

That can happen if 1) you have an older dimmer not rated for LEDs 2) the dimmer is mis-wired; some are very specific where you hookup the hot and load wires especially if you have a 3-way circuit for the light.

Thanks for the feedback. My dimmer is probably older than I am. I'll look into getting a newer one.

> The light of 2700K bulbs is definitely whiter than incandescent bulbs.

You must be looking at some mislabeled, or carelessly packaged stuff.

Also, the actual temperature of incandescent is going to vary with the line voltage. A swing down to 110V will produce a more yellow light than 120V.

> Stuff in the room is not the right color.

Stuff in the room is the right color when you take it outside on a sunny day.

Compare photos taken outside with no-flash indoor ones.

Thanks, that mght explain why I find the 2700K LEDs too "white". I found some 2100K LED bulbs online, I might give them a try.

Being "whiter" won't be an issue. I meant there's some awkward green/purple nuances everywhere.

Cree TW series are the best i've found. They were only bulb to meet california's voluntary requirement for LED lights when i bought them. The requirements covers color rendering, temperature, dimmability, and other important things.


I have Cree recessed modules in my kitchen, and they're fine when not dimmed. They don't change color temperature when they dim, which incandescents do.

I've recently seen adverts for LED bulbs that (intentionally) lower the color temperature when they're dimmed.

Here is a 2013 article that goes into the technicalities a bit:


I tested ~10 different bulbs for my kitchen lighting about 5 years ago. I don't know how much the situation has changed since then, but at that time I had a few key requirements - high output (150W equivalent), warmish (but not super-warm for the kitchen) color temperature, fast turn-on speed, and dimmable.

Much to my surprise, the (almost) cheapest bulbs (some Utilitech model that doesn't seem to exist anymore) ended up being the winners. Instant on (no perceptible delay, compared to all others I tried), great color temp, no flickering, wide range of dimmability. I tried many different brands, from Phillips at 3* the price.

It's just a single anecdote, but I cared a lot about the performance of the bulbs, and I was surprised to find the cheap ones actually worked best. I think there's a lot of variability out there and you can't rely on cost and you definitely can't rely on brand name (Philips was the worst).

Lately, I've found Ikea's LED bulbs are good and quite cheap (with imperceptible delay on turning on).

As another anecdote, I have several LED bulbs from Philips (LPrize-PRO and Hue), GE (GE Link) and Ikea.

All of them are bright, work fine, and start instantly, but the Ikea drivers have a constant buzz. I've relegated those to bathroom fixtures that are only used briefly so that the sound won't drive me crazy.

Interesting, my IKEA ones from a few months ago don't seem to have this issue. Could be an AC frequency difference or just a different model of bulb?

Indeed. I recently replaced all the incandescent cans in my house with LED cans and was very certain to specify a color temperature. I've also got "LED filament" bulbs in one room at 2200K (designed to replace those clear 'vintage' bulbs), which give a really cozy feel, if a little too yellow. Yet for my garage, I love the high color temperature bulbs.

3000K is also very nice. White objects look more white than under 2700K, but the light is still warm; not a huge difference from 2700.

I've done countless tests with different temperatures and I've found that 3500k is the perfect one for my house but it's really hard to find bulbs of that color temperature.

3500k is common in CFL, but you have to want to be investing in CFLs.

My biggest issue with them is size limitations. Lots of places they won't go.

I've used some of the better tested LED bulbs (per http://www.ledbenchmark.com/ 2700K, 80+ CRI, "nice" looking spectrum, "flicker free"). They are very good, and have improved over bulbs from years back. Definitely better than any CFLs I've used. Still a bit less vibrant and warm compared to incandescent. I personally don't need huge amounts of light outside of bathroom/garage, and find most of the LED bulbs too bright if swapped out 1:1 per the equivalent wattage labeling in my existing fixtures.

I've tried multiple different LEDs rated at 2700K to replace the GU10 halogen lights in my kitchen - literally every single option I've tried is too "white". Nothing gives off that nice warm colour of a halogen, even expensive Philips LEDs can't match right colour warmth.

Aren't halogens (as a subclass of incandescents) expensive, and run hot/high wattage?

They use considerably more energy, since each one uses 50W, times 8 halogens in the kitchen=400W just for the lights, vs. 40W if they were LEDs. But then again, the energy saving is not worth anything if the light annoys me. But they are not expensive, halogen GU10 costs $1 each.

Halogen bulbs (as a sub-set of incandescent bulbs) are actually more efficient than normal incandescents, because they operate at a slightly higher temperature, and the glass usually stays clear for longer. This is possible because the halogen inside the bulb prevents evaporated tungsten from condensing on the glass. This also means that an appreciable amount of ultra-violet light is produced, so most halogen lamps will have a UV filter.

The reason people need more power going into halogen lighting is usually because of all the other arrangements around the bulb that reflect the light into the wrong place. A bare halogen bulb (without a reflector) will light a room better than the same amount of power going into a bare normal incandescent bulb, but it will be a harsher bluer light.

The colour issue can be true for CFLs too, when I was living in Asia last year I had a really hard time finding warm-white bulbs. It seems like they prefer cold-white bulbs there (you even can see the difference when flying over Asia vs Europe). It's great for a kitchen where you need to see clearly what you are doing, but not so much when you want a cosy living room.

I assume most LED bulb factories are in China at the moment, so they are just targeting their own preferences for colour.

> People complaining about LEDs being too "bright" or "clinical/sterile" were too bothered to look into the different color temperature options

And where when I'm buying these lights at the supermarket does the information that other color temperature options present itself? I've bought lighting for camera work, but I've never seen anything about color temperature at a local store.

In Australia at least, pretty much every single different bulb has its colour temperature printed on the box, as well as a usually having little bar chart thing showing where in the visible spectrum it sits.

They all have descriptions like "Cool White", "White" and "Warm White" too.

It works pretty well - I haven't been disappointed in LED globes I've bought, whereas I bought a few CFLs back in the day that I couldn't stand.

I take it "cool" means "higher temperature" and "warm" means "lower temperature"? :)

You're not going to get a ton of granularity, but you should be able to find at least ~2700K and ~5000K bulbs in your local supermarket, if they've got a decent selection at all-- "soft white" bulbs are going to be ~2700K and "daylight" bulbs are ~5000K.

Stores with larger selections may have a few options in between, too; "cool white" and "bright white" designations aren't uncommon (although exactly what they mean may vary; check the packaging if you're not sure).

I guess my problem is the information discovery. How does the average person buying bulbs in Target or Walmart know to look for this new thing? "Soft White" and "Daylight" are good terms but I wonder how you know the difference? Daylight bulbs had a bit of a different meaning in my mind, but I guess I grew up in a rural area.

It's not new; it's the same terminology that's been used for the past few decades.

You could buy "soft white" and "daylight" incandescents, too.

I've seen it on Philips bulbs in the supermarket and also on Ikea bulbs (obviously not in the supermarket)

For brand names I think it is pretty common, at least here in Europe.

I've never seen a chart in a Wal-Mart or Target, and this was never a concern for the average person with incandescent bulbs.

Incandescent bulbs never had any variation, they're all around 2700K. The GE Reveal bulbs I believe use a color filter to influence the appearance, but the filament still glows the same as everybody else's.

> Incandescent bulbs never had any variation, they're all around 2700K.

That's basically my point. Of course people will complain about the light since LEDs act differently and buying options didn't properly explain how to get back to something that looks like what people had.

And while I don't know how well this work, apparently it's possible to get blue-tooth-controlled adjustable temperature bulb:


Seems like something it would be at least interesting to play with.

I have two different brands of 2,7k led bulbs, and they are like night and day. The ones that were shipped with a lamp are actually quite bluish. Haven't bothered to measure them though..

> night and day

Tsk tsk.

We're do you source the bulbs? I've recently tried f.lux and I definitely get better sleep - I'd like to upgrade my home lighting story.

Philips Hue + Amazon Echo + Custom Integration = Glorious

"Alexa, I'm ready for bed." is like I live in the future (warms up the lamp colors like flux on my mac). Still working on tying the coffee pot into the morning wakeup alarm though.

I'm in the process of figuring out how to talk to my color changing LEDs so I can change them over time like flux, but for now I'm just turning them a dim, dull red at bedtime and I am sleeping better than I have in years.

Its striking how much of a difference it makes. There should be PSAs about night time lightening color warmth!

I just buy them on Amazon or their third party retailers.

> 5000K brightness

That's color temperature, not brightness. Totally different.

We switched to LEDs about two years ago and have switched back to halogen bulbs recently for few reasons.

1) It didn't actually make much difference on our electricity bill. I believe this is because we use most of our lighting during the winter, and the wasted heat from the traditional bulbs isn't really wasted at all -- we just shifted more work to our electric heating system.

2) Many of the bulbs had burnt out. I am now very skeptical of all the "20 year lifetime" claims since we lost many expensive bulbs in two years.

3) The light quality just isn't great. People will argue this point, and I can't point to any evidence to support my claim, but our perception was that the light was either too blue, or too yellow, and just "looked weird." I wonder if it's related to the strobing of the LEDs.

LED slow motion: https://www.youtube.com/watch?v=zAfWKcg8Bq0

Halogen slow motion: https://www.youtube.com/watch?v=sCD5BMr6LsA

> 1) It didn't actually make much difference on our electricity bill. I believe this is because we use most of our lighting during the winter, and the wasted heat from the traditional bulbs isn't really wasted at all -- we just shifted more work to our electric heating system.

The only reason people expect them to is the, IMO, obsolete idea that lighting load is a significant driver of power usage.

> 2) Many of the bulbs had burnt out. I am now very skeptical of all the "20 year lifetime" claims since we've lost lots of expensive bulbs in two years.

Maybe I have just been lucky, but the LED floods I put on the exterior of my house have been working like champs. I used to go through halogen and CFL bulbs like crazy out there. The Cree LED retrofits for my dimmable can lights have also been solid.

I'm not sure this is an obsolete idea. The lighting in my house is simply an astounding use of electricity. In fact, I think people underestimate how much power it consumes.

My house's baseline load is about 200w; networking/computer equipment mostly. The fridge that cycles on and off, and an instant-hot tap that cycles on and off to keep up to temperature.

Each bedroom had 60w floodlights in 4 cans. Sure, we don't have them on all the time, but 4x3x60 = 720watts. That's half a hairdryer! Our living room+hallway was even worse. There were a bunch of MR16s at 50w. With 16 of them in the house, that's another 800w. When we're home in the evenings, the living room lights are on all the time, because it's in the main part of the house. To be fair, the payback on the $500+ in LED bulbs I put in is way out there, but a big driver in my spending was to keep temps down during the summer. It seems so silly to artificially heat the house with wasted electricity.

In the winter, we don't use much heat, and the gas forced air heat is a hell of a lot better than using lightbulbs.

> My house's baseline load is about 200w

I don't know offhand what my baseline is. Still, this works out to 2.4 kWh baseline energy usage for you.

> Each bedroom had 60w floodlights in 4 cans. Sure, we don't have them on all the time, but 4x3x60 = 720watts. That's half a hairdryer! Our living room+hallway was even worse. There were a bunch of MR16s at 50w. With 16 of them in the house, that's another 800w. When we're home in the evenings, the living room lights are on all the time, because it's in the main part of the house.

That's a hell of a lot more lighting than I have. Our bedrooms have single-bulb overhead lights. The guest bedroom is rarely used, the master has two 25W lamps that are used infrequently, and the third is my lab with has light on maybe two hours a day (I prefer it off unless I am at my workbench). We have a bunch of cans, but except for one that is always dimmed they are usually off. The lights that are typically on when we are home are T8 fluorescent tubes in the kitchen.

Anyway, going back to your example, the typical assumption for lighting is that a given fixture is used on average 3 hours per day. Using that assumption for all your lights gives 4.6 kWh for lighting, or only about twice your baseline load. Noticeable, but far from astounding.

> an instant-hot tap that cycles on and off to keep up to temperature

It doesn't get anymore first world wasteful than this.

most electric water heaters work the same way, just further from the tap

At much greater cost, I might add. I ran my kill-a-watt for a few days, and, IIRC, it uses something negligible like 7 cents a day. I'm looking at Emerson's webpage and they're quoting 6 cents per day at $0.0986/Kwh so my 7 cents seems about right in a more temperate climate but with electricity costs more like 15-16 cents/kwh. Their numbers say .6kwh, so call it 219watt-hours in a year. Or, in other words, 1/365th of my house's normal standby power consumption when nobody is home.

And I've got a tankless hot water heater for our main hot water supply which is of course a savings in orders of magnitude more than is 'lost' by the instant hot.

The instant hot also probably uses negligibly more energy than the other ways we heat water; microwave (for beverages) or running a burner on the stove on full blast for minutes at a time just to get a large pot of water up to boiling.

Of course, you can see from the fact that I bother measuring electricity of new devices with my kill-a-watt (and replacing all bulbs with LEDs at great cost) that I'm somewhat sensitive to electricity consumption. But I choose to spend my consumption where I think it makes the most sense.

Sorry but I can't understand your first paragraph since I can't figure out the pronouns and specifically what objects you're measuring/looking up the wattages of. Can you clarify?

Sorry, "it" was in reference to the instant-hot water tap that was referred to as a first world problem.

Good points. I still use LEDs outside and they last a very long time. I suspect it's because they stay cool, which extends the lifetime of LEDs. I'm also not as sensitive to light quality for outdoor lighting.

It's easy to do the science behind the light quality from different bulbs, but no one seems to bother doing it. You just need to print a graph of the spectrum that the bulb emits and compare it to whatever your favorite bulb is like. Would love to have this information available on the box, but of course that would make comparison too easy.

Instead we're stuck with a simplification where a very high-dimensional property is reduced to a single number, and even that with low precision since few LED bulbs have emission spectra that resemble a black body.


Here's hoping TM-30-15 gets wide adoption. If every bulb listed its fidelity index, gamut index, and color vector diagram, you'd actually be able to tell which ones are awful.

http://energy.gov/sites/prod/files/2015/09/f26/tm30-intro-we... [PDF]

Most likely the ones not listing it.

The problem is that comparing a spectrum isn't actually meaningful. I really don't care about the spectral shape of the light if it's not perceptible to me. You want some sort of quantitative estimate of what you care about.

If you're looking at a white piece of paper with black text, you won't be able to tell the difference.

But if you're looking at color (flowers, a painting, human skin) then spectral shape becomes very important.

Look at all of the shades of color you're not seeing under a typical CFL: http://www.bealecorner.org/best/measure/cf-spectrum/spectrum...

I'm not disputing the fact that there is a difference. But whether it's perceptible depends on where in the spectrum the differences are in relation to what you are looking at and the response curves of your cones. Not to mention your expectations.

I'm a pretty picky person when it comes to color when working with photos. But I don't have a single incandescent in my house and I can't even remember the time when I've been bothered by not being able to discriminate colors.

I have to agree. Every few years I'll try again with LEDs and energy efficient bulbs. I'll look online, buy a few different brands and specifically look for the ones with a "cool" colour temperature that should look natural.

In every instance the lighting was terrible. I really haven't yet found an energy efficient bulb that didn't make my place look like a jail cell.

My current place has halogen track lighting and I really like it.

Halogen is very much not a cool color temperature. If you're trying to match your existing halogen lighting, stick to lights labeled "warm white". If they list a color temperature, 2700K is a reasonable option, 3000K works in a pinch. Halogen bulbs are typically a bit over 3000K, but they'll shift warmer as they're dimmed, which is a pretty common use case.

The vocabulary is a bit weird, since "warm white" corresponds to "low color temperature". It's because we mentally associate blue with cold/ice and orange with fire, but the color temperature scales are based on the color an object glows as you heat it. To make a black body glow blue you have to heat it pretty damn hot.

You don't want "natural" lighting if you want it to look like an incandescent. "Natural", ie "day" light is very blue. You specifically want a non-natural, warm color temperature.

Fire is plenty natural.

In architectural lighting, the reference spectrum depends on color temperature. Warm whites are measured against the black body spectrum, cool whites are measured against the daylight spectrum.

What do you mean? Daylight is very close to a blackbody.

Close but not quite, atmospheric effects block some portions. We use the CIE d-series illuminants as an approximation: https://en.m.wikipedia.org/wiki/Standard_illuminant#Illumina...

That's done because it's very unusual to get a black body radiator that hot on Earth, so if you're talking about something on the cool white end odds are high it's daylight or trying to approximate it, not a tungsten filament or a campfire.

If you are looking for "natural" indoor lighting (like for a bedroom or living room), you probably want a "warm"er color, on the range of 2400k to 2600k.


Cree for type A and downlights with 90+ CRI. I'm using Soraa single-source MR16s elsewhere in my house, with a CRI of 93+.

The problem with the light quality might be to do with the color rendering:


Basically, LEDs have a much more irregular spectrum than incandescents or halogens, so objects may look weird colours under them.

I don't know if the 20 year lifetime claim will pan out, but I replaced as many lights in my apartment that I could with LEDs a couple of years ago and haven't had a single one die yet, including in the bathroom, where I used to have to replace a bulb almost once a month.

> Traditional incandescent bulbs have a ‘colour rendering index’ rating of 100, because they match the hue of objects seen in natural daylight.

> Previously researchers have warned that the blue light emitted by modern bulbs could be stopping people from getting to sleep at night

These two sentences do not go together. Daylight contains LOTS of blue light.

5 seconds of research into CRI says that the first sentence is false. CRI compares a light source to an ideal black body, not necessarily daylight.

Most photographers understand the color temperature differences between incandescent/tungsten light and daylight, and have to account for them by setting their camera's white-balance (manually, or automatically).

So it comes as a shock to me that a traditional incandescent bulb's coloring matches that of natural daylight.


I think this is a difficult concept for people because our brains do such a good job compensating for mixed lighting. Our built in auto-white balance is too good. One example I use to explain this is the classic Hollywood 'moonlight' trick, which is achieved by using a daylight balanced source like a large HMI, and then using film or a camera that is balanced for tungsten. Looks like daytime to the eye, but on camera it looks like moonlight.

There are even special gels just for correcting various light sources CTO (color temp orange) CTB (color temp blue) and minus green (correct gross flo lights). The rest are usually regarded as 'party colors' since they are for non-technical corrections.

You're conflating two concepts, color temperature and CRI. Together they roughly describe the spectrum output; color temperature tells you where the peak output is along the black-body temperature line, and CRI tells you how lumpy the response is. A low CRI can lead to metameric failure[1].

[1] https://en.wikipedia.org/wiki/Metamerism_(color)#Metameric_f...

bad reporting.

I was going to say the same thing. People are confusing "continuous spectrum" with "solar spectrum". Try comparing a "daylight" bulb with a normal incandescent. The color temperature of the sun is ~5500K. The daylight one will look insanely blue, very much like a "cool white" LED or fluorescent.

The people that are claiming that incandescent light is "natural" are out to lunch, imho. Maybe it appeals to the cave man inside them, because it's closer to the color of fire?

Our eyes can adapt pretty readily to differences in color temperature. What is harder to adapt to (or maybe theoretically impossible) is the non-continuous spectrum. Even if some things look "natural" under such lighting, other things won't, it will shift colors in weird ways. It depends on the pigment etc.

Yes, the spectral power distribution is smooth for any approximate black body radiator. That low temperature (2700K) incandescent and high temperature (5000K-6500K) daylight have a range of "color temperatures" doesn't negate the smoothness of the distribution. Human chromatic adaptation accounts for this difference automatically very nicely.

I guarantee you it's very straightforward to reproduce color mismatches with 5000K "daylight" fluorescents that have CRI 90 and higher, compared to actual daylight. The CRI is just short of complete utter nonsense. It's very useful in specific comparison contexts, but requires qualification. It's almost like having your kid come up with a report card that says B+ and you think, yeah OK not bad kid! But then realist they got A's in everything except they got a D in history. Oops.

CRI is next to b.s. by itself, without further qualification. This is how we get craptastic lights where socks mismatch under a CRI 90+ light but not under daylight. The whole point of CRI is to try to turn a complex evaluation into a simple one, and encourages ignoring the dirty filthy truth about certain kinds of light sources, i.e. its emission lines. If you compare the spectral power distribution of incandescent and daylight (sun light on earth filtered through the atmosphere) they don't look anything alike except that both are very smooth spectra.

Whereas LED and fluorescent have very spikey spectra. We tend to not notice these spikes, but sometimes, depending on their location and depending on the materials being viewed under such a light source, they can have a pronounced effect that isn't predicted by CRI.

CRI has been a pretty hot potato for some time with a lot of scientists trying to fix it, butting up against a lighting industry that doesn't give a flying crap about accurate color reproduction, they just want a high number CRI so they can sell their shitty lights.

>CRI compares a light source to an ideal black body, not necessarily daylight.

Isn't the sun a pretty-close-to-ideal black body emitter? Is there something specific you are thinking of that makes daylight deviate significantly from black-body radiation? (Rayleigh scattering in the upper atmosphere?)

The sun provides a spectrum with a temperature of roughly 6500K, domestic lightbulbs produce something more like 2700K up to 3000K. The biggest difference between incandescent and CF/LED lights is that the spectrum from an incandescent bulb is quite smooth, whereas other lights tend to have a much spikier spectrum meaning you may see some odd colour effects from them and certain materials.

Indeed it is, but that doesn't mean anything illuminated by a blackbody "match the hue of objects seen in natural daylight." CRI has much more to do with the ability to discriminate colors. The absolute hue is a strong function of the color temperature of the illumination, but the eye/brain is also very good at normalizing that away.

This is precisely why I don't think people complaining about "nasty cool light" are based in fact. The color of the environment around you will quickly be normalized to be neutral by your brain, and it will be impossible for you to tell whether you are illuminated by a 3000K or 5500K light. You can really only tell color differences accurately.

> 5 seconds of research into CRI says that the first sentence is false. CRI compares a light source to an ideal black body, not necessarily daylight.

I'll clarify this a bit further: At color temperatures below 5000K (warm/neutral whites), it's compared against an ideal black body. For 5000K and up (cool whites) it's compared against CIE Standard Illuminant D[1], representing a daylight spectrum.

Two interesting notes on CRI:

1) This makes CRI discontinuous at 5000K. Two bulbs with very similar color spectrums at 4995K and 5005K can have significantly different CRI scores.

2) CRI of 100 doesn't mean it's "perfect", it means it's the same as a black body emitter (or illuminant D for cool whites). There's some recent research suggesting that people prefer oversaturated reds relative to the black body emitter, but if you put more red in the spectrum it actually lowers your CRI.

As far as the two quotes you pulled from the article

> Traditional incandescent bulbs have a ‘colour rendering index’ rating of 100, because they match the hue of objects seen in natural daylight.

Yep, totally wrong. A tungsten filament bulb is effectively a black body radiator. Being well below 5000K color temperature, it's gauged against the black body spectrum. It's the same spectrum as the reference! Of course it scores well!

> Previously researchers have warned that the blue light emitted by modern bulbs could be stopping people from getting to sleep at night

This one's a more reasonable statement. Daylight doesn't keep you up at night because there's no daylight at night. The fact that you're running a blue pumped phosphor LED at 11 PM is the potential issue. They have a giant spike of blue in the emission spectrum[2]. Exactly how that affects melatonin levels or your circadian clock is out of my expertise, but it's definitely a real concern that we need to investigate and get a better handle on.

[1] https://en.wikipedia.org/wiki/Standard_illuminant#Illuminant...

[2] http://technologyfront.com/journalism/all-pics/true-colors-1...

> Daylight contains LOTS of blue light.

Yes, but the sun automatically goes down at night.

Most people don't turn off their lights at sunset. I believe what the author is saying that if you have a lot of artificial light after sunset giving off a lot of blue light, it can interfere with your sleep.

You missed my point. The author simultaneously claimed that incandescent lights are like daylight, and that non-incandescent light is worse because it has lots of blue in its spectrum. All I am saying is that, given that sunlight contains significant blue light, these things cannot simultaneously be true.

There's insufficient evidence the problem results from blue light specifically, rather than the absolute amount of light. I would not be surprised any amount of light that results in mesopic, let alone photopic, adaptation would be a sleep inhibitor. The human visual system is very adaptive, and we're not nocturnal animals, so pretty much anything brighter than moon light pretty much I'd expect would be a potential trigger for "time to be awake".

It's the scientific concensus (unless you reject all of the evidence that's out there for the past 15 years).

We have multiple receptors in our eyes to do signalling, it's likely that only the S cone cell (low wavelength receptor) suppresses melatonin.

This study found no melatonin suppression for higher wavelength light: http://www.ncbi.nlm.nih.gov/pubmed/14962066

This study also showed the wavelength effect (may be the first big publication on the topic?): http://www.ncbi.nlm.nih.gov/pubmed/11763987

In addition, not only is it an observed effect, but it can be harnessed. This study showed increase in salivary melatonin by using glasses that filtered lower wavelengths of light: http://www.ncbi.nlm.nih.gov/pubmed/15713707

The effect is such a strong one that it was discovered despite not being one of the measurements in an unrelated experiment on breast cancer.

And throughout antiquity, the only light at night we're exposed to has been that of fire, which is a higher wavelength source.

Melanopsin retinal ganglion, hmm. http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2831986/

I think the author has somewhat conflated the concept of CRI and color temperature in the article. An incandescent bulb or a candle has 100 CRI because they are almost by definition close to the ideal. That said a typical incandescent bulb is going to be somewhere in the 2500-3500K color temp. Lots of cheap flo bulbs tend to be both relatively low CRI and have a color temp in the 5000-6000K range which is more like 'daylight without benefits'. They tend to have lots of blue and greens and nothing else. Lots/most LED lights fall in this range as well. My personal opinion is that lighting products should be required to list CRI and color temp, just like nutrition information on food. :)

Yeah, there are fluorescent lights with very high CRIs (for example those in professional light booths) but they cost more because of the phosphor sets.

Daylight has a CRI of 100 because the sun is as close to an ideal black body as we're going to get. Perhaps absorption and reflection within the atmosphere lowers it a bit, but I doubt it's significant.

As for daylight containing lots of blue, I think that's the point - our bodies have adapted to recognizing blue as the defining characteristic of daylight, and keep us awake for it.

This article read an awful lot like a marketing puff piece and failed to answer some obvious questions. There is way too much "those evil fluorescent bulbs are making your life miserable" in the article.

1. How long are the bulbs going to last? 2. How expensive would they be to manufacture? 3. When could we see these in stores?

Often times there is a long road from laboratory prototype to production, and the supposed cost savings in electricity won't matter if they only last 1,000 hours and cost $100/bulb.

Anecdote time. Those evil flourescent bulbs have made my life miserable.

They don't last substantially longer in the real world. I am routinely replacing them within 12 months.

They cost an order of magnitude more.

They are considered toxic waste with special clean-up and disposal considerations.

The amount of energy they save as a fraction of my total electric bill is inconsequential.

Stop buying such shitty bulbs. I moved into my house 8 years ago and replaced the incandescent as they burned out. Most within the first year. Of all of the bulbs I have replaced, not a single one has failed. The only ones that have failed me are some that came packaged in with a new light fixture. Those were the pits.

Worse, the bulbs in the lamps came from my previous home, and have been going strong for a decade now. I've been wanting to upgrade to LED bulbs when they finally burn out, but they've been too damn reliable.

There is a light at the end of the tunnel (hah) with a couple of bulbs that now won't turn on unless you tap them a bit. They're roughly a decade old and have been used for many hours, so I guess it's finally time.

It also depends on the quality of electricity in one's house. I've lived in places that chewed through CFLs at a rapid pace, and yet the same sourced bulbs would last for years in a different house.

Their aging properties are different than incandescent. Incandescent tends to just fail when the filament breaks. There isn't much change in the spectral power distribution, and hence the overall stability of the light source is quite good. Whereas with fluorescent and even LED, there's a decaying of the phosphors, so while it'll produce light for a longer time (probably) the quality of the light will suffer immensely to the point that I think the longevity claims are approximately useless crap unless of course you don't care at all about the quality of light.

Fluorescent bulbs have been superceded, LED bulbs are the best ones to get at the moment. There's very little reason to buy fluorescent bulbs anymore, LED bulbs last longer, are more efficient, and are easily affordable.

I don't know, this may be anecdotal, much like your experience. But I have three fluorescent bulbs that have been running for at least the past five years. One of them runs all night from dusk till dawn, even through New York winters. The other is in the garage, and the other is above a stove, that runs all night as well.

There you go, the reason yours last is because you're not switching them on and off 10 times a day everyday, like one would their bedroom light.

I wonder how much is also because they might be comparing apples to oranges: the guy with long-lived fluorescents sounds like he's using the traditional long tube kind, which are indeed known for very long lifespans, which is why they've been used in offices for many decades. The other guy complaining probably used CFLs, which are notorious for the cheap ones having very short lifespans, not because of the fluorescent bulb itself, but because of the cheap, crappy electronic ballast packaged inside. These bulbs almost always fail because of the electronics. On a traditional fluorescent system, the bulbs are just bulbs, with no electronics at all, and the ballast is a separate part that's built to last decades (the life of the fixture). Only the bulb is routinely replaced. When they went to CFLs, this changed, since they were trying to retrofit CFLs into sockets meant for incandescent bulbs, so they packaged a small electronic ballast into the base of the bulb assembly. And of course, to keep costs down, they massively cheaped out on the ballast electronics. With high voltages (needed to ionize the gas), it's easy for marginal components to fail early.

> They don't last substantially longer in the real world.

Yes they do, I have a lot of them and I write the installation date on the bulb - they have failed more or less inline with what the package said.

> They cost an order of magnitude more.

They cost $1. There is no reason they need to cost more than that.

> They are considered toxic waste with special clean-up and disposal considerations.

That isn't true, it's one of those urban myths that spread by email.

> The amount of energy they save as a fraction of my total electric bill is inconsequential.

What are you spending so much on electricity on?

>> They are considered toxic waste with special clean-up and disposal considerations.

>That isn't true, it's one of those urban myths that spread by email.

It can be hazardous waste, you can't categorically say that it isn't true. Both of you are technically wrong. While it's correct that low mercury bulbs can be disposed of in dumpsters according to EPA, some states have stricter rules with respect to even low mercury bulbs. I believe CA, WA, and VT all require specific recycling of low mercury bulbs, for example.

Beyond that, looking at bulbs that are NOT low mercury, those DO become regulated hazardous waste after they burn out according to federal laws. So those MUST be recycled properly to comply with federal and state laws.

In any case, 100% of fluorescent bulbs contain mercury, which is really bad for the environment. So even if you're legally allowed to throw them in dumpsters, it's really not something you should be doing if you care about the environment or the people in it.

Bottom line: Depending on the bulb and jurisdiction, a spent/broken bulb might be classified as hazardous waste legally requiring special cleanup considerations, or it might legally be able to be thrown in a dumpster. But regardless of the law you should properly recycle 100% of bulbs because they contain mercury (even if in small amounts).

Well explained, thank you for that.

This is off topic but, would you mind changing your username? I get that it's mostly harmless but it's against the usual HN guidelines.

Dongz is a surname, fyi.

> What are you spending so much on electricity on

I tried to figure this out for my household... 1) The squirrel cage on the HVAC unit uses about 600 watts. 2) Couldn't convince everyone in the house to turn off TVs when they leave their rooms. 3) Laundry (gas dryer, but again there is a motor on it). 4) Dish washer 5) Fridge/Freezer.

I added up the cost of leaving lights on in all all the rooms (10 60-watt bulbs) 24x7, and that would only cost $43 a month. Typically only half would be on, and for a few hours in the evening, (5 bulbs, 4 hours a night, 10 cents/kwh), and that comes to $3.60 a month.

You have a 600 HVAC watt fan, and only 10 bulbs in your house?

I have approximately 40 bulbs and I don't think I have a large house.

And I have not been able to convince people not to turn on the light during the day, so my usage is closer to 12-18 hours per day.

Some rooms have more than one light but typically only one would be used. Examples:

Master bedroom -- 2 lamps, but one is mine, which is hardly ever turned on.

Kids rooms -- one overhead hanging (swag) light each. Plus the closet light, but I'm not counting that as it is already a low voltage bulb (code requirement).

Living room -- the foyer has hanging light with 3 candelabra bulbs, which are 20 watts each. The dining room (the other side of the great room) has a hanging light with 5 bulbs, again 20 watts each. So one room is 60 watts, the other is 100.

Kitchen/family room -- overhead long tube fluorescent (120 watts total), family room had a torch light.

Bathrooms have multiple bulbs (4, 6, and 10 bulbs for the 3 bathrooms), but most of them are burned out and/or unscrewed. And they were 25 watt bulbs. So I'm counting an average of 60 watts per bathroom.

Basement has 6 bulbs total, but only a couple are used on the way to the laundry.

So slightly more than the equivalent of 10 60-watt bulbs for the lights that are used most often. At first I thought that I could save a lot with the compact fluorescent bulbs, but I really didn't see any change in the electric bill after changing out all that I could -- the chandeliers take the candelabra bulbs, and while I could get fluorescents to fit they would be way too bright. Same with the bathrooms, I didn't want 10 bulbs in one bathroom that put out the same lumens as 10 60-watt incandescents, and I couldn't readily find any fluorescent or led bulbs that were equiv. to a 20-watt incandescent.

This article also doesn't mention the current U.S. law which says it is now illegal to sell incandescent bulbs across state lines. The law should have been written to mandate efficiency, not implementation. Now there must be a legal fight to allow incandescent bulbs again (for the U.S., at least).

Why should a UK newspaper cover domestic US laws?

And in the US, the law described at https://en.wikipedia.org/wiki/Energy_Independence_and_Securi... suggests that the ban is not on technology choice but on efficiency, such that a more efficient incandescent bulb would be fine.

From the legislation:

  (C) Candelabra incandescent lamps and intermediate 
                base incandescent lamps.--
                          ``(i) Candelabra base incandescent lamps.--A 
                      candelabra base incandescent lamp shall not exceed 
                      60 rated watts.
                          ``(ii) Intermediate base incandescent lamps.--
                      An intermediate base incandescent lamp shall not 
                      exceed 40 rated watts.
Anything over 60 watts was to become unavailable by 2012, with a schedule to retire the 40 and 60 watt bulbs. Ironically, this meant that to get the same lighting levels, you were permitted to buy more 40w bulbs, but not a single 100w bulb.

As to the "why cover US law?" question, I thought it relevant because the discovery was made by a US team at MIT and for there to be a "return" of incandescent bulbs would require both US and UK/European markets to be opened back up. Even if only as an irony juxtaposed with the location of the discovery, it was related.

Now the law as stated would seemingly permit incandescent bulbs of equivalent brightness to old 100w bulbs, but only if they are 27w or less. Or, you know, you could just buy a gaggle of 27 watt bulbs.

What is your point about a candelabra incandescent lamp?

That concerns only lamps which use a "candelabra screw base as described in ANSI C81.61–2006, Specifications for Electric Bases, common designations E11 and E12." Moreover, that is from a section which specifically allows them to be incandescent, so long as they don't exceed a given power limit.

You also seem to have missed the part where it says:

> The rulemaking— (I) shall not be limited to incandescent lamp technologies; and (II) shall include consideration of a minimum efficacy standard of 45 lumens per watt.

or the "Backstop requirement":

> If the Secretary fails to complete a rulemaking in accordance with clauses (i) through (iv) or if the final rule does not produce savings that are greater than or equal to the savings from a minimum efficacy standard of 45 lumens per watt, effective beginning January 1, 2020, the Secretary shall prohibit the sale of any general service lamp that emits less than 300 percent of the average lumens per watt emitted by a 100-watt incandescent general service lamp that is commercially available on the date of enactment of this clause.

This allows incandescent bulbs so long as they are significantly more efficient than those sold before 2007.

As to why UK newspapers writers are supposed to understand the details of domestic US law, and why UK newspaper readers are supposed to care, in order to get an extra moment of irony - well, that seems like a lot of work for very little gain. Especially when the EU has very similar laws, so UK readers may default to think that the US is already in the same boat, so there's little to no irony available.

> current U.S. law which says it is now illegal to sell incandescent bulbs across state lines

That's not true. They sell "high efficiency" incandescents in stores (they are basically halogen in a different shape).

That is exactly how the current US law works.

See also: https://en.wikipedia.org/wiki/Energy_Independence_and_Securi...

They seem to be many on sale here: www.amazon.com/b/ref=sr_aj?node=328865011&ajr=0

Maybe the order would fail at checkout if the relevant warehouse is outside of your state. (I once had an order for smoke detectors fail since they were illegal in California)

From the paper: "This experimental device is a proof-of-concept, at the low end of performance that could be ultimately achieved by this approach."

Somebody really needs to get MIT's PR department under control. The hype level is so high it's embarrassing to a good school. Especially in materials science articles. It's like reading the National Enquirer of science.

Publish or Perish… Publish at all costs without going the extra mile.. A pity.

> The clinical white beam of LEDs and frustrating time-delay of ‘green’ lighting has left many hankering after the instant, bright warm glow of traditional filament bulbs.

Sheer garbage. LEDs come on instantly, and are available in various color temperatures, thanks to filtering: you can have then in 2700K. If your LED isn't coming on instantly, it has some problem with the power supply, probably because you bought the lowest bargain-bin crap you could get your hands on.

I've had Phillips flood lights (3000K) in my kitchen and living room for several years. They come on instantly and put out a warmly colored light that is consistent from the center of the spot to the fringes.

And your apples and oranges are the wrong color because the color rendition on them is so poor that Philips doesn't even publish the number. The only downlight I could find on their site has a CRI of 80 -- basically garbage. CRI is imperfect but things glow green until you get to the low 90s.

I built a house a couple years ago where the lighting used is predominately LED bulbs and I absolutely hate it. The pre-determined lighting plan for the home was presumably based on the brightness given off by certain incandescent bulbs and placing LED bulbs in those locations that are supposed to be wattage equivalent has resulted in a fairly dark house with what I would consider to be low quality of light. Many of the bulbs used in certain fixtures also emit a noticeable high pitched noise when on.

Sounds like pretty rubbish bulbs. Have you not tried replacing even one with a good quality brand?

Of course, it might also just be that the lighting design wasn't very good in the first place...

I have no idea what a good quality brand is. And if I did find better bulbs it would cost a fortune to replace all of the bulbs I just bought two years ago.

Sounds like somebody spent a lot of money without doing any research.

I must have tried samples of a half-dozen different brands before buying enough LED bulbs to retrofit all of my ceiling cans. Actually I sort of wasted my time, because the (subsidized) $2.99 65w replacements from the hardware store on the corner proved to be as good as any of them. No failures a year later (N = about 36) so they seem to be working out OK.

Similar story here, I found my favourite LED bulbs at the supermarket!

If they're on dimmer circuits, they could just be cheap LED bulbs. Even if the dimmer is all the way up, and even if the LED bulb is marketed as dimmable.

Didn't see it mentioned in other comments, but a particular failing of LED lamps is their unsuitability for critical color matching applications. Fluorescents and CFLs are available for that use, but to date haven't seen LEDs that are satisfactory.

This is important in art studios, museums and other venues. For many years I've used specialized 48" T12 fluorescents with good results. Incandescent/halogen lamps are also essential because color selection can depend on the target environment, there's a world of difference between indoor illumination and daylight in this respect.

The high-efficiency lamps described in the article will likely be very useful and a welcome refresh of a light source that's so far been hard to emulate with LEDs.

With IR energy being pumped back into the filament, maybe it's an option to run a hotter filament emitting a higher color temperature. More likely the input energy is just decreased so the resulting filament temp remains in conventional range. There still might be an issue re: time to filament burnout not differing from conventional lamps unless the filament is modified in some way.

In the long run I imagine LEDs will be improved, curbing the disproportionate blue output, while enhancing and smoothing the long end of the spectrum. LED dominance will probably be complete when color rendering is optimized, they work properly with electronic dimmer switches, and lamp envelopes produce illumination as diffuse as classic sources, e.g., like "frosted" incandescents.

So, if I understand dkbrk correctly, this is a glass that transmits light but reflects infrared. This seems to have a huge number of other applications (assuming it is practical). Make windows out of it, so that your house stays cooler in the summer and warmer in the winter. Keep your car cooler in the summer. Better face shields for fire fighters. Put a window in your refrigerator. Let people get closer to metal foundries/casting metal.

So, if this works and is 40% efficient compared to LED or florescent bulb's manage ~14% efficiency, do we ban all LED purchases in the US?

No, the bans were stupid to begin with and should be unwound. Instead, there should be highly granular electrical billing, if the electric company cares to really bill people properly for necessary overcapacity for that ~1 hour a day when demand spikes. I'd let them charge 1000% surge pricing if they wanted. People will absolutely alter their behavior to no negative effect for people who can't afford it, so long as the typical cost per kW is affordable.

It is interesting to think about how this commercial product might have to fight the anti-incandescent laws. Would this bulb be illegal in many states/countries right now?

In the EU, there are no incandescent laws, there are only limits on efficiency.

LEDs themselves won't likely be banned, but we'll likely see a continuing ramp up of required efficiency standards, just as we do now.

No. Just let consumers choose. If I could have super-efficient incandescent bulbs, I would prefer them precisely because of the better colors and instant-on startup. (This assumes production ramp-up has kept these new incandescent prices well below that of LEDs)

"No. Just let consumers choose."

That wasn't the previous solution and I'm wondering if people who were fine with the demise of incandescent bulbs would be fine with a ban on LEDs because of the same efficiency concerns? I get the feeling the answer is no which makes me wonder about the original reason.

> That wasn't the previous solution and I'm wondering if people who were fine with the demise of incandescent bulbs would be fine with a ban on LEDs because of the same efficiency concerns?

Yes, it was: the so-called "ban on incandescent bulbs" was actually an efficiency requirement which no existing incandescent bulbs met. Much of the research into high-efficiency incandescent bulbs (which IIRC actually had produced some bulbs that met it prior to the requirement going into effect, but which were not cost competitive with CFL or even LED bulbs) was directly spurred by the requirement.

> I'm wondering if people who were fine with the demise of incandescent bulbs would be fine with a ban on LEDs because of the same efficiency concerns?

I'm pretty sure that most the people in the public that were fine with the government setting efficiency targets that could be met by a variety of technologies on the market (but not, at the time, cost effectively by incandescent bulbs) would be just fine with moving the efficiency requirements up as new technologies become available.

At any given time, which particular industry players would support and oppose such a move would, of course, vary based on the relative efficiency of what each particular industry player was invested in.

Traditional incandescent lights have terrible efficiency 2.5% or so. The benefit from Old Incandescent vs LED's is huge (5x to 10+x), but there is just not much room from the best LED's to 100% efficiency making the remaining gains less important. https://en.wikipedia.org/wiki/Luminous_efficacy

EX: 303lm/w vs maximum possible 683 lumens/w. http://www.cree.com/News-and-Events/Cree-News/Press-Releases...

PS: Power is kind of an odd thing. Direct costs are so cheap most people ignore it, but the external costs are high enough to push efficiency gains. Taxes would be the best solution, but politics is odd.

There you go. Set the "sales" tax rate on a luminous efficacy sliding scale. The lower the efficacy, the higher the tax %. It'd have to be some kind of wholesale/manufacturer tax, at least in the U.S., so that it's reflected in the sticker price before checkout.

It would not surprise me one bit if it were discovered the lobbying effort for such laws were pushed by the very lighting industry that needed a way to compel people to buy much more expensive light bulbs with a higher profit margin.

It reeks to me of the same reasons why we have ethanol mandated fuel in some cities.

I was and am perfectly fine with the old incandescents going away, and I'll be perfectly happy when LEDs are legally phased out in favor of something better.

Wouldn't big LED be happy to license the new tech and push that?

Especially with LED sales waning as more and more people have 15 year bulbs.

>Researchers at MIT have shown that by surrounding the filament with a special crystal structure in the glass they can bounce back the energy which is usually lost in heat, while still allowing the light through.

Why can't/don't they use those same crystals on LED lights?

The crystals reflect back heat in order to improve efficiency by keeping the filament hot. LEDs don't work by being hot.

They do lose energy in the form of heat, though.

Ever seen the massive heatsinks required for a high wattage LED to work?

This was posted a while back and picked apart by the comments, as it has been here too. This is all just "potential" improvements, just like we've been hearing about "crazy new battery technologies" for the past few years and have seen next-to-nothing come of it. Always good to see new tech and ideas, but let's not kid ourselves into thinking this will make it's way back into homes anytime soon.

"Return of incandescent light bulbs" is an incredibly misleading headline that implies that due to this new tech, we're already starting to use these great new efficient bulbs- um, no.

Not to mention, the prototype/initial version is nowhere near the "potential" level of efficiency they are claiming.

Here is an interesting thread about color quality (something rarely brought up when LED gets mentioned).


> The clinical white beam of LEDs and frustrating time-delay of ‘green’ lighting

I hear a lot about the "delays" in incandescent alternatives, but all of mine are barely perceptible. My Flux bulb has like 400ms; it's palpable but how could it ever annoy me? How often were you trying to illuminate something with a light bulb within 400ms and missed it?

And of course, many of the adjustable/smart LEDs can mimic incandescent light temperature with ease. They cost more, but not necessarily over their lifetime.

As mentioned, this article was fluff.

"Green" and "energy saving" are often used to refer to florescent lighting in the UK. Florescent bulbs take some time to warm up before they reach full brightness, which is what they are complaining about. LED bulbs like your Flux don't have a delay.

Ah, thanks. The article does a poor job here, too, since it uses LED/CFL interchangeably. But if that's a known euphemism it gets a pass on this one :)

> Florescent bulbs take some time to warm up before they reach full brightness

They've had instant on fluorescent bulbs for a decade now, and they don't cost more. If yours are slow switch to a different supplier.

Instant start is not what the comment you're replying refers to. Instant start is about how an arc is struck to get the gas to glow initially. The comment above is referring to how the light gets brighter as it runs longer, well after the gas begins to fluoresce.


I know, I was not referring to instant start either. (I said instant on, so I can see how that might be confusing.)

The CFL's in my house are full brightness immediately. I suppose a lab could measure a change in brightness, but it's nothing that is visible to the eye.

Zero-warmup florescent fixtures do exist, but I've never seen them installed in a home. They work by continuously running a small current through the filaments to keep the bulbs warm, which requires a separate always-on power feed in addition to switched power. Normal (instant-start) florescents start producing light instantly, and in sufficiently hot weather don't need to warm up, but if it's below freezing expect less than half of rated brightness for the first five or ten minutes. It's worse for CFL bulbs than for straight tubes, and worse still for CFL bulbs installed upside-down because the mercury settles.

So it's like the "stand by" mode of a tube amplifier, which keeps the filaments on.

Even though fluorescent lamps are called cold cathodes, "[a] cold cathode does not necessarily operate at a low temperature: it is often heated to its operating temperature by other methods, such as the current passing from the cathode into the gas." (Wikipedia, Cold Cathode).

Obviously, the tech you're referring to requires a special circuit, since an ordinary light socket goes comletely open circuit when switched off. (I.e. anyone who claims to have these zero-warmup CFL's in ordinary light fixtures is confused.)

I find that the slow warm-up time of regular CFL's is excellent for bathrooms. When you have to "go" at wee hours in the morning, you don't want the full glare in your sleepy eyes.

Your notes about the ambient temperature effect on warm-up time are spot on. People who are not seeing the effect with CFL's may be living in a warm climate or well heated home.

What are the chances of seeing this at retail in the next decade?

Thank goodness! My LED Easy Bake Oven takes forever.

Philips has already had car headlight bulbs that do this "radiate heat back as light" trick.

They are VERY VERY bright. In fact the brightest headlights before you go to real xeon. But they are not much more power efficient.

9011 HIR and 9012 HIR models


Am I the only person that doesn't like "warm" light? I wish I could have all the lights in my apartment be D65 like my monitor. It makes everything look like a calm, cloudy day. At least with my monitor calibrated to that temperature, anyway.

They should rename "warm" to "nasty dingy yellow".

"However even ‘warm’ finish LED or florescent bulbs can only manage an index rating of 80 and most are far less."

This is simply not true. Current state of the art is >80. If you search a little you'll find LEDs with CRI 95.

I'm suprised noone brought this documentary up: https://www.youtube.com/watch?v=-1j0XDGIsUg

The most interesting thing I ever read about light bulbs is that they used to last 100 years. But manufacturerers limited this due to sales : http://spectrum.ieee.org/geek-life/history/the-great-lightbu...

There is still one active for over 114 years and it has a 'live' feed http://www.centennialbulb.org/photos.htm ( 1 million hours and it's not that efficient anymore )

When you live in a cold climate some light-bulbs become more efficient due to the fact that the heat is not waste heat.

As it helps to heat up the room they are in.

Heating your home with electricity is probably the most expensive and the most wasteful way.

Depends. If your electricity comes from coal/gas fired power stations, yes it is wasteful - just burn the fossils locally instead. If it comes from hydroelectricity, solar, nuclear, or wind, then no it is sensible.


Incandescent lights, when heating would be used, are 100% efficient.

They provide 3% light output, and 97% heat output. A single one on provides a great amount of heat and can easily provide spot-heat where humans are.

Instead there's CFLs and LEDs. And CFLs are a great way to spread mercury pollution across a great area. Snopes has a decent article about hazards and response: http://www.snopes.com/medical/toxins/cfl.asp

You could use the 97% of electricity that was lost to heat to power a heat pump. Heat pumps beat 100% efficiency. [1]

> In heating mode, heat pumps are three to four times more efficient in their use of electric power than simple electrical resistance heaters.

Instead of just generating waste heat, you generate waste heat and move heat around in a desired way.

1: https://en.wikipedia.org/wiki/Heat_pump

Direct electric is a really inefficient way of heating anything. I ripped out a direct electric heater system and replaced it with a ground source heat pump which is giving 4KW heat for every 1KW electricity (used to run the pump and compressor etc) even in the cold depths of the Swedish winter.

Depends strongly on the situation.

It's true that where heating would be used, if mounted in such a way that all that heat is available (ie, not recessed into a ceiling fixture), the 97% is useful. (But not necessarily maximally efficient, because if you wanted heat in the first place you'd be better off not turning your energy into electricity, a process with ~50% efficiency at the power plant, and then incurring further distribution losses.)

On the other hand, if cooling would be used, it's even less efficient because now you have to cool away all those 97%.

The electricity for my lights comes from thermal power plants with maybe 40% efficiency and then loses a little bit more in transmission. If I use LED bulbs, then that heat is replaced by my natural gas furnace, which is somewhere around 90% efficient. Net result, even in the dead of winter it saves energy to use more efficient bulbs. In summer, when I'm using electricity to cool my house, the difference is even greater.

Resistive heating with electricity is just terribly inefficient. If you have to do it anyway (because you need to run equipment, and that equipment is already as efficient as it'll reasonably get) then you can benefit a little by ensuring that this heat is put somewhere it's wanted. But if you don't have to do it, don't do it.

CFLs suck, but LEDs are great. I fear that the CFL interregnum may have hurt the cause of efficient lighting in general.

> CFLs suck, but LEDs are great.

I have both. There's basically no difference, except the LEDs are hot on the base and CFL's are hot on the bulb.

Power consumption is identical, CRI identical, and Cost per Hour (before failing) is basically identical (assuming the hours listed on the package is the truth).

> I fear that the CFL interregnum may have hurt the cause of efficient lighting in general.

It has not. I've been using CFLs for about 20 years (almost since the day they came out), and they work just fine.

CFLs take longer to come on, don't reach full brightness immediately, are less efficient, don't last as long, and contain mercury.

I don't know what your personal statement about using CFLs for 20 years has to do with my fear about the cause of efficient lighting being hurt. Your own preferences don't necessarily reflect those of people in general.

> CFLs take longer to come on

I have both. That isn't true. I have some LEDs that take longer, some CFL's that take longer, and some (of both) that are effectively instant.

> don't reach full brightness immediately

Also not true. It used to be that way, and perhaps some poorly designed ones. But there is no reason to buy any that are not instant.

> are less efficient

Again, not true - make sure to compare bulbs with equivalent CRI.

> don't last as long, and contain mercury.

This is true, and irrelevant. Also, LEDs cost more, and I'm not (yet) convinced the hour ratings are actually accurate.

> I don't know what your personal statement about using CFLs for 20 years has to do with my fear about the cause of efficient lighting being hurt.

Your fear is unwarranted. CFLs work perfectly fine, and I know that because I've been using them for a long time.

I don't know what to say, besides that your statements run contrary to my experience where they conflict, and I don't understand why you think it's irrelevant that CFLs die quicker and are more poisonous.

A whole lot of people experienced CFLs and came away with the conclusion that they suck. I fear that they will assume LEDs also suck.

> I don't know what to say, besides that your statements run contrary to my experience where they conflict

Did you buy CFLs recently or only more than 10 years ago? What brand? I've had excellent results with GE and GreatValue. Mixed results with Feit.

> and I don't understand why you think it's irrelevant that CFLs die quicker

Because I don't (yet) trust LEDs to live as long as they say. So that advantage doesn't exist as far as I'm concerned.

> and are more poisonous

Because it makes no difference in actual use. You can toss them in the trash if you want, or take them to Lowes/Homedepot to recycle. The amount of mercury is too small to hurt anyone if they break.

They also flash when connected to "glow in the dark" light switches, which was really annoying in my previous house.

LEDs in that type of wiring simply glow dimly. I suppose that's better in a way :)

You really need a neutral wire in the light switch box, and attach to that for the glow in the dark switch.

I ripped out the godawful CFLs in my kitchen that took 3 minutes to get up to full brightness in the winter.

LEDs are where it's at.

Doesn't work out when heating wouldn't be used, or AC would be.

Heating isn't always electric.

Using electricity for heating is an egregious waste of a high quality energy source which can do all sorts of useful things besides generating heat.

Sure, but when air-conditioning would be used that equation flips around. Not only do you lose electricity as heat, you need to use more electricity to counteract that heat.

And in most places, far more money is spent on cooling than heating. Even in the frozen wastelands of Minnesota where I live, air-conditioning costs over the summer are about equal to heating costs over the winter.

As others have mentioned, heat pumps and gas furnaces are much more economical than resistive heating via incandescent lights.

>And in most places, far more money is spent on cooling than heating.

Hmm. I've always heard the opposite. Anyway, here's one piece of research.


"Abstract Energy demand for climate control was analyzed for Miami (the warmest large metropolitan area in the US) and Minneapolis (the coldest large metropolitan area). The following relevant parameters were included in the analysis: (1) climatological deviations from the desired indoor temperature as expressed in heating and cooling degree days, (2) efficiencies of heating and cooling appliances, and (3) efficiencies of power-generating plants. The results indicate that climate control in Minneapolis is about 3.5 times as energy demanding as in Miami. This finding suggests that, in the US, living in cold climates is more energy demanding than living in hot climates."

Need a warmish proofing/fermentation chamber - just turn the lightbulbs (2x25W) in your oven on.

Not really that useful in places that are already hot, or heating while you want to sleep.

Of course, the bulbs would add 97% waste heat if heat isn't wanted. But most places here in the US have some soft of heating system for the winter. That would pretty much require changing out the bulbs for the winter.

During sleeping? Scraping the bottom of the bucket much? I never said that incandescents would eliminate heating, only that it is 100% efficient when used appropriately.

I think it's a fair point; these bulbs are not used for heating because that's not their primary function (despite being more efficient at it than lighting). The inverse would apply to, say, a fireplace. Light is an acceptable (even desired) side effect of the primary function, but it would not be convenient to rely on it for that side effect.


Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact