It was actually published in January, and has just now been picked up by the Telegraph.
Here is the citation:
Ognjen Ilic, Peter Bermel, Gang Chen, John D. Joannopoulos, Ivan
Celanovic, Marin Soljačić. Tailoring high-temperature radiation and
the resurrection of the incandescent source. Nature Nanotechnology,
2016; DOI: 10.1038/nnano.2015.309
The Telegraph's explanation is terrible: "with a special crystal structure in the glass they can bounce back the energy which is usually lost in heat, while still allowing the light through."
The tungsten filament is sandwiched between two plates made up of layers of oxides, designed to selectively reflect infrared radiation and transmit visible light.
They used a numerical model to design and evaluate various candidate compositions for these plates. For a proof-of-concept, they chose one which uses layers of silicon oxide and tantalum oxide, with 90 layers in total per plate. This reflected about 90% of infrared radiation, producing a luminous efficiency of about 6.6%. This is comparable to commercial LEDs and compact fluorescents, though far from state-of-the-art.
However, the results closely matched their numerical model, and a more complex structure comprised of layers of silicon dioxide, aluminium oxide, tantalum oxide and titanium dioxide, with 300 layers in total, should produce a luminous efficiency of 40%. This is significantly better than the state-of-the-art in LEDs (about 15-30%). They did not, however, actually build that one.
I have no idea how expensive this would be to commercialise. It doesn't sound like the physics is particularly complex, but manufacturing costs could be prohibitive. I think it's safe to say we won't be seeing it outside the laboratory any time soon.
The tungsten filament is sandwiched between two plates made up of layers of oxides, designed to selectively reflect infrared radiation and transmit visible light."
Whats terrible about that? Its the same as what you said using words that could be understood by a child.
Then I read the actual scientific explanation and thought the Telegraph summary was pretty good.
Incandescent bulbs are inefficient because they lose a lot of energy as heat instead of light.
This counters that by introducing a crystal that selectively blocks the heat and lets the light through.
Maybe the explanation is wrong in the minutiae but it gets the fundamental point across.
Gas mantles work pretty much the same way. A cage made of ceramic metal oxides (formed by firing a cotton bag dipped in thorium or yttrium nitrate, if thorium is used then every use will release a few microrems of radioactove radon gas) is heated by gas or oil fire to incadescence; because the temperature is much higher than a light bulb, gas lamps actually waste less heat as infrared.
Caveat: if that argument is correct, I would think a bulb with triple glazing would help, too. Has that been tried?
The triple glazing is supposed to keep the infrared in (radiation losses), or supposed to keep losses from convection and conduction down?
Not exactly. It's actually "when the filament is hot it emits light". It has nothing to do with the current (except that the current heats it).
So it doesn't matter how you heat the filament - in this case you heat the filament by reflected infrared energy at it.
In contrast a florescent light does actually use electricity to make light - heating it would do nothing.
Check out the wikipedia article on "luminous efficacy". Hypothetical black-body sources truncated to ≥2% photopic sensitivity range are excellent, and that's what this group is trying to approximate.
This impression is quite distinct from the reality of the filament between plates with alternating layers of SiO2, TiO2, Al2O3 or Ta2O5.
The Telegraph could have at least said that the advance involved sandwiching the filament between layers of transparent material that transmit light by reflect IR. Not to mention conflating crystal structure and glass. (Glasses can be distinguished from crystals by a lack of repeating long range structure).
I'm a bit disappointed that the recipe wasn't included; I would have liked to see what the design entailed. All the layers are on dialectrics, and they're all oxides, so the process chambers won't be too hard to control. And I say that knowing full well some poor coater engineer is actually just going to have a devil of a time with it (or not! quarter wave filters are fairly standard stuff).
Most of all, the thing I'd be intersted in is how resistant to error is the design? It sure looks like it doesn't need to be too perfect per layer. For a pair of microscope lenses, precision is everything. In a lightbulb you can probably get away with a lot of error (per layer: as a whole you could easily make the color something goofy). Of course, if it's color swings wildly with minor errors in certain layers, well, that could hole the product up in R&D hell.
I feel like this is what keeps the most impressive research out of real world application. Yeah, manufacturing is expensive, but tolerances are just so much more controllable in a lab.
Manufacturing also produces revenue, whereas funding the years of research necessary just to figure out a new nanometer-precise process is a hard investment for most companies to swallow, especially if the end product isn't guaranteed to produce a profit itself.
See, the weird bit of how these coatings work is that it's fairly easy to empirically tune the damn things to be angstrom accurate (-ish), but the geometry of how the material is sprayed messes up the uniformity, which in turn ruins your yields (one square centimeter in the center is good, chuck the rest!). But if the coating can withstand some variation between layers, tuning it such that the layers sorta cancel out in the bulk, you'll lose the efficiency, but your yield can absolutely skyrocket. There will likely be key layers at the optical half, quarter, eighth, etc. thicknesses, but if you get enough layers you can probably get away with a lot. You can also just build really neat source geometries to sorta flatten out the spray over a larger area, but that's something I never worked on.
My ignorance of this sort of coating has me worry about it immediately destroying itself from the insane material stresses... Though no idea how well it'll anneal after 1000 hours of constant heat (or worse: on-off heat-cooling cycles!) But since they're all oxides and likely put down together while the substrate stays hot, I bet it works out ok.
Not least because, in a lab you are happy if one out of ten prototypes work for your experiment. You don't want 90% spoil in manufacturing.
So one of those layers is say, 200 nanometers, you could end up reflecting the visible and transmitting the infrared, which is the opposite of what you want to do.
What I'd really expect is that if you make one layer slightly larger, your process almost certainly making all the repititions of that layer larger too, and now yeah, your bulk properties are just hosed.
The Telegraph article is dated 11 January 2016
The url implies March 12.
The reason this technique improves the efficiency of incandescent lights is that they produce their light by heating the filament. The hotter the filament, the more light it produces.
So by reflecting the heat back towards the filament, you can achieve a hotter filament with less input energy.
While there are some designs that use cold cathodes these are not the common type as they are less efficient.
Still won't make this filament treatment help though - sandwiching with layers would probably block the electrons.
They're also expensive and somewhat of a conflict material from what I understand in the little reading I've done on them(but they're awesome for power circuits).
Sure, the bulbs take about 1/5 of a second to turn on, which is not instant, but the cheaper LEDs can take up to 2 seconds to reach full brightness. You have to buy a better made bulb to get the quicker "on", which means spending more money. People only trying the cheapest LEDs available are the ones complaining the most about them, I'd assume.
One thing never mentioned is that LED color temperature will vary over time. It's a two-part system with complementary colors, the LED itself is blue and it's coated or surrounded with a yellow phosphor. The phosphor will dim eventually.
I've never seen a LED bulb that was so slow to light up that I noticed it, unlike some fluorescents that are really atrocious.
I've already given up on fluorescents, if only for the hassle of recycling them due to the mercury content.
Incidentally, is CRI a totally bullshit measurement, or does it mean something? I find it oddly coincidental that an incandescent supposedly has a CRI of 100 and matches daylight. How is this possible for such an old tech? Did they just get really lucky with the first lightbulb? Or have the gases and filament material improved over time to bring CRI up to 100? I have a hard time believing CRI was the ultimate goal of incandescent development, but I could be wrong.
There are reasons to be dissatisfied with the light produced by LEDs and fluorescent lamps, but CRI is only very marginally useful.
Better is to look at the spectral power distribution, and try to find lamps with a broad emission spectrum and no sharp spikes. For nighttime lighting, try to avoid sources with much emission at wavelengths below 500 nm, because these knock out night vision and disrupt sleep cycles.
The color temperature and luminosity of natural light sources varies tremendously during the course of the day or in different parts of the world, yet the objects colors remain consistent. This is a property of the human vision called chromatic adaptation. https://en.wikipedia.org/wiki/Chromatic_adaptation
Sources which emit a continuous spectrum have better CRI than tri-color sources, especially on artificially colored objects (dyed/pigmented) due to metamerism. https://en.wikipedia.org/wiki/Metamerism_(color)
- take a specific set of arbitrary paint chips, and record their colors when illuminated by the target lamp (using the CIE 1931 2° standard observer, and the 1960s CIEUVW color space)
- perform an outdated and not very effective type of chromatic adaptation transformation so that the white point matches either a black body radiator (like an incandescent bulb) for lamps of correlated color temperature of <5000K, or a point on the “Illuminant D” series of approximations to daylight for lamps of CCT >5000K
- measure the distances in UVW space between those target colors and the colors of the paint chips as illuminated by the reference black body radiator or D series illuminant.
- sum up all the color distances, multiply by an arbitrary factor, and subtract from 100.
Everything about this process was totally ad hoc and arbitrary 50 years ago when it was first invented, and is absurdly outdated today.
There was an attempt at the CIE to replace the CRI with something better in IIRC the late 1990s, but the lighting industry didn’t want to follow the recommendations of the color scientists, so it fell apart. (I could be misrepresenting what happened; I’m not an expert in CIE politics or the history here, and haven’t ever researched it in detail.)
Moonlight has significant energy at wavelengths below 500 nm . Does this light disrupt the sleep cycles of other primates, or is this something that developed in humans only after we invented "indoors" and started sleeping there away from sources of bluish light?
In general, the moon is high overhead, so it won’t be directly in your field of view while it lights up your surroundings.
Citation please. What does it mean "below 500 nm"? 400 nm or 600 nm? 400 nm is violet, 600 nm orange. Which of those "knocks out" what?
Additionally, beyond rods and cones, there are a 3rd set of light detectors in the eye, the “photosensitive retinal ganglion cells”, which regulate melatonin and sleep cycles. These are particularly sensitive to light in the 400–500 nm range. Looking at such a light source at nighttime can suppress melatonin production until about an hour after you stop the light. This is the reason that smartphone, television, and computer displays are all so disruptive to sleep when used before bed.
Kind of makes sense when you realize that the sun and an incandescent bulb are both generating light as a small slice of their overall thermal radiation envelope. It's just heat, made visible in both cases. And like sibling commenter said, the CRI was defined based on the incandescent.
I've found that most LED bulbs you can find in supermarkets are really quite bad in this respect.
For my home office, I got something called "True-Light" E27 bulbs. The specs are 12W, 5500K, 920lm, CRI 94. They emit a passable bright, white light.
My favourite light source are still E11 halogens. Just pepper the ceiling with them, add dimmers, and hope it doesn't heat up the room too much…
For me, a lot of the Philips LED Soft White (2700k) bulbs worked.
They are nominally about $2 each, although they frequently drop the price to an astounding $0.17 each. I'm not sure if these price drops are due to some sort of automatic at-the-register rebate from the local power company, or just an internal Walmart effort to promote efficient lighting.
Pretty happy with them so far. The only problem I have had (as with all brands so far) is led bulbs seem to interfere with my garage door receiver signal (if placed right next to it).
You must be looking at some mislabeled, or carelessly packaged stuff.
Also, the actual temperature of incandescent is going to vary with the line voltage. A swing down to 110V will produce a more yellow light than 120V.
> Stuff in the room is not the right color.
Stuff in the room is the right color when you take it outside on a sunny day.
Compare photos taken outside with no-flash indoor ones.
Being "whiter" won't be an issue. I meant there's some awkward green/purple nuances everywhere.
Here is a 2013 article that goes into the technicalities a bit:
Much to my surprise, the (almost) cheapest bulbs (some Utilitech model that doesn't seem to exist anymore) ended up being the winners. Instant on (no perceptible delay, compared to all others I tried), great color temp, no flickering, wide range of dimmability. I tried many different brands, from Phillips at 3* the price.
It's just a single anecdote, but I cared a lot about the performance of the bulbs, and I was surprised to find the cheap ones actually worked best. I think there's a lot of variability out there and you can't rely on cost and you definitely can't rely on brand name (Philips was the worst).
Lately, I've found Ikea's LED bulbs are good and quite cheap (with imperceptible delay on turning on).
All of them are bright, work fine, and start instantly, but the Ikea drivers have a constant buzz. I've relegated those to bathroom fixtures that are only used briefly so that the sound won't drive me crazy.
My biggest issue with them is size limitations. Lots of places they won't go.
The reason people need more power going into halogen lighting is usually because of all the other arrangements around the bulb that reflect the light into the wrong place. A bare halogen bulb (without a reflector) will light a room better than the same amount of power going into a bare normal incandescent bulb, but it will be a harsher bluer light.
I assume most LED bulb factories are in China at the moment, so they are just targeting their own preferences for colour.
And where when I'm buying these lights at the supermarket does the information that other color temperature options present itself? I've bought lighting for camera work, but I've never seen anything about color temperature at a local store.
They all have descriptions like "Cool White", "White" and "Warm White" too.
It works pretty well - I haven't been disappointed in LED globes I've bought, whereas I bought a few CFLs back in the day that I couldn't stand.
Stores with larger selections may have a few options in between, too; "cool white" and "bright white" designations aren't uncommon (although exactly what they mean may vary; check the packaging if you're not sure).
You could buy "soft white" and "daylight" incandescents, too.
For brand names I think it is pretty common, at least here in Europe.
That's basically my point. Of course people will complain about the light since LEDs act differently and buying options didn't properly explain how to get back to something that looks like what people had.
Seems like something it would be at least interesting to play with.
We're do you source the bulbs? I've recently tried f.lux and I definitely get better sleep - I'd like to upgrade my home lighting story.
"Alexa, I'm ready for bed." is like I live in the future (warms up the lamp colors like flux on my mac). Still working on tying the coffee pot into the morning wakeup alarm though.
That's color temperature, not brightness. Totally different.
1) It didn't actually make much difference on our electricity bill. I believe this is because we use most of our lighting during the winter, and the wasted heat from the traditional bulbs isn't really wasted at all -- we just shifted more work to our electric heating system.
2) Many of the bulbs had burnt out. I am now very skeptical of all the "20 year lifetime" claims since we lost many expensive bulbs in two years.
3) The light quality just isn't great. People will argue this point, and I can't point to any evidence to support my claim, but our perception was that the light was either too blue, or too yellow, and just "looked weird." I wonder if it's related to the strobing of the LEDs.
LED slow motion: https://www.youtube.com/watch?v=zAfWKcg8Bq0
Halogen slow motion: https://www.youtube.com/watch?v=sCD5BMr6LsA
The only reason people expect them to is the, IMO, obsolete idea that lighting load is a significant driver of power usage.
> 2) Many of the bulbs had burnt out. I am now very skeptical of all the "20 year lifetime" claims since we've lost lots of expensive bulbs in two years.
Maybe I have just been lucky, but the LED floods I put on the exterior of my house have been working like champs. I used to go through halogen and CFL bulbs like crazy out there. The Cree LED retrofits for my dimmable can lights have also been solid.
My house's baseline load is about 200w; networking/computer equipment mostly. The fridge that cycles on and off, and an instant-hot tap that cycles on and off to keep up to temperature.
Each bedroom had 60w floodlights in 4 cans. Sure, we don't have them on all the time, but 4x3x60 = 720watts. That's half a hairdryer! Our living room+hallway was even worse. There were a bunch of MR16s at 50w. With 16 of them in the house, that's another 800w. When we're home in the evenings, the living room lights are on all the time, because it's in the main part of the house. To be fair, the payback on the $500+ in LED bulbs I put in is way out there, but a big driver in my spending was to keep temps down during the summer. It seems so silly to artificially heat the house with wasted electricity.
In the winter, we don't use much heat, and the gas forced air heat is a hell of a lot better than using lightbulbs.
I don't know offhand what my baseline is. Still, this works out to 2.4 kWh baseline energy usage for you.
> Each bedroom had 60w floodlights in 4 cans. Sure, we don't have them on all the time, but 4x3x60 = 720watts. That's half a hairdryer! Our living room+hallway was even worse. There were a bunch of MR16s at 50w. With 16 of them in the house, that's another 800w. When we're home in the evenings, the living room lights are on all the time, because it's in the main part of the house.
That's a hell of a lot more lighting than I have. Our bedrooms have single-bulb overhead lights. The guest bedroom is rarely used, the master has two 25W lamps that are used infrequently, and the third is my lab with has light on maybe two hours a day (I prefer it off unless I am at my workbench). We have a bunch of cans, but except for one that is always dimmed they are usually off. The lights that are typically on when we are home are T8 fluorescent tubes in the kitchen.
Anyway, going back to your example, the typical assumption for lighting is that a given fixture is used on average 3 hours per day. Using that assumption for all your lights gives 4.6 kWh for lighting, or only about twice your baseline load. Noticeable, but far from astounding.
It doesn't get anymore first world wasteful than this.
And I've got a tankless hot water heater for our main hot water supply which is of course a savings in orders of magnitude more than is 'lost' by the instant hot.
The instant hot also probably uses negligibly more energy than the other ways we heat water; microwave (for beverages) or running a burner on the stove on full blast for minutes at a time just to get a large pot of water up to boiling.
Of course, you can see from the fact that I bother measuring electricity of new devices with my kill-a-watt (and replacing all bulbs with LEDs at great cost) that I'm somewhat sensitive to electricity consumption. But I choose to spend my consumption where I think it makes the most sense.
Instead we're stuck with a simplification where a very high-dimensional property is reduced to a single number, and even that with low precision since few LED bulbs have emission spectra that resemble a black body.
But if you're looking at color (flowers, a painting, human skin) then spectral shape becomes very important.
Look at all of the shades of color you're not seeing under a typical CFL: http://www.bealecorner.org/best/measure/cf-spectrum/spectrum...
I'm a pretty picky person when it comes to color when working with photos. But I don't have a single incandescent in my house and I can't even remember the time when I've been bothered by not being able to discriminate colors.
In every instance the lighting was terrible. I really haven't yet found an energy efficient bulb that didn't make my place look like a jail cell.
My current place has halogen track lighting and I really like it.
The vocabulary is a bit weird, since "warm white" corresponds to "low color temperature". It's because we mentally associate blue with cold/ice and orange with fire, but the color temperature scales are based on the color an object glows as you heat it. To make a black body glow blue you have to heat it pretty damn hot.
In architectural lighting, the reference spectrum depends on color temperature. Warm whites are measured against the black body spectrum, cool whites are measured against the daylight spectrum.
That's done because it's very unusual to get a black body radiator that hot on Earth, so if you're talking about something on the cool white end odds are high it's daylight or trying to approximate it, not a tungsten filament or a campfire.
Basically, LEDs have a much more irregular spectrum than incandescents or halogens, so objects may look weird colours under them.
> Previously researchers have warned that the blue light emitted by modern bulbs could be stopping people from getting to sleep at night
These two sentences do not go together. Daylight contains LOTS of blue light.
5 seconds of research into CRI says that the first sentence is false. CRI compares a light source to an ideal black body, not necessarily daylight.
So it comes as a shock to me that a traditional incandescent bulb's coloring matches that of natural daylight.
There are even special gels just for correcting various light sources CTO (color temp orange) CTB (color temp blue) and minus green (correct gross flo lights). The rest are usually regarded as 'party colors' since they are for non-technical corrections.
The people that are claiming that incandescent light is "natural" are out to lunch, imho. Maybe it appeals to the cave man inside them, because it's closer to the color of fire?
I guarantee you it's very straightforward to reproduce color mismatches with 5000K "daylight" fluorescents that have CRI 90 and higher, compared to actual daylight. The CRI is just short of complete utter nonsense. It's very useful in specific comparison contexts, but requires qualification. It's almost like having your kid come up with a report card that says B+ and you think, yeah OK not bad kid! But then realist they got A's in everything except they got a D in history. Oops.
Whereas LED and fluorescent have very spikey spectra. We tend to not notice these spikes, but sometimes, depending on their location and depending on the materials being viewed under such a light source, they can have a pronounced effect that isn't predicted by CRI.
CRI has been a pretty hot potato for some time with a lot of scientists trying to fix it, butting up against a lighting industry that doesn't give a flying crap about accurate color reproduction, they just want a high number CRI so they can sell their shitty lights.
Isn't the sun a pretty-close-to-ideal black body emitter? Is there something specific you are thinking of that makes daylight deviate significantly from black-body radiation? (Rayleigh scattering in the upper atmosphere?)
This is precisely why I don't think people complaining about "nasty cool light" are based in fact. The color of the environment around you will quickly be normalized to be neutral by your brain, and it will be impossible for you to tell whether you are illuminated by a 3000K or 5500K light. You can really only tell color differences accurately.
I'll clarify this a bit further: At color temperatures below 5000K (warm/neutral whites), it's compared against an ideal black body. For 5000K and up (cool whites) it's compared against CIE Standard Illuminant D, representing a daylight spectrum.
Two interesting notes on CRI:
1) This makes CRI discontinuous at 5000K. Two bulbs with very similar color spectrums at 4995K and 5005K can have significantly different CRI scores.
2) CRI of 100 doesn't mean it's "perfect", it means it's the same as a black body emitter (or illuminant D for cool whites). There's some recent research suggesting that people prefer oversaturated reds relative to the black body emitter, but if you put more red in the spectrum it actually lowers your CRI.
As far as the two quotes you pulled from the article
> Traditional incandescent bulbs have a ‘colour rendering index’ rating of 100, because they match the hue of objects seen in natural daylight.
Yep, totally wrong. A tungsten filament bulb is effectively a black body radiator. Being well below 5000K color temperature, it's gauged against the black body spectrum. It's the same spectrum as the reference! Of course it scores well!
This one's a more reasonable statement. Daylight doesn't keep you up at night because there's no daylight at night. The fact that you're running a blue pumped phosphor LED at 11 PM is the potential issue. They have a giant spike of blue in the emission spectrum. Exactly how that affects melatonin levels or your circadian clock is out of my expertise, but it's definitely a real concern that we need to investigate and get a better handle on.
Yes, but the sun automatically goes down at night.
Most people don't turn off their lights at sunset. I believe what the author is saying that if you have a lot of artificial light after sunset giving off a lot of blue light, it can interfere with your sleep.
We have multiple receptors in our eyes to do signalling, it's likely that only the S cone cell (low wavelength receptor) suppresses melatonin.
This study found no melatonin suppression for higher wavelength light:
This study also showed the wavelength effect (may be the first big publication on the topic?):
In addition, not only is it an observed effect, but it can be harnessed. This study showed increase in salivary melatonin by using glasses that filtered lower wavelengths of light:
The effect is such a strong one that it was discovered despite not being one of the measurements in an unrelated experiment on breast cancer.
And throughout antiquity, the only light at night we're exposed to has been that of fire, which is a higher wavelength source.
As for daylight containing lots of blue, I think that's the point - our bodies have adapted to recognizing blue as the defining characteristic of daylight, and keep us awake for it.
1. How long are the bulbs going to last?
2. How expensive would they be to manufacture?
3. When could we see these in stores?
Often times there is a long road from laboratory prototype to production, and the supposed cost savings in electricity won't matter if they only last 1,000 hours and cost $100/bulb.
They don't last substantially longer in the real world. I am routinely replacing them within 12 months.
They cost an order of magnitude more.
They are considered toxic waste with special clean-up and disposal considerations.
The amount of energy they save as a fraction of my total electric bill is inconsequential.
Worse, the bulbs in the lamps came from my previous home, and have been going strong for a decade now. I've been wanting to upgrade to LED bulbs when they finally burn out, but they've been too damn reliable.
There is a light at the end of the tunnel (hah) with a couple of bulbs that now won't turn on unless you tap them a bit. They're roughly a decade old and have been used for many hours, so I guess it's finally time.
Yes they do, I have a lot of them and I write the installation date on the bulb - they have failed more or less inline with what the package said.
> They cost an order of magnitude more.
They cost $1. There is no reason they need to cost more than that.
> They are considered toxic waste with special clean-up and disposal considerations.
That isn't true, it's one of those urban myths that spread by email.
> The amount of energy they save as a fraction of my total electric bill is inconsequential.
What are you spending so much on electricity on?
>That isn't true, it's one of those urban myths that spread by email.
It can be hazardous waste, you can't categorically say that it isn't true. Both of you are technically wrong. While it's correct that low mercury bulbs can be disposed of in dumpsters according to EPA, some states have stricter rules with respect to even low mercury bulbs. I believe CA, WA, and VT all require specific recycling of low mercury bulbs, for example.
Beyond that, looking at bulbs that are NOT low mercury, those DO become regulated hazardous waste after they burn out according to federal laws. So those MUST be recycled properly to comply with federal and state laws.
In any case, 100% of fluorescent bulbs contain mercury, which is really bad for the environment. So even if you're legally allowed to throw them in dumpsters, it's really not something you should be doing if you care about the environment or the people in it.
Bottom line: Depending on the bulb and jurisdiction, a spent/broken bulb might be classified as hazardous waste legally requiring special cleanup considerations, or it might legally be able to be thrown in a dumpster. But regardless of the law you should properly recycle 100% of bulbs because they contain mercury (even if in small amounts).
This is off topic but, would you mind changing your username? I get that it's mostly harmless but it's against the usual HN guidelines.
I tried to figure this out for my household... 1) The squirrel cage on the HVAC unit uses about 600 watts. 2) Couldn't convince everyone in the house to turn off TVs when they leave their rooms. 3) Laundry (gas dryer, but again there is a motor on it). 4) Dish washer 5) Fridge/Freezer.
I added up the cost of leaving lights on in all all the rooms (10 60-watt bulbs) 24x7, and that would only cost $43 a month. Typically only half would be on, and for a few hours in the evening, (5 bulbs, 4 hours a night, 10 cents/kwh), and that comes to $3.60 a month.
I have approximately 40 bulbs and I don't think I have a large house.
And I have not been able to convince people not to turn on the light during the day, so my usage is closer to 12-18 hours per day.
Master bedroom -- 2 lamps, but one is mine, which is hardly ever turned on.
Kids rooms -- one overhead hanging (swag) light each. Plus the closet light, but I'm not counting that as it is already a low voltage bulb (code requirement).
Living room -- the foyer has hanging light with 3 candelabra bulbs, which are 20 watts each. The dining room (the other side of the great room) has a hanging light with 5 bulbs, again 20 watts each. So one room is 60 watts, the other is 100.
Kitchen/family room -- overhead long tube fluorescent (120 watts total), family room had a torch light.
Bathrooms have multiple bulbs (4, 6, and 10 bulbs for the 3 bathrooms), but most of them are burned out and/or unscrewed. And they were 25 watt bulbs. So I'm counting an average of 60 watts per bathroom.
Basement has 6 bulbs total, but only a couple are used on the way to the laundry.
So slightly more than the equivalent of 10 60-watt bulbs for the lights that are used most often. At first I thought that I could save a lot with the compact fluorescent bulbs, but I really didn't see any change in the electric bill after changing out all that I could -- the chandeliers take the candelabra bulbs, and while I could get fluorescents to fit they would be way too bright. Same with the bathrooms, I didn't want 10 bulbs in one bathroom that put out the same lumens as 10 60-watt incandescents, and I couldn't readily find any fluorescent or led bulbs that were equiv. to a 20-watt incandescent.
And in the US, the law described at https://en.wikipedia.org/wiki/Energy_Independence_and_Securi... suggests that the ban is not on technology choice but on efficiency, such that a more efficient incandescent bulb would be fine.
(C) Candelabra incandescent lamps and intermediate
base incandescent lamps.--
``(i) Candelabra base incandescent lamps.--A
candelabra base incandescent lamp shall not exceed
60 rated watts.
``(ii) Intermediate base incandescent lamps.--
An intermediate base incandescent lamp shall not
exceed 40 rated watts.
As to the "why cover US law?" question, I thought it relevant because the discovery was made by a US team at MIT and for there to be a "return" of incandescent bulbs would require both US and UK/European markets to be opened back up. Even if only as an irony juxtaposed with the location of the discovery, it was related.
Now the law as stated would seemingly permit incandescent bulbs of equivalent brightness to old 100w bulbs, but only if they are 27w or less. Or, you know, you could just buy a gaggle of 27 watt bulbs.
That concerns only lamps which use a "candelabra screw base as described in ANSI C81.61–2006, Specifications for Electric Bases, common designations E11 and E12." Moreover, that is from a section which specifically allows them to be incandescent, so long as they don't exceed a given power limit.
You also seem to have missed the part where it says:
> The rulemaking— (I) shall not be limited to incandescent lamp technologies; and (II) shall include consideration of a minimum efficacy standard of 45 lumens per watt.
or the "Backstop requirement":
> If the Secretary fails to complete a rulemaking in accordance with clauses (i) through (iv) or if the final rule does not produce savings that are greater than or equal to the savings from a minimum efficacy standard of 45 lumens per watt, effective beginning January 1, 2020, the Secretary shall prohibit the sale of any general service lamp that emits less than 300 percent of the average lumens per watt emitted by a 100-watt incandescent general service lamp that is commercially available on the date of enactment of this clause.
This allows incandescent bulbs so long as they are significantly more efficient than those sold before 2007.
As to why UK newspapers writers are supposed to understand the details of domestic US law, and why UK newspaper readers are supposed to care, in order to get an extra moment of irony - well, that seems like a lot of work for very little gain. Especially when the EU has very similar laws, so UK readers may default to think that the US is already in the same boat, so there's little to no irony available.
That's not true. They sell "high efficiency" incandescents in stores (they are basically halogen in a different shape).
See also: https://en.wikipedia.org/wiki/Energy_Independence_and_Securi...
Maybe the order would fail at checkout if the relevant warehouse is outside of your state. (I once had an order for smoke detectors fail since they were illegal in California)
Somebody really needs to get MIT's PR department under control. The hype level is so high it's embarrassing to a good school. Especially in materials science articles. It's like reading the National Enquirer of science.
Sheer garbage. LEDs come on instantly, and are available in various color temperatures, thanks to filtering: you can have then in 2700K. If your LED isn't coming on instantly, it has some problem with the power supply, probably because you bought the lowest bargain-bin crap you could get your hands on.
I've had Phillips flood lights (3000K) in my kitchen and living room for several years. They come on instantly and put out a warmly colored light that is consistent from the center of the spot to the fringes.
Of course, it might also just be that the lighting design wasn't very good in the first place...
I must have tried samples of a half-dozen different brands before buying enough LED bulbs to retrofit all of my ceiling cans. Actually I sort of wasted my time, because the (subsidized) $2.99 65w replacements from the hardware store on the corner proved to be as good as any of them. No failures a year later (N = about 36) so they seem to be working out OK.
This is important in art studios, museums and other venues. For many years I've used specialized 48" T12 fluorescents with good results. Incandescent/halogen lamps are also essential because color selection can depend on the target environment, there's a world of difference between indoor illumination and daylight in this respect.
The high-efficiency lamps described in the article will likely be very useful and a welcome refresh of a light source that's so far been hard to emulate with LEDs.
With IR energy being pumped back into the filament, maybe it's an option to run a hotter filament emitting a higher color temperature. More likely the input energy is just decreased so the resulting filament temp remains in conventional range. There still might be an issue re: time to filament burnout not differing from conventional lamps unless the filament is modified in some way.
In the long run I imagine LEDs will be improved, curbing the disproportionate blue output, while enhancing and smoothing the long end of the spectrum. LED dominance will probably be complete when color rendering is optimized, they work properly with electronic dimmer switches, and lamp envelopes produce illumination as diffuse as classic sources, e.g., like "frosted" incandescents.
That wasn't the previous solution and I'm wondering if people who were fine with the demise of incandescent bulbs would be fine with a ban on LEDs because of the same efficiency concerns? I get the feeling the answer is no which makes me wonder about the original reason.
Yes, it was: the so-called "ban on incandescent bulbs" was actually an efficiency requirement which no existing incandescent bulbs met. Much of the research into high-efficiency incandescent bulbs (which IIRC actually had produced some bulbs that met it prior to the requirement going into effect, but which were not cost competitive with CFL or even LED bulbs) was directly spurred by the requirement.
> I'm wondering if people who were fine with the demise of incandescent bulbs would be fine with a ban on LEDs because of the same efficiency concerns?
I'm pretty sure that most the people in the public that were fine with the government setting efficiency targets that could be met by a variety of technologies on the market (but not, at the time, cost effectively by incandescent bulbs) would be just fine with moving the efficiency requirements up as new technologies become available.
At any given time, which particular industry players would support and oppose such a move would, of course, vary based on the relative efficiency of what each particular industry player was invested in.
EX: 303lm/w vs maximum possible 683 lumens/w.
PS: Power is kind of an odd thing. Direct costs are so cheap most people ignore it, but the external costs are high enough to push efficiency gains. Taxes would be the best solution, but politics is odd.
It reeks to me of the same reasons why we have ethanol mandated fuel in some cities.
Especially with LED sales waning as more and more people have 15 year bulbs.
Why can't/don't they use those same crystals on LED lights?
Ever seen the massive heatsinks required for a high wattage LED to work?
"Return of incandescent light bulbs" is an incredibly misleading headline that implies that due to this new tech, we're already starting to use these great new efficient bulbs- um, no.
Not to mention, the prototype/initial version is nowhere near the "potential" level of efficiency they are claiming.
I hear a lot about the "delays" in incandescent alternatives, but all of mine are barely perceptible. My Flux bulb has like 400ms; it's palpable but how could it ever annoy me? How often were you trying to illuminate something with a light bulb within 400ms and missed it?
And of course, many of the adjustable/smart LEDs can mimic incandescent light temperature with ease. They cost more, but not necessarily over their lifetime.
As mentioned, this article was fluff.
They've had instant on fluorescent bulbs for a decade now, and they don't cost more. If yours are slow switch to a different supplier.
The CFL's in my house are full brightness immediately. I suppose a lab could measure a change in brightness, but it's nothing that is visible to the eye.
Even though fluorescent lamps are called cold cathodes, "[a] cold cathode does not necessarily operate at a low temperature: it is often heated to its operating temperature by other methods, such as the current passing from the cathode into the gas." (Wikipedia, Cold Cathode).
Obviously, the tech you're referring to requires a special circuit, since an ordinary light socket goes comletely open circuit when switched off. (I.e. anyone who claims to have these zero-warmup CFL's in ordinary light fixtures is confused.)
I find that the slow warm-up time of regular CFL's is excellent for bathrooms. When you have to "go" at wee hours in the morning, you don't want the full glare in your sleepy eyes.
Your notes about the ambient temperature effect on warm-up time are spot on. People who are not seeing the effect with CFL's may be living in a warm climate or well heated home.
They are VERY VERY bright. In fact the brightest headlights before you go to real xeon. But they are not much more power efficient.
9011 HIR and 9012 HIR models
This is simply not true. Current state of the art is >80. If you search a little you'll find LEDs with CRI 95.
There is still one active for over 114 years and it has a 'live' feed http://www.centennialbulb.org/photos.htm ( 1 million hours and it's not that efficient anymore )
As it helps to heat up the room they are in.
Incandescent lights, when heating would be used, are 100% efficient.
They provide 3% light output, and 97% heat output. A single one on provides a great amount of heat and can easily provide spot-heat where humans are.
Instead there's CFLs and LEDs. And CFLs are a great way to spread mercury pollution across a great area. Snopes has a decent article about hazards and response: http://www.snopes.com/medical/toxins/cfl.asp
> In heating mode, heat pumps are three to four times more efficient in their use of electric power than simple electrical resistance heaters.
Instead of just generating waste heat, you generate waste heat and move heat around in a desired way.
It's true that where heating would be used, if mounted in such a way that all that heat is available (ie, not recessed into a ceiling fixture), the 97% is useful. (But not necessarily maximally efficient, because if you wanted heat in the first place you'd be better off not turning your energy into electricity, a process with ~50% efficiency at the power plant, and then incurring further distribution losses.)
On the other hand, if cooling would be used, it's even less efficient because now you have to cool away all those 97%.
Resistive heating with electricity is just terribly inefficient. If you have to do it anyway (because you need to run equipment, and that equipment is already as efficient as it'll reasonably get) then you can benefit a little by ensuring that this heat is put somewhere it's wanted. But if you don't have to do it, don't do it.
CFLs suck, but LEDs are great. I fear that the CFL interregnum may have hurt the cause of efficient lighting in general.
I have both. There's basically no difference, except the LEDs are hot on the base and CFL's are hot on the bulb.
Power consumption is identical, CRI identical, and Cost per Hour (before failing) is basically identical (assuming the hours listed on the package is the truth).
> I fear that the CFL interregnum may have hurt the cause of efficient lighting in general.
It has not. I've been using CFLs for about 20 years (almost since the day they came out), and they work just fine.
I don't know what your personal statement about using CFLs for 20 years has to do with my fear about the cause of efficient lighting being hurt. Your own preferences don't necessarily reflect those of people in general.
I have both. That isn't true. I have some LEDs that take longer, some CFL's that take longer, and some (of both) that are effectively instant.
> don't reach full brightness immediately
Also not true. It used to be that way, and perhaps some poorly designed ones. But there is no reason to buy any that are not instant.
> are less efficient
Again, not true - make sure to compare bulbs with equivalent CRI.
> don't last as long, and contain mercury.
This is true, and irrelevant. Also, LEDs cost more, and I'm not (yet) convinced the hour ratings are actually accurate.
> I don't know what your personal statement about using CFLs for 20 years has to do with my fear about the cause of efficient lighting being hurt.
Your fear is unwarranted. CFLs work perfectly fine, and I know that because I've been using them for a long time.
A whole lot of people experienced CFLs and came away with the conclusion that they suck. I fear that they will assume LEDs also suck.
Did you buy CFLs recently or only more than 10 years ago? What brand? I've had excellent results with GE and GreatValue. Mixed results with Feit.
> and I don't understand why you think it's irrelevant that CFLs die quicker
Because I don't (yet) trust LEDs to live as long as they say. So that advantage doesn't exist as far as I'm concerned.
> and are more poisonous
Because it makes no difference in actual use. You can toss them in the trash if you want, or take them to Lowes/Homedepot to recycle. The amount of mercury is too small to hurt anyone if they break.
You really need a neutral wire in the light switch box, and attach to that for the glow in the dark switch.
LEDs are where it's at.
Heating isn't always electric.
Using electricity for heating is an egregious waste of a high quality energy source which can do all sorts of useful things besides generating heat.
And in most places, far more money is spent on cooling than heating. Even in the frozen wastelands of Minnesota where I live, air-conditioning costs over the summer are about equal to heating costs over the winter.
As others have mentioned, heat pumps and gas furnaces are much more economical than resistive heating via incandescent lights.
Hmm. I've always heard the opposite. Anyway, here's one piece of research.
Energy demand for climate control was analyzed for Miami (the warmest large metropolitan
area in the US) and Minneapolis (the coldest large metropolitan area). The following relevant
parameters were included in the analysis: (1) climatological deviations from the desired indoor
temperature as expressed in heating and cooling degree days, (2) efficiencies of heating and
cooling appliances, and (3) efficiencies of power-generating plants. The results indicate that
climate control in Minneapolis is about 3.5 times as energy demanding as in Miami. This
finding suggests that, in the US, living in cold climates is more energy demanding than living
in hot climates."
During sleeping? Scraping the bottom of the bucket much?
I never said that incandescents would eliminate heating, only that it is 100% efficient when used appropriately.