1. Seasonal Affective Disorder is treated using just the visible light spectrum entering the retinas of the eyes , and there are a ton of studies showing its effectiveness (in fact, most SAD lamps explicitly filter out ultraviolet light).
2. Some infrared passes through some glass and seems to be good for a thousand different things.  
3. UVA passes through glass and has beneficial effects on the cardiovascular system. 
This might be of interest to you.
(Edit: That sounds very depressing, and could be health affecting in the long term, so if you can, please do leave!).
So I think the right course of action is a little more nuanced than simply saying, "UV increases skin cancer risk therefore ignore everything else as well as magnitudes of net risks/benefits". Should we not weigh all factors?
I'll see if I can find the link.
I've had couple of barbecue parties this summer in Auckland and NOBODY went outside until sun was way past 5PM. We preferred to sit in stuffy kitchen rather than enjoy beautiful beautiful outdoors. Even in middle of winter you can feel sun bite you as you stand in front of pedestrian crossing.
"It's so lovely and sunny here, why aren't you tan?"
"Because the sun is a terrifying ray of pain and death here."
The prediction is 2050-2070 we will be back to the 1980 level of ozone thickness.
(2001 paper - https://pdfs.semanticscholar.org/6944/503469e4f779e99884e144...
" Sufferers of skin cancer today should more likely blame their affliction on skin type and sun exposure during their youth than any changes in ozone distributions over the last twenty years. Therefore it is safe to say that even without ozone
depletion, Australia would still have a very high rate of skin cancer.")
You had people literally baking themselves in the sun with tanning oils. Even after "Slip Slop Slap" was first promoted in the early 80s everyone was still trying to get a tan.
> Seasonal affective disorder is a mood disorder subset in which people who have normal mental health throughout most of the year exhibit depressive symptoms at the same time each year, most commonly in the winter.
I usually get depressed in the summer though, even when I get enough unfiltered sunlight.
Most depression has nothing to do with SAD; it's merely a specific type. It's a type that has a fairly straightforward treatment, thankfully!
I often have a depressed mood in November and December when the sun is at its lowest, but my family also likes to make "the holidays" as miserable an experience as possible, so it's really no surprise and probably unrelated to light levels.
"This model also explains the otherwise confusing tendency of some SAD sufferers to get depressed in the summer. The problem isn’t amount of light, it’s circadian rhythm disruption – which summer can do just as well as winter can."
Found the answer on Quora: https://www.quora.com/Why-does-glass-block-UV
"With a band gap of 4eV, glass can't absorb any photons with less energy than UVB light; namely, it is transparent to UVA, visible light, infared, etc; but the higher energy photons can and are highly likely to be absorbed."
So it seems hard to create glass that doesn't block UVB.
> In Surely You’re Joking, Mr. Feynman, American physicist Richard Feynman speculates that he may have been the only person who watched the Trinity Test relatively directly, using a windshield to exclude ultraviolet light. Everyone else, he claims, was looking through something akin to welding goggles.
Did he know the main visual hazard from the bomb was UV-B? Did he just get lucky?
Lasers didn't exist yet to force us to study retinal exposure to bright non-UV light and the flash from the bomb didn't last that long, nor were the first few bombs that bright, so he may have been fine... but obviously if you stare at the sun through three inches of glass you're still going to burn your retinas.
He says "I figured the only thing that could really hurt your eyes (bright light can never hurt your eyes) is ultraviolet light. I got behind a truck windshield, because the ultraviolet can’t go through glass, so that would be safe, and so I could see the damn thing."
What we know from later laser research is that unless you have a comprehensive visual field test, it's often hard to identify that part of your retina has been scorched. Your brain just filters it out as a blind-spot and you don't realize what you're failing to see.
There's every chance that Feynman totally burned a section of his retina and never realized it, and there's every chance that he was fine because the exposure at his distance wasn't that bad, but at the end of the day he was more reckless than insightful in this situation.
FYI vacuum exposure can cause blindness, visual impairment, and death.
I've edited my comment above to clarify.
Before y'all criticize that statement, keep in mind it's coming from one of the best scientists of the 20th century.
He was talking before lasers. For a purely thermal source to hurt the eyes, it would have to be:
1. insanely hot
2. quite close to the observer
First condition is met by the nuke, but the second one, obviously, is not, unless you're a victim of the explosion.
Source: Am experimental physicist; do dumb things. Sometimes my colleagues do too, even the best ones.
I don't get this type of thinking. Unless he thought it would be worse to wear the googles what is to be gained by doing something like wearing goggles (in that situation) just in case you were wrong? Why not reduce the chance of harm as much as you can?
- UVA penetrates deeply into the skin (the dermis) causing genetic damage to cells, photo-ageing (wrinkling, blotchiness etc) and immune-suppression.
- UVB penetrates into the epidermis (top layer of the skin) causing damage to the cells. UVB is responsible for sunburn – a significant risk factor for skin cancer, especially melanoma.
Which contrary to what I knew, links melanoma to sunburn, not DNA damage.
I think I'm just going to use daily sunscreen on my face only.
A single quiescent skin cell with precancerous DNA can be cleaned up by the immune system. A precancerous cell that has multiplied itself to cover a patch of sunburn, activating some of the genes for rapid growth, is much harder to clean up.
But I would like to see some clarification on this topic, too.
From what I read UVB melanoma is an "easy" type of cancer.
I just figured automotive glass doesn't block UV (or at least all of it) since window tinting places always advertise UV blocking as a feature of their films. Cynically I know it could be just empty marketing, but it didn't seem like it.
The problem is, its melting point is huge. To make artificial sapphire you have to melt the stuff at very high temperature and let it drop and collect onto a ceramic base or something. And then you cut, grind and polish it into shape, which is not easy either because it's an extremely hard material.
The challenges to making a large flat plate of AlOx would be huge.
This product was originally announced in 1984 as the Nikon 105mm f/4.5 UV-Micro-Nikkor, and from September 1985 it was marketed as the Nikon UV-Nikkor, then the lens sold then for $2,200.00 USD, then about half the cost of a full sized car.
 - https://www.youtube.com/watch?v=V9K6gjR07Po
We always make fun of him for this idea as it's one of his strangest. I don't think he ever finished filing the patent.
Physics aside, why would you want to do that in the first place? UVB is the chief cause of skin reddening and sunburn and plays a key role in the development of skin cancer and a contributory role in tanning and photoaging. 
Just for that little benefit of triggering Vitamin D synthesis is not worth the increased risk of skin cancer IMO. And the author lays out the alternative there too: "Those concerned about low vitamin D levels can get more of the vitamin through foods. "
Previous HN discussion: https://news.ycombinator.com/item?id=18890475
IMO, there's probably some other causal factor(s), and reducing sunlight exposure is not the solution.
Sunlight exposure is important for health, not just for Vitamin D (which others have pointed out, may just be a proxy for some other factor of sun exposure). It's important for regulating circadian rhythm, as well as preventing myopia in childhood.
Why? It is not known, but even when controlling for physical exercise, Vitamin D status, and other factors, the correlation still holds. Some authors suggest other chemicals produced by sunlight, not only Vitamin F, might be involved.
I worked on a product that had a UV sensor. It needed to be protected.
Sourcing glass that didn't block UVB, that could be used in a mass market product, at cost, integrated into a manufacturing line, was a bit of a challenge. The mechanical engineering team eventually got a hold of some. For awhile, there were weekly status updates of "got another manufacturing sample, spec sheet wasn't quite honest, it blocks some UVB."
Ok, so what is the glass in tanning-booths made of? Or the glass of UVB fluorescent lamps?
This is one of those situations where I think government intervention is needed. I bet the long-term benefits of coating glass like this are very real - both for society and individually (especially in professions that involve a lot of driving time). However, the short-term economic incentives work against it - there is probably a strong first-move disadvantage. Also, what is the economic benefit to a landlord to have UV-proofed glass for their tenants?
But if government were to implement a policy of requiring glass in cars and buildings to be coated like that? That levels the playing field. I doubt it is going to happen any time soon though.
But of we were ever going to to do that, I know for a fact that are also coatings with reflective layers (invisible to us) that tell birds that the glass is there, which would also save a lot of wildlife.
As someone who had a number of health issues whose underlying cause was a vitamin D deficiency I do not agree.
There is no substitute for light. Supplementing with D3 had very limited effect. Using a vitamin d lamp however was a world changer within days.
The problem with vitamin D was that up until about a year ago, official recommendations in many countries were mistakenly low (500-1000 IUs), which as it turned out, was not enough for many people. It was even hard to find proper pills that has high enough dosages, because when you got to your local farmacist which was going by the official recommendations, all they had were 200-500 IU pills, because they were not expecting that anyone would need more. So as an average citizen who does not do their research properly, if they just went into a farmacist and took the first Vitamin D supplement they saw, the chances were that they were getting one which had basically no effect.
When the dosage is right, many people experience positive effects and dormant Vitamin D levels really do go up. The sun is not required.
Also, the science is starting to question the supplementation. It’s too early to tell, but one hypothesis is that it is a biomarker at the end of a pathway and not at the beginning of it, i.e. as a consequence of a proper diet and sunlight, and not a cause of a healthy system.
An analogy would be the analysis of car exhaust, finding certain levels of CO2, and trying to fix cars with "bad exhaust" by throwing more CO2 into the intake (except maybe this would even have a negative effect by hindering combustion).
This is an hilarious (and fitting) analogy for nutrition sciences.
Reputed and trustworthy citations from scientific sources required, please.
Industrial production of vitamin D3 irradiates the 7-dehydrocholesterol extracted from sheep's lanolin with UVB light.
There is some question as to whether ergocalciferol--a chemical mainly found in fungi that produce ergosterol and have been exposed to UVB light--is biologically equivalent to cholecalciferol in humans. It can alleviate vitamin D deficiency symptoms, but it is not known with certainty whether it can produce a sufficiency. As far as I am aware, the known cases of hypervitaminosis D have resulted from ergocalciferol supplementation, rather than from cholecalciferol.
If you are supplementing with vitamin D3, the chemical you are consuming is identical to that produced in your skin under UVB irradiation. There is no evidence whatsoever that it is destroyed in or poorly absorbed by the human digestive system. If you swallow 15000 IU of vitamin D3, that is the equivalent of standing shirtless in temperate midday sun for 15 minutes, after which time you will achieve no further benefit until some time has been spent absorbing the cholecalciferol and replenishing the 7-dehydrocholesterol in your skin.
It is poorly known, that if you have to preface a statement with "it is well known", what follows is less likely to be "known" than "unattributably rumored".
"Gravity is real."
"Reputed and trustworthy citations from scientific sources required, please."
You can get less and less ridiculous from here ad nauseam, but the point is made.
And those studies saying otherwise have been backed by a lot of questionable money:
Oh by the way sunscreen is terrible, it doesn't have a proven link to reducing cancer and it destroys reefs so do yourself a favor and get more sun without all the slimy goop so you can feel better AND save the planet!
The Mayo clinic article says:
> Taking 60,000 international units (IU) a day of vitamin D for several months has been shown to cause toxicity.
So 10,000, while it's well above the "recommended" dosage, is nowhere near the toxic level. Although when I was looking around the toxic level was 20,000, still, I've seen no guidance regarding Vitamin D that suggests 10,000 IUs/day can be toxic. (Especially if your BMI is over 25, and you're not getting much sunlight.)
A hypothesis exists that increasing vitamin K consumption in proportion to the vitamin D, and restricting calcium intake, would eliminate the main symptoms of excessive vitamin D supplementation. Nonhuman animal experimentation confirms, but it has not been tested in humans (likely due to the obvious ethical concerns).
Vitamin D, vitamin K, and calcium are all related.
I have seen guidance specifically for vitamin D2 (ergocalciferol), indicating that adverse symptoms may be observed at lower levels of supplementation than for vitamin D3 (cholecalciferol), but it was quite some time ago, and I don't know if the claim was confirmed or refuted scientifically.
10,000 IU is only 1.25 mg, and vitamins aren't well regulated. It's not like you'd notice if your supplement contained 100,000 IU.
Others include NSF, UL, and consumerlab
There's still the chance that a manufacturing mistake could result in bad dosage but that applies to all products including medicine.
This is in cloudy wintertime Seattle. In sunny times it may be completely unnecessary.
Now I try to get my vitamin D through more normal sources by eating lots of UV-exposed mushrooms, salmon, and sardines. Plus side is these foods are all quite healthy even without the vitamin D
To be clear, that's a really useful article, just want to clarify that your intent is indeed to explain why vitamin D wouldn't work but a UVB lamp would.
It's hard to disentangle all these things.
Vitamin D is something we can measure and often serves as an indicator, but large scale tests have shown limited effectiveness of supplements at resolving the issues a Vitamin D deficiency causes. e.g. We increase blood Vitamin D, but there are unknown correlative mechanisms that we aren't substituting.
So, while we wait (perhaps decades) for nutrition to become a more reliable science, I like to consider it from an evolutionary angle. That which is very dissimilar to what my ancestors experienced, like plentiful sugar or a sedentary lifestyle, is more likely to be bad for my health.
Exposure to the sun for hours a day, including the UV part, was part of our ancestors' lifestyle from the beginnings of our species, until a few decades ago. Now it is possible that this is the exception, and we should keep ourselves shielded from UV at all times even though it was what we evolved for. But...
- we don't now much about our skin microbiome, but we know we have one and it matters, and it seems unlikely that removing all UV would not impact that
- we know there's an impact on vitamin D levels, but the details of that are still unknown
- we know that eczema, psoriasis, and other skin conditions have been rising in recent decades, since the "always wear sunscreen" advice became common
Now, many that's all a coincidence, and UV is bad for us, and lack of UV is good for us, regardless of the fact that it's the environment we evolved for. But I'm skeptical.
Compare that to today, where UV exposure mostly means exposing the skin to massive bursts of UV for short times, at best in an effort to build skin pigmentation that isn't sufficient. That's completely different and much more damaging, leading to the general feeling that any UV is bad.
However, the change in exposure is surely true. I'm just not at all convinced that complete shielding from UV is a better choice than occasional UV.
Until recently, children would run around outside a lot during the summer, even in urban environments, and especially in suburban or rural ones.
Choosing the natural over the artificial is a reasonable default where the science is uncertain. But just remember that science has eliminated an awful lot of problems.
Also remember that natural experiences aren't always a la carte. Picking natural foods from disperate regions might be very unnatural, as is getting natural UV light after taking a shower.
Not that we would wish to have a complete absence of antibiotics. But, the attitude of "you have a virus, so take this antibiotic just in case there's also a bacteria", sounds a lot like "block all UV light from hitting your skin, at all times". I am skeptical.
Using soap has a big effect on how the UV light hits your skin, the damage it might cause, and the benefits that it might have. Of course this is all speculation on my part, but I am just warning of the idea of picking one natural thing at a time -- natural things go together.
However, anecdotally, the "mud beggars" at modern Renaissance Faires are said to have healthy skin. It could be a confusion of cause and effect, though.
I think the problem is the visible light output tends to be harsh, and the bulbs are highly inefficient, they are exensive, and they don’t last very long either.
So there would be a lot of problems to solve, not the least of which is the cultural issue you raise of scaremongering around UVB exposure in the first place.
I never use it for more than 1 minute and I alternate between facing away and towards it. For the first month I used it every day, now more like every other day or every two days.
Most other full-spectrum lamps I tried at the time on amazon didn't get even close.
If you never tried, working with these lamps is weird: turning it on in the evening or night by mistake _really_ wakes you up.
So yeah, I hope Androv kept up with the quality.
Can I ask the difference? I'm assuming a typo, but haven't experienced sun shortage for at least six years, so I'm not sure on boom / bust cycle potential with these things.
Using once every two days = use one day, then not use it the day after.
Perhaps you meant "once every three days", i.e. use it one day, then not use it for two days?
Jokes aside, skin is very photo-active and I do think lamps can help greatly against these kind deficiencies.
Most lamps emit a line spectrum though, so it is not like real sunlight. I wonder if that is the reason people going to solaria often seem to have weird skin tones. Ask a dermatologist and don't forget eye protection.
Apart from that it has the same side effects as sun exposure. Too much leads to skin aging, sunburn, cancer, etc.
I dabbled in light therapy and found the first lamp I tried  very effective. However I have also started receiving comments from my colleagues about my nice new facial tan, in the dead of winter. I freaked out because the lamp was in my field of view 30 minutes per day every day without any eye protection, and sent the lamp back to where I bought it. Subsequently I have tried three different LED-based lamps and they seem to have no effect. I have carefully measured the light output and placed myself at proper distance to get the requisite 10,000 lux in all cases. No dice.
Here's the kicker - that lamp  appears to be the one used in some of the light therapy research (that's why I got it). It is entirely possible that much of light-therapy research was contaminated by badly designed lamps. There is no FDA certification for any of this as far as I know.
 Carex Day-Light Classic Plus Bright Light Therapy Lamp. Supposedly UV-filtered.
Also, the lamp could be putting out UVA and not much UVB, so it could tan without producing much vitamin D.
For just the vitamin D, I'd get a "5.0" UVB lamp in a bare-bulb fixture, designed and sold for indoor avian and reptile pets. Aim at arms and hands, and away from the eyes. They're about $15 per bulb, and should be replaced every 6 months, as the UVB output degrades over time. Afterward, they work perfectly fine as compact fluorescents, behind glass or under lampshades, for several more months, albeit shaped inconveniently for some fixtures.
A "10.0" bulb is for desert-species pets. The UV output is too high for the overall lamp brightness, and may damage your eyesight, as human pupils don't constrict enough. I suppose you could wear sunglasses, but that seems counterproductive to just buying a lower-UV lamp.
I'm pretty sure the answer to that is that they do not.
Question, what were your health issues and what changes did you see that quickly?
I was supplementing with a lot of magnesium. Sleep returned very quickly, my need for supplementing magnesium also dropped in days.
It has been like a permanent +5 to constitution; I am now almost through my second winter without getting sick, something inconceivable for me as a kid growing up with asthma and a whole range of puffers, and always prone to debilitating sinus infections that told hold in January and never really let up sometime around April.
I am at 43 latitude so I only start taking it in mid-November and start weening myself off it in late March (you never really know if/when spring will happen up here in Toronto :)
Which one do you recommend?
Supplementing with D3 had very limited effect.
Non sequitur. Pills are not food.
Evolution will happily incorporate an abundant resource into one biological process even if it is damaging to another. Heck, even if it is damaging to the same process it is being incorporated into.
Evolution is short-sighted in a way that is hard to overstate. It exclusively cares about the current environment, it can only consider alternatives when they have actually been implemented, it can't look-ahead to see if a chain of changes is really desirable, and it can't even look-behind to determine if a previous design is better-suited to the current environment.
A better analogy to the way evolution operates using an engine would be: imagine that you first develop an engine that runs on a pure fuel, but then for a long span of time you can only get an inferior, highly contaminated fuel that produces a corrosive acid while it burns that slowly destroys the engine from the inside. You make a few tweaks to mitigate (but not stop) the damage from the acidic residue, but then you discover that by adding an additional afterburner with a special additive, you can get an extra 8% output from the engine. But then later when the pure fuel is available again, you discover that running the engine with the afterburner on a pure fuel will make it explode in short order. It's too late to redesign the engine without the afterburner -- most of the engines currently in existence suffer from the problem -- but it's pretty easy to re-contaminate the pure fuel so that it continues to produce the acidic residue that is both destroying the engines and expected by the afterburners. Meanwhile, most people still don't want to buy a pure-fuel engine without the afterburners since they have a lower output, and you really don't need most engines to last forever anyway.
That's the sort of bullshit you deal with in an evolved system.
This isn't quite accurate; while some evolutionary pressures are indeed exclusively short-term, not all are.
For instance, ability of eukaryotes to have multiple alleles of individual genes combined with sexual reproduction allow populations to maintain a large diversity of alleles and genes, which have the effect of allowing large-scale "split testing" and also "alternatives" for different current environments that may appear over time.
It seems like you'd like for evolution to have some model, so it can test old designs in the current environment, and simulate forward the current design into future environments, without needing to burn an organism and time on the test. But how could that be possible?
Modern men call it: "There is no free lunch!"
Plus nowadays with tech / medical advances evolution doesn't play the same role as 10k years ago. You can be obese, diabetic, lactose intolerant, lose a kidney and a lung and still get kids + your life expectancy will be miles ahead of our ancestors hunting antelopes with spears and pointy stones.
Characteristics ("traits") which are potentially not helpful for direct reproductive success but have an impact on the success of the group of people you're a part of are conserved through evolution. However, the exact formulation of this idea (known as kin selection or group selection) is still quite controversial  among evolutionary biologists.
My point was, something can be good at some point (vitamin D production) and turn bad in the long term (increased cancer rate), but being ultimately bad doesn't mean evolution failed.
Or just designed for another purpose.
It appears not - http://www.bbc.co.uk/earth/story/20150420-why-do-women-have-...
> Apart from humans, most of the other menstruating animals are primates, the group that includes monkeys and apes as well as humans. Most monkeys living in Africa and Asia, such as rhesus macaques, menstruate.
> menstruation also evolved independently in two other groups: some bats and elephant shrews.
Not that any of this disproves the "evolution hackery" suggestion, mind.
I mean at least UVB, you can protect against with simple glass. But something you find traces in most people's blood is no joke.
We should suppress it completly. Or at least ban it from schools.
Required for us to stay alive, yet ultimately kills us.
Medias tend to represent everything as black and white, and so it's easy to get confused, espacially since people will tell you everything then the contrary depending of the fashion of the day.
It's always nice to have those moments, when you can highlight that reality has nuances. Let's encourage people asking for answers, we don't have enough of them.
And yes, Vitamin D is required, not UVB. You can aquired vitamin D by eating organisms full of it. But the most efficient way is still to get exposed to sunlight. It's partly why some get depressed in the winter.
The sun is not "bad for the skin" any more than a cold water is bad for your health. But stay in it too long and you may die.
There is some debate about this now, while sunlight certainly produces vitamin D, that's certainly not all that it does. This article showed up here recently:
E.G: I just bought a light to help with the winter blues. It does not provide any UVB, so the method of action is something else.