This is really neat. I've got a full frame camera with a sensor that has very good dynamic range and I'd learned to under-expose and then pull detail from the shadows in post processing as information can't be saved from blown highlights. This sort of flips that on its head, except the featured overexposed shot didn't have blown highlights, it just looked like it did. Instead it had a wealth of low-light data.
And speaking of SNR, there are a few tricks astro-photographers have been using for years that occasionally find their way into mainstream photography. One is image stacking, where you combine multiple photographs of the same scene (useful for landscapes if a little blur isn't a problem for you) into a single image. This pushes the shot noise (the random distribution of noise) down even further. I've gotten clean images out of cheap cameras, and even from cheap cell phone cameras (given enough images).
The other technique is dark frame subtraction, where you take an photo with the lens covered at the same ISO/shutter/aperture as the rest of the set. Digital sensors have readout noise and heat noise which tends to be very predicable. Subtracting this noise gives you a cleaner shot as well.
On a side note, I do wish when talking about ETTR and even digital photography in general that the term "overexpose" wasn't used when explaining the technique of overriding the cameras default metering with exposure compensation. Overexposure would be blowing the highlights, or at least the highlights of details that the photographer considers important (e.g. blowing the highlights of street lights is usually okay). The "over" portion makes it sound like the photographer is doing too much of something, when in reality ETTR is a technique of giving enough exposure, but not too much.
Since you seem to have experimented with this, I'd like to ask: don't you need a certain "minimum" time to capture the faintest of stars, because their signal otherwise wouldn't rise above the noise floor? I've been curious about this with stacking in atrophotography for a while.
> The other technique is dark frame subtraction, where you take an photo with the lens covered at the same ISO/shutter/aperture as the rest of the set. Digital sensors have readout noise and heat noise which tends to be very predicable. Subtracting this noise gives you a cleaner shot as well.
For long exposures this heat noise becomes more problematic: long exposures heat up the sensor quite a bit. Have you tried building your own customised electric coolbox to keep temperatures down? It can be a fun project!
I actually used this in a different context: during long Skype sessions at home with my people abroad, my phone heats up a lot and the video would start stutter. Putting it on a cooling block actually fixes that. This had more to do with the system not having to scale back CPU performance to let the phone cool down, but a side-effect according to people on the other end of the line, was that the image quality improved too
I'm only guessing, but there must be a minimum time for the exposure. Stacking won't change that. My assumption (and I'm not a statistician) is that the noise floor moves lower with stacking. Taking multiple images and stacking them is basically taking multiple samples. The random (Poisson shot noise) is averaged out across multiple images while the signal (the star) is a constant across the images.
As you can see from the linked article, there's an amazing amount of detail that can be recovered from the shadows if you know how to get to it. In a dark sky environment, you'd be limited to the camera's sensitivity (the analog gain applied using the ISO setting), but in the city environment, almost certainly the sky light is the limiting factor.
I have not experimented with custom cooling. Aside from some moon photos and photos of the recent partial (in my area) solar eclipse, I'll leave astrophotography to others.
I'm not an expert or even very knowledgeable on image stacking, but I'll share my experience.
I've taken a stacked picture of the Orion constellation. 15 exposures at 5 seconds each, and combined them to get this photo:
In none of the individual exposures is the Orion Nebula visible, but in this photo it's pretty clear. So in the end I'm fairly sure that your statement is true, but the signal required is very, very faint.
Ok, just for fun: I'll try to make a quick back-of-an-envelope calculation and see if we can make an educated guess...
Let's use the old Sunny 16 rule, and its equivalent for moonlight, which also gives us an estimate for difference in light between day and night:
> Daylight (sunny day): Correct exposure for this case is given by the Sunny 16 Rule: 1 / ISO [seconds] @ f/16. So at ISO 100, a typical exposure is 1/125 @ f/16 (that's 1/3 stop less exposure than the rule calls for, but it's the closest standard shutter speed.)
> Full moon: to get an equivalent exposure at night with a full moon, the rule is: 1 / ISO [days] @ f/4. That's right, DAYS. For ISO 100, that means 1/100 of a day @ f/4.
> That works out to something like 14.7 minutes; I just round it up to 15 minutes. So 15 minutes @ f/4, or FOUR HOURS @ f/16.
> That's 21 stops difference from sunlight to moonlight.
So on a sunny day you need f/16, 100 ISO, shutter speed 1/100.
flickr says you shot at f/5.6, unknown ISO (I'll go with quasi-conservative 6400 ISO), 5 seconds.
Aperture increased by 1.5 stops
ISO increase by 6 stops
500x longer shutter time => log2(500) -> 2^9 = 512 => bit less than 9 stops.
So that's between 16-17 stops of extra light sensitivity
That's five stops short of the 21 stops mentioned above. However, we're not aiming for a properly exposed single picture, we're aiming for minimum exposure to capture a signal!
Even the first digital SLRs had a dynamic range of five to six stops, so theoretically should be able to capture something in the shadow regions. Nowadays ten stops is about the norm I believe, with some of the better cameras going up to twelve. So that should clearly be above the minimum signal required
Also, the moonlight-rule above is very conservative: it's about exposing a scene in moonlight, not about capturing stars. Stars are light sources themselves, and probably brighter than objects reflecting moon light, even in a full moon. Also, the rule-of-thumb was based on film, which actually suffered from something called "reciprocity failure" so it might exaggerate the required exposure.
So yes, this should work out fine, and as your picture shows it clearly does :)
I feel that this is worth expanding on.
Because a raw file isn't actually image data and instead just straight readouts from the sensor, it doesn't make sense to show the histogram of raw data; the data has to be manipulated into a jpeg first. The histogram in the camera will show the lightness values for what the in-camera image processing computer does to convert the raw file into a jpeg, but the in-camera image processor is usually not as advanced as a desktop image processor. It's probably running older image processing algorithms and has to worry about battery life among other things.
So generally what this means is that if your camera is telling you that parts of your image are over or under exposed and therefore clipped, it might still be possible to recover those parts of the image in a more powerful image processor. It's also worthwhile, if you've got really old raw files, to go back and reprocess them with modern software. Raw file processing has come quite a long way in the past decade!
Imagine back to when digital photography just started, when we were stuck with 10-bit RAW images (before processing them to 8 bit JPGs). That's 1024 different values per channel, not a whole lot of room to begin with. Because we're converting a linear signal to a logarithmic one, that means that highest stop in the image takes up 512 of those values.
EDIT: As rightfully pointed out in a reply, 8-bit JPGs are not in linear colour space any more - so this logic does not apply there (however, JPG compression is more harsh on shadow details, since it's lossy, and it tries to move that loss of information to the parts of the image where you're less likely to notice, like the shadows, but we're digressing)
By that logic, the low light data should have a lot more banding than the highly exposed data. Try it with level tools on a RAW file and see for yourself.
So you can imagine that when the dynamic range of (part of) the scene that you want to capture is less than the dynamic range of your sensor, you want to be on the right edge of the histogram (all else being equal - if this causes motion blur due to long exposure it's pretty useless)
Caveat: this is potentially outdated knowledge, perhaps the analog/digital converter in modern cameras actually converts the voltage of a CCD to a (pseudo)logarithmic scale for the digital output, which might partially circumvent this issue (although the sensor is still linear). And nowadays many of the better cameras have 14-bit A/D converters - the usefulness of which also largely depends on the quality of the sensor - so there's a lot more detail in the shadows than before.
The same thing holds true for a processed RAW image as well.
I guess there's also the visual "symmetry" with JPG or JPEG.
When you say processed RAW, you mean after importing to image editing software, right? Which implies you wouldn't see much of a difference in posterisation in the high and low exposure areas
If you want a good website in general for explaining photography techniques plus the physics behind it, I highly recommend Cambridge in Colour.
By the way, per the example of the second linked essay: if we try to capture 10 stops in 12 bit channels, the darkest stop only has 32 discrete available in the RAW file.
However, this might be somewhat less of an issue these days because most cameras have a really high resolution, and most images are resized for screen displays. So we can average out some banding (and image noise, for that matter).
Example: I have a 20 megapixel camera. A 1080p screens is just 2.1 megapixels. So even for a full-screen export, that's roughly ten "source pixels" per "output pixel". Which would mean that the output pixel, being an, could have 300 different values. The distribution won't be like that of course, but it still should help.
Mandate the replacement of all outdoor night-time illumination with LEDs that are pulse-width modulated at a low duty cycle. Synchronize them all to an accurate global clock (e.g. from a GPS receiver), so that for instance, all of the lights are simultaneously turned on for the first tenth of each UTC millisecond. Then an image sensor with a sufficiently fast global shutter could disable itself during every brief pulse of light, so that it picks up 90% of the incoming starlight, but almost none of the light pollution.
More stars, less energy consumed. Win, win.
this would make the usual 'light cloud' over towns and villages practically disappear. I guess this is a remainder from times where the more light there was, the better (ie look at our mighty civilization and its advances, illuminating whole valleys and soon whole earth!). Now it makes it pretty hard to find unpolluted places in western Europe even in the Alps.
I can be in the Italian side of the alps, and I still get orange blobs from Switzerland or France, even across continuous wide range of frikkin' 4000m peaks!
Anyway, how's the light pollution been on your Africa trip? You must have been through some seriously dark remote areas by now.
Considering the mayor comes at every smallish local events just to remind people of his name, he may be interested in something that would make a bunch of people praise him for little effort.
all that light scattered in the atmosphere and reflecting off other things would still be around when the LEDs are out.
The poster proposed a 90% duty-cycle for capturing the image (i.e. shutter open for 90% of the time, shutter closed for 10% of the time).
Since I'm based in the UK, let's assume the simplest synchronization mechanism - the 50Hz mains signal. (I appreciate this probably isn't perfectly synchronized.)
That means that the shutter is open for 18ms and closed for 2ms, during which we want to have our light pulse.
Let's assume that we have a 1ms flash and we allow 0.5ms for synchronization error and 0.5ms for light from the flash to dissipate.
In 0.5ms, light can travel approximately 150 km. Most of Earth's atmosphere (and hence most of the scattering) is within ~15km of the surface. As I understand it, reflection takes very little time (less than 1us), and each will cause energy loss (and hence less brightness).
I think there are engineering details to work out, but I'm not convinced that reflections or scattering are a significant problem.
I think the bigger problem is that it just doesn't seem worthwhile - how much would it cost to do this, and how much do people care?
As an aside, I wonder what the effect (on humans, wildlife, etc.) of regular 1ms-long pulse of 20x the average brightness would be - maybe it's fine, maybe not?
This article uses 20 seconds exposure, so it wouldnt be helped by PWMing the lights.
Also, you'd have to do it much faster than 100Hz. Wave Christmas LED lights around and you'll see a strobe effect.
People claim to feel (not see) 100-120 Hz flicker (I wouldn't dismiss them off hand).
Finally, you'd have to have a powerful, lighting, LED able to toggle on/off at fast enough speeds. That isn't obviously available, if for no other reason large internal capacitances.
As for synchronizing it, that's not too bad. I don't think you'd need a GPS clock. Use an accurate crystal clock and the mains as a 60*3 = 180 Hz reference. Turn the light off every 0, 120 and 240 degrees to cover all three phases.
This entire issue is already solved, and works fine.
You'd be fighting the current.
- Only be on when needed
- Only light the area that needs it
- Be no brighter than necessary
- Minimize blue light emissions
- Be fully shielded (pointing downward)
Light pollution is (IMO) driven by lack of knowledge of the problem & solutions, a perception that lights = wealth/luxury, and the low cost of electricity.
But engineering the light sources to be astronomy-friendly and nature-friendly in general seems like a good idea. Basically limit things to the red-orange part of the spectrum. It worked for sodium lights.
So that's 3 different offsets, depending on which phase any given light is wired into.
Plus that's probably going to be slow enough to cause visible flicker.
But IMHO, wrt to incandescents they still suck.
It's all about exposing for the stuff you care about. When imaging DSOs (deep space objects), you often don't care about the relatively bright stars nearby. But you do care about that faint little object in between. So you adjust exposure so the stars are usually blown out (overexposed), but the DSO along with the light pollution background have peeled off the left hand edge of the histogram.
So now your DSO is sitting comfortably in the middle of the histogram, where you can process and extract it easily.
In practical terms, for DSOs you tweak exposure until the big light pollution hump on the histogram becomes disconnected from the left hand edge of the graph. Turn exposure up gradually until there's a gap at the left hand wall. That usually does it. Overexposure is bad, but underexposure is super-duper-evil, so you must avoid it. OTOH, don't push things too far to the right - you still want some chroma info in the star halos.
Also, while I have't tried this, I'd think that using an ND filter in light-polluted areas like this could help a little bit with astrophotography.
Googling for an example, amusingly enough, the first link is to the same site:
That filter is not specifically for astrophotography, just one that happens to work well. Here's a discussion of other options:
Since there's a fixed level of sensor noise, when you increase the analog gain you decrease the level of noise in the shadows relative to the sensor noise, essentially increasing SNR. This isn't true for all cameras, however.
This chart [http://www.photonstophotos.net/Charts/PDR_Shadow.htm#Canon%2...
] demonstrates the difference between three cameras, one of which is known to be largely "ISO invariant" (the Nikon D750), one which is not (the Canon 6D), and one which looks like it has a 'native' ISO of 800 (the Sony A7R II)
As you can see, the D750 doesn't gain much (heh) from increasing ISO -- the total image noise in the shadows isn't improved by increasing the ISO through the range. If you're shooting a 6D in a low-light situation, however, you see improvements in shadow quality up to about ISO 3200, after which it turns out that you will be fine if you just raise the shadows in post. If you're shooting an A7R II you might as well just leave the ISO dial on 800 all the time.
That's not all, though, since it turns out when you increase ISO you basically lose 1EV of highlight off the top of your image, as you allude to, and you also lose around 1EV of DR. (Chart here: http://www.photonstophotos.net/Charts/PDR.htm#Canon%20EOS%20...)
It is, like all things in life, a balance.
This will change in the next few years.
If you're the kind of guy who wants to read the relatively dry writings of someone who spends a whole lot of time figuring this stuff out, the following link might interest you: http://clarkvision.com/articles/digital.sensor.performance.s...
Some ancient clusters of stars have drifted a fair bit away from the disc, but are still gravitationally bound to it, and its dark matter. If you were on another galaxy looking from above, our galaxy would look like a spiral with arms. If you looked edge on, it would look more like a cigar.
On a note about light pollution, the mention of sodium street lamps immediately made me think of filters - stars are suns, so should have wide spectra - why not just filter out the orange bit? Apparently I'm not the first with the idea (obviously):
https://petapixel.com/2016/12/14/purenight-filter-cuts-light... (has some nice with/without filter images)
Still I'm sure one could filter out the peaks from the LEDs (they are phosphor based still) but it's not going to be effective as taking out the sodium notch.
Now they are on 24/7
Incredibly wasteful, although i suppose it is safer to go about ones business at night.
It certainly gave the opportunity to view the stars even if one lived in city areas.
I thoroughly enjoyed being able to simply look up and gaze at the glorious Milky Way unaided.
In other words, if he shot this on his 'favourite' Rokinon 24mm, an exposure time of 20 seconds is reasonable to avoid all star trails.