Hacker News new | comments | show | ask | jobs | submit login
A method for improving Milky Way exposures in light pollution (lonelyspeck.com)
275 points by sndean 5 months ago | hide | past | web | favorite | 91 comments



> Shouldn’t the image taken at ISO 6400 show more noise than the image taken at ISO 1600? If the exposures were the same brightness, yes. But these images were not taken with the same exposure. The first image started with twice as much shutter open time (30 seconds versus 15 seconds) and two stops more gain on the sensor (ISO 6400 vs ISO 1600). The result is that the first image has significantly more light data. This makes the final signal-to-noise ratio of the processed images higher on the first image. The higher the signal to noise ratio, the less noisy the resulting photo. So in this case it was actually better to overexpose and compensate later in post processing, despite the initial unprocessed images appearing unusable.

This is really neat. I've got a full frame camera with a sensor that has very good dynamic range and I'd learned to under-expose and then pull detail from the shadows in post processing as information can't be saved from blown highlights. This sort of flips that on its head, except the featured overexposed shot didn't have blown highlights, it just looked like it did. Instead it had a wealth of low-light data.


What I've learned over the years with digital cameras -- you can't trust the preview image. The histogram tells much more of the story (if not the whole truth). My photographs have become much better (in a technical sense) after I learned to expose almost to the point of blowing highlights, making sure there is little blank space at the right (where the highlights are displayed) of the histogram as possible (hence the term Expose To The Right). As someone downthread stated, this improves the signal-to-noise ratio of the captured data. There's an unwritten assumption that the camera's built-in auto-exposure capability isn't sophisticated enough to expose the scene properly, and the photographer has to take control. This is often the case.

And speaking of SNR, there are a few tricks astro-photographers have been using for years that occasionally find their way into mainstream photography. One is image stacking, where you combine multiple photographs of the same scene (useful for landscapes if a little blur isn't a problem for you) into a single image. This pushes the shot noise (the random distribution of noise) down even further. I've gotten clean images out of cheap cameras, and even from cheap cell phone cameras (given enough images).

The other technique is dark frame subtraction, where you take an photo with the lens covered at the same ISO/shutter/aperture as the rest of the set. Digital sensors have readout noise and heat noise which tends to be very predicable. Subtracting this noise gives you a cleaner shot as well.

On a side note, I do wish when talking about ETTR and even digital photography in general that the term "overexpose" wasn't used when explaining the technique of overriding the cameras default metering with exposure compensation. Overexposure would be blowing the highlights, or at least the highlights of details that the photographer considers important (e.g. blowing the highlights of street lights is usually okay). The "over" portion makes it sound like the photographer is doing too much of something, when in reality ETTR is a technique of giving enough exposure, but not too much.


> One is image stacking

Since you seem to have experimented with this, I'd like to ask: don't you need a certain "minimum" time to capture the faintest of stars, because their signal otherwise wouldn't rise above the noise floor? I've been curious about this with stacking in atrophotography for a while.

> The other technique is dark frame subtraction, where you take an photo with the lens covered at the same ISO/shutter/aperture as the rest of the set. Digital sensors have readout noise and heat noise which tends to be very predicable. Subtracting this noise gives you a cleaner shot as well.

For long exposures this heat noise becomes more problematic: long exposures heat up the sensor quite a bit. Have you tried building your own customised electric coolbox to keep temperatures down? It can be a fun project!

I actually used this in a different context: during long Skype sessions at home with my people abroad, my phone heats up a lot and the video would start stutter. Putting it on a cooling block actually fixes that. This had more to do with the system not having to scale back CPU performance to let the phone cool down, but a side-effect according to people on the other end of the line, was that the image quality improved too


I don't actually know the answer to your question. My experience with image stacking is daytime landscape photography. I've found that stacking can greatly improve the shadow detail. The ETTR technique gives nice blue skies, and restrains other highlights, but naturally, with shorter exposure times, the darker portions of the scene get less exposure than they would if you allowed the skies to blow out.

I'm only guessing, but there must be a minimum time for the exposure. Stacking won't change that. My assumption (and I'm not a statistician) is that the noise floor moves lower with stacking. Taking multiple images and stacking them is basically taking multiple samples. The random (Poisson shot noise) is averaged out across multiple images while the signal (the star) is a constant across the images.

As you can see from the linked article, there's an amazing amount of detail that can be recovered from the shadows if you know how to get to it. In a dark sky environment, you'd be limited to the camera's sensitivity (the analog gain applied using the ISO setting), but in the city environment, almost certainly the sky light is the limiting factor.

I have not experimented with custom cooling. Aside from some moon photos and photos of the recent partial (in my area) solar eclipse, I'll leave astrophotography to others.


> don't you need a certain "minimum" time to capture the faintest of stars, because their signal otherwise wouldn't rise above the noise floor?

I'm not an expert or even very knowledgeable on image stacking, but I'll share my experience.

I've taken a stacked picture of the Orion constellation. 15 exposures at 5 seconds each, and combined them to get this photo:

https://www.flickr.com/photos/abliskovsky/31756623694/

In none of the individual exposures is the Orion Nebula visible, but in this photo it's pretty clear. So in the end I'm fairly sure that your statement is true, but the signal required is very, very faint.


Do you know of any good resources for "getting into" image stacking for a complete newbie? 2 years ago I tried stacking images I took of Jupiter in my 16" dob (taken with a Canon DSLR), but it was terrible and I'm not confident I used the app I found (I think it was AstroStack..) for image stacking correctly.


RegiStax is popular for solar and planetary imaging. I've used it several times on H-alpha solar photography. It has a fairly well defined workflow and there are lots of video tutorials on how to use it.


Awesome, I will check it out. Thanks! Also looks like it runs under Wine on Linux, which is good news for me!


I don't have any particular advice. I searched the web for astrophotography stacking and used some software I don't remember the name of to do it. Sorry I can't be more helpful!


Nice result!

Ok, just for fun: I'll try to make a quick back-of-an-envelope calculation and see if we can make an educated guess...

Let's use the old Sunny 16 rule, and its equivalent for moonlight, which also gives us an estimate for difference in light between day and night[0]:

> Daylight (sunny day): Correct exposure for this case is given by the Sunny 16 Rule: 1 / ISO [seconds] @ f/16. So at ISO 100, a typical exposure is 1/125 @ f/16 (that's 1/3 stop less exposure than the rule calls for, but it's the closest standard shutter speed.)

> Full moon: to get an equivalent exposure at night with a full moon, the rule is: 1 / ISO [days] @ f/4. That's right, DAYS. For ISO 100, that means 1/100 of a day @ f/4.

> That works out to something like 14.7 minutes; I just round it up to 15 minutes. So 15 minutes @ f/4, or FOUR HOURS @ f/16.

> That's 21 stops difference from sunlight to moonlight.

So on a sunny day you need f/16, 100 ISO, shutter speed 1/100.

flickr says you shot at f/5.6, unknown ISO (I'll go with quasi-conservative 6400 ISO), 5 seconds.

Aperture increased by 1.5 stops

ISO increase by 6 stops

500x longer shutter time => log2(500) -> 2^9 = 512 => bit less than 9 stops.

So that's between 16-17 stops of extra light sensitivity

That's five stops short of the 21 stops mentioned above. However, we're not aiming for a properly exposed single picture, we're aiming for minimum exposure to capture a signal!

Even the first digital SLRs had a dynamic range of five to six stops, so theoretically should be able to capture something in the shadow regions. Nowadays ten stops is about the norm I believe, with some of the better cameras going up to twelve. So that should clearly be above the minimum signal required

Also, the moonlight-rule above is very conservative: it's about exposing a scene in moonlight, not about capturing stars. Stars are light sources themselves, and probably brighter than objects reflecting moon light, even in a full moon. Also, the rule-of-thumb was based on film, which actually suffered from something called "reciprocity failure"[1] so it might exaggerate the required exposure.

So yes, this should work out fine, and as your picture shows it clearly does :)

[0] https://www.flickr.com/groups/11947580@N00/discuss/721576207...

[1] https://en.wikipedia.org/wiki/Reciprocity_(photography)#Reci...


> The histogram tells much more of the story (if not the whole truth).

I feel that this is worth expanding on.

Because a raw file isn't actually image data and instead just straight readouts from the sensor, it doesn't make sense to show the histogram of raw data; the data has to be manipulated into a jpeg first. The histogram in the camera will show the lightness values for what the in-camera image processing computer does to convert the raw file into a jpeg, but the in-camera image processor is usually not as advanced as a desktop image processor. It's probably running older image processing algorithms and has to worry about battery life among other things.

So generally what this means is that if your camera is telling you that parts of your image are over or under exposed and therefore clipped, it might still be possible to recover those parts of the image in a more powerful image processor. It's also worthwhile, if you've got really old raw files, to go back and reprocess them with modern software. Raw file processing has come quite a long way in the past decade!


That is the right technique in general when overexposing would clip highlights. Here's that's not really a risk, so the other direction works better.


What article doesn't explain is the physics behind why ETTR captures more light data: our perception of light scales logarithmically, while a CCD captures it linearly. Each stop is a doubling (or halving) of the amount of light captured. CCDs work by increased voltage based on captured photons.

Imagine back to when digital photography just started, when we were stuck with 10-bit RAW images (before processing them to 8 bit JPGs). That's 1024 different values per channel, not a whole lot of room to begin with. Because we're converting a linear signal to a logarithmic one, that means that highest stop in the image takes up 512 of those values.

EDIT: As rightfully pointed out in a reply, 8-bit JPGs are not in linear colour space any more - so this logic does not apply there (however, JPG compression is more harsh on shadow details, since it's lossy, and it tries to move that loss of information to the parts of the image where you're less likely to notice, like the shadows, but we're digressing)

By that logic, the low light data should have a lot more banding than the highly exposed data. Try it with level tools on a RAW file and see for yourself.

So you can imagine that when the dynamic range of (part of) the scene that you want to capture is less than the dynamic range of your sensor, you want to be on the right edge of the histogram (all else being equal - if this causes motion blur due to long exposure it's pretty useless)

Caveat: this is potentially outdated knowledge, perhaps the analog/digital converter in modern cameras actually converts the voltage of a CCD to a (pseudo)logarithmic scale for the digital output, which might partially circumvent this issue (although the sensor is still linear). And nowadays many of the better cameras have 14-bit A/D converters - the usefulness of which also largely depends on the quality of the sensor - so there's a lot more detail in the shadows than before.


While the color depth of a JPG is 8-bits, it is not a linear 8 bits. The sRGB color space is gamma corrected (non-linear) and is roughly equivalent to taking the square root of a linear pixel value.

The same thing holds true for a processed RAW image as well.


I'm going to be pedantic and air a pet-peeve of mine. "Raw" isn't an acronym. It's just shorthand for raw image data. It's not even the file extension of most (any?) camera raws. Why does everyone insist on writing it in capital letters?


Another possible reason not outlined the other replies you received to this comment - If I were to say "copy the raw file" someone less experienced may think "okay, take the fresh jpeg file and copy it", interpreting "raw" in a more conventional, non-technical sense. Capitalizing it emphasizes that it refers to the file format, even if not the actual file extension. Sure, I could say "copy the NEF file" or "copy the CRW/CR2 file", but that means I'd have to know whether the user was using Nikon or Canon. "RAW" lets the speaker eliminate ambiguity in non-technical contexts without needing to be so specific as naming the actual extension.


Never really thought about that. I think I subconsciously interpret RAW as only meaning "file of unprocessed image data from a camera", whereas raw has more possible meanings. I know this shouldn't matter, because it's never confusing in context what one would mean.

I guess there's also the visual "symmetry" with JPG or JPEG.


Probably because it's cased that way in camera menus (correctness notwithstanding).


Thanks for the reminder about JPGs, edited in!

When you say processed RAW, you mean after importing to image editing software, right? Which implies you wouldn't see much of a difference in posterisation in the high and low exposure areas


Right, after importing to image editing software.


Still, for a very long time the RAW file itself used to be linear in storing the signal. Perhaps not any more though.


It is still linear as far as I know. And the linearity is quite good, within a few percent. There is a community of astrophotographers interested in making photometric measurements with DSLRs (American Association of Variable Star Observers). I think the harder part of the problem is getting a standard candle for calibration of your DSLR DN values to actual SI quantities.

https://www.aavso.org/dslr-observing-manual


Huh, that sounds like a very interesting (but different) problem to tackle, given the vast differences between models.


You should write an article on the subject. Seems there are a lot of interested people in the process and you've got some domain specific knowledge to lay it out nicely.


Thanks for the compliment, but I doubt I'll do better than the Wikipedia summary[0] or the original article suggesting the method from 2003, which is still really good[1]. The 2011 followup is also worth a read[2].

If you want a good website in general for explaining photography techniques plus the physics behind it, I highly recommend Cambridge in Colour[3].

By the way, per the example of the second linked essay: if we try to capture 10 stops in 12 bit channels, the darkest stop only has 32 discrete available in the RAW file.

However, this might be somewhat less of an issue these days because most cameras have a really high resolution, and most images are resized for screen displays. So we can average out some banding (and image noise, for that matter).

Example: I have a 20 megapixel camera. A 1080p screens is just 2.1 megapixels. So even for a full-screen export, that's roughly ten "source pixels" per "output pixel". Which would mean that the output pixel, being an, could have 300 different values. The distribution won't be like that of course, but it still should help.

[0] https://en.wikipedia.org/wiki/Exposing_to_the_right

[1] https://web.archive.org/web/20150209012804/http://www.lumino...

[2] https://web.archive.org/web/20150213002736/http://www.lumino...

[3] http://www.cambridgeincolour.com


I agree. I’d love to read an article about this!


I'd rather see a comparison between two 30-second exposures at 1600 and 6400 than what was presented here. That would give a much better idea of how much detail you're losing from sensor gain.


If the scene has a lot of variation in brightness, then I'll always opt to save the highlights over the shadows. Shadow detail recovers remarkably well, while clipped highlights are ghastly.


ETTR is a real technique to maximize SNR. Underexposing and then boosting shadows is just a good way to guarantee that your image has no contrast, and comes out looking flat and grey all over.


While idly thinking about light pollution a while ago, I thought of a ridiculously impractical solution:

Mandate the replacement of all outdoor night-time illumination with LEDs that are pulse-width modulated at a low duty cycle. Synchronize them all to an accurate global clock (e.g. from a GPS receiver), so that for instance, all of the lights are simultaneously turned on for the first tenth of each UTC millisecond. Then an image sensor with a sufficiently fast global shutter could disable itself during every brief pulse of light, so that it picks up 90% of the incoming starlight, but almost none of the light pollution.


I think it's time to introduce fines for office buildings that are unoccupied overnight, but have lights on. Certainly in high rise towers this would be tens of thousands of lights.

More stars, less energy consumed. Win, win.


Slovenia is one of the few countries that has legislation against light pollution. For example, all outdoor lights must only emit towards the ground. It helps a lot with dark skies, but enforcing this kind of rules can be a problem in many cases.

http://artificiallightatnight.weebly.com/uploads/3/7/0/5/370...


Interesting. My hometown in Austria recently installed new lampposts; and all of them are downward-facing. I don't know if this was a conscious decision to reduce light pollution, but it worked wonders. The roads are better lit than before, while there is almost no glare from afar and the night sky is much more visible.


THIS so much - Most of the public lights emit in +-all directions, making them not only uneffective but constant light polluters at the same time. And it would be so simple - reflect to the ground, say with 170° angle.

this would make the usual 'light cloud' over towns and villages practically disappear. I guess this is a remainder from times where the more light there was, the better (ie look at our mighty civilization and its advances, illuminating whole valleys and soon whole earth!). Now it makes it pretty hard to find unpolluted places in western Europe even in the Alps.

I can be in the Italian side of the alps, and I still get orange blobs from Switzerland or France, even across continuous wide range of frikkin' 4000m peaks!


Reflecting to the ground is helpful, but it definitely wouldn't make the light pollution disappear. Plenty would bounce off lightly-colored pavement and back into the sky.


The problem is so much bigger than that. Here in Edmonton, we have regular folks who campaign and crowd-source funding to install lights on bridges, buildings, just about anything for the sake of making things look 'nicer'. Which is unfortunate for those of us who want to see the sky.

Anyway, how's the light pollution been on your Africa trip? You must have been through some seriously dark remote areas by now.


In my small city we are a few people interested in night observation. We are pondering about lobbying the city hall to have a "dark night" once a month, maybe something from 1am to 3am, where public lighting would be off.

Considering the mayor comes at every smallish local events just to remind people of his name, he may be interested in something that would make a bunch of people praise him for little effort.


"Then an image sensor with a sufficiently fast global shutter could disable itself during every brief pulse of light"

all that light scattered in the atmosphere and reflecting off other things would still be around when the LEDs are out.


Hmm... is that really true? Light travels pretty fast!

The poster proposed a 90% duty-cycle for capturing the image (i.e. shutter open for 90% of the time, shutter closed for 10% of the time).

Since I'm based in the UK, let's assume the simplest synchronization mechanism - the 50Hz mains signal. (I appreciate this probably isn't perfectly synchronized.)

That means that the shutter is open for 18ms and closed for 2ms, during which we want to have our light pulse.

Let's assume that we have a 1ms flash and we allow 0.5ms for synchronization error and 0.5ms for light from the flash to dissipate.

In 0.5ms, light can travel approximately 150 km. Most of Earth's atmosphere (and hence most of the scattering) is within ~15km of the surface. As I understand it, reflection takes very little time (less than 1us), and each will cause energy loss (and hence less brightness).

I think there are engineering details to work out, but I'm not convinced that reflections or scattering are a significant problem.

I think the bigger problem is that it just doesn't seem worthwhile - how much would it cost to do this, and how much do people care?

As an aside, I wonder what the effect (on humans, wildlife, etc.) of regular 1ms-long pulse of 20x the average brightness would be - maybe it's fine, maybe not?


Speaking from experience, this solution would cause massive headaches for many people, myself included.


I had this idea too but...

This article uses 20 seconds exposure, so it wouldnt be helped by PWMing the lights.

Also, you'd have to do it much faster than 100Hz. Wave Christmas LED lights around and you'll see a strobe effect.

People claim to feel (not see) 100-120 Hz flicker (I wouldn't dismiss them off hand).

Finally, you'd have to have a powerful, lighting, LED able to toggle on/off at fast enough speeds. That isn't obviously available, if for no other reason large internal capacitances.

As for synchronizing it, that's not too bad. I don't think you'd need a GPS clock. Use an accurate crystal clock and the mains as a 60*3 = 180 Hz reference. Turn the light off every 0, 120 and 240 degrees to cover all three phases.


Fun! Solving the wrong problem though - problem with light pollution is the effect on the environment and human psychology, not on astrophotography.


Or we could simply go back to sodium vapor lamps for streetlights, because there are simple sodium vapor filters for astronomy that can entirely filter it out without losing any of the light required for astronomy.

This entire issue is already solved, and works fine.


It is an interesting idea, but most of the lighting industry is interested in reducing flicker, not increasing possible sources of it.

You'd be fighting the current.


It's somewhat of a known problem space. Tackling light pollution boils down to easy steps, you just have to be mindful in design...

- Only be on when needed

- Only light the area that needs it

- Be no brighter than necessary

- Minimize blue light emissions

- Be fully shielded (pointing downward)

Light pollution is (IMO) driven by lack of knowledge of the problem & solutions, a perception that lights = wealth/luxury, and the low cost of electricity.


This might be enough for photography, but the speed of light would probably push this above 1 millisecond for the area that needs to be dark for large telescopes.

But engineering the light sources to be astronomy-friendly and nature-friendly in general seems like a good idea. Basically limit things to the red-orange part of the spectrum. It worked for sodium lights.


Fix the duty cycle to a particular slice of the phase of AC voltage (e.g. the first 0-pi/2) and you accomplish the same thing much more practically, at least within a single power network.


It's going to be a bit trickier than that, because power is usually distributed as 3-phase, not single-phase.

So that's 3 different offsets, depending on which phase any given light is wired into.

Plus that's probably going to be slow enough to cause visible flicker.


Have it trigger on a voltage threshold. 3 phase means you're hitting 0V 6x per cycle so that'd bring the pwm up to 720Hz.


Not a power engineer, but can't most of the power conditioning devices used in buildings mess with the phase of the incoming AC signal? GPS clock synchronization is guaranteed to be globally accurate.


They may cause harmonics, but the underlying sine wave is solid like a rock (well, at least concerning the ENTSO-E over here in Europe, other grids may be more fragile).


I heard once that it is easier to filter out (of the photos) light that comes from LED, but I can't find a good resource on it. Anyone know if this is true?


That's not true. Old mercury and sodium lights are much easier to filter since their emission lines are narrow. LED often has a very broad and spiky spectrum, which makes it almost impossible to filter. (Source: astrophotographer). You can buy reliable light suppression filters for sodium/mercury (google CLS or UHC filter), but not LED


Having large LEDs turn completely off and back on quickly may prove too costly/difficult. Having a notch of light frequency missing from every outdoor LED might be a better solution. You could even wear special filtering glasses that would filter out all artificial light.


People actually used to do this. Back when most street lighting came from sodium lamps, you could eliminate light pollution almost completely with a pair of glasses intended for soda-lime glass blowing.


Easier to filter out than incandescents, not easier than the old sodium lamps. Also a single color LED is not too bad. But a (good) white LED has many primary color LEDs and phosphors to broaden the color spectrum as much as possible.

But IMHO, wrt to incandescents they still suck.


If you're into deep space astrophotography, this is not really news.

It's all about exposing for the stuff you care about. When imaging DSOs (deep space objects), you often don't care about the relatively bright stars nearby. But you do care about that faint little object in between. So you adjust exposure so the stars are usually blown out (overexposed), but the DSO along with the light pollution background have peeled off the left hand edge of the histogram.

So now your DSO is sitting comfortably in the middle of the histogram, where you can process and extract it easily.

In practical terms, for DSOs you tweak exposure until the big light pollution hump on the histogram becomes disconnected from the left hand edge of the graph. Turn exposure up gradually until there's a gap at the left hand wall. That usually does it. Overexposure is bad, but underexposure is super-duper-evil, so you must avoid it. OTOH, don't push things too far to the right - you still want some chroma info in the star halos.


Would it be possible to filter out the pollution by subtracting an out of focus image from the in focus one? The pollution is already spread out while the stars are point light sources. So wouldn't that leave the stars untouched while greatly reducing the light pollution?


All you need to do is make a copy of your image, blur that copy enough to get rid of the stars, and then subtract it from the other image. It's going to affect larger structures like the milky way, though.


Overexposing and underexposing (formally known as Bracketing) is typically a great technique in landscape photography too because it captures as much of the dynamic range as possible. Most DSLRs can bracket right out of the box. Set it up to take 3 exposures (1 underexposed, 1 normal and 1 overexposed) with about 2 or 3 stops in between. Then merge all 3 exposures in Photoshop and you'll get an image with a TON of information embedded in it for you to tweak in post-processing.

Also, while I have't tried this, I'd think that using an ND filter in light-polluted areas like this could help a little bit with astrophotography.


There are light pollution reduction filters made specifically for astrophotography. I was shocked at how good these photos were w/o using such a filter. I don't think you ever could have recovered so much of milky way using film w/o a filter in such a polluted sky. (I was certainly never able to.)

Googling for an example, amusingly enough, the first link is to the same site:

https://www.lonelyspeck.com/hoya-intensifier-review-an-affor...

That filter is not specifically for astrophotography, just one that happens to work well. Here's a discussion of other options:

https://www.cloudynights.com/topic/494162-light-pollution-fi...


Adobe Lightroom can also merge bracketed RAW (DNG) images. That is, you take N DNG images and you get 1 DNG image out, which then preserves more image information that allows you to non-destructively edit the final image. It's available under "Photo Merge > HDR". Unfortunately, it doesn't provide a lot of settings to tweak the merging algorithm, unfortunately.


Here's something I've wondered about a lot, but haven't been able to find a definitive answer: Isn't ISO pretty much software gain control? Why ever increase the ISO on the camera instead of just taking photos at the native ISO and then post-processing the photo to a higher exposure? That way, you don't store the blown-out pixels (if you do overexpose).


ISO is not just software gain. On a lot of higher-end cameras (read: just about any ILC) there's a definite analog gain stage prior to ADC.

Since there's a fixed level of sensor noise, when you increase the analog gain you decrease the level of noise in the shadows relative to the sensor noise, essentially increasing SNR. This isn't true for all cameras, however.

This chart [http://www.photonstophotos.net/Charts/PDR_Shadow.htm#Canon%2... ] demonstrates the difference between three cameras, one of which is known to be largely "ISO invariant" (the Nikon D750), one which is not (the Canon 6D), and one which looks like it has a 'native' ISO of 800 (the Sony A7R II)

As you can see, the D750 doesn't gain much (heh) from increasing ISO -- the total image noise in the shadows isn't improved by increasing the ISO through the range. If you're shooting a 6D in a low-light situation, however, you see improvements in shadow quality up to about ISO 3200, after which it turns out that you will be fine if you just raise the shadows in post. If you're shooting an A7R II you might as well just leave the ISO dial on 800 all the time.

That's not all, though, since it turns out when you increase ISO you basically lose 1EV of highlight off the top of your image, as you allude to, and you also lose around 1EV of DR. (Chart here: http://www.photonstophotos.net/Charts/PDR.htm#Canon%20EOS%20...)

It is, like all things in life, a balance.


Fantastic, that's exactly what I wanted to know, thank you. This is very useful for setting my max auto ISO.


It differs between the different camera sensors. With CCD sensors, there were switchable amplifiers, but sometimes camera companies added software-pushed sensitivities. The most modern CMOS sensors are called ISO-less as the higher ISOs are implemented not by changing the amplification, but by digital post-processing. So for a wide range of exposures, there is little difference between higher ISO settings and pushing in post-processing. There might be some slight differences after all, as the in-camera processing might vary as well as some low bits skipped at low ISO values.


No the ISO setting works in the analog domain, amplifying the signal prior to digitisation, so it is different from digital postprocessing. Edit: apparently my knowledge on this is outdated.


You had it right the first time. ISO on most cameras is very much implemented via analog gain up to a certain point, beyond which "digital gain" is applied.


One thing to note is that most cameras use 12 or 14 bit sensors, so in-sensor digital gain can still be better than post-processing.


You mean if you shoot JPEG, right?


About the only camera that I know of that will currently saturate the bitdepth of a 14-bit ADC/RAW with DR at base ISO is a Sony A7S II, so right now, yeah, shooting in JPEG is the real issue.

This will change in the next few years.

If you're the kind of guy who wants to read the relatively dry writings of someone who spends a whole lot of time figuring this stuff out, the following link might interest you: http://clarkvision.com/articles/digital.sensor.performance.s...


As a photographer who's shot the Milky Way a lot, I can say this is excellent advice.


Or get an IDAS-LPS, CLS or UHC filter (either on the lens or as a sensor clip on). Or stack a bunch of images with proper dark substraction. If you stack enough light sub frames you can efficiently reconstruct dynamic range. Of course the ETTR advice is good too, but you can go much further even with consumer equipment.


I have trouble visualizing how the view of the Milky Way as seen from Earth reconciles with pictures showing whole galaxies. Are we seeing it edge on? Is there any cool animation showing it first from the outside, then rotated to match how we see it in the sky from Earth?


Imagine a vinyl record, and that you are sitting on the record, near, but not at, the outer edge of the record. South of you, but close to the horizon, is the centre of the record - containing a massive black hole. North of you are all the stars in the galaxy between you and the edge of the galaxy. Most of the stars in the galaxy are scattered along the disc, but with a fairly large variation. The closer you get to the centre, the greater the number of stars - that is the brightest part of our night sky.

Some ancient clusters of stars have drifted a fair bit away from the disc, but are still gravitationally bound to it, and its dark matter. If you were on another galaxy looking from above, our galaxy would look like a spiral with arms. If you looked edge on, it would look more like a cigar.


My understanding is that based on the position of the Earth, most of the time what we see is the next band over, from our position on the edge of a neighboring band. But like you I've had trouble finding a detailed explanation of our exact POV within the galaxy.


Some quick digging found


Interesting. I would've liked to see the result of 1600 iso at 30s too - just for comparison.

On a note about light pollution, the mention of sodium street lamps immediately made me think of filters - stars are suns, so should have wide spectra - why not just filter out the orange bit? Apparently I'm not the first with the idea (obviously):

http://www.nezumi.demon.co.uk/nonad/spectra.htm

https://petapixel.com/2016/12/14/purenight-filter-cuts-light... (has some nice with/without filter images)


The PureNight filter is coincidentally a crowdfunded product by Lonely Speck, the people (Ian Norman and Diana Southern) who contributed this story article.


my town has just gone to LED now, so its a bit harder to do :/

Still I'm sure one could filter out the peaks from the LEDs (they are phosphor based still) but it's not going to be effective as taking out the sodium notch.


When i was a child all the street and business lights were turned off at about 11pm.

Now they are on 24/7

Incredibly wasteful, although i suppose it is safer to go about ones business at night.

It certainly gave the opportunity to view the stars even if one lived in city areas.


Quite recently I lived in a small town far from cities with almost no street lights.

I thoroughly enjoyed being able to simply look up and gaze at the glorious Milky Way unaided.


I'm curious if in many decades self driving vehicles will eliminate the need for most night time street lighting.


Street lighting predates cars.


I wonder where are the star trails. Wouldn't 30s exposure make earth movement noticeable?


The answer is the 500 rule. To avoid star trails, exposures should be less than 500/focal-length-in-135-equivalent.

In other words, if he shot this on his 'favourite' Rokinon 24mm, an exposure time of 20 seconds is reasonable to avoid all star trails.


Didn't know that one. Thanks a lot!


I'm quite surprised any images of the Milky Way are possible in such a light polluted environment and I'll be using these techniques when I can to see if I can replicate the results.


This articles reminded me the wonders that surround us while we go on living our boring human lives, so easily forgetting about the marvel of life and nature. Thanks for sharing this.


Resource limit reached, anyone have a cached version?





Applications are open for YC Summer 2018

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: