Over time, as the technology of photography was invented and iteratively improved, film formulations came to work better for taking photographs of light-skinned people than of dark-skinned people. Inventors and engineers didn't set out to disadvantage dark-skinned people specifically. But it happened anyway.
In the relatively short history of commercial "AI", we've already seen examples of machine learning systems that followed a similar development path. Non-representative data sets, replicating existing bias, over-fitting data sets to the point where the outcomes violate non-discrimination laws, etc.
One way to help avoid making these mistakes is to think about how they were made in the past, in other engineering contexts.
I'm less certain it's such an issue today. Some cameras have face recognition built in as a focus/af assist, and I suspect that these are better with some faces than others (most consumer cameras are developed in asia so I'd except them to do best with asian faces, but as I don't use this sort of feature I don't know). Other than that, I'm struggling to come up with a way that bias - in the way I'd define it - could creep in. In-body JPEG colour profile perhaps, but most serious photographers don't depend on that, and it can be rectified in post anyway.
On the other hand, there's the unfortunate technical problem that dynamic range is limited, and having a subject with very large contrast between skin and clothing brightness is technically challenging to shoot, especially in a situation like the one illustrated where the subject can move around so you have limited fine grained control over lighting.
I'm totally on board with presenting the former film chemistry issue as bias - but that's pretty much just historical at this point. The number of people shooting events on slide film is pretty close to zero these days. There's also very little if any investment into film chemistry as a consequence of this.
The second issue irks me a bit. If you have very dark skin and wear very light clothing, or very light skin and wear very dark clothing then it's going to be more of a challenge to capture you effectively. That's physics not racism, and conflating the two issues feels a bit disingenuous to me.
Anyway, I suspect that was an evolutionary advantage. For hunting in forest and savannah, to hide in shadows. The human rootstock was black, I gather. And lighter skin evolved where soft UV was limiting for vitamin D synthesis.
But that's solvable with the right technology. Maybe something like getting overall color value, and then tweaking color contrast. If we can do it in astronomy, we can do it with people.
My bet is on UV tolerance. Black people have lower rates of skin cancer in the US compared to whites. https://www.jaad.org/article/S0190-9622(05)02730-1/fulltext (Mortality is higher in blacks, but that's because it's diagnosed at later more severe stages.) The effect would have been far bigger when foraging and hunting all day every day under the equatorial sun.
And maybe there's linked stuff that got selected for.
As someone who has the ancestry of people who lived far from the equator and now lives in a country that is much closer to the equator I wish we hadn't lost our body hair.
But now that I think of it, hairless mice have light-colored skin. Some hairless dogs too.
Pale skin can be made easier to photograph in challenging lighting conditions by using a matte powder, which diffuses specular highlights and reduces the value range. This doesn't work well on dark skin, because subtle shadows are less visible and you lose the impression of shape; you need the specular highlights, but they greatly increase the value range.
The problem with early colour films wasn't really hue (all human skin falls into a remarkably narrow range of hues), but a lack of sensitivity and dynamic range compared to monochrome films. They coped with over-exposure reasonably well, but even slight under-exposure would obliterate the detail on dark skin. This limitation of chemistry was a constant problem in all forms of colour photography, but the social burden fell disproportionately on dark-skinned portraiture subjects. Later emulsions had far better rendition of lowlights and smooth compression of highlights, which vastly improved their performance with dark skin.
All technical systems have strengths and weaknesses. When systems are conceived and developed their goals are set through a culturally-created understanding of what the system "should" do. The lack of attention to darker faces (from chemical process selection to selection of people depicted in kodak's system calibration cards) explains how we got this particular set of technical limitations instead of one that, say, tended to over-expose easily.
> Dark-skinned faces are inherently more difficult to photograph
I would say that dark-skinned faces are inherently poorly captured by the photographic technology we have developed, which has focused on detail rather than dynamic range.
The choice of the model does reflect some amount of bias, but it doesn't cause bias in the printing process.
Precise calibration was done with a test chart and a densitometer. Shirley was just a quick reference used throughout the day, to check at a glance whether prints were unnaturally blue or purple or washed out due to changes in the developer chemistry. Fuji usually supplied test films and prints with Asian models, even in European markets; the motion picture industry originally used porcelain dolls. A multi-ethnic test image wouldn't have provided better accuracy.
>I would say that dark-skinned faces are inherently poorly captured by the photographic technology we have developed, which has focused on detail rather than dynamic range.
Except that the opposite was true for the entire duration of the film era. The photographic film industry was acutely aware of the fact that most consumer photographs were taken on compact cameras with mediocre lenses and usually printed at 6"x4". They were also aware that the main complaints from consumers and darkroom technicians were about exposure latitude and colour accuracy. Their development efforts and marketing materials reflected this awareness.
Most consumers had never even heard the word "resolution" until the megapixel wars. Ektar, the only Kodak colour negative film specifically marketed as being especially fine-grained and high-resolution, failed commercially and was discontinued after five years.
You're correct that the film ecosystem of photography was not developed with resolution in mind. I was thinking of the way in which digital sensors capture the most bit depth in the upper tones . This would likely not have been pursued if test targets had subjects with darker skin colors than their surroundings.
When he purchased the equipment for his lab, he didn’t have a ton of money so the equipment was only semi automated. There were two machines, one to develop the color negative film and another to make prints.
I operated the printing machine. This involved sitting at a console and feeding each strip of negatives (typically 36 frames) one image at a time over a lamp which lit the negative from below. I then punched some buttons to make each print. I was assisted by a computer. But the computer was dumb. It looked at the negative and tried to average out the exposure so that the resulting print on average would be grey. 18% grey to be exact.
So my job was to look at this inverse negative that was 1” x 1.5”, figure out what the scene actually was, then override the computer’s exposure and color balance to get a correct print. 12 minutes later when the print had developed I’d know if I guessed correctly.
On a good day, I’d print about 100 rolls of film, 3600 prints, and maybe I’d have to redo about 36 of those.
Now this was in Miami and so I was printing picture of every subject and skin type under the sun.
And the funny thing is, I don’t recall ever noticing, particularly, that there were any issues with prints of persons of color.
What I do remember distinctly is that my dad and I never agreed on blue/yellow color balance. I thought he made everyone look like Smurfs. I pleaded with him to look at the whites of people’s eyes. Never won that battle. It was his shop.
Also, beach scenes. Snow scenes. Cameras and film often (counterintuitively) underexposed those.
But this was long ago and maybe I didn’t notice or don’t remember.
The snow being under-exposed is the camera is trying to balance out the image to 18% grey [usually]. This means in a scene with mainly bright white snow it will expose to make the scene grey, unless you override it. The camera can't [with current technology as far as I know] know that you are shooting snow. If you're shooting in snow then you probably want to adjust exposure by +2 as a starting point.
Photographers used to carry around a physical grey card and would use that to take a light meter reading from. That would give the base standard exposure for the ambient light so the scene should be correctly exposed. [ it's more complex due to dynamic range etc but that covers the basics ]
We can debait the title a bit, but a substantive article with a slightly baity title is not one that should be flagged. If the title is your issue, you're welcome to let us know at email@example.com, or suggest a better title in the thread, which we'll happily use if we see the suggestion.
One last thing: if you're going to comment on a thread like this, can you please top up on the site guidelines first? Especially this one: Comments should get more thoughtful and substantive, not less, as a topic gets more divisive. Remember that we're going for good conversation—respectful and two-sided—rather than ideological or political battle.
I'm not sure, though, that it makes sense to focus on the idea that it was a "mistake" to flag this article. I think flagging is currently one of the weakest areas of the HN moderation system, in that it allows a minority to suppress discussion of otherwise good articles that they disagree with. Clearer guidelines can help, but I think you should also think hard about how the system can be changed to be more resilient to "mistakes".
Maybe flagging of articles with some minimum number of upvotes should act only as a "request for moderator review" rather than having any immediate visible effect?
Maybe there could be a way (either mandatory or optional) to explain why an article was flagged? I presume it would be easier to review a flag that says "Bad title, should be X" than just an opaque and universal flag.
Maybe there could be some better feedback for users who flag articles, to tell them if they are flagging articles "correctly"? As it is, I presume a lot of people persist in doing what they think is best, even if you think they are using the system incorrectly.
It's great that thanks to good moderation HN is still going strong after all these years. Obviously you are doing many things right! But I strongly feel that current flagging system needs to be rethought for this success to continue. Explanatory comments like the one you made here are helpful, but I think some deeper changes will also be necessary.
I don't know if the flagging system needs rethinking though. The way it works has been stable for years. I appreciate that such claims are a bit irritating, because they're based on a view of the site that not everybody gets to see. But the [flagged] marker is publicly visible, and if you or anyone sees an article in [flagged] state that ought not to be flagged—say, one whose signal/noise ratio is as high as the OP or better—then please tell us (firstname.lastname@example.org is best, because that way we're sure to see it). You especially, because we know how much and how sincerely and for how long you have cared about open discussion on this site.
For example, consider a post about the political analysis of the changing voting habits of Congress. The post is great, it is filled with technical details and an explanation on the application of game theory on voting habits in Congress, yet it is titled "Racism in the American Political System along Party Lines." The title enough is alone to disqualify such a post here IMO as it would in itself cause a dumpster-fire of a comment section, or at least have a much higher likelihood of it.
I appreciate how the moderation call could seem non-obvious in this case, even though it was obvious to me. Here's the most important thing: people shouldn't just flag every divisive article. It depends on the quality of the specific article. We don't want HN to be taken over by political battle. But we also don't want to sanitize it of social issues, because it would be less intellectually interesting that way, and that is the entire point of the site. Here's a good question to ask when deciding whether to flag: what is the information-to-indignation ratio of the article? If it's high, don't flag it. I wrote about this the other day, for anyone who wants more: https://news.ycombinator.com/item?id=19720659.
What prompted me to comment publicly in the current case is that the article was high in information and low in indignation—surprisingly low, considering where the author is coming from. Which makes the article even more interesting, btw—any time an article defies expectation for its genre, that's good for HN.
Essentially what I use HN for is the comment discussion. So for me, the main thing to look at is whether an article will generate a good discussion. Saying "I don't actually care if the article is good" is not exactly true, but it's also not that far off: a well-written article that generates a bad discussion is still a bad thing IMO.
Whereas someone who uses HN as just a news aggregator may have the opposite thought -- the comments don't matter, all that matters is the article quality.
This submission is a good example to make the point; it sits just on the interesting edge. That's why I've spent so much time commenting in this thread: not because I know anything about the social implications of photography tech, but because it reveals something about how HN can get better.
While it's true that photography in the past was racially biased, it is unclear why modern photography is still racially biased. It may just be genuinely harder to photograph dark skin, after all, the article even says that attempts have been made to fix racial bias in photography.
All of this being said, even if photography is not racially biased and the issue is more technological rather than social, the problem should still be fixed.
I could also be wrong about all of this, and in reality dark skin is just as easy to photograph as light skin, but the article did not give a modern example of the racially biased "Shirley card"--except for facial recognition technology, but I didn't really consider that photography in the traditional sense.
While modern photography may not be racially biased, Kwindla does make an excellent point that machine learning is replicating the past mistakes of photography.
Perhaps if those with dark skin were a majority, far more R&D spending would go into dealing with dark skin tones than say, better zoom lenses or image processing speed. As a practical example, I suspect HDR would have arrived much faster than it did.
The idea that it was furniture companies and chocolate vendors that drove this development is absurd, that is probably just Kodak sales pitches backfiring on them 50 years later. Anyone that wanted to take a photo of a white person, let alone any subject, in a slightly darker environment wanted the same thing.
Or for an even better way to look at it: if you can photograph fast moving subjects twice as well, you can also photograph subjects twice as dark. Again, this is something everyone that uses a camera wants, independent of race.
As for HDR: it is only relatively recently that digital sensors have started outperforming film's dynamic range, which was pretty good for a very long time. Digital camera vendors absolutely had massive financial incentive to fix this deficit. When they started to even come close, that's when the market exploded for movie production, and probably a lot of other commercial applications as well.
Cameras on the other hand have one sensor that is only color and not black and white. And with low brightness, noise becomes a huge problem with the electronics. The modern smartphone sensors is incredibly small and even moderate lighting conditions already require some post processing or special capturing methods like HDR to cook up an acceptable image.
The issue is most likely fixable by making HDR a standard feature on all cameras and extending the dynamic ranges even further so that darker skin tones are appropriately captured in all lighting conditions.
There's little "amusing" about a piece of software that re-enforces prejudices, only now in a way that is supported by a technology culture which prides itself on blind faith in machines. For victims of those prejudices, it presents a real and terrifying future.
I'd expect it to let you calibrate for the black woman's hair detail without blowing out the white dress, but this image on my screen (print? scan?) doesn't seem to have done that.
It looks like the black woman's hair is almost lost in this image, yet it looks like the photographer put a hair light on the blonde woman, and not on the black woman.
(Lighting: In the reflections in their eyes, it looks like two studio lights. A narrow hair light for the blonde, from camera left, might be a third. Also looks like a backdrop light behind, which might also be doing that small hint of rear lighting on camera right of the black woman's hair, which emphasizes that the rest of the hair isn't lit as well as it could be.)
(Outfit exposure: The gray outfit might be Gray Card gray, and you can also compare it to the white outfit. The black dress looks hopeless for exposure, at least in this print/scan.)
(Focus: The blonde head looks in a bit softer focus, which might be accidental or glamorous.)
The author says: instead of seeking a solution, the technician had decided that my body was somehow unsuitable for the stage"
But the technician said: "We have a problem. Your jacket is lighter than your face, That’s going to be a problem for lighting"
In what sense did the technician decide the author should be removed from the stage, and seek no other solution?
Then you can look at the clothes she is wearing and realize that in every photo where she is wearing a lighter dress or jacket, the automatic exposure settings of the camera adjusted for the average lighting of the scene and underexposed her face.
That is why, because Brazile has done countless TV appearances for decades ... in most of the photos you can see that she wears darker jackets or clothing.
You can barely make out Michael Jordan's facial features. Michael Jordan, in a team photograph of the 1990/91 Chicago Bulls.
The lesson here is to design technology so it works for everyone, not just people like you. Sometimes it's a matter of literal skin color, sometimes it's a question of physical size and shape or abilities.
There is no magic, photography is all about light, darker skin colors reflect less light, thus you need to ramp up the exposure or the sensitivity of the film, which destroys details/highlights. The only racial related thing here would be the photographer chosing to correctly expose one skin color over another.
That one is similar to your pic but even with the bad resolution you can tell it's much better : https://usercontent1.hubstatic.com/13875852_f520.jpg
Idk what type of film was used for that particular pic but some films are made to better render white people's skin tones (kodak portra for example), while other are made for the asian markets, some are better for snowy/sandy landscapes, other for blueish/greenish scenes, &c.
On the other hand, black and white film works very well on black people:
Although it’s a harder photograph to get right in terms of technical difficulty, it’s also more than a fair chance to the photographer since everyone is more or less sat down in a controlled environment. Maybe the Bulls didn’t have the best equipment or photographers in the world, but it’s not a total amateur operation, either.
You'd be surprised, especially with pre-digital photo. Each film emulsion has its own characteristics. I won't go to much into details because it's explained all over the internet already, but here is a quick comparison of a few emulsions:
I hope no one will be offended by my comment, I mean it factually.
If darker skinned people had invented it, or had been a richer consumer group things would have been different - to think otherwise you'd have to think greedy capitalists would give up piles of cash to be racist.
Why hate on inventors who create something cool just because it doesn't quite work as well for all groups of people?
Surely this also left a gap in the market - someone could have optimized film for darker skin tones and made a lot of money?
For one, there's this: "Please respond to the strongest plausible interpretation of what someone says, not a weaker one that's easier to criticize. Assume good faith." The author of an article is someone.
Then there's this: "Eschew flamebait." Taking an HN thread further into flamewar, which is the direction your comment points in and alas even moves a little into, definitely breaks that guideline. Keep in mind that once one person goes there, a lot of others are going to go there—for and against, bashing each other along the way—so the biggest responsibility is not to be first to go there. If we all avoid that, no flamewar.
There's also: "Please don't post shallow dismissals, especially of other people's work. A good critical comment teaches us something." It seems to me that the dismissals here were indeed shallow, reminiscent of instant objections that pop into one's mind when encountering what, for whatever reason, we dislike. That mechanism is hardly unique to you—it is active in 100% of us. But the HN guidelines have been carefully written to ask all of us to slow down and inhibit that mechanism—to be more reflective and less reflexive—because this is the only way to get good conversation on the internet.
There are two other guidelines that the comment breaks, but I'll leave them as an exercise for the curious.
"Hating" is perhaps too strong a word, but the author makes multiple accusations of racism against companies, technicians etc - from my reading thinking things are far too intentional and getting angry/offended - where in reality it's a mix of money and physics - not racism against a people.
I would say there is more love in the article—consider the passages about the author's grandfather, whose humiliation she in a way dedicated her career to repairing—and later about her father. Note how she includes a moving (to me at least) moment of reconciliation at the end ("Her eyes were glassy as she said goodbye. Mine were, too, grateful for her vulnerability."). How easy it would have been to shame the woman who made the faux pas instead. Such moments of acknowledgment are hard to come by, and are worth emulating. This is not someone who's just out to hate.
> thinking things are far too intentional and getting angry/offended
A more charitable interpretation of the article is that a series of omissions can compound into a bias, even without deliberate attempts to exclude. That's interesting, and I'm a bit puzzled by the aversion in some commenters to look at it. Yes, angry accusations have been made and still get made, but that leads us to hear them also when they're not really there. We need the ability to notice when they're not really there, so as to respond in kind. That would be a de-escalating movement.
> the author makes multiple accusations of racism against companies, technicians etc
I'd urge you to read the article again and see whether those are really there, or if you haven't somehow filled those details in, perhaps because it felt that way while reading it. I just reread the whole thing myself, to see whether I had missed some accusation of racism against a company or a technician. I didn't find any. In fact, the second reading convinced me that the author must have taken great pains to restrain herself from doing that—since nothing would be easier for someone in her position to do.
The article does contain a lot of pain—the pain of being unseen, excluded. And she does do something difficult for the reader: she creates tension by never expressing the pain directly. It's there implicitly, which heightens the effect. That's a pretty effective device for making a point, and I wonder if that's really what people are reacting to: the discomfort we feel when something intense is present but not expressed. But there's also generosity in this. If someone holds back from expressing as much as they could, even when feelings are intense, it creates space for others to do the same. Those opportunities are worth noticing and acting on, because otherwise we all just repeat the cycle.
I must confess that it grinds my gears too when people rail against makers for being 'selfish' or discriminatory when the things they make end up best suited for people like them.
You could even see such a accusations as implicitly insinuating that the non-catered-to groups are somehow less capable of innovating their own more suitable alternatives.
With this you've confessed to trolling. I mean, the White Man’s Burden? Come on.
Most trolls do.
Now place a different shade of brown upholstered chair, in front of this wall.
Take a photo using the Hasselblad X1D, currently considered to have the most accurate color of mainstream cameras.
Now try to print it accurately, even with the best quality paper and a high end inkjet printer.
Even master printmakers will struggle with it.
Photography is not just a system of calibrating light, but a technology of subjective decisions. Light skin became the chemical baseline for film technology, fulfilling the needs of its target dominant market. For example, developing color-film technology initially required what was called a Shirley card. When you sent off your film to get developed, lab technicians would use the image of a white woman with brown hair named Shirley as the measuring stick against which they calibrated the colors. Quality control meant ensuring that Shirley’s face looked good. It has translated into the color-balancing of digital technology. In the mid-1990s, Kodak created a multiracial Shirley Card with three women, one black, one white, and one Asian, and later included a Latina model, in an attempt intended to help camera operators calibrate skin tones. These were not adopted by everyone since they coincided with the rise of digital photography. The result was film emulsion technology that still carried over the social bias of earlier photographic conventions.
But once you add technology for recording photographs on film, or creating and displaying digital images, there's potential for racism. I know next to nothing about photography. But it seems pretty clear that building stuff that works well for everyone is a hard problem.
If a maker of paints in the 1800s owned slaves does that mean that painting (then, now, in the future) is racist? How ridiculous can we get?
Would you mind reviewing https://news.ycombinator.com/newsguidelines.html and doing a better job of posting substantively here? You've posted one-liner flamebait twice already, which is not cool, especially on a divisive topic.
But the issue isn't so much about those photographers. It's about the defaults designed into film, cameras, photosensors, and so on. As TFA discusses. So random users -- and even many technicians -- don't get good results when they rely on the defaults. As in "your jacket is lighter than your face". And that's even an issue for many "white" people, which is why TV makeup is such a thing.
And about your paint maker. If they didn't care very much about painting dark-skinned people, maybe they wouldn't put much effort into researching and developing paints that worked well for that. It's exactly the same issue as with Kodak in TFA.
So is that racism? One could argue that it's just not caring very much about some race(s) of people. But one could also argue that anything that's race-based is racism.