Or it's extremely basic text matching and "African American Inventors" more closely matches "American Inventors" (as it is an exact substring match) than web pages about "Inventors from the United States", which is probably a more common phrasing (see: https://www.google.com/search?q=Inventors+from+the+United+St...)
One of the biggest problems with Google as I see it, and facebook for that matter, is for the end user there isn't much of a way to introspect _why_ a search result comes from a particular query. In this case, you used that lack of information to confirm your own assumptions and bias.
How many photos of white inventors have the word "white" near them, though? Image search isn't exactly parsing biographies. So it gets largely lost in noise compared to "inventor".
It's pretty common for certain words in certain image searches to be useless, and that's including a lot of very boring searches nobody has ever thought of.
It's pretty easily explained by the fact that people really only use the term "white inventors" when talking about black inventors (honestly "white inventors" would probably have racist connotations in most other contexts).
Any search algorithm will introduce some kind of bias. People will notice this bias and assume a conspiracy because humans like to read purpose and intent into everything. If all the results were white males, people would be accusing Google of sexism and racism. If the results skewed toward Asians people would think Google was in league with the Chinese government or something.
This is one of the social dangers of allowing algorithms to make important decisions in society. We might not even be aware of what biases we're introducing. They can be very non-obvious, especially if the algorithm is complex or the data set is large.
Edit: here's another possible explanation: perhaps due to efforts to popularize African American contributions to American history there are more web pages (in a pure numeric sense) about African American inventors, scientists, etc. on the web. This could affect the way Google weights search results.
> This is actually one of the dangers of AI: it might introduce all kinds of weird biases that we're not even aware of until the effect has already been felt.
This isn't a danger of AI in the future, it's a danger of google's algorithms in the present. Most users (myself included) have come to treat Google results as some oracle of truth, so the bias introduced by Google's existing search algorithms can have tremendous impact. Most of the time it will go unnoticed, and it doesn't require any intentional manipulation. What people know, and how they act based on what they know, is being heavily influenced already by google's algorithms.
In a sense, it doesn't matter whether the curious "american inventors" search results set comes out of political bias or technical inadequacy: from the perspective of many people, Google's editorial choices in certain areas have forfeited the benefit of the doubt in others.
This situation is a good example of why it's important to pay attention to maintaining the appearance of impartiality as well as actually being impartial.
(Disclaimer: I work for Google. Nothing related to search.)
One of the biggest problems with Google as I see it, and facebook for that matter, is for the end user there isn't much of a way to introspect _why_ a search result comes from a particular query. In this case, you used that lack of information to confirm your own assumptions and bias.