Hacker News new | comments | show | ask | jobs | submit login

There was a similar "comming to jesus moment" with Google in China. They saw how they had to do the right thing after years of censoring and manipulating data for the chinese government, but only after they got massively hacked, blocked, and essentially forced out of china...

However a good thing done for the wrong reasons is still a good thing.




> However a good thing done for the wrong reasons is still a good thing.

Agreed, and I try not to be too hard on them. I don't think it's a black and white issue personally, the only issue I have is how this implies Google always wants to do the right thing from the get go, which very much seems to not be the case here.


You should read Daniel Kahneman's "Thinking, Fast and Slow". It's not possible to make all the right policy decisions that are right in hindsight and before a sentinal event occurs. Hindsight bias is always 20/20. Anyone making real decisions of consequence will eventually curse hindsight.


Well Google is now framing this as a moral issue, so did morality change significantly between when they accepted this project and today?


Do you think regret is morally valid?


Sure.

But Google has no intention of doing the right thing anymore than Microsoft or Disney does. These are corporations and their executives HAVE to do what they think will be best for the corporation. Not what they think is best for mankind.

This is how for profit businesses currently work. And PR saying anything to the contrary is simply not true.


This is a gross generalization that people trot out as if it were unassailable but never back it up with any support.

Corporations are run by people with a complex set of motivations and constraints in which they make decisions. Some of them make decisions with intent to harm. Some make decisions with intent to help.

No one person is automatically turned into a ruthless amoral person just be being employed at a corporation.


... and most make decisions in a space where (local) zero-sum games mean there is no option available that uniformly helps or harms.


It gets complicated but it's more about the employees responsibility to shareholders. Not their personal morals.

https://www.reddit.com/r/law/comments/3pv8bh/is_it_really_tr...


And do you know what can happen when a person's own morals or ethics come in to conflict with their responsibility to shareholders?

They can quit. They can speak out. They can organize. They can petition for change. They could join the more ethical competition (if one exists), or start their own.

This is especially easy to do for employees of a company like Google, with excellent job prospects and often enough "fuck you money" to do whatever they want without serious financial hardship.

They are not hopelessly chained to the corporate profit machine. They can revolt -- that is, if their morals are important enough to them. Otherwise they can stay on and try not to rock the boat, or pretend they don't know or are helpless to act.

A handful of Google employees chose to act and publicly express their objections. This action got results. More employees in companies which act unethically should follow their lead.


I used to work at google about 5 years ago. While I was there it was clear that Google employed some of the most morally conscience people I've ever worked with. It's why I still trust them to this day with data that I would never trust anyone else with. As long as Google employees continue to have a voice in the company I'll continue to trust them.


Google public shareholders do not have control of the company. Larry, Sergey, and Eric are the only shareholders who matter. So executives are responsible to them first and foremost.


Even if this is true, they can make the subjective decision that doing certain things will make the company look bad in the eyes of employees (which not only can cause employees to resign, but can disadvantage a company in negotiations to hire new employees) and users of the product, and can ultimately be worse for their bottom line than things that don't bring the same short-term financial benefits.

Ultimately, though, I agree with zaphar that you are overgeneralizing, since corporations are controlled by humans -- executives, other employees, and shareholders -- and human motivations can be complex.


This sort of thing gets said a lot. It's not a valid excuse and it's not true in the black and white sense that people constantly present it.


Otoh Google tries to claim much more moral highground than they actually have. Insincerity does rub people in the wrong way.


> However a good thing done for the wrong reasons is still a good thing.

I'd say it is definitely better than not doing a good thing. For me, the real question is this though: considering there is a pattern here (doing the right thing after doing the wrong thing), do you trust they will do the right thing in the first place next time?


> However a good thing done for the wrong reasons is still a good thing.

Yes, but we should absolutely remember what the original intention was.


Hmm not really. Google is bad and they should feel bad, you're just handwaving away how bad they are because you like them.

Imagine I wanted to have somebody killed and I hired a hitman to kill them and when I go to pay the hitman I accidentally wire the money to the wrong place and inadvertantly pay off the victims mortgage instead of paying the hitman. Now the victim doesn't die and gets their mortgage paid off. I'm not a good guy what I did is not a good thing, I just fucked up, that's all. Had everything gone to plan the guy would be dead and I would be blameless.

Similarly if everything had gone to plan Google AI would now be powering various autonomous murder bots except they realized that they didn't want to be associated with this, not because they have any morals, but because WE DO. They are still bad.


>Imagine I wanted to have somebody killed and I hired a hitman to kill them and when I go to pay the hitman I accidentally wire the money to the wrong place and inadvertantly pay off the victims mortgage instead of paying the hitman. Now the victim doesn't die and gets their mortgage paid off. I'm not a good guy what I did is not a good thing, I just fucked up, that's all.

That's an odd analogy considering the the would-be conspirator didn't make a decision to not go through with it. Do you believe Google published this article by accident? And really; comparing Google's actions to murder... c'mon.


Not comparing googles actions to murder specifically, that's simply you not being able to see the forest for the trees. The only reason they wrote the article is to make it seem like what they did was a proactive moral choice when in reality it was a retroactive action to frame their realization that in supplying AI for the DoD murder bots they would be part of the evil empire. I mean it's literally twisty mustache tier lack of scruples.

They didn't fess up because they realize that the outcome of their actions woudl be bad, they fess up because YOU realize that the outcome of their actions would be bad.


> comparing Google's actions to murder

you think people weren't/wouldn't be killed off intel gathered from the project?


Beleive me when I say that I do not like Google. But what is a person or a group of people supposed to do when they have done something wrong? All they can do is stop doing it, and try to prevent doing it in the future.

You can speculate about their motives, I personally beleive it to be PR motivated aswell, but what matters in the end is that it stopped happening.


That's all fine, but that person or group shouldn't expect everybody to love them afterwards. If anything, they should expect distrust and dislike.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: