Hacker News new | comments | show | ask | jobs | submit login
Uber’s Secret Tool for Keeping the Cops in the Dark (bloomberg.com)
119 points by angpappas 6 months ago | hide | past | web | favorite | 107 comments

I think the biggest issue here is that Uber actually has programmers on its staff who think its ok to write a program whose sole purpose is to try and break the law.

I mean, someone actually sat down and said, ok when, not if, law enforcement comes to raid one of our offices, how can we do our best to block a lawful search warrant.

> actually has programmers on its staff who think its ok to write a program whose sole purpose is to try and break the law

It's not that simple.

Team A is given the task of writing SW that locks down all computers in case some alarm goes off (e.g., opening a door without disarming an alarm). In the start, this alarm and all workstations are connected to an on-premises server.

Team B is given the task of moving the server into the cloud (or home office) because HR realized they could automatically monitor which employees enter and exit the office. For security auditing.

The system has been moved to the cloud and team C is given the task of extending the SW to make it possible to block certain employees for logging in, e.g., after they've accumulated a certain number of work hours during a week. To prove that the company complies with work regulations.

Now you have built a system, all under plausible and "just" pretenses, that allows you to lock down an office anytime remotely.

Team D (managers) are given the task of paging a number when a raid happens. Team E is instructed to lock down an office when the number is paged (possibly they're told that it's a system test or something else).

If anyone is doing anything unethical here, it's the managers and, possibly, team E (executing the lockdown).

When the system is compartmentalized like this, the "anyone doing anything unethical here" is surely the senior leadership who can see the full system and leverage its capabilities for unethical purposes.

Why are we assuming this is something not already in law ? It's pretty well defined, and it's criminal.

So in short: if you trigger this system in reaction to a police visit you are guilty of obstruction of justice and can face up to 10 years in jail (in the US, my state). There is nothing the company can do to shield you from this.

Anyone who has ordered you to do this would be accessory to obstruction of justice, punishable by up to 10 years of jail (not 100% sure).

Neither of those people can defend themselves by saying that it was in the company's interest, or that they were ordered to do so. This is criminal law, and your claim would be akin to saying that your boss asked you to murder someone. If you do it, it's your ass (of course ordering a murder is a crime too, but it doesn't absolve the murderer). This goes to quite extreme lengths and even applies in cases of coercion (so if you murder someone because your boss kidnapped your daughter, you're still a murderer and can still be convicted as such).

The same would apply here.

One hopes Uber is not just throwing employees under the bus (because the IRS, I guarantee you, would get an obstruction of justice persecution started if Ripley happened to them. They have sent people to jail for far less)

> It's not that simple.

I agree, but in a different way. Who's to say that it's not governments that are the major criminals here? Many westerners would agree regarding China, North Korea, Iran, and so on. But what if the laws being used against Uber are products of corruption? To stifle competition against the taxi industry, which seems likely.

But whatever. This is some damn impressive OPSEC, no? I mean, serious anarchocapitalist OPSEC. And that just brings a smile to my face.

Reminds me of "Snowcrash".

I am no uber fan but I'd have no problem with a tool like this if I understand it correctly.

I like to believe that if there's a legitimate use for a tool, then the tool is fine. What's the difference between a lawful search warrant and robbery? Why do we focus on the text rather than the fact that the leadership created a protocol and a scheme to do this? Why does Travis get a pass?

"It should be noted that no ethically-trained software engineer would ever consent to write a HideDataFromPolice procedure. Basic professional ethics would instead require him to write a HideDataFrom procedure, to which Police could be given as a parameter."

- Nathaniel Borenstein (paraphrased)

"The only original rule in engineering ethics was not to compete against other engineers on the basis of price." (Henry Petroski, from memory)

Software engineering doesn't even follow that rule.

Not all government officials and police officers are the same. For example Magnitskiy scandal in Russia.


TL/DR: police had given the materials taken during the police raids to organized criminals, who used them to fraudulently reclaim $230m of the taxes previously paid by the company.

Come on, software engineers love writing crypto software to hide information from the government…

In case of robbery you wouldn't have time to call the number. Tech offices don't tend to get robbed when people are at their computers. And if they aren't, devices should be locked and files with it.

Indeed, the tool was developed in an environment where government agencies and related LEOs are categorized as antagonistic and hostile entities, oftentimes operating at the behest of Uber's competitors.

Surely we're all aware that financial avarice is rampant in developing/emerging markets and, when presented with an unaccommodating adversary, these developing/emerging markets will readily resort to "fishing expeditions" and other avenues of blatant extortion.

> ok when, not if, law enforcement comes to raid one of our offices, how can we do our best to block a lawful search warrant.

When writing end-to-end encrypted chat, client-side encrypted hosting/backup, etc. you ask yourself the same question. And yet I consider those ethical.

By that logic, you do the same when you close your door.

This is ridiculous, clearly there's a sliding scale of how much morality versus how much immorality you enable. How far along that scale is a tool whose sole purpose is to block lawful warrants?

> a tool whose sole purpose is to block lawful warrants?

From the article:

The three people with knowledge of the program say they believe Ripley’s use was justified in some cases because police outside the U.S. didn’t always come with warrants or relied on broad orders to conduct fishing expeditions.

Talk about misuse, sure. But there's nothing at all about that tool that restricts it to blocking lawful warrants.

> a tool whose sole purpose is to block lawful warrants

There's no hint in the article that this was the sole purpose. From the article:

> Employees aware of its existence eventually took to calling it Ripley

So it was not codenamed from the start, there's an implication that a relatively small circle of users knew it.

> Workers in Uber’s IT department were soon tasked with creating a system to keep internal records hidden from intruders

Doesn't necessarily seem like something an ethical developer should refuse to do - it could easily be "sold" internally as security software protecting the sensitive information in case of network intrusion.

Who says that that's what the end developers were tasked with creating? Maybe they were told they were creating security protocols in place for robberies. We shouldn't make assumptions on what the lowly developer knew or didn't know going into the project, but instead focus on the big wigs who 100% knew what this project was for.

I think you can reasonably substitute "developer" for "person" and the grandparent's point still stands.

Sure, the point I was making is this whole thread is focused on these evil developers who are unethical and immoral and horrible for what they did.

Which is the wrong approach. We don't know enough details to know what they knew or didn't know going into the project. If Uber at the highest level is okay with deceiving the government, then it's not crazy to think they wouldn't inform their developers about the potentially illegal things they were going to do. And this was all in 2015-2016 when Uber was still a "scrappy startup" in the people's eyes and not the monster it's been exposed to be.

However, we 100% know of people who did know about the project, specifically the higher ups. Our anger should be directed at them because they have no excuse.

Oh I'm firmly in the camp of "the fish stinks from the head" (I don't know if this Greek proverb translates, but I think there's a similar one in English).

I've heard it with "rots" in place of "stinks", but otherwise verbatim.

>how much morality versus how much immorality

Morality and legality are both very subjective, either to the person or to the legal system.

Depends. Are lawful warrants moral? There are any number of instances in history when the laws have been immoral.

Is the use of state power to enforce taxi monopolies immoral?

Just tell the programmer it's only for use in the company's China offices.

Many companies openly discuss their countermeasures to protect their customer and internal data from lawful requests from the Chinese government. I'm not saying it's right or wrong, just that it's not a secret.

And then once the programmer writes it, it's a piece of cake to make it do the same thing in Canada.

> Many companies openly discuss their countermeasures to protect their customer and internal data from lawful requests from the Chinese government.

This seems like a serious counterpoint to "they were using it against lawful warrants therefore it's bad". Arguing that line and simultaneously excusing defensive measures against China, Turkey, or other governments isn't easy.

The obvious resolution is "ethics aren't equivalent to law", and using this tool to dodge ethical requests is still bad. But that's a far more discretionary standard than a lot of the positions I'm seeing here.

But what are "lawful warrants"? The Chinese government must indeed consider its actions to be lawful. So who sets the standard?

Maybe Uber just wants to ensure that discovery proceeds after careful negotiation.

And yet Apple regularly receives praise here for thwarting law enforcement.

The difference seems obvious to us, but it wouldn't be to someone outside of tech, and similar justification will be used by used by people building the Uber program.

I suspect the large majority of entities who are stymied by Apples extensive encryption who wouldn't be stymied by trivial encryption are law enforcement and other state entities.

I also suspect that Uber's protocol has been invoked against against intruders and/or commercial visitors.

So while Apple good and Uber bad, the difference is gray, not black & white.

> And yet Apple regularly receives praise here for thwarting law enforcement.

That wasn't my takeaway from the Apple/FBI interactions.

They said pretty plainly: "We'll help however we can, but even we can't get into our customers data. We're also not going to break that so that anyone, including you, are able to."

Isn't that a big difference?

What is “trivial encryption”? This is something that keeps out “bad guys” but not “good guys”? Pretty sure that doesn’t exist.

Apple protects its customers. Encryption that stops cops also stops criminals.

Uber is only protecting its profits. The customers are incidental to its blatant lawbreaking.

Apple protects its profits. Encryption that enables privacy protections for consumers also stops cops investigating crime.

Apple is not your friend and does not have your best interest at heart. It is a company like Facebook, Microsoft, Ford, Boeing... They all just want the money. If making hard to break encryption sells more units, then that is the only motivation.

There's no evidence that Apple's ideas about privacy or encryption have any positive effect on sales. The only people who might even be aware of it are tech people, and that doesn't move the needle at all. See the tech people gripes about the Macbook Pro, which went on to sell well anyway.

do you think apple is motivated out of the goodness of its collective heart?

Do you propose that it is impossible to "do the right thing" and make a profit?

> ok to write a program whose sole purpose is to try and break the law

Is it 'breaking the law' to turn off computers when you see the police knocking at your door (or in the parking log) but don't know why? Meaning they haven't presented a warrant or even exchanged words with you yet?

Even in the particular event cited: "about 10 investigators for the Quebec tax authority burst into Uber Technologies Inc.’s office in Montreal"

The act of bursting in but not (yet) saying anything? I would argue that no laws have been broken.

Edit: A similar situation might be if you see flashing lights behind you when you are driving. Doesn't mean the police are there to stop you could be someone else or could be a mistake you haven't been officially notified of an infraction.

Breaking into an office is not the same as presenting a warrant and then prevent access to data detailed in a warrant.

Were they blocking searches with warrants or warrantless raids? And was what they did illegal? I read the article but left unclear on both of these key questions.

>In May 2015 about 10 investigators for the Quebec tax authority burst into Uber Technologies Inc.’s office in Montreal. The authorities believed Uber had violated tax laws and had a warrant to collect evidence.

Per my comment from a while ago, I think what they did was illegal. [1]

If they had simply kicked off users who were likely cops, that would have been legal. That’s what (some) biker bars do.

But they went beyond that and actively misled such users in a way that wasted their time. That amounts to lying to investigators and is illegal.

You can kick people out of your bar for giving off a cop vibe. You can’t keep them around and feed them fake intel on drug deals so they waste their time.

You can pull your shade down to keep people from looking in your window. You can’t make shadow puppets of a homicide with the intent to get the cop outside to investigate a spurious lead.

[1] https://news.ycombinator.com/item?id=14270214

>You can’t keep them around and feed them fake intel on drug deals so they waste their time.

Any relevant case law on this? Lying to the police isn't allowed, but lying to some new guy at the bar is legal, regardless of your suspicions of who the new guy is.

I don't, just basing it on the comments in the previous megathreads [1]. But any kind of prohibition on "lying to the police" is going to be bound up with the issue of the defendant's state of mind. They're guilty if they believed e.g. a particular man was a cop and were deceiving him on that basis. And it looks like that applies here and was what Uber was doing.

Believing that you're merely lying to "the new guy" or "an abusive user" isn't enough, as you note, so that part of the program (send spurious data to likely bots) wasn't illegal.

[1] https://news.ycombinator.com/item?id=14269708


Why can't you keep them around and feed them fake intel on drug deals so they waste their time?

>I think the biggest issue here is that Uber actually has programmers on its staff who think its ok to write a program whose sole purpose is to try and break the law.

People who grew up in a situation where the law and the institutions that create and enforce it did them little good and plenty of bad (i.e. urban poor) often have little respect for it (though they may follow it as that's usually the least worst option when you're moving up the economic ladder).

I don't see a problem with software that shuffles data and access around to create cross-jurisdiction paperwork headaches for the police. Forcing them to cross their t's and dot their i's before going in makes it much harder for people with political ambitions to go fishing for stuff to put on their resume. We all know that law enforcement practices saying no via similar "oh you have to go get the info from these people" spaghetti fairly often with FOIA reqs.

When a lawyer tells someone to not talk to the cops, are they doing anything wrong? Because the lawyer is basically asking 'what can I do to stop a lawful interrogation'.

Does this change if the person is suspected of having pot or of killing children?

In the U.S., people have the right to remain silent during interrogations, so there's no problem with a lawyer advising a client to exercise that legal right. A lawyer can get in trouble (e.g., get disbarred) if they advise a client to break the law, e.g., by committing perjury or destroying evidence.

> Does this change if the person is suspected of having pot or of killing children?

The U.S. legal system has a presumption of innocence, so merely being suspected of committing a crime shouldn't affect the legal rights you're entitled to. (Many innocent people become suspects in police investigations.)

Some of these things could be specced in an obfuscated or completely fragmented way that no one really had any great idea of what the complete product was. That said, there had to have been someone who integrated everything together.

Disclaimer: I'm not defending anyone who breaks the law. I'm just explaining why and how it might work from my point of view.

It just depends on how people were explained why and what they need to do. In the end these engineers even might have had no idea that the thing they produced will be used this way.

Additionally, I know a bunch of people who developed similar system for small companies in my country. However the situation is different: law enforcement institutions are often misused to get the bribe from business owners. Means the less they can find in your office, the less chances they will take your money away for nothing.


People name internal programs badly all the time. That shouldn't be an instant "they're unethical" flag. Furthermore, the developers working on these projects could have easily been told they were building something completely different than how it was used. We shouldn't make assumptions about the developer when there are people that we know knew what was going on.

And is the last line necessary?

> a system to keep internal records hidden from intruders entering any of its hundreds of foreign offices

That’s how the engineers must have been convinced on owning up the requirement.

It seems generally useful against any physical intrusion - presumably it could have been constructed against crackers or intruders of any nature.

Since you need to to be aware of the raid to use the tool, it seems particularly useful vs. attackers that aren't afraid of being noticed, and whom you cannot keep out otherwise. Particularly the police, in other words. Not that it's impossible it'd be useful otherwise...

>I mean, someone actually sat down and said, ok when, not if, law enforcement comes to raid one of our offices, how can we do our best to block a lawful search warrant.

But was that 'someone' a programmer? I'd say whoever programmed this probably did so on order of their boss. Tech isn't immune to the same pressures that exist in every field. Basically, make the boss happy.

If only programmers could form some sort of collective to advocate for their common interests and hold their bosses accountable to ensure our profession is kept to a high ethical standard while protecting programmers from backlash when they refuse to break the law for their employers.

Yes, when I contemplate the paragon of ethics, the image which invariably comes first to mind is that of a trade union.

Who's talking about a paragon of ethics? Trade unions don't exist because they are supremely ethical. They don't even exist because they are more ethical than corporations (they're probably not.)

They exist because a corporation is inherently coordinated in a way that gives them power over workers. Unions just allow workers to adopt that same inherent coordination to seize power back. Unions aren't a moral force, they're just a counter-balance.

Programmers, unlike many other professions, are in the happy position that their skills are in high demand. They can afford to have more moral fiber, because it's easier for them to find a new job. I'd also assume that a lawsuit in case they were fired for refusing to write software which the purpose to break the law would go pretty well, but I'm not a lawyer.

>Programmers, unlike many other professions, are in the happy position that their skills are in high demand. They can afford to have more moral fiber, because it's easier for them to find a new job.

What about all those high-powered MBAs and other finance types? Most anyone who's received an Ivy League education is going to be in demand. Yet would we really expect more ethical behavior from them? Some would say we would expect worse. Programmers are no different from any other worker.

>I'd also assume that a lawsuit in case they were fired for refusing to write software which the purpose to break the law would go pretty well, but I'm not a lawyer.

Yes but if you filed that lawsuit you wouldn't exactly be 'in high demand' anymore, would you?

Actually, there is a vast surplus of MBA's.

An MBA largely has value because it acts like a cult promoting the networking between MBA's. It's arguably a great hack to extract value from organizations as opposed to customers.

>> An MBA largely has value because it acts like a cult promoting the networking between MBA's. It's arguably a great hack to extract value from organizations as opposed to customers.

I don't think this applies to the average MBA (full disclosure, I have an MBA). Most MBAs are just normal people who are just trying to do their job better.

Having said that, I guess some grads of the "most elite" business schools do that cultish thing you're talking about, but I would argue it has less to do with having an MBA than being a member of that top school. Those people tend to be the most visible representatives of MBAs while only being a very small portion of all people with MBAs.

By 'high-powered' I mean to say top 5 or 10 program graduates.

You are still talking about ~5,000 new MBA's from 'top' US schools per year even trying to be that exclusive. Over 20 years that's ~100,000 people which does not seem insane, but you can still be very selective while drawing from that pool.

Sure, they tend have high earnings, but this is the same pool that both want to and could get into a 'top' school which is a rather biased sample.

Sure but that's not really related to what I'm talking about. A top 5 MBA obviously possesses an extremely high capability to find alternative employment if they quit their current job (for ethical or other reasons.) Yet we would never expect those graduates to be more ethical than the average low-demand worker.

What you're saying seems akin to saying that a Stanford MBA is in less demand than the average programmer. (1) I think that's wrong. (2) It's really avoiding the point which his that being in demand doesn't really imply that those individuals will act more ethically.

One way of measuring demand is to look at how long the average job search lasts. In that context Stanford MBA's often take a rather long time between jobs as in 12-18 months is not uncommon.

Various euphemisms exist, like taking a break or the ever popular consulting. But, getting a job in school or being poached while employed is the happy path. Get off that path and it can be very hard to get back.

PS: I don't think demand changes the average much, but it's often outliers that stand up in the first place. So, I do think demand is meaningful as it reduces the burden of ethical behavior which impacts people at the edge.

>> They can afford to have more moral fiber,

Your assumption ignores the fact that morals are relative and can be built around individual life experiences. There are plenty of segments of the population who don't trust the government or law enforcement organizations because they've seen or experienced corruption.

Your assumption also ignores the fact that some people could have debt and mouths to feed other than their own. People who aren't sitting on a huge rainy day fund could be risk averse when it comes to career changes.

Sure, or maybe we can just accept that there are a lot of programmers that aren't saints or innocent victims of overbearing bosses, but actually are happy to use their skills to break the law?

You can make a group of programmers think they're pirates going against the establishment, and they'll do anything under that idea.



Pressure isn't an excuse for illegal and unethical behavior; we all have pressures and responsibilities. I'm surprised so many programmers and others were willing to participate, and to do it so reliably that Uber wasn't afraid someone would leak what was going on.

most of this sounds like good practice for any company that stores large amounts of sensitive data.

perhaps the features were used inappropriately, but I would hope Uber can remote wipe a laptop, log users out of company systems, or centrally enforce encryption policies.

this line is just silly.

--"Later versions of Ripley gave Uber the ability to selectively provide information to government agencies that searched the company’s foreign offices. At the direction of company lawyers, security engineers could select which information to share with officials who had warrants to access Uber’s systems, the people say"

What is the alternative? Giving law enforcement access to all data without any discretion? Querying ride data for 1 person is technically "selectively provide information", but that seems perfectly acceptable.

I agree.

The majority of people on this site are from countries that work under the rule of the law. I understand that they'd feel uneasy by these subterfuges.

But in places like Russia, China, Belarus, most of Africa and parts of Latin America these resources are more than justified. You should fear the police as much as the bad guys.

Actually, as a non-American, if I were to land on an U.S. airport with a computer or cellphone I'd also take careful measures to avoid abusive searches, even if they're substantiated by law.

The headline of this is that it was implemented in response to a legal police search that resulted in uber getting banned in a country because it was breaking the law. It was then repeatedly used to obstruct justice.

This wouldn't be news if I told you that Intel makes its engineers encrypt their hard disks and require passwords on wake and insist no one leave their computer unlocked.

The news here is that they're using this technology specifically for obstruction of justice.


It's not the technology in question, it's the decision to use it for this specific purpose.

But locking out all users by office? I can't think of a legitimate use case for that. Locking/wiping devices of people who left the company is standard but why for a whole office? Robbery doesn't really apply, either they do it when no one's in the office (devices locked anyway) or you won't be able to call HQ.

A certain segment of the population - disproportionately represented on HN - strongly dislikes both Uber and cops. It's always interesting to see how people react to stories in which two groups they dislike are pitted against one another.

I don't think people here have a problem with police. You might be confusing that with a distaste for homicide and assault by police.

Those don't just automatically go hand in hand. And police accountability is not some kind of team sport.

People here are talking about enforcement of actual laws.

Actually for me, as someone who has strong negative opinions towards both Uber and police brutality, it basically comes down to this: a strong desire for fairness.

I have a strong dislike for companies that willingly flaunt rules that are meant to level the playing field. And this is not the same as companies that disrupt (in fact, the taxi industry was badly in need of disruption and was NOT a level playing field before ride-sharing).

Same goes for police. If a policeman has murdered someone, they should be held to trial just like a civilian would. But we all know that's not actually what happens.

I don't think I belong to any of those groups, but Uber seems like a company that operates on the edge of the law, which I don't find all too attractive.

As one of those people, it makes me happy that an obviously sleazy company like Uber is giving the cops something useful to do, so they aren't harassing more ethical businesses.

It's like when the small-time drug dealer and the top bully at your high school get into a fistfight. The ideal outcome is that they both get a savage beating, and then get expelled.

I actually don't care for both as classic Uber is not legal where I live and instead Uber is just a regular taxi app.

Also, police is fine here and so are cab drivers.

Yeah, this sounds like a great tool. Of course it can be used for good and bad. Hell even "cp" is a potent copyright violating tool, if used in that purpose.

I can easily see a toolkit that makes sure everything is FDE with a distributed key network, and revocation from anywhere if needed. I also see remote distributed shutdown requests, sealed storage locking, remote device nuking, and plenty of other features if a device falls into the wrong hands.... even if that is local law enforcement.

Part of this also feels like the Neuromancer universe, where companies are the state actors, and the real states have only limited jurisdiction.

(And yes, I would help build a set of tools like this. They have multiple purposes, legal and illegal. Not my fault if someone uses them illegally in a jurisdiction. )

They should have deployed this just once for a US case. I assume some of their communications travel across state lines and that would make their obstruction of justice and evidence tampering a federal crime. With our beyond vague definitions of such crimes, every Uber executive could have been in jail for decades by this point.

So prior to the next warrant served to Uber authorities will sever all communication lines ran into the suite right?

That itself would trigger the shutdown. No heartbeat signal - shut it all down.

Sounds like a fun system to make, context aside.

It's there an OSS version of this somewhere: contact a server, server initiates clients on all devices to perform lockout with pre-arranged credentials. Client can clean caches, wipe partitions, etc., as required.

Nahhh, I wouldn't rely on a server. I'd rely on a distributed network or a TOR gateway to a onionsite that is HA'ed with onionbalance. Even better if you can put in a 3g sim and have it talk using that as a backup.

Most of the time, your adversary is a thief with low/mild tech intelligence who'd let it talk to the net. Then game over, data's gone and you have a portable spy rig if you play your cards right.

Like managers at Uber’s hundreds of offices abroad, they’d been trained to page a number that alerted specially trained staff at company headquarters in San Francisco

Isn’t this “tipping off”? Which is a crime in itself.

In what respect is "tipping off" a crime? Is it a crime if, after seeing a speed trap, I flash my brights at the next motorist?

I'm not a lawyer, but unless there's a gag order, I don't think it's a crime to tell someone you're being raided by police, especially if that someone is your security or legal staff.

>Is it a crime if, after seeing a speed trap, I flash my brights at the next motorist?

Where I grew up, yes that is a crime, at least in the sense it's something to be ticketed for, they're not going to haul you to jail.

Where I grew up, they could ticket you but it wouldn't stick because I might flash my lights for any number of reasons. The fact that there was a cop nearby was mere coincidence.

In what respect is "tipping off" a crime?

In the UK, in respect to the Proceeds Of Crime Act 2002. Financial regulations tend to be fairly universal in the West so I don’t doubt Canada has them. Start jailing people who send these messages and it will soon stop.

Yes in some countries tipping off is a crime, esp warning about speed traps.

Canada passed a law against it, the Proceeds of Crime act.

Legal in this context only means "The government gave themselves permission." It does not mean ethical, or right, or whatever. And with many police in many jurisdictions acting unethically (like stealing cash from motorists, bribery, blatant criminal conduct, and USA DHS duplicating phones/data), yes, you do have to take matters in your own hands.

Now, if they're found guilty, I have no problem "executing" the corporate charter as a death sentence. Just laws need to have teeth, and no piddly $10M fine is a deterrent.

It never fucking ends with this company, does it?

> Ripley, after Sigourney Weaver’s flamethrower-wielding hero in the Alien movies.

Sounds more like Ripley's believe it or not

What the heck does that illustration even mean?

What's interesting to me is that governments are becoming less and less important in society. With Google, Facebook, Uber, and Bitcoin, we're moving closer and closer to a world where human-based systems (government) are being supplanted by computer based systems (technology).

I for one believe this can be a good thing, since humans corrupt but computers are strictly deterministic.

I'd argue that government gets more and more power with technological advance. Just few decades ago regimes could only dream about total surveillance, now Chinese government are putting this in practice and I'm sure many will follow.

That's more of an example of human corruption, IMO. Counterpoint: people use various technologies (VPN) to bypass the great wall.

I don't see how your description of the China situation restores your point that governments are becoming irrelevant. China is deploying facial recognition and AI to figure out who you know and who's helping you and in a cashless society they can penalize you by cutting off your cards and those of people who try to still help you.

I didn't make that claim with regard to China, but I see your point. I've also never been to mainland China, so I don't really know how things are there.

The rules for the computers are still made by humans.

That's true, but they are still deterministic and can be designed to be egalitarian.

Humans, on the other hand, cannot be trusted in the way that a computer can.

In theory they could be more egalitarian but the original comment was about big corporations taking over from governments with their tech.

These corporations care mainly about profit so i don't think whatever rules they come up with will be good for society.

Right, that's the scary part. A double edged sword.

That is why I think we need collaboration on open source software rather than competition by company written and siloed software.

> computers are strictly deterministic

Maybe in theory, but I haven't found my computers to behave that way.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact