> According to a lawsuit filed today in federal court in California, Waymo accuses Anthony Levandowski, an engineer who left Google to found Otto and now serves as a top ranking Uber executive, stole 14,000 highly confidential documents from Google before departing to start his own company. Among the documents were schematics of a circuit board and details about radar and LIDAR technology, Waymo says
> The lawsuit claims that a team of ex-Google engineers used critical technology, including the Lidar laser sensors, in the autonomous trucking startup they founded, and which Uber later acquired
I was confused as to what stealing a patent actually meant:)
Waymo has also posted this....
From this post...
> Recently, we received an unexpected email. One of our suppliers specializing in LiDAR components sent us an attachment (apparently inadvertently) of machine drawings of what was purported to be Uber’s LiDAR circuit board — except its design bore a striking resemblance to Waymo’s unique LiDAR design.
> We found that six weeks before his resignation this former employee, Anthony Levandowski, downloaded over 14,000 highly confidential and proprietary design files for Waymo’s various hardware systems, including designs of Waymo’s LiDAR and circuit board. To gain access to Waymo’s design server, Mr. Levandowski searched for and installed specialized software onto his company-issued laptop. Once inside, he downloaded 9.7 GB of Waymo’s highly confidential files and trade secrets, including blueprints, design files and testing documentation. Then he connected an external drive to the laptop. Mr. Levandowski then wiped and reformatted the laptop in an attempt to erase forensic fingerprints.
Ooops, that does sound bad after a first read.
Fun times, fun times. We got a few people fired, but that was about it.
Then, about 2 years after they'd canceled, our support department got a strange email from a user that was having problems. At first, the support rep didn't understand the issue and asked the user for screenshots of the problem. Sure enough, the screen shots looked exactly like our old product, which had been decommissioned over a year prior. It even had our support email address and phone number on it. It turns out the version of the software they developed in-house started by scraping our site, renaming all the .html files to .aspx and then making the dynamic parts be data driven.
When we called them on it, they basically threatened to use lawyers to put us out of business if we pursued the issue, so I guess our management decided to drop it. We did make them change the support email and phone number.
This is why we can't have nice things. Darned lawyers. Can't live with them, can't live without them
As Cardinal Richelieu said about 400 years ago: Give me six lines written by the most honest man in the world, and I will find enough in them to hang him
1. the attribution of this quote is disputed
There was nothing we'd done as a company that was actionable.
Patent trolls are one example of this - where they may deliberately select weaker defendants (instead of going after the bigco "infringers") so ensure they hold the upper hand in the financial power imbalance.
Some day people will say the same of software engineers.
A lawyer can screw you badly, but if you want to screw a lawyer you need another lawyer. If taken as a category, they never lose.
His mistake was that he used the company internet connection to upload said source files to something like his personal github or dropbox of the sort, so there were proxy logs of him stealing corporate secrets.
Most people would've gotten their personal-ish data off a corporate laptop with a usb key/harddrive which they probably wouldn't have triggered any alarms.
Most top banks use Citrix technology (or similar virtual desktops) to stream workstations for employees accessing both locally and remotely, so the only way you can get data out is by using your phone to take photos.
While most of such companies actively MITM their HTTPS connections (I wonder whether Google does that - theoretically they are CA so it would be extra easy for them), you could probably get away to uploading said file to any service that is small enough to be unknown to corporate firewall providers (Symantec, etc).
We're talking about reducing a ~$25000 part to ~$2500.
Just seems like a huge risk to save maybe $10k per vehicle (assuming price drops for everyone) for some relatively short time period.
An electrical engineer is also going to see the Uber name and expect Valley idiocy, while being a being a secondary concern for not being softwary enough.
This one broke many laws and needs to go to jail!
If Google has the information they claim they do, they have persuasive arguments to anchor their damages calculations on the basis of Otto's acquisition value. Times three.
This is a sword of Damocles hanging above Uber.
I do believe them: many media firms have the same set up between the journalism and advertising departments.
FWIW, David Drummond, longtime Chief Legal Officer took a board seat after GV's $250m investment in Uber in 2013 but eventually stepped down last August .
Sounds like a rookie move to be downloading this stuff on his work laptop either way.
But I wouldn't, because I'm not evil.
"Ah you know, I forgot my charger! Dang! I'll just plug this into the USB port of this sooper secure machine connected to this sooper secure network."
Just extract the device certificate from one device and store it on another. Problem solved.
Extracting data from the TPM or equivalent stores in ARM devices is also not impossible, as the DRM-breaking community has shown with extracting keys from TPM-based DRM.
Google can only verify what hardware you're running by sending a packet via ethernet to your device. You control all software running on your device, and can send a spoofed result.
If you were crazy, you could even just emulate Google's hardware entirely and proxy all requests to that emulated hardware.
Nonetheless, while this guy certainly wouldn't be able to do it, many Google employees would.
They're already doing something similar, after all.
Handwaving away all of this as just a minor nuisance is silly. An attacker would have to find some unknown side channel or try to physically modify the TPM to get at the data, either approach means the attacker has significant resources, certainly well beyond the means of our hypothetical attacker. Heck, it's been speculated that even the NSA couldn't get data out of something like Apple's Secure Enclave without risking destroying it in an attempt.
I’m more surprised a Google engineer couldn’t circumvent it.
Yes, of course it can be broken, it only takes 2^256 attempts in order to do so.
Edit: much better link than the above: http://www.altium.com/documentation/3.0/display/VAULT/Altium...
Is that better than git for binaries?
Given the adversarial nature of the system and the many past comments on HN to take claims by litigants with very large grains of salt when it came to the Oracle or Facebook or Oculus lawsuits, I'm surprised at the apparent credulity on display. Google will huff and puff just as much as anyone else when it comes to lawsuits.
Even with no tenable claim at all, the absent cofounder has a gun to your head: a proposed liquidation that pends on a civil suit simply isn't going to close. There are time limits on all of this stuff, and the acquirer is simply going to say "fuck it" and walk if they can't predict when the deal will close. A detail I think people who've never sold a company don't realize is that the legal costs for both sides of a deal that closes uneventfully can get close to 7 figures.
I guess this is valuable information for startup founders. It's also a reason you should run, not walk, from any early-stage business partner that wants to negotiate or complexify vesting. 1 year cliff, 4+ year vest, the way everyone does it, or go start a different company.
Is this a guess?
A representative for Guillory declined to discuss the settlement amount, but said the terms were "mutually agreeable." As part of the settlement, the parties have both agreed to dismiss their lawsuits.
Ouch, this doesn't play well for the supplier either!
Presumably if Uber wins, it'd have a solid case against the supplier. If it loses, or while Waymo v Uber is ongoing, might there be a case anyway?
I'll bet the supplier searched for the engineer's name in his inbox and responded to a big email thread about the project with many people involved. The Otto engineer's old colleagues got the email instead of him.
Also an argument that "we stole stuff but you're the one who leaked it so you should be punished" won't play well with a jury.
Can they really claim that the supplier damaged them by leaking trade secrets if the trade secrets aren't theirs?
> Can they really claim that the supplier damaged them by leaking trade secrets if the trade secrets aren't theirs?
If Uber wins, they are Uber's secrets.
Indeed. The current title of the article is
> Alphabet's Waymo Alleges Uber Stole Self-Driving Secrets
Secrets, not Patents. If patents was the original title perhaps the author was confused.
How do you steal a patent? If you steal (to be clear, illegally take) information and then use that to get a patent, is that stealing a patent? Is there case law on on something like that?
Google probably has the largest team of internal forensics software developers.
>>> To gain access to Waymo’s design server, Mr. Levandowski searched for and installed specialized software onto his company-issued laptop. Once inside, he downloaded 9.7 GB of Waymo’s highly confidential files and trade secrets, including blueprints, design files and testing documentation. Then he connected an external drive to the laptop. Mr. Levandowski then wiped and reformatted the laptop in an attempt to erase forensic fingerprints.
The bit about connecting the external drive is interesting but I guess there's probably a ghost of that action somewhere on a drive (assuming you could more or less restore the drive before being wiped).
Or he didn't do a secure wipe when he reformatted the drive and google inspected the computer when he returned it.
OR google modifies their laptops and has separate chips logging this stuff, which would be fairly impressive!
Or he didn't actually do any of this alledged stuff and it's all innocent.
We will find out in the court case either way!
Accessing large amounts of files you haven't previously accessed and shortly thereafter attaching an external drive should trip a competent IDS.
JAMF I believe gathers application usage data but nothing as in depth as what's being discussed. It's also comically handicapped. It somehow manages to do a poor job of everything it tries to do so "the world's largest online Mac administrator community" is forum post after forum post of half understood franken-scripts. I used to think it was milquetoast but after sitting through their sales reps crapping all over open source software (despite extensive use of OSS libs in their products) and seeing it fail to do the most basic stuff out of the box my opinion is that it's over priced crap.
I can see the temptation—I've always missed having access to IP after leaving a company.
What Google could be doing is remote logging on the laptops - logs uploaded to ze cloud every time you connect to the mothership. Plugging in USB drive leaves trace with USB ID, volume information etc. Windows also logs this and more http://www.forensicswiki.org/wiki/USB_History_Viewing
Protip: to exfiltrate data with minimal trace your best bet is taking out the drive and reading it in another computer (using write blocker for best effect), this can still be traced if someone is logging SMART written/read data (I am, but Im paranoid), not all HDD/SSD vendors provide this info. Second best is booting from USB drive so the original OS never sees the plug/unplug event in the first place, I have no idea about current state of UEFI/AMT logging going on tho.
Disclaimer: I used to do forensics.
I was asked to figure out what had happened on a system where some data had changed and 2 parties were blaming each other. After about a hour digging around I managed to piece together a picture of how Person X had got up on a Monday morning, discovered (on their mobile, home wifi) that they had made a mistake on Friday, then logged in on their desktop to fix it from home (first time they logged in at home), then went to work and blamed someone else.
What was remarkable was how many different sources there were to pick up bits and pieces from. In isolation there wasn't much to go on, but once you start the connecting the parts, it's really incriminating.
They had to know that he:
1. modified the software on his laptop
2. logged into an area he should not have had access to (this is probably standard)
3. attached an external drive (possible, but standard?)
4. and they got all this info after he deleted the drive, which means they either went in and found remaining data on the drive or else they captured the info in real time.
I suppose if the drive is clean now, and they know he downloaded data, they can infer that he wiped it.
I suppose that if they know he accessed it, and there was software on his computer preventing him from doing so, they can infer that he downloaded something to overcome it.
But knowing that he connected to an external drive implies active monitoring. That's the part I am most curious about.
I'd think that standard antivirus software detects and alerts external drives being attached.
> they can infer that he wiped it
For 4), Levandowski reformatted the hard drive before returning it, so there's no inference there.
Perhaps the size of the downloaded repository is larger than the physical size of the drive on the laptop?
Google also has an internal PKI CA - I think they meet and exceed that security baseline for rigor.
The threat models targeting anti-Google malicious actions obviously worked since they have traces of the Otto guy's activities. What I am asserting is that these forensics logs they use as evidence can be attacked in court as not being sufficiently protected from tampering by an internal Google party interested in fabricating evidence.
There are all sorts of other tidbits in the complaint that further strengthens the case in favor of Waymo, it's an interesting read, and I'm surprised it's very readable even for a non-lawyer.
Did he actually "search" on "his company-issued laptop" using company network or did he google it from his own machines at home?
Anthony is likely a very smart person, so if allegations are true, I would think the latter is more likely.
How does Alphabet know what he searched for? /G
Trade secret misappropriation can lead to criminal charges.
The post also says he talked about replicating their technology at a competitor with colleagues months before he actually stole the data too. Generally if you work on something confidential, and you start talking about taking it elsewhere, someone reports it or something.
Sidebar question if there are any armchair lawyers around: While I expect Uber to lose this lawsuit based on the type of evidence being claimed here, is it also possible for Uber to sue the supplier for leaking their confidential data back to Google? Because that seems like an incredible lapse of confidentiality in itself. Or will the notion that it wasn't legitimately their confidential data in the end, make Uber's own claim void?
User data is an entirely different matter, and is appropriately treated as such.
People should not be able to access data they don't need. If they need it, you can grant it. But the assumption should be that someone who doesn't use the design server shouldn't have access to the design server.
I don't want to know about or see information I don't need to have at work. It's not that I'm not trustworthy. It's not that I would abuse it. I just don't need the liability that it gets out through me.
Because your account credentials could get stolen. Your laptop could get stolen. Your laptop could get hacked into. Your laptop could get malware. Reducing the list of people who have access to a resource insulates against all of these things... automatically. And sure, all of those risks have other ways to mitigate them as well. But layers of security is key. And hey, it also stops employees from sneaking off with your data too.
Googlers all have to use 2-factor access via hardware tokens, so you'll need to steal their laptop and steal their token, and murder the employee before they report it to security.
Google laptops only permit the installation of software from Google, they are locked down and don't allow arbitrary installation of software, much like an iPhone. Those using Chromebooks are even safer.
Having to ask permission for every thing not only adds huge overhead, it inhibits global code gardening and technical debt reduction, and it inhibits learning, because you don't even know if you need to ask permission for something until you see it. You don't know what you don't know. If I want to learn about Google Translate because it might benefit my project, asking permission is bureaucracy, because I don't even know if what they have will help until I see it, and if I had to write a long justification for access rights, I probably either don't have a clue why I really need it, or might just not bother because of the hassle and seek out other open resources.
I feel sorry for you if you work for a company that operates internally like North Korea. One of the rewarding things about working at Google is the constant learning experience of exploring other people's stuff.
For example: Smaller companies often have the luxury of more trust / transparency. As companies such as Google grow, they have to resort to things like sending internal notice of big announcements very close to when the actual announcements come out (because leakers). If you have 10 people, you can usually trust the whole team.
Google tries very hard to allow as much transparency and openness as is allowable for its size. It fosters a culture of trust. Logging access rather than restricting it is one of those culture moves. It makes employees feel trusted, and puts responsibility on them to behave ethically. When they don't, the other end is that Google can still take legal action.
Culture isn't a fake floofy thing, it's real.
We are in phase two of that right now.
It sucked. I left.
You're talking about a hostile work environment. I'm talking about a secure one.
Security is inherently hostile, I don't see any other way of putting things. We tolerate a certain amount of hostility in order to reap the benefits that security gives us, and we tolerate a certain amount of vulnerability in order to reap the benefits that laxness gives us.
> If someone needs access to something, they request it, you grant it. Simple.
The saying goes that for every complex problem, like security, there is a simple solution, like that one, and that solution is wrong. The cost of such a process is just too high for most companies. You have to request access, explain why you need access, someone has to review it, then grant it. That's how it worked at the company I was complaining about. Processes that should take minutes took hours, those that should take hours took days.
Security, like so many things, is subject to cost-benefit analysis. Better security systems use the full triad: prevention, detection, and response. From the article, it sounds like these are working as intended. The security team detected the exfiltration and responded with a lawsuit. Trying to rely only on prevention will just lead to paralysis.
I might also be jaded, because when I hear phrases like "good practice" my instinct is that it means "omitting the cost-benefit analysis."
These are not high cost processes, these are basic, common sense practices we're talking about, that nobody really has any excuse to not have in place.
You are continuing to let your experience with a single, hostile work environment cloud your openness to something that... isn't even controversial. If you aren't locking down your files to only those who need them, you aren't equipped to be in business. If this is somehow uncommon among Silicon Valley, it explains why so many "they stole our trade secrets" lawsuits are going on right now.
I'm flattered that you want to talk about me, but really, I'm not the subject of the discussion here, and it's inappropriate to talk about what's going through my head or to try and psychoanalyze me.
> These are not high cost processes, these are basic, common sense practices we're talking about, that nobody really has any excuse to not have in place.
I've worked at a few different places on this spectrum in my career. Three of them have been fairly open, internally, like the way Google apparently operates. Maybe there are some high-value IP repositories you don't have access to, but you mostly have access to any source code you want to look at without getting access reviewed first. These companies were very open about the risks that this entailed, and openly discussed the fact that leaks were possible. The benefits became rather clear the longer I worked at each place. Whenever a system I worked on interacted with another system, I could follow what the other system was doing and even submit patches to other systems if necessary.
Saying that restrictive security is "common sense" or "not even controversial" is begging the question and argumentum ad populum, respectively. My argument here is that there are benefits to open access to most company IP, and that these benefits are important enough that the decision should be made on a company-by-company basis.
The access controls that would have prevented this particular case from happening would have to be rather draconian indeed. Anthony Levandowski's work was basically the genesis of autonomous vehicles at Google. Google purchased Levandowski's autonomous driving startup, 510 systems, in 2011. I don't know what kind of access controls you'd need to prevent a startup founder from accessing the technology built on top of his company's IP.
In this case, Waymo has a design server, Anthony clearly didn't work on it's contents, because he didn't already have the software to access that server on his computer. Therefore, regardless of the source of the IP (which isn't his, he sold it), he really shouldn't have ever been given access to it. When the server was first spun up, access should've been given to... the people who would be using it, and nobody else.
Of course, if at some point he did need to access those files, he could ask, and be granted that access. And that doesn't need to be a difficult process (granting access to things takes an IT person a minute or two), but there is now an additional person that knows that user has been recently added to access. Even informally, this is a pretty good security measure, because in most cases, it should be fairly obvious why someone needs something. And if it's not obvious, and maybe that employee has been, as the article says, talking about leaving the company and replicating the technology elsewhere... suddenly that IT person maybe has a reason to mention the issue up the chain.
So I can make a change, see that it breaks some test somewhere else (failed CI test), and peer into the diffs on the opposite side of the code base to decide what to do about that. It's proven quite useful, from time to time. I've seen weird problems like hitting pessimal access patterns for software developed halfway around the world.
> Anthony clearly didn't work on it's contents, because he didn't already have the software to access that server on his computer.
That doesn't follow. I work on a source code repository every day, but I'd need special software to exfiltrate a copy. Same with the various design documents and things I work with—all stored in a private cloud. If I wanted to exfiltrate it I'd get a script to do it automatically.
Remember, this wasn't just the guy who started the autonomous driver project. He wasn't just the "department head". He's an industrial engineer who founded a Lidar startup. The idea that he should be denied access to Lidar design documents is patently absurd.
In general, when I develop, I virtually have all of Google's code base 'checked out' and can edit any of millions of files in my snapshot of the world. I don't need to check out multiple silo'ed repositories or beg for access, diving into any code in the universe has almost zero transactional overhead, it's all mapped into one giant filesystem. (https://plus.google.com/+MattUebel/posts/4dQBDF5CmdX)
On the other hand, production systems are heavily walled off from the corporate network. For all intents and purposes, the corp network your desktop is plugged into is "untrusted"
Nice try though.
My first reaction to the waymo announcement is also on this. "apparently inadvertently" sounds like a blatant lie.
Also, how did google know he downloaded all the stuff through some software?
Did they lock down access or were they all on google docs?
We found that six weeks before his resignation this former
employee, Anthony Levandowski, downloaded over 14,000
highly confidential and proprietary design files for
Waymo’s various hardware systems, including designs of
Waymo’s LiDAR and circuit board. To gain access to Waymo’s
design server, Mr. Levandowski searched for and installed
specialized software onto his company-issued laptop. Once
inside, he downloaded 9.7 GB of Waymo’s highly
confidential files and trade secrets, including
blueprints, design files and testing documentation. Then
he connected an external drive to the laptop. Mr.
Levandowski then wiped and reformatted the laptop in an
attempt to erase forensic fingerprints.
Waymo/Google still had the documents, and copying isn't theft.
Theft is a different type of crime involving actual things.
You can commit theft of a rivalrous good, like an apple or a computer. You cannot, however, do the same to a file or a song or an idea. I wish more people understood this difference. Society would be better for it.
In this particular case, the reason that theft is wrong (because it deprives another person) is very different than the reason that copyright infringement is illegal (because the founders wanted to encourage invention by granting limited monopolies). If people believe that copyright infringement is theft, however, then powerful corporations like Disney can convince them that copyrights should last indefinitely.
Not necessarily. Theft is simply taking something you shouldn't. That includes taking by duplication. Or perhaps you'd like to think it as the deprivation of potential profits.
There's no way in hell an employee contract would ever let someone copy documents and use them after leaving the company. The intellectual property belongs to Google, not the individual creators and certainly not to any other employee.
I would be cautious about discrediting intuition as paranoia.
2. Isn't this scenario incredibly suspicious?
> They could have simply moved things on the desk.
Is there any evidence suggesting something else happened?
Memory forensics would have 100% given you evidence that it was accessed and exactly what was done. There are plenty of guides on how to forensically dump memory but that depends on the OS.
But I'm also in this is FUD camp. I highly doubt any company would do such a risky thing such as mirror your machine data merely for an interview. This would be highly illegal and damage the companies reputation for a relatively minor benefit.
Especially when interviewing any software dev which any infosec person would assume they are dealing with a sophisticated target with a high chance of being detected. Meaning they would have to use very careful cloning techniques.
It would be much easier to get you on staff on a company machine and temporarily monitor you closely after the fact. There are plenty of ways to thoroughly vet someone beforehand without taking such a risk.
This sort of penny-wise-but-pound-foolish shit is depressing, but not nearly as rare as I wish it was.
If a place is worth working for they're not going to judge you for feeling more comfortable keeping your stuff with you.
Isn't this only possible if the laptop is unlocked?
Reputation is everything.
(Except this one, of course. To be fair it was Otto before the acquisition)
Even you could do it https://www.youtube.com/watch?v=qGPGOoJn54E hint: there are internal USB buses in the macbook, you can hijack one for something like rubberducky +management circuity to trigger only when laptop is powered on for a longer period of time but not touched (no imu/keyboard/touch events/dimmed screen).
The implication in the filing is that Uber planned this with Levandowski, and he only created Otto as a plausible corporate vehicle for developing the LiDAR technology before Uber acquired them. Given what we know about Uber and the assertions in the complaint, this sounds entirely plausible, maybe even likely.
That could mean he downloaded an SFTP client like Cyberduck. He could have searched the internet for a client and then installed it. It doesn't say he did not have auth.
Imagine a Google security engineer being deposed for this lawsuit.
Lawyer: "Show me on the MacBook how he downloaded the files"
Engineer: "Well, he used Cyberduck"
Lawyer: "Is that part of the Mac?"
Engineer: "No, he'd have to download it separately"
Lawyer: "So, he searched for and installed specialized software onto his company-issued laptop?"
Engineer: "Um, sure"
Lawyer: "Thank you, that's all the questions I had"
They weren't trying to claim he hacked in. They're making the case he went out of his way to get his hands on these documents, and building a timeline that suggests why he went to that trouble.
He may not have installed "hacking tools" or anything like that, but he did specifically take action to access files he didn't normally use as part of his job. Which is, I think, all that this post is claiming.
See http://www.nobius.org/~dbg/practical-file-system-design.pdf, section 7.2 "How Does Journaling Work?".
The actual self driving software, and more importantly, all of the collected data from the waymo fleet would have been the key.
They don't want to do this, so, if they do, it's because they had to do this.
Practically all of the current sensor suites are expensive, bulky, and power hungry. If you want them on lots of cars, you need to reduce all 3 of those characteristics dramatically.
I think you can follow the money trail here and find some answers for sure. Now if Uber/Otto has a clause that prohibits employees from bringing in confidential data from previous companies, how can they be held liable? Does Google have to prove that those stolen documents were actually used in Uber designs?
Btw that's 1 very sharp eyed engineer, whoever that is...
It sounds like this was entirely accidental on the supplier's part.
>Waymo was recently – and apparently inadvertently – copied on an email from one of its LiDAR component vendors.
Is this going to be a legal test of that annoying lawyer email footer language?
>This message contains information from xxxxxx that may be confidential and privileged. If you are not an intended recipient, please refrain from any disclosure, copying, distribution, or use of this information and note that such actions are prohibited. If you have received this information in error, please notify the sender immediately by telephone or by replying to this transmission.
Ha! More legalese BS that never holds up.
> Otto launched publicly in May 2016, and was quickly acquired by Uber in August 2016 for $680 million.
The fact pattern here is going to be absolutely brutal for Uber. A non-technical judge is going to see the allegation: ex-google employee downloads technical documents in December 2015, launches a company 5 months later in May 2016, and is bought for $680M (later speculated to be $1B+) for all its technical accomplishments. How much fundamental research did they do in the 3 months between May-16 and August-16?!?!? Or was it just to buy the stolen IP that google had developed over 7 years?!? Brutal for Uber!
A public company recently settled a similar lawsuit (competitor hires exec, exec is proven to have downloaded documents) for $130M on much smaller numbers. And the defendant was run through the legal wringer first.
Expect Uber spankage, bigly.
> shortly after Mr. Levandowski received his final multi-million dollar payment from Google
Funny because of all the recent press that Google paid autonomous driving talent too much that they left!
>Infringement of Patent No. 9,368,936 (Against All Defendants)
Real nasty. If a trade secrets lawsuit is an arrow, throwing in a patent infringement claim too, is poison tipped and barbed!
This is some good "old skool Google" where they used to show broad competence across many domains; in this case legal.
We'll take a moment to remember the salad days, when you were just a crazy college kid who showed up at the Darpa Grand Challenge with a self driving motorcycle:
- went nuclear on Uber/Otto
- revealed what they track internally to all their employees
Googlers already know this.
Like it's not even close to being close to being close to being secret.
Economist Joseph Stiglitz wrote in 2009 "...banks that are too big to fail are too big to exist..."
My theory is that the too big to exist theory is now true for basically all the tech giants. Generally, everyone who knows the kind of tracking these companies do (internal and external) agree this is true, except those who benefit from the companies' continued existence e.g. employees, investors, shareholders.
What evidence did they present and how could it be tracked?
- downloaded 9.7 GB of waymo data -> server logs of what files where accessed and downloaded by what user
- searched for special software -> he used google while logged into a work account, so they just looked up that work accounts search history
- Connected external hard drive and wiped data -> Short of automatic backups or something this seems like there is software explicitly for tracking when data is copied, where, and how much
Most of this besides the external hard drive part can be done by any employer who owns your work your gmail account. What really should be alarming is how easy it is for your employer to get lots of data on you even if they aren't some tech giant.
Actually, I think that would be the easiest as I would assume any external USB devices connected to a computer would automatically send an alert to the security team due to how easily they can infect your computer with malware. I'm not sure my company has something like that but we have posters everywhere telling people to never plug external USB devices into our computers so I would not be suprised.
>What really should be alarming is how easy it is for your employer to get lots of data on you even if they aren't some tech giant.
Right? So much for the Principle of least privilege.