Hacker News new | past | comments | ask | show | jobs | submit login
Zoom’s security woes were no secret to business partners like Dropbox (nytimes.com)
98 points by pseudolus on April 20, 2020 | hide | past | favorite | 27 comments



I feel like this is a non-story.

> Dropbox privately paid top hackers to find bugs in software by the videoconferencing company Zoom, then pressed it to fix them.

This is pretty standard. For a big company looking to purchase services from another company, commissioning a pen-test is common practice. It doesn't mean anything bad, it's about mitigating risk.

Further, everything about these contracts is about mitigating risk. Zoom had privacy/security issues for individual consumers, but large enterprises have a contract that lays out it detail exactly what is required and expected, and I fully believe that Zoom was meeting those requirements because that's part of the enterprise software game.

Does a large enterprise care if its traffic is being routed through China? Probably not (unless there's a national security concern), because they will have insurance against any issues arising from that as part of their contract – Zoom will be liable for data leaked. Same goes for Facebook login being in the Zoom app.

Yes, it's not great for the privacy of individuals, but that's not the target customer of Zoom, and even the target _user_ is a corporate identity, not a user's personal identity tied to their Facebook account for example.


Disclaimer: I worked on Dropbox security. Not in appsec. I won't comment on the article, or the accuracy of the article, or anything that wasn't public before the article. I'm mostly just going to talk about what is broadly true for all or most companies.

I think it is worth noting that:

a) Pentesting 3rd party vendors is uncommon. This is something that the majority of companies rely on a SOC2 for.

b) Pentesting is not what the article is talking about, it's talking about bug bounties/ Vulnerability Reporting Programs. It is equally, if not more, uncommon for a company to bring a vendor into its VRP.

And yes, companies care greatly about traffic being routed through China.


> a) Pentesting 3rd party vendors is extremely uncommon. This is something you rely on a SOC2 for.

In my experience this is fairly common? Although that was my experience working on the pen-testing side so I guess it was a little biased. The company I worked for did a lot of this sort of thing – pen-testing for meeting due diligence requirements.


There are two separate things here.

There's going to a company and saying "Are you SOC2? If not, we need you to be, and that requires a pentest - please go do that before we engage." There may also be, in this same vein, "We're strategic partners, we'll help you get that pentest". This is very common, I suspect the vast majority of pentests are compliance (and essentially sales) driven and would fall under this category.

That's different from "We have already engaged, you are already compliant with SOC2, you may do your own pentests, we will now separately pay for and manage a pentest of your company". This is not something I've seen too much of - perhaps that's just me not paying attention? But I'd be surprised if this were common at all.

Though I want to again restate that the article is focusing on VRP.


I find that very surprising that you uncommonly pen test 3rd party vendors. Working at a company that delivers a product to the enterprise, pretty much every publicly traded company we work with requires us to go through a pen test, or requires that we provide a recent and independent pen test report.

If you rely on SOC2 compliance then you are indirectly requiring a pen test.


> I find that very surprising that you uncommonly pen test 3rd party vendors.

Just so we're clear "you" is not Dropbox, and I'm not talking about what Dropbox does or doesn't do. I'm saying that, in general, most companies don't pentest other companies, they ask those companies instead to prove that they do their own pentests, which usually amounts to asking for their SOC2.

> we work with requires us to go through a pen test, or requires that we provide a recent and independent pen test report.

I am stating exactly this. Most companies require proof via SOC2, and that is it. Very few will actually hire a pentest firm directly for a 3rd party vendor.

> If you rely on SOC2 compliance then you are indirectly requiring a pen test.

To quote myself: "This is something you rely on a SOC2 for."


Totally agree. This is how news is able to completely color your perception of things even if they are reporting factually correct info. Compare

> Dropbox privately paid top hackers to find bugs in software by the videoconferencing company Zoom, then pressed it to fix them.

vs.

> Following technology due diligence best practices, Dropbox hired a "pen testing" company to search for security defects in the Zoom app and required fixing them as a condition of their contract.


Except that the latter results in far less number of clicks and shares on social media. That's really the primary goal of most news outlets rather than an objective reporting of the story.


One point I don't agree with the article on is that Zoom doesn't deserve the scrutiny. They absolutely do. The more and more I hear about their issues, it sounds like their entire stack is a mess. I've said it elsewhere, but security is not something you just bolt on after the fact. Security needs to be considered from day one when you're designing any sort of networked application (even non-networked). It's becoming painfully obvious that for the longest time security just was not a concern for Zoom. It doesn't matter that their user base has changed due to COVID-19. The problems were there (and remain to be there) before the virus was even a thing.

My wife uses Zoom to communicate with other people involved with the nonprofit she runs. I'm honestly concerned that her Mac she uses for Zoom is on the same network as my PC. My concern has gotten to the point that I'm considering putting her Macs on an isolated network.


Sadly, this sort of thing seems to be the majority of media reporting on technology. Gell-Mann amnesia and all that.


(Too late to edit my post)

After reading the article, I now don't think it's as misleading as the parent portrayed.

>As part of a novel security assessment program for its vendors and partners, Dropbox in 2018 began privately offering rewards to top hackers to find holes in Zoom’s software code and that of a few other companies. The former Dropbox engineers said they were stunned by the volume and severity of the security flaws that hackers discovered in Zoom’s code — and troubled by Zoom’s slowness in fixing them.

>After Dropbox presented the hackers’ findings from the Singapore event to Zoom Video Communications, the California company behind the videoconferencing service, it took more than three months for Zoom to fix the bug, the former engineers said. Zoom patched the vulnerability only after another hacker publicized a different security flaw with the same root cause.

This is pretty specific, and does seem to be quite unusual and well beyond the scope of a typical vendor assessment/compliance process. And the particular details about Zoom's response (delaying fixing until they received another report of a bug with the same cause) don't sound great.

I do still think it's a common pattern to see vague and sensationalized reporting about infosec matters, but I don't think that's necessarily the case here. My fault for not looking at the article myself and just trusting the parent.


> “I don’t think a lot of these things were predictable,” said Alex Stamos, a former chief security officer at Facebook

After the macOS installer bug last year I predicted exactly this. I may just be a random HN commentator but my opinion of Mr Stamos is diminished greatly if he actually believes this


Somebody should create a lightweight VM to run Zoom and possibly other malware-like apps (WebEx) your employer forces you to use.


Luckily they offer web-based versions of the software, so I use my browser as that lightweight VM :)


I tried that the other day when I had to use Zoom for work. The sound was stuttering (in a consistent straight pattern). Luckily the group was small enough that I managed to convince them to switch to Jitsi.


While it's not at all lightweight, that's exactly how I'm working from home now, using the free Windows 10 VM.

https://developer.microsoft.com/en-us/windows/downloads/virt...


Someday we will be able to run a VM in browser that's fast enough for Zoom, something like jslinux[1].

or... their web apps will stop to suck.

[1] https://bellard.org/jslinux/


The Zoom webapp is fine. My only issue with it is that it does not support the Brady bunch layout.


I believe the Chrome app does, which runs inside a chrome tab and is sandboxed from the rest of the system.

https://chrome.google.com/webstore/detail/zoom/hmbjbjdpkobdj...


I do this already. It's fairly easy. Make a Windows VM or a clone of your existing one specifically to run mandatory shitware like pretty much all enterprise conferencing apps.



What are the odds there'll be a project zero blog post on Zoom in 90 days?


I get the feeling pretty much anything that Project Zero team even glances at will have a post made about it at some point :)


Google doesn't have a competing project that I know of. From what I can tell they only study things google depends on or things google competes with.


Hangouts seems like the main competitor to zoom in casual contexts (school, families, etc)


They've dropped the "hangouts" and are just calling it "Meet" now (in some places, at least).


> Zoom’s defenders, including big-name Silicon Valley venture capitalists, say the onslaught of criticism is unfair. They argue that Zoom, originally designed for businesses, could not have anticipated a pandemic that would send legions of consumers flocking to its service in the span of a few weeks and using it for purposes — like elementary school classes and family celebrations — for which it was never intended.

> “I don’t think a lot of these things were predictable,” said Alex Stamos, a former chief security officer at Facebook who recently signed on as a security adviser to Zoom. “It’s like everyone decided to drive their cars on water.”

So they're saying because this product was designed for business rather than consumer, it wasn't necessary to build it securely? How the fuck does that make any sense?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: