
Zoom private chat text between attendees is actually public - manigandham
https://twitter.com/HJHaldanePhD/status/1244302917206708233
======
thoraway1010
This headline is SOOO garbage - seriously - what is up with HN and Zoom these
days. Just trash headlines.

Edit: In case it is removed this was the original headline "Zoom private chat
text between attendees is actually public"

From the zoom docs.

"If you save the chat locally to your computer, it will save any chats that
you can see-- those sent directly to you and those sent to everyone in the
meeting or webinar.

If you save the chat to the cloud, it will only save chats that were sent to
everyone and messages sent while you were cloud recording."

This is TOTALLY different than all private chat text is public. 100% false.
Private chat text is available to the folks who were participants in the
private chat. That would be a much more accurate headline.

Now if the recipient of a chat first downloads it to their computer (zoom
auto-scrubs if you upload to cloud) and THEN uploads it to the cloud - then
whoever has access to it can see it. But this takes a number of steps.
Download your chat and then publish your chat publicly. The headline is
completely wrong.

~~~
gojomo
The headline isn't fully accurate, but if people are confused about – and
burned by – this behavior, it is a documentation & design issue in Zoom.

They need clearer warnings this will happen, or a design which clearly
separates the two info-streams, of differing sensitivity, into separate
download steps & filenames.

~~~
thoraway1010
The headline is inaccurate.

In a lot of businesses this really is not an issue. Many businesses don't want
a lot of bob mockery in private chat. Webinars people want these questions to
go over with a supervisor (common question analysis etc). Hosts can set
appropriate controls (public chat only / host only etc).

I hope zoom continues to cater to people trying to get work done - which they
seem to be, and not this hyper paranoid, totally misleading title crowd.

The solution on hacker news seems to always be to add lots of new steps,
warnings, device authentications, encryption keys, block the host from doing
this or that etc. I'm already tired to the endless cookie acceptance popups on
the public web.

~~~
viklove
Ah yes, the "totally misleading title crowd," a well known group of people
that everyone can easily identify.

> In a lot of businesses this really is not an issue.

I don't think anyone cares about the business in this situation. Individual
privacy and freedoms are more important than shareholder value and
micromanager snooping. When you force everyone to engage with their coworkers
online, you need to find a way to preserve basic human rights -- like two
individuals having a private conversation. Tech that obfuscates that, or
exposes the contents of private conversations, should be called out for doing
so.

Yes, we want business discussions to be open. But at the same time you have to
remember that it's _humans_ that conduct the business (as little as VCs may
like that, because they're so hard to control and pacify), and that those
humans have human needs and emotions, and will have private conversations
wherever conversations can be had.

~~~
thoraway1010
The use case zoom is targeting is not encrypted private conversations that
focuses on user privacy.

It is conferencing -> generally public / group meetings.

The permissions are _VERY_ loose by default - anyone can join your personal
meeting room if you don't put a password based on a simple set of digits after
a URL. People can join with NO account, no login. My own experience trying to
do a lot more lockdown - people don't like it, they WANT this free and open
approach.

If you want private conversations - lots of apps out there support that. But
many apps targeting business have a corporate / compliance / discovery API
option. For example, slack is very popular, and admins can get corporate
export turned on or discovery API which allows exports of your PRIVATE direct
messaging conversations by the company. So if you think zoom letting a user
see their OWN messages is bad, slack (often a popular programmer system) let's
admins get everything you ever typed.

The one I hate the most are the SSL middleware boxes and decrypt everything.

------
333c
It's not clear to me from this thread whether those messages only become
public if one of the parties to the messages then download the chat, or even
when a third party downloads the chat. One seems more serious than the other.

~~~
StrictDabbler
The consensus appears to be that if Alice is talking "privately" to Charlie
about Bob's stupid hair, and Alice is the person designated to save the
minutes of the meeting and post them, then she may accidentally include the
transcript of her Bob-mockery.

~~~
meritt
Correct. When a given user downloads chat history, it downloads everything
they can see: public chat and their own 1:1 conversations. Zoom has plenty of
security and privacy issues, but this is not one of them [1].

[1] [https://support.zoom.us/hc/en-
us/articles/115004792763-Savin...](https://support.zoom.us/hc/en-
us/articles/115004792763-Saving-In-Meeting-Chat)

~~~
kbenson
I wouldn't say it's not one, just that it's a very minor one. Mixing of
privacy contexts is a common problem on the social side of security. If it
warrants an alert notifying you that you are saving both private messages and
public ones if it notes both when saving locally (and I think that is
definitely warranted), then not doing so is a (minor, most likely) security UI
problem (if you accept that actual people and their common behaviors need to
be accounted for as threat vectors).

~~~
thaumasiotes
> then not doing so is a (minor, most likely) security UI problem (if you
> accept that actual people and their common behaviors need to be accounted
> for as threat vectors)

Yes, you want to account for people's actual behavior. This isn't going to
rise above the level of "minor" if viewed from a security perspective, because
it's a self-only attack -- nobody gets any powers they didn't already have,
and Alice is hurting herself, not someone else.

(She might inadvertently hurt Carl, if Carl was sending her messages making
fun of Bob, but she was _allowed_ to do that anyway.)

A usability or operational perspective might object to the behavior here more
strongly.

~~~
kbenson
You're assuming it's only used for interpersonal things. If Manager A requests
a password regarding some system they are discussing from Manager B in a
private chat which other Employees in the channel should not have access to,
but inadvertently exposes it after downloading and sharing the channel chat,
that is definitely a security problem. Any private channel of communication
that is easily combined with a public channel of communication without warning
is a security problem.

It's not about hurting people's feelings, it's about _information leakage_.
And to cut off anyone that says passwords shouldn't be shared in a private
chat, that's irrelevant. Good infosec security practices in one place do not
preclude criticism of bad practices elsewhere. Security is about layers or
protection, so any layer with problems should be noted. If that layer happens
to be a third party application that mixes private and public channels in some
instances and if there isn't warning as to this happening, it deserves to be
called out.

Another way to look at it is that any minor information leakage can have a
major impact if the information leaked is very important.

~~~
thaumasiotes
> And to cut off anyone that says passwords shouldn't be shared in a private
> chat, that's irrelevant. Good infosec security practices in one place do not
> preclude criticism of bad practices elsewhere.

It's not irrelevant. There are phone apps with no other purpose than to
publicize your location. If you should happen to be a fugitive, using such an
app would be a bad move. Does that make the privacy leak in the app a security
problem? No, how could it? If you don't want your location publicized, the
answer isn't to remove the only feature from a location-sharing app so you can
run NOPs in peace. It's to stop using the app.

Your misuse of a feature that performs exactly as advertised can't justify
calling that _feature_ a security problem. The people responsible for the
feature don't know how you're using it. The _use pattern_ is the security
problem, and it needs to be addressed by people who (1) know what it is,
and/or (2) are responsible for it. Zoom fulfills neither criterion.

~~~
kbenson
> It's not irrelevant. ... the answer isn't to remove the only feature from a
> location-sharing app so you can run NOPs in peace. It's to stop using the
> app.

That's clearly not the case here. It's irrelevant in that in the context of
this specific discussion, which this application, which is marketed towards
enterprises as secure, an argument that a sharing of private information
accidentally though bad UI is the problem of the user for putting private
information in that private channel is irrelevant.

> Your misuse of a feature that performs exactly as advertised can't justify
> calling that feature a security problem.

That depends on how we classify "exactly as advertised". If the overall claims
of the product as to being secure are easily and often circumvented on
accident through poor UI, then that _may_ be a security or privacy problem. If
that place it's advertised is not something that most users will encounter
during normal usage, then how it's advertised is of little consequence.

One extreme end of this would be something hidden deep in the EULA or privacy
policy that advertised how this works, and the other extreme end would be an
alert every time you use it that explains this. I think one is obviously a
problem (to the point that it looks purposeful), and the other obviously
isn't, but the only difference between them is where the information is placed
or how assured you can be that the user has encountered and hopefully
understood it. I think this clearly indicates that the problem is not whether
certain behavior is advertised, but how aware users are made of how the
application functions.

------
excitom
Personal anecdote: Use Slack for DMs.

~~~
weego
Slack DMs aren't guaranteed to be private from server admins

------
thedance
Zoom is a wall-to-wall security disaster. When my company switched to it about
a year ago one of my colleagues showed me how to force any Zoom room, anywhere
on the planet, in any organization, to join our meeting. I just don't think it
was architected for privacy and security, and you can't add those things
later, so you should be prepared for a years-long trickle of this kind of
news.

~~~
xenonite
Out of curiosity: Could you be more specific on the issue and how it works?

~~~
thedance
It's related to [https://medium.com/bugbountywriteup/zoom-zero-
day-4-million-...](https://medium.com/bugbountywriteup/zoom-zero-
day-4-million-webcams-maybe-an-rce-just-get-them-to-visit-your-website-
ac75c83f4ef5)

~~~
xenonite
Thank you.

