Hacker News new | past | comments | ask | show | jobs | submit login

What does this have to do with E2E? I don't see how filtering HTML is harder to do - even if somehow server-side algorithms are better (which this presentation seems to imply), cannot the same algorithm be used client-side?

In a way, the situation is better client-side, because when running code on the client's side, you can check how exactly the browser parses the HTML code.




It's in page 18. If you have end-to-end encryption you can't sanitize in the client.

I mean, you're really just summarizing the presentation. It should be an API that's in the browser. It isn't. So people need to use a library. That's OK. But not great.


"If you have end-to-end encryption you can't sanitize in the client."

I think you meant to type that you can't sanitize in the "server"? Because with end-to-end encryption the server has no access to the plaintext to be sanitized. Only the client can sanitize, only the client has the plaintext.


oops yes.


"even if somehow server-side algorithms are better (which this presentation seems to imply)"

The slides provide several reasons why server-side algorithms are worse.

"the situation is better client-side, because when running code on the client's side, you can check how exactly the browser parses the HTML code."

Yes, and for this reason, DOMPurify is a client-side sanitizer.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: