
Client-side content encryption - abraham
https://blog.amp.dev/2020/04/27/introducing-the-fastest-and-most-user-friendly-content-encryption/
======
p4bl0
From the title I thought it would be something along the lines of an NaCl
implementation in JavaScript.

It's actually a software that helps breaking the web for its users. Basically
it allows paywalled websites to send users their inaccessible data even of
they won't be able to read it. The title should be "DRM for web page that eat
up your data plan even if you can't access the content".

~~~
diggan
I agree in general that AMP is user and web hostile, but this move doesn't
seem to enable something that was impossible before. The drawbacks around this
is the same as before, heavy reliance on AMP and Google for something trivial
to implement yourself.

Websites could have done this before as well, and many in fact already do.
It's generally established pattern in webdev if you need to be able to unlock
content in a fast manner, without having to request additional content from a
backend. Notably, many P2P networks works the same way regarding private-but-
still-distributed content as well.

------
WhyNotHugo
Wow, this is just terrible. Sites using this will now be indexed by google,
but not by any other search engine, since it's encrypted with a key only
google [and subscribers] can read.

I really hope AMP never gets any large scale adoption by consumers.

~~~
fastball
Is it? Now at least Google can index it and it's faster. The solution most
savvy publishers were using before was to not send the content at all to the
client, which means that search engines were not able to index it at all.

As someone who is actually subbed to a few paywalled sites, this solution
sounds great to me.

I mean, we're talking about _paywalled content_ here – ideals of an Open Web
don't really apply.

~~~
bscphil
> I mean, we're talking about paywalled content here – ideals of an Open Web
> don't really apply.

I think the scenario being imagined here is:

1\. I search for story on Google dot com.

2\. Paywalled story appears at #1, artificially boosted relative to other
paywalled content because Google can index it.

3\. I click on the story (not knowing it's paywalled), am greeted with a
paywall. The site not only blocked me from viewing the story, they just wasted
significantly more of my mobile data.

------
franga2000
Wasn't Google's excuse for AMP that it's a "standard" and everyone could use
it (other search engines for example)? Now they want to add even more Google-
specific crap to further lock it down. Not exactly surprising...

~~~
three_seagrass
The authorizor appears to be agnostic, so Microsoft can implement it with
Bing's AMP too without Google.

~~~
franga2000
The content needs to be encrypted with the "AMP provider"'s key too, so each
of the providers needs to be explicitly supported. The webmaster might go
through the trouble of supporting Google and Bing, but I doubt many would
bother with any of lesser-known search engines, if those were to adopt AMP.

~~~
three_seagrass
Maybe, maybe not, but those are still options that are agnostic of Google.
This doesn't rope AMP into being any more dependent on Google than it was
before.

------
diggan
> All content is easily indexed by Google and ready to serve from its AMP
> cache

If the content is encrypted and can only be decrypted by the user with the
right keys, how does Google get to decrypt it? They have a master-key everyone
needs to use in order for this to work?

Took a look at [https://amp.dev/documentation/guides-and-
tutorials/develop/m...](https://amp.dev/documentation/guides-and-
tutorials/develop/monetization/content_encryption) which is linked as well,
but got no answer.

I seem to remember something around that Google penalized websites who showed
different content between Googlebot (the indexer) and a normal website
visitor. Does this move go directly against that, when the premium content
would be indexed but not be able to be viewed by the visitor?

~~~
marcus_holmes
from the link you shared:

"You are required to encrypt the document key with the local environment and
Google’s public key. Including Google’s public key allows Google AMP cache to
serve your document."

So you have to encrypt the document key with their public key so they can
decrypt at will. No master key required.

~~~
diggan
Thanks! I somehow missed that. So there is indeed a master key that Google
holds, the private key that the public key matches with is that master key.

So does that mean that Google will no longer penalize websites that show
different content for Googlebot vs normal visitors, or is this "AMP client
side encryption" a exception to this rule?

~~~
fastball
The same content is being served to Googlebot and users with this solution...
unlike before.

------
kohtatsu
Woo DRM for websites! Thanks Google!

~~~
mirimir
Indeed, just what we needed. /s

So why not instead come up with a workable micropayment system?

~~~
jeremiahlee
I am quite excited about the Web Monetization API and Coil’s implementation.

[https://webmonetization.org/](https://webmonetization.org/)
[https://coil.com/](https://coil.com/)

~~~
mirimir
What forms of payment does Coil accept?

~~~
contravariant
It's answered in a rather roundabout way on their FAQ page for for membership
accounts [1]:

>Can I use cryptocurrency to pay for my membership?

>No, all memberships must be paid by credit card in US dollars.

[1]:[https://help.coil.com/accounts/membership-
accounts](https://help.coil.com/accounts/membership-accounts)

~~~
mirimir
Oh well.

It is possible to go ~anonymously from cryptocurrency to "paid by credit card
in US dollars". Basically you barter cryptocurrency for a credit card payment.
You can do it with someone you trust. Or you can negotiate online. But then
you may get a stolen card account, which might be embarrassing.

I wonder if they accept gift cards.

~~~
jeremiahlee
Nothing stops someone from creating a competitor to Coil that also uses the
Web Monetization API emerging standard and that competitor accepting
subscriptions in another currency, including crypto. That’s one reason why
it’s such a compelling idea.

------
andy_ppp
The bastards on the AMP strategy really do want to embrace and extend the web;
watch out next for a currency that competes with Libra and is even less free.
You can pay per click on the web with Google getting 30%+ and for advertising
they will be the only game in town.

~~~
fennecfoxen
The World Wide Web is dead. Long live the Internet.

(And on that tangent, I still don't understand why Google ever had the right
to buy the .dev TLD for its own private uses.)

~~~
loup-vaillant
I's more like the internet is dying, and the web is killing it.

Fewer and fewer networks allow more than outgoing 80 and 443 TCP connections.
Fewer still allow any incoming connections. The standard way to send email now
is connecting to a _web site_.

Continue like that, and everyone will have to tunnel UDP over HTTPS to get
anything done.

~~~
mschuster91
> Fewer still allow any incoming connections.

Tragedy of the commons, too many people abused their freedoms and attacked
other people. These days, having open ports on a machine that is not a
dedicated server is asking to get hacked, plus that many providers don't even
have enough ipv4 addresses any more to hand them out to consumers.

------
freakynit
One more attempt by Google to gather more control over content publication and
consumption. Idea nonetheless in nothing new.

The only thing that makes this practical is Google's existing know-how of
billions of it's users, and offloading of content encryption on server-side
using Google's services

------
asplake
Not so much user-friendly as normalising the user-hostile

------
kreetx
It appear that sending encrypted content is a bet on whether the user pays for
the content or not. I'd wager that for most page loads people won't go on to
pay. So if page-loads become larger on average due to the encrypted content
then the real benefit here is that the original content providers will be hit
less (since amp can fetch the content all at once), but perhaps more
importantly, google can index this content.

edit: So in total, users' bandwidth won't be saved - google is just serving
itself, but packaging it as a user benefit.

------
atrilumen
The web is Google's platform now, and they're locking it down.

It's time for a new web, and a new user agent with a minimal core so it isn't
impossible to implement.

~~~
loup-vaillant
Not sure how much I agree with the first statement, but I definitely agree
with the second. That's not an easy problem, though.

First, we are doomed from the start, because of network effects. The new web
will likely never gain any traction whatsoever, because it looks like
backwards compatibility is more important than simplicity, performance, and
CO2 emissions.

Second, we must agree what that new web is for. We can display text, images,
audio and video. We can tweak the layout of the content. We can take input
from viewers (text, uploaded files…). We can make entire applications on top
of the web.

Once we agree on the purpose of the web, we need to chose how to make it
happen. Do we serve content declaratively, or procedurally? Should browsers be
readers of a well defined, limited data format, or should they be virtual
machines? I personally prefer virtual machines (unlimited functionality on top
of a very simple core), but their natural opacity does have its problems:
screen readers, dark mode…

\---

There _may_ be a way to break network effects: government web sites. Define a
new standard that serve those right, make sure this standard is easy to
implement pretty much everywhere (including on old computers with a crappy
connection), and mandate that all .gov sites move to that. Also maybe rethink
the whole security layer, most notably the PKI.

To move things further, we could possibly use regulations. For instance, we
could mandate that banks provides an option to use that new web. We could
regulate our way into a critical mass, to a point where common folks can
realistically ditch the old web.

------
noizejoy
I find it quite hostile to send user inaccessible content down the pipe,
thereby cluttering up bandwith and local storage.

~~~
three_seagrass
Prefetching content is a W3C standard.

The user is also trying to access that content so they probably don't find
preloading it aggressive, especially if they're on a low-bandwidth connection
and they don't have to load the page twice.

~~~
SiempreViernes
Most sites aren't very clear with what is gated content until after you click,
so most visitors will get an encrypted payload they can't use and then bounce
because they didn't want to pay in the first place. Thanks Google.

~~~
three_seagrass
Most sites can and do already do this without Google or AMP. You have missed
that point entirely.

------
thereyougo
The article talks too much about the problem, and when it comes to solving it,
it gives very shallow information.

~~~
diggan
Indeed. Seems there are two articles where this one is about the problem, and
this: [https://amp.dev/documentation/guides-and-
tutorials/develop/m...](https://amp.dev/documentation/guides-and-
tutorials/develop/monetization/content_encryption/) is about the solution

------
ecmascript
Fuck AMP and fuck Google.

------
ArtRichards
While not appreciating the use of encryption to paywall services, ultimately
its the responsibility of the content provider. If the author of an article
contributes their content to a publisher who uses this method of encryption,
thats up to the content creator; and then I would hope sites like this would
just stop linking to them.

In reality, this is a major paradigm shift, as it provides a way forward for
peer 2 peer hosted content. Imagine that instead of sending the key and
providing access, rather your access is based on some other out of band
validation system, such that another peer controls your access to their
content.

Very exciting, but of course this particular perversion of google with AMP is
more abuse of technology, solving a problem of publishers at the expense of
freedom of information.

------
prophesi
This would actually be pretty cool if it was made into an open protocol. Let
there be a well-known ID for encrypted paywall content, with children divs
that specify which search engine's public key it was encrypted with. Search
engine bots would then be able to decrypt the content with their private key.

The downside is that now you're sending duplicated content to clients. Turning
it into an event that javascript could fire off would solve this; a bot
accesses the site and hooks up to the encrypted-content event. The event
returns a list of search engines. If the bot's search engine is in the list,
it hits a well-known URL with its name, and gets back the encrypted content it
can decrypt with its private key.

Just spitballing a few ideas here.

------
edf13
Avoid & protest!

Google wants it's own version of the Internet.... a paywalled modern AOL.

------
endgame
Buzz off, Google. Stop making the web worse.

