
Protecting sites from Cryptojacking with CSP and SRI - Scott_Helme_
https://scotthelme.co.uk/protect-site-from-cyrptojacking-csp-sri/
======
userbinator
Presumably, the "[Warning] Do not copy or self host this file, you will not be
supported" is because they change the script reasonably often, meaning that
using SRI will require them to change their hashes on every linked page every
time the script changes or it will stop working --- probably not what they
want.

 _when visiting the ICO website_

That is... amusingly ironic.

~~~
Scott_Helme_
Yeah, the best way to handle this is with a version in the path and then the
host can knowingly/willingly upgrade the library version and SRI hash at the
same time.

~~~
cbr
That doesn't really solve this problem, it just slows down both the rollout of
the subverted version and the rollout of the fix. People aren't auditing the
(minified) javascript they put onto their sites.

SRI is good for use with a CDN, where the same entity controls both the HTML
that references the JS and the JS being referenced. In that case it keeps
someone who subverts the CDN from being able to XSS the site.

~~~
greglindahl
That depends on what "this problem" is -- lots of people here would say that
the problem is websites that depend on unaudited, untested 3rd party
resources. I can tell from your other comments that you think it's safe to
trust 3rd party resources from places like Google. So there's a disagreement
that is worth talking about explicitly.

~~~
cbr
Most sites are built on lots of unaudited untested (by them) third party code
sever side. Adding some client side isn't great, but also isn't a fundamental
change to the dynamic.

(I used to make web server software)

~~~
greglindahl
Uh, so that's terrible added to terrible, but it doesn't really address what I
just said!

If I was the mean person at IBM that buys startups and makes them "right",
it's a lot more work to fix terrible added to terrible than plain terrible.
And my last startup got bought by IBM, so I've experienced that pain
personally.

------
cbr

        Want to know how you can easily stop this attack?  What
        I've done here is add the SRI Integrity Attribute and
        that allows the browser to determine if the file has
        been modified, which allows it to reject the file.
    

SRI does not fix this problem. If you put an integrity attribute on the
script, then the next time BrowseAloud releases to prod their script will stop
working on your site.

This is a product that works by running a script on your page to make changes.
There's no option for defense in depth here: either you trust that their
processes are secure enough that they're not going to XSS you, or you
shouldn't run their code on your site at all.

The bar for including javascript from other sites should be a high one, but
there are times when the tradeoff is reasonable. For example, I have Google
Analytics [1] on my site, and I trust them to handle this responsibly.

[1] Disclosure: I now work for Google

~~~
gambler
_> The bar for including javascript from other sites should be a high one_

Yes, but that's not how most developers think these days. Try browsing the Web
with NoScript and you will routinely witness _dozens_ of domains in the block
list.

~~~
throwawaypanda
>The bar for including javascript from other sites should be a high one

>but that's not how most developers think these days.

The decision to include third party javascript is sometimes not even up to
developers these days.

"A deal has been signed with company X, put their widget on the site" is
something I've now heard a few times.

Arguments about the third party code greatly increasing page load time, page
size, introducing security vulnerabilities etc then fall on deaf ears. High
developer turnover seems to co-occur in these environments.

------
Mister_Snuggles
I use uMatirx and it’s given me a pretty good lesson on how much stuff sites
load from 3rd-parties. Many sites need me to play whack-a-mole to get them to
display - which 3rd party sites do I need to allow to get the content to show
up.

I’m really torn by this sort of thing.

On one hand, when many sites use jQuery (for example), there’s huge benefits
(bandwidth, speed, etc) in having it loaded from one location relatively
infrequently and cached for many pages. This is exactly the promise of shared
libraries, just on a much wider scale.

On the other hand, why aren’t web site operators delivering all of the code
that is needed for the site to function? If they want all their users to
execute that code, why aren’t they willing to serve it themselves?

I’m not sure what the right answer is, but this incident is a pro for pulling
in all of your dependencies. The limited data plans that a lot of people have
are a big pro for hosting libraries centrally.

~~~
foepys
> On one hand, when many sites use jQuery (for example), there’s huge benefits
> (bandwidth, speed, etc) in having it loaded from one location relatively
> infrequently and cached for many pages.

Does it really decrease bandwidth usage and load times? Is there a study that
looked at that specifically in recent months or is it just a assumption? I'm
getting the feeling that including common JS libraries is on the decline and
more and more developers are using node.js packages that are built into their
frontend code. Not to mention that there are quite a few CDNs that are
basically providing the same service for the most common frameworks and are
thus detrimental to this theory.

~~~
hyperpape
Load times, I believe yes, unless you have your own CDN. But you’re right that
the odds that any particular version is in the user’s cache are low. I
gathered some links on the subject a while ago:
[http://justinblank.com/notebooks/browsercacheeffectiveness.h...](http://justinblank.com/notebooks/browsercacheeffectiveness.html).

------
mholt
Even when using integrity checksums, you have to be careful not to update them
to match the malware, like brew did with Handbrake a while ago:
[https://github.com/caskroom/homebrew-
cask/pull/33354](https://github.com/caskroom/homebrew-cask/pull/33354)

------
walterbell
What percentage of mainstream sites use the "integrity" hash validation
attribute when loading 3rd-party script for a known-good library version?

Would be a useful data point for a "tech stack" web crawler to monitor.

~~~
greglindahl
Use of integrity hash validation is pretty limited -- I see 90k sites in the
top 10 million.

It's a shame this isn't more popular, I'd love to build a browser add-on that
uses the integrity hash as the name of the script, and load them from ipfs or
something.

Top sites: gov.uk, nhm.ac.uk, change.org, blogs.worldbank.org, handbrake.fr,
army.mil, genome.gov, ...

~~~
K0nserv
SRI[0] is still a fairly new technology with lacking browser support. I expect
its usage to grow, but it's worth keeping in mind that the use of SRI does not
matter at all if the client doesn't support SRI.

0: SRI - Subresource Integrity [https://developer.mozilla.org/en-
US/docs/Web/Security/Subres...](https://developer.mozilla.org/en-
US/docs/Web/Security/Subresource_Integrity)

~~~
greglindahl
As a guy who crawls, indexes, and archives websites, subresource integrity
matters to me whether or not the client supports it. You probably had end-
users in mind.

~~~
K0nserv
Yeah exactly I was thinking of end users. Since the support in end user's
client dictate what websites will implement I think we'll see use of SRI
increase over the coming years

~~~
greglindahl
I would be amazed if anyone disagreed with you about that.

Status of client support:
[https://caniuse.com/#search=integrity](https://caniuse.com/#search=integrity)

------
gambler
Content-Security-Policy, Strict-Transport-Security, X-Frame-Options, X-XSS-
Protection, X-Content-Type-Options, Access-Control-Allow-Origin...

How many hacks like these will we need before people stop to rethink the
fundamental security model underpinning the Web? It's clearly crumbling.

~~~
JetSpiegel
We are clearly in the Antivirus level of protection (blacklists) to maintain
backwards compatibility, instead of a whitelist of allowed domains or features
to use.

~~~
ubernostrum
CSP is a whitelist-based approach; CSS/JS/etc. will only be permitted from the
sources listed in the CSP header.

------
neals
This makes me wonder what happens when a popular nodejs library get used in
this way. What could hackers do with thousands of compromised nodejs servers?

~~~
gboudrias
Nowadays it's just about getting as many miners as possible. No big mystery.

~~~
quickthrower2
Well they might do other things in to your house, now they have the keys.

------
quickthrower2
Something like JS 'permissions' could help here. I.e. load a script but limit
how much CPU/GPU access it has plus determining if it has DOM access, XHR
access, etc. So you load the script and tell it what it can do. Also
developers taking security more seriously. But it is rarely a priority, since
risk reduction is not paid for today.

~~~
shrumm
I like this option, similar to how apps request permissions on a mobile OS.
There could be a default list of modifications an external javascript script
can run, any further access must be explicitly white listed by the developer.
Then even if the script was malicious, at least the impact is controlled.

for instance, maybe the default is that an external script can read the DOM
but needs extra permissions to write/modify it?

------
sitkack
Isn't this literally out of 'How I Steal Your Credit Card Number' playbook?

~~~
bluejekyll
It’s similar. If I remember the CCN playbook correctly, it was about taking
over one of your JS dependencies via npm. This is about cross hosted JS,
loading remote code.

------
baybal2
A useful tech it is. The problem is 3rd party code from adnets is changing all
the time, and they will never tell you about that because they hide all kind
of anti-clickfraud trick there. From intentionally broken JS syntax, to
intentionally broken Unicode, to actual 0day exploits.

~~~
rspeer
When I first read this discussion, I just assumed that BrowseAloud was some
sort of ad-tech or analytics code. But it's assistive technology for screen
readers. The list of web sites that were compromised were web sites that were
_trying to do the right thing_ to help disabled users.

Ad-tech is a giant security hole that can't be fixed without burning it all
down, but BrowseAloud could be fixed.

------
rexbee
Here's a list of hundreds of sites using that JS library
[https://nerdydata.com/search?query=www.browsealoud.com%2Fplu...](https://nerdydata.com/search?query=www.browsealoud.com%2Fplus%2Fscripts%2Fba.js)

------
jschwartzi
How would I get Mozilla to warn me when a script is loaded without Subresource
Integrity? I'd like to avoid being caught unawares by this type of security
hole, especially if it lets third parties execute code in my browser without
any controls.

~~~
hyperpape
I think adoption is limited enough that you just need to run NoScript, uBlock
or something and whitelist JS.

------
petagonoral
> What I've done here is add the SRI Integrity Attribute and that allows the
> browser to determine if the file has been modified, which allows it to
> reject the file

Wouldn't this negate one of the benefits of a 3rd party hosted SaaS?
Otherwise, you have to redeploy everytime your provider updates their lib?

~~~
teej
It’s not uncommon to freeze 3rd party libs. Do you want a 3rd party provider
to have access to hot deploy to your website?

~~~
gokhan
Possible if the 3rd party script is versioned.

~~~
stordoff
Isn't that the same as freezing the script (except still loading from a remote
and using SRI to enforce that it wasn't changing)?

------
Quarrelsome
am I weird in that I'd prefer to host a known version of a dependency than
effectively hot-link like this? Avoiding this sort of attack is a just a side-
benefit to the main premise of knowing that its always going to work
consistently.

------
jnordwick
When will the JavaScript community learn to stop trusting 3rd party code
downloaded over the internet?

In the previous event the code disappeared. This time it isn't what you
wanted.

~~~
userbinator
_When will the JavaScript community learn to stop trusting 3rd party code
downloaded over the internet?_

Given that the majority in the JS community probably have it enabled by
default in their browsers, probably never...

~~~
floatboth
Not 3rd party in relation to the user's machine.

3rd party in relation to the site. As in <script
src="[https://someone.elses.domain/something.js">](https://someone.elses.domain/something.js">)

------
nodesocket
If you use 3rd parties such as analytics and chat scripts you don't have
control if they legitimaty change the response. I get it, they are supposed to
use versioning and bump, but you'd be surprised.

All your hard coded hashes fail and your 3rd party analytics, chat, etc all
stop working. Is there a solution to that?

------
deanclatworthy
There’s still large numbers of people on older browsers that don’t support the
integrity attribute. It’s not foolproof but it’s one of those things you can
do to improve the experience and security with no side effects to older
browsers and benefits to new(er) ones.

------
baybal2
we used to do an ajax load of a resource and doing simply hash of it "by
hand," sending "alarm, lib A has suddenly changed" by ajax to HQ

------
skinpop
maybe we should throw out ads and instead let creators borrow our computers to
mine for a few minutes while we interact with their content.

~~~
stordoff
I suspect what we'd actually end up with is miners PLUS ads. There's little
incentive to not use both forms of monetisation, as neither prevents the other
from being used.

I also don't think it's a great trade-off for a variety of reasons
(environment impact, possible wear and tear on user's machines, battery life
on mobile devices etc.).

------
stri8ed
Proof of stake will solve this.

~~~
IncRnd
> _Proof of stake will solve this._

No. It. Won't. The issue isn't PoW/PoS but the loading of infected code into
browsers. PoS may stop the need for this particular JS code to get inserted.
However, PoS will hinder no other JS from being inserted.

~~~
JoachimSchipper
In principle, sites should be secure. In practice, putting an implicit bug
bounty on every widely-used Javascript library does produce more exploitation.

I think you missed an opportunity to engage your parent comment more
productively.

~~~
IncRnd
What principle states sites should be secure?

There is hardly even a nod to security, no defense in depth, and no
cryptographic protections. There is widespread loading of untrusted unvetted
code. The operating principle of the web seems to be, "it's okay to do this,
everyone else is."

------
Feniks
"On top of all of that, you could be alerted to events like this happening on
your site via CSP Reporting"

Too bad this can be abused by marketing parasites.

~~~
orf
How?

~~~
dullgiulio
CSP reporting is an unprotected form, in itself a vulnerability.

The author of the article is biased as he made a SaaS product for CSP
reporting.

CSP reporting should be for local debugging only.

~~~
orf
I'm sorry, I'm still not following. How can CSP reporting be used by
marketers? It requires a header sent from the server, so it's not like a
tracking pixel that can be added by a third party. And I'm not sure about
local debugging only, locally it offers no benefits over just viewing the
devtools console, whereas it offers a lot of benefits when enabled on users of
your sites.

