
Looking at how many sites use vulnerable JavaScript libraries - heitortsergent
https://snyk.io/blog/77-percent-of-sites-still-vulnerable/
======
megaman22
If it's an automated scan, I'd be skeptical. Currently dealing with some
overzealous security folks who put adherence to their scan tool over common
sense, and insist that we lock down Oracle PL/SQL vulnerabilities in an
application that doesn't use any flavor of SQL...

~~~
Klathmon
I absolutely hate these kinds of "security" scans.

I once worked with a company that started using one of these. They said our
"vulnerability scores" were significantly too high.

I looked at the report, turns out they were just looking at HTTP headers and
throwing up every CVE that matched any version numbers they found. (One of the
"worst offenders" on the system was a CVE about a vulnerability in PHP when
using "magic quotes", a part of PHP that hadn't been used in many years, and
our application never used)

We were officially instructed that the fix would be to hide the PHP and apache
version numbers from the headers.

If I were the one running that scan, and someone "fixed" the problem by just
hiding the version numbers, I'd be calling for that person to be fired for
trying to hide the problem. But here they were instructing us to do just that.
And once we did, the system was marked "secure"...

~~~
styfle
There was a fund raising website written 15 years ago that a team I was on was
responsible for (I never actually worked on it).

There were fraudulent credit card donations for $1 which became really obvious
when the zip code was garbage.

The “solution” was to disabled the credit card page until the month of the
fund raising event when it was enabled again in hopes of the scammers would
not try during that month.

~~~
CapacitorSet
I don't understand, what were the scammers trying to achieve?

~~~
mr_toad
Trying to validate stolen or generated credit card numbers.

~~~
jacquesm
That's not always what is going on. In some cases unscrupulous operators will
run through large numbers of $1 transactions in order to lower their
chargeback rate.

~~~
nicksantamaria
This sounds interesting - can you explain further?

~~~
jacquesm
Chargebacks have a cap, if you go over that cap you get fined or lose your
merchant account. So in order to dilute the pool they'll make a lot of low $
amount charges usually masquerading as some charity, those will have very low
charge back rates so the average charge back %age will drop.

------
binarymax
Needs some significant clarification on what it means to be 'vulnerable'. As
an example, many static sites (such as blogs) can use jQuery, that doesn't
necessarily mean there is an attack vector for those blogs or their visitors.

------
wesleytodd
We run nsp on our production services in CI before merge. The number of false
positives I have tracked down is infinitely higher than the number of vouln's
found. I literally mean this, we have never seen one disclosure which resulted
in a viable attack on our production services.

For example, recently a bunch of ReDOS voulns were reported in popular
libraries. None of which were in code paths hit by our configurations.

So needless to say, I think this is a sensationalist headline.

~~~
guypod
I think it's an absolute statement about the lack of awareness to this risk.

Of course some of these site would not actually be vulnerable, but I would bet
the vast majority of them don't even know they're using a library with a known
vulnerability.

~~~
wesleytodd
Agreed, but the tools (nsp) are there to make it simple to know. Devs who are
not going to update/patch are not the target here, so making big claims like
this does not strongly add to the conversation IMO.

Also, this is nothing new on the web, the amount of wordpress sites with known
voulns is probably MUCH higher.

------
fenwick67
Just because the library has a vulnerable code path doesn't mean the site
itself is vulnerable. Especially with libs like jQuery, where there are many
many functions.

------
irrelative
One might even say that 100% of 333,410 sites use vulnerable javascript
libraries

~~~
jjnoakes
Those means different things.

Their wording means "we checked X sites and 77% of them met some criteria",
which can be extrapolated to higher values of X (assuming the proper
statistical care is taken, etc).

Your wording implies the same, but that's not good because you can't
extrapolate to a larger X. You chose the sites after knowing they already met
the criteria, and that changes the meaning.

~~~
spurcell93
I get the sense OP was being a bit snide

~~~
bigiain
The snide version would be "100% of 433,000". (Which is what I initially
parsed it as, and nodded in agreement...)

------
jbob2000
This means nothing unless they mention what the vulnerabilities are. We do
security scans on our front-end javascript code as well; most of the hits we
get are for "log injection". Meaning, we have a console.log somewhere and
someone _could_ fake our logs by overriding the output. Wow, such
vulnerability!

------
tyingq
This boils down mostly to a lot of sites running jQuery at a version less than
3.0.0.

Which technically means "vulnerabilities", but depends on how it's used.

------
mnm1
I looked at what it would take to upgrade our Angular 1.5 to 1.6: weeks worth
of trying to untangle dependency hell and testing. For a minor version number
upgrade. Let's not pretend it's only the consumers of these libraries that are
lagging. A lot of these libraries are maintained by teams who have no business
maintaining software, open source or not. They can release all the security
patches they want, it won't make a difference if their new version isn't
backwards compatible (obviously Angular is egregious and by far the worst I
know of where even their minor version numbers have huge incompatibilities,
not to mention 3 major version releases in 11 months).

------
nawitus
Even if we have better tooling to make updating dependencies easy, there's
still the fact that there's a lot of applications in production that are not
maintained. If there are no maintainers there will be no updates..

------
dbg31415
Of course this is click bait. But of course most companies don't invest what
they should in website maintenance, code reviews, security audits, and
upgrades. I'm OK with some clickbait if it helps raise awareness of the need
for more recurring investment in technology.

------
merb
> One of the discoveries the report mentions is that an analysis of around
> 433,000 sites found that 77% of them use at least one front-end JavaScript
> library with a known security vulnerability.

Does that even matter? No Front-End JS Library should actually make your
backend vulnerable.

~~~
bastawhiz
An XSS issue could make your users' data vulnerable.

~~~
oneweekwonder
But cors[0] headers can mitigate some of the risk?

[0]: [https://developer.mozilla.org/en-
US/docs/Web/HTTP/CORS](https://developer.mozilla.org/en-US/docs/Web/HTTP/CORS)

~~~
amenghra
You want CSP headers to mitigate XSS risks.

------
peternicky
I’d like to hear how Snyk compares to GitHub’s recently release vulnerability
notification feature.

------
JasonFruit
Is it better for a website to roll its own insecurity? I'd a lot rather people
use libraries with significant adoption — hopefully being aware of and
avoiding any security problems they may include — than write their own version
where the security problems will never be exposed, at least for good.

~~~
arca_vorago
I'd rather devs stop using JavaScript in so many places where its not even
needed. Pure html5 and CSS is where its at.

------
coin
I still question why I need to execute remote code just to _read_ web content

~~~
ben_w
You don’t. The majority of websites behave acceptably if you disable
JavaScript entirely.

------
styfle
Snyke posted “The State of Open Source Security” not long ago

[https://news.ycombinator.com/item?id=15729811](https://news.ycombinator.com/item?id=15729811)

