
Ask HN: Has software hacking gotten more easier? - IA21
I&#x27;m a college student and I don&#x27;t know what the hacking scene was like 20, 30 years ago so please correct me if I&#x27;m wrong. I recently watched Mr. Robot and did some research on the tools and exploits used in the show. I know the show features quite advanced hacks but what surprised me was this:<p>These days we have tool suites that anyone can find and download from the web (SET, Metasploit, etc) and extensive guides &amp; tutorials on how to use them. Almost anyone can follow a guide and do basic stuff like WiFi hacking and phishing scams. We even have ready-made hardware like USB Rubber Ducky and Pwn Phone which doesn&#x27;t take a lot of time and knowledge to get started with (tutorials do just fine). While we may have developed advanced security standards and protocols, the percentage of mediocre developers which ignore those practices and the sub-par software they develop is always increasing. This is the software which will never be updated once vulnerabilities are found and will be exploited by script kiddies (Mirai).
======
dsacco
Your question is rather more complex than simply "easier" or "harder." There
are a number of axes which have shifted over time.

I would say that it has become cheaper, faster and easier to identify software
vulnerabilities that are low-hanging fruit, and conversely more difficult,
time-consuming and expensive to identify software vulnerabilities in higher
levels of abstraction or complexity; by way of association, overall
exploitation ease ("hacking") for respective complexity classes has shifted
commensurately.

Overall, I would say that it has become _easier_ to find and exploit software
vulnerabilities in websites in the _aggregate_ , and that it is significantly
more difficult than it has been in the past to find and exploit
vulnerabilities in _specific, individual websites._ I view this as being the
result of a confluence of factors, some of which have to do with the
commoditization of many parts of the security industry (including human
skillsets), and some of which have to do with the proliferation of ever-
increasing amounts of software:

1\. As you mentioned, there are many sophisticated tools for quickly
identifying and exploiting vulnerabilities, especially low-hanging fruit. This
makes it easy to find vulnerabilities in websites with no attention, and
significantly more difficult to find vulnerabilities in websites with previous
attention.

2\. New software is constantly produced, and existing software is constantly
augments, both in sheer lines of code and overall feature complexity. This
makes the overall number of vulnerabilities increase, and makes
vulnerabilities easier to find overall.

3\. Software continually undergoes changes in abstraction across the industry.
There are entire categories of software developers who know nothing about
compiled languages and who have never developed in them. This makes it both
easier to produce vulnerabilities due to poor understanding of what a code is
doing "under the hood" and harder to find someone capable of exploiting or
patching those vulnerabilities once they are introduced.

4\. The rise of bug bounties has introduced a gold rush for being the first
person to identify security vulnerabilities, which has significantly raised
the competition. This has made it harder overall to find vulnerabilities in
applications that have received previous attention.

5\. The automated prevention or identification of vulnerabilities has made it
easier to prevent them before they appear. Simple SQL injection flaws or
cross-site scripting errors can be found with fully automated software. Even
complex cross-site scripting in the DOM can be reasonably identified through
automated means. Cross-site request forgery may no longer be a serious threat
in the next five years due to simple browser changes. This has made it harder
to find vulnerabilities, but mostly because fewer of these vulnerabilities are
being introduced.

6\. Security has become increasingly mainstream, which means that, (much like
bug bounties), there are teams constantly trawling through monolithic, open
source code to find serious deficiencies. As technical debt from outdated,
insecure modes of software development is reduced, it becomes harder to find
vulnerabilities, though it certainly appears as though vulnerabilities are
_increasing_ due to how mainstream findings are these days.

I still find things like cross-site scripting when I'm on security
assessments, but it's frankly harder than it used to be. In contrast, things
like insecure crypto, insecure direct object references and API auth logic
errors are on the rise and have been for a few years now.

Net competition to find vulnerabilities has increased across every sector of
the industry, from the work bug bounty hunters are doing, to the work I do in
consulting, to the work Google's Project Zero does. The tools you mention have
been disproportionately developed for the more competitive areas of security
(bug bounties and web/mobile app sec), while the more complex and specialized
vulnerabilities of the sort Tavis Ormandy finds still require a phenomenal
level of manual research and discovery work before they can be found.

~~~
IA21
Great reply.

