Often times, the hack is through a web front-end. Back-end systems (such as DBs) are heavily firewalled, logged, monitored, etc. and are generally very well protected. Systems guys (OS and DB) know security pretty well and have been doing it for a long time now.
Much of the web software that powers the front-end is complex (PHP, Java, .Net, JS, CSS, SQL, includes, 3rd-party libraries from everywhere, etc). That complexity has a broad attack surface that is difficult and time consuming to test. And many devs are late to the security party (unless we're talking OpenBSD developers).
Management wants to push out new features by X date. Devs have very little time to test and are behind on security anyway. Hackers have all the time in the world to poke at the web front-end and test every possible combination of things until they finally get in.
In a nut-shell, that's the problem as I've seen it.
We use SHA1_Pass. It does not store password data. It generates passwords as needed. The info required to generate the passwords is stored on our internal wiki. The secret sentence is only in our heads and never stored or written down anywhere and when an employee leaves, we change the secret sentence. Here is an example map.
This is the same reason I re-wrote a lot of my Python code in C++ many years ago. Distributing one self-contained, statically linked executable just works and even the most clueless user can download and run it.
But I still use a lot of Python and I'm sure this guy still uses a lot of Ruby. Everything has its place.
Very well said. There is no one perfect programming language, no one perfect algorithm and no one perfect data structure for all problems and constraints you will face as a CS practitioner.
Really, a CS education is just preparing you to pick the right solution for the problem/constraints at hand. For example, you can loop through a list. That approach works fine. However, when you begin to scale, you may find that look-ups against a tree-based data structure or perhaps a hash table are much more time efficient at the cost of more complexity, more space and more educated programmers.
Ok, I will go out on a limb here and say that I don´t unit test (at the moment), as I estimate it would take at least twice as long to write the test code and data as the actual code, as there are many related objects that need to be put together correctly for each test case. I am in the fortunate position of writing an in house Django app, so basically it is in a constant beta state, and I have around 40 beta testers to tell me when things go wrong.
Now I see that unit testing would have caught a few of the bugs over the last couple of years (but not that many of them) but in our case, adding new features and adjusting the data model to the constantly changing requirements is more important. My code does get tested, just not automatically.
I am not saying that it is a bad idea to unit test, and that I never intend to use it, but for the time being the time costs don´t outweigh the benefits.
Also whenever I look at tutorials there is no advice on how to test the parts I want to test. Instead they demonstrate how to test 2 + 2 = 4. I don see the point in that when my application is mainly outputting a moderately complex SQL query results. I can generate a load of objects in the database, and set up unit tests, and have them run each time I update a completely unrelated part of the application, or I can use the real data, and check the results are as expected on my development machine. I know which way is more productive for me.
I have projects where I can't manage to get any kind of automated tests in, because it's just too hard to figure out how to do it, way too time consuming. And indeed they don't have tests.
I have other projects where I manage to get good automated test coverage.
Having this experience, I know for sure which projects are more enjoyable to work on,of a higher quality, with fewer bugs, better architecture and more developer productivity -- the well-tested ones every time.
If you write tests from the start, it tends to effect the architecture of the project -- creating a testable architecture, but also generally a better more maintainable architecture. So the projects where testing is 'too expensive' are often those that were started without tests. Also, certainly, some environments/frameworks/platforms support testing better than others. And I think it's true that some domains are better suited for testing than others -- sadly, in my experience, typical web apps are actually among the hardest things to test well.
I have sympathy for not having figured out how to test in an economical and maintainble way. Sometimes that's me. But at this point I am confident from my own experience, that when I can figure out how to test in an economical and maintainable way, it leads to better software and less frustration.
(I suppose there could be a correlation fallacy here, where the 'less problematic' (in some ways) projects are the ones I manage to test on, and it's because they are 'less problematic' that they are higher quality, not because they are tested. All I can say is my experience leads me to believe in tests, even though I still don't use them in every project, because in some projects I can't figure out how to do so economically.)
"Unit" and "automatic" tests are not synonymous. We have a non-trivial system, and mocking the necessary components for unit tests seems like a productivity loss to me. But we absolutely have automated tests; we don't test components in isolation, but as they will behave in production.
However, carefully crafted system tests can exercise the parts of the system you want to exercise. You know the little examples you write up yourself to convince yourself that a new piece of functionality actually works? Turn those into tests, and keep them around. I have been saved by those when a seemingly unrelated change caused an error in something I had not anticipated.
I tend to have strict unit testing with near 100% coverage for libraries. For application code, I use primarily sloppy integration tests. This seems to be the right balance for me anyway. That being said, I try to extract as much app code into libraries as possible.
How do you know this? It frankly doesn't sound like you have worked on an application of similar scope that has good test coverage. I went the first decade of my career without writing tests and said the exact same things you are saying.
I do agree that learning how to write effective tests is difficult and that you cannot do it by reading the web tutorials and the scant coverage given in most books. I learned how by working on an existing project with good coverage.
On my own personal projects what I began doing was writing tests instead of writing most of the exploratory code in the REPL, or instead of writing a stub of a template to test some new code in the browser. I took all that ad-hoc scaffolding that naturally pops up and just structured a little bit and called that my tests. My tests are more integration tests ... I still do not understand the compulsion for low-level unit tests that do nothing other than prove the underlying framework works correctly.
Because I have on numerous occasions tried to get tests set up, and every time another feature request / data model change comes up before I get anywhere close to building all the related objects required for the tests I want to run. (I agree it is integration tests I need rather than low level stuff that proves the framework works as expected).
The majority of the bugs I see I would not have written a test for anyway, as they are usually subtle interactions between the objects and a variation that were not anticipated in the feature request.
The design is constantly changing, as I work for a DNA sequencing centre and the technology is constantly advancing. A few bugs would have been caught by automatic testing, but not so many considering the time it seems to take me to get them put together.
It is different working for a big corporate entity that has the resources to devote to these things, but in our case we don't.
> I am in the fortunate position of writing an in house Django app, so basically it is in a constant beta state, and I have around 40 beta testers to tell me when things go wrong.
And I'm sure you can see that if you were in a position where you had thousands of external paying customers relying on your service (who will possibly leave and stop paying if it breaks) that you might appreciate having unit tests and other automation to help you worry less when it comes time to push the big green launch button.