Hacker News new | past | comments | ask | show | jobs | submit login

A combination of wasteful architecture astronomy and legitimate need to divvy up absolutely mammoth line-of-business applications among teams with hundreds of members operating for years with wildly varying skill levels, often on different subsystems from different physical locations, with billions of dollars on the line. You can't let the least senior programmer in the Melbourne office bring down the bank if he screws up with something, so instead you make it virtually impossible for him to touch anything than the single DAO which he is assigned to, and you can conclusively prove that changing that only affects the operation of the one report for the admin screen of a tier 3 analyst in the risk management group for trans-Pacific shipping insurance policies sold to US customers.

The tradeoff is, historically, that you're going to have to hire a team of seven developers, three business analysts, and a project manager to do what is, honestly speaking, two decent engineers worth of work if they were working in e.g. Rails. This is a worthwhile tradeoff for many enterprises, as they care about risk much, much, much more than the salary bill.

(I spent several years in the Big Freaking Enterprise Java Web Applications salt mines. These days I generally work in Rails, and vastly prefer it for aesthetic and productivity reasons, but I'm at least intellectually capable of appreciating the advantages that the Java stack is sold as bringing to users. You can certainly ship enterprise apps in Rails, too, but "the enterprise" has a process which works for shipping Java apps and applying the same development methodology to e.g. a Rails app would result in a highly effective gatling gun for shooting oneself in the foot.)




I never understood what the 'risk' in architecture designs meant until your comment made me realize it is indeed meant for teams where you really cannot trust all team members (shudders). Thanks, I see now more clearly.


if the decision makers were truly smart, they would've asked to use a provable programming language like Coq, or at least haskell or something like that (i hear Ada is very popular in millitary software because of it's contract based programming).


Not true. It's very difficult to unlock a big budget when you have a small team. And with Coq or Haskell, you team won't be very big...

Also, ADA is _a bit_ popular, but not too much. Most of the code is still done in C because of the conformism. Object oriented programming is still considered like a dangerously modern move. The rules here are pretty hard and it's easier to fuck up the regulation than the code... (cough A330 Neo cough)


In other words, the bureaucracy of the code reflects the environment it was developed in.


In other words,

organizations which design systems ... are constrained to produce designs which are copies of the communication structures of these organizations

—M. Conway

(http://en.wikipedia.org/wiki/Conway%27s_law)


> I spent several years in the Big Freaking Enterprise Java Web Applications salt mines [...]

I had to re-read your comment twice as I thought that there exists an OTS ERP for salt production.

PS. The company I work for actually grows salt.


I wonder how did this evolve to be...

I remember (vaguely) the beginnings of Java, but not how it did end up in Enterprise and why it was loved so much there.


Initially, it aimed at replacing C++ (which it mostly achieved, for LOB apps). The primary market for Sun has always been the enterprise, so it was a natural target.

Then, when the web exploded, people found out that developing against a virtual machine could be safer and more predictable than developing against the operating system, so web-app servers were born. J2EE provided a certain degree of consistency, which other platforms did not have.

As platforms evolved, they tried to generalise their use-cases, which introduced a number of abstractions and forced decoupling; and because Java2 lacked support for a lot of introspection/metaprogramming features, you had to explicitly cover a lot of cases, leading to interfaces multiplying.

It didn't help that at the time, the "Pattern" movement was popularised and their tools (which are all about decoupling and introducing further levels of abstractions) became overused.


Why didn't the enterprise go with the "microservices architecture" (not current usage of microservices, but you get the idea). Considering that SOA was actually originally from Java land, wouldn't the idea of every team with their own code base more appealing to the bureaucracy? The programming version of compartment.


Thanks, that's what I was looking for.


"architecture astronomy" is my new funny-yet-insightful phrase of the week. Thanks!




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: