Hacker News new | comments | show | ask | jobs | submit login

What is up with the seemingly popular knee-jerk reactions towards OO design patterns? It genuinely feels like some sort of mass hysteria... Yes, strategies like dependency injection might seem like pointless complexity if you don't examine why they exist in the first place, but they do solve genuine problems. And no, these aren't just problems that wouldn't exist if you used a functional language... (no silver bullets or else we'd all be writing Scala or Haskell)

Yes Java has a lot of problems but they are mostly ones that have workarounds and aren't fundamentally dealbreakers. A few example issues are assignment isn't final by default, @Override isn't required to override methods, you can't have abstract static methods (sort of can workaround this now get this now with default interface methods). Java is basically just a really un-opinionated language that requires careful practices to wield it properly (see Effective Java by Joshua Bloch) but other than that it isn't a completely terrible language and you can write some really robust, easy to read code with it.

I would also have a lot more time for this article if the author actually took the time to explain the motivation behind the *FactoryFactoryFactory class instead of just discounting it without explanation and using it as the proverbial Straw Man for his article. I'm sure something this unusual must have a very interesting reason as to why it exists (I'm taking him literally here and assuming he isn't just exaggerating for comic purposes).




I've personally seen the kind of engineers who simply pattern match rather than thinking things through go into nesting as many GoF chapters as they can into a commit. I was literally told during a code review when questioning what these layers were achieving for the product that 'all abstraction is a benefit since you can then always work at an even higher level'.

Java for whatever reason seems to attract these ideas.


My pet speculation on why Java attracts this stuff is that it's actually a very robust and strongly backward compatible (and forward compatible) language that runs on a very robust VM.

I can take a compiled .jar file from the late 90s and in quite a few cases I can run it right now unmodified. I can also link against it and use it in Java code pretty trivially. I can do this with modern tools and modern VMs. Then I can copy the result from an x64 box to and aarch64 box and most of the time it will run the same way. That is just amazing.

Java's strength in the areas of robustness and compatibility means it can sustain a lot of complexity. You can build really insanely complicated towers of Babel in Java and they won't topple over. Java's excellent tooling means they'll even be somewhat maintainable.

This transforms Java's great strengths into its great weakness. Its capability to sustain complexity means that nothing checks the tendency of sophomore programmers to over-engineer everything. In other languages over-engineered rococo monstrosities just fall over, but in Java you can actually get away with it to a shocking degree and still ship working robust code.

Try that kind of over-engineering in C++ and you'll have 20 minute compile times for small projects. (I've seen this, so people still try!) Try it in a dynamic language like JavaScript or Ruby and your complex design pattern monstrosity will become a non-deterministic generator of random runtime bugs. These languages can't sustain code that is riddled with FactoryFactoryFactorySingletonFactoryObserverFactory type stuff. Java can, so the Java ecosystem is where the people who love to over-engineer go.


My pet speculation is that Java programmers are culturally predisposed to, and are typically proud of creating towering architectural ziggurats with at least 3 layers of unnecessary abstraction.

It's the only language for which I've ever met "architects who didn't code". It wasn't backwards compatibility that made that happen.


Java today is used heavily in the enterprise, which thrives on utilizing marginal talent. Java didn’t look that way in the 90s, and anyways, I bet you’d see drastically different kinds of code comparing enterprise and Android.


> Java's excellent tooling means they'll even be somewhat maintainable.

I think it may be otherway round. Excellent tooling seems essential for Java to succeed. Since standard JDK could not even compile a project without ant/gradle/maven etc. Java has tools to parse/filter stack traces, of course because few 1000 line vomit for a missing file, or wrong jar means one need tools to extract relevant error from stack trace. A 10 level file hierarchy to read some config data means I can't really code without full blown Java IDE.


> Since standard JDK could not even compile a project without ant/gradle/maven etc.

I'm sorry, but I think you have to defend that statement.

Certainly, gradle and maven do a LOT to make it easier to BUILD java projects, but compile?

These tools do more to apply some common-sense patterns to project source and dependency management, so that you're only specifying the things you really need to, but, like most compilers, javac just needs to know where the compile-time dependencies are and where the source files are. The SDK is quite able to compile without a build tool.


That actually makes a lot of sense.

You've got VM tech like polymorphic inline caches, and the kind of speculative devirtualization and subsequent inlining that you can only get from a JIT that'll flatten these towers of complexity at runtime.

And you have a simple static type system that can easily tooled to flatten it on developer mind space as well.


I guess design patterns should never be applied. Those are simply optimal solutions to common design problems in OOP. When you encounter your own problem, derive your own optimal solution, and you call it some "pattern" if it happens to resemble one.

IMO pattern names are merely for communication purposes, like saying "abstract factory" instead of having to say "I have an interface F for creating instances of interface A, where different implementations of F can be chosen at runtime to create different instances of implementation of A". And also you are not required to call your factory "Factory".


OOP has been a massive step in the wrong direction for programming. To take your example of DI, in Clojure, dependency injection can be done in a few lines for the whole application. In Java/C++/PHP and other languages that use constructor or setter injection, it requires all the code to be rewritten around a DI container, requires many modifications just to get to the point of writing code that actually does something. There is no comparison. We're talking a dozen lines of code vs. dozens of classes spread over dozens of files to achieve the same thing. Minutes vs. hours spend on boilerplate.

I wouldn't call this a knee-jerk reaction, but rather a well-thought out reaction to a few decades of OOP. When I started programming 25 years ago, I felt OO was a misguided idea but I was a novice and couldn't put my finger on it. Now, it's clear to me that all OO does is add unnecessary complexity and simply cannot model real life at all. I don't see a single problem that it solves better than either procedural or functional paradigms. Those paradigms have a lot less boilerplate, are a lot easier to understand, and can be learned much more easily.

Giving precedence to code over data, IMO, is a huge, irredeemable mistake. That's literally the mistake OOP is built on. Data should be a first class citizen in any system, yet it's a second or third class citizen in OOP. It is hidden in objects, processed only by limited certain methods, and generally hard to work with (you can have data only objects, of course, structs, but then you're back to a procedural paradigm). I find myself spending anywhere from 25-75% of my time in OOP languages writing boilerplate to get the right objects talking to each other and the rest writing actual business logic that matters. In something like Clojure, time spent writing boilerplate is around 5% at most. I'm using Clojure simply as one example. I'm sure there are other functional and procedural languages that are also good examples.


Because nested factory patterns are a legacy of an era where lambdas were not well understood, or poorly represented by the type system

Continued use of Factory patterns doesn't deserve outright mockery, but does deserve immediate and firm correction. It introduced confusion and complexity and ontology to a place where a nullary lambda would do everything it does, and slice away all the ambiguity.


I don't really see why it matters what the motivation was behind it. The point is that Java for some reason seems to be an attractor for that kind of ridiculous code. It is where you end up when taking "OO design" to its logical conclusion. The OO paradigm is like religion: mostly harmless if you don't take it seriously.


I'm sorry but you can't possibly be serious. There is no sane motivation for a *FactoryFactoryFactory class. No problem on earth is complex enough to benefit from that much abstraction.


What, serious about wanting to know why it is in their codebase? Yes, I'm 100% serious. If they're currently using it in production I'd assume there's a good reason for it. Plus, I'm pretty sure the folks over at LinkedIn engineering are definitely serious. In any case I'm very curious as to why (rightly or wrongly) it's in there.


It's in their codebase because they catastrophically over engineered it. Even the largest codebases on earth don't require FactoryFactoryFactory's.


After reading this yesterday, I felt inspired to write a short article about how modern techniques deprecate the factory pattern (at least in the form GoF offers with specific ontology) and why composed lambda functions offer more functionality with less complexity.

http://extradimensional.space/posts/2017-12-18-Deindustriali...




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: