Yes Java has a lot of problems but they are mostly ones that have workarounds and aren't fundamentally dealbreakers. A few example issues are assignment isn't final by default, @Override isn't required to override methods, you can't have abstract static methods (sort of can workaround this now get this now with default interface methods). Java is basically just a really un-opinionated language that requires careful practices to wield it properly (see Effective Java by Joshua Bloch) but other than that it isn't a completely terrible language and you can write some really robust, easy to read code with it.
I would also have a lot more time for this article if the author actually took the time to explain the motivation behind the *FactoryFactoryFactory class instead of just discounting it without explanation and using it as the proverbial Straw Man for his article. I'm sure something this unusual must have a very interesting reason as to why it exists (I'm taking him literally here and assuming he isn't just exaggerating for comic purposes).
Java for whatever reason seems to attract these ideas.
I can take a compiled .jar file from the late 90s and in quite a few cases I can run it right now unmodified. I can also link against it and use it in Java code pretty trivially. I can do this with modern tools and modern VMs. Then I can copy the result from an x64 box to and aarch64 box and most of the time it will run the same way. That is just amazing.
Java's strength in the areas of robustness and compatibility means it can sustain a lot of complexity. You can build really insanely complicated towers of Babel in Java and they won't topple over. Java's excellent tooling means they'll even be somewhat maintainable.
This transforms Java's great strengths into its great weakness. Its capability to sustain complexity means that nothing checks the tendency of sophomore programmers to over-engineer everything. In other languages over-engineered rococo monstrosities just fall over, but in Java you can actually get away with it to a shocking degree and still ship working robust code.
It's the only language for which I've ever met "architects who didn't code". It wasn't backwards compatibility that made that happen.
I think it may be otherway round. Excellent tooling seems essential for Java to succeed. Since standard JDK could not even compile a project without ant/gradle/maven etc. Java has tools to parse/filter stack traces, of course because few 1000 line vomit for a missing file, or wrong jar means one need tools to extract relevant error from stack trace. A 10 level file hierarchy to read some config data means I can't really code without full blown Java IDE.
I'm sorry, but I think you have to defend that statement.
Certainly, gradle and maven do a LOT to make it easier to BUILD java projects, but compile?
These tools do more to apply some common-sense patterns to project source and dependency management, so that you're only specifying the things you really need to, but, like most compilers, javac just needs to know where the compile-time dependencies are and where the source files are. The SDK is quite able to compile without a build tool.
You've got VM tech like polymorphic inline caches, and the kind of speculative devirtualization and subsequent inlining that you can only get from a JIT that'll flatten these towers of complexity at runtime.
And you have a simple static type system that can easily tooled to flatten it on developer mind space as well.
IMO pattern names are merely for communication purposes, like saying "abstract factory" instead of having to say "I have an interface F for creating instances of interface A, where different implementations of F can be chosen at runtime to create different instances of implementation of A". And also you are not required to call your factory "Factory".
I wouldn't call this a knee-jerk reaction, but rather a well-thought out reaction to a few decades of OOP. When I started programming 25 years ago, I felt OO was a misguided idea but I was a novice and couldn't put my finger on it. Now, it's clear to me that all OO does is add unnecessary complexity and simply cannot model real life at all. I don't see a single problem that it solves better than either procedural or functional paradigms. Those paradigms have a lot less boilerplate, are a lot easier to understand, and can be learned much more easily.
Giving precedence to code over data, IMO, is a huge, irredeemable mistake. That's literally the mistake OOP is built on. Data should be a first class citizen in any system, yet it's a second or third class citizen in OOP. It is hidden in objects, processed only by limited certain methods, and generally hard to work with (you can have data only objects, of course, structs, but then you're back to a procedural paradigm). I find myself spending anywhere from 25-75% of my time in OOP languages writing boilerplate to get the right objects talking to each other and the rest writing actual business logic that matters. In something like Clojure, time spent writing boilerplate is around 5% at most. I'm using Clojure simply as one example. I'm sure there are other functional and procedural languages that are also good examples.
Continued use of Factory patterns doesn't deserve outright mockery, but does deserve immediate and firm correction. It introduced confusion and complexity and ontology to a place where a nullary lambda would do everything it does, and slice away all the ambiguity.