Java ain't perfect, but a lot of hard problems have been solved for the ecosystem and building on top of those solutions will save teams time and money.
I think Jodd is interesting in the same way NodeJS is interesting. If you're more comfortable building your framework from the ground up, you end up adding only those pieces you require. If you're happier having an entire application server at your beck-and-call, start with a full application server and (if necessary) remove what you don't need.
Since the Java ecosystem is huge, this gives me a lot of confidence that most of my needs will be met by existing tools.
JRuby does not wrap primitive values, or provide values that behave like primitives, but you can add that: https://rubygems.org/gems/infraruby-java
However... What's painful with Java EE is the time it takes to understand it. Please correct me if I'm wrong, because I've never got around it, but it takes 5 years of experience to build an app using CDI, JAX-RS, JPA, JSF, JNDI. For each of them you need to choose the implementation provider (Hibernate, Jersey if I'm correct?) and configure it, then you get bugs like the application isn't aware of your JPA and you don't know why, because the spec is circonvoluted enough to be powerful but then takes months to understand. Whereas some frameworks like Play Framework can be learnt in 2-5 months.
This is my feeling about Java EE, but have I been in bad luck with my experience? Is it supposed to be quick to get a canonical-JEE app running?
The rest is simply a set of orthogonal specs -- none in anyway intellectually or technically challenging. Building a tutorial full stack JEE app should let you see and touch most of these layers from client to backend.
It is a good thing that the new hotness in coding is ignoring JEE ..
... or simply use those that come with your certified application server.
Exactly. Many people simply don't know what they can get out of the EE box. Even self-proclaimed EE experts never heard of e.g. Web Profile.
Still, I doubt I would use Jodd in a new project. The core components (MVC and data access framework) are completely new to me. It seems that they are developed by the same people behind Jodd. I don't particularly like Spring MVC or Hibernate, but Jodd libraries don't seem to be widely used (only 22 questions on SO and no forum or mailing list).
Beside the poor adoption, from a cursory look at the documentation, I couldn't find any striking feature. Actually the DB mapping module seems quite convoluted.
Also, I don't get this strong emphasis on the framework being "lightweight". Who cares if my project uses 3MB or 20MB of libraries?
As others pointed out, also take a look at Dropwizard and Ninjaframework.
Mr Spasic seems to be something of a virtuoso!
Being concise is good for the first time. If someone needs to fix the "concise" code years later, the only option may be to throw it away and write it again.
The main place where older code in readable languages (e.g. not J, common lisp, Perl) becomes hard to read is when new language conventions have replaced the old.
But there's also a lot of verbosity that adds no (or very little) value, e.g. System.out.println() instead of something like just println() or the several lines of boilerplate needed to run something in a new thread.
Discussion and project planning is happening here:
If you want to help out make sure you stop by and say hi :-)
Bloat bothers me, but it has benefits too.
Why is C++ so successful? Because it contains a lot of features. Ask random C++ developers what their favorite feature of C++ is and you will get a very wide spectrum of answers.
I see Java libraries a bit like this: they are not liked because of the number of features they have but because I definitely need some of these features, and as long as I can use them without having to understand the rest of the library, I'm good.
Guava is a very good example of this: an assortment of convenience functions from a very wide range of domains that you can use without ever having to read or understand the entire Guava documentation.
Having small libraries is pretty much a guarantee for subtle bugs because the author went for the 80%/20% approach and then moved on to the next project. Joda is a good example of that: time libraries are extremely difficult to get right, and yet getting them right is absolutely critical. I will gladly pay the price of including the entire Joda library in my application even if I use 10% of it just for the certainty that it's battle tested, something I very much doubt the collection of libraries presented here does.
That's what is always said, but is it actually true? I think a couple of answers will be prevalent and at the top of the list:
- Deterministic destruction
- Manual memory management / no GC
- C interoperability
Would be fun to actually do such a survey and see if there is more spread than other languages.
In Java, as far as I know, this would be utterly impossible, because implementation of interfaces must be explicitly mentioned. Even though Double implements addition and scalar multiplication, it could not be used in such a function because it doesn't explicitly state that it implements such an interface.
I currently work with Django and Wordpress, so I'm not totally new here. But last time I dived into this world, I could not make heads or tails of it.
Java seems like a powerful tool to have in your toolbox, but from what I've found it does not lend itself to be easily picked up.
I see that there is a Dropwizard module for templated views, which looks reasonably nice:
I'm not sure how well it supports the other aspects of a web application.
However, as soon as you want to go off the beaten path (eg to have two databases), Spring starts to become an obstacle rather than an assistance. In this respect, it's similar to Rails or any other large, opinionated framework.
The primary deficiency I've found is startup time - when I move to other systems, I remember - oh yeah, projects can start up immediately, or in < 3 seconds, not 50. I know 3 is better in this regard, and I'm looking to migrate one project to it next month.
Grails never got the mindshare because it was jumping in to a full ecosystem. Rails, by comparison, was essentially the only game in town if you wanted to do Ruby - there weren't any other major competing frameworks.
The other big strike against Grails is... "it's different!". Most shops that are large enough to see big benefit from Grails and similar frameworks also tend to be rather stodgy and slow to change (perhaps with good reasons, but slow anyway). The sort of people who champion 'new' tech largely migrated away to completely other ecosystems (rails, node, etc). Grails/Groovy are 'different' in a scary way for "slow to change" shops, and "not different enough" for the folks who are always looking for "new/shiny" stuff to dig in to and promote. It was/is too "middle of the road", I think, to benefit from the advocacy often required for adoption. And in a crowded JVM-tech field, that hurts.
That said, it's still a great stack, and I still recommend it for many situations.
I would feel more comfortable picking Kotlin despite its newness: it has everything Groovy offers but it also fixes all the problems I listed above.
They're an Apache supported project now.
As for being happy about something, if Groovy development is being effectively led by its technical people as well as not being dictated to by the applications using it, then I'd be optimistic for its future (the Codehaus-cum-Apache implementation of it anyway).
Groovy made Java slightly less painful too use, but added its own warts.
Today we have languages which are better than Java without adding all of Groovy's mistakes.
> They're an Apache supported project now.
You mean the place where software projects go to die?
Care to elaborate?
By the way, I think they went overboard with the marketing. Jodd libraries are all of: efficient - elegant - fast - lightweight - micro - powerful - slick - super - tiny - versatile. It is almost like they picked a random superlative for each piece. I get the point! I see lots of projects going this route on their marketing materials and it ends up being a bit annoying.
Sorry, I could not help myself :-).
I think it would help if people backed their claims with some actual data: a memory/CPU/lines-of-code comparison against mainstream alternatives or standard libraries.
You can really judge for yourself how it compares to the alternatives you're familiar with.
In terms of memory/CPU there are plenty benchmarks out there such as the benchmarks game site http://benchmarksgame.alioth.debian.org/
A small Luminus app will take around 100megs https://groups.google.com/forum/#!topic/clojure/mQbcz82I7iI so that's pretty lightweight by most standards.
In general though while people seem to be obsessed with benchmarks of all kinds, performance is simply not going to be an issue for the vast majority of sites out there. The real question is how productive your stack is and how easy it is to work with.
The JVM doesn't automatically mean something has to be bloated. A small Clojure web app will compile to a roughly 30 meg runnable standalone jar. That's pretty light in my book. On top of that, you get excellent performance.
And to repeat some of the mistakes of the past, such as having the classes be mutable, and not separating zoned and unzoned times. It also contains some novel wackiness of its own: new instances of JDateTime take default values for some settings (time zone, whether date arithmetic should be clamped to valid dates, etc) from fields in the class JDateTimeDefault; but these fields are static, not final, and public, so any code in the JVM can change them, thus changing the behaviour of JDateTime objects created by completely unrelated code!
As of Java 8, there is a pretty comprehensive and well-designed date-time package in the JDK:
That package was designed primarily by Stephen Colebourne, who designed Joda Time, and essentially constitutes the Joda Time: The Next Generation. I don't see any reason to use JDateTime over this.
Curious, what do they mean by 'micro'?
Here you go:
These are framework classes that you will never have to see or use when building a Spring application.
Spring is surely not perfect, most people don't have the slightest clue what a modern Spring Application (read into Spring Boot) looks like.
Spring started life as a dependency injection framework, and that still constitutes the central part of it. Dependency injection is the mechanism by which components of an application are created and assembled, and so there was a natural expansion of Spring into providing those components. Spring now provides a web framework, plumbing for database persistence, batch processing, and all sorts of other things:
AIUI, Boost is a collection of libraries. Spring is much more frameworky than that. The general approach with Spring is to add a particular module to your project, sprinkle a few magic annotations on your code, then watch in wonder as lots of sophisticated stuff happens, exactly as Spring's designers foresaw. Usually.
There are two main alternatives to Spring. Firstly, Java EE, which is similar in spirit, smaller in extent, and the product of a decades-long standards process involving some of the world's most ponderous companies, but really not as bad as that might suggest:
And in fact, i'd describe Spring as an alternative to Java EE, rather than vice versa.
Secondly, not using a framework at all, and just putting everything together yourself. To a large extent, Dropwizard etc are the result of people doing this, then posting the reusable bits of their application to GitHub.
Java's equivalent of Boost is probably some combination of Apache Commons:
And Google Guava:
The difference is that if you're using Spring, then you are writing a Spring application. Every part of your code will be pervaded by Spring, and you will have to do things in the way Spring prescribes. If you're using Commons or Guava, your application is your own, and those libraries will crop up here and there, where you choose to use them.
Also related is the forced convention of requiring one directory nesting level per package name component, resulting in deeply-nested mostly-empty directories that have a tendency to easily exceed filesystem limits.
Here's another ridiculous class name found in different project (AspectJ):
AbstractProxyFactoryBeanInstanceSingletonCommand is not, because it is too abstract. Typing the name is not so bad though.
Customisation over configuration.
Composition over Inheritance.
When using maven you can pick and choose the modules you want.
Edit: The ui generates the dependencies for your pom. I know how maven works, and it's a pain to copy all those xml snippets in your pom.
I know DropWizard is like a distro of select "best" projects out there, glued together with some configuration and instrumentation code. That makes them easier to use while leveraging the battle-tested-ness of established libraries like say Jackson for JSON.
Is this more of a "built from scratch" set of frameworks? How battle-tested are they?
Jodd in itself could work for not needing an app container by using something like Winstone (Jenkins is built with it) for deployment models closer to systems like node.js or Flask and Sinatra that emphasize minimal scaffolding to make it easier to horizontally scale a system by just deploying more dumb nodes.
I've seen Jodd used in production systems and just never seen a stacktrace fail to that point so far. I've only ever seen application code fail in itself or fail due to a side effect or misunderstanding of the underlying libraries. I've seen FAR more bugs due to the ugliness of the old Java dateutil libraries than Jodatime, for example, but would we argue Jodatime is more or less mature?
Maybe I didn't answer your question but I'd say if you're looking for super stable libraries I'd argue on a time basis alone to just use core Java then. The very point of minimalism from a reliability perspective is to reduce points of failure in code with simply less parts. And really, with the number of deployment headaches I've had resolving random classes across an app container, I could take one less transient dependency in my project.
Thanks for your reply but I didn't understand this part of it. Core Java doesn't have a lot of this functionality. Are you proposing it would be better to write your own JSON serializer in core Java, for example, than using a battle-tested library like Jackson?
So in a scenario like being stuck with Java 6 I'd say you could use libraries like Jodatime and Jodd to make up for certain inconveniences, but it's up to you to decide if your application is fine with the AOP in Jodd v. Guice v. Spring v. whatever J2EE standard is trying to catch up after the fact.
Disclaimer: I have nothing to do with this project.
It looks interesting though.
Man, this just sounds so lightweight!
The conspiracy, what most people know is true yet are afraid to admit, is that the Markov chains are trained entirely on past failed code projects. Like cows chewing their own cud, like dogs lapping up their own vomit. Soylent Green is People and Libraries are Pure Virtual.
____ is set of ____ micro frameworks, tools and utilities, under ____ MB. Designed with common sense to make things simple, but not simpler. Get things done! Build your Beautiful Ideas! Kickstart your Startup! And enjoy the _____.
is this the future of technology, or has the future already arrived
I decided to do a more rigid Madlib-style generator. Perhaps someone wants to use Markov chains to generate the page bodies for these headlines.
After observing myself and a couple of developer friends over the last few years, i can most definitely say yes. We are addicted to building our own things, creating layers after layers of abstraction, frameworks, tools, because it makes us feel like we're creating something.
What saves some of us, is to have a greater satisfaction seing someone else actually use our things, than creating that thing. And so the goal becomes more "what do people need" rather than "what would i have fun building today".
1. A need to cut your teeth on a solved problem when diving in to a new category of thinking.
2. The high cost of transferring a model of thinking between two humans, especially over a vast amount of time. I think 3 years is vast compared to maybe 100 work hours to model and implement the core of a problem.