1) Geeks _love_ to debate languages, especially one they _know better_
2) Geeks also love to trash languages that they don't know that well
3) There's always the claim of N-increase of productivity without any scientific backing (yet we always stress out that we should measure everything...seems like a paradox eh?)
4) This is how we do politics: debating languages, tools, frameworks.
Use whatever you like. That's the bottom line.
Joel used ASP.NET for FogBugz, ASP.NET MVC w/o ORM for SO.
LinkedIN, TripAdvisor, Jive, some Google properties are using Java.
Facebook, Flickr, WikiPedia use PHP.
You are far more productive using the tools you know.
Building products require more than just a syntax. Libraries, Tools: build tools, packaging tools, deployment tools, documentation tools, etc are all as important as the programming language itself.
It's about time we ask for proofs, not just a blog post claim.
To be pedantic, that's not true. As I understand it, FogBugz was originally written in VBScript (Classic ASP) and then ported to Wasabi, FogCreek's own compiler. Wasabi then spat out PHP and VBScript.
However, I still agree with your point and this probably makes it an even stronger argument.
Usually, fresh (and valid) criticism can only be found when someone who knows a language abandons it in favor of another. Some of my criticism of Java and C# would, probably, refer to features and techniques that are outdated. However, other aspects would remain as they were (tendency to depend on overly complicated tooling, verbose syntax, relatively low productivity - here we are, headcount-wise, mostly a Java shop while delivered-product-wise a much more even split with most Java projects being delivered late) and my criticism should still be valid.
Fast forward to 2010-2012, the criticisms themselves had become outdated.
Having said that, I do want to ask you to refrain yourself from commenting non-issue such as the IDE + Static language and LoC due to formatting debate because it is heavily a matter of preference in which I think it would definitely be a waste of my and your time to discuss further.
Side note: I don't think Eclipse is any more complex than Vim to master (unless of course your developers use Notepad) and Maven to be... more stable and less of a moving target compare to similar tools in other environments.
I don't write much boilerplate code these days because I use Spring-Data and other modern frameworks (JAX-RS, JAX-WS). So I'm still confused with many people keep saying "tons of boilerplate code". Can you quantify these "tons"? 2 extra lines? 10 extra lines? what is the context/situation? Can you explain with some examples so I can see your pain point?
Gone are the day of boilerplate XMLs as well since these days frameworks are moving toward annotations heavy.
Many situations depend on the context: we built a moderately complicated "portal" that communicates with 4 different data sources (2 Web Services, 2 scheduled DB dumps) using Java in about 6 months. The software has to be deployed in 4 different environments (LOCAL, DEV, UAT, PROD). Bending Maven to meet certain build/packaging requirement was simple.
Maintenance has been a breeze so far: no downtime (we use the latest GlassFish), once the DB locked (but that was the DB), and once JPA/Hibernate bailed on us due to the size of the data we pulled (we skimmed the data down a bit and the issue is gone). DB migration was a walk in the park using Flyway.
After going through Rails, Python, Java and lately C#, my mindset definitely has changed when it comes to the holy-grail of productivity debate: they all suck with different level of problems.
Interesting choice; the reddit guy gives up Lisp for python rather than Mac desktop for a BSD/Linux desktop. I would rather do the reverse; would move where Lisp works great on client and server.
I also wonder if the Reddit team's decision might have been Clojure if Clojure had been mature when they developed Reddit. Clojure is roughly 1/3 as fast as Common Lisp and takes just 2 or 3 times the memory for a typical application (my experience) - a good alternative depending on the application.
I'm a huge Python fan, but I'm also a realist: the Python ecosystem (in terms of vendors, products, customers and professionals involved, as well as libraries and tools) is nowhere near the mainstream Java/C++/whatever-Microsoft-is-pushing-this-year. Python is not the language of choice on any mobile platform, for example; it barely registers in the humongous "enterprise" space; SDKs for most hardware devices (or anything else, really) will usually list Python as the third or fourth choice, if at all. It's doing reasonably well in scripting, automation, Linux administration, 3D and web, but that's a far cry from being a clear favorite for "general-purpose programming".
Do you think that if Reddit developers were faced with the same decision today, that the progress you mention might indeed push the pendulum the other way?
The situation is improving, though, and will continue to improve as long as people scratch CL itches instead of switching to something else.
I honestly don't understand this sentiment. For me it's nearly impossible to work in lisp and not think "If only this were some other language..." The baseline functionality I'm used to from other languages, like even type checking, is absent. It's like writing in assembly just for the degree of control you have as compared to managed code. I mean yeah writing something in assembly can be fun in its own way, but the result will almost certainly be awful, and you'll solve problems that have built-in solutions in other languages.
7 years ago I couldn't (using available libraries) reliably parse in lisp a DB dump in XML (~8GB), select some records based on a set of calculations, and convert it to CSV. Most libraries insisted on needing the whole document in memory, some libraries leaked like a sieve and caused heap exhaustion, others couldn't handle UTF8. In the end, I could do it python (couple hours to develop and took 45-60 minutes to run), I could do it in java (several hours to develop and ran in ~8min), but I couldn't do it in LISP without writing a custom XML parser or delving into FFI. Maybe it was an abnormal case, maybe not.
I miss macros and the expressiveness; I miss the community; I don't miss the insanity.
These things happened because some people decided to improve CL (both implementations and libraries) instead of using something else.
Good enough is good enough.
If you approach Lisp the same way you'd approach C or Python, you'll fail.
This is the reason why the Clojure community is taking libraries very, very seriously.
reddit now runs on a few hundred Ubuntu EC2 instances.
cperciva (HN username), FreeBSD security guy, maintains some AMIs for it: http://www.daemonology.net/freebsd-on-ec2/
I don't want to become one of those diehard __-advocates, but I'm pretty impressed with Clojure. Speaking from experience, I think it solves quite a few of the pitfalls mentioned in the OP. I wrote up an extensive post on how to learn Clojure, and veteran Lispers can jump to the end and catch the links of Clojure for Lispers: