> a. Lisp isn't a single language, but a family of languages. There's Common Lisp, Scheme, Clojure and even Lisp Flavoured Erlang. They're all similar, but different enough that you can really shoot yourself in the foot if you assume code for one will do the same thing (or even work) in the other. C/C++ have a long history of reverse compatibility and a strong spec that insures interoperability between different implementations (mostly.) Java and C#, for all their perceived faults, have language definitions which are well defined.
This is a really good point. To think that Common Lisp and Clojure are somehow related just because they both use s-exp syntax is like thinking that Java and C# are somehow the same (or compatible) because their syntax is oddly similar.
So the question really is: Why aren't languages that use s-exp more popular? My answer is that s-exp lends itself to homoiconicity. But human brain is actually really good at detecting patterns, and homoiconicity is not compatible with human brain (although it is what computers understand better). Homoiconicity means everything looks the same (code is data), but when everything looks the same, it's hard to detect patterns. Contrast that with C code: just with a glimpse you can tell what each part of the code is doing because functions look different from definitions, etc.
In a weird way, human brain really likes irregularities (i.e., patterns).
This is a really good point. To think that Common Lisp and Clojure are somehow related just because they both use s-exp syntax is like thinking that Java and C# are somehow the same (or compatible) because their syntax is oddly similar.
So the question really is: Why aren't languages that use s-exp more popular? My answer is that s-exp lends itself to homoiconicity. But human brain is actually really good at detecting patterns, and homoiconicity is not compatible with human brain (although it is what computers understand better). Homoiconicity means everything looks the same (code is data), but when everything looks the same, it's hard to detect patterns. Contrast that with C code: just with a glimpse you can tell what each part of the code is doing because functions look different from definitions, etc.
In a weird way, human brain really likes irregularities (i.e., patterns).