In some important cases, languages already have common syntax (such as mathematical operators, regular expressions, and string interpolation such as "\n"). And in those cases, the simplest explanation seems to be: "there was no other sane way to do it".
Some differences really are arbitrary, and could theoretically be merged. Some are historical, e.g. someone worked on Unix for years and thinks in terms of verbs like "echo" because that's what shells used.
But usually, there are very good reasons for any differences.
One reason is the inherent complexity of a parser for a full programming language. It turns out that it's fairly easy to confuse a parser, because they're just not as good as human beings at correctly interpreting the "intent" of a statement. Most languages don't have a reason for being unless they introduce a bunch of unique constructs; and they're lucky if their own constructs can be parsed unambiguously, much less after adding support for extra "standard" constructs from other languages.
Yet another reason is that languages aren't as universal as you may think. For example, what would 'print "hello world"' mean in a makefile...when would the print-out occur? Would languages be allowed to ignore "standard" expressions that they can't logically use, or be forced to invent some interpretation of them?
A final reason is what would happen when languages are embedded in each other, which is even summed up by your example, PHP. If one language is embedded in another, it's a plus that they have different syntax: it clearly separates one from the other. This reduces the risk that you'll have to (for instance) escape or namespace every single name used by the embedded code, to avoid collisions with the surrounding file.
> In some important cases, languages already have common syntax (such as mathematical operators
Except that they don't. Almost everyone agrees about the relative precedence and associativity of the two argument forms of addition, subtraction, multiplication and division, but that's about it.
And then there are the folks who think that multiplication is denoted by a space. They call themselves mathematicians. Don't they know that space is concatenation or ignored?
> One reason is the inherent complexity of a parser for a full programming language.
For some full programming languages maybe. Other languages are so easy to parse that they added support for roman numerals, turing-complete read macros, and other things.
> It turns out that it's fairly easy to confuse a parser, because they're just not as good as human beings at correctly interpreting the "intent" of a statement.
It turns out that people aren't all that good either - they're subject to a 7+/-2 rule. Plus, very few people can reliably remember 10 levels of operator precedence.
> Most languages don't have a reason for being unless they introduce a bunch of unique constructs; and they're lucky if their own constructs can be parsed unambiguously.
Huh? Apart from some nastiness involving << and >> when C++ templates were introduced, how many programming languages have ambiguous parsings? (Yes, many parse things in ways that some humans find "unintuitive", but that's different.)
Some differences really are arbitrary, and could theoretically be merged. Some are historical, e.g. someone worked on Unix for years and thinks in terms of verbs like "echo" because that's what shells used.
But usually, there are very good reasons for any differences.
One reason is the inherent complexity of a parser for a full programming language. It turns out that it's fairly easy to confuse a parser, because they're just not as good as human beings at correctly interpreting the "intent" of a statement. Most languages don't have a reason for being unless they introduce a bunch of unique constructs; and they're lucky if their own constructs can be parsed unambiguously, much less after adding support for extra "standard" constructs from other languages.
Yet another reason is that languages aren't as universal as you may think. For example, what would 'print "hello world"' mean in a makefile...when would the print-out occur? Would languages be allowed to ignore "standard" expressions that they can't logically use, or be forced to invent some interpretation of them?
A final reason is what would happen when languages are embedded in each other, which is even summed up by your example, PHP. If one language is embedded in another, it's a plus that they have different syntax: it clearly separates one from the other. This reduces the risk that you'll have to (for instance) escape or namespace every single name used by the embedded code, to avoid collisions with the surrounding file.