Unfortunately I think it may have missed it's peak opportunity to catch on because of this kind of problem - and a chronically buggy and unstable Eclipse plugin - but we'll see.
You really can start by writing Java-in-Scala and then gradually making use of the more advanced features as you need them.
Clojure is great too, of course, but I think Scala is much more likely to gain a foothold in the programming mainstream.
If I could rename Clojure and GIMP I'd do so in a heartbeat.
I'd much rather everything not also be something else.
GIMP is actually appropriate because AFAIC the program is completely crippled.
They sound identical, in fact, when Clojure is pronounced properly.
> "Dynamic" is technical jargon used by programmers, meaning "good". It derives from the Latin dyno mite, meaning "I am extremely pleased", and is first recorded in the historical work Bona Aetas of noted Roman sage and pundit J.J. Walker. Its meaning evolved in the 4th century after monks copying an obscure manuscript on programming linguistics in their ignorance tried to deduce its meaning from context. [...]
(the comment continues at the link; read the whole thing, it's pretty funny whether or not you agree with the sentiment)
Summary: No. At the time that Scala was started, built-in support for XML showcased one area were object-oriented decomposition failed and functional programming proved quite useful. Today the benefits of FP are more well known, and it's generally agreed that if Scala were starting from scratch it would not include XML literals. However, given that XML literals are in the language already, that large code bases have come to rely on them, and that they generally don't get in the way if you don't use them, it's unlikely that they will be removed.
Macros and reader macros ;-)
To some extent there is a continuing popular schism with the idea that "strongly typed" must be synonymous with the lack of "eval", which, at least in principle, cannot be true. In fact, StandardML does have the ability to extend the environment of a program at run-time (as well as Typed Racket). All the necessary typing and meta-data is retained at run-time so that newly evaluated code is checked for type-consistency just like when the "main code" was compiled. So, not only can one expand macros at compile time and verify that the expansions are sound, one can also do the same at run-time.
Macros for the win.
It is my understanding that a good macro system in a statically typed langauge is still an open research problem. I remember hearing that the Racket research group was still working on it for their typed Racket system, and that it was, at least as of a year ago, not possible to directly port macros from the standard language over to the typed language.
I also remember hearing that template Haskell is a relatively primitive macro system that very few Haskell developers currently actually use, but I'm not an expert in this area.
Scala is already cutting edge in a lot of ways. Expecting them to innovate in such a difficult and untrodden area on top of what they've already done is asking a little much, especially considering that it is meant to be a practical and not a research language.
In no way am I trivializing the complexity of what would be involved in bringing a useful macro system to Scala, but compiler plugins for Scala are filling this role today, a good macro system could make meta programming more accessible. Already there are plugins for SBT to invoke FreeMarker on Scala-program templates to generate code, clearly a poor-man's solution, but it simply is more evidence that a macro-system would and would be put to good use were one available.
Taking Racket as an example, a good macro system would enable many kinds of experimentation in the Scala language without having to extend the core directly. In Typed Racket Hindley-Milner type-inference was added on-top of a dynamically typed-core, and Typed Racket targets Haskell-level (or better) typing infrastructure. All this enabled essentially due to support of Macros.
RE: Cutting edge - And Haskell isn't? While there are a number of language features where Scala may in fact outshine Haskell, I would hardly place Haskell in the old and well-trodden place on the programming language landscape. It is still evolving rapidly, and without the need to conform to any particular run-time-system (e.g. the JVM) the researchers working on Haskell are able to practically make the language warp space-time.
 In Racket, we typically use s-expressions rather than XML literals.
Despite limitations and warts it's a pretty nice language to code in on a daily basis if you're going to code on the JVM. Macros would definitely enhance Scala's productivity. If you don't like them don't use them. Certainly doesn't mean that others could not put such a tool to good use.
I came across Sun's (now Oracle's?) open-source(GPL) fork of the JVM that allows for changing class internals on the fly. The following analogy comes to mind DCE VM : JVM :: DLR : CLR
The class data structures of the Java HotSpot virtual
machine are immutable during the execution of a program.
While new classes can be loaded, it is not possible to
change existing classes (e.g. add/remove methods or
fields). This projects tries to relax this condition to
allow arbitrary changes to existing Java classes
(including changes to the class hierarchy) while running
a Java program.
For example DynamicMethod, which represents a method reference that can be created at runtime and garbage-collected when no longer in use; similar to the lightweight method handlers in JSR 292.
That said, modifying existing classes on top of the CLR is not possible, unless it inherits from something like ExpandoObject, but you can pull the same trick in Java (though not as easily until JSR 292 comes out).
The fact that Scala uses the JVM, with a hacky workaround for type erasure, is suboptimal.
Scala's interop with Java libraries has been a huge, huge boon, and undoubtedly drove take-up where there otherwise would have been none. Scala has come this far in a decade. Python is 20 years old. Ruby is 16. Scala came a long way in a short time, and was probably resigned to the dustbin of "that would have been cool" languages without that interop.
They also did themselves no favours by having the Scala compiler dump out MSIL assembler code rather than a .NET assembly. You had to separately run the ilasm tool to get an executable. Obviously for a production environment you'd package this up in a build file, but for experimenting with the language it was a bit of a speed bump.
This is without even looking at the language features and libraries that were supported on the JVM but not on the CLR (e.g. structural types, parser combinators).
My experience may be out of date here, but the scala-lang page on .NET support is still datestamped July 2008, so I'm not too optimistic. A great pity, as Scala always seemed a great fit for the CLR and would provide some nice competition to C# and F#.
I'm not sure what relation his work has to the core Scala project or team, though.
Artima appears to have gone all-scala-all-the-time at some point in 2010. Weird.
All that painful refactoring and modularization that went into Rails 3 would have been a lot easier and, I suspect, less necessary if Rails had been written in Scala.
I just hope Scala gets enough traction that I can find real work in the language. After using it for a while languages like Ruby and Python just feel primitive.
The reality is that complex features will be used and abused.
It's the first time I've seen "not lisp" as a checkbox feature. :)