Complexity is the primary issue with the scala language... when the whole goal is to have a scalable language which in itself is diametrically opposed to simplicity, your language is dead on arrival.
After using Scala, Go seemed like a dream come true.
We really loved the integration of OO and FP, really miss the FP awesomeness honestly... but simplicity trumps doing either of those things well.
Few production incidents are enough to change this mentality. I'd rather write more verbose but "obvious" code than "clever one liners". I feel sorry for anyone who has to understand someone's DSL on the fly when there's an outage and your business is burning through X$ per minute of downtime.
(On the other hand, you probably want to structure your production changes in the way that they are easy to roll back without understanding everything, but that's other conversation. Either way, the thought that someone may want to chase me after work because my code broke and they can't understand it is enough for me to stop being clever).
The impression I always had was that Scala was a research project into how various programming language features could live together and interact, hence their everything and the kitchen sink approach. From an academic point of view I think Scala was a huge success.
That people chose to use Scala in production speaks more to how badly people hate Java and want an alternative, not to Scala's failings.
Golang is the opposite: it takes the ideas from 1980s with some early 1990s (Pascal, Modula-2, Oberon; CSP; garbage collection) and executes them to near perfection. It even ignores some of the obvious faults in ergonomics (if (err != null) return err; everywhere) to keep things dead simple and reasonably performant. It's much harder to write incomprehensible code in Golang, even if you try, so lack of mastery and restraint would have much less catastrophic consequences than in a highly expressive language like Scala (or e.g. C++).
This one brilliant feature allows for easy DSLs without macros, parsing, or otherwise introducing new concepts to the language.
How so? Do you mean Java in general, or just the pattern matching part? Can you provide an example where Java is "better designed"?
This is how C++ has been successful at Google, and it's how I'd approach something like Scala if I were to go back to doing it now.
I'm going to say maybe too fast. Not that the changes aren't good (they are good) but rather that the pace of features from 8 to 11 and then 11 to 17 has been fast and trying to get old toolchains updated to 8 (yes, it pains me) - much less use the features that are present in 11.
While in the organization that I work in there was some "ok, we've got old things on the computers, update to the latest Java now?" was postponed until September so that we can say "Java 17 is the standard" and have that be our long term platform for awhile.
But there are a lot of people who are still writing code that would compile in 7.
Looking at the version history, 12 years ago would have been Java 6... there were a lot of good changes with collections in Java 7. https://www.oracle.com/java/technologies/javase/jdk7-relnote...
The try with resources and catches were the two big things that made 7 much better than 6 for me. And with 8 - https://www.oracle.com/java/technologies/javase/8-whats-new.... - while:
* Classes in the new java.util.stream package provide a Stream API to support functional-style operations on streams of elements.
is only a single bullet point, that's a big thing.
If you are in the databricks Spark ecosystem -- stick to using the Databricks style and Spark libraries. Don't bring in anything that isn't in that ecosystem, because it runs way behind standard scala -- it's on a 6 year old version at this point.
If you don't want java to eat the features you use, use either the typelevel ecosystem or the ZIO scala ecosystem of libraries, use their respective preferred built-in DI methods (tagless final and ZIO environments, respectively).
If you don't mind eventually having java eat your features, go ahead and use the Lightbend ecosystem and buy the support license. If you don't want to buy support, the twitter/finagle ecosystem is basically the same without a support possibility. But within those there's definitely room for creative license.
Would I rather everything move to Mill? Yes. Is that ever going to happen, nope.
"I love C++." —Alice
"Oh? Which C++?" —Bob
Every successful C++ team defines its own C++ subset.
We had PHDs and very senior engineers, and they thought debugging common libraries like slick would take them weeks of work and they opted for avoiding that entire type of scala. But that is hard to do, because its a lot more than a coding style.
Which, I don't want to be too down on Scala. Overall, I like the language. Scala's real weak point is its culture. There's a decent risk of cognitive dissonance when you try to wrap an ethos of design purity around a language that's always been a bit of a communal experimentation project.
What is helpful in situations like this is the motto "as simple as possible, but not simpler". During a code review, if you see something you think is too complex, ask the author: "How could it be made even simpler? If not, how exactly a simpler approach did not work?" It sometimes helps find an overlooked simplification.
I think that's missing the point.
The criterion should be: is the code clear enough to be easily testable, easy to debug, and easy to evolve?
I will take a wrong abstraction with the qualities listed above any day of the week.
The problem with wrong abstractions, though, is that they do not work, especially when you make the next step on the roadmap.
I would say that clear, testable, evolvable code with little abstraction may be fine. OTOH boilerplate and copy-paste prevents easy or well-controlled evolution. The abstractions end up inlined and fused in the code, instead of being made visible, and become easy to miss a case when a concerted change is needed.
And, let's ask Rich Hickey. What language was it he used to experiment with a new s-expression language? Why wasn't RH scared away? You want the brutal honest answer? Because he is smart and can grok the complexity of the Java in toto.
What sort of experiments can you not do on a virtual machine based language, with open byte code/vm spec, open class loaders, and compile and runtime instrumentation and meta- capabilities?
I used to do the switch the superclass at loadtime to experiment with adaptable programs. This was circa late 90s. Loads of fun (npi).
What scares many away from Java is that it is now a very huge mental object. But not wanting to admit this, they simply pass along FUD.
This probably falls in undefined behavior territory. What kind of errors did you get when something went wrong?
Not saying it's not interesting to experiment with things like that, but there are non-trivial associated, if you ask me, to depending on it in production.
Anyway, this was before ASM and other bytecode engineering libraries. Today I would not directly hack the class def myself.
In my experience it's clearly considerably simpler than Rust or C++ to become productive in.
Once an object is instantiated, don't change its internal state.
Just embrace Clojure then. You get all that enforced for you, plus the entire Java eco-system.
Clojure does look really nice and Diatomic looks pretty slick.
Have you used Clojerl?
Clojure's ecosystem is even better, it improves on the Java one a lot, simplifying most of the warts with the Java one, and like you've brought up, it goes beyond the JVM in having quite a lot of active dialects.
You can also use ClojureCLR if you like the .Net ecosystem better.
And finally there's Babashka which deserves a mention, if you want something more like Python.
You also get to play with Clojure derivatives as well: Janet, Fennel, and Ferret can all be picked up in a day if you already know Clojure.
For me Clojure was a great investment, and it's replaced all my needs in all areas: scripting, front-end, application development (mobile and desktop), command line tools, server side, big data, batch processing, ML, data visualization, etc. Only downside is you will exist within a niche, but that niche has everything I need.
And no, core and other half-hearted attempts at gradual typing from Clojure don't even come close to hitting that mark.
Higher-level organizational patterns in pure FP style aren't familiar enough to replace the ergonomics of just using a few classes and DI to make things modular. Maybe tagless final or something similar will eclipse it eventually, but for now it seems like using "anemic style" OOP, with functional in the small is still the best we have for mainstream development.
Data types are object-oriented. They are responsible for ensuring that their internal state is consistent, and nothing more. They may inherit if it makes sense, but it usually doesn't and most of the time aggregation is more appropriate.
Business rules are functional. There's just a bag of composable, pure functions that take the various data types and perform validations, transformations, etc.
External services, such as a database, are contractual. This would be an interface in Java or C# defining the required operations of the service. This allows manipulation of these external services during testing. Unit testing would mock them; integration testing would not.
Workflows or services are procedural. They combine everything else into an actual use-case. They read in a linear flow of what needs to happen when to meet the use-case.
Agreed. I like the Scala language, but I can't imagine a scenario where I would recommend it for a project over Java.
That's the genius (?) of Go. They gave us what we didn't know we needed.
Software people love the "manliness" that comes with "serious", "real" programming languages. Maybe it's the desire to impress our peers with our intelligence? But in the end the only thing that matters is whether you can get your shit done and on time in at least a semi-working state.
As far as it being driven my machismo, I think that's part of it, but I'd be more inclined to blame it on the following:
* complex languages/mental frameworks function as an "intellectual trap" where people more inclined to be interested in intellectual things for their own sake start to lose site of the forest for the trees.
* FOMO that one must keep up to speed with all the coolest new developments in programming or face mounting irrelevance
* Elitism/self promotion where specializing in something difficult (and potentially keeping newcomers out) either provides internal or external motivation
I think some of these traits seem somewhat correlated with the "nerd machismo" archetype but I think the actual causes are multifaceted.
Sure, Scala is complex but I find that there are a few misunderstandings here.
Java, for instance, _requires_ that you use a number of supporting tools for it to be useful these days. You need a container, you need quite extensive testing and you need quite the advanced build tools. Sure, these have all been around long enough these days to not be what we think about when we compare Java to other things, but they're implicitly also Java. I can't speak for Go because I've not looked at it.
But Scala has strengths that means that you don't need to have _external_ tools and libraries for those things. That strength is types which means that the compilers is going to want to "pick a fight" with you to a very much higher frequency. That is a pain. The reason that we're here talking about that pain with Scala is that you don't think about that in your Java project anymore because you use heuristics and sandy heads instead.
Another thing is that I think that Scala gives you a big fat revolver to shoot yourself in the foot with. Java has a bb-gun so no matter where you aim it, you're not going to accomplish a lot.
Over to F#:
I'm going to claim that the .NET toolchain is superior to the JVM ones. The CLI tools and paket runs circles around the POS that is SBT.
OO and FP are paradigms. You can use any of them, or both of them, in any language.
I believe Scala had the goal of offering a high level of usability for both FP and OO. It failed despite an admirable effort, because of an incredible amount of language complexity.
While most of the article rings true, I wanted to share some counter-arguments:
1) A lot of the problems mentioned are only really noticeable if you use Spark, which really lags behind in Scala versions compared to the rest of the ecosystem. As someone who doesn't use Spark, I don't really feel much of the "Scala minor versions" pain.
2) While not great, cross-compilation/publishing is not that bad (there are plugins like `sbt-release` and `sbt-crossproject` that pretty much take care of it).
3) I mostly disagree with the "Difficult to publish to Maven" point. I'm not saying that it's easy, but I enjoy the fact that it's not as trivial like other languages. I would argue that this whole process makes  almost impossible, which is a plus for me.
> You need to open a JIRA ticket to get a namespace, create GPG keys, register keys in a keyserver, and add SBT plugins just to get a manual publishing process working. It’s a lot more work than publishing to PyPI or RubyGems.
It's a little annoying to have to go through that, but you only have to do it once per domain, and the turnaround from the people who manage the sonatype jira is usually pretty quick. In return the ecosystem gets a lot of protection from the kind of exploits you mentioned.
1. Open source libraries are sometimes not maintained which is absolutely true but also just a general hazard of the open source software ecosystem. The solution in most cases is to just not use open source libraries that are somebody's hobby project (which the author very sensibly recommends). But that is true whether you are using Scala or not.
2. Spark is weird. Which is true but that seems like it is more to do with the idiosyncratic
nature of Spark than it is anything to do with Scala. That is, Spark development is sort of its own category and poses certain problems that are pretty specific to Spark and you don't necessarily run into when you are building non-Spark project in Scala.
1. Yea, but languages that are backwards compatible have libs that are useful for way longer. I can still use Java JAR files that were built with Java 8, and published many years ago. The open source libs get stale so much quicker in Scala.
2. I'd actually argue that Spark is less weird that some other dependencies. If you look at the cats README (https://github.com/typelevel/cats) there is this caveat: Cats relies on improved type inference via the fix for SI-2712, which is not enabled by default. For Scala 2.11.9+ or 2.12 you should add the following to your build.sbt: scalacOptions += "-Ypartial-unification"
More cats specific stuff here: https://github.com/sbt/sbt/releases/tag/v1.5.0-RC2
I use Spark a lot so used that for the examples, but think other libs cause even weirder maintenance challenges.
On the other side though, strict backwards compat creates it's own set of costs (that are often harder to quantify. Java backwards compatibility is great but it seems like it is also at least prat of the reason Java moves at such a glacial pace.
This is the tragedy of Scala. The maintenance cost is bearable. In return you get a language that dares to shed legacy cruft and fix bugs. Trash fire code bases, on the other hand, are hard to deal with.
We came up with our own linting guide, and guidelines for writing extensible and composable code.
I've never coded in Scala, but I've shipped production code in languages that support multiple paradigms.
Conventions are important. Things need need to be agreed upon and done in a certain way or your code base will become a tangled mess.
When helping start my current startup, one of the founders asked why I chose C# as he was asked by others why we'd pick that language/stack as it's not common for SF startups. I told him that it's less about the language and I could do just about anything with it. It's turned out great and of all the things to worry about, the language hasn't ever come up again.
The tooling is good, and the lang has never gotten in the way of getting work done.
Personally, I'll take an expressive, GC'ed language, but write "boring"-ish code when I'm on a short time-frame and I have few developers.
EDIT. Just realised this sounds needlessly antagonistic; the point I'm trying to make is that "boring" is too coarse a measurement. Java 1.8 is more "boring" with it's required explicit type declarations, but I would /always/ pick a newer Java version to avoid the boilerplate.
Pepsi is more potent (sugar) than an apple, easier to get fat. Still the consumer's fault.
My point is the choice of language may require more energy to enforce better code practices. Scala is so feature packed that it makes it easy for a developer to... overdose.
On the other hand, I've seen my fair share of too-clever uses of runtime reflection (Java) or weird metaprogramming (Python/JS) techniques, except they were not only too-clever but actually just didn't work and would fail miserably at inopportune times.
When I started software development I had "hammer and nail syndrome" for absolutely everything. I figured out something new and then used it everywhere. Experimenting in this way is necessary but you absolutely want to avoid it in a production codebase. Over time you have run out of hammers and nails to play around with. You've developed the ability to judge where and when a specific hammer is most effective and when it shouldn't be used at all. Scala offers you a wide variety of hammers and nails. You'll have to experiment a lot.
Somehow Scala has a lot of the issues that I feared would have impacted Typescript by now (yet hasn't) so I'd be interested in comparing and contrast why Typescript isn't perceived as having the kinds of issues Scala does when on paper they sound similar in features and technical challenges.
Having worked on a lot of legacy codebases both creating and inheriting I'm quite interested in practices that balance rapidly prototyping and deploying systems and gracefully degrading when projects lose people and funding. It seems tragic to see so much of the work of our lives thrown away so fast.
So I'm here scratching my head at the point issues denoted in the OP wondering if I missed something critical about what makes Scala so different in its ecosystem conventions that a community couldn't fix over years. Even Node.js managed to work something out via Yarn
Java, Kotlin, C# and TypeScript are led by companies that want usage. Scala was and still is primarily an academic exercise. Same problems as with Haskell - the leadership are paid to add random optimised-for-sounding-clever ideas to the language, not design a language in the gestalt that optimises for user success.
In particular TypeScript is not a research language and therefore the documentation, marketing, leadership etc doesn't strongly emphasise convoluted FP type theory. That means the sort of people who really want to go wild with that stuff stay away.
I'm not an either/or kind of person and don't see why a language suitable for academic purposes can't be broadly useful in industry, but I would also agree that leadership that has little background in certain use cases may not design and orient the language accordingly.
I'm personally in the camp of "keep Scala huge and weird and multi-paradigm, but just make less breaking changes and focus all efforts on making the ecosystem more stable".
Also worth noting that Scala is not a "very big" language in any sense that I know of. It has a lot fewer features than C#, for instance. And probably a comparable amount to TypeScript.
One thing I find encouraging is that Scala features may become deprecated (either explicitly or de facto) if an overlapping feature is added that works 'better'. Examples I've come across are Manifest getting replaced by TypeTag and implicit coercions getting replaced by implicit classes.
It looked for me like someone wanted to add every existing language into one single language.
When I read the article I was shocked that 2.11/2.12/2.13 could be breaking changes that prevent existing libraries from working, and so badly that they take years to catch up.
Certain languages prioritize backwards compatibility and maintainability over new features / complicated programming stuff. The maintenance burden is dramatically different for different languages in my experience.
Scala's problem is no unified paradigm is proposed or accepted at community level. 'Dumb' languages like Java/Go limited the style choices one programmer could take within the language, but Scala seems to encourage the opposite.
This has not been my experience in the Machine Learning space.
I'm surprised why scala couldn't solve it better than python -- I mean, scala's a compiled language, so it should have more wriggle-room...
Scala developers aren’t willing to freeze that mapping, partly because they found out better ways to do such mappings, and partly because they keep changing the language, changing what was the best way to do that mapping.
I think it’s easier for interpreted languages to keep their internals compatible. They are willing to give up some speed for convenience, so even if they think “I wish we had done that differently”, the pressure to change it isn’t that high.
They also keep more metadata around. In some cases, that enables them to discover “this is using the old way to do Foo”, and fix that up to use the new way.
More importantly Kotlin is designed to be as straightforwardly compatible with Java at the bytecode level as possible. Scala isnt.
I'm not sure I agree with OPs blog post.
This line specifically stuck out to me.
"Scala should be avoided for easier problems that don’t require advance programming language features."
What are advanced programming features? Isn't this just a matter of where you are coming to Scala from?
The problem is not the language, but the team & overall approach to development (rushing to deploy bad code because your manager said so; lack of code review).
Tooling can be better, but it was massively improved over last 2 years
And I agree it is not the language, it is the stewardship. Who they hire, how they manage their tech, culture, pairing/mob, quality over speed etc. And the reality that all codebases rot, at different speeds depending on these factors, but eventually they all rot.
I have seen well maintained large Scala architectures, at very large organisations because of great stewardship of the projects and people.
And I have seen car-crash bundle of apps in Scala because they hired rockstar developers that incorporated every shiny feature continuously leaving it unmaintainable by the rest. Or by recent Java converts that were not mentored well enough so still writing Java code in Scala.
I love Scala but with bad HR processes and weak leadership it will ground to a halt. And they will blame the language not themselves. But the same would probably have happened with any language.
It's got everything to do with discipline and time spent on creating and keeping a codebase simple and well factored. If you don't have a team doing that, you're going to have problems.
Scala is a sharp tool, in the wrong hands you're gonna slice your toes off. In the right hands, you can create simple, elegant solutions to challenging problems.
TL;DR: anyone doing it wrong anywhere breaks you, and it’s only a win if you feel a need to rip pieces out of your JVM.
No more than Java, but also why most Scala projects publish to Bintray as it much smoother. Oh wait...
This is the complete opposite of my experience.
I love Clojure because it enables me to build large / non-trivial applications after learning a minimal set of rules.
Scala, on the other hand, appears to have a relatively large number of syntax rules and special cases.
I like scala. I haven't used it for anything big, though. And hard to argue that it has an interesting approach to discipline.
Lisp, though, I like to use. I'm not as impressed with myself for getting something working in it. I am impressed that all of my books have code still work.
For me, lisp is bigger in the meta programming. Really pulls the covers back on how things relate. Not just in how to put the code together, but how to look at the data.
Scala was my first big intro to category theory. Helped me see relations in some higher ideas. Many, though, are hard to see a priori.
That said, a lot of things folks reach for with meta programming, I think should get a pass. The goal shouldn't be to make the code like a text. The goal should be to heavily leverage core data structures.
To be honest, the same people I used to see do heavy Reflection are the same that did heavy meta programming. Usually to the same end.
I do not have a hypothesis on why.
On the other hand, yes - Scala is in many ways a research language. It's got a number of novel features, starting from the blend of FP and OO, through implicits, ending with metaprogramming. The dark side of introducing features that other languages don't have, is that they sometimes need polishing or complete removal. That impacts the compiler (which evolves as well - compiling Scala code is much more complex than the rather straightforward translation of Java to bytecode) and high-level language features alike.
The bright side is that software engineering is very far from being a "done field" (or at least I refuse to believe it is such). There's a lot to be discovered as to how we can write code that is readable, performant and - yes - maintainable.
"Wait, but if you said Scala is a research language, is it safe for business usage?" Yes - while Scala has a research side to it, it also is a language used successfully in many companies. It has a lot of libraries that are maintained, the language authors pay a lot of attention to binary compatibility (with well-defined limits - you know what to expect!).
Businesses have benefited from using the newest available technology in many fields, and I doubt software engineering should be different. Of course, you shouldn't be reckless - but using technology from 20 years ago "just to be safe" isn't always a recipe for success.
Yes, you can build great systems with Java and Go. Same is true for Scala. And yes, you can build totally unmaintainable systems in Scala, but in Java and Go as well.
Finally, a side note about OSS. I don't think it's fair to expect anybody to maintain any kind of library in any language for free. You like the library, it solved your problem - great, you can use it for free. But if you want it maintained above what the library author donates, either through their time or through corporate sponsorship - well, you can't have any expectations here. I'm sure a lot of these migrations problems could be solved, given a reasonable business offer.
True! but why Spark & Flink, which are backed with business, didn't bother themselves to catch up with changes in Scala! they delayed their upgrade to 2.12, not mention 2.13! and as far as I know the 3.0 hasn't been discussed yet! Am I right?
In general I'm happy and thankful that I learned Scala! it shaped my perspective to combine programming paradigms and pick best practices from each one rather than sticking to a single paradigm. But for work that I wanna last long enough, I can't accept it, as there is no realm of calmness in view. Software Industry is a young industry and complex, like any other profession, and a developer need to think about lots of other things as well, not just the language and possible pattern/syntaxes it has to keep up with!
I think the value that Scala brings into industry is what that prevents it from becoming the rival to others.
Looks like you’ve chosen the wrong profession
This seems like a big turn off. Question to any Scala dev's out there. Does this get in the way? And does it make upgrading Scala versions difficult?
In practice though it is not a major issue. It is easy for library authors to cross-compile for different versions and SBT handles fetching to correct binary dependency for your project. And in general, binary incompatibility doesn't mean source incompatibility. So generally uprgrading your projects scala version is as simple as changing the scalaVersion in your build.sbt. The exceptions are:
1. If you have a dependency that hasn't been cross-compiled for the new Scala version yet. So if you are trying to upgrade as soon as a new Scala minor version comes out then you may have issues if you have a lot of dependencies.
2. If you are trying to publish a library compiled for multiple older Scala versions then you can run into source compatibility issues. For instance, if you are using a method/interface that is new as of Scala 2.13 you won't be able to cross-compile to Scala 2.12.
Some projects are easier to upgrade, but for complex projects, minor Scala version bumps can be quite challenging.
As long as the platform is relevant everything moves along without extra layers of tooling and idiomatic wrapper libraries for platform APIs.
Might not have all the bells and whistles, not so shinny, but it works and I don't need to care about my replacement having headaches.
* Stable, Well understood language.
* Rich ecosystem.
* Fast enough.
* Plenty of developers to hire from.
* Scales well to large projects.
Working from the bottom: Scaling well to large projects is only a concern if you expect your individual projects to become large. I've worked on monoliths, and I've worked on systems that follow more of a Unix philosophy. In the former case, we had one project with a 7 digit line count. In the latter case, it was rare for a single application's codebase to grow to more than a few thousand lines of code.
"Plenty of developers to hire from" is often optimizing for the wrong problem. In many lines of business, domain expertise is a much more valuable skill, because it can only be developed through years of first-hand experience. A new programming language, on the other hand, can typically be taught to a reasonably skilled programmer in no more than a couple weeks.
Also, when you're just starting out, the single best programming language you can pick is the one that the people actually starting the business already know the best. They will not have senior mentors to help them learn the best way to use a new language.
Finally, "fast enough" is not a single target or a strict hierarchy. I used to work at a shop where we used Python, C#, and C++. Only C++ was fast enough, performance-wise, for some problems. Only Python was fast enough, productivity-wise, for other problems. C# covered the (large) middle ground.
It's also not a strict hierarchy. At my current job, I'm in the middle of a project to replace some Java components with Python. Java has not proven to be fast enough, performance-wise, for this particular problem, while Python gets us there with room to spare.
This is a gross programming fallacy.
Yes, you'll learn the language at the superficial level. But learning the ecosystem, best practices, that takes years.
Unless your project is short lived or something trivial, in which care you won't care. But if it's going to live for a long time and will have a major impact, the decision to hire programmers that have 0 experience in your target language will bite you in the butt, years later.
The Kotlin ecosystem is the Java ecosystem. It runs as fast as Java, partly because Kotlin is basically (90%) a better syntax for Java. It scaled well as the project got bigger. And of course in the beginning nobody knew Kotlin but it didn't hold back hiring at all - you just take Java devs, tell them to read the language guide and they're 90% of the way to being Kotlin devs. Some code review to show them useful little tidbits in the stdlib and you're done. It wasn't a problem.
10/10 would use for a startup again (and will)
Why haven't you learned anything new?
Frankly, I think Scala is just good at exposing weak programmers and teams with poor discipline. I've seen far worse codebases in python where there's no disciple: all methods are public, monkey patch classes at runtime, etc.
I think articles like this are so common because Scala is easy to pick on. "Implicits are bad" is more interesting than "Implicits have a very specific usefulness", but Implicits are peculiar and unique to scala so the language gets lampooned. "Private methods make code more maintainable" may be just as true, but no one says Python is universally hard to maintain -- so long as the person behind the keyboard knows what they are doing.
Look at the beautiful code examples in a Scala book and you might be seduced, but make a small change and there is nothing beautiful about it.
The only concrete case I can think of where your sentiment rings true is when you try and do some fancy dependent-typing thing for strict type safety and you end up having to right some (VERY) ugly type lambdas. But then again that has mostly gone away with how 2.13+ handles existential types and of course with the Kind Projector compiler plugin.
By the end of whatever work I was doing, there was always the added benefit of the code base looking and feeling much better. Plus it’d take maybe 1/5 as much time to do a huge refactor compared to my experience in say Java.
I really don’t understand why people have such a hard time with Scala, other than its very different from imperative ways or doing things.
I have more experience writing in Go, and I think the code for all would look similar-ish. Go is a bit of a grey language, but it gets the job done without a lot of refactoring and restructuring.
I'm happy to pay the complexity tax associated with the borrow checker. You get outstanding performance and you usually don't need to dive deep into lifetimes to get some basic code running.
The language itself is reasonably pleasant but is missing some key features that really annoy me:
- No overloading
- No default parameters
- No named parameters
- Awkward constructor syntax which causes a lot of copy/pasting boiler plate
But overall, I'd say Rust is safely in my #2 spot, with Kotlin a bit ahead at #1.
Wonder if they've run into these issues with the language though
Once you're used to the way it works, and if you're using well-maintained libraries, I find that it's really not such a big deal. Though it's always awkward to have transitive dependency conflicts, and those are best avoided if possible.
Is this claim backed by anything?
Maybe this is a feature, not a bug. I won't call out the language/repo, but I was able to gain admin access for a library my company published with just an email from my company's domain. There are major security issues with "easy." There are also quality issues. You're more likely to publish something if you think it has value; I've published work to Maven Central, but only the high-quality, reusable work.
Their approach is to write secondary smaller implementations of the same thing in other languages and use those to build the larger thing. The seed of their bootstrap less than 1000 bytes of hand-written machine code (not assembly), and they are working on eventually reaching a full Linux distro.
I’m never going to use a library written in Scala for the same reason, plus chances are I’d have to spend hours fiddling with SBT to rebuild it at some point.
Scala (with Maven) at the top please, then Java all the way down thanks.
Tbh, I've been programming in scala for 5 or 6 years now and have never worked with "proper" FP libraries like cats or zio. All the companies I've been working with, have used the lightbend stack, so basically play framework, which is easy to learn for someone coming from java/spring and the most different, Akka streams.
One day one of my java colleagues started an argument saying scala was too complex etc etc. I asked based on which code? He mentioned scalaz. So tell me what the hell someone who has never programmed in scala before was doing looking at scalaz code? No wonder why a novice would think of it.
The bottom line is that if you are new to scala and/or has no experienced developer nearby but still want to try out scala, just stick with the standard library and use play for your rest services. When you are confident enough with the language, if interested, have a look at the FP libraries.
One of the main slides the stood out was discussing something like 20 or 30 different kinds of "types" in Scala. I cannot find it anymore, it may have disappeared with the political "drama" I've heard happened within Scala that included attempts to blackball people.
I've never personally coded in it, I use groovy on the JVM and it satisfies my needs but it has nothing to brag about in terms of avoiding spaghetti code.
It has all the same problems of bootstrapping any language that isn't a "X but distinctly better/easier/more features/fixes major problems in X". A big big lift.
I think only Rust is a language that is sufficiently different that has any real long term chance.
JetBrains build Kotlin as a programming language for industrial projects - and it shows.
What makes Java and Kotlin more popular languages than Scala is that both Java and Kotlin are being extremely deliberate in choosing what features they include and more importantly, which not to include.
Scala includes everything by default because it allows the EPFL to submit a lot of papers to conferences.
I think it's the opposite. For example null: where Kotlin has a special operator and semantics, Scala simply has the "Option" type in the std lib.
For example, the most common thing people point to when talking about Scala's complexity is implicits. And that's fair because implicits can be hard to understand and can be used in wildly inappropriate ways. But I have found that what implicits are mostly doing (at least when used correctly) is taking stuff that would be implemented in Java with some crazy runtime reflection scheme and making it a compile-time construct. So then the question is not "are implicits complex?" but "are they MORE complex than the equivalent implementation using reflection?" And on that question I would say absolutely not! Runtime reflection is (to me at least) much more opaque, error prone and difficult to debug.
Having less features does not mean the language is easier to learn though. Scala has less features, but the ones it has do work well in combination with the others and are very general.
The fact that Scala's features work well together is a myth that's been debunked over and over. Just look at the number of semantic meanings for "implicit" or the underscore character.
Scala is just what happens when a language just adds every single feature under the sun with very little care for user productivity.
The implicits you mentioned are a good example. It is one feature and consists of two parts:
1. Implicits at use-site: they define that a not explicitly provided value can be filled in implicitly by the compiler
2. Implicits at definition-site: they mark those values that the compiler is allowed to fill in at use-site when a value is not explicitly provided
That's all there is to it and one without the other would be utterly useless, so it really is _one_ feature.
But that one feature can be used for multiple things and I assume that this is what you meant. For example, they can be used for method extension syntax (which is a distinct feature in Kotlin) or derivation (e.g. serialize a structure into json, which in Java is usually done with reflection or annotations) or ad-hoc polymorphism (emulating typeclasses similar to Haskell).
Mind that these use-cases were not foreseen when Scala was created! But because implicits are such a powerful language feature, Scala can now have method extension syntax without any changes in the language (well, besides some minor optional syntactic sugar, but that's it).
Java cannot easily have this syntax and Kotlin only has it because it was added from the beginning.
Implicits are a beautiful example how a well thought and flexible language feature covers multiple features of other programming languages at once.
The underscore character is a different thing, because that isn't really a language feature, it is syntax. But I agree that it is overused in Scala and is quite confusing in the beginning.
This is just a thought that I had, and I wonder if that's actually a relevant issue in practice.
People took these possibilities and ran with them, creating all sorts of wonderfully illegible DSLs. It was widely recognised as a bad idea.