Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

There’s no rigorous academic evidence for this, but a lot of companies have been backporting typescript/mypy/sorbet onto existing dynamic codebases and the case studies have been overwhelmingly positive.

Not to be a buzzkill, but migration reports from any tech A to tech B are always overwhelmingly positive when the industry has a newlywed period with tech B. Save for a clear, undeniable failure the stakeholders will always claim success.



Number one, static type checking is hardly a "new tech" for which the industry is in a "newlywed period". If anything, it is the middle-aged wife that the industry's crawling back to, as the passion fades from its dynamic mistress dalliance.

Secondly, the obvious counterexample here is MongoDB. Way too hyped-up during its honeymoon, and then almost immediately crapped on by the entire industry (to the point where we've PROBABLY gone too far and are unfair now). When hype waves don't work out, this industry is pretty quick about discarding them.


Also, TODAY type checking (and more important, how Types are used) is not the same than YESTERDAY types usage.

Java, C++ and similar are terrible benchmark for it. Before, types were almost for do taxonomy, with limited help to actually write CORRECT code, and more important, MODELING the domain was very verbose and with limited advantages!.

Against that, no static types makes more sense. With o without the end result, result-wise was kinda the same, only that without you remove a lot of noise on the codebase.

Only after a bit of the ML/oCalm/Haskell/etc type-system get in, and NULL removal becomes truly feasible, and some of the failures of error-prone software and how WRONG JS,C++ and for some extend, Java become more and more evident, then modern static types start to leverage a greater return of investment.

It also come together with other improvements in tooling (that make more enjoyable type inference on IDEs, for example), composability, iterators/generators, etc and you get a very nice toolset.

This is where everyone is converging, in some way or another.


You may believe that, but the fact is that we haven't been able to find any evidence to support the fact that "new types" significantly increase correctness and have a greater return on investment, either compared to "old types" or in general, and not for lack of trying. At this point, it's okay to believe it, but I'd be very careful about being so sure about it. Large effects are very easy to detect and are hard to hide. The fact that we haven't been able to easily see the effect in any rigorous way, suggests that even if there is some positive effect, the most reasonable assumption we can make right now is that it is most likely small. Having had experience with both "new types" and "old types," my personal impression (which is certainly not rigorous or necessarily trustworthy) is that such claims about a big bottom-line impact are exaggerated. Sometimes people feel they're writing more correct programs more easily, but the bottom line doesn't quite support it.


> we haven't been able to find any evidence to support the fact that "new types" significantly increase correctness and have a greater return on investment

Rust?

A lot of people say Rust improve the game (I'm one of them). I have coded in +12 langs all my life,. Rust totally remove tons of issues (for me) that were present in the past just before shipping and even after.

And I port the same project. The kind of issues I get of the Rust codebase are a fraction of what I has.

--- And I think many studies show it?


I was expecting someone would say that. There are very specific situations where a language has an advantage over an exceptionally "bad" language in the same domain, such as Rust vs. C or TypeScript vs. JS (for which we also have evidence of a ~15% improvement). But that doesn't mean that the very concept of "new types" generally has a big impact. E.g., it's easier to write more correct software in Rust than in C, but probably not than in Java or even (the untyped) Clojure.

Rust is a special case because its main contribution is to use typing rules to solve a harmful problem that's well-known and particular to C (or C++).


Clojure is strongly typed and dynamically typed, not "untyped". Much of its core behavior is built on interfaces like ISeq.

It's not uncommon to use a spec library like clojure.spec or malli, whose benefits overlap those of static typing. I'm not sure if there is a measured improvement from their use, but they have either advantages like facilitating generative testing that do help one to write more correct software.


The term "untyped" means anything that isn't statically typed (or just "typed"). This is because (static) types and dynamic "types" are two very different objects, and only the former is called "types" in programming language theory and formal languages in general.

I am well aware of clojure.spec, and it, as well as many other techniques employed in development, are probably among the reasons why types don't actually seem to have a big relative impact on correctness.


Thanks for the explanation. What are some of the other techniques?


Another way to phrase this is that in old languages, static types were added for the compiler, while in modern languages they are added for the developers (specifically, the teammates of the developer).

This is why hybrid typing is so prevalent nowadays, as you don’t need to satisfy any internal need of the compiler, but you can still keep the documentation of models and interfaces that static typing gives, where desired. Performance is usually also a no-factor nowadays unless you have high demands.

On top of this, the only reason JavaScript grew so large was that there was literally no other way to ship code to a user. IE6 was the universe. Nobody wrote code like this by choice. And those unfortunate souls that did still have scars from it. Remember this was also before CI checking every PR for correctness became as common as it is today.

Because of all this, there will not be any “swing of the pendulum” back to the crazy era of untyped PHP, JavaScript without TS or python without mypy. This is something that is here to stay.


I'd strongly suspect that we'll see a cycle for this over time. Fast Java incremental builds, minimally typed languages like go, and IDEs which actually worked for most statically typed languages ushered in the current trend of favorable static typing views.

I'd bet this lasts for as long as static typing remains fast and comprehensible to the average dev just trying to get something done. I suspect language designers and library builders will add more typing foo until builds are either slow or the code becomes a mess of factories, traits, monads, functors, and other constructs - ushering in a new wave of dynamically typed languages which "just get out of your way".


You are right there is hardly a tech that hasn't existed in some form or shape since 1970s. Doesn't change my point (and nobody said newlyweds have to be on their first marriage).


I think the trend is more of a pendulum swinging back and forth than a single cycle of "static" => "dynamic" => "oh, dynamic was bad, so static" explanation. Prolog, Smalltalk, Lisp, etc. are likely older than most of the engineers now very enthusiastic about static typing. Static typing is fashionable right now, I suspect, because it's the first time many engineers are stumbling upon it via Typescript, Sorbet, Haskell (or Haskell-derived projects like Elm). I predict eventually the costs (and let's be honest: static typing is not free) are realized, there will be a re-emergence of dynamic typing being fashionable again in the next 10 years.


Popular music changes every 5 or so years, because freshmen entering high school want their own identity that's distinct from the seniors who just graduated. This is why they can reject the music wave that came just before, yet also embrace nostalgia for the prior wave that came before that. That's fine, because the previous wave of kids (from which we want a separate identity) weren't into that wave.

This teenage dynamic carries over into adulthood. Academia is a bunch of young adults chasing tenure, by publishing papers arguing that the existing batch of tenured professors are washed up and got it all wrong. And also "rediscovering" the older research that was discarded by the previous generation.

Software development is a bunch of a junior devs working on bug tickets. Frustrated by the tech debt they inherited, and convinced that tech debt comes from the tech rather from the organization. It's just that in comparison to academia, technology has the generational lifespan of fruit flies. So we spin through the cycle a lot faster than other walks of life.


Static typing is net 0 cost. We don't need to be "honest" static typing is obviously not free you need to think of types. The real revelation was that the cost of dynamic languages far exceeds the cost of a statically typed language.


You're not wrong, but yet you are.

There were failures with Ada as well as I recall. Statically typed. Ahead of it's time. But in some critical use cases, it failed. It also cost a lot to maintain. Compilers/IDE's were super expensive. Yet there were cases of undefined behavior that you still had to (hopefully) catch in a peer review.

This was the case of the Arianne 5 rocket explosion caused by software written in statically typed Ada. It's worth a read through:

https://itsfoss.com/a-floating-point-error-that-caused-a-dam...

Java and the log4j mess are similar. It's a statically typed language, but a poorly reviewed code base. The static typing didn't catch the security hole. And it's cost the US millions of dollars so far to fix it.

Static typing may be nice and all, but it sure as hell ain't a silver bullet.


>Java and the log4j mess are similar. It's a statically typed language, but a poorly reviewed code base. The static typing didn't catch the security hole. And it's cost the US millions of dollars so far to fix it.

This is a ridiculous statement and just really devalues everything you are saying. How is a vulnerability in a library that is completely orthogonal to typing relevant in this discussion. Static typing never claimed to cure programmer stupidity.

Also when I say static typing I mean languages with good type systems, as much as people like to say typescript is bad because it's based on javascript, at least null errors are impossible if you are using strict mode unlike Java and Go where any variable could potentially be null and the compiler wont tell you if you have unhandled null cases.


You said:

> Static typing is net 0 cost. We don't need to be "honest" static typing is obviously not free you need to think of types. The real revelation was that the cost of dynamic languages far exceeds the cost of a statically typed language.

And my point is the static typed languages we do have didn't save us anything in terms of cost.

I recommend maybe you read the hackernews posting guidelines before you post again.

https://news.ycombinator.com/newsguidelines.html


> Number one, static type checking is hardly a "new tech"

Static typing of course not new, but migrations usually happen to relatively new languages: Go, Rust, TypeScript. I haven't seen reports recently of any large migrations to Ada, Pascal or even C++ or other old (20yo+) language.

> MongoDB. Way too hyped-up during its honeymoon, and then almost immediately crapped

MongoDB is still used in new projects but it is not longer getting a lot of up-votes on HN. It is somewhat similar to PHP - HN doesn't like it but the language is still widely used.


> I haven't seen reports recently of any large migrations to Ada, Pascal or even C++ or other old (20yo+) language.

Things are quietly rewritten in Java and C# all the time. I’m sure it happens with C++ as well since the more recent quality of life improvements have landed in that language.


Java and C# are very "corporate languages". Microsoft and Sun/Oracle have done well in their marketing and manipulations. The usual philosophy is that it's easier to find programmers that know those languages, so it should be easier to support and replace programmers as needed.

It creates a kind of "self-fulfilling prophecy". "We can only find Java and C# programmers, so everything needs to be and upgraded to Java or C#. Since everything is in Java or C#, we have no choice but to keep using it. Since everything is written in Java or C#, then those are the languages I better learn." It's very hard for other languages to penetrate that bubble.

Ada, and even more so for C++, will hold on relative to the code previously written with them. But clearly, so many programmers will "hedge their bets", with also learning Java or C#.

Pascal/Object Pascal/Delphi, from back in the 1980s, was a "problem" for which big players (like AT&T, Microsoft, Sun, etc...) have arguably pushed a lot of hate and disinformation towards. There is a good argument to say that these days it's really more for independents and mavericks. Way more for Pascal programmers who managed to get themselves into positions of power, that are calling the shots or can influence what gets used.

Delphi/Embarcadero would have made the language extinct because of their short-sightedness, charging outrageous prices for its IDE/compiler to the enterprise, and developing little to no Object Pascal talent, if not for the open source projects of Free Pascal/Lazarus and PascalABC. Interestingly, a few changes in the past or future, and Object Pascal could/can be a contender. It is a viable alternative to Java or C#, but doesn't have the corporate push.


But this was specifically in reference to introducing statically-typed tech into something that didn't have it (the original quote was "companies have been backporting typescript/mypy/sorbet onto existing dynamic codebases"), therefore it's new in the sense that it wasn't in that place before and now it is being introduced there. So there could absolutely be a honeymoon period local to that team/project.


Software developers are goldfish, not tortoises. We retain no long-term memories. There is always - oh, look! - a brand new side of the bowl to swim toward. All change is new tech to us.

Except COBOL. We've never seen it but it's a punchline.

Now regarding number one . . .


After using python and C++ in production for more than 10 years I'll reach for python for trivial things, but for something complex give my C++ over python! Yes C++ has awful syntax full of footguns, but at least I can change a large project. At 50k lines of code python becomes something you can't change for fear of some 1 in a million path that will only break after it hits production. Static types means that C++ won't compile in most of those cases. As a result python needs a lot more unit tests, and even 100% code coverage isn't assurance that you covered all the cases that can crash.

Don't take the above as a statement that C++ is a great language. I'm interested in rust and ADA (just to name a few that keep coming up) which might or might not be better for my needs. I'm a C++ expert though, so I know what the limits are.


Python in a project == tech debt. Same for bash scripts.

For us, Golang turned out to be the optimal balance between (probably imaginary) strictness of C++ and "everything allowed until 2AM PD call" philosophy of Python.

Even for simpler things like scripts and little auxiliary services Python is bad, because these little things tend to grow and get more complex.


I too have C++ experience and would never take "it compiles" as a sign that it'll work. I don't dispute that there are languages which have sufficiently powerful and complex type systems and compilers that enable you to model significant amounts of domain constraints such that you feel you don't "need" tests, because it'll probably work if it compiles (though I'd argue that you really ought to still have tests to verify your modelling of the domain in the type system is correct, i.e. a set of programs that must typecheck and a set of programs that mustn't typecheck) -- C++ is not usually one of those.


True. I have more confidence my c++ works with 50% code coverage in tests than python with 100% coverage.

I try to model my domain in C++'s types. it isn't easy, but even avoiding int and bool helps a lot to ensure my code doesn't make stupid mistakes. Haskell would give me more power, but I'm moving my C++ in that direction.


i think your expertise in C++ might be affecting your view of python more than expected. i used to feel this way about languages other than my « primary » one and it turned out i just wasn’t as good at others and blamed the language for it


I agree with bluGill and I've done most of my career programming backend infra in Python. Give me C++, Rust, Go, Java, C#, TypeScript—as long as it has a type system. Eliminates a whole class of runtime errors and just makes life easier. When I'm dealing with large projects that have lots of moving parts and other developers, strict types are essential.

It's true that one can write unit tests to firm up dynamically-typed code, but doing so is not only a slog, it makes changes more fragile and deployments a lot scarier. If you aren't perfect the first time you write the tests, instead of a build failing you get restart loops, dead threads, and in the worst cases data corruption from improper silent type coercion.


I have enough experience in python to know where the limits are. I've done more C++ for sure, but I have done a fair amount of python. I still reach for python for small projects where I'm not expecting over 10k lines of code.


> C++ has awful syntax full of footguns

HN never stops delivering new funny insults about this language :-)


I mean, the sentence translates to "there's no evidence this is better, but some people are doing it and liking it".

This is not about being objectively better but only about preference. So that's not a "truth".

Sure people might prefer static over dynamic, but until we have hard evidence that one is actually better than the other, this is only preference which is heavily influenced by the current moment we are in.

Having said all that, I've worked with both and lean a lot more towards typed languages now (more than I did previously). But this is not an uncomfortable truth, because it's not objectively true.


Typing is almost a no-loss addition though. If your function takes an integer argument and someone might pass it a string, that is a bug. It is quite hard to argue that a typing mechanism won't improve code quality, and it is very lopsided in the amount of time it takes to type a function vs. debug something that would have been caught by a type system.

There are edge cases that can be argued all day, but if some large codebase is going to be maintained long term it is difficult for me to see how types could be a handicap. Even if a coder doesn't really engage with the type system "properly", as little as capturing information that was already obvious anyway will reduce the bug count.

There are good reasons to expect typed languages to do well.


> If your function takes an integer argument and someone might pass it a string, that is a bug.

What if you pass it a float and that's converted to an integer?


What you’re describing is optional typing. What is usually derided as cumbersome is mandatory and exhaustive typing.


“Mandatory and exhaustive” typing is also good and usually not particularly difficult if you have a decent type system. With OCaml, for instance, you rarely even need to go out of your way to have 100% type coverage.


What if... we had the "best" parts of PHP or JavaScript development but for any traditionally statically-typed and precompiled language?

(sorry, I can't say that with a straight-face)

It could work though! And I don't just mean like Visual Studio's Edit-and-Continue feature...

Consider a language feature that lets you annotate a local/parameter/field's ("thing") type as "Hindley–Milner-esque" which tells the compiler to invert the thing's typing rules from prescriptive to descriptive (i.e. to mostly stop complaining and make the thing behave more like a JavaScript `object`, so the compiler only complains about 100% provably wrong usage of that thing).

Another "feature" of PHP/JS dev that I want to see in compiled projects is a way to run a project-with--build-errors by adding a toggle that instructs the compiler to stub-out all functions/members that contain build errors (even syntax errors!), this way we can do quick ad-hoc testing without needing the entire project to build. Think about the times when you your boss asks you to make 1 or 2 "minor" alterations for an unrelated feature and you don't think it's worth making a whole new git branch-andworktree for, but you can't test it right-away because of unrelated breaking changes you've already made.

Just some ideas...

---------------

Unrelated additional comment: I'm looking forward to when all programming languages fully support generalized ADTs. So many issues with data-modelling can be solved with ADTs and GADTs, but yet thanks to OOP's legacy from the 1990s we still have to shoehorn simple and clear models into inappropriate applications of inheritance. I want my intersection-types, damnit.


Exactly, that that kind of content is literally publication bias[0] on steroids.

[0] https://en.wikipedia.org/wiki/Publication_bias


The biggest problem is talking about 'statically typed' and 'dynamically typed' languages as if Haskell and C and Javascript, Python and Common Lisp (or Clojure) are the same.


For Typescript, I think it's safe to say the honeymoon period is over. It's been >9 years, or more than 3 internet generations.

For at least this one case, many people (including myself) believe it's a success. Whether the tooling for Sorbet or mypy ever reaches that point, there are compelling reasons to believe in the success of the model.


Adoption curve takes a while. Typescript definitely wasn't common back in 2012, I can tell you that.


This is true, but the adoption has been dramatic.

The question I would have is what would take for this adoption to reverse? Where is the dissatisfaction with the model? The rationale for Typescript and Sorbet was precisely long-term maintenance and codebases at scale.

To add to the list, there's also Clojure spec and Elixir spec. All of these tools have a similar philosophy across very different problem domains. All of them have enthusiasm and the model of type annotations for dynamic languages is decades old.

To reference another one of the author's points: people are slow learners in software engineering. The continued reinvention of this concept decade after decade shows its utility, both theoretical and practical.


Well my point was not that all technologies are equivalent. And it could well be than a new iteration of B is better than ancient version of A. But that the way industry celebrates new (or if you wish, rediscovered) paradigms is too detached from reality to be useful as any objective metric.


It's not common now! The software world is huge; a niche of a niche is not some bellwether of the industry.


> It's not common now!

It is [1].

Over 60% of Javascript developers use it somewhere, and over 1/3rd of all developers use it for something. Almost all widely used npm modules have @types annotations as well. Outside of node, it's arguably the most ubiquitous voluntary extension of the JS ecosystem ever.

[1] https://redmonk.com/jgovernor/2019/05/07/typescriptexploding...


The key of my point is "niche of a niche", which is seemingly the exact point you're making in arguing that typescript is common.

Common among JS developers != common among programmers != common among professionals in the software industry


> Common among JS developers != common among programmers != common among professionals in the software industry

Not sure if you're a troll, but as I pointed out, as of 2019 over 1/3rd of all developers in the software industry were using it. It's higher now.

That's common. Find me another technology used by more developers in the software industry.


I think part of it is that there are suddenly a bunch of clear, actionable and measurable goals when doing a migration - so people can commit to progress, deliver on it and feel good about their accomplishments.


Agreed, just because more companies are adopting something and saying it helps doesn't mean it actually helps. There's too much incentive for spin and arguable claims of success.

Additionally for things like static typing, the larger the company the more risk-averse people tend to be within the company so things that give a feeling of safety like type-safety are going to be a much easier sell.


It... depends.

I like Typescript. The Typescript enthusiasts are also glib about what a pain it can be to teach. This is not a minor tradeoff.


Yes, survivorship bias too.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: