I think in the current crop of languages, the "functional language" concept has shifted its purpouse:
1 GP languages have adopted FP enabling features, so we can do FP just fine in many mainstream languages, and it is infact very common (see eg React, Ramda popularity on frontend and many recent FP features in Kotlin, Java, C++, etc).
2 At the same time FP leaning languages are more popular than ever, to the point it would be ridiculous to claim "nobody uses functional languages" given the visible positions of Clojure, Erlang/Elixir, ReasonML/Ocaml, F#, Scala, etc on the scene.
I think adding 1+2 together tells us that FP is on a real streak. People choose use FP langauges, not because of their capabilities, but because of the mindsets, ecosystems and culture they promote, along with promising to consistently pave the way for FP problem solving instead of often falling back to imperative.
(1) keeps cropping up but these new languages are hardly "FP" or even using "FP" features. I think we give FP research way too much credit.
Pure functions? No, none of those languages can even express the concept except maybe C++ with the const keyword (which is as old as Haskell itself).
Immutable data structures? Even very modern languages like Kotlin don't really have them. You can define a data structure where the fields are immutable after construction but it's not transitive. There are read only collections but not immutable collections, without doing extra work. Maybe immutable collections will turn up at some point, there's a proposal to add them, but it'll just be a library not a language feature.
Type inference? New languages don't use full blown Hindley-Milner TI, they do much more limited TI that's designed to strike a balance between readable error messages, documented APIs, IDE performance and so on. Meanwhile the concept of TI is obvious and required no inspiration from anywhere - anyone who ever looked at source code will be struck by the repetition required.
Generics? C++ templates started being discussed in 1985, and it's again a pretty obvious concept that would occur to anyone who wanted to offer type safe collections.
I honestly can't see the huge impact FP supposedly had on mainstream programming. All the ideas its advocates try to claim for their own are either not used, or are obvious but usually not implemented the same way. My guess is if Haskell had never existed modern languages would look exactly the same.
I think the discussion is in danger of going the way of a Möbius strip... Language features originating in FP have been adopted by GP languages, which was the point :)
But those features were found in Java 1.0 so the influence can't be that strong, given it was heavily advertised as an object oriented language at the time.
From the list, only GC was in early Java. GC was popularized by Lisp (was added in the 60s?) and other FP languages like ML (had gc from the start in 1973) decades before.
1 GP languages have adopted FP enabling features, so we can do FP just fine in many mainstream languages, and it is infact very common (see eg React, Ramda popularity on frontend and many recent FP features in Kotlin, Java, C++, etc).
2 At the same time FP leaning languages are more popular than ever, to the point it would be ridiculous to claim "nobody uses functional languages" given the visible positions of Clojure, Erlang/Elixir, ReasonML/Ocaml, F#, Scala, etc on the scene.
I think adding 1+2 together tells us that FP is on a real streak. People choose use FP langauges, not because of their capabilities, but because of the mindsets, ecosystems and culture they promote, along with promising to consistently pave the way for FP problem solving instead of often falling back to imperative.