The eventual standard still stands up after more than three decades because Common Lisp was designed by people using Lisp. And designed for people using Common Lisp.
The paper is probably best understood as two insiders criticizing their own work in progress, trying to steer the community they were part of in a direction they thought would be beneficial. Gabriel was a member of the standardization committee, and Brooks was the author of papers influential in development of the standard.
"He was one of ten founders of Lucid Inc., and worked with them until the company's closure in 1993."
There are some cool tidbits in this interview:
Don't think this passage from the abstract stood the test of time. Common Lisp's standard library was large at the time, but is nowadays either comparable or small (or tiny!) compared to the standard libraries shipped with C++, Python, Java, Haskell. Software complexity skyrocketed since 1984.
The std library of Python is remarkably similar to that of CL and the issues compiling it much like those in Python. JIT compilers can beat the 'microcoded' system to deal with cases like
x = y + z
People are re-discovering lisp all the time. The nice things are still seemingly CL-exclusive; Conditions and restarts, proper macros (getting adopted in many languages NOW, about 30 years later), multiple dispatch OO.
And first and foremost: proper interactive development. Using a python repl (or even a notebook) I have the feeling I get all the frustrating bits of the repl experience without any of the good parts. With very lackluster performance to that.
I was kind of tired of not being impressed by new languages. Clojure and Erlang were the first ones in a very long time where it was the language (and not any additional libraries) that made me go "Wow. This is really cool".
Now, I still write most of my code in guile scheme and occasionally ocaml, but not without at least glancing once or twice at Erlang.
In one sense, it's the most common thing in the world, because Common Lisp source code is made of linked lists, so every Lisp program makes extensive use of them. But that doesn't have any necessary relationship to the efficiency of running programs, because the compiled code doesn't have to traverse linked lists just because that's what the source code is made of.
In another sense, it's still pretty common because linked lists are a standard feature of the language with extremely convenient surface-level support, and because linked lists are sufficiently flexible to represent arbitrary data structures (albeit not necessarily efficiently). Those characteristics make it really convenient to use lists for modeling data while you're figuring out how you want things to work, and I personally do that all the time.
Again, that doesn't have any necessary relationship to efficiency in your completed program. I commonly use lists to knock together speculative data models and figure out access patterns. That doesn't mean I'll stick with them once I figure out the needed access pattern. It depends on the access pattern.
If it happens to be the case that linked lists are close to optimal for the access pattern I discover, then I'll stick with them. Why wouldn't I? If not, I'll use another data structure. Lisp makes it easy to design APIs that are agnostic about the data structures they operate on, which makes it easy to swap out one structure for another once you figure out which is the right one.
In guile scheme you gain very little by using vectors instead of lists unless you want random access. There is however a never-ending flood of people coming from python using lists in exciting ways to create accidentally O(n^3) programs :) Those can almost always be re-written to be o(n) using linked lists, otherwise you point them to vectors instead.
Not as good as a vector of course, but still decent.