If the big idea was messaging and not objects, I'm curious to know how did we end up in this primarily object oriented world. Especially, when its founding father didn't had that intention as he says: ""I invented the term object-oriented, and I can tell you that C++ wasn't what I had in mind".
I have a guess from following quote: "This was why I complained at the last OOPSLA that - whereas at PARC we changed Smalltalk constantly, treating it always as a work in progress - when ST hit the larger world, it was pretty much taken as "something just to be learned", as though it were Pascal or Algol." -
There were people in hurry to make something with whatever theoretical basis there was instead of further refining the original concept. Probably, that's where it took the wrong turn.
> I'm curious to know how did we end up in this primarily object oriented world
Hundreds of millions of lines of C and millions of C programmers.
They could, in theory, smoothly migrate into the world of object orientation using C++.
C++ has, however, held as a constant design principle that if you don't use the new stuff, it's still C. Put another way: feed your C program to a C++ compiler and it behaves the same way.
That design back pressure makes late binding and message-passing a la Smalltalk a non-starter.
Objective C solved this in a very different way: by making the Smalltalk-y additions syntactically orthogonal to the existing C bits. That's why you send messages [inside brackets].
C++ was backed by AT&T. Objective C was backed by some whacky tech company from California founded by that loony guy they kicked out of Apple when the grownups took over.
Not to mention that Smalltalk had serious corporate backers too (eg IBM), so what room was there for Objective C?
Eventually Java came along with the mission to rescue the world from the explosive mixture of C++ and median programmers. It took a page from the C++ book and aimed to be easy to switch. Similar model, similar basic concept of operation. And that's largely been that.
I don't think we really did end up in an object-oriented world. I think we ended up with nominally object-oriented languages used in ways that are more struct-oriented than object-oriented.
I had the good fortune to do early Java work with a bunch of Smalltalk programmers, so I know you can write Java that is reasonably well object oriented. But I don't think I ever saw an intro-to-Java book that was any good in that regard.
Refining a language or programming paradigm requires a philosophical or scientific bent. Not much money in that. Most people just want to get a job done and go home at 5pm.
Edit: Bright side, I think polyglot programming is more popular now than ever in history. In the past you learned one language and that was 20 years of your career before you went into management. Your identity was wrapped up in being a "VB person", "C", "COBOL", or "Java/C# person" or "Rubyist" .
There still is language identity politics but I think Ruby might have been the last of the One True Religions, now the frontier is all over the place between Python, Ruby, Node, Java, Golang, c#, etc.
I think it is a perversion of computer science that we think ourselves as users of languages, instead of creators of languages. Especially because it is not hard to create a language these days -- we know much more and have better tools than people 30 years ago. Many of the problems that we consider uber-complicated could be relatively easily solved by creating appropriate problem-specific languages: consider web programming, data mining, etc.
I'm with you. But it becomes a skills and NIH-attitude issue at scale, where the evolution of science is less of a concern than treating the science as constant for an engineering project.
I've seen lovely crafted DSLs that were deemed unmaintainable when the creator moved on, and management cursed the use of them if they weren't already a popular OSS project.
The reason is that DSLs are just viewed as "another" burdensome language, instead of a design for codified problem-specific knowledge. A DSL should evolve as we evolve our understanding of the problem. Think about successful DSLs like Mathematica, for example.
"Object-oriented" was like "agile" in the late 80s and early 90s. It was viewed by management as a quasi-silver-bullet that would solve the software complexity crisis -- and by consultants as easy money. The qualities of Smalltalk that were selected for in the object-oriented craze were the ones that most appealed to computer-unsavvy middle management -- the ones that helped them herd programmers by the dozens or hundreds without them stepping on one another's toes. So classes, data encapsulation, inheritance, and virtual-method polymorphism were emphasized to promote the creation of sealed components that could be extended or wrapped but not changed (Brad Cox's software IC dream) and late-bound messaging de-emphasized in order to enforce static typing constraints.
OO is still a useful way to organize programs. Even if you use, lets say, templates in C++ for inheritance instead of method table dispatch...it's still a useful way to mentally/physically organize your code.
Because it was much simpler to bolt on the more primitive Simula model of OO, which is a straightforward extension to procedural Algol. Beefed up structs, at their core.
I have a guess from following quote: "This was why I complained at the last OOPSLA that - whereas at PARC we changed Smalltalk constantly, treating it always as a work in progress - when ST hit the larger world, it was pretty much taken as "something just to be learned", as though it were Pascal or Algol." -
There were people in hurry to make something with whatever theoretical basis there was instead of further refining the original concept. Probably, that's where it took the wrong turn.