
Alan Kay on the Meaning of “Object-Oriented Programming” (2003) - tosh
http://www.purl.org/stefan_ram/pub/doc_kay_oop_en
======
chrisseaton
I still don't really understand what the big idea of message passing compared
to a method call is.

Is it supposed to be asynchronous? It never seems to be so in practice. It it
supposed to be remotable? So are method calls. Are languages like Objective C
and Ruby message passing to some extent? Why is that message passing rather
than a method call? What's the essential difference? Is Java message passing?

~~~
azeirah
The big idea of message passing is that it's not about message passing, I
think what Alan Kay has been trying to say all these year is that we should
look at how bacteria communicate, and try to mimic that.

They communicate by sending protein "messages" to their vicinity, and if a
close-by bacterium knows what to do with it, it processes it, and responds by
either sending out their own proteins (message passing), OR change their
internals to get receptive to different kinds of messages (change internal
state).

From this viewpoint, coding starts being about cultivating a literal society
of objects (cells). The obvious advantage being that biology scales an
immeasurable amount better than whatever architectures we ever come up with.
He often gives the example of the internet as a well-designed system, it
operates in a similar manner, objects (anything that can self-identify with an
IP-address) can join/leave at any point, and the system itself stays alive.

I'm not 100% sure if this is what Alan means, but after having following his
talks and writings for a couple of years, I think that's the gist of it.

~~~
hota_mazi
> The big idea of message passing is that it's not about message passing, I
> think what Alan Kay has been trying to say all these year is that we should
> look at how bacteria communicate, and try to mimic that.

It is such a horrible programming model, though.

Making calls and sometimes they get picked up and sometimes not?!?

If we're talking remote processes, sure, that's understandable, remote
resources are not always available.

But in-process? If I call a function on an object, I want the compiler to tell
me right now if this code makes sense, and the compiler should stop me right
away if that code cannot possibly work once deployed.

~~~
function_seven
I don't think it's about sending a message into the void and "hoping" that a
compatible agent receives the message.

Rather, it's about your object not having to know _who_ is handling the
message, only _what_ the message is ("what" in both content and type).

It's still up to you as the programmer to make sure that you have implemented
the other objects to act on whatever message types your code will pass. Just
like it's up to you to make sure the method exists in the target class when
the caller tries to invoke it.

Later on, if you decide to swap out your logging infrastructure for a
different one, you don't need to update all the places that called the old
one. You just tell your new logger to listen for messages directed at Logger.

~~~
da_chicken
> I don't think it's about sending a message into the void and "hoping" that a
> compatible agent receives the message.

Except that _is_ the bacterial model. When you want something done, you signal
for it by releasing proteins, and keep releasing proteins until you're
satisfied even if the work ended up being done 50 times more than you needed.
That's why biological systems so commonly overreact to stimulus. The
biological method is to flood communication until whatever demanded response
is met. It's the equivalent of a user mashing a button until a program
responds, like the elevator button problem.

Actually, elevators seem like an extremely good analogy for this kind of
asynchronous service system.

I understand we're not trying to perfectly mimic the analogy, but I think it's
important to see that nature's model, while robust and asynchronous, carries
significant problems due to how it communicates. It's that very robustness
that we're trying to mimic, so we should expect to inherit some of the
problems that go along with it.

~~~
function_seven
Yeah, the biological model is much more messy. I think we can take some
lessons from it while still enforcing a "God Mode" on our local machine,
ensuring that a compatible process is always running to receive the message.

Though, as GP stated earlier, this does make some more sense from a
distributed networking perspective, where no machine has control over the
existence of its peers. In that case, having a setup where the message is sent
out on a service bus to be snatched up by a compatible listener is closer to
the biological analogy.

------
tyingq
_" I'm not against types, but I don't know of any type systems that aren't a
complete pain, so I still like dynamic typing.)"_

That's probably more controversial here than his views on OO.

~~~
yxhuvud
Arguably, it was true back in 2003 when this was written. There has been a lot
of improvement in type system usability since then.

~~~
phyrex
A lot of us still feel that way. The trade off is still there, and in many
situation I still don’t find types to be worth the trouble.

~~~
DonaldPShimoda
> The trade off is still there

Not really, in my opinion. Type inference especially has come a long way. If I
offer you a choice between language A and language B and claim that their
syntax is nearly identical and you'll use essentially the same number of
keystrokes to accomplish the same task in both, but language B can inform you
about mistakes you've made at compile-time instead of waiting until run-
time... which one are you going to choose?

While dynamic languages afford some degree of freedom, that freedom comes at a
cost. The tradeoff used to be that safety required lots of extra writing
(e.g., Java). Now that this isn't the case... the freedom of dynamic typing
can be more harmful than helpful.

(I still use Python all the time, so don't mistake me for a static-only zealot
of some sort. But I think the landscape of type systems has changed a lot
since Kay made these claims.)

~~~
ridiculous_fish
No this completely misses the point! The point is the development model, not
the language syntax.

Static types assume a "static" phase: an edit-compile-run cycle, with bright
lines between each stage.

But as a Smalltalk programmer, you inhabit the system and build it inside-out,
_while it is running._ For example, during development it is routine to query
live objects. The lines between the edit-compile-run phases disappear.

Static types don't make sense in the Smalltalk model because there's no static
phase.

~~~
ddragon
I agree with that. The interesting aspect of dynamic languages is that it's a
living program, you can interact with, it can interact with itself and you
could even effectively make it an entire new program without ever closing it
like the Ship of Theseus. Smalltalk, Lisp Machines and the Erlang Virtual
Machine were pushing this idea.

Which is why I feel that a dynamic language that doesn't focus on the
interactivity, like having a great REPL based development cycle, hot swapping
capabilities and strong code as data utilities kinda misses the point in the
trade-off between static and dynamic, sacrificing safety for too little.
Jupyter notebooks seem to have recently brought a little of the first to the
mainstream at least.

------
cle
I mentally categorize these discussions of "true OOP" in the same class as
"RESTful" API discussions. Nobody is quite sure what they're even arguing
about, so the discussions go around endlessly in circles.

~~~
h8hawk
Alan Kay is Turing award winner and one of the main pioneers in OOP. I don't
know how he doesn't know what he talking about.

~~~
cle
Alan Kay probably does know what he's talking about. The problem is that
nobody else knows what he's talking about.

~~~
4thaccount
I stated something similar in an older post. Essentially we all know who he is
and have a lot of respect for his amazing accomplishments. He even has some
lectures that are awesome. One that I've watched a few times goes over the
massive technical debt we're building as a society. The Xerox Parc team built
a computer with an OS, language, editor, GUI...etc etc in probably 1/50th (I'm
completely making this number up to paraphrase the point) of the code of just
Microsoft Word. So we're using overcomplicated systems and tools and it isn't
necessary. On the other hand, there are other papers, internet discussions,
and talks of his where I think the target audience is someone who has an
advanced understanding of computers, biology, art, philosophy, pedagogy,
psychology and has had the time to piece it together and I'm completely lost
as to the point he's trying to make and it is a shame. I wish those talks came
with a ELI5 where I could get the gist before diving in further. Perhaps I
haven't earned the knowledge to participate in those discussions yet.

------
userbinator
_I wanted to get rid of data._

I wonder if the OOP "style" of obfuscating program behaviour by using more
code (class hierarchies, overridden functions, etc.) instead of data (table
lookups) could be attributed to him.

The opposing viewpoint is discussed at
[https://news.ycombinator.com/item?id=4560334](https://news.ycombinator.com/item?id=4560334)
and I've personally found from experience that expressing a set of conditions
and decisions as one table is far easier to understand and modify than trying
to model it with multiple interacting objects. The latter has much more
cognitive overhead.

~~~
anon1m0us
Quite the contrary: Everything should be data. Even the lines of code. The
values of variables and constants.

We can't have AI if we can't rewrite the code. It's super hard to rewrite 1's
and 0's, but if our lines of code are the data, then all we need to do is
update a row in a database to rewrite the program.

We can automate this. Even now our code is data. We could automate a program
to upload a new source code file to github and then redeploy ourselves to the
appropriate nodes.

~~~
bunderbunder
What Kay was getting at there is a bit difficult for me to follow, but I'm
pretty sure that he meant something different than that. Something more
specific and constrained.

What you're describing sounds to me like a very LISP-y view of data. Alan Kay,
by his own admission, wasn't very versed in the LISP tradition - based on what
he's describing, at the time he was pretty deeply immersed in the more ALGOL-y
side of the world, back in a time long before there was much cross-pollination
between the two camps, and that would presumably have influenced his thinking.

So, semi-wild conjecture: It seems to me like what he was really getting at is
that he wanted to get rid of the heap. Or at least shove it deeply into
"implementation detail" territory, the same way languages like Haskell try to
banish any evidence of the existence of memory registers from the programmer's
consciousness.

In so many languages in the imperative tradition, the standard practice was to
have most your data living out there, in this giant pile of shared memory,
more or less available for anyone to see. I think that, even before the birth
of C, most academics recognized that as a source of liability; a lot of ink
was spilled dancing around possible solutions to that problem in the mid-late
20th century and beyond. What Kay is saying in TFA seems, to me, to place his
thinking clearly in the middle of that whole scene.

~~~
galaxyLogic
The way I see it is that the FP school is about the microcosm like quantum
mechanics, deciding what are the best possible building blocks of programs,
the functions, which are pure and referentially transparent and immutable.

OO is looking at the macro level: What are the systems how they communicate in
a way that preserves the referential integrity of the system as it evolves
over time. OO is like cosmology, structure of things on a large scale.

FP = Quantum Mechanics

OO = Relativity

~~~
kazinator
OOP is not very good on large scales; in fact, it shines at medium scales.
Even programs of a few thousand lines of code can greatly benefit from OOP.

We don't know what works at scales that are so large that they are described
with cosmological words.

------
ilovecaching
OO is a great example of Chinese whispers. The original idea behind it is more
in line with Erlang or functional programming in general than what we have
today in Java or Python.

That’s why I’m adamant that existing OO languages need to die. They’re
abominations of good ideas that make things worse. C++, Java, Python, are all
founded I’m confused interpretations of OO. OO has also become so synonymous
with these implementations that we should just let the paradigm go entirely
and start afresh.

~~~
mattnewport
C++ isn't like Java in this sense, it's a multi paradigm language and you can
use it effectively without ever doing Java style OOP. The STL is mostly OOP
free because Alexander Stepanov recognized that it was mostly a bad idea and a
lot of modern C++ doesn't use inheritance, virtual functions, abstract
classes, getters and setters or a lot of the other Java style OOP junk except
sometimes as an implementation detail.

~~~
pjmlp
Apparently now it is common to forget that Java style OOP junk was originally
mid-90's C++.

STL builds up on OOP abstractions, class templates, static method dispatch,
type polymorphism, abstract classes for allocators, iostreams, base classes
for data structures common methods.

~~~
mattnewport
STL is just the containers / iterators / algorithms part of the standard
library, the part that Alexander Stepanov designed and originally implemented.
Most of the less pleasant parts of the standard library like iostreams come
from elsewhere.

Templates, static method dispatch and type polymorphism are not really part of
OOP, certainly not Java style OOP. Abstract classes for allocators are a very
recent addition with polymorphic allocators and are a good example of OOP
features being used in modern C++ as more of an implementation detail (a way
to get dynamic dispatch / type erasure).

~~~
pjmlp
It is C++ OOP, regardless of how Java does it.

Java is not the last word in what OOP means in practice, and is a younger
language, which took many OOP ideas from C++ OOP libraries that were current
when Java was designed.

In fact something like J2EE was a relief versus using CORBA or DCOM/MTS, kings
of OOP boilerplate.

Or the surviving king of C++ GUI frameworks, Qt.

As for the STL part you refer to, the original one, containers / iterators /
algorithms, it makes use of C++ classes, methods, aggregation and delegation,
which are definitely OOP.

OOP doesn't have a rule that it is only OOP when all concepts are used in
every single class.

~~~
mattnewport
The STL is more derived from Abstract Data Types than from OOP but there is
overlap between those. Basically the good parts of OOP is ADTs and are
embodied in the STL, the bad parts are most of the rest and are embodied in
Java.

~~~
pjmlp
It is still OOP from CS point of view, regardless how you want to sell it and
keep hand waving the fact that Java got it from C++.

The Gang of Four book uses Smalltalk and C++ for its pattern examples, Java
wasn't even invented when the first edition came out.

~~~
mattnewport
To bring it back to the original comment I was replying too and to the
original email, Alan Kay says in the email:

> _OOP to me means only messaging, local retention and protection and hiding
> of state-process, and extreme late-binding of all things. It can be done in
> Smalltalk and in LISP. There are possibly other systems in which this is
> possible, but I 'm not aware of them._

He was writing in 2003 so presumably didn't / doesn't consider Java or C++ OOP
languages by his definition.

The comment I was originally replying to said:

> _C++, Java, Python, are all founded in confused interpretations of OO. OO
> has also become so synonymous with these implementations that we should just
> let the paradigm go entirely and start afresh._

And I was taking issue with C++ being included with Java in this (I think
Python shouldn't really be included either). C++ is a multi paradigm language
and while I agree the OOP parts of it that it has in common with Java are
neither very good nor what Alan Kay meant by OOP, C++ supports other better
programming paradigms better than Java does.

The good parts of C++ style OOP are mostly the ADT bits. The less good parts
are inheritance and virtual functions. _Design Patterns_ makes heavy use of
the less good parts of that style of OOP and most of the problems it addresses
are better solved in other ways that modern C++ supports quite well.

------
Ericson2314
I've never heard it discussed before, but I think the "no data" part is
important. Languages without pattern matching (or a few other exotic
alternatives) really do not allow for proper elimination. Asynchronous message
passing can be thought of as a weird continuation passing variation (you store
your continuation in the message receiving code). This means all elimination
is done remotely on your behalf, seemingly hiding what is a nastier part of
the language.

But once you get proper elimination, now your messages can be more complex,
and chained pipielines are very useful so better be synchronous if you weren't
already, and then you have combinators that only shallow modify your messages
so might as well immutably share memory to avoid copying, and well those
objects you create and tear down on the fly might as well not be stateful
because they're so short lived and woah, now we have functional programming.

------
marvel_boy
"OOP to me means only messaging, local retention and protection and

hiding of state-process, and extreme late-binding of all things."

So this means that languages like Swift, for example, are not OOP?

~~~
kybernetikos
Almost no oop languages are actually oop according to Alan Kay.

I think his OOP is closest to what we'd call actors today.

~~~
userbinator
Fortunately it does not matter, because the industry seems to have mostly
gotten out of the "if it's not X-oriented it's bad" dogmatic mindset that
spurred the "OOP craze" of the 90s and early 2000s.

~~~
BigJono
I dunno about that. Try being a JS dev that likes mutability and dislikes
static typing.

------
anthony_doan
I like Joe Armstrong's definition of OOP and what he did to make Erlang an OOP
language:

[http://erlang.org/pipermail/erlang-
questions/2009-November/0...](http://erlang.org/pipermail/erlang-
questions/2009-November/047748.html)

Alan Kay went on the record to state that current OOP miss his point of OOP
where message passing is the main point.

Joe Armstrong's definition of OOP in the mailing list give a little bit more
context on the other modern features of OOP that aren't truly needed.

In Alan Kay and Joe Armstrong talk, Alan Kay acknowledge that Computer Science
is pop culture where the past aren't learn from and why past mistakes are
repeated.

[https://www.youtube.com/watch?v=fhOHn9TClXY](https://www.youtube.com/watch?v=fhOHn9TClXY)

~~~
hota_mazi
Pretty rich for Armstrong to lament about the past not being learned from
while he himself used to think that OO sucks:

[http://harmful.cat-v.org/software/OO_programming/why_oo_suck...](http://harmful.cat-v.org/software/OO_programming/why_oo_sucks)

~~~
anthony_doan
That's an interesting article, I don't believe it go against his other
article. Your link shows that he hate certain aspects of OOP and the other
article shows he decides to strip those OOP aspects away and only choose the
ones that he believe is OOP.

At least that's my interpretation on reading the link you've provided and
comparing to the link I've posted.

------
neokantian
The wikipedia page on "message passing" is in total disconnect with existing
practice
([https://en.wikipedia.org/wiki/Message_passing](https://en.wikipedia.org/wiki/Message_passing)):

"Synchronous message passing occurs between objects that are running at the
same time. With asynchronous message passing it is possible for the receiving
object to be busy or not running when the requesting object sends the
message."

"Objects running at the same time"? Multithreaded? Multiprocessing?
Distributed?

An instance does not "run". It is allocated in memory, and then you can invoke
methods on it.

This entire "message passing" doctrine is nebulous nonsense.

~~~
dwd
What you're referring to is a Pub/Sub model which is probably explained better
here:

[https://en.m.wikipedia.org/wiki/Publish%E2%80%93subscribe_pa...](https://en.m.wikipedia.org/wiki/Publish%E2%80%93subscribe_pattern)

------
Verdex
You setup this great analogy about messages and cells and a bunch of other
stuff, but then exceptions and pointers/references show up and everything
falls apart.

If you have a system like ruby or java or c# then you can sort of pretend that
you have different "organisms" all sending signals to each other. However, if
your throw an exception at any one place, suddenly what order the functions
you are simulating your message calls with WILL determine the semantics of
your program. Have fun figuring things out when the order changes based on
minor details that change wildly with different inputs.

In this scenario if methods are messages, then exceptions are like a false
vacuum in quantum mechanics collapsing and destroying reality.

[Additionally, you'll need some sort of threading and message queue system
setup or else you'll get a similar mess with your metaphor OR stack overflow.
Really erlang is the only system that preserves the metaphor without the
programmer either having to A) understand everything without the metaphor ...
in which case why have it or B) not understand anything and only be able to
program until they hit the threshold of their comprehension and then type
random letters and hope it runs until they can find a new job.]

Pointers are also a problem. In real life we have ownership types build into
reality. If I have something, then you don't. You also don't have to worry
about cyclical references or infinite recursion in reality. Eventually all of
the physical resources are utilized and you have to wait for something to free
up before you can proceed. Once you have pointers, though, you have to worry
about one "cell" possessing another "cell" that just happens to possess the
first "cell". A system that works just fine in real life (because paradoxical
systems just can't be physically built) gives way to a system that will
hopelessly break UNLESS the programmers know that the metaphor is false and
can see through it. We run into the same problem as before. Either they can
work just fine without it (so why are we using it) OR they can't see through
it at all (or only partially) and eventually they'll run into a problem that
they can't solve.

Maybe there are scenarios where it makes sense to use an incomplete metaphor
that pretty much has to break sometime. But it seems like it's going to be for
unimportant work that we're comfortable with breaking with no good path
forward for fixing it (besides to find people who can see through the metaphor
completely).

------
dwd
I would recommend reading the chapters on Finite State Machines from Object
Oriented Design & Modelling - Rumbaugh, et al.

Given it was written nearly 30 years ago - the concepts it puts forward are
much closer to the ideas that Kay had than what passes for OO these days.

------
m0zg
IMO a more productive way of enacting change would be to call the idealized
view of what this _should be_ something else and try again, rather than
telling everyone that they're "holding it wrong". Mental inertia is a force
that's nearly impossible to overcome, especially if lots of people are making
lots of money from the status quo.

------
namelosw
I was also struggling with the "big idea is about messaging", the best I can
think of is it refers to something similar to "tell, don't ask". Isolation and
communication is an essential part.

While mainstream OO languages focus on higher level concepts like
encapsulation, inheirtance, and polymorphism.

------
madhadron
When a programmer today tries to understand what Kay is saying, they usually
get caught up with decades of intellectual structure labelled "object oriented
programming." You need to go back to the influences he's pointing at as
instrumental and see what the words meant in the zeitgeist of the time. And
you also need to tease out what parts have become implicit in our own
zeitgeist and what haven't. Then you need to figure out what implications the
sum of all the parts that haven't are.

This isn't an issue just with Kay. It's true for basically any intellectual
structure you care to engage with. The definition of "gene" in biology is a
similar case, but that's a rant for a different time.

For example, when he says dataless programming, he points to a couple of
papers. For example, Balzer's "Dataless Programming."[1] Go read the abstract.
It's talking about the algorithms part of the C++ Standard Template Library
and the idea that you shouldn't have to rewrite your program just because
you're changing from an array to a linked list.

But that's not all. Add "the cell/whole-computer metaphor would get rid of
data, and that "<-" would be just another message token." That is, why stop
with the Standard Template Library? A program made of assignments, loops, and
conditionals is as much a formal object as a linked list. Why should we have
to rewrite it when the "data structure" is changed from running as a
traditional program on a single core messing with registers to having its
state stored in a distributed tuple space and steps in it being executed by
workers that pull them off a queue?

This is where his math background really comes in. For a mathematician, the
algebra of structured programs is the same in kind as the algebra of numbers
or the algebra of sequences of numbers or the algebra of topology preserving
mappings of a space. What Kay was trying to get at was a way of letting you
write an expression in such an algebra and be able to bind it to whatever
instantiation of the algebra you might want, including ones that don't exist
yet.

To do that, all the pieces of your language have to be of the same kind,
whether they be numbers, lists, types, or loops. And here's our big delta from
"everything is an object" in familiar languages to what Kay was talking about.
In Ruby, Python, JavaScript, or Java (let's ignore Java's primitive types for
now and pretend boxing is all smiles and rainbows) the only things that are
objects are the things that were values in structured programming languages.
That's why Kay doesn't see these as object oriented.

Also, consider his comment "I don't know of any type systems that aren't a
complete pain, so I still like dynamic typing" against this background. It
suddenly feels a lot more reasonable. A type system for this kind of work is
basically pointing Kanren at your expressions and writing a logic program to
reason about your domain. Then, next week, when you change the algebra
substantively, that logic program might be almost unchanged or may have to be
almost entirely scrapped and rewritten.

[1]:
[https://www.rand.org/pubs/research_memoranda/RM5290.html](https://www.rand.org/pubs/research_memoranda/RM5290.html)

------
profalseidol
Does anyone find it surprising that OOP is not even well-defined? Alan Kay
says that Messaging an important part of it but based on comments here, there
is no consensus.

------
msccuba
Q: Alan, how do you suggest coding? Alan: well we should not have one big
source of data. We should organize code into compartments each doing it's own
thing with their own data like cells. Q: How would that work Alan: Message
passing!

Few years later Q: What is OOP? Inheritance, polymorphism, etc? Alan: It is
message passing. All the other stuff I haven't figured it out yet. They are
just nice to have things.

Me: Actually OOP is code organized into cell/computer like structure that
happen to use messages to make it work.

Dear Alan don't emphasize message passing although that is a way to deny the
other techniques as non-essential to OOP.

~~~
latch
As you distilled it down, you seem to have forgotten about the data.

I don't think there's any practical reason why you couldn't have perfect data
isolation in OOP (make everything private, only pass copies of objects around,
build an event/dispatch system), this isn't what usually happens. As Joe
Armstrong said: You wanted a banana but what you got was a gorilla holding the
banana and the entire jungle.

That's why there's an emphasis on message passing, because it speaks directly
to data ownership. OOP, not so much.

~~~
msccuba
For you the implications of message passing might be apparent instantly but
for me and maybe others it would've been easier to grasp data ownership by
simply telling us about data ownership.

You wanted to explain data ownership, but what you got was cells, computers,
networks and finally "message passing".

------
sgc
Can anyone explain the importance of the term "oriented" in OOP?

~~~
chrisseaton
Not sure if you're asking literally, but 'oriented' (or I'd normally write
'orientated' myself) means something like 'aligned around'. Object orientated
programming is a programming model aligned around objects as the central
concept. As opposed to aligned around procedures, or functions, or aspects, or
something else.

~~~
sgc
I guess I erred on the side of vagueness. I did understand the basic
significance, but I was wondering if there was some technicality of the way
OOP should be implemented per Kay or other well respected contributors to the
OOP model that is referenced, e.g. the relationship between messages and
objects in the paradigm, or some other thing. I did not see the term explained
in this thread, his AMA, or the emails.

------
jayd16
Is a message also an object by Alan Key's definition

------
hevi_jos
It took multiple people to conceive OOP.

Alan kay was the dreamer, the academic that conceived it. But it took other
people to materialize it into something practical.

Those people had different skills than Alan Kay, and they used languages
derived by C,like ObjC, java or C++,the C world, which by the way Alan hates
with passion.

The C word was much more practical for companies than the Lisp world, you
could generate executables that are static(can not modify themselves like Lisp
can) and can be sold as a product. They are easier to audit and at the time
hundreds of times more efficient(Academics just did not care about efficiency
at all like the people who had to actually buy their own machines).

In the end OOP became different to what Alan envisioned, which is a great
thing by the way.

------
lincpa
FP and OO are overly complicated, and it is not feasible in large industries.
It is also a kind of production method that emphasizes personal technology in
hand workshops. Personal technology greatly affects product quality and
extremely unreliable production methods.FP and OO are actually taking a
detour, highly embellished and ineffectual, and produce all kinds of fail.

Most OO systems are just simulations of real-world surface phenomena, and the
whole system, like a mess, I think it is not good method of OO to simulate the
real world, but to design it correctly with an abstract refined data model as
a prototype. For example, the ggplot2 of the R language, the system is clear,
with the perfect data model as the prototype. So a good OO system is more
inclined to a data flow system, and I think Ggplot2 is more likely to be a
data-driven plot system if OO was not in vogue at the time.

Excessive application of OO and FP design patterns, in addition to increasing
complexity and error probability, reduce performance, without any benefit.
Complex networks of relationships between objects in the OO system are also
difficult to maintain.

I tend to construct systems with the simplest concepts and the most basic
techniques, syntax, and functions. Used to implement my mind, The Pure
Function Pipeline Data Flow is the simplest, stable, reliable and readable..
There is a great poet Bai Juyi in China. even illiteracy understands and
appreciates his poetry. I hope that my code can be understood by the junior
programmer even in the most complicated system.

For me, programming is the process of designing a data model that is simple
and fluent in manipulation. More than 80% functions of my project is ->>
threading macro code block, each step is simple, verifiable, replaceable,
testable, pluggable, extensible, and easy to implement multithreading. The
clojure threading macro provides language-level support for
PurefunctionPipeline&Dataflow.

[https://github.com/linpengcheng/PurefunctionPipelineDataflow](https://github.com/linpengcheng/PurefunctionPipelineDataflow)

~~~
empath75
> FP and OO are overly complicated, and it is not feasible in large
> industries.

This is just an absurd statement on its face given how much of the internet
runs on OO languages.

~~~
lincpa
Do you think their code has reached the level of large industrial standard
parts?

Is the level of programmers as standardized as that of blue-collar workers?

I don't think their production methods meet the standards of large industrial
production: simple, consistent, standard, Pipeline, repeatable, linear
expansion.

~~~
msla
> Is the level of programmers as standardized as that of blue-collar workers?

Is the level of journalists as standardized as that of blue-collar workers?

 _It would literally be absurd to say yes._

> I don't think their production methods meet the standards of large
> industrial production: simple, consistent, standard, Pipeline, repeatable,
> linear expansion.

You certainly have a litany of buzzwords.

