

Are we still doing OOP? - graham_king_3
http://www.hackinghat.com/index.php/programming/programming-like-its-1995

======
blub
Terrible blog post: present a series of special cases where OOP is not the
best fit => throw OOP away except for class libraries?

Thanks to Paul Graham's language abuse, dead is becoming a meme: "X is dead" =
"I don't use X".

~~~
mdg
Proper name

------
barrkel
Ironically, I don't think OOP was really meant to be used for business
objects. I think inheritance doesn't work very well with objects whose backing
store needs to fit into a relational database someplace.

OO really shines with UI programming, where incrementally building up
behaviour through inheritance makes a lot of sense.

~~~
arethuza
If the classes in your system vary a lot in behavior but not in their internal
state then mapping to a database can be straightforward.

With lots of variations in object structure things can be horrible - witness
designs that have more tables than most tables have rows.

~~~
epochwolf
> With lots of variations in object structure things can be horrible - witness
> designs that have more tables than most tables have rows.

This is the problem document databases are designed to solve. :)

~~~
arethuza
Maybe someone should create an ERP system based on one then!

------
duncanj
1\. How bizarre to consider Javascript not object-oriented programming. The
entire page is represented in objects.

2\. I think the worst thing that happened in OOP was the development of ER-
like methods (Rumbaugh, Booch) to "model" the system before it got built. It
doesn't clarify anything and it doesn't fit with the original idea from
Smalltalk, where software is developed in a gradual and exploratory style.
When he states you don't need to develop a "complex object class hierarchy"
for a small task, I think he is falling into this trap.

3\. Obviously, other approaches are good for a number of problems, and OOP is
not the be-all and end-all. I would never avoid using it, though, because it
is good for many problems, especially UI. OOP is worst in static languages
(EC++), and best in dynamic languages (Smalltalk, Ruby, JS), in my opinion.

4\. Spreadsheets. I wish the VBA interface to Excel were more OO, because I
think the methods are on the wrong interfaces. There is definitely room for
improvement. OTOH, the user interface is a facade around something that may or
may not be implemented in OO. I also wish that I could refer to things in the
user interface using OO methods, kind of like using Prototype in JS.

5\. In financial applications, I wonder what technique allows him to
"innovate" without his model "breaking down". He doesn't say, but it sure
seems like he doesn't like objects! (for whatever reason, he also doesn't
really say.)

6\. Finally, I get the feeling that he's really saying he no longer writes a
real object model for his applications. This is, apparently, because the
object libraries he is working with have gotten so good at modeling everything
he needs that he is mostly writing glue, and it is all fitting into a few
modules of mostly procedural code. So, in other words, OO is a huge success.

------
ErrantX
As with all programming methodologies; OOP is a useful tool for situations
where it works.

Outside of that other things become useful... this is how the world has always
worked :)

~~~
loup-vaillant
Of course. The real question is, where does it work? (And why?)

Personally, I don't know.

------
hxa7241
It is not so much OO that is receding as _inheritance_. Although that is not
recent -- it must have been around 1995-2000 that doubts were becoming
popularly known. Inheritance is somewhat interesting though: it is a
formalised mechanism or pattern for changing software function -- there
doesn't seem much else like that. Why did it not work very well?

As to OO, Bertrand Meyer gives the best rationale: it is about organising
software around data instead of procedure, because data is more stable. That
yields a rather abstract definition, but in that sense OO is perhaps more
alive now than ever.

What has happened is that the developer's canvas has expanded enormously. We
think and write for the WWW first now. The data types are the various sub-
parts and relations of web standards. Our software is very much organised
around these, they just don't map to small language features.

------
fleitz
The reason OO exists to the degree it does in my mind is that it makes it
easier for Enterprise Architects to write code in Visio, this also lends
itself to directly modeling the domain so managers don't have to think. Which
leads to public class Money { public string Currency { get; set; } public
float Price { get; set; } }

"The money class is the root of all evil"

~~~
derefr
OO models exist because they reflect how we describe things when _not_ talking
to computers. "This is Spot. Spot Is-A Dog. See Spot.run(). Spot.runs ==
:fast."

They're a crutch for people who can't (or don't have the training to) think in
terms of sets, mappings, graphs, combinators, etc.

~~~
blub
...or maybe FP is a crutch for mathematicians and purists. You're making a
huge mistake: we shouldn't adapt our thinking to computers, we should adapt
computers to our thinking and make them easier to operate. According to your
logic, writing hexadecimal code is the holy grail of computer programming.

~~~
loup-vaillant
And _you_ are making an equally huge mistake: that thinking in terms of sets,
mappings, graphs, combinators, etc doesn't help. It does. And if a programmer
is incapable of understanding those concepts, he should learn them.

The fact is, we shouldn't adapt computers to our thinking, nor our thinking to
computers. We should adapt both to our problems.

~~~
blub
Sure it helps, but that's not what derefr said. He said that any other type of
thinking is a "crutch" and that the ideal of thinking is "thinking in terms of
sets, mappings, graphs, combinators, etc".

By the way, "sets, mappings, graphs, combinators" is a pretty strange
definition for a set. The element descriptions are vague and some of them seem
unrelated to the others. That may be mathematically correct, but it doesn't
make for a great argument.

~~~
loup-vaillant
OK, I misread you.

That said, I tested both OO an FP, and my current opinion is that thinking
more mathematically (in terms of sets, mappings…), almost always yields
smaller designs, which are almost always more flexible and more efficient.

> By the way, "sets, mappings, graphs, combinators" is a pretty strange
> definition for a set. The element descriptions are vague and some of them
> seem unrelated to the others. That may be mathematically correct, but it
> doesn't make for a great argument.

I don't understand you here. Derefr didn't make any mathematical statement. He
didn't described the concepts, he named them. The four items he mentioned
_are_ related. And why should they, anyway? The way I see it, Derefr just made
a statement, not an argument.

------
xtho
Probably the most disturbing thing about the article is the original title:
"Programming like it's 1995". I'd feel better if he had said "OOP is like so
1980's".

~~~
loup-vaillant
The problem is, OOP as Alan Kay envisioned it isn't widely refered to any
more. Now, we all talk about OOP as if Java started it.

------
DanielBMarkham
With all respect to this author and this piece -- it was well done and he
makes many salient points. There are some topics on HN that are just starting
to bore the hell out of me. Apple fanboy-ism, or not. C++: it's complicated so
it's bad. Famous person X says this about famous person/product Y. And
bitching about OOP.

Look. OOP is just the use of categorization and set theory to organize code
and data. It doesn't have to involve a lot wiring things together, it doesn't
have to mean tremendously huge class diagrams before coding, and it is
absolutely not related to one particular language. I can use OOP techniques in
any language, and to the degree they're necessary, I should. It's just a tool.

I understand that the OOP _movement_ got way overblown and annoying, but we as
technologist are always making overly-broad generalizations of anything we do.
There's a tremendous amount of selection bias that goes on in technology, most
of it hidden. I remember in one of my early contracting jobs we had a PM who
was a FoxPro programmer. No matter what the problem we were talking about,
somehow he would feel that FoxPro was the best answer for that problem.

At the time I thought he was unique -- sort of a joke. Looking back over many
years of observing technologists, however? He was the rule, not the exception.
At the risk of stating the obvious we only understand those things we have
familiarity with. This means if you haven't programmed in OOP, _or if you've
had extensive experience with OOP and you know somebody at the same experience
level with different opinions_ , you should try very hard to have an open mind
and learn something from somebody else. It's very easy to make broad claims
that are unsupported by anything but a small set of observations.

Saying "Are we still doing OOP?" is like saying "Are we still using
categorization and set theory" which sounds to me a LOT like "Are we still
using arithmetic?"

The question is nonsensical.

~~~
plinkplonk
"Saying "Are we still doing OOP?" is like saying "Are we still using
categorization and set theory" which sounds to me a LOT like "Are we still
using arithmetic?""

OO is hardly equivalent, and _not_ as fundamental (or anywhere near
fundamental) as Set Theory, leave alone Arithmetic. The implied equivalence is
misleading.

You can use "categorization" and Set theory to design programs with never an
Object in sight.

Types and (mathematical) functions are both defined using Set Theory as are
Relations (as in relational databases). You could program in Haskell with only
types and functions and monads and such with no "object oriented design". The
primary difficulty for people who've _only_ done (what some call) "OOP" is
forgetting all the OO technique they learned.

At best you could say that objects (for various definitions of "OO") could
_also_ be expressed with Sets and (mathematical) functions and such. ALmost
every programming language feature could be so expressed, that isn't saying
much.

There is no specifically "Object Oriented" super technique that transcends all
languages (both OO and non OO) and is applicable in all of them. You'd be
cutting against the grain of many languages if you tried to think in OO while
using them. LIke Fortan, you can write "OO" in any language and it isn't
always a good thing to do.

Not defending the author of TFA ( I don't read these "Ohh technique X or
language Y is so outdated" type of articles any more) but the sentence quoted
(' "Are we using OO" is lik e "Are we using Set Theory" ') is as generic and
misleading as anything the author claimed. Balance in all things.

~~~
DanielBMarkham
Clarification: you don't need an object to do OOP.

Here's a wonderful example. Let's say you're in C. You start writing code and
continuously refactor as you go along. What most people find is, after a good
bit of coding and refactoring, you end up with code grouped in modules with
public and private functions, public and private data, etc. You can have data
abstraction, encapsulation, modularity, polymorphism, and inheritance -- all
to varying degrees, depending on the problem and solution.

OOP takes that categorization and set work and makes it part of the language.
But programming in objects -- true OOP - does not require a language that
provides objects. In fact, it's the other way around: because people
programmed in objects, languages started providing that capability. Where
folks get lost in the woods is where they think the language features are OOP
instead of the conceptual work. Languages are just varying levels of syntactic
sugar on top of the core tools.

Hope that clears it up for you

~~~
plinkplonk
" But programming in objects -- true OOP - does not require a language that
provides objects. "

Without getting into some mystical definition of "True OOP" You don't need
_any_ OOP to program or design.

"data abstraction, encapsulation, modularity, polymorphism, and inheritance"

None of these things (except inheritance maybe) is specific to OOP. All of
them occur in all kinds of languages and paradigms with no objects or "OOP".

As to

" (with C programs) after a good bit of coding and refactoring, you end up
with code grouped in modules with public and private functions, public and
private data, etc. you end up with code grouped in modules with public and
private functions, public and private data"

Sure if you refactor that way.

You could also end up with a collection of rules (as in Prolog) , polymorphic
types and functions (as in Haskell), communicating processes (Occam/Erlang)
etc with no "OOP" anywhere. (unless you want to claim any and all use of
polymorphism or abstraction of any kind is indicative of some kind of all
pervading mystical "True OOP" technique. There is no "True OOP" underlying all
use of (say) abstraction or polymorphism).

I think people who've done only OO or primarily OO are as bad as people who've
never done any OO or rarely do any OO in making outrageous claims.

Grandiose claims opposing _or favoring_ specific subsets of programming
technique [ example 1 --> "Are we still using OO?WTF" example 2 --> "Saying
are we using OO is like saying are we using Arithmetic" don't stand up to
scrutiny.

Objects are a useful abstraction. So are functions, logic staements,
relations, processes, polymorphic types .. . Each in its place. (and balance
in claims ;-) )

~~~
DanielBMarkham
Plink I'm not really sure what you're going on about here.

Yes, you can refactor in lots of ways. Some rely more on OOP, some on FP, some
on rules-based or constraint-based programming, etc. This is my point -- the
tools are there regardless of the language features. Some languages just make
this easier or more difficult. No mysticism required :)

Objects _are_ a useful abstraction at times, but "object" does not and should
not map directly to some language feature. If you think that then perhaps
you've done a great job of learning a language and not-so-good-a-job learning
OOP.

As fas the "people who've done only OOP" part? I take it you mean me? Dude
I've only been doing functional programming in F# for the last couple of years
or so. Found that my OOP chops came in very handy with data structures and
functional composition. But perhaps you didn't mean me. I'm not sure.

I'm done here. The only reason I took this thread deeper is that there is a
useful thing for HN'ers to learn. The tool is bigger (and more useful) than
the language. Don't confuse the concept with the application.

~~~
loup-vaillant
Looking at this thread, it seems your main disagreement is about the
definition of OOP. I won't provide one, mind you. But such disagreements are
so widespread that we may want to stop using the term OOP altogether, and only
use more narrowly defined terms.

~~~
DanielBMarkham
No rhetorical trick intended.

It's a definition thing, sure. But it's important to realize that there's no
trickery going on here. There's a very important point.

I get this all the time, and from all aspects of software engineering. As an
example, it's become quite fashionable to say something along the lines of
"use-cases suck!"

However use-case analysis is just a way of thinking about system
functionality. Most people, when pressed, actually mean "we hate these
monstrous word templates and rigorous bullshit that people make us do and call
them use cases"

To which I do not disagree. But that's confusing the application with the tool
again. To insist that they are the same is to throw out huge hunks of software
engineering simply because a few people got into them and made them overly-
complicated and onerous to use.

Or put another way: I'm sure some folks would like to describe OOP in terms of
overly-complex class diagrams, lots of wiring, un-manageable systems, yadda
yadda. Some other folks would like to describe OOP in terms of small systems
pragmatically created bottom-up. It can be any or all of that. OOP is actually
constructing solution code in a pattern that supports encapsulation,
polymorphism, information hiding, etc. The _idea_ is a conceptual tool, a code
construction pattern of thinking. The _implementation_ can be all sorts of
things. Various implementations suck to more or less degree. But that's really
not important. What's important is whether this tool is useful right now, do I
know how to use it, and which parts work and which parts get in the way of
where I'm going.

I understand that this can sound like sophistry, but it's not.

~~~
stonemetal
_OOP is actually constructing solution code in a pattern that supports
encapsulation, polymorphism, information hiding, etc. The idea is a conceptual
tool, a code construction pattern of thinking._

Yeah except Alan Kay, you know that guy who invented OOP, says everything you
mention there is the overly complex BS unOOP. OOP is late binding and message
passing. It isn't the hair splitting you seem to think it is, it truly
different philosophies about programming. What you are calling OOP is more
like Structured Programming Redux, strong on the structure and light on the
dynamic message passing and late binding.

~~~
DanielBMarkham
Alan Kay? Well gee, you should have said something. Here I was using working
definitions and Alan Kay already had the answer.

Either you understand how silly you sound or you do not. I suspect the latter.
Perhaps you and Alan could continue the discussion.

~~~
stonemetal
What can I say a silly response to a silly post. Information hiding,
encapsulation,polymorphism, have been Software Engineering principles before
OOP was invented Using X == good software engineering then X is the true
definition of OOP no matter how you program it, and all that bad OOP code is
just wolves in OOP clothing is humorous. Good clean structured code may look
similar to OOP code. If the author was not Oriented on Objects then it isn't
OOP code. OO is more about how you get there than where you end up.

------
elblanco
Yes, next question?

