Until I hit the Java scene. Boy, is it a big mess. There I found some of the worst unreadable, unmaintainable, ungrokable, complex and brittle code I've ever seen. And that's not an exception, rather it's the standard. I suddenly understood why Python's logging implementation is such a disaster. It was ported from Java. I'm sorry if this offends the java programmers, but it feels like idiomatic java is just abstractions piled on abstractions upon patterns, in the hope of somehow making things more understandable or robust. It doesn't.
I've always thought of OO as simply abstraction and encapsulation. Functions on steroids, if you will. If something doesn't fit the OO paradigm, I don't try to shoehorn it in (which patterns seem to especially made for). If you manage to avoid doing that, OO works rather well if you ask me.
this is my experience whenever I argue against Object Oriented Programming (OOP): no matter what evidence I bring up for consideration, it is dismissed as irrelevant. If I complain that Java is verbose, I’m told that True OOP Programmers let the IDE take care of some of the boilerplate, or perhaps I am told that Scala is better. If I complain that Scala involves too much ceremony, I’m told that Ruby lacks ceremony. If I complain about the dangers of monkey-patching in Ruby, I’m told that True OOP Programmers know how to use the meta-programming to their advantage, and if I can’t do it then I am simply incompetent. I should use a language that is more pure, or a language that is more practical, I should use a language that has compile-time static data-type checking, or I should use a language that gives me the freedom of dynamic typing. If I complain about bugginess, I’m told that those specific bugs have been fixed in the new version, why haven’t I upgraded, or I’m told there is a common workaround, and I’m an idiot if I didn’t know about it. If I complain that the most popular framework is bloated, I’m told that no one uses that framework any more. No True OOP Programmer ever does whatever it is that I’m complaining about.
It would be quite simple to turn the argument around by doing the exact opposite thing - cherrypicking nice things from OO languages and warts from FP languages and then use the same argument about scotsmen in the other direction.
Arguments about whole categories of languages when not limiting yourself to least common denominators end up stupid pretty fast as it is always possible to find one example or another that fit the point you want to claim.
Any attempt to fuse aspects of other languages or paradigms with OOP in order to mitigate the listed problems are dismissed as "not OOP" to serve the purpose of illustrating that OOP sucks.
My experience is that any pure paradigm has limited context in which it's ideal and many pitfalls outside that context. OOP is no different. Foolish adherence to consistency, hobgoblin of little minds, etc.
Where I will agree, at least in spirit, is that OOP has a few different aspects, including the implementation in a given language, the current understanding of best practices for a given context, and the cult of context-independent "design correctness". That last one has major issues.
When the cult drives the implementation, you get high-ceremony languages like Java. And when the cult drives best practices, you get maintenance issues like those currently being acknowledged around invasive unit testing, etc.
The key there isn't the design paradigm, IMO, but the fallacy of trying to apply a best practice without understanding why it's useful or analyzing whether it's the right thing to do for the current situation.
An analogy would be normalizing a relational database. There's a "correct way" to do it, which in theory reduces maintenance if you go to third+ normal form. In practice, knowing when to relax normalization is the difference between success and a mess. So goes DRY, SOLID, etc.
I'll sink back into the corner now....
> Above, I have, with some humor, suggested that proponents of OOP tend to indulge the No True Scottsman fallacy when arguing for OOP. On a more serious note, some proponents of OOP might try to defend themselves by suggesting that we are facing issues of commensurability: proponents of the pure-logic paradigm, or the functional paradigm, or the OOP paradigm, talk past each other because we can not understand the axioms on which each other’s arguments rest.
> I am willing to believe that this issue explains some of the disconnect between some of the more thoughtful proponents of the different styles. And yet, there has to be something more going on, since the empirical evidence is so overwhelmingly against OOP. Proponents of OOP are arguing against reality itself, and they continue to do so, year after year, with an inflexibility that must have either non-rational or cynically financial sources.
A: Java is too verbose.
B: Java isn't really a OOP language.
This is more like:
B: Sure, but that's a statement about Java, not a statement about OOP.
If you list a bunch of criticisms of OOP languages, that doesn't mean that those criticisms automatically become criticisms of OOP as a concept, merely those of the languages themselves. Especially when you can find OOP languages that don't match that criticism. Java is verbose, yes. But it doesn't mean OOP languages are verbose (what the article author would like to claim), and it doesn't mean that Java isn't an OOP language (what the author believes other people use to deflect the criticism). It means that Java is verbose.
And I doubt that anybody claims Java isn't verbose, or that the IDE "handles" it--regardless of who's doing the typing, Java is verbose. The only people that hold up Java as a good example of OOP are people that don't really know any other version.
You: OO is flawed because of x
Me: But OO language a doesn't have a problem with x
You: Ah but language a has problems with y
Me: Yes but language b doesn't
You conclude OO is inherently flawed
I conclude that there are now languages without flaws...
On the other hand, if you say "I'm trying to do X - what's a good language to use?" then we can have a conversation. Some of the flaws are irrelevant in some projects, but roadblocks in others.
Debating the pros and cons of languages without the context of a project is like discussing the best form of transport without having any idea of the journey.
Right now I've gotten to the point where he writes "We should note the irony that they are using Linux to explain OOP concepts, even though Linux is written in C, which is not an OOP language."
And yet... it implements OO concepts throughout the kernel. He then quotes Torvalds' comment about C++, yet does not mention that OO concepts are used.
This guy references the most random things. Some guy on HN and a Kuro5hin article from 2002 (he ought to look up my article on Buddhism, which shows that anyone could write anything on K5)...
No offense, but this massive rant could do with a copyeditor. More brevity would help me understand what point he is trying to make.
Interestingly enough, 30,000 words is also a common 'crisis point' among aspiring writers. Maybe that's what forced the article short.
Sure OO brought a lot of little code smells too, and we often end up making bad code, but it's not like we have to. Polyglot programmers that use e.g. Java+Scala or C#+F# probably make their OO code much better than others. I have almost completely stopped using mutable objects, long inheritance chains and nulls in C#, as an influence from F#.
For some scenarios having mutable objects is a near perfect fit (scene graphs for games or UI:s are good examples of code where both FP and non-object imperative usually looks worse).
Disregard or play down the contents of this fantastic post at your peril. Understanding it and groking it could save you 5 years, 10 years or even a lifetime of writing software in the wrong way.
Otherwise I'd like a batch of whatever it is you're smoking.
Not only is the OP mistaken as being almost a troll post but anyone who actually warns "er, no guys, this post is for real and you need to understand it or at least respect it to move forward in your careers" is also mistaken as "just being funny".
Just to clarify. No the OP is not a troll post or anything like it. It is deadly serious. And I am also being completely serious.
The OP is just someone who reached the end of his learning path of OOP. He found all its flaws and weaknesses and then discovered there is a whole other world out there called FP and that there are even languages which combine the best parts of OO and FP into one. Why is this considered a problem? Some people almost seem insulted by it. He is just trying to help you guys realise that the floor on which you're standing is not quite as solid as you mistakenly think.
As I said, it is a story of enlightenment. Discredit it at your peril, as you are only harming your own self development as a programmer. There is nothing wrong in learning FP and adding it to your toolbox. This doesn't mean you have to "give up" on OO. It just means you have two tools to select from rather than just one.
Observation: Unless they are famous or head up some popular OSS project, it seems that software developers are incapable of identifying or even respecting those who are clearly more knowledgeable/experienced than themselves.
Oh, as for "us guys"--guess what? Some of "us guys" use the appropriate tool for the job, whether it's OOP, FP, IP, etc. and have been doing so for decades.
So do JAVA,Ruby,C#,...
Why do FP folks always try to sell the fact the "OOP=bad / FP=good"?. I'm found of OOP, i'd never say "FP=bad",that would be ridiculous,FP concepts are interesting,i'd even say FP is fun.
Both are tools in my box.
Now if the problem is JAVA,good,we can talk about JAVA shortcomings.But JAVA is only one implementation of OOP concepts.
This is something I miss from the discussion: for each computational problem there is one tool which is "best" to solve it. Sometimes it is some esoteric Haskell magic and sometimes it is some lowlevel C++ bit manipulation.
Git (C) has eaten up both Darcs (Haskell) and Mercurial (Python), even though Python and Haskell are "clearly" superior languages, according to some.
While I understand the notion of pure FP.Shouldnt we say that FP is more of a toolbox that can be used in many languages,some making it easier than others. I mean FP in Java 7 is a pipe-dream while any language that has lambdas can be considered as functional? or should lazy evaluation be another precondition?
that's a question.
I would identify a language as distinctly imperative if it has mutable state and has statements, e.g. operations with side-effects that have no return values (ex: loops, if's, void methods) -- not to be confused with expressions, which may have side effects, but always have return values.
So, if you look at the matrix of possibilities based on these classifications, a language can be any combination of (OO or not OO) and (functional and/or imperative).
It is trickier yet when many languages support multiple paradigms but encourage a subset. Or if you choose to restrict functional to require anonymous functions _without_ syntactic sugar.
Does that help though?
It would be interesting to do a contest where each month a jury selects one of those problems and everyone focuses on that problem. The "boilerplate" is already there.
«OOP to me means only messaging, local retention and protection and
hiding of state-process, and extreme late-binding of all things. It
can be done in Smalltalk and in LISP. There are possibly other
systems in which this is possible, but I'm not aware of them.» (E-mail to Stefan Ram, 2003, http://www.purl.org/stefan_ram/pub/doc_kay_oop_en)
I have bookmarked this good ol' OOP bashin' and I am considering it a must-read going forward, because I love me some OOP bashin'.
if (NULL == lpDataPointer)
/* Return FALSE */
/* Return TRUE */
I know I kind of repeat the author's True Scotsman's metaphor, but my point is, you can't kill a return operator or, say, code comments just because the majority of code around us tends to misuse it. Unfortunately for the OOP paradigm it makes multiplying even bigger entities easier, and at the same time its learning curve invites lo-fi coding.
Who's to blame?
return (NULL != lpDataPointer)
The expression problem is not solved by every language, but basically because Lisps and Haskell solve it in their own ways, it is concluded that OOP is worthless.
There are a number of practical reasons OOP could be liked, other than the marketing of it to the Enterprise. It's a natural way to model real things. The syntax lends itself to things like auto-complete (foo.ba<tab>). And being liked is enough to make it not-awful if the people who like it are skilled enough -- it's sufficient even to say it can be better than other paradigms for some people.
Don't mistake that for me saying that OOP is the best at anything, but it's not the worst at everything.
An unreasoned rant that's more parody than substance. Seems more a derivative pile-on than anything new or valuable.
This, I thought before even reading the article. I know that frustration myself too well, but have to admit I am often in the wrong, asking the wrong questions, having false preconception.
Lloking at different impleme tations it's clear, experts aren't all that sure, either, even though most implementation specifics aren't detrimental to OOP, so language critizism in lieu of OOP concepts is clearly a strawman. then I stopped reading.
I would have to take a few guesses to know what an AbstractFactory is. That is a "J2EE Javaism" and always has been. So you aren't alone there ;)
I might have missed one or two points, but it doesn't really matter.
Oh, put your sense of superiority away!
I had to bail out about 1/3rd of the way through. Wow, what a rant.
I don't mind the invective, and I'm becoming more skeptical of OOP the more I see larger-scale FP apps work. Mutable state and hidden dependencies are killing us. But I do not like the way this essay is arguing its case. Please do not trot out famous people, tell me their ideas, and then show me how things did not work out for them.
Frankly, I couldn't care less what Alan Kay thinks of OOP. Or Bjorne. Or any of the others. I'm sure they're nice people, and probably a million times smarter than I am. What I'm interested in is this: why do people who know multiple ways of creating a solution pick OOP? And does it fulfill the implied promises it makes to those folks?
The classic example is the statement "OOP is required for creating and maintaining large systems, because it forces developers to think about where things go first, ahead of what the system does."
So here we are presented with a problem: large-scale systems with lots of developers have a difficult time working without a clear definition of where everything goes and how it works together. From my experience that seems like a reasonable problem to consider. So let's talk about that problem, how OOP helps, and how OOP falls down on the job.
Repeat and rinse. Then we get a list of real-world problems that choosing OOP is supposed to help with. We also can start creating some success criteria for both OOP and non-OOP projects.
Otherwise, our arguments get caught up in personalities and theory-of-the-universe crap that doesn't really go anywhere. I can line up a dozen experts, cherry-pick some quotes, and rant away. A person disagreeing with me can do the same. Then we can each talk about how the theoretical system the other guy's way of solving things is built on a shaky foundation. The other guy will demonstrate that this is incorrect by citing an example. It's not productive.
The reason it is not productive is that it is trying to make a universal case based on theory, as if programming were some sort of extension of calculus or geometry. Yes, I understand the association with category theory and such, but programming is the act of multiple humans coming together and creating some better way of doing things for another group of humans. Yes, languages are mathematical, but programming isn't. You want better programming, you'd better start listing out a bunch of ways humans screw the pooch when they're trying to make stuff, then try to help them stop doing that. You don't trot out the set theory books. Wrong answer.
Think in terms of machine language. At the end of the day, bits gotta go somewhere -- both data bits and coding bits. Instead of trying to say "well, we always do things this way because $FamousAuthorX told us", it might be much better to say "we have this process that allows us to dynamically change the method of grouping that proceeds like so"
This is much better because it starts with the problem and lets the solution evolve. Instead of starting with already knowing the solution and then just taking the problem and a big hammer and making it all work [insert long argument for mixed-mode languages here, like Ocaml or F#]
ADD: I also note that developers love making things complicated, and OOP is like a massive playland where you can run off and make complicated stuff all day long. Become your inner architecture astronaut. This is extremely difficult for many to resist.
The biggest obstacle is the anti-intellectualism which weighs down every argument. Very few people even want to learn something new, they just want the ego boost of knowing they shouldn't have to.
So now I lead a study group reading through SICP and I call it a day. Better to train up a cohort of people who want to learn than argue with those who even if proved 100% wrong will never budge.
Key question to ask yourself: assuming you had a good team that could talk about things, if the other three guys on your team wanted to try something new, but you thought it was a bad idea, would you keep an open mind and give it a good shot? Or would you dig in and come up with the 74 thousand reasons it sucks?
There are a lot of really smart guys out there that can argue any subject from any angle better than anybody else in the room. They view conversations as debates and discussions around direction as something that can be "lost" or "won". There's a right and wrong answer, and these guys are almost always right.
Don't be one of those guys.
^1 «Object oriented software construction is the building of software systems as structured collections of implementations of abstract data types.» Bertran Meyer. Object Oriented Software Construction, 2nd edition, 1997.
scala> def hasUppercase(s: String) = s.exists(_.isUpper)
hasUppercase: (s: String)Boolean
scala> hasUppercase("Charlie Brown")
res2: Boolean = true
res3: Boolean = false
Please note that, below, when I refer to a multi-paradigm language, such as Scala, as an OOP language, I am specifically referring to the OOP qualities in that language. And I would like you to ask yourself, if you use a multi-paradigm language to write in the “functional” paradigm, are you actually gaining anything from the OOP qualities of that language? Could you perhaps achieve the same thing, more easily, using a language that is fundamentally “functional”, rather than object oriented?
But that's not the point, the Scala code provided there is beyond ignorant, whoever wrote that has no right to be judgemental of the language or its founding paradigms.
> Could you perhaps achieve the same thing, more easily, using a language that is fundamentally “functional”, rather than object oriented?
(That was easy. :-D)
IT'S LITERALLY JAVA SYNTAX, COMPLETELY INVALID SCALA.
Let's say that you want to design a CRUD app, but you're not going to use OOP. What are some of the ways you could choose to structure your code? Would you still use a pattern like MVC?
Modularity and design-by-contract are better implemented by module systems ( http://en.wikipedia.org/wiki/Standard_ML#Module_system )
Encapsulation is better served by lexical scope ( http://en.wikipedia.org/wiki/Scope_(computer_science)#Lexica... )
Data is better modelled by algebraic datatypes ( http://en.wikipedia.org/wiki/Algebraic_data_type )
Type-checking is better performed structurally ( http://en.wikipedia.org/wiki/Structural_type_system )
Polymorphism is better handled by first-class functions ( http://en.wikipedia.org/wiki/First-class_function ) and parametricity ( http://en.wikipedia.org/wiki/Parametric_polymorphism )
As for an alternative to "CRUD app using MVC", I'd probably recommend Functional Reactive Programming ( http://en.wikipedia.org/wiki/Functional_reactive_programming ). MVC is a way to architect interactive simulations and games, developed in the live environment provided by SmallTalk.
However, I imagine your intention was for something closer to the server-side code of a form wizard on a Web page, rather than a game. In which case, I'd avoid MVC-style approaches completely, since they're inappropriate. It's much more straightforward to model servers as UNIX pipelines turning HTTP requests into HTTP responses ( http://en.wikipedia.org/wiki/Pipeline_(Unix) )
Pipelines turn out to be a great fit for functional programming: lots of small, single-purpose pieces, composed together into a server which requests flow into and responses flow out.
As for "best practices", there are some interesting answers on http://stackoverflow.com/questions/842026/principles-best-pr...
There certainly aren't as many 'design patterns' style things in functional programming as there are in OO. In general, it seems to be because:
- FP systems are much smaller than OO systems (due to conciseness and not as much large-scale use)
- FP favours immutability over encapsulation, so you can re-use existing map/fold/filter/etc. functions rather than having to rewrite them in small pieces spread throughout your app. Hence you don't need to decide how they're implemented.
- Pure functions take everything in via arguments and give everything out via return values. There aren't many ways to get that wrong. In OO any method can affect the application in all kinds of ways, so there are all kinds of ways to get it wrong, requiring best-practices to avoid them.
- In FP, we're generally turning some input into some output. If we're writing code that doesn't seem to get anything from our input closer to something in our output, we should stop and think for a minute. In OO we're encouraged not to think in terms of solving a problem: instead we try to define ontologies and hierarchies, implement models of the domain, etc. only then, once we've bootstrapped a simulation of our business entities and their interactions, do we write the actual code for the problem we have: `Order::newFromFile('input.csv').ship()`
* Real-World Functional Programming - With Examples in F# and C# by Thomas Petricek
This helped me a lot because I'd been working solidly with C# for 8 years leading up to it (C++ before, and C before that), so I was very much in C mode. It presents reasons for the functional approach and shows side-by-side C# and F# examples of the various topics covered. It also takes a more practical approach, rather than 'yet another fibonacci example' which you'll see on lots of functional programming tutorials.
* Programming in Haskell by Graham Hutton
Basically a text-book on learning Haskell, but covers a lot of fundamentalist functional thinking.
* Pearls of Functional Algorithm Design
Once you have the basics of functional programming knocked then go for this. It teaches you how to think about reducing problems.
I've got the Haskell Craft book but haven't practiced enough. It seems good enough for me but, as people are learning more and more about using OO and FP in the real world (thus such criticism), I would like to keep current.
It would be good to define the absolute essence of OOP first, because in 30 years it has been overloaded so much with different meaning that it means everything today. For instance:
- It was a hype/next-big-thing/snake-oil-salesman-material: CORBA and COM used to the silver bullets in the 90's which would solve all software development problems by building applications from reusable components which could interact with each other (but in reality, the only thing that came out of it was embedding Excel tables into Word documents). If this is the essence of OOP, then yes, it deserves to die.
- If it the software design principle of thinking 'object oriented'? Then this is already a 2-edged sword. The main problem here is to fall into the trap of trying to model a software system after the real world (the animal-cat-dog problem). Is inheritance, polymorphism, encapsulation, message passing a bad thing? I would say mostly not, but it can be abused, most of these solved actual real-world problems.
- Are the OOP language features the essence of OOP? 'Everything is an object'? Multiple inheritance? Here it gets fuzzy because languages (and their standard frameworks) are so radically different.
To me the pragmatic essence of OOP is that it groups the data and the code which works on that data. If there's a pure C library which has a function to create some sort of 'context' and then has functions which take a context pointer as argument, then this is object-oriented to me. Most other features, like classes, inheritance, runtime polymorphism is just sugar-topping; useful in some cases, but not the essence.
Even with this simple definition, it is easy to fall into the real-world-object trap by designing complex systems as some sort of object-soup which communicate through messages. That's hard to debug and optimise for performance (but it's still easier to maintain then having one big soup of global functions and variables).
Data-oriented-design is one obvious answer to fix some OOP-design-problems. First concentrate on the data, not the code, divide it into 'hot and cold' data, group by access patterns, think about cache misses. After that, the code that works on that data usually 'writes itself'.
And, IMHO OOP itself is completely orthogonal to the really hard problems: memory safety and lifetime management without runtime overhead, concurrency, reducing complexity and improving maintainability of big software systems.
Your mileage may vary of course, my background is games programming, and I started with Z80 assembly, dabbled a bit with Pascal, Forth and Lisp in the 80's before coming to C, then C++ and Python, and I tend strongly towards Rust for future work. I'm only using the parts of C++ that I feel improve on raw C (there once was an object-oriented UI framework on the Amiga called Intuition which bolted classes and methods on top of C, that was the point where I thought that it's really time to learn some C++).
You nailed it.It's just more elegant to group things as a class than having fopen,fread,fclose where you pass a resource as a first argument. Polymorphism then is the consequence of that design since you could have the same interface for a file,a socket,an archive or whatever. When in use,you can ignore the nature of the resource and just remember it can be opened,read or closed,which allows to write abstract logic on top of it.
Are the OOP language features the essence of OOP? 'Everything is an object'? Multiple inheritance?
if an object can be reduced to a symbol like a void-pointer to data or a function that returns data and this object can be related to others on a graph, then that's still general enough.
Imperative Programms as well as Object models can be well organized in graphs. The CPU or bytecode doesn't know about objects anyway. So it's an abstraction and you need to applay that in a moderate fashion, depending on the complexity of the problem, so for simple or efficient procedures imperative style is not to be replaced by OOP. fopen, fread etc. all are abstractions of underying system calls, but depend on speed, so light OOP is a good fit there,
As far as I can tell, the OP has seen something working in practice, and has set out to demonstrate why it doesn't work in theory.
Functional programming can be really awesome for some problems, as can OOP, imperative and declarative.
It's almost as if the site was designed to be solely read on some tiny-screened mobile device or something, with all that text squished in the middle.