Having programmed a lot of OO in Python and PHP (yeah, yeah, I know, but its OO implementation is actually rather good), I always felt OO was a pretty good way of doing things...
Until I hit the Java scene. Boy, is it a big mess. There I found some of the worst unreadable, unmaintainable, ungrokable, complex and brittle code I've ever seen. And that's not an exception, rather it's the standard. I suddenly understood why Python's logging implementation is such a disaster. It was ported from Java. I'm sorry if this offends the java programmers, but it feels like idiomatic java is just abstractions piled on abstractions upon patterns, in the hope of somehow making things more understandable or robust. It doesn't.
I've always thought of OO as simply abstraction and encapsulation. Functions on steroids, if you will. If something doesn't fit the OO paradigm, I don't try to shoehorn it in (which patterns seem to especially made for). If you manage to avoid doing that, OO works rather well if you ask me.
One of the major points of the article is the "no True OO Programmer" fallacy:
this is my experience whenever I argue against Object Oriented Programming (OOP): no matter what evidence I bring up for consideration, it is dismissed as irrelevant. If I complain that Java is verbose, I’m told that True OOP Programmers let the IDE take care of some of the boilerplate, or perhaps I am told that Scala is better. If I complain that Scala involves too much ceremony, I’m told that Ruby lacks ceremony. If I complain about the dangers of monkey-patching in Ruby, I’m told that True OOP Programmers know how to use the meta-programming to their advantage, and if I can’t do it then I am simply incompetent. I should use a language that is more pure, or a language that is more practical, I should use a language that has compile-time static data-type checking, or I should use a language that gives me the freedom of dynamic typing. If I complain about bugginess, I’m told that those specific bugs have been fixed in the new version, why haven’t I upgraded, or I’m told there is a common workaround, and I’m an idiot if I didn’t know about it. If I complain that the most popular framework is bloated, I’m told that no one uses that framework any more. No True OOP Programmer ever does whatever it is that I’m complaining about.
One really big problem with how your argument is structured is that you cherry pick warts from OO languages, while at the same time cherry picking good solutions from FP languages.
It would be quite simple to turn the argument around by doing the exact opposite thing - cherrypicking nice things from OO languages and warts from FP languages and then use the same argument about scotsmen in the other direction.
Arguments about whole categories of languages when not limiting yourself to least common denominators end up stupid pretty fast as it is always possible to find one example or another that fit the point you want to claim.
This was my take as well. As many good points as there are in the article (and there are many), the whole thing is "No True Scotsman" itself.
Any attempt to fuse aspects of other languages or paradigms with OOP in order to mitigate the listed problems are dismissed as "not OOP" to serve the purpose of illustrating that OOP sucks.
My experience is that any pure paradigm has limited context in which it's ideal and many pitfalls outside that context. OOP is no different. Foolish adherence to consistency, hobgoblin of little minds, etc.
Where I will agree, at least in spirit, is that OOP has a few different aspects, including the implementation in a given language, the current understanding of best practices for a given context, and the cult of context-independent "design correctness". That last one has major issues.
When the cult drives the implementation, you get high-ceremony languages like Java. And when the cult drives best practices, you get maintenance issues like those currently being acknowledged around invasive unit testing, etc.
The key there isn't the design paradigm, IMO, but the fallacy of trying to apply a best practice without understanding why it's useful or analyzing whether it's the right thing to do for the current situation.
An analogy would be normalizing a relational database. There's a "correct way" to do it, which in theory reduces maintenance if you go to third+ normal form. In practice, knowing when to relax normalization is the difference between success and a mess. So goes DRY, SOLID, etc.
This is nitpicking and is entirely irrelevant to the topic at hand; but you said OO (to refer to object-oriented) and FP (to refer to functional programming). You should have either stuck with OO and F, or OOP and FP.
The article pulls a Fallacy fallacy, otherwise known as an "argument from fallacy". Basically what he says is "none of the arguments against me are correct, because fallacy X". I simply ignore such statements, because they make discussion impossible.
> Above, I have, with some humor, suggested that proponents of OOP tend to indulge the No True Scottsman fallacy when arguing for OOP. On a more serious note, some proponents of OOP might try to defend themselves by suggesting that we are facing issues of commensurability: proponents of the pure-logic paradigm, or the functional paradigm, or the OOP paradigm, talk past each other because we can not understand the axioms on which each other’s arguments rest.
> I am willing to believe that this issue explains some of the disconnect between some of the more thoughtful proponents of the different styles. And yet, there has to be something more going on, since the empirical evidence is so overwhelmingly against OOP. Proponents of OOP are arguing against reality itself, and they continue to do so, year after year, with an inflexibility that must have either non-rational or cynically financial sources.
Oh, so what happens if all the arguments against him are actually fallacious? You realise that you would have ignored him based on the fallacy of the Fallacy fallacy?
No, I wouldn't have. I ignore his claim that all arguments against him are True Scotsman Fallacies. I don't ignore the actual arguments against him, nor the one he makes. Any of those arguments may or may not be fallacious themselves, but that has no bearing on me ignoring his claim that they all are.
What I'm saying is that if all the arguments against him were True Scotsman Fallacies, then it would be fallacious to ignore him based on a mistaken understanding that he is arguing based on a Fallacy fallacy. And as we all know, no true logician would make such a mistake.
I think you misunderstand me, or I you. I'm not ignoring him. The only part I'm ignoring is his claim that all counter-arguments made against him are True Scotsman Fallacies. And I was right to do so, since apparently later in the article he revealed it as a joke.
That's not really the No True Scottsman falacy, though. A NTSF would be something like:
A: Java is too verbose.
B: Java isn't really a OOP language.
This is more like:
A: Java is too verbose.
B: Sure, but that's a statement about Java, not a statement about OOP.
If you list a bunch of criticisms of OOP languages, that doesn't mean that those criticisms automatically become criticisms of OOP as a concept, merely those of the languages themselves. Especially when you can find OOP languages that don't match that criticism. Java is verbose, yes. But it doesn't mean OOP languages are verbose (what the article author would like to claim), and it doesn't mean that Java isn't an OOP language (what the author believes other people use to deflect the criticism). It means that Java is verbose.
The same goes the opposite direction, though. While I'm sure your actual arguments are more cogent, simply complaining about something specific isn't an argument against OOP in general.
And I doubt that anybody claims Java isn't verbose, or that the IDE "handles" it--regardless of who's doing the typing, Java is verbose. The only people that hold up Java as a good example of OOP are people that don't really know any other version.
Are you referring to the OP's post? That part of it is simply incorrect. If you're referring to the comment I replied to, I don't see this argument in there.
Exactly. If you look for a language without flaws, you may as well give up programming.
On the other hand, if you say "I'm trying to do X - what's a good language to use?" then we can have a conversation. Some of the flaws are irrelevant in some projects, but roadblocks in others.
Debating the pros and cons of languages without the context of a project is like discussing the best form of transport without having any idea of the journey.
What are examples of a good approaches to generic logging? In the past I've found log4j and the python stdlib logging to be frustrating to work with. But when I had a shot at writing my own approaches, I found it to be one of those domains that's harder to deal with than it looks. Someone will have though - interested to know of a good approach.
There are multiple approaches (my favs being dependency injection or aspect oriented). The actual case of a dependency affecting an entire application is called a "cross cutting concern", there is great literature to be found under that term that explains it better then I ever could. It highly depends on the use case though, just have it in your toolbox (and dont get in the habit of argueing when you are tasked to build a house whether you should use the hammer OR the saw like some peeps in this thread. Select the right tool for the right circumstance dont just limit yourself to one).
My goodness this guy takes a long time to get to any point!
Right now I've gotten to the point where he writes "We should note the irony that they are using Linux to explain OOP concepts, even though Linux is written in C, which is not an OOP language."
And yet... it implements OO concepts throughout the kernel. He then quotes Torvalds' comment about C++, yet does not mention that OO concepts are used.
This guy references the most random things. Some guy on HN and a Kuro5hin article from 2002 (he ought to look up my article on Buddhism, which shows that anyone could write anything on K5)...
No offense, but this massive rant could do with a copyeditor. More brevity would help me understand what point he is trying to make.
A wall of text indeed: at 30,000 words (including quotes), it's about the same length of a 100-page novella. You and the other commenters were not exaggerating.
Interestingly enough, 30,000 words is also a common 'crisis point' among aspiring writers[1]. Maybe that's what forced the article short.
What OO brought to The table (and what many advocates of FP are forgetting) is context sensitivity, that is, functions/methods are brought in scope by the owning object rather. Typing dog.bark() is a thousand times more powerful in terms of tooling than bark(dog) or SomeDogPackage::bark(dog).
I prefer the FP style of coding but I can't see average devs giving up on Java/C# style tools.
Sure OO brought a lot of little code smells too, and we often end up making bad code, but it's not like we have to. Polyglot programmers that use e.g. Java+Scala or C#+F# probably make their OO code much better than others. I have almost completely stopped using mutable objects, long inheritance chains and nulls in C#, as an influence from F#.
For some scenarios having mutable objects is a near perfect fit (scene graphs for games or UI:s are good examples of code where both FP and non-object imperative usually looks worse).
The good parts of OO aren't exclusive to it though. For example, Clojure provides protocols (http://clojure.org/protocols) and multimethods (http://clojure.org/multimethods) which result in far more flexible polymorphism than a language like Java.
I agree many parts can be brought to FP but most importantly the languages of the future must be developed with tools in mind and in tandem with the tools. Too often new langs seem to be developed as command line compiled experiments which only (much) later get IDE support. At that point some decision may already have been made that makes tooling harder, such as not allowing or encouraging methods "on" types.
This is a truly fantastic post. It isn't a rant. It's a story of enlightenment that I encourage all HN'ers to read and absorb.
Disregard or play down the contents of this fantastic post at your peril. Understanding it and groking it could save you 5 years, 10 years or even a lifetime of writing software in the wrong way.
Not only is the OP mistaken as being almost a troll post but anyone who actually warns "er, no guys, this post is for real and you need to understand it or at least respect it to move forward in your careers" is also mistaken as "just being funny".
Just to clarify. No the OP is not a troll post or anything like it. It is deadly serious. And I am also being completely serious.
The OP is just someone who reached the end of his learning path of OOP. He found all its flaws and weaknesses and then discovered there is a whole other world out there called FP and that there are even languages which combine the best parts of OO and FP into one. Why is this considered a problem? Some people almost seem insulted by it. He is just trying to help you guys realise that the floor on which you're standing is not quite as solid as you mistakenly think.
As I said, it is a story of enlightenment. Discredit it at your peril, as you are only harming your own self development as a programmer. There is nothing wrong in learning FP and adding it to your toolbox. This doesn't mean you have to "give up" on OO. It just means you have two tools to select from rather than just one.
Observation: Unless they are famous or head up some popular OSS project, it seems that software developers are incapable of identifying or even respecting those who are clearly more knowledgeable/experienced than themselves.
Oh, as for "us guys"--guess what? Some of "us guys" use the appropriate tool for the job, whether it's OOP, FP, IP, etc. and have been doing so for decades.
> Functional languages such as Haskell, Erlang and Clojure offer powerful approaches to the problems that software developers have always faced.
So do JAVA,Ruby,C#,...
Why do FP folks always try to sell the fact the "OOP=bad / FP=good"?. I'm found of OOP, i'd never say "FP=bad",that would be ridiculous,FP concepts are interesting,i'd even say FP is fun.
Both are tools in my box.
Now if the problem is JAVA,good,we can talk about JAVA shortcomings.But JAVA is only one implementation of OOP concepts.
What I would love to see monthly or quarteryearly "competition" between functional and object oriented programmers. For each competition a jury selects a problem and then everybody can try to solve this problem as nice and clean as possible for his preferred programming language / paradigm. The best solutions are then judged / commented on by experts and put on a website. By this you would have perfect examples how each language / paradigm deals with a certain problem.
This is something I miss from the discussion: for each computational problem there is one tool which is "best" to solve it. Sometimes it is some esoteric Haskell magic and sometimes it is some lowlevel C++ bit manipulation.
More of this. This endless debate about what is better - functional or OO is leading nowhere. The OO side is winning, since the majority of the industry is using it. The way to tear down the OO "hype", is by replacing things used in the industry with better alternatives written in a functional language.
Git (C) has eaten up both Darcs (Haskell) and Mercurial (Python), even though Python and Haskell are "clearly" superior languages, according to some.
Functional has not much to do with object oriented. Scala for example is both functional and OO. I think you mean the competition between imperative and functional languages.
It's not clear where the line is to me.I mean,it's obvious fr Erlang or Haskell.But isnt clojure imperative to some extent?
While I understand the notion of pure FP.Shouldnt we say that FP is more of a toolbox that can be used in many languages,some making it easier than others. I mean FP in Java 7 is a pipe-dream while any language that has lambdas can be considered as functional? or should lazy evaluation be another precondition?
It seems practical to say a functional language has first-class functions and a purely functional language has first-class functions and no mutable state -- purely functional -> functional and not imperative.
I would identify a language as distinctly imperative if it has mutable state and has statements, e.g. operations with side-effects that have no return values (ex: loops, if's, void methods) -- not to be confused with expressions, which may have side effects, but always have return values.
So, if you look at the matrix of possibilities based on these classifications, a language can be any combination of (OO or not OO) and (functional and/or imperative).
It is trickier yet when many languages support multiple paradigms but encourage a subset. Or if you choose to restrict functional to require anonymous functions _without_ syntactic sugar.
It would be interesting to do a contest where each month a jury selects one of those problems and everyone focuses on that problem. The "boilerplate" is already there.
«OOP to me means only messaging, local retention and protection and
hiding of state-process, and extreme late-binding of all things. It
can be done in Smalltalk and in LISP. There are possibly other
systems in which this is possible, but I'm not aware of them.» (E-mail to Stefan Ram, 2003, http://www.purl.org/stefan_ram/pub/doc_kay_oop_en)
This is like reading John Galt's long winded speach in Ayn Rand`s Atlas Shrugged (and I shuddered ...)
There are things I will never be able to 'un-read' ;)
I read this all the way through because I love reading articles that bash OOP. I haven't read an OOP-bashin' article in awhile, so it is great to read a good ol' bashin' that is current. There were a number of times I giggled, like with the "fat models" debate in Rails, or the number of times where he said people wasted their best moments of their career worrying about clean OOP code.
I have bookmarked this good ol' OOP bashin' and I am considering it a must-read going forward, because I love me some OOP bashin'.
Mediocre and bad code tends to multiply entities. On a small scale it's usually redundant global and local variables, or just plain stupid code like this:
Now give the same programmer an OOP language and see what happens. Even a bigger disaster masqueraded as an OOP system.
I know I kind of repeat the author's True Scotsman's metaphor, but my point is, you can't kill a return operator or, say, code comments just because the majority of code around us tends to misuse it. Unfortunately for the OOP paradigm it makes multiplying even bigger entities easier, and at the same time its learning curve invites lo-fi coding.
Whats wrong about the above code? Does it not do what it means to do? Is your personal feeling for how this construct should be written syntactically any measure of intelligence?
I know, it can also be written in many other ways. What decides which one is more correct if they are all semantically equivalent? Is it less characters? Is it using exactly as many 'e's as 'a's? Is it the unnecessary braces?
There is probably no "correct" way of writing this, but there are ways of making your next code maintainer's life easier (which might be you in a few months from now). One line is easier to read and understand than 10, especially if that one line is just "return p".
Personally most of the time id rather have 10: main(int c,charv){return!m(v[1],v[2]);}m(chars,chart){returnt-42?s?63==t|s==t&&m(s+1,t+1):!t:m(s,t+1)||*s&&m(s+1,t);}
But I meant return p, not some gibberish that obviously should be formatted better. It's just that for some developers "formatting better" doesn't translate to "make it shorter and sweeter".
Depends on the language. Clearly 'lpDataPointer' suggests C or C++, but if it was C# or Java then 'return p;' would only compile if 'p' was of boolean type.
Good heavens. What a mess. This guy rambles on forever, and steps on his own points over and over again. I am desperate to leave comments on the actual site, and apparently 2 comments have been left, but I dont see the comments on the actual site nor is there a way to post a comment on the actual site. I am guessing whomever runs this site shut-down the comments section because this joker made an ass of himself with his endless rambling.
Procedural programming for the win. One can write in an object-oriented style or functional style in C, but the language has enough of a barrier to entry to writing code of these varieties that most people write KISS procedural code unless the particular situation really benefits from doing otherwise.
My main problem with the proposition that "OOP is awful" is that the argument is basically "it's not the best so it's the worst".
The expression problem is not solved by every language, but basically because Lisps and Haskell solve it in their own ways, it is concluded that OOP is worthless.
There are a number of practical reasons OOP could be liked, other than the marketing of it to the Enterprise. It's a natural way to model real things. The syntax lends itself to things like auto-complete (foo.ba<tab>). And being liked is enough to make it not-awful if the people who like it are skilled enough -- it's sufficient even to say it can be better than other paradigms for some people.
Don't mistake that for me saying that OOP is the best at anything, but it's not the worst at everything.
> this is my experience whenever I argue against Object Oriented Programming (OOP): no matter what evidence I bring up for consideration, it is dismissed as irrelevant.
This, I thought before even reading the article. I know that frustration myself too well, but have to admit I am often in the wrong, asking the wrong questions, having false preconception.
Lloking at different impleme tations it's clear, experts aren't all that sure, either, even though most implementation specifics aren't detrimental to OOP, so language critizism in lieu of OOP concepts is clearly a strawman. then I stopped reading.
Are you sure you know what DI is? It just means you pass an object's dependencies to it, instead of it constructing them itself. You don't need a framework to do DI.
I don't believe you. Are you mistaking DI for meaning a full blown IoC framework? DI is merely the abstract computer science concept. In FP land we call it partial application or P/A. But it is essentially the same thing. I honestly cannot fathom how a large project would look without DI. It must be hideous with massive amounts of coupling all over the place i.e. not something to be proud of. Refactoring must be very hard.
I would have to take a few guesses to know what an AbstractFactory is. That is a "J2EE Javaism" and always has been. So you aren't alone there ;)
Agreed. If you've not used DI after 15 years as a "professional" developer, and you are also happy to "boast" about this fact on the Internet, then it means you are very likely an utter shit developer.
I think articles like these are akin clickbait. Mature developers understand that every paradigm has it's place. I mainly use functional for data processing and pipelining, and OOP for object modeling.
Object Oriented Programming is not for all developers the same way that some excellent developers have problems with functional programming. I think OO is very well suite for bottom up way of thinking (http://en.wikipedia.org/wiki/Top-down_and_bottom-up_design).
I had to bail out about 1/3rd of the way through. Wow, what a rant.
I don't mind the invective, and I'm becoming more skeptical of OOP the more I see larger-scale FP apps work. Mutable state and hidden dependencies are killing us. But I do not like the way this essay is arguing its case. Please do not trot out famous people, tell me their ideas, and then show me how things did not work out for them.
Frankly, I couldn't care less what Alan Kay thinks of OOP. Or Bjorne. Or any of the others. I'm sure they're nice people, and probably a million times smarter than I am. What I'm interested in is this: why do people who know multiple ways of creating a solution pick OOP? And does it fulfill the implied promises it makes to those folks?
The classic example is the statement "OOP is required for creating and maintaining large systems, because it forces developers to think about where things go first, ahead of what the system does."
So here we are presented with a problem: large-scale systems with lots of developers have a difficult time working without a clear definition of where everything goes and how it works together. From my experience that seems like a reasonable problem to consider. So let's talk about that problem, how OOP helps, and how OOP falls down on the job.
Repeat and rinse. Then we get a list of real-world problems that choosing OOP is supposed to help with. We also can start creating some success criteria for both OOP and non-OOP projects.
Otherwise, our arguments get caught up in personalities and theory-of-the-universe crap that doesn't really go anywhere. I can line up a dozen experts, cherry-pick some quotes, and rant away. A person disagreeing with me can do the same. Then we can each talk about how the theoretical system the other guy's way of solving things is built on a shaky foundation. The other guy will demonstrate that this is incorrect by citing an example. It's not productive.
The reason it is not productive is that it is trying to make a universal case based on theory, as if programming were some sort of extension of calculus or geometry. Yes, I understand the association with category theory and such, but programming is the act of multiple humans coming together and creating some better way of doing things for another group of humans. Yes, languages are mathematical, but programming isn't. You want better programming, you'd better start listing out a bunch of ways humans screw the pooch when they're trying to make stuff, then try to help them stop doing that. You don't trot out the set theory books. Wrong answer.
Think in terms of machine language. At the end of the day, bits gotta go somewhere -- both data bits and coding bits. Instead of trying to say "well, we always do things this way because $FamousAuthorX told us", it might be much better to say "we have this process that allows us to dynamically change the method of grouping that proceeds like so"
This is much better because it starts with the problem and lets the solution evolve. Instead of starting with already knowing the solution and then just taking the problem and a big hammer and making it all work [insert long argument for mixed-mode languages here, like Ocaml or F#]
ADD: I also note that developers love making things complicated, and OOP is like a massive playland where you can run off and make complicated stuff all day long. Become your inner architecture astronaut. This is extremely difficult for many to resist.
I decided to just simply stop arguing and just learn every style I could. As much as I want to be a white knight who "saves" software engineering by convincing everyone to use the most efficient way to work, it's never going to happen. I'm not even sure I know the must efficient way.
The biggest obstacle is the anti-intellectualism which weighs down every argument. Very few people even want to learn something new, they just want the ego boost of knowing they shouldn't have to.
So now I lead a study group reading through SICP and I call it a day. Better to train up a cohort of people who want to learn than argue with those who even if proved 100% wrong will never budge.
Key question to ask yourself: assuming you had a good team that could talk about things, if the other three guys on your team wanted to try something new, but you thought it was a bad idea, would you keep an open mind and give it a good shot? Or would you dig in and come up with the 74 thousand reasons it sucks?
There are a lot of really smart guys out there that can argue any subject from any angle better than anybody else in the room. They view conversations as debates and discussions around direction as something that can be "lost" or "won". There's a right and wrong answer, and these guys are almost always right.
In the article and elsewhere Clojure is often used as an example of an anti-OOP language, yet by either Alan Kay's definition (quoted in my previous comment) or by Bertran Meyer's definition^1 Clojure uses the object oriented paradigm. Clojure is more object oriented than PHP or C++ which only provide some OO-inspired data abstraction mechanisms, but are not based around them.
^1 «Object oriented software construction is the building of software systems as structured collections of implementations of abstract data types.» Bertran Meyer. Object Oriented Software Construction, 2nd edition, 1997.
Please note that, below, when I refer to a multi-paradigm language, such as Scala, as an OOP language, I am specifically referring to the OOP qualities in that language. And I would like you to ask yourself, if you use a multi-paradigm language to write in the “functional” paradigm, are you actually gaining anything from the OOP qualities of that language? Could you perhaps achieve the same thing, more easily, using a language that is fundamentally “functional”, rather than object oriented?
Yes, isUpper is a method on Char, _.isUpper is a lambda that is given to exists, which is a method on the trait TraversableOnce, which String happens to implement, along with many other things, such as Option and Future. Where's the OO and where's the functional? In the Clojure example in the OP does the author realise that all of those things are in effect Java objects? Does any of this make the code better or worse?
But that's not the point, the Scala code provided there is beyond ignorant, whoever wrote that has no right to be judgemental of the language or its founding paradigms.
As someone who is more familiar with OOP, I would love to see examples/hear more alternative approaches to organising code.
Let's say that you want to design a CRUD app, but you're not going to use OOP. What are some of the ways you could choose to structure your code? Would you still use a pattern like MVC?
As for an alternative to "CRUD app using MVC", I'd probably recommend Functional Reactive Programming ( http://en.wikipedia.org/wiki/Functional_reactive_programming ). MVC is a way to architect interactive simulations and games, developed in the live environment provided by SmallTalk.
However, I imagine your intention was for something closer to the server-side code of a form wizard on a Web page, rather than a game. In which case, I'd avoid MVC-style approaches completely, since they're inappropriate. It's much more straightforward to model servers as UNIX pipelines turning HTTP requests into HTTP responses ( http://en.wikipedia.org/wiki/Pipeline_(Unix) )
Pipelines turn out to be a great fit for functional programming: lots of small, single-purpose pieces, composed together into a server which requests flow into and responses flow out.
Say I want to migrate from OOP to Functional and avoid doing things the wrong way the first time, like we supposedly did with OOP and it's too late to fix. What book/code/etc is a good reference for high-quality, practical proven Functional programming best practices?
There certainly aren't as many 'design patterns' style things in functional programming as there are in OO. In general, it seems to be because:
- FP systems are much smaller than OO systems (due to conciseness and not as much large-scale use)
- FP favours immutability over encapsulation, so you can re-use existing map/fold/filter/etc. functions rather than having to rewrite them in small pieces spread throughout your app. Hence you don't need to decide how they're implemented.
- Pure functions take everything in via arguments and give everything out via return values. There aren't many ways to get that wrong. In OO any method can affect the application in all kinds of ways, so there are all kinds of ways to get it wrong, requiring best-practices to avoid them.
- In FP, we're generally turning some input into some output. If we're writing code that doesn't seem to get anything from our input closer to something in our output, we should stop and think for a minute. In OO we're encouraged not to think in terms of solving a problem: instead we try to define ontologies and hierarchies, implement models of the domain, etc. only then, once we've bootstrapped a simulation of our business entities and their interactions, do we write the actual code for the problem we have: `Order::newFromFile('input.csv').ship()`
* Real-World Functional Programming - With Examples in F# and C# by Thomas Petricek
This helped me a lot because I'd been working solidly with C# for 8 years leading up to it (C++ before, and C before that), so I was very much in C mode. It presents reasons for the functional approach and shows side-by-side C# and F# examples of the various topics covered. It also takes a more practical approach, rather than 'yet another fibonacci example' which you'll see on lots of functional programming tutorials.
* Programming in Haskell by Graham Hutton
Basically a text-book on learning Haskell, but covers a lot of fundamentalist functional thinking.
* Pearls of Functional Algorithm Design
Once you have the basics of functional programming knocked then go for this. It teaches you how to think about reducing problems.
It might be easier for those in the know to make a recommendation if they have an example of the kind of book/code/etc you're looking for on the OO side. What do you recommend for high-quality, practical proven OO programming best practices?
I have no idea. Programming is not my main area and thus I look for advice from other people to avoid doing stupid mistakes by following best practices. That being said, OO people will often point me to the design patterns book, etc. Is something similar needed for FP?
I've got the Haskell Craft book but haven't practiced enough. It seems good enough for me but, as people are learning more and more about using OO and FP in the real world (thus such criticism), I would like to keep current.
Another 'the sky is falling' article ;) Can someone provide a TL;DR who worked through the entire write-up? When I hear 'OOP must die' I can't help but think that we're throwing away the good bits too and will reinvent the wheel in a few years.
It would be good to define the absolute essence of OOP first, because in 30 years it has been overloaded so much with different meaning that it means everything today. For instance:
- It was a hype/next-big-thing/snake-oil-salesman-material: CORBA and COM used to the silver bullets in the 90's which would solve all software development problems by building applications from reusable components which could interact with each other (but in reality, the only thing that came out of it was embedding Excel tables into Word documents). If this is the essence of OOP, then yes, it deserves to die.
- If it the software design principle of thinking 'object oriented'? Then this is already a 2-edged sword. The main problem here is to fall into the trap of trying to model a software system after the real world (the animal-cat-dog problem). Is inheritance, polymorphism, encapsulation, message passing a bad thing? I would say mostly not, but it can be abused, most of these solved actual real-world problems.
- Are the OOP language features the essence of OOP? 'Everything is an object'? Multiple inheritance? Here it gets fuzzy because languages (and their standard frameworks) are so radically different.
To me the pragmatic essence of OOP is that it groups the data and the code which works on that data. If there's a pure C library which has a function to create some sort of 'context' and then has functions which take a context pointer as argument, then this is object-oriented to me. Most other features, like classes, inheritance, runtime polymorphism is just sugar-topping; useful in some cases, but not the essence.
Even with this simple definition, it is easy to fall into the real-world-object trap by designing complex systems as some sort of object-soup which communicate through messages. That's hard to debug and optimise for performance (but it's still easier to maintain then having one big soup of global functions and variables).
Data-oriented-design is one obvious answer to fix some OOP-design-problems. First concentrate on the data, not the code, divide it into 'hot and cold' data, group by access patterns, think about cache misses. After that, the code that works on that data usually 'writes itself'.
And, IMHO OOP itself is completely orthogonal to the really hard problems: memory safety and lifetime management without runtime overhead, concurrency, reducing complexity and improving maintainability of big software systems.
Your mileage may vary of course, my background is games programming, and I started with Z80 assembly, dabbled a bit with Pascal, Forth and Lisp in the 80's before coming to C, then C++ and Python, and I tend strongly towards Rust for future work. I'm only using the parts of C++ that I feel improve on raw C (there once was an object-oriented UI framework on the Amiga called Intuition which bolted classes and methods on top of C, that was the point where I thought that it's really time to learn some C++).
Basic tl;dr: Some HN users where mean to the writer so he went on a big rant on how they were all wrong while spouting wild irrelevant technical reasons why tools that are designed to model certain classes of real world systems are completely false and everyone in the world is too dumb to acknoledge his genius.
> To me the pragmatic essence of OOP is that it groups the data and the code which works on that data. If there's a pure C library which has a function to create some sort of 'context' and then has functions which take a context pointer as argument, then this is object-oriented to me. Most other features, like classes, inheritance, runtime polymorphism is just sugar-topping; useful in some cases, but not the essence.
You nailed it.It's just more elegant to group things as a class than having fopen,fread,fclose where you pass a resource as a first argument. Polymorphism then is the consequence of that design since you could have the same interface for a file,a socket,an archive or whatever. When in use,you can ignore the nature of the resource and just remember it can be opened,read or closed,which allows to write abstract logic on top of it.
I see no reason pure modularization wouldn't achieve the same thing. In fact, that's what I took OOP to mean for a long time. But it doesn't need to be bound to the data, or mutable as in not functional. If member functions return objects and data, they can be used in a functional approach and to construct polymorphism or inheritance through currying.
Are the OOP language features the essence of OOP? 'Everything is an object'? Multiple inheritance?
if an object can be reduced to a symbol like a void-pointer to data or a function that returns data and this object can be related to others on a graph, then that's still general enough.
Imperative Programms as well as Object models can be well organized in graphs. The CPU or bytecode doesn't know about objects anyway. So it's an abstraction and you need to applay that in a moderate fashion, depending on the complexity of the problem, so for simple or efficient procedures imperative style is not to be replaced by OOP. fopen, fread etc. all are abstractions of underying system calls, but depend on speed, so light OOP is a good fit there,
It reminds me of the time when Linux users were ranting about Windows... Somehow everybody ended up with a Mac. Hold on Lawrence, things are going to get better.
Until I hit the Java scene. Boy, is it a big mess. There I found some of the worst unreadable, unmaintainable, ungrokable, complex and brittle code I've ever seen. And that's not an exception, rather it's the standard. I suddenly understood why Python's logging implementation is such a disaster. It was ported from Java. I'm sorry if this offends the java programmers, but it feels like idiomatic java is just abstractions piled on abstractions upon patterns, in the hope of somehow making things more understandable or robust. It doesn't.
I've always thought of OO as simply abstraction and encapsulation. Functions on steroids, if you will. If something doesn't fit the OO paradigm, I don't try to shoehorn it in (which patterns seem to especially made for). If you manage to avoid doing that, OO works rather well if you ask me.