
Don't distract new programmers with OOP - angrycoder
http://prog21.dadgum.com/93.html
======
raganwald
Provocative.

The suggestion is that thinking in OO terms makes you think about
architectures instead of programs, and the result is to move you away from
thinking about the problem and the solution and towards thinking about the
program's organization. This is always assumed to be a good thing, especially
when accompanied by the usual examples of writing programs for teams of
programmers with varying levels of skill.

But if we grant that non-OO is better for someone learning to program, why
wouldn't it be better for someone reading a program for the first time?

EDIT:

Thinking further about this, I am not against the idea of design
considerations that should be kept from the beginning programmer. But OO isn't
really a design consideration, it's a _metaphor_.

If it really "worked," then new programmers would be looking all confused
trying to write a Towers of Hanoi program, and you would explain, "Think of
each tower as its own thing. What can it do? What does it have?" And slowly
you could tease out a program by designing objects from the ground up.

But if it actually doesn't work to teach programming using objects, if you
teach programming without objects and introduce them as an 'advanced' subject,
then at some level you have to wonder if the metaphor is fundamentally broken.
If the purpose of OO is design and organization rather than a fundamental way
to think about programs, then really we shouldn't say things like
"Everything's an object."

We should ask what we need to do to design well-factored programs that are
cohesive without being coupled and then design language features that directly
address those organization requirements rather than thinking that there is
this obvious "metaphor" that naturally leads to well-organized programs.

~~~
ahlatimer
I don't know that OO's main goal is to be readable, but rather to manage
complexity (which _should_ help readability, but I don't think that's the
focus). There's an inherent level of complexity in OO that isn't necessary
when you're solving basic, learner type problems, but it can help keep
complexity down when you're solving problems that have solutions that can't be
kept in your head at any one time.

To put it differently, think of an algorithm that's constant time, but has a
very large constant and an algorithm that's O(n). The constant time algorithm
will be worse for small sets, but better for large sets (once n > that
constant). That's the problem with teaching OO to people new to programming.
You're adding an additional thing to an already difficult topic, and that
thing you're adding doesn't really benefit you at that level of problem, but
get to a bigger problem, and it starts to make sense.

~~~
simpleTruth
I agree, but it's good to step back and think why that's true.

The real value in OO programming is being able to reduce complexity by reusing
code in a new ways. Either by creating several instances of the same object
that all manage a complex internal state, or by inheriting from another object
and extending functionality. I need to add a window, I already have lot's of
windows, but this one is different because X works in OO. I need to build my
first window starting from scratch and suddenly OO does not give you any
leverage.

Which is why OO seems pointless for most being programmers, they can't deal
with sufficiently complex problems to gain anything from OO programming.
However, you don't need to understand object's to use them. It takes a while
to understand why you can keep adding <<'s after cout, but most beginners
quickly prefer it over sprintf.

~~~
raganwald
_Which is why OO seems pointless for most being programmers, they can't deal
with sufficiently complex problems to gain anything from OO programming._

Beware, this is not biconditional. I'm sure you won't have to go very far to
find programmers who deal with complex problems but who do not feel that they
gain anything from solving them using OO.

And there is also the argument that OO as implemented in popular languages
introduces its own complexity, such that it ends up being used to solve the
problems it introduced in the first place.

------
siglesias
Speaking from personal experience, when I transitioned from C (first language
that I was three months into) to Objective-C, I couldn't STAND the number of
cutesy metaphors used to describe what objects were and how they functioned:
"Like, you can send the duck a message to swim, or to fly." "Ducks inherit
from birds."

Coming from a highly plausible programming world of data and operations on
data, this came off as nonsense. I wanted to tear my hair out finding a sober
explanation of what an object actually WAS, practically, in practice instead
of hearing them referred to as things that could be told to do stuff. Apple's
Object-Oriented Programming was by far the most enlightening document in my
early days:

[http://developer.apple.com/library/mac/documentation/cocoa/c...](http://developer.apple.com/library/mac/documentation/cocoa/conceptual/OOP_ObjC/OOP_ObjC.pdf)
(chapter 2)

~~~
kragen
> Coming from a highly plausible programming world of data and operations on
> data, this came off as nonsense.

It _is_ nonsense.

On IRC the other day, I said this:

    
    
        19:19 < xentrac> I propose a new rule for discussions of object-oriented 
                         programming
        19:20 < xentrac> which is that anyone who brings up examples of Dog, Bike, Car, 
                         Person, or other real-world objects, unless they are talking 
                         about writing a clone of The Sims or something,
    
        19:20 < xentrac> is immediately shot.

~~~
phillco
Person _is_ shot.

~~~
possibilistic
Person _has a_ GunShotWound, which _is a_ subclass of Wound. (cf.
AbrasionWound, LacerationWound, etc.) Also note the new properties such as
bulletType, entryPoint, and fragmentationPattern.

~~~
phillco
Sorry. It's Java. The GunShotWound has to come from a GunShotWoundFactory.

~~~
bcrescimanno
Unfortunately, GunShotWoundFactory now needs to implement
AbstractGunshotWoundFactory so that we can support multiple types of gunshot
wounds.

Of course, we'll need an AbstractGunshotWound as well and a subclass for each
type of gun used.

Needless abstractions make my head HURT!

~~~
cheez
DON'T WORRY GUYS! Dependency injection to the rescue!

~~~
fabjan
The WoundFactoryFactory will take care of it.

~~~
cheez
Actually, I'll be honest: I quite like dependency injection. There are a lot
of really good things that can be done with Guice. I've written one for C++
here: <https://bitbucket.org/cheez/dicpp/wiki/Home>

------
pnathan
As a former TA, I agree.

Objects are not ways of solving problems. They are a metaphor, which gives you
_a_ handle into the problem space.

OO design is building a metaphorical data storage mechanism for your system,
and then you need to cross the data into the metaphor and back again.

I wasted countless hours when learning programming trying to wrap my head
around coding in an OO fashion. Eventually I integrated enough OO to make my
programs better. But I never really got 'into' OO, although I tried real hard!

I don't have good words yet to explain the problem in detail, but
fundamentally, a metaphor is not the solution. Most discussions of object
orientation badly confuse themselves with that. As an example - an interface
(ala Java/C#) is not an interface as a hardware engineer would speak of it: an
interface from the OO perspective is some sort of promised type or 'protocol'.

Are you building a model of your problem? Or are you solving your problem. I'm
willing make a small wager that building models of your problem doesn't pay
off until you really start scaling your system in some direction. If we use
the Pareto idea, only 20% of your systems actually need the heavyweight
approach - the other 80% can be put together without the heavy object-oriented
machinery.

Anyway, I'd rather be given the opportunity to teach in Scheme or other
functional untyped language, instead of Java or C++. :-)

~~~
btilly
_I don't have good words yet to explain the problem in detail, but
fundamentally, a metaphor is not the solution._

A number of years ago I wrote
<http://www.perlmonks.org/index.pl?node_id=318257>. You may find that it
solves the problem in a reasonable mount of detail.

------
scootklein
couldn't agree more. i started with java in high school and was too caught up
in "public static void main" to realize that it was up to me to define methods
that made sense and that things generally ran top to bottom. switching to php
for a while made everything "click" before heading back to java

getting rid of CS terms and data types allows you to learn how to think in
program flow and "what am i actually doing here". failing to capture the
attention in this first step is lethal to most people that otherwise would be
good at programming if they hadn't run quickly at their first try

~~~
Groxx
That's precisely why I think Java is a horrible language to start people out
in.

Well, not language. _Library_. It's abstracted to the _extreme_ , and
confusing as heck to anyone just starting to learn how to _think_ about how to
program.

Start people out in something simple. Ruby can teach bad habits, so don't tell
them about monkey patching, but it's so simple it's a great way to hook
people, and has simple console / file IO that'll let newbies actually _do_
something with their skills, instead of being glad their project doesn't
overflow its array bounds or get stuck in hideous C input handling deathtraps
( _which_ flags do you have to reset in the input stream if they start
inputting Japanese? I forget...)

Best of all, despite being OO, _you don't need a class to start doing things_.
Something Java gets wrong for beginners.

    
    
      #!/usr/bin/env ruby
      puts "hello world"
    

Classes can wait, they're a whole can of worms that should come after people
learn to think clearly enough for a computer to understand them.

~~~
phillco
If you're burnt out on using Java for introductory courses, consider using
Groovy. (<http://groovy.codehaus.org/>). It takes a lot of the pain and
boilerplate code out of java, making it easier for students to focus on what
they're actually doing and not writing "public static void main String args"
all of the time.

Similar to your example, this is a valid groovy program:

println "Hello!"

But, unlike Python or Ruby, you're still in the Java world so migrating back
to to pure Java is a lot easier. Though, the shrieks of pain you'll endure
when you explain why they now have to write constructors or getters and
setters might hurt. :-)

~~~
Groxx
Have seen it, but never used it. It looks decently active on the forums,
despite the home page showing an award from 2007 (and looking older than
that)... how's the community / code health? On something like a .NET-to-
Ruby/Python scale? I'll keep it in mind for any future Java necessities.

------
onan_barbarian
Interesting, but one could go further with this: there still seems to be the
assumption that learning OOP concepts is indispensable and necessary in the
long run.

This may or may not be true depending on the application area, language and
intent of the programmer. Some languages and frameworks make it impossible to
get anything done unless you know how to subclass things; at the other extreme
you've got Stephanov-inflected C++.

Arguably you would have people learn at least a little more about algorithms
and data structures of greater complexity before dropping the OO-hammer. I
still have moments where mid-way into astronauting up a class hierarchy I
realize that all of this could be done more clearly in 10 lines of STL/Boost
(and yes, I'm aware of the problems of that style of coding, too).

~~~
nickik
I agree but STL/Boost hard tiny bit to hard. Teach that stuff with Scheme,
python or something that isn't <insert what you hate about C++>

~~~
onan_barbarian
You misunderstand. I'm merely using STL/Boost as one example of advanced non-
OO programming. Many other such paths to sophisticated programming exist that
don't go through OO, including the ones you name.

I use STL/Boost in this case not because I think they're wonderful but because
they're realistically what I'd have to use, as our codebase is mostly C/C++.

------
jarrett
I don't think objects unavoidable lead beginning programmers to get lost in
"architecture."

My go-to language for teaching new programmers is Ruby--in part because it's
my strongest language, but also because it's capable of expressing most
programming problems intuitively.

Is Ruby object-oriented? Yes and no. Certainly, objects are core to its
design. But it's also a scripting language that can be used purely
imperatively. Or you can use it like a functional language.

What makes it so good for beginners, in my opinion, is that in Ruby, things
are objects that _naturally seem like objects_. For example, it's very natural
to think of a string as an object that can be passed around and manipulated,
and Ruby makes this totally intuitive.

Writing "Hello world".downcase.sub('hello', 'goodbye') doesn't tempt beginning
programmers to create bizarre class structures or reams of boilerplate code.
But it does make you feel empowered, and because it's so darn logical, it
makes programming seem way less scary to a beginner.

~~~
jablan
Agreed. It's the hogs like Java, C# and C++ that gave OO a bad name. Moreover,
in dynamic OO languages like Ruby, even scary things like design patterns
usually take lot less time and lines of code to implement and, later,
understand.

------
agentultra
I'd actually skip on Python and give them Scheme with a copy of "The Little
Schemer."

However, I agree with the OO sentiment. It brings a lot of vernacular and
concepts that are not important to the building of a program; at least not at
first. Classes, objects, meta-classes, inheritance, multiple inheritance,
method resolution order, operator over-loading, class methods, static
methods... it's all baggage for a new programmer. It's an interesting way of
organizing code and encapsulating the responsibilities of different parts of a
larger program... but it's a purely architectural tool.

For a beginning programmer they aren't worried about such design concepts.
They just need to understand inputs, outputs, and control flow. A single
function is a program for them. Once they have a grasp of the fundamentals
then you can think about introducing them to OO programming.

Beginner's mind. It's hard for an experienced programmer to grasp.

------
trustfundbaby
Exactly.

Its like trying to teach someone basketball for the first time and at the same
time trying to explain a pick and roll, a full court press, Triangle offence
etc. You'll just confuse them.

Teach the basics and slowly ease into the big picture/strategy stuff when they
have a handle on the fundamentals.

------
6ren
It's a question of scale. Modules become increasingly important for larger
projects. It's a form of infrastructure, ridiculous for a birdhouse, essential
for a city.

------
kragen
I wonder if part of the problem is that objects and classes are additional
concepts to understand, on top of statements, variables, functions, modules,
data types, if, while, for, and so on. In a language like the ς-calculus, you
wouldn't have to introduce objects and classes separately from those other
things.

------
robee
Object Oriented programming exists because it is how humans conceptualize the
world around them. Humans think in objects and actions on, in and between
objects. OOP is a translation of natural thinking into systems and behaviors.
Why is this a bad thing, especially for learning?

I guess my question is, why does OOP and an natural translation between the
real world and the code world lead to this discussion and a certain level of
condescension around utilizing the concepts of OOP?

~~~
nickik
Why don't you use Erlang? Its much closer to you definition of OOP.

~~~
Ingaz
Exactly.

And I think that Erlang is most closest to Kay's definition of OOP. (If we can
drop "Everything is object")

Erlang processes are objects.

------
pelotom
Better yet, teach them a language which doesn't even _have_ the "extra layer
of [OO] nonsense". Teach them something simple, clean, and logical. Teach them
Haskell.

~~~
warrenwilkinson
You need monads to do any IO, that's not an easy first lesson. And you are a
great distance from hardware.

~~~
brehaut

        main = print "Hello, World!"
    

Not a monad in sight. Yes, print is an IO Action, but you don't have to use
monads to do IO. Add in 'interact' and a you can be writing trivial stdin /
stdout programs very quickly.

------
Rhapso
A reminder, The the objects are only in your mind not in the computer. They
exist only as a metaphor to allow you to wrap your mind around programming
easier, Rarely does the easier path yield any benefits except shorter travel
time.

------
sunsu
I couldnt agree more. The first "programs" I ever wrote were BASH scripts.
Definitely no OOP distractions there.

~~~
Groxx
That's hugely debatable, actually. If you ever called a command-line
application in one of those scripts, you called what's essentially an object -
it encapsulates its own behavior, hides how it does it, you can create
multiple ones (instances) without them competing or sharing information, the
command line arguments are arguments to the constructor, etc. They're nearly
identical.

~~~
wtallis
OOP concepts aren't what's getting in the way. It's the boilerplate most
languages require for their preferred object systems. A good language for
beginners would have to make it easy to progressively encapsulate things as
the program grows, without requiring substantial rewrites. It should be very
easy to go from a few named variables to an array or dictionary, and then to
add methods to that to make it a proper class.

~~~
swift
Sounds like JavaScript or Lua.

------
edw519
People often ask what's the single biggest difference between a good
programmer and a great programmer.

I've heard a lot of good answers dealing with talent, hard work, experience,
imagination, communication, cleverness, simplicity, vision, and even laziness.

 _The shift from procedural to OO brings with it a shift from thinking about
problems and solutions to thinking about architecture._

This makes me think about my answer: You first enable yourself to become a
great programmer the moment you stop worrying about your own problems and
focus primarily on your customer's.

~~~
dwc
Despite the other comments, I think you make a very valuable point. Take out
the "customer" angle and I think they'd also agree. It's really thinking about
the problem, working the problem and solving the problem. Thinking about and
working on your toolset is not solving the problem. Whether the problem is
yours or your customer's makes little difference.

------
iandanforth
As someone who is just making the transition to OOP in Python I totally agree
with this post.

I started with PHP:

Learning programming syntax was really annoying then learning loop and nested
loop structures was hard then learning a set of useful built in methods was
hard then learning how to interact with other systems was really hard

All the time I was learning this basic stuff I was writing code primarily for
myself. I was the only one who maintained it and had to use it.

Now that I've moved over to python and I'm working on a code base that I
expect to last, and will get to work with others on, OOP makes a huge amount
of sense.

OO code may have more overhead to write (it does) but I find it much easier to
_read_ good OO code now that I understand the model, than I ever found
procedural code even as I grew comfortable with writing it.

As a number of others have mentioned, OO code is about architecture and that
seems to be what matters when you're trying to grok other peoples code. The
specifics of how they did it only matter when you start debugging or
optimizing. (In my limited experience to date)

I look forward to really getting OO, writing lots of code, and then starting
to climb the functional programming mountain. But the poster is right, start
bare bones, start simple, and build from there!

------
freedrull
I think that a language for beginners should be as paradigm agnostic as
possible. Lua does not force OO or functional programming upon the user. And
unlike Ruby, there aren't a bunch of list-like datatypes like lists, arrays,
and hash tables to keep track of. There is just the table datatype which takes
care of all of those.

~~~
cageface
Lately I lament that Python, Perl, Ruby and PHP are the dominant scripting
languages instead of Lua. None of the former really justify their extra
complexity, IMO.

------
InclinedPlane
I think this is a bit silly. The problem isn't OOP, it's how OOP is taught,
and how OOP is abused.

To be honest most programming education is so terrible that it will still come
down to who has the talent and passion to figure things out on their own,
that's been true completely orthogonal to the presence or absence of OOP
education.

~~~
nickik
And thats we people write blogpost about it.

------
vrode
As a new programmer I felt blessed when I learned OOP.

I could actually write long and clean pieces of code, without spending double
time on restructuring them. Clean and coherent structure is important for
understanding, while understanding your own code, and remembering what it
does, creates a more fruitful development. Since new programmers often work
iteratively by adding new features to their code, this is rather positive,
than negative and that extra learning effort is always rewarded.

By knowing OOP you can access other's libraries in the language you recommend.
By using other's libraries and by altering them, you can create more cool
stuff faster.

------
knowtheory
Or you could use a language that uses object orientation in a more natural
way, such as Ruby and Scala, which both are fundamentally OO languages.

You can do procedural-y things with both languages, much to their credit, but
packaging up functionality is both natural and intuitive.

Over building APIs is not a hallmark that Objects are bad. They're a hallmark
of bad design.

So, yeah, i agree, prototype a minimal solution and iterate, and teach others
to do this. This is an issue that is orthogonal to object orientation. At
least, it is in languages that don't make you jump through weird hoops (C++ &
Java, i'm looking in your general direction).

~~~
jerf
The delta between Python and Ruby is way, way smaller than the Ruby community
seems to think it is. You are aware that everything is an object in Python,
too, right? The idea that that isn't true still seems to be knocking around
the Ruby community. Everything he said about Python applies precisely to Ruby
(or fails to apply in exactly the same way if it is a bad argument), because
there's hardly any difference that matters to a newbie.

"But, but, _len_ is a function! And you can't monkeypatch the base classes
even though you can monkeypatch everything else!"

Yes, please, by all means burden your neophyte with that tirade. They'll
really appreciate it while they're trying to figure out what an "if" statement
really does or why calling a function with the wrong capitalization doesn't
work.

~~~
jonhohle
> You are aware that everything is an object in Python, too, right?

Python is missing encapsulation and message passing. It's object model seems
to be similar to something like PHP or JavaScript: objects are essentially
hashes. In practice, that's not a huge deal, but they aren't quite the same.

This is not important for beginning programmers, but for programmers thinking
about their toolset more information can only help.

~~~
BobKabob
"Any traditional OOP programmer might tell you that essential elements of OOP
are encapsulation and message passing.

The Python equivalents are namespaces and methods.

Python doesn't subscribe to protecting the code from the programmer, like some
of the more BSD languages. Python does encapsulate objects as a single
namespace but it's a transluscent encapsulation."

From: <http://www.voidspace.org.uk/python/articles/OOP.shtml>

~~~
GeneralMaximus
TBH, private fields are a stupid idea. It's just a cutesy concept that gives
the newbie a false illusion of being in control. I've never seen a program
where a private field was actually _important_ to the integrity of the
program.

~~~
andrew1
I'd have to politely disagree; if your class has mutable state and you want to
maintain some invariant relationships amongst the elements of your class then
as far as I can see you really need to be able to specify that some operations
can only be performed within the class. If your class is immutable then I'd
agree that in most cases privacy offers you few benefits.

~~~
jerf
Yes, that's the theory. In practice it doesn't seem to work out so well.
Neither the promised benefits of using private fields actually manifest, nor
the promised costs of not having private fields, but the costs of actually
having the private fields manifest in spades. In cost/benefits terms, the
benefits are almost entirely theoretical and the costs much higher than
advertised. I consider them a major net loss for OO.

~~~
andrew1
I wouldn't really agree that it doesn't work out in practice. For the systems
that I work on, the concept of privacy is extremely useful, perhaps we're
doing things 'the wrong way' to believe that but I don't think that's all that
likely.

What would you suggest as a better alternative, outside of a purely functional
approach? Just as an example, suppose that Java's LinkedList class didn't have
a cached size and that instead it went and iterated its elements to calculate
its size each time someone asked for it. I might write a wrapper around
LinkedList with two private variables, the LinkedList and an int to cache its
size in. Then I would update the int each time an operation was performed on
my member list. If I can't make those two members private, how can I ensure
that no one accidentally updates the list without updating the size, or vice-
versa. It mightn't be an unreasonable mistake for someone to think calling
mylist.size = 0 would empty the list. What is the 'safe' way of constructing
this concept?

~~~
GeneralMaximus
> If I can't make those two members private, how can I ensure that no one
> accidentally updates the list without updating the size, or vice-versa. It
> mightn't be an unreasonable mistake for someone to think calling mylist.size
> = 0 would empty the list. What is the 'safe' way of constructing this
> concept?

Conventions. In Python, for instance, all private fields and methods start
with a double underscore, i.e., '__'. This is merely a convention, outlined in
PEP8. Documentation tools don't pick up "private" fields. Even when they do,
they don't list them in the same section as the public fields.

You could argue about the safety of this approach, but at some point you have
to let go of the training wheels and let programmers make their own judgments.

------
yason
This is very true. I write most of my Python code using only
procedural/functional constructs and limit the OO parts to a minimum and
that's the way I want to do it.

Now, I haven't known exactly _why I want_ to do this, except maybe that OO
constructs soon turn my code ugly but that's a gut-feeling argument which I've
kept to myself. Maybe I've thought that I've unconsciously absorbed from the
athmosphere enough of that Java-hating mentality to let it show up in the way
I write Python, too. But maybe I've been on to something as well.

I'm glad to read about someone else thinking in the same way.

------
extension
_Modularity_ is a fundamental design principle that pre-dates computer
science: complex systems should be broken down into independent components
that interact through formal abstractions. It's very simple and obvious.
Everyone understands it on some level.

OOP is really just taking the principle of modularity and making it a first
class language feature. There's no reason not to teach it to a beginner. They
are going to be using the concepts right away anyway.

Also, OOP is ubiquitous in the real world and rookie coders need to get
involved with the real world as early as possible.

~~~
nickik
If you build a house you dont need worry about the inside of bricks. Objects
dont have that property, Pure Functions do.

The Objects loss this proberty because they have inheritance. The other
problem is that every in OO the data is mixed with the functions.

Rich Hicky on how it can be done right:
[http://www.infoq.com/presentations/Are-We-There-Yet-Rich-
Hic...](http://www.infoq.com/presentations/Are-We-There-Yet-Rich-Hickey)

~~~
trezor
_If you build a house you dont need worry about the inside of bricks. Objects
dont have that property, Pure Functions do._

I see this _exactly_ the other way around. Objects makes sure you don't have
to worry about what's on the inside, but if you use pure functions you need to
know how the function is implemented, how it processes data, how it expects
input to be and how the output will be.

Using pure functions requires you to seek out lots of information and verify
implementation. Not that you shouldn't investigate this when using OOP, but in
OOP a lot more of this is formalized in class-definitions IMO making the
interaction easier to understand.

The way I see your analogy, is that if you use pure functions, the electricity
in your house needs to know what material the bricks are made of so that
current doesn't accidentally leak and cause a fire. With OOP this is a detail
you don't have to worry about as long as the pieces fit.

If that sounds somewhat flawed or too simplistic, I will take the liberty of
blaming the poor metaphor. I'm just pointing out that I see the exact same
metaphor in the exact polar opposite way ;)

~~~
merijnv
Why would I need to know the implementation and how it processes data?

The entire point of pure functions is that you don't need to know. Referential
transparency guarantees that a pure function given inputs X and Y always
returns the same output Z. So if I just follow the interface specification
about accepted inputs and accepted outputs I can switch in any arbitrary
implementation of that function and have my code keep working.

~~~
trezor
_So if I just follow the interface specification about accepted inputs and
accepted outputs I can switch in any arbitrary implementation of that function
and have my code keep working._

The funny thing is that this is what I would say about OOP, while for
functional solutions I often find the specifications much looser, so that I
cannot trust this to always be true.

I'm not saying you are right and I am wrong. But I am saying that the
arguments about what holds true for what, and especially the arguments for
"OOP is bad" or "FP is bad" etc, they just seem to be highly subjective.

And unless someone can come up with empirical evidence to show that one side
is actually better/right, I find this debate both pointless and rather
amusing. Pointless in that it doesn't give more insights. Amusing in the sense
that it makes people reveal lots of the preconceptions and prejudice against
things they seemingly don't fully grasp.

~~~
nickik
What how do you have to know about the implementation of the function? Think
of something like quicksort or a partiton function (partition 2 [1 2 3 4]) =>
([1 2][3 4]). How do you have to know anything about that functions
implemantation. Sure if the function has a name like myfunction and no docs
then you have to look at the code but thats the same with OO.

In clojure you can say (doc anyfunction) and you will get a description of
what the function does and it does only that.

You describe the perfect case in OO where you have only objects that only have
immutable members and pure methodes. :)

I n the realworld I you often see that something does not work anymore because
im some other object some variable changed or that it worked first but after a
variable somewhere changed it breaks. With inheritance adds to that a hole set
of new problems.

I'm not saying you cant make a good design with object im saying with a FP
stile its easy to do the right thing while with objects its harder to do the
right thing.

Thats as good as I can argue in a comment.

~~~
trezor
_Think of something like quicksort or a partiton function (partition 2 [1 2 3
4]) = > ([1 2][3 4]). How do you have to know anything about that functions
implemantation._

Well. I need to know that it accepts two parameters, first being an integer,
the second being an array of integers.

In an OOP solution this would at least be exposed by type signatures,
something you don't always see in FP solutions (often due to type-inference).
Hence you need to check the implementation.

And this is for a simple example. What about more complex example? Where the
input-data has a more complex nature? Take the following example:

    
    
        var data = [ { id: 1, value: 2 }, { id: 2, value: 3} ];
        var ordered = orderByValue(data);
    

Ignoring the "var data ="-line: Without checking the implementation, how would
you know how the input-data should be formatted? What types and properties are
needed, and in what format the function accepts the data? A seq? A list? An
object with properties? You don't.

In C# the same function would probably be contained in a relevant class and
have a signature akin to the following:

    
    
        IEnumerable<ValueHolder> OrderByValue(IEnumerable<ValueHolder> data)
    

Now I _know_ what it returns, what it expects and don't have to worry about
that. The type-signature tells me everything. The types it expects tells me
everything. Moreover, this is probably already implemented on a specialized
collection class, so all I need to do is:

    
    
        var data = new ValueHolderList();
        // populate
        var ordered = data.OrderByValue(); // notice -pure- implementation in OOP ;)
    

Again. The implementation tells me what I need to know. Details are
blackboxed, abstracted and objects easy to work with. I don't have to worry
about functions, context, what they expect and in which order the glue is
expected. In FP you are more commonly exposed to the internals of things and
need to figure these things out yourself.

I'm not saying I am 100% right and you are 100% wrong. I'm saying there is
lots of grey here which this thread doesn't really seem to cover or
acknowledge.

FP is not a silver-bullet and nor is OOP. FP has strengths. So does OOP. Lots
of the "weaknesses" I see people complain about with regard to OOP here are
what I consider weaknesses in FP and strengths of OOP.

I sometimes wonder if we are living on the same planet.

~~~
swift
In both OOP and FP, to know what a function (or a method) does, you need (at a
minimum) to check its type signature. There are _many_ OOP languages which do
not use explicit type signatures, and many FP languages which do; it seems to
me that you have confused the OOP vs. FP issue with the issue of explicit vs.
implicit typing, or perhaps with static typing vs. dynamic typing. There are
languages available to suit pretty much every combination of those properties,
so you can easily avoid whatever you don't like.

Further, when you check a type signature, what you read is much more valuable
in a pure FP context than in a typical OO context, because OO languages
generally (1) allow and encourage the use of state, and (2) do not distinguish
in the type system between functions/methods that use state and those that
don't. The type signature of a pure function strongly constrains what that
function can actually do - so much so that it's possible, and effective, to
look up the function you need just by specifying the type that you expect it
to have. In a typical OO language, the type signature indicates much less
about a method's behavior, because its inputs include not only the parameters
you provide, but also the entire "world" at the time it is invoked; similarly,
its outputs include the entire "world" in addition to its return value. As an
example, a pure function that takes no parameters can only be a constant, but
an impure method that takes no parameters could play a song on the speakers,
display a window on the screen, or launch a missile.

FP and OOP certainly have both strengths and weaknesses. I suspect one reason
that OOP catches so much flak around here is that people are more familiar
with it, and the flaws in tools you're familiar with are easier to see.
Unfortunately, when you're not familiar with a tool, it can also be easy to
see flaws - flaws that aren't really flaws at all, but simply aspects of the
tool you don't yet understand. The result of this, I think, is that one should
ignore criticisms of OOP from people who aren't deeply familiar with it, and
similar for shallow criticisms of FP. Unfortunately, there are a lot of both
on Hacker News.

------
jarin
I didn't even know this was a big, controversial issue still. Coming from a
PHP, Ruby, and Obj-C background (and having taken C and Java classes in
college), I always saw it as "use objects when you want things to be easy, use
C when you want things to be fast".

Of course, being primarily a web developer and not having implemented a linked
list or a bubble sort since college, the anti-OO people probably wouldn't
consider me a "real" programmer anyway.

~~~
bigfudge
This is a serious question: is there ever an occasions when someone working on
a real-world project should be implementing linked lists? I've no experience
with C, but surely this stuff comes from libraries now?

~~~
JoachimSchipper
The C world has various reusable linked lists, but yes, there is occasionally
a reason to implement one yourself: a {data, next} tuple usually fits in a
single cache line, whereas some libraries only offer {data pointer, next}
tuples, which requires loading in the memory that <data pointer> points to.
The performance difference _can_ matter.

That said, some more agreement on reusable code may be useful, especially for
things that aren't as easy to implement as linked lists.

------
mtraven
Hm, nobody seems to have yet mentioned that OOP was practically _invented_ as
a teaching tool for novice programmers:
[http://samizdat.cc/shelf/documents/2004/08.02-historyOfSmall...](http://samizdat.cc/shelf/documents/2004/08.02-historyOfSmalltalk/historyOfSmalltalk.pdf)

~~~
nickik
Atomic Power was invented to build bombs do you really thing that we should
use everything for what it was invented for?

------
r00fus
This OO complexity becomes more pronounced once you start talking about
serious persistence, which today, likely means SQL.

Has the object/relational problem been solved? Last I checked, it's still a
confusing and complicated inelegant hack-job to reconcile
inheritance/polymorphism vs. relational storage schemes.

------
Tycho
I just think of it in terms of OO is important for the design patterns. That's
when I'd 'use it.' I wouldn't bother trying to explain the theoretical
underpinnings to a beginner - the definitions passed around in these types of
debate are hilariously confusing.

------
Apocryphon
My obligatory link to Joel Spolsky's lament on JavaSchools and championing of
C:
[http://www.joelonsoftware.com/articles/ThePerilsofJavaSchool...](http://www.joelonsoftware.com/articles/ThePerilsofJavaSchools.html)

------
erehweb
I'd start with BASIC. For real beginners and kids, GOTO is their friend.
[http://erehweb.wordpress.com/2010/06/24/goto-considered-
help...](http://erehweb.wordpress.com/2010/06/24/goto-considered-helpful/)

------
nwjlyons
I completely agree. I wish I was taught Python or Ruby at University instead
of Java.

------
jpr
If only someone told me what on earth they mean by OOP...

