

Objects vs closures - prog
http://people.csail.mit.edu/gregs/ll1-discuss-archive-html/msg03277.html

======
shawndumas
I invoke the ko rule...

Situational Cycle: a move sequence that starts and ends at the same position
such that at the end of that cycle it is the same player's turn as it was at
the start. If cycles are allowed without restrictions on them, it is possible
for a game to go on indefinitely. [1]

The 'ko' situation is the most common cycle that occurs in the game of go.

Ko: a situation where two alternating single stone captures would repeat the
original board position. The alternating captures could repeat indefinitely,
preventing the game from ending. The ko rule resolves the situation. [2]

\----

[1]: <http://senseis.xmp.net/?Cycle>

[2]: <http://senseis.xmp.net/?Ko>

------
nadam
I like the approach in Scala: every function is an object, and every object
which provide an apply() method can be used as a function.

~~~
sverrejoh
Same as in Python, except apply() is called __call__()

------
jaekwon
I like the koan in the end. Now if only one can hit me in the head such that I
understand the duality of electricity and magnetism...

~~~
eru
Feynman explains it nicely in his lectures on physics. The key is relativity.
(Magnetism is a relativistic effect of electricity.)

[http://en.wikipedia.org/wiki/Classical_electromagnetism_and_...](http://en.wikipedia.org/wiki/Classical_electromagnetism_and_special_relativity)
may be interesting. Especially the section "Relationship between electricity
and magnetism".

~~~
J3L2404
Can someone summarize how magnetism is a relativistic effect?

~~~
weber88
The explanation given here is pretty decent:

[http://en.wikipedia.org/wiki/Classical_electromagnetism_and_...](http://en.wikipedia.org/wiki/Classical_electromagnetism_and_special_relativity)

The main thing is to consider how charge densities change as we switch
reference frames. Because the lengths contract when we switch to a reference
frame where things are in motion, we see higher charge densities (same charge
per less length).

A simple demonstration: Say we have a charge neutral, current carrying wire.
We model it as a bunch of positive charges staying still, and some negative
ones moving. The positive and neutral balance, but there is still a current
because only the negatives are moving. Now imagine we switch to a different
reference frame, one where the negative charges aren't moving, but the
positive one's are, we are flying along parallel to the current. Now the
positive charges contract relativisticly, so the density of positive charges
is now greater than the negatives, and hence the wire appears to be carrying a
net charge.

------
noblethrasher
A closure is also a poor man's class (sort of).

I've often wished that C# let me implement interfaces with anonymous classes.
Since it doesn't, I get around it by creating AdHoc classes for interfaces
that I use a lot.

For instance, if I have an interface Foo defined as:

    
    
        interface Foo
        {
            string Bar(int x);
        }
    

then I also create a corresponding class called AdHocFoo defined like so:

    
    
        class AdHocFoo : Foo
        {
            Func<int, string> _bar;
    
            public AdHocFoo(Func<int, string> bar)
            {
                _bar = bar;
            }
    
            public string Bar(int x)
            {
                return _bar(x);
            }
    
        }
    

I've written a little program that creates these AdHoc class definitions from
interfaces.

------
beza1e1
"so closures can be, and are, used to implement very effective objects with
multiple methods"

I don't really believe this to be efficient. The linked text by Kiselyov
implements the dispatch via Scheme's (case ...) expression. Efficient dynamic
dispatch means one indirection through a vtable, so one load instruction more
compared to a normal function call. Which compilers for functional languages
can perform this optimization?

Since we are talking about Scheme here, we could compare the dispatch to
dynamic languages like Python or Ruby, where the dispatch means looking up a
string in a hashmap. I'm willing to believe that Scheme's case can keep up
with that.

~~~
swannodette
Clojure. Clojure has a fairly interesting feature called reify:

    
    
      (defprotocol Foo
        (bar [this])
        (baz [this]))
    
      (defn make-foo [a]
         (reify Foo
                (bar [this] a)
                (baz [this] a)))
    
      (dotimes [_ 10]
        (let [x (make-foo 'x)]
          (time
            (dotimes [_ 1e8]
              (bar x)
              (baz x))))
    

So 2 methods called 1 billion times only takes about ~500ms on my 2.66ghz i7
MacbookPro.

(of course, Clojure prefers values+behaviors, aka objects, without mucking it
up with state)

~~~
supersillyus
It seems very likely to me that the JVM is recognizing that "bar" and "baz"
don't do anything and (after some warm-up) optimizing them away.
Microbenchmarking JVM is hard.

~~~
swannodette
Not true. Timings are completely different if you take a method out. I'm not
saying that the JVM is not doing _any_ magic here - but here is a closure that
gets method dispatch as fast as the host can provide.

~~~
supersillyus
You're probably right. I took 500ms for 1B iterations and saw that you're
looking at ~0.25ns a call, and that seemed a bit low. However, based on your
code, you ran it 100M times, not 1B (1e8 vs 1e9). That changes it to 2.5ns per
call. I ran the code on my machine (similar to yours, Macbook Pro 2.66ghz)
with Clojure 1.2.0, and I got just over 2500ms for 1e8 iterations, which is
about 12.5ns per call.

For comparison, looping 1e8 times with two calls to empty functions in a
static language takes ~639ms, gives me ~3ns per call. So, you can see why my
first suspicion was that the JVM was doing something like just inlining the
methods and avoiding the call altogether. Considering the differences in our
reported numbers, you may have a newer JVM than me, and if it is beating
simple CALL instructions, it must be inlining them or avoiding some of the
looping.

------
doki_pen
I wish that _why didn't leave potion behind. One of the coolest thing about
potion is that everything is a function. For example, if you have an integer
named a that has a value of one, the a() will return 1. I hadn't seen that in
any other language. Are there other languages with this feature?

