
New release of Self programming language - russellallen
https://blog.selflanguage.org/
======
ChuckMcM
Wow. For those who don't know, there was a big 'language bake off' at Sun
between TCL, Java, and Self (all being funded by Sun Labs) and Bert Sutherland
(then director of Sun Labs and brother of Ivan) required that the language
changes stop and then we'd look at each one, and decide which one to move
forward on. When the world sort of exploded at the WWW conference held in
Darmstadt Germany in 1995, Java officially 'won' and both Self and TCL were
de-committed. (not canceled per-se but not getting any more funding either).

I like to think that all three languages benefited from the competition.

~~~
pron
I believe that much of the work Sun had done on Self found its way to Java in
the form of HotSpot.

~~~
hipitihop
Not to take anything from Self, but in regards to HotSpot, a more accurate
history attributes the lineage to Smalltalk and the Animorphic team. Some
details here
[https://en.m.wikipedia.org/wiki/HotSpot](https://en.m.wikipedia.org/wiki/HotSpot)

~~~
pron
Right. Sun incorporated Animorphic's work into Self, and later purchased
Animorphic and the technology became known as HotSpot.

------
bhk
Can we ever get away from the misnomer of "prototype-based OO"?

In Self we have two ways of specialization:

    
    
      1. Prototypes
      2. Parents
    

Prototypes are objects that are "cloned" to create instances, a very simple
and direct notion of inheritance. We create an object (a "prototype") with
properties that apply to a larger set of objects, and then for each instance
we copy (clone) it and then add or modify properties as necessary for the
instance. Self had optimizations to deal with this so instances did not end up
being very fat.

Parents are objects that other objects "inherit from". At run-time property
lookups are delegated to a parent. The difference between a parent and a
prototype is that changes to the prototype do NOT affect the derived
instances. Changes to a parent DO affect the derived instances.

So, when I read about "class-based" versus "prototype-based" languages, I
cringe. It is really "class-based" versus "parent-based". How did cloning get
confused with run-time delegation?

Self introduced the notion of self-describing instances. That is the essential
coolness. The simplifying notion.

[http://www.selflanguage.org/_static/published/self-
power.pdf](http://www.selflanguage.org/_static/published/self-power.pdf)

~~~
DonbunEf7
Relax already.

We say "prototype" to indicate an important feature of Self which its children
(Python, JavaScript, etc.) inherited: Unlike classes, prototypes are open for
modification. The precise mechanisms differ between the different languages in
Self's family, but they all have that feature in common.

Compare and contrast with other Smalltalk children, like Java or E, where this
isn't possible because classes are closed. (E doesn't have classes, but it has
a similar property, in that object literal scripts are closed for
modification.)

~~~
jecel
The problem is that there are two groups of languages that use "prototypes"
and using the same word for slightly different concepts causes confusion.

In the original Lieberman paper[1] you created new objects that were empty and
inherited from the prototype. Then you added local slots (name/value pairs) to
the object for anything that was different from the prototype. Languages based
on this concept often have two kinds of assignments: one that adds a new local
slot and another that just replaces the value of the old slot (either local or
inherited).

In Self, on the other hand, a prototype is a template for an object you want
and you clone it to get a new object. Once it has been cloned, there is no
longer any relation between the prototype and the new object and they evolve
separately. These kind of languages tend to have only one way to assign values
to slots.

Many languages that claimed to follow the Self model (like NewtonScript and
Io) actually use the Lieberman model instead. In either case the prototypes
are fully functioning examples of the object you want to create new instances
of. So, unfortunately, it is natural that one word is used for both. But this
results in very confusing discussions when someone is talking about a language
with one model while thinking about the other model instead.

[1]
[http://web.media.mit.edu/~lieber/Lieberary/OOP/Delegation/De...](http://web.media.mit.edu/~lieber/Lieberary/OOP/Delegation/Delegation.html)

------
krallja
Self (along with Scheme) was supposedly one of the big influences on Brendan
Eich when he was creating JavaScript. Neat to know it's still out there.

~~~
goatlover
Wonder how JS would have gone if it had fully embraced its prototypal nature
instead of hiding it behind Java-like syntax. For as long as JS has been
around, there were people clamoring for classes in the language, or thinking
that it had classes, or implementing a class-like structure, until finally
ES6.

But it didn't have to be that way. And now the sentiment is that OOP is bad,
and inheritance is evil, and classes are the worse, forcing one to predefine a
taxonomy that's likely to need refactoring.

But prototypal languages can be easily changed. Just change the parent
slot(s), or modify the object itself, etc.

~~~
chubot
I honestly don't get what's better about prototypes than classes. In
particular, I don't see that the prototypal style is supported empirically.

Are there any codebases around 100K to 1 M lines of code written in a
prototypal style, which are actually in production use?

You can claim thousands of such systems for classes. In that sense, classes
are a success. That people write horrible class-based code isn't a knock
against them. People also write horrible procedural code. Most code is bad.
But there is some code with classes that is very good.

There was sentiment in the 90's and early 2000's that OOP is bad. I think the
world has learned how to use classes since then -- e.g. no more large
inheritance chains and fragile base classes. Not everything is an object --
some things are just functions, and some things are just data with no
behavior.

As far as I can see, prototypes are worse along all dimensions than classes.

~~~
nostrademons
Prototypes and metaclasses are equivalent in power, and both are more powerful
than straight classes. You can do things like dynamically change the set of
methods that an object responds to with them, something that you have to fake
in a class-based system with the State or Strategy pattern. Also, because
prototypes are just ordinary objects, you can do even fancier stuff like store
the set of possible prototypes in a data structure and dynamically change them
based on a lookup. Things like Django's ORM (where all the object has to
define are a set of fields as member variables, and then it magically gets a
bunch of methods for DB query/insert/update) are trivially easy to define with
a prototype-based object system, and you could do even fancier things like add
another object to the prototype chain to get JSON serialization, or another
one for protobufs, and swap these out at run-time.

However, the flip side of this is that more freedom is not always good for
more readable software. GOTO, for example, can express any control flow that
for/while/do-while/if/switch can and a number that it cannot
(coroutines/exceptions/etc.), but as an industry we've moved away from GOTO
because most programmers can't hold this flow control in their heads.
Prototypes are the same way: they grant a lot of freedom to implement fancy
abstractions, but many programmers seem to be unable to understand the
resulting abstractions, so they don't find widespread use.

Asking for codebases of 100K-1M lines in prototype-based languages is the
wrong question. Because prototypes let people define abstractions that other
programmers find unreadable, they a.) let you write equivalent software in
fewer lines of code and b.) get rewritten in class-based languages as soon as
you have more than a handful of programmers working on the codebase. They're
much more likely to be used by a small team of hackers who sells their startup
for $40M or so and then vests in peace while another team rewrites all their
code in Java than by a big company.

If you broaden the question to "has anyone ever made significant money working
on or with Self", the answer is yes:

[http://merlintec.com/old-self-
interest/msg01011.html](http://merlintec.com/old-self-interest/msg01011.html)

(Fun fact: Urs Hoelzle, Animorphic's CTO and sender of the second message in
that thread, later went on to become employee #9 and the first executive hire
at Google.)

~~~
chubot
Thanks for the response, but I wouldn't call that empirical support for
prototypes. They took JIT technology developed for Self and applied it to a
class-based language, Java. That doesn't mean that prototypes are good for
writing programs. It actually indicates the opposite, because the technology
in Self that ended up actually being deployed turned out to be something else.

(I also worked at Google and am somewhat familiar with the Self to HotSpot to
v8 heritage.)

Your second paragraph is exactly what I think is wrong with prototypes.
There's not enough structure, and not enough constraint. Constraints are
useful for reasoning about programs. You might as well just have a bunch of
structs and function pointers (and certainly many successful programs are
written that way).

Having every application roll its own class system is a terrible idea, in
theory and in practice. In practice JavaScript programs end up with multiple
class systems because of this. The Lua ecosystem also has this problem.

It's analogous to every library in C rolling their own event loop or thread
pool, leading to a fragmenetation of concurrency approaches. Go and node.js
both unify the approach to concurrency so every application doesn't end up
with 3 different concurrency abstractions.

I don't buy your 3rd paragraph. Python is class-based; it has the
characteristic you're talking about with respect to a small team of hackers;
and that has been empirically supported by hundreds or thousands of startups
being acquired (Instagram, etc.) and even huge companies created (Dropbox).

I honestly think prototypes have failed in the marketplace of ideas and
there's a good reason for that. I use metaclasses all the time but not for
dynamically changing sets of methods. That seems like a horrible idea. The way
I use them is for generating types from external sources like CSV/SQL/protobuf
schemas.

~~~
nostrademons
Python has metaclasses, and they're used extensively in many of the frameworks
that were critical for those startups (Django etc.). They're coming to
Javascript too, in the form of ES6 proxies, which have been long awaited.

I actually think metaclasses are probably a better solution than prototypes,
because they let you write the majority of your code in a class-based style
and only incorporate funky abstractions when they're really necessary, which
is typically infrequently. My point is that prototypes are strictly more
powerful than (non-metaclass) class-based systems, and that this power lets
you build powerful abstractions that can dramatically decrease the number of
lines of code you need to write for an initial system.

------
patrec
Has someone here got some examples of things that are more neatly expressed in
self than javascript and other obvious suspects (e.g. python, racket, common
lisp, smalltalk)?

~~~
doublec
I'm not sure about comparing to those languages but I did a post comparing
other prototype based languages
[https://bluishcoder.co.nz/2009/07/16/prototype-based-
program...](https://bluishcoder.co.nz/2009/07/16/prototype-based-programming-
languages.html)

For me though it's not about the Self language but about the combination of
the language and environment. I did a screencast attempting to show some of
how Self development is done [https://bluishcoder.co.nz/2015/11/18/demo-of-
programming-in-...](https://bluishcoder.co.nz/2015/11/18/demo-of-programming-
in-self.html)

------
analognoise
No Windows release. Hmm.

~~~
russellallen
It's on the list but a big task because of the Unix assumptions build into the
VM and the need to redo the graphics backend both as primitives and to hook it
into the Self level GUI framework.

