
The origins of Objective-C at PPI/Stepstone and its evolution at NeXT - fanf2
https://dl.acm.org/doi/10.1145/3386332
======
herodotus
Good article, so glad to find it here. I wish, though, that the authors had
talked a lot more about memory management in Objective-C, because this had a
major impact, not only on the development of Objective-C, but also on Swift.
As far as I can tell, the idea of (explicit) reference counting came from
Next, when Blaine Garst and others were working on remote objects. When I
joined Apple (and therefore had to learn Objective-C), the most important task
was to understand reference counting and memory management. Yes, there was a
lot more, like interfaces, properties and the many libraries, but
understanding alloc/new/release/autorelease was crucial for shippable
software. GC was introduced (and deprecated) during my time, but it was the
realization (presumably by Lattner and his team) that the LLVM analyzer could
do a better job than humans at managing reference counting that was huge. No
more hours and hours hunting down memory leaks! And, of course, only one
concept required in Swift, namely weak vs. strong references.

In summary, a terrific paper, but it is too bad that this crucial part of the
story is missing.

~~~
asveikau
> As far as I can tell, the idea of (explicit) reference counting came from
> Next,

Maybe in objc, but in the general case I kinda doubt this. It's a very common
pattern in kernel development as an example, haven't done any archeology here
but I have seen it in codebases rooted in the 80s at latest, probably earlier.
(Did original Unix reference count file objects in the FD table the same way a
modern kernel would? Would not be surprised if yes.)

Autorelease is pretty unique though and clever.

~~~
pmcjones
I believe this is the first paper to describe reference counting:

George E. Collins. A method for overlapping and erasure of lists. Commun. ACM
3, 12 (Dec. 1960), 655–657.
[https://doi.org/10.1145/367487.367501](https://doi.org/10.1145/367487.367501)

------
gdubs
This is a wonderful bit of history, really grateful to have it. I remember
reading an interview with Steve Jobs in Wired magazine in the 90s where he
talked about the future being all about Objects – "The Next Insanely Great
Thing" [1]. I built my career on top of Objective-C. These days I really enjoy
Swift, but I still have a space in my heart for the dynamic beauty of
Objective-C.

1:
[https://www.wired.com/1996/02/jobs-2/](https://www.wired.com/1996/02/jobs-2/)

~~~
pixelrevision
Swift is great but is starting to suffer from feature bloat. I really loved
the simplicity of objective-c. If you can get past the syntax you find a very
thin wrapper on top of c that lets you do really fun things. I don’t think
I’ve ever seen another language where people followed the same patterns so
consistently either.

~~~
justinmeiners
> where people followed the same patterns so consistently

so true. This is one ecosystem where you could actually just include another
persons code as if it was yours, and get a great understanding of how to use
it from a header.

------
NelsonMinar
In 1994 I was on a team that had one of the few uses of Objective C that had
nothing to do with NeXTStep. The Swarm Simulation System[1], an agent based
simulation toolkit. We used GNU's Objective C implementation and Tcl/Tk (via
tclobjc) for the UI.

We picked ObjC because we knew we wanted something object oriented but C++
seemed too complicated and too static. ObjC was simple and appealingly dynamic
and flexible in its type system. It worked pretty well for us other than the
costs of being an oddball language no one knew. But it was simple enough to
learn that a fair number of our target audience did.

In retrospect I wish we had the courage to use Smalltalk or a Lisp system but
at the time that felt too risky. Java was also just beginning to be an option
then, but it was still being shown as a toy for making applets and not a real
programming language. Also very slow before JITs.

I really appreciated that the ObjC runtime was open source and very small. It
was quite easy to get in deep and understand what was going on.

[1] [http://www.swarm.org](http://www.swarm.org)

~~~
jmuhlich
I have fond memories of using Swarm in an undergrad course on agent based
simulation ~20 years ago. Just tweaking the heatbugs example in various ways
was already enlightening. Thanks for your work!

------
ChrisMarshallNY
Wow. “Software-ICs.” I remember that, and I thought it was a great concept. At
the time, it was revolutionary.

Nowadays, it’s how we do everything, but the term “Software-ICs” never climbed
out of the bassinet.

~~~
mpweiher
Yes, and I still think it still is great concept. :-)

I also don't think it's really how we do everything, the original concept
caught on only very partially, and appears to be more and more forgotten.

See _Software-ICs, Binary Compatibility, and Objective-Swift_

[https://blog.metaobject.com/2019/03/software-ics-binary-
comp...](https://blog.metaobject.com/2019/03/software-ics-binary-
compatibility-and.html)

~~~
mwcampbell
IMO, the trouble with COM and its imitators is that they're prone to gross
over-use. The best example I know is Gecko, which over-used XPCOM and then had
to go through what Mozilla folks called deCOMtamination. [1] I think IE might
have over-used COM to some extent as well, but that's only speculation based
on what I saw on the outside. (Disclosure: I work at Microsoft, but I joined
well after IE became Edge, and I was never on that team.)

Then Chrome landed like a piece of alien technology, and if we took a look
inside, we found that it was one giant binary module (DLL, .framework, or
executable, depending on the platform) that internally didn't use anything
like COM, at all. It was also fairly well-known for its use of link-time
optimization. I wonder how much these things contributed to Chrome's famous
speed.

Of course, Chrome was only able to pull this off because the team had great
engineering discipline, and later, a great build system (first GYP, then GN).
I remember when I built Chromium for the first time and was awed at how it was
made up of hundreds of modules, but they were all built as static libraries
and then linked together into one monster binary module at the end. These
days, newer statically compiled languages like Go, Rust, and others are
bringing large-scale static linking of arbitrary modules within reach for the
rest of us.

If I may stretch the IC metaphor, I'm guessing something similar happened with
actual ICs; better EDA tools made it more feasible to combine more and more IP
blocks onto a single chip, giving rise to the modern SoC.

[1]: The best post I can find that talks about deCOMtamination, and then goes
on to describe how XPCOM continued to be over-used, is this:
[https://brendaneich.com/2006/02/fresh-xpcom-
thinking/](https://brendaneich.com/2006/02/fresh-xpcom-thinking/) Does anyone
know of a definitive written history of this process?

~~~
pcwalton
> I remember when I built Chromium for the first time and was awed at how it
> was made up of hundreds of modules, but they were all built as static
> libraries and then linked together into one monster binary module at the
> end.

This is how Gecko is built too—essentially everything ends up in libxul unless
it has to be split out so that link.exe doesn't OOM on 32-bit (sadness). I
believe it had been this way when the first version of Chrome was released, so
this wasn't something Chrome introduced.

(Also, based on my experiences with Node and other Google projects like Skia,
I wouldn't consider gyp a great build tool—it's always been a nightmare for
me. gn is better, but Google projects still have a tendency to be difficult to
build for those who aren't Google employees.)

~~~
mwcampbell
> This is how Gecko is built too—essentially everything ends up in libxul
> unless it has to be split out so that link.exe doesn't OOM on 32-bit
> (sadness). I believe it had been this way when the first version of Chrome
> was released, so this wasn't something Chrome introduced.

Touché. And I do remember seeing this in Firefox, or maybe even the old
Mozilla suite, long before Chrome came out.

Still, I think static linking is a much more effective optimization in Chrome,
because Chrome doesn't make heavy use of internal ABI boundaries (e.g. COM or
XPCOM), so link-time optimization can be more aggressive. IIUC, Firefox and
Thunderbird still use a fair amount of XPCOM internally, because they have
lots of modules written in JavaScript, including all of the code behind the
XUL-based UI. Chrome, on the other hand, uses a lot more C++.

~~~
pcwalton
Chrome has a lot of the frontend written in JS nowadays too. At this point
XPCOM is mostly just a bindings layer between JS and C++ (that's what COM was
supposed to be to begin with—a glue layer between languages). V8 has something
similar.

I don't think you can really say static linking is more effective in Chrome or
Firefox. Both browser architectures are broadly similar these days.

------
MintelIE
Objective-C was a very interesting language, in some ways much better than C++
while filling the same type of role. It's a shame that so much effort is being
thrown away by Apple. I recommend anybody who has an Objective-C program that
they don't wish to rewrite in Swift to look into GNUstep. I personally have
two Objective-C projects which I will not be porting to Swift, instead I will
port them from Cocoa to GNUstep.

~~~
wool_gather
I'm curious why you feel the need to port them at all? ObjC is pretty far from
dead. Perhaps the ObjC runtime will be abandoned by Apple at some point, but I
can't see even deprecation ( _not_ removal from their shipping systems) taking
less than a decade.

In particular, note that Swift on Darwin platforms is _intimately_ tied to the
ObjC runtime. I wouldn't be surprised if there are long-range aspirations to
change that, but it's not feasible in any short term.

------
microtherion
Very interesting paper. I was a bit puzzled, though, at the assertion related
to the Stepstone graphics libraries that X11 was still "years away". The
chronology of the paper is not 100% certain in that area, but my impression
was that this assertion was set in something like 1987/88, and X11 was
released in late 1987.

~~~
pjmlp
I guess it has more to do with the bare bones X capabilities and the
alternatives like NeWS.

------
miohtama
Eventually software industry revolution was brought by an open source movement
and likes of NPM and PyPi repositories.

~~~
anthk
CPAN existed several years before.

------
watersb
Software ICs (Integrated Circuits)

------
matchbok
Glad it's going away, the syntax is a chore to read and understand. There's no
valid reason for the use of brackets.

~~~
gmac
There’s also no valid reason for the non-use of brackets, is there? They’re
just different. It didn’t take long for them to feel natural to me when I
started coding in ObjC, and the named method parameters that go with them make
ObjC code (in my eyes) delightfully self-documenting.

~~~
andrekandre
> different. It didn’t take long for them to feel natural to me when I started
> coding in ObjC

definitely, and for people like me, who started coding when they were young
(on mac/ios) there was no bias for or against such syntax (i also learned c++,
and java at the same time and have no special love or hate for those syntaxes
either)

i think alot of these "i hate this syntax" issues are people just not liking
what they were not used to...

