Hacker News new | past | comments | ask | show | jobs | submit login
Brad Cox has died (legacy.com)
884 points by carlosrg on Jan 22, 2021 | hide | past | favorite | 190 comments



> On one scuba diving excursion while in the compound having lunch, Brad engaged a couple from Germany in conversation. Brad asked about the fellow travelers occupation and discovered he was a computer programmer. Lifewise, Brad was asked about his life's work and stated I am also a computer programmer. "What do you do?" Brad was asked. I wrote Objective-C. Astonished, the gentlerman said, "No, Brad Cox wrote that". "Hi, I am Brad Cox", was the response and the introduction.

Wonderful story. I wish his family all the best.

I love Objective-C and consider it a beautiful language. Back in the day I re-discovered my love for programming when I started to learn this language. This was when I was still in the Java world.

As a side project I tried to build a drone (unmanned navel vehicle) powered by objective-c. I have abandoned the effort but posted the code on GitHub - it was a joy to work with the language and the funnest side project I’ve worked with.

These days I work with python and golang for job/hobby but I always am grateful to have spent time with objective-c. Reflecting back if I haven’t spent time with this language, today I would of not been a programmer.

Thank you Brad Cox for your work and positive influence.


Objective-C is a real object oriented programming language. Everything is messages. Reverse engineering ObjC as a security engineer over the years has been a treat. The runtime is a breeze to work with and the language itself made substantial usability improvements over C. It’s phased out over Swift now, but I really have no complaints over my time with the language, a rare thing in tech. I didn’t know you, Mr. Cox, but I enjoyed your work.


Coincidentally, I've been doing RE for many years, starting in the days of DOS and then moving into Windows, but it was only very recently --- less than a week ago --- that I had my first look at an app written in ObjC, and the first thing I noticed was the amount of information in the binary that seemed to almost give away most of its source: method, field, and class names everywhere. When I saw a method named something like checkLicenseSignature, I almost thought it was misdirection.

The second thing that I found astonishing was that these strings were actually being used by the runtime to determine what to call, i.e. something that in basically every other native language I've seen would be a direct or indirect call to an address or vtable offset. All this in an application that has no exported functions. I can definitely see how it would make RE extremely straightforward! Efficiency, on the other hand...


> All this in an application that has no exported functions

This is not strictly true: On macOS/iOS the OS/frameworks/plug-ins may call your methods, likewise with support for services, input managers, distributed objects, etc.

The responder chain is a good example of this dynamism: The input manager (responsible for interpreting key strokes) translate the user’s input into a message, e.g. “copy” or “insert A” and then finds the first object in the chain of objects that responds to this message.

This chain of objects may consist of standard framework objects (like a text view) or it may be your custom objects (like a view controller subclass).

In most other environments, you would have to create special interfaces for stuff you want to make public, this means only that stuff suffers the performance overhead, but you generally pay the cost in code complexity, see e.g. Window’s Component Object Model.


> these strings were actually being used by the runtime to determine what to call

Fortunately it doesn't do a full string comparison - it just compares the value of the string pointers, I understand.


Yes: selectors are interned.


Objective-C, at least these days, is plenty fast, with a the cost of a message send being comparable to a virtual function call. The fast path of objc_msgSend is just over a dozen instructions.


Plus for critical sections you can obtain method pointers which are just C functions.


The M1 chip has specialized paths on it just to make the message sending even faster. That’s part of the M1 magic.


To be fair I think the extent of this specialization is a branch hint


It’s still a kind of neat trick. One of those paths where even a little optimization goes a long way due to it being so hot.


as i recall, `libobjc` is ~25% of app launch time. `objc_msgSend` is quite expensive when you're making hundreds of thousands to millions of calls per second.


Although Swift has the spotlight, Objective-C keeps being improved.

"Advancements in the Objective-C runtime"

https://developer.apple.com/videos/play/wwdc2020/10163/


Swift uses the Objective-C runtime.


Swift can uses the Objective-C runtime for interoperability with Objective-C code, just like .NET uses COM on Windows.

They need to interoperate with the rest of the platform.


Swift is tiptoeing between static and dynamic typing, preferring the former when feasible, but often needing some of the latter when dealing with the UI.


I would guess that most of app startup time isn't going to objc_msgSend (although it is still probably slow as it hasn't warmed up yet) but rather to setting up the various data structures used by the runtime, applying initial swizzling and the associated cache invalidation that comes from that, and those pesky +loads that every analytics framework thinks it need to have ;)


The compiler does some very fancy stuff with objc_msgSend to make it fast, and the selectors are interned.

https://www.mikeash.com/pyblog/objc_msgsends-new-prototype.h...


Objective-C had an IMP pointer, essentially a function pointer. If you needed to send a message to an object in a tight loop, you could extract the pointer before the loop and use it inside.


Java also owns it to Objective-C, while it copied C++ syntax, protocols, reflection and dynamic loading come from Objective-C.

https://cs.gmu.edu/~sean/stuff/java-objc.html

And what many J2EE/JEE haters aren't aware of, it started as an Objective-C framework during the OpenSTEP days, and the OS was called Spring.

https://en.wikipedia.org/wiki/Distributed_Objects_Everywhere


I think it's fair to say it owes some to Obj-C, but both Obj-C and Java owe a lot more to Smalltalk.


Also true.


Not to sidetrack the conversation, but I found your typo of calling a drone an unmanned navel vehicle quite apt!


I wonder what objective-c would look like with the square brackets turned into parenthesis...


Would probably look a little less sharp...C-flat?


unmanned belly button vehicles are the best. much better than the manned ones


I am giving a talk in work soon around method swizzling in iOS and was delving into the history of Objective-C a bit and came across Alan Kay's talk about the power of simplicity and how we've all screwed up OOP.

In the talk, Alan talks about the ant who lives his life on a single plane of existence, the "Gulley World" or "Reality".

The ant goes to work, he finds stuff to eat, he lives his life in this Gulley World, which is depicted as a pink 2D plane. However some times on this pink plane there are little spots of blue. They represent thoughts that don't belong in the pink plane.

Sometimes those blue spots turn into blue planes, and the ant we are following starts to move along that plane instead of the pink one. Everyone in the pink plane thinks he is wrong. Everyone can see the pink plane, in all its reality. It is not until you walk on the blue plane until you can see "another way".

The metaphor being that we developers built a world where we started to take the general idea of OOP and construct a lot of "reality" around it. A lot of process, a lot of formalization so we could build mechanical systems of gears that slotted together. I think Alan's idea of OOP was something more fluid, more organic than this. The world is messy and we often try to abstract the mess away in these overbearingly weighty and hierarchical programs that everyone agrees is the right way.

I think Objective-C was the most widely used and successful walk on this blue plane. Millions of developers were exposed to the idea of message passing as a form of OOP, which is an astounding accomplishment. It really is a neat language, and I had a lot of fun learning it.

Brad definitely walked on the blue plane. RIP.


What a great tribute you have written. When I first found about swizzling through a seasoned iOS dev I was blown away. The swizzling capability in obj-c basically helped create my first startup, InstaOps, a long time back which allowed no code change to instrument an app to capture logs and network performance metrics.


I second all of these comments. I was on the same team with him (Hi Prabhat, this is Paul D.) and concur that swizzling is a very powerful mechanism. It's definitely a good thing to know about if you're working with Objective-C.


"I think Alan's idea of OOP was something more fluid, more organic than this."

I've heard a good deal about Alan Kay's dissent of the state of OOP, but I've never seen a concise summary of his vision or the principles that Kay's 'ideal' realization of OOP would adhere to.

Does such a resource exist, written by Kay himself or otherwise? Or do I just need to go play around with Obj C or Smalltalk to really "get it?"



Here he discusses his thoughts on the term; http://userpage.fu-berlin.de/~ram/pub/pub_jf47ht81Ht/doc_kay...


But, yes, you really need to play around with one to 'get it'. Or at least, get why the other languages fail to really uphold the goal.

Per the original anecdote, being told "it's all blue!" isn't that impressive when you've seen the spots of blue and shrugged them off as just normal little detours amongst the pink, and can't imagine everything being blue. You have to see it for yourself.


>I've heard a good deal about Alan Kay's dissent of the state of OOP, but I've never seen a concise summary of his vision or the principles that Kay's 'ideal' realization of OOP would adhere to.

Smalltalk 76-like...


>Brad definitely walked on the blue plane. RIP.

Your/Alan's "blue plane" analogy remindeded me of this:

  They said, "You have a blue guitar,
  You do not play things as they are."

  The man replied, "Things as they are 
  Are changed upon the blue guitar."
https://www.writing.upenn.edu/~afilreis/88v/blueguitar.html


I met him once in the late 1990s, when his travels took him to Zurich and he asked me whether I could book a talk for him at ETH Zurich, where I was a grad student.

I did not quite share his confidence in my abilities in that area, but to my relief, Jürg Gutknecht agreed to sponsor the talk, and I got to spend lunch with Brad Cox, Niklaus Wirth, and Jürg Gutknecht. Given their highly divergent aesthetics in language syntax, I expected some fireworks, but the conversation was quite pleasant, even when they were discussing Perl.

I was at the time the maintainer of the Mac port of Perl, and had taken some classes with Wirth, but the idea of discussing Perl with him struck me as akin to discussing masturbation with the Pope. However, Wirth conceded that in the area of text processing, general purpose languages tended to be somewhat clumsy, and there had always been a successful niche for languages like Snobol and now Perl.

Brad Cox was a splendid conversationalist in many other areas as well. His talk focused on Superdistribution as the next evolution of the Software IC concept, and he very skillfully pitched this to a Swiss audience that a banking nation should be a natural superpower to take the lead in a micropayment world. He was very good at painting visions like this, but I'm not sure how much of it ultimately came to pass:

a) I don't think we're any closer to plug and play "Software ICs" than we were in the mid-1980s when he introduced the term. In the Objective C ecosystem, the closest there was to that was maybe Interface Builder with its Outlets and Actions, but I think that part did NOT originate with Cox (I may be mistaken, though).

b) Likewise, I don't see any move to distributed micropayments. If anything, more and more of the software revenue seems to come from centrally billed cloud services, e.g. comparing the Microsoft Office revenue model 20 years ago and now.


I think the vision of Software ICs was delivered most fully by VBX custom controls for Visual Basic and Delphi. Superseded by OCX.

There were hundreds of them you could download for free or paid, doing all kinds of GUI and non-GUI tasks.

Doesn’t seem so popular now.


Depends on which circles you move around.

https://www.actiprosoftware.com/

https://www.grapecity.com/

https://www.telerik.com/

Just a sample, there are others.


Sure, I just think web became way more popular.

Maybe Wordpress plugins are the spiritual successor.


Those companies also sell Web components, as in for all major SPAs and SSR frameworks, the issue is only enterprises care enough to pay for them.


The book to go with is "Superdistribution: Objects as Property on the Electronic Frontier" which is a fun read.

Objective-C is still my favorite language and I loved his writing when first learning it in 94.


>I don't think we're any closer to plug and play "Software ICs" than we were in the mid-1980s when he introduced the term.

A lot of modern infrastructure works like that -- function as a service, serverless, k8s boxes, etc.


That's an interesting perspective, but that's a bit more coarse grained than what I thought Software ICs would be, and, I suspect, more coarse grained than what Cox thought as well.


Very sad. I had the privilege of taking a class from him at George Mason University, and he was (unsurprisingly) very knowledgeable.

He worked hard to enable software reuse. No one was interested in his idea of trying to monitor component use during runtime to pay developers. That was an unworkable approach, and I told him that then. But the general world of making it easy to reuse components is a reality today, via open source software and package managers.

So, a hat-tip to him and all the other pioneers who helped make the world a better place.


> No one was interested in his idea of trying to monitor component use during runtime to pay developers.

This reminds me of Project Xanadu's ideas about transclusions and associated royalties.

What a coincidence that this was posted recently: https://news.ycombinator.com/item?id=25875386


That makes sense.

In addition, the telecomms were generally not interested in the very early Internet (TCP/IP) because they couldn't figure out how to do per-packet metering, and they assumed that was necessary.

All 3 examples show that trying to do fine-grained metering, in ways that cause tremendous overhead, often don't work. It is sometimes better to make something so that it is "too cheap to meter".


superdistribution was superinteresting... but i felt the same as well when reading it... it could only work if everyone agreed at the same time to use the same protocols... and of course legislation... but it was definitely a grand vision and would have been interesting to see the effect on culture and creativity if it had actually panned out


> He worked hard to enable software reuse. No one was interested in his idea of trying to monitor component use during runtime to pay developers

People are experimenting doing this in blockchain smart contracts. It’s transparent and supports micropayments as well.


Yupp! I made a little toy project for the EVM over a year ago with this exact concept, never really did anything with it sadly, life seems to find ways to get busy. Due to the nature of needing to send 'gas' to make function calls, it was a natural fit to add a call to send a small portion of the value to an address before returning the computations result.

I really loved the idea of being able to create libraries of code that could just be called for a small fee or copied for free if one didn't have the funds. I hope this idea continues to catch on, it seemed to me to be a perfect incentive fit for the open source world.


> He worked hard to enable software reuse. No one was interested in his idea of trying to monitor component use during runtime to pay developers.

This is a nice idea although I never thought it could've worked; it seems like it took forever for people to stop trying though. The app-and-library organization of software is more natural than document-and-component organization because of Conway's law, which is surprisingly hard to escape.


"No one was interested in his idea of trying to monitor component use during runtime to pay developers."

The way to make people pay for software components is not to ask them for money at runtime, but to do so much much earlier in the development cycle: at design time.

In the instances that I have seen this business model work, the components are usually bought as part of a collection [1]—think source-available components similar to the model made famous by Apache Commons (Commons Codec, Commons Util, Commons Lang etc). The Apache Commons OSS project emerged on June 20, 2007 [0] as a way of standardizing the need for reusable Java components and libraries, slowly killing the market of paid components.

Or, components are bought as part of an ongoing subscription to a large catalog containing thousands of components [1][2]—think of it as a company-wide Safari Books subscription but for software components.

As part of the business model, component designers and developers were paid royalties in additional to the one-time monetary payment for developing each component, with the top 25 royalty earners collectively making as much $458,792.31 over a multi-year period [3].

0: https://commons.apache.org/charter.html

1: https://software.topcoder.com/catalog/c_showroom.jsp

2: https://www.topcoder.com/tc?module=Static&d1=pressroom&d2=pr...

3: https://www.topcoder.com/tc?module=ComponentRecordbook&c=roy...


>No one was interested in his idea of trying to monitor component use during runtime to pay developers.

Well, today we call it "function as a service" and Serverless...

https://en.wikipedia.org/wiki/Function_as_a_service


> No one was interested in his idea of trying to monitor component use during runtime to pay developers.

Apart from enterprises selling K8s components who call it 'metering'.


Two years ago I started writing an app in Swift. After a two year sabbatical, this was going to be my first app.

When using Swift, the compiler was painstakingly slow. Because of that, I tried Objective-C and it is so clear to me that I love it. It is the best language in my humble opinion. The dynamism clicked and the modern features make it a real breeze to use.

Messages are so flexible. I also love how it has “gradual typing.”

My only gripe with it is that Categories can’t formally conform to protocols—which I understand is an easy to build feature that Blaine Garst did finish but Apple never released.

I know I’m talking about the language more than I am talking about Brad Cox, but that’s because it’s the first time I really fell in love with a language. Using Objective-C to build it brings me joy. Lots and lots of joy.

Thank you Brad. My prayers to your family. May you find peace in heaven.


> Categories can’t formally conform to protocols

I wasn't aware of that limitation, so I tried it out just now only to be certain. Works fine for me. ¯\_(ツ)_/¯

   @interface UIViewBuilderSmalltalkViewController(storage) <MPWStorage>

   @end

   @implementation UIViewBuilderSmalltalkViewController(storage)


   @end

Compiler correctly complained about the missing methods and Xcode kindly offered to add stubs for me:

   /Users/marcel/programming/Projects/ViewBuilder/UIViewBuilderMockup/UIViewBuilderSmalltalkViewController.m:121:17: Category 'copying' does not conform to protocol 'MPWStorage'
   /Users/marcel/programming/Projects/ViewBuilder/UIViewBuilderMockup/UIViewBuilderSmalltalkViewController.m:121:17: Add stubs for missing protocol requirements


Maybe he meant classes (metaclasses)? Class itself is not typed, which definitely feels like a missing feature in the language.


Can you build the feature yourself?


Many moons ago I used to work with Brad in DC. He never let on that he was a world famous computer scientist. He slinged code shoulder to shoulder with us plebes.

He was a Mensch.


For the other Dutch/German people out there confused at what Mensch means other than 'human', dictionary says: "a person of integrity and honor" (https://www.merriam-webster.com/dictionary/mensch)


Yes, but it also carries the connotation of being the highest complement you can say about someone. I don't use the word lightly. It's Yiddish in this context.

https://en.wikipedia.org/wiki/Mensch


German derivation, of course, but the American English usage is more particularly Yiddish (https://en.wikipedia.org/wiki/Mensch).


plebes... a great word I don’t hear often!


There is no doubting the legacy of Objective-C (especially given the high likelihood you are reading this post on a mobile device, using app written in Objective-C), but to truly appreciate Brad's legacy, am curious about the appeal of using Objective-C.

Having developed only one small iOS app with Objective-C code, I was mostly turned off by its overall verbosity in the context of NS prefixes. Hence, I ask the question on behalf myself and others who did not appreciate the language and did not give it a proper chance... what did I miss and what are its top appeals?

Nevertheless, Rest In Peace to a pioneer.


In the context of the time, C++ didn't exist yet. Objective-C was actually introduced just prior to C++, and both languages were effectively solving the same problem in different ways: C was the dominant language, and both language designers were trying to graft the OOP paradigm onto it.

Objective-C is a thin-layer on top of C, adding Smalltalk-inspired object support. That's pretty much all there is to it. C, with some new syntax for objects. In the context of a world where C is the norm, that's pretty appealing. This is before Java existed, too.

The "NSWhatever" stuff, as far as I'm aware, isn't part of the language. That's all in the frameworks Apple/NEXT developed for Objective-C. (Note that the base object is called Object, not NSObject, and the integer class is Integer.) NSString is probably named that way because Objective-C doesn't include a string class (nor does C, as a string is just an array of bytes until you write a wrapper to get fancy) and NEXT made one. They were just namespacing the NEXTStep classes.


> Note that the base object is called Object, not NSObject, and the integer class is Integer.

Objective-C actually doesn't require a base object (although these days it essentially does), but Object and NSObject are both examples of root objects. IIRC, reference counting is not in Object and was a NeXT invention.


I'm curious--what happened to Objective-C in that fight with C++? Why didn't people go for its simplicity?


As usual, platform languages win.

C++ was born at Bell Labs and quickly integrated into their workflows as C with Classes started to get adopters.

This raised the interest of the C compiler vendors, so by the early 90's, all major C compiler vendors were bundling a C++ compiler with them.

Additionally, Bjarne got convinced that C++ should follow the same path as C and be managed by ISO, so the C++ARM book was written, which is basically the first non-official standard of the language.

So C++ had ISO, the same birthplace as C and love of C compiler vendors, while Objective-C was initial a work from a small company and later on owned by NeXT.

So, naturally Apple, Microsoft, IBM decided to go with C++, and everyone else followed.

Here is an anecdote for Apple fans, Mac OS was written in Object Pascal + Assembly, when market pressure came adopt C and C++, the MPW was born and eventually a C++ framework that mimic the Object Pascal one (there is a longer story here though, ending with PowerPlant framework).

Copland was based on a C++ framework, and Dylan team eventually lost the internal competition to the C++ team regarding the Newton OS.

Apple was one of the major OS vendors that never cared much about C for OS development, only the NeXT acquisition ended changing it. And even then they weren't sure about C and Objective-C, hence the Java Bridge during the first versions.


> Apple was one of the major OS vendors that never cared much about C for OS development

It's true that MacOS Classic kept providing Pascal headers for most of its APIs for a long time (I don't recall whether they ever stopped), but internally, they started switching to C by the late 1980s (as an external developer, I could tell by one bug which would never have made it through a Pascal compiler, but was typical for the kind of bugs that wouldn't get caught by a K&R C compiler), and by the late 1990, it was all C and C++, just with Pascal calling conventions for all public facing APIs. In my time at Apple, I never encountered a single line of Pascal code.


I bet it was actually C++ with extern "C", which was my point, specially given the MPW and PowerPlant frameworks.

I never knew anyone doing bare bones C on classic Mac.


There was a lot of extern "C" (and there still is a lot of that), but there also was a lot of extern "Pascal" back then, I seem to recall.

There was a quite a bit of regular C in classic MacOS, though there was also a good deal of C++. You're right that MacApp (Which I think is what you're referring to with "MPW", which was an IDE) and PowerPlant were written in C++, but I'm not talking about the clients of the MacOS APIs, but about the implementations of those APIs.


Faire enough, but sure it wasn't C++'s C subset?


In 68K times, the two compilers were quite different, the C++ was CFront at one point, and insanely slow. It would have taken a real masochist to compile C with a C++ compiler. I can't guarantee that nobody at Apple ever did that, but the suffixes were distinct. In that situation, IDEs usually make the distinction automatically, and it's not hard to write Makefiles to invoke the right compiler.


Ok, thanks for the overview.


I tried out both Objective C and C++ in 1988, when neither were popular though C++ was more talked about.

What I remember was that with Objective C you needed to track all intermediate values and release them, so you couldn’t write an expression like [[objectA someMessage] anotherMessage] - you had to capture the intermediate in a variable so you could release it at the end.

So this was annoying and I didn’t like Objective C at the time. (25 years later I wrote several iOS apps in it)

C++ let you manage memory and temporary values though constructors and destructors, which was much more appealing, though pre-templates it was quite constrained.


Part of it was licensing. Probably more of it was the personalities involved at e.g. Microsoft or SGI.


Objective C loses to C++ for performance if you really start exploiting OOP a lot. The fact the you can swizzle methods in ObjC says a lot about the "weight" of the underlying implementation ("it's all messages") compared to C++.


The fact that you can swizzle methods also says a lot about its power and flexibility. When Steve Jobs was at NeXT, he was quoted numerous times bashing C++ as having 'dead objects' while pointing out that ObjC objects were 'alive'. One seldom needs to make use of swizzling, but when you do need it, it's an awesome capability.

As prabhatjha pointed out in another comment in this thread, swizzling was used to automatically capture networking calls just by adding our framework to your app and initializing it. You could then log into our app's web-based dashboard and see stats about networking calls (counts, latency, errors, etc.). This simple and elegant solution would not have been possible with C++. We also supported Android at the time (Java), and the developer was required to change his code to call our networking wrapper calls to get the same tracking for their Android apps.


Absolutely. I've used swizzling myself to fix issues with audio plugin's GUIs (to limit how fast they are allowed to redraw themselves). It's very clever and sometimes very useful.

But the ability to do that comes with certain costs, and performance is one of them. The fact that these "methods" are dynamically dispatched sometimes matters, and you can't change that any more than you can swizzle in C++.


> This simple and elegant solution would not have been possible with C++

Actually, it would, but not via any feature of the language. You can use features of the linker to accomplish this (particularly LD_PRELOAD). It's not the same thing, really, but it felt worth mentioning for the record.


In our case, the LD_PRELOAD approach would not have worked because this was on iOS devices where you can't set that variable. However, I do appreciate you mentioning it because it too is a powerful mechanism that enables some creative and non-invasive solutions in some cases.



> Having developed only one small iOS app with Objective-C code, I was mostly turned off by its overall verbosity in the context of NS prefixes.

This is actually a blessing because NS-/name prefixes are a simple approach to naming that keeps you humble. If you let programmers have namespacing they will invent enterprise software frameworks where every class is six layers deep in a namespace of random tech buzzwords they thought up.

> Hence, I ask the question on behalf myself and others who did not appreciate the language and did not give it a proper chance... what did I miss and what are its top appeals?

It implements message-based programming, which is "real" OOP and more powerful than something like C++, where OOP just means function calls where the first parameter goes to the left of the function name instead of the right.

In particular it implements this pattern: https://wiki.c2.com/?AlternateHardAndSoftLayers which is great for UI programming and lets you define the UI in data rather than code. Although iOS programmers seem to like doing it in code anyway.


This plague of object wiring in code is pervasive in the Java world as well. The joy of declarative late binding and decoupled objects at compile time seems to be very lost on the vast majority of programmers.


> It implements message-based programming, which is "real" OOP

No, it's message-based programming, which is a very powerful and useful tool. It's not the one true inheritor of the fundamental OOP concept.

OOP wasn't defined by "you send messages to objects", it was defined by the idea that objects had their own semantics which in turn constrained/defined the things you could do with them. Some OOP languages implemented "doing something to an object" as "send it a message"; some didn't.

ObjC is in the former group; C++ is in the latter.


Well, considering that Alan Kay coined the term...



Alan Kay didn't invent object oriented programming though he was instrumental in expanding its scope and usage. Simula 67 was a big influence on Kay's work, and as he has noted, the "revelation" that "it's all messages" was a huge one.

But Smalltalk is just one OOP language, not the only one and not even the original one (though we could argue about how much Simula was or was not OOP, and I'd rather not).

The message from Kay you cite about is strictly about Smalltalk & Squeak:

> The big idea is "messaging" - that is what the kernal of Smalltalk/Squeak is all about

He doesn't say "what OOP is all about".


Actually he did invent Object Oriented programming.

Simula was an inspiration, but was never considered as object oriented. After Kay came up with the concept, Simula was identified as part of the historical background.

“I invented the term object oriented, and I can tell you that C++ wasn't what I had in mind.” —Alan Kay.


Dahl & Nygard would not agree with you:

https://www.sciencedirect.com/science/article/pii/S089054011...

http://kristennygaard.org/FORSKNINGSDOK_MAPPE/F_OO_start.htm...

Kay came up with the term "object oriented programming", but he has made it very clear that what he had in mind has little relationship to what most people mean by that term today.

If you want to give Kay veto power over the correct application of the term, be my guest but please be consistent across all other cases where a word or phrase changes its meaning over time.

Contemporary OOP pays only lip service to Kay's ideas (something he would be the first to say), and is only tangentially influenced by Smalltalk at this point (Objective C probably being one of the few widely used counterpoints to that).


Those links don’t support your claim about Dahl and Nygard.

They are retrospectives written by other people talking about their work. Not papers by Dahl and Nygard themselves.

Just because you can find some people who are making the same retrospective mistake you are, doesn’t change the history.

OOP was defined by message passing.

What you are calling ‘contemporary OOP’ is a cargo cult based on a failure to appreciate that. The problems with this are increasingly acknowledged.

If you want to say OOP is class based programming, be my guest, but your statements about the history that I responded to are simply false.


The second link is titled:

> How Object-Oriented Programming Started > > by Ole-Johan Dahl and Kristen Nygaard, > Dept. of Informatics, University of Oslo

The first link is "based on an invited talk" given by the author at the Dept. of Informatics, University of Oslo, with colleagues of (then-deceased) Dahl and Nygaard in attendance. That doesn't guarantee anything in particular, but it makes it likely that they are not making stuff out of thin air.

If OOP was defined by message passing, why was it necessary for Kay to note in 1998 that people had apparently lost sight of this (his) definition?

If you want to say that C++ programming is not OOP, be my guest, but your statements about the history of OOP are not part of some canon or bible.


Ok - I accept that the first piece was by Dahl and Nygaard.

Nevertheless it is just a retrospective application of a term that they didn’t invent, and doesn’t change the history or substantiate the false claim that they invented the term.

OOP was defined by message passing and as you say people lost sight of the idea, largely due to C++

Kay did invent the term. It was about message passing.

Later people used it to describe something else which has little resemblance to the original idea.

It’s fair to say that it’s not part of some canon and of course people are free to miss the point of something and cause a second definition of a word to enter circulation.

Irrespective of canon or multiple definitions, your statements about the history itself in your earlier comments are just false.


Objective-C is a very simple, clean language–very much unlike its other "object-oriented-C competitor" C++. Unlike C++ it's a 100% superset of C, and it takes its cues from Smalltalk where objects send messages to each other rather than statically call each other's procedures. To support this, there is a very rich runtime that allows all sorts of reflection and metaprogamming atypical in a compiled language.


Tastes may differ. To me, C++ looks like an organic extension of C syntax, while Objective C looks like an alien graft on top of C.

Same with semantics: In C++ there is a continuum from POD structs to adding non-virtual methods to adding virtual methods. In Objective C there is a gaping chasm between C types and Objective C types, and weirdness occurs when you mix the two (e.g. pass a method taking an (int) to a place expecting a method taking an (NSNumber *)).

Containers (arrays and dictionaries) in Objective C, I find particularly ugly, especially in earlier (pre-2010 or so) versions of Objective C. They can contain only Objective C objects, not C objects, but can wildly mix and match objects of different types (this has been helped by Objective C generics by now). Access to elements is very verbose (this has been helped by syntactic sugar by now).

Just recently, I had to review Objective C code using a multidimensional numeric array. Even in modern syntax, it was no joy to read, and I wept for the senselessly murdered memory and CPU time. But if it had been written in pre-2010 Objective C, I might have lost my will to live for weeks.


I don't see how you can call ObjC any more of a superset of C than C++.

Object-related syntax in ObjC is completely alien to C. Object-related syntax in C++ (mostly) extends C structure syntax.

Yes, ObjC takes its cues from Smalltalk. C++ does not. And so... ?

[EDIT: ok, so people want to interpret "superset" as meaning "every valid C program is a valid Objective C program too. This is, with very few exceptions, true of C++ as well ]


Because ObjC is a strict superset of C in the technical sense. That is: every valid C program is also a valid Objective-C program.

Of course idiomatic ObjC is heavily tilted toward the non-C parts of the language (OOP features), but that doesn’t mean it’s not a true superset of C.


"Yes! C++ is nearly exactly a superset of Standard C95 (C90 and the 1995 Amendment 1). With very few exceptions, every valid C95 program is also a valid C++ program with the same meaning."

https://isocpp.org/wiki/faq/c


Objective-C has no exceptions.


Whaddya mean?

   @try {
        // do something that might throw an exception
    }
    @catch (NSException *exception) {
        // deal with the exception
    }
    @finally {
        // optional block of clean-up code
        // executed whether or not an exception occurred
    }
https://developer.apple.com/library/archive/documentation/Co...

(I'll see myself out. For at least two reasons.)


We are on C17 nowadays.


ObjC doesn't change any existing C syntax, it only adds messages. C++ is an entirely different language with a different spec that merely looks like C.


"Yes! C++ is nearly exactly a superset of Standard C95 (C90 and the 1995 Amendment 1). With very few exceptions, every valid C95 program is also a valid C++ program with the same meaning."

https://isocpp.org/wiki/faq/c


The implicit casting rules are different, it doesn't allow VLAs, you can implicitly create static constructors instead of having your program rejected for non-constants at the top level, more keywords are unavailable as variable names…


VLAs are optional since ISO C11, clang and gcc are probably the only C compilers that care to support them.

C17 also has its share of keywords and C2X plans to replace some of the _Keyword with they keyword version, as enough time has passed since their introduction.


Clang and GCC being the two largest implementations.


True, but not the only ones, so good luck making that VLA code work outside BSD/Linux clones or the few OEM vendors that have forked them.

Also Google has sponsored the work to clean Linux kernel from VLAs.


Yeah, because VLAs mostly suck.


I have some perfectly safe code using them that crashes Intel icc. Useful if you ever need to, I don't know, crash icc I guess.

Among other things, this means icc doesn't run other compilers' testsuites, because I reported the same bug in the first release of clang and clattner fixed it right away.


And given that C11 made them optional, Intel can just close such bug reports with won't fix using the ISO as justification.


Not if it claims GCC compatibility, which it does. Though I believe the frontend is licensed from EDG anyway.


It's not a modern language, so appreciating it has to be in its original context. I think it does an admirable job of augmenting C with object-oriented capabilities. It's certainly easier to master than C++.

I'm not an expert on this, but I suspect that the main reasons it was chosen for iOS were:

- The technical limitations of the original iPhone meant that you needed to use a low-level language.

- The legacy of NeXT at Apple.


Mostly the latter, I would assume. Apple didn't really use anything other than Objective-C for its application frameworks (and still generally does not, for the most part).


I read in multiple sources, usually the kind of comments that is only possible to validate with inside info, that to this day not all business units are sold on Swift.


Sold or not, it's easy to see that most of the code being written is still in Objective-C just by looking at the code that Apple ships publicly.


Objective-C is verbose not just because of the NS suffixes. Everything is verbose (by today standards anyway). ObjC is a "child" of the 1980's when verbosity was considered a merit and a norm in programming.

Two things that I used to like about it:

- Combination of static typing and at the same time pretty high level dynamic typing: it was practically possible to call any method on any object, right or wrong, just like in dynamic languages. For performance critical parts you could always resort to C. Later, as a little bonus it was also possible to resort to... C++. There was such a beast as Objective-C++.

- The method calling syntax. Quite unusual but neat. I liked it a lot.

However, Swift ruined it for me. Now that I'm a total Swift convert and I feel a 2x or even 3x boost in productivity I can't even look at Objective-C code anymore.


> ObjC is a "child" of the 1980's when verbosity was considered a merit and a norm in programming.

It's still considered a merit by some.


I agree with this. Verbose code is code you can come back to an understand years down the road.

The easiest projects for me to pick back up are the ones I wrote in Objective-C, hands down.


>Verbose code is code you can come back to an understand years down the road.

For ObjC, "verbose code" means "code you can come back to years down the road and hope there's still a manual to translate those message argument names into whatever current programming terminology uses".


Thankfully, in computer science terms like "array" and "string" still mean what they did many years ago.


It's not about "array" or string".

     [[NSNotificationCenter defaultCenter] addObserver:self
   selector:@selector(appDidBecomeActive:)
   name:NSApplicationDidBecomeActiveNotification
   object:[NSApplication sharedApplication]];
can you explain to me what any of those terms mean without looking a fairly extensive reference manual?


At that level you're talking about framework api's (in this case NextStep and derivatives), not the language itself. Drop me into some Haskell or Java or OCaml or Python or Ruby framework, I'm going to have to reach for a reference as well.

It's not uncommon to conflate Objective-C conversationally with its most common use case, but Objective-C is not NextStep and other Apple-ecosystem friends.


Entirely fair.


Sure, putting on my layman hat: "Add the observer 'self' to some centralized place get a notification of some kind for when the app becomes active. Something to do with a 'shared application' and a 'selector', maybe they are some additional context?" Of course, the latter two are things you'd know if you have used Objective-C even a little bit, with the former being a fairly standard nomenclature for a global and the latter being a crucial part of the language.


Yep.

AppKit has this concept called a Notification Center, which, duh, sends notifications. You want to observe the notification called NSApplicationDidBecomeActiveNotification. The way you want to observe this notification is by being sent the message * appDidBecomeActive:. The "object:" parameter tends to be nil, so I did actually have to look that up: it means I only want to receive this particular notification when sent by that object. It is almost certainly redundant in this case, because nobody else has any business sending NSApplicationDidBecomeActiveNotification, and it is usually precisely what you do not* want, hence it is usually nil.

Anyway:

   [prefix stringByAppendingString:suffix];
   [dictionary objectForKey:key];

In the old NeXTstep days, before our editors had code completion and other conveniences, you could very often just type a phrase describing the operation you wanted and magically the code would compile and do what you expected. Hard to both describe and probably believe if you haven't experienced it yourself.


In addition, the semantic consistency of the frameworks combined with the verbosity encouraged by the language makes it easy to jump into a new-to-you part of a 25+ year old codebase and start making useful changes very quickly.

This is much more difficult when you have to be careful about what every single operator dispatches to, or when more than just the receiver’s type determines the method that’s called. You can look at code in isolation and get a pretty good idea of its intent and the routes that its implementation will take, both of which are necessary to start making changes.


I loved the idea that the OOP world and the C worlds were syntactically different. It made the language significantly more elegant than C++, which doesn't even take into account the beauty of its Smalltalk message passing semantics.


That verbosity is exactly why I love it.

It's easy to write and easy to read (especially years later). It's just such a joy to work with.


[BradCox release]

RIP.

I owe so much to Objective-C. My early love for the language is what launched my own career, and inspired a love for programming in general. Thank you, Brad Cox.


The same thing happened to me. I was at university loving C but learning Java and hating every minute of it.

One day, with the help of another student, I managed to install Snow Leopard on my Acer notebook and the first thing I wanted to do was figure out how iPhone programming worked. However I was instantly confused at the syntax of the language and that threw me off.

I did try again two more times though, and in the last one it just clicked. That was mid-2012. I dropped out of university for a job opportunity in 2015 and have been an iPhone developer ever since.

Thank you, Brad Cox.


This is so poetic.

What a beautiful way to remember him.

Alan Kay once said that those who began to talk about objects in anthropomorphic terms got object oriented programming. Your “code” now reveals to me the cycle of life in all that I type. How delightful.

Thank you :)


It’s odd to feel sad having never known this man, and only written in the language he helped create. Perhaps it is a testament to the beauty of Objective-C that I feel moved.

Objective-C is poetic. Its patterns and clarity are the closest I’ve come to an ecological software language: I feel like I’m gardening when writing Objective-C code.


My first programming job used Objective C to make an iOS app. Back in 2013 XCode was pretty fast. My naive youthful self enjoyed using Objective C's Categories and Associated Objects to share some UI code across 2 UIViewControllers.

Did I get it to work? yep. Did it make senior programmers a bit nervous? yep. I wrote a blog post about it.

Later, I got to use Java for some Android apps, and after that we got Swift. XCode seemed to get slower with swift and Java(android) was more a limited language. No complaints, but it was just not as fun and easy as using Objective C. (in my naive beginner opinion)

Things were a lot simpler back then. I'll never forget my joy learning Objective C at my first ever programming job. RIP Brad Cox


You weren't naive. Objective-C is so much better suited to do GUI applications than Java. Plus you could dip into the performance of C where it mattered most. Also while Objective-C (when used idiomatically) was slower than Java, it was consistently and reproducibly slow around reference counting while Java itself decided when and where it was time to do a full garbage collection cycle.

And of course only now languages like Rust and Swift are feasible given the fact they do so much compile time checking. But that makes your IDE a lot slower compared to Objective-C.


I love the quote from him where he says "languages are mere tools for building and combining parts of software." I think a lot of new developers get hung up on Language A vs. Language B (or OS A vs. OS B), so I hope this helps them realize that the languages are just tools you have in your toolbox, and that they should be open to switching between (and learning new) languages as needed.


Absolutely! ObjC's raison d'être was pragmatic. Brad Cox said that he didn't invent ObjC because he wanted to come up with a new language, he needed that type of language to solve the problems at hand.

I completely agree with your point about languages being just tools in your toolbox. With that point, I always feel that many of the folks who describe themselves as being 'passionate about language xxx' might be selling themselves short when it comes to having a toolbox that's not a one-trick pony.


:(

Objective-C was the “object oriented C” that was simple and a delight to use…words that I certainly would not use to describe competing efforts. The syntax might be a little disagreeable–a concession to strict C compatibility–but the language itself is remarkably clean and, dare I say, pretty. Brad Cox struck the balance between flexibility and practicality better than almost anyone else before or since.


Well put. The one word I would add to your description is 'pragmatic'. For many years, ObjC was the language that I would rate at the top of the pragmatic list. That stayed true for me until Nim arrived and now it's a toss-up.


Computer History Museum interview with him: http://archive.computerhistory.org/resources/access/text/201....




Objective-C is the programming language that made me fall in love with programming, and led to my career for the past 14 years.

I never met Brad Cox, but the work he did to create it has had a huge impact on my life. Watching his long interview with the computer history museum was a delight and made me feel like I knew him just a little.

Sincere condolences to his family and friends.


Sorry to hear that he has passed. His book, "Object-Oriented Programming : An Evolutionary Approach" still influences how I think about and teach OOP today. His legacy will live for a long time.


RIP. Objective-C was my first language and I enjoyed it even with manual memory management!


True! For those who might be wondering how it's even possible to enjoy working with a language that required manual memory management, I submit to you 'autorelease' and auto-release pools.


I always liked his analogy for object oriented programming as "software ICs" -- just as in hardware development, you don't have to worry about what goes on in a chip (just what it takes as input and gives as output), so too a well designed object works.


Interestingly, I feel like this comparison to an IC and input(s) -> output(s) is more akin to functional approaches, and many people complain about OOP being the opposite.

To quote Joe Armstrong:

> I think the lack of reusability comes in object-oriented languages, not functional languages. Because the problem with object-oriented languages is they’ve got all this implicit environment that they carry around with them. You wanted a banana but what you got was a gorilla holding the banana and the entire jungle. If you have referentially transparent code, if you have pure functions — all the data comes in its input arguments and everything goes out and leave no state behind — it’s incredibly reusable.


Many ICs contain state. That's RAM's purpose, for example.

The analogy with OOP is that (ideally) you don't need to know how it works internally: you just use the external interface.


Heck, many mixed-signal ICs are much harder to use correctly than a typical object; Things like "Vdd must be brought up to +6V or higher before CLK begins toggling" are not too uncommon to see.


It's a loose analogy. You're going to find exceptions. I think its a fantastic analogy to describe what a class is despite of some exceptions.


I was actually agreeing with the analogy.


FP comes with the same jungle, especially if you consider that when most people talk about FP, they're allowing for closures. Even if you don't, though, and you do only have pure functions, it's a similar bog.

Consider a function f that itself relies on being able to call function g. In FP, you have to either bind by name, which introduces coupling on par with OO ("the jungle"), or you make it parametric, so f doesn't know anything about g per se, but somebody does―perhaps f's caller, or its caller, etc. OO is parametric, too, but in that case they live as slots on the objects, rather than being passed as arguments.

People can try to squidge this end of the problem and find that it shquoshes up on the other side, or vice versa. There's no real escape from the problem of dealing with state.

Probably the real silver bullet boost to productivity will come when we adapt program execution environments to our dumb brains by way of VR. Think of something like a cellular automaton such as von Neumann's universal constructor:

https://en.wikipedia.org/wiki/Von_Neumann_universal_construc...

... only in 3 dimensions. When we want to debug a program, we project it into a 3-dimensional space, and we trace its execution the way you can look at something progressing through the assembly line on a shop floor, with pieces that you can reach out and touch, and even pick them up and mark some with blue dots and some without, etc. That looks a lot more like OO than FP.


My reaction to PT fundamentalists is always the same. "Jesus, what the hell have I been doing, reusing all this unreusable OOP code?"


There's an extended interview with him about Objective-C in the book "Masterminds of Programming: Conversations with the Creators of Major Programming Languages": https://amzn.to/3iEYfGh


I've spent pretty much my entire career (20 years!) writing Objective-C: first for macOS, later iOS. Of course Swift is now the new kid on the block, and has lots to recommend it. But there's something about the simplicity and purity of Objective-C that has a special place in my developer's heart (despite its flaws & imperfections).

So thank you Brad, you've influenced my entire career. RIP.


I mentioned Brad Cox's "software ICs" today on the phone in a conversation about big ideas in programming, not knowing that he'd passed away a couple weeks ago.

Here's the Objective-C paper at last year's HOPL:

"The origins of Objective-C at PPI/Stepstone and its evolution at NeXT"

https://dl.acm.org/doi/10.1145/3386332

https://news.ycombinator.com/item?id=23516334


That's a truly excellent paper (and exceptionally honest, considering some of the touchy subjects involved) and disentangles many of the origins of the various concepts in Objective C between the Stepstone and NeXT environments.


This is Apple Objective-C right? I thought it was developed in house, didn't realize it had already existed.


It was originally developed by Cox's company in the mid 1980s, and then adopted by Steve Jobs' company NeXT in the late 1980s as the official language of NextStep. The Apple connection is only that Apple bought NeXT and that its OS X is really just a Mac-skinned version of NextStep.


> OS X is really just a Mac-skinned version of NextStep.

You could probably describe Rhapsody that way, but by the time Mac OS X came out, I don't think that's an accurate characterization at all anymore.

In addition to the sizable Carbon subsystem, the NeXTStep pieces were also changed substantially, e.g. the refactor of Foundation that extracted CoreFoundation and the change of the graphics subsystem from Display Postscript to Quartz.


Yep, Cox's company was Stepstone.


The language was selected by NeXT and then later used pervasively in Mac OS X as a result. This left Apple as the main driver of its development.


Objective-C was adopted by NeXT Computer in the late 1980s for their app development framework.

The modern version of Objective-C, the one that's still in use today, was developed by NeXT and Sun and was called OpenStep. The first OpenStep specification was published in 1994.

OpenStep API implementations were created for NeXTSTEP OS, Solaris, and Windows NT, running on Motorola 68040, Intel, PA-RISC, and SPARC (and later PPC) platforms.

Sun would switch gears to Java, Apple would buy NeXT, and OpenStep would become Cocoa.


You won't find much on his Wikipedia article, but his C2 wiki page has more: http://wiki.c2.com/?BradCox


  - (void) dealloc
      {
      // :(
      }


[me say:"Oh no! This sucks."];

I loved his little book on Objective-C.


> I loved his little book on Objective-C.

Do you mean: "Object-Oriented Programming : An Evolutionary Approach"? Its 320 pages!


What an amazing career. I was curious about this:

>"The late Steve Jobs', NeXT, licensed the Objective-C language for it's new operating system, NEXTSTEP. NeXT eventually acquired Objective-C from Stepstone."

Does anyone what NeXT paid to acquire the Objective-C license?


They were both privately held companies, so it might never have been disclosed.

Part of the acquisition was that NeXT would license Objective-C back to Stepstone for their own products, so it was more than an outright purchase anyway.


Tom Love mentioned it in a talk once (sorry, I’m not in a position to search for the reference right now). They were offered a choice between a fixed fee of some tens of thousands of dollars, or fifty cents per device. They picked the fixed fee, and felt it was the correct choice right up to the 2000s.


Interesting! Do you know what the "per device" was? Was that device with the Objective-C compiler installed? Device that shipped with the Objective-C runtime?

If it was device with the runtime, I imagine we'd have seen a rewritten "modern" runtime a few years earlier than we did.


I've found the video here https://www.youtube.com/watch?v=adI6-liGXqE, he said something along the lines of "five dollars [not fifty cents] per device running Objective-C", that make me think he meant the runtime.

I don't necessarily agree that the result would have been a rewritten runtime, just because the NeXT runtime (and compiler based on gcc) already was their own code so the licence must have been for "IP" in broader terms. Remembering that Apple were very heavily into Java around the time of the NeXT integration (https://www.sicpers.info/2020/07/so-whats-the-plan-part-1-wh...), I imagine that they would have gone more heavily in Java's direction.


Great language. Amazing bang for the buck. RIP.


What a legacy. Objective-C feels like a fun toy you can play with, it really does make cool things quite easy that are really hard in most other languages, like the iOS animation system.


He was bold enough to create a DSL starting from C. Too many black bar-worthy losses lately.


Given the impact of his contributions, I don't understand why there isn't a black bar.



I had no idea he was living in South Carolina until I saw the news begin to appear on local channels.


Very sad, rest in peace and thank you for all your contributions.


Was it covid?


The message passer has passed.


Having lived in Manassas, I express my deep regret that this pioneer spent his final years there.


If I had known he was literally down the road from me I’d have tried to pay my respects earlier and didn’t catch this in local news at all. It’s not that bad here probably compared to 10 years ago, especially with the VRE train routes to DC and new stuff downtown.


I moved there in 1992 from Warrenton, predictably due to misfortune. Couldn't get out of Bobbitt-Land fast enough. Glad it's improved.

Do city cops there still obsess over teen boy genitals? https://www.techdirt.com/articles/20140709/07330027823/prose...


Manassas just elected its first Democrat mayor in like 100 years, for what it’s worth. The area isn’t as prosperous as the rest of NoVA but it’s kind of being forced to stop being looked at as a slum. Home prices YoY led the entire state this past year with the growth of remote employment. I know plenty of people still call it a NoVA ghetto similar to Herndon, but I’m happy to not be shielded from day to day reality for people that don’t have white collar jobs when I look outside my windows.

I can’t say much about law enforcement honestly one way or the other. Danica Roem won multiple elections here for focusing upon fixing the terrible traffic on 28, and it’s proof that local economic issues matter more than social issues now. The problems with that road have kept economic development from going to an otherwise welcoming community, and I blame that upon a laisse-faire style leadership for many decades that’s ironically impeded growth for the community long-term as AWS grew just down the road.


I ought to note that I grew up and lived in Lorton until the late 80s. Even with that experience, I found Manassas to be a difficult place to live.

I never considered Manassas to be a slum or ghetto. In the early 90's, I found it to be a stifling place dominated by people who seemed resigned to live in a very small world.


Allah rahmet eylesin


Wow, so very sad.


Does anyone know what he died of?

Given current events, my assumption is COVID-19. But I know that I'm assuming that too often. Old people do die of other things.


I was thinking about how the vaccine just started rolling out and has already killed “frail” people, but I have no knowledge of this guy other than him being a computer programmer... gulp (I am a computer programmer)


funny thing, I just discovered the singer Brad Cox [0] searching for the original Brad Cox Wikipedia page and I like his songs

[0]: https://www.youtube.com/user/bradleycoxmusic




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: