Hacker News new | comments | show | ask | jobs | submit login
Java Was Strongly Influenced by Objective-C (gmu.edu)
131 points by thealphanerd on Dec 26, 2012 | hide | past | web | favorite | 69 comments

Wow, a pointer to an old page of mine, that's weird to see on HN. This post was originally from comp.sys.newton.misc, back when I was a PhD student. Here was the thread:


The discussion was originally about Java on the Newton, and someone had made the goofy suggestion that Java was based on NewtonScript. Patrick Naughton jumping in was an interesting ending to the thread.

As a side note: four years later I actually wound up writing the only version of Java released on the Newton (Waba).


That is quite cool! I wrote a JVM for the PalmPilot in 1998:


Lots of fun fitting the 1.0.2 class libraries into 180K (out of 256K available, needed some space to run in!).

May be this time you can fit here your proof that P == NP.

Well Objective-C was influenced by Smalltalk, and nearly every modern OOP language was also heavily influenced by Smalltalk. And of course there has been over the years a lot of cross-pollination between Smalltalk and Java. Hotspot was written by the team that had originally marketed a Smalltalk VM, Gilad Bracha co-authored the JVM 2.0 specification and was also involved in the Strongtalk Smalltalk implementation, and of course one of the key developers of Hotspot, Lars Bak, is now working on Dart which is basically Smalltalk with Javascript syntax.

My impressions of Dart are not particularly well-informed but I must raise an eyebrow at the allegation that it's Smalltalk with Javascript syntax. Care to elaborate?

The semantics are the closest to Smalltalk of any of the relatively high-profile languages. At the implementation level, everything is an object that responds to certain selectors. Objects have a static layout (unlike say Javascript or Python or Ruby which treat objects as basically hash tables of fields), and the call frames are relatively static too, compared to Python and Ruby which let you do all sorts of weird magic with the call stack. Scoping is Lisp-y, instead of the various weird scoping rules Javascript, etc, have come up with.

> Objects have a static layout (unlike say Javascript or Python or Ruby which treat objects as basically hash tables of fields)

This will make life easier for implementors of high performance VMs and tend to reduce memory footprint of objects.

Since we're on that topic… in Objective-C, object layout is determined at runtime (these days anyway). It's pretty efficient though, and lets you do sometimes-useful stuff like add ivars at runtime. It also kills the fragile base class binary compatibility issue that plagues other compiled languages (particularly relevant as a glue / MVC language).

Thanks. That piques my interest. I'll have to take a deeper look.

I can't help but think that ARC has pretty much killed garbage collection. Why use garbage collection at all when you have a static equivalent that works just as well?

It doesn't work nearly as well. ARC doesn't handle cycles at all, which makes it substantially more painful to use than a garbage collector. Don't get me wrong. ARC is great compared to the previous situation, but it's simply not a substitute for a real GC.

Automatic reference counting dates back to probably the 1950s. If it "works just as well" as garbage collection, don't you think it would have won by now?

One day a student came to Moon and said, "I understand how to make a better garbage collector. We must keep a reference count of the pointers to each cons." Moon patiently told the student the following story - "One day a student came to Moon and said, "I understand how to make a better garbage collector...


This makes me paranoid that you can read my thoughts.

Why? The garbage collection koan is common knowledge among compiler and PL people, I just took some initiative.

It's a joking way to indicate that this is exactly what I was thinking of when I wrote my comment.

ARC doesn't work just as well. At least theoretically, ARC retains a lot more garbage because the compiler has to be much more conservative about reachability than the garbage collector. ARC has theoretical failure modes that can cause huge amounts of retained garbage, and that's less well-understood than the failure modes of GC. Doing reference count updates is also generally slower in terms of CPU cycles than doing garbage collections. Oh, and ARC doesn't handle cycles.

That said, I think reference counting warrants more attention than it has gotten in memory management research. If you're willing to punt on the leaking cycles issue, you can get a lot of mileage out of using a reference counting collector as a tenured generation with a copying nursery.

> you can get a lot of mileage out of using a reference counting collector as a tenured generation with a copying nursery.

Are there implementations one can play with?

I think MMtk might have an implementation of the technique. There are some papers on it: http://www.cs.technion.ac.il/~erez/Papers/AzatchiPetrankCCPE...

ARC has absolutely killed garbage collection.

    Important: Garbage collection is deprecated in OS X v10.8. You should use ARC
    instead—see Transitioning to ARC Release Notes.

Whether ARC is better or worse than GC overall is immaterial. Apple has moved away from garbage collected Objective C.

> ARC has absolutely killed garbage collection

... In Apple's Sandbox. Apple has also killed ethernet and other very not-dead in the real world things.

Killed ethernet? Thats a somewhat ill-informed statement to make. The latest iMac still has an ethernet port.

They did remove it from their line of portables as most people use wireless with their laptops (and they scored some valuable real-estate on the main board!) but you can get a Thunderbolt network card if you really want to be tethered to your desk.

Ah, the iMac that you still can't buy (end of the year is a really strange time to have no inventory of your new flagship desktop)... But that's another rant.

Apple's computers (desktops included, save for the Mac Pro which is essentially EOL) no longer have an optical drive, maybe I should have used that instead. I saw a coworker hunting for his MacBook's ethernet port today, so it was was fresh in my mind.

I honestly can't remember the last time I used an optical drive (and I have one in my MBP). Alternatives are superior in virtually every way.

If you have an always on, reasonably fast and unmetered broadband connection perhaps. I understand it on the laptops, but desktops have plenty of available space so they're removing them simply for the sake of removing them.

I don't use the optical drive on my iMac much, but the times I have had to were not times I would have been able to avoid. I don't want yet another accessory/dongle or one less USB port. There's tons of room, just include the drive.

This came up recently with trying to install Windows inside Parallels on my girlfriend's laptop. Their IT people had a DVD and of course her laptop has no DVD drive, so I looked up how to use Remote Disc and found out that it works great except explicitly not for installing an OS and oddly almost all the other common uses of an optical disc.


These types of discs are not supported by DVD or CD sharing: These types of discs are not supported by DVD or CD sharing: DVD movies, Audio CDs, Copy protected discs such as game discs, Install discs for an operating system such as Microsoft Windows (for use with Boot Camp), or Mac OS X.

If Apple does not take care that its frameworks behave well with a GC app, you are likely in for a masochistic rollercoaster ride.

Apple also hasn't killed Ethernet, the desktops still have it.

It's easy to get into a world of hurt even when Apple does its best to make their frameworks well-behaved with GC.

Retrofitting a conservative GC onto C is really hard, and they were never able to fully make it work. If you look at the Boehm collector's page, you see a lot of potential issues that can and do arise. Having these issues in an optional add-on that you adopt (or not) after weighing the issues is one thing, having them in a system-component that is declared to (a) be the future and (b) "Just Work™" is another.


I think he meant more generally. ARC, as a general algorithm, has killed the need for garbage collection, as a general algorithm [I]anywhere[/I]. As supposed to what you're saying: ARC, Apple's implementation of ARC, has persuaded Apple to abandon their GC implementation. These are two very different meanings (but yes, in Apple's case ARC has killed GC because of particular characteristics and needs of Apple's developer toolchain and environment).

ARC is terrible, I've had far more errors with it than I ever did without.

Do tell, please.

Curious... I've never had a problem with it.

rdar? I'm calling BS on this.

Besides memory fragmentation?

Also remember that the type-feedback breakthrough of StrongTalk was done by one Urs Hoezle, subsequently Google employee #9 and currently an SVP of Engineering there.

...And Urs Hoelzle worked on the implementation of Self programming language when he was a PhD student at Stanford. Self was the result of a drastic simplification of Smalltalk. Applying Occam's Razor, Self used the prototype concept as the central organizational mechanism as opposed to Smalltalk's classes. Prototypes later were popularized by JavaScript (and of course, mangled in the usual JavaScript ways.)

See http://labs.oracle.com/features/tenyears/volcd/papers/intros... for a historical note. Smith & Ungar's "Self: Power of Simplicity" is a must-read for anyone interested in this topic: http://ranger.uta.edu/~nystrom/courses/cse3302-fa10/selfPowe...

Not only that, but he developed type feedback while working on Self, a heavily Smalltalk influenced language.

Which is why Objective-C++ is so much fun to have available in your Mac/iOS programming toolbelt: it's the best and worst of two completely different C-derived object-oriented language lineages in one schizophrenic package.

It's a terrible Frankenstein monster but it can be pretty useful. My most popular iOS app is a synthesizer with a UI in Obj-C and a DSP core in pure C++. Interfacing the two is pretty trivial thanks to Obj-C++.

I've been considering a port to Android but all the JNI boilerplate I'd have to write is putting me off.

I'll agree that JNI is inelegant, but having recently written a small amount of JNI glue with the NDK for a hobby/side project, I have to say, once you get going it won't be a big deal. If that's really the only barrier to an Android port, I say bite the bullet, sit down and do it.

I need to do it anyway. I'm taking on more and more Android work and I know sooner or later I'm going to have to dig into the native layer. Thanks for the words of encouragement.

So's mine (and we both have Futura-heavy UI's. :]) I looked into an Android port and walked away very quickly: there's no Core Audio, not even a HAL. Most devices have unusably high audio latency, too. It's a nightmare.

It's unfortunate really. Philosophically I'm much more aligned with Android and overall I prefer the Android API but music apps are my passion and Android is so far behind there it's not even funny.

Cool synth, by the way. I'd like to see more of these kinds of experimental instruments on the iPad.

Thanks, likewise. Been meaning to try out some grain stuff, now I have an excuse.

Actually the app name is kind of misnomer. It started out as a grain synth but wound up as a straight subtractive but I was kind of committed to the name already. I should probably change the name actually.

My personal best/favorite use of Objective C++ was using it for code reuse between an iOS app and a WPF Windows desktop app. I was building a horribly boring CRUD app that used google Maps (basically a work order management program) and doing something a little outside of the normal helped make the project fun.

So how comes that they missing out all the cool features like "performSelector", "respondsToSelector"? Sure the whole OOP part may comes from Objective-C (SmallTalk to be precise) but the dynamic parts were poorly executed.

Well, they're a lot longer to type, but you can use [1] to get a method (and handle the error to see if it exists) and [2] to call the method on a given object.

1: http://docs.oracle.com/javase/1.5.0/docs/api/java/lang/Class...

2: http://docs.oracle.com/javase/1.5.0/docs/api/java/lang/refle...

Java takes its most basic ideas about how OOP is done from the Simula and C++ side, rather from Smalltalk and Objective-C. In that world view you don't ask if something responds to a selector but rather whether it's an instance of a suitable interface. There are still occasions when responsa to/perform is a nice pattern but its often better done with lambdas. Perform also tends to need annoying special casing to optimise.

The exception I'd make to that is meta-programming—especially for ORM libraries—where responds to, perform, and lambdas together produce some lovely patterns to marry a schematic to a dynamically generated object hierarchy.

Not to mention the fact that they completely ignored class methods.

No, static methods do not count.

The only difference I know of is that you can override inherited class methods, but not in Java. Is that really an important feature that you use? I suppose you could use it to do "factory" stuff, without an actual factory class.


Imagine my surprise when I disovered Delphi has class methods too, but they were not copied to C#. I guess most of the time they're not particularly useful, but when you do need to create some factory code, having pure virtual class methods (template methods), is really useful.

Downside w/ Delphi is the metaclasses do not implicitly exist - you have to manually define it, eg TMyObjectClass = class of TMyObject; They should just be implicity, when you define a class, the metaclass is defined too. I guess when they first designed it memory constraints were still a consideration

Out of curiosity, why do statics not count for class methods?

They are closer to global functions that are attached to classes and don't support inheritance the same way as instance methods. e.g. if a subclass implements a static function with the same signature as its parent, and you call it with the parent class, it invokes the parent class version instead of the subclass version.

Class-side methods in Smalltalk are inherited, working with classes (as an object) is analogous to how instance-side methods work with instances (also objects). So e.g implementing and using factory methods are simpler in Smalltalk and similar languages.

Edit: grammer and typos.

class methods are virtual

(SmallTalk to be precise)

Smalltalk to be precise.

That might explain its complicated syntax. Better than obj-c but worse than nice languages like Ruby or Python.

Also, if they were influenced by Obj-C then why on earth didn't they bring over the by far most useful feature of the language, caegories?! Or maybe categories were a later addition?

I have been developing Java for 15 years and ObjC only for two but categories are seriously the best thing since sliced bread. Anything missing in the libraries? Just add it!

Just make sure you prefix your category methods. Somebody at Apple might have the same idea you have, and if the two implementations aren't perfectly compatible, you're in a world of hurt.

I wouldn't know how much the convention is enforced, but doesn't Apple prefix private API with an underscore?

Edit: this is from the Apple naming guidelines, but it doesn't clarify that they always follow that convention for private methods: Avoid the use of the underscore character as a prefix meaning private in method names (using an underscore character as a prefix for an instance variable name is allowed). Apple reserves the use of this convention. Use by third parties could result in name-space collisions; they might unwittingly override an existing private method with one of their own, with disastrous consequences. See “Private Methods” for suggestions on conventions to follow for private API.

They most definitely have private functions that do not have prefix. The iOS Three20 framework used to have a bug where you tap on a tab bar button a second time and it crashed because a cateogry method had the same name as an existing private method.

The new method could be public.

A lot of the newer JVM languages provide better equivalents to Obj-C categories. Most of them also let you mix in instance variables along with methods. Check out Scala or Kotlin for specific examples.

Objective-C categories nowadays can include instance variables :-)

Really? Since when? Do you have a link to docs? I've seen some hacks involving associated references but it was my understanding that categories were not allowed to add instance variables because Apple would have to make member access much slower to allow that kind of extension.

My bad: they are allowed in class extensions, which are similar to but not the same as categories.

Given how the new non-fragile ivars are implemented, it should be possible to add ivars anytime before the first instance is created (so load time should be OK), but that has apparently not been implemented, class extensions only work for code compiled together.

So I guess no dice, apart from associated objects or the old indexed-instance-variable tricks:


Alas, those tricks break on bridged CF classes that have incorrect metadata:


(Checked on 10.8, instance-size for CF objects still incorrect)

>That might explain its complicated syntax.

Not at all. The syntax was modelled after C++ which was what SUN wanted Java to replace at the time (generally it is a C-family style syntax).

Smalltalk itself has better syntax than any of the bunch.

>Better than obj-c but worse than nice languages like Ruby or Python.

Ruby and Python are not statically typed languages, which covers a lot of the simplicity in their syntaxes.

And while they did exist at the time, they were far from anybody's attention at the time Java emerged. It took 3-5 years after that for Python to make a strong appearance in some circles, and another 3-5 years for Ruby to emerge.

Fascinating that this snippet is from comp.sys.newton. I've forgotten all my NewtonScript, but do remember the richness of the Newton programming model.

I'm twenty-one and the email referenced something called "Excited Live". Can anybody explain what that was? Google hasn't been much help.

Excite Live! was one of the many services (circa ~1996) offered by Excite, which was kinda-sorta like Yahoo (search, web site directory, etc, etc) but is mostly remembered today as the company that could have bought Google for pennies but passed on it.



I briefly worked for Excite entirely by accident, but my tenure there was pretty brief due to an odd situation. I was originally hired by Novo Media Group who at the time was in the final stages of a merger with McKinley (known for the Magellan search engine), and McKinley's Sausalito offices were closer to my home (Novo was in SF and I was living in Marin) so I was located in that office under the assumption that both offices would soon be serving the same company but the deal with Novo fell through, Excite bought McKinley and things got weird and I left.

Shit I feel old.

I remember what the @home jingle ~2000-2001 should've been:

"I don't wanna grow, I wanna buy a crappy web portal..."

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact