Hacker News new | past | comments | ask | show | jobs | submit login
Apple stops support for WebObjects (cnn.com)
99 points by secondary on May 13, 2016 | hide | past | favorite | 76 comments



I like watching that 1996 WebObjects presentation [0] just to see how technically detailed Jobs could get, in a time before Apple could just dominate the consumer attention span with the slick, adjective-heavy presentations they have today. He's taking questions from the audience, people are walking around in the middle of the presentation. He may not have been a programmer but he knew how to talk about the details, not just talk about the big and beautiful ideas.

[0] https://www.youtube.com/watch?v=goNXogpwvAk


It is something always amusing to see an old video explaining to a selection of experts, "here is what dynamic html is and what you could do with it" when you know what it has become since. I particularly like the part where the submit button on a web page fires up an excel session on the server, which calculates a price and returns it in the html! I wonder why we don't do more of that today!


It could be interesting to take spreadsheets and compile them into functions.

You'd get a DSL with an incredibly powerful IDE and interpreter for development basically for free.


You have server side excel engines like spreadsheetgear. But my comment was ironic. I am not really suggesting it is a good idea.

But I like the idea of converting an Excel logic to a DLL automatically. It's non trivial as Excel allows to do some pretty messy things. As soon as someone uses OFFSET or INDIRECT, all bets are off in term of what the spreadsheet may do.


you can also use Excel services architecture: https://msdn.microsoft.com/en-us/library/office/ms582023.asp...


I don't think it's a bad idea either. You might not be able to reasonably support all features Excel has but you could still provide some value with a subset.

I think this could be quite useful not so much when you're starting something from scratch necessarily but maybe you already have spreadsheets that implement some algorithm. In these cases having a compiler translate the spreadsheet would probably make more sense than having a human do it.


Chris Granger (of Light Table fame) and his bunch at Eve are working on something like that. There are a bunch of presentations on YouTube. They are more looking at it abstractly as a possible programming / data model, than purely 'compile an Excel sheet to a programming library'.

Eve: https://twitter.com/with_eve Chris Granger: https://twitter.com/ibdknox


That's interesting but I think it addresses a different problem. It seems to be directed at having a more natural and intuitive way to construct and analyse data.

The thing is most business users are already familiar with pivot tables in Excel, for as clunky as they are. But Excel is not only used for data, it is also largely used for a complex business logic. And that's where you have a logic which IT will struggle to understand and therefore should rather not own, and that the business needs to set up and maintain. Think complex tax calculations, or a business plan, or the logic for a quote on a big project, etc. These will be highly non trivial in term of the logic the spreadsheet will follow, and that's where a library that could just reliably swallow a spreadsheet and make it a function IT can use but the business can maintain has a lot of value.


If you're in .NET or Powershell, and Excel is installed on the PC, you already have the Excel libraries. Not to mention the amazing ubiquity of functions to convert, import, manipulate, export, and otherwise work with spreadsheets. Or just work with the functions directly.


Are you refering to Excel's COM interface?


Not directly related, but Microsoft PowerApps have a very Excel-like feel to them, except you're using formulas to wire up arbitrary controls that you've placed on the canvas, and various tabular or hierarchical data sources, rather than grid cells.


A bit of a duplicate comment that my response to melchebo above: looking at it quickly, it looks like it addresses a different problem. PowerApps seems to be directed at giving business users the ability to quickly create a simple CRUD application / website. But I think the use case where the business logic has to sit in Excel and be owned by the business is where you have a complex logic in Excel, like a tax calculation, or a regulatory formula, or some complex quote for a client, etc. They are problems that are intrinsically complex and that IT can't really productionise by just looking at a couple of formulas, and where having the ability to express the logic in Excel but use it on a server has value.


Good point. There's no reason we couldn't have Excel as a headless DLL and spreadsheets compiled into a nice useful bytecode except that the software itself is a complete trainwreck of legacy issues.


Not to mention one that non-developers already know and are productive in.


this is how WebWORQ and ReportWORQ operate

Disclosure: I worked on both of them


Because Excel is a multi hundred megabyte, incredibly complex piece of software. Way too much trouble if you just want to do simple arithmetic.


Note that this is from early 1996, when Steve Jobs was CEO of NeXT. Apple didn't acquire NeXT until nearly a year later.


I like the comment about the online transactions, not being a technical problem but a business one (in his opinion), a problem that would be solved on that year (1996). And sending card numbers over Internet being safer than doing that over the phone. These days many organizations still ask for those details over the phone, or even on paper.


This is what the Dodge site that appears in that video looks like now: http://www.dodge.com/en/


Ah, WebObjects. In a lot of ways, it was the Ruby on Rails of it's day. It was one of the first integrated, opinionated platforms with all the tools to build web sites backed by a database. Architecturally, it was years ahead of the first Java application server platforms; the persistence model so much better than the first iteration of enterprise Java Beans.

Its major drawback, in the early days at least, was that it was designed as an Objective-C framework. By the time I started using it (around 2000), Java was supported, but it was clearly an afterthought and you dealt with a buggy bridge to ObjC components and Objective-C-isms all over.


Interesting, I must have been using it just a year or two after you -- by the time I was using it (in Java), I don't even remember if the Java was still just a bridge to ObjC or not (at some point I think they rewrote the core in Java?), but there were no problems with buggy bridge to ObjC components, it worked fine. There were sometimes frustrations with the architecture, but buggy ObjC bridge wasn't one of them. (the statefulness of the architecture wasn't a good fit for the web as it was then; although interestingly was a similar approach to what React does, but actually exposed in public URLs, which was kind of disastrous).


I was there, at that announcement in 1996. It was the only time I saw Steve Jobs speak. His cult was pretty much not a cult then, and it was pretty hard to understand what he was talking about, but of course a short time afterward we were all talking about and building db-backed websites (and my career was based on it for a long time). The prescience was always there.


Not surprising at all, given the rise and fall of Java as a first class citizen on OSX (remember how first-party Java support was touted as a huge feature in the first OSX releases) and then also OSX Server (which, once the OS of choice for running file, mail, web servers on Xserves, has been relegated to be an appstore .app that mostly seems to be useful for running xcode ios build bots on mac minis)


>Not surprising at all, given the rise and fall of Java as a first class citizen on OSX

And not only on OSX -- generally the rise and fall of Java. In late nineties "Java Applets" (+ the web plugin) were supposed to go big. Never happened. Then there was always the promise of Java on the consumer desktop. Never happened either [1]. Java on the enterprise desktop also didn't fare that well. As for startups and smaller companies, the mostly go with Ruby/RoR, Python, Node, Go, PHP, etc.

Nowadays it's just mainly Java on the enterprise server.

[1] Yeah, I know you can name 5 somewhat successful desktop Java apps. I doubt you can name 50 though, and that's the whole point.


Sun did a pretty bad management of it.

They could have gotten Java much more widespread if they weren't religious against AOT compilation to native code, leaving that to third party JDKs. Thus forcing the whole distribution headache.

Then the whole APIs for media and graphics, 3D and desktop integration where left abandoned.

Swing is powerful, but by default looks awful, one needs to know which APIs to make use of. The Filthy Rich Clients blog was quite helpful in this regard.

The sad thing is that it helped to displace much nicer native languages like Delphi. Borland's mismanagement also contributed a lot, though.


If you look at super early OS X documents, it's interesting to see how highly regarded the Java development platform was held. For example:

https://developer.apple.com/library/mac/documentation/Portin...

Here, Java is a peer of Carbon and Cocoa.

Also, IIRC, on the original OS X reveal presentation, Jobs flagged Java as deeply, natively integrated in a way that no other OS had done before (surely this is available on youtube but don't have a link handy)

It would almost seem like they were betting half the farm on Java. I find it fascinating how fast Java was alienated from the mac given its initial deep embed. Even today, Oracle's JVM for OS X installs into the dedicated /Library/JavaVirtualMachines folder, something I've never seen anywhere else.


I remember those days.

As I also remember Apple sales people coming to CERN doing a presentation trying to sell us the idea that since Mac OS X was built on NeXT and BSD, it was also an UNIX that could be used by the researchers.

In the early days of MacOS X, Apple wasn't sure that the Mac OS developer community would enjoy using Objective-C, so they bet the farm on both languages.

As it became clear that the majority of the developers were quite happy to using Objective-C, they just slowly killed the Objective-C - Java bridge, and then eventually gave the JVM code to Oracle.

This also helped killing the Java role on the desktop and together with Microsoft, started the industry cycle to move back into AOT compiled languages.

I wonder how it would have turned out if they had killed Objective-C instead.

On the other had, I also sometimes wonder how a Mac OS X based on BeOS ( C++ )would be.


>This also helped killing the Java role on the desktop and together with Microsoft, started the industry cycle to move back into AOT compiled languages.

Well, it's not like Java ever did much to win the Desktop. Swing was a horrible over-engineered mess, performance was crap until GC improved and CPUs caught on in the mid-late 00s, and the desktop saw mostly neglect from SUN.

>I wonder how it would have turned out if they had killed Objective-C instead

And have all OS X userland apps being written in Java? Probably Apple would have been dead.

Besides it's not like they killed Java among protests from developers -- the majority of developers and users "were quite happy to using Objective-C" as you said.


Regarding Swing,it is quite possible to create beautiful applications with it, as I mentioned regarding the Filthy Rich Clients blog.

The big problem, was that it required an effort to learn how to do it, which the majority of the developers weren't willing to do.

I still don't understand why one needs to explicitly enable the native L&F. It should have been enabled by default.


>It would almost seem like they were betting half the farm on Java. I find it fascinating how fast Java was alienated from the mac given its initial deep embed.

In those days (2000-2001) Java was the most hyped language on the planet, thanks to SUNs marketing efforts (and huge cash influx from the dot-com era and need for big servers).

That's what motivated Apple.

But the reality was, few developers ever cared to produce Mac apps in Java, and fewer users like those when they did appear. Despite the deep integration (which was mostly some hooks to native APIs), it was always an alien looking step-child on the platform.


The only Java Cocoa app I can remember using regularly is the FTP client Cyberduck.

Looks like Cyberduck is still written in Java using the Cocoa bridge:

https://trac.cyberduck.io/browser/trunk/osx/src/main/java/ch...


Actually the java-everywhere thing worked out pretty well, but Oracle is suing the OS that made that idea comes true.


Java-everywhere wasn't about a single platform with Java being everywhere, it was about Java being on every platform.


we are in a winner-takes-most world so the first strategy beats the second one every single day


I'm not so sure about that.

Microsoft for example is a winner in desktop space, but they don't get most of the server, nor most of the mobile. And even their desktop profits are not that great.

And Android is a winner on the marketshare front, but its profits (from all makers, Google, Samsung, etc) combined are at best equal to Apple's which has 1/3 to 1/10 the market share (depending on the country).


I can name two that never sold very well. I built them :(


Based on adoption rates I would say Java applets were very successful. One area were Java applets were still actively used until very recently was in online banking, where it was used for secure OTP entry. I'm sure there are still some that use it.

As a technology, it was fairly clear even in the late nineties that Java was a bad fit for the web.


Brazilian banks do that. Nearly all of them. It is a major pain.

Some are moving away from them, but installing spyware instead.


>Based on adoption rates I would say Java applets were very successful.

How successful exactly? Crappy banking sites aside, you could, even in 2000 or 2005, surf the whole web with Java disabled and not have any problem.

That was less true if you disabled Flash.


Java applets were all over the place for many years. I remember there were tons of maths, CS and physics sites that used Java to render things like simulations and algorithms.

Sometimes the Java stuff was hidden: Before XMLHttpRequest I personally used Java in several web apps as a way to do messaging with the server, because Java permitted you to do arbitrary socket connections. You inserted a 0x0 applet into the page and then used the JS bridge that both IE and Netscape provided. Aside from the JVM startup time (which could be several seconds) it was pretty great, actually. I wrote a commercial audio conferencing system that showed the talk status of each participant, for example.

Basically, Java applets were (were, mind you) highly successful before the advent of Flash and AJAX.


Very recently being when?

I don't remember anytime where I've used a Java Applet on any single banking site.


The Norwegian government's BankID system (used by all Norwegian banks, and also supported as a login mechanism across all the government's digital solutions, including the national paperwork portal where everyone files their tax returns) only migrated away from Java 18 months ago [1].

[1] http://www.dinside.no/931165/dnb-dropper-java-fra-november


I Germany, the web site of the German equivalent of the IRS uses a Java applet to do the Umsatzsteuervoranmeldung (don't ask me what that is in English, though).

So it's not banking per se, but it's finance-related.


VAT Return. (According to Google Translate)


Have you heard of this thing called Android?


What about it? We're talking about the historical rise (and fall) of Sun's/Oracle's Java technology.

Android uses its own runtime, its own platform APIs, they just borrowed the syntax and basic libraries and added their own (non standard) implementation below, plus their own platform APIs, unique to Android.

It's nothing like SWT/Swing, J2EE, or the "compile once, run everywhere" Java of yore.


And yet the iTunes and App Stores are both built with it, as is a chunk of Apple Music.

It's a pity it's been neglected so long; there were some nice touches in there (bindings, EOF, some of the tooling when it wasn't crashing) that I'd love to see modern versions of.


Does that mean Cupertino has already rewritten iTunes and App Store and Apple Online Store? I'm guessing they're going to keep the WebObjects style URLs for compatibility with old versions of iTunes and App Store.


I used to work on those projects back a few years ago so it all might have changed but I doubt it.

iTunes Store was a more traditional monolithic WebObjects app using Project Wonder. Apple Online Store was a microservices based architecture which really just the HTTP routing part of WebObjects.


At least https://itunesconnect.apple.com appears to have been rewritten in AngularJS. The https://developer.apple.com/ portal also got a new UI not too long ago.


Given that your first link redirects to

https://itunesconnect.apple.com/itc/static/login?view=1&path...

...it looks like they're still using WebObjects.


Look the DOM in the browser inspector, it's definitely Angular. (And angular.version reports 1.2, they need to upgrade). The WebObjects path is probably for backward compatibility.


It's about the backend, not the UI/frontend.


Do you think they're using foundationdb?


Why would anyone think that? Do you have a reason to think so?



Pretty sure developer.apple.com has been Rails for a while. Maybe 4-5 years.


I've suspected for years that if iTunes Store/App Store were still on WO at all, it was a forked in-house customized WO.


No it wasn't. Apple Online Store was vanilla. But iTunes Store at least used the Project Wonder libraries which is almost a fork in of itself.

https://github.com/wocommunity/wonder


Ha! That's kind of hilarious that ITMS used Project Wonder. Thanks for the info, assuming you had some way to know. :)


Something tells me the iTunes and App Store backends are going to be migrated to Swift, most likely using Kitura [1] given their recent close relationship with IBM, and IBM cranking out project after project to make Swift go beyond iOS and OSX apps.

[1] https://github.com/IBM-Swift/Kitura


I highly doubt it.

The Apple Online Store at least was exceptionally well architected and I would be surprised if that codebase didn't still look good decades from now.

Plus nothing competes with the JVM's library base which are often critical when you have integration with other legacy systems.


I seriously missed EOF from the NeXTSTEP days.

WebObjects wasn't the same after the Java "upgrade". I suppose a modern version in Swift would be interesting, but the web has changed in the intervening years.


I still liked EOF Java but still there does remain a spiritual successor:

http://cayenne.apache.org


Rails ActiveRecord has _most_ of what EOF had. It comes close, a couple decades later.


I'm pretty shocked CNN would write an article about WebObjects. The news itself is barely news at all though.


I was thinking the same thing. WebObjects has been on life support for many, many years now. What's shocking is that it had any level of external support 10 years ago, not that that support is ending now. It reads like CNN was framing it as another Apple doom-and-gloom, whatever-will-they-do-without-Steve story, especially with the claim that it was one of his "favorite projects" [?], and OMG now they're cancelling it, this is proof that it's not the same company anymore, etc.

Its importance to NeXT in the '90s was more as a life raft than as something they thought would sweep the tech world and make billions of dollars for them. It was something Steve was hoping NeXT could pivot to instead of going out of business (NeXT was circling the drain in the mid-'90s, just not as dramatically as Apple was). It came out after NeXT had left the hardware business, after it became clear that going software-only wasn't going to save them, and in the early days of the dotcom bubble, so it would've made a lot of sense to try to hitch their wagon to up-and-coming web technologies.

This was before Apple paid a lot of money to get taken over by NeXT, and the rest is history! The life raft never caught on in a huge way (although it obviously found some valuable niches to fill), and after a couple more years it became clear that a life raft was no longer needed, and may have even become a distraction.


It's true. Knowing CNN, I'm surprised they didn't make the headline:

"New technologies Trump Apple"


For me personally, that's sad and good news at the same time. After school, I did a 3-year training in a company and basically learned all my programming skills from using WebObjects in enterprise applications. Somehow it felt ancient already back then (around 2007). But I learned to like it a lot. Nowadays there are much more modern and mostly streamlined web frameworks...in contrast to the huge bulk of components WebObjects brought with it. Now, I wonder what my old company will do now. As far as I know, they're still using it.


On the heels of the "Rails is old and gray" discussions that have popped up here recently, I have to wonder: at what point is it necessary to accept a given framework is past its prime, even if it is still being used? If the abandoned project is open-sourced and the community can take care of it from there, great, but what about developers who can't spare extra labor to fix bugs, vulnerabilities, etc.

I'm just a little concerned about how rapidly things are changing and being forgotten in the process. You can't blame a for-profit company for abandoning a project that drags on growth, but how will future historians piece together the lives in our time when operating systems and web app frameworks are abandoned and forgotten?

Edit: I am reminded of a statement from an Economist article on this issue: http://www.economist.com/node/21553445


I'm just a little concerned about how rapidly things are changing and being forgotten in the process. You can't blame a for-profit company for abandoning a project that drags on growth, but how will future historians piece together the lives in our time when operating systems and web app frameworks are abandoned and forgotten?

From what I've seen, technologies very rarely ever actually die. Sure, the trend-of-the-day changes, and this industry, which is WAY too fashion-driven, starts to assume that the "old stuff" is dead, but it never is. Or almost never. Look around a bit, and you'll find plenty of people still building apps in FORTRAN, COBOL, Ada, RPG, Pascal, Delphi, and all sorts of other languages that most people think of as relics. Granted, hardware is a little bit of a different story - for example, you'll probably have some difficulty finding Token Ring or ARCnet networking kit these days. But even there, I'm sure you can find a few crusty S/38 boxes or VAXen or something, out there plugging away (probably connected via Token Ring or ARCnet too!)

I won't dispute that there's a point where something new comes along that offers legitimate advantages, and that there's a time to start moving away from certain technologies for one reason or another. But I'll also argue that a lot of us are way too quick to write off "old" technologies, and are prone to re-inventing the wheel and/or failing to learn from the lessons of history.


Weird how the article opens to a huge video interviewing a guy who did acid with Jobs. Kind of lame, really.


I was a WebObjects developer back in the early 2000s. It was great for it's time. It's architecture is really quite similar to Rails, I've often wondered if dhh was exposed to it, or came up with Rails architecture separately from shared influences.



Is there any modern web framework for Objective C? Ideally integrated with other Apple technologies like CoreData.


And on the 6th day, Apple created the Internet in their own image and likeness.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: