You'd get a DSL with an incredibly powerful IDE and interpreter for development basically for free.
But I like the idea of converting an Excel logic to a DLL automatically. It's non trivial as Excel allows to do some pretty messy things. As soon as someone uses OFFSET or INDIRECT, all bets are off in term of what the spreadsheet may do.
I think this could be quite useful not so much when you're starting something from scratch necessarily but maybe you already have spreadsheets that implement some algorithm. In these cases having a compiler translate the spreadsheet would probably make more sense than having a human do it.
Chris Granger: https://twitter.com/ibdknox
The thing is most business users are already familiar with pivot tables in Excel, for as clunky as they are. But Excel is not only used for data, it is also largely used for a complex business logic. And that's where you have a logic which IT will struggle to understand and therefore should rather not own, and that the business needs to set up and maintain. Think complex tax calculations, or a business plan, or the logic for a quote on a big project, etc. These will be highly non trivial in term of the logic the spreadsheet will follow, and that's where a library that could just reliably swallow a spreadsheet and make it a function IT can use but the business can maintain has a lot of value.
Disclosure: I worked on both of them
Its major drawback, in the early days at least, was that it was designed as an Objective-C framework. By the time I started using it (around 2000), Java was supported, but it was clearly an afterthought and you dealt with a buggy bridge to ObjC components and Objective-C-isms all over.
And not only on OSX -- generally the rise and fall of Java. In late nineties "Java Applets" (+ the web plugin) were supposed to go big. Never happened. Then there was always the promise of Java on the consumer desktop. Never happened either . Java on the enterprise desktop also didn't fare that well. As for startups and smaller companies, the mostly go with Ruby/RoR, Python, Node, Go, PHP, etc.
Nowadays it's just mainly Java on the enterprise server.
 Yeah, I know you can name 5 somewhat successful desktop Java apps. I doubt you can name 50 though, and that's the whole point.
They could have gotten Java much more widespread if they weren't religious against AOT compilation to native code, leaving that to third party JDKs. Thus forcing the whole distribution headache.
Then the whole APIs for media and graphics, 3D and desktop integration where left abandoned.
Swing is powerful, but by default looks awful, one needs to know which APIs to make use of. The Filthy Rich Clients blog was quite helpful in this regard.
The sad thing is that it helped to displace much nicer native languages like Delphi. Borland's mismanagement also contributed a lot, though.
Here, Java is a peer of Carbon and Cocoa.
Also, IIRC, on the original OS X reveal presentation, Jobs flagged Java as deeply, natively integrated in a way that no other OS had done before (surely this is available on youtube but don't have a link handy)
It would almost seem like they were betting half the farm on Java. I find it fascinating how fast Java was alienated from the mac given its initial deep embed. Even today, Oracle's JVM for OS X installs into the dedicated /Library/JavaVirtualMachines folder, something I've never seen anywhere else.
As I also remember Apple sales people coming to CERN doing a presentation trying to sell us the idea that since Mac OS X was built on NeXT and BSD, it was also an UNIX that could be used by the researchers.
In the early days of MacOS X, Apple wasn't sure that the Mac OS developer community would enjoy using Objective-C, so they bet the farm on both languages.
As it became clear that the majority of the developers were quite happy to using Objective-C, they just slowly killed the Objective-C - Java bridge, and then eventually gave the JVM code to Oracle.
This also helped killing the Java role on the desktop and together with Microsoft, started the industry cycle to move back into AOT compiled languages.
I wonder how it would have turned out if they had killed Objective-C instead.
On the other had, I also sometimes wonder how a Mac OS X based on BeOS ( C++ )would be.
Well, it's not like Java ever did much to win the Desktop. Swing was a horrible over-engineered mess, performance was crap until GC improved and CPUs caught on in the mid-late 00s, and the desktop saw mostly neglect from SUN.
>I wonder how it would have turned out if they had killed Objective-C instead
And have all OS X userland apps being written in Java? Probably Apple would have been dead.
Besides it's not like they killed Java among protests from developers -- the majority of developers and users "were quite happy to using Objective-C" as you said.
The big problem, was that it required an effort to learn how to do it, which the majority of the developers weren't willing to do.
I still don't understand why one needs to explicitly enable the native L&F. It should have been enabled by default.
In those days (2000-2001) Java was the most hyped language on the planet, thanks to SUNs marketing efforts (and huge cash influx from the dot-com era and need for big servers).
That's what motivated Apple.
But the reality was, few developers ever cared to produce Mac apps in Java, and fewer users like those when they did appear. Despite the deep integration (which was mostly some hooks to native APIs), it was always an alien looking step-child on the platform.
Looks like Cyberduck is still written in Java using the Cocoa bridge:
Microsoft for example is a winner in desktop space, but they don't get most of the server, nor most of the mobile. And even their desktop profits are not that great.
And Android is a winner on the marketshare front, but its profits (from all makers, Google, Samsung, etc) combined are at best equal to Apple's which has 1/3 to 1/10 the market share (depending on the country).
As a technology, it was fairly clear even in the late nineties that Java was a bad fit for the web.
Some are moving away from them, but installing spyware instead.
How successful exactly? Crappy banking sites aside, you could, even in 2000 or 2005, surf the whole web with Java disabled and not have any problem.
That was less true if you disabled Flash.
Sometimes the Java stuff was hidden: Before XMLHttpRequest I personally used Java in several web apps as a way to do messaging with the server, because Java permitted you to do arbitrary socket connections. You inserted a 0x0 applet into the page and then used the JS bridge that both IE and Netscape provided. Aside from the JVM startup time (which could be several seconds) it was pretty great, actually. I wrote a commercial audio conferencing system that showed the talk status of each participant, for example.
Basically, Java applets were (were, mind you) highly successful before the advent of Flash and AJAX.
I don't remember anytime where I've used a Java Applet on any single banking site.
So it's not banking per se, but it's finance-related.
Android uses its own runtime, its own platform APIs, they just borrowed the syntax and basic libraries and added their own (non standard) implementation below, plus their own platform APIs, unique to Android.
It's nothing like SWT/Swing, J2EE, or the "compile once, run everywhere" Java of yore.
It's a pity it's been neglected so long; there were some nice touches in there (bindings, EOF, some of the tooling when it wasn't crashing) that I'd love to see modern versions of.
iTunes Store was a more traditional monolithic WebObjects app using Project Wonder. Apple Online Store was a microservices based architecture which really just the HTTP routing part of WebObjects.
...it looks like they're still using WebObjects.
The Apple Online Store at least was exceptionally well architected and I would be surprised if that codebase didn't still look good decades from now.
Plus nothing competes with the JVM's library base which are often critical when you have integration with other legacy systems.
WebObjects wasn't the same after the Java "upgrade". I suppose a modern version in Swift would be interesting, but the web has changed in the intervening years.
Its importance to NeXT in the '90s was more as a life raft than as something they thought would sweep the tech world and make billions of dollars for them. It was something Steve was hoping NeXT could pivot to instead of going out of business (NeXT was circling the drain in the mid-'90s, just not as dramatically as Apple was). It came out after NeXT had left the hardware business, after it became clear that going software-only wasn't going to save them, and in the early days of the dotcom bubble, so it would've made a lot of sense to try to hitch their wagon to up-and-coming web technologies.
This was before Apple paid a lot of money to get taken over by NeXT, and the rest is history! The life raft never caught on in a huge way (although it obviously found some valuable niches to fill), and after a couple more years it became clear that a life raft was no longer needed, and may have even become a distraction.
"New technologies Trump Apple"
I'm just a little concerned about how rapidly things are changing and being forgotten in the process. You can't blame a for-profit company for abandoning a project that drags on growth, but how will future historians piece together the lives in our time when operating systems and web app frameworks are abandoned and forgotten?
Edit: I am reminded of a statement from an Economist article on this issue: http://www.economist.com/node/21553445
From what I've seen, technologies very rarely ever actually die. Sure, the trend-of-the-day changes, and this industry, which is WAY too fashion-driven, starts to assume that the "old stuff" is dead, but it never is. Or almost never. Look around a bit, and you'll find plenty of people still building apps in FORTRAN, COBOL, Ada, RPG, Pascal, Delphi, and all sorts of other languages that most people think of as relics. Granted, hardware is a little bit of a different story - for example, you'll probably have some difficulty finding Token Ring or ARCnet networking kit these days. But even there, I'm sure you can find a few crusty S/38 boxes or VAXen or something, out there plugging away (probably connected via Token Ring or ARCnet too!)
I won't dispute that there's a point where something new comes along that offers legitimate advantages, and that there's a time to start moving away from certain technologies for one reason or another. But I'll also argue that a lot of us are way too quick to write off "old" technologies, and are prone to re-inventing the wheel and/or failing to learn from the lessons of history.