"We have a strategy to promise that everything and anything provided that there is no firm specification or explanation and provided that we provide no date for implementation.
"This gives our customers a sense of security that if anything should happen to become important in the future, they will have purchased from a vendor that has a vision for embracing the technology even if there is no roadmap for actually implementing the technology.
"And although they may not ever end up benefitting from the technology, their careers are safe because they made a decision that, at the time, was prudent, namely to standardize on a technology that was thought to be on its way to becoming ubiquitous from a vendor that articulated a strategy for achieving ubiquity."
Oh? Gosh, I must have misread it. I admit I was in a hurry, and what he said sounded so much like what other proprietary enterprise vendors say when promising vaporware that I got the exact wording confused.
UPDATE: Look, I am NOT making fun of your post in any way. I think it's fair game to poke a little at his comments given that there has been no other mention of this, no roadmap, nothing. But I respect your post and the fact that you pushed a little further and tried to find some corroboration for his statement.
I'd like to think that somewhere in Oracle-land, there's a white paper from the Sun days talking about this idea, and maybe they're even trying to make it fly right now. And good job teasing that out and asking some intelligent questions about what it may mean.
I'm hardly an Oracle (née Sun in this context) booster; i.e. as a Clojure partisan, I'm all too aware of the benign neglect that comes with using the JVM but not Java, and with vaporware (JWebPane, anyone?).
All that said, I'm happy enough to be Charlie Brown and take a run at this football. Cynicism leads to either resigning myself to using Javascript forever for client side stuff, something I just couldn't swallow psychologically.
Javascript is a bad language with poor libraries, and I'm forced to use it. I've said before: if it weren't for its distribution, it simply wouldn't be used. If given other choices (emphasis on the plural there), not one developer would opt for Javascript.
"The good parts" of Javascript get a lot of play, but I don't want to have cope by using patterns and avoiding the dull, broken bits that go along for the ride...and I'm surprised that many (most?) other competent developers that wouldn't put up with such compromises in their "real" languages aren't as irritated about Javascript as I have been.
"If given other choices (emphasis on the plural there), not one developer would opt for Javascript"
How do you explain the recent widespread enthusiasm for Node.js, including from server-side engineers with little to no previous JavaScript experience?
I don't know but this trend of making Javascript a compilation target is making me want to shoot myself (from a compiler-design perspective). JVM->Javascript seems like they're begging me to head over to Oracle and scream "Guys! You're going the WRONG WAY!"
Javascript's absolutely ubiquitous distribution characteristics have resulted in all sorts of irrational behaviour.
If the W3C or the browser makers collectively had put some thought into making it possible for other languages to be linked into the client in a sane way (i.e. rather than the existing plugin regime), or defined some proper IL that app authors could compile to, Javascript wouldn't be on its way to being the compilation target for the web.
You make it sound like they were incompetent, lazy or evil. Or perhaps I misunderstood you, sorry if so.
But the fact is, they did think a lot about using other languages on the web, and about proper ILs as well. The fact is, there has been no good way to do either, if you want a standards-based platform with multiple compatible implementations.
You can take some existing IL like Java bytecode, but (1) that stuff is patented, and (2) there is no hope of cross-vendor agreement on what that IL should be, and all of them have various issues anyhow (for example, Java and .NET bytecode were designed with static languages in mind).
JavaScript is winning simply because it's there, it works, and everybody supports it. Yes, it has frustrating limitations, but it's hard to see a practical alternative. We will simply need to overcome those limitations - and that is possible.
That was perhaps a little harsh in the phrasing; I wasn't really implying any of those things. I think I am saying that this appears to have been a blind spot on the part of the web standardization crowd and the browser makers. Maybe I'm entirely wrong though, and there were efforts that simply hit hard problems, as you say.
I'd never heard of any discussions around determining a proper IL, though; if you can, a link so relevant MLs and such would be most appreciated.
Why do you insist on an IL? The real reason why JavaScript won out over Java and ActiveX is exactly because it's a programming language and not an IL. Inline JavaScript in HTML pages is the killer feature of JavaScript.
From a technical point of view, you also need to think about how scripts interact with other dynamically loaded scripts. That means standardizing on function calling convention, symbol tables, types and object representation. There's no way to solve these things with something like bytecode (which is why there were never any Java applets that load other Java applets), you need a dynamically-typed programming language. So R4RS Scheme may have been a better choice, but that's not what Netscape marketing wanted, so what.
Right, I just don't see the point, are people so adverse to learning a new language that they need this. I mean it is logical that higher form languages are built on top of lower level ones. But this to me seems parallel. It is not like JavaScript is a low level run-time and in fact in some ways it is easier to pick up than Java for the uninitiated.
I assume that Oracle is doing this so that people can write Java in a browser. But I don't get it, Java's efforts at being a UI language have been disastrous, both web UI and desktop.
For the same reason it make a fairly decent middleware language is the same reason that it is a poor selection for the UI. It is ridged, it trades off flexibility for safety. Which is a good thing in the middleware layer, but when it comes to writing a UI, it seems that it always has it's feet in the wrong place.
Javascript despite it's warts is a very good UI development language. I was just thinking about this the other day, when I was working on a project, I use two toolkits in my day to day work Dojo and jQuery and I was just thinking about how totally different they are and how they deal with their problem sets very well.
Dojo allows me to build up reusable widgets that are self encapsulated that can just be dropped on a page and jQuery allows me to treat the DOM almost like SQL. They are very different in their approach but yet both are built from the same building block. You just don't see that kind of diversity in Java.
Again it is for a reason, and those reasons are good in the middleware and systems integration layer, but they are cumbersome in the UI. So it makes me wonder, why people are so adverse to learning a new language that is designed for it's problem domain.
If we're talking about a JVM impl (rather than a source-to-source compiler á la GWT), being able to use better languages like Clojure, Python (via Jython), Ruby (via JRuby), et al. would be a huge reason.
Fair enough. Though if you want to write code that looks like Python or Ruby, I can't recommend CoffeeScript (or Coco, if you're feeling frisky) enough.
Hmmm, I'd assumed it was a source-to-source compiler. Are they thinking of writing a JVM in JavaScript? Can you say slow? And debugging seems like a nightmare. And if it's not that, then how else does it get into the browser?
Presumably, that's what Orto did (a post about which I linked to in the original post), and apparently with reasonable performance results for the examples they provided at the time (circa 2008). Javascript runtimes are becoming absurdly fast these days, so I wouldn't put an efficient and effective JVM-in-Javascript out of reach on spec, at least not yet.
> Umm...are there any reasons, other than the JVM, to use Java? I'd much rather write plain old JS (or even better, CoffeeScript) than Java any day.
Existing code. I agree that writing JavaScript is better in almost all cases, but there is a lot of existing code out there and not just in Java but C, C++, and other languages. Being able to run that on the web, anywhere, would be huge.
Yeah that just seems silly. But of course that's the Java way: add layers of abstraction wherever possible. Oh, and a big abstract IDE on top of it all would be really helpful ;-)
I started out writing about how I thought this was insane and by the end of it I had changed my mind and - suspending disbelief and assuming they can pull this off in a reasonable and performant manner - it's actually a pretty clever idea.
Now that Apple, Google and Microsoft have all put the nail in the coffin of applets by not supporting them on their mobile devices Java is looking decidedly shaky as a client side browser technology - especially when you consider that Flash IS shipping to most devices. However Oracle can actually one-up Flash here if they can promise to run inside the browser on any platform. Not too long ago it would have been impossible, but with things like Canvas, WebWorkers and WebSockets they can probably make a reasonable stab at implementing at least the functions of a standard Applet with Javascript.
Java as an applet host has been effectively dead for years. I would be very surprised if Oracle were trying to resurrect that. Seems more likely that they'd be trying to replace the server JRE with a server-side JS interpreter.
Seems more likely that they'd be trying to replace the server JRE with a server-side JS interpreter
Unless you're suggesting they are being willfully misleading I'm not sure how you could interpret that from what they said:
...we think it’s something that we need to do for the community so we can make Java available more places from tablet devices like the iPad ...
I think Java applets are a lot less dead than people realize. There are heaps of enterprises that have deployed them for internal apps (essentially a slightly less evil version of ActiveX). I could easily imagine a lot of these places now have executives wandering around with iPads complaining that the payroll app doesn't work on their computer any more.
Creating something like a VM or a compiler that produces reliable and fast-enough Javascript from bytecode is hard work, and there is no evidence that it can be done. Microsoft tried something similar with Volta (project discontinued) and when I tried it the results where awful, although it was cool that you could compile CLR bytecode to Javascript: http://en.wikipedia.org/wiki/Microsoft_Live_Labs_Volta
On the other hand creating something less than Java SE would break compatibility and it would not be covered by the patents grant, not to mention difficulties in calling the thing "Java".
There is a certain amount of cluelessness that, when concentrated inside a given company, risks starting a runaway chain reaction that ends up destroying the company and the ecosystem around it.
GWT achieves reasonable speed by only supporting a subset of Java which is already basically compatible with the JavaScript runtime. So, you don't get things like threads, weak references, etc. To support such things (if it's even possible) would require writing (at least pieces of) a JVM in JavaScript. Want weak references? You now can not use JS GC -- you have to synthesize Java objects and write your own GC impl. Good luck getting that to perform.
"This gives our customers a sense of security that if anything should happen to become important in the future, they will have purchased from a vendor that has a vision for embracing the technology even if there is no roadmap for actually implementing the technology.
"And although they may not ever end up benefitting from the technology, their careers are safe because they made a decision that, at the time, was prudent, namely to standardize on a technology that was thought to be on its way to becoming ubiquitous from a vendor that articulated a strategy for achieving ubiquity."
tl;dr: "F.U.D."