> ...you can't run Java code in a browser without a plugin...
> ...WASM, being memory safe and tuned for validation, also has security advantages over Java applets...
> ...explained the difference between WebAssembly and Java thus: "WebAssembly has been designed to scale from tiny devices to large server farms or CDNs...
> ..."That's how important it is. WebAssembly on the server is the future of computing. A standardized system interface was the missing link...
> ...But a write-once, run anywhere binary represents a worthwhile effort...
What's ironic is that the "tiny devices" and even "high end professional desktop workstation and server devices" that Java was originally designed to run on when it was started in 1990 were MUCH tinier than the devices considered "tiny" today.
How many more times faster is a typical smartphone today (Raspberry Pi 3: 2,451 MIPS, ARM Cortex A73: 71,120 MIPS) that a 1990 SparcStation 2 pizzabox (28.5 MIPS, $15,000-$27,000)?
> Many Java language features are not supported by Java Card (in particular types char, double, float and long; the transient qualifier; enums; arrays of more than one dimension; finalization; object cloning; threads). Further, some common features of Java are not provided at runtime by many actual smart cards (in particular type int, which is the default type of a Java expression; and garbage collection of objects).
I really don't get why the community is hell bent on reinventing the wheel, especially when the main defense of npm is that you don't have to reinvent the wheel.
It's really disheartening to see that the best minds of the generation are busy spending time to reinvent the same old things instead of trying to improve upon the existing systems.
I think it might have to do with people considering 40 as very old in Silicon Valley and those in their late 20s and early 30s have barely learned this lesson.
I feel like other fields of engineering don't have such a dismissive approach to their own pasts. Show an electrical engineering class an old analog instrument and there's generally wonder and curiosity. Show a computer science class a slide with a 1 MB hard drive compared to an 1TB SD card and there's generally ridicule and laughter. "Look how stupid they've been", not "we've learned many valuable lessons since".
Anecdotally speaking, the average web developer does not have any formal computer science education. Most are self taught or attend a bootcamp or two at most. They have excellent vocational skills, but little knowledge of anything in computing outside of their narrow path of learning.
"Progress, far from consisting in change, depends on retentiveness. When change is absolute there remains no being to improve and no direction is set for possible improvement: and when experience is not retained, infancy is perpetual. Those who cannot remember the past are condemned to repeat it." - George Santayana (1863 - 1952) [The Life of Reason (1905-1906) Vol. I, Reason in Common Sense]
This neatly dovetails with the thoughts that were floating in my head. This part, is the explanation I believe, for the current state of programming, "when experience is not retained, infancy is perpetual."
This has wide reaching effects, one of the more obvious ones being that 20 year old knowledge seems ancient to most.
Sadly on several countries this is a thing, which also contributes to a low level of expectation, regarding the quality of delivered work.
My Informatics Engineering degree certainly did include historical perspective of previous languages, operating systems and hardware architectures.
As far as performant, I agree it wont matter in most trivial interactions but large datasets with computations already see an impact (for example we have million row datatables to show sometimes).
It's worth pointing out that .NET Core has an assembly/executable generator that can package only the .NET libraries that are actually used.
I think the biggest issue is sending every interaction over the wire but there are ideas to separate the event handling or even split which components run server vs client-side. That would allow highly responsive inputs with no lag while the parent component can run all its logic on the server.
Asp:UpdatePanel was great for it's time, and no different than doing an XHR request now and replacing the contents of a div with the response. You can still get basically the same effect with libraries like Turbolinks which is what Github does.
I hear that often in regard of programming languages. But I see a lot of value in remixing a lot of existing ideas in a new package. A wheel can only be so round and its interaction with the road is quite simple, but tech is different. Systems have widely different requirements. Old implementations have shortcomings and a hard time fixing those, especially due to backwards compatibility of complex systems. I cannot imagine how hard is to remove null from an existing programming language, as C# is currently trying. At some point it's just easier to start from a clean slate.
Is all the effort worth it? I don't know. But I wouldn't want to imply they are just doing it because their out of touch with reality.
Reinventing the wheel takes a lot of effort to be worth doing, because it takes a lot of effort to improve foundational technology, but it is sometimes worth doing if you are going to put an appropriate level of resources into it.
Null removal is completely compile-time anyway so it's complicated to compile but relatively simple to implement as a language upgrade at a certain version, and in this case it's also opt-in.
2. Yeah, I read that and was super confused. Literally that was one of the main point of the Java language itself, enforced by the JVM itself.
3. I think this is more a matter of expected level of abstraction, but I agree this is fairly weak.
4. This just seems like standard "the new thing is vastly superior to the old thing", with a valid touch of "the JVM is too heavily abstracted from how computers work"
5. Yeah, this was also weird.
Java was the solution that would provide a robust, symmetric (server+client), secure, highly capable, and portable platform for complex web applications.
Early Java folk weep because Java failed so badly on the client, and something else is stepping in to do what Java could not.
(I first saw Java when it was called Oak, and Sun had great people doing major things, Java only one of them. When Java applets first hit conventional Web browsers, most people thought they were for replacing animated GIFs, thanks to a demo program. I probably wrote some of the first Java desktop application code outside of Sun, partly to demonstrate that Java was a real applications development language. Well, the language was there, and in many ways a huge improvement over the C++ that most shrinkwrap and technical desktop application developers were moving to, though the library support took a while to catch up, and performance took longer.)
Modern web is a joke regarding resource usage and complexity etc., but java was a shitshow in practice, except on beefy servers.
Reminds me of client-side JS frameworks. Or Electron apps.
> Not to mention how ugly awt and swing looked even back then.
Web UI is still as ugly as ever, of course. Peak UX usability was native UIs in the 1990s and early 2000s, after that it was nothing but steady decline.
Even big framework web apps on a core 2 duo running old Firefox, that is nowhere as bad as early Java trying to run something as a web applet.
I remember staring at that crazy Java applet load spinner and hating Java, and this was already in the early 2000s (on the first white iBook running Mac OS 9). It would have been much worse 7 years earlier.
It was really just the JVM’s startup time and the clunky AWT-based UI that made Java lose on web clients. Flash easily dethroned Java there because it started almost instantaneously and could do fancy animations (recall dial-up speeds meant video wasn’t an option).
>Peak UX usability was native UIs in the 1990s and early 2000s
So much this. Give me Win2K and Office2K style applications - multiple tiled or overlapping windows, modeless views, regular menus and toolbars, context sensitive right click menus, etc. over this single pane with a hamburger button and search box crap any day of the week.
Also on client devices running on my credit card, factory management client screens and couple of car infotainment systems.
And on client devices across many corporations still safe from Electron madness.
(I think Java Card maybe? is responsible for CC handling in all the magic CC features in phones)
And as a whole (except for run in browser without plugin, which was never promised, AFAIR), they were all true or Java, relatively speaking, compared to what was available before.
Are you saying that we could stick a JVM in the browser and ship Java applets around to be progressively/streaming loaded and executed in browser?
I don't know much about the JVM but it seems the goal of WASM and the JVM were quite different, no? And yes, WASI has overlap with JVM, but if WASM catches on and people like it, how does WASI make any less sense?
Are you advocating we have WASM and JVM but no WASI? Is that better? Not sure why anyone is crying here..
More or less. I mean, that used to be a thing you know.
> the goal of WASM and the JVM were quite different
WASM: A universal write-once, run anywhere bytecode for heterogeneous networks of systems
JVM: A universal write-once, run anywhere bytecode for heterogeneous networks of systems
I did not know Java applets can be stream loaded (ie, optimized for the web).
> WASM: A universal write-once, run anywhere bytecode for heterogeneous networks of systems
Not at all.. at least, not in my view. WASM is optimized for platform issues specific to the web. That is to say, how it behaves on load is the first priority.
I never got that impression from the JVM. If you say it is, then it seems to be a failure in marketing of the JVM or something.
Since you seem to feel WASM is a waste, what are your thoughts on the failure of the JVM? Ie, why don't I have Go and Rust compile targets for the JVM, with JVM browser code running my JVM targets and etc?
If JVM truly does have all of these WASM-web oriented features then it is an impressive failure on JVMs part. Quite curious
The JVM is not a true VM. It's heavily coupled to the Java object model and the Java GC.
I'm honestly puzzled. Am I the one missing something? Why is the JVM crowd seemingly upset here?
Nobody is upset, really. Java web client lost a very long time ago to Flash.
Java certainly has a well-established place in software, no doubt about that. But its original promises had it (combined with AWT/Swing, applets, and XML, ... umm yeah) becoming the ultimate "write once run everywhere" platform that could scale up to massive servers and down to tiny embedded chips, and every client platform in between. In retrospect, C# and .NET would have done well not to attempt to emulate Java's initial scope.
And I have no particular JVM love. I use it exceedingly rarely.
It's just that you're being kind of a frustrated, pedantic tool right now.
That is more penetration than Java could ever get on Desktop and Mobile.
Although Java is actually doing ok in embedded space, I am not entirely sure how WASM would work for that segment.
Java is the most used programming language in the world (https://www.tiobe.com/tiobe-index/)
It not only powers all kinds of desktop things, it is foundational to Android, which currently powers ~85% of all mobile devices.
How can you get more penetration than "most used programming language" and "foundational to the largest mobile OS"? I mean there isn't even a category that fully encompasses the degree of dominance of Java. The next step, I assume, would be to rename "Programming" (all of it) to "Java".
>> Mozilla this week announced a project called WASI (WebAssembly System Interface) to standardize how WebAssembly code interacts with operating systems.
This is just another implementation of holes that allow web sites access to the rest of your system. It really is supposed to be the operating systems job to manage resources and what can be accessed. The problem is this functionality keeps getting re-implemented by others with different agendas.
Well, unlike systems designed in the 90s, it's designed for the modern "everything-is-a-threat" mindset rather than the optimism of the 90s that everything would be safe.
People have been experimenting with "WASM outside the browser" for a while now, and WASI is just an API for making OS calls that a runtime can implement.
Has it, though? What we have now is an MVP, and easily 90% of the features aren’t there yet.
When WASM gets garbage collection, will it still be able to run on tiny devices? We don’t know yet.
But yeah, even specification itself is WIP
This provides substantial differentiation from .Net/Java/Flash/etc.
That said, this advantage risks diminishing if Google's influence continues to grow.
It’s disproportionally affected by Google.
Still doesn't compare to the Sun God and his silicon fist.
(With apologies to Steve Yegge.)
I didn’t say iron fist, I said disproportionately affected.
A few people will care if Safari does.
Edge is out if the picture and is being replaced by Chromium (aka Chrome).
On WebAssembly specifically. Here’s WebAssembly working group: https://www.w3.org/2000/09/dbwg/details?group=101196&order=o...
I count 18 people from Google, and only one from Mozilla. The second largest representation is from Microsoft (8 people), but their input is now Chrome/Chromium input.
The biggest drawback in WASM to me has alway been the memory model. While it is memory safe, it's currently opaque to the host environment which means that the general implementation is to over allocate address space and essentially implement pointers as integers. That results in pointers that can't be trivially shared between environments.
That said, I think that if I /were/ to be making an app that had plugins, or an OS from scratch, etc I would define an interface via WASM. Screw loading unbound untrusted code into my process. My code is 100% bug free, I don't want to other people to break it.
This is not remotely true :D
* I am biased here.
Haha, I'll defer to you then. But I'm still curious about this:
> you have to support arbitrary compiled code, with all the intentional or unintentional memory safety violations
But there are multiple barriers here, are there not? First, your code needs to be signed (and notarized, at some point), and then at runtime the app sandbox/entitlements/process separation ensures there's not much you can do even if you're running arbitrary code. Sure, the first "barrier" isn't necessarily all that restrictive or thorough, but the other one should ideally stop anything horrible?
There's the "are you malware that's got through whatever rules exist in the target App Store", but there's also the "is the extension code buggy".
In my experience the latter is the real killer - there are all sorts of things an App Store review process can be used to prevent malware, but they're all dependent on the code doing what static analysis claims it's doing - they can't protect against terrible code -- my experience writing software has taught me that no one rights 100% correct code, 100% time.
Historically GPU drivers have been a terrible source of bugs (it's why WebGL has such tight restrictions on features and shader syntax) not because of intentional malfeasance, but because you're processing arbitrary untrusted content in a trusted environment (my understanding/belief is that the windows driver model has pushed this out of the kernel? please confirm/deny), so functionally forcing that into a restricted execution environment is a win.
Obviously the memory model of WASM still allows for things like use after free, out of bounds access, etc; but at least it's not unbound into the host address space.
Going wrong can mean “exploited by malware” through to “extension code trawls the host process address space to provide ‘features’”
The latter of these two used to happen all the time with “haxies” on OS X.
The run on security benefit of providing a semi-virtualised environment for third party/untrusted code is that if the VM is exploited you are able to fix and ship a fix for the VM. You can’t fix the untrusted code.
IMHO, the success of these products has very little to do with how good they might be technically. It's all about sales and marketing and, eventually, politics. Perhaps that's where the motivation to re-write recent history is coming from.
Not in this case. Java failed because it sucked from a technical point of view. It traded better 'OO purity' in exchange for worse security, portability and performance.
The end result is that users got something that was slower and buggier for no gain.
You know, the use-case it was originally designed for. (I read the hype for Java 1.0, I was there.)
Nobody at the time could imagine that Java would eventually become the enterprise COBOL replacement.
It's argued that only IDEs use Java, but Java + Swing strikes me as the most popular cross-platform language and toolkit currently in use.
Looks pretty much living the dream of portable language-agnostic VM for user-facing apps to me.
WebAssembly has its work cut out for it trying to succeed in both of these spaces. I think the project might enhance it's chances for success by remembering Java and its browser plugin rather than pretending they are the first.
Do people think WASM is not better than the JVM for streaming browser based usage? Did it learn nothing from the JVM? The sarcasm in these threads always confuses me.
I'm not going to debate any points against the JVM, but all I know is that I can compile languages I want to all browsers today. Why can't I do that for the JVM?
Something must be a massive, gory failure of the JVM and/or the companies involved with the JVM for it to be an equal to that of WASM, and yet have zero traction for modern languages and browser targets.
Also 86% of the mobile OS market runs on Java.
Additionally smartcards, a big portion of electricity meters, factory automation, smart copiers, M2M gateways, Bluray Players,... run on Java.
Hardly a failure, only for naysayers.
The worst thing that can happen is that in 10-15 years from now nobody remembers the WASM fling, and all these man hours go to waste.
Be _the_ language (Java)
Be _the_ platform (JVM)
Wasm is a compilation target, it's better to compare it to the JVM than Java. The thing that has me excited is that _every_ language will be write once run everywhere not just some new and unproven language (like Java was at the time).
All the tools were there.
WebAssembly is still not safe from internal memory corruption, due to lack of memory tagging and bounds checking.
Unsafe applications won't ever be magically safe when compiling them to WebAssembly. Neither would this be the case when compiling to Java bytecode - if this was possible at all.
IMHO WebAssembly is a compilation target and therefore not the right layer to solve this. This is the responsibility of the language or the specific application. If you want to solve this in WASM, I predict you couldn't just compile all different languages to WASM anymore without significant changes to the codebases. If this would be feasible at all..
Rewriting those huge C/C++ codebases is simply not an option, new applications can be written in safe languages and then compiled to WASM.
CLR proves the contrary, by having C++ support, with the difference between safe Assemblies (where typical memory corruption opcodes are not allowed, compilation via /CLR) and unsafe Assemblies, where WASM like opcodes are allowed.
To load an unsafe Assembly, the host has to explicitly allow it.
Similar examples on IBM and Unisys language environments, e.g. on ClearPath, the admin must allow the execution of binaries tainted with unsafe code.
PHP is memory safe, and yet is a larger source of data breaches and security bugs than C by (rough guess) an order of magnitude.
C is not the bogeyman you're looking for.
Garbage Collected implementations of C and C++ do exist, and only thing that their GC fixes is use after free.
At a very basic level it can be summed up with: the JVM approach to GLSL was to just throw it at the GPU, whereas the browsers worked on restricting WebGL to a super constrained subset (this was my fault, but is the correct thing to do :D )
The applets also made it hard to do a bunch of the simple games that were super popular - flash included a large amount of multimedia functions built in, and an editing environment that was geared towards interactive design.
The reason java took so long to move out of process was because the java<->browser bindings were unique to java, it did not use any of the normal plugin apis, and expected a lot of direct linkage to the host system.
- JVM doesn't do enough to separate computation and I/O
- JVM doesn't run C code very well, or it requires research-level technology to do so (Graal). This applies to both computation and I/O -- it has a completely different I/O interface than C programs rely on.
- Photoshop / Word / Excel / etc. were never ported to the JVM. The browser actually has better equivalents of them.
The reason they weren't ported is because they were legacy codebases with some code going back to the original versions including lots of assembler. It was always going to be a monumental task to rebuild them from scratch.
You can compile legacy codebases in C and C++ to WASM. It was designed for that. Doing that on the JVM requires "research-level" techniques because the bytecode has a completely different design.
JVM separation between computation and IO can be managed via classloaders and JAAS.
Kotlin is a beautiful language and hopefully WebAssembly will help it to gain traction outside the Android world.
The article you've linked to is much better than this and deserves a better representation than that.
It was poorly suited for the browser from the earliest days, and even now the language and runtime have evolved to make it a round peg for that square hole no matter how WASM develops.
Yet because students and younger webdevs see it through the prism of trying to be a browser-side solution, it taints the brand for people who lack business application experience. Java is thought of as "insecure", even though the OVERWHELMING majority of security patches are for the browser applet plugin, which hasn't been widely used in twenty years and now being retired. Instead of being "a thing that makes large-scale business server applications more performant, and more tenable for large teams to work on", it is often seen by newbies as "a failed React.js alternative" where it comes up short.
Actual Java developers stopped thinking of it that way decades ago, but the history unfortunately persists.