
 Why are we limited to JS in browsers - would a bytecode standard help? - AndrewDucker
http://andrewducker.livejournal.com/2233017.html
======
silentbicycle
Before speculating too much about "a bytecode standard", etc., it would
probably be helpful to understand virtual machines and instruction sets.

I know web programmers generally aren't big on assembly or writing virtual
machines, but an instruction set (group of bytecodes) design and
implementation predisposes a processor/VM to certain operations. A VM for an
OO language is going to have bytecodes (and other infrastructure) for doing
fast method lookup, because it will really hurt performance otherwise. A
functional or logic language VM will probably have tail-call optimization,
perhaps opcodes specific to pattern matching / unification, and a different
style of garbage collector.

Compare the JVM to the Lua or Erlang virtual machines; think about the issues
people run into when trying to port languages that aren't like Java to the
JVM. Unless people are very deliberate about making a really general
instruction set, a "bytecode standard" informed by Javascript could be
similarly awkward for languages that aren't just incremental improvements on
Javascript. Besides, you can't optimize for everything.

There are a LOT of details I'm glossing over (e.g. sandboxing/security
concerns, RISC vs. CISC, the DOM), but I've been meaning to point this out
since I read someone saying, "Why do people keep writing more VMs? Why don't
we just use the JVM for everything and move on?" It's not that easy.

~~~
scott_s
All true. But I think the potential benefits are real, and I don't necessarily
think it's a bad thing if the JS VM was specialized for JS. Standardizing the
byte-codes could allow a looser coupling between browsers and JS. It could
also allow for people to play with JS optimizations and augmentations without
having to touch the VM internals itself.

All of this makes me think that if I was a VM researcher, I'd seriously
consider going in this direction. And not being a VM researcher makes me think
maybe I should be one.

~~~
silentbicycle
I agree with you. Typical web devs probably don't have enough context about
virtual machine implementation to understand the trade-offs, though.

------
nix
My admittedly biased view: I spent two years of my life trying to make the JVM
communicate gracefully with Javascript - there were plenty of us at Netscape
who thought that bytecode was a better foundation for mobile code. But Sun
made it very difficult, building their complete bloated software stack from
scratch. They didn't _want_ Java to cooperate with anything else, let alone
make it embeddable into another piece of software. They wrote their string
handling code in an interpreted language rather than taint themselves with C!
As far as I can tell, Sun viewed Netscape - Java's only significant customer
at the time - as a mere vector for their Windows replacement fantasies.
Anybody who actually tried to use Java would just have to suffer.

Meanwhile Brendan was doing the work of ten engineers and three customer
support people, and paying attention to things that mattered to web authors,
like mixing JS code into HTML, instant loading, integration with the rest of
the browser, and working with other browser vendors to make JS an open
standard.

So now JS is the x86 assembler of the web - not as pretty as it might be, but
it gets the job done (GWT is the most hilarious case in point). It would be a
classic case of worse is better except that Java only looked better from the
bottom up. Meanwhile JS turned out to be pretty awesome. Good luck trying to
displace it.

SWF was the other interesting bytecode contender, but I don't know much about
the history there. Microsoft's x86 virtualization tech was also pretty cool
but they couldn't make it stick alone.

~~~
InclinedPlane
x86 is a perfect comparative example. An architecture that is a patch on a
patch on a patch (add several more layers here until you're tired) going back
to the 8086 a kajillion years ago (a processor which was less sophisticated
and powerful than an arduino). Intel tried to kill the architecture (replacing
it with IA64) but AMD patched it yet again and the result was successful.

Nobody sane would design an architecture like x86 (or event x86-64) from the
ground up today. Yet here we are.

~~~
Tuna-Fish
I would just like to point out that IA64 wasn't really a significant step up.
VLIW is a good idea for DSPs and GPUs and whatnot, but for the kind of
dynamic, branchy code that we all know and love, IA-64 was quite probably the
only somewhat modern CPU arch that was actually _worse_ than x86.

~~~
InclinedPlane
I'm not so sure about that. Modern IA64 compilers and systems are actually
pretty damned decent. Though the fact that they are still only comparable to a
monstrous, teetering pile of hacks and kludges (x86) is not much of a
recommendation.

------
Groxx
In all honesty... what's the difference?

Turing complete language ≈ Turing complete language. Just make something to
compile your language of choice into JavaScript, or make a new one
(CoffeeScript). Bytecode is just a little more dense, little faster executing,
_far_ harder to investigate language than JavaScript, which the browser can
compile for added speed anyway.

I prefer my language-of-the-web to be readable, thanks. Then I can find out
wtf it's doing. And where you _really_ need speed, native client is just about
your only option.

~~~
jules
The difference is that Javascript is about the worst possible IL for a
compiler to compile down to. Yeah you can recover some of the speed by
spending man years on making Javascript execute somewhat fast.

As a bytecode you wouldn't use a bytecode that is targeted at handling
Javascript constructs. You'd use a bytecode that, for example, supports
machine integers (unlike Javascript). Something that allows you to allocate
objects, rather than string indexed hash tables. Something that allows you to
allocate an array of floats, rather than an array of boxed floats.

~~~
InclinedPlane
"Worst possible"? Sub-optimal perhaps, but Javascript is becoming very fast in
the browser, perhaps it's not such a bad choice after all.

~~~
jules
With enough effort you can put lipstick on a pig. But really, an IL that
doesn't support integers? An IL where a member field access may involve
looking up a string in a hash table instead of being a single machine
instruction? Propose that as an IL to a compiler guy and they'll laugh at you.
Javascript is an historical accident. Let's look at some alternatives if
history was different:

\- Scheme: this would have been a much better choice, because it supports
integers and has a compact object representation instead of representing
everything with hash tables. Bad point: all objects are boxed (like in JS).
Good point: tail calls are handy when compiling all kinds of control
structures.

\- Java: As an IL to compile down to this would have been better than
Javascript. Good points on top of Scheme: unboxed primitive types, static
typing means fewer runtime checks. Bad point: lack of tail calls.

\- ML: excellent choice: has both unboxed primitives and tail calls.

I challenge you to find a language that would have been a worse choice than
Javascript. It may turn out that by working really hard on smart runtimes and
compilers you can get OKish performance out of Javascript in some cases, but
that doesn't mean that it's a good choice for an intermediate language. A ML
compiler that just translates to LLVM IL as simply as possible without doing
any optimizations itself will easily beat the highly optimized Javascript
engines we have today.

~~~
Robin_Message
Just to note though: Javascript in fact doesn't require representing objects
as hash tables. For example, the V8 javascript engine represents objects as
instances of classes, more or less exactly how C++ would do it (the build the
class definitions automatically in the background.)

You are right on integers though, and other types of memory block in general,
they are trickier to fix in a javascript engine. Surely the best IL is LLVM IL
though, since it was designed so everything can compile down to it.

~~~
silentbicycle
> Just to note though: Javascript in fact doesn't require representing objects
> as hash tables. For example, the V8 javascript engine represents objects as
> instances of classes, more or less exactly how C++ would do it (the build
> the class definitions automatically in the background.)

That sounds like what Self does, too. See the _excellent_ paper, "An Efficient
Implementation of SELF"
([http://selflanguage.org/documentation/published/implementati...](http://selflanguage.org/documentation/published/implementation.html)).

The Self bibliography page
(<http://selflanguage.org/documentation/published/index.html>) has a lot of
other content applicable to dynamic languages in general, too.

~~~
wmf
Self and V8 are connected through Lars Bak.
<http://en.wikipedia.org/wiki/Lars_Bak_(computer_programmer)>

~~~
silentbicycle
He worked on Beta / Mjølner, too. Neat!

------
hannibalhorn
Google is already heading there with native client. They're changing the
approach to serving up LLVM bytecode on the server, which is translated to x86
or ARM by the browser prior to execution. For future apps that require
performance, it should work well and with Google's weight behind the tech I
think it'll be widely adopted.

There's already a version of python that runs in native client, and pretty
much anything could conceivably be ported.

~~~
AndrewDucker
Oooh, LLVM bytecode. I hadn't thought of that, but it's a perfect choice!

I hope it does get picked up by other people.

~~~
tav
<http://nativeclient.googlecode.com/svn/data/site/pnacl.pdf>

^ Overview of the LLVM-based Portable Native Client (PNaCl) architecture.

------
CodeMage
I really liked this comment (by khoth): "And once we have standardised
bytecode, the next logical step would presumably be to improve performance by
creating CPUs that can execute it directly. In 20 years we'll all be back
where we started."

~~~
AndrewDucker
I believe some work went on to produce a Java-running chip at one point. It
proved better to compile down to whatever suited the chip best.

~~~
gvb
There have been several JVM (without the "V") created[1]. None of them really
caught on other than the ARM Jazelle[2], which isn't really a JM, it is more
of a JVM accelerator: it has support for direct execution of many of the JVM
opcodes.

After investing _lots_ of time and money into creating a JM, the companies
were chagrined to find a general purpose processor with a good JVM (especially
with JIT) could run circles around a direct-execution processor.

[1] <http://en.wikipedia.org/wiki/Java_processor>

[2] <http://en.wikipedia.org/wiki/Jazelle>

~~~
tiles
And that makes an excellent case for why JavaScript should simply be the
bytecode of the future. x86 is gaudy and unsuited for many tasks--but it's
where most of the speed innovation happens, thus it's the best platform for
compiling other languages too.

I'll propose an alternative, fix JavaScript by adding APIs like ByteArrays and
shorts and a proper int to the language. Over time, JS could become an
excellent IL. We can standardize on intermediate bytecode, but like all things
in web adoption, it will probably be the path of least resistance that works.
(Who would ever give HTML graphic, multimedia, and threading abilities?)

------
tptacek
Am I missing something, or did we all _have_ a browser bytecode standard that
everyone hopped on board with, and it turned out nobody really liked feeding
browsers compiled programs after all?

~~~
rlpb
It was never integrated into the DOM, though, was it? I am only aware of it
being used in a separate, sandboxed area on the page, unable to interact with
anything else except in a very limited way.

(Never mind the slow loading time etc)

~~~
patio11
You could integrate Java applets into the DOM. Now you have two problems.
(Keep in mind that back when this was popular IE 4 was busy kicking the pants
off of Netscape Navigator. Fancy programming against IE _4_?)

~~~
rimantas

      Fancy programming against IE 4
    

DHTML was born with IE4 and Netscape 4, and IE4 was fancier of the two.

------
BrendanEich
I'm on Ecma TC39 and in touch with JS hackers working for all the major
browser vendors. FWIW, as far as I can foresee there will be no standard JS
bytecode. Commenters here touch on good reasons:

* divergent, evolving VMs can't agree (not all NIH, see third bullet);

* IPR concerns;

* lowering from source over-constrains future implementation, optimization, and language evolution paths;

* view-source still matters (less, but still).

A binary encoding of JS ASTs, _maybe_ (arithmetic coders can achieve good
compression; see Ben Livshits' JSZap from MSR, and earlier work by Michael
Franz and Christian Stork at UCI). But this too is just a gleam in my eye, not
on TC39's radar.

Meanwhile, we are making JS a lot better as a target language, with things
like ES5 strict mode, the ES-Harmony module system and lexical scope all the
way up (no global object in scope; built on ES5 strict), and WebGL typed
arrays (shipping in Firefox 4).

We have a Mozilla hacker, Alon Zakai, building an LLVM-based C++-to-JS
compiler. Others are doing such things (along with good old SNES emulators and
the like).

So being a good mid-to-high-level, memory-safe compiler target language is on
TC39's radar. Not to the exclusion of other JS desiderata, and never in a way
that compromises mass-market usability or buy-by-the-yard rapid-protoyping
"scriptability". But one among several goals.

------
joubert
Google nativeclient might be a start...
<http://code.google.com/p/nativeclient/>

~~~
siddhant
There was a session about it, this time in the Google Developer Day at Munich.
Interesting stuff. Right now they're stressing on C/C++, but probably C# is up
next.

~~~
AndrewDucker
Well, Mono supports LLVM, so it sounds feasible.

~~~
contextfree
Mono has already been made to work with PNaCl (for some experimental-
prototype-y definition of "already been made to work").

------
Hoff
Discussions of technical feasibility aside, getting a sufficiently large
installed base to make this interesting is (as usual) the "fun" part; your
plan for getting to critical mass against an available and "good enough"
solution.

As for previous bytecode approaches to consider, there are the Lisp Machines,
or the Burroughs B5000 Algol box, or the EFI byte code (EBC) interpreter and
the EBC I/O drivers, or UCSD pCode, or all the gonzo things you could do with
the VAX User Writable Control Store, the JITs underneath Java and Lua and
other languages, or...

And HTTP is printable, which means you're working within character-encoding
constraints or with escapements.

There are the not-inconsequential security requirements.

And then there's the question of how a provider might make money with this, if
you're not undertaking this effort for free.

Have at...

~~~
klync
Congratulations on being the only person in this thread with two feet on the
ground. Interesting discussion, nonetheless. But purely academic.

~~~
pjscott
Since Google is actually working on this, I would read their PNaCL paper
before calling the discussion "purely academic":

<http://nativeclient.googlecode.com/svn/data/site/pnacl.pdf>

Summary: Send LLVM bytecode to the browser, and enforce sandboxing by
inspecting the compiled code.

------
chrisaycock
Whenever I hear "standardized bytecode", I think of Parrot:

<http://www.parrot.org/>

~~~
robhu
That was actually what was in my mind when I originally mentioned the idea to
andrewducker!

------
alexmchale
It's an interesting theory, but ultimately irrelevant. The thing is that there
are SO MANY people working right now to make JavaScript incredibly fast. It's
not hard to imagine a future where JavaScript is the fastest reasonable way to
write software -- simply because it's the language that has the most R&D going
for it.

Did I say future? Oh, hello NodeJS.

I don't envision that we'll be writing JavaScript itself forever -- but rather
a super-syntax on top of it that compiles down to JS. CoffeeScript is the
first generation of this kind of programming language.

I do, however, believe that for the foreseeable future, JavaScript will become
the lingua franca of day-to-day programming.

~~~
silentbicycle
And yet, for all the resources being thrown at improving Javascript, LuaJIT is
still significantly faster. Oh, and it was written by one person.

The design of Javascript itself limits performance in various ways,
unfortunately. See this discussion between Mike Pall (of LuaJIT) and Brendan
Eich: <http://lambda-the-ultimate.org/node/3851#comment-57671>.

------
watt
Do you remember Internet Explorer with VBScript?

~~~
jarin
I was seriously trying to forget.

------
tlrobinson
I'd take it even further and implement the various web standards on this VM.
That way when HTML6 or CSS4 comes out you don't need to wait 4 years for
everyone to upgrade their browser, you just download it automatically the
first time it's used.

This is sort of taking Cappuccino/Objective-J's principle of "shipping the
runtime" to the extreme.

------
ithkuil
It would improve script parsing speed but perhaps hinder the evolution of
internals of the interpreters. Of course engines could always retranslate this
standard bytecode in another internal representation and jits.

------
jallmann
I might be missing something here, but this implicates a few things:

You won't be writing code in the browser anymore. Otherwise you'd have
fragmentation where the user doesn't have the proper interpreter installed,
and a never ending stream of "language downloads" which might be cool to devs
but horribly impractical for end users.

Because we're not writing in the browser anymore, we'd be back to offline
compilation and static code generation. I suppose it'd be possible to JIT the
bytecode (assuming that current Javascript optimizers work on the IR; I don't
know enough to qualify this any further). From my casual browsing of LtU, it
seems that tracing JITs in particular may offer optimizations dificult to
achieve with static bytecode compilation, particularly in identifying hot
spots. But the more I think of it, that's a non-issue because you're still
compiling to bytecode and JITing from there.

This feels very Java-ish to me. If you really want to script the browser in
another language, compiling to Javascript (a'la GWT) would essentially do the
same thing.

~~~
stevecooperorg
> You won't be writing code in the browser anymore

You could be. Impement a URL like <http://mydomain/scripts/MyScript.bytecode>.
This is a page which compiles, say MyScript.rb into bytecode and delivers it
back to the client. You get a new copy by saving a file on the server and
hitting refresh, and there's only one interpreter to download.

~~~
jallmann
That just moves the bytecode compilation to the server -- so you're back to my
original point of generating the code statically. In fact it'd probably be
worse, since there is no reason to compile MyScript.rb to bytecode more than
once. It's like generating a dynamic page with Rails when all you really need
is static, cacheable HTML.

------
stevecooperorg
I'd be really interested in seeing this develop. Define a bytecode interpreter
(JSVM?) and push bytecode into it. Performance may be god-awful, but you'd be
free to use whatever language you fancy.

Parchment is an example of the approach, implementing the z-machine VM in
javascript. <http://code.google.com/p/parchment/>

~~~
stcredzero
_Performance may be god-awful_

Why, after decades of JIT bytecode VMs, do we still have this misconception?
Many of the widespread JVM implementations are faster than V8. LuaJIT and C#
Mono are also VM implementations that JIT bytecode and are faster than V8! Is
this a troll?

[http://shootout.alioth.debian.org/u32/which-programming-
lang...](http://shootout.alioth.debian.org/u32/which-programming-languages-
are-fastest.php)

~~~
stevecooperorg
As people have commented, I'm suggesting writing a javascript function which
takes pre-compiled bytecode; like so;

    
    
        function interpret(bytecode) {
            // stack machine implementation goes here
        }
    

And called like so;

    
    
       var bytecode = load("http://my.domain.com/myscript.bytecode");
       interpret(bytecode);
         

I'm not suggesting that bytecode is inherently slow -- just that I could take
a great stab at writing a slow bytecode interpreter in JavaScript. ;)

So under this scheme, you could do a server-side compilation of any language
-- let's imagine Pascal as an example -- and deliver it back to the client as
bytecode. Now you've broken the browser dependence on Javascript. At the cost
of a VM written in JavaScript.

~~~
stcredzero
_I'm suggesting writing a javascript function which takes pre-compiled
bytecode_

Sorry, I misunderstood.

 _Now you've broken the browser dependence on Javascript. At the cost of a VM
written in JavaScript._

I keep looking at these two sentences again. On one hand I know what you mean.
On the other hand, this doesn't make any sense because it contradicts itself.
(Which is why I misunderstood the original comment, I think.)

------
Detrus
Maybe browser makers could expose a few lower level constructs in JS to make
efforts like CoffeeScript, Objective-J and GWT easier?

That could be a short term solution. Long term we have to hope that LLVM and
NaCl pick up steam.

But many languages for the same tasks will create fragmentation. Right now
there is a big pool of developers proficient in JS. They even use the same
libraries! The libraries are where a lot of productivity gains come from. The
web became an "easy" platform to develop for. With fragmentation that could be
lost.

CoffeeScript is interoperable with JS, easy to pick up because it's mainly a
cleanup, but Objective-J and GWT are too far out. GWT needs a reworked JQuery,
called GQuery.

So there are these economic considerations that are probably more important
than language preference.

~~~
tolmasky
Not sure what you mean by Objective-J being too far out. Objective-J is much
closer and "interoperable" with JS than CoffeeScript because it is a strict
superset of JavaScript (in other words, all JavaScript _is also_ Objective-J).
Practically what this means is that any existing JS works alongside
Objective-J with no changes whatsoever, the syntaxes do not "collide". The
difference between ObjJ and JS are comparable to ECMAScript 2 and 5 (additions
of new features and keywords)

~~~
jashkenas
Not to be combative, but I think what he means is this: Although it's
perfectly easy to use JavaScript from Objective-J, the same is not true in
reverse. ObjJ generates code that looks like this:

    
    
        objj_msgSend(objj_msgSend(CPIndexSet,"alloc"),
          "initWithIndexSet:",_selectedRowIndexes);
    

... where objj_msgSend is how the Objective-J "interpreter" does it's magic.
If I wanted to use a library that was originally written in Objective-J, from
JavaScript, I'd have a hell of a hard time calling it correctly, unless the
library author took special care to make it JavaScript-compatible in the first
place. The syntaxes may not collide, but the semantics certainly don't match
up.

On the other hand, you'd never know that you were calling a CoffeeScript
library from JS, unless you inspected the source, and vice-versa. In that
sense, they're interoperable.

~~~
tolmasky
No not combative at all, what you say makes perfect sense. I suppose the
confusion arose because he bundled objj with gwt, saying you'd need a totally
different jquery, which is certainly not the case with objj.

------
ssp
It's not a bad idea.

In fact, why not take it a step further and make a stand-alone client that
_only_ executes the byte code? It would be different from Java for two
reasons:

(a) The bytecode format could be much simpler if it weren't tied to a
particular language (no type system, no objects, maybe not even a garbage
collector). If the bytecode was similar to a real CPU architecture, it would
be possible to target it from LLVM.

(b) There would be no humongous standard library to install, because it could
just be downloaded on demand. With almost all of the library living on the
server side, application authors could avoid a lot of the usual compatibility
nightmares with different client implementations each one with its own bugs
and workarounds.

------
pedrocr
I tried to make this point in this submission:

<http://news.ycombinator.com/item?id=1790311>

The fact that it also ranted about Javascript shifted the discussion though

~~~
user24
I remembered that post when I saw this headline. I think from a philosophical
viewpoint it's a good idea - though I've no idea what the implementation
difficulties might be (could google have created V8 if we had a bytecode
system?). But from a practical stance, what's the benefit? What would it
achieve?

~~~
pedrocr
>But from a practical stance, what's the benefit? What would it achieve?

The reasons I outlined in my post were things like having a single form
validation codebase, deployed both server side and client side. More
ambitiously you could have the same codebase used for the server-side online
portions of something like gmail and also retarget that to use for offline
gmail. Basically the big divide between your server-side and client-side
codebases would start to disappear.

Seems like Google's native client system is a great step in this direction.
Soon enough it may be possible to target say Lua to LLVM bytecode and deploy
that to a browser in a bundle of javascript. If the browser supports NaCi you
execute the LLVM bytecode with the built-in VM, if not you have a fallback
bytecode interpreter in Javascript. That way you get full coverage of browsers
with only degradation of performance when NaCi isn't available. I'm not sure
however how fast you could make the Javascript fallback. If it is very slow it
may not be workable.

------
stefs
imho an intermediate bytecode makes no sense, as silentbicycle explained.

3 alternatives:

1) make every browser vendor implement multiple runtimes

2) x-to-js translators

3) interpreters in implemented in javascript

ad 1) multiple runtimes: not a good idea, because of multiple reasons.

first, versioning hell. javascript is ~15 years old, and browser vendors are
still not able to provide 100% compatibility. you'd have versioning hell, only
worse. second, it would hurt javascript performance, because browser vendors
would have to split ressources. third: which languages should be supported?
you just couldn't please everyone, so there would be ongoing "why is language
x supported but not y?" problems.

ad 2) translators: are in use now. see coffeescript, gwt and ghcjs, ...

pros:

\- almost native javascript speed

\- already possible

cons:

\- small translation overhead (depending on if it's JITted or precompiled)

\- not everything is possible. if the javascript runtime doesn't support tail
call optimization, the translation won't have it either. certain magic just
doesn't translate.

ad 3) interpreters implemented in js

afaik there are some, e.g. js brainfuck interpreters.

pros:

\- the sky's the limit

\- already possible

cons:

\- slow (adds an emulation layer).

that said, i'm against additional browser-provided runtimes except NACL. you
could use javascript for your day to day work and NACL for your special needs.
i'm not sure if NACL would be a complete stand-in for JS thought - would it be
possible to communicate with the DOM (or at least JS)? if yes, there you have
it - problem solved.

~~~
nickik
JS is just not a good language as a compiler target. If you hade an Bytcode
with all the right low level types and features it would be easier to optimise
for the VM implementers (mozilla, google ...), you could modify JS without
changing the VMs, languages would not be blocked by the speed of JS, faster
languages would be possible. It would be easier to make languages that are
very diffrent from JS work in the browser.

Your solution work but a haskell interpreter in js will never have good
performence but if you could make a GHC backend that comipiles to "web
bytecode".

------
adamzochowski
But you can run different languages inside a browser.

Activestate had Perl running inside browser:
[http://docs.activestate.com/activeperl/5.8/Components/Window...](http://docs.activestate.com/activeperl/5.8/Components/Windows/PerlScript.html)

    
    
      <script language="PerlScript">
        $window->document->write('Hello world!');
      </script>
    
    

Also you can run VBScript. I have seen code that avoid javascript confirm()
and tries to check first if it can use VBScripts msgbox function, just so it
can provide 'yes'/'no' buttons. Aka:

    
    
      function confirmVB (text)
        confirmVB = msgbox ( Text , VBYesNo )
      end function
    

and then javascript

    
    
      function confirmYesNo(txtText)
      {
        if (window.vbSupported)
        {
      	return confirmVB(txtText);
        } else  {
      	return confirm(txtText + " (ok = yes, cancel = no)"   );
        }
      }
    

Cheers

------
cpr
No, it wouldn't help at all: let me point to the overwhelming success of
standards (XML, HTML, Javascript, CSS, Postscript) that are character-based.

Think of Javascript as a (human-readable) virtual machine layer itself, below
which the implementation is free to do as it pleases as long as it meets the
JS standard semantics.

------
abecedarius
I took a whack at this 9 years ago. My code's at
<http://wry.me/~darius/software/idel> and there were other, probably worthier,
attempts around the same time and before, like Michael Franz's work with
Oberon. Basically: make a low-level VM more or less like LLVM without the
machine-dependent semantics and with a compact, easy-to-verify wire format.
Apparently PNaCl is working on fixing those infelicities now, or at least the
machine dependence.

(I had some fun but decided the obstacles to addoption were too great and we'd
end up with x86 in the browser someday. So the NaCl announcement years later
amused and gratified me.)

------
amix
Having a standard bytecode would make it much harder to steal code which is
much needed as web and mobile applications shift towards using lots of
JavaScript.

Currently it's way too easy to steal everything since the whole source is
exposed. You can obfuscate JavaScript and CSS, but the main semantics will
still be there and someone that's interested will still steal your code (this
has happend twice to us, even if all of our JS is obfuscated using Google
Closure...) The same thing could happen with bytecode, but their takeout would
not be maintainable. Obfuscated JavaScript is still maintainable, since the
structure and semantics are largely there.

~~~
tptacek
Bytecode-compiled languages are dreadfully easy to decompile.

~~~
amix
The point isn't that you can't decompile bytecode-languages, but that the
decompiled bytecode is a lot harder to maintain than obfuscated JavaScript...

------
snissn
Microsoft's silverlight plugin allows for embedding python and ruby in the
browser: <http://www.silverlight.net/learn/dynamic-languages/>

~~~
nikcub
and Javascript. It is very very fast with Javascript.

You can do what is discussed in this thread today by detecting Silverlight in
the client and if it exists use it instead of Javascript source. DOM access,
manipulation and everything else is just so much faster.

The Silverlight SDK is free, you can get it on Linux and OS X (Moonlight) and
there are a lot of Silverlight VM's out there

------
thisrod
Why use a virtual machine when you have a real machine? It takes some
cleverness to do safely, but the AI lab are a lot clever, and they've found a
way:

"Vx32 is a user-mode library that can be linked into arbitrary applications
that wish to create secure, isolated execution environments in which to run
untrusted extensions or plug-ins implemented as native x86 code."

<http://pdos.csail.mit.edu/~baford/vm/>

------
DjDarkman
JavaScript is so moldable, you could easily make a compiler that translates to
JavaScript. You could even write a language and translate it in the browser,
see CoffeeScript.

------
kevinburke
If bytecode became the standard it would further obscure scripting code and
make it tougher to read other people's code.

------
kqueue
Just translate it to javascript.

------
DjDarkman
Take a look at Coffee Script.

------
logancautrell
I believe you are looking for <http://nodejs.org/>

~~~
alanh
(I didn’t downvote you but) you answered the wrong parsing of OP’s original
question: It wasn’t “why can’t we use JS elsewhere?” but rather “Why can’t we
use other languages in the browser,” a semi-tired plea from those who haven’t
learned to love JavaScript and/or don’t realize that due to the countless
number of browsers out there, JavaScript is going to be the only option for
DOM scripting for years, at minimum.

