
LLJS : Low-Level JavaScript - vmorgulis
http://mbebenita.github.io/LLJS/
======
girvo
The idea of LLJS is quite interesting, similar in ways to asm.js (in fact
there was a fork that can compile to it!) but easily human readable and
writable. It seems to be a dead project however; I came across it a few weeks
ago and it seems abandoned, unfortunately. With wasm, hopefully we'll see more
and more languages like this one; I'm imagining a low-level C-like language
solely for doing WebGL in a fullscreen canvas.

~~~
halosghost
I don't know; honestly, I think the nicest part about wasm is that it might
allow us to stop making more dialects of JS, get rid of JS altogether and
compile to the web whatever language we like (I'm looking at you Haskell).

~~~
woah
I never really have understood this kind of sentiment. There are a huge number
of people who like js, to the point that it is used in a bunch of places that
are not the browser. In terms of dynamic scripting languages, it's a fast,
easy, and flexible option.

~~~
halosghost
Disclaimer: the below text is just my opinion meant to answer the Parent's
questions about my preferences. Please do not take it to be an attempt at
starting a flamewar (or, for that matter, an invitation to do so).

I have never enjoyed dynamic typing; it has always seemed like
TheWrongSolution™ to me. And Haskell really drives that home for me (i.e., it
is possible to have a static type system with all the benefits that gives you
without any of the headaches).

Additionally, I really dislike the trend of JS moving out of the web browser;
it is incredibly easy to write impenetrable JS (particularly with the compile-
to-js languages) which makes it much harder for me to understand what is
running on my system, how to contribute to it and how to debug it.

Furthermore, JS is fast in comparison to other interpreted dynamic languages,
but is quite slow compared to most native code (like the kind you would get
with Haskell).

I am not trying to be one of the JS naysayers that gets into religious wars,
but I will say that I am generally not a fan of the language, and the
possibility of removing it from my life does seem like a net-positive.

But again, my intention is not to start a flamewar, simply answer your
questions about my preferences.

~~~
woah
Your choices are totally sensible.

What I meant was the idea that everyone is just waiting for JS to go away
seems like you're extrapolating your preferences onto other people.

~~~
saosebastiao
He extrapolated to me quite well. The existence of hundreds of languages that
compile to javascript is pretty good proof that there are tons of people that
are waiting for JS to go away.

~~~
halosghost
Or, at the very least, that there are many people that feel unsatisfied with
JS in its current form.

------
bobajeff
They actually made a version that generated asm.js. Unfortunately that was
before asm.js was stable and they haven't updated it since so it no longer
generates valid code.

The language and generator were simple enough to use in a browser-only
environment so it's a shame there was never more interest in it. It would've
been nice to be able to fool around with asm.js without having to set up a dev
environment, which isn't possible in some circumstances.

------
logicallee
I asked Carmack and others, he hated my idea and said it was silly.† I still
think it's the right way to go. _I think we should physically have CPU 's that
you can hand a binary you don't trust, off of the Internet, to and nothing bad
can happen._

Read that again. CPU's should be able to be given a binary blob, and nothing
bad can happen. It should be native speed because it's running on the actual
metal. Like a hardware CPU, for the web.

A multigigaherz processor cannot render an 8-bit (that is 0-255 here)
processor running at 1.79 MHz with a whopping 2 kB of RAM that is orders of
magnitude slower:
[https://news.ycombinator.com/item?id=9624483](https://news.ycombinator.com/item?id=9624483)

This is insanity. It's time for new hardware, that you can say "go wild" to by
typing in an http address. 90% of the reason for enduring this slowness is
security. (It's okay to interpret javascript - what's the worst that will
happen. It's not okay to agree to run any binary regardless of what will be in
it.)

It's time for Intel to make chips that you can hand to a website and say, "go
wild". I think reinventing C in Javascript at 10% of performance (generously),
is what is silly. Running Mario at a bit too slow to be called "perfect" is
what is silly. It's 2015. Light travels 10 centimeters between clock cycles,
with cores at 3 GHz. Do we really have to spend three _billion_ cycles every
second, on interpreting javascript?

† "Dedicated hardware would be silly – the current hardware can do secure
sandboxing just fine. Cross platform compatibility is an issue with a lot of
possible solutions; dedicated hardware would be a far worse case."

~~~
cactusface
NaCl?

~~~
lucd
[https://developer.chrome.com/native-
client](https://developer.chrome.com/native-client)

Native Client is a sandbox for running compiled C and C++ code in the browser
efficiently and securely, independent of the user’s operating system.

~~~
cactusface
Oh, it was a suggestion to the guy, but thanks for elaborating.

~~~
logicallee
yes, along those lines. I don't know how large a subset NaCL is though. And
it's yet another target. It would be cool if you could just target x64, and if
the browser is running in a box that is new enough (has such a CPU) your
thread literally GETS an x64 core whenever it has focus. All of it - same as
if if it were a desktop app. Like a coprocessor. (Exactly what was called a
silly idea.)

------
halosghost
This appears to be about a language that compiles to js (asm.js it seems?),
not wasm (WebAssembly) which is mean to be more like an IR (think of llvm ir)
but for the web. Am I misunderstanding something?

~~~
girvo
_> more like an IR_

Actually, it's more like a compressed abstract syntax tree. Subtle difference,
but there's no stack machine you're targeting in wasm, which I think is quite
interesting!

~~~
halosghost
I suppose, but my point is that what the OP posted appears to be completely
unrelated to wasm…

------
empyrical
I wonder if LLJS could become something like Cython for JavaScript. Imagine
writing Node or Emscripten bindings to a C library with JS!

------
bluejekyll
Can we get over all this native JavaScript junk and finally move to either
platform independent llvm bytecode (wasm as I understand it) or just use the
damn jvm? There's no need other than faster Dev times (which can be handled in
other ways) to keep the code in a non compiled format anymore. It's nuts and
super inefficient. Let's move on already.

~~~
vmorgulis
If wasm is effectively adopted, the browser could become the cocoon of most of
end-user applications.

It will be like a VM plus an OS API:

    
    
      -DOM replace GUI frameworks (like Qt)
      -Canvas replace framebuffer (or GDI)
      -JS File API replace fopen() (or with emscripten virtual FS)
      - ...
    

Like in 90s, we could see rewamping of applications but for the web this time
(instead of Windows).

We can see JS as a higher level shell and the Web API as a cross-OS POSIX.

~~~
teacup50
The DOM is going to be the first thing to go if WebAssembly actually takes
off.

It's simply not a suitable performant model for a GUI event/render tree.

~~~
pdkl95
...and the open web will be the 2nd thing to go. If businesses that rely on a
walled garden approach (Facebook, a lot of "software as a service", etc) and
the paranoid cartels that rely on re-creating scarcity (the {RI,MP}AA,
"ebooks", what's left of newspapers) can bypass the DOM and render directly,
they will.

Does anybody seriously think that the people that try to disable the right-
click context menu in a futile attempt to prevent people from using "save as"
will bother to re-create copy-paste support? Is anybody delusional enough to
believe that businesses will _pay_ developers to add back in proper URL deep-
linking support into their "app" with custom rendering that loads page content
AngularJS-style?

Sure, those of us that know what we are doing can [decompile and] read the
source and bypass the whole mess. That doesn't help normal people, and worse
it's a workaround; links will still be broken. Additionally, who knows how
courts will interpret "decompiling" WebAssembly. Does that count as a
"technological measure that effectively controls access" under the DMCA?

The requirement of rendering to the DOM puts _de facto_ limits on what can be
done on the client, which is part of what has made the internet and the web so
successful. Giving those that wish to lock up the commons the tools they need
to build their own locks will be one of the worst things to happen to the
internet. Unfortunately, I suspect that a lot of the people that _should_
understand these issues will be distracted by shiny toys and promises about
better tools and faster apps when they should be thinking about how the
technologies they support will affect their future.

edit: fixed spelling typo

~~~
walterbell
We are going to need self-declared URL labels/annotations plus third-party
whitelists/blacklists, so that browsers/extensions can filter the "webs" that
will splinter. This could extend all the way to suppressing the rendering of
links which violate user preferences.

