I am 100% with Carmack. Sorry, guy. JavaScript is crap. The reasons that JavaScript sucks have been hashed to death in the past. If you are already disagreeing with me, then anything I say here won't change your mind, because you've already heard the arguments before and built up a repertoire of responses. That's fine, whatever floats your boat. Lots of fun games have been, and will be written, in silly languages like JavaScript and ActionScript and whatever. People used to write games in assembly, and they were still fun. In the end, it's the game that matters.
But don't tell John Carmack, or any of the other many people who have been writing simulation engines and 3D rendering engines since around when you were born to use your web browser scripting engine. Seriously. (Also before I was born.)
And unsecured access to the graphics stack is a terrible idea. Flash already randomly causes my GPU to crash.
Do you not see the rhetorical problem with what you're saying? To come in here with a controversial statement and preemptively write off all oppositional viewpoints is to act in exactly the same way you deride pro-JavaScript proponents for allegedly behaving.
...however, this is also true of every other part of web programming: HTML and CSS are similarly a mixture of academic navel gazing nonsense and ugly cludges designed by committee.
The fact is that HTML5 is still really only version 0.1 of where web programming should be- But this doesn't mean it isn't improving. The fact is that having portable code snippets that can execute within a website is extremely powerful and right now javascript is essentially the only way to accomplish this.
You don't have to love javascript to love what it allows you to do.
The fact is that HTML5 is still really only version 0.1 of where web programming should be
That's one way of looking at it. Another point of view is that HTML already does far more than it needs to, to serve as a perfectly adequate hypertext format. It could be argued that it's a mistake to try to morph the web-browser into a crappy OS and / or a poor man's UI remoting technology, when we already have perfectly good operating systems and technologies for remoting out UIs or delivering applications to remote locations.
Perhaps instead of trying to make the browser a Swiss-Army-Do-Everything-Polymorphic-FrankensteinsMonster-OperatingSystem-XServer-ApplicationRuntime, we should just use the browser to browser hypertext, and hand off handling of these other issues to purpose specific software that was designed to do that.
we already have triple figures of good operating systems and technologies for remoting out UIs
We only have one web stack, it may differ a little at the bleeding edge, but that the core its the same stack deliverable to pretty much every user and device in the world.
(disclaimer, I work on aforementioned Firefox OS :)
> The reasons that JavaScript sucks have been hashed to death in the past. If you are already disagreeing with me, then anything I say here won't change your mind, because you've already heard the arguments before and built up a repertoire of responses.
You're implying I'm closed-minded because I have an opinion. I would actually like to hear your reasons for saying JS is crap/silly.
It seems to me like you missed the point of his letter to Carmack entirely. He's saying it doesn't have to be the greatest tool for the job, he's saying it's a good intro for would-be programmers and dabblers.
And don't give me that "when you were born" line. It's built to hinder fresh, new ideas and perspectives.
Signed,
A dinosaur who tries to refrain from using the "When I was your age..." line.
Pygame can be a pain to install sometimes. Things that package dependencies into a single binary (like Love2D) are easier to setup, but often lack the out-of-the-box interactivity that JavaScript tools provide.
I've been writing "writing simulation engines and 3D rendering engines since around when you were born" and I love JavaScript.
Like most C/C++/ASM programmers I hated it at first and I'm under no delusion that it's going to replace C/C++/ASM any time soon. But like many say, "the right tool for the job". By any measure Lua is complete shit and yet Carmack made it popular by using as the glue for most of his games.
JavaScript does many things extremely elegantly and the environment it runs in is fun, cross platform and accessible. It's fun to make things in JavaScript the same way it was fun to make programs on my Atari 800, Commodore 64. You type something and click "refresh" and it's up. No downloading a 2-4gig dev environment. No finding 4 SDKs you need to install. No figuring out how to get your image libraries to link. No worrying about how to get the initial screen up,
And even better, when it works you just send you friends a link. No installing. No worries if they are on OSX and you are on Windows or Linux.
Are you going to write Shadow of the Colossus in JavaScript? Probably not. But ICO? Yea, no problem. Any Zelda to date? Sure. 80-90% of all games ever shipped would run just fine in JavaScript at this point in time. Very few games actually need every once of perf.
> And unsecured access to the graphics stack is a terrible idea.
WebGL does not provide unsecured access to the graphics stack so I'm not sure what this BS remark was about.
> "By any measure Lua is complete shit and yet Carmack made it popular by using as the glue for most of his games."
This just isn't the case. In recent games iD software has moved away from using scripting languages at all. Later in the same talk, Carmack praises iOS for reversing the trend against Java and bringing back native languages. Saying that native languages are the only way you can get predictable performance for things like smooth scrolling.
> And unsecured access to the graphics stack is a terrible idea.
No one gives you unsecured access to the graphics stack. WebGL shaders are parsed and modified for security (for example, inserting things like bounds checks).
(Of course, there are security risks with any new technology.)
That's the problem. We aren't even sure what attacks exist against the graphics stack, let alone how to secure it. There is a mountain of "who would ever call this maliciously?" code sitting under webgl.
I have similar thoughts about allowing web sites to install fonts. Font rendering is hard, and font engines have not traditionally been exposed to attack. The idea that we have found all the bugs in them is absurd. There were security problems found in png many years after release, and that was something people knew needed to be secure from the beginning.
> That's the problem. We aren't even sure what attacks exist against the graphics stack
That could be an argument against new JavaScript JITs, too, or new video codecs. But we add those to browsers all the time, because they are important. So is being able to render 3D, I would argue.
Furthermore, of course we have an idea of the risks here, and the ways to prevent them. A huge amount of effort has gone into that, but in speccing WebGL and in implementing it, and in collaboration with GL driver vendors. And security experts have been poking at GL and WebGL for a while, just like they poke at JS engines and video codecs.
Yes, it is an argument against JITs. For a little while, OS builders were making progress on mitigation techniques, then the browsers all got together and decided it would be cool to allow attackers more control over the executable parts of the address space.
I do care about speed and 3D. But I don't think the web needs to be the delivery mechanism for all software. 900 years ago, they compiled Thunderbird once and everybody downloaded and used it. Now, every individual end user compiles gmail on a daily basis. The SaaS model has a lot of advantages. It has disadvantages too. I'm ok with only a subset of all potential programs being viable in a SaaS model.
In particular, I think allowing malicious people to run programs on your computer by default because, hey it's safe in the sandbox, is a terrible idea.
I didn't know that. Thanks. But I'm not so much worried about bounds checks as I am other obscure forms of overflow or corruption in graphics drivers. They barely work for things that are built specifically for them.
Another example: When you get bufferData in WebGL, it doesn't just call glBufferData with those arguments. It carefully checks those arguments for validity, including against the current state (currently bound buffer, its size, etc.). This ensures that the GL driver only sees calls with valid, reasonable info. It does incur a performance hit, but as you can see from WebGL demos on the web, it isn't terrible.
GL drivers in general are pretty complex. But the very small subset of GL that WebGL uses - both in terms of the commands used, and the content it passes in those commands as just mentioned - is much simpler to secure. Browser vendors have been working with GL driver makers on safety for a while now to improve things further.
Again, are there security risks? Sure, any new technology has them. JavaScript engines and video codecs have them, exploits for those are discovered all the time, but without those our browsers would be much less useful. The same is true for WebGL.
That only transfers the decision to run it, doesn't make it any more safe. You can override anything, that doesn't "clean" it or make beneficial by default.
Yeah, no one is "telling" John Carmack et al. to do anything. Unless you consider making an argument to be ordering folks around, in which case may I suggest you listen to this guy John who recently talked about JavaScript...
Since the Apple II was all in all just very basic easily programmable hardware for home users, I say no.
Our generation's Apple II is the ARM or any ARM based development board. It's ubiquitous and it's (at least compared to the x86 architecture) rather easy to program yet very powerful. Those are the reasons why the Apple II bred good programmers like almost no other machine in it's day.
Comparing a high-level language with tons of abstractions to Applesoft BASIC or pure 6502 Assembler is rather appalling to me. The reason why people who coded for the Apple II and similar machines became such great programmers is because they got a fundamental understanding of the hardware from it, which is something that JavaScript will never give you. It's not just that tinkering with low level stuff opens up new ways to be creative with the technology, but it does in fact make you a better programmer on all levels.
Additionally, JavaScript is (in my opinion obviously) an absolutely terrible language, with which we're just stuck with because it happens to be the only one that's well supported by all the big browsers of today.
ARM is cool, but you cannot give a kid ARM based computer, and expect it to program anything without setting up programming einvironment first. Do you expect this kid to cross compile on PC? Or maybe it should set up gcc on this device?
Maybe raspbery pi will change that, but for now the kid won't even know how to compile anything. The best bet is probably python, which isn't much better in "closeness to metal" than javascript.
I'm a bit too young to know how well the Apple II was documented for programming. However, people like Jordan Mechner (creator of the original Prince of Persia) were spitting out 6502 assembler code at a rather young age, weren't they? And it's not like that's a trivial task. Mind you, there was no internet as we know it today with StackOverflow and other resources, in fact, the internet was still pretty much in it's infancy.
I agree there's a lack of ways to develop directly on ARM, but that should change pretty soon with the Raspberry Pi as you point out, or any other such platform. For the rest of us there are development boards which usually come with quite a bit of documentation on their usage. Anybody determined enough to learn that stuff will be capable to do so, I think.
Python again is more like a magic wand that does most things for you. When I learned coding I constantly kept wondering how those calls actually worked and what they did. I just couldn't accept that say, print just prints stuff to the screen. I was wondering how it did that. I finally gave up on questioning at the edge of the ALU.
The reason I told this is, that I think a high level language like Python will teach you a lot about how to structure problems into smaller parts and will improve your "algorithmic thinking", but you'll still be pretty clueless about what's actually happening behind the code.
And exactly that sort of hacking on a lower level is what comes to my mind when I think of computers like the Apple II, which were basically just glorified calculators by today's standards.
I had C64. I "programmed" in BASIC. Which means I typed in programs from manual (which was in German, and I couldn't understand it), and changed stuff to see what will be the effect. I've never went into 6502 assembler (I had no books for it), but I learned that I want to be a programmer, and that was what was important. When I got PC I tried QBasic, Turbo Pascal, Assembler, C, Delphi, and everything else. Distance from metal was never issue for me. The issue was the idea I had, and the end product I wanted to achieve. Hell, I've written small logic game with Turbo Basic, that was one huge for loop (with counter that went to 99999999 or sth like that, because I didn't knew of loops without countter) with 5 screens of nested if then elses inside. It was about guessing words letter by letter, and I didn't knew about arrays, so I had 5 variables A1$, a2$, a3$, a4$ and a5$ for letters.
It was horrible, but I didn't knew it. And it worked. I've shown it in school, and to my parents, and I was a wizard. The methodology, and style conventions came later.
I think of 8bit computers as gateway drugs that invited people to become programmers. Currently the closest things we have is javascript+canvas, or some scripting languages in game modding scene.
Nobody[1] really writes assembler today proffesionaly anyway. It is too hairy now, and we have better compilers. Don't push it on kids.
[1] by nobody I mean almost nobody, and yes, arm assembler is better, but still show me a big application written in it
Sadly, I'm not too young to know. The Apple II had tons of documentation, including its schematics, for christ's sake, and a built-in disassembler so you could inspect anything in the ROMs. It came with two versions of BASIC -- Woz wrote an integer-only BASIC, and a licensed Microsoft BASIC. Over the years I had one, I also wrote programs in Pascal, C (with a Z-80 card), Forth, and 6502 assembler. Plus, you could wire crap directly into the game port's A/D converters; it was our generation's Arduino. So yeah, it was a pretty great box for developers.
> There was no internet as we know it today
It was the age of BBSes. I spent thousands of hours trolling random BBSes for bits of knowledge. There also were Compuserve, the Source, and Delphi -- today you'd think of them as AOL-like walled gardens. (In the early days AOL was considered just a Compuserve clone with a GUI front end.)
> 6502 assembler... not like that's a trivial task
I wrote an Apple ][ game in 6502 assembler for Compute! Magazine when I was 15. (Yes, back then, software was distributed in magazines... people would type the code in...) I urge you to find an online Apple ][ simulator and try it. It's crazy and fun. But yes, it's like going from the big lego blocks to the tiny lego blocks. You say "Python will teach you a lot about how to structure problems into smaller parts" -- writing a simple game in 6502 assembler will teach you (force you) to structure your code into even smaller parts.
I personally like bigger lego blocks and the higher-level abstractions we have today, I think it makes me ridiculously more productive and now that's what I crave (productivity = more time spent with my kids) whereas when I was younger I loved digging into the internals just because they were there.
> Awww hell. Now it's on
In a sense all computers are "just calculators". That statement wasn't meant to mock the Apple II in any way. After all, modern computers can't do "more" in terms of computability. I was mainly refering to how instruction sets were smaller back in the day. Looking at the instruction set of a x86 CPU really makes me kinda dizzy sometimes. The 6502 on the other hand has some 56 instructions (if my quick googling proved correct).
I do like the bigger lego blocks as well. They're great for doing any actual work. I'm also not trying to force assembler onto anyone. But as you point out, digging into the internals is a whole lot of fun and a very interesting journey through the world of computing. It kind of feels like discovering the very essence of it in a way. As a bonus point it teaches you a whole lot of things that might be useful one day.
Bottom line, even with today's powerful languages it's never wrong to dig a bit deeper, if just for experience and insights.
The problem with javascript is similar to the problem with BASIC.
The wise elders back then tried to warn us that if you learned to program with BASIC it would take years (if ever) to unlearn those bad habits, but we ignored them. The result is thousands upon thousands of programmers who produce nothing but spaghetti code.
Now we have an entire generation of programmers for whom statically typed languages, and compiled languages in general, are completely alien. There is a time and a place for dynamic typing (just as there is a time and place for GOTO) but to start with a language like javascipt inculcates habits that will take years to undue.
>>
The wise elders back then tried to warn us that if you learned to program with BASIC it would take years (if ever) to unlearn those bad habits, but we ignored them. The result is thousands upon thousands of programmers, and the creation of every interesting company from google to iD. That's not a bug, it's a feature.
>>
Unfortunately there's a gap between popular dynamic scripting languages - PHP, Python, Ruby - and popular static compiled languages - C++, Java, C# - where some imperative, statically and strongly typed, garbage collected and native-compiled language should exist. For a first language, i think it shouldn't force any particular paradigm on you, nor require you to understand things at the C-level of abstraction.
Legacy VB almost fills this gap, but unfortunately the syntax is a little embarassing. Actionscript is a perhaps closer.
It is not. It is closer to our generations HyperCard (but not nearly as cool).
The iPad COULD have been our generation's Apple II if it came bundled with an un-sandboxed Xcode iOS app and Terminal... dreams of an alternate universe.
And? He's not ordering anyone around, he's urging him to consider others may find it useful. There's a world of difference between that and a direct order. Sorry you can't see that!
The post has absolutely no mention about the quality of JavaScript. It is arguing that Web technologies are 'good enough' for an extremely large class of problems, and that they have massive advantages due to a lowe barrier to entry and ubiquity.
You might want to argue those points, but this is a complete strawman.
He's not telling Carmack to use a web browser scripting engine. In fact he specifically says "Nobody pretends that the next AAA-title will be written in JavaScript."
You don't have to write JS though, just use something compiles to it.
> And unsecured access to the graphics stack is a terrible idea. Flash already randomly causes my GPU to crash
Letting only safe operations through to the driver is the biggest part of a WebGL implementation's job. Hopefully HTML 5, WebGL, WebRTC etc can replace Flash. I'd much rather trust those jobs to Chrome than to Flash (Flash provides accelerated 3d too these days btw.)
There's lot to be done still, and you can score bug bounties from Mozilla and Google if you manage to poke holes in Chrome/Firefox :)
Talented people like Carmack have the potential to make things less crap, if they took an interest. If they don't take an interest, things will remain the way they are for a lot longer. John Carmack should join the Chrome team.
One person, even Carmack, is no match for standards bodies like the W3C. Google itself is already trying to push different web technologies like Dart and WebM, but it's a very slow burn. I wouldn't say there is a lack of vision or new ideas.
>If you are already disagreeing with me, then anything I say here won't change your mind, because you've already heard the arguments before and built up a repertoire of responses.
Really? Ever stopped to think it's you that is biased, so much that you pre-emptively null any possible responses with the above cheap trick?
But don't tell John Carmack, or any of the other many people who have been writing simulation engines and 3D rendering engines since around when you were born to use your web browser scripting engine. Seriously. (Also before I was born.)
And unsecured access to the graphics stack is a terrible idea. Flash already randomly causes my GPU to crash.