Hacker News new | past | comments | ask | show | jobs | submit login
John Carmack working on Scheme as a VR scripting language (groups.google.com)
555 points by minikomi on July 1, 2015 | hide | past | favorite | 346 comments



Naughty Dog (Uncharted, Crash Bandicoot, Jak and Daxter) uses PLT Scheme (a LISP variant) and used "Game Oriented Assembly Lisp" (GOAL).

http://www.slideshare.net/naughty_dog/statebased-scripting-i...

https://en.wikipedia.org/wiki/Game_Oriented_Assembly_Lisp

https://news.ycombinator.com/item?id=2475639

https://en.wikipedia.org/wiki/Racket_(programming_language) (formerly named PLT Scheme)


They switched to Racket when moving to the PS3. Here's a talk discussing the use of Racket in developing The Last of Us: https://www.youtube.com/watch?v=oSmqbnhHp1c


"Racket" is the new name for PLT Scheme.


Yup! I meant that they switched from GOAL to Racket for the PS3.

Here's a summary from a presentation at the CUFP workshop (likely a shorter version of the slides above):

  Dan Liebgold from Naughty Dog Software in Santa Monica then came on stage with the
  first gaming related talk at CUFP. They produce the popular Uncharted game series for the
  Playstation, which is famous for its complex and interactive scripted scenes. Dan described
  modern game development as a major production effort where, roughly, artists produce
  data and programmers produce code.
  Naughty Dog has a history of using various Lisp dialects to handle the code and data in
  a unified way. But when making the jump from the Playstation 2 to the Playstation 3, they
  decided that maintaining a custom Lisp-based game development system was too costly,
  and instead dedicated their efforts to rebuilding the tools, engine, and game in C++ and
  assembly language.
  This decision left no scripting system for gameplay and, more importantly, no system
  for creating DSLs and the extensive glue data that is typically required to develop a major
  video game. There was no off-the-shelf scripting system that fit the stringent memory
  requirements in a Playstation 3, and no language that would allow rapid DSL creation
  that fit into the existing tool chain.
  With a bit of naivety, a penchant for the Scheme language, and a passion for functional
  programming techniques, the team dove in and put together a system to fill in the gaps!
  They used mzScheme, which can compile to fast native code. Dan reported that the results
  have been very good, but not without issues. In particular, garbage collector performance
  sometime led to manual tuning being required, and build environment integration was
  tricky. Syntax transformations and error reporting led to confusion with new programmers
  too.
  On the other hand, the functional nature of the system was a big win, as it allowed
  them to flexibly distill game data down to just the right form to embed into the resource-
  constrained run-time environment. The final result is a system where programmers, artists,
  animators, and designers are productively programming directly in an S-expression Scheme-
  like language. Dan closed his talk by wowing the audience with the trailer for the game,
  which has now been released and is garnering extremely positive reviews.
anil.recoil.org/papers/2011-cufp-scribe-preprint.pdf


> they switched from GOAL to Racket for the PS3.

That's not completely true. GOAL was their replacement for C. When they were bought by Sony, they were required to switch to C++ so their engine team interoped better with the rest of Sony. GOAL was low-level like C (and also manually GC'd), but macros allowed them to code at a much higher level and reduce boilerplate. This actually makes for a safer low-level coding environment as a lot of little mistakes simply don't happen.

Since they love lisp and Sony didn't specify a scripting language, they decided to write a bunch of libraries on top of racket.


LuaJIT is, by far, the best option for scripting in games/RT, thanks to the incredible JIT performance and its incremental GC with deadlines.

But there's something when you start playing with Lisp and then you want to keep using it more and more. Suddenly the classic Ruby/JS/Python/Lua language design feels boring and stale (ironically, given the age of Lisp).

After getting my feet wet in v2, I'm doubling down on Scheme for The Spatials v3, this time using S7 Scheme, a true gem of an interpreter (first class environments, hook on symbol-access for reactive programming, super extensive C FFI support, etc.)


Going from Python to Lisp to Racket was an absolutely mind-blowing experience.

It literally changed my life, my perspective on programming, even what jobs I sought out.


Could anyone comment on the Lisp --> Racket mind-blowing experience? I have worked with Lisp and Clojure, and most of the big concepts are shared. What has racket on top of (common, I assume?) lisp?


* macro/module system with phases ( https://www.cs.utah.edu/plt/publications/macromod.pdf )

* submodules ( http://www.cs.utah.edu/plt/publications/gpce13-f-color.pdf )

* Languages as modules (i.e. the #lang mechanism ) ( http://www.ccs.neu.edu/racket/pubs/pldi11-thacff.pdf )

* Documentation system that links identifiers to documentation (respecting the scope)

  ( http://docs.racket-lang.org/ )


It is simply that I found Racket to be the most pleasant and inviting to use. An all-in-one toolset, with a large and vigorously well documented standard library, and some of the best books on programming there are.

As a sometimes-aspiring language designer who'd already done one esolang before coming to Lisp, the language dev tools that Racket provides were also far too enticing to stay away from.


> What has racket on top of (common, I assume?) lisp?

Racket is from the Scheme family rather than the Common Lisp family, (Racket used to be PLT-Scheme before the name was changed because Racket isn't strictly an implementation of the Scheme standard, though it includes such an implementation among its bundled languages.)


Started coding in the mid-80's and went through a plethora of languages, OSes, frameworks, you name it. Learned ML when Caml Light was new and got a third price in logic programming contest between universities.

Nowadays I care mostly about JVM, .NET and first class languages from OS vendors.

Why? Not all of us have the luxury to move around just for working with the languages, or the skilled team, we would like to.


This is primarily why I'm focusing on strengthening my JavaScript skills...There are tons of start-ups looking for back and front-end devs. Your comment also makes me want to get back into Java and C#...I've taken the time to learn Haskell but I know realsitically I'll never get a Haskell job (at least here in Quebec...).


> It literally changed ... what jobs I sought out.

Why?


Same reason I still hate doing dishes at home after using a professional-grade dish machine at work for a decade.


Sir, this comment just made my day. I am going to screen shot it and place it in our office when anybody asks why I use mostly Clojure for my work.


I never understood why people consider lisps superior, mind explaining exactly what changed your mind?


It's not that there is anything particularly special about what you can do in Lisp than what you can do in, say, C#. It's that Lisp provides facilities for metaprogramming that most other languages lack. Most other languages make metaprogramming hard enough, or require it all be done at runtime, that it is very discouraging against the concept in general. Lisp creates a culture around metaprogramming that fundamentally changes how you approach programming forever after.

You stop writing programs. You start writing programs that write programs. It's a lever that multiplies the power of the programmer. That can be both good and bad. For a hacker coding in earnest on her own, it can be extremely good.

Yes, if you are the type of beginner or intermediate programmer that still struggles to just write programs, advanced metaprogramming in any language is not for you. If you're an intermediate programmer looking to become an expert programmer, learn Lisp, where metaprogramming is easy, and take the lessons on to whatever other languages you eventually end up using.

That's what people mean.


I've never had to change my mind about Lisp -- but to me, Common Lisp's condition system kicks the crap out of every other language because it allows you to handle conditions without unwinding the stack.

And whenever you get dropped into the debugger, you can edit and reevaluate the broken code or values or whatever -- and then continue on with the task.

In other words, you can begin a task with imperfect code -- and refine it as you go along -- figuring out how to handle corner cases as you go. All without starting the task over from the beginning.

I know of only one other language that allows you to explore a problem space like that.

Besides, the parentheses make the code less jagged looking and jagged looking code imo, is more tiring to look at. Lisp is simply prettier.


Smalltalk also drops you into a debugger on an error, letting you inspect the state of the system, correct your code, and resume. Check out Pharo Smalltalk.


TIL.

I have edited my comment to account for smalltalk.


Smalltalk probably supports this as well, but what's cool about Common Lisp's condition system is that it doesn't depend on user interaction or debugger to work. Conditions let you separate the code making the decision about how to handle an error from actual error handling. The decision doesn't have to involve interactive debugger, you can have your code making the choice about which restart (unit of error handling code) to invoke. The common use case is that you install various restarts as you go down the call stack; then, when an exception happens, control goes up the stack until appropriate decision code, which selects restart, and control goes back down the call stack to the selected restart block.

It's an incredibly flexible system that makes exception handling much more useful.


Not a lisp user myself, so I can't speak firsthand, but Paul Graham has several essays about it. http://www.paulgraham.com/lisp.html


Can languages with other syntax have the same properties?


Tcl I heard has string-based homoiconicity. Otherwise, I'm yet to see a proof you can get an actual, usable and not incredibly annoying macro system working without turning your language into a Lisp. People keep trying, but it's always some inelegant subset bolted onto the core syntax.


Julia is homoiconic and, of course, has lisp-style macros. It has a syntax similar to something like array fortran.

http://julialang.org/


I love using Julia, but for me writing Julia macros came with a steeper learning curve than in Lisp / Scheme, due in part to the syntax. I kept wishing Julia was a layer on top of a Lisp, and that I can just write Lisp macros. (I know the parser is currently in Scheme, but you can't exploit that easily.)


I tend to end up writing `Expr` objects directly when I'm building larger macros as I find them much easier to reason about. It's clearly not as convenient/clean as Lisp though. (David Moon actually sent a PR last year to make hygiene much easier.. unfortunately it got stuck for a while, but the PR was recently rebased and hopefully will be merged eventually).

Regarding the learning curve: we rewrote the first half of the Metaprogramming section of the manual a few months ago to try to provide a more gradual introduction, especially for people without a Lisp background. Objectively I don't know if the change has helped, but I tell myself that the number of macro-related questions has gone down :)

We would really like to make these tools accessible to people both with and without a formal CS background (probably the majority of Julia users will fall in the latter category). So, if you have any suggestions for doc/pedagogy improvements in this area, I would be very happy to hear them!


The obvious solution is to write a julia macro that implements lisp macros as a DSL ;)


No need, Julia comes with a builtin lisp.


Dylan? Then again it is a Lisp with Algol syntax.


I believe Dylan used to be a Lisp with Lisp syntax.



Thanks for the link. The page on that site that gives the history including the original lispy syntax is here:

http://opendylan.org/history/index.html


We talk sometimes about having an additional reader to support another syntax ... but so far, it isn't something where someone's actually volunteered to step up and help. It'd be interesting to play with some ideas from David Moon's PLOT as well ...


Take a look at Converge, Nemerle, TH and many other similar languages - they're doing just fine without any homoiconicity. All you need is a decent quasiquotation.


Thanks for the pointers, I never heard of those languages before.

I am actually looking at the Converge documentation on macros now, and I found the perfect quote to highlight the problem I see with those approaches:

Quasi-quotes allow ITree's to be built using Converge's normal concrete syntax. Essentially a quasi-quoted expression evaluates to the ITree which represents the expression inside it. For example, whilst the raw Converge expression 4 + 2 prints 6 when evaluated, [| 4 + 2 |] evaluates to an ITree which prints out as 4 + 2. Thus the quasi-quote mechanism constructs an ITree directly from the users' input - the exact nature of the ITree is of immaterial to the casual ITree user, who need not know that the resulting ITree is structured along the lines of add(int(4), int(2)).

That is, quasi-quotes take some code and turn it into a tree object. This tree object can then somehow get compiled into actual code later. Compare that with Lisp approach, in which code is data. You don't think about the code as something separate from the tree that represents it. The code is the tree. There is no ITree object.

It may seem like just "semantics", but I find this to be a significant cognitive difference.


> Compare that with Lisp approach, in which code is data.

It's not any different. It does not matter if your `(,a ,b) is compiled into (list a b), or your ` \a\ = \b\` is compiled into make_binary('=',a,b) - both are constructors, essentially.

Take a look at what I'm doing with just a quasiquotation here: https://github.com/combinatorylogic/clike


Other languages can technically have homoiconicity, but there's something to be said for the simplicity of Lisp forms.


The property of homoiconicity is shared by all Lisps, Forths, and Prolog amongst others: https://en.wikipedia.org/wiki/Homoiconicity


Yes. See Smalltalk for a language with object-oriented homoiconicity.


That's not really homoiconicity though, is it?

Smalltalk source code does not appear to be expressed in terms of a Smalltalk data structure. Whereas Lisp source code is expressed as lists of lists.


Sure it is. All parts of a Smalltalk program are present as objects that can be manipulated by sending them messages. Classes, methods, variable bindings, closures, stack frames, even individual messages are objects.

As a practical matter, methods also have a textual representation, but that's not strictly necessary. Ten or fifteen years ago, the Squeak method browser had a mode where you could write Smalltalk code using tiles, kind of like in Etoys, no parsing necessary. That was a hassle to use, though, so people stuck with writing methods using text.

By the way, Lisp has the same sort of "impurity". S-expressions are very simple, but they are syntax. It's surely possible to create a tool that would let one construct cons cells in memory directly, but that would be a hassle. Instead we use parentheses.


The 'iconicity' part of homoiconicity refers to the textual representation.


Maybe not the same, but I feel like nim (http://nim-lang.org/docs/manual.html#templates) can have a comparable power ot redefining itself. But I'm absolutely not an expert neither in any lisp nor in nim so I can't really tell.


Yes, take a look at Dylan. But they almost never do.


Dylan's approach works because it has lisp at its core.


Any language with a compile-time metaprogramming would do.


After learning about Elixir (http://elixir-lang.org/) and playing with it some, I now want an Elixir job. (I'm skilled in Ruby, and Ruby already back in the day wanted me to get a Ruby job... which I finally did)

One nice thing about Elixir for Lisp lovers is that you get "real" macros (full AST access within a quoting context) AND actual syntax. I am not sure there are any other languages which feature that combination currently.


You get full AST access in Erlang as well. I don't know how it is in Elixir, but in Erlang you don't want to touch that feature with a ten foot pole. It's hairy and incredibly annoying to work with. I'm having a hard time to imagine how it couldn't be without homoiconicity and with that "actual syntax". I've been reading about macros in Scala recently, and they seem to suffer from the same problem - they just don't fit well with the rest of the language.


> but in Erlang you don't want to touch that feature with a ten foot pole. It's hairy and incredibly annoying to work with

It's pretty much the exact opposite in Elixir. The only thing you really have to wrap your head around is the switch between the quoting context and the unquoting context... which is pretty much no different from understanding macros period. The definition syntax looks just like a method definition, except it's "defmacro" instead of "def", and the macro body receives an AST instead of (or in addition to) your usual arguments. But I'm probably not doing it justice...

http://elixir-lang.org/getting-started/meta/macros.html

https://pragprog.com/book/cmelixir/metaprogramming-elixir

Here Dave Thomas creates a simple "returning" macro, you can just watch the screencast if you're feeling lazy: http://pragdave.me/blog/2014/11/05/a-simple-elixir-macro/


AND actual syntax

What does "actual syntax" mean here?


Yes, I was riffing on Lisp not really having a "syntax."

The fact that Erlang's syntax is (disclaimer: subjective) awful just adds further grist to the Elixir mill, assuming you think Elixir's syntax is (disclaimer: subjective) sweet.

Which a lot of people seem to be coming to the same conclusion on.


That's their loss, then.


I don't see a loss. I see a win.

In Elixir, I have the full power of Lisp-level macros, in a functional immutable pattern-matching language, with a ridiculous level of concurrency (you can spawn a million or so processes in a second on your typical laptop), hot software upgradability (https://www.youtube.com/watch?v=96UzSHyp0F8 for a cool demo), access to the entire repository of already-existing battle-tested Erlang code...

... AND it's readable. :P

Lisp with its powerful macro facility has had literally dozens of years to find acceptance and still struggles (argumentum ad populum notwithstanding, a userbase with critical mass brings a plethora of other advantages). Ruby found enough of a niche that I think there is something to be said for Ruby's style of syntax. Elixir gives you both, and then some.


I was talking about Erlang and Elixir, not Lisp and Elixir. I don't need an introduction to Erlang/OTP, as I've used it for quite a while. You sound a lot like you're in the hyper-enthusiastic newbie phase, though.


I... guess I am. Is that OK? :) I like Erlang too... but I was one of the folks for whom the syntax turned me off originally. I can't explain why, especially if you're one of those developers (bless their pure-engineering hearts) who thinks syntax is irrelevant once you grok it. The best I can explain it is that some brains interpret computer code as a very abstract form of writing and some don't (or don't need to), and I may be one of the former, and that causes some syntaxes to "feel better" or "feel worse". It's... not very rational, sigh.


Presumably a riff on "Lisp has no syntax"


I actually thought it was a riff on Erlang's syntax, which is just as annoying a complaint in 2015.


It does have a syntax. It's just hard to find in amongst all the parenthetical shrubbery.


the thing i remember most from comp.lang.lisp was someone defining a truly great language as one that could get you to take a job simply to work with it, irrespective of the actual problem being worked on.


Can confirm.

I once took a job working on a product I had a moral objection to -- on a platform I despise (windoze embedded) -- simply because I'd be working in Common Lisp.



Funny to see that latter link on this message board, of all places.


Can you explain why? I have no experience with Lips/Racket and I'd love to hear about your experience.


Not OP, but I find this text a good introduction to one of the most significant things you can get enlightened by when learning Lisp: http://www.defmacro.org/ramblings/lisp.html.


Learning lisp is enlightening, but to claim that it's that much more productive than some of the other well designed languages is a stretch.

- Lisp macros are powerful, so the core of the language can be kept simple. However, many languages take an alternate approach and codify the most common programming patterns into their specification. The result is a compromise that works for the majority of cases, and better understood than custom macros.

- Homoiconicity is elegant, but somewhat over rated in practice. Declaratively expressing intent does not require homoiconicity, you can do that in JSON or XML if you agree on a format. Now with lisp, code being data, you can be infinitely more expressive; but at the same time inferring intent out of complex code-as-data is essentially writing a very intelligent macro/parser. There's a cost to it.

- if you're not really gathering the intent from code-as-data, there are ways to do eval() in other languages as well.

- Lisp has not succeeded on a relative scale. Let's not discount that.

- Compilers can optimize well known constructs better than equivalent macros in lisp.

So again, learning lisp is a great idea. But there isn't a one programming paradigm that's universally better than others.


> but to claim that it's that much more productive than some of the other well designed languages is a stretch

Given that you can turn Lisp into any of those "other well designed languages", it's not a stretch at all.

> and better understood than custom macros.

What can be easier than the macros?

> but somewhat over rated in practice

True. You can build a decent meta-language without homoiconicity, all you need is a quasiquotation.

> there are ways to do eval() in other languages as well.

Eval is runtime, macros are compile-time. Huge difference.

> Compilers can optimize well known constructs better than equivalent macros in lisp.

No. Macros can optimise any way you fancy. There are no limits.

> But there isn't a one programming paradigm that's universally better than others.

A paradigm which contains all the others is exactly this.


No. Macros can optimise any way you fancy. There are no limits.

For example, the Racket compiler does not need to know about the type specializing optimizations that Typed Racket makes possible.


- Lisp has not succeeded on a relative scale. Let's not discount that.

Is there a clear reason for this? I've only ever heard good things about lisp.

My impression, as a hobbyist programmer, is that lisp appeals to people who have a deep intellectual curiosity about the way programs work. It doesn't seem to appeal to the larger pool of programmers who are looking for a language thy can pick up in a straightforward way so thy can either get a job, or build a project they've been thinking of.


I fell in love with Lisps and FP precisely because they were an easier, more straightforward way of just getting the job done than the alternative.

How many times have you written a dozen lines of for-loop that could've been one map/reduce? How many times have you written a whole page of Object { this.foo = ... } just to add the simplest of new features?

Literally the reason I got out of programming after high school almost 15 years ago and wrote it off as 'not for me' was that kind of tedium, and learning Lisp and FP were the point in my return when I said 'Oh, wait, actually this is pretty great; where the hell was this when I was a kid?'

Lisp didn't take off because 1) home-computer-ready implementations were largely out of reach for three decades, and 2) Lisp and FP both were embracing the importance of expressive power during an era in which most programming still worshiped doing things the hardest way possible. Shit, when I was a kid, you weren't a 'real programmer' unless you did everything in assembly. Then it was C above all, to be followed by EnterpriseFactoryJavaFactoryFactories.

By the standards of most of the programming world, where there are still real bosses who grade coder performance in KLOC, Lisp is 'wrong'. But pumping out thousands of lines of repetitive boilerplate is not equal to efficiency, it just looks like it to a work culture that only understands output of work rather than quality of work. If programmer A takes 1 hour to solve the problem with 100 LOC, and programmer B thinks for 45 minutes and then solves the same problem with 4, who's the most efficient in that scenario?

And more to the point, which of those two work environments do you want to sign on for?


Same here, I never understood Java interfaces, abstract classes, and a ton of other "features" but picking up Clojure was a breeze. I don't understand why complicating thing that supposed to be simple helps you by any mean. On the top of that, I have seen several cases when Java programmers tipped over in their own code because of the complexity that they thought they understand, except there was a particular case when it was doing something else then expected.

Reasoning about Clojure (LISP) code is always easy because if you follow best practices you have small functions with very little context to understand.

On the top of these, I see the ratio of LOC 50:1 (worst case even higher) for Java : Clojure code that does the same thing. Usually people get triggered and say why does it matter, but in reality less code is better for everybody. Easier to understand, less chance of errors, etc. Correctness was long time lost for majority of Java developers, just put out a survey and you can see it for yourself.

It is also pretty common practice not to handle exceptions well and just let a DNS not found error explode as an IOexception and good luck tracking down what caused it (literary happened to me).

I understand that the average Java dev does not see any value in LISP (Clojure) but it silly to expect that the average of any group is going to lead the scientific advancement of any field including computer science.

One tendency that you can see if you are walking around with open eyes that the people who spent significant time in developing procedural code in an imperative language understand the importance of functional language features and the power of LISP. One can pretend it does not matter see you in 5-10 years and see how much this changes.

https://twitter.com/id_aa_carmack/status/577877590070919168

https://www.youtube.com/watch?v=8X69_42Mj-g

https://www.youtube.com/watch?v=P76Vbsk_3J0


The closest I got to Xerox Parc environments was Smalltalk VisualWorks and Oberon (Wirth made it based on his Cedar experience).

Then thanks to my curiosity I delved into the Xerox's and Genera documentation.

It is sad that a PDP-11 and VMS descendedant won the mainstream.

How much better could computing be if those behind those systems hadn't failed to bring them to the masses.

However environments like the JVM, .NET and their tooling bring us somehow close to it.

Also Swift with its Playground is helping new generations to rediscover this experience.

So maybe not all is lost.


I take it you're probably not a huge fan of Golang? :)


I would imagine not, though the parent can speak for himself. I agree with what has been said before, that Go is a bold step backwards, just another language to replace large amounts of Legacy Enterprise Code with (sometimes) slightly less large amounts of (Soon To Be Legacy) Enterprise Code. Go has going for it corporate backing and a good community, but on the technical merits alone, there is a better language for every task.


I wish I was. That little gopher is adorable.

Rust, on the other hand, gives me palpitations.


Is golang very verbose?


I don't know about "very", but it's pretty verbose and very imperative. You have to do a lot of repetition, often straight up code copy+pasting, in some cases.


In a way it's similar to frameworks. Frameworks which are more popular try to make choices for the user (like Rails). As an end-user I clearly wanna focus on my tasks, rather than choosing a toolset or perhaps building one myself.

Lisp is minimal and abstract. That's appealing to a different set of people, who aren't satisfied with off-the-shelf abstraction levels. It's also fun and challenging to work at that level, though IMO it's not always going to translate to better productivity.

For me, learning assembly and going through the 80386 reference manuals were more rewarding in terms of understanding how programs work. Sorry I have no specific insight to offer on the question you asked.


Lisp was the hot new thing in the mid-to-late 80s, when the AI Winter hit.

https://en.wikipedia.org/wiki/AI_winter

When AI couldn't live up to the hype, funding dried up. A lot of that funding was driving the companies and research projects that were doing major Lisp development. After the AI Winter, Lisp was strongly associated with the unmet promises of AI and thus gained a reputation as a poor choice for serious projects.

It's never really recovered.


Around 2000 a new generation rediscovered Lisp. SBCL was released in dec 1999. CMUCL rethought with a simplified implementation and build process. From then on various implementations were improved. Clozure CL was released on multiple platforms. The commercial Lisps were moving to the then important operating systems and architectures.

The hype moved to Clojure, Racket, Julia and other Lisp-like languages. The core Lisp may not have the same depth of commercial activity as in the 80s, but generally the implementations are in the best shape since two decades. There are still lots of open issues, but from the Lisp programmer standpoint, it's an incredible time.


It’s the Lisp Curse: http://www.winestockwebdesign.com/Essays/Lisp_Curse.html

Lisp is so powerful that problems which are technical issues in other programming languages are social issues in Lisp.


lisp makes you an asshole programmer. you're encouraged and enabled to write your own language for each problem, thus isolating you in a world of your own views and ideas. it's a babelian tar pit, luring programmers to their doom.

being your own tin pot dictator is quite alluring. you get to go to great feats and neat hacks to get code working. to control and manipulate the code to allow you to write what you want. every new macro and construct shapes the product in your own image and ideals, subsequently alienating other programmers.

it's like these language revisionist cranks who want to replace english with their own little concoction that's just ever so perfect and logical. a complete ignorance of social factors.

anecdotally, I know of large scale codebases and products in simpler, less elegant languages, meanwhile lisp seems to be popular with the lone hacker aesthetic.

eventually, with enough practice, you get to the smug lisp asshole stage.

this is where you wonder why lisp is unpopular, or fragmented, but assume that it's simply too good for the populace. Classics like 'worse is better' struggle with the notion that lisp maybe isn't that good. Sometimes you get a naggum complex and trot out saphir-whorf. Other people are terrible and that is why they don't use lisp.

it can't be that lisp isn't a great idea. or macros aren't a great tradeoff. at least the ruby community is coming to terms with monkey patching coming at the expense of library conflicts.

lisp is a strange beast. a simple tool that encourages complexity. purity over utility. a perfect goo waiting for the next hero to shape it and return to the mortal plain with their new, perfect macros.

http://forums.somethingawful.com/showthread.php?threadid=348...


> a complete ignorance of social factors.

Maybe that's why I like Lisp so much. Because "social factors" are so frikkin' annoying and irrelevant and I feel the world would be so much better for everyone if we stopped paying so much attention to them as we do now.


To formulate this a little more nicely, I might say instead that there is a real need for "intimate" languages, just as there is a need for "collaborative" languages.

As an example, the shells on my personal machines are often customized beyond the comprehension of anyone who isn't me, with tons of two-letter aliases and bash functions, cryptic environment variables, and a $PATH that's several terminal lines long, filled with labrythine symlinks and oneliners that I've accumulated over the years. Many people have similarly elaborate and impenetrable emacs configurations.

That's fine, since this is my personal environment, but at work (I'm a sysadmin, more or less) I'm still able to use more-or-less bash, and even write portable shell scripts that eschew bash-isms. Similarly, all that horrible e-lisp powering super-personalized workflows doesn't prevent someone from writing an emacs mode that they can share with others, the point being that a language that enables customization is great, because you can always just not do that and write code that others will find comprehensible.

Conversely, if your language forces you to write in a collaborative style, you can't gain efficiencies in your private use of it.


> To formulate this a little more nicely, I might say instead that there is a real need for "intimate" languages, just as there is a need for "collaborative" languages.

That's a... way of putting it I've never seen before. I'll remember the concept of "intimate" vs. "collaborative" language for the future.

Personally, even though I write a lot of Lisp and live inside Emacs, my environment seems to be quite... standard. The "collaborative" mindset is emphasized in pretty much every programming book out there, and I must have acquired this kind of weird fear of overcustomizing my environment thanks to it.


I'm not a lisp user, but I've used xml + xslt to generate xslt that processes xml to xhtml and I liked it ;)


"own languages" are much more approachable for the others than the "own libraries". For very obvious reasons.


My naive reasoning of why is that a lot of people start by learning c-like languages and don't see the need to learn something as different as lisp. As lisp and it's descendants were never really the dominant language, it was never the first type of language most people learned. Now that many mainstream languages have progress to incorporate more and more lisp features, it's becoming less foreign to many devs and the popularity is increasing and is now higher than I think it ever was.


>> Lisp has not succeeded on a relative scale. Let's not discount that.

> Is there a clear reason for this?

Yes. In the 1980s AI was teh new shiny and at that time, Lisp was almost synonymous with AI.

A bunch of people over-promised when it came to AI and expert systems, and failed to deliver. And people conflated the failure of the promise of AI with the failure of Lisp. Essentially, guilt by association -- people can be dumb that way.

Amusingly, once something in the realm of AI actually works -- we stop calling it AI. But one thing is for certain: the scruffies have been right more often than the neats.

Still, Common Lisp is pretty effin' awesome.


Lisp macros are powerful, so the core of the language can be kept simple. However, many languages take an alternate approach and codify the most common programming patterns into their specification. The result is a compromise that works for the majority of cases, and better understood than custom macros.

The semantic core is kept simple. That doesn't mean lisps don't provide constructs for common patterns. Ej. Loop, with-open-file or defclass or Racket's for/ constructs.

Plus a good written macro is easy to understand and I'd argue most of macros are good written. In fact I'd like to see a macro that what it does is unclear on a widely use liso library.

> - Compilers can optimize well known constructs better than equivalent macros in lisp.

Lookup compiler macros. Macros can help the compiler optimize the code.


99% of programming works like this: you pass parameters to a function/macro. If you screw up, the compiler spits an error. The details of the error message are rarely relevant, you mostly know what's wrong even before you read that message anyway.

This whole talk about macros being crazy dangerous and difficult is very misguided. Most of the time, if a Lisp compiler spits a weird message on you, you know what you've screwed up. In the 1% cases you don't, you apply macroexpand-1 (or equivalent), see why the expansion doesn't make sense and fix it. In the 1% of the cases it doesn't help, you keep reading the source until you understand what's wrong. It's no different than debugging functions. Same rules apply.


> But there isn't a one programming paradigm that's universally better than others.

Maybe that's why Racket supports functional, imperative, declarative, and object oriented programming. I'm sure I'm even missing a few.


Don't forget relational and logic programming. :)

http://minikanren.org


> there isn't a one programming paradigm that's universally better than others.

That's so true. I wish someone would come up with a language where a wide variety of programming paradigms, including imperative, functional, and message passing styles, could find convenient expression.

EDIT: This comment is almost an exact quote from somewhere. Bonus points to whoever can identify where.


The parent's humorous wish is granted by the wizards of MIT Scheme (and, I daresay, even more fulfilled by Racket):

"Scheme is a statically scoped and properly tail-recursive dialect of the Lisp programming language invented by Guy Lewis Steele Jr. and Gerald Jay Sussman. It was designed to have an exceptionally clear and simple semantics and few different ways to form expressions. A wide variety of programming paradigms, including imperative, functional, and message passing styles, find convenient expression in Scheme."

http://groups.csail.mit.edu/mac/projects/scheme/


I got it from R4RS. I imagine MIT Scheme also got it from there.


Maybe. Looks like R4RS got it from R3RS. :-)


I think you should give Elixir a look: http://elixir-lang.org

Bonus points: Elixir has compile time macros and an AST with a representation that is similar to Lisp.


> and an AST with a representation that is similar to Lisp

That's actually cheating, you know :). Lisp pretty much is AST. It's a fun fact about Erlang that it's not directly compiled to BEAM, but is first translated into a very strange, almost lovecraftian Lisp. I've worked with Erlang commercially for some time and ever since learning about parse transforms I kept wondering why they didn't just clean up the "intermediate language" syntax; they'd have a decent Lisp instead of a parallel Prolog.


There is, of course, Lisp Flavoured Erlang: http://lfe.io/


Indeed there is :). I used to sneak up some code in it on the job ;). I'm happy to see it being actively developed to this very day.


oz/mozart does a good job of that; it's a shame it never really caught on as a non-research language.

http://mozart.github.io/


You have Scala. They even try to shoehorn macros into it. The result is a powerful but IMO messy language.


What for? With any meta-language you can build and mix any paradigms you like.


Although Lisp itself may not have succeeded on a relative scale, Clojure (a Lisp dialect for the JVM) seems to have a fairly good foothold.


Also: writing good macros is (even) more diffcult than writing good functions or modules.

Writing macros is doing language design and compiler implementation at the same time.

If you've ever cursed the error messages from a C++ compiler, think how much worse it would be if that compiler was written by your co-worker part-time as a side effect of writing rest of the code.

(I wrote clojure in anger for three years with brilliant co-workers)


> If you've ever cursed the error messages from a C++ compiler, think how much worse it would be if that compiler was written by your co-worker part-time as a side effect of writing rest of the code.

This is why you use proper facilities to write macros, that will enable you to forge better error messages than most C compilers. This is also why you don't simply use something like `defmacro` that will not let you supply syntactic information (syntax objects) with location and context information.

A good macro has a contract that constrains its use and inform you when you're breaking it, so that you don't have to rely on your coworkers to explain their macro to you.

See this[0][1] for a very detailed view on how you can provide the programmer with the tools they need to create abstractions that stretch well into the world of macros and still be able to make them usable.

You can set enforcable conditions for your macros to guide people, just like any language can statically check their syntax.

This is not an unsolved problem. What is unsolved, like many people have mentioned, is when the culture of a language (and the facilities of lesser languages) don't emphasize managing macros like other abstractions.

Using Lisps that don't emphasize more than "Macros are functions that modify lists of syntax at read time" will lead people to believe that's all there is to it. You won't have to be angry about macros if you use a language that gives people the tools to help you use them and then fosters that idea.

0 - http://docs.racket-lang.org/syntax/Parsing_Syntax.html?q=syn...

1 - http://docs.racket-lang.org/syntax/stxparse-specifying.html?...


No. Writing macros is less difficult than pretty much anything else. You just have to follow the proper methods.

Macros are, essentially, compilers. It is a very well studied area, writing compilers is a totally mechanical thing which does not require any creative thinking.

Compilers are best implemented if split into long sequences of very trivial tree rewrites (see the Nanopass framework for an example of this approach). Such tree rewrites should be declarative, and you don't really need a Turing-complete language to do this. This approach is inherently modular and highly self-documenting, so you're ending up with nicely organised, readable, maintainable code, much better than whatever you'd do with plain functions.


> However, many languages take an alternate approach and codify the most common programming patterns into their specification. The result is a compromise that works for the majority of cases, and better understood than custom macros.

That's not the case in my experience. It's very easy to look up the definition of a macro, or see what the macro-expansion of an expression looks like. Good luck trying to figure out what your interpreter/compiler/JIT is actually doing when the language documentation is lacking.

For Common Lisp at least the quality of documentation is also another huge advantage over any other language I've used - the Hyperspec (http://clhs.lisp.se/) is extremely detailed, concise, and easy to access (you can pull up the documentation page for a particular symbol with a keystroke, or browse through the index or by TOC when you aren't sure what you are looking for).

> Homoiconicity is elegant, but somewhat over rated in practice. Declaratively expressing intent does not require homoiconicity, you can do that in JSON or XML if you agree on a format. Now with lisp, code being data, you can be infinitely more expressive; but at the same time inferring intent out of complex code-as-data is essentially writing a very intelligent macro/parser. There's a cost to it.

Except that doing it with JSON or XML is usually two orders of magnitude more code than a macro solution would be. Now simple problems that could have been debugged with a macroexpand involve tens of thousands of lines of parser libraries (it's 2015, why are encoding errors still an issue for anyone?), ConfigurationFactories and class hierarchies for the internal representation of whatever it is you are trying to encode in JSON, and glue code to connect it to your current controller and model classes.

> Compilers can optimize well known constructs better than equivalent macros in lisp.

Compilers are very limited in the partial evaluation and domain-specific optimizations that they can do. This is not the case for macros.

> But there isn't a one programming paradigm that's universally better than others.

That's kind of the point of macros - to allow you to add new programming paradigms without having to change the base language or compiler.


> LuaJIT is, by far, the best option for scripting in games/RT

Much as we want to like it, we've been frustrated by Lua for a single reason: no preemptive threading library. Its lack of true prototype-style OO is more than a bit annoying too.


Threading I'll grant you, but prototype-based OO is practically Lua's defining feature. Could you expand?


Well, I guess that's not really fair. Obviously you can implement prototype-style OO, more or less, using the metatable and metamethod facilities. But as an old NewtonScript developer, I rather find

    foo = { x = 1, y = 2 }
    mt = { __index = bar }
    setmetatable(foo, mt)
to be crazy compared to

    foo := { _proto: bar, x: 1, y: 2 }
The uncleanliness of function declarations in Lua compared to NewtonScript also gets to me in this way. At any rate, to me, the metatable facility feels like a hack. That's not to denigrate it: it's a very flexible and clever gizmo which can be used for a variety of cool things. But to me proto-style OO languages are notable for their elegance and simplicity, and metatables expose a fair bit of wiring and boilerplate. [It also wasn't a good sign that the whole OO section in "Programming in Lua" was about "classes" :-( ]


There're lisp to lua solutions like l2l[1] or moonlisp[2]

(Shameless plug: I also have a WIP hy[3] inspired lisp->lua compiler hua[4])

[1]: https://github.com/meric/l2l [2]: https://github.com/leafo/moonlisp [3]: http://hylang.org [4]: https://github.com/larme/hua


I've looked into both l2l and moonlisp and they were too incomplete to be usable in their current state, and not being worked upon recently (or not much). The most current and interesting effort in this area in my opinion is Conspire[1], which is actually based on Terra[2], another amazing piece of software.

The decision to not go with LuaJIT was complicated and depending how it goes with S7 I probably won't take it again for a new game. For starters I doubt I will delegate any per-frame non-evented computation to the S7 side, while I could easily do it in LuaJIT. But I'm already seeing an at least 10:1 ratio of lines of code for things like building GUIs, and it's only going down even further now that I'm implementing event callbacks with lambdas and adding reactive-like system for rendering updates.

Good work on hua! I read somewhere that once you learn a Lisp you are destined to write one. I've been tempted too :)

[1]: http://blog.duangle.com/2015/01/conspire-programming-environ... [2]: http://terralang.org/


[deleted]


Millions of developers got no clue about the power of meta-languages.




Explore them properly: http://playcanv.as/p/apIKHp7a

:-D


This is neat, how did you rig this up?


Pretty simple in PlayCanvas. It's a sphere with a camera in the centre. Texture is applied as an emmissive map on the material. Took me about 5 mins.

Project is here: https://playcanvas.com/project/349830/overview/office


403 forbidden :( how does the texture get applied? Do you have to supply the texture coordinates?


Oops, it's public now.

The texture, as provided, maps on the sphere primitive created in PlayCanvas. So no need to worry about texture co-ordinates.


nah, the source is still hidden. I want to see the code! :D


I was noticing he was a lefty, then I noticed the backwards writing on the wall...


Yeah texture is applied to the outside of the sphere and the camera is in the middle so its all backwards...


Difficult to figure what parts are really curvy :)


  Render the metaverse - May 2015
It's happening Neil.


I've implemented a Common Lisp that interoperates with C++ and uses LLVM as the backend because I was motivated by the same shortcomings in traditional software development languages and tools (https://github.com/drmeister/clasp). I choose Common Lisp rather than Scheme because the killer feature of Lisp is Macros and they are easier to use in Common Lisp than they are in Scheme. Also, Slime (Emacs interface to Common Lisp) is one of the most powerful software development environments available IMHO.


Unrelated to VR but still: if anyone here hasn't yet watched drmeister's talk about Clasp and his "molecular metaprogramming" project, please do. Absolutely fascinating stuff.

https://www.youtube.com/watch?v=8X69_42Mj-g

Also recently discussed on HN: https://news.ycombinator.com/item?id=9721801


What makes Common Lisp macros easier to use than Scheme's?


In short Common Lisp has two namespaces, one for functions and macros and a second for variable names - Scheme has one namespace. So you have fewer conflicts in naming in Common Lisp. In Scheme there is a lot of concern about hygienic macros that don't mess up the namespace. For more on that here is an inflammatory but well thought out exposition. http://www.xach.com/naggum/articles/3236789642671151@naggum.... The book Let Over Lambda lays it out really well as well.


Scheme macro is hygienic, which make certain type of macro impossible to be done (those that capture itself). Also depend on who you ask, hygienic macro is slightly more annoying to use.


syntax-rules macros can break hygiene: http://okmij.org/ftp/Scheme/macros.html#dirty-macros


Lack of enforced hygiene, for starters. Hygienic R5RS-style macros are nice and all that, but only for petty language extensions, and they're a noticeable obstacle if you want to implement more complex, deeply staged DSLs. It's possible to find workarounds for most of that things, but yet, not having hygiene in the first place is better.


No need for a special metaprogramming DSL, macros are written in plain Common Lisp. Hygiene is a non-issue since you have the GENSYM function which gives you a new symbol guaranteed to be unique.


What makes Slime so much better than e.g. Webstorm or PyCharm or Eclipse or Android Studio or vim-go, etc?


SLIME emphasizes interactivity. Coupled with the fact that it uses the full power of Emacs (which is a small lisp vm, though a different dialect of lisp), you can do things like inspect any object or package in your system on the fly, view class hierarchies of your image, live documentation access, live disassembly of a function, incremental compilation into another buffer, arglist inspection, easily add amazing tools like paredit, autocompletion, inline macroexpansion utilities, source location lookup for functions and global variables, with Common Lisp spec lookup of any symbol all with with two keypresses or so. While many IDE's provide some or even all of these features, they are rarely as coherent and easy to use as they are in SLIME. Plus, SLIME itself is more easily customizable than something like Eclipse, because it uses elisp and common lisp as the backend language. A downside though, is that learning emacs is a prerequisite to using all the features of slime.


cool


This is pretty exciting! Also, I love this tweet from John Carmack last week: "The web gets assembly... and I'm using a Lisp for VR. Bizzaro world" https://twitter.com/ID_AA_Carmack/status/611279873852731392


These quotes make me cry:

> Doing VR GUI development in the native apps is unpleasant – you can gut it out in C++ with our current interfaces, but the iteration times and code structure don’t make it a lot of fun.

> “Remote development”, where the script is actually executed in an IDE on a PC, communicating with NetHMD over a TCP connection. This allows sub-second code-change to VR change iteration cycles, and the use of a debugger. There might also be a use for a related mode where a central server directly drives one or more clients.

Why can't people see that this is the same problem even for standard GUI development rather than just VR GUI development?

I would kill for these tools on my desktop ...


This is one reason why I don't mind building apps using Eclipse + Java. Swing sucks, but I can live-edit my code and see it running immediately.

If you've ever watch Notch live-coding a Java game, he uses this extensively as well.


This is something you get so used to with Lisp that later any environment that doesn't support it seems unnecessarily and terribly constrained. SLIME + Common Lisp is a wonderful toolset.


swing was slightly nicer to work with in clojure. not actually pleasant, mind you, but more pleasant than it was in java.


Yet you constantly see Clojure enthusiasts droning on about their interactive development setup, all while poorly emulating what hot swapping did 10 years ago.

Kids.


I attended Northeastern University where PLT Scheme (now Racket) was the first programming language I ever learned in a classroom. Matthias Felleisen, the second poster in that thread, is a professor at NU and co-author of How to Design Programs:

http://www.ccs.neu.edu/home/matthias/HtDP2e/

I recommend this book for anyone interested in learning more about Racket.


I would also like to recommend these fantastic edX courses:

1) https://www.edx.org/course/systematic-program-design-part-1-...

2) https://www.edx.org/course/systematic-program-design-part-2-...

3) https://www.edx.org/course/systematic-program-design-part-3-...

Part 1 started 2nd of June so you can still start it and catch up (if you are familiar with programming concepts already then the first 2-3 weeks are easy).

Note that there is archived (older) courses on Coursera here:

https://www.coursera.org/course/programdesign

I emailed Professor Kiczales about having the course on both edX and Coursera and he confirmed that new course iterations will be on edX.

I cannot recommend the book and the course enough for newbie programmers and even those 'new learners' who are stuck between newbie and hacker. I would say that if you know less than four languages that is you.

Doing this course will be like taking the red pill. One last thing, the course may start off slow but if you stick at it the difficulty climbs and you will be well rewarded.


I was wondering what happened to part 2 of "Introduction to Systematic Program Design" on Coursera. I took the inaugural instantiation, and it was offered again, but then disappeared. Kickzales's ability to cut across the complexities of recursion and get to the point is just one aspect of his teaching style.

From a literary point of view, I find the style of HtDP a bit harder to engage than Feilison's other works and SICP. I wish it read less like an over the transom argument. A more overtly opinionated and passionate case for TDD and the rest of the HtDP methodology would be great for the third edition. One of the things that makes SICP great is that it is entirely unapologetic, and the reason it works is that espousing sound engineering principles needs no apology.


Ahh, I took the original one on Coursera, and had no idea about these edx courses (which I must now find the time to do.)

The course made a huge impression on me. In particular, the jaw-dropping moment of realising, "I just wrote a set of mutually-recursive functions, and they worked first time, and it was easy"

At work at the moment I'm mostly maintaining and extending a system that I've been working on for the last few years, and when I look at my old code, I can see instantly whether it was written before or after taking that course.


I can't overemphasize how good this course is! It is second only to the Abelson and Sussman video lectures.)


A long time ago, Dave Taylor left id to create Crack dot Com. They then wrote the game "Abuse" using C++ and a lisp variant:

https://en.wikipedia.org/wiki/Abuse_(video_game)

http://lispgames.org/index.php/Abuse

Although, from the FAQ:

http://abuse.zoy.org/wiki/doc/faq

"Approximately seven percent of the game engine and Abuse combined is Lisp code. The rest is C++."


Abuse, good memories. It's one of the games (other being StarCraft and Unreal Tournament) which pushed me into learning programming as a kid. Funny that after 10 years I suddenly remembered that I had my first Lisp exposure over a decade before I learned about the language itself - it was by browsing Abuse data files.


That's pretty interesting. I recall that there was a project to integrate Clojure and Unity but I don't know how active it is:

https://github.com/arcadia-unity/Arcadia

Either way getting Carmack into language design has very exciting implications. He seems like an ultra pragmatic guy and is undoubtedly a great programmer.


Arcadia dev here! We're always actively working on Arcadia, and are closing in on our next release that brings with it a package manager and all kinds of other goodness. Keep an eye on the repo or jump in the gitter for updates.

Lisp in VR is super exciting! We're following Carmack's work closely.


It is very active. Search youtube for "arcadia clojure" - lots of very compelling/recent demos.


they're also super active on gitter and twitter, fyi. often fielding and answering questions about arcadia.

https://gitter.im/arcadia-unity/Arcadia

https://twitter.com/arcadiaunity


Strange Loop(2014) video:"Clojure in Unity 3D: Functional Video Game Development" by Ramsey Nasser and Tims Gardner"

https://www.youtube.com/watch?v=tJr_TD1BtF0

__ and __

http://arcadia-unity.tumblr.com/post/100257212548/arcadia-0-...

HN: https://news.ycombinator.com/item?id=8472792


Also worthwhile to checking out a SkillsMatter talk Tim (one of the Arcadia authors) gave, really appreciate the sense of freedom/interactivity in the demo talk (once he gets into it) https://skillsmatter.com/skillscasts/6416-games-and-3d-graph...


"With app signing, we have no way for developers to casually share work with each other or limited communities."

This is such a sad state of affairs. I miss the excitement of handing a friend a floppy (well, a tape at one time) so they could play with code I wrote.


Luckily, adding the "web-like" scripting mode will potentially circumvent this.

  “Web like”, where the script is downloaded from the
  net for each execution and run by a single signed
  app that we provide (NetHMD).  Fully specified by
  an app-scheme URI, this allows VR experiences to
  be launched directly from web pages or app-links
  on Facebook, which I think is VERY powerful – read
  about how great something is while browsing on your
  phone, then just click the link and get the “insert
  into HMD” dialog and jump right there.  VR scripts
  can link to other VR scripts without leaving VR. 
  There is no direct monetization here, but IAP
  can eventually be supported.
I think this is a positive outcome of Facebook owning Oculus. If another hardware company owned it, they would potentially be much more myopic about the "Internet"-related possibilities of VR.


I'm not sure I understand this. I think I'm missing something so maybe you can help. I share code with friends all the time. Sometimes it's something small and I save it as a gist on github and sometimes it's something larger and I make an entire repo. I have also made tarballs and emailed them to people when necessary.


I take it all your friends are developers. Try sharing something you coded with non-developers on their devices that require app signing. It is not an easy operation.


Makes sense. I thought I was missing something. Thank you.


Oh my goodness this sounds absolutely amazing. I really-really hope that this will happen. Working on VR in lisp would be the most fun thing in the world ever.


One of the earliest advanced graphics program was developed on Lisp Machines.


S-graphics package or even older ? being a nichimen fan I'm always curious about their [pre]history.


Something like that, lispm knows certainly what I mean.


Ha, for a second I thought he was that guy http://web.archive.org/web/20110716211447/http://lemonodor.c...


Wow! This takes me back to the late-90s. I worked on the Neverworld project, which was a set of Scheme48 extensions for modeling virtual worlds in VRML: http://www-personal.umich.edu/~jeffshuo/neverworld/


This is great! Hardcore game devs need to have their idols doing this type of work to convince them to move to something higher in the chain.

Like it happened with Pascal, C and C++. Or having an OS underneath their engine.


I'm not convinced it has anything to do with who is using the language. If the best framework happens to be written in some language in some way, well I guess that is what I will use...


For me, Carmack is one of those people I will watch and maybe follow just on reputation alone.


Technical talent aside, it's a shame he's never been involved in the development of any games actually worth playing :(


That is a matter of taste, as our LAN parties would prove.


It's not like he's probably indirectly responsible for all the games you feel are worth playing.


You must be pretty young.


doom had a bigger installed base than windows 95. you sir are gravely wrong.


My experience back when I was trying to get into games a few decades ago was that some devs only change tools if the platform owners shove them into their throat without any other option, or get proven wrong by having one of their idols going alternative.


How would you know it's the "best", though?


I'm somewhat surprised he isn't basing his work on Fluxus: http://www.pawfal.org/fluxus/

> people who might be a little hostile to working in a Lisp.

Hostility is an excellent word for it.

> Doing VR GUI development in the native apps is unpleasant – you can gut it out in C++ with our current interfaces, but the iteration times and code structure don’t make it a lot of fun.

> In-editor prototyping is nice in Unity, but actual on-device testing cycle times are terrible, and you wind up with all the bulk of Unity and poor access to the native VR features like TimeWarp layers.

This is exactly the reason I've focused on WebVR with my Primrose project. Otherwise, JS sucks.

> I am a big believer in functional programming (and static types) for large projects, but there is an undeniable bit of awkwardness compared to just imperatively poking things for small projects

That's funny, because I would say the opposite is true: imperative code with global state is harder to poke around on because it's harder to reason about your state, whereas functional code [0] can be sliced and diced and put back together much more readily.

There are places where traditional for-loops, with a readily exposed counter, are extremely useful in game programming. Perhaps that is what he means? Map and for-each are great when they are the right tool, but I've personally never subscribed to the idea that they are particularly "easier" to use than for-loops, or "more functional".

> The design usage is to reference pre-processed models loaded from the net

This is a great point. Most (all?) beginner 3D programming tutorials start you off with making your own meshes by hand in code and leave loading models from files until far, far later. This, to me, is exactly backwards. The basics of modeling are not hard to teach, Blender is free, and building static, non-mathematically defined meshes in-code really sucks.

[0] with a REPL, any modern language without a REPL is defective. And that means with Readline, or something similar. A REPL without command history is a defective REPL.


> This is a great point. Most (all?) beginner 3D programming tutorials start you off with making your own meshes by hand in code and leave loading models from files until far, far later. This, to me, is exactly backwards. The basics of modeling are not hard to teach, Blender is free, and building static, non-mathematically defined meshes in-code really sucks.

Your mileage may vary, but I wouldn't necessarily say that modeling is easy, especially not trying to use Blender. Most beginnner 3D programming tutorials use simple geometric shapes because it is a helluva lot easier to figure out if your lighting is off or your projections wrong when the scene objects are simple. Loading complex models from file is not exactly a piece of cake, either - you either need a fully-featured library with support for models baked in, or to use something like Assimp, or even write your own model format parsers.


We should have "fully-featured library with support for models baked in" already. There is no reason for a beginner to be writing stuff like this from scratch. A scene graph and model loading library are the bare minimum for a beginner.


Well, good luck if you are using raw OpenGL or DirectX and trying to learn the fundamentals, as opposed to how to use game engine X, Y or Z.


Why? I've put up a simple renderer (meshes + instances [transformation+material] + camera + default light behind the camera] in ~450 lines of C++ code (+ 50-ish lines of GLSL).


I've done the same for C++ DX9 code, but that was before D3DX was deprecated and then removed in DX11. It's not quite so simple anymore, unless you pull in various third-ish party libraries (usually just open-sourced versions of the D3DX stuff Microsoft cut...)


I thought the industry standard for game scripting is Lua. I mean - if you want to attract the wide range of game devs, why choose lisp?


Because you're John Carmack. You'll attract devs no matter what you use, so it makes sense to use the best tools available.


Yes, he of course can afford this - to play with things he likes. But will it blend?


Does it need to blend? Were Racket the only supported scripting language for Oculus I'm sure people would just learn Racket rather than skipping out on Oculus.


As sklogic said, no need to learn Racket:

It has multiple Lipsy dialects: http://docs.racket-lang.org/guide/dialects.html

And a datalog one: http://docs.racket-lang.org/datalog/index.html

It knows C: http://pkg-build.racket-lang.org/doc/c/index.html

And Javascript: https://github.com/lwhjp/ecmascript

And Algol: http://docs.racket-lang.org/algol60/index.html

And Pascal: https://github.com/soegaard/minipascal

It shouldn't be too difficult to teach it Lua ;).


No need to learn Racket, it's easy to build any language on top of it. Having a single meta-language as a base allows diversity and choice for end users that is not possible with any of the inferior (non-meta) languages.


Best? If we'd all agree on what "best" means, we wouldn't discuss anything ever.

Scheme and all the other Lisp variants look terrible to me. Having an excessively simple syntax doesn't mean that it's nice to use. They went too far.

This is probably the primary reason why Lisp dialects aren't more popular. Even the simplest code examples look like some sort of practical joke.

    (let ((hello0 (lambda() (display "Hello world") (newline))))
      (hello0))
[Edit: This example was taken from Wikipedia's Scheme article.]

Something like that in Dart:

    hello() => print('Hello world');
    hello();
Dart's syntax is way more complicated, but this is much easier to understand, isn't it?


The second example is more readable for several reasons.

1. You chose a better function name (hello0 looks a little confusing)

2. You [EDIT: actually Wikipedia] decided for whatever reason to use let rather than define in the Scheme example.

3. Dart's "print" function is more suitable for console output on a single line than Scheme's "display".

Here's the fix:

  (define (print something)
    (display something)
    (newline))

  (define (hello)
    (print "Hello world"))
All of a sudden (f a b) doesn't seem way more complicated than f(a, b);.


For the sake of completion...

  (define (hello)
    (displayln "Hello world"))


Racket has a displayln that is similar to puts or System.out.println.


The code you pasted is used to illustrate the concepts of ports. You took it out of context. It's not an example of hello world like you're claiming. Also, the code on Wikipedia doesn't always match actual practice of programming in a language..

If you want to see more realistic examples go to RosettaCode, pick interesting task and look for Scheme, Racket and Clojure entries. You can then compare them to each other and to other languages you know.

The other thing is that your code in Scheme and Dart don't do the same thing at all, unless I underestimate Dart semantics very much. A translation of your Scheme code to JS, with as much semantics preserved as possible, would look something like:

    (function (){ // let introduces new scope
      var hello0 = function (){ // hello0 is a variable holding an anonymous function
        return console.log("Hello world"); 
      };
      return hello0();
    })();
That's quite a bit of work less in Scheme, isn't it? But for when you don't need these semantics you can do a translation the other way round (Dart->Lisps). In Racket your Dart example would look like:

    (define (hello) (displayln "Hello world"))
    (hello)
Now, that looks better, right? It's worth noting that the Racket version has only one pair of parens than Dart.


To expand on this a little, here is a thing that you can't translate to most programming languages. By defining a simple, pattern-based macro we can get much closer to the Dart version:

   (func: hello() => (displayln "Hello world"))
   (hello)
All you need to do this - to change the language syntax to look like that - is to write this:

    (define-syntax func:
      (syntax-rules (=>)
         [(_ name (formal ...) => expr ...)
          (define name (lambda (formal ...) expr ...))]))


Is this just trolling? Why a let and "hello0"? Is that just to make it look weird on purpose?

    (define (hello)
      (displayln "Hello, World!"))
    
    (hello)


I used to program in a LISP (the language predates Lisp) derivative called "Boot". (It booted a computer algebra system.) It added two rules to LISP: unparenthesized expressions bind "to the function call", using dynamic lookup to determine arity, and failing if the binding wasn't unambiguous; and, ';' to delimit expressions. It's been a decade since I used the language, but it would look something like this:

    define (hello)
        displayln "Hello, World!";

    hello;
The boot 'compiler' only did name resolution and parenthesization, then it kicked everything back to an underlying LISP. (In this case, SBCL, which rocked!)

The second level language built on Boot was Spad, which used a slightly friendlier syntactic top-form:

    e : t = x
With compact forms:

    e := x
    e = x
    e : t
    e
Which lets you uniformly parse packages, modules, domains, categories, functions, annotations, etc. etc. For instance, we had a Spad-lite syntax:

    Monoid : Domain = public :
        _* : (y : Monoid), (x : Monoid)
        _0 : Monoid
Then you could assert membership:

    assert(Integer, Monoid) :=
        _* : (y : Integer), (x : Integer) = (_*:Integer)(x, y)
        _0 = (0:Integer)
An implementation would be:

    Monoid : Domain = public :
       -- see above.
    = implements :
       -- implementations.


Quite a beautiful addition to sexps. I've seen implicit rules about top-level expression being implicitely wrapped in parens you one can write

    defun fact (n)
      (cond
         ..
         ..)
which makes lisp so much more bearable to parensophobic people.



Seriously ?

  (defn hello [] (println "hello world"))
  (hello) ;; in Clojure

  (defun hello () (format t "hello world"))
  (hello) ;; in Common Lisp.


The example was taken from Wikipedia's Scheme article.

I do not know how to write Hello World in Scheme.

By the way, I really like how everyone's Hello World looks different.


>I really like how everyone's Hello World looks different.

Because everyone is posting hello world examples in different languages. Lisp is a family, a category of languages, not a single language.


You can't effectively criticise a language without actually knowing it. That's why you're being downvoted.

The example you gave is using the lowest level primitives to printing with zero syntactic sugar. It's equivalent to writing the following in JavaScript (assuming functions called 'display' and 'newline':

    (function () {
        var hello = function() {
            display('Hello world');
            newline();
        };
        hello();
    })();
Practical Scheme implementations include additional macros that add syntactic sugar so you an use much more straightforward mechanisms to do things.


So, you know nothing about Scheme, but yet dare to have an opinion?


Yes, I "dare" to have an opinion on that syntax.

I like it from a theoretical point of view. It's very uniform which is convenient for implementers. It's brilliant, really.

However, I dislike it from a practical point of view, because this uniform soup is hard to scan and also hard to write.

I pretty much like any other language's syntax better than this one. Even fairly cryptic ones like Erlang's.


> However, I dislike it from a practical point of view,

How can you be disliking it from a practical point of view if you don't use it in practice? Don't you think it would be different if you tried actually learning Scheme for a week? Do you think it would still be unreadable to you after that week? Why do you think so?

Also, take a look at APL or J if you want to see "unreadable" language.


> How can you be disliking it from a practical point of view if you don't use it in practice?

I had plenty of exposure to deeply nested things. This is pretty much like nested function calls. "foo(a, b)" and "(foo a b)" isn't all that different, is it?

I prefer to have different bits of syntax for different things because it looks less uniform. I like it when different things stick out in different ways.

> Also, take a look at APL or J if you want to see "unreadable" language.

Both of these aren't even in the top 50. Can you think of a somewhat popular (non-legacy) language?


> I had plenty of exposures to deeply nested things. This is pretty much like nested function calls.

I have no idea what you mean here.

> I like it when different things stick out in different ways.

Here's your problem: the "things" are sticking out in a variety of ways in Lisps, you just don't know and can't recognize in what ways exactly.

> Both of these aren't even in the top 50. Can you think of a somewhat popular (non-legacy) language?

Again, I don't understand what you mean.


> I have no idea what you mean here.

Nested lists are similar to nested function calls. Did you skip the rest of the paragraph?

> the "things" are sticking out in a variety of ways

No, they don't. Syntax-wise, everything is the same.

"Scheme's very simple syntax is based on s-expressions, parenthesized lists in which a prefix operator is followed by its arguments. Scheme programs thus consist of sequences of nested lists."

There shall be nested lists. That's all, folks!

In other languages, you have syntax for importing stuff, syntax for declaring a variable, syntax for different loop constructs, syntax for branching, syntax for classes/methods/functions...

Each building block comes with its own syntax.

This isn't as elegant as having one universal construct for everything, but it's easier to use (once you got used to it), because the syntax itself carries a lot of the information.

  (define (square x)
    (* x x))
I rather write something like:

  square(x) => x * x;
"bla(...)" is a bit of syntax for defining a function. "=>" is a bit of syntax for defining a lambda. "x * x" is a bit of syntax for using x's "star" operator.

> Again, I don't understand what you mean.

J and APL aren't popular languages. There are many unpopular languages with worse syntax. They are also both array programming languages which is a rather odd niche.


This is simply the case of getting used to something. The Lisp code reads like this: (define A B) -> define A to be B. You define (square x) to mean (* x x). It requires you to know only one syntax rule - that (f a b ...) means "apply f to arguments a, b, ...". The above code basically reads "define applying square to x to mean applying * to x and x".

> They are also both array programming languages which is a rather odd niche.

Array programming may seem like an odd niche if the only thing one knows is web development, but the moment you actually start doing some maths, they become incredibly useful. Another, much more popular (and crappier, which seems to be a common correlation, but that's a different story) array programming language is MATLAB. Also you've probably heard of R, loved in statistics and sciences, which is another array language.


> This is simply the case of getting used to something.

My point was that other languages provide more visual hooks. Lisp is just words and parens. Nothing sticks out. Everything is the same.

So, you can't learn this "visual vocabulary" because there simply isn't any. There is a thing at the beginning of each list and what follows are the parameters. That's it.

"(define x 5)", "(square x)", and "(+ x x)" are syntactically the same thing.

> Array programming may seem like an odd niche if the only thing one knows is web development

You make it sound like it's either web stuff or maths.

Array programming languages aren't used for scripting (e.g. games), are they?


How many of those visual hooks are provided by syntax highlighting? You have that in Lisp as well. Moreover, you don't need that many syntax hooks; you learn to recognize words, just like when reading a book, and also indentation structure.

> Array programming languages aren't used for scripting (e.g. games), are they?

They've been used at least once, if you count one demo and one game I wrote in MATLAB. Man, you wouldn't believe how convenient array languages are when you need to do, things like polygon mesh interpolation (morphing), not to mention actual array operations like multiplying matrices or vectors. MATLAB may be a crappy language, but the array operations? I wish I had them in C++/Java for games.


If you interpret "array programming language" as "language with support for arrays and matrices as first-class objects with nice operators, etc" then you get a lot more things coming into the mix beyond oddballs like apl.


> but it's easier to use (once you got used to it)

No, actually it's easier to both read and write and edit sexps once you get used to it. This whole thread is about this: you have no practical Lisp experience yet you claim Lisp is not practical. You couldn't be bothered to actually learn more of a language, but you want to tell us how the experience of using it looks like.

It doesn't work that way. You can only compare things meaningfully when you have comparable knowledge of both. You apparently don't. There are many people who do know both "normal" languages and Lisps and most of them seem to agree that in practice Lisps are as readable as other syntaxes. But you don't want to believe in it for some reason and you don't even want to see for yourself.

> square(x) => x * x;

Why don't you answer my earlier posts, where I show how to make similar syntax in Lisp?

> J and APL aren't popular languages.

But that is completely irrelevant. I'm talking about language features and practice/experience of programming with it, I don't care at all about "popularity".


> You can only compare things meaningfully when you have comparable knowledge of both.

Lisp isn't the only language which allows you to nest things.

You can nest function calls, lists, objects, tuples, and whatever in any language.

  (a b (c d))

  a(b, c(d))
Impossible to imagine, eh?

> Why don't you answer my earlier posts, where I show how to make similar syntax in Lisp?

So, your solution to make Lisp usable is also to not write Lisp? And I'm supposed to take that as disagreement?

If you extend it with your own syntax, it becomes a different language.


> If you extend it with your own syntax, it becomes a different language.

...but that's exactly what programming Lisp looks like. Becoming a different language every time you need it to is business as usual in Lisp. Syntactic abstraction - ability to extend language syntax - is central to Lisp programming.


> If you extend it with your own syntax, it becomes a different language.

No. This is exactly what Lisp is for. If you're not transforming it into hundreds of small DSLs (with their own syntax and a wide variety of semantic properties), then you're not using it right, and missing on all of its expressive power. In such case, yes, you may get rightfully puzzled, what all the Lisp buzz is about if it's just all the same stuff, but with an ugly syntax.


I think this requires an obligatory image explanation for non-lispers: http://www.loper-os.org/wp-content/parphobia.png


> Yes, I "dare" to have an opinion on that syntax.

Is not it stupid, to have an opinion about something you do not know and do not understand?

> It's very uniform which is convenient for implementers.

It's not that uniform, really. It's so flexible that you can make it look any way you like. Any existing language syntax is a subset of this syntax.

> because this uniform soup is hard to scan and also hard to write

Just use a better text editor then.


> [Scheme's syntax is] not that uniform, really.

"Scheme's very simple syntax is based on s-expressions, parenthesized lists in which a prefix operator is followed by its arguments. Scheme programs thus consist of sequences of nested lists."

That's as uniform as it gets. Everything is the same.


Firstly, Racket is not a Scheme. There are reader macros and all that bells and whistles. Does this look like S-expressions to you?

https://github.com/soegaard/minipascal

Secondly, even most Schemes allow to use [ ] and { } in addition to ( ).


> Firstly, Racket is not a Scheme.

Did I talk about Racket anywhere?

> Does this look like S-expressions to you?

"MiniPacal implemented in Racket"

Are you really arguing that implementing some other language is the way to go?

https://github.com/eudoxia0/cmacro http://research.swtch.com/shmacro

Totally legit "C".


> Did I talk about Racket anywhere?

This is a thread about Racket, if you did not notice.

> Are you really arguing that implementing some other language is the way to go?

This is exactly what Lisps are about. You've got a nearly raw AST at the bottom (S-expressions) and then you build up a hierarchy of languages on top of it.


Even if you use this redundant `let` form you can still make it more readable by doing the following:

    (let ([hello-func (lambda ()
		        (displayln "Hello world!"))])
      (hello-func))
This applies globally; when it makes sense, differentiate between some lists by using brackets instead of parens. Some Lisps take this further by using other bracket types to signify certain things.

I would advise not to hop on Wikipedia and assume things about languages. Scheme is not very hard to get into and with variants where batteries included you can build useful stuff right out of the box.

Go to http://www.racket-lang.org/ if you want to see a good example of a pragmatic and useful Scheme variant that may just dispel some of your prejudices.


> Scheme and all the other Lisp variants look terrible to me

Baby duck syndrome?

> Having an excessively simple syntax doesn't mean that it's nice to use.

Having an excessively simple syntax ensures that you can add any amount of syntax on top of it, whatever you fancy. Do not like S-expressions? Fine. Code in any syntax you like.

> but this is much easier to understand, isn't it?

No, it is not.


That's mostly because such lisps are overly verbose. The same code in Clojure:

    (let [hello (fn [] 
                  (println "Hello World"))]
      (hello))
Or assuming hello is a global in your dart example:

    (defn hello []
      (println "Hello World"))

    (hello)
Nothing too complicated about that. Clojure is way less verbose than some scheme variants.


All Scheme variants are approximately as verbose as your second example, except you don't have to assume hello is global. Internal defines are part of the R4RS standard published in 1991.


The equivalent code in Racket is exactly as complex as your Clojure example.


> Or assuming hello is a global in your dart example

The way it's written implies that it's somewhere inside some block, because "main()" is always the entry point. I wrote it like that because I couldn't be bothered to figure out how this stuff works in Scheme.

With entry point:

  main() => print('Hello World');
Alternatively:

  main() {
    print('Hello World');
  }


> I wrote it like that because I couldn't be bothered to figure out how this stuff works in Scheme.

I think that sums everything up, really. This is how shallow the complaints about parens are. No one who actually bothers will make a point about what delimits expressions in a language.

To complain about parens is also doubly wrong, as they do bring something technical to the table.

What Carmack talks about when he says S-expressions are nice for reading in terms of network communication, for example, is that the messages can be read exactly like other expressions over the wire. This is true for any I/O; you can read the data just as if it was code, because the code is data. The same mechanism that reads the code you are executing is available to you to read full expressions through any medium, and use that data/code as you see fit.


"This stuff" was referring to the entry point.

I didn't complain about parenthesis. ASM's syntax is also extremely simple, but that doesn't make it easy to use, does it? If everything looks the same, it's hard to scan.

C-like languages provide more visual hooks than Lisp dialects.


I'm currently using Java and Javascript for work. Previously I used Scheme. I do find that scanning is important in Java and JavaScript because often you don't need to read a whole section, just a piece. Back when I worked with Scheme, if I didn't need to read a whole section, generally the part I needed to know was right at the beginning. I think all Lisp dialects work this way.

The worst example is immediate function application in JavaScript. You get to the end and all of a sudden the }(); tells you everything you've just read is a different context than you thought it was. In Scheme (and probably most Lisps) you'd recognize immediate function application at the beginning as soon as you saw ((.

I don't know how people get really good at JavaScript without learning Scheme first.


> You get to the end and all of a sudden the }();

With most code conventions, you wrap immediately invoked function expressions (IIFEs) in parens:

  var foo = (function () {
    ...
  }());
The only purpose of those extra parens is to act as marker.

Well, with block scope (`let` & `const`), IIFEs aren't really needed anymore.


>Clojure is way less verbose than some scheme variants.

Bullshit.


Eh, IMO

    (let [x (+ 1 2)
          y (+ x 3)]
      (* x y))
Is always less verbose than:

    (let ((x (+ 1 2))
          (y (+ x 3)))
      (* x y)) 
I program in Clojure every day, and that latter example makes me say "aahhhh! The parens!"


And the Clojure example makes me say "aaah the ambiguity!" The key/value pairs makes much more sense to me when they are delimited as such. It's a minor point, anyhow. Clojure is just fine. I just don't think that it is significantly less verbose than Scheme. Let's just keep hacking with Lisps and be happy. :)


It's dramatically less verbose when you are dealing with code that uses associative data structures, both because of having nicer literal syntax and because the data structures can be applied directly without needing to reach for things like hash-ref. (Which makes a boatload of sense, because immutable hash maps actually have more in common with mathematical functions than Scheme procedures do.) In addition, functional nested updates are very verbose in Scheme.

However, both of these problems can be solved in Racket by third-party libraries.


    (let* ([x (+ 1 2)]
           [y (+ x 3)])
      (* x y))
Less ambiguity, all of the clarity of what is what. I don't get why you use other delimiters for your Clojure code, but have this view that they can't be used in other Lisps/Schemes.

(You will also have to use `let*` in (most?) schemes because the binding for y has absolutely nothing to do with the binding for x and so cannot be used in the binding of y.)


>I don't get why you use other delimiters for your Clojure code

Do other delimiters mean the same thing as in Clojure tho ? I only know Clojure - in Clojure [] - vector- is a different data structure from () - linked list - which is different from {} - hash map. Is this similar to Racket or is it all just lists in Racket ?


In Racket all kinds of parenthesis are equivalent, using one or the other is a matter of convention and readability.

I think the newest Scheme standard takes the same approach.


Here's his (brief) answer (from the same thread):

John Carmack:

> >An imperative API makes functional abstraction harder. What are the main selling points for Scheme/Racket now?

> I am a big believer in functional programming (and static types) for large projects, but there is an undeniable bit of awkwardness compared to just imperatively poking things for small projects. That is one of the wins for Scheme -- I can make it super-easy to get easy things working, but it isn't just a "scripting language" unsuitable for large scale development. I am going to have to sort out my Racket / Chibi module strategy sometime soon, though.

> As far as language choice goes, I don't claim to have broadly evaluated every possibility and chosen the optimal one.

> Java or C# would have been more familiar to a broader base of game industry developers, but I really didn't want to drag in all the bulk of a JVM / .NET system, and a class focused world view seems less suited for the smaller scripting tasks.

> Javascript would have been more familiar to a broader base of web industry developers, but I have basically no experience with javascript, and little desire to explore it (which is admittedly a fault of mine).

> S-expression reading and writing is a strong factor for network communication, and I like the fact that there are available options for Scheme to interpret / compile / compile-to-C. I can see valid use cases for all of them, and I'm not sure how important each will be.

> The bottom line is that I have been enjoying myself working with Racket / Scheme this year, and I have evidence that it has been objectively productive for me, so I'm going out on a bit of a limb and placing a bet on it.

> >Initial impression: 7 years after starting to program in Racket, it still surprises me how easy it is to do something > >useful in just a few lines of code.

> A big goal here is to make the authored code very clear and simple. I may yet get to a single #lang vr <ip address> declaration that gets rid of the boilerplate at the top and bottom of the file and flips the right switches to make Racket closer to R7RS.

> There are sets of creative-types that are learning how to put things together in Unity to accomplish fairly simple media related tasks. I think I can make it easier for them with very domain specific helper libraries and a little bit of scheme.


> I am a big believer in functional programming

> I have basically no experience with javascript

> The bottom line is that I have been enjoying myself working with Racket

So there is no logical reason behind, he just loves Racket, ok


Programmer love is secretly (or not so secretly) the reason for all language choice. I say why hide it.


Not really. Domain often dictates. If AAA games could get away with using a garbage collected language, I'm sure they would.


Unreal has GC support for C++.

Unfortunately Unity broken GC is a side effect of them not wanting to improve the stone age runtime they are using.

I guess many AAA could actually be done in languages with GC support, but game devs tend to only change tools when the OS/console vendors force them.

All that is needed is having such a vendor pushing a language with the same effort as they are pushing for JavaScript JITs, lets say.


Programmer love would determine who'd make that change vs. who defended C++ to the death.

Domain is often dictated too—we're attracted to problems that fit our tools.


Not to worry, they will be forced for add javascript just to please the hordes of people who don't know any better.


Before that it had to be "VB like" scripting.


You forgot this part:

>and I have evidence that it has been objectively productive for me


He is fairly smart and has quite good taste, so I'll trust him to choose a good language.


It is logical for him to pick Racket because of these three points.


Only programmers would demand that you provide logical justification for an aesthetic preference :)


Ultimately, this is what art is. You need to go with what you love. 'logical argument' often just is a euphemism for rationalizing your convictions after the fact. You can't paint if you try to analyze every brush stroke. Let the subconscious flow.


Scheme (or any other Lisp) is the best choice for a baseline implementation, because it can be easily (and without any runtime cost) turned into any other language. You can do similar things with MetaLua, but at a greater cost.


I've worked at a couple of games companies and we didn't use Lua on any of the projects I worked on. One was starting to experiment with it when I left.

A large AAA studio I worked at actually rolled their own scripting language, compiler, and VM from scratch. I got to get my hands dirty with this. The only time since University I've had to utilise my compilers knowledge :)

Unity gives you the choice of C#, a JavaScript dialect, or a Python dialect.

Why not lisp? :D


Andy Gavin has a great series of articles on the making of crash bandicoot[0], [1]. (I'd recommend reading the entire series as it's very interesting).

It seems that in order to create an immersive world on the original playstation and keep up the speed of development, iteration and dynamism required for a game he wrote his own compiler (into assembly for the playstation one) for a lisp that he created called Game Oriented Object Lisp (GOOL)[1].

[0] http://all-things-andy-gavin.com/2011/02/02/making-crash-ban...

[1] http://all-things-andy-gavin.com/video-games/making-crash/

[2] http://all-things-andy-gavin.com/2011/03/12/making-crash-ban...


Because, in my opinion, when you create a tool for the masses, the individual preferences should be the last in line of factors when you choose scripting programming language for that.


God forbid javascript developers are being encouraged to broaden their horizons by a master of the discipline.

He is going to be forced to add javascript by whiners anyway, I am enjoying the pipe dream of keeping it away.


When you create a tool for the masses, you get a crap tool. You should focus on creating a good tool first; the masses will have to follow anyway.


> when you create a tool for the masses, the individual preferences should be the last in line

Well Javascript is Javascript because of some guy personal prefences at Netscape. The same is true for C# and Java in their respective companies, and they didn't start as popular languages either.


Javascript is literally the poster child for "accidental and completely illogical success". It's the only mainstream language that did not have to compete with anything else because of accidents of history.


Technically, it did have to compete with VBScript, back when IE had browser dominance and VBScript support.


But Netscape never had support for vbscript, so until they died Javascript/JScript was the only choice. By the time IE reached dominance, the industry had already standardized on JS.


JavaScript was supposed to be Scheme not once but twice, and the reason we use it instead of a Lisp in the browser is the insane competition with Microsoft. Source: interviews in Coders at Work.


And that guy wanted to just use Scheme, but got overruled by management.


By that line of reasoning, everything would be COBOL.


They masses don't know what they want. By choosing for them, you show them.


Lua's a minimal and easy-to-learn language that's easy to embed. But is it a good language? I'm not sure. It's weakly-typed and dynamically-typed, for example.


Lua is not weakly typed (at least not in the sense of Javascript and PHP where "3"+4 is either 7 or 34, can't remember which.

It is dynamically typed, but so is Scheme, so that's not a reason to choose Scheme over Lua.

It has the bestest JIT currently found in any dynamically typed language, and rivaling the best JITs of statically typed languages like Java.

What's your definition of a good language that accepts scheme but not Lua?


> Lua is not weakly typed (at least not in the sense of Javascript and PHP where "3"+4 is either 7 or 34, can't remember which.

In JS, "3" + 4 is "34", in PHP, it's 7. PHP separates concatenation and addition, unlike JS.

Anyway, huh? Lua has automatic type conversion. That's usually considered weak typing.


Its just for "+" and ".." though. For other things, such as equality and table indexing numbers and strings are not interchangeable.


In Lua "3"+4 is an error. Concatenation is a different operator than add, specifically to avoid confusing coersions.


Try it:

    Lua 5.1.5  Copyright (C) 1994-2012 Lua.org, PUC-Rio
    > ="3" + 4
    7


Ah. Well at least there isn't a confusion between add and concat

"3"+4 is 7 4.."3" is "43"

"3hello"+4 is an error


Stand corrected with the example. I forgot about the lua auto type conversion - but that, on its own does not make weak typing - Python, C and Java do that between floats and ints, would you consider them weak typed?

I'm not familiar with a rigorous definition of weak typing - but my experience is that weak typing is usually reserved for languages like TCL, Snobol, and even Perl where there's no "type" of values to speak of - everything is equivalent and decoded according to context.


That's the problem with those terms, both 'weak' and 'strong' typing are purely subjective.


> bestest JIT currently found in any dynamically typed language

Aren't the best players in this field the JavaScript engines? Not because JS is the best language, but because Google, for example, hired people like Lars Bak to make them.


Javascript engines certainly have the best marketing going for them :) But Lua is a much simpler language and its much easier to write a JIT compiler for it than Javascript.

And if it comes to quality of programmers, LuaJIT has Mike Pall which is a bit of a legend for those who heard of him.


Which means LuaJIT suffers from Bus factor syndrome.


TypedRacket is statically typed.

Good language is a meta language. There is MetaLua, of course, which qualifies it as a "good" language, but Scheme got more bells and whistles.


Metalua isn't compatible with luajit. It needs the slower C based interpreter. Its problem was that it was based on a very old version of Lua and it needed things like goto that weren't in lua yet, so it goes straight to bytecodes.

I'm working on building languages on top of Lua/Luajit, since Lua has a good tracing jit, there's no reason you can't write an expanded metalanguage on it that will preprocess a bit and give you a brand new language.

By the time I'm done it will have the power of scheme (including continuations, with some functions compiled internally to use continuation passing style, since Lua guarantees tail call elimination) and macros. The main difference should be that the base type will be arrays and tables not lists, but that could be an improvement. I'll probably tack on something like s-expressions too.


You should find this interesting http://terralang.org

Took me a couple reads to figure out what it does. Luajit is the metalanguage for Terra. But, if you want to, you can run a mixture of Lua and Terra at runtime or statically compile to a pure Terra exe or .o with no Lua runtime at all.


that reminds me a bit of lush [http://lush.sourceforge.net/], one of a small handful of languages that i really feel should have made the jump to at least haskell-level popularity.

hn actually had a lush discussion recently: https://news.ycombinator.com/item?id=9602430


Also, I've worked in both Lua and Racket.

Racket's garbage collector is the only one I've ever used that actually made programs unusable.

Also, Racket's macro system feels like a mistake. While it's more powerful than most others, it's still rather incomplete, feels like a bad design, and ... well just look at the code that's used to implement the (sadly slow) object system. The code makes assembly language look clear by comparison. Dig under the surface and despair.

Scheme is very powerful, but in the end Lua programs are a lot more readable.


[edit, I wrote this before you changed your comment so say "one-shot". In short my continuations are not one shot, they are delimited though, by necessity because I don't rewrite ALL functions into continuation passing style, and you can't recast what lua calls metafunctions.]

Continuations are not coroutines.

Continuations let you save the current position, like setjump/longjump. But unlike setjump/longjump you can continue inside a function that already returned, ie as long as there is a continuation remembering that position then the activation record can't be deleted.

That means that activation records (ie local variables) can't be just the top of the stack - it changes things deep.

Also note that continuations only save the position, they don't restore the values of variables to what they were at that position, so re-evaluating continuation more than once will break any code that wasn't expecting to re-run.

Something that snapshots some local variables as well as the position will be more useful than raw continuations for things like search, backtracking, and logical programming languages.

[edit 2, I am ALSO adding snapshot continuations for these purposes]

Running an OUTER continuation rather than an inner one looks like an exception.


Doesn't Lua already have (one-shot) continuations, in the form of coroutines?


> Metalua isn't compatible with luajit.

Although there should not be any obstacles for porting (or re-implementing) MetaLua on top of the most modern luajit.

> The main difference should be that the base type will be arrays and tables not lists, but that could be an improvement.

That should only matter in compile time anyway, for your meta-language. Target may use whatever data structures you want.

> I'll probably tack on something like s-expressions too.

It's not necessary for a meta-language, as long as you have quasiquotation (that works both for constructing ASTs and for pattern-matching them). It's just the easiest way, but not the only one.


Lua, thanks to Metalua, works pretty well as a meta language, so you could easily add static typing, while keeping the benefits of its interpreter.

OTOH, the same is true for Scheme.


I think it is pretty much expected of devs to be able to learn new languages on the fly, as you need them.

Also, just because something is the "industry standard" (of popularity), doesn't mean you should always follow.


The Naughty Dog team made a lisp (GOAL) for some of their games.


When moving to PS3, they used Racket instead.

https://www.youtube.com/watch?v=oSmqbnhHp1c


Not really related, but instead of using a LISP to code VR, I will suggest to do the other way around, so code LISP in a VR environment.

The real core of LISP are pure function and simple structure.

It really makes sense to have a 3D representation of a function where you plug variable inside and get variable out...


I love the humility he displays by asking if Gear VR/Scheme is an appropriate topic for the list.


Very cool.

When I started work at Angel Studios (mostly they did work for Nintendo and Disney) I experimented with an embedded Scheme (probably SIOD). I was hired for game AI and I thought that a scripting language would facilitate rapid experiments. I ended up sticking with C++ which was not so bad because I always had a very beefed up SGI Reality Engine from either Nintendo or Disney, depending on what I was working on. Turns out that Reality Engines do C++ builds very quickly :-)


Pretty bizarre. He says he is "favoring ease of development over performance," but the best way to ease development would have been to use a well known language like JavaScript or C#. Those are the two scripting languages in Unity as well, so it would have immediately worked with Unity developers, and C# isn't far off from Java for one of the other biggest languages. Kind of sad VR development is going to hobbled like this.


Ease of development for a programmer is a different thing than ease of development for a company, the latter usually means you can hire tons of cheap programmers to crank out code. Carmack simply decided to choose a powerful tool instead of a popular one; a choice which I applaud and wish more people and companies would make.


"over performance" doesn't mean "with complete disregard to performance". You still need to hit >= 90fps with zero hitches and zero dropped frames.

A less performant scripting language can be a wise choice if the the productivity gains are sufficient. However that language still needs to hit 90fps with no dropped frames.


>Kind of sad VR development is going to hobbled like this.

Racket is a much more powerful, well-designed language than JavaScript or C#.


I think you vastly underestimate how capable developers are of learning new programming languages.


Finally.


And on the 2nd day The JC spoke and proclaimed: The metaverse shall be written in lisp. And it was good.



And on the 3rd day The Facebook spoke and proclaimed: All your metaverse are belong to us and shall be written in "web-native" "technologies". [Sign In With Facebook]


Somebody set up us the like.


The Like button is people!


They're making our news out of people. Next thing they'll be breeding us like cattle; For clicks. You gotta tell 'em. You gotta tell 'em.

>I promise tiger, I promise, I'll tell the exchange.

Listen to me DaveSapien, you gotta tell 'em. HN is people. We gotta stop 'em! Somehow!


D:


I didn't mean to bum you out.


no no, that was just me looking at lisp...


I can already see Lispers using "but Carmack uses it for VR stuff, therefore it's the best!" as an argument in language wars.


"James Clark uses it for SGML stuff, therefore it's the best!"

Of course James Clark is to markup languages as John Carmack is to games, or Jesus Christ is to religion. ;)

http://www.jclark.com/dsssl/

https://en.wikipedia.org/wiki/Document_Style_Semantics_and_S...


Any other language out there is just a subset of Lisp (or any other sufficiently powerful metalanguage). So it's superior simply by definition.

EDIT: downvote count = a number of people who got no clue what meta language is.


I understand your sentiment, but this isn't what 'subset' means.

Definitional arguments aside, this would mean that machine code is the best language, since all other languages are 'subsets' of it.


No, this is exactly what 'subset' means. Any code in any language can be in fact a Scheme code, and there is no way you can tell if it's Scheme or not.

And this is exactly what superiority is about: you can do in Scheme anything that is possible in any language, existing or not invented yet. There is absolutely no way one can even compare meta-languages vs. fixed ones.


You're describing Turing completeness, which is a property of all general purpose languages.

In the context of programming languages, "subset" implies that one language is identical in form and semantics to the other, plus some extra stuff. Good examples of this are Objective C and C++ to C. (Although C is not a strict subset of Objective C and C++, it is pretty close.)


> You're describing Turing completeness, which is a property of all general purpose languages.

What? It's totally irrelevant here.

> In the context of programming languages, "subset" implies that one language is identical in form and semantics to the other,

Exactly. And, say, a Pascal implemented as a thin macro layer on top of a Lisp is semantically 100% equivalent to a standalone Pascal.

In other words, you can build a Pascal (undistinguishable from the "real" one) on top of Scheme, but you cannot build Scheme on top of Pascal.


I can certainly implement Scheme using Pascal. That is, I can write a Pascal program that, when given a program written in Scheme, executes the Scheme program according to the Scheme spec.


> I can certainly implement Scheme using Pascal.

You can implement an interpreter or a compiler. But it won't turn your Pascal into Scheme.

> That is, I can write a Pascal program that, when given a program written in Scheme, executes the Scheme program according to the Scheme spec.

It's irrelevant. Can you mix a bit of Scheme code into a definition of a Pascal procedure, with all the local identifiers transparently available? No. But it's trivial the other way around.

A simple experiment for you: imagine you've got a system scriptable in Pascal. And you want to write your scripts in Scheme instead. Your actions? Implement a slo-o-ow and broken Scheme interpreter in that Pascal, right? And then think hard on how to do interop in between Scheme and all of the stuff already available for that Pascal. Funny and stupid.

Now, the other way around: you've got an embedded Scheme, but you hate all that parentheses and want to just code in Pascal. Fine. Write a little macro which will translate transparently your Pascal into Scheme. No runtime cost whatsoever and all the interop done for free. See the difference now?


It's possible, for instance, to write a program in C which compiles Scheme-formatted strings according to the Scheme spec. You use one; it's whichever Scheme compiler you use (therefore making your Scheme programs a "subset" of a C program). What's more, any Scheme program can be replaced by one written in C that runs faster and with less overhead.

Does this mean that C is "better by definition" than Scheme? No, both have their places. But any Turing-complete language with string handling is a "subset" of any other; that is an interpreter or compiler for that language can be written in any other language, so long as it's Turing-complete and can parse strings. If we look at which languages are "supersets" in practice, we find that C and C++ are the best languages ever by your theory, since many, many compilers and interpreters are written in them, while Scheme is a terrible language because compilers and interpreters are almost never written in it, even in most implementations of itself.

Your poor understanding of these pretty basic theoretical concepts and your silly fanboyism are why you're being downvoted.


Again... A compiler or an interpreter written in C is not the same as implementing a language on top of C, fusing it into all the existing infrastructure.

You did not understand what static metaprogramming is, what is an extensible language, and yet you're talking about my poor understanding.

And, by the way, I fixed an inferior C. Any inferior language can be made superior by adding a tiny bit of compile-time Turing-completeness and a bit of compile-time reflection: https://github.com/combinatorylogic/clike

Made it equally powerful to Scheme, Forth, Nemerle, TH, C++ and the other proper meta-languages.


"Write a little macro which will translate transparently your Pascal into Scheme."

How little we talking here.


Not very - Pascal has records, pointers, arrays, etc. You can do it, but you're writing a Pascal compiler and runtime system as Scheme macros. I don't find that a practical application of macros - and macros are powerful, and they let you do things in a Lisp that are painful in other languages.

I bowed out of this conversation because the poster was talking in absolutes, and if you're willing to go to the extreme of implementing a Pascal compiler and runtime system in Scheme macros, then I think other extremes are on the table that can give similar functionality. But I didn't see such nuance getting across.


Pascal is an extreme indeed, it is a large language and as such, unpractical. What is practical is embedding an imperative language with pointers and an unmanaged memory. And the practical value of such a language inside a meta-host is mostly as a building block for DSLs, not as something that end users would face directly.

It's really hard to convey the gigantic difference in expressive power between the meta languages and the primitive ones to those who never got even exposed to the higher level methodologies, who never built elaborate DSLs.

And no, you're wrong in assuming that there are "other extremes" providing a comparable expressive power at no runtime costs. There are none, provably. You mentioned implementing compilers or interpreters in any Turing-complete language - but it won't solve the interoperability issue between the host, the implemented new language and any languages you'd build on top, which may need to borrow semantic building blocks from your new language. As soon as you start to address these concerns, you'll end up turning your host into a proper meta-language (as I did with C, for example).


Pascal was your original example upthread, tho: "Now, the other way around: you've got an embedded Scheme, but you hate all that parentheses and want to just code in Pascal. Fine. Write a little macro which will translate transparently your Pascal into Scheme. No runtime cost whatsoever and all the interop done for free. See the difference now?"


Yes, it was my example - exactly because there are multiple implementations of this very thing, this thread is littered with the links. My last remark was that the full, 100% standard-compliant Pascal, although impressive, is not very practical, and a lower level language is what is usually designed for this purpose.


OK, so I'm looking at https://github.com/soegaard/minipascal right now, and ok it looks like there's a couple thousand lines of racket code that will translate a very limited (no records, no reals, does not appear to be any pointers) pascal-like language.

So presumably for this to be exciting, it would have to be executing in a larger environment where it's surrounded by racket code doing its own thing. But now I don't see what's so special about that, and why that's different from, say, a python script that takes a string of fake-pascal, turns it into python thru an internal processor, and then calls an eval.


The value is in mixing semantic properties of multiple (dozens) of languages in your single environment. Building the first fundamental blocks may take time, for they are fundamental for a reason, but then adding new things on top is trivial.

So, this is essential for building rich DSLs. See my framework as an extreme example of such an approach.


>>EDIT: downvote count = a number of people who got no clue what meta language is.

No, people simply don't like "language fanboy-isms" here.

The reality is that no language is "superior by definition." Some languages are better at certain things than others depending on the task at hand, and many other factors.


> No, people simply don't like "language fanboy-isms" here.

I said specifically that any metalanguage would do. And there's a lot of them. I do not care which one to use.

> The reality is that no language is "superior by definition."

Then you do not understand what "superior" means.

> Some languages are better at certain things than others depending on the task at hand, and many other factors.

A language A which contains all the features of a language B, plus a bit more, is superior by definition, because it can seamlessly replace language B without anyone noticing.


"Features" are not the only things that matter when it comes to weighing languages against each other.

Is it unanimously agreed that C++ is "superior by definition" to C? What about Objective-C?


C++ is not, technically, a 100% superset of C, there are some really cool features in C that are missing from C++. If it was not the case, and if you take the runtime library overhead out of consideration, then yes, C++ would have been superior in terms of expressive power, since it could be doing everything that C can, plus a bit more (and even a tiny bit counts).


It's HN, we know very well. We just don't agree with your neophyte enthusiasm (or the inacurrate description).


Evidently you've got no clue what I'm talking about.


Evidently you talk hyper-enthusiastically (and with the wrong terminology) for things even seasoned Lispers and Schemers don't, not to mention that your "magic bullet" praises are far from being computer science.

E.g. "any other language out there is just a subset of Lisp" is just a fun statement in a Greenspun's tenth rule way. You say it like you mean it, which means you don't understand the importance of syntax, semantics, type systems, and several more concerns (not to mention pragmatic issues like ecosystem and tooling).

You also yield "metalanguage" like its conventional CS wisdom that being a metalanguage is some magic bullet trait, that makes it "superior simply by definition".

(Not to add that you don't understand the definition of "metalanguage", which would also include things like Antlr, m4, PEG, transpiler generators, and such, for which there's nothing "superior" to a conventional language. Not sure were you picked the terminology up, probably some book or article, but it doesn't mean what you think it means).


> Evidently you talk hyper-enthusiastically (and with the wrong terminology) for things even seasoned Lispers and Schemers don't, not to mention that your "magic bullet" praises are far from being computer science.

And again, evidently, you've got no idea what proper meta-languages are (i.e., featuring a static compile-time metaprograming with a full, again, compile-time reflection). And yes, I'm perfectly aware of the fact that even majority of Lisp users got no clue how to use metaprogramming properly.

> You say it like you mean it, which means you don't understand the importance of syntax, semantics, type systems, and several more concerns

Obviously you do not understand metaprogramming. I do not care at all about syntax, semantics, type systems and all that of a host meta-language. I can build any combination of this stuff on top, easily. Of course as long as the host language is not trying hard to break things - this is what separate a proper metalanguage from an inferior one.

> (not to mention pragmatic issues like ecosystem and tooling).

Again, you do not understand metaprogramming at all. Once you start parasiting on any given language you're getting all of its ecosystem and tooling in, for free.

> You also yield "metalanguage" like its conventional CS wisdom that being a metalanguage is some magic bullet trait, that makes it "superior simply by definition".

I gave a definition, if you did not notice. A meta-language is the one which can be turned into any possible language, statically.

> which would also include things like Antlr, m4, PEG, transpiler generators, and such, for which there's nothing "superior" to a conventional language.

I gave my definition, which (conveniently) fits Lisp in.

> Not sure were you picked the terminology up, probably some book or article, but it doesn't mean what you think it means

Not sure you know it better.


> you've got no idea [...] Obviously you do not understand [...] you did not notice

Please don't conduct programming language flamewars on HN. Discourse here must remain civil and substantive, even when the other person is also being rude (as in fact is the case). Please read https://news.ycombinator.com/newsguidelines.html.

You're being downvoted because you're breaking this rule, not because of your argument. This community is replete with lovers of Lisp, including the person who built HN and us who work on it, so that's hardly the issue.

If there's one thing we will never allow to happen it's HN turning out like comp.lang.lisp.

Edit: I've detached this subthread and marked it off topic. Your thoughts on Lisp are welcome here, but please be respectful from now on. The Principle of Charity is what we shoot for in arguments here: https://en.wikipedia.org/wiki/Principle_of_charity.


>You're being downvoted because you're breaking this rule, not because of your argument.

If that was the case I wouldn't be being upvoted (as I'm, as you said "also rude"), but I am. So I maintain that the downvotes to the parent was for the content (or the content too), not merely the tone.

That said, I take some offense into that I was "rude".

I merely replied in a much kinder tone to the parent comment that wrote "downvote count = a number of people who got no clue what meta language is". That initial tone provoked my response.

When that was shot down with "you got no clue either", I came back with further explanation -- to which I got his response that you commented on.

In any case, apart from the tone, for which I can apologize, do you agree, on the technical and CS side, with what the parent wrote?


I'd be happier to have this conversation privately but there isn't much of an option when accounts have no email address. I bet I could convince you of the below if we were sitting opposite one another over a fine beer, but this channel is pretty limited. But let me try.

I like and appreciate a great many of your comments and think you're a net positive contributor to HN, who has significantly enhanced the intellectual diversity of the site over the years (on a lot of things if not programming languages!). Unfortunately, your comments have also often been abrasive in a way that at best skirts the HN guidelines and sometimes plainly violates them. I wish you would work on eliminating that.

On a large public forum, when someone is a good writer and makes cogent points and is abrasive, it gives license to everyone who isn't a good writer and doesn't make cogent points to just (begging your pardon) pee in the pool. It's a destabilizing influence. Being right makes it worse.

Considering that what we're shooting for on HN is an internet version of reversing the arrow of entropy, i.e. hard, I wish you'd join in with that and help. It would benefit you too, since the community would suck less. Other users have done so. I'm one of them; I used to deliberately mix a little abrasion into my HN comments because I thought they would be lukewarm without it and I value lively language. But that was before I understood the dynamics of a large public forum. A large public forum has to be a bit bland because otherwise the riffraff will ruin it.

Re the current thread? I doubt it makes sense to litigate the details, but FWIW I at most partly agree with the author. I think Lisp metaprogramming has significant advantages, but he overstated his case, and that plus being rude, then lashing out at downvoters, is guaranteed to attract more downvotes. Your comments were less rude and advocated a more conventional view, so it isn't surprising that users preferred them. I mentioned the rudeness on both sides not to equate the two, but to point out that the HN guidelines still apply even when one is being provoked. In fact that's when they apply most. Otherwise everyone can invoke the "I merely" defense and down we all go.


>I wish you would work on eliminating that.

I'll give it a try!

(I'm from a place that we like our debates heated and our coffee iced :-)).


And yet you keep proving that you in particular still cannot understand what static metaprogramming is, even after all my patient, kind explanations. So I was right after all.


Yeah yet another programming language discussion/argument.


> "Javascript would have been more familiar to a broader base of web industry developers, but I have basically no experience with javascript, and little desire to explore it."

that's not a good reason to give up on the most popular scripting language that will engage more developer to VR.


> that's not a good reason to give up on the most popular scripting language that will engage more developer to VR.

Maybe, but there are a lot of good reasons to avoid Javascript, too.

I've actually been part of a project that used Javascript as a scripting language inside non-web software. IMO a better language should be used whenever possible.

First, even today, 98% of the Javascript code and tutorials on the internet and in books is focused squarely on website scripting, and it's difficult to find out what's actually part of Javascript, and what parts come from the browser. So saying, "There are a ton of Javascript resources available," ends up not meaning much, because they're all telling you how to manipulate CSS or navigate through the DOM. The other 2% covers node.js, which is still not useful for embedded scripting.

Second, Javascript sucks as a language. There's all kinds of stuff like weird conversion rules and lack of real arrays (and I could keep going, but won't) that end up making things more difficult than they need to be. It's a commonly held belief that it was unfortunate Javascript became the de-facto scripting language of the web. No use propagating that mistake to other fields.

Third, there's no reason to believe that web developers familiar with Javascript will be particularly good at (or interested in) scripting VR systems or whatever.

Finally, a lot of existing Javascript code is just really, really bad. It may be the most popular scripting language, but I would wager large amounts of money that as a percentage, there's more flat out bad Javascript code than there is in any other language. Missing out on that part of the developer pool helps more than it hurts.

Besides all of that, Scheme isn't very hard to learn and is a far better language than Javascript. People turned off because they can't learn Scheme probably weren't going to contribute much anyway.


How about having those developers get off their asses and learn something new for a change? Seriously, JavaScript is not the best thing since sliced bread. People will code up support for it anyway, but I suppose Carmack is aiming for power instead of popularity.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: