Wait, is it global by default (Lua, Bash) or truly dynamic? The latter would be kind of mind-bending to program with as the sole or default style. Was that ever a thing? Maybe I'm just too young to have experienced that.
I think Kernel is the other side of the extreme from Picolisp, since it wants all objects to be first class but wishes to maintain lexical scope information for all of them. I think this is hard because in a certain sense the names of things in a program have no natural correspondence to the meaning of the program from the point of view of a compiler writer in particular. Code calculates a value using values or changes the state of memory or however you want to conceive of it. The names one used to tell the compiler how to do that don't have any obvious relation to the transformation and keeping them around so that the programmer can meta-program is complex and makes the generated code slower. In a way, Common Lisp and Scheme seem like two distinct local maxima, out of which I prefer the latter. Kernel is neat though.
Kernel is "mostly" just lexical unless you explicitly opt-out with FEXPRs. FEXPRs are what draw most people into Kernel.
However, what is probably more important but doesn't immediately stick out until you poke at Kernel a lot harder are the fully reified "environments". "Environments are copy on write that don't destroy older references" has very subtle consequences that seem to make dynamic scope a lot better behaved.
This also has the consequence that I can force things to explicitly evaluate in reference to the "ground environment" which is an explicit signal that it can always be compiled.
I suspect there is a lot of fertile research ground here. In addition, there is a lot of implementation subtlety that I'm not sure he really grasped. Environments need a special data structure otherwise they cons up an enormous amount of garbage (I suspect it really needs a Bitmapped Vector Trie like Clojure).
People talk as if dynamic scoping was objectively a mistake, but the fact that it works well and is really useful in a complex piece of software like Emacs seems to suggest otherwise.
original opposition to lexical binding in Lisp circles was that lexical would be slower. That turned out to be false.
Emacs Lisp explicitly kept to dynamic binding for everything because it made for simpler overriding of functions deep in, but resulted in lower performance and various other issues, and ultimately most benefit from such shadowing seems to be focus of defadvice and the like.
I can understand why that objection would be raised, because lexical binding is slower in code that is interpreted rather than compiled, compared to (shallow) dynamic binding. Under shallow dynamic binding, there isn't a chained dynamic environment structure. Variables are simply global: every variable is just the value cell of the symbol that names it. The value cell can be integrated directly into the representation of a symbol, and so accessing a variable under interpretation very fast, compared to accessing a lexical variable, which must be looked up in an environment structure.
A rather weak argument when you consider what kind of mechanisms (like a digital clock with working seven-segment display) people have been programming / put together in Conway's Game of Life; to me this does not suggest in any way or manner that GoL could ever be my favored platform to simulate a digital clock (or anything more complex than a glider for that matter). Likewise vacuum cleaners and toothbrushes have likely been made hosts for playing doom, and people accomplish all kinds of stuff like quines and working software in brainf*ck. None of these feats are indicative of the respective platform being suitable or the right tool for a sizable number of programmers.
As someone that used it in languages like Clipper, or Emacs Lisp, and ADL rule in C++ templates, cool to make programming tricks of wonder, a pain to debug when something goes wrong several months later.
Few deny the utility of dynamic-style variables for certain kinds of programming. But it can be helpful to segregate that behavior more carefully than in a language where it is the default.
>PicoLisp uses dynamic binding for symbolic variables. This means that the value of a >symbol is determined by the current runtime context, not by the lexical context in the source file.
>This has advantages in practical programming. It allows you to write independent code >fragments as data which can be passed to other parts of the program to be later executed >as code in that context.
This amuses me because while its technically true this amazing feat is accomplished only by denuding of the code of substantial expressive power - namely the relation between the lexical denotation of the code and its meaning. I will say this - aesthetically, I prefer picolisp's approach to Common Lisp's, which is to just paper over this problem with gensyms, packages, etc. Give me hygienic macros or give me death.
Gensyms and packages are not required to make lexical scope work. Macros in an unhygienic macro system use these internally so that their expansions don't have unexpected behaviors in the scope where they are planted. The problems avoided by gensyms or packages are affect both dynamic and lexical scopes. A dynamic variable can be wrongly captured by an internal macro variable, not only a lexical variable.
It may be there are solutions favored in Picolisp without using macros that would be done using macros in idiomatic Common Lisp, and so those solutions don't need gensyms and whatnot.
My point is only that unless you are using a hygienic macro system the idea that you are manipulating code in your macro is a (often white) lie. Code has semantics, a meaning, and unless the object you manipulate carries those semantics with it (that is, the syntax objects of eg `syntax-case`) you're just manipulating some data which has a necessarily superficial relationship with the code itself. Picolisp resolves this by simple "eliminating" lexical scope, which means that code really is trivially related to its denotation since the semantics of variable binding really are just "whatever is currently bound to this variable." Scheme resolves this by having syntax-transformations instead of macros: functions which genuinely manipulate syntax objects which carry along with them, among other things, information about their lexical context. Common Lisp accepts that most of the issues arising from the distinction between code itself and its nude denotation can be worked around and provides the tools to do that, but in Common Lisp one still transforms the denotation of the code, not the code itself. From my point of view, if one is purely interested in the aesthetics of the situation, the Scheme approach is much more satisfactory. From a practical point of view, it doesn't seem to be particularly onerous to program in, although the macros in scheme seem to lack the immediate intelligibility of the Common Lisp ones.
You are manipulating fragments of source code in a macro. Material in which tokens have been converted to objects and which has a nested structure. So, nicer than textual source code.
I mean yes and no. In a CL Macro you are manipulating lists of symbols and other atoms and in a sense that is code. But code has some static properties (of which lexical bindng is one) which are not reflected in that structure and which you can break pretty easily in a CL macro. A scheme syntax object carries that lexical information which is so critical to the meaning of the code and because it does it is much harder to accidentally manipulate the code in such a way that meaning of the code changes. It is exactly the static lexical binding semantics Common Lisp which introduce the conceptual tension in macro programming that requires the programmer to manually worry about gensyms. Because picolisp lacks lexical binding manipulating code lacks this complication (and, in fact, the complication of a macro system almost reduces to a trivial combination of quotation and evaluation).
Programmers say that they are manipulating code when they go "vi foo.c" at their Unix prompt, so that's a bit of an upstream rhetorical paddle.
> It is exactly the static lexical binding semantics Common Lisp which introduce the conceptual tension in macro programming that requires the programmer to manually worry about gensyms.
A dynamically scoped Lisp (like Emacs lisp by default) with those kinds of macros needs gensyms all the same. It isn't the lexical scope.
When we have (let ((x 1)) (+ x x)), then regardless of whether x is lexical or dynamic, there is a lower level of binding going on. The x in (+ x x) physically belongs to the enclosing (let ...). That is not lexical scope; it's a fact about the position of the code pieces regardless of x being lexical or dynamic.
This is why in that strategy for implementing hygienic Scheme macros that you're alluding to, syntax objects, there is a different kind of closure at play: the syntactic closure. It is not a lexical closure.
The syntactic closure doesn't say that "x is bound as a variable". Only "this x expression is meant to be enclosed in this code".
Picolisp doesn't run into hygiene issues requiring gensym because it doesn't perform macro expansion:
If you don't have a code manipulating process that invisibly transplants pieces of code from here to there, then of course you don't have the issues which that entails.
You'd probably be the only person using an IDE for development in Picolisp.
The main author does (or at least did) a lot of development on a tablet, with his own software keyboard (https://play.google.com/store/apps/details?id=de.software_la... , which I've enjoyed for years on my handhelds, in part due to the tmux-arpeggio), and his own editor (https://picolisp.com/wiki/?vip ). I think most of us do something similar, using vim or vip, maybe on larger computers, but generally a pretty minimal setup.
The REPL has string based completion, besides completing symbols it will also complete file paths. Development is heavily REPL based, you'd spend a lot more time inspecting the runtime than searching for string occurrences in files.
From the REPL you'd also read the language reference, most likely in w3m, the preferred text web browser in this community. (doc 'macro) will open the reference on this entry if you started the REPL with 'pil +', where the + is a flag denoting debug mode. You can expect the web GUI framework to work rather well in w3m.