"The best description of the LISP programming language is that it is a high level machine language. That is, it shares many of the facets of contemporary machine language --the necessity for attention to detail and the freedom to manipulate the machine's data and programs without restriction-- yet LISP is high level in that the language contains the expressive power and convenience of traditional high level languages. The contradiction is resolvable: a LISP machine is just a higher level machine whose data items are organized differently from the binary bit patterns of most machines, and the LISP programming language is the assembly language for this machine."
Consider the Emacs Lisp (Elisp) interpreter for example. Elisp interpreter is the Lisp machine. It understands Elisp symbolic expressions, the language of this machine. With enough code written in this machine's language, we get this fine editing and productivity software known as Emacs!
It is mainly languages built around AOT compilation like C, Pascal, FORTRAN and such that completely separate code and data. (I guess though you can make a C program that writes a C program, runs "cc" on it, makes a shared object file, loads the library dynamically, and calls a function from it.)
which I might finish up when I'm done with the report I'm writing. It is possible to create Java expression trees with trees of static method calls that look a lot like S-expressions and stick them together into statements, methods and classes.
You should be able to do the same tricks people do with LISP macros and it could work code generation miracles but it would have that "LISP curse" problem in spades.
The plan is to generate a code generator that is sufficient to generate the full DSL implementation (ferocity0) and use that to generate the full implementation (ferocity.)
I have some tests for ferocity0 writing .java files to get fed to javac and for ferocity0 running expression trees with a primitive interpreter. Already the type system is enriched over the type system because interpreted ferocity0 can handle Java expressions as a type at run time so you get issues like quoting and unquoting in LISP.
Would also make the lives of time travelers much easier.
You can also use the new sb-simd to play around with vectorization in the interpreter. Okay you'll break something, but it's such joy while you do it. C/C++ and other "strictly-separated compilation" languages OTOH seem so dogmatic and stupid now. It's no surprise why development is no painful in the latter. Also not surprising that the world prefers them.
"LOGO is, up to surface structure, more or less equivalent to LISP."
It further says,
"The LOGO system supports two different (by no means disjoint) environments: the Turtle, Graphics and Musicbox world (ie: peripheral devices which are controlled by a command language) and the LISP world."
And then in a later bullet point on page 22,
"Our experiences, especially with young students, indicate that programming in LOGO may serve as a bridge between natural language communication and reasoning and the formal and abstract symbols and reasoning in mathematics and programming languages."
This bullet point ends with,
"Our findings can at least be partly explained by the cleanliness by which the basic computational ideas are embodied in LISP/LOGO."
In my own life, I was fortunate to have stumbled upon Logo as my first computer programming language. The simplicity and elegance of Logo had a powerful effect on me at a very young age. It immediately turned me into a computer programmer for life!
Three years later Byte ran a special issue on LOGO which I thought was one of the best issues of Byte ever
in particular it had some great reviews of LOGO implementations for different home computers some of which were pretty strange like the TRS-80 Coco and the TI-99/4A. It was very close to the cultural peak of the 1980s before Michael Jackson dropped off the charts, the Atari 2600 went down without being immediately replaced (unless you count the C-64), etc. Byte was a lively magazine so long as the market was split up between a wide range of computers but it never really found it's niche in a world dominated by the IBM PC at the high end and the C-64 at the low end.
Speaking of which, LISP never really caught on in the 8-bit era. This issue has some articles about how you would do so but it didn't seem to shoehorn so well into a tiny machine (like the 1k of RAM on the Sinclair ZX80) as BASIC did. From the viewpoint of a kid who just learned BASIC, FORTH seemed to offer a lot of what LISP did and it was very available, even if it was a few weeks of assembly coding to write a FORTH.
Like how we are all aghast at Python now. (All us Lispers, anyhow :-)
Hacker News discussion of BYTE's special Logo issue: <https://news.ycombinator.com/item?id=28603556>
>In my own life, I was fortunate to have stumbled upon Logo as my first computer programming language. The simplicity and elegance of Logo had a powerful effect on me at a very young age. It immediately turned me into a computer programmer for life!
I well remember the epiphany I felt while learning Logo in elementary school, at the moment I understood what recursion is.
While I have never worked as a professional software developer, computers have been a hobby all my life. I don't think the fact that the language I have mostly written code in in recent years is Emacs Lisp is unrelated to the above moment.
Yep, this was 100% me too.
I "got" recursion through math and induction and a bit of graph paper, and the way it made my mind recurse to infinity made me feel ... funny ... in that age.
But it wasn't until I used Logo on the school's only Macintosh LC475 that I got the full psychedelic effects.
Nothing as intricate as NetLogo (e.g. see its Koch curve here: http://www.netlogoweb.org/launch#http://ccl.northwestern.edu...), but still a strong impression.
Funnily, that had as a consequence to fall in love with iterative function systems and pizzabox-form desktops, leading to me being introduced to Matila Ghyka's book, and Sun's Sparcstation 5.
Whelp, the lack of an Oxford comma there really through me for a loop until I continued reading. Anyway...
> "Our experiences, especially with young students, indicate that programming in LOGO may serve as a bridge between natural language communication and reasoning and the formal and abstract symbols and reasoning in mathematics and programming languages."
If anyone is interested in this, there are the following books:
* Exploring Language with Logo: https://www.amazon.com/gp/product/0262570653
* Visual Modeling with Logo: A Structured Approach to Seeing: https://www.amazon.com/gp/product/0262530694
* Turtle Geometry: The Computer as a Medium for Exploring Mathematics: https://www.amazon.com/Turtle-Geometry-Mathematics-Artificia...
* Computer Science Logo Style: http://people.eecs.berkeley.edu/~bh/v1-toc2.html
It's disappointing to me that Logo died out and didn't remain around or evolve, and I find it a bit sad that many kids are getting introduced to programming via something like Python or Scratch. I feel there's still space for Logo, and it would be cool to see an easily downloadable and installable version of it.
(I've only glanced through CS Logo Style and haven't seen the other two.)
> A dozen articles about the language, with listings. The screenshots? All the same, showing recursive pictures of rectangles and circles. Great. LOGO can do that. But what else? Big empty void there.
E.g. the last chapter is an intro to general relativity, with a simulator for motion in curved spacetime.
(CS Logo Style also covers many topics, but it looked like they were all familiar to me as an experienced programmer. I haven't seen another book for programmers about most of the math in Turtle Geometry.)
Take a look at that issue of BYTE entirely dedicated to LOGO that was referenced multiple times in this discussion.
A dozen articles about the language, with listings. The screenshots? All the same, showing recursive pictures of rectangles and circles. Great. LOGO can do that. But what else? Big empty void there.
Contrast that with what BASIC could do at the time... Is it any wonder LOGO died out and BASIC thrived?
AIUI, implementations of BASIC on home computers were a lot simpler and more straightforward than LOGO or LISP - for instance, GC in BASIC was an afterthought and only applied to strings. The real competitor to BASIC back then was FORTH.
I grew up and learned programming in that era, bought magazines, typed pages and pages of listings.
98% of the listings in these magazines were BASIC, 1% were assembly, and the rest was... well, others.
LOGO was a niche language then, and Forth even more so.
See my comment upthread about Turtle Geometry.
But it did remain around and evolve.
See, e. g.:
StarLogo Nova: https://www.slnova.org/
There is so much good info packed into this one issue, it is worth terabytes of crap on stack overflow. Clearly there was less information back then, but it was far higher quality.
I'd like to understand how C came to dominate the world when LISP could replace all of the scripting languages we used today. Was it tooling on cheaper machines that made C so popular? Or that it was closer to ASM than LISP? I never really formed a good opinion on this, I'm missing lots of history.
I strongly suspect Microsoft was the catalyst which propelled C, absent of Unix.
Microsoft had been infatuated with Unix from early on. They had their own version called Xenix, based on AT&T licensed code. Part of the MS-DOS API consists of "Xenix functions", imitating some Unix-like things. MS-DOS, and consequently Windows feature Unixy conventions like .. for the parent directory and . for current, even though the underlying link concept isn't there.
Microsoft produced a C compiler + IDE which I believe highlighted and legitimized C as a viable language in the mass market world of IBM PC compatibles.
According to the Wikipedia page on Microsoft Visual C++, Microsoft already had a C compiler out in 1983.
"Microsoft C 1.0, based on Lattice C, was Microsoft's first C product in 1983. It was not K&R C compliant."
Microsoft picking up Lattice C and running with it is probably what caused the C explosion. Borland joined in the fray in 1987 with Turbo C.
But wait, Microsoft had a Lisp product too: why didn't that help?
[plain HTTP!] http://www.edm2.com/index.php/Microsoft_LISP
Probably they just didn't peddle it hard enough.
Microsoft also used C for developing Windows, and supported C development for Windows:
"Microsoft sold as included Windows Development libraries with the C development environment, which included numerous windows samples."
The rise of MS Windows dragged C with it, with the API's and code samples being expressed in C.
Plus, it was probably not lost on the PC programming population that if they learn and work with C on that platform, their skills will be applicable in Unix and elsewhere.
Until Visual C++, almost no one would bother with buying Microsoft compilers, rather going for Borland and Watcom offerings on MS-DOS, which was fully written in Assembly.
On the MS-DOS days, C was was yet another slow high level language, sharing attention with Modula-2, Pascal dialects (mostly QuickPascal, TMT Pascal and Turbo Pascal), BASIC (QBasic, Turbo Basic and QuickBasic), and Clipper/Paradox for DB stuff.
Anyone that cared about performance was using TASM and MASM for most of their coding.
When Windows finally became usable, with version 3.0, anyone that cared about productivity was buying Borland C++ for Windows, using Object Windows Library, or Microsoft's Visual Basic, only masochists were using pure C with the Win16 SDK.
Only the release of Visual C++, alongside MFC, made most devs have a second look at Microsoft's own compilers, and then with Borland's mismanagement, devs eventually jump ship from what was then Delphi and C++ Builder into Visual C++.
Microsoft was never a big C shop, as proven with the decision to ignore C past C89, which they only backtracked (sadly) due to customer pressure.
LISP would've been a total non-starter for serious use given its requirement for GC and being an interpreted language. Quite comparable to the dog-slow Java applets of the mid-1990s, and in fact even worse on early hardware.
The folks I always had mad respect for were those who could write in assembly on their Commodores, etc… I never had one of those machines but I still bow my head to those who could write anything meaningful in pure assembly as a teenager in the pre internet era when you had to figure out so much more on your own.
Your point still stands, but the actual substance is a lot smaller than you describe.
But back then... man, these ads were cancer and they made up 80% of the weight and price of the magazine.
One interesting point about the ads is that the time it took for a new technology to appear in the ads was usually much shorter than for it to appear in articles. In that sense, the ads were opportunities to peek into the near future (and, in the case of vaporware, alternate timelines).
I like the way magazines like BYTE are preserved and I hope we preserve our current tech as well.
C could more easily access the lower level resources as well.
I still wonder why one of these, like forth, lisp later Python did not become the standard command line interpreters whereas it would never have been C, and I have written very featured C interpreters.
> "The author's system uses the pointer reversal method, and he will testify to the unlimited number of obscure problems which can appear during the debugging phase of its implementation."
(I note that HN frequently has posts on Lisp as well as 8-bit systems.)
Other than HN, is there anything equivalent to BYTE in the modern era?
I do like magazines like Linux Format and RasPi, but they're focused on Linux and Raspberry Pi whereas BYTE seems to have covered all "small systems" from microcontrollers to multiuser systems (so both Linux and Raspberry Pi systems would be in scope, as would Arduino as well as Apple/Microsoft/Android/etc..) This issue also included a wide range of contributors from enthusiast developers to industry professionals to teachers and researchers.
Covid, my time in this house, and the people I was with, gave me a chance to explore Scheme and CLOS derivatives (GOOPS). It's a joy to work with, and it helps me understand Ruby and other later developments. The different hooks it gives you into your hierarchy really map out the space that's possible.
It would be nice to lose myself in BYTE. It was influential. But it feels wrong somehow? Maybe that's on me. Maybe someday I'll write a BYTE random-page generator service.
But I tend to agree - the biggest advances seem to be in hardware: now you can run Unix on an internet-connected watch, voice recognition/translation/automatic transcription on a mobile phone, and real-time ray tracing on a game console.
C++ improvements are related to the bare language itself, in regards to tooling, no one seemed to have cared to go beyond what something like C++ Builder has been doing for the last 30 years (more or less).
Python improves on BASIC, while still missing the compilation story from most BASICs.
Diving into Xerox PARC documentation about Interlisp-D, Smalltalk, Mesa XDE and Mesa/Cedar is quite revealing in what we already had 40 years ago, and how long it has taken to bring those ideas back into mainstream.
Byte Magazine – LISP (1979) - https://news.ycombinator.com/item?id=20008908 - May 2019 (67 comments)
BYTE Magazine's Lisp issue (1979) [pdf] - https://news.ycombinator.com/item?id=15033439 - Aug 2017 (151 comments)
The book traces out what are effectively conversations across decades (or centuries if you include Aristotle). So when you see McCarthy name drop Church's lambda calculus, you know how it ties into a conversation about Hilbert's decidability problem, and that self-reference ("recursion") is a fundamental tool.
And from the perspective of now, you see that this is the first encounter with automatic garbage collection, describing a simple mark and sweep algo. The BYTE magazine covers a more advanced garbage collector, Lambdino, which assumes a massive amount of familiarity with LISP and internals than the previous article comparing LISP and LOGO.
> Returning to the LISP theme of our current issue, Visi- Calc is an example of a tree-oriented parallel data struc- turing problem for which LISP is a most appropriate lan- guage of expression. Due to a lack of availability of LISP as a software development tool for personal computing hardware, its authors did not use LISP. They also had to make a number of compromises and tradeoffs as a result of the small size (eg: 16 K to 48 K bytes) of the main memory of personal computers. But they did use many of the tree concepts of artificial intelligence research. This provides us with the ultimate example of the relevance of LISP-like languages and approaches to personal computing: one of the most generally useful new user software tools for small machines, Visi-Calc, tackles just the sort of problem for which LISP is an appropriate tool of expression.
> Coupled with the new low cost, high density memory devices with 64 K bit capacity and with even greater density coming, the personal computer will attain or exceed the power of an IBM 360 Model 30 within the next decade.
Wow. We've come a long way...
Page 177 had an advertisement mentioning Alpha Micro Systems, where I worked at the time.
> We do not want to give the impression that all interesting uses of computers are centered around LISP. Some of the most innovative work was done by the Learning Research Group at Xerox Research Center in their development of the Dynabook and the Smalltalk language.
and the rest is history
Conveniently, some of those are on archive.org as well, e.g. https://archive.org/details/Radio-Craft . (Cheat code for search purposes: gernsback.) March 1949 is especially awesome, with articles by Sarnoff and De Forest on the Next Big Thing of the day (television) and, as an afterthought towards the back, an article on NIST's first atomic clock.
1) I should point out that my parents never bought into that whole D&D is evil craze. Star Frontiers was "cute" though.
For a mere $1595, you got a 16k computer with microsoft basic but the best part is that it comes with 2 Z80 chips. Not sure how much a Z80 chip cost but surely the user would prefer an extra 8k or whatever. Best part is that you get to learn how the computer works by building it.
You can see why user testing is so important.
> A major advance was announced in a press release dated October 25, 1975 in which the young specialty firm offered a relatively new and promising concept. The Hypercube was advertised as a four dimensional arrangement of dual 8080 processor "nodes" configured in 2x2x2x2, 3x3x3x3, and 4x4x4x4 arrays, with each node capable of communicating, via shared memory, with 8 adjacent nodes. This arrangement provided for the first processor in each node to handle system overhead and communications tasks while the second was left free to execute user code. The operating code was to be stored in ROM, and the total system promised unparalleled processing power at a fraction of the cost and overhead of mainframe machines from IBM, Honeywell, Boroughs, and other giants of the period. The advertised price of these three offerings was $80,000 for the Hypercube II, $400,000 for the Hypercube III (about 1/10th the cost of an IBM 370-168), and $1,280,000 for the Hypercube IV which was to be released in the second quarter of 1976. The concept was legitimized by publication in the December 11, 1975 issue of ELECTRONICS magazine. Ultimately, the U.S. Navy ordered a Hypercube II for installation in Huntsville, Alabama.
When did HD crashes go away? I still suffer through them.
Like stacks of thoughts that got played and worn
Used over and over till they were tired and torn
In my quick view, this is more cleaned up and easier to read at smaller sizes. (You might lose some of that "old magazine" feel, if you prefer that, though.)
But the "No Loose Ends" ad is pretty good. Probably the most modern sort of advertising, i.e based on an emotive pull, a dream, rather than every thing else that uses tech spec numbers, bigger is better.
By the time I started reading Byte at the public library (1980) there were a few computers that were mass market like the TRS-80, Apple ][, Commodore PET, TI-99/4A, etc. Mass market computers were talked about a lot in the editorial in Byte but Byte was also full of ads for more exotic machines aimed at OEMs, for instance to build a cash register system for a supermarket. Cromemco, for instance, advertised harder than anybody, but it was rare to see Cromemco and other exotics talked about in the articles.
And this in an era where there was no such thing as package tracking. You got your stuff when you got your stuff.
I've ordered a few things mail order this year because they were not available any other way. The wait doesn't bother me. Order confirmation comes when the bank tells me the check has been cashed. But the lack of package tracking causes mild anxiety.
Also, some hand-made things. Custom paper and such.