Hacker News new | past | comments | ask | show | jobs | submit | linguae's comments login

Same here; I've been fascinated by Lisp machines for the past decade, but they are very difficult to get a hold of, and when they do show up for sale, they are prohibitively expensive. When I went to the now-defunct Living Computer Museum in Seattle back in 2019, I saw that there were no Lisp machines available, though I did get to see and use other rare machines such as the Xerox Alto, the Apple Lisa, and the original NeXT cube (I'm glad I finally got to add one to my collection a few years ago). MIT CADR has been made open source for quite some time (https://tumbleweed.nu/lm-3/), and I'm glad that Xerox Interlisp-D is now open source (https://interlisp.org/). However, the holy grail of Lisp machine environments, Symbolics Genera, is still not available as FOSS. Funnily enough, this is fitting since Richard Stallman's frustrations with Symbolics was one of the catalysts behind his starting the GNU Project.

One of the interesting "what could have been" moments of computing history is Apple's exploration of Lisp in the late 1980s and during the first half of the 1990s. Such projects include:

- Apple's original plans for the Newton, which included an OS written in Lisp.

- The Dylan programming language, which I've heard can be thought of as Scheme with the Common Lisp Object System. Dylan was originally designed to be the official language used to develop Newton apps. Originally Dylan had an S-expression syntax, but this was changed to a more Algol-like syntax due to the prevailing opinion among many that an Algol-like syntax would be easier for C/Pascal/C++ programmers to adopt. However, the Newton ended up using a C++-based operating system, and NewtonScript, which wasn't based on a Lisp, was created as the language for developing Newton apps.

- SK8 (https://en.wikipedia.org/wiki/SK8_(programming_language) ), which was dubbed "HyperCard on steroids," was written in Common Lisp.

In an alternate timeline, we could've been using Apple devices built on a Lisp foundation. This is probably the closest we've gotten to Lisp machines on the desktop, as opposed to specialized AI workstations that cost five figures in 1980s dollars.

Then again, things worked out well for Apple after the NeXT purchase. The OpenStep API (which became Cocoa) and Objective-C was (and still is) solid infrastructure than can be thought of as a "pragmatic Smalltalk" desktop, though in recent years I feel Apple has been moving on from NeXT influences and is doing its own thing now with Swift.


> had an S-expression syntax, but this was changed to a more Algol-like syntax due to the prevailing opinion among many that an Algol-like syntax would be easier for C/Pascal/C++ programmers to adopt.

They can be forgiven, at the time. Now we have evidence that thinking is wrong. Today we have half of everyone and their dog, as Web programmers, using a syntax originally chosen by very technical systems programmers. (Bell Labs researchers -> C -> Oak -> Java -> JavaScript.)

Almost no Web developers are systems programmers, and this is just a poor syntax for the work, and needlessly cryptic, but they can pick up even this bad syntax just fine.

Now we know that, whether you use curly braces, parentheses, whitespace, or something else is not the barrier to learning a programming language. It's one of the most absolutely trivial things about it to learn.

Knowing this, the next time I hear someone say "We can't use this syntax, because it will just totally break people's brains, even though the higher grammar, semantics, libraries, domain frameworks, and everything else are different anyway, and are orders of magnitude harder to learn, we need to make it look superficially like something it's not, because people are full of poo"... I'm ready to appropriate the Lily Allen song: https://www.youtube.com/watch?v=KUHqFhnen0U


> (Bell Labs researchers -> C -> Oak -> Java -> JavaScript.)

It seems like Javascript inherits more directly from lisp than any of the other languages mentioned, except for the syntax. As a result Javascript is a lisp without macros, which is a sad concept. (In this sense, I very much agree with you.)


If you're already a seasoned programmer, maybe different syntax causes little friction.

But for people new to the craft, syntax matters: Chris Okasaki found that the one thing that helped students get over the hump and really start to understand scopes and blocks was significant whitespace.


Matthias Felleisen, et al., also found that syntax is one of the barriers to new students with zero prior programming experience.

Strangely enough, they found that Lisp syntax was easier to pick up, because it was simpler. (In general, the first word in the parentheses tells you what to do with the rest. Not other punctuation to remember, precedence parsing rules, etc.)

We're usually not developing languages for people with zero experience, but if someone wants to twist my arm to use Lisp syntax...


Racket now gets a new, non-s-expression, syntax. The Lisp syntax is said to be a problem hindering wider adoption.

*Some* people behind Rhombus think that lisp syntax is a problem - we will see. My prediction is that it will not even leave a dent - people get all kind of strange ideas without having any kind of data to prove their claims. To me it looks like just another python:

https://docs.racket-lang.org/rhombus/index.html


Python with extensible syntax and not afraid of functional programming.

Back in the day, Lisps and Schemes has had all sorts of excuses for not being used. To slow, GC is slow, too big, no libraries, too old, etc. etc. etc. All reasons that had some merit at one time.

But today, all of those excuses have been long gone for a long time.

Clojure is as mainstream and box checking as you can get, much less all of the other zillion projects out there.

And yet.

No great renaissance. Still talked about in hushed tones. "Only those snobby hacker guys use that."

And what single thing has remained and controversial about Lisps?

The syntax.

JavaScript demonstrated that a dynamic language with garbage collection, closures, and functional elements, and native data structures, can be used for everything from web pages to enterprise backends. Many of the things folks complained about in Lisp environments, JavaScript "suffers" from as well.

I know I'm not completely on top of things, but I think JavaScript has been reasonably successful and gained some popularity.

And behold, of all the things it does not share with Lisps: the syntax.

I'm reasonably confident if the hackers at Netscape came out with "S-Script" for their browser, it would be a historical curiosity. As desperate as people were to get scripting in browsers, they would have likely stuck with Explorer, VBA, and everything would be in a VBA clone today.

S-expressions have had their chance, and the wisdom of the crowds have not bought into them.


Ah, what would be a lisp thread without the inevitable lisps-decline-because-of-the-syntax-explainer, who will write a small roman about how bad lisp is and how no one cares about them /s

On a more serious note, anecdotes are not data, correlation has to be proven. Lisps may not be popular (a fate they share with most c-syntax language without corporate money) but also absolutely un-dead at that point. I am fine with it. In fact, as someone who earns good money with mostly JS, I couldn’t care less, I wouldn’t touch JS with a stick for my personal projects.


It's a longstanding error to think Lisp would benefit from a more conventional syntax. People have been making this mistake since the very beginning (McCarthy).

Rhombus is research. IMHO, Honu parsing stuff is interesting (e.g., maybe it helps add richer syntax extension to languages with syntax that traditionally makes that hard).

For the others: According to James Gosling's intentions, by retaining the familiar syntax of C programming with its use of curly braces, Java aimed to build upon the existing skills and knowledge of a larger community of developers who were already well-versed in C. This decision was meant to make the transition to Java as smooth as possible for those programmers, thereby increasing their adoption of the new language and its associated ecosystem (VM etc.). By the way, JavaScript was originally an embedded Scheme and had nothing in common with Java, except for the Sun marketing team (Brendan Eich was brave and took no pride...).

Additionally, Java (and .NET for historical reasons), also benefit from that Objective-C / NeXTStep linage.

The C++ like syntax was a kind of honey trap for C++ devs, the actual runtime and language semantics are closer those from Smalltalk/Objective-C, hence why Strongtalk and SELF JIT research did fit so well into Hotspot.


I still have my Dylan development kit from Appple. I was also into Sk8.

There don't seem to even be many screenshots of Sk8 and Apple Dylan out there, so please, if you can, mirror this stuff online somewhere.

One day the world of computing will realise the mistakes it made and having at least maps of the forks in the road that it didn't take will help it to find its way out of the jungle.


Both Dylan and Sk8 were available on Macintosh Garden last I checked.

Even better.

I totally understand and if the next 5 years go as planned I can do it. I have all the hardware as was.

I’m pretty sure by the time I got Sk8 it was download only.


Eloquently put, as ever.

> but this was changed to a more Algol-like syntax due to the prevailing opinion among many that an Algol-like syntax would be easier for C/Pascal/C++ programmers to adopt

Did this not cripple macros?


No. Dylan has macros. They’re much like Schemes syntax-case macros vs just ad hoc list building like Common Lisp.

It did cripple them in the sense that it took forever to actually fully implement the Algol-style syntax and the necessarily much more complex macro system that such a syntax requires.

That one to two year delay absolutely destroyed any momentum Dylan could have had, and also made implementation of a Dylan-compatible language much more (needlessly) complex for a perceived benefit that never materialized.

Instead of being able to focus on implementing optimizations, tools, and frameworks, everyone trying to participate in the Dylan ecosystem had to spend that time on syntax bullshit instead, and still do. It really pains me that the other Dylan ecosystem players didn't immediately drop the Algol-style syntax for the much simpler Lisp-style one when Apple dropped Dylan, and to this day OpenDylan uses the infix syntax.


This is one of the things I miss about Tokyo: there are plenty of small apartments there at relatively affordable prices. I personally would rather have a 400 sq ft unit to myself than to live in a shared house or apartment.

It’s not always easy living with roommates (I don’t like it myself), but we all have to make choices. Unless we’re wealthy, we can’t have it all. For many people it’s either roommates or a long commute, and I know people who have roommates and a long commute since that’s all they can afford. It still beats homelessness.

As someone who recently left an industry job to teach full-time at a community college, I have mixed feelings about the situation of this physics lecturer. I don’t want to dismiss the difficulties that many people, including academics, face in the housing market. In places like Los Angeles and Silicon Valley it’s rough for those having to pay market rates for housing unless they have extremely high incomes.

However, I feel that the lecturer, who started his position at UCLA last year, didn’t do enough homework regarding academic salaries. $70,000 for a non-tenured lecturer is normal, maybe above average, and in fact there are many tenure-track assistant professors who make about the same. To add, UCLA is in a very expensive area. He will have to compete against well-heeled people who want to live within close proximity to some of the nation’s wealthiest areas, such as Santa Monica and Beverly Hills. A physics lecturer simply can’t compete against Hollywood.

In addition, a $70,000 annual salary means that he could qualify for a rental no more than $1944 per month due to the standard 3x the rent requirement a lot of California landlords have. A cursory search on Craigslist shows plenty of one-bedroom apartments in Los Angeles County that are within this budget. Granted, they may not be the nicest complexes in the nicest neighborhoods, and they may require a commute, but the Los Angeles metro area does have extensive public transportation, and nobody goes into academia expecting a luxurious lifestyle on an academic salary. An academic who wants to make serious money needs to take advantage of entrepreneurial opportunities outside the institution.

In addition, there are plenty of grad students and postdocs making less than $70,000 who attend schools like UCLA that are in very expensive areas. Once again, I don’t want to dismiss the challenges they have. I myself lived in Santa Cruz County for over a decade and I know firsthand the challenge of being a grad student at UC Santa Cruz dealing with the area’s notoriously difficult housing market.

But I have a feeling we’re being withheld useful information about his finances. Does he have massive student loans impacting his budget? Does he have other obligations such as taking care of family members? Why wasn’t he able to take advantage of UCLA’s faculty housing?

I just feel that this guy didn’t do enough homework about the realities of being an academic, especially one living in a very expensive area. I’m an academic in Silicon Valley and I’m fully aware of the tradeoffs of choosing this lifestyle over industry. Yes, I’d like a bigger home closer to work, and I have no idea how I’d raise a family (though I’m single now so I don’t need to worry). But at the same time I enjoy the freedom I have in academia; I have far more research freedom as a community college instructor than I did as an industry researcher. I also love teaching and guiding students.


Yeah, I think this article does a massive disservice to the issue of low pay for contract faculty and postdocs by highlighting the relative out-of-touchness of McKeown while ignoring how a lot of lecturers and post-docs (at UCs like UCLA as well as other institutions) earn well below $70k depending on the field.

> Why wasn’t he able to take advantage of UCLA’s faculty housing

Lecturers are at the lowest priority for UCLA Faculty Housing.

That said, you can afford a 1bdrm on $70k in LA, just not in Westwood. He'd have to commute from the Valley if he wants to pay less than $2500/mo.


I agree, though I feel compelled to post that the BSDs learned their lesson about the use of the UNIX trademark the hard way. Part of what caused the classic lawsuit between the Regents of the University of California (the "Berkeley" in Berkeley Software Distribution) and AT&T was the fact that a spinoff company selling BSD had the phone number 1-800-ITS-UNIX:

https://en.wikipedia.org/wiki/UNIX_System_Laboratories,_Inc.....

This lawsuit happened (1992-1994) when Linux was in its infancy (Linus Torvalds started work on Linux in 1991). Had the future of BSD not been so uncertain during this lawsuit, it is quite possible that some derivative of BSD would be the dominant FOSS operating system today instead of Linux.


Linux's dominance might still have emerged due to the individuals behind it. Beyond Torvalds' work on the kernel, Richard Stallman and others had a significant influence in shaping its early trajectory. Their philosophies and actions were key in driving the open-source movement forward.

The BSD license, while appealing to corporations due to its permissive nature, allows proprietary entities to incorporate the code into their products without giving back to the community. This has happened multiple times, with examples like Apple's Darwin, NetApp, QNAP, and Sony's PlayStation systems.

The GPL, on the other hand, attracted early contributors because it ensured that their work would remain open and shared with the world, rather than being exploited for profit.


Official Unix certification may matter at workplaces and institutions where such certifications are required for meeting some type of compliance criteria. For example, if I remember correctly, early versions of Windows NT had a POSIX compatibility layer, which was crucial for getting Windows NT accepted by some US government agencies.

Still, I hope Apple resurrects their server line… hardware and software. It’d be nice to have a server oriented UNIX running on Apple Silicon without resorting to Linux.

OpenBSD works on Apple silicon, has done for the last few releases.

Cutler must have hated that layer!

As a semi-functioning adult, I am sympathetic to the argument that the term “enshittification,” while accurate, is also too vulgar of a term in some settings. It’s one thing to use it on Hacker News, but I personally wouldn’t use this term at church or when talking to K-12 students. Not everything can be PG-13 all the time; sometimes we need G-rated language.

There needs to be a more professional-sounding, G-rated term that describes the degradation of quality of software services.


Why not use “degradation of software services” when you want to be staid — and let everyone else use the term they want?

Inventing jargon with the intention of being boring is just hiding the issue with euphemisms.


If I was Apple or Microsoft, convincing people to use the term “enshittification” is actually the best possible outcome.

Nobody can use it in a TV ad.

Nobody can use it in political messaging.

Nobody can use it in G-rated settings.

Nobody can use it in a party platform.

Nobody can use it on the debate stage.

Nobody can use it in marketing on why they are better.

Nobody can use it in a courtroom without being accused of bias.

Nobody can use it who is generally soft-spoken or has strong cultural inhibitions.

The term itself silences speech. Anyone who calls this out is labeled a prude, which is perfect from a corporate planning point of view.

The only possible better outcome would be to use the term “assholeification” or something stronger. Call it “companies fucking with consumers” - that’s even better from a PR perspective.


You’re just trying to shift the narrative to using a less impactful term.

Regardless of whether the word is vulgar, I often see it thrown around as a meme that has became overused. Even if the original phrase used different words, they will become less and less meaningful once they start appearing in every other comment thread.

Is it overused or does it apply to many things?

You're overstating the impact of the term. No one is going to change the world or overturn the status quo or shift the dominant paradigm by using slightly vulgar language. The only value it has is in the catharsis it provides by comparing something to shit. It isn't a technical term (even though it used to masquerade as one.), it's evocative, so let's be honest. People just like saying things they don't like are shit. It's snark. It's weirdly the only kind of snark that gets past HN's filter.

And since "enshittification" is applied to everything now, and no longer refers to the specific context for which it was coined, we can say we're witnessing the enshittification of enshittification itself.


You’re underestimating how much this stuff matters. There’s an old George Carlin bit about “soft language” that is very relevant here:

“Americans have trouble facing the truth, so they invent a kind of a soft language to protect themselves from it.”

There’s a reason why clickbait is a thing, it’s because if you don’t find a way to punctuate the noise then people don’t pay attention, and people’s brains are affected by the things that grab their attention.


Except that isn't what's happening here. No one is protecting themselves from uncomfortable truths by choosing not to use the word "shit" to describe anything and everything they don't like. People use vulgar language all the time, especially online. "Enshittification" doesn't move the needle either way, but it comes off as trying too hard to be edgy and it's well overdone at this point.

I mean, you accuse people of trying to "shift the narrative" if they don't like it. As if not using it is wrongthink to you. You frame "enshittification" in terms of class warfare and self-deception, and almost imply that using it is a revolutionary act. And that's weird. That's far too much emotional and political investment in what amounts to a poop joke.

It's done, please find another meme. I know you won't, but I wish you would.


> There needs to be a more professional-sounding, G-rated term that describes the degradation of quality of software services.

It may be. The problem is that enshitification is much more than a simple degradation.


"Value engineering"?

Exactly. The reason why I'm a fan of the BSDs in general is because of the cohesion of the base system, not to mention the high quality of its documentation and source code. I tend to prefer cleaner, cohesive systems that seem "designed" (System 7 Unix, the BSDs, Plan 9, C, NeXT and early Mac OS X, Smalltalk, many functional programming languages like Scheme and Standard ML) to large, complex systems that seem "evolved" (Windows, Linux, C++, JavaScript, the Web in general). I also like the conservative, deliberate approach that the BSDs take when it comes to the inclusion of new features. I remember reading a FreeBSD guide about 20 years ago that said something to the effect of "BSD is what you get when Unix hackers make a Unix for a PC, while Linux is what you get when PC hackers make a Unix for a PC." The spirit of that quote is that the BSDs tend to adhere to the Unix philosophy and thus their feature additions are in line with the philosophy of the overall system. The consequence of this, though, is that the BSDs are slower to adopt new technologies compared to Linux, though there is, in my opinion, a tendency in the Linux ecosystem for new technologies to be pushed before they are 100% ready, and the Linux ecosystem, as you said, feels more like an assembly of parts, each with their own design philosophies, rather than a cohesive system. Linux is certainly good, but I personally find FreeBSD more pleasant.

With that said, I use FreeBSD for my home server and also inside VMs, but I find myself using Linux more. There are many situations where I often need to use Linux instead of FreeBSD due to either software compatibility reasons or lack of drivers.


Thank you for this thoughtful explanation. I’ve been thinking a lot lately about this and your post convinced me that FreeBSD with virtualisation for occasional use of linux software is the right choice for me.

I’ve been using FreeBSD occasionally for about 20 years. I like FreeBSD; it’s a no-nonsense operating system with excellent documentation and high-quality source code.

There is a question that affects all of the BSDs: what does it mean to be a non-Linux Unix in the 2020s? 20 years ago, there were many commercial Unixes that were in use, such as Solaris, AIX, HP-UX, and IRIX. POSIX was the main interface that the Unix world, commercial and open source, had in common. The BSDs benefitted in this ecosystem by being able to run software that kept portability in mind, since there were so many Unixes to support.

20 years later, commercial Unix seems to be largely dead, and Linux has become the dominant Unix-like OS. I get the sense that some software developers are less concerned about compatibility across Linux, *BSD, and macOS and are instead singly targeting Linux. This leads to software with many “Linuxisms.”

Should there be an updated POSIX to tackle new technologies, or should the BSD world recognize that Linux has become the standard and thus focus on implementing interfaces to technology from the Linux world?

I love the BSDs, but I’m concerned that the FOSS ecosystem is increasingly ignoring them.

On a related note, the BSDs are respected by its users for its conservative, deliberate approaches to new technologies. There is a tendency in the Linux ecosystem for solutions to be pushed before they are fully formed, and there is also a tendency to prioritize features over adherence to the Unix philosophy. I see pushback from the BSDs when it comes to Linux containers, systemd, and Wayland. However, if Linux technologies become the standard by application developers, then the BSD world will either be forced to write compatibility layers or will have to do without those applications.


This is excellent news, and also quite timely! I am working on a side project where I want to build an exokernel, and recently (earlier this week, in fact) I decided to use Pre-Scheme as a base for implementing this exokernel and other low-level libraries that handle functionality such as memory management. Eventually I want to be able to host a language like Common Lisp, or at the bare minimum, something like Self (which can be thought of as Smalltalk without classes; see https://selflanguage.org/) on top of Pre-Scheme. This is all part of explorations of how to build a modern OS inspired by the Lisp and Smalltalk environments of the 1980s but updated to address modern concerns such as security and multi-core processors.

I hope I'm not completely beside the point, but what you describe makes me think of the "malleable systems collective" people who I was reading more about recently:

https://malleable.systems/catalog/

I think there may be some synchronicity between the kinds of things being discussed on the forum section of that site and what you're attempting. Very cool project anyway, I hope it goes well!


Yes, this is exactly what I’m interested in, and building malleable systems is exactly why I’m interested in Lisp- and Smalltalk-style systems where metaprogramming and component-based design are infrastructure for enabling malleable software construction. Exokernels are also a nice complement since this aids in the swapping of even low-level OS components, since OS services are implemented as libraries on top of a tiny kernel that only handles hardware multiplexing and protection.

If I could nudge you a little further - as I think I'm almost following, but not quite - what's the difference between an exokernel and a vm? I mean, wouldn't you have to write a different exokernel for each architecture, with this set up?

Edit: reading more about exokernels - upon reflection, maybe I can better imagine how this kind of system could be very interesting for experimenting with what you describe. You'd effectively be drawing the boundaries the programs would cover much more broadly, so then maybe they could share more with each other?


When I was learning Scheme, I made a very simple Self-ish object model in R5RS, as an exercise: https://www.neilvandyke.org/racket/protobj/

Your project sounds more interesting and challenging.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: