Hacker News new | past | comments | ask | show | jobs | submit login
Bits of History, Words of Advice (gbracha.blogspot.com)
110 points by miki123211 on Aug 31, 2020 | hide | past | favorite | 132 comments



From my understanding, to adopt Smalltalk, I’d have to give up my editor, my SCM, my code review practises, my deployment strategies, my whole way of working. Learning a new language is already a big ask. If you then also require the would-be adopter to change everything around the language, everything about how they work, don’t be surprised if people choose something else.

I think a modern parallel is darklang.com. It might be great, but it faces the same resistance for the same reasons.

You can do some amazing things if you integrate everything into a big monolith, but that also makes it impossible to adopt a technology part-way.


So you need to make a new language similar enough to preexisting ones but on the other hand it should give you noticeable more power to justify the change. It's a very hard to solve problem.


The activation energy required to make someone give up everything familiar and proven to (mostly) work in order to use a new language is huge, so it would need to offer a lot. OTOH, a language that only needed you to replace a few of your daily conventions and practices Could offer less and still get you to give it a try. All-or-nothing is a losing proposition for convincing someone to give a new language a try.


I'd almost be willing, but im still not sure I "get" smalltalk. I mean sure, I've started to understand why people talk so highly of the language but the whole sandboxxed environment feels like a outdated relic from the time it was created and makes it look today like nothing more than a toy. It feels like a concept that just doesn't realistically fit in the world of software today. I want to like Smalltalk but I just can't ever see it being an option even for most of my side projects.


I don't know. A bunch of people are writing Electron apps. That's surely also a sandbox, right?


Kind of... but, to quote the grandposter, I don't "have to give up my editor, my SCM, my code review practises, my deployment strategies, my whole way of working." It's just JavaScript, I get to bring my React/Vue/Vanilla.js workflow and the kajillion (for better or worse) npm libraries.

And, I get a double-clickable app that runs on Windows, MacOS, and Linux. Last I looked there wasn't an open-source way to do this with smalltalk.


But the one I replied to said that he/she would "almost" be willing to give all that up, except something-something-sandbox.


Not necessarily. There is a nice, file-based Smalltalk dialect called SOM (Simple Object Machine) by the people from Denmark who also developed JS V8. There are implementations based on C/C++, JVM and Truffle, see http://som-st.github.io/ for more information. I currently implement yet another version based on LuaJIT, see https://github.com/rochus-keller/som/#a-som-to-lua-transpile.... You can use your favorite editor etc. and don't depend on all-including virtual image like e.g. Smalltalk-80. In contrast to e.g. GNU Smalltalk (which is also file-based) SOM is completely independent of the Bluebook interpreter and doesn't inherit its disadvantages. I had a close look at both and also implemented a Smalltalk-80 VM (see https://github.com/rochus-keller/Smalltalk). When it comes to performance optimization it's much easier with SOM than with a Bluebook based machine.


Giving up your editor to tough, so I'll come back to that. I believe that Pharo and some other smalltalks no longer require giving up Git[1], and by extension, I assume that would allow keeping any git based review strategies. Some deployment strategies can also be kept largely intact[2][3], but I don't know what you give up by doing so. In those links, a lot of things aren't done how I would do them, but I can see how to adopt what is shown there to my preferred deployment tools.

Getting back to giving up your editor. It should be possible to use many editors with the smalltalk of your choice. However, in practice, there are many half finished and abandoned tools for doing so[4], and fighting with that while learning the system is unlikely to be productive.

[1] https://github.com/pharo-vcs/iceberg [2] https://ci.inria.fr/pharo-contribution/job/EnterprisePharoBo... [3] https://www.slideshare.net/umejava/dockerizing-pharo [4] https://dmitrymatveev.co.uk/shampoo/


Indeed, for Smalltalk-80 and its cousins at least. There were/are versions of Smalltalk that integrated better with the OS, but classic Smalltalk's vision was always to _be_ the OS and surrounding tools, really.

Perhaps Smalltalk was too far ahead of its time at the time it initially appeared. The computing machinery of the age couldn't run it efficiently enough. If they had, it just might have displaced everything else. At the time there really were no IDEs, no good SCM, no review tools, etc. And so it could have made that world in its image, which at the time was frankly superior.

But for the broader point, yes -- Smalltalk does not play well with others. I'm sure it can be made to, but that was not its first instinct.


> I think a modern parallel is darklang.com. It might be great, but it faces the same resistance for the same reasons.

They work similarly on the surface level, but the tradeoffs are quite far apart. The deployless nature of Dark on the backend is the central big benefit, and it's a distinction that doesn't exist when you are developing local GUI apps. Also, in Dark there isn't a peristent VM image that everything is entangled with, not to mention the lack of entanglement inherent in the FP language.


I would give up everything if someone could give me a completely consistent universe. For example if everything I came across was automatable via SQL I would take it.

All the time wasted not solving my problem on e.g. linux, filesystems, config files, bash, docker, kubernetes, rest, git, programming languages ...


I would say the two closest programming ecosystems I have found to this are Rust/Cargo and Node/NPM.

With either of these tools, it seems quite dependable to clone a git repo, and get it up and running in minutes or even seconds with a couple consistent commands. That is as long as the dependencies are fully rooted within the ecosystem.


> I’d have to give up my editor, my SCM, my code review practises, my deployment strategies, my whole way of working

This is bad? Everything (besides maybe reviews) is horribly broken for most (modern) dev, so not sure it's a negative to just nuke it from orbit.


Every other language works fine with Kate and Git, my favorite editor and SCM.

Having fewer options is not inherently good.


No, but some people have to try. You are not mainstream, but the amount of people who think vscode is a lightweight and stable editor is slightly painful. Working every day with 100s of people seeing them killing/restarting is some sign we generally just gave up. Taking the current low bar as 'good enough' (or even great as some people seem to think it is) is definitely not a good think imho.


When I ask why something isn't working with typescript in vscode at work people recommend restarting the language server with a keybind.

I'd send them a table flip emoji but im pretty sure they wouldn't get it, they legitimately think they're giving me good helpful advice.

They simply don't understand that if you need to restart something, said something is fucking broken.


Having to restart the language server is surely a disadvantage, but why make it be a dealbreaker? Maybe they see it as a cost worth paying?


Obviously it's a price worth paying but it should not be needed. MS could trivially collect these issues and fix them but they don't, people tolerate crappy software, the cost of bad software gets externalised not on the producer but amortised over many, many users, they tolerate it,the temperature goes up and the frog boils. If you put up with shit software, guess what you're going to get.


Doesn't help a common fix for node is `rm -rf node_modules`.

Yikes, your fix for your software is just delete it and start over? What is so leaky about the build process.


Its crap software, I can use the vim version just fine, no restarts, no problem.

Aside from that, I don't need typescript, its being forced on me.

I do want typescript, because it has advantages.

A frustratingly broken workflow is far too high of a cost to pay though.


Smalltalk's editor...wasn't great. And I was a vi user at the time. (Not vim, vi.)

SCM? Hah! The image was path-dependent. I seem to recall getting into a position where the code you filed out of one image couldn't be filed into a new image---you had to get a previous version and then upgrade.

Deployment strategy? Ship 'em the image. Maybe you should disable the editor first, though.

No matter how bad you think "modern" dev is, Smalltalk was worse.


And you cannot deploy it easily...


This is an interesting argument in an era where people now deploy Docker, etc. images as their deployment strategy. That looks suspiciously similar to the Smalltalk image model.

I'm not a Smalltalk fan really (though it's neat) but it was doing "containers" quite intensely long before anybody else.


A Smalltalk image runs Smalltalk.

A Docker image can run an app written in JS, C++, Python, Haskell, brainfuck, whatever.

Smalltalk was not doing containers.


I’m now curious about implementing a long running Smalltalk server that can be accessed using command line tools and traditional editors, with some graphical or web based tools to implement inspectors and other UI.


You're not far off from describing various lisp modes on emacs. Not trying to derail, but just share some existing art in the space.

Take a look at SLIME+swank for Common Lisp, or Geiser for Scheme, or Cider for Clojure.

A common architectural choice is to have a REPL buffer in Emacs, connected over a socket to the language runtime which hosts the REPL. You edit a text file, sending whatever portion of your program you want to the REPL, where it is evaluated in the runtime, with results printed back to the REPL buffer. There are advanced introspection and documentation workflows supported by being connected to a live REPL.

These protocols easily and transparently cross the network, so you have the ability to troubleshoot a production issue on a (hopefully previously-quarantined) production host exhibiting the issue. You can inspect state, recompile functions on the fly and replay the failure all from the comfort of an editor and REPL that you are familiar with from dev-time.



Gorgeous, thanks.


There is GNU Smalltalk, but I don't think it has a big community at the moment.

What you are describing, though, is conceptually easy enough to implement: modern Smalltalks (Squeak, Pharo, and friends) all have robust interfaces to Unix and stdin/stdout. You could easily create a repl. It's also easy to make TCP, HTTP, or WebSocket servers, so you can create interaction that way as well. And of course there is Seaside, which has some tools already for interacting with a Smalltalk image through the browser.

Another approach is Amber Smalltalk, which is a web-based Smalltalk for frontend development (amber-lang.net)


Yeah, I’m thinking along these lines. Just packing it up so it works in the flow.


i think darklang at least gives u infrastructure as you adopt it. so a better value proposition than smalltalk.


The article does not mention Smalltalk's uncommon syntax as a deal breaker. Not many langs with uncommon (not C-like) syntaxes have managed to become widely used. I think that that is known to those who ordered investment in Java and C#.

To me Smalltalk feels too integrated, which is a problem for me. I like a language that's small in it's (open source) core, with lots of (competing) integrations (IDE, testing, profiling) available. It helps me not to feel locked-in to an ecosystem.


I had that same reaction towards lisp. Not sure which flavor it was, but when I first saw it 10 years ago or so, the syntax put me off so hard that I wrote the whole thing off. Didn't look at another lisp for a few years, until I came across Clojure.

The syntax was again "ugly" but some people were so excited about the language that I had to look into it, which is when I learned why the syntax looks the way it does, and how powerful it is because it looks that way.

Needless to say, I felt monumentally stupid, not just for having done myself the disservice of not looking into lisps earlier, but for having the audacity to write things off out of "ewww" feelings. A perfect exhibit of programming being a pop culture and me being a mindless consumer.

Thankfully I learned my lesson, which came in handy when I came across Smalltalk. The syntax is powerful [0], and good luck getting that kind of power out of any conventional syntax language.

[0] among many other aspects.


Great comment.

At the same time, after 35 years as a programmer (mostly in C++), it is rare these days that I find myself ever thinking that my inability to get "power out of [...] syntax" is a problem when I'm working.

The problems I feel I face these days, working on a mid-size (700k LOC or so) complex, native, multithreaded, realtime, creative software project, are mostly just caused by the fact that what we need to do is hard, no matter what the language is.

I feel as though attempts to elide those difficulties through syntax really do almost nothing to address the underlying hardness of the problems. One example: it is true that there are quite a few places in the codebase where using lambdas/anonymous local functions would be handy. We wouldn't have to create actual named methods to tidy things up, even though we only use them once. But none of this relates to the actually hard stuff: managing UX and responding to changing and expanding user expectations.


Never blame the user for the problems of the system. Lisp fans know the syntax is a huge problem. It has been for decades. Their solution is to blame the user. This approach is absolutely doomed.

Clojure actually fixes quite a few of the syntax problems with previous lips, but doesn't go nearly far enough so they still have the same problem.


What happened is that "better syntax" turned out to universally fail in practice. It's not that we decided to blame the user. But there's 60 years of empirical data that points to M-expressions being irrelevant and waste of time. And people who don't have experience with other languages tend to pick it up just fine, even if the formatting is sometimes atrocious (AutoLISP community is an example).


M-expressions did fail. Their greatest failure is that they persist as a universal strawman that's useful for killing any conversation about evolving lisp's syntax.

M-expressions didn't just fail because they were unnecessary; they also failed because they just weren't a very good syntax. They were more cluttered and less readable than S-expressions, while adding no real practical value.

But Clojure has arguably demonstrated that you can evolve on Lisp syntax without going to M-expressions, or even changing S-expressions all that much, just by adding a few new bits of literal syntax and sprinkling some clever macros into the standard library. (A macro invocation may not technically be syntax from the parser's perspective, but it sure feels like syntactic sugar from a user experience perspective.)

And, while I haven't actually tried them at all, I've at least been intrigued by t-expressions. They seem to achieve some nice things like git-friendliness and easier visual scannability without sacrificing on macro-ability.


M-expressions happen to be catch-all name for various non-Sexp syntax forms.

As for Clojure, TBQH I'm not too convinced that few sigils make all that difference in writing, in plus or in minus. But I am far from my 10k hours in Clojure so YMMV


> M-expressions happen to be catch-all name

They're that, but it's also used to describe a specific syntax. IMO, the more general definition is not particularly valuable and should be discarded. Because it's just not useful to paint m-expressions, i-expressions, t-expressions, etc. with the same brush like that. Doing so just muddies the waters. If you want a general-purpose term for alternative lisp syntax, why not "alternative lisp syntax?"


Smalltalk and C were developed around the same time, which renders any argument to the effect of, "Smalltalk didn't get popular because it didn't use the C-style syntax that everyone knows and loves," uncompelling by virtue of being anachronistic.


Except that C-style syntax existed before C.


The earliest language I know of with syntax that really looks like C's is B, which is a whole three years older than C.


I remember people using the phrase Algol style, but that has faded out.


People overestimate C syntax importance, especially earlier on - when a lot of development was in non-C languages.


I agree. From my perspective, C syntax was just another syntax... Pascal, Lisp, BASIC, dBase, Fortran, DOS BAT, unix shell, and various assemblers.

That changed once C/C++ look over, but until then it wasn't something large proportions of programmers knew and had an affinity for.


so which language besides python is really really successful and doesn't bave C syntax?


Fortran, Cobol, Algol, Pascal, Basic, Rexx, Ruby and Lisp have all experienced success relative to the size of the software industry during their peaks - perhaps most notably Visual Basic and Turbo Pascal.


Fortran, COBOL, BASIC, Algol, and Lisp peaked well before C was popular (in some cases peaked before C existed), and Algol (obviously), Pascal, Python, and Ruby, are all languages in the Algol-derived syntax family, like C.

Visual Basic is probably the biggest success of a non-Algol-derived general purpose language after the dominance of C as anything other than mostly-legacy.

There are some others (Erlang, some Lisp-family languages [recently Clojure], some ML family languages) but languages with Algol-derived syntax are still far and away dominant.

There's also lots of not-general-purpose languages without Algol syntax (SQL, for instance.)


SQL


Depends what you mean by successful.

If you're willing to accept heavily entrenched ("a lot of development" as in the parent), then languages like Fortran and COBOL are highly successful. They underlie an enormous amount of infrastructure in the world.


I think it might be even simpler and worse: ecosystems are important, languages aren't. The "bitter lesson," if you will, of languages.


Smalltalk is really quite small. All syntax is described in here: https://gist.github.com/jacaetevha/3795784


Python is one of the most widely used languages out there and doesn't have C-like syntax really


Python is not exceptionally C-like, but a typical python script is readily understood by someone familiar with C. Smalltalk, on the other hand is not. This LOC is from the wikipedia example page:

#($a #a "a" 1 1.0)

There's no way a C programmer can know what that means without looking it up.


This comment makes me feel a little bit sad.

It doesn't take too much breadth of knowledge to recognize those all as different kinds of literal syntax, even if you don't know the exact details. #(...) for some sort of collection, probably a list or vector, $a and #a for different kinds of atomic value, even if it's hard to guess whether any one of them represents a symbol, a character literal, a perl-style variable reference, or something else.

If such prosaic syntax is really so unfamiliar-looking nowadays, it really speaks to a major gap in how people are taught about programming. Comparable to if it were common for younger mechanical engineers to not even recognize a slide rule when they see one.


If said C programmer has truly never seen or heard of anything but C, I'm sure a similar line of Python (and the end result) would seem quite confusing as well; ['a', ({1j}, {1: 1.0})]


Is that a typical line of smalltalk? I mean if this is representative I can understand why C-like syntax won. Surely a language with descriptive names, and a call syntax resembling:

    subject.verb(arguments)
Is more intuitive than the example you've given.

Of course it's also quite possible to write an equally cryptic line of C code.


No, this is not a typical line of Smalltalk. That would be considered a very sloppy array indeed. I think it's just there to demonstrate the different literal notations in the language.

Smalltalk is actually quite readable in practice, though the keyword syntax seems to throw off experienced programmers from other languages. It's worth noting that the language was originally tested on children, so it was also designed to be easy to read and learn.

The equivalent of your C code above could be something like:

    MyObject doSomethingWith: anArgument


aka Objective C

:)))


Python, like C, has basically Algol-derived syntax (it's not far from the Pascal-inspired pseudocode that was dominant form of pseudocode for a long time); Smalltalk does not.

It's not a direct linear descendant of C the way C++, Java, C#, and lots of other relatively recent languages are, but it's still a lot closer to C than Smalltalk is.


There's not many examples beyond this though... it does seem to be the case that people either choose C syntax for successful languages, or those languages are successful possibly because of C.

Why did Rust move from an ML syntax to a C syntax?


We wanted to cater to an audience of systems programmers, so we tried to stick to a familiar syntax where possible.


I know this opinion will not be received well, but I'm really sorry this extended to things like semicolons, snake_case, and the overloading of the term vector to refer to variable length arrays. I'm a big fan of many of the decisions made in Rust, but those are things which I wish we could have let die in the systems programming language of the last decades.


It's cool. I strongly prefer those things. That's why no single language will ever rule them all.


To each his own, but out of curiosity what do you like about using the term Vec for dynamically-sized arrays? To me it just seems that it's an inaccurate usage of a term with a precise mathematical meaning (what is a vector of strings exactly?), and as somebody who works with linear algebra a lot in code, it's a frequent point of consternation to have this built into two of the most relevant languages for achieving high performance.


Meaning is inherently contextual, so different things meaning different things in different contexts is the norm, not the exception. Do I get upset that a "string" isn't made of "thread"s?

It is true that mathematics is closer to programming than tailoring is, but you can't escape this issue. So I don't get particularly worked up about it.

(This context is why this matters too; "vector" as a term for this is established, just like the linear algebra term is established.)

(I do like to joke about getting worked up about this though, but I pick on String, which should be StrBuf, instead of vectors.)


So I generally agree with your point, and I'm not trying to say this is some giant issue which makes Rust and C++ unusable - on the contrary I think it's a minor annoyance at worst when trying to carve out a naming scheme for my own components.

But for the sake of internet pedantry, you seem to be saying that calling it vector is acceptable because it's established, not that there are any inherent advantages to this right? Because I would argue that we can and should take the opportunity to turn the page on imprecise terms of art which only exist for historical reasons.

Even looking at Rust, I'm very glad that char|short|int|long has been replaced with i8|i16|i32|i64 even if the former are terms of art which everyone can understand.


I am at least saying something similar to what you're saying. The inherent advantage is familiarity. You may not consider that inherent. That's okay too :)

Please excuse my WIP CSS, but I wrote a longer bit about this over at https://steveklabnik.com/writing/the-language-strangeness-bu... . I think that, for languages that intend to be adopted broadly, you have to choose carefully where you diverge from existing practice. Sometimes, you have to change, after all, that's how progress even happens in the first place! And sometimes, you're forced to change, because you don't truly have an analogous situation.


So I agree with the thrust of the point of your article as well - I guess my only point of disagreement would be where I would have placed something like naming vectors in terms of the trade-off between progress and familiarity, which admittedly probably has a lot to do with my particular domains of programming. However it does seem like a bit of a missed opportunity given the overlap between people who care about performance, and people who are working a lot with linear algebra in fields like computer graphics, simulation and ML. But I do understand that the line has to be drawn somewhere.

(On a side note, the CSS is perfectly fine, and I've been looking for a static alternative to Ghost for blogging, so I might give next.js a try)


Totally, 100% get it :)

Ah so, next is a way to make websites in general; the template linked is a good one for a blog though. I've had pretty positive experiences with next so far; I just re-did my blog in a hurry over the weekend for unrelated reasons, so it's pretty much just the template at the moment.


> what is a vector of strings exactly?

You can have a vector of integers, a vector of real-numbers, why do you think a vector of strings doesn't make any sense? Strings can be given a well-defined ordering. You can think about strings as a points along a string-line, if you want to.

For example - Java often names packages using co-ordinates (vectors) in an n-dimensional string space. Seems to make sense to me?


Maybe in some technical way this is correct, but when I'm pushing a string onto a vector of strings, I'm not thinking about conceptually adding a dimension to my vector in string space.

And how would you compute the cross product of <"foo", "bar", "baz"> and <"apple", "orange", "banana"> exactly?


> And how would you compute the cross product of <"foo", "bar", "baz"> and <"apple", "orange", "banana"> exactly?

Cross product isn't a well-defined operation. The wedge product[0] would be <a,b,c>~<x,y,z>=<<ay-bx,az-cx,bz-cy>>, assuming I didn't get the signs wrong again. That would require scalar multiplication and subtraction of strings, though, neither of which are generally well-defined.

If you substitute in string multiplication (ie concatentation) and addition of negatives (alternation and reversal, the latter of which is only vaguely justified by anticommutativity of multiplication), you get:

  << /fooorange|elpparab/ ,
     /foobanana|elppazab/ ,
     /barbanana|egnarozab/ >>
which frankly sounds to me like a pretty good argument that vectors of strings make no sense and shouldn't be supported, but maybe someone working on parsing algebras could come up a more useful interpretation.

0: http://en.wikipedia.org/wiki/Wedge_product

Edit: FWIW, wedge product of NaN makes perfect sense: <NaN,NaN,NaN> ~ <NaN,NaN,NaN> = <<NaN,NaN,NaN>>.


> And how would you compute the cross product of <"foo", "bar", "baz"> and <"apple", "orange", "banana"> exactly?

How would you compute the cross product of <NaN, NaN, NaN> and <NaN, NaN, NaN>?

Some vectors don't have a useful result for cross product, even if they contain 'numbers' and even if you can get some kind of result out.

Same with dot product of a string. Doesn't make much sense... so probably don't do it.


> even if they contain 'numbers'

Vectors don't 'contain' numbers, a vector is a multidimensional value. The fact that a C++ vector is conceptually a container for values more than anything else is a reason why 'vector' is a poor name for this component.


Yes they contain a number for each dimension. Not sure what you think the point of debating individual words like this? What practical problems do you think there are?


You're the one arguing from the extremely strained logic that because ordering rules exist for strings, this somehow means that C++ vectors have anything to do with mathematical vectors in the mind of the average programmer.

The practical problem is, when you do a lot of work with linear algebra, you would really like to have the term vector to use for its canonical mathematical meaning rather than having it reserved for a data structure which is not suitable for that purpose.


Well it makes sense to converge towards something right? Unless a programming language is using a radically different paradigm, like Lisp or Prolog, or if it is purely a syntax experiment, why not go with a C-like syntax and have every programmer in the world be able to instantly ready your code and basically understand what's going on.


> There's not many examples beyond this though...

Ruby, though still in the Algol-derived syntax family, is even less C-like than Python.


That's true, and that's probably why even though I have had to modify ruby code and write simple scripts at various points in my career, I never got to the point where I fully grep the syntax. Maybe that's why C-like syntax is so popular.


I'd say, no it is simplified C-like syntax in disguise.


I agree. Although Python is the best kinda counter example.

If you remove semicolons and use indentation over curlies in C it will look pretty pythonesque.


It failed because it was too expensive. 25 years ago I was looking at job in an elite project that ran smalltalk. I didn't get it but liked the language enough to investigate using it in my next job. The cost was crazy, like 1-2 months wages which was hard to justify when hardly anyone else used it. If it were the same price as C++ environments/libraries (yeah remember when you had to pay for container libraries) it would have been a hugely successful language.


I don't think it's the main reason. I rather share these views: http://www.wirfs-brock.com/allen/posts/914. Smalltalk beyond that suffers from the same issues as any dynamically typed language. You quickly lose track when projects get bigger and many problems that a compiler could easily find are discovered much later. I used ST in the nineties and then switched completely to C++ and other languages. Recently, I started using ST again, when I built the two Smalltalk-80 interpreters; from my point of view, it feels quite similar to JS or Lua, but is even more inefficient, since even control structures and loops are dynamic objects.


Yeah, try to be a small developer in the 90's and deal with the Smalltalk companies. If you made it in the door now deal with runtime fees.


And then the buyer likely had the cost of extra RAM on top of that...


I worked for a firm that tried to use Smalltalk to deliver commercial software running on Windows in the early 90's. From a learning standpoint, I learned a lot. From a business standpoint this is what we ran into compared to this article:

Lack of a Standard. Didn't really bother us. FYI we were using Digitalk Smalltalk. People in those days didn't have the expectation of changing their toolset without incurring problems.

Business Model. At that time people paid for software. The free (as in beer) software distribution model didn't yet exist. The tools were expensive but they were justified by programmer productivity. To that end, the tools made programmers very productive. It's one of the best environments I've ever worked in - and that was over 25 years ago!

Performance. Ah - this is where we started having problems. At that time Smalltalk hadn't yet optimized integer arithmetic and as a result simple numerical operations were slow. The garbage collector was also difficult - when it ran it would essentially hang your machine for several minutes. That would improve over the years but it was bad early on. Performance was the deciding factor in our not using Smalltalk.

Interaction with the outside world. Digitalk took care of this. It was workable, i.e. this isn't where we had problems or concerns.

Deployment. Digitalk was capable of producing a Windows executable file. There were two problems though: 1) it made a guess as to which classes should be included in the executable artifact (you could manually adjust), this made automated testing extremely important because you would need to test a release candidate and 2) it employed it's own windowing system which it would embed in the executable. It never looked like a native Windows application. The marketing folks and product line managers didn't like that aspect of Smalltalk.

Bottom line - poor performance and a poor end-user experience killed Smalltalk for us as it did for many others at the time. You know the saying you only get one chance to make a first impression? That goes for programming languages as well. A LOT of developers gave Smalltalk a go in the early 90's and ran into these issues. Though these issues would later be resolved by then the perception damage had been done and everyone had moved on.

It's interesting that Smalltalk is one of the most influential languages that never caught on.


I never had to deal with that end, but wasn't Digitalk a per-dev fee + a runtime per-user license?


Dolphin Smalltalk was beautiful in this regard. Reasonably fast, native windows, easy deployment.


That sounds awesome! How was its garbage collection?


Never noticed a pause. But I don’t think I exercised it with lots of ephemeral creation


It never forced anyone to use it. C was forced into use by the popularity of unix. Java by its association with the browser at the time. Nobody would have used objective C if Mac OS X had been based on C++ (BeOS). Ruby would be a niche if it weren’t for rails. Javascript, obviously, is necessitated by the browser.


Java, I don't think it was association with the browser. I admittedly was still in school at the time, so maybe I'm missing something, but it always seemed to me that everyone regarded Java applets as more of a cute toy than a serious technology.

However, Java did solve one really annoying problem that's hard to properly appreciate if you didn't leave any of your own blood behind in the 1990s: Cross-platform development. There were so many operating systems and architectures back then, and none of them had really risen to prominence as a general-purpose option, so most shops were working with a pretty decent variety of different platforms. At one point I worked at a desk that had three different computers with different processor architectures and operating systems (and therefore basically no software in common) sitting on the same desk, and all three were needed for some purpose or other.

Into that environment comes a new language/platform that offers write-once-run-anywhere with decent (if not great) performance, and one of the first standard libraries that might be recognized as reasonably mature by modern standards.


The batteries included approach makes a big difference in adoption. So far as I know there was nothing in particular making people use Python, it was just fun.


From the comments on the article "This reminds of The Lisp Curse. It's so easy to make a slightly different and potentially slightly better Smalltalk that many people end up trying. The resulting noise makes it difficult for substantial improvements to be recognized as such." - this is another way of talking about being forced to use it; it might be possible to build a better JavaScript or Visual Basic or Bash, but if you try the dominant one which is tightly coupled with a large popular system will have a lot more momentum.

The force drags people to it, and stops it splintering into twisty maze of implementations all alike.


I had th impression Java really had something going for it.

It used OOP and was leaning heavier on the code-side than Smalltalk, which had this in-place modification of a already running system.

I think Java went a step back so people would understand it better.


It had a lot of hype. I think a big part of the hype at the time was web related, but it was also immediately popular with schools. Schools liked that it was easy enough to spend less time teaching the language and more on the concepts. It also helped that it was going to be industrially important.


I think the two big factors for Java's success were these:

* OOP in a C-like syntax without the complexity and traps of C++, this made it attractive as a teaching language and ensured a steady supply of Java developers

* Cross-plattform-ness, reasonable performance, easy multitasking, and the Servlet API made it the ideal successor to Perl as the default option for web development, leading to ubiquitous industry adoption.


Objective-C was the language of NeXTSTEP, which was one of the most productive programming environments of its era (early 1990's).

In companies where custom, in-house software development was required, NeXT, NeXTSTEP, and Objective-C did quite well.

The whole macOS thing happened about 10 years after the first successes of Objective-C, when it was well on the path to obscurity due to the failure of NeXT's hardware, and then operating system businesses.

But this also kinda proves your point ... it was NeXTSTEP that made Objective-C succeed (both the first time, and then again with macOS/iOS).


Objective C had fewer users than Smalltalk before Apple bought NeXT. Hardly a success in the sense we’re saying.

It was a very nice system compared to its contemporaries. MacOS 10.1/Cocoa was a lot more fun than anything Microsoft offered.


So then, maybe a better question to ask is Why does smalltalk not have a killer application? It was used to pioneer GUIs back in the 60's, why didn't it maintain that popularity?


You need developers for that killer app and most killer apps don't come out of big companies. The vendors made sure the little developer wasn't welcome.


Partly because it couldn’t be democratized back then. HyperCard achieved a lot of the Smalltalk dream for individual programmer/users.


You could also argue Java was forced on people once more thanks to Android which makes the Oracle v. Google lawsuit funnier to me.


You mean it lacked a killer application: https://en.wikipedia.org/wiki/Killer_application


Would smalltalk been as successful as ruby is now if DHH had written rails in it?


It's possible. But the mainstream culture of computing is still so entrenched in the division of user/programmer and in the use of text editing as a primary interface that I suspect it would not have caught on in the same way.


That’s a great question. Seaside was really good at handling application flow through pages, seamlessly allowing the back button to work, by being based on closures and continuations. If that had been paired with the new ideas about convention over configuration and if a good object database were freely available, I think DHH might have chosen it had he been aware. DSL like things and magical code are possible in Smalltalk but the system is more reified in its final view. You could have class methods initializing the relationships, etc, but afterward those methods and variables would be visible parts of the system. Then tearing them down might not be as easy as removing a line of code like it is in a language like Ruby where the state is discarded at the end of a run.


Languages are more than just syntax and ideology. The entire ecosystem of the language and how it interacts with surrounding software is important.

You can have an awesome language with lousy interaction, and it's dead in the water.

Conversely, you can have a mediocre language with a great ecosystem, and it takes over the world.


But Smalltalk was likely the king of language ecosystem, and even more so of language interaction, at around the time when Java first appeared. And probably its only serious rivals for that crown were even more high-end and even more now-dead, like Lisp-machine Lisps. It didn't have great integration with native OS environments, but early Java wasn't spectacular in that department either.


As the article itself points out, Smalltalk may not have succeeded as an active language in the sense of market share, but its influence (and that of the environment from which it came) on later languages and development paradigms seems very significant and it's still regularly fondly referenced in many coding blogs and articles.

In spirit at least, it seems a resounding success.


The influence is generally overestimated. C++ and Java were mostly inspired by Simula 67, which actually was also an inspiration for Smalltalk. Although there had been a few articles before, a broader public did not have access to Smalltalk until the mid-1980s.


Smalltalk enthusiast for twenty years here. I jumped out of the balloon almost ten years ago now and sold my soul to the mediocrity of pragmatic polyglotism.

I miss Smalltalk. I miss it's clean syntax. Were there times when keyword syntax got odd, yes. But less so than list base call syntax (and even weirder with the hybrid list(keyword) keyword messages.

Was everything's done via messages all the way down always the bees knees. There were some edge cases (zip?) were it was difficult to figure out where the behavior really should be bound. But the trade off was simplicity and consistency. None of this why is len() a function and upper() a method nonsense.

I miss the simple syntax and pervasive use of Smalltalk closures.

I miss the tooling. The RefactoringBrowser was one of the most amazing code environments out there. I found code faster. I authored code faster. I changed code faster.

What have I appreciated as I've wandered through the wreckage of modern in use systems?

Namespaces. There were some Smalltalk experiments here, but they were late and varied. My favorite was probably that of Smalltalk/X. I mostly like pythons namespaces: "we should do more of those". Not a fan of pythons shadowing rules though and have never decided whether it's namespaces and shadowing are codependent or independent.

I kind of like typing. I remember I was skeptical of Strongtalk typing at first, but I've come to appreciate that style of annotation. Where typing always seems to break down for me is containers. I am just not a fan of generics. They are to typing what preprocessers end up becoming, a sort of "deep state Illuminati" of the program. Gilad hinted at a proposed solution to this a while back, but I never saw a follow up.

And of course cheep tooling. I don't mind paying for things. But Smalltalk business model was about platform entrapment. There was no incremental. No choice. You had to choose the whole party up front.


Here is the worth reading response by Allen Wirfs-Brock who "was there pretty much at the beginning": http://www.wirfs-brock.com/allen/posts/914


Smalltalk environments felt too much like a 'boil the ocean' approach in a world that was just starting to produce software that was distributed to and operated by non-programmers.

Most of its adherents developed on Apple's early Macintosh line. One of the core tenants of the Mac was intuitive usability. The Mac developer's 'Bible' was the Inside Macintosh series that featured Object Pascal (Think Pascal was a realy good IDE for it's time) and the MacApp library and resources that made it fairly easy to produce software that adhered to Apple's then state-of-the-art UX guidelines.

Smalltalk at that crucial point in time was of in it's own (academic) world, with its own windows and menus that didn't feel at all like the rest of the Mac's software. While this would soon be possible, it was too late and did not offer enough of an advantage to get the Macintosh developer community to turn away from Object Pascal. That would only be achieved by years of C/C++ programmers pushing for that language's adoption across all platforms.


The author seems to know (well, was involved in) the history of Self and Strongtalk but seems disappointed that the people involved went on to work on and perfect what they see as inferior technology (Java, JavaScript).

In reality I think if we do an honest accounting of the situation we can see that these have been the more pragmatic engineering solutions for their times, and so that's where the $$ was. I worked in Java for years, but didn't particularly like it, but it is, I think, what the marketplace needed and could handle at the time.

And I think JS/ECMAScript etc. is a reasonable practical solution for the imperfect world that evolved out of Internet/browser tech.

Unlike the author I don't think Sun's fortunes would have been any different if Self/Strongtalk or some other Smalltalk-world inspired/derived tech had been pushed hard. The market didn't need or want it.

IBM did a similar pivot with their VisualAge line, and they did it for pragmatic reasons.

Finally, let's look at the bigger picture: in the end, what we have with the browser + modern JS on V8/SpiderMonkey etc. is a dynamic object oriented programming language that allows for dynamic manipulation of its environment, is accessible to any user, can be edited and inspected live, can present interactive media to users, and has a large community of authors.

Is this really that far off from what Alan Kay and Dan Ingalls originally wanted? Purists will gag, but I think as much as I hate front end dev in JS that there's actually a lot here that is Smalltalk inspired.

EDIT: I spent much of the 90s and early 2000s cringing at what was evolving. I was really into PL research and really liked the Self-inspired world of prototype OO languages. I really dug LambdaMOO and spent time writing my own shared-world multi-author persistent-world object oriented virtual machine type systems. But there's no way any of that would pay the bills. I still avoid JS work really, but I can see the niche that this filled. It's imperfect ... but that's the world. I'm grateful to V8 & friends for making it suck a lot less. And that work in V8 is a direct descendant of some of the really neat pioneering stuff in the 80s&90s.

There's a good response the original article here: http://www.wirfs-brock.com/allen/posts/914


I have always noticed the fondness with which some people talk about smalltalk. For a language that practically doesn't exist anymore, I find that odd...especially considering the number of languages that people despise but still exist and are regularly used.

I don't really have much love for the combination of dynamic types and object orientation in general, but could someone explain what they got from smalltalk that they don't currently have via Ruby, Python, or other currently popular languages of similar discipline? I've seen lots of explanations as to why it failed...but why do people still talk about it like they miss it?


* The IDE/Debugger rocked. It was the killer app. You could explore/understand code in a way that I have yet to see rivaled in any other environment. I have used PyCharm/XCode/Studio/VSCode/Emacs/Vim at regular levels.

* BlockClosures. Reified and used ubiquitously. Did I mention reified? Since the language had no control flow builtins/keywords, it was all done in the library. And since Closures were fully first class objects that you could explore, understand, and extend, you could write control flow to your hearts content.

* Simplicity. You didn't have to have a PHD in parsing to deal with the AST for Smalltalk. Building your own AST walker was a mid level exercise. I once saw a comparison between the Ruby AST and the Smalltalk AST. The Ruby one was like 10 times more complex to deal with all of the edge cases. I added a goto implementation for a lark in an afternoon. I wouldn't even know where to start to add goto to Python. An absurd example of course, but it highlights that you didn't feel like Dan Ingalls and the other inventors of Smalltalk controlled your destiny. They wanted you to go places with it.


The big thing I find fascinating about it is the live environment. You can explore and change every single piece of code while it’s running. There’s really nothing else out there that does this except maybe lisp and even there you give up having the GUI to explore with.


I enjoyed his prose. Mr. Bracha is an opinionated chap that is not afraid to back up his [well-informed] opinions in a rather pithy manner.

I think that SmallTalk's contribution to ObjC was nice. It was weird to encounter, at first, but I got used to it.

He has a point about the people charging for it. The same thing happened to XSLT.

XSLT 1.5 is the free version; supported by many languages and libraries.

XSLT 2.0, however, is where the action is at, and that is only supported by Saxon, a paid library.

I'm not sure if that has changed, as I haven't messed with that Pandora's Box in a number of years.


In the mid 90s I was at an OO database company, and we were expanding into tools. The decision was made to produce a Smalltalk binding and build tools for that language.

That was well underway when Java was released. And you could hear the air go out of the Smalltalk tires. Java was another OO language, it was designed for “applets” that would run in the browser, and it was created by Sun Microsystems, who built the most popular computers of the day. Nearly everyone who was interested in Smalltalk turned to Java and never looked back.


"And yet, today Smalltalk is relegated to a small niche of true believers. Whenever two or more Smalltalkers gather over drinks, the question is debated: Why?"

"...Interaction with the outside world."

This. So. Much. This.

I was a (junior) member of the team that made (part of) the prototype for the OS/2 Workplace shell, in (IIRC) Digitalk Smalltalk. Doing anything outside the Smalltalk image was incredibly painful.

Want to share your work with the team? File it out (and make sure you get all of it) and give the file to the Keeper of the Golden Image, who will file it in to the image. Then pass the image around.

Want to send the prototype to someone who didn't have a license for a (rather useless) support library from the trainers who taught the team Smalltalk? File our code out, file it into a virgin image, run through the test suite to make sure you had everything, reimplementing anything that used that library. Shouldn't take you more than a couple of days.

Did I mention this was a prototype of a GUI? Make those FFI calls using the really sketchy api. When I started, there was a bug that would randomly cause the prototype to crash. No one could ever figure out what caused it, beyond the FFI interface. At some point, it escaped the prototype and would cause the image to crash without running the prototype.

Sure, Strongtalk and Newspeak may have improved some or all of everything, but by that point the bridge had been burned. Their one opportunity had been wasted. I never saw or heard of either in the wild.

"The Smalltalk image provided a much better Docker than Docker."

And then there's that kind of thing.


Complexity associated with those stateful images contributed as well, I believe.


Smalltalk never evolved towards practical. It did so much on it's own because it had to: there were no windowing graphical operating systems, so it was invented and included. There was no SCM, and so it was invented and included. There were no object oriented IDEs with object inspectors, so it was invented and included. The problem is, the first implementation of an idea is rarely the best, or the one that wins in the market. There's entire parts of our industry that just peeled off a piece invented by Smalltalk and made a better general purpose one. Great article.


I love ruby. Which happens to be a smalltalk-inspired language, although different enough that the particular analysis in the OP doens't apply, I don't think. (ObjC is also a smalltalk-inspired language, but also same I think).

I am worried that rubyists are gonna end up talking about what happened so ruby shrank for decades too. :(



The formatting of this page is weird; some lines wrap, others don’t.

Even in Reader Mode there’s spurious line breaks.


Third from bottom says you shouldn't do this: https://news.ycombinator.com/newsguidelines.html


This is not helpful. It's good to know if it's just me or others have the same issue.


Ah I somehow assumed the author may have posted this because of the blogspot link.


The article may be good, but the website, mobile adapted, it was not.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: