Hacker News new | past | comments | ask | show | jobs | submit login
Oberon-2, a hi-performance alternative to C++ (1997) (ntnu.no)
72 points by abricq 3 days ago | hide | past | favorite | 102 comments





One interesting thing about at least some versions of the Oberon system is that they found that it was faster to store programs as compressed abstract syntax trees on disk and compile them at load time instead of storing them on disk as native code. This was largely due to 2 factors: (1) the ratio of disk I/O bandwidth to CPU speed on the ETH Zurich workstations and (2) Wirth-style compilers generating code very quickly (with relatively few optimizations).

Wirth's graduate student Michael Franz took this a bit further and adapted to storing Java programs as compressed SSA control flow graphs, called SafeTSA. Franz's research group modified the Jalapeno (now JikesRVM) JVM to be able to load SafeTSA files as well as regular Java class files. They found that the time to JIT code was reduced, and the performance of the generated native code was faster when using SafeTSA. The downside is that SafeTSA isn't well suited to interpretation in a HotSpot style hybrid JIT. So, program start-up time would likely suffer unless using AoT compilation. Last time I played around with A2/Bluebottle/AOS, I believe compressed syntax trees were the on-disk program format.


Huge fan of Franz' work on this, his PhD thesis on it[1] is a thing of beauty for it's simple, accessible description.

Beyond that, I just wanted to add a couple of things:

It doesn't require keeping the generated datastream at the level of the AST per see. You can generate as high- or low-level data as suitable where e.g. a lower level representation requires additional computation but remains independent of hardware architectures. The key is just that what you generate should fit a tree structure. So you can do any number of optimisation steps on the tree prior to applying the semantic dictionary encoding step if you choose. The extension to e.g. serialising SSA forms was pretty natural - there was work at ETHZ on generating SSA form around the same time as well [2]

The real beauty of the method is that the encoder and decoder each generates a dictionary the same way e.g. LZW does (and the dissertation cites Welch; Lempel-Ziv style variable length dictionary entries is achieved by simply variable length encoding all integers stored), but where the encoder uses it to compress and flatten the tree, the decoder does not reconstitute the original tree, but uses it to generate machine code fragments directly, and either copy it straight into the code being generated or stores it as a template to reuse for generation of other occurrences of that dictionary element.

The compression/decompression and serialization/code-generation are one and the same, and avoiding a separate layer is a key reason why it achieved the speed it did.

[1] Code-Generation On-the-Fly: A Key to Portable Software, M. Franz (1994): https://oberoncore.ru/_media/library/franz_m.code-generation...

[2] Single-pass generation of static single-assignment form for structured languages, M. Brandis, H. Mössenböck (1994)


As info for others, since Oberon System 3, there is the option of AOT or JIT compile the modules, depending on the use cases.

It is done as part of the compiler settings when compiling a module, if it ends up in pure binary or abstract syntax trees format. Some flags depend on the compilation mode.


I never got the big appeal of Wirth languages. When they weren't just vanilla Algol derivatives, they were making a virtue out of a performance hack (enforced define-before-use to enable single-pass compilers) or relying on non-standard variants (Turbo Pascal/Delphi) to claim that uninspiring base languages were competitive.

As an aside, Oberon-2 isn't a C++ alternative because it apparently has mandatory garbage collection, which isn't acceptable to C++ programmers now (which is why they're going to Rust instead of Go) and damned well wasn't acceptable to them in 1997, else they would have moved to Java.


It's very simple: Simplicity.

A Wirth-style compiler for a Wirth-style language can be written fairly rapidly by a single person, and the languages, compilers and runtime systems can be easily understood. The Oberon-07 language report, including the grammar, is 17 pages.

If simplicity doesn't matter to you, then this won't appeal to you, and that's fine. I mostly use Ruby these days, whose parser (in MRI anyway) is likely longer than the whole Oberon-07 language report alone (I haven't measured, but ca. Ruby 1.8.6 the parser alone was 6k lines+), so I've surrendered the simplicity, but it still stands out to me as an ideal and inspiration.

Wirth was perhaps too ascetic in his insistence on keeping the compilers minimal, but you can fine less extreme versions carried on in extensions done in the PhD dissertations of students in his group.


> It's very simple: Simplicity.

Well, simplicity in a programming language is IMO only a virtue as long as it doesn't come at the cost of too much functionality, which with Pascal it surely did.

There's a reason that of C and Pascal, C is the one that survived. It's the simplicity of Pascal that ultimately killed it, since it forced any practical implementation to heavily extend the language (Turbo Pascal, Apple's Clascal, etc). Even USCD's famous P-System extended as much as it subsetted the language. What this meant is there was in practice no Pascal standard, only these isolated ecosystems which eventually died one at a time.

In contrast to Pascal, C (which was designed only a couple of years later) was functionally complete enough, even in original K&R form, that it was implemented as a standard, and the standard endured past the lifetime of any individual implementation.

The simplicity of Pascal also came at extreme cost in terms of performance and usability. For example most non-trivial Pascal programs (certainly any use for systems programming) end up using the "variant record hack" for type casts (e.g. pointer to int). So, rather than not allow type casts, Pascal allowed it but made it have a runtime cost of load/store rather than being free. Or another example - Pascal only supported value or var (passed by reference) function parameters, so if you want to efficiently pass a record or array type you'd have to pass it as "var" to avoid being copied, even if the caller wasn't meant to modify it.

Hmm.. or how about Pascal's "simple" for loops, which took the form "for i := x to y", which means you couldn't use it for "i = 0 to (N-1)" without the cost of calculating "N-1" and would have to use a while or repeat loop for that instead at the cost of readability/maintainability.

Or how about Pascal's (variable sized) "conformant array" function parameters which the language didn't allow you to pass to another function... Or how about the fact that Pascal didn't support length-1 strings (length 0 and 2+ ok, though), since it made things "simple" by regarding 'a' as a char vs 'ab' as a string (incompatible types).

Yes, it was simple, but very poorly thought out, and practically useless unless extended.


Nobody, including Wirth, wanted Pascal in its original form to survive, or at least to be in widespread use, given he iterated on it multiple times. You'll note the article is not about Pascal. For a reason. Pascal is a very different beast to the Oberon family, which proved itself - in it's unexpanded form - viable to implement both a OS and applications, whether or not you like the tradeoffs of the languages.

Focusing on Pascal is also irrelevant to the point I made, which was to answer the general point of the appeal of Wirth's languages.


But even with Oberon, Wirth doesn't seem to have learnt the usability lesson, and removed stuff like enumeration types and variant records.

I get the appeal of a simple language, but was just pointing out that in addition to compromising functionality it also lead to the death of Wirth's languages by means of there effectively being no standard.

I've probably had as much exposure to Wirth's languages as anyone:

- Learnt Algol-W in college c. 1979 - Implemented ISO-Pascal (me +1) at Acorn (UK) c.1982 - Used Modula-2 to program parts of PANOS at Acorn

Have spent the last few months programming in Acorn ISO Pascal on an 8-bit emulator (BeebEm), struck by how awful the language itself really was!


Pascal was designed for learning programming concepts, not for real world use. Others modified it for real world use but Modula-2 was designed for real world use and is a huge improvement over Pascal.

You appear to want him to have different goals for these languages than he had. That they don't have mass adoption is fine. It's not their primary value.

That said, "death" is too harsh, all the time there are multiple implementations of both Wirth-derived (the most prominent being Delphi, which is still being developed and sold) and "pure" Wirth languages available and in use. If anything, I find it more remarkable that they've been seen as much adoption as they have outside of teaching. There's even a very expensive commercial ARM implementation of Oberon-07.

It's in any case largely irrelevant to the question of their appeal.

I don't want to work in Oberon (or Pascal) any more than you do (but, to tie it back to this article, I also don't want to work in C++ unless I have to), but in terms of distilling lessons of what is needed from a language, and a learning tool, it is up there with Smalltalk and Lisp (other languages I don't want to work in) as one of a small handful of the language designs that will continue to influence for a long time to come.

And that is a large part of their appeal - not for day to day use, but for the lessons you can learn from them. I don't care if nobody uses them. I care about the research they contributed to and spawned, and what I can still learn from them.

> and removed stuff like enumeration types and variant records.

Oberon replaced variant records with extended types, type tests, and procedure variable (function pointers), sufficient to do OOP. In terms of his goal of focusing on sticking to the bare essentials, it was a good tradeoff, and it sparked novel work.


> I don't want to work in Oberon (or Pascal) any more than you do (but, to tie it back to this article, I also don't want to work in C++ unless I have to), but in terms of distilling lessons of what is needed from a language, and a learning tool, it is up there with Smalltalk and Lisp (other languages I don't want to work in) as one of a small handful of the language designs that will continue to influence for a long time to come.

Other languages did simplicity better without sacrificing expressiveness by allowing natural extensions of the language. Wirth languages, by being Algol-derived, are overly-complex syntactically in ways that make them non-extensible, and possess ill-considered semantics on top of that.


I really don't see Wirth in reality as a huge influence in the history of programming language design. All of Wirth's languages are in essence tweaks on ALGOL from which they all (starting with ALGOL-W) were derived. What's so new about any of Wirth's languages, even Oberon? OOP had already been done 20 years prior in Simula, as well as more recent pre-Oberon languages.

He certainly left his mark on the computer industry in terms of products based on his work, but would, say, a hypothetical "Turbo Algol" have been much different than "Turbo Pascal"? The popularity of Turbo Pascal (& Delphi) really had more to do with the implementation (speed, IDE, language extensions) than the base language itself.


> I really don't see Wirth in reality as a huge influence in the history of programming language design.

C# is influenced by Pascal and Modula-2 because Anders Hejlsberg had a huge hand in creating it.

Go and Lua are also quite heavily influenced by the Wirthian languages.

Ada is also a direct descendant of Pascal and Modula-2. For a while in the 80's, Ada was more popular than C++ because of the DoD standardizing on it but the DoD and NASA still use it today. NASA's Artemis project uses Ada. The Canadian air traffic controls system all run Ada.

Wirthian languages have been focused on safety for 50 years. The industry is just now catching up with safety-focused Rust as a first step. It's long overdue. C and C++ were dangerous enough offline but using them online is just asking to get hacked. Sure, safe code can be written in them but not many C and C++ programmers are capable of doing so. Why wouldn't you want a language that handled the bulk of that for you?


C# is really managed C, and the C lineage is more ALGOL -> BCPL -> B - > C

Sure Ada saw some success, and is even a pretty decent language, but I fail to see that there is much Wirth in there, certainly not his minimalist aesthetic!


Lets not forget that Wirth was on the ALGOL design group, and Pascal was initially designed as it saw ALGOL as being too complex for teaching purposes.

So there is a little bit of Wirth on ALGOL as well.


I don't believe Wirth was involved with any of the official ALGOL versions. He submitted his own proposal for an ALGOL-60 replacement which was rejected and became ALGOL-W (Wirth), which I briefly used in college c.1979. The actual successor to ALGOL-60 was ALGOL-68 which Wirth had no input on.

There's a useful language family tree on the WikiPedia ALGOL page. ALGOL is really the grandfather of all modern structured programming languages. I guess the way it was named "Algol = Algorithmic language" about says it all.


Wirth became minimalist with Oberon, which came after Modula-2 and Ada.

> What's so new about any of Wirth's languages, even Oberon?

You've been given the answer already: Simplicity. You keep ignoring it.

Both of implementation (which can be met by e.g. Smalltalk, Lisp, Scheme, Forth as well), and of usability (which some might argue the former could, but the reality is most people have gravitated towards languages closer to the ALGOL family for usability).

And yes, it's "tweaks" on ALGOL, but those tweaks mattered, in terms of simplifying implementation. ALGOL users expected gc (though it was not technicall required), Pascal users didn't (though it wasn't forbidden). ALGOL's added complexity around type checking of variants mattered. ALGOL's file IO was bizarre (giving me horrible flashbacks to Simula's equally broken IO), and ALGOL 68's definition of formatted IO took up several dozen pages of the spec... ALGOL supported user defined precedence rules for operators, which, while it's not hard to implement is significant added complicity vs. a typical hardcoded recursive descent parser for Pascal, Modula-2 or Oberon.

Having used Simula (it was compulsory for CS classes at my university way back), the OOP in Simula is really quite different and at a far higher complexity level than Oberon.

Another comment listed influence directly on languages. But the influence on languages is far broader in terms of the simplicity of the languages being a facilitator for an explosion in language designs. It was seeing the ENBF for Pascal that got me to start experimenting with parsers and compilers, for example, because the simplicity made it clear that it was possible for a hobbyist to create their own from scratch. The existence of simple Wirth-language compilers made experimentation easy - I spent too much of my teen years looking at compiler source, and the Pascal compilers were easy to understand and modify. I wrote my first expression parser in Pascal, and my first code generator, and soon I ended up bootstrapping my own Pascal-inspired language.

Simplicity matters.

It's easy to forget that now when we have decades of hardware progress and orders of magnitude more RAM and CPU to work with, and decades of additional books and resources on how to build compilers using methods that were computationally infeasible at the time.

The existence of very simple compilers and languages has also been part and parcel of a lot of work at ETHZ that is not part of the Wirth languages themselves, but that was enabled by having access to simple languages as a foundation. Personal favourites of mine is work such as Franz' work on Semantic Dictionary Encoding, which led to his work on JVM JIT's etc.. Franz' student Andreas Gal came up with trace trees [1]. Brandis did fantastic work on showing a wide range of optimisations could be added to Oberon at a fraction of the complexity of the similar optimisations in gcc at the time. Mossenbock and Brandis did excellent work on SSA generation. Franz did simple but excellent work on protocol extension (dynamically propagating method overrides at runtime; I used parts of that in my experimental/unfinished Ruby compiler). There were many more. Some undoubtedly would have happened with more complex compilers to work on as well, but if you read the reports and dissertations from that period at ETHZ you also see plainly the influence that Wirth's approach carried over in the students subsequent approaches and focus on simplicity.

> a hypothetical "Turbo Algol" have been much different than "Turbo Pascal"?

It would have been slower and too big, and so lost a significant part of what made Turbo Pascal important.

In terms of implementation being more important, the speed etc. of Turbo Pascal depended heavily on the simplicity of compilation. Being able to keep the source, the compiler, the editor and the compiled object code in memory at once was a big deal facilitated by the extreme simplicity of the language.

[1] "Incremental Dynamic Code Generation with Trace Trees", Andreas Gal, Michael Franz, 2006: https://static.aminer.org/pdf/PDF/000/286/022/profile_driven...


Well, OK, and I certainly can't refute personal experience even if I don't totally understand it.

I played with C parsers (using my own parser generator ... written in Modula-2 as it happens) back in '82 and never found the grammar overly intimidating. Of course you're familiar, but just do a quick scroll up and down to remind yourself of the length (or lack of it):

https://cs.wmich.edu/~gupta/teaching/cs4850/sumII06/The%20sy...

And it seems 25% of that is the expression syntax which really would be better (more efficiently) handled outside of the grammar via precedence climbing (which would also neatly handle Algol's user-defined precedence), and which would result in a smaller parser as well as faster one.

As far as I/O, not sure that Pascal had too much to crow about there ... you could open files, but not close them, no way to associate a filename with a file, a need to put any non-temporary file as a program parameter, the odd file buffer variable concept, with an equally odd syntax treating file types as if they were pointers to the element type...

A language that fascinated me at the time, but I never got to play with was Occam - designed specifically for the Transputer which was really way ahead of it's time and never caught on.


> Wirth was perhaps too ascetic

Maybe too ascetic in terms of getting the most adopters, but a wonderful reminder to designers of development tools that most designs are probably far more complex than they need to be, even when optimizing for non-simplicity criteria.

So not a wasted effort!


Here is the specification for the latest version of the original Oberon language (also called Oberon-07):

http://miasap.se/obnc/oberon-report.html


It is also worth mentioning that Oberon-07 is a purely structured language in the sense that each sequence of statements is either fully executed or not executed; the only exception is ASSERT(FALSE) which halts the program.

So in Oberon:

  WHILE i < 100 DO
     ...
  END
  (*Is i >= 100 here? Yes of course, otherwise the loop guard would be lying.*)
In C:

  while (i < 100) {
     ...
     if (i >= 10) break;
     ...
  }
  /*Is i >= 100 here? No, not necessarily. Why is the loop guard lying then?*/

in 01997 it was acceptable to them and that's why most of them did move to java, once hotspot came out if not earlier

c++ projects today are very different from c++ projects in the 01990s when basically all 'serious' software was either c++ or c

a lot of c++ projects in the 01990s were what we think of now as java projects or c# projects (or django projects): line-of-business systems full of boilerplate and shitty architecture, not streamlined mega-optimized aaa game engines or web browsers

c++ projects today are the niche that was left over, which is, yeah, basically the cases where you can't afford gc

— ⁂ —

the appeal of wirth languages is that you get 90% of the value of a monstrosity like pl/i or c++ for about 5% of the complexity

language complexity hurts you three times: when you are learning the language, when you are choosing a compiler, and when you are debugging your code

so actually sometimes you get 200% of the value of a monstrosity like pl/i or c++

hopefully at some point someone will develop a wirthified version of rust, but i don't think wirth will do it this time


> c++ projects today are very different from c++ projects in the 01990s when basically all 'serious' software was either c++ or c

As an example, as late as '99, my first large web application was C++, and that was not an unusual choice at the time.


I worked on a greenfield web application in C++ last year. It still is a viable choice for web backend even today.

Depends on how much effort is done in security.

Most of the stuff I have exposed to the Internet written in C or C++, has done so by using a safer managed language and only accessing the native code via libraries.

Naturally there is the issue of on which language those runtimes were written on, but even so, the trusted computing base is much smaller than having the application fully written in C++, with possible C idioms while parsing data coming out of the network.


My own reason for preferring lightweight languages like Python over C++ for web services is not security but flexibility: it takes more code to do things in C++, and it takes more work to debug that code when the program is small. The things you get in exchange (mostly performance, since callability from an FFI is not a consideration) are often not worth it. Most web services I've worked on have a much harder time meeting the demands on their flexibility than those on their performance.

>in 01997 it was acceptable to them >c++ projects today are very different from c++ projects in the 01990s >a lot of c++ projects in the 01990s

You got the year wrong with that '0' prefix. The decade you're referring to is the 11990s.



Maybe we could start the epoch from when Homo erectus invented fire two million years ago and align with the Christian era for convenience.

So this is now the year 2,002,023 FE (Fire Era).


This might be the finest criticism I've seen of attempts to extend the Christian epoch forward or backwards.

For those that don't know, my comment is based on the Holocene Era calendar, aka Human era calendar (HE). Basically, human civilization started roughly 12k years ago, in ~-10000BCE. The HE calendar lets us express all human-era events using positive numbers, and does so in a way that just tacks a '1' onto the popular Gregorian calendar so it's very easy to convert.

Back in the 90s when Modula 2 was that latest thing, the selling point was that it was as efficient as C (in theory), but safer. The Wirth languages were called 'bondage and discipline' languages for the safety they offered to beginning programmers. No buffer overflows, and you could specify a metres type and be assured that someone wouldn't accidentally cast an variable of type yards to it. It also offered modular programming with implementation hiding.

Safety wasn't fashionable back then

Computers were slow so speed meant a lot more. We should have switched back to safe languages when we started networking.

The typical excuse if we ignore what was happening outside Bell Labs in systems programming.

It is acceptable to those that use Unreal C++ and C++/CLI.

Many did move to Java (or .NET), that is one of the reasons C++ is hardly seen in distributed computing stacks 20 years later, or most CNCF projects, other than existing OSes and RDMS.

Those targeting OSes like Android, or ChromeOS, hardly have an option, unless they feel like using their own OpenGL/Vulkan based GUI, and create JNI wrappers (or JS callbacks) for like 80% of the OS surface .


Well, Oberon-2 (designed and published by Hanspeter Mössenböck in 1991) was indeed closer to C++ than its predecessor Oberon 87 and 90, but definitely not a true alternative to C++ (because it only supports a fraction of the C++ features). Neither is the optimization done by the available Oberon compilers comparable to what we see in C++ compilers, not even in 1997.

> (because it only supports a fraction of the C++ features)

I'm pretty sure that both Mössenböck and Wirth would consider this a feature, not a bug.


Neither Mössenböck nor Wirth were involved with the posted paper, and neither intended to create a "high-performance alternative to C++". The authors of the paper who made this claim are different people, who apparently didn't know C++ good enough.

The title of the article is quite clickbaity. Neither is the performance of Oberon-2 discussed nor how it compares to C++.

But you are wrong if you think that Oberon-2 wasn't considered a competitor to C++. I'd say 1997 was early enough that it wasn't obvious yet that Java would snatch up such a big part of C++'s market share.

Wirth has always been very critical of software bloat and C++ is an abomination in programming language design. They certainly didn't set out to make an alternative to C++ but a good language that was an improvement over Oberon.


C++ was never a role model, neither for Oberon nor for Oberon-2. Wirth was always opposed to C++ and Mössenböck worked in his group. The official goal for Oberon-2 was "to make object-oriented proramming easier without sacrifiicing the conceptual simplicity of Oberon" (from ETH INF Report 160, 1991). Wirth himself was involved when Apple designed Clascal and Object Pascal, and even though you see a different solution to OO in both Oberon and Oberon-2; the latter introduced the receiver syntax for its type-bound procedures, which you can also find in Go (not surprisingly since Mössenböck was the PhD supervisor of Griesemer).

I'm currently reading Mössenböck's introduction to OOP (1992), and I'd say it's all pretty much by design. Refreshingly for that era, he doesn't recommend using OOP for everything, and he's aware of other language's features that C++ later embraced, like generics (Ada/Eiffel), but argues that they might not be necessary for what it aims at.

And the latter is something we don't see as much these days, where people don't like "gaps", and compiled languages are measured by what they can't do (Go being one notable exception, it seems). For the Oberon system, Oberon was enough, and anything more would probably not have been worth it. Wirth and Mössenböck didn't really aim at meteorology super computers or whatever was the HPC rage in the late 80s/early 90s.


Which makes sense in 1990's computing world, and I was a big Oberon fan.

However there is a reason why it didn't stop at Oberon-2, and Component Pascal, Active Oberon, Zonnon and others followed up.

Not everyone at ETHZ agreed on the minimalism, and for some low level coding it was proven that having stuff like untraced references (introduced in Active Oberon, based on Modula-2+/Modula-3 work) actually made sense.

Latest revisions of Active Oberon also have some kind of generics support.


That's why you would use o2c, which can use gcc-13 or clang-14.

Though I'm convinced that this ship has sailed, with go replacing oberon


There is even a better alternative; just use the Oberon+ toolchain (compatible with Oberon 87, Oberon-2 and Oberon-07), so you can benefit of a decent IDE with a source-level debugger and a C99 transpiler.

Or use Modula-2, which was recently added to gcc (https://www.phoronix.com/news/Module-2-GCC-Merged).

When we can use Oberon, there is no need for Modula-2. Oberon is by far simpler and superior. Just enum and records might be missing from the default Oberon.

Enumerations and subranges (in style of Modula-2 and Pascal) are missing from Oberon, but Records are present subject to type extension in the style of structs in Go. In Oberon(-1), we used (pointers to) procedures in record fields as methods when programming in an object-oriented style. Oberon-2 added methods which dispatch dynamically based on type of receiver. These were "type bound procedures". You'll find this idea echoed in Go as well.

As a bit more comfortable alternative to Oberon I would suggest Modula-3.

https://github.com/modula3/cm3

It even comes with a GUI library, which even though developed in the 90th Stil, works on x11. I find that quite fascinating :-)


> [Optimization not comparable to C++]

Hmm...considering Proebsting's Law[1], which is always shown to be too optimistic despite its pessimism[2][3], the arguments presented by DJ Bernstein[4] and the fact that Frances Allen got all the good ones in 1971 [5], etc. not sure how much this matters outside a few specialised domains.

[1] http://proebsting.cs.arizona.edu/law.html

[2] https://gwern.net/docs/cs/algorithm/2001-scott.pdf

[3] https://zeux.io/2022/01/08/on-proebstings-law/

[4] https://cr.yp.to/talks/2015.04.16/slides-djb-20150416-a4.pdf

[5] http://venge.net/graydon/talks/CompilerTalk-2019.pdf


Pascal is a great alternative to C. It is practical for writing operating systems. Though its use for this has declined if not vanished.

Most think of it as a "basic tinkering and learning" language, and indeed, that is probably the most common application. It does not have to be. [1]

UNIX came with C, and C became "IT".

[1] https://wiki.freepascal.org/Operating_Systems_written_in_FPC....


Pascal and C where the two most used languages in the kind of software development world I was living in around 1990, hobby, university and consulting, UNIX and some Windows. I remember reading one Oberon and one Modula 2 paper. I didn't know anybody using them to build anything real. Pascal had its heyday on DOS and Windows with Turbo Pascal and Delphi. It slowly faded away because everybody started building GUIs with Visual BASIC instead, or Visual C++ for the very technical people. Then came the web and Java and PHP and the early Linux corporate adoption and by 2000 it was buried under a geological layer.

> I remember reading one Oberon and one Modula 2 paper. I didn't know anybody using them to build anything real.

FWIW there was a huge following for the Wirth languages on the Amiga. I don't quite know how that came about but at times it felt like the programs written with Modula-2 or Oberon were as numerous as the ones written in C or Assembler.

That was mostly for Public Domain, Freeware, and Shareware software though.


I have bought the M2 Amiga Modula-2 compiler after saving my pocket money for ages. It came with complete modules/interfaces to the native Amiga system libraries and the type system prevented you from shooting into your own foot at every corner.

Amiga system developed is very pointer heavy. Modula-2 could not prevent all issues but at least catch some. Made sense to me at the time. Plus I could do my computer homework in Modula-2 on my Amiga and then quickly rewrite it to Turba Pascal in school


Modula-2 was definitely a niche language. Back in the late 1980s I wrote an add-on to a commercial Modula-2 library that earned me a few bucks. Literally, a few bucks. I used Logitech's Modula-2 compiler, which I don't remember being slow, but Turbo C and Turbo Pascal were far more popular, likely because of their fast compilation speed.

The operating systems written in Pascal I'm aware of (e.g. the Lisa the source code of which was published last week) used extended versions of Pascal, which e.g. had the '@' operator; actually I don't know any system of significant size which used original Pascal as specified in Jensen/Wirth's User Manual and Report; do you have references?

The Lisa had two operating environments written in Pascal - the standard Lisa OS/Lisa Office System, which resembled early Mac environments but supported protected memory, and the mostly text-based Lisa Workshop development environment, which resembled the UCSD p-system, but supported native compilation, assembly language, and a Lisa GUI-based text editor.

Two big extensions to standard/Wirth Pascal that enabled Pascal to work well for the Lisa and subsequent systems were units (separate compilation modules that could be imported/referenced using the USES statement) and an early version of Object Pascal known as Clascal.

I'm not sure that Pascal's evolution from Wirth Pascal to UCSD/Lisa/Turbo/etc. Pascal to Object Pascal and Delphi is something to complain about, any more than we'd complain about C evolving from K&R to C89/.../C23 and to C++ (and its various versions.) Pascal is still a capable and relatively compact language.

I think Pascal versions that lacked the @ operator and type casts could often use case variant records (relying on technically undefined behavior) to accomplish similar things.


I just implemented a parser based on the Lisa Pascal 3 specification (see https://github.com/rochus-keller/LisaPascal) and started to analyze the published Lisa source code. So far I only saw two of the ~350 Lisa OS source files looking like Clascal (both from the Lisa toolkit). Lisa Pascal has quite a lot of essential features not present in original Pascal; an important one for system programming is the possibility to take the address of things which was not possible with original Pascal.

> an important one for system programming is the possibility to take the address of things which was not possible with original Pascal

I believe with (unsafe, packed) case variant records you could often get the address of anything that was allocated on the heap:

    ...
    type ptrAddr = packed record
        case Integer of
           0: (addr: Integer);
           1: (ptr: ^Something);
    end;
    var p: ptrAddr;
        var somethingPtr: ^Something;
    begin
       new(somethingPtr); { allocate a Something }
       p.ptr = somethingPtr;  { copy its pointer }
       writeln('Allocated something at', p.addr);
    ...
(Apologies for any errors - I don't write Pascal very often and haven't tested this.)

This obviously relies on implementation-specific behavior such as integers and pointers being the same size.

With this sort of trickery you can do all sorts of (unsafe) things including pointer arithmetic, custom memory allocators, garbage collectors, (unsafe) type casts, (unsafe) low-level memory and memory-mapped I/O access, etc..


There were indeed loopholes in variant records you could misuse for all kinds of tricks, and Wirth was aware of these and finally replaced the concept by type extension. Instead of "not possible" I should better have stated "not officially supported". My point was instead, that Lisa Pascal added concepts, which were official part of the language specification, from which we can conclude, that (also) from the view of Apple original Pascal was not suited for the endeavor. Others came to the same conclusion (see e.g. the famous paper by Kernighan), and even Wirth added features similar to the ones found e.g. in Lisa Pascal to his Modula-2 language.

If you want to pick nits, many C and C++ programs and OS kernels also rely on technically undefined and implementation/machine-specific behavior.

From which I suppose we can conclude that C/C++ without undefined and/or implementation-specific behavior has been found to be unsuitable for systems programming. Or that systems programs tend to be non-portable by nature.

Perhaps a more charitable conclusion is that languages like Wirth Pascal and K&R C (for example) contain (sometimes explicit and intentional) loopholes and unsafe/technically undefined behavior which can be useful for systems programming but problematic for memory safety and security/reliability.

Sadly Pascal standardization seems to have stalled after ISO Extended Pascal in 1991, while C is still evolving into C23.


All languages have loopholes; even Oberon has at least one, in the type case statement (uncovered by C. Szyperski).

> from which we can conclude, that (also) from the view of Apple original Pascal was not suited for the endeavor

Another conclusion might be that Apple started with UCSD Pascal (which was available for the Apple II and already included extensions to standard Pascal) and that they thought that certain extensions were useful even if they weren't technically necessary.

Clearly the Lisa didn't need Clascal or Object Pascal (the Macintosh didn't require them), but they provide advantages in terms of conciseness, modularity, reusability, etc..


> I just implemented a parser based on the Lisa Pascal 3 specification

cool!

> So far I only saw two of the ~350 Lisa OS source files looking like Clascal (both from the Lisa toolkit)

Have you looked for instances of SELF in the Lisa Toolkit and app sources?

Anything that is SELF.foo is a Clascal instance variable or instance method reference in a method. You can also use with SELF do... to make it more implicit like C++ or Java.

Also look for METHODS OF, SUPERSELF, CREATE, etc..


Please also note that I'm talking about the Lisa OS (subfolder LISA_OS in the source code ZIP), not the Lisa Toolkit (subfolder Lisa_Toolkit in the source code ZIP).

It's actually not a surprise I see some Clascal files in the Lisa OS tree since LIBTK in LISA_OS/LIBS is apparently just a copy of the Lisa Toolkit.


I'm still troubled by the fact that I don't have implemented the preprocessor yet and the code is cluttered over hundereds of .txt files with diverse content (i.e. many modules are parceled to several .txt files). But from what I have seen so far some units in the Lisa_os seem indeed to be Clascal (estimated 5%).

It appears that the Lisa OS kernel is primarily straight Pascal and that Clascal is mainly used in the toolkit and apps (which are of course the most interesting bit of the Lisa.)

UCSD Pascal was as much a regression/subset of (Jensen &) Wirth Pascal as an evolution of it. It did add things like a function "exit" (cf return) keyword, string type and separately compiled units, but on the other hand it only supported local goto, and didn't bother to implement dispose(), instead opting for an odd stack-based mark() and release() scheme (i.e. there was no way of freeing an individual item allocated with new()).

The USCD P-System spec is here, for anyone interested:

http://pascal.hansotten.com/uploads/ucsd/ucsd/UCSD_PASCAL_II...


That's very interesting. Another vote for Wirth Pascal, which I think is an underappreciated language!

Modules are good though - hence Units, Modula-2/3, Oberon, etc..


Eh, that's fine, nobody writes OSs in standard C without compiler extensions either.

There are a couple of operating systems which indeed use standard C; Linux is rather an exception in this respect.

Can you name them? Even the academic/research ones or the barebones embedded kernels I'm familiar with use extensions to get things like context switching/multitasking/threading to work.

What OSs would those be? AFAIK none of the major ones - NT, Linux, illumos, or the BSDs are in standard C.

All of the embedded OSs. They need to support a mixture of commercial C compilers that don't support GNU extensions.

Since when is inline assembly and a couple of language extensions they have part of ISO C?

There is no need for inline assembly. Modules written in assembler can be separate from C code. And if you consider "//" a language extension, then yes, everybody and their dogs seem to use them, and nearly every compiler seems to support them.

When I look at those embedded OSes, I see more than just what ISO C89 allows for, beyond //.

My argument is extended ad absurdum in this thread anyway. My point was that Lisa Pascal has made several significant language extensions compared to original Pascal, such as a module system, pointer handling like in C, compiler directives, etc. - all things Wirth himself later added to Modula-2; these were intended, significant and documented extensions under the name "Lisa Pascal", not just the usual errors, inaccuracies and "common deviations" that you find in every C compiler, and that force you to secure libraries with #ifdefs.

Many of those things were already present in ISO Extended Pascal and UCSD Pascal.

> not just the usual errors, inaccuracies and "common deviations" that you find in every C compiler, and that force you to secure libraries with #ifdefs.

And extension keywords for microcontrollers specific features, intrisics for CPU instructions, DSP specific capabilities,....

Yeah I am already used that what goes for Pascal gets another weigth when placed against C.


Apple II and III Pascal (and in consequence Lisa Pascal, but there was e.g. not UNIT or USES syntax in USCD) were indeed based on USCD Pascal, but ISO 10206 was a decade later as far as I remember.

There were other Pascals around with system programing extensions, quite a few were eventually standardized in ISO 10206 (1991).

http://pascal.hansotten.com/

And while the discussions keep going around Pascal and its extensions, there is this little detail of Modula-2 being released in 1978, where all the issues with Pascal (as originally designed by Wirth) for systems programming were already fixed.


Here I found an interesting Article by David Craig about where Lisa Pascal (and all the other Apple Pascal version) came from: https://www.applefritter.com/content/brief-history-apple-com...

Where Apple II and III Pascal was based on UCSD Pascal, Lisa Pascal was apparently directly based on Wirth's P4 compiler, not on UCSD Pascal. See page 4.


Great reference! Terrific to have a semi-original source, thoug I wish it included more detail and references.

> Therefore, Lisa Pascal lasted from 1981 to 1986, an eternity in the field of microcomputer languages.

;-)

> Tho (sic) Pascal will have the support of a small but vocal minority at Apple, C/C++ will be the dominant language for Apple and outsiders for the next decade.

RIP Pascal, you will not be forgotten - and you will live on as Delphi.


Yes, the first ETH Report about Modula-2 appeared in 1978, around the time Lisa Pascal was developed; the latter refers to the Jensen/Wirth 1975 Pascal edition. I don't know whether there were contacts between Apple and Wirth before Clascal. Not to forget that Tesler came from PARC where they had Mesa, so he had the same source as Wirth had.

Even without compiler extensions the integration with the linker system is a huge strength of C

Just any compiled language.

Turbo Pascal/Delphi/Free Pascal evolved since Pascal, and cover quite a bit of the feature set of Oberon, as far as I can tell.

The main improvement I see is the implied handling of multiple statements in an if-then-else-end. It seems like it would eliminate a lot of begin/end pairs, so I like it.

Wirth's decision to revert to Case sensitivity was a bad one.

With the exception of replacing "Uses" with "Import", it looks identical to Free Pascal/Delphi.


Not entirely sure why people are downvoting your perspective. There are at least two object lineages for Pascal: one leading to Oberon and the other to Delphi.

Both to semi-obscurity, but there is a lot to like in the Pascal language family in my opinion. Personally I find the syntax to be simple and attractive. Even though I spend a lot of time writing C-style languages, I don't have any trouble reading Pascal-style syntax.

My complaints about Ada and VHDL, which seem to have Pascal-inspired syntax, are they they feel a bit verbose compared to C++ and Verilog. On the other hand, Ada supports concurrency and memory safety, and has packages for dimensional analysis of physical units, and VHDL supports pluggable implementations.


While uppercase keywords is a bummer, many IDEs support automatic capitalization.

One thing from Oberon linage that Turbo Pascal/Delphi/Free Pascal lack, is automatic memory management (with the exception for COM based types).


No garbage collection, but Strings... it handles those brilliantly, you never need to allocate them, and they are automatically reference counted, and just work, even with a gigabyte of ascii in them.

It's the best of both worlds. No GC pause, yet you can have huge strings in memory, return them from functions, all without issue.


For screenshots of what it feels like to use an Oberon based OS, see

http://progtools.org/article.php?name=oberon&section=compile...


Related:

Oberon-2, a high performance alternative to C++ (1996) - https://news.ycombinator.com/item?id=13734438 - Feb 2017 (2 comments)

A basic forking server in Oberon-2 - https://news.ycombinator.com/item?id=11627697 - May 2016 (36 comments)

Oberon-2, a hi-performance alternative to C++ (1996) - https://news.ycombinator.com/item?id=3361469 - Dec 2011 (18 comments)


Someday Rust will go the way of Oberon.

Username checks out.

Jokes aside, I’m excited about any future additions to Fortran, esp. to make it appealing to direct data science use.


Rust daily driver here. When that happens it'll be because something obviously better came along. I look forward to that.

Genuinely curious, do you mean Rust will become equivalent to Oberon in terms of developer mindshare/use in production in 2023 (i.e. very limited); or, that someone will ‘Wirth’-ify the Rust language and produce a less complex language that still allows for writing code at Rust’s level of abstraction?

There don't seem to be any Wirth's around, design-wise. We're in the Algol-68 committee's time line.

Wirth was initially on the committee and resigned after being dissatisfied with the direction Algol 68 took.

So we can't be sure if there really is no next Wirth around.


Would academia still reward that kind of thinking? Business certainly doesn't seem to, where you're mostly sticking to 'universal' solutions. The problem is that this 'universe' has such an enormous range.

I was at Wirth's farewell lecture and the topic of the Laudatio was a friendly roast on how Wirth failed to bring in lots of money by creating very efficient solutions which weren't commercial successes.

I think it certainly wasn't the norm back in the day. It's often easier to pile on existing solutions and try to go further by trying to cover up problems. Look how far we've come with the tooling surrounding C (e.g. static analyzer or runtimer fuzzers).

But sometimes you need to take a few steps back. It might not be obvious when looking at it today, but when Java came onto the scene, it had a tremendous effect on software engineering. It was a breath of fresh air that you could concentrate on other topics than how to prevent buffer overflows or what would be the best String class.

Java pushed modern programming practices in a way C or C++ just couldn't. I often joke that I am very grateful to Java as it saved me from working in C++.

But if you ask most programmers today, you don't get a very good opinion of Java. You probably shouldn't try to disrupt programming if you want to have rewards. :)


You mean there will be a Rust-2? Nice.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: