
Project Oberon - tosh
http://www.projectoberon.com/
======
systemvoltage
I often wonder what if we decide to build a completely new, well thought out
computing system (including network) that is totally separate from any legacy
system and 100% backwards _incompatible_. Not any bit less, 100% backwards
incompatible and everything would be fresh. There would be no jpeg support (it
would have its own image format), it would not have standard TCP/IP stack - a
new protocol and network infrastructure, new display format, new IO ports,
etc.

Why am I imagining this? I feel like computing systems as they've evolved,
they've built stuff on layers upon layers and just like biological evolution,
it is impossible (difficult) to undo because the marginal cost is too high to
unwind and break backwards compatibility. We get stuck in local maxima at each
layer and the critical energy required to escape out of it would be too large.

I am not saying backwards compatibility is bad, far from that, I am saying
that if we force ourselves to cut the cord and completely build computing
systems from scratch - I feel like we would have a hindsight to design it
better, faster, and cheaper.

~~~
jcrawfordor
One of the overriding themes which I attempt to convey in my teaching, blog,
etc is that the modern technology landscape is something that we arrived at
largely _by chance._ Nearly every decision made in the development of
computing had a wide variety of alternatives at their time, but once things
become ossified as a norm it is difficult to imagine another way. For example,
I have recently mentioned a few times how it is hard now to see TCP/IP as not
being the "obvious design," but in the early '90s there was healthy
competition of network protocols offering various feature sets.

I think it is extremely important from a teaching and engineering perspective
to view the modern computing environment not as some elegant design (which it
most certainly is not) but instead as an concretion of "best option at the
time" solutions to real problems.

It is a bit intoxicating to think about how things _could_ be if computing
really were approached as a single, monolothic, elegant design. But
essentially every attempt to do so has failed. Much like how we think of
computing as layers of abstraction, the computing industry is functionally
constructed of layers of abstraction, and so it is difficult to really change
anything at lower levels.

~~~
fsloth
I think part of the problem is that software is hard.

I've programmed for two decades. I write non-trivial software modules in a
_specific_ domain. It's not as if I was averse to analyzing things. And I try.
When I can.

Yet. Design is damn hard.

One of the problems is that implementation is often an empirical process of
requirements and constraints recovery.

I.e. I cannot create an elegant design before I've implemented anything,
because some of the requirements are discovered only when the domain is
explored using the code which I write.

One of the problems is that _I 'm not smart enough_ to design the system
beforehand. And _I don 't have enough working memory to do so_. And I presume
many programmers work in similar environments.

I think in the 70's or 80's there was a vivid dream that one day programmers
would have advanced CASE tools to help them design better programs. I wonder
what happened to that dream. Maybe IDE:s and software systems became so easy
to use that everyone prefers to prototype in code and perform empirical
constraint discovery just like me...

~~~
ivan_gammel
The dream came true. In a modern non-software startup environment people often
write code only as a last resort. I regularly see BizDev people setting up new
business process automations with services like Zapier, Jira, Zoho, AutoPilot
etc. There are plenty of CMS systems, customizable BPM, CRM, ERP etc where you
can define UI and logic without writing a line of code and it just works. The
cost reduction compared to the traditional coding is tremendous and enables
small IT teams to run quite sophisticated businesses. Developers usually don’t
see this, locked within their code-based programming paradigm, but this is
already real.

~~~
zelphirkalt
Let me tell you, mostly these solutions suck. There are several reasons for
this: (1) leaky abstraction, (2) not the correct abstraction, which works in
all envisioned cases without limiting you, (3) often you do not want to only
use an API, but instead change a tiny thing and that needs code or accept
limitations (4) the existing API is weird or has a weirdly structured result
(5) it is inefficient (6) you depend on a third party (7) you might have some
rate limits, and probably more reasons.

If you got capable developers, let them do their job and create a flexible
(expecting changes, simple design, usually not an elaborate many classes
design) solution, that actually fits your situation. Developers are (should
be) abstractions specialists. They should be able to find working ones in most
cases.

Or live with limitations and the spread of thinking, that some things are not
possible in various appartments. Mental barriers, that stop creative thinking,
simply, because the tools do not exist.

~~~
ivan_gammel
Well, as a software engineer who wrote first line of code in 1991 and as a CTO
of a medium sized company I would disagree here. These solutions do not suck.
They work, they are cheap and they are efficient in addressing the business
needs. You don’t always need a developer for simple automation, just like you
don’t need a computer to calculate 2+2. Abstraction leaks, third party
solutions and other things you mention are not the problems that always need a
solution. Developers often overthink scalability and invest time in perfection
of a code which will become obsolete in a year or two because of the change in
the business model. It is there you can find the art of engineering: in
finding the right problem to solve, rather than working on the problem for
which you know a conventional solution. Zapier is unconventional, but it takes
30 minutes there for something that a development team would require a week or
two. The limitations do exist, but it often takes some courage to understand
that in your specific case you can ignore them.

~~~
vidarh
The biggest problem is that a lot of them are siloed, sometimes seemingly
intentionally.

E.g. we have a bunch of stuff in Airtable. It's great. You can do amazing
stuff with it, and it has a great API for _most things_.

Except for the change history and comments.

When you're aware of this, it is fine, but the moment your business processes
start depending on using the comments and change history, you're locked in
unless you're willing to use phantomjs or similar to log in and use the API
(of course there _is one_ ) that their web client uses to access the comments,
in a way that is a massive pain.

I'm happy for users to start building stuff and automating stuff. The bigger
problem is when they don't design and don't understand the trade-offs. Often
it works great for years, until it doesn't and you have a massive job
untangling dependencies in some cobbled together solution.

Sometimes that is worth it.

Sometimes it causes you to incur costs far higher than what you saved
initially because knowledge often gets lost as these system grow in complexity
without any coherent thought behind them.

Very often, even if you end up using these tools, you'd still be far better
off if there was still a review process to get someone to say "stop,if you do
it this way you're creating a risky dependency, do it this way instead." Or to
say "we really need the dev team to handle this."

~~~
ivan_gammel
It’s indeed a job of solution architect or CTO in smaller companies to do this
review and develop a strategy telling, when something needs to be replaced by
a custom code. Also it’s important to look at the costs not only in absolute
values, but also relative to the IT budget. It’s ok to spend later 200K
instead of 50K now if company revenue is going to be 10x higher or if your
current budget is consumed by a more important project. Cost of money and
resources can be different.

------
rahimnathwani
Previous discussion:
[https://news.ycombinator.com/item?id=9847955](https://news.ycombinator.com/item?id=9847955)

And, of course, in the same vein:
[https://www.nand2tetris.org/](https://www.nand2tetris.org/)

------
pjmlp
Although the original Project Oberon was great for its day, one should have a
look at Oberon System 3 and Blue Bottle (AOS), which evolved Oberon (language)
into Active Oberon (language), while offering a much more modern L&F.

Some surviving info regarding System 3:

[https://www.drdobbs.com/architecture-and-design/oberon-
syste...](https://www.drdobbs.com/architecture-and-design/oberon-
system-3/184409324)

[https://www.semanticscholar.org/paper/Oberon-with-
Gadgets-A-...](https://www.semanticscholar.org/paper/Oberon-with-Gadgets-A-
Simple-Component-Framework-Gutknecht-
Franz/7d938c17c5b759eba8b03f9f49cbe292875aeec6)

[https://sourceforge.net/p/nativeoberon/wiki/Home/](https://sourceforge.net/p/nativeoberon/wiki/Home/)

[https://www.semanticscholar.org/paper/Project-Oberon-the-
des...](https://www.semanticscholar.org/paper/Project-Oberon-the-design-of-an-
operating-system-Wirth-
Gutknecht/54588af85cafce2599ccbb8d9f3a7a0c9c0eb923/figure/68)

Some surviving info regarding Bluebottle (AOS):

[https://os-projects.eu/oberon-system](https://os-projects.eu/oberon-system)

[https://www.linux.org.ru/gallery/screenshots/4163239](https://www.linux.org.ru/gallery/screenshots/4163239)

Unfortunately the www.ocp.inf.ethz.ch seems to have been moved out of
Internet. It had quite some information about Bluebottle, although there are
some Github mirrors like this one:

[https://github.com/cubranic/oberon-a2](https://github.com/cubranic/oberon-a2)

The doc directory has lots of nice info.

The latest version of Active Oberon manual as of 2019:

[http://cas.inf.ethz.ch/news/2](http://cas.inf.ethz.ch/news/2)

~~~
non-entity
Is Active Oberon considered a canon (for lack of a better term) successor to
Oberon07 or is it more like Component "Pascal" and just a spin off from an
Oberon version?

~~~
pjmlp
The linage is more or less as follows:

Oberon => Oberon-2 => Component Pascal

    
    
                       => Active Oberon
    
                       => Oberon.NET => Zonnon
    
           => Oberon-07 (multiple iterations until 2016)
    

From features think Go (Oberon), D/C# (Active Oberon).

Active Oberon provides features for manual memory management (untraced
references), async/await (active objects hence the name), lightweight
generics, interfaces, exceptions.

------
CharlesMerriam2
This has been done with hardware. The OLPC was a clean sheet implementation of
a computer. It created ideas such as only having a few sizes of screws,
shipping extra screws in the case, separating the light bar from the LED
screen, innovative use of polarization, new mesh networking, and a low price
tag.

About half of the innovations were picked up by other manufacturers within a
couple years.

This was done with a programming language. Ada was designed through an
iterative sequence of trials for a clean sheet implementation of a development
language. It created, or popularized, ideas such as explicit module exports,
tying directory and file names to classes, separate "to end of line" comments,
and more.

About half of the innovations were picked up by other language vendors within
a couple years.

Oberon gets mentioned on HackerNews every couple years. It is hard to say its
really a clean sheet.

~~~
pjmlp
Many of its ideas, and Xerox PARC workstations which is here Niklaus Wirth got
his inspiration from, could be replicated via COM/XPC/D-BUS/gRPC/AIDL based
desktop, however they never go to the full extent as Xerox/Oberon went,
because most developers lack the understanding how it could be like and never
bother to learn from history.

------
Gormisdomai
Related and potentially interesting to HN, a short memo from the creator of
the Oberon language (that backs the OS) comparing it to LISP:

[https://people.inf.ethz.ch/wirth/Miscellaneous/Styles.pdf](https://people.inf.ethz.ch/wirth/Miscellaneous/Styles.pdf)

------
mark_l_watson
I think it was about 1997 when I RAN Oberon on a home computer. I thought the
language was good, the OS and mostly text oriented UI was interesting.

It would be good if there was a VirtualBox distribution. I looked but I saw
nothing that I could use.

~~~
cxr
[https://github.com/pdewacht/oberon-risc-
emu](https://github.com/pdewacht/oberon-risc-emu)

------
newbie789
This is a lot of PDFs. Is there a basic summary of what hardware/software is
involved available in HTML form?

Sorry if I don't really understand what this is, but it seems like there's a
production-ready hardware and software description (written in 2013?) that is
applicable today in 2020?

Edit: Is this meant for academic study? And if so, is a basic newbie
description available?

~~~
alexisread
The documents outline how to build a computer, from designing the hardware in
an FPGA, to the entire software layer including UI, bootstrapping itself etc.
One person can understand the whole thing.

What's more, this is all done with a compiled language using GC at runtime,
and with a low memory overhead ie. Oberon can be used as a systems language.

There are several projects moving on from this software-wise, A2:

[https://www.research-
collection.ethz.ch/bitstream/handle/20....](https://www.research-
collection.ethz.ch/bitstream/handle/20.500.11850/147091/eth-26082-02.pdf)

Also, following on from A2, Composita introduces a component architecture to
allow managed memory without GC, and contextless switches (IMHO better than
Rust)

[http://concurrency.ch/Content/publications/Blaeser_Component...](http://concurrency.ch/Content/publications/Blaeser_Component_Operating_System_PLOS_2007.pdf)

~~~
newbie789
Thank you so much! I think this is out of my scope skill-wise but it looks
pretty cool! I appreciate this info!

------
axilmar
I don't believe in throwing out the old completely. As others have noted, the
cost of completely replacing the current infrastructure is astronomical.

What could be done instead would be a parallel virtual platform inside the
current technology that abstracts the current technology in such a degree that
many things become trivially doable.

------
rtlfe
I don't know anything about hardware or EE. Is this something that a person
could actually build from scratch?

~~~
bri3d
From "scratch," as in raw materials, absolutely not. Even the most dedicated
home-fabrication hobbyists ( [http://sam.zeloof.xyz/first-
ic/](http://sam.zeloof.xyz/first-ic/) ) have come nowhere near achieving the
transistor density required to construct the "OberonStation's" CPU or memory,
whether taped out (laid into a chip design using actual silicon gates) or
built using a programmable gate array.

From a circuit board etched by someone else and PCB components purchased
online and soldered together by a hobbyist, certainly, without too much
difficulty (the package used by the programmable gate array requires a bit of
specialty equipment, but is within reach).

One other point about Oberon is that I _believe_ that because the design
predates open-source FPGA toolchains, a closed toolchain is still required to
convert the Verilog (high-level design describing the processor logic) into a
netlist (FPGA configuration), and it hasn't been taped out into a chip,
either. So, the book covers the design of a computer end-to-middle - that is,
without covering how the block / HDL level design becomes logic gates, and
without covering how said gates are then manufactured. This is probably fine -
the scope has to end somewhere. The linked NAND-to-Tetris articles cover the
other end - from logic gates up to typical computing constructs (adder,
registers, shifters, etc.).

~~~
leowoo91
What's the next big limitation zeelof had for going smaller?

------
theamk
It is very weird.. this code is is full of the duplicative, hard-coded
constants, something which I would avoid in any software I write.

For example, take a look at the display module [0] -- this 190-line program
contains over 20 instances of constant "128", in lines like those:

    
    
        sa := base + u0*4 + sy*128; da := base + v0*4 + dy*128;
    

What does this mean? See, there is only one display supported by this OS -- an
1024 x 768 monochrome one. Since this is monochrome, you only need one bit per
pixel, and you can put 8 of them in the byte. If the pixels are sent out
serially and lines follow each other in the buffer, then start of each line is
(1024 / 8) = 128 bytes apart.

But instead of writing something like "const LineStride = Width / 8", the code
just hardcodes the value all over the place.

[0]
[https://people.inf.ethz.ch/wirth/ProjectOberon/Sources/Displ...](https://people.inf.ethz.ch/wirth/ProjectOberon/Sources/Display.Mod.txt)

~~~
pavlov
My guess is that this is meant to emphasize how the display module is truly
single-purpose for this specific device.

If you want a different display, they'd prefer you to rewrite it from scratch?

~~~
082349872349872
Exactly. Recently, when Wirth couldn't find an OS with a device driver for his
favourite mouse, he built his own hw + OS + userland combination so he could
keep using the mouse he got as a souvenir of his Xerox sabbatical. (I believe
the monitor and keyboard were off-the-shelf.)

He's not primarily a programmer, but a computer engineer. It wasn't his first
computer system, and may not even be his last. The mouse stays the same size,
but the computers themselves have shrunk dramatically over the years. Maybe
when he turns 90 he'll ship-in-a-bottle build a computer inside his favourite
mouse?

~~~
non-entity
> Maybe when he's 90 he'll ship-in-a-bottle build a computer inside his
> favourite mouse?

Ha wouldn't be shocked. Apparently some mice already have ARM processors
inside, why not take it a bit further.

------
avodonosov
Any videos of it running?

------
robertlagrant
Is jcm involved?

------
tonymet
no need, we already have BeOS

------
non-entity
I've been mildly interested in Oberon (the language) as of late after reading
about Wirth's languages. I remember reading about Pascal and the Modula family
some years ago, but always didn't know Oberon was his creation as well and
thought it soubded like some weird esolang. I almost wrote a compiler for
Oberon when I learned how simple the language is and I needed a compiler for
an old architecture.

What were benefits or breakthroughs of the OS when it was created? I wish the
2013 book was available in a physical copy, I'm not a huge fan of PDFs.

~~~
pjmlp
Basically the same as Mesa/Cedar, but in more affordable form than Xerox PARC
hardware would require.

[http://toastytech.com/guis/cedar.html](http://toastytech.com/guis/cedar.html)

"Eric Bier Demonstrates Cedar"

[https://www.youtube.com/watch?v=z_dt7NG38V4](https://www.youtube.com/watch?v=z_dt7NG38V4)

About 1h video with some developers from Mesa/Cedar done by Computer History
Museum.

As for what breakthrough, this are full stack graphical workstations written
in GC enabled systems programming language.

To bring it to modern context, imagine something like Windows being fully
written in .NET Native, but in the 80's.

~~~
082349872349872
... in the 80's, with a headcount of 2 (who had other academic duties).

> "The primary goal, to personally obtain first-hand experience, and to reach
> full understanding of every detail, inherently determined our manpower: two
> part-time programmers. We tentatively set our time-limit for completion to
> three years. As it later turned out, this had been a good estimate;
> programming was begun in early 1986, and a first version of the system was
> released in the fall of 1988."

I think of Wirth and Gutknecht as demonstrating that one need not be as far
out as Terry A. Davis to do small-team end-to-end work. (compare Carver Mead's
tall thin person)

------
SamuelAdams
Sadly this is not the same as the brew from Bells Brewery in Kalamazoo, MI.

