Hacker News new | past | comments | ask | show | jobs | submit login
The Deep History of Your Apps: Steve Jobs, NeXTSTEP, and Early OO Programming (computerhistory.org)
143 points by protomyth on March 21, 2016 | hide | past | favorite | 113 comments



> “The line of code that the developer could write the fastest… maintain the cheapest… that never breaks for the user, is the line of code the developer never had to write.”

I don't think graphical programming has ever really solved this problem. The only time interface builder is useful is with really small applications. Once you have any amount of complexity you're much better off creating your interface programmatically. Interface builder is just another thing I have to learn, and relearn when Apple decides to change it 2 years from now. If Apple partners with Adobe and creates an interface builder / photoshop hybrid, maybe we could talk. If I could connect my designers psd file directly to a core data backend, that might provide some legitimate time savings.


I work on a product that has millions of lines of code and has been in existence for over 15 years. It uses Interface Builder and has a very complex interface. IB works just fine. It has quirks like any other tool you'll use, but dealing with it is no worse than dealing with the changes to compilers, graphics drivers, OSes, etc.


Graphical programming cannot make the activity of programming easier. Programming is encoding a system of rules. It doesn't matter whether you do it by point-and-click or by written language, the rules remain the same. The "ease" of point-and-click is discoverability of possible rules and intuitive representation of said rules, but that's nothing that a good IDE can't do at the code level just as well.



Yes it can.

Dragging and dropping to create a data processing pipeline is much faster and less error prone than writing the equivalent code.

An easy to see example are audio pipeline building software, where you hook together various components - http://factmag-images.s3.amazonaws.com/wp-content/uploads/20...

Google also has something like this - https://i.imgur.com/bGTSDf5.png


Only if the code is trivial.

Max/MSP and PD are the goto products here.

https://cycling74.com/max7/

They're incredibly horrible for anything complicated. Even a simple for loop requires multiple clicks to select the modules you need, multiple lines of typing to set their parameters, and multiple mouse movements to define the connections. It literally takes about around ten times longer than typing.

Neither handles encapsulation elegantly and transparently. (Max 7 is the best effort yet, but it's still not pain free.)

They have some nice DSP features for building DSP chains, and as long as you stick with those they're useful. But they're an instant nightmare if you try to do anything with logic that would be trivial in written code.

Reaktor doesn't compare because it just does the DSP thing. It's a very simple pipeline.

The takeaway is that simple module composition is easy, but non-trivial logic flows with many data types can be very challenging indeed.


Has anyone tried to write functional programs (not necessarily of the reactive FRP variety) this way? I'm trying to imagine how it would play with algebraic data types and pattern matching.

(I feel Haskell syntax is dreadful once you start trying to achieve anything beyond "functional pearl"-type material.)


Interestingly, Adobe used (not sure now) for PS interface a programmatic interface builder, where you describe with text the general layout of the dialog and the various fields/interactions.

This library is open-sourced - http://stlab.adobe.com/group__asl__overview.html#asl_overvie...

    sheet clipping_path
    {
    output:
        result          <== { path: path, flatness: flatness };

    interface:
        unlink flatness : 0.0 <== (path == empty) ? 0.0 : flatness;
        path            : 1;
    }


I went to a talk by the author of their auto-layout engine.

He would always tell his manager, "You can give me two weeks to enhance the tool, or two weeks to do it manually. Your choice."


Isn't that Adam and Eve? I remember reading about this awhile ago.


A curious question - why is Objective-C not used very widely outside of Apple ecosystem?

I was playing with it for a while even before iPhone was a thing and I kind of liked it, but it's basically only for Apple devices. Nobody on Linux or Windows sides uses it for anything.

I have had the misfortune of using GNUStep about a year ago, and it's, unfortunately, still very buggy. Why is there no more activity there?


Without an integrated framework, it's just an OO mechanism hung off the side of vanilla C. And while that mechanism does have some sweet features, it's also slow and really weird looking for someone who's used to C, and seeing it in the middle of C code is like in a Bollywood movie where they suddenly break out into song.


Love the analogy.

Not to leave course of the original comment re:ObjC, but I think to a certain extent you don't see C# really being used outside the Microsoft ecosystem, as Mono does exist. An integrated framework is probably the programming language equivalent of having an "army and navy."


Programmimg is soo based on trends and not on technical merits :

Dart is a great language - it's way more feature complete than typescript, it comes bundeled with official package manager, build system and a dev server (huge value compared to the js ecosystem schicoprenia) and it's designed by people who actually know their shit and have decades of expirience in both language design and VM implementation/compilers (isn't constrained by JS design decisions)

Dart started marketing it self as a JS.next, IMO unrealistic goal, but has dropped that agenda for like 2 years now (?). It still dosn't prevent people from making uninformed comments about it being a failed google js.next language.

It's really an amazing ammount of working tools and code in that project, waay better than angular 2 js/typescript for example but people won't even evaluate it because of reputation.

The same is true for .NET, Microsoft took JVM and Java and then took it to the next level - stuff that just got in to Java (Lambdas) or will be getting in years (Value types) have been a part of .NET for a decade now - and in recent years Microsoft made a 360 on cross platform and OSS with Core.CLR - this is not even mentioning F# being a MS sponsored and supported tool with the tooling support that implies, and just the best in class tooling with VS in general.

Yet most people won't even touch .NET because it's not hip.


I don't think developers should take the blame for not using a language which has from day one been a deliberate trap intended, by its very usefulness on Windows, to tempt people away from Java into creating programs which can't survive outside Windows. It began as an attempt to nobble Java with incompatibilities (Microsoft J++), and when that got sued off the market, it returned as a "different language" (C#) that looked nearly the same, trying to play on all the same strengths - except the ability to escape from Microsoft's OS. Nor has that ever ended. Microsoft has no interest in helping Mono reach parity with its own version.

You can use it, but your programs will be stuck on Windows, and your time and effort and dedicated mental resources will be joined at the hip to Windows, in a way that doesn't apply to almost any other language.


See this is exactly what I'm talking about !

I can't blame you for not being up to date with developments on .NET platform or in the Microsoft world - but please o please refrain from making comments about the issue you are not informed about and obviously haven't looked in to for the last couple of years.

Microsoft has their own cross platform open source .NET implementation under MIT license supported officially on Linux and OSX. https://github.com/dotnet/coreclr

Not only that but they are opensourcing and porting huge parts of their stack like build system, web framework, etc. and are supporting it officially on Azure cloud. https://github.com/aspnet/Mvc

They are doing pure OSS from scratch projects such as LLILC - a static compiler for CoreCLR based on LLVM : https://github.com/dotnet/llilc

Further more even before that Microsoft has started opensourcing large parts of their implementation at the request from Mono team and Mono has taken those + CoreCLR and made their implementation better - showing that they actually wanted to help Mono team implement feature parity.

And finally Microsoft recently acquired the Mono team (Xamarin) and will probably fold that in to their OSS offerings.

So please stop the FUD - it's not helping anything - I realize you have an opinion but before voicing it please check if it matches the facts - otherwise you're just creating noise in the discussion.


Interesting, stuff changed, and I stand corrected.


So microsoft has done all this in the past few years. Yes, they are turning their reputation around. But Java has had this from the start, so has a 20 year head start.


Microsoft is forcing android vendors over patents for things they didn't even invent. Imagine what they would do if people used things they did actually invent.


They are giving patent exceptions to their OSS stuff (either explicit in coreclr or inlicense trough apache 2 in mvc for eg.)

So again please cut the FUD and try to get informed before discussing thease things - you could literraly find the stuff I'm talking about in a google search or two


>Yet most people won't even touch .NET because it's not hip.

To be fair .NET can't replace the Java ecosystem now and will not replace the Java ecosystem in the following years. It's just not the same thing. Yes C# is way better than Java, but there is so much more going on for Java. For starters Java is truly multi platform and the ecosystem is way richer. So no "people won't even touch .NET because it's not hip", people won't touch .NET because even though it's really good it's not that good, at least not when you take into consideration the whole ecosystem.


Exactly. OpenJDK and Oracle JDK for the most part are interoperable. In fact, the open-source version of the JRE is the reference standard. You'd be able to run a web application almost out of the box with Java. It wouldn't be so possible with Mono and .NET.


Microsoft opensourced a cross platform official implementation of .NET called CoreCLR - and altough slightly young you can do what you just said and they support it trough Azure on linux and offer OSX builds.

Also Mono took better parts of CoreCLR and made their implementation better.

Also Microsoft acquired the company behind mono/Xamarin and given their recent trend will just fold that work in to their OSS offerings (at least partially)

So please get your facts straight before commenting on the issue because that's sort of the problen I'm getting at in my post.


See I don't have a problem with this argument if you really made the evaluation - that is if you really can't find the stuff you need.

But I don't ever hear that, at best I see that argument based on assumptions (without checking), and more often it's just "ewwww Microsoft"


> Yet most people won't even touch .NET because it's not hip.

.NET isn't that well supported outside of Windows though. At least not to the same degree Java is. And a lot of that is down to Microsoft. .NET was pretty awful when it was first released. Granted it's been solid for a great number of years now, but originally it was just MS lashing out at Sun because Sun Microsystems took MS to court over Microsoft's own (partly incompatible) version of Java (massively paraphrasing history here!!!). But that was a long time ago and .NET has diverged and morphed into so much more since. However for a long time it never felt like Microsoft was not giving any love to .NET outside of Windows. Yes, I appreciate Mono does exist, but there's a lot of proprietary stuff in .NET that, until recently, MS refused to port. So even once .NET really became an attractive technology to develop against, it's cross platform support was still stunted by Microsoft's own need to dominate.

It's also worth noting that .NET is still hugely popular - or at least in England. But those organisations running .NET tend to buy into the entire Microsoft stack (.NET, IIS, MS SQL Server, etc) - which really just goes to reinforce Microsoft's justification for making Mono a second class citizen.

However with Microsoft's recent headlines (SQL Server being ported to Linux, greater collaboration with Mono, etc), there's a real potential that .NET will become a lot more relevant outside of Windows too. Unfortunately for Microsoft, there's now also more competition in the Linux / UNIX ecosystem than there's ever been.


I think the main reason why Dart was poorly received, was that it is as complicated as a statically typed language, and as unsafe as a dynamically typed one. The unsoundness is not just superficial; every assignment is a potential run time type error waiting to happen. In practice, there's really no type safety at all, and it isn't used in the optimizer, so why bother?


Firstly you can run dart in checked mode which will catch all type errors. Secondly static type checks will catch most errors. Even if there were no type checking at all the types are still hugely useful for documentation, navigation and tooling. Gradual typing can be more productive than pure static typing in that you can play fast and loose when you want to prototype something and then tighten up later when you have arrived at a definition of the solution.

Most of your arguments could also be applied to typescript (which doesn't even have the option of runtime tour checking)


Because the tooling is top notch while the language is more flexible/expressive than say C#/Java.

Async/await future/observables are built in to std lib, dom API is wrapped up in to them and fully typed, etc.

It's not an academically revolutionary language - it's just a really solid practical language, with many modern ideas implemented from start, good tools and should be immediately familiar to C#/Java/TS developers.

It has tradeoffs but IMO it fixes so many annoyances in TS/JS.


Most people won't touch .NET because the only deployment option outside of Windows is Mono, which depending on your use cases (for server software) ranges from "barely works" at best to "not even compiling your code using mcs" at worst.

Unless you start with Mono and port to .NET, you may as well forget it, and even though the runtime has the _potential_ to be nicer because of proper generics and value types and many other things, outside of Windows this is just not the case right now. Core CLR (or whatever it is called today) may fix this in future, but if you need to ship now or in the next year it is a poor bet.


ASP.NET MVC 6 is RC and pretty close to stable and will support cross platform CoreCLR out of the box - Microaoft will probably start offering Linux packages with it on Azure when it's released.

But even if you need something right now and CoreCLR doesn't cut it Mono compatibility story got a lot better because they could just take stuff from CoreCLR and other OSS drops from Microsoft recently - so I would say cross platform developmebt with .NET is much less painfull than it was even 2 years ago.

Even more JetBrains is working on a C# IDE so you won't be stuck with MonoDevelop/VS code on non widows platforms.

I'm thinking most of this will be out during BUILD.

Last few posts here make me sound like a .NET advocate :D For the last year I've been doing C++/Python/Dart/JavaScript.

I also used Clojure/JVM before - I'm not one of those "full stack .NET" developers - I just had great expiriences developing on the .NET platform, been keeping an eye on the ecosystem and I hate when people dismiss it on false assumptions - there is some top of the line stuff in that platform that would make my life as a developer better on a daily basis.


>as Mono does exist

Sure but truth be told, Mono is not as good as the official .NET Framework, even more so when it comes to doing GUI applications. I don't think most people would choose Mono to do a serious cross platform application, when Qt and Java are so much better for this.


That's exactly the point I was trying to raise--without an integrated framework (like JRE), it's hard to consider Mono and .NET as a cross-platform toolkit. In fact, .NET and Mono have entirely different APIs to do UI, while Java is more/less integrated with respect to different JREs (including OpenJDK, Oracle, IBM, etc).

People have tried really hard to bring C# on to other platforms even when it starts feeling forced.


c# is the main language used with the Unity game engine, which is currently the most poplar game engine around.


I'm not sure I understand the analogy. Are you meaning that it is an integral portion of the framework and basically the entire reason people use C?


Hollywood did that too, you know? It wasn't strange back then.

Ask Singing in the Rain and West Side Story :)


(Some of the responses here are hilarious. I feel like I'm reading a Slashdot thread from 1998.)

I think it's interesting to compare Objective-C with Microsoft's COM. Both originated at roughly the same time (late 80s). Both aim to add dynamic, object-oriented features to plain old static C. Both provide a thin runtime library and C-based API to bring those features to C. Both use the same reference-counted memory management. Both allow you to dynamically load objects at runtime, and query about the capabilities of these dynamically-loaded objects in a general way. Both have been integrated into multiple different object-oriented programming languages. Objective-C differs in that it is both a runtime library and a language, where COM is just a runtime library.

COM become hugely popular in the late 90s. While it's natural to use COM to implement a plug-in framework for an app, people started to use COM for everything. This led to the deCOMtamination movement of the early 2000s.

I too did a bunch a COM development in the late 90s, just like everyone else. If I knew then what I know now, I would have ditched COM entirely and used Objective-C and OpenStep. But even if one had a time machine and could travel back to 90s with knowledge of the future, there is no way you could convince anyone else to ditch COM for Objective-C. The syntax is unusual to C, C++, and Java programmers, and there are just too many misconceptions about Objective-C floating around (e.g., see some of the other responses in this thread).

Objective-C has many interesting features that make it ideal for doing the kind of large-scale client-side desktop and mobile app development that it's known for. But I think the truely compelling aspect of Objective-C is the cohesion between the language and UI frameworks that make up Cocoa and Cocoa Touch. The frameworks and the language were developed hand in hand.

Objective-C without the Frameworks isn't compelling to people who aren't already familiar with the language, and those frameworks are (now) only available on OS X and iOS.


COM was popular, yes, but it was a nighmare. Remember all the threading models? Here's just a taste:

When a free-threaded apartment (multithreaded apartment model) in a client creates an apartment-threaded in-process server, COM spins up a single-threaded apartment model "host" thread in the client. This host thread will create the object, and the interface pointer will be marshaled back to the client's free-threaded apartment. Similarly, when a single-threaded apartment in an apartment-model client creates a free-threaded in-process server, COM spins up a free-threaded host thread (multithreaded apartment on which the object will be created and then marshaled back to the client single-threaded apartment).

I remember thinking I sort of understood this back in the day, but now I'm pretty sure I didn't.


<quote>(Some of the responses here are hilarious. I feel like I'm reading a Slashdot thread from 1998.)</quote>

+5 Insightful


C++ was used in Windows, and Objective-C never had an official standard. It was also considered a direct descendent which got used in UNIX. Objective-C really needed a standard library.

The cranky person in me would also point out comp.lang.objective-c had an individual who really hated NeXT and made life heck on others.


Made myself learn it back when the AppStore was first released and the downsides are mostly related to the strange syntax and strange memory management model. It definitely has some good features and I do like the verbose style of inline naming (which is great for auto-documenting) but the downsides far outweigh the upsides.

Java as a competitor is just a much easier language to read and write, though I would not say it is a better language. It gained widespread adoption because of that ease of use and ability to run on so many platforms.

Whether you agree or not, C++ is perceived as a better language and is definitely better supported across multiple platforms. As an example, making an App that shares code on both iOS and Android is impossible without the Native C++ bridge on both. So despite the failings of the language (of which there are many), it really comes down to portability.

This is why Swift is such an interesting experiment to me. As a language it is pretty good, very modern and easy to understand (though not super expressive or powerful), but, the thought of using it for both Android and iOS development is actually very appealing to me.


Just a clarification.

I think the interface can be even "lower" than C++ ...

C interface to iOS: http://stackoverflow.com/questions/10289890/how-to-write-ios...

C programs can access Android's JNI ( http://developer.android.com/ndk/samples/sample_hellojni.htm... ).


Interesting. The syntax and the memory management is something I liked almost immediately, even when it's totally alien to other languages.

Once I got it, I wondered why are not other languages cool like this.

And then I never used it again in real life (since back then nobody had Apple, and then I moved to javascript...)


At the time of NeXT, you did pay for some of the dynamic-'nice'ness of ObjC with efficiency... and efficiency was critical in the days of 16MHz 030's.

Like many technically-superior technologies, they don't find market success because of momentum of an inferior product.


It was actually quite nice to use. Windows felt a lot more sluggish than NeXTSTEP on the same hardware.


Did Windows and Nextstep run the same hardware?


In a manner of speaking, yes. It ran on PC hardware, but when it was first released the system requirements were at the high end of the spectrum. (Local bus, 32MB RAM, fast disk, specific graphics cards, etc.) In 1993-4, I lived in a dorm over a store that focused on NeXTStep/PC hardware (Omegabyte in Austin, TX), and remember it being around a 4-5K proposition to get started, at least.

Flash forward a couple years to Summer 1995, and the NeTXStep system requirements are easier to attain. Unfortunately, by then both Linux and Windows 95 have made it a lot less important.


Yep, NeXTSTEP 3.3 had and Intel version (as well as versions for other hardware). I had the NeXTSTEP 3.3 and 4.x on an Intel 90MHz Pentium with a SCSI drive (Microway was the company I bought it from). NeXTSTEP was smooth, and Windows was a tad bit cranky.


Apple/NeXT fought it. To the point of GPL violation in their GCC fork.


Maybe what the Objective-C runtime does isn't really needed? It doesn't make much sense to have every method patchable, it doesn't make much sense to be able to build classes at runtime, and it doesn't make much sense to have all calls go through a trampoline (with a selector that must be get with another call).


Because people keep forgeting that languages alone are worthless.

What matters are tooling, libraries and vendor support.

GNUStep is quite old and only had a couple of devs maximum working on it, already since the NeXT days.

Also gcc never had 100% support for the Objective-C runtime, there was just the initial contribution that FSF forced NeXT to do.


> What matters are tooling, libraries and vendor support.

Yeah, I thought so, too. Then we built a content management system in Objective-C on Sun/AIX/Linux. We had a so so Foundation and that was it. It was amazing. Completely changed my mind.


Well Web Objects were the genesis of J2EE actually and were great.

Probably only Smalltalk had a similar experience in those days, with commercial Lisp vendors already fading out.


Again it might have changed since the last time (about one year ago), but I was trying to port a command-line utility from OS X to Linux using GNUStep, and it was so buggy it was almost useless. So I gave up.


Why would anyone want to use it? Especially before ARC. Apple created Swift for a reason.


One quibble, I believe Dell's first e-commerce site was developed in WebObjects, the other crown jewel at NeXT, but of course written in Objective-C.


"Early OO-programming"

Hello, Smalltalk.


One of my first languages out of Uni. Still influences my thinking to this day 15 years later.


While Hardware is often perceived to be "harder" than software.

I feel like the NextTeam where right about the system specs for graphically driven OO environment. It took them 20 years to really get their software environment to the point where developers could produce compelling experiences. In that time the same specs shrunk from a table top cube to a pocket size computer.


I still have two NeXTstations (one is a Turbo!) sitting in my closet. Wonderful machines and a fantastic OS for its time. I don't have monitors for them anymore, but even so, I doubt I'll ever part with them.


* Naroff integrated Objective-C directly into the C compiler NeXT was using, the open source GNU C compiler, GCC, working closely with Richard Stallman

Probably #1 on Stallman's list of "things I regret most".


What's to regret? Stallman, until he consulted with a lawyer, thought it was going to be OK for Next to not open source the Objective-C part at all. After consulting with a lawyer he asked Apple to open source it, so I would presume he's happy with the outcome?

https://en.m.wikipedia.org/wiki/Objective-C#cite_note-7

At least, this was the story that was presented to me as the reason that GPL results in more open source software than permissive licenses.


Ahhh, NeXT... probably the nicest environment/OS/machine that never took off....


NeXT (with a bunch of Mac baggage) is basically the current OS X


The "NS" you see all over OSX API's stands for NextStep

http://stackoverflow.com/questions/473758/what-does-the-ns-p...


Well, they removed quite a bit when they added the Mac baggage. I still would rather have the interface to OpenStep 4 than the current OS X.

The whole concept of the menu bar on the top was nice on a 9" display, but it is a major pain on a 34" monitor.


Conversely, I find it annoying to have to chase down menubars that follow their parent windows around on large, high resolution screens. The principles of a global menubar (being able to throw your cursor to hit it) still work on large screens... I just crank up cursor speed so hitting it takes roughly the same amount of physical movement as it did to do the same two or three decades ago.


Not useful, but I agree completely. See also Fitts's law:

https://en.m.wikipedia.org/wiki/Fitts%27s_law

I like the ability to slam the cursor to the top of the screen either for purposes of using a menu or finding a "null" place where I can click to abort drags, or just locate the damn cursor.

I'm also a caveman in thinking that multiple overlapping windows are ideal for programming and I hate the way Xcode has adopted the MDI style. I always end up using the "open quickly" dialog and typing the name of the document I want rather than any combination of hunting it down in a menu.


I'll reply to the thread here since I think many people have forgotten how NS menus worked.

By default the appl menu was in the upper left, it went down and submenus opened to the right.

You did not like it there, just move it to where ever you liked, NS remembered where you wanted it for every individual appl.

Want access to to a particular submenu at any time, just tear it off and drag it to where ever you like.

Don't want to move the mouse to get to the menu, just middle click and the appl menu appears under your mouse cursor.

NS menu was best of both approaches.


Mac OS X isn't the only one that uses that space. Windows 10 still has the taskbar, which is about the same size. Gnome still has whatever you call the thing at the top of the primary display. IMHO, they all server roughly the same purpose.


stumpwm doesn't have any of that, and it's glorious. I can launch anything I want from a pop-up command line. Feels like the future of windowing systems!


Absolutely, OSX has a great heritage, but they're also backing away from the purity of the design too... (e.g. Microkernel, simplicity, mimimalism, etc).


NeXTStep was never running on any more/less of a microkernel than OS X.

Mach is essentially a microkernel PLUS most of a BSD kernel wrapped around it, all running in ring 0. It's not very microkernely in a true sense. This was done to achieve performant Unix compatibility, which was a major goal of the CMU research project Mach originated as.

NeXTStep used Mach 2.5, where the BSD personality was based on 4.3 BSD. OS X upgraded to Mach 3.0, with the BSD personality derived from FreeBSD. In both cases all of the drivers are running in kernel space -- NeXTStep never had, eg, filesystem drivers running in user-space like you'd expect from a "true" microkernel.


The platform was quite successful in financial and scientific circles where ease and speed of development was key. That's what led to Tim Berners-Lee having such a machine on his desk whilst at CERN and hence developing the HTTP protocol, the HTML language, and basically the first web server and browser with it.


yeah... I know... I've got a cube and 2 NeXTStations... but while they were popular in engineering circles, they deserved more mass-market appeal than that. Unfortunately, they were also pricey in their day...


At those prices they never stood a chance of becoming anything more than a niche item — and this was very clear to those who formulated the pricing strategy. They did expect a higher success rate in Universities and higher education generally, and the consequent students-choosing-to-work-on-the-platform-they-learnt-on but new graduates had almost no say in such choices “back then” and there was never much take up in the educational sector to begin with.


Do you know if that work was done in (more-portable) C or in Objective-C?


The work was done in Objective-C and took around 5KLOC (including editing).

   I could do in a couple of months what would take more like a year on other platforms [1].
You can download the source code.

TBL got permission from his manager to do WWW as a side project. For NCSA Mosaic, it took a team of 5 a year to produce 100KLOC of C. In other words, it would not have been possible as a side project, and thus we would not have the WWW.

[1] http://www.w3.org/People/Berners-Lee/WorldWideWeb.html


> Ahhh, NeXT... probably the nicest environment/OS/machine that never took off....

That would be BeOS.


I was a big BeOS fan back in the day, but honestly in retrospect I think NeXTStep was a more advance and better thought out system. BeOS went with C++ for GUI programming e.g. which is a terrible language to do GUIs in. I wasted about 10 years of my life doing GUI programming in C++, and any C++ solution which didn't totally suck essentially ended up duplicating in a crappy way what had already been figured out in Objective-C.

BeOS approach to multithreading was really just a dead end. Nobody does multithreading that way anymore.

NeXT also had a more sophisticated display system, based on 2D vector graphics. They had the bundle concept and a lot of other things which I think in sum made it a better system.

What I liked about BeOS was the heavy use of the file system attributes, and the translators were you could just drop in a GIF translator and then all you graphics apps could suddenly read GIF images.


"BeOS approach to multithreading was really just a dead end. Nobody does multithreading that way anymore."

I was about to start digging up more on its multithreading due to interesting things I read in architecture documents. Not to mention their multimedia demos. Read lots of positive stuff with interesting tricks like "benaphores."

So would you clarify why you think it was a bad design and why nobody will use it any more? And did their concurrency or other design efforts get anything right in your eyes?


I programmed for BeOS a little (mostly before I really understood programming, though), and what I remember is just the shear number of threads. Each window (maybe even subregions of windows, don't remember) would spawn two, I think. You'd have to lock to do the simplest things. A lot of rote, manual management of shared mutable state.


I got you. So it's not tge architecture or how threading worked under hood but the developer experience the API's forced was bad? Right?

That could be fixed easily in a knockoff.


I'm so delighted to find somebody else who remembers BeOS.


I regularly cite it as an alternative to mainstream desktops and example of a microkernel system that performed well. Here's a post with it and a bunch of other systems with excellent properties worth copying:

https://news.ycombinator.com/item?id=10957020


I loved Beos. I still have R4 disks lying around, very nice system.


We were discussing microkernels vs monolithic recently with BeOS as example. Opponent believed compilation times and apps doing many file accesses (web server) should be really slow on such a system vs Linux or Windows. Did you do either of those on BeOS and alternatives? If so, what was your experience with that?

Note: BeOS is a special case since the filesystem is a database. So it cant prove his point unless it has XFS or something. But still curious.


wasn't it just a bad clone of irix? why the hype?


Not at all. Irix was a fairly standard System V-derived Unix that ran on Silicon Graphics hardware, which typically was specially built for high graphics performance.

NeXTSTEP broke with Unix tradition in many ways, notably using Display PostScript rather than the X Window System for display, offering Objective-C rather than vanilla C as the primary systems language, and attempting to provide a friendly WIMP interface as the primary mode of usage rather than the somewhat primitive windowmanagers in use on Unixes at the time. The NeXT hardware, in contrast, emphasized multimedia features (like CD audio, megapixel display, and desktop publishing) over graphics rendering performance.


Sun's NeWS previously used Postscript extended with an event system. Of course that meant a heavyweight display client running arbitrary code in an interpreted Turing-complete language communicating with application backends over a network, and something that ridiculous could never catch on.


That's what I said about centralized computing rooms with big iron racks that charged per account for specific amounts of data served to dumb clients over a network. Amazon, Google, and Microsoft say they've never heard of these mainframes. I swear something looks familiar about their feature sheet, though.

Probably best not to give them any more ideas. ;)


And what we have today with HTML/CSS/JS is better, right? ;)


That was the target, yes. NeWS was too big and too slow, but what we ended up with instead is ten times slower, a hundred times larger, and a thousand times uglier. The promise of the world wide web of target-independent structural markup turned out to have its fingers crossed.


interesting parallel... yes we should scrap the JS/HTML/CSS mess and go back to straight PS.... ;o)


irix5 or 6 had the system settings application as a Mozilla xul application running JavaScript. that was early nineties


Early 2000's maybe. Xul wasn't around until 99 or so.


On the serious side, it would have made an interesting alternate to the web.


IRIX UI displays were based on Display PostScript, before SGI switched to X.


Huh, I did not know that! You're right, looks like DPS and X got merged in during IRIX 4.x.


SGI's X server (Xsgi) continued to support the DPS X extension up through 6.5.22. It was removed after that, either for licensing or support reasons depending on which rumors you believe. Some of the Adobe apps (Photoshop, Illustrator) used DPS.


I still remember how neat it was to type PostScript into a console window and see the result appear on the root window.


IRIX (at least circa 6.5, that's the earliest I can remember) was primarily used for graphics. I'm not sure what specifically made the architecture so graphics-friendly (maybe a co-processor daughter board?) but it could render 3d protein fold simulations from PDB files circa 1995 with little difficulty compared to the other gear in his lab. I also remember IRIX being used heavily for CGI in major motion pictures. (I want to say Lightwave or Maya was originally SGI only, due to architectural limitations on every other platform.)

AFAIK NeXT didn't even come close to putting out any hardware that came close to allowing that sort of rendering capabilities. Xerox PARC's platform is probably the closest analogue.[1] Basically all of our modern day GUI paradigms are based off of what they did in the late 70s. @pjmlp will jump in with more detail if he sees this.

[1] http://www.cnet.com/news/tracing-the-origins-of-the-macintos...


>> IRIX (at least circa 6.5, that's the earliest I can remember) was primarily used for graphics. >> I'm not sure what specifically made the architecture so graphics-friendly

Every workstation vendor had their own UNIX back then. SGI built in what became OpenGL and had the graphics hardware to pair it with.

Later on they added things like the XFS file system (think 9TB volumes in 1994), 64-bit MIPS support, and multi-CPU support. All which were key in allowing SGI to do the things they did with 3D.


To this day I still have an irrational love for SGI systems. I've been keeping an eye out for a case to use in a modern build just because. SGI also did a lot of pushing of the industry in good directions imho.


SGI is basically the company that shows amazing engineers cannot overcome psychotic management. From their Windows strategy to their spending, SGI is the pulled defeat from the jaws of victory company.


I wonder whether it all went wrong when SGI bought Cray Research in '96 or if things were already sliding downhill by then.


I think they were just torn between doubling down on their own RISC processors or moving with the times and going Intel. They ended up trying to do both, and that's never going to work.


I think in 1995 with the Windows NT strategy marked the end. They were sliding in 96 and had already sold patents to Microsoft by then.


Same here.

And the colors! those purple shades, wow.

I remember seeing my first Indigo machine and being utterly amazed at both the design of the computer, the big ass monitor, the graphical interface (my friend and I just stood there opening and closing the program groups since they had this very cool 3D effect... and of course we were younger so this was super rare for us).

Amazing machines. I would love to get my hands on one of them.


They were great machines, the Indy, the O2, Octane, etc. At the time there was really nothing comparable from a graphics performance perspective. However, even the latest mips SGI machine the tezro gets absolutely spanked by a modern machine in every possible way. They still have a premium price tag, probably due to rarity and legacy software in a few industries, but have poor performance vs alternatives.

I'm putting my hope on raptor engineering or another firm delivering a Power based system, the days of the great unix workstation providers are in the past.


I was only 12 or 13 then but I remember the discontinuity between an AS/400 setup vs SunOS 2.6 (does that sound right for about '98?). But yeah I distinctly remember fighting against the platforms with such frustration (though it helped me a lot in my career so I can't complain too much).

Wow, I realized they had a bit of advantage over their competitors but I didn't know their engineers were of that caliber. I knew that SGI made STL, but I had totally forgotten that they standardized OGL as well. Multi-CPU support VAX/VMS style still takes the cake in my book but I remember XFS being absolutely incredible re: latency against ext2 on Linux (2.2?) for small files while running an image host in the early 2000s.

Anyways, that makes a lot lot more sense now as to why it'd be the platform of choice. I wish I had a chance to grab that gear before it was decommissioned :(

RIP SGI - fond memories.


"I'm not sure what specifically made the architecture so graphics-friendly (maybe a co-processor daughter board?)"

It had 64-bit processors, high-bandwidth NUMA links for CPU's/boards, plenty of RAM, interconnects from CPU's to graphics cards that removed obstacles to performance, often custom graphics cards, custom software interfaces, whole-program optimization, and ability to scale up to 256 CPU's and hundreds of gigs of RAM in one image. Compare original Final Fantasy movie to anything 3D animated from that time period to see what SGI Onyx machines could do. ;)

Hell, it still looks good despite all the advances since then by gaming and rendering engines:

https://www.youtube.com/watch?v=2zQ-LhwYqdQ

That doesn't happen a lot. Quite a bit can be done in real-time on modern systems. A few things in trailer still look better to me than modern games. I recall the detail even included individual strands of hair's physics and rendering. Too bad the plot and acting sucked because it was first to look real enough to fool people passing through the living room. Scared the rural ones, too. Took 32 Onyx2 Machines fully loaded plus tons of custom software.

For desktops, Bill Gates and SGI did this demo on a machine with two, 150-200Mhz processors plus their graphics cards:

https://youtu.be/zuMSk_S2ARI?t=1m22s

I start there because the first trick will impress you if you recall what multimedia, graphics, or concurrency were like on Windows PC's in 1998.


I own a working NeXTSTEP (Color!), if anyone wants one. I'm not using it.


I've been looking for one. Where are you based?




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: