Ask HN: Will we be using Unix derived OS's for the next 50 years? - vsbuffalo
======
eggestad
Depend is you're talking about the user level part or kernel level part.

Android is a "new" OS on the user level, even if you have a unix kernel. And
I'd claim that android's userlevel stuff is too different from unix that you
can't claim it's unix derived.

I'm pretty sure I'll be doing my development in eclipse on an android based
workstation in a not too distant future.

For a new type of kernel. there has to be a real upside to switch. For the
world to really take the time to make the effort to use something different,
i.e. a new kernel based on different principles, you really need to have one
of the following to happen:

* you need to come up with something that a kernel can do, that is a MUST HAVE and can't be implemented in a unix kernel. (in 40 years this was not happened)

* You get better performance. (again hard to see how you're effectivly compete with the sheer amount of engineering effort done in something like linux)

* You can get by with a MUCH simpler kernel. (As linux is a modular design which is stripable, it's hard to see how you can compete)

* we're going to start building hardware on a different HW architecture that demand a different programming paradigm.

The last part is not as far-fetched as you might think. There is a growing gap
between how we programmer perceive how machines work (and a unix kernel do
provide a user space process with a virtual machine of the type programmer
expect) and how HW now actually work. But as a lot of people has tried to come
up with something else, and failed, I'm not holding my breath,

TL;DR There is no foreseeable benefit to users in a new NON unix kernel.

------
zackmorris
You know, I sure hope not.

Good points:

* applications run in a sandbox (preemptive multitasking/protected memory)

* interprocess communication is well implemented (copy on write, pipes, stdin/stdout/stderr)

* everything is a file (so many data structures and APIs become superfluous once you realize this)

* atomicity is for the most part robust which allows scaling (mutexes, semaphores, file locking)

* open nature of code lends itself to better security, size and performance

Bad points:

* Hierarchical filesystems are a dead end (the future is all about metadata, hashes, diffs and relationships)

* Too much emphasis on brevity, while size becomes less important over time (acronyms, abbreviations, regular expressions, bash, perl, etc)

* Human-oriented concepts, ironically, don't work well for the use cases humans want (permissions, process priority, executable bit)

* Basing everything on source code instead of binaries needlessly increases everyone's workload

* Dependency hell

Honestly I could come up with 10 times as many examples as these. Especially
for the bad points, seriously, it's worth keeping an open mind about what
could be possible if we thought about how the world is moving towards treating
data as essentially infinite. I think computers of the future will work more
like how Google does things with map reduce and Go. It just kills me every
time I can't find something on my hard drive when I KNOW so much about it,
what I was thinking at the time I made it, and not so much its name or
contents. Or when I lose hours, or even days, trying to make the simplest
command work in the shell, or set up a config file (for BIND etc). I think
UNIX reached a pinnacle with Mac OS X but now it will enter a long period of
slow decline as multiprocessing and higher level languages begin to replace
all of the things that we used to do by hand. Especially with regard to how we
develop software today, so much of it (makefiles, even compiled code), while
not necessarily UNIX-centric, is going to go the way of the dodo. I find
anymore that the vast majority of my time, perhaps as high as 90%, goes to
learning curves, getting anything to work at all, and fumbling in the dark
without being able to see where a problem comes from. The operating systems of
the future, whatever form they take, are going to solve these problems in ways
that I think would be difficult with a command prompt mindset.

~~~
sdegutis
In terms of IPC, I like Singularity's model a lot. You don't do any (slow)
hardware boxing, only software-enforced isolation, so you're already in the
same address space, which means sending data between apps is a lot faster.

~~~
sdegutis
Actually come to think of it, I like everything in the Singularity paper. We
should do way more software-based enforcement and utilize compile-time and
install-time verification to do things like avoid DLL-hell, etc.

------
yuhong
Ah, I ranted before about how Copland was killed in favor of Unix-based NeXT
technologies. For example Classic Mac OS used Pascal strings which was more
secure than C strings. What is funny is that Copland tried to move the Mac
Toolbox into kernel mode even though Copland never intended to preemptively
multitask GUI apps. I read they once was able to run the "Blue Box" on
Copland's NuKernel which would have been a good starting point.

------
staunch
Reminds me of this old HN post:

    
    
      Ask HN: Will we be using Lisp derived languages for the next 50 years?
      64 points by jm 50 years ago | flag | 46 comments

------
memracom
Yes, except that I think production services are more likely to migrate to a
thin containerized layer directly over a hypervisor something like what
Docker/LXC provide in UNIX. We already have JVM and Erlang VM and even LISP
implementations that bypass the OS and I think there will be even more of that
type of thing.

UNIX is still a very useful OS for general use and for developers so I don't
see it going away for a century or two.

~~~
frou_dh
2.. hundred.. years. You're bananas to think computing will even be
recognisable as we know it by then.

~~~
bdunbar
My guess is that, while that will be true, if you dig down deep enough through
whatever interface we're using in 2214 you'll find csh.

And support for vt100.

------
LeoSolaris
Likely some evolution of them, unless something profoundly better changes the
game.

