

Gruber: The OS Opportunity - concretecode
http://daringfireball.net/2009/11/the_os_opportunity

======
GeneralMaximus
I was born in 1990. The first computer that I could call my own was a custom
built PC with Windows XP. This was in 2003.

I cannot express in words how sad that makes me. I've heard older programmers
talk fondly about the BeOS, the classic MacOS, Amiga Workbench (IIRC),
NeXtStep and even Win3.1. _Even_ older programmers talk about computers built
by Tandy and the BBC, and the legendary machines built by Atari and Commodore.
I cannot help but feel that I've missed something. I salute Gruber for making
the point I have been trying to make for a long time now.

I've grown up in a world where we have only two major families of operating
systems[1]: Windows and UNIX. That makes me a sad panda :(

Even though I hate the flimsy machines Dell make, I would love to try out the
DellOS, if they ever decide to build one.

 _[1] I'm only talking about the desktop space here._

~~~
aristus
For what it's worth, people who are children now will have missed the wild
days of the Web when everyone had their own API, data, and UI stacks and no
two apps looked or behaved the same.

I'm not sure I agree with the OP. When there is real competitive advantage in
some newish layer of the tech stack, you will see incredible diversity. When
the advantage moves on to other parts of the stack (eg applications, the web,
etc), the older layers standardize and homogenize.

Today, in 2009, neither I nor my mom give two shits what OS we're using 70% of
the time because our work is done inside a browser. I run WinXP in a virtual
machine on my Mac. At least once a day I catch myself using the browser inside
the Windows instance without realizing it.

~~~
mbreese
And I think this is why hardware vendors could come out with their own spin on
an operating system.

We didn't use to have a standard for sharing information. If you wanted to
share a file, you had to have the _exact same_ hardware _and_ software. That
is what led us to the local minima that is PCs running DOS/Windows.

Now, however, we have a well accepted standard. If you can have a good TCP/IP
stack and run a browser, you're golden. With just these two things you can be
productive on any operating system.

So now, the fact that the OS is the last thing that you consider, might just
free manufacturers to come up with their own designs.

That does however leave one crucial thing out, which might just kill Gruber's
argument: games.

------
tumult
Read this earlier. Dead on. Any company which cannot build decent software
does not have a bright future. Not just in computers -- probably any field.
Software is becoming a more pervasive part of life every year. The many who
cannot adapt will be easily replaced by the few who can. And with software, it
really is easy for one person or company to replace thousands in a single
stroke.

For now, though, this effect is probably most visible in the tech field.
There's a large and growing disparity in profits between companies which can
produce good software, such as Apple, Google and Microsoft, and those which
cannot. I suspect this disparity will more and more reflect how the economy
functions for individuals. It wouldn't surprise me if the wealth gap further
widens every year. I'm not sure if our politicians, society or culture are
prepared to deal with it.

~~~
Raphael_Amiard
I don't see how that is the case, since the main software supplier in the
world today that is microsoft, is totally dependent on hardware manufacturers.

I don't see Intel, (ok easy example) or any other hardware manufacturer dying
anytime soon. The only one who 'could' die are those whose job is only about
assembling hardware components together. And even them have a role in the low
end market that will be very hard to meet for a company aiming to handle the
whole chain of assembly of a computer, software included.

It's easy to agree on the fact that software has a much more profound impact
on people that hardware. It's why there is an UI in GUI. But to me this point
is as moot as saying an engine manufacturer is useless because it doesn't do
steering wheels.

It seems to be a fact (I dont have enough informations to be sure) that
software economy is wealthier than hardware's. But i'd like facts about that
first, because i see a whole lot of software companies dying too . And then
even if it is true it doesn't mean we NEED hardware manufacturers to come and
release a ton of more or less compatible software and trying to impose that on
users the exact same way we are stuck with windows at the moment.

Software building is a field of expertise in itself. A company can convert
itself, or invest a new field. But it certainly doesn't mean they should, or
they gonna die if they dont ..

My summary : Basic and non credible argumentation from the original article.
Takes a visionnary tone , but in my opinion this is everything but where we
are headed at the moment.

~~~
tumult
Intel makes some very good software -- for example, their compilers.

Car companies that cannot produce good software are indeed dying. The next
generation driving systems rely heavily on a software component (traction
control, user interface feedback, gas management and hybrid drive, Tesla's
power management, etc) and those which cannot make the leap to produce
software in tandem with their car hardware are facing irrelevance.

------
ajscherer
I don't see any reason to think a company like Dell or Sony would be well
positioned to create an operating system. What software have they created that
suggests they could do it? If Windows is so unsatisfactory doesn't that
suggest that merely having resources isn't nearly enough? If Dell decided to
create an operating system they would have little advantage over anyone else,
and would have the disadvantage of being a huge company attempting an equally
huge software project from scratch. They might as well just light piles of
cash on fire. The sensible way to do it would be to buy a company that is
building an OS.

I think Gruber is underestimating the fact that Apple has been developing not
just the Macintosh OS, but the ecosystem and community for like a quarter of a
century. I don't know that there is a shortcut to the latter two. When I
switched back to Windows from Ubuntu it wasn't because of any failing of the
operating system. It was because I was unable to find suitable replacements
for all the Windows applications (and drivers) I used.

Web applications have not yet usurped desktop apps across the board, and when
they eventually do, won't it be a worse environment for a new proprietary OS?
At that point what would a proprietary OS offer that a free OS on inexpensive
hardware couldn't?

------
jasonlbaptiste
I was thinking about this the other day. There's a lot of opportunity in terms
of other Operating Systems/Interfaces. Computers used to JUST be these beige
boxes that sat under our desks. Actually at one point, it used to be huge
mainframes. Now, computers are everywhere, and by computers I mean full
fledged system specific GUIs tied to some piece of hardware. They're in our
pocket, in our cars, on our desks, in our laps, on our tvs, in the cloud as
servers, and probably other places I'm not naming. This is where I see other
Operating Systems taking away Microsoft's dinner. Microsoft got us to the
first billion, but I'm pretty sure someone else will get us to the first
trillion. Now, that's pretty exciting.

As far as the desktop goes? Apple will have its market share, which is the
high end computing market (they have ~90% of it that 10%) and windows will
have the other 90%. You can make Ubuntu function/look exactly like
windows/mac. The problem comes down to the OEMs. Will they support it? Will
they make the strong push in marketing to get people to try it out? Your
grandmother is not going to do sudo and apt-get commands. If an OEM made
beautiful machines, that ran tight software/hardware integration running a
customized version of Ubuntu, marketed the hell out of it, and offered the
right amount of training like Apple does, it could possibly get off the
ground. Even then, linux is a scary place for the average person and the apps
just aren't there.

Honestly, I'm more bullish on other Operating Systems taking us from billions
to trillions on devices that aren't beige boxes that sit under our desk.

------
Raphael_Amiard
I don't agree at all with Gruber on this one.

It seems in fact that software manufacturers, and for good reasons, are taking
more and more control into how hardware is designed. And that is a good thing
because hardware ought to be designed for the software, not the other way
around.

But it doesn't mean that what we need is more monolithic entities like apple.
The fact that hardware and software is at least somewhat decoupled is a VERY
good thing in my opinion.

What google is doing with android is interresting in this regard, and i wonder
if their strategy with Chrome OS is gonna be similar; they release an OS
(open-source, and this is quite important in the end), do the usual strategy
of partnership with hardware manufacturers, and in the end , also decides to
produce their own smartphone. It's an harder path cause your partners are your
opponents at the same time (something apple for example, hasn't to deal with
concerning OS X). But for the user it's clearly the best path : You have the
freedom of using the software as you want (!= Apple) and you can also buy a
proprietary solution that supposedly offers a better synergy between software
and hardware

~~~
pohl
(From your other post below)

 _I don't see Intel, (ok easy example) or any other hardware manufacturer
dying anytime soon._

I don't think that, when the author said PC makers are "busy dying" that he
meant that they're necessarily on a direct trajectory towards bankruptcy. I
took it in the softer, more poetic sense that they're listless, apathetic,
lacking in the proud vigor and vibrancy of their youth. This was captured best
when he said this:

 _People today still love HP calculators made 30 or even 40 years ago. Has HP
made anything this decade that anyone will remember fondly even five years
from now?_

I agree with you that hardware and software decoupling is a very good thing. I
also do not see this as being mutually exclusive to what Gruber is suggesting.
If Dell wanted to invest in being a player in user experience, there's no
reason that they would have to tie their system software to their hardware.

The biggest flaw that I see in Gruber's article is that he might be
overestimating consumers themselves. Sure, the computer industry is different
from the days when you tried to sneakernet a 1-2-3 file to a friend's machine
only to find that he was running Visicalc on CP/M. But will consumers accept
this change without knowing what an open standard is, or a document format, or
a portable runtime, or cross-compling, or virtual machines?

I'd guess not, given that I've talked to seasoned geeks who haven't yet fully
internalized that things are different today. Consumers, recoiling from the
fear of "incompatibility", all happily ran to one side of the boat in the 90s
and nearly capsized it. Teaching them that diversity doesn't have to mean
incompatibility seems like a daunting task.

------
Hoff
Building an OS is feasible for a small team or even a single programmer.

Providing enough of a subset of the expected APIs and the device support and
the databases and web browser and the compilers and file systems the rest of
the stack that the customers and third-party partners expect? That's a bigger
project and a bigger budget.

Simply being better isn't enough.

Being faster isn't enough.

Being cheaper isn't enough.

Even running Microsoft Windows on your (non-x86) hardware isn't enough.

You need some combination of "betters" and of application and document
compatibility, and you need to get to critical mass of applications and tools
and device and hardware support, or you need to get to "massively better" in
one or more dimensions to get enough early adopters on-board, or you need
enough money to buy the tools and ports you need.

And then you have to get to big volume and to enough of a profit margins to
get your prices down to where you attract application developers and
resellers, or pay for the developers.

As for competition, you're working against Microsoft Windows on x86 and Apple,
and FOSS limits your margins. Or against embedded vendors that excel in one or
more dimensions.

There are little-known and niche and embedded operating systems and vendors
all over the place. Wind River (now at Intel) is one. HP has at least three
operating systems (NSK, HP-UX, VMS) and has retired others including MPE and
Tru64 Unix and Domain/OS, and not counting embedded software platforms such as
EFI and all those HP printers. IBM has its own OS offerings.

Building an OS is comparatively easy. Anyone with enough skills or enough cash
can certainly build or buy one, and various folks can buy enough partners. But
to create a self-sustaining environment within your target market and to avoid
creating a massive write-off, you need to build an ecosystem around your
operating system. That's a much tougher and much bigger effort.

~~~
rsheridan6
Note that both litl and chrome use Linux under the hood, so you don't have to
do all of that to make a "new" OS, for values of "new" used in this article.

------
glymor
I assume he means creating their own user experience. As all the examples he
gives of companies creating their own OS are actually using Linux.

~~~
tumult
OS is more than a kernel and some drivers.

Unless you would consider Android, Palm's WebOS, Nokia's various OSs, Arch and
Ubuntu to all be equivalent.

~~~
glymor
All the "OS"s in the article run GNU userland, X windows, webkit or gecko
browsers. So yes they are the same operating system. They just have different
applications in different configurations.

Android is it's own operating system. It has own user space and GUI
technology. But the article didn't mention it.

------
TallGuyShort
Computer makers that want to succeed? I wouldn't buy a computer from someone
that locks me into their product line for ALL my needs with that computer.

------
leej
Classic Apple fanboy-ism at best with missing knowledge here and there. If OS
does matter that much first Google couldn't be such a success and second Apple
couldn't enjoy its current success because first 2 (even 3) Mac OS X releases
were horribly slow and have very little app support!

~~~
cubicle67
Classic anti-Apple fanboy-ism at best with missing knowledge here and there.
The OS mattering != everything else not mattering

~~~
leej
maybe downvoters didnt get my point and/or Gruber's point. i'm saying that OS
does not matter as it used to be to be developed by other players such as HP
or Dell. asking again what is wrong with this???

~~~
pohl
The original article makes an excellent point, and expresses it with a great
deal of coherency: the reluctance of computer makers to engage in operating
system development is based on an outmoded fear that was established in the
days where interoperability was hard because we hadn't yet figured out things
like open document formats, open network protocols, portable APIs, portable
language runtimes, and virtual machines. A lot has changed since then, so why
are they still so reluctant?

This is an excellent question.

Your response seems to have a very tenuous grasp on coherency. I'm trying to
pull out what you're trying to say. It appears to hinge on the transitive verb
"to matter", and I can find only one introductory paragraph where the author
uses this verb, and he doesn't say anything objectionable the two times that
he uses it:

He says that hardware and software both matter. I find that hard to refute.
Then he goes on to say that if you asked him to say which matters more, he'd
say software. I'm not surprised that he would say this, since he tends to be a
"user experience" guy and I'm willing to grand him this premise for the rest
of what he wrote.

Honestly I can't tell what counter-argument you're trying to make. Yes, Google
has been successful. It certainly wasn't because of their hardware.
Regardless, you're only responding to the setup of the thesis, not the thesis
itself.

~~~
balakk
_A lot has changed since then, so why are they still so reluctant?_

It's this premise that I find inaccurate. Sure there's open everything, but
writing an operating system from scratch to support all this is still hard.
Hell, give me one commercial operating system written from scratch in the last
decade.

The point is, why do it? Maintaining an operating system is big money. And you
can't stop there - you've got to have a full-stack offering - business apps,
fun apps, drivers, the whole set. Add to that support, interoperability with
the world, backwards compatibility etc. It's a long-term commitment; you can't
back out of it that easily.

The risk-to-reward ratio is pretty small; unless you have some earth-shaking
innovation up your sleeve, and/or it reinforces/supports your business model
significantly.

~~~
cubicle67
Why from scratch? If you remove that seemingly pointless requirement then
there's quite a few examples, starting with the Litl OS which is based on
Ubuntu

Heck, there's plenty of good solid starting points. What about BSD? Linux?
Android?

~~~
balakk
Still doesn't matter. Unless you create an ecosystem around your software, I
think it still isn't relevant in the larger scheme of things. For eg., can
Maemo, Android, WebOS etc all share applications, APIs, drivers, and other
infrastructure seamlessly?

The point is you need to end up creating an ecosystem around your offering,
which is non-trivial. Even if you do, you may still end up as a niche player.
Litl is nice; but how well do you think they'll do against traditional
netbooks?

~~~
steveklabnik
Depending on how you architect your OS, it doesn't have to be impossible.

I have a little Exokernel project I devote my Saturdays to, and we'll actually
be able to offer a POSIX-compliant libOS, and run anything that Linux does.

Or, there's always the hypervisor route, on desktops anyway.

