

Steve Jobs: The Next Insanely Great Thing  - sayemm
http://www.wired.com/wired/archive/4.02/jobs_pr.html

======
taphangum
This was particularly interesting:

"Design is a funny word. Some people think design means how it looks. But of
course, if you dig deeper, it's really how it works. The design of the Mac
wasn't what it looked like, although that was part of it. Primarily, it was
how it worked. To design something really well, you have to get it. You have
to really grok what it's all about. It takes a passionate commitment to really
thoroughly understand something, chew it up, not just quickly swallow it. Most
people don't take the time to do that.

Creativity is just connecting things. When you ask creative people how they
did something, they feel a little guilty because they didn't really do it,
they just saw something. It seemed obvious to them after a while. That's
because they were able to connect experiences they've had and synthesize new
things. And the reason they were able to do that was that they've had more
experiences or they have thought more about their experiences than other
people.

Unfortunately, that's too rare a commodity. A lot of people in our industry
haven't had very diverse experiences. So they don't have enough dots to
connect, and they end up with very linear solutions without a broad
perspective on the problem. The broader one's understanding of the human
experience, the better design we will have."

~~~
sudont
Those diverse experiences are exactly the purpose of a “liberal arts
education,” something that’s derided here as bad job training. A degree in
Shakespeare won’t help with cache invalidation, but it’ll sure spice up the
naming.

~~~
tel
I think of the underlying faults with that point of view on liberal arts
education, one stands above all others. As holder of an engineering bachelors,
I'll anecdotally attest to the widely known problem that

 _Undergraduate engineering degrees actively discourage building communication
skills._

This above all else is the heart of that linear thinking problem. Many
graduating engineers in the US are communication challenged. They can't
perform a principled arguments, write evocatively, perform convincing. They
lack the skills to perform critical analysis of other communication, any sort
of eye for subtlety, and flat out lack the appreciation of good writing.

It's often explained away as if there is this single choice people make when
they're young that they'll either be good at math/science or
literature/history. Once you've made that choice, you're stuck socially,
psychologically, practically. _But that's okay,_ goes the argument,
_specialization is necessary and everybody is one-sided, really_.

So despite wide knowledge of this problem, schools combat it by deliberately
watered-down english classes designed to not damage engineering GPAs. My alma
mater required two english classes, both of which I got near perfect grades in
by skipping class and writing half-minded, fantastical leaps of literary
analysis on books I'd skimmed. I got perfect scores and remarks like "This is
the best paper I've ever read". Literally. It also required a single semester
"technical communication" course where we were introduced to business letter
etiquette and the definition of _genre_.

As a firm believer that writing something down is the fastest, most powerful
method to clarifying your thoughts, I find it despicable that engineering
colleges don't provide these challenges. I find it corrosive that the dominant
societal perspective is that it's ok if you can't write because _you're an
engineer_. I think this directly leads to the sort of creativity rot I saw,
all the time, in classmates.

My experiences may be anecdotal and localized to my college, which is one of
the top engineering colleges in the US. I hope it is, and that this disgust is
misdirected, but really I think it's larger than that.

\---

tl;dr? I'm completely convinced that a dreadful number of engineers are
missing something central to anyone who wants to deal in complex ideas
entirely because it's considered a liberal arts speciality. They're weak at
techniques to critically and clearly articulate or analyze ideas aloud or in
writing, the art and science of human communication.

(I'll also note that so far a huge part of the advice of how to survive
graduate school is learning to write. People tell me it with the look in their
eyes that says they've seen so many people fail miserably because of this
sudden, new expectation. I'm know my writing needs a lot of improvement, but I
know that of many of the people in my graduating class, many would require a
near complete literary reversal before they could actually publish.)

</rant>

( _Edit: made it more clear that I'm not arguing with what sudont said.
Written communication is hard._ )

~~~
sudont
Um: _“There are only two hard things in Computer Science: cache invalidation
and naming things”. Tim Bray quoting Phil Karlton_

If you haven’t heard that quote, hopefully it’ll conceptually expand to where
my post completely agrees with you. If it doesn’t, I completely agree with
you.

My arts degree helps me think laterally in the extreme.

~~~
tel
Ah, cool. Hadn't heard the quote actually, hah, though I was pretty sure you
weren't _completely hardline serious_ that you'd summed up the benefits of
liberal arts education.

You just tipped off a complaint I had (and have) building up inside. We're in
the same boat, argumentatively. I'm just speaking (yelling, crying) from the
other side.

------
owyn
Pretty cool. Originally published in Feb 1996 for those who are curious (like
I was).

~~~
Clarity1992
Thanks, I realised it was old, but didn't bothering figuring out the date.

Personally I would like it if people put the publication year in the title
when posting articles such as this.

------
Fjslfj
Jobs was entirely wrong about the web, and continues to be. WebObject's big-
company focus ($50,000 for a license) is precisely why it is now dead.
Meanwhile Personal HomePage script (now known as PHP) powers the world's
largest website -- the little guy's tools won.

~~~
bonaldi
What WebObjects would have allowed us to do today if it hadn't been left to
bitrot was write an app's business logic once, customise the interfaces and
have a webapp, Mac app and iPhone app all from the same codebase. They would
connect to the same database, and sync seamlessly.

You can feel the lack of it every time an iOS developer complains there isn't
an out-of-the-box syncing solution. They're all using CoreData (which is a
simplified cut-down version of WebObject's EOF); it should be a simple matter
just to sync to a Mac app, let alone to a web site.

Instead, devs have to learn three different stacks and join them all up
manually. Frustrating to watch when you know we've had something better since
1995, and it's sitting neglected.

If you see WebObjects as competing with PHP then yes, it absolutely lost. If
you see it as creating the cloud 15 years before its time, then it's actually
the only contender that comes anywhere close.

~~~
encoderer
"What WebObjects would have allowed us to do today if it hadn't been left to
bitrot was write an app's business logic once, customise the interfaces and
have a webapp, Mac app and iPhone app all from the same codebase. They would
connect to the same database, and sync seamlessly."

I think this is pie-in-the-sky just the same way his prediction of being able
to write an app in "20% of the time" was.

"Write an apps business logic once." I've been developing software for a while
now, 10 years, and I've never seen true company-wide (let alone world-wide)
code reuse on a massive scale.

Yes, there is a lot of code reuse. And yes django and zend framework and RoR
are all examples of code reuse.

But I've worked with so many entrepreneurs and CEO's who have this same wish:
Write it once, Use it everywhere.

It just doesn't work that way in practice.

~~~
bonaldi
Ever actually used WebObjects? Especially in the objective-C days this is
exactly what it did. (What became) Cocoa has pervasive MVC all the way
through. Turning a WO app into a NS app was a matter of working on the V, as
it should be.

~~~
encoderer
Just because that's what it CAN do doesn't mean it will ever be used that way
in practice.

There are many existing technologies that would let you "write it once and use
it over and over in different platforms" which is the generic version of what
you described specifically using iPhone, etc, as examples.

But in practice writing code general enough to do that takes much longer and
is much more difficult and required more elite developers.

Might work fine for the companies that are Mecca for grade-a development
talent. But for the bread and butter companies wherein 90% of software in this
world is written? Please.

~~~
bonaldi
Lets separate out here Job's utopian spin and focus on what I'm actually
talking about, which is not "global level code-reuse" or whatever generic
strawman you're aiming for.

But let's say I'm writing a web app, perhaps in RoR. I build an entire model
for the backend and set up controllers to drive it. Have my HTML views and I'm
good to go. Then I want an iPhone client. I have to reimplement that _exact
same model_ in Objective-C, and a good portion of the controllers too.

With WO, all that wasted time vanished. The same models and controllers worked
for both, right down to the NSString level and below.

Does that let you write once and expand to everything, everywhere? No. But
does it take grade-A talent to work across its supported platforms? No.

~~~
encoderer
The exact same model. This is textbook. In practice? There's going to be
differences. The exact same database does not mean the exact same model.

Controllers? Night and day. A complete re-write would be necessary. The app
will function entirely different.

Generic strawman? Chill, this isn't slashdot. But to say that the only thing
stopping this awesome revolution in software development is that Web Objects
wasn't successful seems to me to ignore the grim realities of most software
development projects: There isn't the ability, talent, budget or planning to
produce truly reusable code.

~~~
bonaldi
In practice? I've done this. The controllers change only as much as needed to
accommodate the new views. If your model has differences you've coupled it too
tightly.

"only thing stopping this awesome revolution in software development"

I thought you said this wasn't slashdot? Who is talking about an awesome
revolution? I am talking about a smarter way of building software across
different platforms, a way that worked then and would work now to solve real
problems that we have.

The fact that the solution is in a space that has long been occupied by the
type of wishful thinking that has clearly hurt you in the past doesn't mean
you get to write the entire concept off.

------
nikster
I love Jobs' endorsement of Miele washing machines. Where I grew up in Austria
all everyone had was Miele. They were ridiculously expensive. Their slogan was
"reliability for many years" which was an understatement as they actually ran
for 20 or 30 years no problem.

Also him discussing the new washing machine with his family for 2 weeks!
Fascinating. I don't think many Billionaires would do that.

~~~
WalterBright
My Miele never worked right, the plumber couldn't fix it. Finally I had it
ripped out and replaced with a Maytag.

~~~
LaPingvino
Just had called the Miele service themselves: they offer a 10 years guarantee
plan.

------
cubicle67
I was most taken by his ideas on education. The voucher system will never
happen, of course, which is a shame. I also love the idea of small school
springing up all over the place (I'm not a huge fan of the current education
system, and I'd love to see it disrupted)

as for trying to teach computing in schools, well that's long been rant
material of mine (ask anyone who's been unfortunate enough to mention to me
how pleased they are that their school's spending money on a room full of new
computers), so don't get me started...

~~~
spydum
Interestingly enough, many states attempted voucher programs, not to the
extent Jobs suggested though. It did spur school upstarts. I know Florida had
a Pre-K voucher program, and as a result, Pre-K schools popped up all around.
This was sometime in 2006-2008. I'm not sure if they discontinued the program
or not, but education levels were on par with public schools, with a per-child
cost of about half.

------
0x0
Is this quote a case of iPad foreshadowing? :)

"On the client side, there's the browser software. In the sense of making
money, it doesn't look like anybody is going to win on the browser software
side, because it's going to be free. And then there's the typical hardware.
It's possible that some people could come out with some very interesting Web
terminals and sell some hardware."

------
adammichaelc
Though Web Objects didn't take off, I think the idea behind them did. We just
call it SaaS.

In SaaS products, instead of building the same core ingredient over and over
again for multiple companies, it is built once and then made available via the
web to anybody who wants to use it.

That's what ViaWeb was, it's what WePay is, it's what Salesforce is -- it's
what the web has become. You need software to do something, you Google it, you
find a web-app and you start using it. Done.

Web Objects, Web Apps, tomato, tomatoe.

~~~
nikster
WebObjects was just a really well designed app server. It's obvious things
would work that way - even though WO didn't, in the end, "win". Too lazy too
look it up but WebObjects might have been designed around the same time as
Java.

A predecessor to Ruby on Rails at a time when websites were making extensive
use of the <blink> tag.

~~~
bonaldi
It also did things that have still really to be replicated for web developers
in a joined-up way; things like proper developer tools (including Interface
Builder for the web), a seriously nifty ORM, and the ability to write the
back-end once and produce both a web app and a desktop app.

(I know Interface Builder doesn't appeal to the hardcore set, but at least it
is an attempt at a different way of working. Our never-ending focus on
creating interfaces programmatically makes me wonder what would have happened
to desktop publishing if we'd said "what? We made you a postscript mode for
emacs -- what more do you need?" and left it at that.)

~~~
sudont
There would be front-end developers for print design…

HTML really _is_ like postscript mode for emacs, as far as a designer is
concerned.

~~~
bonaldi
Yes, I agree. That's why I think we should do better. Dreamweaver really
doesn't come close to what we should be doing in terms of visual and coding
tools to let designers and developers work together.

~~~
sudont
I’m not smart enough to think of a solution. Interface builder in Xcode works
because everything’s based on cartesian coordinates, but other than pages
starting from top-left, HTML is really based around flowing and cascading.

Maybe it’s possible, CSSEdit works great, but requires the HTML to be set up.
Espresso’s node layout works well to show hierarchy, but still requires manual
HTML editing. One solution would be similar to creating nested NSViews, but
turning off {position:absolute} by default. This would force the designer to
think in a way that’s similar to how web pages flow in the real world, instead
of throwing absolutely positioned elements into a static frame.

I’d have to say some sort of blob state file would be the best bet. The
designer works in a vector environment that emphasizes mutable viewports,
elements and element amounts. Basically dashcode with vector built in.

------
hedgehog
I think these old articles provide valuable perspective. There's an
interesting book from 1998 analyzing Apple's fall up to that point:

<http://amazon.com/dp/0887309658>

I'm partway through and so far it's good. The detailed analysis untainted by
the company's current success is quite interesting. At that point Jobs had
come back to be involved in the company but not yet officially taken over as
CEO (publicly he said he wasn't up for it).

------
cmurphycode
Fascinating. I find it particularly interesting that Jobs, since moving back
to Apple, has done nothing interesting with the Web, but so much with the
desktop and hardware. Sure, iTunes delivers music over the Internet, but it is
not a web application. On the other hand, he did create the iPad: "And then
there's the typical hardware. It's possible that some people could come out
with some very interesting Web terminals and sell some hardware."

------
zalew
Offtopic: Please, stop this linking to print version nonsense.

 _Discussion about it here:<http://news.ycombinator.com/item?id=1966724> _

Original: <http://www.wired.com/wired/archive/4.02/jobs.html> Issue 4.02 | Feb
1996

~~~
jmtame
the original story was split out into 8 pages and is bombarded with ads, i
guess i can see why the print version was linked instead

~~~
wyclif
I agree. There is no reason to spread a simple interview over 8 pages so that
Wired can get additional ad impressions, and I upvoted the parent because I
hate that crap.

~~~
tzs
The regular version is much more readable, since it sets the column size
reasonably. The print version fits to the window size.

Furthermore, for those who prefer the print version, it is a single click away
from the regular version. On the other hand, there is no link on the print
version to go to the regular version. This alone is sufficient to make the
regular version the correct version to link to.

~~~
_delirium
> The regular version is much more readable, since it sets the column size
> reasonably.

Perhaps if you're _really_ good at ignoring peripheral distractions, but for
me, I'd rather have the column be too wide than the column be narrow and _a
flashing GIF_ in the margin!

------
stcredzero
Prediction: Apple is going to move away from the notion of Hard Disks for
their consumer-oriented mobile devices. The Flash chips in SSDs are already
somewhat like dynamic RAM, though not quite as fast and with some extra
requirements and limitations. One could easily give the illusion of a laptop
with 256 Gigabytes of _orthogonally persistent_ RAM by using 8 Gigs or so of
dynamic RAM as a cache into Flash RAM. No need for any notion of a separate
persistent store like a Hard Drive. No need to even boot up! Laptops would
only have "hibernate" and "sleep". (You'd need ECC, wear leveling, and ways of
transparently "retiring" sections of Flash RAM that go bad.)

Go one step further, and use the cloud as a backing store with an always-on
mobile broadband connection. The 256 Gigabytes of Flash RAM would act as a
local cache to cloud storage.

~~~
guywithabike
It's not a "prediction" if they've already started doing it, dude(tte).

Prediction: Apple will do away with optical drives.

Prediction: Apple will bring iOS features into MacOS X.

Prediction: Apple will make their laptops smaller and slimmer.

Prediction: Apple will do away with floppy drives.

~~~
stcredzero
What do you mean "already started doing it?" Please point out the OS X or iOS
device whose kernel has no notion of a "Hard Disk" or something like it. With
what I'm talking about, the kernel would see nothing but RAM.

As far as I can see, they just started doing away with the _case_ of the hard
drive mechanism in their laptops. As far as even their iOS devices go, they
still have a notion of a separate persistent store with high latency
characteristics completely separate from a non-persistent RAM.

What I'm talking about doing is making dynamic RAM, Flash RAM, and the cloud
all just a part of the memory hierarchy with _orthogonal persistence._

This is going to entail a lot of engineering which isn't yet in evidence in
any of their current products. Such devices would never have to boot-up! I'm
not sure such devices would be capable of being booted up. They'd also need
more sophisticated error correction than current devices have. (Current
devices can just correct the rare memory corruption event by being rebooted.)

~~~
sil3ntmac
It's not an OSX or iOS device, but Apple recently removed the hard drive from
the Apple TV.

~~~
stcredzero
AFAIK, the Apple TV's kernel still has a notion of a persistent store from
which it boots up.

------
maguay
Jobs: "The desktop computer industry is dead." And here we are, nearly 15
years later, with this statement seeming truer than ever, in part thanks to
Jobs' iOS devices.

~~~
maguay
In case I was mistaken, I meant that as both a compliment to Jobs and as a
reminder of how incredibly hard it is to know when something's truly dead or
non-relevant.

~~~
crander
I view it as a compliment to Jobs as a visionary.

From a business perspective, the desktop industry was still to peak and then
shifted to a luggable version of the same ecosystem with laptops.

------
arturadib
He was right about the "idea" of objects on the web, that is, stuff that can
be simply plugged and reused here and there. We call those APIs today.

------
grammr
"Jobs's fundamental insight - that personal computers were destined to be
connected to each other and live on networks - was just as accurate as his
earlier prophecy that computers were destined to become personal appliances."

It's interesting that common sense becomes "prophecy" when you're Steve Jobs.

~~~
blader
It's more that prophecies tend to become common sense 25 years in the future
(Jobs first spoke of this in the early 80s).

~~~
nikster
I'd argue that saying something today that will be common sense 25 years in
the future is the very definition of prophetic.

None of this was common sense back then, that's for sure. I got my first email
account in 1995.

~~~
_delirium
I'm not sure it was so common as to be common sense, but it was a reasonably
common prediction among tech people. The slogan "the network is the computer"
was coined at Sun in 1984!

------
ivanzhao
How funny the Web and Steve's "objects" has become one thing eventually
(mashups/infrastructure-SAS/APIs)

------
rasur
From the mouth of Steve: "And once you're in this Web-augmented space, you're
going to see that democratization takes place."

Riiight, Steve. "App-Store" as Democracy.

~~~
jason_tko
It's still a 'democracy' in the way you're describing it. If you don't like
it, vote with your wallet.

~~~
rasur
I do!

------
jfb
Web Objects: yesterday's technology, tomorrow!

~~~
cubicle67
more accurately: yesterday's technology, the day before

~~~
jfb
I don't dispute that it was years ahead of its time. But it has stalled (five
years ago?) and promised improvements are largely vaporware.

