Hacker News new | past | comments | ask | show | jobs | submit login
Steve Jobs: The Next Insanely Great Thing (wired.com)
158 points by sayemm on Dec 4, 2010 | hide | past | web | favorite | 102 comments



This was particularly interesting:

"Design is a funny word. Some people think design means how it looks. But of course, if you dig deeper, it's really how it works. The design of the Mac wasn't what it looked like, although that was part of it. Primarily, it was how it worked. To design something really well, you have to get it. You have to really grok what it's all about. It takes a passionate commitment to really thoroughly understand something, chew it up, not just quickly swallow it. Most people don't take the time to do that.

Creativity is just connecting things. When you ask creative people how they did something, they feel a little guilty because they didn't really do it, they just saw something. It seemed obvious to them after a while. That's because they were able to connect experiences they've had and synthesize new things. And the reason they were able to do that was that they've had more experiences or they have thought more about their experiences than other people.

Unfortunately, that's too rare a commodity. A lot of people in our industry haven't had very diverse experiences. So they don't have enough dots to connect, and they end up with very linear solutions without a broad perspective on the problem. The broader one's understanding of the human experience, the better design we will have."


Those diverse experiences are exactly the purpose of a “liberal arts education,” something that’s derided here as bad job training. A degree in Shakespeare won’t help with cache invalidation, but it’ll sure spice up the naming.


I think of the underlying faults with that point of view on liberal arts education, one stands above all others. As holder of an engineering bachelors, I'll anecdotally attest to the widely known problem that

Undergraduate engineering degrees actively discourage building communication skills.

This above all else is the heart of that linear thinking problem. Many graduating engineers in the US are communication challenged. They can't perform a principled arguments, write evocatively, perform convincing. They lack the skills to perform critical analysis of other communication, any sort of eye for subtlety, and flat out lack the appreciation of good writing.

It's often explained away as if there is this single choice people make when they're young that they'll either be good at math/science or literature/history. Once you've made that choice, you're stuck socially, psychologically, practically. But that's okay, goes the argument, specialization is necessary and everybody is one-sided, really.

So despite wide knowledge of this problem, schools combat it by deliberately watered-down english classes designed to not damage engineering GPAs. My alma mater required two english classes, both of which I got near perfect grades in by skipping class and writing half-minded, fantastical leaps of literary analysis on books I'd skimmed. I got perfect scores and remarks like "This is the best paper I've ever read". Literally. It also required a single semester "technical communication" course where we were introduced to business letter etiquette and the definition of genre.

As a firm believer that writing something down is the fastest, most powerful method to clarifying your thoughts, I find it despicable that engineering colleges don't provide these challenges. I find it corrosive that the dominant societal perspective is that it's ok if you can't write because you're an engineer. I think this directly leads to the sort of creativity rot I saw, all the time, in classmates.

My experiences may be anecdotal and localized to my college, which is one of the top engineering colleges in the US. I hope it is, and that this disgust is misdirected, but really I think it's larger than that.

---

tl;dr? I'm completely convinced that a dreadful number of engineers are missing something central to anyone who wants to deal in complex ideas entirely because it's considered a liberal arts speciality. They're weak at techniques to critically and clearly articulate or analyze ideas aloud or in writing, the art and science of human communication.

(I'll also note that so far a huge part of the advice of how to survive graduate school is learning to write. People tell me it with the look in their eyes that says they've seen so many people fail miserably because of this sudden, new expectation. I'm know my writing needs a lot of improvement, but I know that of many of the people in my graduating class, many would require a near complete literary reversal before they could actually publish.)

</rant>

(Edit: made it more clear that I'm not arguing with what sudont said. Written communication is hard.)


I definitely agree with that: many of the skills I've used as a CS grad student came directly from my philosophy minor, not from my CS bachelor's degree. Being able to construct or refute an argument, writing 6-to-20-page essays, etc. Writing in CS isn't the same as writing in philosophy, but at least the philosophy classes gave me some grounding in it, which was totally lacking in the CS curriculum.

The other big thing I got out of philosophy but not CS: how to frame and motivate a problem. In undergrad CS, you're usually solving problems that come prepackaged: someone has already posed this problem, developed the framework in which you're expected to come up with a solution, etc., and your job is to find and explain the solution. But in CS research, sometimes >50% of the work can be in convincing someone that something is a problem to begin with, and moreover that it's a problem they should care about, and your way of framing the problem is a good one. Only then will they even bother to look at your proposed solution to the problem!


Um: “There are only two hard things in Computer Science: cache invalidation and naming things”. Tim Bray quoting Phil Karlton

If you haven’t heard that quote, hopefully it’ll conceptually expand to where my post completely agrees with you. If it doesn’t, I completely agree with you.

My arts degree helps me think laterally in the extreme.


Ah, cool. Hadn't heard the quote actually, hah, though I was pretty sure you weren't completely hardline serious that you'd summed up the benefits of liberal arts education.

You just tipped off a complaint I had (and have) building up inside. We're in the same boat, argumentatively. I'm just speaking (yelling, crying) from the other side.


You must have had a truly great “liberal arts education”!

I for one am extremely sceptical that a large set of deeply understood diverse experiences can be obtained in 4 or 5, actually anything less then 10 years, at college.

In my experience, a "liberal arts education" is a great way to spend a lot of money while having a hell of fun time and learning about history and stuff, in an extremely safe environment.

Obviously a tiny percentage of students with true passion for knowledge will seek out challenges and get a lot more out of college.

But the difference between the average liberal arts student and the average science/engineering student, is that the average science/engineering student does not need to seek out the opportunity to work hard and be wrong.

And that is why if I am forced to pick with no other information, I pick science and engineering.


Ah, thank you for describing the condescending attitude I was talking about!


I agree with sudont, bh23ha; you have more than a streak of condescension with your response right here. But more than that, I think you're wrong, or disagree, with how idea of what a college ought to be. It's worth a conversation.

I'm not strictly a liberal arts student, but my curriculum involves a series of liberal arts courses, required. I'm also in Communications, meaning that among the arts I study (art school) I'm getting a fairly diverse set of courses, ranging from film to screenwriting to poetry to interface design to a few other things that aren't strictly involved in my major. So even that study is more "liberal arts" than the average technical degree concentration.

> I for one am extremely sceptical that a large set of deeply understood diverse experiences can be obtained in 4 or 5, actually anything less then 10 years, at college.

I think you're assuming that everybody who's attending college is going there to learn a specific skillset that'll help them in their career, and work hard at developing those skills. I view college rather as a place for people to learn more about themselves, and to have the freedoms to discover things they would never discover on their own.

I didn't enter college for Communications. It took me a little while to figure out what I wanted. And in the process of searching I took a variety of classes — programming classes, religious studies, Japanese, an industrial design class — and each one helped me understand a little bit more about what I liked and what I didn't.

What's more, I find that the classes I take that offer me nothing that I'll apply to my eventual business future tend to offer me unexpected flashes of insight. That religion course, for instance, taught me more about social experiences that human beings seek out than any social design course I've sat through; it helped me understand and come to terms with a phenomenon I at once point actively hated as a child. Similarly, a flamenco class I'm taking right now is unexpectedly teaching me a lot about my misconceptions of what music is and how it works socially. (If you don't know, the way flamenco musicians approach their work is much different than the way band musicians do, and the resulting society is vastly different and interesting.)

For a creative mind, all of these things aid both in specific studies — I could reel out a list of scientific discoveries formulated by scientists who were inspired by an utterly different line of thought — and in the more important study, which is: realizing that the world is interconnected in a lot of ways, and that those connections tend to mirror other connections elsewhere, and that studying all these other things will lead to a richer and deeper education/human being.

I do agree with you that my major doesn't force hard work out of people like engineering majors do. I'm a mix myself: I work hard on my out-of-school projects but my classwork depends on how interested I am in the course. I breeze through a lot of things and do pretty well. But I think that's an advantage, too. I don't like that some technical colleges sap a student's soul and make it impossible for them to do stupid college things, or to experiment on projects with their friends. I have friends who want to do things which they simply can't because freshmen-level courses are taking hours and hours out of their every week. And I'm really sad about that, because I know that a lot of people who give up creative projects in college never pick them up again.

College isn't about me getting a job. It's about me becoming a better person. Liberal arts is geared towards making me more diverse and thoughtful than I would be otherwise. It's working.


>College isn't about me getting a job. It's about me becoming a better person. Liberal arts is geared towards making me more diverse and thoughtful than I would be otherwise. It's working.

I'd really, really like to agree with you. Unfortunately, for myself and a lot of others, college is about getting a job. Why did I go to school and get a computer science degree? All of the employers in my area simply expected a B.Sc. in Computer Science or something related before they'd even consider you as a programmer.

I actually disliked a lot of my computer science curriculum -- I felt that it shortchanged the actual craft of coding and working in teams to build large systems, but I had to do what I had to do in order to get the job I wanted.

EDIT: I also believe that the necessity of liberal education in college is declining as more information becomes more widely available. Yeah, when you had to go to the university library to look up philosophy and history, liberal arts courses were necessary to make someone a well-rounded person. Today, however, there is such a rich variety of liberal education available for free on the Internet that one can make themselves well rounded without stepping foot inside a formal educational setting.


> I'd really, really like to agree with you. Unfortunately, for myself and a lot of others, college is about getting a job. Why did I go to school and get a computer science degree? All of the employers in my area simply expected a B.Sc. in Computer Science or something related before they'd even consider you as a programmer.

It's hard for me to have this conversation with you, because I can't imagine living my life with the end goal of simply finding employment. I'm incapable of going a day without creating something or adjusting something or trying to somehow change myself. So the only jobs I want are the ones that let me make things. And there are always jobs like that available, and they're really easy to find, too: You just hunt down other people that are making things and you ask them what they want you to make.

But I'm still surprised that people have trouble applying for jobs even as programmers. Isn't it relatively easy? You just spend some time making really flashy things, real show-off-y stuff that demonstrates you're more than competent at what you're doing. Write a cocky cover letter that says "I didn't go to college but I'm better at what I'm doing than most college graduates", even if you're not, and then let your work speak for itself. I'm biased also because the year I spent learning Computer Sciences was the easiest year of classes I ever took. I still don't fully grasp why the major seems so demanding for so many students when a lot of the work they have you do is fairly trivial.

> EDIT: I also believe that the necessity of liberal education in college is declining as more information becomes more widely available. Yeah, when you had to go to the university library to look up philosophy and history, liberal arts courses were necessary to make someone a well-rounded person. Today, however, there is such a rich variety of liberal education available for free on the Internet that one can make themselves well rounded without stepping foot inside a formal educational setting.

One can convince himself of his well-roundedness. But there's a difference between theoretical knowledge and practical knowledge. Sort of like the parent post to my original one said about not suspecting most liberal arts college students of being willing to put in an effort. I too am suspicious of college students reading "liberal arts subjects" on the Internet and bothering to apply them in any practical way, to test out their ideas for their own.

Having experienced, wise professors guide you through courses of study is a luxury we don't have yet on the Internet.


>It's hard for me to have this conversation with you, because I can't imagine living my life with the end goal of simply finding employment. I'm incapable of going a day without creating something or adjusting something or trying to somehow change myself. So the only jobs I want are the ones that let me make things. And there are always jobs like that available, and they're really easy to find, too: You just hunt down other people that are making things and you ask them what they want you to make.

It really depends on where you are and what sort of contacts you have. Yeah, if you already know a lot of people who are into programming or if you're in an area where there are lots of people making things, then you can let your work speak for itself. Unfortunately, I'm not in one of those areas. Here in the Midwest, if you don't have the degree, your resume gets placed directly in the circular file.

One of my friends is a better programmer than I am, but due to family circumstances he couldn't finish his Computer Science degree. He's stuck in a relatively dead end sysadmin job, while I'm moving ahead in the programming world. He's doing his best to finish his degree, but it'll be a couple of years before he can do so, and he'll be that much further behind when finally does graduate.

You may not be able to imagine a life with the end goal of simply finding employment. However, if you ended up in a situation where you were unemployed (or worse, unemployable) for a long period of time, then you would start looking for employment just for the sake of having employment. Its nice that its unlikely for you, but it is certainly a situation that many of us have to deal with every day.

>Having experienced, wise professors guide you through courses of study is a luxury we don't have yet on the Internet.

That certainly is true. On the Internet you don't have the challenge of defending your ideas against someone who's studied the topic for most of their lifetime.


Thanks for the thoughtful response.

Let me first clarify that I'm not talking about education as a type of technical school that will teach you what ever skill is hot at the moment.

Rather I think the original point of this discussion started with the claim that good design is not just how things looks but rather a deep understanding of how the whole thing works. Knowledge which is both wide ranging and deep is a prerequisite for good design.

That is something I strongly agree with.

A liberal arts education was suggested as the thing which provides both deep and wide ranging knowledge. And obviously I am bit sceptical of that.

For example:

I view college rather as a place for people to learn more about themselves, and to have the freedoms to discover things they would never discover on their own.

Exactly! Except I don't quite agree with never discover on their own. College happens to coincide with the time of your life where you are learning the most about yourself.

And if you decided to travel the country (or the world) on a motorcycle, I bet you would learn a lot about yourself and have incredible freedom, and discover things you'd never even dreamt about.

But I don't mean to disparage college. College also brings together other students and professors and it's very safe, so an absolutely great experience.

However, I do find that often people who defend liberal arts education imply that self discovery without it is just not as good. I strongly disagree with that, I don't think you even need college for self discovery. Curiosity and a sense of adventure is just as good. Add travel to that mix and I think few colleges can beat that.

The other unfortunate implication is that people who don't have a liberal arts education are a bit shallow or narrow focused, or not quite as well rounded. This I find frankly offensive. But never mind how I feel about it, I think it's plainly wrong.

Science and engineering don't sap student's souls. Scientists and engineers are not all boring, grey people, with no sense for art or music. I recall the Ad Council commercials with a boy telling the street musician to get a job, and a little girl asking her dad to read to her from the Federal Reserve meeting notes.

Actually, if I had to name the one area of study which saps people's souls with overbearing work loads, it would be medicine. And MDs aren't exactly known to be shallow or not well rounded.

Scientists and engineers love music and art too. I mean how can you look at the Millau Viaduct http://images.businessweek.com/ss/06/01/wonders_bigdigs/sour... and not see the art and beauty in it?

Your paragraph:

For a creative mind, all of these things aid both in specific studies — I could reel out a list of scientific discoveries formulated by scientists who were inspired by an utterly different line of thought — and in the more important study, which is: realizing that the world is interconnected in a lot of ways, and that those connections tend to mirror other connections elsewhere, and that studying all these other things will lead to a richer and deeper education/human being.

Is a perfect example of implying that scientists and engineers are just not as aware of the nature of the universe, the subtle connections in it, the beauty in it. That scientists and engineers just aren't quite as "creative".

College isn't about me getting a job. It's about me becoming a better person. Liberal arts is geared towards making me more diverse and thoughtful than I would be otherwise. It's working.

And amazingly science and engineering work in exactly the same way for science and engineering students.

I mean literally no one wants to hire software engineers right out of collage. So yeah, pretty much exactly like a liberal arts degree :)


Ha! One out of two isn't bad.


That was my most favorite part as well, and what made me want to post this to HN.

Loves this last paragraph especially: "Unfortunately, that's too rare a commodity. A lot of people in our industry haven't had very diverse experiences. So they don't have enough dots to connect, and they end up with very linear solutions without a broad perspective on the problem. The broader one's understanding of the human experience, the better design we will have."

Big point to take away from that is you can't make great consumer products unless you have a good understanding of people and human nature.

Guys like Markus Frind, Jonah Peretti, and esp Zuckerberg are great examples of good hackers who also have that rare quality of knowing what makes people tick.

No surprise that broad thinkers are able to connect the dots better. Makes me think of awesome writers in HN like PG and Derek Sivers who continually write quality posts and insights.


And yet, they are widely known to not be "people persons". Hmm...


Only way to observe a crowd is to stand away and look from a distance.


Easiest example of this to grasp: Apple bluetooth keyboard, which rolls up in the back to provide elevation. That roll in the back also happens to be a battery compartment: aesthetic and functional. Minimalist. Not just how it looks, but also how it works.


Pretty cool. Originally published in Feb 1996 for those who are curious (like I was).


Thanks, I realised it was old, but didn't bothering figuring out the date.

Personally I would like it if people put the publication year in the title when posting articles such as this.


I got to the bit where he said he was 40, then googled his birthdate (1955) to arrive at 1995


Jobs was entirely wrong about the web, and continues to be. WebObject's big-company focus ($50,000 for a license) is precisely why it is now dead. Meanwhile Personal HomePage script (now known as PHP) powers the world's largest website -- the little guy's tools won.


What WebObjects would have allowed us to do today if it hadn't been left to bitrot was write an app's business logic once, customise the interfaces and have a webapp, Mac app and iPhone app all from the same codebase. They would connect to the same database, and sync seamlessly.

You can feel the lack of it every time an iOS developer complains there isn't an out-of-the-box syncing solution. They're all using CoreData (which is a simplified cut-down version of WebObject's EOF); it should be a simple matter just to sync to a Mac app, let alone to a web site.

Instead, devs have to learn three different stacks and join them all up manually. Frustrating to watch when you know we've had something better since 1995, and it's sitting neglected.

If you see WebObjects as competing with PHP then yes, it absolutely lost. If you see it as creating the cloud 15 years before its time, then it's actually the only contender that comes anywhere close.


"What WebObjects would have allowed us to do today if it hadn't been left to bitrot was write an app's business logic once, customise the interfaces and have a webapp, Mac app and iPhone app all from the same codebase. They would connect to the same database, and sync seamlessly."

I think this is pie-in-the-sky just the same way his prediction of being able to write an app in "20% of the time" was.

"Write an apps business logic once." I've been developing software for a while now, 10 years, and I've never seen true company-wide (let alone world-wide) code reuse on a massive scale.

Yes, there is a lot of code reuse. And yes django and zend framework and RoR are all examples of code reuse.

But I've worked with so many entrepreneurs and CEO's who have this same wish: Write it once, Use it everywhere.

It just doesn't work that way in practice.


Ever actually used WebObjects? Especially in the objective-C days this is exactly what it did. (What became) Cocoa has pervasive MVC all the way through. Turning a WO app into a NS app was a matter of working on the V, as it should be.


Just because that's what it CAN do doesn't mean it will ever be used that way in practice.

There are many existing technologies that would let you "write it once and use it over and over in different platforms" which is the generic version of what you described specifically using iPhone, etc, as examples.

But in practice writing code general enough to do that takes much longer and is much more difficult and required more elite developers.

Might work fine for the companies that are Mecca for grade-a development talent. But for the bread and butter companies wherein 90% of software in this world is written? Please.


Lets separate out here Job's utopian spin and focus on what I'm actually talking about, which is not "global level code-reuse" or whatever generic strawman you're aiming for.

But let's say I'm writing a web app, perhaps in RoR. I build an entire model for the backend and set up controllers to drive it. Have my HTML views and I'm good to go. Then I want an iPhone client. I have to reimplement that exact same model in Objective-C, and a good portion of the controllers too.

With WO, all that wasted time vanished. The same models and controllers worked for both, right down to the NSString level and below.

Does that let you write once and expand to everything, everywhere? No. But does it take grade-A talent to work across its supported platforms? No.


I agree with you that this was the WO vision. I think what we have settled on instead is JSON over RESTful interfaces. The downside is that it takes a lot more code to hook the web logic into the interface. The good side is that it works across languages and technology stacks, you just need a JSON library and an HTTP stack.

This is probably a good trade off, because the "one language for everything" has never succeeded in practice.


The exact same model. This is textbook. In practice? There's going to be differences. The exact same database does not mean the exact same model.

Controllers? Night and day. A complete re-write would be necessary. The app will function entirely different.

Generic strawman? Chill, this isn't slashdot. But to say that the only thing stopping this awesome revolution in software development is that Web Objects wasn't successful seems to me to ignore the grim realities of most software development projects: There isn't the ability, talent, budget or planning to produce truly reusable code.


In practice? I've done this. The controllers change only as much as needed to accommodate the new views. If your model has differences you've coupled it too tightly.

"only thing stopping this awesome revolution in software development"

I thought you said this wasn't slashdot? Who is talking about an awesome revolution? I am talking about a smarter way of building software across different platforms, a way that worked then and would work now to solve real problems that we have.

The fact that the solution is in a space that has long been occupied by the type of wishful thinking that has clearly hurt you in the past doesn't mean you get to write the entire concept off.


I'm sitting with my iPhone in one hand, reading your post, and before this I checked my Facebook account using the native iPhone client and now I'm wondering ... what the hell are you talking about?

Do you think the hard part of building something like Facebook is that stupid native client (that can sync just fine)?


Do you think we make our lives easier by only making the "hard part" of a job easier?

Look at the number of little webapps people here make. Do you see absolutely no value in also being able to produce native apps at the same time, with dramatically less additional work, using the tools and languages you already know?

Facebook's native client "syncs just fine" because they did extra work to make it so. Why is it bad wanting to get that functionality for free?


I understand your point mostly, but I think you're wrong nonetheless.

They did get that functionality for free (mostly) with their web client optimized for mobile: which works and behaves just fine. But even so, they rolled their native client anyway, because the web client doesn't feel and behave like a native iPhone app.

A native iPhone app is property optimized for the screen width, it does local caching (which a web app used on a desktop doesn't need to do), it behaves differently in regards to the control used (which respond to different events), and in many cases it is desirable to get rid of the page metaphor, no request/response cycle, etc...

The business logic may be the same, but the UI logic is very different and you don't want something autogenerated because the UI itself is usually a competitive advantage; and assembling it is not the hard part, a lot harder is to come up with a good design or to make your backend stable (I don't know how it happens but when I go to Foursquare 6 times out of 10 their service is down).

For instance ... I'm using Twitter a whole lot more on my iPhone because the native client kicks ass. In contrast I use Facebook more on my desktop because the Twitter's web interface sucks (yeah, I'm weird like that).

What I'm trying to say is ... companies that want cheap transition to new platforms like the mobile, already have the right tools: the web and its related protocols; which are more ubiquitous than ever. If you want more than that (say, to differentiate yourself from the competition, or just to keep up), autogenerating the UI won't help; you'll have to get down and dirty because the beauty of a good design lies in the details.

There's no free launch or silver bullet.


I think you're missing the point. The UI isn't autogenerated -- it's the only bit that isn't. (Aside from DirectToWeb, which is a whole different thing). What WO allows you to do is save on all the backend work and focus entirely on the details of the UI. Which as you say, are the bits that matter.

There may be no silver bullets, but there is having a gun when everyone else is using three different types of hammer.


WebObjects was essentially the precursor to the large j2ee app servers such as WebSphere or WebLogic and not a priority at all after the Apple purchase. They moved WO over to Java and dropped the price to $699(IIRC), although by that time no one cared. I can't think of anyone other then Apple that in 2004 used WebObjects.

Still, I dont think he was wrong. WO/Enterprise Object Framework was incredibly influential in server side java which I believe is still the most popular 'enterprise' language currently.


Jobs was entirely wrong about the web

For low values of 'entirely'

He's made, in this interview, probably a dozen or more statements about the future of the web. I'd say for many of them bang on, certainly a lot better then most predictions from 15 years ago.

As for web objects being dead (as a commercial product), well that's because it's time has passed. Apple made it free years ago and I think even then they it had had its day in the sun. It was, however, a very good framework and powered a number of very large sites, including Dell.


WebObjects was $50,000 when it was expected to be $50,000. It was running on bigger servers (HP. Sun) when the software on those boxes was very expensive. They dropped the price pretty significantly later when they were bought by Apple.


WebObjects, dead? Can you please define "dead"? WebObjects powers the iTunes store. I think it is doing just fine generating billions of dollars/year for Apple.


I question the asserted numbers. It is a dead technology.


So it's the dead technology that powrers #1 music seller in US?


I think I can safely say that I know more about the usage of WebObjects within Apple than (probably) every other poster here, and my knowledge in this area leads to me to one inescapable conclusion: it's a dead technology.

HINT: My last job was writing WebObjects software for the aforementioned music store.


Any hints if Apple intends to replace it with something?


Yes. There are thousands people in the world speaking Latin, but it is still dead language.


I think he means dead as a commercial product, not dead as in no longer used


I love Jobs' endorsement of Miele washing machines. Where I grew up in Austria all everyone had was Miele. They were ridiculously expensive. Their slogan was "reliability for many years" which was an understatement as they actually ran for 20 or 30 years no problem.

Also him discussing the new washing machine with his family for 2 weeks! Fascinating. I don't think many Billionaires would do that.


> Their slogan was "reliability for many years" which was an understatement as they actually ran for 20 or 30 years no problem.

You say that like it's something odd or distinctive.

My family has gotten that kind of lifetime out of bargain-price "American" washing machines. The only exceptions have involved operator error and "but I want a new one NOW,I've waited long enough for it to die".


My Miele never worked right, the plumber couldn't fix it. Finally I had it ripped out and replaced with a Maytag.


Just had called the Miele service themselves: they offer a 10 years guarantee plan.


I was most taken by his ideas on education. The voucher system will never happen, of course, which is a shame. I also love the idea of small school springing up all over the place (I'm not a huge fan of the current education system, and I'd love to see it disrupted)

as for trying to teach computing in schools, well that's long been rant material of mine (ask anyone who's been unfortunate enough to mention to me how pleased they are that their school's spending money on a room full of new computers), so don't get me started...


Interestingly enough, many states attempted voucher programs, not to the extent Jobs suggested though. It did spur school upstarts. I know Florida had a Pre-K voucher program, and as a result, Pre-K schools popped up all around. This was sometime in 2006-2008. I'm not sure if they discontinued the program or not, but education levels were on par with public schools, with a per-child cost of about half.


Is this quote a case of iPad foreshadowing? :)

"On the client side, there's the browser software. In the sense of making money, it doesn't look like anybody is going to win on the browser software side, because it's going to be free. And then there's the typical hardware. It's possible that some people could come out with some very interesting Web terminals and sell some hardware."


Though Web Objects didn't take off, I think the idea behind them did. We just call it SaaS.

In SaaS products, instead of building the same core ingredient over and over again for multiple companies, it is built once and then made available via the web to anybody who wants to use it.

That's what ViaWeb was, it's what WePay is, it's what Salesforce is -- it's what the web has become. You need software to do something, you Google it, you find a web-app and you start using it. Done.

Web Objects, Web Apps, tomato, tomatoe.


WebObjects was just a really well designed app server. It's obvious things would work that way - even though WO didn't, in the end, "win". Too lazy too look it up but WebObjects might have been designed around the same time as Java.

A predecessor to Ruby on Rails at a time when websites were making extensive use of the <blink> tag.


It also did things that have still really to be replicated for web developers in a joined-up way; things like proper developer tools (including Interface Builder for the web), a seriously nifty ORM, and the ability to write the back-end once and produce both a web app and a desktop app.

(I know Interface Builder doesn't appeal to the hardcore set, but at least it is an attempt at a different way of working. Our never-ending focus on creating interfaces programmatically makes me wonder what would have happened to desktop publishing if we'd said "what? We made you a postscript mode for emacs -- what more do you need?" and left it at that.)


There would be front-end developers for print design…

HTML really is like postscript mode for emacs, as far as a designer is concerned.


Yes, I agree. That's why I think we should do better. Dreamweaver really doesn't come close to what we should be doing in terms of visual and coding tools to let designers and developers work together.


I’m not smart enough to think of a solution. Interface builder in Xcode works because everything’s based on cartesian coordinates, but other than pages starting from top-left, HTML is really based around flowing and cascading.

Maybe it’s possible, CSSEdit works great, but requires the HTML to be set up. Espresso’s node layout works well to show hierarchy, but still requires manual HTML editing. One solution would be similar to creating nested NSViews, but turning off {position:absolute} by default. This would force the designer to think in a way that’s similar to how web pages flow in the real world, instead of throwing absolutely positioned elements into a static frame.

I’d have to say some sort of blob state file would be the best bet. The designer works in a vector environment that emphasizes mutable viewports, elements and element amounts. Basically dashcode with vector built in.


WebObjects now power the iTunes store: http://en.wikipedia.org/wiki/WebObjects


Always thought he was seeing OOP in general, which, of course, took off in a huge way, even if Web Objects didn't. (NeXTStep was hugely OOP, as were OS X and iOS later, of course.)


OOP was pretty big by the time of the interview, so what he says doesn't make sense, if he's talking about OOP in general.


Nah, he was definitely talking about OOP I feel. I mean, specifically, he was pimping his product that he was launching, NeXT WebObjects which was just an application server for OO apps.

In 1996 most software was being developed in, hmm, C, C++, and Visual Basic 5. Java was just launching. Most web stuff was Perl or old fashioned C. VBScript (Classic ASP) was being developed.


One of the things that happened after Steve's infamous Xerox PARC visit, was that he saw Smalltalk, thought it was massively cool and declared that to be the way everyone should program. When he went on to found NeXT, they were pushing Objective-C, which is Smalltalk style message passing layered on top of C. (Trying to get the best of both worlds, though the combination is a bit awkward to me.)


I think these old articles provide valuable perspective. There's an interesting book from 1998 analyzing Apple's fall up to that point:

http://amazon.com/dp/0887309658

I'm partway through and so far it's good. The detailed analysis untainted by the company's current success is quite interesting. At that point Jobs had come back to be involved in the company but not yet officially taken over as CEO (publicly he said he wasn't up for it).


Fascinating. I find it particularly interesting that Jobs, since moving back to Apple, has done nothing interesting with the Web, but so much with the desktop and hardware. Sure, iTunes delivers music over the Internet, but it is not a web application. On the other hand, he did create the iPad: "And then there's the typical hardware. It's possible that some people could come out with some very interesting Web terminals and sell some hardware."


Offtopic: Please, stop this linking to print version nonsense.

Discussion about it here: http://news.ycombinator.com/item?id=1966724

Original: http://www.wired.com/wired/archive/4.02/jobs.html Issue 4.02 | Feb 1996


the original story was split out into 8 pages and is bombarded with ads, i guess i can see why the print version was linked instead


To be fair - the only way to have known that this was published Feb 1996 would have been to have had the regular link.


Yes, the site and magazine use ads for funding. What's your point?


I agree. There is no reason to spread a simple interview over 8 pages so that Wired can get additional ad impressions, and I upvoted the parent because I hate that crap.


The regular version is much more readable, since it sets the column size reasonably. The print version fits to the window size.

Furthermore, for those who prefer the print version, it is a single click away from the regular version. On the other hand, there is no link on the print version to go to the regular version. This alone is sufficient to make the regular version the correct version to link to.


> The regular version is much more readable, since it sets the column size reasonably.

Perhaps if you're really good at ignoring peripheral distractions, but for me, I'd rather have the column be too wide than the column be narrow and a flashing GIF in the margin!


Yeah, and this particular '96 html is so bad that the print version is all bold and unreadable after using Readability while the original looks quite good (only the first page is bold)

btw I guess I got downvoted for encouraging linking to the original, who'd know.


Use Readability.


Why was this comment downvoted?

Besides linking to it being on shaky moral ground, the print version misses in this case a very critical piece of information, the date this story was published. There is also no way of going from the print version to the normal version, the reverse, however, is easily possible. It’s just not nice to link to the print version.


It didn't occur to me that some HN readers might not know that Wired 4.02 was published in about 1996, but you could be right.

Linking, however, is not on shaky moral ground. If Wired didn't want the print version to be linked to, they wouldn't put it on the web.


Thanks for including the original Wired link there.

And my apologies guys, I usually include the date of the article on the title if it's something old and link to the original publication on my HN submissions.

I came across the link on Quora out of all places, skimmed it very quickly on a late-Friday night, and submitted it w/ no thought to seeing it get upvoted several times. Regardless, I'm glad we're getting a good discussion here on the interview; glad you guys find it interesting.


Why are you so Interested in we all going through the 8 pages of advertisement.


Prediction: Apple is going to move away from the notion of Hard Disks for their consumer-oriented mobile devices. The Flash chips in SSDs are already somewhat like dynamic RAM, though not quite as fast and with some extra requirements and limitations. One could easily give the illusion of a laptop with 256 Gigabytes of orthogonally persistent RAM by using 8 Gigs or so of dynamic RAM as a cache into Flash RAM. No need for any notion of a separate persistent store like a Hard Drive. No need to even boot up! Laptops would only have "hibernate" and "sleep". (You'd need ECC, wear leveling, and ways of transparently "retiring" sections of Flash RAM that go bad.)

Go one step further, and use the cloud as a backing store with an always-on mobile broadband connection. The 256 Gigabytes of Flash RAM would act as a local cache to cloud storage.


It's not a "prediction" if they've already started doing it, dude(tte).

Prediction: Apple will do away with optical drives.

Prediction: Apple will bring iOS features into MacOS X.

Prediction: Apple will make their laptops smaller and slimmer.

Prediction: Apple will do away with floppy drives.


What do you mean "already started doing it?" Please point out the OS X or iOS device whose kernel has no notion of a "Hard Disk" or something like it. With what I'm talking about, the kernel would see nothing but RAM.

As far as I can see, they just started doing away with the case of the hard drive mechanism in their laptops. As far as even their iOS devices go, they still have a notion of a separate persistent store with high latency characteristics completely separate from a non-persistent RAM.

What I'm talking about doing is making dynamic RAM, Flash RAM, and the cloud all just a part of the memory hierarchy with orthogonal persistence.

This is going to entail a lot of engineering which isn't yet in evidence in any of their current products. Such devices would never have to boot-up! I'm not sure such devices would be capable of being booted up. They'd also need more sophisticated error correction than current devices have. (Current devices can just correct the rare memory corruption event by being rebooted.)


It's not an OSX or iOS device, but Apple recently removed the hard drive from the Apple TV.


AFAIK, the Apple TV's kernel still has a notion of a persistent store from which it boots up.


Jobs: "The desktop computer industry is dead." And here we are, nearly 15 years later, with this statement seeming truer than ever, in part thanks to Jobs' iOS devices.


In case I was mistaken, I meant that as both a compliment to Jobs and as a reminder of how incredibly hard it is to know when something's truly dead or non-relevant.


I view it as a compliment to Jobs as a visionary.

From a business perspective, the desktop industry was still to peak and then shifted to a luggable version of the same ecosystem with laptops.


He was right about the "idea" of objects on the web, that is, stuff that can be simply plugged and reused here and there. We call those APIs today.


"Jobs's fundamental insight - that personal computers were destined to be connected to each other and live on networks - was just as accurate as his earlier prophecy that computers were destined to become personal appliances."

It's interesting that common sense becomes "prophecy" when you're Steve Jobs.


It's more that prophecies tend to become common sense 25 years in the future (Jobs first spoke of this in the early 80s).


I'd argue that saying something today that will be common sense 25 years in the future is the very definition of prophetic.

None of this was common sense back then, that's for sure. I got my first email account in 1995.


I'm not sure it was so common as to be common sense, but it was a reasonably common prediction among tech people. The slogan "the network is the computer" was coined at Sun in 1984!


It was common sense in 1977, when the people I worked with were connecting their computers together.


I'd like to say that it still wasn't all that prophetic back then, but I was only a wee lad and don't seem remember much of the early 80's...


Also, more humorously, "said this" becomes "spoke of this" when you're talking about Steve.


In the early 70s the XEROX PARC guys had defined a computer as a communications device rather than a computing device.


How funny the Web and Steve's "objects" has become one thing eventually (mashups/infrastructure-SAS/APIs)


From the mouth of Steve: "And once you're in this Web-augmented space, you're going to see that democratization takes place."

Riiight, Steve. "App-Store" as Democracy.


On the other hand, AppStore has brilliantly fulfilled another vision mentioned in the article: removing a lot of middlemen and thus making hierarchies flatter. If you worked either in the mobile business or game business, AppStore is indie paradise compared to times before it.


It's still a 'democracy' in the way you're describing it. If you don't like it, vote with your wallet.


I do!


Web Objects: yesterday's technology, tomorrow!


more accurately: yesterday's technology, the day before


I don't dispute that it was years ahead of its time. But it has stalled (five years ago?) and promised improvements are largely vaporware.




Registration is open for Startup School 2019. Classes start July 22nd.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: