Hacker News new | past | comments | ask | show | jobs | submit login

The final line of this article is just astounding to me:

> Barring unforeseen obstacles, an on-line interactive computer service, provided commercially by an information utility, may be as commonplace by 2000 AD as telephone service is today. By 2000 AD man should have a much better comprehension of himself and his system, not because he will be innately any smarter than he is today, but because he will have learned to use imaginatively the most powerful amplifier of intelligence yet devised.

An "on-line interactive computer service"?...what did "on-line" even mean back then...nevermind "interactive"...The prediction in 1964 that we would have "an information utility as commonplace by 2000 AD as telephone service is today" must be one of the most prescient and accurate three-decade predictions ever made in a general interest magazine.




I agree that looking back at that statement from modern day is just mind blowing. BUT, I think it is good to remember that (in all honesty)fundamental computing did not advance after the 1970s. PLs, OSs, computer networks, etc., were all invented in the early 1970s or earlier. Most of what has been done in computing in the last several decades have been improved versions of the concepts invented in the 50s and 60s. JGrahamC calls this "Turing's Curse". [0]

[0] https://www.youtube.com/watch?v=hVZxkFAIziA


Brilliant talk, thanks for linking to it...I love the bit (8:40) where he says, "The thing you are doing has likely been done before. And that might seem depressing, but I think it's the most wonderful thing ever. Because it means an education in computer science is worth something."


Actually, it rather fails to hold, in large part because the terminology has so changed in the intervening decades.

In 1964, the idea that computers would largely be small devices that sit on your disk or even in your pocket was unimaginable. The microcomputer--the forerunner of today's computers--wouldn't be developed for another decade. The term computer back then meant the big room-filling mainframes, and it would certainly have been infeasible that home users could own those computers.

When you look at the context of the article, it's clearly describing the future where you have access to time-shared resources on a large, central computer. On-line and interactive are meant as a reference in contrast to batch systems: in the days of punch cards, you typically operated by submitting your punch cards to be run and waiting for the results to come back--sometimes hours.

In effect, to use more modern terminology, the article is predicting: "By 2000, you'll be able to purchase from Comcast (or other friendly telco) access to a computer that runs a suite of intelligent expert systems that will drive several professions out of business."

That prediction is more wrong than it is right: 1. You don't buy access to a computer, you own one outright. 2. Most of the intelligent expert systems it predicts don't exist. 3. Most of the professions purported to be obsolete are not. Travel planner is about the only counterexample I can think of, but that's not because computers are doing the planning. I'm not even sure if the 2000 deadline was met for that one. 4. The closest service to what it predicts would exist is Watson. Which debuted in 2011, wasn't commercially available until 2013, and certainly isn't commonplace or even easily accessible even to corporate users.

About the only thing it got right was that computers would move from batch mode to interactive mode. The expert system model has completely failed--note this article would have been written before automatic machine translation of natural languages was declared a failure (and it still is, 50 years later). Even with the rise of Big Data, most of the goals of computers in that article still seem difficult or impossible to achieve. I personally believe that Big Data is going to lead to the fourth (fifth? I've lost track) big AI crash. Impressive, really, that people still have high hopes for AI when it has failed to live up to those promises for over 60 years.


Not sure if it is as wrong as it sounds. In many ways the computing world today isn't that different. We sit at home with fairly simple terminals like iPads or laptops while computers filling entire rooms at google, facebook etc are performing a large part of our calculations. The web is in a way just an evolved or advance form of teletyper terminal that allows us to connect to more powerful clusters of computers offering us different services.


The modem was invented in 1958 so the idea of computer networks was not far fetched in 1964. For context the Internet as in the same (TCP/IP) network was born in 1969 and was designed before then.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: