
'Fifth Generation' Became Japan's Lost Generation (1992) - luu
http://www.nytimes.com/1992/06/05/business/fifth-generation-became-japan-s-lost-generation.html
======
gumby
In 1984 I visited NTT's 5th generation effort (and later worked at MCC,
featured prominently in the article).

At the time I had a bitmapped display on my desk which I connected to a
dedicated Lisp machine (Dorado). At the 5Gen office there was a row of desks
with shared character terminals (one for every two developers) connected to
the workhorse PDP-10 (same machine we used for research in the USA). The guy I
was visiting was glad he only had to share the terminal with one other person.
It was my first introduction to the blue collar status of programmers in Japan
-- but without the respect that, say, a machinist gets in the US. It seemed
pretty unlikely they'd have the opportunity to innovate.

The MCC response wasn't much better (perhaps the nearby Sematch did better? I
don't know). It was also stupid and bureaucratic and although they did recruit
a bunch of very smart folks from US and Mexican universities, I never saw much
impact either (though I had _two_ symbolics 3600s terminals on my desk). I
don't think it's a government issue -- these multi-company collaborations
(Taligent, Itanium, even Power) rarely come to much.

Of course companies can work together -- standards organizations and joint
development projects are common. But IMHO a diffuse "consortium" really has no
reason to exist beyond its own existence.

~~~
lkrubner
" _these multi-company collaborations (Taligent, Itanium, even Power) rarely
come to much._ "

That is a good point. A good example that's happening right now is
[https://www.roomkey.com/](https://www.roomkey.com/)

If you read this you will be amazed by the technology that RoomKey developed:

"Against the Grain: How We Built the Next Generation Online Travel Agency
using Amazon, Clojure, and a Comically Small Team"

[http://www.colinsteele.org/post/23103789647/against-the-
grai...](http://www.colinsteele.org/post/23103789647/against-the-grain-aws-
clojure-startup)

(Hotelicopter eventually rebranded as RoomKey)

The tech is amazing, but this is a company owned by a consortium of big hotels
(Marriot, Sheraton, etc) and they have no interest in feeding a disruptive
startup that threatens their business. Rather, they simply want RoomKey to
exist so they can use it as leverage in negotiations with Expedia and Orbitz
and maybe Kayak. In other words, they want to be able to say "Give us a better
percentage of revenue or we will fully fund RoomKey and you will be sorry."

This truth is obvious to the tech at programmer, and they find it immensely
frustrating.

It's the mis-aligned incentives that lead to the poisonous politics of these
multi-company joint efforts.

~~~
wolfgke
Perhaps I miss something for cultural reasons (I am not a US citizen nor do I
live in the USA):

> The tech is amazing, but this is a company owned by a consortium of big
> hotels (Marriot, Sheraton, etc) and they have no interest in feeding a
> disruptive startup that threatens their business. Rather, they simply want
> RoomKey to exist so they can use it as leverage in negotiations with Expedia
> and Orbitz and maybe Kayak. In other words, they want to be able to say
> "Give us a better percentage of revenue or we will fully fund RoomKey and
> you will be sorry."

> This truth is obvious to the tech at programmer, and they find it immensely
> frustrating.

Why is this so frustrating to the programmers working there? They can work
with interesting technology (e.g. Clojure). Yes, because of the incentives it
will probably not become a really huge company, but again: where is the
problem if they earn a good salary?

~~~
ACow_Adonis
Because technology is the means, not the ends. Many of us work with tech not
because we're fetishists for proximity or status of technology, but because we
recognise that the technology is a means to actually do something meaningful
in the world.

Have no meaning in the work that we do, and job-wise we'll be as happy working
with tech as we were flipping burgers at McDonald's: pay and conditions aside
obviously.

~~~
wolfgke
> Have no meaning in the work that we do

To create a website that serves as an important bollock against other travel
websites that endanger the company that pays you is not what I would call
meaningless.

------
mcguire
Back in about 1988, Bruce Porter, one of my AI professors, described the
problem statement of the Fifth Generation project as, "a list of important
research topics like natural language understanding, speech recognition, and
so forth, and the statement, 'Therefore we will build a parallel Prolog
machine.'" He wasn't a fan.

On the other hand, according to the tech press at the time, the Japanese were
going to _eat our lunch_. Put us all out of work. Rule the world with an iron
fist.

~~~
segmondy
A Parallel prolog machine is a great idea. Just because it failed to take off
doesn't mean the idea shouldn't be explored again. The difference in CPU
design in 88 and today is amazing. Prolog is one of those languages that just
automatically lends itself well to AND/OR conditions without having to change
any code.

I personally still see a future there, Von Neumaan has only taken us so far.
We need alternate CPU designs.

~~~
wrp
There were a few nearly identical parallelizations of Prolog. Besides KL1,
there were Concurrent Prolog, GHC, Paralog, Strand, and Aldwych. The structure
of Prolog-like concurrent systems is like the Actor model, based on message
passing, and shares the problem of indeterminacy in the ordering of messages.
These systems had to drop backtracking, because that requires access to a
global stack.

In the end, it proved difficult to implement clause invocation in these
languages as efficiently as procedure invocation in object-oriented
programming languages. The combination of efficient inheritance-based
procedure invocation together with class libraries and browsers was better
than the slower pattern-directed clause invocation of the FGCS programming
languages, so concurrent object-oriented message-passing languages like Java
and C# became the mainstream.

I would suggest that the relational database approach is simpler and about as
powerful as the rule-based approach to knowledge processing. From a practical
point of view, any row of a Relatonal DataBase System table can be considered
as a rule, provided that at least one attribute has been selected as an
output.

------
geophile
I was at Computer Corporation of America for a few years, until 1988, in the
research group. It was quite illustrious in its day, producing very
interesting research and prototypes, (concurrency control, extended relational
databases, the Adaplex project). We were aware of both the 5th Generation
project, and the US "competitor", MCC.

Remember, at the time, Japan was an economic and technical powerhouse, There
were concerns about the Japanese buying up valuable real estate in New York.
We were worried about the trade imbalance. So 5GP was quite threatening to
some.

I remember meeting one of the 5GP researchers who credited MCC for scaring the
_Japanese_ into greatly increasing funding for computer science in his
country. And one of the other comments here pointed out that the fear worked
in the other direction too.

Another commenter here talked about Prolog. One of the lines of research at
the time was on "expert systems", and if I remember correctly, Prolog was one
of the major languages for research in that area. An expert system was a set
of rules for solving a problem in some domain. It was yet another failed
attempt at AI, which didn't really work beyond toy problems. (Of course, now,
those rules come out of machine learning systems, but the rules are kind of
inscrutable.)

~~~
ghaff
At the time, there were a lot of voices pushing the idea that public-private
partnerships in the Japanese MITI vein were the future. Economists like Lester
Thurow vehemently argued that the US wasn't doing nearly enough in this vein
and was going to lose out to Japan as a result.

The history of AI research is sort of messy but around the end of the 1980s
you were entering one of the AI winters. As you say, a lot of money had been
pumped into expert systems without a lot of commercial success (and some high-
profile failures). The Lisp machine market was collapsing. It probably didn't
help that a lot of the big computer companies at the time (the Route 128
minicomputer makers) were collectively struggling.

~~~
geophile
The research arm of CCA disappeared in 1988, at the same time that Symbolics
(and other Lisp machine companies) was failing. So I, along with a few
Symbolics people, were looking for the next thing to do, as OO databases were
becoming a thing. We were brought together by an entrepreneur and formed
Object Design, one of the batch of OO database companies of the late 80s/early
90s. Those Symbolics guys were absolutely brilliant.

~~~
ScottBurson
I remember ObjectStore. Tried to use it, but it didn't work that well for our
use case; we were better off doing serialization. Very interesting design,
though.

------
gwern
Speaking of which, tonight I made a scan of _Japan as Number One: Lessons for
America_, Vogel 1979:
[http://libgen.io/book/index.php?md5=10dd41d874075bdd04be0a46...](http://libgen.io/book/index.php?md5=10dd41d874075bdd04be0a46bd8a70e7)
[https://www.dropbox.com/s/rr14mgyq8cnrp32/1979-vogel-
japanas...](https://www.dropbox.com/s/rr14mgyq8cnrp32/1979-vogel-
japanasnumberone.djvu?dl=0)
[https://mega.nz/#!peoiWAya!siIb9bDw6JbX9-toqTj6bP5iW4VlVTK4b...](https://mega.nz/#!peoiWAya!siIb9bDw6JbX9-toqTj6bP5iW4VlVTK4btH-073V9MI)

------
drallison
Moto-oka Tohru and Hideo Aiso often said privately that an unstated goal in
the Japanese Fifth Generation project was the creation of a practical way to
keyboard written Japanese and that with regard to that issue it was an
unqualified success.

~~~
wrp
Linguist James Marshall Unger in his book, _The Fifth Generation Fallacy_ ,
argued that the primary motivation for FGCS reasearch was to develop NLP for
the Japanese writing system. He claimed that the project was doomed to failure
because the high level of ambiguity in the Japanese writing system required
contextual understanding for even simple processing tasks. In that respect,
things went as he predicted. Do Japanese IMEs like ATOK have any heritage in
FGCS work?

------
amsilprotag
This reminded me of the $560MM "AI Fund" South Korea touted after the AlphaGo-
Sedol Go competition [0]. Looking for news now, it seems like they teamed up
with Singapore and are taking a less centralized approach. From [1]:

 _Gwangju Metropolitan City (GMC) plans to set aside US$920 million (1
trillion won) to support the AI Town Project, which is expected to incubate
1,000 AI-oriented startups and educate and train 5,000 for the sector within
the decade, starting from 2019.

The 3 sub-projects of the AI Town Project include: (1) Establishment of AI
Research Institute, (2) Development of AI-oriented Campus, and (3) Creation of
AI-oriented Start-up Ecosystem._

[0] [https://www.nature.com/news/south-korea-
trumpets-860-million...](https://www.nature.com/news/south-korea-
trumpets-860-million-ai-fund-after-alphago-shock-1.19595) [1]
[https://www.opengovasia.com/articles/singapore-and-south-
kor...](https://www.opengovasia.com/articles/singapore-and-south-korea-to-
collaborate-on-south-koreas-first-ai-town-project)

------
cjauvin
I've been intrigued lately by the notion of Japan's "Fifth Generation", and
wonder what was the nature of the research being done, its actual results,
etc. I'm also wondering if there are possible links that we could try to
establish with the current AI boom. One interesting book I found about the
subject is this one:

[https://www.amazon.ca/Fifth-Generation-Artificial-
Intelligen...](https://www.amazon.ca/Fifth-Generation-Artificial-Intelligence-
Challenge/dp/0201115190)

But as it's from 1983 (i.e. right in the actual period), I don't think the
historical perspective would be very present. Anyone has other interesting
reading suggestions?

~~~
gumby
There was no technical link -- neural nets were still unpopular (largely, if
unfairly, as a result of the '67 Perceptrons book0 and most work was on
symbolic systems and, in the later period, expert systems. but what passed for
AI systems at the time were quite powerful, if limited, just as what are
called AI systems today are quite powerful, if limited.

"Unfair" in that the book wasn't intended to destroy work on NNs, but was used
to justify abandoning the field of work, until multilayer networks (discussed
Perceptrons) were re"discovered" in the 90s.

------
mastazi
> these days, few people want specialized computers for artificial
> intelligence, preferring powerful general-purpose machines like those made
> by Sun Microsystems Inc., a fast-growing Silicon Valley company.

This is a starking example of how tech cycles change over time. The period in
which the article was written was during an AI Winter (in fact, the Fifth
Generation program is listed as a notable example[1] in the Wikipedia entry
for "AI Winter") and Sun Microsystems was still in its prime at the time (work
on the yet to be released Java language had started just 12 months[2] before).

[1]
[https://en.wikipedia.org/wiki/AI_winter#The_fizzle_of_the_fi...](https://en.wikipedia.org/wiki/AI_winter#The_fizzle_of_the_fifth_generation)

[2]
[https://en.wikipedia.org/wiki/Java_(programming_language)#Hi...](https://en.wikipedia.org/wiki/Java_\(programming_language\)#History)

~~~
gwern
Specialized hardware/generalized hardware is one of the classic 'wheels of
reincarnation' in computer tech: [http://www.cap-
lore.com/Hardware/Wheel.html](http://www.cap-lore.com/Hardware/Wheel.html)

But looking back over the last 20 years, at least for the silicon paradigm, I
think we may finally be in the last turn of the wheel. The trend back towards
specialized hardware (for audio, video, encryption, NN, servers vs
smartphones) seems fairly inexorable. There's no CPU architecture remotely on
the horizon which is general and blows away everything else for flexibility
and speed. Even specialized hardware like GPUs are starting to see their lunch
eaten by ultra-specialized ASICs and TPUs.

