
AI Winter - leonidasv
https://en.wikipedia.org/wiki/AI_winter
======
codingslave
Really what we are in the midst of is AI click bait. People are predicting the
next "AI Winter", like major publications are predicting the next recession.

~~~
Causality1
It's a new thing every year. It was blockchain. Now it's AI and machine
learning. Soon there will be a different buzzword to generate clicks and
investor funds.

~~~
sammorrowdrums
Well IoT has come back a bit too with "edge processing". Buzzword trends kill
me, mostly because of non-technical managers asking about using them with no
real understanding of what, how and why, rather than the public buzz they
generate. Blockchain is the king of that, but I do think calling ML AI was
ridiculous development and I hold steadfast in my use of ML.

You are right too, of course, about the investor situation. That one is really
bad because nothing says "employ a technology you don't consider a good fit
for your business" like $1M...

I had to switch prematurely to microservices before due to investor pressure,
and the company wasn't ready and you can imagine the rest.

------
ngngngng
This is as good a place as any to ask this I suppose.

Google results being as manipulated as they are these days, what is the best
course/book to get to self teach on artificial intelligence? I'd like to spend
a few months familiarizing myself with concepts and writing programs with what
I learn to see if anything useful comes of it.

edit: Found this which answers this perfectly
[https://news.ycombinator.com/item?id=15689399](https://news.ycombinator.com/item?id=15689399)

~~~
jacquesm
fast.ai

~~~
bgorman
Fast.ai is a great resource but it is really narrowly focused on deep learning
and especially transfer learning

~~~
jacquesm
GP wants to do self study and only a few months, I'm not aware of a better
resource for that purpose.

------
StudentStuff
Is OP inferring we are entering another AI Winter after this most recent hype
cycle for general AI?

Considering we use neural networks to encode audio (in Opus), transcribe our
speech, secure our homes and much more, this most recent AI wave has been
quite productive.

~~~
z3phyr
All AI waves have been very productive. The last ones gave us huge
advancements in programming language design, operating systems, networking,
hci, spawned a few programming paradigms. ..

It does not really translate to the final objective though.

~~~
selestify
What is it that AI has done for all those fields that you mentioned?

~~~
z3phyr
The research in AI directly coincided with the development of early computer
systems. Lisp and friends (and small but important ideas that come with it
ranging from string interning to garbage collection to the macro system to the
fundamental idea of compilers and interpreters etc) was contributed to by
people mostly involved with Symbolic AI research.

The early programming language parsing research is the direct product of
researchers working on natural language processing, and in fact the BNF form
was developed for natural languages but later adapted and improved for
programming languages.

The idea of Logic programming with prolog and friends comes directly from AI
research.

Most of the search algorithms we use unknowingly in various machines have
origins in the first AI wave.

The human computer interaction research directly dealt with development of
fundamental ideas on speech synthesis, graphical user interfaces and computer
graphics.

All in all, The field of applied AI and Computers developed together and a lot
of early ideas spearheaded by AI transferred into fundamental general
computing, ideas so trivial we do not even think about them now. But they were
not so trivial when they were developed, specifically for AI

------
skunkworker
Won't we enter another AI winter unless we can solve things like finding
something better than backpropagation or just jumping to the Student model in
a Distillation teacher-student model, letting us know there have to be better
means out there that we haven't achieved. Though I could be completely wrong
and Hinton could come out tomorrow with a groundbreaking paper that furthers
another 5 to 10 years of iterative improvements.

~~~
dijksterhuis
Hinton already came out with a “groundbreaking” paper not too long ago -
capsule networks.

~~~
p1esk
Capsules papers don’t strike me as particularly groundbreaking. Still
convolving over every single location in the input image, still using backrop.
Vectors instead of scalars for activations and a novel way to pass signals
between layers? Sure, why not. But if it wasn’t from Hinton no one would’ve
noticed. The initial results were underwhelming, and there has been zero
progress even 2 years later.

------
timonoko
In this 1988-video Teuvo Kohonen tells us that future AIs will be "separate
units you can attach to your regular computer". This sounds true, methinks
real AIs will be treated like human personalities with certain skillz and
experience. "This AI has 3 years of experience of driving on congested roads
in India, 300€". They will be separate units, because the hardware will be
more brain-like, and architecture is constructed or grown as it gains
knowledge and skills.
[https://youtu.be/Qy3h7kT3P5I?t=2479](https://youtu.be/Qy3h7kT3P5I?t=2479)

------
radagaisus
With just incrementalism there’s still enough value to extract from the
current methods of deep learning to propel a small country into a golden age.
Leaps and bounds are needed to get to AGI, but winter comes when there’s no
more money on the table, not because we can’t reach our scifi fantasies.

~~~
edoceo
AGI = Artificial General Intelligence?

~~~
fnbr
Yes!

------
Amygaz
Can’t learn to ski without snow...

Also the first line on the wiki mention that this entry needs to be updated.

The level of funding is as hot as it’s ever been. NOAA predicts a winter with
temperatures above the normals. So I would go with that.

------
dr_dshiv
With Elon's latest truck, we've just crawled out of the cyber winter. And that
was a cold one.

------
tapatio
Yeah, it’s winter time and we have AI.

------
makstaks
I'm rather confused by the misuse of the term "AI", to the point, I no longer
know what is true. I thought true AI needs to pass some sort of test that
demonstrates real intelligence? So, did the Singularity event happen already
and I didn't notice?

~~~
fouc
>In computer science, artificial intelligence (AI), sometimes called machine
intelligence, is intelligence demonstrated by machines

The term AI doesn't actually imply human-level intelligence.

------
xvilka
Missing "is coming" in the title.

------
soup10
We're in an AI Summer if anything.

~~~
brink
Idk, summer is typically something you enjoy.

~~~
soup10
i don't see any signs of funding or enthusiasm letting up though, these types
of posts are more reactionary and goal post moving "well its not really
"strong" ai because x" never-mind that lots of people are finding real
applications for ML. Processing power to support it is increasing and funding
is still flowing.

------
ganitarashid
We are currently in the AI winter. What we have now is not AI it’s curve
fitting. Don’t be fooled, the computer does not “understand” it’s doing blind
curve fitting.

