
Google CEO: AI is a bigger deal than fire or electricity - johnshades
https://www.fastcompany.com/40519204/google-sundar-pichai-ai-is-a-bigger-deal-than-fire-or-electricity
======
dmoy
I think there's an important distinction here in word choice. Sundar said more
_profound_ , not _a bigger deal_. You could imagine that it's more profound in
the sort of "what if we make machines that can think / feel / have souls" sort
of way, which certainly can't happen with fire or electricity.

That doesn't mean it's more important than electricity. (As a bozo test - how
does your AI work without electricity?)

Disclaimer: work at google, in a role completely unrelated to AI.

~~~
Swizec
> (As a bozo test - how does your AI work without electricity?)

You design a system that can take in a store of energy of some sort and
converts it into energy through some kind of process. You might call the
energy source, say, food and the process of converting into energy could be
called nutrition.

That's how you'd do it if you lost electricity I guess. Create a generator
that takes not-electricity and turns it into electricity.

If you really have to go without electricity, you could make it run on
glucose. We have systems like that running around, just not artificial.

And since we know neurons and brains work using electric pulses, you could say
that turning glucose into electricity is also a solved problem.

Of course you could also say that without electricity (as in doesn't exist as
a thing) you can't have brains so you'd never be able to come up with AI in
the first place.

Altho I wonder if a brain would evolve that runs on something other than
electricity if electricity didn't exist as a physical phenomena. That would be
interesting.

~~~
ericjang
I wish we knew how to imbue robots with chemical actuation mechanisms that
actually worked (as opposed to some proof of concept demo of a simple
electrolysis reaction in a bulky 3D printed silicone balloon).

If real muscle cells / limbs serve as a sort of reference, they would be
vastly more space-efficient, compliant, and have higher power-weight ratio
than any electromagnetically actuated servo we have today.

It makes me wonder if the "modern" androids in Westworld used electricity at
all.

------
tw1010
"Company: Thing we have sunk massive amounts of cost into is a bigger deal
than sliced bread."

------
austincheney
The application logic is currently smart enough. So long as the vision of AI
remain around single, closed, data-oriented, monolithic applications AI will
continue to be as impressive as it is now. It will get faster as the hardware
gets faster, but it won't be what tech evangelists are hyping it up to be.

If the giant tech companies really want AI to be something _more_ they need to
reevaluate what AI is from a very foundational and simplistic approach. Right
now, the current approach, AI is just a big smart application, but it can
never be smarter than a big single software application.

------
jsemrau
Yes it is. Allow be to boast, but over the last 4 years I have implemented
automated credit decision systems replacing human credit analysts and event
recommendation systems categorizing 3,000,000 events only on the description.
The power these new technologies have is on the same scale as fire and
electricity.

~~~
DrScump
Is the primary advantage of these systems over human deciders that they make
_better decisions_ or that they can be sold as being _free of bias_ in ending?

~~~
nostrademons
I'd think that the primary advantage is they _don 't require paying salaries_.

~~~
jsemrau
Same as not requiring people to manually turn on the gas lights in the
evening.

~~~
jsemrau
Maybe adding to that is that is that from a practitioners point of view AI is
marketed heavily because most of the large tech companies can increase higher
utilization rates of their infrastructure

[1][https://e27.co/innovative-cognitive-services-new-oil-
begun-r...](https://e27.co/innovative-cognitive-services-new-oil-begun-
realising-impact-lives-20170410/)

------
throw7
No it's not. It's "AI" on your terms sundar. I doubt I'll ever be able to tell
Google Assistant to change her name to whatever I want.

~~~
candiodari
Hotword detection is a solved problem. They could implement it in, well I
don't know, a month or so, and change "hey Google" to anything.

They just don't want to let you.

~~~
Noumenon72
I'd be happy if they'd just keep on working on "hey Google" until I can train
it to understand me on the first try. It was like magic when I got my first
Galaxy S3 but the recognition has only gotten worse.

------
mnm1
Is AI still intelligence if it does completely stupid shit that people
currently do like racial profiling? Because to me, that's not intelligence.
That is fucking scary as shit though that this non-intelligent software can
now amplify the worst of human impulses many times over without any oversight
or control. If that's what people like like Pichai are talking about--shit
software given too much power by humans--then yeah, it's beyond terrifying.
But if we're still talking about intelligence, we're not talking about systems
like this.

~~~
emerged
If a neural network (or any other AI algorithm) includes race as one of its
input variables, and through its training set optimization utilizes that trait
in order to deduce who may be guilty of a crime, and the cumulative error is
indeed significantly improved by allowing that trait as an input -- then we
have the need to rationally define _exactly_ why this is bad, and what
_exactly_ the AI is missing which is needed to address that problem.

I'd imagine the answer is something along the lines of the AI needing to incur
a strong penalty when it falsely accuses or falsely casts suspicion, or misses
a crime it would've found without that trait distracting it, etc. But the
point IMO is that the only way to stop this problem is to understand it
without being afraid of talking about it. You talk of the AI as being stupid
for this problem -- but the stupidity lies in the axioms we're feeding it, and
that only gets fixed by finding out where WE are going wrong in a rigorous and
scientific way.

------
trhway
fire, electricity, and life/intelligence are result of the same - entropy
equalization along the gradient between low entropy region in parameter space
and the rest of the space, ie. 2nd law.

AI looks like the next step in that ladder as it isn't limited by physical
limitations of a protein based life body, ie. it has all the chances to
transcend the current practical limit of 1.5kg of wet live matter blob with
100B elements and 100T connections.

