
The AI Index 2019 Report - smt1
https://hai.stanford.edu/news/introducing-ai-index-2019-report
======
scottlocklin
I'm pretty sure there is a ~2002 "Nanotech index report" out there which looks
a lot like this. Oh looksie; I am right:

[http://www.nanotech-now.com/CMP-reports/NOR_White_Paper-
July...](http://www.nanotech-now.com/CMP-reports/NOR_White_Paper-July2002.pdf)

~~~
account73466
True but I am not sure "nanotech" was used by top S&P 500 companies and their
engineers on a daily basis.

~~~
nostromo
Actually it’s quite analogous.

When nanotech became a hot concept, the goal posts were moved so that
companies could claim they were nanotech companies even though they were using
pretty basic chemistry. No nanobots required.

And now we’re seeing a lot of statistics being rebranded as AI because the
latter attracts investors and journalists.

~~~
account73466
AI is the term used by marketing division and clueless public. Researchers
don't use that term without a special need. FAANG engineers actually use ML
approaches.

------
account73466
>> Diversifying AI faculty along gender lines has not shown great progress,
with women comprising less than 20% of the new faculty hires in 2018.

I am wondering what is % of women doing PhD in the same field and whether it
is growing. Without the latter number growing it would be hard to have a
greater % of women among new faculty hires.

~~~
grammarxcore
We'd also need to see numbers on new positions and freshly available
positions. Even if the number of candidates is relatively stagnant, they can't
get jobs if their number is larger than the number of positions available.
Many of the doctoral candidates I knew in school couldn't find positions in
academia because of market saturation.

~~~
account73466
>> Many of the doctoral candidates I knew in school couldn't find positions in
academia because of market saturation.

The situation is now drastically different compared to say 40 years ago. That
is why so many PhDs leave for industry to be better paid. The latter is
possible for CS guys but in other domains the situation is pretty bad since
they don't have easily accessible highly paid industry jobs.

------
utopian3
> In a year-and-a-half, the time required to train a large image
> classification system on cloud infrastructure has fallen from about three
> hours in October 2017 to about 88 seconds in July 2019. During the same
> period, the cost to train such a system has fallen similarly.

3 hours to 88 seconds? Wow... I wonder if that's been further reduced today
(December 2019)

------
keithyjohnson
I get that these bullets points are answering What instead of Why but for
those that are more readily discernible, like "In a year-and-a-half, the time
required to train a large image classification system on cloud infrastructure
has fallen from about three hours in October 2017 to about 88 seconds", what's
causing this? Are models getting smaller without a loss in accuracy? Is
training distributed over a greater amount of cheaper machines? Personally,
I'd be more excited about the former rather than the latter. We can't all
afford MegatronLM-type experiments - [https://nv-
adlr.github.io/MegatronLM](https://nv-adlr.github.io/MegatronLM).

~~~
ummonk
Both. Companies are certainly building bigger and bigger clusters for
training.

At the same time though, consumer GPUs have gotten significantly faster
(compare e.g. an Nvidia 2080TI to a 980TI), and learning algorithms keep
improving / better learning algorithms become more widely used (e.g. Adam
instead of stochastic gradient descent).

~~~
antpls
And also, architectural search allowed for neural networks to use more
efficient builtin blocks, using many less parameters, and achieving the same
accuracy with smaller models (and lowering training cost)

