
N-Shot Learning: Learning More with Less Data - DarkContinent
https://blog.floydhub.com/n-shot-learning/
======
bitL
How does this differ from active learning? When would you use which in case
you don't have sufficiently large training dataset? Would you combine both
approaches? If so, how?

~~~
inimino
To combine few-shot learning with active learning you can

\- use active learning to expand the training set, and

\- revise your model training procedure to account for the sampling bias this
introduces in the training set.

------
unimployed
Maybe paying smart people for their ideas, insights, and algorithms costs less
than throwing a bunch of data at AI training.

~~~
acollins1331
For some things, we discovered that wasn't true for other things this last
decade though.

------
Vaslo
“If AI is the new electricity, then data is the new coal.“

Ugh

~~~
amrrs
I've been pondering about this analogy. I like this one but if AI us the new
electricity do we need more Edisons or Teslas? I think everyone jumping into
AI stuff learning Deep learning and stuff seems to learning how to create
electricity itself than creating Lightbulb and other stuff that runs on
electricity - building user apps on it.

Does anyone else feel so ?

~~~
heipei
I don't know, just check out a few Kaggle competitions and how pragmatic the
winning teams are approaching their solution. It's most often a combination of
tried-and-true techniques, used in an ensemble, with some smart feature
selection. Anecdotally, there's plenty of ready-to-use ML tech available
nowadays that I, as a novice, was able to go from zero to working Gradient
Boosting classifier within a few days. For me that's the definition of
applying the techniques without trying to earn a PhD in the field.

