
Our weird behavior during the pandemic is messing with AI models - gilad
https://www.technologyreview.com/2020/05/11/1001563/covid-pandemic-broken-ai-machine-learning-amazon-retail-fraud-humans-in-the-loop/
======
sn41
The title is backwards. It should be that AI models are wrong. Not that a
given AI model is a gold standard that human behavior is failing to live up
to.

~~~
radarsat1
That's kind of a weird reading of the title tbh. "Messing with AI models" to
me implies that the models are trained on data of a certain nature, therefore
during the pandemic, they are turning out to be "wrong" in this new context.
It's not drawing normative conclusions about human nature at all.

~~~
BerislavLopac
What does the word "weird" stand for exactly then?

------
BIackSwan
We disabled all ML enabled predictive features when the pandemic began.

This has prevented us from the pain of having “bad” unreliable predictions
which are worse than having “no” predictions. This saved us a lot of pain in
maintainence and debugging.

~~~
im3w1l
That seems very drastic. At least simple, explainable-logically sound models
for short term dynamics should still work?

------
asquabventured
So it turns out that humans are more complex than what the black box AI models
have hyper optimized for.

~~~
ikeyany
Better retrain the humans.

~~~
v4dok
Welcome to westworld

------
IIAOPSW
Most models ("AI" or otherwise) behave very poorly in the situation known a
"structural break" [1]. In extreme instances you can even end up with
something like Simpsons paradox where the predicted correlation is the
opposite of the true correlation [2]. I've had to deal with this exact problem
before. The trick is to have a meta model that sits on top and tries to detect
when the recent data is highly unlikely to have been produced if the existing
model is true. When that happens you flush the older data. This might be what
our brains do [3].

[1]
[https://en.wikipedia.org/wiki/Structural_break](https://en.wikipedia.org/wiki/Structural_break)
[2]
[https://en.wikipedia.org/wiki/Simpson%27s_paradox](https://en.wikipedia.org/wiki/Simpson%27s_paradox)
[3] [https://slatestarcodex.com/2017/09/06/predictive-
processing-...](https://slatestarcodex.com/2017/09/06/predictive-processing-
and-perceptual-control/)

~~~
enchiridion
Interesting idea. Why would your existing model not be able to detect a change
in the distribution of the input?

~~~
IIAOPSW
How do you detect change in the input distribution before you even have a
model of the input distribution?

------
sircastor
As others have noted, the models don't represent the way things are anymore.
And bring frank,I don't they'll represent things ever again. I very much doubt
that the models will be a good match for behaviors going forward as people are
likely going to have a hard, long transition towards normalcy

------
im3w1l
It seems a fundamental nature of the universe that there is a tension between
continuum and discrete. You see it everywhere:

Wave vs particle, gradual heating vs phase change, but also soup with little
chunks of meat (themselves smaller continuum), gas clouds coalescing into
planets. It goes on and on.

In this case model was built for the continuum and in came a big discrete
shock. Something it wasn't built for. Shocks require a different modality of
thought. More reasoning, logic-y than regression.

------
SomeoneFromCA
Our neighborhood cats were seriously confused and disturbed by the looks of
our faces in masks, no wonder AI is confused too.

------
lostmsu
To be fair, the predictive ability of humans also reduced quite a lot.

------
2019-nCoV
Is anti-fragile AI possible?

~~~
cameldrv
Yes. You don't see it too much in ad recommenders because they don't need to
be correct at the tails as long as you're rightish most of the time.

Elements of it show up in autonomous vehicles. For example, one thing that can
cause a disconnect is when sensors detect a sequence of events that is
extremely improbable. If it's improbable enough, it's probably not the world,
it's probably your model. Best to hand it to a human or hit the brakes.

It reminds me a bit of John Meriwhether's statement that the events that led
to LTCM's collapse were a "Six Sigma Event." If you are getting a six sigma
event and you've only been in business a few years, maybe it's because your
probability distribution of the market is wrong when you get a little too far
from the mean.

------
sbmthakur
It's high those AI models catch up with humanity.

