
New Data Processing Module Makes Deep Neural Networks Smarter - rbanffy
https://news.ncsu.edu/2020/09/attentive-normalization-in-ai/
======
Datenstrom
These are pretty good results, but not SOTA currently EfficientNet-B7
outperforms this on ImageNet-1000. ImageNet in particular is around 84% top-1
and 97% top-5. Claiming the last few percentages can be much harder than the
first so it would be interesting to see this integrated into other approaches
still.

For comparison to other methods see, ImageNet Classification Leaderboards[1],
Object Detection on COCO test-dev[2], Semantic Segmentation on COCO 2017[3].
Unfortunately COCO 2017 does not have many entries, I'm not aware of a better
summary of SOTA methods.

[1]: [https://kobiso.github.io/Computer-Vision-
Leaderboard/imagene...](https://kobiso.github.io/Computer-Vision-
Leaderboard/imagenet)

[2]: [https://paperswithcode.com/sota/object-detection-on-
coco](https://paperswithcode.com/sota/object-detection-on-coco)

[3]: [https://paperswithcode.com/sota/semantic-segmentation-on-
coc...](https://paperswithcode.com/sota/semantic-segmentation-on-coco-2017)

------
cs702
The headline is too linkbaity for my taste. The idea is simple: Apply a
mixture of multiple affine transformations, e.g., weighted by Softmax
attention, instead of just a single affine transformation, to renormalize the
data. The use of attention likely increases running time both in training and
testing. See also Datenstrom's comment:
[https://news.ycombinator.com/item?id=24528468](https://news.ycombinator.com/item?id=24528468)

------
MichaelZuo
“For example, top-1 accuracy in the ImageNet-1000 improved by between 0.5% and
2.7%. And Average Precision (AP) accuracy increased by up to 1.8% for bounding
box and 2.2% for semantic mask in MS-COCO.“

Just how significant are these results?

~~~
bradneuberg
Top-1 results are much harder than top-5, so the fact that they are bumping up
top-1 by a few percentage points is pretty significant IMHO. Also getting 1 to
2% bump for the segmentation task is nothing to sneeze at.

