Hacker News new | past | comments | ask | show | jobs | submit login
Exploring the Limits of Transfer Learning with a Unified Transformer (2019) (arxiv.org)
12 points by yeesian 7 months ago | hide | past | favorite | 1 comment



Is it me or are deep learning papers getting more and more hyperbolic in their use of language? Check out the first sentence in the abstract:

  Transfer learning, where a model is first pre-trained on a data-rich task before being fine-
  tuned on a downstream task, has emerged as a powerful technique in natural language
  processing (NLP). 
You could re-write that without the pomp:

Transfer learning is a technique where a model is first pre-trained on a data-rich task before being finetuned on a downstream task.

And you lose none of the meaning for dropping the "powerful" bombast. What is "powerful" anyway? Is this a research paper or a social media post?

This is really something I've noticed more and more lately -see e.g. the recent paper on the blindness of vision LLMs:

https://vlmsareblind.github.io/

Whence I quote (the abstract):

  ... tasks absurdly easy to humans 
  ... The shockingly poor performance ... 
And many more in the body. What's with all that? Aren't results enough to draw attention to your research work anymore?




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: