
Ask HN: How far are we with abstractive text summarization? - trahn
I.e. how good is it with current methods? Any hints on resources are welcome as well. :)
======
unlikelymordant
perhaps this 7 day old paper will help:
[https://arxiv.org/abs/1707.02268v2](https://arxiv.org/abs/1707.02268v2)
(edit: i had a closer look and it is focused maily on extractive
summarization, maybe it has some good references anyway)

I found a lot of papers searching for "text summarization survey" on google
scholar, even when restricting to >2016 papers. e.g. "A survey on abstractive
text summarization"
[http://ieeexplore.ieee.org/abstract/document/7530193/?reload...](http://ieeexplore.ieee.org/abstract/document/7530193/?reload=true)

------
guard0g
Sequence-to-sequence attention models:
[https://github.com/tensorflow/models/tree/master/textsum](https://github.com/tensorflow/models/tree/master/textsum)
[https://arxiv.org/abs/1509.00685](https://arxiv.org/abs/1509.00685)

