
Deep Gate Recurrent Neural Network - vonnik
http://arxiv.org/abs/1604.02910v3
======
jean-
The comment field on arXiv reads:

> This paper has been withdrawn by the author due to lacking of enough
> experiments

------
argonaut
PSA: In case you didn't know, anyone can publish anything to arxiv.

~~~
colah3
I missed the actual paper before it got taken down. But as someone in the
field, I'd like to say that these are real researchers at a respected group.

The Helsinki group has done a lot of lovely work over the last few years --
ladder nets come to mind. This paper is by Yuan Gao, and his advisor Dorota
Glowacka. Dr. Glowacka has a pretty substantial publication record, mostly
focused on reinforcement learning:

[https://scholar.google.com/citations?user=sDZkDHQAAAAJ&hl=en](https://scholar.google.com/citations?user=sDZkDHQAAAAJ&hl=en)

Yuan Gao is a second year grad student, and this is one of his first
publications. I think it shows a lot of integrity to realize a weakness in
your work and remove it, pending refinement.

~~~
Djabbz
That's exactly why double blind reviewing is necessary. #bias

~~~
colah3
I hope my previous comment didn't come across as me suggesting that the paper
shouldn't be scrutinized because it comes from an established research group.
Of course, it absolutely should be and double blind review is a helpful
mechanism for achieving this.

One possible reading of the parent comment was an attack on the researchers.
Of course, the parent probably didn't mean it that way but it might still feel
hurtful to the authors and I wanted to clarify it. :)

~~~
argonaut
It was an attack on HN'ers blindly upvoting things from arxiv, because most HN
readers do not properly scrutinize postings from arxiv or take it with a grain
of salt.

I was well aware these were actual researchers. Your comment probably didn't
help very much here because most HN'ers, again, will just see that the authors
are PhD's or from a decent group, and assume it's a legit paper.

------
zk00006
where is the link to pdf?

~~~
Houshalter
[http://arxiv.org/pdf/1604.02910v2](http://arxiv.org/pdf/1604.02910v2)

