Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Seems like a great argument for more transparency. This guy wasted a lot of time trying to guess what they had done. Given that publishing all the supporting materials is approximately free, perhaps journals could start requiring a git link that contains data, code, and paper drafts.


Many respectable journals already do this.

From Science: All data necessary to understand, assess, and extend the conclusions of the manuscript must be available to any reader of Science. All computer codes involved in the creation or analysis of data must also be available to any reader of Science. After publication, all reasonable requests for data and materials must be fulfilled. [...]

http://www.sciencemag.org/site/feature/contribinfo/prep/gen_...


Paper drafts? That's a sure fire way to make sure people prepare their papers in secret. The Presidental Records Act didn't suddenly increase the transparency of the presidency, it ensured the president doesn't use email and prompted the Bush administration to move to a secret email server:

http://en.wikipedia.org/wiki/Presidential_Records_Act

http://en.wikipedia.org/wiki/Bush_White_House_e-mail_controv...

By all means, force researchers to publish all the tools necessary to reproduce their results. But you can't expect to set up surveillance in their head.


You might have a point, but I think that's a terrible analogy.

It could go as you suggest. Or it could be like a locker room: if everybody is naked, then nobody cares.

What made me add drafts to the lists is Daniel Dennett's energetic description in Consciousness Explained of how he repeatedly circulates drafts of papers to colleagues for comment. At least in philosophy, that's an important part of the process.

Having to show interim steps would make fraud much harder, and it's a zero-overhead thing if people are already backing up their work.


Yes, I often circulate drafts of my papers to colleagues too. I don't post them for the world to see into perpetuity until they reach a certain level of quality.

Your shower analogy doesn't work because there's no way to force people to post drafts. We already have the option posting of drafts. It's called personal websites and/or the arXiv.


Well, the mechanism I suggested for forcing was journals requiring it for publication. People would be obliged to keep a version history of some sort. Careful writers do already, and it's easily automated, so I don't think enforcement would be hard.


"journals could start requiring a git link that contains data, code, and paper drafts."

But when the goal is to create a result rather than report it, transparency is the enemy


> But when the goal is to create a result rather than report it, providing data, transparency is the enemy

I strongly believe that errors are a much more pervasive problem in science and related fields than malice is.


There's also a bunch of stuff between error and malice. As we saw yesterday here, it's very obvious in pharmaceutical research, where the stuff getting funded and published is often heavily biased despite the best intentions of everybody (or almost everybody).

I suspect there are plenty of similar issues in economics. People with money are much more likely to support researchers whose work benefits or protects people with money. E.g., I happened to read a paper from a U of Chicago prof arguing that insider trading is actually beneficial. Boy, I wonder who the big donors are there. Probably not Mother Jones Magazine.


The economic benefit of insider trading is a well established economic possibility. It's not a pure transfer to rich people.


> I strongly believe that errors are a much more pervasive problem in science and related fields than malice is.

The problem is that economics is not a field related to science. The vast majority of public policy economics starts with a conclusion, and then creates facts to support that conclusion. This is not science, it is religion.

(insert disclamier about this being an overgeneralization)


> I strongly believe that errors are a much more pervasive problem in science and related fields than malice is.

In science generally, probably. In areas tightly connected to perennial areas of sharp ideological policy divides, like macroeconomics, I'm less convinced.


"science and related fields "

But we are talking about economics, and republican groups like fox news eat it up.

My favorite example is the 2011 chart distorting the display of the unemployment rate:

http://mediamatters.org/blog/2011/12/12/today-in-dishonest-f...


Fox News is not in the business of science. I used the term "related fields" as a euphemism for Econ and Psych because I didn't want to get drawn into a demarcation debate.

Wherever the line is drawn, the dishonesty of some random Fox News chart is not related to the honesty or dishonesty of actual scientific research.

The much bigger problem in science is error.


" didn't want to get drawn into a demarcation debate."

Econ and Psych really do lie in a gray area, and by avoid the demarcation debate you completely miss the relevance of the issue at hand.

If policy makers weren't using this study to justify more austerity, then we probably wouldn't have such a prolonged discussion.


Distorting the truth about the state of the economy is not something confined to one partisan side, and the need for data transparency extends beyond economics regardless.


"is not something confined to one partisan side"

True, but I haven't seen a liberal think tank manipulate charts in that specific manner.


Then look harder.


At my startup, http://banyan.co, we are aiming to tackle transparency in academia using git. Our products barely a few months old, but were shipping new features & improvements daily.


> This guy wasted a lot of time trying to guess what they had done.

According to the article, many people wasted a lot of time attempting to recreate their results. Further, the paper is highly-cited and it probably shaped, directly or indirectly, opinions, further research directions, and possibly even policy.

We are focusing on the Excel error, which was likely a mistake. I'm really struggling though to justify their choice of weighting/averaging. It's confounding that two highly regarded and experienced academics would choose something so particularly bad. It absolutely should have been noted in the text.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: