Hacker News new | comments | show | ask | jobs | submit login

Having worked with a large open-source scientific project, this is extremely reminiscent of the type of talk we would give to every new student or post-doc joining the project. Unfortunately, the outcome is entirely binary - either they understand both what to do and why its important, and also happen to do it already, or they don't understand it and they don't already do it. I can't really think of a single person who jumped from one category to the other over the years.

The reason for that is there was rarely any visible benefit from putting in the work to follow good computing practices. It certainly created a lot of pain when jumbles of broken and unreadable code were passed on to new people, who basically ended up reimplementing everything from scratch (I was one such person), but there is exactly zero punishment for doing that, and very rarely any reward for cleaning up code and data. So why bother?

As with the open-source publishing debate, there has to be an incentive system in place, and then people will do it. There are standardized (and required!) practices for things like bio protocols or reporting PV-performance data - only if computing had something similar do I see anything improving.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: