The reason for that is there was rarely any visible benefit from putting in the work to follow good computing practices. It certainly created a lot of pain when jumbles of broken and unreadable code were passed on to new people, who basically ended up reimplementing everything from scratch (I was one such person), but there is exactly zero punishment for doing that, and very rarely any reward for cleaning up code and data. So why bother?
As with the open-source publishing debate, there has to be an incentive system in place, and then people will do it. There are standardized (and required!) practices for things like bio protocols or reporting PV-performance data - only if computing had something similar do I see anything improving.