"In this scheme there is a minimum allowed acceleration which depends on a Hubble scale Θ, so, if Θ has increased in cosmic time, there should be a positive correlation between the anomalous centripetal acceleration seen in equivalent galaxies, and their distance from us, since the more distant ones are seen further back in time when, if the universe has indeed been expanding, Θ was smaller. The mass to light ratio (M/L) does seem to increase as we look further away. The M/L ratio of the Sun is 1 by definition, for nearby stars it is 2, for galaxies’ it is 50, for galaxy pairs it is 100 and for clusters it is 300. As an aside: equation (11) could be used to model inflation, since when Θ was small in the early universe the minimum acceleration is predicted to be larger." (http://arxiv.org/pdf/astro-ph/0612599v1.pdf)
If an effect was stronger in the early universe, you'd expect to see a big correlation between the effect size in a galaxy, and that galaxy's redshift z. It wouldn't make any sense to say that "galaxies" have a ratio of 50, since there are galaxies at every redshift; many are nearby and have redshifts of almost zero, while the Ultra Deep Field galaxies have very large redshifts of up to ~10. If the number is really the same for "galaxies" in general, that means there's no distance dependence, but McCulloch doesn't seem to realize this. He seems to imply that nearby stars have a higher mass/luminosity ratio because of their distance compared to the Sun (?!), but the time-delay effect for anything in the Milky Way is negligible (< 0.0005% of the universe's age). In reality, nearby areas of space will have higher ratios than the Sun just because they contain many objects which, unlike the Sun, don't emit much light (red/brown/white dwarfs, gas and dust, etc.). Likewise, he seems to imply that "galaxy clusters" are farther away than "galaxies", but most galaxies are part of clusters, and we can observe both galaxies and galaxy clusters at both small and large redshifts.