Hacker News new | past | comments | ask | show | jobs | submit login

Yes, but that's true of all statistics. You have to make some assumptions to get off the ground. If you estimate parameter variance the frequentist way, you also make assumptions about the parameter distribution.



No, this is expressly untrue. In the frequentist paradigm parameters are fixed but unknown, they are not random variables, and have no implicit probability distribution associated with them.

An estimator (of a parameter) is a random variable, as it is a function of random variables, however this depends only on the data distribution, there is no other implicit distribution on which it depends.

For instance, the distribution for the maximum likelihood estimator of the mean of a normal distribution is normally distributed, however this does not imply that the mean parameter has a normal prior, it has no prior, as it is a fixed quantity.


But you make the assumption that the data can be generated by your model, and your variance estimate only holds asymptotically.


> But you make the assumption that the data can be generated by your model

Yes

> you also make assumptions about the parameter distribution

No

> your variance estimate only holds asymptotically

Don't follow




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: