Hello,
Let be our sample space. Consider a model in which we have a random vector with the all distributed with a probability distribution on some finite alphabet , and independent. I have seen in several statistical courses that an estimator is any function of the random variables that do not depend on unknown parameter. In the course, when we say that is an estimator, do we mean that ?
Best regards,
GL
Hello Gabin,
If we are doing distribution estimation, you can indeed assume (or, if you construct the estimator yourself, you should always require) that the output will be in the simplex (non-neg. and sums up/integrates to 1). This should be considered the minimum requirement whenever we do distribution estimation.
Of course, this is only a special case of estimation. The empirical mean estimator for example can a-priori return anything.
Cheers,
Thomas
If we are doing distribution estimation, you can indeed assume (or, if you construct the estimator yourself, you should always require) that the output will be in the simplex (non-neg. and sums up/integrates to 1). This should be considered the minimum requirement whenever we do distribution estimation.
Of course, this is only a special case of estimation. The empirical mean estimator for example can a-priori return anything.
Cheers,
Thomas
Thank you Thomas !