Scientific versus statistical modelling: a unifying approach.
2020
This paper addresses two fundamental features of quantities modeled and analysed in statistical science, their dimensions (e.g. time) and measurement scales (units). Examples show that subtle issues can arise when dimensions and measurement scales are ignored. Special difficulties arise when the models involve transcendental functions. A transcendental function important in statistics is the logarithm which is used in likelihood calculations and is a singularity in the family of Box-Cox algebraic functions. Yet neither the argument of the logarithm nor its value can have units of measurement. Physical scientists have long recognized that dimension/scale difficulties can be side-stepped by nondimensionalizing the model; after all, models of natural phenomena cannot depend on the units by which they are measured, and the celebrated Buckingham Pi theorem is a consequence. The paper reviews that theorem, recognizing that the statistical invariance principle arose with similar aspirations. However, the potential relationship between the theorem and statistical invariance has not been investigated until very recently. The main result of the paper is an exploration of that link, which leads to an extension of the Pi-theorem that puts it in a stochastic framework and thus quantifies uncertainties in deterministic physical models.
Keywords:
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
20
References
1
Citations
NaN
KQI