language-icon Old Web
English
Sign In

Big O in probability notation

The order in probability notation is used in probability theory and statistical theory in direct parallel to the big-O notation that is standard in mathematics. Where the big-O notation deals with the convergence of sequences or sets of ordinary numbers, the order in probability notation deals with convergence of sets of random variables, where convergence is in the sense of convergence in probability. The order in probability notation is used in probability theory and statistical theory in direct parallel to the big-O notation that is standard in mathematics. Where the big-O notation deals with the convergence of sequences or sets of ordinary numbers, the order in probability notation deals with convergence of sets of random variables, where convergence is in the sense of convergence in probability. For a set of random variables Xn and a corresponding set of constants an (both indexed by n, which need not be discrete), the notation means that the set of values Xn/an converges to zero in probability as n approaches an appropriate limit.Equivalently, Xn = op(an) can be written as Xn/an = op(1),where Xn = op(1) is defined as, for every positive ε.

[ "Compact convergence", "Normal convergence", "Convergence tests" ]
Parent Topic
Child Topic
    No Parent Topic