language-icon Old Web
English
Sign In

Probability mass function

In probability and statistics, a probability mass function (PMF) is a function that gives the probability that a discrete random variable is exactly equal to some value. The probability mass function is often the primary means of defining a discrete probability distribution, and such functions exist for either scalar or multivariate random variables whose domain is discrete. f X ( x ) = P ( X = x ) = P ( { s ∈ S : X ( s ) = x } ) {displaystyle f_{X}(x)=P(X=x)=P({sin S:X(s)=x})} In probability and statistics, a probability mass function (PMF) is a function that gives the probability that a discrete random variable is exactly equal to some value. The probability mass function is often the primary means of defining a discrete probability distribution, and such functions exist for either scalar or multivariate random variables whose domain is discrete. A probability mass function differs from a probability density function (PDF) in that the latter is associated with continuous rather than discrete random variables; the values of the probability density function are not probabilities as such: a PDF must be integrated over an interval to yield a probability. The value of the random variable having the largest probability mass is called the mode. Suppose that X : S ↦ A {displaystyle X:Smapsto A} for A ⊆ R {displaystyle Asubseteq mathbb {R} } is a discrete random variable defined on a sample space S {displaystyle S} . Then the probability mass function f X : A ↦ [ 0 , 1 ] {displaystyle f_{X}:Amapsto } for X {displaystyle X} is defined as Thinking of probability as mass helps to avoid mistakes since the physical mass is conserved as is the total probability for all hypothetical outcomes x {displaystyle x} : When there is a natural order among the potential outcomes x {displaystyle x} , it may be convenient to assign numerical values to them (or n-tuples in case of a discrete multivariate random variable) and to consider also values not in the image of X {displaystyle X} . That is, f X {displaystyle f_{X}} may be defined for all real numbers and f X ( x ) = 0 {displaystyle f_{X}(x)=0} for all x ∉ X ( S ) {displaystyle x otin X(S)} as shown in the figure. The image of X {displaystyle X} has a countable subset on which the probability mass function f X ( x ) {displaystyle f_{X}(x)} is one. Consequently, the probability mass function is zero for all but a countable number of values of x {displaystyle x} . The discontinuity of probability mass functions is related to the fact that the cumulative distribution function of a discrete random variable, when it is meaningful because there is a natural ordering, is also discontinuous. Where it is differentiable, the derivative is zero, just as the probability mass function is zero at all such points. A probability mass function of a discrete random variable X {displaystyle X} can be seen as a special case of two more general measure theoretic constructions: the distribution of X {displaystyle X} and the probability density function of X {displaystyle X} with respect to the counting measure. We make this more precise below.

[ "Joint probability distribution", "Random variable", "Probability distribution", "Probability density function", "Factorial moment generating function", "Statistical distance", "Parabolic fractal distribution", "Regular conditional probability", "Law of the unconscious statistician" ]
Parent Topic
Child Topic
    No Parent Topic