language-icon Old Web
English
Sign In

Bernoulli process

In probability and statistics, a Bernoulli process (named after Jacob Bernoulli) is a finite or infinite sequence of binary random variables, so it is a discrete-time stochastic process that takes only two values, canonically 0 and 1. The component Bernoulli variables Xi are identically distributed and independent. Prosaically, a Bernoulli process is a repeated coin flipping, possibly with an unfair coin (but with consistent unfairness). Every variable Xi in the sequence is associated with a Bernoulli trial or experiment. They all have the same Bernoulli distribution. Much of what can be said about the Bernoulli process can also be generalized to more than two outcomes (such as the process for a six-sided dice); this generalization is known as the Bernoulli scheme. The problem of determining the process, given only a limited sample of the Bernoulli trials, may be called the problem of checking whether a coin is fair. A Bernoulli process is a finite or infinite sequence of independent random variables X1, X2, X3, ..., such that In other words, a Bernoulli process is a sequence of independent identically distributed Bernoulli trials. Independence of the trials implies that the process is memoryless. Given that the probability p is known, past outcomes provide no information about future outcomes. (If p is unknown, however, the past informs about the future indirectly, through inferences about p.)

[ "Bernoulli's principle", "Bernoulli scheme" ]
Parent Topic
Child Topic
    No Parent Topic