In Bayesian statistics, the posterior probability of a random event or an uncertain proposition is the conditional probability that is assigned after the relevant evidence or background is taken into account. Similarly, the posterior probability distribution is the probability distribution of an unknown quantity, treated as a random variable, conditional on the evidence obtained from an experiment or survey. 'Posterior', in this context, means after taking into account the relevant evidence related to the particular case being examined. For instance, there is a ('non-posterior') probability of a person finding buried treasure if they dig in a random spot, and a posterior probability of finding buried treasure if they dig in a spot where their metal detector rings. In Bayesian statistics, the posterior probability of a random event or an uncertain proposition is the conditional probability that is assigned after the relevant evidence or background is taken into account. Similarly, the posterior probability distribution is the probability distribution of an unknown quantity, treated as a random variable, conditional on the evidence obtained from an experiment or survey. 'Posterior', in this context, means after taking into account the relevant evidence related to the particular case being examined. For instance, there is a ('non-posterior') probability of a person finding buried treasure if they dig in a random spot, and a posterior probability of finding buried treasure if they dig in a spot where their metal detector rings. The posterior probability is the probability of the parameters θ {displaystyle heta } given the evidence X {displaystyle X} : p ( θ | X ) {displaystyle p( heta |X)} . It contrasts with the likelihood function, which is the probability of the evidence given the parameters: p ( X | θ ) {displaystyle p(X| heta )} .