In numerical analysis and computational statistics, rejection sampling is a basic technique used to generate observations from a distribution. It is also commonly called the acceptance-rejection method or 'accept-reject algorithm' and is a type of exact simulation method. The method works for any distribution in R m {displaystyle mathbb {R} ^{m}} with a density. P ( U ≤ f ( Y ) M g ( Y ) ) = E 1 [ U ≤ f ( Y ) M g ( Y ) ] = E [ E [ 1 [ U ≤ f ( Y ) M g ( Y ) ] | Y ] ] ( by tower property ) = E [ P ( U ≤ f ( Y ) M g ( Y ) | Y ) ] = E [ f ( Y ) M g ( Y ) ] ( because Pr ( U ≤ u ) = u , when U is uniform on ( 0 , 1 ) ) = ∫ y : g ( y ) > 0 f ( y ) M g ( y ) g ( y ) d y = 1 M ∫ y : g ( y ) > 0 f ( y ) d y = 1 M ( since support of Y includes support of X ) {displaystyle {egin{aligned}mathbb {P} left(Uleq {frac {f(Y)}{Mg(Y)}} ight)&=operatorname {E} mathbf {1} _{left}\&=Eleft}|Y] ight]&({ ext{by tower property }})\&=operatorname {E} left\&=Eleft&({ ext{because }}Pr(Uleq u)=u,{ ext{when }}U{ ext{ is uniform on }}(0,1))\&=int limits _{y:g(y)>0}{frac {f(y)}{Mg(y)}}g(y),dy\&={frac {1}{M}}int limits _{y:g(y)>0}f(y),dy\&={frac {1}{M}}&({ ext{since support of }}Y{ ext{ includes support of }}X)end{aligned}}} M = 1 P ( U ≤ f ( Y ) M g ( Y ) ) {displaystyle M={frac {1}{mathbb {P} left(Uleq {frac {f(Y)}{Mg(Y)}} ight)}}} F θ ( x ) = E [ e x p ( θ X − ψ ( θ ) ) I ( X ≤ x ) ] = ∫ − ∞ x e θ y − ψ ( θ ) f ( y ) d y g θ ( x ) = F θ ′ ( x ) = e θ x − ψ ( θ ) f ( x ) {displaystyle {egin{aligned}F_{ heta }(x)&=mathbb {E} left\&=int _{-infty }^{x}e^{ heta y-psi ( heta )}f(y)dy\g_{ heta }(x)&=F_{ heta }^{'}(x)=e^{ heta x-psi ( heta )}f(x)end{aligned}}} Z ( x ) = f ( x ) g θ ( x ) = f ( x ) e θ x − ψ ( θ ) f ( x ) = e − θ x + ψ ( θ ) {displaystyle Z(x)={frac {f(x)}{g_{ heta }(x)}}={frac {f(x)}{e^{ heta x-psi ( heta )}f(x)}}=e^{- heta x+psi ( heta )}} ψ θ ( η ) = log ( E θ exp ( η X ) ) = ψ ( θ + η ) − ψ ( θ ) < ∞ E θ ( X ) = ∂ ψ θ ( η ) ∂ η ∣ η = 0 V a r θ ( X ) = ∂ 2 ψ θ ( η ) ∂ 2 η ∣ η = 0 {displaystyle {egin{aligned}psi _{ heta }(eta )&=log left(mathbb {E} _{ heta }exp(eta X) ight)=psi ( heta +eta )-psi ( heta )<infty \mathbb {E} _{ heta }(X)&={frac {partial psi _{ heta }(eta )}{partial eta }}mid _{eta =0}\Var_{ heta }(X)&={frac {partial ^{2}psi _{ heta }(eta )}{partial ^{2}eta }}mid _{eta =0}end{aligned}}} f X | X ≥ b ( x ) = f ( x ) I ( x ≥ b ) P ( x ≥ b ) g θ ∗ ( x ) = f ( x ) exp ( θ ∗ x − ψ ( θ ∗ ) ) Z ( x ) = f X | X ≥ b ( x ) g θ ∗ ( x ) = exp ( − θ ∗ x + ψ ( θ ∗ ) ) I ( x ≥ b ) P ( x ≥ b ) {displaystyle {egin{aligned}f_{X|Xgeq b}(x)&={frac {f(x)mathbb {I} (xgeq b)}{mathbb {P} (xgeq b)}}\g_{ heta ^{*}}(x)&=f(x)exp( heta ^{*}x-psi ( heta ^{*}))\Z(x)&={frac {f_{X|Xgeq b}(x)}{g_{ heta ^{*}}(x)}}={frac {exp(- heta ^{*}x+psi ( heta ^{*}))mathbb {I} (xgeq b)}{mathbb {P} (xgeq b)}}end{aligned}}} M = Z ( b ) = exp ( − θ ∗ b + ψ ( θ ∗ ) ) P ( X ≥ b ) = exp ( − ( b − μ ) 2 2 σ 2 ) P ( X ≥ b ) = exp ( − ( b − μ ) 2 2 σ 2 ) P ( N ( 0 , 1 ) ≥ b − μ σ ) {displaystyle M=Z(b)={frac {exp(- heta ^{*}b+psi ( heta ^{*}))}{mathbb {P} (Xgeq b)}}={frac {exp(-{frac {(b-mu )^{2}}{2sigma ^{2}}})}{mathbb {P} (Xgeq b)}}={frac {exp(-{frac {(b-mu )^{2}}{2sigma ^{2}}})}{mathbb {P} (mathrm {N} (0,1)geq {frac {b-mu }{sigma }})}}} U ≤ Z ( x ) M = e − θ ∗ ( x − b ) I ( x ≥ b ) {displaystyle Uleq {frac {Z(x)}{M}}=e^{- heta ^{*}(x-b)}mathbb {I} (xgeq b)} In numerical analysis and computational statistics, rejection sampling is a basic technique used to generate observations from a distribution. It is also commonly called the acceptance-rejection method or 'accept-reject algorithm' and is a type of exact simulation method. The method works for any distribution in R m {displaystyle mathbb {R} ^{m}} with a density. Rejection sampling is based on the observation that to sample a random variable in one dimension, one can perform a uniformly random sampling of the two-dimensional Cartesian graph, and keep the samples in the region under the graph of its density function. Note that this property can be extended to N-dimension functions.