language-icon Old Web
English
Sign In

Markov random field

In the domain of physics and probability, a Markov random field (often abbreviated as MRF), Markov network or undirected graphical model is a set of random variables having a Markov property described by an undirected graph. In other words, a random field is said to be a Markov random field if it satisfies Markov properties. In the domain of physics and probability, a Markov random field (often abbreviated as MRF), Markov network or undirected graphical model is a set of random variables having a Markov property described by an undirected graph. In other words, a random field is said to be a Markov random field if it satisfies Markov properties. A Markov network or MRF is similar to a Bayesian network in its representation of dependencies; the differences being that Bayesian networks are directed and acyclic, whereas Markov networks are undirected and may be cyclic. Thus, a Markov network can represent certain dependencies that a Bayesian network cannot (such as cyclic dependencies); on the other hand, it can't represent certain dependencies that a Bayesian network can (such as induced dependencies). The underlying graph of a Markov random field may be finite or infinite. When the joint probability density of the random variables is strictly positive, it is also referred to as a Gibbs random field, because, according to the Hammersley–Clifford theorem, it can then be represented by a Gibbs measure for an appropriate (locally defined) energy function. The prototypical Markov random field is the Ising model; indeed, the Markov random field was introduced as the general setting for the Ising model.In the domain of artificial intelligence, a Markov random field is used to model various low- to mid-level tasks in image processing and computer vision. Given an undirected graph G = ( V , E ) {displaystyle G=(V,E)} , a set of random variables X = ( X v ) v ∈ V {displaystyle X=(X_{v})_{vin V}} indexed by V {displaystyle V}   form a Markov random field with respect to G {displaystyle G}   if they satisfy the local Markov properties: The Global Markov property is stronger than the Local Markov property, which in turn is stronger than the Pairwise one. However, the above three Markov properties are equivalent for a positive probability. As the Markov property of an arbitrary probability distribution can be difficult to establish, a commonly used class of Markov random fields are those that can be factorized according to the cliques of the graph. Given a set of random variables X = ( X v ) v ∈ V {displaystyle X=(X_{v})_{vin V}} , let P ( X = x ) {displaystyle P(X=x)} be the probability of a particular field configuration x {displaystyle x} in  X {displaystyle X} . That is, P ( X = x ) {displaystyle P(X=x)} is the probability of finding that the random variables X {displaystyle X} take on the particular value x {displaystyle x} . Because X {displaystyle X} is a set, the probability of x {displaystyle x} should be understood to be taken with respect to a joint distribution of the X v {displaystyle X_{v}} . If this joint density can be factorized over the cliques of G {displaystyle G} : then X {displaystyle X} forms a Markov random field with respect to G {displaystyle G} . Here, cl ⁡ ( G ) {displaystyle operatorname {cl} (G)} is the set of cliques of G {displaystyle G} . The definition is equivalent if only maximal cliques are used. The functions φC are sometimes referred to as factor potentials or clique potentials. Note, however, conflicting terminology is in use: the word potential is often applied to the logarithm of φC. This is because, in statistical mechanics, log(φC) has a direct interpretation as the potential energy of a configuration  x C {displaystyle x_{C}} .

[ "Markov chain", "Image segmentation", "Markov process", "Iterated conditional modes" ]
Parent Topic
Child Topic
    No Parent Topic