Markov random field (MRF)¶

The are essentially graphical models where we do not define edge orientations. This is useful in cases when Conditional independence is not straight forward to define (Spatial models).

Formal Definition¶

A Markov random field is a probability distribution \(p\) over variables \(x_1, \cdots, x_n\) defined by an undirected graph G in which nodes correspond to variables \(x_i\).

\[\begin{split} p(x_1, \cdots, x_n) = \frac{1}{Z} \prod_{c \in C} \phi_c(x_c) \\ Z = \sum_{x_1, \cdots, x_n}\prod_{c \in C} \phi_c(x_c) \end{split}\]

This definition will satisfy conditional independence properties of a graph G.

Examples¶

  • Ising model

  • Hoppfiled model

  • Pots model

Learning¶

ML and MAP estimation on MRF is quite computationally expensive. For this reason it is rare to perform Bayesian inference for the parameters of MRF. They are done using gradient optimization algorithms.

Statistical Physics¶

The Gibs distribution can convert it to an UGM:

\[ \psi_c(y_c|\theta) = \exp(-E[y_c|\theta_c]) \]

This forms the base of energy based models. Here high probability states correspond to low energy configurations.

Bayesian Networks are special case of MRF¶

Bayesian networks are MRF where clique factor are conditional probability distributions, they imply directed acyclic structure in graphs

Factor graphs¶

We can view MRFs in a way where factors and variables are explicit and separate in the representation.