MCMC by Metropolis-Hastings Algorithm: An Intuitive illustration
How samples are generated by MCMC, is the main question of interest.
Imagine we have a uniform distribution (0, 5) as proposal, (Dashed line)
and a target, (Zigzag pattern) (Supplementary figure 4). To show how to
estimate this transition matrix, we considered a discrete target
distribution as well as two numbers of states involved. Therefore, we
need a 2×2 transition matrix exists whose elements are the probability
of movement. To explain clearly, four possible transitions for
generating samples exist. If the first sample generated from state 1,
what the probabilities of being in the next state 1 or 2 would be, and
so when the first in 2. According to the proposal, the probabilities of
being in state 1 and 2 are\(\frac{\mathbf{1}}{\mathbf{5}}\ \)and\(\ \frac{\mathbf{4}}{\mathbf{5}}\),
respectively. Therefore, estimated Transition Matrix is
P=\(\par
\begin{bmatrix}\frac{\mathbf{13}}{\mathbf{15}}&\frac{\mathbf{1}}{\mathbf{5}}\\
\frac{\mathbf{2}}{\mathbf{15}}&\frac{\mathbf{4}}{\mathbf{5}}\\
\end{bmatrix}\ \)(Table1). If the process of sampling is repeated n
times, based on stochastic process the nth-step
transition matrix will
be\(\text{\ P}^{n}=\frac{1}{\alpha+p}\par
\begin{bmatrix}\alpha&\alpha\\
p&p\\
\end{bmatrix}\par
\begin{bmatrix}\frac{\mathbf{3}}{\mathbf{5}}&\frac{\mathbf{3}}{\mathbf{5}}\\
\frac{\mathbf{2}}{\mathbf{5}}&\frac{\mathbf{2}}{\mathbf{5}}\\
\end{bmatrix}\). It is proven this sampling mechanism converges to a
stationary form which shows target distribution (Supplementary Figure
4).
According to the stochastic process theorem, from this convergence time
onwards each sample generated is independent of previous
states(5). From the Bayesian point of
view, considering proposal as prior distribution and target as prior ×
likelihood, this process of sampling is defined as Metropolis-Hasting
which eventually converges to a stationary posterior.