
Boltzmann Machine is a type of stochastic artificial neural network and a generative model. The Boltzmann Machine has been a foundational model in machine learning and has inspired various advancements in deep learning.
Definition and Energy Function
The Boltzmann Machine consists of nodes, or neurons, that represent binary variables. The energy of a configuration in a Boltzmann Machine is given by:
$$\mathcal{H}(\sigma) = \sum_{i} h_i \sigma_i \sum_{i<j} w_{ij} \sigma_i \sigma_j$$
where:
 $\sigma_i = \pm1$ is the state of node $i$.
 $h_i$ is the bias of node $i$.
 $w_{ij}$ is the connection weight (coupling) between nodes $i$ and $j$.
The probability of a configuration follows the Boltzmann distribution:
$$p(\sigma) = \frac{1}{Z} e^{\beta E(\sigma)}$$
with $Z$ being the partition function.
Boltzmann Machine Architecture
The architecture of a Boltzmann Machine is fully connected, meaning every node is connected to every other node. The connections have weights, and each node has an associated bias. The model was introduced in 1985 and has been instrumental in the development of deep learning.
Restricted Boltzmann Machine (RBM)
A variant of the Boltzmann Machine, the Restricted Boltzmann Machine (RBM), has a specific architecture that restricts connections between nodes within the same layer. The RBM consists of two layers: a visible layer receiving the input data and a hidden layer for internal representation.
The free energy of RBMs can be calculated, and thermodynamic quantities related to learning can be derived. The RBM has been analyzed using statistical mechanics, providing insights into learning and data representation.
The learning method of maximizing loglikelihoods and powerful physicsinspired algorithms for training RBMs have been developed, contributing to the field of machine learning.

