language-icon Old Web
English
Sign In

Boltzmann machine

A Boltzmann machine (also called stochastic Hopfield network with hidden units) is a type of stochastic recurrent neural network and Markov random field. A Boltzmann machine (also called stochastic Hopfield network with hidden units) is a type of stochastic recurrent neural network and Markov random field. Boltzmann machines can be seen as the stochastic, generative counterpart of Hopfield networks. They were one of the first neural networks capable of learning internal representations, and are able to represent and (given sufficient time) solve difficult combinatoric problems. They are theoretically intriguing because of the locality and Hebbian nature of their training algorithm (being trained by Hebb's rule), and because of their parallelism and the resemblance of their dynamics to simple physical processes. Boltzmann machines with unconstrained connectivity have not proven useful for practical problems in machine learning or inference, but if the connectivity is properly constrained, the learning can be made efficient enough to be useful for practical problems. They are named after the Boltzmann distribution in statistical mechanics, which is used in their sampling function. That's why they are called 'energy based models' (EBM). They were invented in 1985 by Geoffrey Hinton, then a Professor at Carnegie Mellon University, and Terry Sejnowski, then a Professor at Johns Hopkins University. A Boltzmann machine, like a Hopfield network, is a network of units with an 'energy' (Hamiltonian) defined for the overall network. Its units produce binary results. Unlike Hopfield nets, Boltzmann machine units are stochastic. The global energy E {displaystyle E} in a Boltzmann machine is identical in form to that of a Hopfield network as well as the Ising model:

[ "Artificial neural network", "Deep learning", "Convolutional Deep Belief Networks", "contrastive divergence", "restrict boltzmann machine" ]
Parent Topic
Child Topic
    No Parent Topic