Affordable Access

Publisher Website

Deterministic learning rules for boltzmann machines

Authors
Journal
Neural Networks
0893-6080
Publisher
Elsevier
Publication Date
Volume
8
Issue
4
Identifiers
DOI: 10.1016/0893-6080(94)00112-y
Keywords
  • Mathematical And Computational Analysis
Disciplines
  • Biology
  • Computer Science
  • Engineering

Abstract

Abstract It is shown that by introducing lateral inhibition in Boltzmann machines (BMs), hybrid architectures involving different computational principles, such as feedforward mapping, unsupervised learning, and associative memory, can be modeled and analysed. This is of great advantage for getting a better understanding of the capability of the Boltzmann machine and for the study of hybrid architectures in the context of neurobiology as well as i engineering. Analytic learning rules can be derived for these networks that allow for fast simulation on sequential machines. As a result, time-consuming Glauber dynamics need not be invoked to calculated the learning rule. Two examples how lateral inhibition in the BM leads to fast learning rules are considered in detail: Boltzmann perceptrons (BP) and radial basis Boltzmann machines (RBBM). BPs are shown to be universal classifiers. The main difference between BPs and MLPs are indicated. For RBBMs, it is shown that noise in the system controls an interesting symmetry-breaking pattern that leads to specialization of hidden units.

There are no comments yet on this publication. Be the first to share your thoughts.