Treffer: Algorithmica (1991) 6:466-478 Algorithmica,9 1991 Springer-Verlag New York Inc. Stochastic Neural Networks 1
Weitere Informationen
The first purpose of this paper is to present a class of algorithms for finding the global minimum of a continuous-variable function defined on a hypercube. These algorithms, based on both diffusion processes and simulated annealing, are implementable as analog integrated circuits. Such circuits can be viewed as generalizations of neural networks of the Hopfield type, and are called "diffusion machines." Our second objective isto show that "learning " in these networks can be achieved by a set of three interconnected diffusion machines: one that learns, one to model the desired behavior, and one to compute the weight changes. Key Words. Neural network, Simulated annealing, Diffusion. 1. Hopfield Networks. It is well known [1], [2] that a neural network can be used to compute a local min imum of a function E(x) defined on a hypercube [0, 13 " as follows. Let vg(t) be the state at node i at time t and set (1.1) vi(t) = g(ui(t)), (1.2) dui(t)- Ei(v(t)), dt where Ei(v) = (c~/c ~ vi) E(v) and g is an increasing function. The special case of a quadratic function (1.3) E(v) =- 89 ~, wijviv j- Z Oivi, i, j i where we assume wij ~ wjD results in (1.4) El(V) =-- ~ wijvj- Oi, J which is particularly well suited for realization as an analog integrated circuit. As Hopfield and Tank [3] and others (e.g., [4]) have shown, a variety of computat ional problems of considerable complexity can be reduced to computing the global min imum of a quadratic function. With current echnology, a network with several hundred nodes and with the dynamics given by (1.1)-(1.3) can