## Neural Computation

In a recent paper (Gelenbe 1989) we introduced a new neural network model, called the Random Network, in which “negative” or “positive” signals circulate, modeling inhibitory and excitatory signals. These signals can arrive either from other neurons or from the outside world: they are summed at the input of each neuron and constitute its signal potential. The state of each neuron in this model is its signal potential, while the network state is the vector of signal potentials at each neuron. If its potential is positive, a neuron fires, and sends out signals to the other neurons of the network or to the outside world. As it does so its signal potential is depleted. We have shown (Gelenbe 1989) that in the Markovian case, this model has product form, that is, the steady-state probability distribution of its potential vector is the product of the marginal probabilities of the potential at each neuron. The signal flow equations of the network, which describe the rate at which positive or negative signals arrive at each neuron, are nonlinear, so that their existence and uniqueness are not easily established except for the case of feedforward (or backpropagation) networks (Gelenbe 1989). In this paper we show that whenever the solution to these signal flow equations exists, it is unique. We then examine two subclasses of networks — balanced and damped networks — and obtain stability conditions in each case. In practical terms, these stability conditions guarantee that the unique solution can be found to the signal flow equations and therefore that the network has a well-defined steady-state behavior.