Noise effects and fault tolerance in Hopfield-type neural networks
Date
Authors
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
Research interest in neural networks has grown rapidly in recent years. Studies covering many aspects of neural networks, from new models, simulations and theoretical analyses, to implementations and applications, have been reported. Little research, however, has been performed on noise effects and fault tolerance in neural networks. In this dissertation, the focus is on the investigation of Hopfield-type neural networks (HNNs) by both numerical simulations and theoretical analyses. Computer simulations and a linear combination concept are employed to study HNNs from a quantitative point of view. A statistical method and models are then proposed for various situations in analyzing different aspects of HNNs. Contributions of this dissertation include the following: First, a complementary Hopfield model (CHNN) is presented for improving the performance of the original Hopfield model. A generalized three-layered model capable of systematically describing HNNs and their extensions, e.g., higher-order, exponential order, and winner-take-all nets is then proposed. A rigorous analysis of HNNs, using a statistical technique, clearly displays their characteristics. Analyses and comparisons of first-order modifications, higher-order nets, and exponential order nets are also performed. The differences between even- and odd-order nets, as well as auto- and hetero-associative memories are pointed out and discussed. Various models for implementation error and/or noise sources, including detector/thresholding device noise, 2-D matrix mask noise, and gain variation noise, are then proposed and an "excess" noise concept is developed, which then leads to new results on the analysis of implementation noise effects in HNNs. This technique is then extended and successfully applied to the analysis of fault tolerance problems, including synaptic interconnect faults and neuron stuck-at faults, in HNNs.