Quantitative analyses of associative memories
MetadataShow full item record
Neural networks have been studied for many years in the hope of simulating human-like activities such as recognizing a friend in a picture. Associative memories are systems that can recall stored data by specifying all or a portion of a probe that has been associated or paired with that data. Until now, most of the researchers used the equal probability neuron status assumption to derive the system performance. Only a few considered the non-equally distributed case. In this dissertation, we have quantitatively analyzed the characteristics of a variety of sparsely encoded associative memories. Based on each neuron operating close to its threshold, a dynamic thresholding scheme is proposed. From this dynamic approach, the first-order sparsely encoded associative memory storage capacity is shown to have better performance than for an ordinary associative memory. Sensitivity of storage capacity with respect to variations of threshold change is calculated to observe the effect on capacity change. Information capacity is also investigated in order to choose the optimum activity rate. Extensions are made to the higher order system. Several properties such as storage and information capacities are explored to evaluate the system performance. Other contributions include: (1) consideration of the retrieval problem of stored patterns in the noisy environment, and (2) development of the fault tolerant analysis of associative memories. Both neuron and connection fault models are analyzed in detail. Simulation results are shown to be consistent with theoretical work.