High Capacity Neural Network Associative Memories
One of the most attractive features of associative memories (AM) is
abilities of associative recall, especially recall by incomplete or
inputs. However, most existing neural network AM models suffer
from their limited storage capacities. Due to the increased
brought in by the hidden nodes, backpropagation (BP) networks
(multi-layer feedforward networks constructed by BP learning) are able
associate more pattern pairs than the network size if these pairs are
as learning samples. However, conventional use of BP networks with
passes of forward computing for associative recalls gives these
only very limited noise resistance capability.
This research project is aimed at developing a new class of AM which
the relaxation dynamics of the recall mechanism of traditional Hopfield
AM models and the representational power of BP networks. The resulting
may have significantly increased storage capacity (up to 2^n n
binary patterns can be stored in a network of size O(n), according to
recent experiments) yet at the same time maintain a high level of noise
resistance capability. This project may deepen our understanding of the
mechanism underlying BP networks.
- Peng Y, Zhou Z and McLenney E: "Relaxing Backpropagation
Associative Memories, to be presented at The IEEE International
on Neural Networks, Perth, Australia, Nov. 27 - Dec. 1, 1995.
- Peng Y
and Zhou Z: “Turning Backpropagation
Networks into High-Capacity Associative Memories”, in Proceedings
of the World Congress on Neural Networks, San Diego, CA, Sept.
15-18, 1996, 743-748..
FOR MORE INFORMATION
Contact Yun Peng, firstname.lastname@example.org .