Implementation of an Associative Memory using a Restricted Hopfield Network

Authors

  • Tet Yeap

DOI:

https://doi.org/10.34257/GJREFVOL21IS2PG1

Keywords:

hopfield network, restricted boltzmann machine, energy function, lyapunov function,

Abstract

A trainable analog restricted Hopfield Network is presented in this paper. It consists of two layers of nodes, visible and hidden nodes, connected by weighted directional paths forming a bipartite graph with no intralayer connection. An energy or Lyapunov function was derived to show that the proposed network will converge to stable states. The proposed network can be trained using either the modified SPSA or BPTT algorithms to ensure that all the weights are symmetric. Simulation results show that the presence of hidden nodes increases the network#x2019;s memory capacity. Using EXOR as an example, the network can be trained to be a dynamic classifier. Using A, U, T, S as training characters, the network was trained to be an associative memory. Simulation results show that the network can perform perfect re-creation of noisy images. Its recreation performance has higher noise tolerance than the standard Hopfield Network and the Restricted Boltzmann Machine. Simulation results also illustrate the importance of feedback iteration in implementing associative memory to re-create from noisy images.

How to Cite

Tet Yeap. (2021). Implementation of an Associative Memory using a Restricted Hopfield Network. Global Journals of Research in Engineering, 21(F2), 1–18. https://doi.org/10.34257/GJREFVOL21IS2PG1

Implementation of an Associative Memory using a  Restricted Hopfield Network

Published

2021-03-15