Part of Advances in Neural Information Processing Systems 5 (NIPS 1992)
Isaac Meilijson, Eytan Ruppin, Moshe Sipper
We analyze in detail the performance of a Hamming network clas(cid:173) sifying inputs that are distorted versions of one of its m stored memory patterns. The activation function of the memory neurons in the original Hamming network is replaced by a simple threshold function. The resulting Threshold Hamming Network (THN) cor(cid:173) rectly classifies the input pattern, with probability approaching 1, using only O(mln m) connections, in a single iteration. The THN drastically reduces the time and space complexity of Hamming Net(cid:173) work classifiers.