Part of Advances in Neural Information Processing Systems 7 (NIPS 1994)
Nicol Schraudolph, Terrence J. Sejnowski
Differentiation between the nodes of a competitive learning net(cid:173) work is conventionally achieved through competition on the ba(cid:173) sis of neural activity. Simple inhibitory mechanisms are limited to sparse representations, while decorrelation and factorization schemes that support distributed representations are computation(cid:173) ally unattractive. By letting neural plasticity mediate the compet(cid:173) itive interaction instead, we obtain diffuse, nonadaptive alterna(cid:173) tives for fully distributed representations. We use this technique to Simplify and improve our binary information gain optimiza(cid:173) tion algorithm for feature extraction (Schraudolph and Sejnowski, 1993); the same approach could be used to improve other learning algorithms.