Logical Content in the Recurrent Hopfield Network without Higher Order Connections
Recurrent single field neural networks are essentially dynamical systems that feed back signals to themselves. Popularized by John Hopfield, these models possess a rich class of dynamics characterized by the existence of several stable states each with its own basin of attraction. Higher Order Neural Networks (HONN) has been shown to have impressive computational, storage and learning capabilities. However, networks with higher order connections need more computational time, higher storage requirements and have higher complexity, so the authors look at the possibility of networks without higher order connections. In this paper, they explore the constraints and effects of insisting on having neural networks without these higher order connections in learning three-atom clauses.