A variety of artificial neural networks are reviewed, including feed‐forward networks, recurrent networks, associative memories such as the Hopfield network, and the self‐organizing map, as are methods that have been developed for training them.
A variety of artificial neural networks are reviewed, including feed‐forward networks, recurrent networks, associative memories such as the Hopfield network, and the self‐organizing map. Their architectures are described, as are methods that have been developed for training them. Particular emphasis is placed on links with statistical activities, especially regression, classification, and clustering, in contexts such as graphical models and latent structure models. In terms of the training procedures, attention is drawn to the implicit or explicit implementation of statistical methodological approaches such as likelihood and Bayesian methods. Copyright © 2009 John Wiley & Sons, Inc.