Analysis and Comparison of Different Learning Algorithms for Pattern Association Problems

1987 
We investigate the behavior of different learning algorithms for networks of neuron-like units. As test cases we use simple pattern association problems, such as the XOR-problem and symmetry detection problems. The algorithms considered are either versions of the Boltzmann machine learning rule or based on the backpropagation of errors. We also propose and analyze a generalized delta rule for linear threshold units. We find that the performance of a given learning algorithm depends strongly on the type of units used. In particular, we observe that networks with ±1 units quite generally exhibit a significantly better learning behavior than the corresponding 0,1 versions. We also demonstrate that an adaption of the weight-structure to the symmetries of the problem can lead to a drastic increase in learning speed.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    2
    References
    5
    Citations
    NaN
    KQI
    []