6. Conclusions
In this paper, we analyzed convergence properties of the perceptron learning, the AdaTron learning and the Hebbian learning, when the teacher was noisy. The learning curves in these cases were analytically derived using a statistical mechanical method and were consistent with the experimental results in our simulation. Our analyses showed that the learning curves of the perceptron learning and the AdaTron learning have an overshoot, that is, the covariance coefficient R of the teacher and the student exceeds the convergence value once. However, the Hebbian learning does not have this property. We showed that these phenomena result from the difference of the eigenvalues and eigenvectors of the system matrix using the asymptotic analysis of dynamical systems. This result may give a method for controlling the learning coefficient η to achieve a faster convergence speed and a lower residual error in the future.