Generalizations of Oja's learning rule to non-symmetric matrices

Research output: Contribution to journalConference articlepeer-review

Abstract

New learning rules for computing eigenspaces and eigenvectors for symmetric and nonsymmetric matrices are proposed. By applying Liapunov stability theory, these systems are shown to be globally convergent. Properties of limiting solutions of the systems and weighted versions are also examined. The proposed systems may be viewed as generalizations of Oja's and Xu's principal subspace learning rules. Numerical examples showing the convergence behavior are also presented.

Original languageEnglish (US)
Article number4253004
Pages (from-to)1779-1782
Number of pages4
JournalProceedings - IEEE International Symposium on Circuits and Systems
StatePublished - Sep 27 2007
Event2007 IEEE International Symposium on Circuits and Systems, ISCAS 2007 - New Orleans, LA, United States
Duration: May 27 2007May 30 2007

Keywords

  • Global convergence
  • Liapunov stability
  • Minor components
  • Oja's learning rule
  • Principal components

Fingerprint

Dive into the research topics of 'Generalizations of Oja's learning rule to non-symmetric matrices'. Together they form a unique fingerprint.

Cite this