In this paper, learning rules are proposed for simultaneous computation of minor eigenvectors of a covariance matrix. To understand the optimality conditions of Rayleigh quotients, many interesting identities and properties related are derived. For example, it is shown that the Hessian matrix is singular at each critical point of the Rayleigh quotient. Based on these properties, MCA rules are derived by optimizing a weighted inverse Rayleigh quotient so that the optimum weights at equilibrium points are exactly the desired eigenvectors of a covariance matrix instead of an arbitrary orthonormal basis of the minor subspace. Variations of the derived MCA learning rules are obtained by imposing orthogonal and quadratic constraints and change of variables. Some of the proposed algorithms can also perform PCA by merely changing the sign of the step-size.