New criteria are proposed for extracting multiple minor components associated with the covariance matrix of an input process. The proposed minor component analysis (MCA) algorithms are based on optimizing a weighted inverse Rayleigh quotient so that the optimum weights at equilibrium points are exactly the desired eigenvectors of a covariance matrix instead of an arbitrary orthonormal basis of the minor subspace. Variations of the derived MCA learning rules are obtained by imposing orthogonal and quadratic constraints and change of variables. Some of the proposed algorithms can also perform PCA by merely changing the sign of the step-size. These algorithms may be seen as MCA counterparts of Oja's and Xu's systems for computing multiple principal component analysis. Simulation results to demonstrate algorithm performance are also presented.