Modern technology often generates data with complex structures in which both response and explanatory variables are matrix valued. Existing methods in the literature can tackle matrix-valued predictors but are rather limited for matrix-valued responses. We study matrix variate regressions for such data, where the response Y on each experimental unit is a random matrix and the predictor X can be either a scalar, a vector or a matrix, treated as non-stochastic in terms of the conditional distribution Y|X. We propose models for matrix variate regressions and then develop envelope extensions of these models. Under the envelope framework, redundant variation can be eliminated in estimation and the number of parameters can be notably reduced when the matrix variate dimension is large, possibly resulting in significant gains in efficiency. The methods proposed are applicable to high dimensional settings.
|Original language||English (US)|
|Number of pages||22|
|Journal||Journal of the Royal Statistical Society. Series B: Statistical Methodology|
|State||Published - Mar 2018|
Bibliographical noteFunding Information:
We thank the Joint Editor, two Associate Editors and two referees for their valuable and insightful comments that greatly improved this work. We thank Professor Bing Li for sharing the electroencephalogram data. This work was partially supported by the General University Research Program at the University of Delaware.
- Matrix variate regression
- Matrix-valued response
- Reducing subspace
- Sufficient dimension reduction