Sufficient dimension reduction (SDR) techniques have proven to be very useful data analysis tools in various applications. Underlying many SDR techniques is a critical assumption that the predictors are elliptically contoured. When this assumption appears to be wrong, practitioners usually try variable transformation such that the transformed predictors become (nearly) normal. The transformation function is often chosen from the log and power transformation family, as suggested in the celebrated Box-Cox model. However, any parametric transformation can be too restrictive, causing the danger of model misspecification. We suggest a nonparametric variable transformation method after which the predictors become normal. To demonstrate the main idea, we combine this flexible transformation method with two well-established SDR techniques, sliced inverse regression (SIR) and inverse regression estimator (IRE). The resulting SDR techniques are referred to as TSIR and TIRE, respectively. Both simulation and real data results show that TSIR and TIRE have very competitive performance. Asymptotic theory is established to support the proposed method. The technical proofs are available as supplementary materials.
Bibliographical noteFunding Information:
This work was supported in part by NSF grant DMS-0846068 and a grant from ONR. The authors thank Professor Dennis Cook for helpful discussions and Xin Zhang for kindly providing the Minneapolis elementary school data. The authors are also grateful for the insightful comments and suggestions from the Editor, Associate Editor, and the referees.
- Inverse regression estimator
- Linearity condition
- Sliced inverse regression
- Variable transformation