Friday, April 18 2014, 3:30pm Cohen Room 230, Statistics Building Hilafu PhD Candidate, The University of Georgia Department of Statistics Sufficient Dimension Reduction (SDR) is a dimension reduction paradigm for reducing the dimension of the predictor vector without losing regression information. Classical inverse regression based SDR methods, though successfully used in many applications and have attractive computational properties, require inverting the predictor vector covariance matrix. This has hindered their use in contemporary high dimensional data analysis, where the number of predictors exceeds the available sample size, because estimation and inversion of the covariance matrix poses theoretical and operational challenges. Motivated by this problem, in this dissertation, we present a sequential reduction framework that avoids the need to invert large covariance matrix. The proposed framework extends the scope of these SDR methods to the high dimensional setting. We present three projects: the first one extends these methods to multivariate regressions with categorical and quantitative predictors; the second presents the sequential reduction framework and two paths to implement it, one for quantitative response and one for categorical response; the third uses the framework and the idea of partial least squares to develop a SDR method for correlated high dimensional data.