Presentation Name: Principal component reduction of a nonparametric additive model with variable velection
Presenter: Prof. Kejun He
Date: 2018-06-14
Location: Zibin N102
Abstract:

Additive models have been widely used as a flexible nonparametric regression method that can overcome the curse of dimensionality. By using sparsity-inducing penalty for variable selection, several authors have developed methods for fitting additive models when the number of predictors is very large, sometimes even larger than the sample size. However, despite good asymptotic properties, the finite sample performance of existing methods deteriorates considerably when the number of relevant predictors becomes moderately large. We propose to reduce the number of additive component functions to be estimated using principal components. To fit the reduced additive model to the data, we develop a novel algorithm to solve the penalized least squares on a fixed-rank manifold with a sparsity-inducing penalty. Our asymptotic theory shows that the resulting estimator has faster convergence rate than estimating without principal component reduction; and this is true even when the reduced model is only an approximation, provided that the approximation error is small. Moreover, the proposed method is able to consistently identify the relevant predictors. The advantage of the reduced additive model is also illustrated using a simulation study.

海报

Annual Speech Directory: No.144

220 Handan Rd., Yangpu District, Shanghai ( 200433 )| Operator:+86 21 65642222

Copyright © 2016 FUDAN University. All Rights Reserved