Apr 2016-Mar 2017

Math-Fi seminar on 10 Mar.

2017.03.06 Mon up
  • Date : 10 Mar. (Fri.)
  • Place: W.W. 6th-floor, Colloquium Room
  • Time : 16:30-18:00
  • Speaker: Benjamin Poignard (Paris Dauphine University)
  • Title: Penalized M-estimators and the Sparse Group Lasso case: theory and applications.
  • Abstract: There are many statistical methods available for constructing prediction models in the presence of high-dimensional data. The key concept underlying the analysis of high-dimensional data is dimension reduction or regularization. Producing a useful forecasting model requires regularization; that is, the estimates must be constrained regarding the bias-variance trade-off. We will cover commonly used penalization methods and their theoretical properties. We will mainly focus on the so-called oracle-property both from a finite-sample and asymptotic point of view. We will extensively study the asymptotic properties of the adaptive Sparse Group Lasso estimator within the penalized M-estimtor framework for dependent variables. We prove that this sparsity based estimator recovers the true underlying sparse model and is asymptotically normally distributed. Then we will study its asymptotic properties in a double-asymptotic framework, where the number of parameters diverges with the sample size. We show by simulations that the adaptive SGL outperforms other oracle-like methods in terms of estimation precision and variable selection.

Comments are closed.