Please note change of TIME - QLS guest webinar - Tuesday, 14 April 14h00

qls at ictp.it qls at ictp.it
Wed Apr 8 10:36:30 CEST 2020


Dear All,

On Tuesday, 14  April at 14h00 Alia Abbara and Cedric Gerbelot(Laboratoire de Physique, ENS Paris)
will give a webinar titled "Asymptotic errors for convex penalized linear regression beyond Gaussian matrices and extension to the generalized linear model"

Abstract:

We consider the problem of learning a coefficient vector x0 ∈ RN from 
noisy linear observations y=Fx0+w∈RM in the high dimensional limit M,N → 
∞ with α ≡ M/N fixed. We provide a rigorous derivation of an explicit 
formula —first conjectured using heuristic methods from statistical 
physics— for the asymptotic mean squared error obtained by penalized 
convex regression estimators such as the LASSO or the elastic net, for a 
class of very generic random ma- trices corresponding to rotationally 
invariant data matrices with arbitrary spectrum. The proof is based on a 
convergence analysis of an oracle version of vector approximate 
message-passing (oracle-VAMP) and on the properties of its state 
evolution equations. Our method leverages on and highlights the link 
between vector approximate message-passing, Douglas-Rachford splitting 
and proximal descent algorithms, extending previous results obtained 
with i.i.d. matrices for a large class of problems. We illustrate our 
results on some concrete examples and show that even though they are 
asymptotic, our predictions agree remarkably well with numerics even for 
very moderate sizes.
We then show how the same proof can be extended to the generalized 
linear model using an oracle version of generalized vector approximate 
message passing (oracle-GVAMP) and a more elaborate convergence proof 
based on Lyapunov arguments from control theory.

Here is the Zoom meeting ID to attend the online seminar:
Meeting ID: 475-819-702
Join Zoom Meeting
https://zoom.us/j/475819702

Kind regards,

Erica

  



More information about the science-ts mailing list