7 优化方法 (5 Hours)
7.1 凸优化条件
7.2 梯度下降
7.3 最小角度回归
7.4 ADMM
7.5 最小最大方法
7 Optimization Methods (5 Hours)
7.1 Convex Optimality Conditions
7.2 Gradient Descent and Coordinated Descent
7.3 Least Angle Regression
7.4 Alternating Direction Method of Multipliers
7.5 Minorization-Maximization Algorithms
8 用于变量选择的惩罚似然方法 (8 Hours)
8.1 广义线性模型
8.2 惩罚似然方法
8.3 数值算法
8.4 调节参数选择
8 Penalized likelihood methods for variable selection (8 Hours)
8.1 Generalized linear models
8.2 Variable selection via Penalized Likelihood
8.3 Numerical Algorithms
8.4 Tuning parameters selection
(
○
考核形式
;
○
分数构成
;
○
如面向本科生开放,请注明区分内容。
course is open to undergraduates, please indicate the difference.)
40% +
20% +
40%
教材及其它参考资料
Textbook and Supplementary Readings
1. Jianqing Fan, Runze Li, Cunhui Zhang and Hui Zou. (2020). Statistical foundations of Data Science. Chapman
and Hall/CRC.
2. Hastie, T., Tibshirani, R., Wainwright, M. (2015). Statistical learning with sparsity: the lasso and generalizations.
Chapman and Hall/CRC.
3. Bühlmann, P., Van De Geer, S. (2011). Statistics for high-dimensional data: methods, theory, and applications.
Springer Science & Business Media.
4. Philippe Rigollet, Jan Christian Hutter. (2019). High dimensional statistics. MIT lecture notes,