Refine
Document Type
- Conference Proceeding (1) (remove)
Year of publication
- 2021 (1)
Language
- English (1) (remove)
Keywords
- Komplexität (1)
- Maschinelles Lernen (1)
- bias-variance (1)
- double descent (1)
This article aims to explain mathematically, why the so called double descent observed by Belkin et al., Reconciling modern machine-learning practice and the classical bias-variance trade-off, PNAS 116(32) (2019), p. 15849-15854, occurs on the way from the classical approximation regime of machine learning to the modern interpolation regime. We argue that this phenomenon may be explained by a decomposition of mean squared error plus complexity into bias, variance and an unavoidable irreducible error inherent to the problem. Further, in case of normally distributed output errors, we apply this decomposition to explain, why LASSO provides reliable predictors avoiding overfitting.