OPUS


Volltext-Downloads (blau) und Frontdoor-Views (grau)
The search result changed since you submitted your search request. Documents might be displayed in a different sort order.
  • search hit 8 of 66
Back to Result List

An attempt to explain double descent in modern machine learning

  • This article aims to explain mathematically, why the so called double descent observed by Belkin et al., Reconciling modern machine-learning practice and the classical bias-variance trade-off, PNAS 116(32) (2019), p. 15849-15854, occurs on the way from the classical approximation regime of machine learning to the modern interpolation regime. We argue that this phenomenon may be explained by a decomposition of mean squared error plus complexity into bias, variance and an unavoidable irreducible error inherent to the problem. Further, in case of normally distributed output errors, we apply this decomposition to explain, why LASSO provides reliable predictors avoiding overfitting.

Download full text files

Export metadata

Additional Services

Search Google Scholar

Statistics

frontdoor_oas
Metadaten
Author:Jochen Merker, Gregor Schuldt
DOI:https://doi.org/10.48446/opus-12293
ISSN:1437-7624
Parent Title (German):26. Interdisziplinäre Wissenschaftliche Konferenz Mittweida
Publisher:Hochschule Mittweida
Place of publication:Mittweida
Document Type:Conference Proceeding
Language:English
Year of Completion:2021
Publishing Institution:Hochschule Mittweida
Contributing Corporation:HTWK Leipzig
Release Date:2021/05/18
Tag:bias-variance; double descent
GND Keyword:Maschinelles Lernen; Komplexität
Issue:002
Page Number:4
First Page:141
Last Page:144
Open Access:Frei zugänglich
Licence (German):License LogoUrheberrechtlich geschützt