OPUS


Volltext-Downloads (blau) und Frontdoor-Views (grau)
The search result changed since you submitted your search request. Documents might be displayed in a different sort order.
  • search hit 1 of 14378
Back to Result List

Comparison of numerical properties comparing Automated Derivatives (Autograd) and explicit derivatives (Gradients) for Prototype based models

  • Differentiation is ubiquitous in the field of mathematics and especially in the field of Machine learning for calculations in gradient-based models. Calculating gradients might be complex and require handling multiple variables. Supervised Learning Vector Quantization models, which are used for classification tasks, also use the Stochastic Gradient Descent method for optimizing their cost functions. There are various methods to calculate these gradients or derivatives, namely Manual Differentiation, Numeric Differentiation, Symbolic Differentiation, and Automatic Differentiation. In this thesis, we evaluate each of the methods mentioned earlier for calculating derivatives and also compare the use of these methods for the variants of Generalized Learning Vector Quantization algorithms.

Download full text files

Export metadata

Additional Services

Search Google Scholar

Statistics

frontdoor_oas
Metadaten
Author:Venkata Sai Sandeep Yendamuri
Advisor:Thomas Villmann, Alexander Engelsberger
Document Type:Master's Thesis
Language:English
Year of Completion:2022
Granting Institution:Hochschule Mittweida
Release Date:2023/02/06
GND Keyword:Maschinelles Lernen
Page Number:49
Institutes:Angewandte Computer‐ und Bio­wissen­schaften
DDC classes:006.31 Maschinelles Lernen
Open Access:Frei zugänglich
Licence (German):License LogoUrheberrechtlich geschützt