OPUS


Volltext-Downloads (blau) und Frontdoor-Views (grau)

Analysis of Attention Learning Schemes and the Design of an Attention Integration into Learning Vector Quantization

  • Machine learning models for timeseries have always been a special topic of interest due to their unique data structure. Recently, the introduction of attention improved the capabilities of recurrent neural networks and transformers with respect to their learning tasks such as machine translation. However, these models are usually subsymbolic architectures, making their inner working hard to interpret without comprehensive tools. In contrast, interpretable models such learning vector quantization are more transparent in the ability to interpret their decision process. This thesis tries to merge attention as a machine learning function with learning vector quantization to better handle timeseries data. A design on such a model is proposed and tested with a dataset used in connection with the attention based transformers. Although the proposed model did not yield the expected results, this work outlines improvements for further research on this approach.

Download full text files

Export metadata

Additional Services

Search Google Scholar

Statistics

frontdoor_oas
Metadaten
Author:Thomas Davies
Advisor:Thomas Villmann, David Nebel
Document Type:Master's Thesis
Language:English
Year of Completion:2023
Publishing Institution:Hochschule Mittweida
Granting Institution:Hochschule Mittweida
Release Date:2024/02/06
GND Keyword:Maschinelles Lernen; Zeitreihe; Vektorquantisierung
Page Number:74
Institutes:Angewandte Computer‐ und Bio­wissen­schaften
DDC classes:006.31 Maschinelles Lernen
Open Access:Frei zugänglich
Licence (German):License LogoUrheberrechtlich geschützt