Refine
Document Type
- Master's Thesis (121)
- Bachelor Thesis (94)
- Conference Proceeding (66)
- Diploma Thesis (14)
- Final Report (6)
Year of publication
Language
- English (301) (remove)
Keywords
- Blockchain (40)
- Maschinelles Lernen (28)
- Vektorquantisierung (9)
- Algorithmus (7)
- Bioinformatik (6)
- Bitcoin (6)
- Graphentheorie (6)
- Internet der Dinge (6)
- Neuronales Netz (6)
- Unternehmen (6)
Institute
- Angewandte Computer‐ und Biowissenschaften (120)
- 06 Medien (35)
- 03 Mathematik / Naturwissenschaften / Informatik (28)
- Wirtschaftsingenieurwesen (21)
- 01 Elektro- und Informationstechnik (7)
- 04 Wirtschaftswissenschaften (7)
- Ingenieurwissenschaften (4)
- Sonstige (4)
- 02 Maschinenbau (2)
- 05 Soziale Arbeit (1)
Many companies use machine learning techniques to support decision-making and automate business processes by learning from the data that they have. In this thesis we investigate the theory behind the most widely used in practice machine learning algorithms for solving classification and regression problems.
In particular, the following algorithms were chosen for the classification problem: Logistic Regression, Decision Trees, Random Forest, Support Vector Machine (SVM), Learning Vector Quantization (LVQ). As for the regression problem, Decision Trees, Random Forest and Gradient Boosted Tree were used. We then apply those algorithms to real company data and compare their performances and results.
In the following study the properties of the superabsorbent polymer Broadleaf P4 were investigated according to the aim to apply that polymer within constructed wetlands. The application of the polymer in constructed wetlands shall result in an improvement of the removal of pesticides. For that the polymer was given into lab-scale wetlands together with pumice and were compared to a control wetland, which was filled with gravel. The wetlands were running for several weeks in which the nutrient removal was recorded. The polymer was also tested according to its property to adsorb the pesticides before adding the pesticides to the wetland beds.
To investigate the effects of climate change on interactions within ecosystems, a microcosm experiment was conducted. The effects of temperature increase and predator diversity on Collembola communities and their decomposition rate were investigated. The predators used were mites and Chilopods, whose predation effects on several response variables were analysed. This data included Collembola abundance, biomass and body mass as well as basal respiration and microbial biomass carbon. These response variables were tested against the predictors in several models. Temperature showed high significance in interaction with mite abundance in almost all models. Furthermore, the results of the basal respiration and microbial biomass carbon support the suggestion of a trophic cascade within the animal interaction.
In this paper, we designed, implemented, and tested a special surveillance camera system based on a combination of classical image processing algorithms. The system’s sub-objective consists of tracking experimental vehicles driving on a defined trajectories (Rail) in real time. Furthermore, it analyzes the scene to collect additional vehicles & rail-related information. The system then uses the gathered data to reach its main objective which confines oneself in independently predicting vehicles collision. Consequently, we propose a hybrid method of detecting and tracking ATLAS-vehicles efficiently. To detect the vehicle at the beginning of the video, periodically every n-frame, and in the case where the tracked vehicle has been lost, we used Histogram Back-Projection. By contrast, Kernelized correlation filter is used to track the detected vehicles. Combining these two methods provides one of the best trade-offs between accuracy and speed even on a single processing core. The proposed method achieves the best performance compared with three different approaches on a custom dataset.
DropConnect (the generalization of Dropout) is a very simple regularization technique that was introduced a few years ago and has become extremely popular because of its simplicity and effectiveness. In this thesis, a suitable architecture for applying DropConnect to Learning Vector Quantization networks is proposed along with a reference implementation and experimental results. Inmany classification tasks, the uncertainty of themodel is a vital piece of information for experts. Methods to extract the uncertainty and stability using DropConnect are also proposed and the corresponding experimental results are documented.
Die vorliegende Arbeit befasst sich mit der Leistungsmessung während der Strategieentwicklung im Bereich des Einkaufs am Beispiel DANIELI Österreich. Das Hauptziel ist es, dass System von Danieli Österreich darzustellen, um ein Verständnis der Leistungsmessung in der Praxis zu generieren. Die draus erworbenen Kenntnisse sollen weiterführend in einer Analyse dazu dienen, um die tatsächlichen Einsparung in diesen Ländern aufzuzeigen. Aufgrund Vorgaben der Geschäftsführung ist DANIELI Österreich gezwungen 20% seines Einkaufsvolumens im ehemaligen Jugoslawien einzukaufen, weshalb Möglichkeiten sowie Risiken in den Ländern Bosnien und Herzegowina, Serbien und Kroatien dargestellt werden, um die Einkaufsstrategie zu optimieren.
The loss of photoreceptors is a major course for visual impairment and blindness with no cure currently established. Photoreceptor replacement into mouse models of retinal degeneration is currently investigated as a potential future therapy. To evaluate visual function in mice before and after treatment two vision-based behavioral tests (optomotor tracking and the light/dark box) were investigated including their feasibility to distinguish between rod and cone photoreceptor function. Both methods turned out to be an objective and reliable readout for vision ability in wildtype mice and mice with vision impairment due to retinal degeneration. The capability of the methods to assess slight vision improvements have to be further evaluated.
Therefore options for improvement of the established tests and an idea for a new test paradigm have been introduced.
In this work a new method for the prediction of the Xaa-proline (where Xaa is any amino acid) cis/trans isomerization was investigated. By extraction of twelve structural features (real secondary structure, inside/outside classification, properties of the environment around proline and proline itself) a support vector machine (SVM) based prediction approach was evolved. The Java software Xaa-PIPT for structural feature extraction was developed. Based on 4397 (2199 cis and 2198 trans) prolines extracted from non-redundant, globular proteins a classifier was trained using the radial basis function (RBF) kernel. In ten-fold cross-validation it achieved an accuracy of 70.0478 % and a Matthews correlation coefficient (MCC) of 0.4223, a sensitivity of 0.5433 and a specificity of 0.8576. Based on this classifier a lightweight and easy-to-use Java software tool, called m Xaa-PIPT, for the prediction of the Xaa-proline cis/trans isomerization was devel-oped. It was shown that there are correlations between the proline surrounding environment and the isomerization state. m Xaa-PIPT can be used for the evaluation of low-resolution protein structures and theoretical models to improve their quality by the prediction of the Xaa-proline isomerization.
As widely discussed in literature spatial patterns of amino acids, so-called structural motifs, play an important role in protein function. The functional responsible part of a protein often lies in an evolutionary highly conserved spatial arrangement of only few amino acids, which are held in place tightly by the rest of the structure. In general, these motifs can mediate various functional interactions, such as DNA/RNA targeting and binding, ligand interactions, substrate catalysis, and stabilization of the protein structure.
Hence, characterizing and identifying such conserved structural motifs can contribute to understanding of structurefunction relationships in diverse protein families. Therefore and because of the rapidly increasing number of solved protein structures, it is highly desirable to identify, understand and moreover to search for structural scattered amino acid motifs. The aim of this work was the development and the implementation of a matching algorithm to search for such small structural motifs in large sets of target structures. Furthermore, motif matches were extensively analyzed, statistically assessed and functionally classified. Following a novel approach, hierarchical clustering was combined with functional classification and used to deduce evolutionary structure-function relationships. The proposed methods were combined and implemented to a feature-rich and easy-to-use command line software tool, which is freely available and contributes to the field of structural bioinformatic research.
Footage of organoids taken by means of fluorescence microscopy and segmented as well as triangulated by image analysis software like LimeSeg and Mastodon often needs to be visualized in aesthetic manner for presentation of the results in scientific papers, talks and demonstrations. The goal of this work was to create a simple to use addon “Biobox” for the open source 3D – visualization package “Blender” which would allow to import triangulated 3D data with animation over time (4D), produced by image analysis software, and optimize it for efficient usage. ”Biobox” offers several visualization tools for the creation of rendered images and animation videos by biologists.
The optimization of imported data was performed by using Blender intern modifiers. The optimized data can then be visualized by using several tools built for visualizing the organoid in frozen, animated and semi-transparent manners. A dynamic link for object selection and dynamic data exchange between Blender and Mastodon was developed. Additionally, a user interface was developed for manual correction errors of segmentation and steering the object detection algorithms of LimeSeg. The benchmark of the developed addon “Biobox” was performed on real scientific data. The benchmark test demonstrated that developed optimization result in significant (~5 fold) decrease of RAM usage and acceleration of visualization more than 160 times.
This master’s thesis was written in cooperation with the Spanish company sí-internships. Developing an effective promotion strategy for this startup spending as little financial resources as possible is the main objective of this work. To do so an extensive research on the current internal, external and integral market situation follows. Building on the results of this analysis promotional objectives are being determined and a target audience chosen. Next a promotion strategy is being established.
Reducing costs is an important part in todays business. Therefore manufacturers try to reduce unnecessary work processes and storage costs. Machine maintenance is a big, complex, regular process. In addition, the spare parts required for this must be kept in stock until a machine fails. In order to avoid a production breakdown in the event of an unexpected failure, more and more manufacturers rely on predictive maintenance for their machines. This enables more precise planning of necessary maintenance and repair work, as well as a precise ordering of the spare parts required for this. A large amount of past as well as current information is required to create such a predictive forecast about machines. With the classification of motors based on vibration, this paper deals with the implementation of predictive maintenance for thermal systems. There is an overview of suitable sensors and data processing methods, as well as various classification algorithms. In the end, the best sensor-algorithm combinations are shown.
In the field of Blockchain Technology applications and research, non-fungible tokens (NFTs) have gained significant attention in recent years. Whilst current research is focused on NFT use cases or the purchase of NFTs from an investor’s perspective, the NFT launch (i.e. primary market) from a creator’s perspective remains uncovered. However, the launch strategy is considered to be an important factor for the success of a product. Therefore, our research paper aims to explore launch strategies of NFTs. Thereby, we discuss the marketing mix instruments price (i.e. pricing strategy), place (i.e. mint mechanism), and promotion. Through an empirical approach of conducting eight expert interviews, we examine parameters that are used to define an NFT launch strategy and assess their preference of different stakeholders.
This scientific work reveals the potential for the development of the renewable energy market, due to many reasons. The reasons are the unstable political situation in the world, rising energy prices, environmental degradation and the growing demand of Ger man residents for government measures to reduce the negative impact on the environment. This work is related to business planning and development using strategies based on the above reasons. The purpose of the study is to develop methods for successfully regulating the market for renewable resources to solve the problem of environmental pollution through the promotion of environmentally friendly products. The work explores the driving forces and problems hindering the development of the market for renewable resources. The problems raised concerned all interested parties, from consumers and producers to the state body for regulating and stimulating the industry . An analysis was also made of the methods of environmentally oriented companies and the tools they use to strengthen their positions in the market. Based on the data obtained from the conducted research, a concept and business strategy for a new environmentally oriented generation” was created. The business consulting company “Sun’s idea of the new company is to involve all parties using marketing tools, creating a healthy competitive environment among commercial companies and benefiting not only the companies themselves but also the end user of the products and the German government.
The purpose of this study was to investigate the effects of provocative marketing strategies of different companies in the fashion industry in the first part. The thesis emphasizes on various strategies used by several firms. Furthermore it demonstrates the
different modes of provocation and also the process of a marketing strategy. The second part highlights the opportunities and risks of provocative marketing strategies based on American Apparel.
FUSO is one of the Japanese leading manufacturing of trucks and buses in the world and also it is an integral part of Daimler AG. Being a large manufacturer in trucks and buses, Fuso faces some marketing issues due to corrosion issues. Corrosion is one of the major issue to breakdown or damage the performance of the vehicles. To encounter this issue, FUSO initiated new project and called as “Anti-Corrosion Project”. The main mission of this project is to improve the corrosion resistivity or performance of the metal parts. Currently FUSO has almost 70 percent of parts which lies under Grade-III i.e. lesser than the one year corrosion resistivity.
In this project, the corrosion issues are collected by different types of audits like from customer as well as from taking two years old vehicle in worst conditions. Listed corrosion issues further investigated for current specification and requested for new proposal from supplier. Then the proposed solution is internally estimate the cost and make negotiation with the supplier. Later it’s forwarded to meeting with top management for approval. In case of higher corrosion specification, parts are taken from production line and tested in material lab which is available in FUSO. At last, the approved proposal is requested to release the drawing change and further the new proposal will be implemented. Entire project it should be coordinate with all different departments and working with teams gives more deep knowledge about the cause of issues.
With this project, parallel focused on the shop floor developments in return parts management area. FUSO is also responsible for the after sale services. In other words, FUSO provides warranty for the parts which breakdown within three years. Breakdown parts are directly delivered by the customers through dealers for warranty claim, so these parts called Warranty Part Investigation (WPI) parts. Sometimes customer wants to know the cause of the breakdown even though warranty has expired, in this case company will investigate the cause but they don’t provide the warranty. These kind of parts known as Product Quality Report (PQR) parts.
Company has a different shop floor for return parts and these parts are directly received by the company. RPM has four processes i.e. inwarding, pre-analysis, investigation and dispatch or scrap.
Usually, company used to get 30-50 parts per day, recently they decided to receive all the breakdown parts. Hence, it results in increasing the delay of inwarding and other processes. To solve this, standard layout and process are constructed. And, one of the main reasons for inward delay is higher documentation which is basically not required. These are converted into automation or digitalize work. Improvements are done using the lean manufacturing project methodology which results in more inward of failure parts and less inventory.
Embeddings for Product Data
(2022)
The E-commerce industry has grown exponentially in the last decade, with giants like Amazon, eBay, Aliexpress, and Walmart selling billions of products. Machine learning techniques can be used within the e-commerce domain to improve the overall customer journey on a platform and increase sales. Product data, in specific, can be used for various applications, such as product similarity, clustering, recommendation, and price estimation. For data from these products to be used for such applications, we have to perform feature engineering. The idea is to transform these products into feature vectors before training a machine learning model on them. In this thesis, we propose an approach to create representations for heterogeneous product data from Unite’s platform in the form of structured tabular records. These tables consist of attributes having different information ranging from product-ids to long descriptions. Our model combines popular deep learning approaches used in natural language processing to create numerical representations, which contain mostly non-zeros elements in an array or matrix called as dense representation for all products. To evaluate the quality of these feature vectors, we validate how well the similarities between products are captured by these dense representations. The evaluations are further divided into two categories. The first category directly compares the similarities between individual products. On the other hand, the second category uses these dense vectors in any of the above- mentioned applications as inputs. It then evaluates the quality of these dense representation vectors based on the accuracy or performance of the defined application. As result, we explain the impact of different steps within our model on the quality of these learned representations.
In the practice of software engineering, project managers often face the problem of software project management.
It is related to resource constrained project scheduling
problem. In software project scheduling, main resources are considered to be the employees with some skill set and required amount of salary. The main purpose of software
project scheduling is to assign tasks of a project to the available employees such that the total cost and duration of the project are minimized, while keeping in check that
the constraints of software project scheduling are fulfilled. Software project scheduling (SPSP) has complex combined optimization issues and its search space increases exponentially when number of tasks and employees are increased, this makes software project scheduling problem (SPSP) a NP-Hard problem. The goal of software project scheduling problem is to minimize total cost and duration of project which makes it multi-objective problem. Many algorithms are proposed up till now that claim to give near optimal results for NP-Hard problems, but only few are there that gives feasible set of solutions for software project scheduling problem, but still we want to get more efficient algorithm to get feasible and efficient results.
Nowadays, most of the problems are being solved by using nature inspired algorithms because these algorithms provide the behavior of exploration and exploitation. For solving
software project scheduling (SPSP) some of these nature inspired algorithms have been used e.g. genetic algorithms, Ant Colony Optimization algorithm (ACO), Firefly etc.
Nature inspired algorithms like particle swarm optimization, genetic algorithms and Ant Colony Optimization algorithm provides more promising result than naive and greedy algorithms. However there is always a quest and room for more improvement. The main purpose of this research is to use bat algorithm to get efficient results and solutions for software project scheduling problem. In this work modified bat algorithm is implemented where a different approach of random walk is used. The contributions of this thesis are to: (1) To adapt and apply modified multi-objective bat algorithm for solving software project scheduling (SPSP) efficiently, (2) to adapt and apply other nature inspired algorithms like genetic algorithms for solving software project scheduling (SPSP) and (3) to compare and analyze the results obtained by applied nature inspired algorithms and provide the conclusion.
Financial fraud for banks can be a reason for huge monetary losses. Studies have shown that, if not mitigated, financial fraud can lead to bankruptcy for big financial institutions and even insolvency for individuals. Credit card fraud is a type of financial fraud that is ever growing. In the future, these numbers are expected to increase exponentially and that’s why a lot of researchers are focusing on machine learning techniques for detecting frauds. This task, however, is not a simple task. There are mainly two reasons
• varying behaviour in committing fraud
• high level of imbalance in the dataset (the majority of normal or genuine cases largely outnumbers the number of fraudulent cases)
A predictive model usually tends to be biased towards the majority of samples, in an unbalanced dataset, when this dataset is provided as an input to a predictive model.
In this Thesis this problem is tackled by implementing a data-level approach where different resampling methods such as undersampling, oversampling, and hybrid strategies along with bagging and boosting algorithmic approaches have been applied to a highly skewed dataset with 492 idetified frauds out of 284,807 transactions.
Predictive modelling algorithms like Logistic Regression, Random Forest, and XGBoost have been implemented along with different resampling techniques to predict fraudulent transactions.
The performance of the predictive models was evaluated based on Receiver Operating CharacteristicArea under the curve (AUC-ROC), Precision Recall Area under the Curve (AUC-PR), Precision, Recall, F1 score metrics.
The aim of this bachelor thesis is to find out how the use of artificial intelligence, specifically the one used in combat situations, can increase the playing time or even the replay value of games in the action role-playing genre. Thereby, it focuses mainly on combat situations between a player and an artificial intelligence.
To begin with, this bachelor thesis examines the action role-playing genre in order to find a suitable definition for it. Accordingly, action role-playing games involve titles that send the player on a hero’s journey-like adventure in which they must prove their skills in combat against virtual opponents. The greatest challenge of these real-time battles comes from the required quick reflexes, skill queries and hand-eye coordination.
Next, six means of increasing the replayability of a game are explored: Experience and Nostalgia, Variety and Randomness, Goals and Completion, Difficulty, Learning, and Social Aspect. The paper then proceeds to give an explanation for the term Artificial Intelligence and examines the various methods used to create intelligent behavior as well as the general advancement of the research field. Special attention is given to the implementation methods of Finite State Machines and Behavior Trees, as they are the most widely used methods for creating behavioral patterns of virtual characters.
Finally, a study conducted as part of the bachelor thesis is described, which compares a mathematically balanced artificial intelligence with a behaviorally balanced one in terms of game performance regarding the willingness of test subjects to purchase and play through the game as well as its replay value. The thesis concludes with the findings that while the behavioral approach is more promising than the mathematical approach, a combination of the two methods ultimately leads to the best outcome. Furthermore, the study shows that the use of artificial intelligence to individualize gaming experiences is promising for the future of the gaming industry.
In this thesis two novel methods for removing undesired background illumination are de-veloped. These include a wavelet analysis based approach and an enhancement of a deep learning method. These methods have been compared with conventional methods, using real confocal microscopy images and synthetic generated microscopy images. These synthetic images were created utilizing a generator introduced in this thesis.
Global challenges like climate change, food security, and infectious diseases such as the COVID-19 pandemic are nearly impossible to tackle when established experts and upstart innovators work in silos. If research organizations, governments, universities, NGOs, and the private sector could collaborate on these challenges more easily, lasting solutions would certainly come more quickly. Aligned with the United Nations’ Sustainable Development Goals, SAIRA connects key players in different arenas: scientists and engineers at research and technology organizations (RTOs) looking to collaborate on sustainable development projects, companies seeking R&D support to tackle their most challenging problems, and startups with innovative ideas and a desire to scale. The platform is a blockchain-secured open innovation platform, anchored on Max Plank Digital Library's blockchain network bloxberg, that assures the authenticity and integrity of all user-generated content and collaboration processes.
Computationally solving eigenvalue problems is a central problem in numerical analysis and as such has been the subject of extensive study. In this thesis we present four different methods to compute eigenvalues, each with its own characteristics, strengths and weaknesses. After formally introducing the methods we use them in various numerical experiments to test speed of convergence, stability as well as performance when used to compute eigenfaces, denoise images and compute the eigenvector centrality measure of a graph.
This thesis provides an overview of Generation Z with a focus on Mittweida University of Applied Sciences students. It explores the general issues of students' behavior in life, as well as their attitudes toward the financial and banking sectors. It also examines the German banking market, its strengths and weaknesses in attracting new clients. At the end, possible strategies for the development of the bank in terms of attractiveness for young people are provided.
While blockchain technology is still in an early stage of its development, it is already of surging economic importance.
In the literature, blockchain is referred to as either being a disruptive, institutional, foundational, or general purpose technology. There is still no consensus about the economic theory that should apply for analyzing its economic effects. This article draws on use cases from the coffee supply chain to explore, which theories could potentially apply to an emerging blockchain economy.
Aminoacyl-tRNA synthetases (aaRSs) are key enzymes in the process of protein biosynthesis, charging tRNA molecules with their corresponding amino acid. Whereas adenosine phosphate fixation is common to all aaRSs, recognition of the respective amino acid to ensure correct translation poses a complex task, which is still not understood to its full extent. Using all aaRS structures in the Protein Data Bank (PDB), this thesis reveals further details about the specificitydetermining interactions of each aaRS. Moreover, inspection of the similarities between these enzymes using the structure-derived interaction data reinforces the sequence-based evolutionary trace of aaRSs to a certain degree: The concurrent development of two distinct Classes of aaRS is apparent at functional level, and previously determined evolutionary subclasses coincide altogether with specific aminoacyl recognition in each aaRS Type. Still, discrimination of amino acids in aaRSs involves a multitude of further relevant mechanisms. Eventually, analysis of specificity-relevant binding site interactions sheds light on how aaRS evolved to distinguish different amino acids.
We use machine learning for the selection and classification of single–molecule trajectories to replace commonly used user–dependent sorting algorithms. Measured fluorescence time series of labelled single molecules need to be sorted into ’good molecules’ and ’bad’ molecules before further kinetic and thermodynamic analysis.
Currently, processing, sorting and analysis of the data is mainly done with the help of laboratory specific programs.
Although there are freely available programs for processing smFRET data, they do not offer ’molecular sorting’ or it is purely empirical. Only recently, new approaches came up to solve this problem by means of machine learning. Here, we describe a sound terminology for molecular sorting of smFRET data and present an efficient workflow for manual annotation followed by the training of the ML algorithm. Descriptive statistics of our generated dataset are provided and will serve as the basis for supervised ML-based molecular sorting algorithms yet to be developed.
This paper examines the communication channels used by innovation projects at the ProtoSpace Hamburg, when engaging with stakeholders, and tries to answer the thesis question whether new media channels improve the chances of success for innovation projects, when used for this communication. Expert interviews with eight experts in com-munication, innovation and stakeholder management were conducted and then analyzed through the application of Mayring´s qualitative content analysis, in order to answer the posed question.
This Bachelor thesis provides an experimental validation of the “si-Fi” software, which was designed for RNAi off-target searches and silencing efficiency predictions. The experimental approach is based on using synthetic DNA as RNAi-target as well as RNAi-trigger sequence. The data was generated by two different types of experiments using a transient gene silencing system in bombarded barley epidermal cells. The efficiency of RNAi was estimated by scoring the effect of silencing of the susceptibility-related gene Mlo on resistance of transformed cells to the powdery mildew fungus Blumeria graminis f. sp. hordei by observing reduction of fluorescent signals coming from an RNAi target fused to the green fluorescent protein. The aim of this work was a comparison between in silicio prediction of RNAi efficiency and off-target effects in barley and experimental data.
Robust soft learning vector quantization (RSLVQ) is a probabilistic approach of Learning vector quantization (LVQ) algorithm. Basically, the RSLVQ approach describes its functionality with respect to Gaussian mixture model and its cost function is defined in terms of likelihood ratio. Our thesis work involves an approach of modifying standard RSLVQ with non-Gaussian density functions like logistic, lognormal, and Cauchy (referred as PLVQ). In this approach, we derive new update rules for prototypes using gradient of cost function with respect to non-Gaussian density functions. We also derive new learning rules for the model parameters like s and s, by differentiating the cost function with respect to parameters. The main goal of the thesis is to compare the performance results of PLVQ model with Gaussian-RSLVQ model. Therefore, the performance of these classification models have been tested on the Iris and Seeds dataset. To visualize the results of the classification models in an adequate way, the Principal component analysis (PCA) technique has been used.
For monitoring laser beam welding processes and detecting or actively avoiding process defects, acoustic based measurements can be used in addition to optical measurement methods such as pyrometry. To reliably detect process events, it is essential to position the respective sensors in such a way that specific signal characteristics are reproducible and significant. However, there are only few investigations regarding the positioning for airborne sound sensors, especially for the detection of process emissions in the ultrasonic range. Therefore, in this research, the influence of the process distance as well as the angle and orientation of the microphone to a laser beam deep penetration welding process is investigated with respect to the detectability of process emissions in different frequency bands. It is shown that for a wide ultrasonic range a flat sensor angle with respect to the sample surface leads to an increased signal strength of the acoustic emissions compared to steep angles.
The theoretical foundations of enterprise management using information technology were reviewed; analysis of the effectiveness of the use of information systems in the enterprise; ways of improving the enterprise management mechanism using information systems (on example of Mars Wrigley Confectionery Belarus) have been developed.
Standard assembly time is an important piece of data in product development that is used to compare different product variants or manufacturing variants. In the presented approach, standard time is created with the use of a decision tree regarding standard manual and machine-manual operations, taking into consideration product characteristics and typical tools, equipment and layout. The analysed features include, among others: information determined during product development, such as product structure, parts characteristics (e.g. weight, size), connection type, as well as the information determined during assembly planning: tools (e.g. hand screw driver, power screw driver, pliers), equipment (e.g. press, heater), workstation layout (e.g. distance, way of feeding). The object-attribute-value (OAV) framework was applied for the assembly characteristic. An example of the decision tree application to predict standard assembly time was presented for a mechanical subassembly. The case study was dedicated to standard time prediction for a bearing assembly. The presented approach is particularly important for the enterprises which offer customized products.
Cryptocurrencies are characterized by high volatility, both in the short and long term. Experienced traders exploit this to make profits from price fluctuations by swing trading. However, this requires closely observing and analyzing the prices and trading positions at the right time. Only a few specialists, who spend time focusing on this, or optimized trading bots are able to actually make continuously profits. The autradix protocol is a selfoptimizing and self-learning parametric trading algorithm that analyzes price actions in real-time and adaptively optimizes the algorithm’s parameters to realize the user’s investment objective. Embedded in an adaptive genetic algorithm, possible parameterizations are simulated and the optimal for the investigated trading pairs are calculated. The generic trading protocol API enables coupling with various crypto exchanges and decentralized protocols. A smart contract based decentralized, trustless, and tokenized fund, controlled by a DAO, enables users to invest, operate trading agents, and to participate in the profits generated according to their share.
This thesis was written in order to prove the expediency of startup ecosystem support and to develop practical recommendations for Belarusian government based on the analysis of successful practices in the U.S. and Lithuania.
It covers the essence of a “startup company” and a “startup ecosystem” as well as provides the analysis of socioeconomic impact of startup companies with particular focus on job creation. It sheds light on the best startup support policies in the U.S., where most prominent startup ecosystems are operating, and Lithuania as a country with similar to Belarusian preconditions and a rapidly
developing ecosystem. Furthermore, this paper deals with Belarus‘s peculiarities regarding fostering startup ecosystem growth. It assesses recent economic development of Belarusian IT sector and gives an insight into its competitive advantages and challenges.
The subsequent paper is based on internet research using articles, presentations, reports and studies, websites and official legal documents.
Purpose: This study addresses issues of occupational mental health among nurses. Factors such as linking role, work and social factors, stress, burnout, depression, absenteeism and the intention of turnover, guides the research. The purpose of this research paper therefore looks forward to answer the question "How to measure the extent at which nurses experience symptoms or risk of depression through various factors such as individual or demographic factors, emotional exhaustion and stressful working situations?"
Design: Data were collected from 9 nurses working for major hospitals located in Germany, Baden-Württemberg (Mannheim and Heidelberg), Bremen, Ukraine and Ghana.
Methods: The design and method utilized in the qualitative and quantitative research methods is a survey, which consists of a questionnaire and biographical Interviews.
Questionnaire was used to collect data, which included demographic and job characte-ristics, job-related stress, emotional labor, and depressive symptoms The PHQ-9, serves to measure the depressive symptoms of the participants and serves as an instrument to back up the Interviews conducted. The questionnaire was evaluated with the SPSS version 21.0. Descriptive statistics, correlations, and frequency were used to analyze and evaluate the data.
Results: The study found out that all the participants who took part in this survey are depressed ranging from minimal to moderate depression. The questionnaire detected approximately 20% of the participants being minimal depressed, 40%, mild depressed and 30% moderate depressed. The composed questions targeted on factors like Occupational Stress and Work strain with factors as well as recognition and appreciation from patients and organization. 77, 80% admitted, they have no recognition and appreciation from colleagues and patients. 44,40% turned out to be very stressed up with
their daily work routine and the other 55,60% finds it only stressful .100% turned out to find Labor disturbances as a stress factor. All the participants are not pleased with their salary which leads to Job dissatisfaction.
Conclusion: The results show that it is necessary to implement programs for nurses to help reduce job-related stress, Preventive and suitable methods should be considered to reduce mental strain before depression manifest itself.
Clinical Relevance: Introducing programs that may help nurses and its organization is the work of Human resource management in nursing organization. Nursing administrators have to understand that, the rate at which nurses have to work and deal with other stressful situations might cause them to suffer depressive symptoms. In other to help this situation, they can aspire to enhance stressful work conditions, develop programs
that subdue job related stress and minimize the expectations of depressive symptoms.
To enable smart devices of the internet of things to be connected to a blockchain, a blockchain client needs to run on this hardware. With the Trustless Incentivized Remote Node Network, in short Incubed, it will be possible to establish a decentralized and secure network of remote nodes, which enables trustworthy and fast access to a blockchain for a large number of low-performance IoT devices. Currently, Incubed supports the verification of Ethereum data. To serve a wider audience and more applications this paper proposes the verification of Bitcoin data as well, which can be achieved due to the modularity of Incubed. This paper describes the proof data that is necessary for a client to prove the correctness of a node’s response and the process to verify the response by using this proof data as well. A proof-object which contains the proof data will be part of every response in addition to the actual result. We design, implement and evaluate Bitcoin verification for Incubed. Creation of the proof data for supported methods (on the server-side) and the verification process using this proof data (on the client-side) has been demonstrated. This enables the verification of Bitcoin in Incubed.
As the cryptocurrency ecosystem rapidly grows, interoperability has become increasingly crucial, enabling assets and data to interact seamlessly across multiple chains. This work describes the concept and implementation of a trustless connection between the Bitcoin Lightning Network and EVM-compatible blockchains, allowing the transfer of assets between the two ecosystems. Establishing such a connection can significantly contribute to the growth of both ecosystems as they can benefit from each other’s advantages and emerge new pos- sibilities.
The cryptocurrency ecosystem has seen significant growth with Ethereum and Bitcoin as foundational pillars. Ethereum introduced smart contracts revolutionizing decentralized applications (dApps) across various domains. Scalability challenges led to alternative ecosystems like Binance Smart Chain and Polygon, maintaining compatibility through the Ethereum Virtual Machine (EVM). Bitcoin also faces scalability issues, leading to the Lightning Network's development—an off-chain solution with payment channels for scalable instant transactions. Interoperability is increasingly crucial as the cryptocurrency ecosystem continues to grow, enabling seamless interactions between assets and data across multiple blockchain platforms. EVM-compatible blockchains and the Lightning Network offer unique advantages in their respective use cases. This paper utilizes atomic swaps to create a secure, fast, and user-friendly trustless bridge between the Lightning Network and EVM-compatible blockchains, fostering the growth of both ecosystems and unlocking novel opportunities.
The number of Internet of Things (IoT) devices is increasing rapidly. The Trustless Incentivized Remote Node Network, in short IN3 (Incubed), enables trustworthy and fast access to a blockchain for a large number of low-performance IoT devices. Although currently IN3 only supports the verification of Ethereum data, it is not limited to one blockchain due to modularity. This thesis describes the fundamentals, the concept and the implementation of the Bitcoin verification in IN3.
Humans started using the principles of insurance thousands of years ago when they lived in tribes in smaller villages. If one of the tribe members were injured, the others would take care of him and his family. The basic principle of insurance is several people covering each other against a particular risk. Today, most people in regions like Europe have access to insurance, while many people worldwide still have no access at all. The cost and accessibility may be improved with a blockchain-based parametric approach. The insurance process in a parametric approach is exclusively based on data, and decisions are made objectively. Blockchain is a necessary and integral part of the approach to create transparency and connect the customer’s and investor’s risk capital. The paper offers an overview of the opportunities and challenges of blockchain-based parametric insurance, a catalog of criteria for such insurance, a description of all components and their interaction for implementation on Ethereum, and a reference implementation of a train delay insurance in Germany.
Massive multiple-input multiple-output (MIMO), eine Technik bei der die Basisstation einer Mobilfunkzelle mit einer großen Anzahl an Antennen ausgestattet ist, wird derzeit als eine vielversprechende Schlüsseltechnologie zur Erfüllung der Anforderungen zukünftiger drahtloser Kommunikationsnetze der fünften Generation betrachtet. Die zuversichtlichen Angaben über die Leistung solcher Systeme beruht allerdings auf einer theoretischen, bisher kaum praktisch verizierten Annahme, dass die drahtlosen Übertragungskanäle verschiedener Nutzer aufgrund der hohen Anzahl an Antennen voneinander unabhängig sind. Das heißt, dass sogenannte günstige Übertragungsbedingungen herrschen. Die vorliegende Masterarbeit untersucht diese neuartigen Systeme unter zwei verschiedenen Perspektiven.
Im ersten Teil dieser Arbeit wird der Einfluss von realistischen Übertragungsbedingungen auf die Performance von massive MIMO Systemen evaluiert. Dazu werden entsprechende numerische Systemsimulationen durchgeführt und mit den Ergebnissen von praktischen massive MIMO Messkampagnen verglichen.
Die Untersuchungen ergeben, dass die sogenannten günstigen Übertragungsbedingungen in realistischen Umgebungen nur bedingt beobachtet werden können. Daher führen traditionelle Kanalmodelle zu einer ungenauen Abschätzung der Leistung von praktischen massive MIMO Systemen. Um diesem Problem zu begegnen, wird deshalb eine neuartige Parametrisierung des traditionellen Kronecker-Modells vorgeschlagen, sodass relevante Kenngrößen realistischer Kanäle mit diesem Modell präzise widergespiegelt werden.
Anschließend folgt eine Untersuchung verschiedener Methoden zur Kanalschätzung in massive MIMO Systemen unter den verschiedenen Kanalmodellen mittels numerischer Simulationen. Die Experimente zeigen auf, dass Schätzmethoden, welche speziell für massive MIMO unter der Annahme von günstigen Übertragungsbedinungen hergeleitet wurden, eine signifikante Leistungsminderung unter realistischen Kanalmodellen erfahren.
Im zweiten Teil dieser Arbeit liegt der Fokus auf der Anwendung von massive MIMO Systemen in sogenannten Internet of Things (IoT) Netzwerken. Die typischerweise hohe Anzahl an aktiven IoT-Geräten macht die Anwendung von effizienten Scheduling-Algorithmen notwendig. Daher wird ein Downlink-Scheduling-Algorithmus präsentiert, welcher sich die Eigenschaften von massive MIMO Systemen und die typischen Anforderungen an die Datenraten von IoT-Geräten zunutze macht. Im Speziellen wird vorgeschlagen, die IoT-Nutzer in Gruppen aufzuteilen und die verschiedenen Gruppen nacheinander zu versorgen. Die Gruppengröße wird dabei mit Hilfe asymptotischer Eigenschaften von massive MIMO Systemen hergeleitet.
Um die Gruppenmitglieder zu selektieren, wird eine modifizierte Version des populären Semi-Orthogonal-User-Selection (SUS) Algorithmus vorgeschlagen. Die anschließend durchgeführten numerischen Simulationen bestätigen, dass die modifizierte Version von SUS die Nachteile des originalen Algorithmus eliminiert, was wiederum zu verbesserten Datenraten in dem betrachteten System führt.
This Bachelor thesis deals with connected systems consisting of a multitude of similar electronic devices
(often referred to as agents) endowed with information processing abilities. It is required that these socalled multi-agent systems solve a certain task with a high reliability, while the individual components
are not able to solve the problem on their own in a satisfying manner. A central control unit can not or
shall not be used in such systems for a variety of different reasons: For example, a significant drawback of
a central control unit is the vulnerability of the system. If the central control unit fails, the whole system breaks down. Therefore, multi-agent systems require special algorithms enabling the agents to solve a
common, global problem in a suitable manner by local interaction only.
In this thesis distributed algorithms are investigated which can be used for distributed information pro-cessing and control of such multi-agent systems. In the first part of this work, it is assumed that each
agent posses a private information state about a common parameter of interest. The described consensus algorithm enables all agents to reach a system-wide identical information state by local information
exchanges only. Subsequently, it is considered the case that every agent has access to streaming data containing information about an a priori unknown parameter. The diffusion strategy described in the second
part enables the agents to estimate this parameter and to minimize a global cost function which depends
on it. Both algorithms are described in a general framework and can therefore be applied to a variety of
different problems. One application of these strategies, which is described in the third part of this work,
is the simulation of swarming behavior.
Neural networks have become one of the most powerful algorithms when it comes to learning from big data sets and it is used extensively for classification. But the deeper the network models, the lesser is the interpretability of such models. Although many methods exist to explain
the output of such networks, the lack of interpretability makes them black boxes. On the other hand, prototype-based machine learning algorithms are known to be interpretable and robust.
Therefore, the aim of this thesis is to find a way to interpret the functioning of the neural networks by introducing a prototype layer to the neural network architecture. This prototype layer will train alongside the neural network and help us interpret the model. We present architectures of neural networks consisting of autoencoders and prototypes that perform activity recognition from heart rates extracted from ECG signals. These prototypes represent the different activity groups that the heart rates belong to and thereby aid in interpretability.
This master thesis covers the topics of Studying customers’ behavior on the example of skin care brand Nivea. There are presented theoretical basis for the following research about marketing, customers’ behavior and conducting marketing research properly. Then, there is the analysis of German market. Since Nivea is the brand of Beiersdorf company, there is a description of Beiersdorf’s activity and operation work. The main idea of the paper work is to analyze customers’ behavior of Nivea. Therefore, the work contains huge research about the brand along with its’ micro- and macroenvironment. There also were conducted an in-depth interview and a survey to understand customers’
current needs. With all the results the author of the work proposed some ideas for Nivea brand.
A variety of methods have been used to describe natural systems and cellular functions. Most use continuous systems with differential equations. Based upon the neighbourhood relations in graphs and the complex interactions in cellular automata a mathematical model was designed and implemented as an application user interface. This discrete approach called graph automata was utilised to simulate diffusion processes and chemical kinetics. The progression of diffusion in cellular environments was described and resulted in a discrepancy of 20% in comparison to experimental results. Different chemical kinetics were simulated and found to be as accurate as their continuous counterparts. The proposed model appears to be a highly scalable and modular
approach to simulate natural systems.
nicht vorhanden
Smart ultrafast laser processing with rotating beam – Laser micro drilling, cutting and turning
(2021)
Current micro drilling, cutting and turning processes are mainly based on EDM, milling, stamping, honing or grinding. All these technologies are using a tool with a predefined geometry that is transferred to the working piece. In contrast the laser is a highly flexible tool, which can adapt its size very fast by changing only a software setting. Thanks to the efforts in laser development during the last years, stable ultrafast lasers with sufficient average power and high repetition rates became industrially available. For using as many pulses as possible, a cost-efficient production demands for innovative processes and machining setups with fast axes movement and special optics for beam manipulation. GFH has developed a helical drilling optics, which rotates the beam up to 30.000 rpm in a very precise circle and allows furthermore to adjust the diameter and the incidence angle. This enables the laser to be used for high precision drilling and cutting and also for micro turning processes.
In laser drilling, one challenge is to achieve a high drilling quality in high aspect ratio drilling. Ultra-short pulsed lasers use different concepts like thin disks, fibers and rods. The slab technology is implemented because of their flexibility and characteristics. They bring together both advantages and deliver high pulse energies at high repetition rates. Materials with a thickness > 1.5 mm demand specialized optics handling the high power and pulse energies with adapted processing strategies, integrated in a machine setup. In this contribution, we focus on all the necessary components and strategies for drilling high precision holes with aspect ratios up to 1:40.
As new sensors are added to VR headsets, more data can be collected. This introduces a new potential threat to user privacy. We focused on the feasibility of extracting personal information from eye-tracking. To achieve this, we designed a preliminary user study focusing on the pupil response to audio stimuli. We used a variation of machine learning models to test the collected data to determine the feasibility of obtaining information such as the age or gender of the participant. Several of the experiments show promise for obtaining this information. We were able to extract with reasonable certainty whether caffeine was consumed and the gender of the participant. This demonstrates the unknown threat that embedded sensors pose to users. A further studies are planned to verify the results.
Cyanobacteria, prokaryotic microorganisms with basically the same oxygenic photosynthesis as higher plants, are becoming excellent green cell factories for sustainable generation of renewable chemicals and fuels from solar energy and carbon dioxide. In the presentation I will visualize the concept green cell factories by introducing and discussing two examples: (i) engineering cyanobacteria to produce the important bulk chemical and potential blend-in biofuel butanol from sunlight and carbon dioxide, so called photosynthetic butanol, and (ii) generation of a functional semisynthetic [FeFe]-hydrogenase linking to the native metabolism in living cells of the unicellular cyanobacterium Synechocystis PCC 6803.
Over the past few years, wind and solar power plants have increasingly contributed to energy production. However, due to fluctuating energy sources, the energy production data contain disruption. Such disrupted data lead to the wrong prediction performance, and they need to be estimated by other values. In this thesis, we provide a comparative study to estimate the online disrupted data based on the data of similar groups of power plants, We apply three estimation techniques, e.g., mean, interpolation, and k-nearest neighbor to estimate the disruption on training data. We then apply four clustering algorithms, e.g., k-means, neural gas, hierarchical agglomerative, and affinity propagation, with two similarity measures, e.g., euclidean and dynamic time warping to form groups of power plants and compare the results. Experimental results show that when KNN estimation is applied to data, and neural gas and agglomerative with dtw are used to cluster the data, the cluster quality scores and execution time give better results compared to others. Therefore, we conclude and choose KNN estimation to reconstruct the online disrupted data on each group of a similar power plants.
The shape-memory Nitinol as a nickel-titanium alloy is widely used in actuator and medical applications. However, the connection of a flange to the rod is a critical point. Therefore, laser rod end melting enables material accumulations to generate a preform at the end of a rod, followed by die forming, so that the flange can be generated. This process has been successfully applied on 1.4301 steel. This study is aimed to investigate laser rod end melting of shape-memory Nitinol regarding the resultant surface quality of the preforms. The results showed that spherical preforms could be generated without visible surface discoloration due to oxidation. By using different scan rates, different solidification conditions occurred which led to significantly different surface structures. These findings show that laser rod end melting can principally be applied on Nitinol to generate preforms for flanges whereby the surface quality depends on the solidification conditions.
Decentralization is one of the key attributes associated with blockchain technology. Among the different developments in recent years, decentralized autonomous organizations (DAOs) have been of growing interest. DAOs are currently a key part of another emerging use case, namely decentralized science (DeSci). Given the novelty of the field, an integrative definition of DeSci has not been established, but some inherent concepts and ideas can be traced back to the Open Science movement. Although the DeSci movement has the potential to benefit the public, for example through funding underrepresented research areas or more inclusive and transparent research in general, some negative aspects of decentralization should not be neglected. Due to the novelty of blockchain and emerging use cases, research can and should precede mass adoption, to which this paper aims to contribute.
The object of the thesis is the material and information flows in the production systems of enterprises, in particular, of LLC "Kolibri".
As a subject of thesis, the improvement of the management of material and related information flows of the company LLC "Kolibri" was chosen.
The purpose of the thesis is theoretical substantiation and development of practical recommendations for the effective management of the flows of material and information resources of the enterprise on the principles of logistics.
In machine learning, Learning Vector Quantization (LVQ) is well known as supervised vector quantization. LVQ has been studied to generate optimal reference vectors because of its simple and fast learning algorithm [2]. In many tasks of classification, different variants are considered while training a model and a consideration of variants of large margin in LVQ helps to get significant
results [20]. Large margin LVQ (LMLVQ) is to maximize the distance between decision hyperplane and data points. In this thesis, a comparison of different variants of Generalized Learning Vector Quantization (GLVQ) and Large margin in LVQ is proposed along with visualization, implementation and experimental results.
With the advancement in cryptography and emerging internet technology, electronic voting is gaining popularity since it ensures ballot secrecy, voter security, and integrity. Many commercial startups and e-Voting systems have been proposed, but due to lack of trust, privacy, transparency, and hacking issues, many solutions have been suspended. Blockchain, along with cryptographic primitives, has emerged as a promising solution due to its transparent, immutable, and decentralized nature. In this paper, we summarized the properties that existing solutions should satisfy and explained some cryptographic primitives like ZKP, Ring signatures along with their security limitations. We gave a comprehensive review of some blockchain-based e-Voting systems and discussed their strengths and weaknesses based on the given properties with table of comparison.
Mathematics Behind the Zcash
(2020)
Among all the new developed cryptocurrencies from Bitcoin, Zcash comes out to be the strongest cryptocurrency providing both transparency and anonymity to the transactions and its users by deploying the strong mathematics of zk-SNARKs.
We discussed the zero knowledge proofs which is a basic building block for providing the functionality to zk-SNARKs. It offers schnorr and sigma protocols with interactive and noninteractive versions. Non-interactive proofs are further used in Zcash transactions where the validation of sent transaction is proved by cryptographic proof.
Further, we deploy zk-SNARKs proofs following common reference string as public parameter when transaction is made. The proof allows sender to prove that she knows a secret for an instance such that the proof is succinct, can be verified very efficiently and does not leak the
secret. Non-malleability, small proofs and very effective verification make zk-SNARKs a classic tool in Zcash. Since we deal with NP problems therefore we have considered the elliptic curve cryptography to provide the same security like RSA but with smaller parameter size.
Lastly, we explain Zcash transaction process after minting the coin, the corresponding transaction completely hides the sender, receiver and amount of transaction using zero knowledge proof.
As future considerations, we talk about the improvements that can be done in term of decentralization, efficiency by comparing with top ranked cryptocurrencies namely Ethereum and Monero, privacy preserving against the thread of quantum computers and enhancements in shielded transactions.
Mathematics behind the Zcash
(2020)
Among all the new developed cryptocurrencies, Zcash comes out to be the strongest cryptocurrency providing both transparency and anonymity to the transactions and its users by deploying the strong mathematics of zk-SNARKs. We discussed the zero knowledge proofs as a building block for providing the functionality to zk-SNARKs. It offers schnorr protocol which is further used in Zcash transactions where the validation of sent transaction is proved by cryptographic proof. Further, we deploy zk-SNARKs following common reference string that allows sender to prove that she knows a secret such that the proof is succinct, can be verified and does not leak the secret. Non-malleability, small proofs and effective verification make zk-SNARKs a classic tool in Zcash. We deal with NP problems therefore we have considered the elliptic curve cryptography to provide the security. Lastly, we explain Zcash transaction, the corresponding transaction completely hides the sender, receiver and amount of transaction using zero knowledge proof.
In response to prevailing environmental conditions, Arabidopsis thaliana plants must increase their photosynthetic capacity to acclimate to potential harmful environmental high light stress. In order to measure these changes in acclimation capacity, different high throughput imaging-based methods can be used. In this master thesis we studied different Arabidopsis thaliana knockout mutants-and accessions in their capacity to acclimate to potential harmful environmental high light and cold temperature conditions using a high throughput phenotyping system with an integrated chlorophyll fluorescence measurement system. In order to determine the acclimation capacity, Arabidopsis thaliana knockout mutants of previously not high light assigned genes as well as accessions of two different haplotype groups with a reference and alternative allele from different countries of origin were grown under switching high light and temperature environmental conditions. Photosynthetic analysis showed that knockout mutant plants did differ in their Photosystem II operating efficiency during an increased light irradiance switch but did not significantly differ a week later under the same circumstances from the wildtype. High throughput phenotyping of haplotype accessions revealed significant better acclimation capacity in non-photochemical quenching and steady-state photosynthetic efficiency in Russian domiciled accessions with an altered SPPA gene during high light and cold stress.
Beam shaping and splitting with diffractive optics for high performance laser scanning systems
(2021)
Diffractive optical elements (DOEs) enable novel high performance and process-tailored scanning strategies for galvanometer-based scan heads. Here we present several such concepts integrating DOEs with laser scanners and the respective application use cases. Beam shaping DOEs providing a homogeneous fluence over a custom defined profile, such as a rectangular Top-Hat, enable increased process quality in Laser-Induced Forward Transfer (LIFT) compared to the Gaussian beam of the laser source. We show that aberrations which occur over the necessary large wafer-sized image field can be eliminated through the use of a synchronous XY-stage motion. Another application that benefits from the use of DOEs is laser drilling. Drilling in display and electronics manufacturing demands high throughput that can only be achieved through the use of beam splitting DOEs for parallel processing. To this end, the joint MULTISCAN project is developing a variable multi-beam tool capable of scanning and switching each individual beamlet for increased control.
This thesis proposes a solution to the practical problem of supervising relatively basic mechanic processes in robotics by means of computervision. Supervision happens by comparing the tracked movement with a known, ideal recording of the movement that acts as a model.
First, this thesis analyzes possible approaches to the problem regarding data structures and representation, ways of extracting the data from the recording and ways to compare the data sets of two recordings. Then, a specific solution is implemented in C++ and explained.
This paper analyses the status quo of large-scale decision making combined with the possibility of blockchain as an underlying decentralized architecture to govern common pool resources in a collective manner and evaluates them according to their requirements and features (technical and non-technical). Due to an increasing trend in the distribution of knowledge and an increasing amount of information, the combination of these decentralized technologies and approaches, can not only be beneficial for consortial governance using blockchain but can also help communities to govern common goods and resources. Blockchain and its trust-enhancing properties can potenitally be a catalysator for more collaborative behavior among participants and may lead to new insights about collective action and CPRs.
This article aims to explain mathematically, why the so called double descent observed by Belkin et al., Reconciling modern machine-learning practice and the classical bias-variance trade-off, PNAS 116(32) (2019), p. 15849-15854, occurs on the way from the classical approximation regime of machine learning to the modern interpolation regime. We argue that this phenomenon may be explained by a decomposition of mean squared error plus complexity into bias, variance and an unavoidable irreducible error inherent to the problem. Further, in case of normally distributed output errors, we apply this decomposition to explain, why LASSO provides reliable predictors avoiding overfitting.
This work deals with the construction of a microscope for combined total internal reflection fluorescence (TIRF) and confocal microscopy. It is especially designed for single-molecule fluorescence spectroscopy. The design of the microscope body is based on the miCube (Hohlbein lab, Wageningen University, NL). The excitation and detection pathways were adapted to allow both TIRF and confocal illumination as well as camera and pointdetection for two color-channels to allow single-molecule Förster resonance transfer measurements
Convolutional Neural network (CNN) has been one of most powerful and popular preprocessing techniques employed for image classification problems. Here, we use other signal processing techniques like Fourier transform and wavelet transform to preprocess the images in conjunction with different classifiers like MLP, LVQ, GLVQ and GMLVQ and compare its performance with CNN.
Classification of time series has received an important amount of interest over the past years due to many real-life applications, such as environmental modeling, speech recognition, and computer vision.
In my thesis, I focus on classification of time series by LVQ classifiers. To learn a classifiers, we need a training set. In our case, every data point in the training set contains a sequence (an ordered set) of feature vectors. Thus, the first task is to construct a new feature vector (or matrix) for each sequence.
Inspired by [2], I use Hankel matrices to construct the new feature vectors. This choice comes from a basic assumption that each time series is generated by a single or a set of unknown Linear Time Invariant (LTI) systems.
After generating new feature vectors by Hankel matrices, I use two approaches to learn a classifier: Generalized Learning Vector Quntization (GLVQ) and Median variant of Generalized Learning Vector Quantization (mGLVQ).
Soft Learning Vector Quantisation (SLVQ) andRobust Soft Learning Vector Quantisation (RSLVQ) are supervised data classification methods, that have been applied successfully to real world classification problems. The performance of SLVQ and RSLVQ, however, reduces, when they are applied tomore complicated classification problems. In this thesis, we have introducedmodi-fications to SLVQand RSLVQ, in order to havemore capable versions of them. A few possibilities to modify SLVQ and RSLVQ are considered, some of them are not successful enough and they have been included for the sake of completeness. The fruits of the thesis are plenty, including Tangent Soft Learning Vector Quantisation-Strong (TSLVQ-S), together with its more stable version Tangent Robust Soft Learning Vector Quantisation-Strong (TRSLVQ-S), Attraction Soft Learning Vector Quantisation (ASLVQ) and Grassmannian Soft Learning Vector Quantisation (GSLVQ).
The computer based calculation of the sound insulation in between dwellings or the analysis of the transmission in a building are common use in practice of a building acoustic engineer. With the release of the revised DIN 4109 in July 2016 a whole new calculation model was introduced to the German users of this standard. The calculation model and its input data now need to be included into existing calculation software, such as the software SONarchitect ISO of the Spanish developers Sound of Numbers. For this cause this thesis compares the input parameter and the airborne and impact sound transmission of the DIN 4109:2016-07 and European standard EN 12354:2000. With the help of this comparison it now is possible to declare all necessary parameter and calculation procedures for the calculation of the airborne and impact sound insulation between dwellings.
Marker-based systems can digitally record human movements in detail. Using the digital biomechanical human model Dynamicus, which was developed by the Institut für Mechatronik, it is possible to model joint angles and their velocities such accurately that it can be used to improve motion analysis in competitive sports or for ergonomic evaluation of motion sequences. In this paper, we use interpretable machine learning techniques to analyze the gait. Here, the focus is on the classification between foot touchdown and drop-off during normal walking. The motion data for training the model is labeled using force plates. We analyze how we could apply our machine learning models directly on new motion data recorded in a different scenario compared to the initial training, more precise on a treadmill. We use the properties of the interpretable model
to detect drift and to transfer our model if necessary.
In the past few years Generative models have become an interesting topic in the field of Machine Learning (ML). Variational Autoencoder (VAE) is one of the popular frameworks of generative models based on the work of D.P Kingma and M. Welling [6] [7]. As an alternative to VAE the authors in [12] proposed and implemented Information Theoretic Learning (ITL) based Autoencoder. VAE and ITL Autoencoder are a combination of the neural networks and probabilistic graphical models (PGM) [7]. In modern statistics it is difficult to compute the approximation ofthe probability densities. In this paper we make use of Variational Inference (VI) technique from machine learning that approximate the distributions through optimization. The closeness between the distributions are measured by the information theoretic divergence measures such as Kullbach-Liebler, Euclidean and Cauchy Schwarz divergences. In this thesis, we study theoretical and experimental results of two different frameworks of generative models which generate images of MNIST handwritten characters [8] and Yale face database B [3]. The results obtained show that the proposed VAE and ITL Autoencoder are capable of generating the underlying structure of the example datasets
Data streams change their statistical behaviour over the time. These changes can occur gradually or abruptly with unforeseen reasons, which may effect the expected outcome. Thus it is important to detect concept drift as soon as it occurs. In this thesis we chose distance based methodology to detect presence of concept drift in the data streams. We used generalized learning vector quantization(GLVQ) and generalized matrix learning vector quantization( GMLVQ) classifiers for distance calculation between prototypes and data points. Chi-square and Kolmogorov–Smirnov tests are used to compare the distance distributions of test and train data sets to indicate the drift presence.
Generating electricity from wind power is one of the fastest growing methods in the world. The kinetic energy of the moving air is converted into electricity by wind turbines that are installed in places where the weather conditions are most favorable.
Wind turbines can be used individually, but are often grouped together to form wind parks also called wind farm. Electricity generated from wind parks can be used to meet local needs or to supply an electricity distribution network for homes and businesses further away.
Energy obtained from the wind can also be converted into hydrogen and used as transport fuel or stored for subsequent electricity generation. The use of this form of energy, reduces the impact of electricity generation on the environment as it does not require fuel and does not produce any pollutants or greenhouse gases.
Wind energy is growing significantly and since 1994 the world market has grown by around 30% per year. The installed capacity worldwide rose from 17,400 up to 650,560 MW between 2000 and the end of 2019. In the European market, which concentrates most of the world's wind farm, Germany remains the leader with almost half of the total capacity. Spain recorded the strongest growth in the last three years with an annual growth rate of 28%. Europe also concentrates industrial and technological activities: Eight European manufacturers are among the top ten in the world, with 70% of devices sold in 2018.
Anomaly Detection is a very acute technical problem among various business enterprises. In this thesis a combination of the Growing Neural Gas and the Generalized Matrix Learning Vector Quantization is presented as a solution based on collected theoretical and practical knowledge. The whole network is described and implemented along with references and experimental results. The proposed model is carefully documented and all the further open researching questions are stated for future investigations.
Assessment of COI and 16S for insect species identification ti determine the diet of city bats
(2023)
Despite the numerous benefits of urbanization to human living conditions, urbanization has also negatively affected humans, their environment, and other organisms that share urban habitats with humans. Undoubtedly adverse while some wild animals avoid living in urban areas, others are more tolerant or prefer life in urban habitats. There are more than 1,400 species of bats in the world.
Therefore, they have the potential to contribute significantly to the mammalian biodiversity in urban areas. Insectivorous bats species play a key role in agriculture by improving yields and reducing chemical pesticide costs. Using metabarcoding, it is possible to determine the prey consumed by these noctule mammals based on the DNA fragments in their fecal pellets. This study
aimed to evaluate COI and 16S metabarcodes for insect species identification to determine the diet of metropolitan bats. For this purpose, COI and 16S metabarcodes were extracted, amplified, and sequenced from 65 bat feces collected in the Berlin metropolitan areas. Following a taxonomic annotation, I found that 73% of all identified insects could only be detected using the COI method, while 15% could be recovered using the 16S approach. Just 12% of all detected insects were identified simultaneously by both markers. According to this result, COI is more suitable for the taxonomic identification of insects from bat feces. However, given the bias of COI primers, it is recommended to use both markers for a more precise estimation of species diversity. Additionally,based on the insect species identified, I noticed that urban bats fed mainly on Diptera, Coleoptera,and Lepidoptera. The bat species Nyctalus noctula was most abundant in the samples. His diet analysis revealed that 91% of the samples contained the insect species Chironomus plumosus. 14 pest insect species were also found in his diet.
Simulating complex physical systems involves solving nonlinear partial differential equations (PDEs), which can be very expensive. Generative Adversarial Networks (GAN) has recently been used to generate solutions to PDEs-governed complex systems without having to numerically solve them.
However, concerns are raised that the standard GAN system cannot capture some important physical and statistical properties of a complex PDE-governed system, along side with other concerns for difficult and unstable training, the noisy appearance of generated samples and lack of robust assessment methods of the sample quality apart from visual examination. In this thesis, a standard GAN system is trained on a data set of Heat transfer images. We show that the generated data set can capture the true distribution of training data with respect to both visual and statistical properties, specifically the vertical statistical profile. Furthermore, we construct a GAN model which can be conditioned using variance-induced class label. We show that the variance threshold t = 0. 01 constructs a good conditional class label, such that the generated images achieve 96% accuracy
rate in complying with the given conditions.
The GeoFlow II experiment aims to replicate Earth’s core dynamics using a rotating spherical container with controlled temperature differences and simulated gravity. During the GeoFlow II campaign, a massive dataset of images was collected, necessitating an automated system for image processing and fluid flow visualization in the northern hemisphere of the spherical container. From here, we aim to detect the special structures appearing on the post processed images. Recognizing YOLOv5’s proficiency in object detection, we apply Yolov5 model for this task.
In this thesis, we focus on using machine learning to automate manual or rule-based processes for the deduplication task of the data integration process in an enterprise customer experience program. We study the underlying theoretical foundations of the most widely used machine learning algorithms, including logistic regression, random forests, extreme gradient boosting trees, support vector machines, and generalized matrix learning vector quantization. We then apply those algorithms to a real, private data set and use standard evaluation metrics for classification, such as confusion matrix, precision, and recall, area under the precision-recall curve, and area under the Receiver Operating Characteristic curve to compare their performances and results.
The impact of organisational structure and organisational culture on the efficiency of a business
(2020)
The fear of losing flexibility and effectiveness due to an increased organisational structure induced by personal growth is causing SME's to defer structural changes. The purpose of this work is to examine whether the structural and cultural demands of employees match the structure and predominant culture within such a medium-sized company. As part of this, a survey was made to evaluate the current status and to suggest furthermore where and how changes would make sense to regain or even improve organisational efficiency.
In this work, we discuss the key role that “conflict minerals” (Gold, Coltan, Cobalt, Tin, Tungsten) play in global supply chains and high-technology industries, and the issues surrounding their extraction and trade in origin
countries, particularly in the African Congo Basin and the Great Lakes Region. We discuss ongoing international efforts to combat violence, child labour and human rights violations at mineral extraction areas, particularly in the Democratic Republic of the Congo (DRC), where very large mineral reserves have been discovered. We present the OECD Due Diligence Guidance for Responsible Supply Chains of Minerals from Conflict-Affected and High-Risk Areas, and the
GOTS MineralTrace mineral proof-of-origin and trade chain certification solution developed by ibes AG in Germany, which automates and simplifies the implementation of the OECD Guidance. We discuss a pilot project in DRC involving the GOTS GoldTrace application, based on the MineralTrace platform. We point out MineralTrace’s benefits and its limitations. We analyse possible solutions to said limitations, including an analysis of blockchain-based transactional information exchange and record keeping systems, and finally we propose a new MineralTrace Application Programming Interface (API) that solves current limitations, introduces configuration flexibility for client applications, introduces workflow flexibility to adapt MineralTrace to any country or region, and simplifies data export functionality.
Large bone defects are a major clinical problem affecting elderly disproportionally, particularly indeveloped countries where this population is the fastest growing. Current treatments include autologous and allogenous bone grafts, bone elongation with the Ilizarov technique, bone graft substitutes, and electrical stimulation. Each of these approaches enjoys varying degrees of success, however, each also has its associated problems and complications. A new, still experimental, treatment is Tissue Engineering that combines scaffolds, osteogenic stem cells and growth factors, and is showing encouraging early results in preclinical and initial clinical studies.
Electrical stimulation has been shown to enhance bone healing by promoting mesenchymal stem cell migration, proliferation, and differentiation. In the present study we combine Tissue Engineering with Electrical Stimulation and hypothesize that this combined approach will have a synergistic effect resulting in enhanced new bone formation. In our in vitro experiments we observed that the levels of electrical stimulation we tested had no cytotoxic effect, instead increased osteogenic differentiation, as determined by enhanced expression of the osteogenic marker, Alkaline Phosphatase. These findings support our hypothesis by demonstrating that in the tissue-engineering environment electrical stimulation promotes bone formation. The bioinformatics part of this project consisted of gene network analysis, identification of the top 10 osteogenic markers and analyzis of genegene interactions. We observed that in studies of stem cells from both human and rat the genes, BMPR1A, BMP5, TGFßR1, SMAD4, SMAD2, BMP4, BMP7, RUNX3, and CDKN1A, are associated with osteogenesis and interact with each other. We observed a total of 31 interactions for human and 29 interactions for rat stem cells. While this approach needs to be proven experimentally, we believed that these in vitro and in silico analyses could compliment each other and in doing so contribute to the field of bone healing research.
Currently, the Internet of Things (IoT) is connected to the virtual world through the Web of Things (WoT), allowing efficient utilization of real-world objects with Internet technologies. The WoT facilitates abstract interaction between applications and connected IoT devices, allowing owners to switch between devices while using multiple ones. To achieve this, virtual assets in WoT devices can be tokenized through smart contracts and transferred using hashed proof as transactions within blockchain networks that support virtual currencies. The goal of Web of Things is to establish connectivity, interoperability, and integration among IoT devices using web standards and protocols, reducing reliance on device manufacturers. This enables easy integration of Web 3.0 cryptocurrency for device management. This study proposes a solution for WoT applications involving different cryptocurrency definitions. Finally, simulation results are presented to demonstrate the tokenization-based ownership transfer in the Web of Things.
Dynamic object roles and corresponding contexts can model complex applications with higher-level abstraction. These abstracted applications can be used in wider areas such as financial institutions, health care, and supply chain network. Role management which consists of the creation of role objects, and binding role object between core objects still suffers from non-intrusive logging-monitoring, auditing, and resilient data source for role-based applications. Moreover, immutable smart contracts cause problems concerning bug fixing and maintenance without dynamic binding to new smart contract objects. An object that is created from a smart contract (contract class) can be transparently attached to a role object utilizing the Role Object Pattern (ROP). However, ROP itself does not contain a context definition and context-specific role assignment grouping the definition of smart contract relationships in abstracted data types. In this study, we would like to implement an extended version of the role object pattern called Context-based Role Object Pattern (ContextROP) with an onchain smart contract language called Solidity to solve fundamental problems. To evaluate the proposal, we will implement a use case with the design pattern proceeding with qualitative and quantitative analysis.
Protein structures are essential elements in every biological system evolved on earth, where they function as stabilizing elements, signaltransducers or replication machin eries. They are consisting of linear-bonded amino acids, which determine the three-dimensional structure of the protein, whereas the structure in turn determines the function. The native and biological active structure ofa protein can be understood as the folding state of a polypeptide chain at the global minimum of free energy.
By means of protein energy profiling, which is an approach derived from statistical physics it is possible to assign a so called energy profile to a protein structure. Such an energy profile describes the local energetic interaction features of every amino acid within the structure and introduces an energetic point of view, instead of a structural or sequential onto proteins.
This work aims to give a perspective to the question of how we may gain pattern information out of energy profiles. The concrete subjects are energy-mapped Pfam family alignments and investigations on finding motifs or patterns indiscretizised energy profile segments.
Classification label security determines the extent to which predicted labels from classification results can be trusted. The uncertainty surrounding classification labels is resolved by the security to which the classification is made. Therefore, classification label security is very significant for decision-making whenever we are encountered with a classification task. This thesis investigates the determination of the classification label security by utilizing fuzzy probabilistic assignments of Fuzzy c-means. The investigation is accompanied by implementation, experimentation, visualization and documentation of the results.
This thesis investigated the generation of laser induced periodic surface structures (LIPSS) using femtosecond laser irradiation at a central wavelength of 775 nm.
The metals stainless steel and copper as well as a semiconducting thin film, ITO on glass substrate were investigated. The impact of the processing parameters was studied for single and multiple pulse irradiation to determine the ablation threshold of the materials
and the different types of LIPSS. These observations allowed the optimisation of area structuring with regards to processing speed and LIPSS quality.
The feasibility of the LIPSS generation in dynamic, real time polarisation control was then explored. By using a fast response, liquid-crystal polarisation rotation device, the direction of the linear polarisation of the laser beam could be dynamically controlled and synchronised to the scanning during laser processing. As a result, a range of complex micro- and nano-scale patterns with orthogonal direction of LIPSS were created. The samples were analysed using optical and electron microscopy. The orientation of the LIPSS was determined also from detection of light diffracted by the LIPSS.
Finally, two applications of large area LIPSS patterning were demonstrated, information encoding on metals and periodic structuring of a thin film conducting oxide for solar cells.
The subject of the following paper is the mental well-being of employees at their work and how the leader can improve this well-being using positive psychology. The paper is compilatory in nature because it uses research and literature of experts to analyse how employee mental well-being can be further stimulated. The expert literature is used to present tools, but also to demonstrate the effectiveness of these tools through real-life case studies and evidence. The paper wishes to inform persons, leaders, and entire organizations how positive psychology can be beneficial to organizational members’ well-being in the long term. Using a compilation of positive psychology literature and reallife case studies’ analysis, the informative purpose of the thesis can be achieved.
Recently a deep neural network architecture designed to work on graph- structured data have been capturing notice as well as getting implemented in various domains and application. However, learning representation (feature embedding) from graphical data picking pace in research and constructing graph(s) from dataset remains a challenge. The ability to map the data to lower dimensions further makes the task easier while providing comfort in applying many operations. Graph neural network (GNN) is one of the novel neural network models that is catching attention as it is outperforming in various applications like recommender systems, social networks, chemical synthesis, and many more. This thesis discusses a unique approach for a fundamental task on graphs; node classification. The feature embedding for a node is aggregated by applying a Recurrent neural network (RNN), then a GNN model is trained to classify a node with the help of aggregated features and Q learning supports in optimizing the shape of neural networks. This thesis starts with the working principles of the Feedforward neural network, recurrent units like simple RNN, Long short-term memory (LSTM), and Gated recurrent unit (GRU), followed by concepts of Reinforcement learning (RL) and the Q learning algorithm. An overview of the fundamentals of graphs, followed by the GNN architecture and workflow, is discussed subsequently. Some basic GNN models are discussed in brief later before it approaches the technical implementation details, the output of the model, and a comparison with a few other models such as GraphSage and Graph attention network (GAN).
Pollinating insects are of vital importance for the ecosystem and their drastic decline imposes severe consequences for the environment and humankind. The comprehension of their interaction networks is the first step in order to preserve these highly complex systems. For that purpose, the following study describes a protocol for the investigation of honey bee pollen samples from different agro-environmental areas by DNA extraction, PCR amplification and nanopore sequencing of the barcode regions rbcL and ITS. It was shown, that the most abundant species were classified consistently by both DNA barcodes, while species richness was enhanced by single-barcode detection of less abundant species. The analysis of the the different landscape variables exhibited a decline of species richness, Shannon diversity index, and species evenness with increasing organic crop area. However, sampling was only carried out in August and further investigations are suggested to display a more complete picture of honey bee foraging throughout the seasons.
Die vorliegende Diplomarbeit befasst sich mit der Analyse, Auswertung und Empfehlung einer Berechnungsmethode für die axiale Klemmkraft und Wellenmuttern-Anzugsmoment bei Hochgenauigkeits-Axial-Schrägkugellagern für Gewindetriebe. Des Weiteren wird der Einfluss der Klemmkraft bzw. WellenmutternAnzugsmomentes auf unterschiedliche Lagersätze und –anordnungen überprüft und Korrekturfaktoren dafür erarbeitet
Genetic sequence variations at the level of gene promoters influence the binding of transcription factors. In plants, this often leads to differential gene expression across natural accessions and crop cultivars. Some of these differences are propagated through molecular networks and lead to macroscopic phenotypes. However, the link between promoter sequence variation and the variation of its activity is not yet well understood. In this project, we use the power of deep learning in 728 genotypes of Arabidopsis thaliana to shed light on some aspects of that link. Convolutional neural networks were successfully implemented to predict the likelihood of a gene being expressed from its promoter sequence. These networks were also capable of highlighting known and putative new sequence motifs causal for the expression of genes. We tested our algorithms in various scenarios, including single and multiple point mutations, as well as indels on synthetic and real promoter sequences and the respective performance characteristics of the algorithm have been estimated. Finally, we showed that the decision boundary to classify genes as expressed and non-expressed depends on the sensitivity of the transcriptome profiling assay and changing it has an impact on the algorithm’s performance.
Current research in identity management is focusing on decentralized trust establishment for distributed identities. One of these decentralized trust models is Self-Sovereign Identities (SSI). With SSI each entity should be able to independently present and manage provable information about itself as well as request and review evidence from other entities. Using a distributed blockchain, information for verifying the authenticity of this evidence can be obtained from any other entity. This concept can be used not only for people, but also for authentication and authorization during the life cycle of devices in the Internet of Things (IoT). This paper presents an SSI-based concept for authentication and authorization of IoT devices among each other, intended to contribute to the change in trust on the internet. The SSI methodology employing a blockchain offers the possibility to establish mutual trust and proof of ownership without relying on any third party. The paper describes the concept, offers a reference implementation, and gives a discussion of the approach.
This thesis focuses on the introduction of a process for the fracture toughness testing of epoxy resin systems, in the light of the linear elastic fracture mechanic approach. Based on the requirements of ISO 13586, SENB-specimen were designed and especially the precracking process was analysed and the tapping process was optimized by designing and testing a drop-weight device. After successful validating the test process using specimen made of Araldite LY556, the in uence of GNP loading on the fracture toughness was analysed. The pure epoxy showed a KIc of 0.73 MPap
m, being perfectly in line with the manufacturers datasheet. A peak in fracture toughness of 0.83 MPap
m was archived at 1 wt% and a loading rate of 10 mm/min, showing a decreasing trend as the loading is increased further. As the loading rate is increased, the fracture toughness reduces slightly for 0.5 wt% and 2 wt% GNP, but
drops signicantly for 1 wt% GNP obliterating the peak. The load vs. displacement curves showed quasi-brittle material behaviour. The fracture surfaces were analysed using SEM and while the neat resin did not show any features, did the reinforced samples show pattern of crack pinning in connection with bridging and pull-out. The resulting improvement is less signicant as observed by other researchers for larger GNPs. This is in line with the general idea, that small particles are not able to yield as high improvements, but the signicant decrease for higher loading rates is not observed or described so far. It is suspected that tests at lower loading rates (e.g. 1 or 0.5 mm/min) show an even higher fracture toughness.
Abstract: Blockchain Technology has become an innovative, mature tool for digital transformation, disrupting more and more application areas in their business processes, values, or even economic models. This paper leverages more than 30 academic publications on prototypes and their Blockchain-based use cases to transact certificates in the context of public education. The conceptual design and guiding ideas are reflected in the practical application development for the Federal Ministry of Education and Research ECHT! project within the showcase region WIR! in Mittweida and are used for the research design. During this approach we applied agile methods and the current certificate process to propose a comprehensive disclosure of a new software prototype including a three-layered architecture with multi-stakeholder components. The artefact instantiation contributes to the practical knowledge base within Information System Research and specifically in digital certificate processes starting from creation, searching, and proofing up to revoking by consideration of an existing IT landscape as well as organizational hierarchy.
Digital Power of Attorney catalyzed by Software Requirements for Blockchain-based Applications
(2022)
Blockchain Technology (BT) with so-called web3 is at an inflection point between new sub-theme hypes and world-wide industrialization over last three years thanks to large companies like MicroStrategy [1], Facebook [2] and several Venture-Capital formations [3] who are already fighting over market share and community growth. Our work represents insights from Literature-based Software Requirement (SR) elicitation for a specific Blockchain-based Application, which is creation, managing and control of digital Power of Attorney (POA). The context of POA is not only a financial driven use-case it is by far a heavy weight universal legal transaction. We use a morphological box and reduced PRIMS-P to synthesis a generic specification for further Blockchain-based Application development. Formulated SRs in POA context are reflected on our core actors which are Grantor and authorized, trusted, external Entities. Proposed characteristics for relationship and effects are visualized in a reference model originally used in digital platform ecosystems [4]. This design and modelling approach facilitated closing discussion of BT and its future eCommerce perspective.
This paper looks at current projects in the field of Blockchain in education, their specific areas of application, possible advantages and weaknesses. Three examples developed by the team of authors are introduced in detail. First: Gallery-Defender a Serious Game, which was adapted to serve as a demonstrator in a stand-alone version to show the possibility to carry out exams directly from within the game and store the grades and meta-data on Blockchain. Second: Art-Quiz, an e-learning tool, which can be integrated into existing LMS systems and map exam results and further data using Blockchain technologies. Both were developed following an iterative design process. And third: The results of a focus group, which simulated the assignment of grades after an oral online exam. The three examples presented here are based on the Blockchain system Ardor/Childchain Ignis, but each demonstrator has a different set of features and approaches.
In addition, the integration of various Blockchain solutions was conceptually designed to make a Multi-Chain model possible.
PICC Modulation Analysis
(2014)
This diploma thesis discusses interoperability problems between certified RFID readers and transponders based on the ISO/IEC14443 standardand the root cause for them.
The main goal is the definition of new test methodsand parameters that can supplement the existing test regime for such systems and allowthe identification of those problems beforehand.
Noise in the oceans is a constantly increasing factor. The growing industrialisation due to shipping, offshore wind parks, seismic studies and other anthropogenic noise is putting the eco system under immense stress. The focus of this thesis is on the assessment of continuous underwater noise from ships. Based on existing strategies in air as well as underwater and a comparison of both an alternative strategy for the assessment of con-tinuous noise from ships is given. The concept developed is based on published, scien-tifically observed responses of animals to ship passes with an indication of an effect range. A model is created to describe the strategy using publicly available data for cargo ships as an example. The results are summarized in maps depicting the affected area for an MRU of the OSPAR II region and the MPA “Borkum Riffgrund”. The strategy is discussed and evaluated on the basis of these results. From this, further improvements and the need for additional information in publicly available data on vessel traffic are derived.