Refine
Document Type
- Master's Thesis (121)
- Bachelor Thesis (94)
- Conference Proceeding (66)
- Diploma Thesis (14)
- Final Report (6)
Year of publication
Language
- English (301) (remove)
Keywords
- Blockchain (40)
- Maschinelles Lernen (28)
- Vektorquantisierung (9)
- Algorithmus (7)
- Bioinformatik (6)
- Bitcoin (6)
- Graphentheorie (6)
- Internet der Dinge (6)
- Neuronales Netz (6)
- Unternehmen (6)
Institute
- Angewandte Computer‐ und Biowissenschaften (120)
- 06 Medien (35)
- 03 Mathematik / Naturwissenschaften / Informatik (28)
- Wirtschaftsingenieurwesen (21)
- 01 Elektro- und Informationstechnik (7)
- 04 Wirtschaftswissenschaften (7)
- Ingenieurwissenschaften (4)
- Sonstige (4)
- 02 Maschinenbau (2)
- 05 Soziale Arbeit (1)
DropConnect (the generalization of Dropout) is a very simple regularization technique that was introduced a few years ago and has become extremely popular because of its simplicity and effectiveness. In this thesis, a suitable architecture for applying DropConnect to Learning Vector Quantization networks is proposed along with a reference implementation and experimental results. Inmany classification tasks, the uncertainty of themodel is a vital piece of information for experts. Methods to extract the uncertainty and stability using DropConnect are also proposed and the corresponding experimental results are documented.
The subject of the following paper is the mental well-being of employees at their work and how the leader can improve this well-being using positive psychology. The paper is compilatory in nature because it uses research and literature of experts to analyse how employee mental well-being can be further stimulated. The expert literature is used to present tools, but also to demonstrate the effectiveness of these tools through real-life case studies and evidence. The paper wishes to inform persons, leaders, and entire organizations how positive psychology can be beneficial to organizational members’ well-being in the long term. Using a compilation of positive psychology literature and reallife case studies’ analysis, the informative purpose of the thesis can be achieved.
This thesis work is focusing on the optimization and improvement of IP network and IP transit operations and strategy as well as service offerings. Therefore, this thesis tries to give suggestions at different areas of engineering, business, strategy and operational contexts. This thesis is written in English, as this topic itself is mainly handled in English language too. The first part will try to identify and evaluate methods which are helpful to improve the practical work which will be focused in the second part of this work.
Internationalization and business expansion appear to be the most challenging processes in business conduction today. Every step of the foreign market entry process and overseas operations establishment is full of obvious risks and hidden pitfalls. Theoretical background, multiplied with the vital practice, is playing the key role in such a complicated business process; such information can be used as a guideline by further market entrants and players. At present, Germany with its well-developed engineering industry represents a broad space for research of internationalization process in its different forms, as well as can show both successful and negative results of foreign market entries.
Object detection and classification is active field of research inmachine learning and computervision. Depending on the application there are different limitations to adjust to, but also possibilities to take advantage of. In my thesis, We focus on classification and detection of video sequence during night-time and the proposed method is robust since it does use image thresholding [8] which is commonly use in other methods and the thesis uses histograms of oriented gradients (HOG) [37] as features and support vector machine (SVM) [74] as classifier. It is of great importance that the extracted features from the images should be robust and distinct enough to help the classifier distinguish between high-beam and a low-beam. The classifier is part of the object detection which predicts whether or not a testing image matches one group or the other. In our case that is predicting whether or not an image belongs to high or low-beam sequence.
In the following study the properties of the superabsorbent polymer Broadleaf P4 were investigated according to the aim to apply that polymer within constructed wetlands. The application of the polymer in constructed wetlands shall result in an improvement of the removal of pesticides. For that the polymer was given into lab-scale wetlands together with pumice and were compared to a control wetland, which was filled with gravel. The wetlands were running for several weeks in which the nutrient removal was recorded. The polymer was also tested according to its property to adsorb the pesticides before adding the pesticides to the wetland beds.
FUSO is one of the Japanese leading manufacturing of trucks and buses in the world and also it is an integral part of Daimler AG. Being a large manufacturer in trucks and buses, Fuso faces some marketing issues due to corrosion issues. Corrosion is one of the major issue to breakdown or damage the performance of the vehicles. To encounter this issue, FUSO initiated new project and called as “Anti-Corrosion Project”. The main mission of this project is to improve the corrosion resistivity or performance of the metal parts. Currently FUSO has almost 70 percent of parts which lies under Grade-III i.e. lesser than the one year corrosion resistivity.
In this project, the corrosion issues are collected by different types of audits like from customer as well as from taking two years old vehicle in worst conditions. Listed corrosion issues further investigated for current specification and requested for new proposal from supplier. Then the proposed solution is internally estimate the cost and make negotiation with the supplier. Later it’s forwarded to meeting with top management for approval. In case of higher corrosion specification, parts are taken from production line and tested in material lab which is available in FUSO. At last, the approved proposal is requested to release the drawing change and further the new proposal will be implemented. Entire project it should be coordinate with all different departments and working with teams gives more deep knowledge about the cause of issues.
With this project, parallel focused on the shop floor developments in return parts management area. FUSO is also responsible for the after sale services. In other words, FUSO provides warranty for the parts which breakdown within three years. Breakdown parts are directly delivered by the customers through dealers for warranty claim, so these parts called Warranty Part Investigation (WPI) parts. Sometimes customer wants to know the cause of the breakdown even though warranty has expired, in this case company will investigate the cause but they don’t provide the warranty. These kind of parts known as Product Quality Report (PQR) parts.
Company has a different shop floor for return parts and these parts are directly received by the company. RPM has four processes i.e. inwarding, pre-analysis, investigation and dispatch or scrap.
Usually, company used to get 30-50 parts per day, recently they decided to receive all the breakdown parts. Hence, it results in increasing the delay of inwarding and other processes. To solve this, standard layout and process are constructed. And, one of the main reasons for inward delay is higher documentation which is basically not required. These are converted into automation or digitalize work. Improvements are done using the lean manufacturing project methodology which results in more inward of failure parts and less inventory.
Many companies use machine learning techniques to support decision-making and automate business processes by learning from the data that they have. In this thesis we investigate the theory behind the most widely used in practice machine learning algorithms for solving classification and regression problems.
In particular, the following algorithms were chosen for the classification problem: Logistic Regression, Decision Trees, Random Forest, Support Vector Machine (SVM), Learning Vector Quantization (LVQ). As for the regression problem, Decision Trees, Random Forest and Gradient Boosted Tree were used. We then apply those algorithms to real company data and compare their performances and results.
The application described in this thesis has been created, built and designed to help nurses or any medical personnel all around the world in being able to access a real-time database to store patient records like Patient Name, Patient ID, Patient Age and Date of Birth, and the Symptoms that the patient is experiencing. A real-time database is a live database where all changes made to it are reflected across all devices accessing it. This application will be beneficial especially in countries where access to a computer or medical equipment is not always possible. A phone is always ready use and at the reach of the hand, users of this application will always be able to access the data at any given time and place. We will be able to add a new patient or search for existing patients. In addition, this application allows us to take RAW medical images that can be used to identify anomalies in the blood sample. RAW images are important for this application because they’re uncompressed, which means, they do not lose any quality or details. The users of this application are the medical personnel that will be taking care of the patients. These users will have to create a profile on the database in order to use the application, since their data, like user ID, will be used in order to control the behaviour of the data retrieved and stored. We will also discuss the current and future features of this application, as well as, the benefits of this application when it comes to the medical personnel, as well as patients. Finally, we will also go
over the implementation of such application from a hardware perspective, as well as a software one.
Implementation of a customised business model for innovative engineering consultancy services
(2019)
Business development is vital for every organisation who intend to grow. It follows expansion through organic and inorganic means. Also, there are many innovative business styles which help organisations to expand. This thesis shows how engineering services organisation chose its form of business expansion
The following thesis explains how engineering service sector company uses its expertise to expand its business towards consultancy market with the demonstration of the real-life executed business model.
The thesis provides a solution for the following issues
1) What is the best in-house strategy to be developed for business expansion in the service industry?
2) How did the niche market experiences help for business expansion?
Social media platforms play an increasing role in marketing, politics and police affairs, because they can strongly influence opinions. So called “opinion leaders” exert their influence in a given network and shape the opinions of other users. Identifying central nodes in a social graph has been of interest for decades. However, not all centrality measures were developed for social media platforms. They were built for social graphs, which did not include additional metrics (e.g. “likes”, “shares”). Nevertheless, these metrics play a crucial role on modern platforms. Hence, outdated measures need to be adjusted and additional metrics need to be integrated to ensure the best possible results.
Prototype-based classification methods like Generalized Matrix Learning Vector Quantization (GMLVQ) are simple and easy to implement. An appropriate choice of the activation function plays an important role in the performance of (deep) multilayer perceptrons (MLP) that rely on a non-linearity for classification and regression learning. In this thesis, successful candidates of non-linear activation functions are investigated which are known for MLPs for application in GMLVQ to realize a non-linear mapping. The influence of the non-linear activation functions on the performance of the model with respect to accuracy, convergence rate are analyzed and experimental results are documented.
This master thesis covers the topics of Customer relationships formation in the IT-outsourcing market on the example of “ABC” company. Most works related to the topic IT outsourcing cover the problems of implementation of IT services and the process of providing them to the customers and mostly all the issues are covered from the perspec-tive of consumers. Thus, problems and results of outsourcing providers of IT services remain almost uncovered. This master thesis is to reveal the specific features of IT out-sourcing business in Belarus and to develop an approach to the formation and construc-tion of a system of relationships between the company and its clients as a source of competitiveness increase.
In this thesis, the changes in economy and society and the resulting effects on the labor market are being outlined. Current studies show that the shrinking labor market and the increasing digitalization result in a lack of skilled tech talent and a transition from an employer market to a clear employee market. Derived from the findings of the scientific research on this topic and conducted expert interviews, practical recommendations for recruitment actions within the scope of employer branding will be defined in order to help corporations to gain the needed tech skill set and drive innovation.
The bachelor thesis is assigned to introduce the theoretical concept of Human Recourses Management, to analyze the work of human resources department of the LLC Tavria-V and to offer actions with recommendation to improve the productivity of the personnel. To start the implementation of actions for personnel management improvement, first of all, an overview of theoretical and methodological aspects of the HRM are presented and theories which earlier had an impact on our present running of the "workers" are described. Secondly, the concept of organizational work of the enterprise, main indexes and types of activities are figured out and in the form of tables and diagrams analyzed. The main object of the thesis - the process of personnel management with qualitative characteristics is described and presented. Using also the survey of employee all advantages and disadvantages of the present system of HRM are defined. Then in the last part, taking to account all data about current situation, recommended actions and effect for LLC Tavria-V on the basis of the personnel management analysis are presented in the work.
Cryptorchidism describes a disease, in which one or both testes do not descend into the scrotum properly. With a prevalence of up to 10%, cryptorchidism is one of the most common birth defects of the male genital tract. Despite its associated health risks and accompanying economic damage, resulting from surgery and losses in breeding, studies on canine cryptorchidism and its causes are relatively rare. In this study a relational database for genetic causes of cryptorchidism was established and used as a basis for the identification of candidate genes. Associated regions were analysed by nanopore sequencing with the goal to identify genetic variants correlated with cryptorchidism in German Sheep Poodle.
The aim of this bachelor thesis was to establish extracytoplasmic function (ECF) σ factors as synthetic genetic regulators for biotechnological and synthetic biology applications in the new emerging model organism Vibrio natriegens. Therefore, synthetic genetic circuits were engineered on plasmids as test set-up for the investigated ECFs and their target promoters. The resulting plasmid library consisted of the reporter plasmids with the target promoter, fused to a lux cassette, a set of high-copy ECF plasmids and a backup set of lower-copy ECF plasmids. First, the high-copy plasmids were transformed in V. natriegens to test them for their functionality upon different inducer levels, which yielded good inducibility for few, but showed too high ECF-expression in most strains. For this reason, the set of lower copy plasmids was used for combinatorial co-transformation, to investigate the ECFs for their cross-talk to unspecific ECF target promoters. The switching to the lower-copy plasmid-set seemed to be partly helpful, while still much room for fine-tuning of the circuits remains. The knowledge gained can be used to achieve higher success rates when engineering synthetic circuits for various applications in V. natriegens, by using the ECFs here recommended as suitable synthetic genetic regulators.
The epithelial membrane proteins (EMP1-3), which belong to the family of peripheral myelin proteins 22-kDa (PMP22), are involved in epithelial differentiation. EMP2 was found to be a downstream target gene of the tumor suppressor gene HOPX, a homeobox-containing gene. Additionally, a dysregulation of EMP2 has been observed in various cancers, but the function of EMP2 in human lung cancer has not yet been clarified.
In this study, a real-time RT-PCR, Western blot and cytoblock analysis were performed to analyze the expression of EMP2. Gain-of-function was achieved by stable transfection with an EMP2 expression vector and loss-of-function by siRNA knockdown. Stable transfection led to overexpression of EMP2 at both mRNA and protein levels in the transfected cell lines H1299 and H2170.
Functional assays including proliferation, colony formation, migration and invasion assays as well as cell cycle analyzes were performed after stable transfection and it was found that the ectopic EMP2 expression resulted in a reduced cell proliferation, migration and invasion as well as a G1 cell cycle arrest. After the EMP2 gene was silenced by the siRNA knockdown, inhibition of the cell invasive property was observed. These phenomena were accompanied by reduced AKT, mTor and p38 activities.
Taken together, the data suggest that the epithelial membrane protein 2 (EMP2) is a tumor suppressor and exerts its tumor suppressive function by inhibiting AKT and MAPK signaling pathways in human lung cancer cells.
In the present bachelor thesis, nanopore sequencing and Illumina sequencing was compared using pollen DNA collected from honeybees and bumble bees. Therefore, nanopore sequencing was performed with the MinION sequencers and the generated reads were analysed with bash programming. A quantitative and qualitative (based on ITS2 sequences) BLAST run was performed. The results confirme the error probability of nanopore sequencing that is described in the literature. Nevertheless, with both sequencing methods similar sample preferences of the bees could have been observed, allowing ecological conclusions.
In today’s market, the process of dealing with textual data for internal and external processes has become increasingly important and more complex for certain companies. In this context,the thesis aims to support the process of analysis of similarities among textual documents by analyzing relationships among them. The proposed analysis process includes discovering similarities among these financial documents as well as possible patterns. The proposal is based on the exploitation and extension of already existing approaches as well as on their combination with well-known clustering analysis techniques. Moreover, a software tool has been implemented for the evaluation of the proposed approach, and experimented on the EDGAR filings, on the basis of qualitative criteria.
This Master Thesis covers two main Topics: Sharing Economy and Risk Management and combines them in frames of this paper in order to provide a methodology (Uber was chosen as an example) of how a risk management process may be applied to a Sharing Economy business, as well as which types of risks are of special relevance for those types of businesses.
In dieser Arbeit wurden neuartige Proteasen aus psychrotoleranten Bakterienstämmen isoliert und auf ihre biochemischen Eigenschaften charakterisiert. Des Weiteren konnten S8 Familie Proteasen Gene amplifiziert werden und Unterschiede in der Aminosäuresequenz konnten in Zusammenhang mit den biochemischen Eigenschaften der Proteasen in Verbindung gebracht werden.
A relatively new research field of neurosciences, called Connectomics, aims to achieve a full understanding and mapping of neural circuits and fine neuronal structures of the nervous system in a variety of organisms. This detailed information will provide insight in how our brain is influenced by different genetic and psychiatric diseases, how memory traces are stored and ageing influences our brain structure. It is beyond question that new methods for data acquisition will produce large amounts of neuronal image data. This data will exceed the zetabyte range and is impossible to annotate manually for visualization and analysis. Nowadays, machine learning algorithms and specially deep convolutional neuronal networks are heavily used in medical imaging and computer vision, which brings the opportunity of designing fully automated pipelines for image analysis. This work presents a new automated workflow based on three major parts including image processing using consecutive deep convolutional networks, a pixel-grouping step called connected components and 3D visualization via neuroglancer to achieve a dense three dimensional reconstruction of neurons from EM image data.
In the following bachelor thesis the current trends and potential applications of digitalization in the service industry will be discussed. With the nowadays surging demand on digitalization in all industries, there are branches of the service industry where digitalization is yet to be exploited to its full potential. However, it is difficult to pick and choose which branches of the industry should be fully digitized and which should be partially digitized. The result of this work should therefore facilitate the process of applying digitization in the consulting services where face to face human interaction has been the key to the industry for years. For this purpose, essential factors to be taken into account were identified, which are to be sought after through the analysis, in the specification of the system requirements as well as in the performance of a utility value analysis.
Digital innovation in the quality management system from supply chain to final product conformityy
(2019)
As the new revolution is happening in the industry 4.0 as digitalization and the new trend in innovation is taken place. So, we want to digitalize the process from the supply chain to the final product conformity of the aircraft.
So every document which is received from the supplier like (eg.CoC, Inspection report, concession) digitally. When the part is received at the warehouse of the OEM the warehouse personal has a system to say that part A serial no X is the perfect fit for the part no By with the help of QR code and book the part into the ERP.
The biggest challenge we have is to reduce in production inspection method to be done by a human. We want to bring one more upper step that is automation with edition with IOT in the process to give better data processing to the Automation process plus reduce the overall inspection time and what is needed in create a proper visual automation control system and also with help of gauge Rand R make the process more accurate and also certify the traceability of the process . At finally there was so much data and we need data security for that to create a proper data source and data storage for supplier data as well as internal data security.
In the practice of software engineering, project managers often face the problem of software project management.
It is related to resource constrained project scheduling
problem. In software project scheduling, main resources are considered to be the employees with some skill set and required amount of salary. The main purpose of software
project scheduling is to assign tasks of a project to the available employees such that the total cost and duration of the project are minimized, while keeping in check that
the constraints of software project scheduling are fulfilled. Software project scheduling (SPSP) has complex combined optimization issues and its search space increases exponentially when number of tasks and employees are increased, this makes software project scheduling problem (SPSP) a NP-Hard problem. The goal of software project scheduling problem is to minimize total cost and duration of project which makes it multi-objective problem. Many algorithms are proposed up till now that claim to give near optimal results for NP-Hard problems, but only few are there that gives feasible set of solutions for software project scheduling problem, but still we want to get more efficient algorithm to get feasible and efficient results.
Nowadays, most of the problems are being solved by using nature inspired algorithms because these algorithms provide the behavior of exploration and exploitation. For solving
software project scheduling (SPSP) some of these nature inspired algorithms have been used e.g. genetic algorithms, Ant Colony Optimization algorithm (ACO), Firefly etc.
Nature inspired algorithms like particle swarm optimization, genetic algorithms and Ant Colony Optimization algorithm provides more promising result than naive and greedy algorithms. However there is always a quest and room for more improvement. The main purpose of this research is to use bat algorithm to get efficient results and solutions for software project scheduling problem. In this work modified bat algorithm is implemented where a different approach of random walk is used. The contributions of this thesis are to: (1) To adapt and apply modified multi-objective bat algorithm for solving software project scheduling (SPSP) efficiently, (2) to adapt and apply other nature inspired algorithms like genetic algorithms for solving software project scheduling (SPSP) and (3) to compare and analyze the results obtained by applied nature inspired algorithms and provide the conclusion.
The theoretical foundations of enterprise management using information technology were reviewed; analysis of the effectiveness of the use of information systems in the enterprise; ways of improving the enterprise management mechanism using information systems (on example of Mars Wrigley Confectionery Belarus) have been developed.
Neural networks have become one of the most powerful algorithms when it comes to learning from big data sets and it is used extensively for classification. But the deeper the network models, the lesser is the interpretability of such models. Although many methods exist to explain
the output of such networks, the lack of interpretability makes them black boxes. On the other hand, prototype-based machine learning algorithms are known to be interpretable and robust.
Therefore, the aim of this thesis is to find a way to interpret the functioning of the neural networks by introducing a prototype layer to the neural network architecture. This prototype layer will train alongside the neural network and help us interpret the model. We present architectures of neural networks consisting of autoencoders and prototypes that perform activity recognition from heart rates extracted from ECG signals. These prototypes represent the different activity groups that the heart rates belong to and thereby aid in interpretability.
Workload Optimization Techniques for Password
Guessing Algorithms on Distributed Computing Platforms
(2019)
The following thesis covers several ways to optimize distributed computing platforms for cryptanalytic purposes. After an introduction on password storage, password guessing attacks and distributed computing in general, a set of inital benchmark results for a variety of different devices will be analyzed. The shown results are mainly based on utilization of the open source password recovery tool Hashcat. The second part of this work shows an algorithmic implementation for information retrieval and workload generation. This thesis can be used for the conception of a distributed computing system, inventory analysis of available hardware devices, runtime and cost estimations for specific jobs and finally strategic workload distribution.
The Infinica product suite consists of multiple individual microservice applications, mainly gathered around Infinica Process Engine which allows the execution of highly individualised process definitions. For estimating process performance, a layered queuing network approach has been applied. In the first step this required the implementation of a basic modelling framework. Subsequently the implemented framework was used to evaluate the applicability of the approach by creating two models and comparing them with actual performance measurements. Although the calculated results deviated from the expected results, analysis showed that the differences may
derive from an inaccurate model. Nevertheless the general approach seems to be appropriate for the given application as well as for microservices in general, especially when extended with advanced modelling techniques, as the analysed modelled results appear consistent.
This thesis provides an overview of Generation Z with a focus on Mittweida University of Applied Sciences students. It explores the general issues of students' behavior in life, as well as their attitudes toward the financial and banking sectors. It also examines the German banking market, its strengths and weaknesses in attracting new clients. At the end, possible strategies for the development of the bank in terms of attractiveness for young people are provided.
Tokenization projects are currently very present when it comes to new blockchain technologies. After explaining the fundamentals of cross-chain interaction, the bachelor thesis will focus on tokenizing technology for Bitcoin on Ethereum. To get a more practical context, implementing the currently most successful decentralized tokenization project is described.
Vicia faba leaves and calli were transformed using CRISPR Cas RNP. Two kinds of CPP fused SpyCas9 were used with sgRNA7, sgRNA5 or sgRNA13 targeting PDS exon 1, PDS exon 2 or MgCh exon 3 respectively. RNP were applied using high pressure spraying, biolistic delivery, incubation in RNP solution and infiltration of leaf tissue. A PCR and restriction enzyme based approach was used for detection of mutation. Screening of 679 E. coli colonies containing the cloned fragments resulted in detection of 14 mutations. Most of the 14 mutations were deletions of sizes 150, 500 or 730 bp. 5 out of the 14 mutations were point mutations located two to three bp upstream of PAM.
In bioinformatics one important task is to distinguish between native and mirror protein models based on the structural information. This information can be obtained from the atomic coordinates of the protein backbone. This thesis tackles the problem of distinction of these conformations, looking at the statistics of the dihedral angles’ distribution regarding the protein backbone. This distribution is visualized in Ramachandran plots. By means of an interpretable machine learning classification method – Generalized Matrix Learning Vector Quantization – we are able to distinguish between native and mirror protein models with high accuracy. Further, the classifier model supplies supplementary information on the important distributional regions for distinction, like α-helices and β-strands.
A Protein is a large molecule that consists of a vast number of atoms; one can only imagine the complexity of such a molecule. Protein is a series of amino acids that bind to each other to form specific sequences known as peptide chains. Proteins fold into three-dimensional conformations (or so-called protein’s native structure) to perform their functions. However, not every protein folds into a correct structure as a result of mutations occurring in their amino acid sequences. Consequently, this mutation causes many protein misfolding diseases. Protein folding is a severe problem in the biological field. Predicting changes in protein stability free energy in relation to the amino acid mutation (ΔΔG) aids to better comprehend the driving forces underlying how proteins fold to their native structures. Therefore, measuring the difference in Gibbs free energy provides more insight as to how protein folding occurs. Consequently, this knowledge might prove beneficial in designing new drugs to treat protein misfolding related diseases. The protein-energy profile aids in understanding the sequential, structural, and functional relationship, by assigning an energy profile to a protein structure. Additionally, measuring the changes in the protein-energy profile consequent to the mutation (ΔΔE) by using an approach derived from statistical physics will lead us to comprehend the protein structure thoroughly. In this work, we attempt to prove that ΔΔE values will be approximate to ΔΔG values, which can lead the future studies to consider that the energy profile is a good predictor of protein binding affinity as Gibbs free energy to solve the protein folding problem.
he automatic comparison of RNA/DNA or rather nucleotide sequences is a complex task requiring careful design due to the computational complexity. While alignment-based models suffer from computational costs in time, alignment-free models have to deal with appropriate data preprocessing and consistently designed mathematical data comparison. This work deals with the latter strategy. In particular, a systematic categorization is proposed, which emphasizes two key concepts that have to be combined for a successful comparison analysis: 1) the data transformation comprising adequate mathematical sequence coding and feature extraction, and 2) the subsequent (dis-)similarity evaluation of the transformed data by means of problem specific but mathematically consistent proximity measures. Respective approaches of different categories
of the introduced scheme are examined with regard to their suitability to distinguish natural RNA virus sequences from artificially generated ones encompassing varying degrees of biological feature preservation. The challenge in this application is the limited additional biological information available, such that the decision has to be made solely on the basis of the sequences and their
inherent structural characteristics. To address this, the present work focuses on interpretable, dissimilarity based classification models of machine learning, namely variants of Learning Vector Quantizers. These methods are known to be robust and highly interpretable, and therefore,
allow to evaluate the applied data transformations together with the chosen proximity measure with respect to the given discrimination task. First analysis results are provided and discussed, serving as a starting point for more in-depth analysis of this problem in the future.
Convolutional Neural network (CNN) has been one of most powerful and popular preprocessing techniques employed for image classification problems. Here, we use other signal processing techniques like Fourier transform and wavelet transform to preprocess the images in conjunction with different classifiers like MLP, LVQ, GLVQ and GMLVQ and compare its performance with CNN.
Anomaly Detection is a very acute technical problem among various business enterprises. In this thesis a combination of the Growing Neural Gas and the Generalized Matrix Learning Vector Quantization is presented as a solution based on collected theoretical and practical knowledge. The whole network is described and implemented along with references and experimental results. The proposed model is carefully documented and all the further open researching questions are stated for future investigations.
Genetic sequence variations at the level of gene promoters influence the binding of transcription factors. In plants, this often leads to differential gene expression across natural accessions and crop cultivars. Some of these differences are propagated through molecular networks and lead to macroscopic phenotypes. However, the link between promoter sequence variation and the variation of its activity is not yet well understood. In this project, we use the power of deep learning in 728 genotypes of Arabidopsis thaliana to shed light on some aspects of that link. Convolutional neural networks were successfully implemented to predict the likelihood of a gene being expressed from its promoter sequence. These networks were also capable of highlighting known and putative new sequence motifs causal for the expression of genes. We tested our algorithms in various scenarios, including single and multiple point mutations, as well as indels on synthetic and real promoter sequences and the respective performance characteristics of the algorithm have been estimated. Finally, we showed that the decision boundary to classify genes as expressed and non-expressed depends on the sensitivity of the transcriptome profiling assay and changing it has an impact on the algorithm’s performance.
Data streams change their statistical behaviour over the time. These changes can occur gradually or abruptly with unforeseen reasons, which may effect the expected outcome. Thus it is important to detect concept drift as soon as it occurs. In this thesis we chose distance based methodology to detect presence of concept drift in the data streams. We used generalized learning vector quantization(GLVQ) and generalized matrix learning vector quantization( GMLVQ) classifiers for distance calculation between prototypes and data points. Chi-square and Kolmogorov–Smirnov tests are used to compare the distance distributions of test and train data sets to indicate the drift presence.
In response to prevailing environmental conditions, Arabidopsis thaliana plants must increase their photosynthetic capacity to acclimate to potential harmful environmental high light stress. In order to measure these changes in acclimation capacity, different high throughput imaging-based methods can be used. In this master thesis we studied different Arabidopsis thaliana knockout mutants-and accessions in their capacity to acclimate to potential harmful environmental high light and cold temperature conditions using a high throughput phenotyping system with an integrated chlorophyll fluorescence measurement system. In order to determine the acclimation capacity, Arabidopsis thaliana knockout mutants of previously not high light assigned genes as well as accessions of two different haplotype groups with a reference and alternative allele from different countries of origin were grown under switching high light and temperature environmental conditions. Photosynthetic analysis showed that knockout mutant plants did differ in their Photosystem II operating efficiency during an increased light irradiance switch but did not significantly differ a week later under the same circumstances from the wildtype. High throughput phenotyping of haplotype accessions revealed significant better acclimation capacity in non-photochemical quenching and steady-state photosynthetic efficiency in Russian domiciled accessions with an altered SPPA gene during high light and cold stress.
This thesis deals with the development of a methodology / concept to analyse targeted attacks against IIoT / IoT devices. Building on the established background knowledge about honeypots, fileless malware and injection techniques a methodology is created that leads to a concept of a honeypot analyzation system. The system is created to analyse and detect novel threats like fileless attacks which are often utilized by Advanced Persistent Threats. That system is partially implemented and later evaluated by performing a simulated attack utilizing fileless attacks. The effectiveness is discussed and rated based on the results.
Financial fraud for banks can be a reason for huge monetary losses. Studies have shown that, if not mitigated, financial fraud can lead to bankruptcy for big financial institutions and even insolvency for individuals. Credit card fraud is a type of financial fraud that is ever growing. In the future, these numbers are expected to increase exponentially and that’s why a lot of researchers are focusing on machine learning techniques for detecting frauds. This task, however, is not a simple task. There are mainly two reasons
• varying behaviour in committing fraud
• high level of imbalance in the dataset (the majority of normal or genuine cases largely outnumbers the number of fraudulent cases)
A predictive model usually tends to be biased towards the majority of samples, in an unbalanced dataset, when this dataset is provided as an input to a predictive model.
In this Thesis this problem is tackled by implementing a data-level approach where different resampling methods such as undersampling, oversampling, and hybrid strategies along with bagging and boosting algorithmic approaches have been applied to a highly skewed dataset with 492 idetified frauds out of 284,807 transactions.
Predictive modelling algorithms like Logistic Regression, Random Forest, and XGBoost have been implemented along with different resampling techniques to predict fraudulent transactions.
The performance of the predictive models was evaluated based on Receiver Operating CharacteristicArea under the curve (AUC-ROC), Precision Recall Area under the Curve (AUC-PR), Precision, Recall, F1 score metrics.
Drought is one of the most common and dangerous threats plants have to face, costing the global agricultural sector billions of dollars every year and leading to the loss of tons of harvest. Until people drastically reduce their consumption of animal products or cellular agriculture comes of age, more and more crops will need to be produced to sustain the ever growing human population. Even then, as more areas on earth are becoming prone to drought due to climate change, we may still have to find or breed plant varieties more suitable to grow and prosper in these changing environments.
Plants respond to drought stress with a complex interplay of hormones, transcription factors, and many other functional or regulatory proteins and mapping out this web of agents is no trivial task. In the last two to three decades or so, machine learning has become immensely popular and is increasingly used to find patterns in situations that are too complex for the human mind to overlook. Even though much of the hype is focused on the latest developments in deep learning, relatively simple methods often yield superior results, especially when data is limited and expensive to gather.
This Master Thesis, conducted at the IPK in Gatersleben, develops an approach for shedding light on the phenotypic and transcriptomic processes that occur when a plant is subjected to stress. It centers around a random forest feature selection algorithm and although it is used here to illuminate drought stress response in Arabidopsis thaliana, it can be applied to all kinds of stresses in all kinds of plants.
We present dimensionality reduction methods like autoencoders and t-SNE for visualization of high-dimensional data into a two-dimensional map. In this thesis, we initially implement basic and deep autoencoders using breast cancer and mushroom datasets. Next, we build another dimensionality reduction method t-SNE using the same datasets. The obtained visualization results of the datasets using the dimensionality reduction methods are documented in the experiments section of the thesis. The evaluation of classification and clustering for the dimensionality reduction techniques is also performed. The visualization and evaluation results of t-SNE are significantly better than the other dimensionality reduction techniques.
The impact of organisational structure and organisational culture on the efficiency of a business
(2020)
The fear of losing flexibility and effectiveness due to an increased organisational structure induced by personal growth is causing SME's to defer structural changes. The purpose of this work is to examine whether the structural and cultural demands of employees match the structure and predominant culture within such a medium-sized company. As part of this, a survey was made to evaluate the current status and to suggest furthermore where and how changes would make sense to regain or even improve organisational efficiency.
The subject of the following paper is the analysis of global company motives for taking on sport sponsorships as a corporate social responsibility (CSR) initiative. This work is compilatory in nature because it is derived from literature released by experts as well as real-life case studies. The expert literature provides a basis of theories and models regarding the fundamental motives for CSR and sport sponsoring and visualizes them by means of statistics and real-life case studies. This paper aims to inform individuals, leaders and specifically global organizations about the benefits that taking on a sport sponsorship may have for fulfilling a company’s CSR objectives
This paper looks at current projects in the field of Blockchain in education, their specific areas of application, possible advantages and weaknesses. Three examples developed by the team of authors are introduced in detail. First: Gallery-Defender a Serious Game, which was adapted to serve as a demonstrator in a stand-alone version to show the possibility to carry out exams directly from within the game and store the grades and meta-data on Blockchain. Second: Art-Quiz, an e-learning tool, which can be integrated into existing LMS systems and map exam results and further data using Blockchain technologies. Both were developed following an iterative design process. And third: The results of a focus group, which simulated the assignment of grades after an oral online exam. The three examples presented here are based on the Blockchain system Ardor/Childchain Ignis, but each demonstrator has a different set of features and approaches.
In addition, the integration of various Blockchain solutions was conceptually designed to make a Multi-Chain model possible.
With the increasing usage of blockchain technology, legal challenges such as GDPR compliance arise. Especially the right of erasure is considered challenging as blockchains are tamperproof by design. Several approaches investigated
possibilities to weaken the tamperproof aspect of blockchains in favor of GDPR compliance. This paper presents several approaches, then focuses on chameleon hash functions by evaluating the possibility to use these specific functions in a private blockchain. The goal of the built system is to take a step towards the digitization of the bill of lading used in international trade. This paper describes the developed software as well as the core considerations around the system such as network design or block structure.
Both cryptocurrency researchers and early adopters of cryptocurrencies agree that they possess a special kind of materiality, based on the laborious productive process of digital ‘mining’ [1]. This idea first appears in the Bitcoin White Paper [2] that encourages Bitcoin adopters to construct and justify its value in metaphoric comparison to gold mining. In
this paper, I explore three material aspects of blockchain: physical infrastructure, human language and computer code. I apply the concept of 'continuous materiality' [3] to show how these three aspects interact in practical implementations of blockchain such as Bitcoin and Ethereum. I start from the concept of ‘digital metallism’ that stands for ‘fundamental value’ of cryptocurrencies, and end with the move of Ethereum to ‘proof-of-stake’, partially as a countermeasure against ‘evil miners’. I conclude that ignoring material aspects of blockchain technology can only further problematize complicated relations between their technical, semiotic and social materiality.
After creating a new blockchain transaction, the next step usually is to make miners aware of it by having it propagated through the blockchain’s peer-to-peer network. We study an unintended alternative to peer-to-peer propagation: Exclusive mining. Exclusive mining is a type of collusion between a transaction initiator and a single miner (or mining pool). The initiator sends transactions through a private channel directly to the miner instead of propagating them through the peerto-peer network. Other blockchain users only become aware of these transactions once they have been included in a block by the miner. We identify three possible motivations for engaging in exclusive mining: (i) reducing transaction cost volatility (“confirmation as a service”), (ii) hiding unconfirmed transactions from the network to prevent frontrunning and (iii) camouflaging wealth transfers as transaction costs to evade taxes or launder money. We further outline why exclusive mining is difficult to prevent and introduce metrics which can be used to identify mining pools engaging in exclusive mining activity.
The set of transactions that occurs on the public ledger of an Ethereum network in a specific time frame can be represented as a directed graph, with vertices representing addresses and an edge indicating the interaction between two addresses.
While there exists preliminary research on analyzing an Ethereum network by the means of graph analysis, most existing work is focused on either the public Ethereum Mainnet or on analyzing the different semantic transaction layers using
static graph analysis in order to carve out the different network properties (such as interconnectivity, degrees of centrality, etc.) needed to characterize a blockchain network. By analyzing the consortium-run bloxberg Proof-of-Authority (PoA) Ethereum network, we show that we can identify suspicious and potentially malicious behaviour of network participants by employing statistical graph analysis. We thereby show that it is possible to identify the potentially malicious
exploitation of an unmetered and weakly secured blockchain network resource. In addition, we show that Temporal Network Analysis is a promising technique to identify the occurrence of anomalies in a PoA Ethereum network.
In this work a second version for the Python implementation of an algorithm called Probabilistic Regulation of Metabolism (PROM) was created and applied to the metabolic model iSynCJ816 for the organism Synechocystis sp. PCC 6803. A crossvalidation was performed to determine the minimal amount of expression data needed to produce meaningful results with the PROM algorithm. The failed reproduction of the results of a method called Integrated and Deduced Regulation of Metabolism (IDREAM) is documented and causes for the failed reproduction are discussed.
In an era of global climate change and fast growing cities, local governments are in an urgent need for adopting sustainable urban growth concepts for tackling a liveable and prosperous urban future. Against this background, the smart city notion progressively gained popularity as an urban development concept, which heavily relies on technology and urban data use for fostering sustainable urban growth. However, so far, the understandingof the smart city term is ambiguous, and little scientific research has been done on developing comprehensive conceptual frameworks to support local governments in the making of smarter cities. This paper aims at presenting the current state-of-the-art of smart city research in order to support the making of smart city best practices and to promote a comprehensive understanding of the smart city notion. In doing so, the role of technology in the making of smarter cities and critical success factors in transforming cities are elaborated, following the methodological approach of a multidimensional conceptual framework. The research findings and an expert interview with a representative of the state capital will then serve for the assessment of the weak points and best practices in the smart city pursuit of the German city Munich, providing urban policymaking with valuable insights and fostering the development of a comprehensive smart city conceptualism.
Mathematics Behind the Zcash
(2020)
Among all the new developed cryptocurrencies from Bitcoin, Zcash comes out to be the strongest cryptocurrency providing both transparency and anonymity to the transactions and its users by deploying the strong mathematics of zk-SNARKs.
We discussed the zero knowledge proofs which is a basic building block for providing the functionality to zk-SNARKs. It offers schnorr and sigma protocols with interactive and noninteractive versions. Non-interactive proofs are further used in Zcash transactions where the validation of sent transaction is proved by cryptographic proof.
Further, we deploy zk-SNARKs proofs following common reference string as public parameter when transaction is made. The proof allows sender to prove that she knows a secret for an instance such that the proof is succinct, can be verified very efficiently and does not leak the
secret. Non-malleability, small proofs and very effective verification make zk-SNARKs a classic tool in Zcash. Since we deal with NP problems therefore we have considered the elliptic curve cryptography to provide the same security like RSA but with smaller parameter size.
Lastly, we explain Zcash transaction process after minting the coin, the corresponding transaction completely hides the sender, receiver and amount of transaction using zero knowledge proof.
As future considerations, we talk about the improvements that can be done in term of decentralization, efficiency by comparing with top ranked cryptocurrencies namely Ethereum and Monero, privacy preserving against the thread of quantum computers and enhancements in shielded transactions.
Decentralizing Smart Energy Markets - tamper-proof-documentation of flexibility market processes
(2020)
The evolving granularity and structural decentralization of the energy system leads to a need for new tools for the efficient operation of electricity grids. Local Flexibility Markets (or "Smart Markets") provide platform concepts for market based congestion management. In this context there is a distinct need for a secure, reliable and tamper-resistant market design which requires transparent and independent monitoring of platform operation. Within the following paper different concepts for blockchain-based documentation of relevant processes on the proposed market platform are described. On this basis potential technical realizations are discussed. Finally, the implementation of one setup using Merkle tree operations is presented by using open source libraries.
Procurement processes are deemed to lack supporting digital technologies that raise efficiency and automation.
Blockchain solutions are piloted in procurement in order to offer a decentralized IT infrastructure covering these needs. This paper aims at identifying current blockchain approaches in the field of procurement and presenting affected business processes. In order to get an overview of the current state of the art, a systematic literature mapping is conducted.
Moreover, the out-comes are gathered and categorized in a classification scheme. Based on the analysis, systematic maps are presented to showcase relevant findings. Within the findings, several blockchain use cases in the field of procurement are identified and information about addressed challenges, utilized blockchain frameworks and affected business processes are extracted.
The financial world of blockchains is mostly covered by Bitcoin, taking up about 210 billion dollars in market cap. Despite the huge security and independence which the technology offers to the users, it's not quite easy to adapt with upcoming applications due to the regulated infrastructure behind. For small-scale transactions, everyday use applications or the access to a variety of crypto technologies and projects, Bitcoin is relatively limited in future development. The compatibility for most of those applications is covering currencies from more development-driven blockchains like Ethereum. Those want to reach out for the user base that's already in hold of Bitcoins and offer them a seamless transition to new applications without the risk of losing their funds. Within the article, atomic swaps and tokenization are covered up and current approaches compared. Both mechanisms are used to fulfill this symbiosis between Bitcoin and Ethereum.
To get a more practical view, an example on how to implement such a tokenization within an app is shown. This will give deeper insights and offers inspiration for digital identity-based app development.
To enable smart devices of the internet of things to be connected to a blockchain, a blockchain client needs to run on this hardware. With the Trustless Incentivized Remote Node Network, in short Incubed, it will be possible to establish a decentralized and secure network of remote nodes, which enables trustworthy and fast access to a blockchain for a large number of low-performance IoT devices. Currently, Incubed supports the verification of Ethereum data. To serve a wider audience and more applications this paper proposes the verification of Bitcoin data as well, which can be achieved due to the modularity of Incubed. This paper describes the proof data that is necessary for a client to prove the correctness of a node’s response and the process to verify the response by using this proof data as well. A proof-object which contains the proof data will be part of every response in addition to the actual result. We design, implement and evaluate Bitcoin verification for Incubed. Creation of the proof data for supported methods (on the server-side) and the verification process using this proof data (on the client-side) has been demonstrated. This enables the verification of Bitcoin in Incubed.
Mathematics behind the Zcash
(2020)
Among all the new developed cryptocurrencies, Zcash comes out to be the strongest cryptocurrency providing both transparency and anonymity to the transactions and its users by deploying the strong mathematics of zk-SNARKs. We discussed the zero knowledge proofs as a building block for providing the functionality to zk-SNARKs. It offers schnorr protocol which is further used in Zcash transactions where the validation of sent transaction is proved by cryptographic proof. Further, we deploy zk-SNARKs following common reference string that allows sender to prove that she knows a secret such that the proof is succinct, can be verified and does not leak the secret. Non-malleability, small proofs and effective verification make zk-SNARKs a classic tool in Zcash. We deal with NP problems therefore we have considered the elliptic curve cryptography to provide the security. Lastly, we explain Zcash transaction, the corresponding transaction completely hides the sender, receiver and amount of transaction using zero knowledge proof.
This paper analyses the status quo of large-scale decision making combined with the possibility of blockchain as an underlying decentralized architecture to govern common pool resources in a collective manner and evaluates them according to their requirements and features (technical and non-technical). Due to an increasing trend in the distribution of knowledge and an increasing amount of information, the combination of these decentralized technologies and approaches, can not only be beneficial for consortial governance using blockchain but can also help communities to govern common goods and resources. Blockchain and its trust-enhancing properties can potenitally be a catalysator for more collaborative behavior among participants and may lead to new insights about collective action and CPRs.
Glycans play an important role in the intracellular interactions of pathogenic bacteria. Pathogenic bacteria possess binding proteins capable of recognizing certain sugar motifs on other cells, which are found in glycan structures. Artificial carbohydrate synthesis allows scientists to recreate those sugar motifs in a rational, precise, and pure form. However, due to the high specificity of sugar-binding proteins, known as lectins, to glycan structures, methods for identifying suitable binding agents need to be developed. To tackle this hurdle, the Fraunhofer Institute for Cell Therapy and Immunology (Fraunhofer IZI) and the Max-Planck Institute of Colloids and Interfaces (MPIKG) developed a binding assay for the high throughput testing of sugar motifs that are presented on modular scaffolds formed by the assembly of four DNA strands into simple, branched DNA nanostructures. The first generation of this assay was used in combination with bacteria that express a fluorescent protein as a proof-of-concept. Here, the assay was optimized to be used with bacteria not possessing a marker gene for a fluorescent protein by staining their genomic DNA with SYBR® Green. For the binding assay, DNA nanostructures were combined with artificially synthesized mannose polymers, typical targets for many lectins on the surface of bacteria, presenting them in a defined constellation to bind bacteria strongly due to multivalent cooperativity. The testing of multiple mannose polymers identified monomeric mannose with a 5’-carbon linker and 1,2-linked dimeric mannose with linker as the best binding candidates for E. coli, presumably due to binding with the FimH protein on the surface. Despite similarities between the FimH proteins of E. coli and K. pneumoniae, binding was only observed between E. coli and the different sugar molecules on DNA structures. Furthermore, the degree of free movement seemed to affect the binding of mannose polymers to targeted proteins, since when utilizing a more flexible DNA nanostructure, an increase in binding could be observed. An alternative to the simple DNA nanostructures described above is the use of larger, more complex DNA origami structures consisting of several hundred strands. DNA origami structures are capable of carrying dozens of modifications at the same time. The results for the DNA origami structure showed a successful functionalization with up to 71 1,2-linked dimeric mannose with linker molecules. These results point towards a solution for the high-throughput analysis of potential binding agents for pathogenic bacteria e.g. as an alternative treatment for antibiotic-resistant.
The emerging Internet of Things (IoT) technology interconnects billions of embedded devices with each other. These embedded devices are internet-enabled, which collect, share, and analyze data without any human interventions. The integration of IoT technology into the human environment, such as industries, agriculture, and health sectors, is expected to improve the way of life and businesses. The emerging technology possesses challenges and numerous
security threats. On these grounds, it is a must to strengthen the security of IoT technology to avoid any compromise, which affects human life. In contrast to implementing traditional cryptosystems on IoT devices, an elliptic curve cryptosystem (ECC) is used to meet the limited resources of the devices. ECC is an elliptic curve-based public-key cryptography which provides equivalent security with shorter key size compared to other cryptosystems such as Rivest–Shamir–Adleman (RSA). The security of an ECC hinges on the hardness to solve the elliptic curve discrete logarithm problem (ECDLP). ECC is faster and easier to implement and also consumes less power and bandwidth. ECC is incorporated in internationally recognized standards for lightweight applications due to the
benefits ECC provides.
This paper examines the communication channels used by innovation projects at the ProtoSpace Hamburg, when engaging with stakeholders, and tries to answer the thesis question whether new media channels improve the chances of success for innovation projects, when used for this communication. Expert interviews with eight experts in com-munication, innovation and stakeholder management were conducted and then analyzed through the application of Mayring´s qualitative content analysis, in order to answer the posed question.
Footage of organoids taken by means of fluorescence microscopy and segmented as well as triangulated by image analysis software like LimeSeg and Mastodon often needs to be visualized in aesthetic manner for presentation of the results in scientific papers, talks and demonstrations. The goal of this work was to create a simple to use addon “Biobox” for the open source 3D – visualization package “Blender” which would allow to import triangulated 3D data with animation over time (4D), produced by image analysis software, and optimize it for efficient usage. ”Biobox” offers several visualization tools for the creation of rendered images and animation videos by biologists.
The optimization of imported data was performed by using Blender intern modifiers. The optimized data can then be visualized by using several tools built for visualizing the organoid in frozen, animated and semi-transparent manners. A dynamic link for object selection and dynamic data exchange between Blender and Mastodon was developed. Additionally, a user interface was developed for manual correction errors of segmentation and steering the object detection algorithms of LimeSeg. The benchmark of the developed addon “Biobox” was performed on real scientific data. The benchmark test demonstrated that developed optimization result in significant (~5 fold) decrease of RAM usage and acceleration of visualization more than 160 times.
Robust soft learning vector quantization (RSLVQ) is a probabilistic approach of Learning vector quantization (LVQ) algorithm. Basically, the RSLVQ approach describes its functionality with respect to Gaussian mixture model and its cost function is defined in terms of likelihood ratio. Our thesis work involves an approach of modifying standard RSLVQ with non-Gaussian density functions like logistic, lognormal, and Cauchy (referred as PLVQ). In this approach, we derive new update rules for prototypes using gradient of cost function with respect to non-Gaussian density functions. We also derive new learning rules for the model parameters like s and s, by differentiating the cost function with respect to parameters. The main goal of the thesis is to compare the performance results of PLVQ model with Gaussian-RSLVQ model. Therefore, the performance of these classification models have been tested on the Iris and Seeds dataset. To visualize the results of the classification models in an adequate way, the Principal component analysis (PCA) technique has been used.
The number of Internet of Things (IoT) devices is increasing rapidly. The Trustless Incentivized Remote Node Network, in short IN3 (Incubed), enables trustworthy and fast access to a blockchain for a large number of low-performance IoT devices. Although currently IN3 only supports the verification of Ethereum data, it is not limited to one blockchain due to modularity. This thesis describes the fundamentals, the concept and the implementation of the Bitcoin verification in IN3.
In this thesis two novel methods for removing undesired background illumination are de-veloped. These include a wavelet analysis based approach and an enhancement of a deep learning method. These methods have been compared with conventional methods, using real confocal microscopy images and synthetic generated microscopy images. These synthetic images were created utilizing a generator introduced in this thesis.
Gold cyanidation is a process by which gold is removed from low-grade ore. Due to its efficiency it has found widespread application around the world, including Peru. The process requires free cyanide in high concentration. After the gold extraction is completed, free cyanide as well as metal cyanide complexes remain in the effluent of gold mines and refineries. Often these effluents are kept in storage ponds where they pose considerable risk to health and environ-ment. Thus, it is preferable to degrade cyanide to minimize the risk of exposure. In the context of this thesis cyanide degradation was explored in a UV-light based prototype. Degradation with a combination of hydrogen peroxide and UV-light has proven to be very effective at degrading cyanide concentrations of 100 mg/L and 1000 mg/L. Furthermore, the presence of ammonia as a degradation product could also be confirmed. Membrane distillation may provide an alternative to cyanide destruction in the form of cyanide recovery. Promising results were gathered from several membrane experiment.
A classical topic in the theory of random graphs is the probability of at least one isolated vertex in a given random graph. An isolated node has a huge impact on social networks which can be given by a random graph. We present a distribution on the number of isolated vertex using the probability generating function. We discuss the relationship between isolated edges and extended cut polynomials, extended matching polynomials using the principle of inclusion exclusion. We introduce an algorithm based on colored graphs for general graphs. We apply this to the components of a graph as well. Finally, we implement the idea on a special class of graphs like cycle, bipartite graph, path, and others. We discuss recursive procedure based on the analogous coloring rules for ladder and fan graphs.
Introducing natural adversarial observations to a Deep Reinforcement Learning agent for Atari Games
(2021)
Deep Learning methods are known to be vulnerable to adversarial attacks. Since Deep Reinforcement Learning agents are based on these methods, they are prone to tiny input data changes. Three methods for adversarial example generation will be introduced and applied to agents trained to play Atari games. The attacks target either single inputs or can be applied universally to all possible inputs of the agents. They were able to successfully shift the predictions towards a single action or to lower the agent’s confidence in certain actions, respectively. All proposed methods had a severe impact on the agent’s performance while producing invisible adversarial perturbations. Since natural-looking adversarial observations should be completely hidden from a human evaluator, the negative impact on the performance of the agents should additionally be undetectable. Several variants of the proposed methods were tested to fulfil all posed criteria. Overall, seven generated observations for two of three Atari games are classified as natural-looking adversarial observations.
We investigate the folding and thermodynamic stability of a tertiary contact of baker's yeast ribosomal ribonucleic acid (rRNA), which is supposed to be essential for the maturation process of ribosomes in eukaryotes at lower temperatures1. Ribosomes are cellular machines essential for all living organisms. RNA is at the center of these machines and responsible for translation of genetic information into proteins2,3. Only recently, the rRNA tertiary contact of interest was discovered in Zurich by the research group of Vikram Govind Panse. Gerhardy et al.1 showed in vitro that within the 60s-preribosome under defined metal ion concentrations the tertiary contact become visible between a GAAA-tetraloop and a kissing loop motif. Our aim is now to understand this RNA structure, especially the formation of the rRNA tertiary contact, in terms of thermodynamics and kinetics at various experimental conditions, such as temperature and metal ion concentration of K(I), Na(I) and Mg(II). Therein, we use optical spectroscopy like UV/VIS spectroscopy and ensemble Förster or Fluorescence Resonance Energy Transfer (FRET) folding studies. Our findings will help to further characterize this newly discovered ribosomal RNA contact and to elucidate its function within the ribosomal maturation process.
Several algorithms have been proposed for the testing of series-parallel graphs in linear time. We give our alternate algorithms for testing series-parallel graphs, their tree decompositions, and the independence number when the input is undirected biconnected series-parallel graphs, which run (approximately) linearly in polynomial time.
VQ-VAE is a successful generative model which can perform lossy compression. It combines deep learning with vector quantization to achieve a discrete compressed representation of the data. We explore using different vector quantization techniques with VQ-VAE, mainly neural gas and fuzzy c-means. Moreover, VQ-VAE consists of a non-differentiable discrete mapping which we will explore and propose changes to the original VQ-VAE loss to fit the alternative vector quantization techniques.
Cryptocurrencies are characterized by high volatility, both in the short and long term. Experienced traders exploit this to make profits from price fluctuations by swing trading. However, this requires closely observing and analyzing the prices and trading positions at the right time. Only a few specialists, who spend time focusing on this, or optimized trading bots are able to actually make continuously profits. The autradix protocol is a selfoptimizing and self-learning parametric trading algorithm that analyzes price actions in real-time and adaptively optimizes the algorithm’s parameters to realize the user’s investment objective. Embedded in an adaptive genetic algorithm, possible parameterizations are simulated and the optimal for the investigated trading pairs are calculated. The generic trading protocol API enables coupling with various crypto exchanges and decentralized protocols. A smart contract based decentralized, trustless, and tokenized fund, controlled by a DAO, enables users to invest, operate trading agents, and to participate in the profits generated according to their share.
With the advancement in cryptography and emerging internet technology, electronic voting is gaining popularity since it ensures ballot secrecy, voter security, and integrity. Many commercial startups and e-Voting systems have been proposed, but due to lack of trust, privacy, transparency, and hacking issues, many solutions have been suspended. Blockchain, along with cryptographic primitives, has emerged as a promising solution due to its transparent, immutable, and decentralized nature. In this paper, we summarized the properties that existing solutions should satisfy and explained some cryptographic primitives like ZKP, Ring signatures along with their security limitations. We gave a comprehensive review of some blockchain-based e-Voting systems and discussed their strengths and weaknesses based on the given properties with table of comparison.
Abstract: Blockchain Technology has become an innovative, mature tool for digital transformation, disrupting more and more application areas in their business processes, values, or even economic models. This paper leverages more than 30 academic publications on prototypes and their Blockchain-based use cases to transact certificates in the context of public education. The conceptual design and guiding ideas are reflected in the practical application development for the Federal Ministry of Education and Research ECHT! project within the showcase region WIR! in Mittweida and are used for the research design. During this approach we applied agile methods and the current certificate process to propose a comprehensive disclosure of a new software prototype including a three-layered architecture with multi-stakeholder components. The artefact instantiation contributes to the practical knowledge base within Information System Research and specifically in digital certificate processes starting from creation, searching, and proofing up to revoking by consideration of an existing IT landscape as well as organizational hierarchy.
Over the last two decades, the rapid advances in digitization methods put us on the fourth industrial era’s cusp. It is an era of connectivity and interactivity between various industrial processes that need a new, trusted environment to exchange and share information and data without relying on third parties. Blockchain technologies can provide such a trusted environment. This paper focuses on utilizing the blockchain with its characteristics to build machine-to-machine (M2M) communication and digital twin solutions. We propose a conceptual design for a system that uses smart contracts to construct digital twins for machines and products and executes manufacturing processes inside the blockchain. Our solution also employs the decentralized identifiers standard (DIDs) to provide self-sovereign digital identities for machines and products. To validate the approach and demonstrate its applicability, the paper presents an actual implementation of the proposed design to a simulated case study done with the help of Fischertechnik factory model.
Bitcoin's energy consumption and social costs in relation to its capacity as a settlement layer
(2021)
Bitcoin runs on energy. The decentralized network’s amount of energy consumption has resulted in multifaceted discussions about its efficiency and environmental impact. To put Bitcoin’s energy consumption into perspective, we propose to relate (a) the energy consumption in TWh and (b) resulting social costs in the form of carbon emissions to the Dollar value settled on the Bitcoin network. Both metrics allow to relate and quantify the capacity of Bitcoin as a settlement layer to the network’s energy consumption and resulting carbon missions, or social costs. We find that in early 2021 Bitcoin (a) settles between $2,333 and $7,555 for each Dollar spent on energy and (b) that, on average, a Dollar settled on the Bitcoin blockchain causes in social costs between 0.007% and 0.01%, depending on the estimated energy consumption converted into the costs of carbon emissions. These results help to assess the efficiency, cost and sustainability of Bitcoin and may allow a comparison of Bitcoin with existing settlement base layers such as Fedwire or gold
Blockchain and other distributed ledger technologies are evolving into enabling infrastructures for innovative ICT-solutions. Numerous features, such as decentralization, programmability, and immutability of data, have led to a multitude of use cases that range from cryptocurrencies, tracking and tracing to automated business protocols or decentralized autonomous systems. For organizations that seek blockchain adoption, the overwhelming spectrum of potential application areas requires guidance reducing complexity and support the development of blockchain-based concepts. This paper introduces a classification approach to provide design and implementation guidance that goes beyond current textbook classifications. As an outcome, a typology for management and business architects is developed, before the paper concludes with an instantiation of existing use cases and a discussion of their classes.
Mapping identities, digital assets, and people’s profiles on the internet is getting much traction in the blockchain cosmos lately. The new technology is currently forming architectures that will further pave new ways to reach fundamental mechanisms to interact in a decentralized, user-centered manner. These schemes are often declared as the next generation of the web. Within the article will be shown, how the internet has evolved in managing identities, what problems arose, and how new data architectures help build applications on top of privacy rights. Both technological and ethical perspectives are viewed to answer which guidelines should be considered to fulfill the upcoming branch of decentralized services and what we can learn from historical schemes regarding their privacy, accounting, and user data.
Global challenges like climate change, food security, and infectious diseases such as the COVID-19 pandemic are nearly impossible to tackle when established experts and upstart innovators work in silos. If research organizations, governments, universities, NGOs, and the private sector could collaborate on these challenges more easily, lasting solutions would certainly come more quickly. Aligned with the United Nations’ Sustainable Development Goals, SAIRA connects key players in different arenas: scientists and engineers at research and technology organizations (RTOs) looking to collaborate on sustainable development projects, companies seeking R&D support to tackle their most challenging problems, and startups with innovative ideas and a desire to scale. The platform is a blockchain-secured open innovation platform, anchored on Max Plank Digital Library's blockchain network bloxberg, that assures the authenticity and integrity of all user-generated content and collaboration processes.
While blockchain technology is still in an early stage of its development, it is already of surging economic importance.
In the literature, blockchain is referred to as either being a disruptive, institutional, foundational, or general purpose technology. There is still no consensus about the economic theory that should apply for analyzing its economic effects. This article draws on use cases from the coffee supply chain to explore, which theories could potentially apply to an emerging blockchain economy.
Smart ultrafast laser processing with rotating beam – Laser micro drilling, cutting and turning
(2021)
Current micro drilling, cutting and turning processes are mainly based on EDM, milling, stamping, honing or grinding. All these technologies are using a tool with a predefined geometry that is transferred to the working piece. In contrast the laser is a highly flexible tool, which can adapt its size very fast by changing only a software setting. Thanks to the efforts in laser development during the last years, stable ultrafast lasers with sufficient average power and high repetition rates became industrially available. For using as many pulses as possible, a cost-efficient production demands for innovative processes and machining setups with fast axes movement and special optics for beam manipulation. GFH has developed a helical drilling optics, which rotates the beam up to 30.000 rpm in a very precise circle and allows furthermore to adjust the diameter and the incidence angle. This enables the laser to be used for high precision drilling and cutting and also for micro turning processes.
The shape-memory Nitinol as a nickel-titanium alloy is widely used in actuator and medical applications. However, the connection of a flange to the rod is a critical point. Therefore, laser rod end melting enables material accumulations to generate a preform at the end of a rod, followed by die forming, so that the flange can be generated. This process has been successfully applied on 1.4301 steel. This study is aimed to investigate laser rod end melting of shape-memory Nitinol regarding the resultant surface quality of the preforms. The results showed that spherical preforms could be generated without visible surface discoloration due to oxidation. By using different scan rates, different solidification conditions occurred which led to significantly different surface structures. These findings show that laser rod end melting can principally be applied on Nitinol to generate preforms for flanges whereby the surface quality depends on the solidification conditions.
Beam shaping and splitting with diffractive optics for high performance laser scanning systems
(2021)
Diffractive optical elements (DOEs) enable novel high performance and process-tailored scanning strategies for galvanometer-based scan heads. Here we present several such concepts integrating DOEs with laser scanners and the respective application use cases. Beam shaping DOEs providing a homogeneous fluence over a custom defined profile, such as a rectangular Top-Hat, enable increased process quality in Laser-Induced Forward Transfer (LIFT) compared to the Gaussian beam of the laser source. We show that aberrations which occur over the necessary large wafer-sized image field can be eliminated through the use of a synchronous XY-stage motion. Another application that benefits from the use of DOEs is laser drilling. Drilling in display and electronics manufacturing demands high throughput that can only be achieved through the use of beam splitting DOEs for parallel processing. To this end, the joint MULTISCAN project is developing a variable multi-beam tool capable of scanning and switching each individual beamlet for increased control.
Increasing speed in laser processing is driven by the development of high-power lasers into ranges of more than 1 kW. Additionally, a proper distribution of these laser power is required to achieve high quality processing results. In the case of high pulse repletion rates, a proper distribution of the pulses can be obtained from ultrafast beam deflection in the range of several 100 m/s. A two-dimensional polygon mirror scanner has been used to distribute a nanosecond pulsed laser with up to 1 kW average power at a wavelength of 1064 nm for multi pass laser engraving. The pulse duration of this laser can be varied between 30 ns and 240 ns and the pulse repetition rate is set between 1 and 4 MHz. The depth information is included in greyscale bitmaps, which were used to modulate the laser during the scanning accordingly to the lateral position and the depth. The process allows high processing rates and thus high throughput.
Pulsed laser processing of vacuum component surfaces is a promising method for electron cloud mitigation in particle accelerators. By generating a hierarchically structured surface, the escape probability of secondary electrons is reduced. The choice of laser treatment parameters – such as laser power, scanning speed and line distance – has an influence on the resulting surface morphology as well as on its performance. The impact of processing parameters on the surface properties of copper is investigated by Secondary Electron Yield (SEY) measurements, Scanning Electron Microscopy (SEM), ablation depth measurements in an optical microscope and particle release analysis. Independent of the laser wavelength (532nm and 1064nm), it was found that the surface morphology changes when varying the processing parameters. The ablation depth increases and the SEY reduces with increasing laser fluence. The final application requires the capability to treat tens of meters of vacuum pipes. The limiting factors of this type of surface treatment for the applicability in particle accelerators are discussed.
We demonstrate a thulium-based fiber amplifier delivering pulses tunable between <120fs and 2ps duration at up to 228 μJ of pulse energy at a center wavelength of 1940 nm and 500-kHz repetition rate. Due to the excellent long-term stability, this system proves the ability of this technology to be integrated into ultra-fast material processing machines.
The wind energy sector is undergoing digitalization processes that span multi-tier supply chains of turbine components and wind farm maintenance, amongst others. In an industrial use case that includes Siemens Gamesa Renewable Energy, Vestas and APQP4Wind, the processes of producing, fastening, and servicing bolts in turbines are mapped to a digital model. The model follows the lifetime of turbine bolts from the manufacturing phase, to fastening in turbines and maintenance, until their replacement and recycling. The development of the digital model is iteratively addressed in a design science research approach, as the authors actively contribute to the project. Distributed ledgers (DLs) support the notary documentation of the bolts and turbines, from their registration phase to the assembly-, technical service verification- and recycling phases. The immutable and decentralized nature of DLs secures the data against tampering and prevents any changes taken unilaterally by engaging the service stakeholders and component providers in a blockchain consortium.
There are multiple ways to gain information about an individual and its health status, but an increasingly popular field in medicine has become the analysis of human breath, which carries a lot of information about metabolic processes within the individuals body. The information in exhaled breath consists of volatile (organic) compounds (VOCs). These VOCs are products of metabolic processes within the individuals body, thus might be an indicator for diseases disturbing those processes. The compounds are to be detected by mass-spectrometric (MS) or ion-mobility spectrometric (IMS) techniques, making the analysis of these compounds not only bounded to exhaled breath. The resulting data is spectral data, capturing concentrations of the VOCs indirectly through intensities. However, a number of about 3000 VOCs [1] could already be determined in human exhaled breath. The number of research paper about VOC-analysis and detection had risen nearly constantly over the last decade 1. Furthermore, the technique to identify VOCs could also be used to capture biomarker from alien species within the individuals body. Extracting VOCs from an individual can be done by non- or minimal invasive techniques. However, the manual identification of VOCs and biomarkers related to a certain disease or infection is not feasible due to the complexity of the sample and often unknown metabolic products, thus automized techniques are needed. [1–4] To establish breath analysis as a diagnosis tool, machine learning methodes could be used. Machine learning has become a popular and common technique when dealing with medical data, due to the rapid analysis. Taking this advantage, breath analysis using machine learning could become the model of choice for diagnosis, keeping in mind that conventional methodes are laboratory based and thus when trying detect bacterial infection need sometimes several days to identify the organism. [5]
The objective of this Bachelor Project is the creation of a tool that should support forensic investigators during IT forensic interventions. It uses Kismet as the base program and adds functionalities to it via the plugin interface. The installation of the plugin shall be explained, how the plugin works, and a recommendation on how to use it. To understand the underlying basics, an introduction about WLAN and Bluetooth is given. The tests that were performed with the new plugin are described as well as their results. It is therefore briefly discussed why the tool is applicable for locating Wi-Fi devices, especially access points, but not Bluetooth devices. Using all this a few ideas on how to improve the tool and what can be researched in this area are provided.
In this work, a protocol for portable nanopore sequencing of DNA from pollen collected from honey bees, bumble bees, and wild bees was developed. DNA metabarcoding is applied to identify genera within the mixed DNA samples. The DNA extraction and ITS and ITS2 PCR parameters tested for this purpose were applied to the collected pollen sample and the amplicons were then decoded using the Flongle sequencer adapter from Oxford Nanopore Technologies. It is shown that the main pollinator resources at the different sites can be identified in percentage proportions. The protocol generated in this study can be used for further ecological questions.
We propose a method for edge detection in images with multiplicative noise based on Ant Colony System (ACS). To adapt the Ant Colony System algorithm to multiplicative noise, global pheromone matrix is computed by the Coefficient of Variation. We carried out a performance comparison of the edge detection Ant Colony System algorithm among several techniques, the best results were found in the gradient and the coefficient of variation.
At a global level, different studies disclose that transport systems are responsible for 25% of CO2 emissions. In the context of sustainable mobility, one of the challenges in the short term is associated with the research and improvement of alternative fuels, which should allow a fast decrease in the generation of greenhouse gases due to sustainable transport means. In this sense, green hydrogen can play a fundamental role. Green hydrogen is the basis for producing synthetic fuels, which can replace oil and its derivatives. Synthetic fuels or e-fuel are hydrocarbons produced from carbon dioxide (CO2) and green hydrogen (H2) as the only raw materials. H2 or efuel could be used in many sectors (manufacturing, residential, transportation, mining and other industries). In this study, different applications of hydrogen are evaluated by techno-economic analysis. The main variable that affects the production of hydrogen and its derivatives is the cost of electricity. Considering the renewable energy potential of Chile, it is feasible to develop in Chile the green hydrogen production as an energy vector, which would be technically and economically viable, together with the environmental benefits
In this paper, we designed, implemented, and tested a special surveillance camera system based on a combination of classical image processing algorithms. The system’s sub-objective consists of tracking experimental vehicles driving on a defined trajectories (Rail) in real time. Furthermore, it analyzes the scene to collect additional vehicles & rail-related information. The system then uses the gathered data to reach its main objective which confines oneself in independently predicting vehicles collision. Consequently, we propose a hybrid method of detecting and tracking ATLAS-vehicles efficiently. To detect the vehicle at the beginning of the video, periodically every n-frame, and in the case where the tracked vehicle has been lost, we used Histogram Back-Projection. By contrast, Kernelized correlation filter is used to track the detected vehicles. Combining these two methods provides one of the best trade-offs between accuracy and speed even on a single processing core. The proposed method achieves the best performance compared with three different approaches on a custom dataset.
Long-range tertiary interactions between RNA tetraloops and their receptors stabilize the folding of ribosomal RNA and support the maturation of the ribosome. Here, we use FRET-assisted structure prediction to develop a structural model of the GAAA tetraloop receptor (TLR) interaction and its dynamics. We build the docked TLR de novo, label the RNA in silico and compute FRET histograms based on MD simulations. The predicted mean FRET efficiency is remarkably consistent with single-molecule experiments of the docked tetraloop. This hybrid approach of experiment and simulation will promote the elucidation of dynamic RNA tertiary contacts and accelerate the discovery of novel RNA and RNA-protein interactions as potential future drug targets.