Refine
Document Type
- Master's Thesis (121)
- Bachelor Thesis (94)
- Conference Proceeding (66)
- Diploma Thesis (14)
- Final Report (6)
Year of publication
Language
- English (301) (remove)
Keywords
- Blockchain (40)
- Maschinelles Lernen (28)
- Vektorquantisierung (9)
- Algorithmus (7)
- Bioinformatik (6)
- Bitcoin (6)
- Graphentheorie (6)
- Internet der Dinge (6)
- Neuronales Netz (6)
- Unternehmen (6)
Institute
- Angewandte Computer‐ und Biowissenschaften (120)
- 06 Medien (35)
- 03 Mathematik / Naturwissenschaften / Informatik (28)
- Wirtschaftsingenieurwesen (21)
- 01 Elektro- und Informationstechnik (7)
- 04 Wirtschaftswissenschaften (7)
- Ingenieurwissenschaften (4)
- Sonstige (4)
- 02 Maschinenbau (2)
- 05 Soziale Arbeit (1)
We report on our recent progress in creating a new type of compact laser that uses thulium-based fiber CPA technology to emit a central wavelength of 2 μm. This laser can produce pulse energies of >100 μJ and an average power of >15 W. It is designed to be long-lasting and is built for industrial use, making it a great fit for integration into laser machines used for materials processing. These laser parameters are ideal for working with semiconductors like silicon, allowing for tasks such as micro-welding, cutting of filaments, dicing, bonding and more.
As part of the research project Trusted Blockchains for the Open, Smart Energy Grid of the Future (tbiEnergy), one of the objectives is to investigate how a holistic blockchain approach for the realization of a local energy market could be accomplished and how corresponding hardware security mechanisms can be integrated. This paper provides an overview of the implemented prototype and describes the system and its processes.
This study presents an analysis of the coverage made by the journals El País (Spain), Folha de S. Paulo (Brazil) and Süddeutsche Zeitung (Germany) about the protests in Brazil against the 2013 Confederations Cup and the 2014 FIFA World Cup to establish a comparison between them and see which topics were emphasized by the newspapers and which tone they use in their reporting. Based on the research questions, four categories were developed for the analysis of the journals: article structure; topic of the article; actors/group of persons and tone of the reporting, all of them composed by several subcategories. It was concluded that the themes highlighted by the European newspapers were different from those stressed on the Brazilian diary. Nonetheless, all the reviewed newspapers made a neutral coverage of the protests.
This paper examines the communication channels used by innovation projects at the ProtoSpace Hamburg, when engaging with stakeholders, and tries to answer the thesis question whether new media channels improve the chances of success for innovation projects, when used for this communication. Expert interviews with eight experts in com-munication, innovation and stakeholder management were conducted and then analyzed through the application of Mayring´s qualitative content analysis, in order to answer the posed question.
Functions, which can be summarized to the keyword Internet Protocol Television (IPTV) describe the transmission of video services to users via Internet Protocol (IP). Accompanying to this new television transmission path Home Theatre PCs (HTPC) running a so called Media Center platform are more and more entering the living rooms as a companion for the popular LCD and Plasma displays. Perfect ease of use and the visual integration on the screen and also into the living room is raising their acceptance. These HTPCs are a central node for multimedia services such as TV, radio and email within the networked household. Thus, there are good preconditions for the use of a HTPC as end device for Telco operator driven IPTV and telecommunication services. In the context of this diploma thesis possibilities for the provisioning of IPTV and Next Generation Network (NGN) services on a converged multimedia home entertainment platform for the living room will be investigated, especially Vista Media Center platforms. For this reason, standardization activities will be investigated, which deal with the integration of IPTV and telecommunication services into NGN. The validation of the results will be achieved by the design and implementation of a Vista Media Center Add-In, which can be integrated as an IP Multimedia Subsystem (IMS) based User Agent (UA) in ETSI TISPAN Release 2 IPTV infrastructures. Additionally, a Cross Domain messaging service for IMS based UA is created, which enables a cross-network communication between users.
Drought is one of the most common and dangerous threats plants have to face, costing the global agricultural sector billions of dollars every year and leading to the loss of tons of harvest. Until people drastically reduce their consumption of animal products or cellular agriculture comes of age, more and more crops will need to be produced to sustain the ever growing human population. Even then, as more areas on earth are becoming prone to drought due to climate change, we may still have to find or breed plant varieties more suitable to grow and prosper in these changing environments.
Plants respond to drought stress with a complex interplay of hormones, transcription factors, and many other functional or regulatory proteins and mapping out this web of agents is no trivial task. In the last two to three decades or so, machine learning has become immensely popular and is increasingly used to find patterns in situations that are too complex for the human mind to overlook. Even though much of the hype is focused on the latest developments in deep learning, relatively simple methods often yield superior results, especially when data is limited and expensive to gather.
This Master Thesis, conducted at the IPK in Gatersleben, develops an approach for shedding light on the phenotypic and transcriptomic processes that occur when a plant is subjected to stress. It centers around a random forest feature selection algorithm and although it is used here to illuminate drought stress response in Arabidopsis thaliana, it can be applied to all kinds of stresses in all kinds of plants.
The voluntary international blog VaultingNews exists for two years now. Meanwhile the
team grew and the costs increased. This thesis is a collection of tools, which can help
to improve the communication of the team members who are spread all over the world
and introduces monetization ideas where the focus lies on establishing an online fan
shop based in Germany. This chapter results in a check list which laws have to be
observed.
There are multiple ways to gain information about an individual and its health status, but an increasingly popular field in medicine has become the analysis of human breath, which carries a lot of information about metabolic processes within the individuals body. The information in exhaled breath consists of volatile (organic) compounds (VOCs). These VOCs are products of metabolic processes within the individuals body, thus might be an indicator for diseases disturbing those processes. The compounds are to be detected by mass-spectrometric (MS) or ion-mobility spectrometric (IMS) techniques, making the analysis of these compounds not only bounded to exhaled breath. The resulting data is spectral data, capturing concentrations of the VOCs indirectly through intensities. However, a number of about 3000 VOCs [1] could already be determined in human exhaled breath. The number of research paper about VOC-analysis and detection had risen nearly constantly over the last decade 1. Furthermore, the technique to identify VOCs could also be used to capture biomarker from alien species within the individuals body. Extracting VOCs from an individual can be done by non- or minimal invasive techniques. However, the manual identification of VOCs and biomarkers related to a certain disease or infection is not feasible due to the complexity of the sample and often unknown metabolic products, thus automized techniques are needed. [1–4] To establish breath analysis as a diagnosis tool, machine learning methodes could be used. Machine learning has become a popular and common technique when dealing with medical data, due to the rapid analysis. Taking this advantage, breath analysis using machine learning could become the model of choice for diagnosis, keeping in mind that conventional methodes are laboratory based and thus when trying detect bacterial infection need sometimes several days to identify the organism. [5]
The main purpose of this thesis is to investigate factors influencing the buying decision of cigarette smokers. To achieve this, different theories concerning consumer buying behavior and factors influencing have been discussed to achieve a deeper understand of consumer behaviour. To enable me comprehend the influence factors that influence the buying decision of a smoker as a consumer, a survey with questionnaires was performed. The results of the survey indicates that brand awareness, quality of the tobacco, price, packaging, advertisement, influence by others and availability are the major factors influencing the buying decision of a smoker, with availability, quality, price and brand awareness having the most effective influence on a smoker.
A Systematic Literature Review on Blockchain Oracles: State of Research, Challenges, and Trends
(2023)
To enable data exchange between the Blockchain protocol (on-chain) and the real world (off-chain), e.g., non-Blockchain-based applications and systems, a software called Oracle is used [3]. Blockchain oracle is an important component in the use of off-chain data for on-chain smart contracts. However, there is limited scientific literature available on this important blockchain topic. Therefore, in this paper, a novel systematic literature review based on intelligent methods, e.g., information linking, topic clustering and focus identification through frequency calculations, is proposed. Thus, the current state of scientific research interest, content and challenges, and future research directions for blockchain oracles are identified. This paper shows that there is little unbiased literature that does not call oracles a problem. From the results of this new literature review framework, relevant areas of data handling and verification with blockchain oracles are identified for future research.
Humans started using the principles of insurance thousands of years ago when they lived in tribes in smaller villages. If one of the tribe members were injured, the others would take care of him and his family. The basic principle of insurance is several people covering each other against a particular risk. Today, most people in regions like Europe have access to insurance, while many people worldwide still have no access at all. The cost and accessibility may be improved with a blockchain-based parametric approach. The insurance process in a parametric approach is exclusively based on data, and decisions are made objectively. Blockchain is a necessary and integral part of the approach to create transparency and connect the customer’s and investor’s risk capital. The paper offers an overview of the opportunities and challenges of blockchain-based parametric insurance, a catalog of criteria for such insurance, a description of all components and their interaction for implementation on Ethereum, and a reference implementation of a train delay insurance in Germany.
Derived from the Ancient Greek word τραῦμα (engl. wound,
damage), the word trauma refers to either physical or emotional wounds. Nowadays, it is mostly used in the context of psychological wounds, inflicted by an identity-shattering event – an event that causes the traumatised to not be able to reconcile their lived reality with the expectation of a human universal experience anymore. The last decade, the last two years in particular, and the last two weeks ad absurdum, have scarred the global landscape of human existence beyond recognition. From Putin’s unexpected reimposition of mutually assured destruction doctrines via the global SARS-Cov-2 pandemic to the lingering threat of climate doom, people all over the globe have been faced with persistent threats to their most basic perceptions of ontological safety. This article seeks to examine the impact of the SARS-Cov-2 pandemic and to which degree it is justified to speak of a pandemic trauma. In addition, it engages with the liminality of pandemic trauma as a shared, collective and an isolated, individual experience, and potential mitigation strategies for building community resilience.
The objective of this diploma thesis is to analyse the results of functional tests carried out on hydraulic valve blocks at Wujin Plant of Bosch Rexroth (Chang-zhou) Co., Ltd., (China). Based on this analysis, tests could be checked for systematic errors and root causes of failures be identified. Finally, this helped i n-crease the first pass yield of testing to release resources so far bound in inefficient testing processes. Furthermore, a tracking mechanism was established to monitor the function of crucial sensors at test benches.
In the past few years, social media has become the most popular communication software, replacing phone calls, text messages, television and even advertisements. Social media has become the most important channel for spreading opinions. As a result of this trend, many politicians have also started to operate social media (Wang, Tsai, & Chen 2019). This study was conducted in order to understand whether there was an intercandidate agenda-setting effect between the Facebook posts of legislative candidates and presidential candidates during the election period, and whether the legislative candidates' Facebook posts were influenced by the presidential candidates' Facebook posts. The target population of this study was the three presidential candidates in Taiwan's 2020 presidential election — Dr. Tsai Ing-Wen, Mr. Han Kuo-Yu, and Mr. James Soong — as well as the 36 legislative candidates in Taipei, Taichung, and Kaohsiung.
The study focused on Facebook posts from 1thNovember 2019 to 10th January 2020, 10 weeks before the voting day. Text-mining and cosine similarity were used to organize the posts and compare the similarity between posts. Finally, the similarity between posts was presented as a line graph.
The study revealed that there was an inter-candidate agenda-setting effect between legislative candidate posts and presidential candidate posts, and that Dr. Tsai Ing-Wen, who was also the incumbent president during the campaign, was the most influential Facebook poster during the entire election.
Future research is proposed on the inter-candidate agenda-setting effect only analyzing the similarity of posts among the candidates to discuss the influence of the candidates' Facebook agenda-setting during a specific election period.
This is the first study in which the Facebook posts of Taiwanese politicians are analyzed and the relationships were analyzed and the relationships were systematically compared, across multiple degrees, which opens up a whole new subject for future elections in Taiwan.
The games industry has significantly grown over the last 30 years. Projects are getting bigger and more expensive, making it essential to plan, structure and track them more efficiently.
The growth of projects has increased the administrative workload for producers, project managers and leads, as they have to monitor and control the progress of the project in order to keep a permanent overview of the project. This is often accompanied by a lack of insight into the project and basic communication within the team. Therefore, the goal of this thesis is to enhance conventional project management methods with process structures that occur frequently in game development.
This thesis initially elaborates on what project management in the game industry actually is: Here, methods are considered, especially agile methods and progress tracking prac-tices, which were created for software development and have become a standard in game development. Subsequently, an example is used to demonstrate how process management can function within the development of video games. Based on this, the ideal is depicted, which is implemented and used in a tool at the German games studio KING Art GmbH. This ideal is compared with expert interviews in order to verify its gen-eral validity in the industry.
By integrating process structures, the administrative effort can be reduced, communica-tion within game development can be simplified, while the current project status can be permanently presented. This benefits both project management and leads, as well as the entire team. Further application tests of this theory would have to be organized to check scalability and to draw comparisons to other applications.
Due to the intractability of the Discrete Logarithm Problem (DLP), it has been widely used in the field of cryptography and the security of several cryptosystems is based on the hardness of computation of DLP. In this paper, we start with the topics on Number Theory and Abstract Algebra as it will enable one to study the nature of discrete logarithms in a comprehensive way, and then, we concentrate on the application and computation of discrete logarithms. Application of discrete logarithms such as Diffie Hellman key exchange, ElGamal signature scheme, and several attacks over the DLP such as Baby-step Giant-step method, Silver Pohlig Hellman algorithm, etc have been analyzed. We also focus on the elliptic curve along with the discrete logarithm over the elliptic curve. Attacks for the elliptic curve discrete logarithm problem, ECDLP have been discussed. Moreover, the extension of several discrete logarithms-based protocols over the elliptic curve such as the elliptic curve digital signature algorithm, ECDSA have been discussed also.
This article aims to explain mathematically, why the so called double descent observed by Belkin et al., Reconciling modern machine-learning practice and the classical bias-variance trade-off, PNAS 116(32) (2019), p. 15849-15854, occurs on the way from the classical approximation regime of machine learning to the modern interpolation regime. We argue that this phenomenon may be explained by a decomposition of mean squared error plus complexity into bias, variance and an unavoidable irreducible error inherent to the problem. Further, in case of normally distributed output errors, we apply this decomposition to explain, why LASSO provides reliable predictors avoiding overfitting.
In the field of Blockchain Technology applications and research, non-fungible tokens (NFTs) have gained significant attention in recent years. Whilst current research is focused on NFT use cases or the purchase of NFTs from an investor’s perspective, the NFT launch (i.e. primary market) from a creator’s perspective remains uncovered. However, the launch strategy is considered to be an important factor for the success of a product. Therefore, our research paper aims to explore launch strategies of NFTs. Thereby, we discuss the marketing mix instruments price (i.e. pricing strategy), place (i.e. mint mechanism), and promotion. Through an empirical approach of conducting eight expert interviews, we examine parameters that are used to define an NFT launch strategy and assess their preference of different stakeholders.
This Bachelor thesis deals with connected systems consisting of a multitude of similar electronic devices
(often referred to as agents) endowed with information processing abilities. It is required that these socalled multi-agent systems solve a certain task with a high reliability, while the individual components
are not able to solve the problem on their own in a satisfying manner. A central control unit can not or
shall not be used in such systems for a variety of different reasons: For example, a significant drawback of
a central control unit is the vulnerability of the system. If the central control unit fails, the whole system breaks down. Therefore, multi-agent systems require special algorithms enabling the agents to solve a
common, global problem in a suitable manner by local interaction only.
In this thesis distributed algorithms are investigated which can be used for distributed information pro-cessing and control of such multi-agent systems. In the first part of this work, it is assumed that each
agent posses a private information state about a common parameter of interest. The described consensus algorithm enables all agents to reach a system-wide identical information state by local information
exchanges only. Subsequently, it is considered the case that every agent has access to streaming data containing information about an a priori unknown parameter. The diffusion strategy described in the second
part enables the agents to estimate this parameter and to minimize a global cost function which depends
on it. Both algorithms are described in a general framework and can therefore be applied to a variety of
different problems. One application of these strategies, which is described in the third part of this work,
is the simulation of swarming behavior.
Cancer is one of the main causes of death in developed countries, and cancer treatment heavily depends on successful early detection and diagnosis. Tumor biomarkers are helpful for early diagnose. The goal of this discovery method is to identify genetic variations as well as changes in gene expression or activity that can be linked to a typical cancer state.
First, several cancer gene signaling pathways were introduced and then combined. 27 candidate genes were selected, through the analysis of several data sets in the GEO database, a few expression difference matrices were established. Those candidate genes were tested in the matrices and found five genes PLA1A, MMP14, CCND1, BIRC5 and MYC that have the potential to be tumor biomarkers. Two of these genes have been further discussed, PLA1A is a potential biomarker for prostate cancer, and MMP14 can be considered as a biomarker for NSC lung cancer.
Finally, the significance of this study and the potential value of the two genes are discussed, and the future research in this direction is a prospect.
This scientific work deals with the current opportunities of business development. Purpose of the work is study and analysis of the organization's development strategy and its development. The subject of the study is the mechanism of formation of an organization's development strategy, understanding of business development and its core methodologies and branches. This thesis is based on the operations of the real engineering company and main part of the research could be applied in reality. Main goal of the thesis is to find recommendations on the implementation of strategic changes organization's development strategy.
The Blockchain is a technology which has the capabilities to change the way, the world operates. As promising as this may be, there are still many challenges which do not exist or are way simpler to solve in conventional software solutions. Services which are offered over the blockchain suffer from so called Block-confirmation-times where the customer simply has to wait till the transaction is confirmed. In this paper possible solutions to that problem will be examined and challenges that arise from the specific criteria of the Ethereum Blockchain will be analyzed.
Machine learning models for timeseries have always been a special topic of interest due to their unique data structure. Recently, the introduction of attention improved the capabilities of recurrent neural networks and transformers with respect to their learning tasks such as machine translation. However, these models are usually subsymbolic architectures, making their inner working hard to interpret without comprehensive tools. In contrast, interpretable models such learning vector quantization are more transparent in the ability to interpret their decision process. This thesis tries to merge attention as a machine learning function with learning vector quantization to better handle timeseries data. A design on such a model is proposed and tested with a dataset used in connection with the attention based transformers. Although the proposed model did not yield the expected results, this work outlines improvements for further research on this approach.
Analysis of Continuous Learning Strategies at the Example of Replay-Based Text Classification
(2023)
Continuous learning is a research field that has significantly boosted in recent years due to highly complex machine and deep learning models. Whereas static models need to be retrained entirely from scratch when new data get available, continuous models progressively adapt to new data saving computational resources. In this context, this work analyzes parameters impacting replay-based continuous learning approaches at the example of a data-incremental text classification task using an MLP and LSTM. Generally, it was found that replay improves the results compared to naive approaches but achieves not the performance of a static model. Mainly, the performances increased with more replayed examples, and the number of training iterations has a significant influence as it can partly control the stability-plasticity-trade-off. In contrast, the impact of balancing the buffer and the strategy to select examples to store in the replay buffer were found to have a minor impact on the results in the present case.
Stability of control systems is one of the central subjects in control theory. The classical asymptotic stability theorem states that the norm of the residual between the state trajectory and the equilibrium is zero in limit. Unfortunately, it does not in general allow computing a concrete rate of convergence particularly due to algorithmic uncertainty which is related to numerical imperfections of floating-point arithmetic. This work proposes to revisit the asymptotic stability theory with the aim of computation of convergence rates using constructive analysis which is a mathematical tool that realizes equivalence between certain theorems and computation algorithms. Consequently, it also offers a framework which allows controlling numerical imperfections in a coherent and formal way. The overall goal of the current study also matches with the trend of introducing formal verification tools into the control theory. Besides existing approaches, constructive analysis, suggested within this work, can also be considered for formal verification of control systems. A computational example is provided that demonstrates extraction of a convergence certificate for example dynamical systems.
This paper analyses the status quo of large-scale decision making combined with the possibility of blockchain as an underlying decentralized architecture to govern common pool resources in a collective manner and evaluates them according to their requirements and features (technical and non-technical). Due to an increasing trend in the distribution of knowledge and an increasing amount of information, the combination of these decentralized technologies and approaches, can not only be beneficial for consortial governance using blockchain but can also help communities to govern common goods and resources. Blockchain and its trust-enhancing properties can potenitally be a catalysator for more collaborative behavior among participants and may lead to new insights about collective action and CPRs.
Analysis of the Forensic Preparation of Biometric Facial Features for Digital User Authentication
(2023)
Biometrics has become a popular method of securing access to data as it eliminates the need for users to remember a password. Although exploiting the vulnerabilities of biometric systems increased with their usage, these could also be helpful during criminal casework.
This thesis aims to evaluate approaches to bypass electronic devices with forged faces to access data for law enforcement. Here, obtaining the necessary data in a timely manner is critical. However, unlocking the devices with a password can take several years with a brute force attack. Consequently, biometrics could be a quicker alternative for unlocking.
Various approaches were examined to bypass current face recognition technologies. The first approaches included printing the user's face on regular paper and aimed to unlock devices performing face recognition in the visible spectrum. Further approaches consisted of printing the user's infrared image and creating three-dimensional masks to bypass devices performing face recognition in the near-infrared. Additionally, the underlying software responsible for face recognition was reverse-engineered to get information about its operation mode.
The experiments demonstrate that forged faces can partly bypass face recognition and obtain secured data. Devices performing face recognition in the visible spectrum can be unlocked with a printed image of the user's face. Regarding devices with advanced near-infrared face recognition, only one could be bypassed with a three-dimensional face mask. In addition, its underlying software provided evidence about the demands of face recognition. Other devices under attack remained locked, and their software provided no clues.
Abstract nicht vorhanden
The bachelor thesis is assigned to introduce the theoretical concept of Human Recourses Management, to analyze the work of human resources department of the LLC Tavria-V and to offer actions with recommendation to improve the productivity of the personnel. To start the implementation of actions for personnel management improvement, first of all, an overview of theoretical and methodological aspects of the HRM are presented and theories which earlier had an impact on our present running of the "workers" are described. Secondly, the concept of organizational work of the enterprise, main indexes and types of activities are figured out and in the form of tables and diagrams analyzed. The main object of the thesis - the process of personnel management with qualitative characteristics is described and presented. Using also the survey of employee all advantages and disadvantages of the present system of HRM are defined. Then in the last part, taking to account all data about current situation, recommended actions and effect for LLC Tavria-V on the basis of the personnel management analysis are presented in the work.
In this thesis, we focus on using machine learning to automate manual or rule-based processes for the deduplication task of the data integration process in an enterprise customer experience program. We study the underlying theoretical foundations of the most widely used machine learning algorithms, including logistic regression, random forests, extreme gradient boosting trees, support vector machines, and generalized matrix learning vector quantization. We then apply those algorithms to a real, private data set and use standard evaluation metrics for classification, such as confusion matrix, precision, and recall, area under the precision-recall curve, and area under the Receiver Operating Characteristic curve to compare their performances and results.
As new sensors are added to VR headsets, more data can be collected. This introduces a new potential threat to user privacy. We focused on the feasibility of extracting personal information from eye-tracking. To achieve this, we designed a preliminary user study focusing on the pupil response to audio stimuli. We used a variation of machine learning models to test the collected data to determine the feasibility of obtaining information such as the age or gender of the participant. Several of the experiments show promise for obtaining this information. We were able to extract with reasonable certainty whether caffeine was consumed and the gender of the participant. This demonstrates the unknown threat that embedded sensors pose to users. A further studies are planned to verify the results.
FUSO is one of the Japanese leading manufacturing of trucks and buses in the world and also it is an integral part of Daimler AG. Being a large manufacturer in trucks and buses, Fuso faces some marketing issues due to corrosion issues. Corrosion is one of the major issue to breakdown or damage the performance of the vehicles. To encounter this issue, FUSO initiated new project and called as “Anti-Corrosion Project”. The main mission of this project is to improve the corrosion resistivity or performance of the metal parts. Currently FUSO has almost 70 percent of parts which lies under Grade-III i.e. lesser than the one year corrosion resistivity.
In this project, the corrosion issues are collected by different types of audits like from customer as well as from taking two years old vehicle in worst conditions. Listed corrosion issues further investigated for current specification and requested for new proposal from supplier. Then the proposed solution is internally estimate the cost and make negotiation with the supplier. Later it’s forwarded to meeting with top management for approval. In case of higher corrosion specification, parts are taken from production line and tested in material lab which is available in FUSO. At last, the approved proposal is requested to release the drawing change and further the new proposal will be implemented. Entire project it should be coordinate with all different departments and working with teams gives more deep knowledge about the cause of issues.
With this project, parallel focused on the shop floor developments in return parts management area. FUSO is also responsible for the after sale services. In other words, FUSO provides warranty for the parts which breakdown within three years. Breakdown parts are directly delivered by the customers through dealers for warranty claim, so these parts called Warranty Part Investigation (WPI) parts. Sometimes customer wants to know the cause of the breakdown even though warranty has expired, in this case company will investigate the cause but they don’t provide the warranty. These kind of parts known as Product Quality Report (PQR) parts.
Company has a different shop floor for return parts and these parts are directly received by the company. RPM has four processes i.e. inwarding, pre-analysis, investigation and dispatch or scrap.
Usually, company used to get 30-50 parts per day, recently they decided to receive all the breakdown parts. Hence, it results in increasing the delay of inwarding and other processes. To solve this, standard layout and process are constructed. And, one of the main reasons for inward delay is higher documentation which is basically not required. These are converted into automation or digitalize work. Improvements are done using the lean manufacturing project methodology which results in more inward of failure parts and less inventory.
Safety, quality, and sustainability concerns have arisen from global supply chains. Stakeholders incur risk regarding these factors, given their significance and complexity. Thus, each business's supply chain risk management must prioritize product characteristics. Accordingly, an effective traceability solution that can monitor and regulate product and supply chain aspects is crucial, especially in a given scenario. This re-search paper elucidates the potential of smart contracts in blockchain to enhancing the efficacy of business transactions and ensuring comprehensive traceability within the supply chain of paper-based coffee cups The improved levels of transaction transparency and security in traditional supply chains have been achieved through the digitization of supply chain ecosystem interactions and transactions. This approach makes verifying sources, manufacturing procedures, and quality standards easier in complex supply chains. Accordingly, the integration helps stakeholders monitor and track the whole ecosystem, promoting transparency, predictability, and dependability.
At a global level, different studies disclose that transport systems are responsible for 25% of CO2 emissions. In the context of sustainable mobility, one of the challenges in the short term is associated with the research and improvement of alternative fuels, which should allow a fast decrease in the generation of greenhouse gases due to sustainable transport means. In this sense, green hydrogen can play a fundamental role. Green hydrogen is the basis for producing synthetic fuels, which can replace oil and its derivatives. Synthetic fuels or e-fuel are hydrocarbons produced from carbon dioxide (CO2) and green hydrogen (H2) as the only raw materials. H2 or efuel could be used in many sectors (manufacturing, residential, transportation, mining and other industries). In this study, different applications of hydrogen are evaluated by techno-economic analysis. The main variable that affects the production of hydrogen and its derivatives is the cost of electricity. Considering the renewable energy potential of Chile, it is feasible to develop in Chile the green hydrogen production as an energy vector, which would be technically and economically viable, together with the environmental benefits
Many companies use machine learning techniques to support decision-making and automate business processes by learning from the data that they have. In this thesis we investigate the theory behind the most widely used in practice machine learning algorithms for solving classification and regression problems.
In particular, the following algorithms were chosen for the classification problem: Logistic Regression, Decision Trees, Random Forest, Support Vector Machine (SVM), Learning Vector Quantization (LVQ). As for the regression problem, Decision Trees, Random Forest and Gradient Boosted Tree were used. We then apply those algorithms to real company data and compare their performances and results.
Applications and Potential Impacts of Blockchain Technology in Logistics and Supply Chain Areas
(2022)
The motive of the present thesis is to analyze the applications and potential impacts of blockchain technology in the logistics and supply chain areas. For this purpose, the literature from different sources has been used to analyze and get an overview of the current status and role of blockchain technology within the logistics and supply chain areas. Different use cases, as well as pilot projects from organizations all over the world and also from Germany, have been included. Suggestions for further applications and implementations of blockchain technology along with their potential impacts have been made. Additionally, the cost of implementing blockchain-based solutions and applications has been estimated along with providing recommendations and suggestions for important and key points to be considered before preparing and deciding to implement blockchain-based solutions in any organization.
Laser welding of hidden T-joints, connecting the web-sheet through the face-sheet of the joint can provide advantages like increased lightweight potential in manufacturing sandwich structures with thin-walled cores. However, maintaining the correct positioning of the beam relative to the joint is challenging. A method to reduce the effort of positioning is using optical coherence tomography (OCT), that interferometrically measures the reflection distance inside of the keyhole during laser deep penetration welding. In this study new approaches for targeted data processing of the OCT-signal to automatically detect misalignments are presented. It is shown that considering multiple components from the inference pattern and the respective signal intensities improve the detection accuracy of misalignments.
Aspects of Mindful Leadership Upon the Psychological Health of Employees in an Intercultural Context
(2023)
Across the globe, organizations are in the midst of rapid transformation. Immigration, digitalization and the push for sustainability are just to name a few. Organizational structures are being pushed for more agility, co-opetition, integration, tenable and resilient workplaces. Social structures of companies are being reformed and the weight of cooperation and integration lays upon the leaders and employees. But from this weight of integration what psychological effects does it play upon the migrant and domestic employees to be engaged at work? What role does the leadership style impact the mental health and engagement in the cross-cultural workplace? Previous work has shown the importance of workplace integration, however, the impact of the mental health of domestic employees needs more attention from the scholars in this new context. The object of the research is to define the connection of mindful leadership and the psychological health of employees within a cross-cultural workplace and to develop strategies to improve workplace engagement.
Assessment of COI and 16S for insect species identification ti determine the diet of city bats
(2023)
Despite the numerous benefits of urbanization to human living conditions, urbanization has also negatively affected humans, their environment, and other organisms that share urban habitats with humans. Undoubtedly adverse while some wild animals avoid living in urban areas, others are more tolerant or prefer life in urban habitats. There are more than 1,400 species of bats in the world.
Therefore, they have the potential to contribute significantly to the mammalian biodiversity in urban areas. Insectivorous bats species play a key role in agriculture by improving yields and reducing chemical pesticide costs. Using metabarcoding, it is possible to determine the prey consumed by these noctule mammals based on the DNA fragments in their fecal pellets. This study
aimed to evaluate COI and 16S metabarcodes for insect species identification to determine the diet of metropolitan bats. For this purpose, COI and 16S metabarcodes were extracted, amplified, and sequenced from 65 bat feces collected in the Berlin metropolitan areas. Following a taxonomic annotation, I found that 73% of all identified insects could only be detected using the COI method, while 15% could be recovered using the 16S approach. Just 12% of all detected insects were identified simultaneously by both markers. According to this result, COI is more suitable for the taxonomic identification of insects from bat feces. However, given the bias of COI primers, it is recommended to use both markers for a more precise estimation of species diversity. Additionally,based on the insect species identified, I noticed that urban bats fed mainly on Diptera, Coleoptera,and Lepidoptera. The bat species Nyctalus noctula was most abundant in the samples. His diet analysis revealed that 91% of the samples contained the insect species Chironomus plumosus. 14 pest insect species were also found in his diet.
Noise in the oceans is a constantly increasing factor. The growing industrialisation due to shipping, offshore wind parks, seismic studies and other anthropogenic noise is putting the eco system under immense stress. The focus of this thesis is on the assessment of continuous underwater noise from ships. Based on existing strategies in air as well as underwater and a comparison of both an alternative strategy for the assessment of con-tinuous noise from ships is given. The concept developed is based on published, scien-tifically observed responses of animals to ship passes with an indication of an effect range. A model is created to describe the strategy using publicly available data for cargo ships as an example. The results are summarized in maps depicting the affected area for an MRU of the OSPAR II region and the MPA “Borkum Riffgrund”. The strategy is discussed and evaluated on the basis of these results. From this, further improvements and the need for additional information in publicly available data on vessel traffic are derived.
The GeoFlow II experiment aims to replicate Earth’s core dynamics using a rotating spherical container with controlled temperature differences and simulated gravity. During the GeoFlow II campaign, a massive dataset of images was collected, necessitating an automated system for image processing and fluid flow visualization in the northern hemisphere of the spherical container. From here, we aim to detect the special structures appearing on the post processed images. Recognizing YOLOv5’s proficiency in object detection, we apply Yolov5 model for this task.
Cryptocurrencies are characterized by high volatility, both in the short and long term. Experienced traders exploit this to make profits from price fluctuations by swing trading. However, this requires closely observing and analyzing the prices and trading positions at the right time. Only a few specialists, who spend time focusing on this, or optimized trading bots are able to actually make continuously profits. The autradix protocol is a selfoptimizing and self-learning parametric trading algorithm that analyzes price actions in real-time and adaptively optimizes the algorithm’s parameters to realize the user’s investment objective. Embedded in an adaptive genetic algorithm, possible parameterizations are simulated and the optimal for the investigated trading pairs are calculated. The generic trading protocol API enables coupling with various crypto exchanges and decentralized protocols. A smart contract based decentralized, trustless, and tokenized fund, controlled by a DAO, enables users to invest, operate trading agents, and to participate in the profits generated according to their share.
A variety of methods have been used to describe natural systems and cellular functions. Most use continuous systems with differential equations. Based upon the neighbourhood relations in graphs and the complex interactions in cellular automata a mathematical model was designed and implemented as an application user interface. This discrete approach called graph automata was utilised to simulate diffusion processes and chemical kinetics. The progression of diffusion in cellular environments was described and resulted in a discrepancy of 20% in comparison to experimental results. Different chemical kinetics were simulated and found to be as accurate as their continuous counterparts. The proposed model appears to be a highly scalable and modular
approach to simulate natural systems.
Beam shaping and splitting with diffractive optics for high performance laser scanning systems
(2021)
Diffractive optical elements (DOEs) enable novel high performance and process-tailored scanning strategies for galvanometer-based scan heads. Here we present several such concepts integrating DOEs with laser scanners and the respective application use cases. Beam shaping DOEs providing a homogeneous fluence over a custom defined profile, such as a rectangular Top-Hat, enable increased process quality in Laser-Induced Forward Transfer (LIFT) compared to the Gaussian beam of the laser source. We show that aberrations which occur over the necessary large wafer-sized image field can be eliminated through the use of a synchronous XY-stage motion. Another application that benefits from the use of DOEs is laser drilling. Drilling in display and electronics manufacturing demands high throughput that can only be achieved through the use of beam splitting DOEs for parallel processing. To this end, the joint MULTISCAN project is developing a variable multi-beam tool capable of scanning and switching each individual beamlet for increased control.
Gold cyanidation is a process by which gold is removed from low-grade ore. Due to its efficiency it has found widespread application around the world, including Peru. The process requires free cyanide in high concentration. After the gold extraction is completed, free cyanide as well as metal cyanide complexes remain in the effluent of gold mines and refineries. Often these effluents are kept in storage ponds where they pose considerable risk to health and environ-ment. Thus, it is preferable to degrade cyanide to minimize the risk of exposure. In the context of this thesis cyanide degradation was explored in a UV-light based prototype. Degradation with a combination of hydrogen peroxide and UV-light has proven to be very effective at degrading cyanide concentrations of 100 mg/L and 1000 mg/L. Furthermore, the presence of ammonia as a degradation product could also be confirmed. Membrane distillation may provide an alternative to cyanide destruction in the form of cyanide recovery. Promising results were gathered from several membrane experiment.
Die biologische Ammoniumoxidation ist ein zentraler Bestandteil des globalen Stickstoffkreislaufs. Angesichts der extremen Massen Stickstoff anthropogenen Ursprungs in der Umwelt, liegt die Entfernung reaktiven Stickstoffs im Interesse der Umwelt und der öffentlichen Gesundheit. In der folgenden Arbeit werden Bedingungen zur anaeroben Ammoniumoxidation mit Nitrat in einem Anammox-Reaktor untersucht. Dabei wurden 2 Laborreaktoren für eine Zeit von insgesamt 116 Tagen betrieben und beobachtet, die ausschließlich als Elektronendonatoren und Akzeptoren Ammonium und Nitrat enthielten. Zusätzlich wurden Batchkulturen mit Zellen eines Reaktors angezüchtet und auf ihre Gaszusammensetzung abhängig unterschiedlicher Eigenschaften untersucht. Hierbei wurde eine Reihe unterschiedlicher analytischer Quantifizierungsmethoden genutzt und es konnte gezeigt werden, dass ein Abbau unter den Bedingungen stattfindet.
Die aktuelle Forschung zu dieser Reaktion ist spärlich und verleiht der Bachelorarbeit dadurch Relevanz.
Bitcoin's energy consumption and social costs in relation to its capacity as a settlement layer
(2021)
Bitcoin runs on energy. The decentralized network’s amount of energy consumption has resulted in multifaceted discussions about its efficiency and environmental impact. To put Bitcoin’s energy consumption into perspective, we propose to relate (a) the energy consumption in TWh and (b) resulting social costs in the form of carbon emissions to the Dollar value settled on the Bitcoin network. Both metrics allow to relate and quantify the capacity of Bitcoin as a settlement layer to the network’s energy consumption and resulting carbon missions, or social costs. We find that in early 2021 Bitcoin (a) settles between $2,333 and $7,555 for each Dollar spent on energy and (b) that, on average, a Dollar settled on the Bitcoin blockchain causes in social costs between 0.007% and 0.01%, depending on the estimated energy consumption converted into the costs of carbon emissions. These results help to assess the efficiency, cost and sustainability of Bitcoin and may allow a comparison of Bitcoin with existing settlement base layers such as Fedwire or gold
This desk research will initiate an exploration of present and potential blockchain applications in the higher education sector of Europe. The aim of this research is to create a theoretical base for a further postgraduate research and analysis, so to create an effective model/framework to augment the integration of blockchain technology into existing organizational processes, initially in higher educational institutions, but which may be adaptable and generalizable to other specific uses. Due to the novelty of the topic, academic resources related to the research area are limited. Most studies seem to focus on blockchain-based applications in industries such as finance, healthcare, and supply chain management, and there is little evidence of the impact of blockchain technology on education. This paper discusses present and suggests some potential blockchain-based applications in education in Europe and beyond. This research provides a groundwork for education and academia stakeholders, policymakers and researchers to exploit the potential of blockchain in different functions of an education system.
Over the last two decades, the rapid advances in digitization methods put us on the fourth industrial era’s cusp. It is an era of connectivity and interactivity between various industrial processes that need a new, trusted environment to exchange and share information and data without relying on third parties. Blockchain technologies can provide such a trusted environment. This paper focuses on utilizing the blockchain with its characteristics to build machine-to-machine (M2M) communication and digital twin solutions. We propose a conceptual design for a system that uses smart contracts to construct digital twins for machines and products and executes manufacturing processes inside the blockchain. Our solution also employs the decentralized identifiers standard (DIDs) to provide self-sovereign digital identities for machines and products. To validate the approach and demonstrate its applicability, the paper presents an actual implementation of the proposed design to a simulated case study done with the help of Fischertechnik factory model.
This paper looks at current projects in the field of Blockchain in education, their specific areas of application, possible advantages and weaknesses. Three examples developed by the team of authors are introduced in detail. First: Gallery-Defender a Serious Game, which was adapted to serve as a demonstrator in a stand-alone version to show the possibility to carry out exams directly from within the game and store the grades and meta-data on Blockchain. Second: Art-Quiz, an e-learning tool, which can be integrated into existing LMS systems and map exam results and further data using Blockchain technologies. Both were developed following an iterative design process. And third: The results of a focus group, which simulated the assignment of grades after an oral online exam. The three examples presented here are based on the Blockchain system Ardor/Childchain Ignis, but each demonstrator has a different set of features and approaches.
In addition, the integration of various Blockchain solutions was conceptually designed to make a Multi-Chain model possible.
Procurement processes are deemed to lack supporting digital technologies that raise efficiency and automation.
Blockchain solutions are piloted in procurement in order to offer a decentralized IT infrastructure covering these needs. This paper aims at identifying current blockchain approaches in the field of procurement and presenting affected business processes. In order to get an overview of the current state of the art, a systematic literature mapping is conducted.
Moreover, the out-comes are gathered and categorized in a classification scheme. Based on the analysis, systematic maps are presented to showcase relevant findings. Within the findings, several blockchain use cases in the field of procurement are identified and information about addressed challenges, utilized blockchain frameworks and affected business processes are extracted.
The cryptocurrency ecosystem has seen significant growth with Ethereum and Bitcoin as foundational pillars. Ethereum introduced smart contracts revolutionizing decentralized applications (dApps) across various domains. Scalability challenges led to alternative ecosystems like Binance Smart Chain and Polygon, maintaining compatibility through the Ethereum Virtual Machine (EVM). Bitcoin also faces scalability issues, leading to the Lightning Network's development—an off-chain solution with payment channels for scalable instant transactions. Interoperability is increasingly crucial as the cryptocurrency ecosystem continues to grow, enabling seamless interactions between assets and data across multiple blockchain platforms. EVM-compatible blockchains and the Lightning Network offer unique advantages in their respective use cases. This paper utilizes atomic swaps to create a secure, fast, and user-friendly trustless bridge between the Lightning Network and EVM-compatible blockchains, fostering the growth of both ecosystems and unlocking novel opportunities.
As the cryptocurrency ecosystem rapidly grows, interoperability has become increasingly crucial, enabling assets and data to interact seamlessly across multiple chains. This work describes the concept and implementation of a trustless connection between the Bitcoin Lightning Network and EVM-compatible blockchains, allowing the transfer of assets between the two ecosystems. Establishing such a connection can significantly contribute to the growth of both ecosystems as they can benefit from each other’s advantages and emerge new pos- sibilities.
The following thesis contains a detailed business plan of a formula student combustion racecar. This includes the evaluating of existing knowledge about the car combined with required information about the market and seed capital. Subsequently the already presented plan is described with the interpretation for future business plans. In this connection the acceptance of electro mobility shall be evaluated and first ideas for the presentation of an electric car shall be created.
Reputation is indispensable for online business since it supports customers in their buying decisions and allows sellers to justify premium prices. While IS research has investigated reputation systems mainly as review systems on online platforms for business-to-consumer (B2C) transactions, no proper solutions have been developed for business-to-business (B2B) transactions yet. We use blockchain technology to propose a new class of reputation systems that apply ratings as voluntary bonus payments: Before a transaction is performed, customers commit to pay a bonus that is granted if a service provider has performed a service properly. As opposed to rival reputation systems that build on cumulated ratings or reviews, our system enables monetized reputation mechanisms that are inextricably linked with online transactions. We expect this system class to provide more trustworthy ratings, which might reduce agency costs and serve quality providers to establish a reputation towards new customers.
In response to prevailing environmental conditions, Arabidopsis thaliana plants must increase their photosynthetic capacity to acclimate to potential harmful environmental high light stress. In order to measure these changes in acclimation capacity, different high throughput imaging-based methods can be used. In this master thesis we studied different Arabidopsis thaliana knockout mutants-and accessions in their capacity to acclimate to potential harmful environmental high light and cold temperature conditions using a high throughput phenotyping system with an integrated chlorophyll fluorescence measurement system. In order to determine the acclimation capacity, Arabidopsis thaliana knockout mutants of previously not high light assigned genes as well as accessions of two different haplotype groups with a reference and alternative allele from different countries of origin were grown under switching high light and temperature environmental conditions. Photosynthetic analysis showed that knockout mutant plants did differ in their Photosystem II operating efficiency during an increased light irradiance switch but did not significantly differ a week later under the same circumstances from the wildtype. High throughput phenotyping of haplotype accessions revealed significant better acclimation capacity in non-photochemical quenching and steady-state photosynthetic efficiency in Russian domiciled accessions with an altered SPPA gene during high light and cold stress.
We investigate the folding and thermodynamic stability of a tertiary contact of baker's yeast ribosomal ribonucleic acid (rRNA), which is supposed to be essential for the maturation process of ribosomes in eukaryotes at lower temperatures1. Ribosomes are cellular machines essential for all living organisms. RNA is at the center of these machines and responsible for translation of genetic information into proteins2,3. Only recently, the rRNA tertiary contact of interest was discovered in Zurich by the research group of Vikram Govind Panse. Gerhardy et al.1 showed in vitro that within the 60s-preribosome under defined metal ion concentrations the tertiary contact become visible between a GAAA-tetraloop and a kissing loop motif. Our aim is now to understand this RNA structure, especially the formation of the rRNA tertiary contact, in terms of thermodynamics and kinetics at various experimental conditions, such as temperature and metal ion concentration of K(I), Na(I) and Mg(II). Therein, we use optical spectroscopy like UV/VIS spectroscopy and ensemble Förster or Fluorescence Resonance Energy Transfer (FRET) folding studies. Our findings will help to further characterize this newly discovered ribosomal RNA contact and to elucidate its function within the ribosomal maturation process.
The larval zebrafish mutant Knörf has got a not yet identified gen, which is lethal after 14 dpf in a homozygous state. The mutation courses various degenerations and the loss of the regeneration ability. One of these degenerations was first discovered in the retina by a histological section. The mutants retinas show gaps in the IPL at 7 and 8 dpf which number increases during the maturation of the larva. In recent studies a pax 6 staining was performed, which showed that amacrine cells areaffected. Different types of amacrine cells were tested and it was shown that the parvalbuminergic amacrine cells disappear. The staining was performed in a time course. At 5 dpf is no difference between the number of parvalbuminergic amacrine cells in siblings and mutants but then the degeneration starts. At 2 dpa there is thefirst significant difference which increases at later stages and leads nearly to a full disappearance of these cells in the eye. Parvalbumin is not only present in the retina, therefore the brain as another central nervous system structure was examined. In the telencephalon these cells disappear already at 2 dpa. The parvalbuminergic cells are also present in the skeletal muscle of the tail. Here the degeneration starts approximately at the half of the tail and intensifies to distal areas. It was shown, that parvalbuminergic cells in the muscle disappear until 4dpa. The role of parvalbumin is seemed in the binding ofcalcium and therefore it supports the adjustment of the resting potential after an excitation in the central nervous system. In muscles it assists in the slowing of relaxing after a contraction of a muscle.
In today’s market, the process of dealing with textual data for internal and external processes has become increasingly important and more complex for certain companies. In this context,the thesis aims to support the process of analysis of similarities among textual documents by analyzing relationships among them. The proposed analysis process includes discovering similarities among these financial documents as well as possible patterns. The proposal is based on the exploitation and extension of already existing approaches as well as on their combination with well-known clustering analysis techniques. Moreover, a software tool has been implemented for the evaluation of the proposed approach, and experimented on the EDGAR filings, on the basis of qualitative criteria.
It is possible to obtain a common updating rule for k-means and Neural Gas algorithms by using a generalized Expectation Maximization method. This result is used to derive two variants of these methods. The use of a similarity measure, specifically the gaussian function, provides another clustering alternative to the before mentioned methods. The main benefit of using the gaussian function is that it inherently looks for a common cluster center for similar data points (depending on the value of the parameter s ). In different experiments we report similar behaviour of batch and proposed variants. Also we show some useful results for the “alternative” similarity method, specifically when there is no clue about the number of clusters in the data sets.
In this paper, we conduct experiments to optimize the learning rates for the Generalized Learning Vector Quantization (GLVQ) model. Our approach leverages insights from cog- nitive science rooted in the profound intricacies of human thinking. Recognizing that human-like thinking has propelled humankind to its current state, we explore the applica- bility of cognitive science principles in enhancing machine learning. Prior research has demonstrated promising results when applying learning rate methods inspired by cognitive science to Learning Vector Quantization (LVQ) models. In this study, we extend this approach to GLVQ models. Specifically, we examine five distinct cognitive science-inspired GLVQ variants: Conditional Probability (CP), Dual Factor Heuristic (DFH), Middle Symmetry (MS), Loose Symmetry (LS), and Loose Symme- try with Rarity (LSR). Our experiments involve a comprehensive analysis of the performance of these cogni- tive science-derived learning rate techniques across various datasets, aiming to identify optimal settings and variants of cognitive science GLVQ model training. Through this research, we seek to unlock new avenues for enhancing the learning process in machine learning models by drawing inspiration from the rich complexities of human cognition. Keywords: machine learning, GLVQ, cognitive science, cognitive bias, learning rate op- timization, optimizers, human-like learning, Conditional Probability (CP), Dual Factor Heuristic (DFH), Middle Symmetry (MS), Loose Symmetry (LS), Loose Symmetry with Rarity (LSR).
We present dimensionality reduction methods like autoencoders and t-SNE for visualization of high-dimensional data into a two-dimensional map. In this thesis, we initially implement basic and deep autoencoders using breast cancer and mushroom datasets. Next, we build another dimensionality reduction method t-SNE using the same datasets. The obtained visualization results of the datasets using the dimensionality reduction methods are documented in the experiments section of the thesis. The evaluation of classification and clustering for the dimensionality reduction techniques is also performed. The visualization and evaluation results of t-SNE are significantly better than the other dimensionality reduction techniques.
Convolutional Neural network (CNN) has been one of most powerful and popular preprocessing techniques employed for image classification problems. Here, we use other signal processing techniques like Fourier transform and wavelet transform to preprocess the images in conjunction with different classifiers like MLP, LVQ, GLVQ and GMLVQ and compare its performance with CNN.
Adversarial robustness of a nearest prototype classifier assures safe deployment in sensitive use fields. Much research has been conducted on artificial neural networks regarding their robustness against adversarial attacks, whereas nearest prototype classifiers have not chalked similar successes. This thesis presents the learning dynamics and numerical stability regarding the Crammer-normalization and the Hein-normalization for adversarial robustness of nearest prototype classifiers. Results of conducted experiments are penned down and analyzed to ascertain the bounds given by Saralajew et al. and Hein et al. for adversarial robustness of nearest prototype classifiers.
Differentiation is ubiquitous in the field of mathematics and especially in the field of Machine learning for calculations in gradient-based models. Calculating gradients might be complex and require handling multiple variables. Supervised Learning Vector Quantization models, which are used for classification tasks, also use the Stochastic Gradient Descent method for optimizing their cost functions. There are various methods to calculate these gradients or derivatives, namely Manual Differentiation, Numeric Differentiation, Symbolic Differentiation, and Automatic Differentiation. In this thesis, we evaluate each of the methods mentioned earlier for calculating derivatives and also compare the use of these methods for the variants of Generalized Learning Vector Quantization algorithms.
The computer based calculation of the sound insulation in between dwellings or the analysis of the transmission in a building are common use in practice of a building acoustic engineer. With the release of the revised DIN 4109 in July 2016 a whole new calculation model was introduced to the German users of this standard. The calculation model and its input data now need to be included into existing calculation software, such as the software SONarchitect ISO of the Spanish developers Sound of Numbers. For this cause this thesis compares the input parameter and the airborne and impact sound transmission of the DIN 4109:2016-07 and European standard EN 12354:2000. With the help of this comparison it now is possible to declare all necessary parameter and calculation procedures for the calculation of the airborne and impact sound insulation between dwellings.
In the past few years Generative models have become an interesting topic in the field of Machine Learning (ML). Variational Autoencoder (VAE) is one of the popular frameworks of generative models based on the work of D.P Kingma and M. Welling [6] [7]. As an alternative to VAE the authors in [12] proposed and implemented Information Theoretic Learning (ITL) based Autoencoder. VAE and ITL Autoencoder are a combination of the neural networks and probabilistic graphical models (PGM) [7]. In modern statistics it is difficult to compute the approximation ofthe probability densities. In this paper we make use of Variational Inference (VI) technique from machine learning that approximate the distributions through optimization. The closeness between the distributions are measured by the information theoretic divergence measures such as Kullbach-Liebler, Euclidean and Cauchy Schwarz divergences. In this thesis, we study theoretical and experimental results of two different frameworks of generative models which generate images of MNIST handwritten characters [8] and Yale face database B [3]. The results obtained show that the proposed VAE and ITL Autoencoder are capable of generating the underlying structure of the example datasets
This bachelor thesis examines two main topics: Corporate Social Responsibility and Corporate Philanthropy as an integral part of it. It was written in order to prove the high importance of business philanthropy in today’s global market and to encourage companies to strengthen their CSR policy so as to contribute to the resolution of social problems. This paper reviews the theoretical framework of CSR, its evolution, types and theories relating to Corporate Philanthropy. Also it represents a comparative analysis of successful practices of corporate philanthropy in pharmaceutical and other global industries predominantly in Europe and USA. This work underlines competitive advantages and important socio-economic impact of CP and suggest recommendations for companies in developing their CSR activities. The subsequent paper is based on internet research using articles, presentations, reports and studies, websites and official legal documents.
Studying and understanding the metabolism of plants is essential to better adapt them to future climate conditions. Computational models of plant metabolism can guide this process by providing a platform for fast and resource-saving in silico analyses. The reconstruction of these models can follow kinetic or stoichiometric approaches with Flux Balance Analysis being one of the most common one for stoichiometric models. Advances in metabolic modelling over the years include the increasing number of compartments, the automation of the reconstruction process, the modelling of plant-environment interactions and genetic variants or temporally and spatially resolved models. In addition, there is a growing focus on introducing synthetic pathways in plants to increase their agricultural potential regarding yield, growth and nutritional value. One example is the β-hydroxyaspartate cycle (BHAC) to bypass photorespiration. After the implementation in a stoichiometric C3 plant model, in silico flux analyses can help to understand the resulting metabolic changes. When comparing with in vivo experiments with BHAC plants, the metabolic model can reproduce most results with exceptions regarding growth and oxaloacetate. To evaluate whether the BHAC is suitable to establish a synthetic C4 cycle, the pathway is implemented in a two-cell type model that is capable of running a C4 cycle. The results show that the BHAC is only beneficial under light limitation in the bundle sheath cell. An additional engineering target for improved performance of plants is malate synthase. This work serves as the basis for further analyses combining the different factors boosting the advantages of the BHAC and for in vivo experiments in C3 and C4 plants.
There are a lot of people taking part in more than one competition. The competitions are also of a different kind. From local events with a small number of participants to international tournaments watched by many viewers. Naturally it becomes necessary a system to assess and compare the success in various competitions.
The existing ranking systems are usually specialized to fit their application area. More general ranking methods also exist. They can be applied to a wide spectrum of competition fields. However these ranking methods are still not universal and don't cover some important features of the competitions.
A totally new ranking system has been developed within the present master thesis. Its primary purpose is to evaluate and measure prestige gained by participants in competitions. The main contribution of the thesis consists of an original mathematical model that makes the ranking system unique.
The developed ranking system claims to be universal and interdisciplinary. It is based on the fundamental element that distinguishes the competition from the non-competition areas, namely standings that rank the participants according to their performance. The universality and the interdisciplinarity of the ranking system make available cross-disciplinary comparisons, which is usually very subjective and difficult for implementation.
The contribution of the master thesis extends beyond the theoretical area. A ranking software that fully implements this novel ranking system has been designed and developed. The software makes the practical benefits of the ranking system immediately available to potential application areas such as sports clubs and universities.
And finally, the developed ranking system offers a new viewpoint to the competitions – as a way of gaining prestige, rather than the traditional viewpoint of demonstrating mastery.
In machine learning, Learning Vector Quantization (LVQ) is well known as supervised vector quantization. LVQ has been studied to generate optimal reference vectors because of its simple and fast learning algorithm [2]. In many tasks of classification, different variants are considered while training a model and a consideration of variants of large margin in LVQ helps to get significant
results [20]. Large margin LVQ (LMLVQ) is to maximize the distance between decision hyperplane and data points. In this thesis, a comparison of different variants of Generalized Learning Vector Quantization (GLVQ) and Large margin in LVQ is proposed along with visualization, implementation and experimental results.
Dynamic object roles and corresponding contexts can model complex applications with higher-level abstraction. These abstracted applications can be used in wider areas such as financial institutions, health care, and supply chain network. Role management which consists of the creation of role objects, and binding role object between core objects still suffers from non-intrusive logging-monitoring, auditing, and resilient data source for role-based applications. Moreover, immutable smart contracts cause problems concerning bug fixing and maintenance without dynamic binding to new smart contract objects. An object that is created from a smart contract (contract class) can be transparently attached to a role object utilizing the Role Object Pattern (ROP). However, ROP itself does not contain a context definition and context-specific role assignment grouping the definition of smart contract relationships in abstracted data types. In this study, we would like to implement an extended version of the role object pattern called Context-based Role Object Pattern (ContextROP) with an onchain smart contract language called Solidity to solve fundamental problems. To evaluate the proposal, we will implement a use case with the design pattern proceeding with qualitative and quantitative analysis.
In this work, we identify similarities between Adversarial Examples and Counterfactual Explanations, extend already stated differences from previous works to other fields of AI such as dimensionality, transferability etc. and try to observe these similarities and differences in different classifier with tabular and image data. We note that this topic is an open discussion and the work here isn’t definite and canbe further extended or modified in the future, if new discoveries found.
In an era of global climate change and fast growing cities, local governments are in an urgent need for adopting sustainable urban growth concepts for tackling a liveable and prosperous urban future. Against this background, the smart city notion progressively gained popularity as an urban development concept, which heavily relies on technology and urban data use for fostering sustainable urban growth. However, so far, the understandingof the smart city term is ambiguous, and little scientific research has been done on developing comprehensive conceptual frameworks to support local governments in the making of smarter cities. This paper aims at presenting the current state-of-the-art of smart city research in order to support the making of smart city best practices and to promote a comprehensive understanding of the smart city notion. In doing so, the role of technology in the making of smarter cities and critical success factors in transforming cities are elaborated, following the methodological approach of a multidimensional conceptual framework. The research findings and an expert interview with a representative of the state capital will then serve for the assessment of the weak points and best practices in the smart city pursuit of the German city Munich, providing urban policymaking with valuable insights and fostering the development of a comprehensive smart city conceptualism.
Crowd-Powered Medical Diagnosis : The Potential of Crowdsourcing for Patients with Rare Diseases
(2023)
With the recent rise in medical crowdsourcing platforms,
patients with chronic illnesses increasingly broadcast their
medical records to obtain an explanation for their complex
health conditions. By providing access to a vast pool of
diverse medical knowledge, crowdsourcing platforms have
the potential to change the way patients receive a medical
diagnosis. We developed a conceptual model that details
a set of variables. To further the understanding of
crowdsourcing as an emerging phenomenon in health care,
we provide a contextualization of the various factors that
drive participants to exert effort. For this purpose, we used
CrowdMed.com as a platform from which we gathered and
examined a unique dataset that involves tasks of diagnosing
rare medical conditions. By promoting crowdsourcing
as a robust and non-discriminatory alternative to seeking
help from traditional physicians, we contribute to the acceptance
and adoption of crowdsourcing services in health
economics.
Over recent years, Maximal Extractable Value (MEV) has gained significant importance within the decentralized finance (DeFi) ecosystem. Remarkably, within just two years of its emergence, MEV has seen an extraction of approximately 600 million USD - a phenomenon that has sparked concerns regarding potential threats to blockchain stability.
With growing interest in the Ethereum network and the growing DeFi sector, research surrounding MEV has substantially increased. This work aims to offer a comprehensive understanding of MEV. Additionally, this research quantifies the largest types of MEV (Arbitrage, Sandwich and Liquidations) from March 2022 to March 2023. The data are then compared to other sources, revealing a general upward trend, with a particularly noticeable increase in Sandwich Attacks.
This work examines the impact Web 2.0 has on CRM in journalism. For this purpose the communication strategies of one international, one New Zealand and one exclusively online men’s magazine are compared. Through this comparison changes in the magazines’ approach to CRM are identified and expert interviews with editors give further insight into the dynamic of the evolution CRM and the journalism industry are going through. Finally, the conclusion illuminates the effects this evolution has on CRM in journalism.
This master thesis covers the topics of Customer relationships formation in the IT-outsourcing market on the example of “ABC” company. Most works related to the topic IT outsourcing cover the problems of implementation of IT services and the process of providing them to the customers and mostly all the issues are covered from the perspec-tive of consumers. Thus, problems and results of outsourcing providers of IT services remain almost uncovered. This master thesis is to reveal the specific features of IT out-sourcing business in Belarus and to develop an approach to the formation and construc-tion of a system of relationships between the company and its clients as a source of competitiveness increase.
This work emphasises the synergy between anthropologi-cal research on human skeletal remains and suitable doc-umentation strategies. Highlighting the significance of data recording and the use of digital databases in various aspects of anthropological work on bones, including scien-tific standards, skeletal collections, analysis of research re-sults, ethical considerations, and curation, it provides a comprehensive examination of these topics to demonstrate the value of investing time and resources in this field, countering the existing lack of funding that has led to sig-nificant deficiencies. Additionally, the paper outlines the requirements and challenges associated with standard data protocoling and suggests that digital data manage-ment frameworks and technologies such as ontologies and semantic web technologies for anthropological information should be a central focus in developing solutions.
Decentralizing Smart Energy Markets - tamper-proof-documentation of flexibility market processes
(2020)
The evolving granularity and structural decentralization of the energy system leads to a need for new tools for the efficient operation of electricity grids. Local Flexibility Markets (or "Smart Markets") provide platform concepts for market based congestion management. In this context there is a distinct need for a secure, reliable and tamper-resistant market design which requires transparent and independent monitoring of platform operation. Within the following paper different concepts for blockchain-based documentation of relevant processes on the proposed market platform are described. On this basis potential technical realizations are discussed. Finally, the implementation of one setup using Merkle tree operations is presented by using open source libraries.
In the swiftly changing world of academic publishing, the Sea of Wisdom platform seizes the opportunity to innovate. By combining the technologies of blockchain, decentralized finance (DeFi), and Non-Fungible Tokens (NFTs) with traditional scholarly communication, we present a groundbreaking, decentralized solution. Our design, although adaptable, primarily uses Ethereum's Virtual Machine, tapping into its robust scientific community.
A relatively new research field of neurosciences, called Connectomics, aims to achieve a full understanding and mapping of neural circuits and fine neuronal structures of the nervous system in a variety of organisms. This detailed information will provide insight in how our brain is influenced by different genetic and psychiatric diseases, how memory traces are stored and ageing influences our brain structure. It is beyond question that new methods for data acquisition will produce large amounts of neuronal image data. This data will exceed the zetabyte range and is impossible to annotate manually for visualization and analysis. Nowadays, machine learning algorithms and specially deep convolutional neuronal networks are heavily used in medical imaging and computer vision, which brings the opportunity of designing fully automated pipelines for image analysis. This work presents a new automated workflow based on three major parts including image processing using consecutive deep convolutional networks, a pixel-grouping step called connected components and 3D visualization via neuroglancer to achieve a dense three dimensional reconstruction of neurons from EM image data.
In this master thesis, we define a new bivariate polynomial which we call the defensive alliance polynomial and denote it by da(G; x; y). It is a generalization of the alliance polynomial and the strong alliance polynomial. We show the relation between da(G; x; y) and the alliance, the strong alliance, the induced connected subgraph polynomials as well as the cut vertex sets polynomial. We investigate information encoded about G in da(G; x; y). We discuss the defensive alliance polynomial for the path graphs, the cycle graphs, the star graphs, the double star graphs, the complete graphs, the complete bipartite graphs, the regular graphs, the wheel graphs, the open wheel graphs, the friendship graphs, the triangular book graphs and the quadrilateral book graphs. Also, we prove that the above classes of graphs are characterized by its defensive alliance polynomial. We present the defensive alliance polynomial of the graph formed of attaching a vertex to a complete graph. We show two pairs of graphs which are not characterized by the alliance polynomial but characterized by the defensive alliance polynomial.
Also, we present three notes on results in the literature. The first one is improving a bound and the other two are counterexamples.
In this thesis two novel methods for removing undesired background illumination are de-veloped. These include a wavelet analysis based approach and an enhancement of a deep learning method. These methods have been compared with conventional methods, using real confocal microscopy images and synthetic generated microscopy images. These synthetic images were created utilizing a generator introduced in this thesis.
In recent years the term Cloud has become popular in the world of technology. It is used to describe many different Information Technology offerings, but people are adapting this word without truly understanding it. “Demystifying the Cloud – Drawing the Lines between Technologies and Concepts” by Kevin Arnot takes a look at many levels of the Cloud and gives a comprehensive overview of the technologies and ideas that make it a paradigm shift. The author analyzes the term methodically by leveraging appropriate information from the Internet as well as from experts. An important milestone in understanding the Cloud accurately is differentiating between its components. These include: underlying technologies, the three Cloud Service Models (SaaS, IaaS and PaaS) and how it is deployed, publically or privately. The result is to understand that a Cloud can be composed in different ways and therefore serves exactly the needs of its users. Furthermore, the author describes challenges that individuals and busi-nesses have to deal with equally and reviews possible solutions. Cloud technology will continue to evolve; however, the future business value of the term “Cloud” will depend on how companies continue using or misusing it.
In the field of satellites it is common practice to combine multiple ground stations into one network, to increase communication times with satellites. This work focuses on TIM, which is an international academic colaborative project. Important criteria for this project are elaborated and used to evaluate existing ground station networks. It concludes that there is no appropriate solution availiable for this specific use case and establish a proposed solution. The proposed ground station network software will be elaborated and evaluated.
The H.323 umbrella standard describes audiovisual communication over packetswitched networks. This thesis illustrates the standard in detail with regards to architecture and implementation. The second part of this dissertation is dedicated to examining the Gmail Voice and Video plug-in, an Internet-based audiovisual communication platform. In the course of this thesis a secured kiosk environment for the Gmail Voice and Video plug-in is being developed.
Abstract: Blockchain Technology has become an innovative, mature tool for digital transformation, disrupting more and more application areas in their business processes, values, or even economic models. This paper leverages more than 30 academic publications on prototypes and their Blockchain-based use cases to transact certificates in the context of public education. The conceptual design and guiding ideas are reflected in the practical application development for the Federal Ministry of Education and Research ECHT! project within the showcase region WIR! in Mittweida and are used for the research design. During this approach we applied agile methods and the current certificate process to propose a comprehensive disclosure of a new software prototype including a three-layered architecture with multi-stakeholder components. The artefact instantiation contributes to the practical knowledge base within Information System Research and specifically in digital certificate processes starting from creation, searching, and proofing up to revoking by consideration of an existing IT landscape as well as organizational hierarchy.
The design of an interview model based on competencies arises from the need to have highly qualified people that contribute to the achievement of organizational objectives. It intendes to shape the department of human resources into a strategic area of the company. To achieve this, organizational competencies are defined, and guidelines for the elaboration of a portfolio of questions, as well as the design of a competency dictionary, are established. These serve as tools for the human resources processes of the company Visbal Moreno y Sucesores Ltd. Through this work, the importance of the human factor is exposed as part of the organizational strategy.
Traditional user management on the Internet has historically required individuals to give up control over their identities. In contrast, decentralized solutions promise to empower users and foster decentralized interactions. Over the last few years, the development of decentralized accounts and tokens has significantly increased, aiming at broader user adoption and shared social economies.
This thesis delves into smart contract standards and social infrastructure for Ethereum-based blockchains to enable identity-based data exchange between abstracted blockchain accounts. In this regard, the standardization landscapes of account and social token developments were analyzed in-depth to form guidelines that allow users to retain complete control over their data and grant access selectively.
Based on the evaluations, a pioneering Solidity standard is presented, natively integrating consensual restrictive on-chain assets for abstracted blockchain accounts. Further, the architecture of a decentralized messaging service has been defined to outline how new token and account concepts can be intertwined with efficient and minimal data-sharing principles to ensure security and privacy, while merging traditional server environments with global ledgers.
The objective of this Bachelor Project is the creation of a tool that should support forensic investigators during IT forensic interventions. It uses Kismet as the base program and adds functionalities to it via the plugin interface. The installation of the plugin shall be explained, how the plugin works, and a recommendation on how to use it. To understand the underlying basics, an introduction about WLAN and Bluetooth is given. The tests that were performed with the new plugin are described as well as their results. It is therefore briefly discussed why the tool is applicable for locating Wi-Fi devices, especially access points, but not Bluetooth devices. Using all this a few ideas on how to improve the tool and what can be researched in this area are provided.
Object detection and classification is active field of research inmachine learning and computervision. Depending on the application there are different limitations to adjust to, but also possibilities to take advantage of. In my thesis, We focus on classification and detection of video sequence during night-time and the proposed method is robust since it does use image thresholding [8] which is commonly use in other methods and the thesis uses histograms of oriented gradients (HOG) [37] as features and support vector machine (SVM) [74] as classifier. It is of great importance that the extracted features from the images should be robust and distinct enough to help the classifier distinguish between high-beam and a low-beam. The classifier is part of the object detection which predicts whether or not a testing image matches one group or the other. In our case that is predicting whether or not an image belongs to high or low-beam sequence.
The almost complete transcription of the human genome yield in a high number of transcripts, that do not encode proteins. However, the functional elucidation of especially long non cod-ing RNAs is still difficult. Secondary structure analysis is assumed to be a possible method to detect functional relationships of lncRNAs on a large scale, but it is still time consuming and error-prone. GRAPHCLUST, the currently most suitable clustering tool based on RNA secondary structure analysis, lacks mainly in an efficient method for the interpretation of its results. Hence, an independent and interactive RNA clustering interpretation tool was developed to allow visu-alisation and an efficient analysis of RNA clustering results.
In the present bachelor thesis, nanopore sequencing and Illumina sequencing was compared using pollen DNA collected from honeybees and bumble bees. Therefore, nanopore sequencing was performed with the MinION sequencers and the generated reads were analysed with bash programming. A quantitative and qualitative (based on ITS2 sequences) BLAST run was performed. The results confirme the error probability of nanopore sequencing that is described in the literature. Nevertheless, with both sequencing methods similar sample preferences of the bees could have been observed, allowing ecological conclusions.
Classification label security determines the extent to which predicted labels from classification results can be trusted. The uncertainty surrounding classification labels is resolved by the security to which the classification is made. Therefore, classification label security is very significant for decision-making whenever we are encountered with a classification task. This thesis investigates the determination of the classification label security by utilizing fuzzy probabilistic assignments of Fuzzy c-means. The investigation is accompanied by implementation, experimentation, visualization and documentation of the results.