Refine
Document Type
- Master's Thesis (121)
- Bachelor Thesis (94)
- Conference Proceeding (66)
- Diploma Thesis (14)
- Final Report (6)
Year of publication
Language
- English (301) (remove)
Keywords
- Blockchain (40)
- Maschinelles Lernen (28)
- Vektorquantisierung (9)
- Algorithmus (7)
- Bioinformatik (6)
- Bitcoin (6)
- Graphentheorie (6)
- Internet der Dinge (6)
- Neuronales Netz (6)
- Unternehmen (6)
Institute
- Angewandte Computer‐ und Biowissenschaften (120)
- 06 Medien (35)
- 03 Mathematik / Naturwissenschaften / Informatik (28)
- Wirtschaftsingenieurwesen (21)
- 01 Elektro- und Informationstechnik (7)
- 04 Wirtschaftswissenschaften (7)
- Ingenieurwissenschaften (4)
- Sonstige (4)
- 02 Maschinenbau (2)
- 05 Soziale Arbeit (1)
We report on our recent progress in creating a new type of compact laser that uses thulium-based fiber CPA technology to emit a central wavelength of 2 μm. This laser can produce pulse energies of >100 μJ and an average power of >15 W. It is designed to be long-lasting and is built for industrial use, making it a great fit for integration into laser machines used for materials processing. These laser parameters are ideal for working with semiconductors like silicon, allowing for tasks such as micro-welding, cutting of filaments, dicing, bonding and more.
As part of the research project Trusted Blockchains for the Open, Smart Energy Grid of the Future (tbiEnergy), one of the objectives is to investigate how a holistic blockchain approach for the realization of a local energy market could be accomplished and how corresponding hardware security mechanisms can be integrated. This paper provides an overview of the implemented prototype and describes the system and its processes.
This study presents an analysis of the coverage made by the journals El País (Spain), Folha de S. Paulo (Brazil) and Süddeutsche Zeitung (Germany) about the protests in Brazil against the 2013 Confederations Cup and the 2014 FIFA World Cup to establish a comparison between them and see which topics were emphasized by the newspapers and which tone they use in their reporting. Based on the research questions, four categories were developed for the analysis of the journals: article structure; topic of the article; actors/group of persons and tone of the reporting, all of them composed by several subcategories. It was concluded that the themes highlighted by the European newspapers were different from those stressed on the Brazilian diary. Nonetheless, all the reviewed newspapers made a neutral coverage of the protests.
This paper examines the communication channels used by innovation projects at the ProtoSpace Hamburg, when engaging with stakeholders, and tries to answer the thesis question whether new media channels improve the chances of success for innovation projects, when used for this communication. Expert interviews with eight experts in com-munication, innovation and stakeholder management were conducted and then analyzed through the application of Mayring´s qualitative content analysis, in order to answer the posed question.
Functions, which can be summarized to the keyword Internet Protocol Television (IPTV) describe the transmission of video services to users via Internet Protocol (IP). Accompanying to this new television transmission path Home Theatre PCs (HTPC) running a so called Media Center platform are more and more entering the living rooms as a companion for the popular LCD and Plasma displays. Perfect ease of use and the visual integration on the screen and also into the living room is raising their acceptance. These HTPCs are a central node for multimedia services such as TV, radio and email within the networked household. Thus, there are good preconditions for the use of a HTPC as end device for Telco operator driven IPTV and telecommunication services. In the context of this diploma thesis possibilities for the provisioning of IPTV and Next Generation Network (NGN) services on a converged multimedia home entertainment platform for the living room will be investigated, especially Vista Media Center platforms. For this reason, standardization activities will be investigated, which deal with the integration of IPTV and telecommunication services into NGN. The validation of the results will be achieved by the design and implementation of a Vista Media Center Add-In, which can be integrated as an IP Multimedia Subsystem (IMS) based User Agent (UA) in ETSI TISPAN Release 2 IPTV infrastructures. Additionally, a Cross Domain messaging service for IMS based UA is created, which enables a cross-network communication between users.
Drought is one of the most common and dangerous threats plants have to face, costing the global agricultural sector billions of dollars every year and leading to the loss of tons of harvest. Until people drastically reduce their consumption of animal products or cellular agriculture comes of age, more and more crops will need to be produced to sustain the ever growing human population. Even then, as more areas on earth are becoming prone to drought due to climate change, we may still have to find or breed plant varieties more suitable to grow and prosper in these changing environments.
Plants respond to drought stress with a complex interplay of hormones, transcription factors, and many other functional or regulatory proteins and mapping out this web of agents is no trivial task. In the last two to three decades or so, machine learning has become immensely popular and is increasingly used to find patterns in situations that are too complex for the human mind to overlook. Even though much of the hype is focused on the latest developments in deep learning, relatively simple methods often yield superior results, especially when data is limited and expensive to gather.
This Master Thesis, conducted at the IPK in Gatersleben, develops an approach for shedding light on the phenotypic and transcriptomic processes that occur when a plant is subjected to stress. It centers around a random forest feature selection algorithm and although it is used here to illuminate drought stress response in Arabidopsis thaliana, it can be applied to all kinds of stresses in all kinds of plants.
The voluntary international blog VaultingNews exists for two years now. Meanwhile the
team grew and the costs increased. This thesis is a collection of tools, which can help
to improve the communication of the team members who are spread all over the world
and introduces monetization ideas where the focus lies on establishing an online fan
shop based in Germany. This chapter results in a check list which laws have to be
observed.
There are multiple ways to gain information about an individual and its health status, but an increasingly popular field in medicine has become the analysis of human breath, which carries a lot of information about metabolic processes within the individuals body. The information in exhaled breath consists of volatile (organic) compounds (VOCs). These VOCs are products of metabolic processes within the individuals body, thus might be an indicator for diseases disturbing those processes. The compounds are to be detected by mass-spectrometric (MS) or ion-mobility spectrometric (IMS) techniques, making the analysis of these compounds not only bounded to exhaled breath. The resulting data is spectral data, capturing concentrations of the VOCs indirectly through intensities. However, a number of about 3000 VOCs [1] could already be determined in human exhaled breath. The number of research paper about VOC-analysis and detection had risen nearly constantly over the last decade 1. Furthermore, the technique to identify VOCs could also be used to capture biomarker from alien species within the individuals body. Extracting VOCs from an individual can be done by non- or minimal invasive techniques. However, the manual identification of VOCs and biomarkers related to a certain disease or infection is not feasible due to the complexity of the sample and often unknown metabolic products, thus automized techniques are needed. [1–4] To establish breath analysis as a diagnosis tool, machine learning methodes could be used. Machine learning has become a popular and common technique when dealing with medical data, due to the rapid analysis. Taking this advantage, breath analysis using machine learning could become the model of choice for diagnosis, keeping in mind that conventional methodes are laboratory based and thus when trying detect bacterial infection need sometimes several days to identify the organism. [5]
The main purpose of this thesis is to investigate factors influencing the buying decision of cigarette smokers. To achieve this, different theories concerning consumer buying behavior and factors influencing have been discussed to achieve a deeper understand of consumer behaviour. To enable me comprehend the influence factors that influence the buying decision of a smoker as a consumer, a survey with questionnaires was performed. The results of the survey indicates that brand awareness, quality of the tobacco, price, packaging, advertisement, influence by others and availability are the major factors influencing the buying decision of a smoker, with availability, quality, price and brand awareness having the most effective influence on a smoker.
A Systematic Literature Review on Blockchain Oracles: State of Research, Challenges, and Trends
(2023)
To enable data exchange between the Blockchain protocol (on-chain) and the real world (off-chain), e.g., non-Blockchain-based applications and systems, a software called Oracle is used [3]. Blockchain oracle is an important component in the use of off-chain data for on-chain smart contracts. However, there is limited scientific literature available on this important blockchain topic. Therefore, in this paper, a novel systematic literature review based on intelligent methods, e.g., information linking, topic clustering and focus identification through frequency calculations, is proposed. Thus, the current state of scientific research interest, content and challenges, and future research directions for blockchain oracles are identified. This paper shows that there is little unbiased literature that does not call oracles a problem. From the results of this new literature review framework, relevant areas of data handling and verification with blockchain oracles are identified for future research.
Humans started using the principles of insurance thousands of years ago when they lived in tribes in smaller villages. If one of the tribe members were injured, the others would take care of him and his family. The basic principle of insurance is several people covering each other against a particular risk. Today, most people in regions like Europe have access to insurance, while many people worldwide still have no access at all. The cost and accessibility may be improved with a blockchain-based parametric approach. The insurance process in a parametric approach is exclusively based on data, and decisions are made objectively. Blockchain is a necessary and integral part of the approach to create transparency and connect the customer’s and investor’s risk capital. The paper offers an overview of the opportunities and challenges of blockchain-based parametric insurance, a catalog of criteria for such insurance, a description of all components and their interaction for implementation on Ethereum, and a reference implementation of a train delay insurance in Germany.
Derived from the Ancient Greek word τραῦμα (engl. wound,
damage), the word trauma refers to either physical or emotional wounds. Nowadays, it is mostly used in the context of psychological wounds, inflicted by an identity-shattering event – an event that causes the traumatised to not be able to reconcile their lived reality with the expectation of a human universal experience anymore. The last decade, the last two years in particular, and the last two weeks ad absurdum, have scarred the global landscape of human existence beyond recognition. From Putin’s unexpected reimposition of mutually assured destruction doctrines via the global SARS-Cov-2 pandemic to the lingering threat of climate doom, people all over the globe have been faced with persistent threats to their most basic perceptions of ontological safety. This article seeks to examine the impact of the SARS-Cov-2 pandemic and to which degree it is justified to speak of a pandemic trauma. In addition, it engages with the liminality of pandemic trauma as a shared, collective and an isolated, individual experience, and potential mitigation strategies for building community resilience.
The objective of this diploma thesis is to analyse the results of functional tests carried out on hydraulic valve blocks at Wujin Plant of Bosch Rexroth (Chang-zhou) Co., Ltd., (China). Based on this analysis, tests could be checked for systematic errors and root causes of failures be identified. Finally, this helped i n-crease the first pass yield of testing to release resources so far bound in inefficient testing processes. Furthermore, a tracking mechanism was established to monitor the function of crucial sensors at test benches.
In the past few years, social media has become the most popular communication software, replacing phone calls, text messages, television and even advertisements. Social media has become the most important channel for spreading opinions. As a result of this trend, many politicians have also started to operate social media (Wang, Tsai, & Chen 2019). This study was conducted in order to understand whether there was an intercandidate agenda-setting effect between the Facebook posts of legislative candidates and presidential candidates during the election period, and whether the legislative candidates' Facebook posts were influenced by the presidential candidates' Facebook posts. The target population of this study was the three presidential candidates in Taiwan's 2020 presidential election — Dr. Tsai Ing-Wen, Mr. Han Kuo-Yu, and Mr. James Soong — as well as the 36 legislative candidates in Taipei, Taichung, and Kaohsiung.
The study focused on Facebook posts from 1thNovember 2019 to 10th January 2020, 10 weeks before the voting day. Text-mining and cosine similarity were used to organize the posts and compare the similarity between posts. Finally, the similarity between posts was presented as a line graph.
The study revealed that there was an inter-candidate agenda-setting effect between legislative candidate posts and presidential candidate posts, and that Dr. Tsai Ing-Wen, who was also the incumbent president during the campaign, was the most influential Facebook poster during the entire election.
Future research is proposed on the inter-candidate agenda-setting effect only analyzing the similarity of posts among the candidates to discuss the influence of the candidates' Facebook agenda-setting during a specific election period.
This is the first study in which the Facebook posts of Taiwanese politicians are analyzed and the relationships were analyzed and the relationships were systematically compared, across multiple degrees, which opens up a whole new subject for future elections in Taiwan.
The games industry has significantly grown over the last 30 years. Projects are getting bigger and more expensive, making it essential to plan, structure and track them more efficiently.
The growth of projects has increased the administrative workload for producers, project managers and leads, as they have to monitor and control the progress of the project in order to keep a permanent overview of the project. This is often accompanied by a lack of insight into the project and basic communication within the team. Therefore, the goal of this thesis is to enhance conventional project management methods with process structures that occur frequently in game development.
This thesis initially elaborates on what project management in the game industry actually is: Here, methods are considered, especially agile methods and progress tracking prac-tices, which were created for software development and have become a standard in game development. Subsequently, an example is used to demonstrate how process management can function within the development of video games. Based on this, the ideal is depicted, which is implemented and used in a tool at the German games studio KING Art GmbH. This ideal is compared with expert interviews in order to verify its gen-eral validity in the industry.
By integrating process structures, the administrative effort can be reduced, communica-tion within game development can be simplified, while the current project status can be permanently presented. This benefits both project management and leads, as well as the entire team. Further application tests of this theory would have to be organized to check scalability and to draw comparisons to other applications.
Due to the intractability of the Discrete Logarithm Problem (DLP), it has been widely used in the field of cryptography and the security of several cryptosystems is based on the hardness of computation of DLP. In this paper, we start with the topics on Number Theory and Abstract Algebra as it will enable one to study the nature of discrete logarithms in a comprehensive way, and then, we concentrate on the application and computation of discrete logarithms. Application of discrete logarithms such as Diffie Hellman key exchange, ElGamal signature scheme, and several attacks over the DLP such as Baby-step Giant-step method, Silver Pohlig Hellman algorithm, etc have been analyzed. We also focus on the elliptic curve along with the discrete logarithm over the elliptic curve. Attacks for the elliptic curve discrete logarithm problem, ECDLP have been discussed. Moreover, the extension of several discrete logarithms-based protocols over the elliptic curve such as the elliptic curve digital signature algorithm, ECDSA have been discussed also.
This article aims to explain mathematically, why the so called double descent observed by Belkin et al., Reconciling modern machine-learning practice and the classical bias-variance trade-off, PNAS 116(32) (2019), p. 15849-15854, occurs on the way from the classical approximation regime of machine learning to the modern interpolation regime. We argue that this phenomenon may be explained by a decomposition of mean squared error plus complexity into bias, variance and an unavoidable irreducible error inherent to the problem. Further, in case of normally distributed output errors, we apply this decomposition to explain, why LASSO provides reliable predictors avoiding overfitting.
In the field of Blockchain Technology applications and research, non-fungible tokens (NFTs) have gained significant attention in recent years. Whilst current research is focused on NFT use cases or the purchase of NFTs from an investor’s perspective, the NFT launch (i.e. primary market) from a creator’s perspective remains uncovered. However, the launch strategy is considered to be an important factor for the success of a product. Therefore, our research paper aims to explore launch strategies of NFTs. Thereby, we discuss the marketing mix instruments price (i.e. pricing strategy), place (i.e. mint mechanism), and promotion. Through an empirical approach of conducting eight expert interviews, we examine parameters that are used to define an NFT launch strategy and assess their preference of different stakeholders.
This Bachelor thesis deals with connected systems consisting of a multitude of similar electronic devices
(often referred to as agents) endowed with information processing abilities. It is required that these socalled multi-agent systems solve a certain task with a high reliability, while the individual components
are not able to solve the problem on their own in a satisfying manner. A central control unit can not or
shall not be used in such systems for a variety of different reasons: For example, a significant drawback of
a central control unit is the vulnerability of the system. If the central control unit fails, the whole system breaks down. Therefore, multi-agent systems require special algorithms enabling the agents to solve a
common, global problem in a suitable manner by local interaction only.
In this thesis distributed algorithms are investigated which can be used for distributed information pro-cessing and control of such multi-agent systems. In the first part of this work, it is assumed that each
agent posses a private information state about a common parameter of interest. The described consensus algorithm enables all agents to reach a system-wide identical information state by local information
exchanges only. Subsequently, it is considered the case that every agent has access to streaming data containing information about an a priori unknown parameter. The diffusion strategy described in the second
part enables the agents to estimate this parameter and to minimize a global cost function which depends
on it. Both algorithms are described in a general framework and can therefore be applied to a variety of
different problems. One application of these strategies, which is described in the third part of this work,
is the simulation of swarming behavior.
Cancer is one of the main causes of death in developed countries, and cancer treatment heavily depends on successful early detection and diagnosis. Tumor biomarkers are helpful for early diagnose. The goal of this discovery method is to identify genetic variations as well as changes in gene expression or activity that can be linked to a typical cancer state.
First, several cancer gene signaling pathways were introduced and then combined. 27 candidate genes were selected, through the analysis of several data sets in the GEO database, a few expression difference matrices were established. Those candidate genes were tested in the matrices and found five genes PLA1A, MMP14, CCND1, BIRC5 and MYC that have the potential to be tumor biomarkers. Two of these genes have been further discussed, PLA1A is a potential biomarker for prostate cancer, and MMP14 can be considered as a biomarker for NSC lung cancer.
Finally, the significance of this study and the potential value of the two genes are discussed, and the future research in this direction is a prospect.
This scientific work deals with the current opportunities of business development. Purpose of the work is study and analysis of the organization's development strategy and its development. The subject of the study is the mechanism of formation of an organization's development strategy, understanding of business development and its core methodologies and branches. This thesis is based on the operations of the real engineering company and main part of the research could be applied in reality. Main goal of the thesis is to find recommendations on the implementation of strategic changes organization's development strategy.
The Blockchain is a technology which has the capabilities to change the way, the world operates. As promising as this may be, there are still many challenges which do not exist or are way simpler to solve in conventional software solutions. Services which are offered over the blockchain suffer from so called Block-confirmation-times where the customer simply has to wait till the transaction is confirmed. In this paper possible solutions to that problem will be examined and challenges that arise from the specific criteria of the Ethereum Blockchain will be analyzed.
Machine learning models for timeseries have always been a special topic of interest due to their unique data structure. Recently, the introduction of attention improved the capabilities of recurrent neural networks and transformers with respect to their learning tasks such as machine translation. However, these models are usually subsymbolic architectures, making their inner working hard to interpret without comprehensive tools. In contrast, interpretable models such learning vector quantization are more transparent in the ability to interpret their decision process. This thesis tries to merge attention as a machine learning function with learning vector quantization to better handle timeseries data. A design on such a model is proposed and tested with a dataset used in connection with the attention based transformers. Although the proposed model did not yield the expected results, this work outlines improvements for further research on this approach.
Analysis of Continuous Learning Strategies at the Example of Replay-Based Text Classification
(2023)
Continuous learning is a research field that has significantly boosted in recent years due to highly complex machine and deep learning models. Whereas static models need to be retrained entirely from scratch when new data get available, continuous models progressively adapt to new data saving computational resources. In this context, this work analyzes parameters impacting replay-based continuous learning approaches at the example of a data-incremental text classification task using an MLP and LSTM. Generally, it was found that replay improves the results compared to naive approaches but achieves not the performance of a static model. Mainly, the performances increased with more replayed examples, and the number of training iterations has a significant influence as it can partly control the stability-plasticity-trade-off. In contrast, the impact of balancing the buffer and the strategy to select examples to store in the replay buffer were found to have a minor impact on the results in the present case.
Stability of control systems is one of the central subjects in control theory. The classical asymptotic stability theorem states that the norm of the residual between the state trajectory and the equilibrium is zero in limit. Unfortunately, it does not in general allow computing a concrete rate of convergence particularly due to algorithmic uncertainty which is related to numerical imperfections of floating-point arithmetic. This work proposes to revisit the asymptotic stability theory with the aim of computation of convergence rates using constructive analysis which is a mathematical tool that realizes equivalence between certain theorems and computation algorithms. Consequently, it also offers a framework which allows controlling numerical imperfections in a coherent and formal way. The overall goal of the current study also matches with the trend of introducing formal verification tools into the control theory. Besides existing approaches, constructive analysis, suggested within this work, can also be considered for formal verification of control systems. A computational example is provided that demonstrates extraction of a convergence certificate for example dynamical systems.
This paper analyses the status quo of large-scale decision making combined with the possibility of blockchain as an underlying decentralized architecture to govern common pool resources in a collective manner and evaluates them according to their requirements and features (technical and non-technical). Due to an increasing trend in the distribution of knowledge and an increasing amount of information, the combination of these decentralized technologies and approaches, can not only be beneficial for consortial governance using blockchain but can also help communities to govern common goods and resources. Blockchain and its trust-enhancing properties can potenitally be a catalysator for more collaborative behavior among participants and may lead to new insights about collective action and CPRs.
Analysis of the Forensic Preparation of Biometric Facial Features for Digital User Authentication
(2023)
Biometrics has become a popular method of securing access to data as it eliminates the need for users to remember a password. Although exploiting the vulnerabilities of biometric systems increased with their usage, these could also be helpful during criminal casework.
This thesis aims to evaluate approaches to bypass electronic devices with forged faces to access data for law enforcement. Here, obtaining the necessary data in a timely manner is critical. However, unlocking the devices with a password can take several years with a brute force attack. Consequently, biometrics could be a quicker alternative for unlocking.
Various approaches were examined to bypass current face recognition technologies. The first approaches included printing the user's face on regular paper and aimed to unlock devices performing face recognition in the visible spectrum. Further approaches consisted of printing the user's infrared image and creating three-dimensional masks to bypass devices performing face recognition in the near-infrared. Additionally, the underlying software responsible for face recognition was reverse-engineered to get information about its operation mode.
The experiments demonstrate that forged faces can partly bypass face recognition and obtain secured data. Devices performing face recognition in the visible spectrum can be unlocked with a printed image of the user's face. Regarding devices with advanced near-infrared face recognition, only one could be bypassed with a three-dimensional face mask. In addition, its underlying software provided evidence about the demands of face recognition. Other devices under attack remained locked, and their software provided no clues.
Abstract nicht vorhanden
The bachelor thesis is assigned to introduce the theoretical concept of Human Recourses Management, to analyze the work of human resources department of the LLC Tavria-V and to offer actions with recommendation to improve the productivity of the personnel. To start the implementation of actions for personnel management improvement, first of all, an overview of theoretical and methodological aspects of the HRM are presented and theories which earlier had an impact on our present running of the "workers" are described. Secondly, the concept of organizational work of the enterprise, main indexes and types of activities are figured out and in the form of tables and diagrams analyzed. The main object of the thesis - the process of personnel management with qualitative characteristics is described and presented. Using also the survey of employee all advantages and disadvantages of the present system of HRM are defined. Then in the last part, taking to account all data about current situation, recommended actions and effect for LLC Tavria-V on the basis of the personnel management analysis are presented in the work.
In this thesis, we focus on using machine learning to automate manual or rule-based processes for the deduplication task of the data integration process in an enterprise customer experience program. We study the underlying theoretical foundations of the most widely used machine learning algorithms, including logistic regression, random forests, extreme gradient boosting trees, support vector machines, and generalized matrix learning vector quantization. We then apply those algorithms to a real, private data set and use standard evaluation metrics for classification, such as confusion matrix, precision, and recall, area under the precision-recall curve, and area under the Receiver Operating Characteristic curve to compare their performances and results.
As new sensors are added to VR headsets, more data can be collected. This introduces a new potential threat to user privacy. We focused on the feasibility of extracting personal information from eye-tracking. To achieve this, we designed a preliminary user study focusing on the pupil response to audio stimuli. We used a variation of machine learning models to test the collected data to determine the feasibility of obtaining information such as the age or gender of the participant. Several of the experiments show promise for obtaining this information. We were able to extract with reasonable certainty whether caffeine was consumed and the gender of the participant. This demonstrates the unknown threat that embedded sensors pose to users. A further studies are planned to verify the results.
FUSO is one of the Japanese leading manufacturing of trucks and buses in the world and also it is an integral part of Daimler AG. Being a large manufacturer in trucks and buses, Fuso faces some marketing issues due to corrosion issues. Corrosion is one of the major issue to breakdown or damage the performance of the vehicles. To encounter this issue, FUSO initiated new project and called as “Anti-Corrosion Project”. The main mission of this project is to improve the corrosion resistivity or performance of the metal parts. Currently FUSO has almost 70 percent of parts which lies under Grade-III i.e. lesser than the one year corrosion resistivity.
In this project, the corrosion issues are collected by different types of audits like from customer as well as from taking two years old vehicle in worst conditions. Listed corrosion issues further investigated for current specification and requested for new proposal from supplier. Then the proposed solution is internally estimate the cost and make negotiation with the supplier. Later it’s forwarded to meeting with top management for approval. In case of higher corrosion specification, parts are taken from production line and tested in material lab which is available in FUSO. At last, the approved proposal is requested to release the drawing change and further the new proposal will be implemented. Entire project it should be coordinate with all different departments and working with teams gives more deep knowledge about the cause of issues.
With this project, parallel focused on the shop floor developments in return parts management area. FUSO is also responsible for the after sale services. In other words, FUSO provides warranty for the parts which breakdown within three years. Breakdown parts are directly delivered by the customers through dealers for warranty claim, so these parts called Warranty Part Investigation (WPI) parts. Sometimes customer wants to know the cause of the breakdown even though warranty has expired, in this case company will investigate the cause but they don’t provide the warranty. These kind of parts known as Product Quality Report (PQR) parts.
Company has a different shop floor for return parts and these parts are directly received by the company. RPM has four processes i.e. inwarding, pre-analysis, investigation and dispatch or scrap.
Usually, company used to get 30-50 parts per day, recently they decided to receive all the breakdown parts. Hence, it results in increasing the delay of inwarding and other processes. To solve this, standard layout and process are constructed. And, one of the main reasons for inward delay is higher documentation which is basically not required. These are converted into automation or digitalize work. Improvements are done using the lean manufacturing project methodology which results in more inward of failure parts and less inventory.
Safety, quality, and sustainability concerns have arisen from global supply chains. Stakeholders incur risk regarding these factors, given their significance and complexity. Thus, each business's supply chain risk management must prioritize product characteristics. Accordingly, an effective traceability solution that can monitor and regulate product and supply chain aspects is crucial, especially in a given scenario. This re-search paper elucidates the potential of smart contracts in blockchain to enhancing the efficacy of business transactions and ensuring comprehensive traceability within the supply chain of paper-based coffee cups The improved levels of transaction transparency and security in traditional supply chains have been achieved through the digitization of supply chain ecosystem interactions and transactions. This approach makes verifying sources, manufacturing procedures, and quality standards easier in complex supply chains. Accordingly, the integration helps stakeholders monitor and track the whole ecosystem, promoting transparency, predictability, and dependability.
At a global level, different studies disclose that transport systems are responsible for 25% of CO2 emissions. In the context of sustainable mobility, one of the challenges in the short term is associated with the research and improvement of alternative fuels, which should allow a fast decrease in the generation of greenhouse gases due to sustainable transport means. In this sense, green hydrogen can play a fundamental role. Green hydrogen is the basis for producing synthetic fuels, which can replace oil and its derivatives. Synthetic fuels or e-fuel are hydrocarbons produced from carbon dioxide (CO2) and green hydrogen (H2) as the only raw materials. H2 or efuel could be used in many sectors (manufacturing, residential, transportation, mining and other industries). In this study, different applications of hydrogen are evaluated by techno-economic analysis. The main variable that affects the production of hydrogen and its derivatives is the cost of electricity. Considering the renewable energy potential of Chile, it is feasible to develop in Chile the green hydrogen production as an energy vector, which would be technically and economically viable, together with the environmental benefits
Many companies use machine learning techniques to support decision-making and automate business processes by learning from the data that they have. In this thesis we investigate the theory behind the most widely used in practice machine learning algorithms for solving classification and regression problems.
In particular, the following algorithms were chosen for the classification problem: Logistic Regression, Decision Trees, Random Forest, Support Vector Machine (SVM), Learning Vector Quantization (LVQ). As for the regression problem, Decision Trees, Random Forest and Gradient Boosted Tree were used. We then apply those algorithms to real company data and compare their performances and results.
Applications and Potential Impacts of Blockchain Technology in Logistics and Supply Chain Areas
(2022)
The motive of the present thesis is to analyze the applications and potential impacts of blockchain technology in the logistics and supply chain areas. For this purpose, the literature from different sources has been used to analyze and get an overview of the current status and role of blockchain technology within the logistics and supply chain areas. Different use cases, as well as pilot projects from organizations all over the world and also from Germany, have been included. Suggestions for further applications and implementations of blockchain technology along with their potential impacts have been made. Additionally, the cost of implementing blockchain-based solutions and applications has been estimated along with providing recommendations and suggestions for important and key points to be considered before preparing and deciding to implement blockchain-based solutions in any organization.
Laser welding of hidden T-joints, connecting the web-sheet through the face-sheet of the joint can provide advantages like increased lightweight potential in manufacturing sandwich structures with thin-walled cores. However, maintaining the correct positioning of the beam relative to the joint is challenging. A method to reduce the effort of positioning is using optical coherence tomography (OCT), that interferometrically measures the reflection distance inside of the keyhole during laser deep penetration welding. In this study new approaches for targeted data processing of the OCT-signal to automatically detect misalignments are presented. It is shown that considering multiple components from the inference pattern and the respective signal intensities improve the detection accuracy of misalignments.
Aspects of Mindful Leadership Upon the Psychological Health of Employees in an Intercultural Context
(2023)
Across the globe, organizations are in the midst of rapid transformation. Immigration, digitalization and the push for sustainability are just to name a few. Organizational structures are being pushed for more agility, co-opetition, integration, tenable and resilient workplaces. Social structures of companies are being reformed and the weight of cooperation and integration lays upon the leaders and employees. But from this weight of integration what psychological effects does it play upon the migrant and domestic employees to be engaged at work? What role does the leadership style impact the mental health and engagement in the cross-cultural workplace? Previous work has shown the importance of workplace integration, however, the impact of the mental health of domestic employees needs more attention from the scholars in this new context. The object of the research is to define the connection of mindful leadership and the psychological health of employees within a cross-cultural workplace and to develop strategies to improve workplace engagement.
Assessment of COI and 16S for insect species identification ti determine the diet of city bats
(2023)
Despite the numerous benefits of urbanization to human living conditions, urbanization has also negatively affected humans, their environment, and other organisms that share urban habitats with humans. Undoubtedly adverse while some wild animals avoid living in urban areas, others are more tolerant or prefer life in urban habitats. There are more than 1,400 species of bats in the world.
Therefore, they have the potential to contribute significantly to the mammalian biodiversity in urban areas. Insectivorous bats species play a key role in agriculture by improving yields and reducing chemical pesticide costs. Using metabarcoding, it is possible to determine the prey consumed by these noctule mammals based on the DNA fragments in their fecal pellets. This study
aimed to evaluate COI and 16S metabarcodes for insect species identification to determine the diet of metropolitan bats. For this purpose, COI and 16S metabarcodes were extracted, amplified, and sequenced from 65 bat feces collected in the Berlin metropolitan areas. Following a taxonomic annotation, I found that 73% of all identified insects could only be detected using the COI method, while 15% could be recovered using the 16S approach. Just 12% of all detected insects were identified simultaneously by both markers. According to this result, COI is more suitable for the taxonomic identification of insects from bat feces. However, given the bias of COI primers, it is recommended to use both markers for a more precise estimation of species diversity. Additionally,based on the insect species identified, I noticed that urban bats fed mainly on Diptera, Coleoptera,and Lepidoptera. The bat species Nyctalus noctula was most abundant in the samples. His diet analysis revealed that 91% of the samples contained the insect species Chironomus plumosus. 14 pest insect species were also found in his diet.
Noise in the oceans is a constantly increasing factor. The growing industrialisation due to shipping, offshore wind parks, seismic studies and other anthropogenic noise is putting the eco system under immense stress. The focus of this thesis is on the assessment of continuous underwater noise from ships. Based on existing strategies in air as well as underwater and a comparison of both an alternative strategy for the assessment of con-tinuous noise from ships is given. The concept developed is based on published, scien-tifically observed responses of animals to ship passes with an indication of an effect range. A model is created to describe the strategy using publicly available data for cargo ships as an example. The results are summarized in maps depicting the affected area for an MRU of the OSPAR II region and the MPA “Borkum Riffgrund”. The strategy is discussed and evaluated on the basis of these results. From this, further improvements and the need for additional information in publicly available data on vessel traffic are derived.
The GeoFlow II experiment aims to replicate Earth’s core dynamics using a rotating spherical container with controlled temperature differences and simulated gravity. During the GeoFlow II campaign, a massive dataset of images was collected, necessitating an automated system for image processing and fluid flow visualization in the northern hemisphere of the spherical container. From here, we aim to detect the special structures appearing on the post processed images. Recognizing YOLOv5’s proficiency in object detection, we apply Yolov5 model for this task.
Cryptocurrencies are characterized by high volatility, both in the short and long term. Experienced traders exploit this to make profits from price fluctuations by swing trading. However, this requires closely observing and analyzing the prices and trading positions at the right time. Only a few specialists, who spend time focusing on this, or optimized trading bots are able to actually make continuously profits. The autradix protocol is a selfoptimizing and self-learning parametric trading algorithm that analyzes price actions in real-time and adaptively optimizes the algorithm’s parameters to realize the user’s investment objective. Embedded in an adaptive genetic algorithm, possible parameterizations are simulated and the optimal for the investigated trading pairs are calculated. The generic trading protocol API enables coupling with various crypto exchanges and decentralized protocols. A smart contract based decentralized, trustless, and tokenized fund, controlled by a DAO, enables users to invest, operate trading agents, and to participate in the profits generated according to their share.
A variety of methods have been used to describe natural systems and cellular functions. Most use continuous systems with differential equations. Based upon the neighbourhood relations in graphs and the complex interactions in cellular automata a mathematical model was designed and implemented as an application user interface. This discrete approach called graph automata was utilised to simulate diffusion processes and chemical kinetics. The progression of diffusion in cellular environments was described and resulted in a discrepancy of 20% in comparison to experimental results. Different chemical kinetics were simulated and found to be as accurate as their continuous counterparts. The proposed model appears to be a highly scalable and modular
approach to simulate natural systems.
Beam shaping and splitting with diffractive optics for high performance laser scanning systems
(2021)
Diffractive optical elements (DOEs) enable novel high performance and process-tailored scanning strategies for galvanometer-based scan heads. Here we present several such concepts integrating DOEs with laser scanners and the respective application use cases. Beam shaping DOEs providing a homogeneous fluence over a custom defined profile, such as a rectangular Top-Hat, enable increased process quality in Laser-Induced Forward Transfer (LIFT) compared to the Gaussian beam of the laser source. We show that aberrations which occur over the necessary large wafer-sized image field can be eliminated through the use of a synchronous XY-stage motion. Another application that benefits from the use of DOEs is laser drilling. Drilling in display and electronics manufacturing demands high throughput that can only be achieved through the use of beam splitting DOEs for parallel processing. To this end, the joint MULTISCAN project is developing a variable multi-beam tool capable of scanning and switching each individual beamlet for increased control.
Gold cyanidation is a process by which gold is removed from low-grade ore. Due to its efficiency it has found widespread application around the world, including Peru. The process requires free cyanide in high concentration. After the gold extraction is completed, free cyanide as well as metal cyanide complexes remain in the effluent of gold mines and refineries. Often these effluents are kept in storage ponds where they pose considerable risk to health and environ-ment. Thus, it is preferable to degrade cyanide to minimize the risk of exposure. In the context of this thesis cyanide degradation was explored in a UV-light based prototype. Degradation with a combination of hydrogen peroxide and UV-light has proven to be very effective at degrading cyanide concentrations of 100 mg/L and 1000 mg/L. Furthermore, the presence of ammonia as a degradation product could also be confirmed. Membrane distillation may provide an alternative to cyanide destruction in the form of cyanide recovery. Promising results were gathered from several membrane experiment.
Die biologische Ammoniumoxidation ist ein zentraler Bestandteil des globalen Stickstoffkreislaufs. Angesichts der extremen Massen Stickstoff anthropogenen Ursprungs in der Umwelt, liegt die Entfernung reaktiven Stickstoffs im Interesse der Umwelt und der öffentlichen Gesundheit. In der folgenden Arbeit werden Bedingungen zur anaeroben Ammoniumoxidation mit Nitrat in einem Anammox-Reaktor untersucht. Dabei wurden 2 Laborreaktoren für eine Zeit von insgesamt 116 Tagen betrieben und beobachtet, die ausschließlich als Elektronendonatoren und Akzeptoren Ammonium und Nitrat enthielten. Zusätzlich wurden Batchkulturen mit Zellen eines Reaktors angezüchtet und auf ihre Gaszusammensetzung abhängig unterschiedlicher Eigenschaften untersucht. Hierbei wurde eine Reihe unterschiedlicher analytischer Quantifizierungsmethoden genutzt und es konnte gezeigt werden, dass ein Abbau unter den Bedingungen stattfindet.
Die aktuelle Forschung zu dieser Reaktion ist spärlich und verleiht der Bachelorarbeit dadurch Relevanz.
Bitcoin's energy consumption and social costs in relation to its capacity as a settlement layer
(2021)
Bitcoin runs on energy. The decentralized network’s amount of energy consumption has resulted in multifaceted discussions about its efficiency and environmental impact. To put Bitcoin’s energy consumption into perspective, we propose to relate (a) the energy consumption in TWh and (b) resulting social costs in the form of carbon emissions to the Dollar value settled on the Bitcoin network. Both metrics allow to relate and quantify the capacity of Bitcoin as a settlement layer to the network’s energy consumption and resulting carbon missions, or social costs. We find that in early 2021 Bitcoin (a) settles between $2,333 and $7,555 for each Dollar spent on energy and (b) that, on average, a Dollar settled on the Bitcoin blockchain causes in social costs between 0.007% and 0.01%, depending on the estimated energy consumption converted into the costs of carbon emissions. These results help to assess the efficiency, cost and sustainability of Bitcoin and may allow a comparison of Bitcoin with existing settlement base layers such as Fedwire or gold
This desk research will initiate an exploration of present and potential blockchain applications in the higher education sector of Europe. The aim of this research is to create a theoretical base for a further postgraduate research and analysis, so to create an effective model/framework to augment the integration of blockchain technology into existing organizational processes, initially in higher educational institutions, but which may be adaptable and generalizable to other specific uses. Due to the novelty of the topic, academic resources related to the research area are limited. Most studies seem to focus on blockchain-based applications in industries such as finance, healthcare, and supply chain management, and there is little evidence of the impact of blockchain technology on education. This paper discusses present and suggests some potential blockchain-based applications in education in Europe and beyond. This research provides a groundwork for education and academia stakeholders, policymakers and researchers to exploit the potential of blockchain in different functions of an education system.