Refine
Document Type
- Master's Thesis (121)
- Bachelor Thesis (94)
- Conference Proceeding (66)
- Diploma Thesis (14)
- Final Report (6)
Year of publication
Language
- English (301) (remove)
Keywords
- Blockchain (40)
- Maschinelles Lernen (28)
- Vektorquantisierung (9)
- Algorithmus (7)
- Bioinformatik (6)
- Bitcoin (6)
- Graphentheorie (6)
- Internet der Dinge (6)
- Neuronales Netz (6)
- Unternehmen (6)
Institute
- Angewandte Computer‐ und Biowissenschaften (120)
- 06 Medien (35)
- 03 Mathematik / Naturwissenschaften / Informatik (28)
- Wirtschaftsingenieurwesen (21)
- 01 Elektro- und Informationstechnik (7)
- 04 Wirtschaftswissenschaften (7)
- Ingenieurwissenschaften (4)
- Sonstige (4)
- 02 Maschinenbau (2)
- 05 Soziale Arbeit (1)
Thіs bachеlоr thеsіs was еxеcutеd fоr Іntеrpіpе cоmpany and іt cоncеntratеs оn іts busіnеss stratеgy оn іntеrnatіоnal markеts, еspеcіally оn Mіddlе Еastеrn pіpеs markеt. Chооsіng an іnapprоprіatе еntry busіnеss stratеgy can lеad tо sіgnіfіcant nеgatіvе cоnsеquеncеs, busіnеss stratеgy sеlеctіоn оn іntеrnatіоnal markеts іs оnе оf thе mоst crіtіcal dеcіsіоns іn іntеrnatіоnal tradе systеm. Thе thеоrеtіcal framеwоrk оf thе bachеlоr thеsіs іs prоvіdеd іn thе sеcоnd chaptеr, whіch was maіnly cоllеctеd by dеsktоp studyіng. Thе thеоry rеvіеw cоntaіns dеscrіptіоn оf varіоus fоrеіgn markеt stratеgіеs, mеthоds and mеchanіsms оf dеcіsіоn-makіng, lеvеls and typеs оf busіnеss еnvіrоnmеnt. A cоmbіnatіоn оf thеоrіеs іs adоptеd tо facіlіtatе thе prоcеss оf gathеrіng thе rеquеstеd іnfоrmatіоn. Thе thіrd chaptеr cоntaіns іnfоrmatіоn abоut Іntеrpіpе Cоmpany und іt´s еcоnоmіc actіvіty іn thе hоst cоuntry and abrоad. Gеnеral іnfоrmatіоn abоut Іntеrpіpе Cоmpany, іts currеnt pоsіtіоn, busіnеss dеvеlоpmеnt stratеgy fоr 2015-2016 yеars arе prеsеntеd. Thе sіtuatіоn оn thе pіpеs and whееls markеt іn Ukraіnе durіng thе pеrіоd 2014-2015 was analyzеd and оn accоunt оf thіs thе rеasоns оf dеclіnе іn prоfіts and salеs wеrе еducеd. Cоіncіdеntly thе pеnеtratіоn fоrms оf Іntеrpіpе Cоmpany tо fоrеіgn cоuntrіеs wеrе cоnsіdеrеd. Іn thіs rеgard thе mоst succеssful еntry fоrms arе suggеstеd tо bе accеptеd as thе maіn kеy stratеgy оf pеnеtratіоn tо thе іntеrnatіоnal markеt. Thе fоrth chaptеr prоvіdеs thе іnfоrmatіоn abоut thе cеrtaіn apprоach оf pеnеtratіоn оf Іntеrpіpе Cоmpany tо thе Mіddlе Еastеrn pіpе markеt. Thе purpоsе іs tо іncrеasе thе numbеr оf dеlіvеrіеs tо оіl and gas cоmpanіеs іn thіs rеgіоn and cоntіnuе еstablіshіng оf іts rеlatіоns wіth kеy agеnts and dіstrіbutоrs. Thе prоjеct aіms tо еlеvatе currеnt pоsіtіоn оf thе еntеrprіsе оn Mіddlе Еastеrn pіpеs markеt and adjust advantagеоus іntеrnatіоnal rеlatіоns fоr bоth cоuntеrparts. Data іs cоllеctеd frоm varіоus sоurcеs, іncludіng: bооks and jоurnals іn thе thеоrеtіcal framеwоrk, nеwspapеrs, cоmpany’s publіshеd rеpоrts, prеss rеlеasеs, catalоguеs, bullеtіns, brоchurеs, prеsеntatіоn, Іntеrnеt rеsоurcеs еtc. іn thе еmpіrіcal study.
This thesis aims to research the platform YouTube and whether “being a YouTuber” qualifies as a profession or not and what leads to this. The author combines existing scientific data and information provided by YouTubers doing this as a job and uses the compilation method. The author merges that material and uses it to create a bachelor thesis that covers both the theoretical and practical approach. The aim was to find out if there is a success recipe that can be followed that leads to views and clicks which are essential for the profession as a YouTuber. To do this, the author created two channels to see how the factors mentioned in this thesis are applied and if the approach leads to success. The findings of this thesis showed, that although the profession of a YouTuber can be classified as a job, it needs to be viewed differently from commonly known and in society accepted careers. Becoming a YouTuber and making money from this business, therefore, cannot be guaranteed.
Workload Optimization Techniques for Password
Guessing Algorithms on Distributed Computing Platforms
(2019)
The following thesis covers several ways to optimize distributed computing platforms for cryptanalytic purposes. After an introduction on password storage, password guessing attacks and distributed computing in general, a set of inital benchmark results for a variety of different devices will be analyzed. The shown results are mainly based on utilization of the open source password recovery tool Hashcat. The second part of this work shows an algorithmic implementation for information retrieval and workload generation. This thesis can be used for the conception of a distributed computing system, inventory analysis of available hardware devices, runtime and cost estimations for specific jobs and finally strategic workload distribution.
In this thesis, the changes in economy and society and the resulting effects on the labor market are being outlined. Current studies show that the shrinking labor market and the increasing digitalization result in a lack of skilled tech talent and a transition from an employer market to a clear employee market. Derived from the findings of the scientific research on this topic and conducted expert interviews, practical recommendations for recruitment actions within the scope of employer branding will be defined in order to help corporations to gain the needed tech skill set and drive innovation.
This work concentrates on the frequently used marketing instrument brand personality. Its effect on the consumer and how it drives consumer behaviour through TV advertis-ing are the focus. Scientific material, utilising research results of the last 20 years, has been analysed to investigate this subject. Furthermore, the example of Southern Comfort provides an insight of brand personality being applied to the real world of marketing business.
This paper explores the origins of Maori images in New Zealand film history. Discussing the history of Maori and their society brings us closer to a, once almost extinct, race and its struggle for self-representation and self-governance. By taking an in-depth look at New Zealands film history we get to understand how Maori were the subject of the earliest films and at what time they started making their own films. Combining those elements gives us the opportunity to understand how early images of Maori were created by Pakeha directors. By looking at different films throughout film history shows how Maori images evolved in time, especially when Maori started depicting themselves. This paper not only answers questions about Maori images in film but also tries to make people realise what odds Maori had to overcome in their daily struggle for selfdetermination.
The following is a description and outline of the work done at the Cornell Lab of Ornithology developing Nation Feathers VR, a virtual reality game for learning about bird calls and songs. The goal was to develop a game which is intuitive, educational and entertaining. Furthermore, the software needed to be structured in a way that allows for feasible future expansion. This required careful data saving and retrieval. The game gives the player an opportunity to learn and apply that knowledge, all while maintaining a shorter runtime in order to reduce the total time spent in the virtual world. This is meant to prevent any discomfort to the player that may result from extended use of the VR headset.
VQ-VAE is a successful generative model which can perform lossy compression. It combines deep learning with vector quantization to achieve a discrete compressed representation of the data. We explore using different vector quantization techniques with VQ-VAE, mainly neural gas and fuzzy c-means. Moreover, VQ-VAE consists of a non-differentiable discrete mapping which we will explore and propose changes to the original VQ-VAE loss to fit the alternative vector quantization techniques.
Protein structures are essential elements in every biological system evolved on earth, where they function as stabilizing elements, signaltransducers or replication machin eries. They are consisting of linear-bonded amino acids, which determine the three-dimensional structure of the protein, whereas the structure in turn determines the function. The native and biological active structure ofa protein can be understood as the folding state of a polypeptide chain at the global minimum of free energy.
By means of protein energy profiling, which is an approach derived from statistical physics it is possible to assign a so called energy profile to a protein structure. Such an energy profile describes the local energetic interaction features of every amino acid within the structure and introduces an energetic point of view, instead of a structural or sequential onto proteins.
This work aims to give a perspective to the question of how we may gain pattern information out of energy profiles. The concrete subjects are energy-mapped Pfam family alignments and investigations on finding motifs or patterns indiscretizised energy profile segments.
To enable smart devices of the internet of things to be connected to a blockchain, a blockchain client needs to run on this hardware. With the Trustless Incentivized Remote Node Network, in short Incubed, it will be possible to establish a decentralized and secure network of remote nodes, which enables trustworthy and fast access to a blockchain for a large number of low-performance IoT devices. Currently, Incubed supports the verification of Ethereum data. To serve a wider audience and more applications this paper proposes the verification of Bitcoin data as well, which can be achieved due to the modularity of Incubed. This paper describes the proof data that is necessary for a client to prove the correctness of a node’s response and the process to verify the response by using this proof data as well. A proof-object which contains the proof data will be part of every response in addition to the actual result. We design, implement and evaluate Bitcoin verification for Incubed. Creation of the proof data for supported methods (on the server-side) and the verification process using this proof data (on the client-side) has been demonstrated. This enables the verification of Bitcoin in Incubed.
The number of Internet of Things (IoT) devices is increasing rapidly. The Trustless Incentivized Remote Node Network, in short IN3 (Incubed), enables trustworthy and fast access to a blockchain for a large number of low-performance IoT devices. Although currently IN3 only supports the verification of Ethereum data, it is not limited to one blockchain due to modularity. This thesis describes the fundamentals, the concept and the implementation of the Bitcoin verification in IN3.
nicht vorhanden
This study explores the opportunities and risks associated with user-generated content (UGC) in the communication strategies of marketing departments from a business perspective. With the rise of social media and online platforms, UGC has become a powerful tool for brands to engage with their audience, build trust, and enhance brand awareness. However, implementing UGC also comes with inherent risks, including the loss of control over brand messaging, potential negative user-generated content, and legal implications.
To investigate these dynamics, an empirical mixed-methods approach was employed, including expert interviews and a comprehensive literature review. The findings indicate that UGC offers significant opportunities for marketing departments, such as increased customer loyalty, enhanced authenticity, brand awareness, as well as a diverse set of possible content. However, the study also reveals the potential risks associated with UGC, highlighting the importance of managing these risks effectively.
In the following study the properties of the superabsorbent polymer Broadleaf P4 were investigated according to the aim to apply that polymer within constructed wetlands. The application of the polymer in constructed wetlands shall result in an improvement of the removal of pesticides. For that the polymer was given into lab-scale wetlands together with pumice and were compared to a control wetland, which was filled with gravel. The wetlands were running for several weeks in which the nutrient removal was recorded. The polymer was also tested according to its property to adsorb the pesticides before adding the pesticides to the wetland beds.
In this thesis, we implement, correct, and modify the compartmental model described in “Transmission Dynamics of Large Coronavirus Disease Outbreak in Homeless Shelter, Chicago, Illinois, USA, 2020”. Our objective is to engage in reading and understanding scientific literature, reproduce the results, and modify or generalize an existing mathematical model. We provide an overview of epidemiological models, focusing on simple compartmental SEIR models. We correct inaccuracies and misprints in the original implementation and use the limited-memory Broyden–Fletcher–Goldfarb–Shanno algorithm to fit the model’s parameters. Furthermore, we modify the model by introducing an additional compartment. The resulting model has a more intuitive interpretation and relies on fewer assumptions. We also perform the fitting process for this alternative model. Finally, we demonstrate the advantages of our modified implementations and discuss other possible approaches.
In Machine Learning, Learning Vector Quantization(LVQ) is well known as supervised learning method. LVQ has been studied to generate optimal reference vectors because of its simple and fast learning algorithm [12]. In many tasks of classification, different variants of LVQ are considered while training a model. In this thesis, the two variants of LVQ, Generalized Matrix Learning Vector Quantization(GMLVQ) and Generalized Tangent Learning Vector Quantization(GTLVQ) have been discussed. And later, transfer learning technique for different variants of LVQ has been implemented, visualized and we have compared the results using different datasets.
Community acquired pneumonia (CAP) is a very common, yet infectious and sometimes lethal disease. Therefor, this disease is connected to high costs of diagnosis and treatment. To actually reduce the costs for health care in this matter, diagnosis and treatment must get cheaper to conduct with no loss in predictive accuracy. One effective way in doing so would be the identification of easy detectable and highly specific transcriptomic markers, which would reduce the amount of work required for laboratory tests by possibly enhanced diagnosis capability.
Transcriptomic whole blood data, derived from the PROGRESS study was combined with several documented features like age, smoking status or the SOFA score. The analysis pipeline included processing by self organizing maps for dimensionality and noise reduction, as well as diffusion pseudotime (DPT). Pseudotime enabled modelling a disease run of CAP, where each sample represented a state/time in the modelled run. Both methods combined resulted in a proposed disease run of CAP, described by 1476 marker genes. The additional conduction of a geneset analysis also provided information about the immune related functions of these marker genes.
Blockchain and other distributed ledger technologies are evolving into enabling infrastructures for innovative ICT-solutions. Numerous features, such as decentralization, programmability, and immutability of data, have led to a multitude of use cases that range from cryptocurrencies, tracking and tracing to automated business protocols or decentralized autonomous systems. For organizations that seek blockchain adoption, the overwhelming spectrum of potential application areas requires guidance reducing complexity and support the development of blockchain-based concepts. This paper introduces a classification approach to provide design and implementation guidance that goes beyond current textbook classifications. As an outcome, a typology for management and business architects is developed, before the paper concludes with an instantiation of existing use cases and a discussion of their classes.
Influenza A viruses are responsible for the outbreak of epidemics as well as pandemics worldwide. The surface protein neuraminidase of this virus is responsible, among other things, for the release of virions from the cell and is thus of interest in pharmacological research. The aim of this work is to gain knowledge about evolutionary changes in sequences of influenza A neuraminidase through different methods. First, EVcouplings is used with the goal of identifying evolutionary couplings within the protein sequences, but this analysis was unsuccessful. This is probably due to the great sequence length of neuraminidase. Second, the natural vector method will be used for sequence embedding purposes, in hopes to visualize sequential progression of the virus protein over time. Last, interpretable machine learning methods will be applied to examine if the data is classifiable by the different years and to gain information if the extracted information conform to the results from the EVcouplings analysis. Additionally to using the class label year, other labels such as groups or subtypes are used in classification with varying results. For balanced classes the machine learning models performed adequately, but this was not the case for imbalanced data. Groups and subtypes can be classified with a high accuracy, which was not the case for the years, continents or hosts. To identify the minimal number of features necessary for linear separation of neuraminidase group 1 subtypes, a logistic regression was performed at last, resulting in the identification of 15 combinations of nine amino acid frequencies. Since the sequence embedding as well as the machine learning methods did not show neuraminidase evolution over time, further research is necessary, for example with focus on one subtype with balanced data.
Proteins are involved in almost every aspect of life, mediating a wide range of cellular tasks. The protein sequence dictates the spatial arrangement of the residues and thus ultimately the function of a rotein. Huge effort is put into cumbersome structure eludication experiments which obtain models describing the observed spatial conformation of a protein, enabling users to predict their function, to understand their mode of action or to design tailored drugs to cure disease caused by misfolded or misregulated proteins.
However, the result of structure determination experiments are merely models of reality, made under simplifying assumptions - sometimes containing major undetected errors. On the other hand, such experiments are resource demanding and they cannot supply the actual demand.
Thus, scientists are predicting the structure of proteins in silico, resulting in models that are even
more prone to error.
In consequence, the structure biologists search after a practicable definition of structure quality and over the last two decades several model quality assessment programs emerged, measuring the local and global quality of peculiar structures. Seven representatives were studied, regarding the paradigms they follow and the features they use to describe the quality of residues. Their predications were compared, showing that there is almost no common ground among the tools.
Is there a way to combine their statements anyway?
Finally, the accumulated knowledge was used to design a novel evaluation tool, addressing problems previously spotted. Thereby, high quality of its predication as well as superior usability was
key. The strategy was compared to existing approaches and evaluated on suitable datasets.
Currently, the Internet of Things (IoT) is connected to the virtual world through the Web of Things (WoT), allowing efficient utilization of real-world objects with Internet technologies. The WoT facilitates abstract interaction between applications and connected IoT devices, allowing owners to switch between devices while using multiple ones. To achieve this, virtual assets in WoT devices can be tokenized through smart contracts and transferred using hashed proof as transactions within blockchain networks that support virtual currencies. The goal of Web of Things is to establish connectivity, interoperability, and integration among IoT devices using web standards and protocols, reducing reliance on device manufacturers. This enables easy integration of Web 3.0 cryptocurrency for device management. This study proposes a solution for WoT applications involving different cryptocurrency definitions. Finally, simulation results are presented to demonstrate the tokenization-based ownership transfer in the Web of Things.
Tokenization projects are currently very present when it comes to new blockchain technologies. After explaining the fundamentals of cross-chain interaction, the bachelor thesis will focus on tokenizing technology for Bitcoin on Ethereum. To get a more practical context, implementing the currently most successful decentralized tokenization project is described.
The financial world of blockchains is mostly covered by Bitcoin, taking up about 210 billion dollars in market cap. Despite the huge security and independence which the technology offers to the users, it's not quite easy to adapt with upcoming applications due to the regulated infrastructure behind. For small-scale transactions, everyday use applications or the access to a variety of crypto technologies and projects, Bitcoin is relatively limited in future development. The compatibility for most of those applications is covering currencies from more development-driven blockchains like Ethereum. Those want to reach out for the user base that's already in hold of Bitcoins and offer them a seamless transition to new applications without the risk of losing their funds. Within the article, atomic swaps and tokenization are covered up and current approaches compared. Both mechanisms are used to fulfill this symbiosis between Bitcoin and Ethereum.
To get a more practical view, an example on how to implement such a tokenization within an app is shown. This will give deeper insights and offers inspiration for digital identity-based app development.
Target of this Diploma Thesis is the development of a thermal simulation card to analyze the thermal behavior of a LTE PCIe Mini data card for GSM/UMTS based wireless networks in different environments. The power consumption of modern wireless communication systems has increased dramatically during the last years. Especially for the next generation of wireless modem cards the thermal dissipations will be slightly on or even beyond the official guidelines of the components and the whole card. To gain knowledge about the behavior of the data card, it shall be simulated with software as well as real hardware. As the ASIC components are not available yet, a hardware emulation shall be developed. The thesis covers the whole development process from the idea, the conception, the layout to the assembly and the measurements. It starts with finding a way of emulating the mounted components, measuring and powering. Afterwards a card, incorporating the principles found before, will be developed. An additional software simulation gives comparative values against the measurements. After assembling the emulation cards and running reference measurements, trials for temperature improvements will be ran and compared with the simulations.
In this work a new method for the prediction of the Xaa-proline (where Xaa is any amino acid) cis/trans isomerization was investigated. By extraction of twelve structural features (real secondary structure, inside/outside classification, properties of the environment around proline and proline itself) a support vector machine (SVM) based prediction approach was evolved. The Java software Xaa-PIPT for structural feature extraction was developed. Based on 4397 (2199 cis and 2198 trans) prolines extracted from non-redundant, globular proteins a classifier was trained using the radial basis function (RBF) kernel. In ten-fold cross-validation it achieved an accuracy of 70.0478 % and a Matthews correlation coefficient (MCC) of 0.4223, a sensitivity of 0.5433 and a specificity of 0.8576. Based on this classifier a lightweight and easy-to-use Java software tool, called m Xaa-PIPT, for the prediction of the Xaa-proline cis/trans isomerization was devel-oped. It was shown that there are correlations between the proline surrounding environment and the isomerization state. m Xaa-PIPT can be used for the evaluation of low-resolution protein structures and theoretical models to improve their quality by the prediction of the Xaa-proline isomerization.
The theoretical foundations of enterprise management using information technology were reviewed; analysis of the effectiveness of the use of information systems in the enterprise; ways of improving the enterprise management mechanism using information systems (on example of Mars Wrigley Confectionery Belarus) have been developed.
The Media System of Malawi
(2010)
nicht vorhanden
The impact of organisational structure and organisational culture on the efficiency of a business
(2020)
The fear of losing flexibility and effectiveness due to an increased organisational structure induced by personal growth is causing SME's to defer structural changes. The purpose of this work is to examine whether the structural and cultural demands of employees match the structure and predominant culture within such a medium-sized company. As part of this, a survey was made to evaluate the current status and to suggest furthermore where and how changes would make sense to regain or even improve organisational efficiency.
The research of this thesis aims to analyze how a specific CSR approach from the Adidas Group on sustainability is perceived globally based on an analysis of the movements on the stock market combined with a sentiment analysis of tweet activities on Twitter. The thesis analyzed both positive feedback and critic from customers worldwide regarding the approach and other initiatives from the Adidas Group and their partner Parley for the Oceans, a non-governmental organization working towards a more sustainable world.
The topic of soulbound, non-transferable tokens is getting lots of interest within the blockchain space lately as decentralized societies become more tangible with Web3 social media applications and DAOs. In this article, I want to outline how such tokens function, their problems for adoption and standardization, and how they differ from verifiable credentials in the SSI field. As such soulbound assets will likely rely on extended recovery and asset management schemes to become viable identities that safely gain reputation and trust, features like social recovery and contract-based accounting are incorporated. By combining those new technologies and the theoretical crypto-native identity construct, the paper will give an impression of the future user-centric data economy.
This paper set out to determine what the effect of daily internet usage on a short attention span was and whether this had an effect on academic performance. As described briefly in the introduction this paper consisted of laying the groundwork through defining the relevant terminology, applying the methodology to the Hypotheses and making conclusive statements.
Two Hypotheses were presented to give the paper the aim. While Hypothesis 1 can be proven true through the two-step terminology applied, Hypothesis 2 does not stand up to the scrutiny. For lack of sufficient and specific evidence, the only conclusive statement that can be made regarding it is that it is untrue.
Approx. 80% of the population sample analysed were between the age of 19 – 30 which automatically reduces the analysis, extrapolations and scientific statements to a more specific age group. The other ages represented were almost all above, meaning that the findings could not accurately be applied to older age groups.
Nonetheless, the data collected was accurate and good be applied to prove Hypothesis 1, meaning that daily internet usage breeds and invites a short attention span. For lack of a fitting data collection method, physcial, social, mental factors along with motivation of an individual make up his academic performance. These were factors that could not be taken into consideration.
Conclusively, the author predicts that a present internet connection coupled with the growing popularity of digital technology attention spans will contin ue to stay as short as they are. Individuals will find ways to direct their short attention span where it is needed and apply it as necessary.
Both cryptocurrency researchers and early adopters of cryptocurrencies agree that they possess a special kind of materiality, based on the laborious productive process of digital ‘mining’ [1]. This idea first appears in the Bitcoin White Paper [2] that encourages Bitcoin adopters to construct and justify its value in metaphoric comparison to gold mining. In
this paper, I explore three material aspects of blockchain: physical infrastructure, human language and computer code. I apply the concept of 'continuous materiality' [3] to show how these three aspects interact in practical implementations of blockchain such as Bitcoin and Ethereum. I start from the concept of ‘digital metallism’ that stands for ‘fundamental value’ of cryptocurrencies, and end with the move of Ethereum to ‘proof-of-stake’, partially as a countermeasure against ‘evil miners’. I conclude that ignoring material aspects of blockchain technology can only further problematize complicated relations between their technical, semiotic and social materiality.
nicht vorhanden
Where does the cocoa, which we consume on a regular basis, come from? Supply chains are not always transparent, much less easily comprehensible. The cocoa industry faces ongoing challenges. Whether it be the chocolate manufacturers’ promise to maintain a sustainable and ethical supply chain, the minimal impact on the environment or the maximum adherence to human rights in their production process. This paper revises important steps which lead to the compliance with UN standards and questions the role of consumers in the construct of ethical chocolate products.
The shape-memory Nitinol as a nickel-titanium alloy is widely used in actuator and medical applications. However, the connection of a flange to the rod is a critical point. Therefore, laser rod end melting enables material accumulations to generate a preform at the end of a rod, followed by die forming, so that the flange can be generated. This process has been successfully applied on 1.4301 steel. This study is aimed to investigate laser rod end melting of shape-memory Nitinol regarding the resultant surface quality of the preforms. The results showed that spherical preforms could be generated without visible surface discoloration due to oxidation. By using different scan rates, different solidification conditions occurred which led to significantly different surface structures. These findings show that laser rod end melting can principally be applied on Nitinol to generate preforms for flanges whereby the surface quality depends on the solidification conditions.
As economies are getting more and more interconnected, the importance of the global logistics sector grew accordingly. However, both structural challenges and current events lead to recent supply chain disruptions, exposing the vulnerabilities of the sector. Simultaneously, blockchain has emerged as a key innovative technology with use cases going far beyond the exchange of virtual currencies. This paper aims to analyze how the technology is transforming global logistics and its challenges. Therefore, six use cases, are presented to give an overview of the technological possibilities of blockchain and smart contracts. The analysis combines theoretical approaches from scientific journals and combines them with findings from real-world implementations. The paper finds that the technology can change supply chain design fundamentally, with processes and decisions being automated and power within supply chain structures changing. However, implementations also face technological, environmental, and organizational challenges that need to be solved for wide-spread adoption.
This thesis proposes a solution to the practical problem of supervising relatively basic mechanic processes in robotics by means of computervision. Supervision happens by comparing the tracked movement with a known, ideal recording of the movement that acts as a model.
First, this thesis analyzes possible approaches to the problem regarding data structures and representation, ways of extracting the data from the recording and ways to compare the data sets of two recordings. Then, a specific solution is implemented in C++ and explained.
Success story DAB in the UK
(2017)
The popularity of digital audio broadcasting in different countries can be explained mainly by means of historical development. In this work, the general technical conditions are explained and the mode of operation is explained. In addition, advantages, disadvantages and alternatives are presented. After that, the development of the digital radio in Germany and the UK is compared with the current situation in order to show how the differences have led to a different distribution and acceptance of the medium.
This master thesis covers the topics of Studying customers’ behavior on the example of skin care brand Nivea. There are presented theoretical basis for the following research about marketing, customers’ behavior and conducting marketing research properly. Then, there is the analysis of German market. Since Nivea is the brand of Beiersdorf company, there is a description of Beiersdorf’s activity and operation work. The main idea of the paper work is to analyze customers’ behavior of Nivea. Therefore, the work contains huge research about the brand along with its’ micro- and macroenvironment. There also were conducted an in-depth interview and a survey to understand customers’
current needs. With all the results the author of the work proposed some ideas for Nivea brand.
This study shows the potential for the make-or-buy theory in several scenarios – production, assembling and development. The evaluation of these possibilities is conducted, based on Bosch’s core competencies. A decision model is developed to support the decision making process. Based on these results, the serial production at RBAC in China is planned and suggestions for setting up the assembly line are given
After the expression of the titin-Hsp27-construct with the following purification supplies no satisfied results which makes the realization of the atomic force microscopy not possible. The devel-opment of the structure model by using different bioinformatic methods can establish a model for the protein sequence. As bioinformatic methods the template search by different BLAST runs and free available software like SwissModel, Pcons, ModWeb and other tools are used. Nevertheless, the generated model is not the native conformation and has to be analyzed with other software until a stable conformation of the structure can be predicted. Depending on the time which is provided the generated model is a good approach for the aim this master thesis has.
More than 10 years after the invention of Bitcoin, the underlying blockchain technology is having an increasing effect on today’s society. Although one of the most popular application areas of blockchain is still the field of cryptocurrencies, the technological concepts are crossing into further application domains such as international supply chains. Fast-changing markets, high costs of time and risk management as well as biased relationships between the actors pose big challenges to an appropriate supply chain management. Based on a case study about sensor tracking, this paper explores the potential impact of blockchain on small and medium enterprises within an international supply chain. We will show that blockchain technologies offers a high potential to reduce inequalities of power relations between involved actors within supply chains. To achieve this, the requirements for the use of blockchain in supply chain management will be analyzed by means of a conducted case study and an expert survey of the companies concerned.
The set of transactions that occurs on the public ledger of an Ethereum network in a specific time frame can be represented as a directed graph, with vertices representing addresses and an edge indicating the interaction between two addresses.
While there exists preliminary research on analyzing an Ethereum network by the means of graph analysis, most existing work is focused on either the public Ethereum Mainnet or on analyzing the different semantic transaction layers using
static graph analysis in order to carve out the different network properties (such as interconnectivity, degrees of centrality, etc.) needed to characterize a blockchain network. By analyzing the consortium-run bloxberg Proof-of-Authority (PoA) Ethereum network, we show that we can identify suspicious and potentially malicious behaviour of network participants by employing statistical graph analysis. We thereby show that it is possible to identify the potentially malicious
exploitation of an unmetered and weakly secured blockchain network resource. In addition, we show that Temporal Network Analysis is a promising technique to identify the occurrence of anomalies in a PoA Ethereum network.
Standard assembly time is an important piece of data in product development that is used to compare different product variants or manufacturing variants. In the presented approach, standard time is created with the use of a decision tree regarding standard manual and machine-manual operations, taking into consideration product characteristics and typical tools, equipment and layout. The analysed features include, among others: information determined during product development, such as product structure, parts characteristics (e.g. weight, size), connection type, as well as the information determined during assembly planning: tools (e.g. hand screw driver, power screw driver, pliers), equipment (e.g. press, heater), workstation layout (e.g. distance, way of feeding). The object-attribute-value (OAV) framework was applied for the assembly characteristic. An example of the decision tree application to predict standard assembly time was presented for a mechanical subassembly. The case study was dedicated to standard time prediction for a bearing assembly. The presented approach is particularly important for the enterprises which offer customized products.
The subject of the following paper is the analysis of global company motives for taking on sport sponsorships as a corporate social responsibility (CSR) initiative. This work is compilatory in nature because it is derived from literature released by experts as well as real-life case studies. The expert literature provides a basis of theories and models regarding the fundamental motives for CSR and sport sponsoring and visualizes them by means of statistics and real-life case studies. This paper aims to inform individuals, leaders and specifically global organizations about the benefits that taking on a sport sponsorship may have for fulfilling a company’s CSR objectives
As widely discussed in literature spatial patterns of amino acids, so-called structural motifs, play an important role in protein function. The functional responsible part of a protein often lies in an evolutionary highly conserved spatial arrangement of only few amino acids, which are held in place tightly by the rest of the structure. In general, these motifs can mediate various functional interactions, such as DNA/RNA targeting and binding, ligand interactions, substrate catalysis, and stabilization of the protein structure.
Hence, characterizing and identifying such conserved structural motifs can contribute to understanding of structurefunction relationships in diverse protein families. Therefore and because of the rapidly increasing number of solved protein structures, it is highly desirable to identify, understand and moreover to search for structural scattered amino acid motifs. The aim of this work was the development and the implementation of a matching algorithm to search for such small structural motifs in large sets of target structures. Furthermore, motif matches were extensively analyzed, statistically assessed and functionally classified. Following a novel approach, hierarchical clustering was combined with functional classification and used to deduce evolutionary structure-function relationships. The proposed methods were combined and implemented to a feature-rich and easy-to-use command line software tool, which is freely available and contributes to the field of structural bioinformatic research.
nicht vorhanden
We use machine learning for the selection and classification of single–molecule trajectories to replace commonly used user–dependent sorting algorithms. Measured fluorescence time series of labelled single molecules need to be sorted into ’good molecules’ and ’bad’ molecules before further kinetic and thermodynamic analysis.
Currently, processing, sorting and analysis of the data is mainly done with the help of laboratory specific programs.
Although there are freely available programs for processing smFRET data, they do not offer ’molecular sorting’ or it is purely empirical. Only recently, new approaches came up to solve this problem by means of machine learning. Here, we describe a sound terminology for molecular sorting of smFRET data and present an efficient workflow for manual annotation followed by the training of the ML algorithm. Descriptive statistics of our generated dataset are provided and will serve as the basis for supervised ML-based molecular sorting algorithms yet to be developed.
The intention of this thesis is to examine the beneficial impact of renewable energies in general and biogas technologies in particular on socioeconomic status of people, by considering all applicable sides affecting its development as per political, cultural, environmental, and institutional means. As energy and development are very much correlative with each other, biogas technologies figure prominently as part of a decentralized, sustainable, renewable, energy network especially in rural areas of Nepal.
This thesis was written in order to prove the expediency of startup ecosystem support and to develop practical recommendations for Belarusian government based on the analysis of successful practices in the U.S. and Lithuania.
It covers the essence of a “startup company” and a “startup ecosystem” as well as provides the analysis of socioeconomic impact of startup companies with particular focus on job creation. It sheds light on the best startup support policies in the U.S., where most prominent startup ecosystems are operating, and Lithuania as a country with similar to Belarusian preconditions and a rapidly
developing ecosystem. Furthermore, this paper deals with Belarus‘s peculiarities regarding fostering startup ecosystem growth. It assesses recent economic development of Belarusian IT sector and gives an insight into its competitive advantages and challenges.
The subsequent paper is based on internet research using articles, presentations, reports and studies, websites and official legal documents.
Smart ultrafast laser processing with rotating beam – Laser micro drilling, cutting and turning
(2021)
Current micro drilling, cutting and turning processes are mainly based on EDM, milling, stamping, honing or grinding. All these technologies are using a tool with a predefined geometry that is transferred to the working piece. In contrast the laser is a highly flexible tool, which can adapt its size very fast by changing only a software setting. Thanks to the efforts in laser development during the last years, stable ultrafast lasers with sufficient average power and high repetition rates became industrially available. For using as many pulses as possible, a cost-efficient production demands for innovative processes and machining setups with fast axes movement and special optics for beam manipulation. GFH has developed a helical drilling optics, which rotates the beam up to 30.000 rpm in a very precise circle and allows furthermore to adjust the diameter and the incidence angle. This enables the laser to be used for high precision drilling and cutting and also for micro turning processes.
Generating electricity from wind power is one of the fastest growing methods in the world. The kinetic energy of the moving air is converted into electricity by wind turbines that are installed in places where the weather conditions are most favorable.
Wind turbines can be used individually, but are often grouped together to form wind parks also called wind farm. Electricity generated from wind parks can be used to meet local needs or to supply an electricity distribution network for homes and businesses further away.
Energy obtained from the wind can also be converted into hydrogen and used as transport fuel or stored for subsequent electricity generation. The use of this form of energy, reduces the impact of electricity generation on the environment as it does not require fuel and does not produce any pollutants or greenhouse gases.
Wind energy is growing significantly and since 1994 the world market has grown by around 30% per year. The installed capacity worldwide rose from 17,400 up to 650,560 MW between 2000 and the end of 2019. In the European market, which concentrates most of the world's wind farm, Germany remains the leader with almost half of the total capacity. Spain recorded the strongest growth in the last three years with an annual growth rate of 28%. Europe also concentrates industrial and technological activities: Eight European manufacturers are among the top ten in the world, with 70% of devices sold in 2018.
Sensor fusion is an important and crucial topic in many industrial applications. One of the challenging problems is to find an appropriate sensor combination for the dedicated application or to weight their information adequately. In our contribution, we focus on the application of the sensor fusion concept together with the reference to the distance-based learning for object classification purposes. The developed machine learning model has a bi-functional architecture, which learns on the one side the discrimination of the data regarding their classes and, on the other side, the importance of the single signals, i.e., the contribution of each sensor to the decision. We show that the resulting bi-functional model is interpretative, sparse, and simple to integrate in many standard artificial neural networks.
Current research in identity management is focusing on decentralized trust establishment for distributed identities. One of these decentralized trust models is Self-Sovereign Identities (SSI). With SSI each entity should be able to independently present and manage provable information about itself as well as request and review evidence from other entities. Using a distributed blockchain, information for verifying the authenticity of this evidence can be obtained from any other entity. This concept can be used not only for people, but also for authentication and authorization during the life cycle of devices in the Internet of Things (IoT). This paper presents an SSI-based concept for authentication and authorization of IoT devices among each other, intended to contribute to the change in trust on the internet. The SSI methodology employing a blockchain offers the possibility to establish mutual trust and proof of ownership without relying on any third party. The paper describes the concept, offers a reference implementation, and gives a discussion of the approach.
The emerging Internet of Things (IoT) technology interconnects billions of embedded devices with each other. These embedded devices are internet-enabled, which collect, share, and analyze data without any human interventions. The integration of IoT technology into the human environment, such as industries, agriculture, and health sectors, is expected to improve the way of life and businesses. The emerging technology possesses challenges and numerous
security threats. On these grounds, it is a must to strengthen the security of IoT technology to avoid any compromise, which affects human life. In contrast to implementing traditional cryptosystems on IoT devices, an elliptic curve cryptosystem (ECC) is used to meet the limited resources of the devices. ECC is an elliptic curve-based public-key cryptography which provides equivalent security with shorter key size compared to other cryptosystems such as Rivest–Shamir–Adleman (RSA). The security of an ECC hinges on the hardness to solve the elliptic curve discrete logarithm problem (ECDLP). ECC is faster and easier to implement and also consumes less power and bandwidth. ECC is incorporated in internationally recognized standards for lightweight applications due to the
benefits ECC provides.
Global challenges like climate change, food security, and infectious diseases such as the COVID-19 pandemic are nearly impossible to tackle when established experts and upstart innovators work in silos. If research organizations, governments, universities, NGOs, and the private sector could collaborate on these challenges more easily, lasting solutions would certainly come more quickly. Aligned with the United Nations’ Sustainable Development Goals, SAIRA connects key players in different arenas: scientists and engineers at research and technology organizations (RTOs) looking to collaborate on sustainable development projects, companies seeking R&D support to tackle their most challenging problems, and startups with innovative ideas and a desire to scale. The platform is a blockchain-secured open innovation platform, anchored on Max Plank Digital Library's blockchain network bloxberg, that assures the authenticity and integrity of all user-generated content and collaboration processes.
With the advancement in cryptography and emerging internet technology, electronic voting is gaining popularity since it ensures ballot secrecy, voter security, and integrity. Many commercial startups and e-Voting systems have been proposed, but due to lack of trust, privacy, transparency, and hacking issues, many solutions have been suspended. Blockchain, along with cryptographic primitives, has emerged as a promising solution due to its transparent, immutable, and decentralized nature. In this paper, we summarized the properties that existing solutions should satisfy and explained some cryptographic primitives like ZKP, Ring signatures along with their security limitations. We gave a comprehensive review of some blockchain-based e-Voting systems and discussed their strengths and weaknesses based on the given properties with table of comparison.
With the increasing usage of blockchain technology, legal challenges such as GDPR compliance arise. Especially the right of erasure is considered challenging as blockchains are tamperproof by design. Several approaches investigated
possibilities to weaken the tamperproof aspect of blockchains in favor of GDPR compliance. This paper presents several approaches, then focuses on chameleon hash functions by evaluating the possibility to use these specific functions in a private blockchain. The goal of the built system is to take a step towards the digitization of the bill of lading used in international trade. This paper describes the developed software as well as the core considerations around the system such as network design or block structure.
Recently a deep neural network architecture designed to work on graph- structured data have been capturing notice as well as getting implemented in various domains and application. However, learning representation (feature embedding) from graphical data picking pace in research and constructing graph(s) from dataset remains a challenge. The ability to map the data to lower dimensions further makes the task easier while providing comfort in applying many operations. Graph neural network (GNN) is one of the novel neural network models that is catching attention as it is outperforming in various applications like recommender systems, social networks, chemical synthesis, and many more. This thesis discusses a unique approach for a fundamental task on graphs; node classification. The feature embedding for a node is aggregated by applying a Recurrent neural network (RNN), then a GNN model is trained to classify a node with the help of aggregated features and Q learning supports in optimizing the shape of neural networks. This thesis starts with the working principles of the Feedforward neural network, recurrent units like simple RNN, Long short-term memory (LSTM), and Gated recurrent unit (GRU), followed by concepts of Reinforcement learning (RL) and the Q learning algorithm. An overview of the fundamentals of graphs, followed by the GNN architecture and workflow, is discussed subsequently. Some basic GNN models are discussed in brief later before it approaches the technical implementation details, the output of the model, and a comparison with a few other models such as GraphSage and Graph attention network (GAN).
Learning Vector Quantization (LVQ) methods have been popular choices of classification models ever since its introduction by T. Kohonen in the 90s. These days, LVQ is combined with Deep Learning methods to provide powerful yet interpretable machine-learning solutions to some of the most challenging computational problems.
However, techniques to model recurrent relationships in the data using prototype methods still remain quite unsophisticated. In particular, we are not aware of any modification of LVQ that allows the input data to have different lengths. Needless to say, such data is abundant in today's digital world and demands new processing techniques to extract useful information. In this paper, we propose the use of the Siamese architecture to not only model recurrent relationships within the prototypes but also the ability to handle prototypes of various dimensions simultaneously.
Over the past few years, wind and solar power plants have increasingly contributed to energy production. However, due to fluctuating energy sources, the energy production data contain disruption. Such disrupted data lead to the wrong prediction performance, and they need to be estimated by other values. In this thesis, we provide a comparative study to estimate the online disrupted data based on the data of similar groups of power plants, We apply three estimation techniques, e.g., mean, interpolation, and k-nearest neighbor to estimate the disruption on training data. We then apply four clustering algorithms, e.g., k-means, neural gas, hierarchical agglomerative, and affinity propagation, with two similarity measures, e.g., euclidean and dynamic time warping to form groups of power plants and compare the results. Experimental results show that when KNN estimation is applied to data, and neural gas and agglomerative with dtw are used to cluster the data, the cluster quality scores and execution time give better results compared to others. Therefore, we conclude and choose KNN estimation to reconstruct the online disrupted data on each group of a similar power plants.
The application described in this thesis has been created, built and designed to help nurses or any medical personnel all around the world in being able to access a real-time database to store patient records like Patient Name, Patient ID, Patient Age and Date of Birth, and the Symptoms that the patient is experiencing. A real-time database is a live database where all changes made to it are reflected across all devices accessing it. This application will be beneficial especially in countries where access to a computer or medical equipment is not always possible. A phone is always ready use and at the reach of the hand, users of this application will always be able to access the data at any given time and place. We will be able to add a new patient or search for existing patients. In addition, this application allows us to take RAW medical images that can be used to identify anomalies in the blood sample. RAW images are important for this application because they’re uncompressed, which means, they do not lose any quality or details. The users of this application are the medical personnel that will be taking care of the patients. These users will have to create a profile on the database in order to use the application, since their data, like user ID, will be used in order to control the behaviour of the data retrieved and stored. We will also discuss the current and future features of this application, as well as, the benefits of this application when it comes to the medical personnel, as well as patients. Finally, we will also go
over the implementation of such application from a hardware perspective, as well as a software one.
Prototype-based Vector Quantization is one of the key methods in data processing like data compression or interpretable classification learning. Prototype vectors serve as references for data and data classes. The data are given as vectors representing objects by numerical features. Famous approaches are the Neural Gas Vector Quantizer (NGVQ) for data compression and Learning Vector Quantizers (LVQ) for classification tasks. Frequently, training of those models is time consuming. In the contribution we discuss modifications of these algorithms adopting ideas from quantum computing. The aim for this is a least twofold: First quantum computing provides ideas for enormous speedup making use of quantum mechanical systems and inherent parallelization.
Second, considering data and prototype vectors in terms of quantum systems, implicit data processing is performed, which frequently results in better data separation. We will highlight respective ideas and difficulties when equipping vector quantizers with quantum computing features.
The bachelor thesis is about cis-trans isomerization of Xaa-Pro (Xaa = any amino acid), their quantitative acquisition and the selection of 3D structure information for the prediction with a support vector machine (SVM). The quantitative detection of occurrence of cis-, trans- and cis/trans conformation in membrane proteins will be examined and evaluated. The 3D structure informa-tions include 12 features, the amino acids around proline and are including of proline. These include the inside/outside classification, the real secondary structure, energy consideration, as well as five further amino acid occur properties within a defined radius of the proline. From this information, a data set was created for the SVM. This program is used for the prediction of unknown and known Xaa Pro Isomerisms. The methods for the analysis were implemented with the platform independent programming language Java. Two programs have emerged from the work to a Xaa PIPT for the quantitative detection and extracting structural information and m Xaa-PIPT to the pure prediction of Xaa-Pro isomerism in protein structures. 389 Membrane proteins from the PDB (Protein Data Bank) served as a basis. The data were also statistically analysed and evaluated.
nicht vorhanden
The purpose of this study was to investigate the effects of provocative marketing strategies of different companies in the fashion industry in the first part. The thesis emphasizes on various strategies used by several firms. Furthermore it demonstrates the
different modes of provocation and also the process of a marketing strategy. The second part highlights the opportunities and risks of provocative marketing strategies based on American Apparel.
Sequences are an important data structure in molecular biology, but unfortunately it is difficult for most machine learning algorithms to handle them, as they rely on vectorial data. Recent approaches include methods that rely on proximity data, such as median and relational Learning Vector Quantization. However, many of them are limited in the size of the data they are able to handle. A standard method to generate vectorial features for sequence data does not exist yet. Consequently, a way to make sequence data accessible to preferably interpretable machine learning algorithms needs to be found. This thesis will therefore investigate a new approach called the Sensor Response Principle, which is being adapted to protein sequences. Accordingly, sequence similarity is measured via pairwise sequence alignments with different sequence alignment algorithms and various substitution matrices. The measurements are then used as input for learning with the Generalized Learning Vector Quantization algorithm. A special focus lies on sequence length variability as it is suspected to affect the sequence alignment score and therefore the discriminative quality of the generated feature vectors. Specific datasets were generated from the Pfam protein family database to address this question. Further, the impact of the number of references and choice of substitution matrices is examined.
The study “Proteomic and systems biological database analysis of changed proteins from rat brain tissue after diving “ is about system biological testing of proteomic data obtained by rat brain after experimental diving in a pressure chamber. Basically, brain tissue from animal decompression sickness (DCS) was analyzed by mass spectrometry and has given two larger sets of modified proteins. Thereupon, the resulting up- and down-regulated proteins wereidentified and later compared by means of systems of biological databases, in this case GeneGo MetaCoreTM, in order to find similar or various affected cell biological signaling pathways when two different mass-spectrometry methods were compared.
The occurence of prostate cancer (PCa) has been consistently rising since three decades and remains the third leading cause of cancer-related deaths after lung and bowel cancer in Germany. Despite of new methods of early detection, such as prostate-specific antigen (PSA) testing, it persists to be the most common cancer in german men with over 63,400 new diagnoses in Germany every year and exhibits high prevalence in other countries of Northern andWestern Europe as well [64]. Men over the age of 70 are most commonly affected by the lethal disease, whereas an indisposition before 50 is rare. The malignant prostate tumor can be healed through operation or irradiation while the cancer hasn’t reached the stage of metastasis in which other therapeutic methods have to be employed [14] [15]. In the metastatic phase, the patient usually exhibits symptoms when the tumors size affects the urethra or the cancer spreads to other tissue, often the bones [16].
The high prevalence of this disease marks the importance of further research into prognosis and diagnosis methods, whereby identification of further biomarkers in PCa poses a major topic of scientific analysis. For this task, the effectiveness of high-throughput RNA sequencing of the transcriptome (RNA molecules of an organism or specific cell type) is frequently exploited [66]. RNA sequencing or RNA-Seq in short, offers the possibility of transcriptome assessment, enabling the identification of transcriptional aberrations in diseases as well as uncharacterized RNA species such as non-coding RNAs (ncRNAs) which remain undetected by conventional methods [49]. To alleviate interpretation of the sequenced reads they are assembled to reconstruct the transcriptome as close to the original state as possible, thus enabling rapid detection of relevant biomolecules in the data [49]. Transcriptomic studies often require highly accurate and complete gene annotations on the reference genome of the examined organism. However, most gene annotations and reference genomes are far from complete, containing a multitude of unidentified protein-coding and non-coding genes and transcripts. Therefore, refinement of reference genomes and annotations by inclusion of novel sequences, discovered in high quality transcriptome assemblies, is necessary [24].
Several algorithms have been proposed for the testing of series-parallel graphs in linear time. We give our alternate algorithms for testing series-parallel graphs, their tree decompositions, and the independence number when the input is undirected biconnected series-parallel graphs, which run (approximately) linearly in polynomial time.
In dieser Arbeit wurden neuartige Proteasen aus psychrotoleranten Bakterienstämmen isoliert und auf ihre biochemischen Eigenschaften charakterisiert. Des Weiteren konnten S8 Familie Proteasen Gene amplifiziert werden und Unterschiede in der Aminosäuresequenz konnten in Zusammenhang mit den biochemischen Eigenschaften der Proteasen in Verbindung gebracht werden.
Probabilistic Micropayments
(2022)
Probabilistic micropayments are important cryptography research topics in electronic commerce. The Probabilistic micropayments have the potential to be researched in order to obtain efficient algorithms with low transaction costs and high speeding computer power. To delve into the topic, it is vital to scrutinize the cryptographic preliminaries such as hash functions and digital signatures. This thesis investigates the important probabilistic methods based on a centralized or decentralized network. Firstly, centralized networks such as lottery-based tickets, Payword, coin-flipping, and MR2 are described, and an approach based on blind signatures is also discussed. Then, decentralized network methods such as MICROPAY3, a transferable scheme on the blockchain network, along with an efficient model for cryptocurrencies, are explained. Then we compare the different probabilistic micropayment methods by improving their drawback with a new technique. To set the results from the theoretical analysis of different methods into some context, we analyze the attacks that reduce the security and, therefore, the system’s efficiency. Particularly, we discuss various methods for detecting double-spending and eclipse attacks occurrence
Robust soft learning vector quantization (RSLVQ) is a probabilistic approach of Learning vector quantization (LVQ) algorithm. Basically, the RSLVQ approach describes its functionality with respect to Gaussian mixture model and its cost function is defined in terms of likelihood ratio. Our thesis work involves an approach of modifying standard RSLVQ with non-Gaussian density functions like logistic, lognormal, and Cauchy (referred as PLVQ). In this approach, we derive new update rules for prototypes using gradient of cost function with respect to non-Gaussian density functions. We also derive new learning rules for the model parameters like s and s, by differentiating the cost function with respect to parameters. The main goal of the thesis is to compare the performance results of PLVQ model with Gaussian-RSLVQ model. Therefore, the performance of these classification models have been tested on the Iris and Seeds dataset. To visualize the results of the classification models in an adequate way, the Principal component analysis (PCA) technique has been used.
This thesis deals with the possible integration of social media communication in the marketing of the International Rectifier Corporation. The basis for the implementation of the new communication channel is set through a detailed description of basic, theoret ical functions and features of business-to-business communication as well as social media communication. Based on this knowledge the marketing communication of International Rectifier is analyzed and compared to the competitors. The theoretical lessons in combination with the analysis will then be used to develop a competitive and effective social media strategy for International Rectifier.
This thesis investigated the generation of laser induced periodic surface structures (LIPSS) using femtosecond laser irradiation at a central wavelength of 775 nm.
The metals stainless steel and copper as well as a semiconducting thin film, ITO on glass substrate were investigated. The impact of the processing parameters was studied for single and multiple pulse irradiation to determine the ablation threshold of the materials
and the different types of LIPSS. These observations allowed the optimisation of area structuring with regards to processing speed and LIPSS quality.
The feasibility of the LIPSS generation in dynamic, real time polarisation control was then explored. By using a fast response, liquid-crystal polarisation rotation device, the direction of the linear polarisation of the laser beam could be dynamically controlled and synchronised to the scanning during laser processing. As a result, a range of complex micro- and nano-scale patterns with orthogonal direction of LIPSS were created. The samples were analysed using optical and electron microscopy. The orientation of the LIPSS was determined also from detection of light diffracted by the LIPSS.
Finally, two applications of large area LIPSS patterning were demonstrated, information encoding on metals and periodic structuring of a thin film conducting oxide for solar cells.
PICC Modulation Analysis
(2014)
This diploma thesis discusses interoperability problems between certified RFID readers and transponders based on the ISO/IEC14443 standardand the root cause for them.
The main goal is the definition of new test methodsand parameters that can supplement the existing test regime for such systems and allowthe identification of those problems beforehand.
The Infinica product suite consists of multiple individual microservice applications, mainly gathered around Infinica Process Engine which allows the execution of highly individualised process definitions. For estimating process performance, a layered queuing network approach has been applied. In the first step this required the implementation of a basic modelling framework. Subsequently the implemented framework was used to evaluate the applicability of the approach by creating two models and comparing them with actual performance measurements. Although the calculated results deviated from the expected results, analysis showed that the differences may
derive from an inaccurate model. Nevertheless the general approach seems to be appropriate for the given application as well as for microservices in general, especially when extended with advanced modelling techniques, as the analysed modelled results appear consistent.
Die vorliegende Arbeit befasst sich mit der Leistungsmessung während der Strategieentwicklung im Bereich des Einkaufs am Beispiel DANIELI Österreich. Das Hauptziel ist es, dass System von Danieli Österreich darzustellen, um ein Verständnis der Leistungsmessung in der Praxis zu generieren. Die draus erworbenen Kenntnisse sollen weiterführend in einer Analyse dazu dienen, um die tatsächlichen Einsparung in diesen Ländern aufzuzeigen. Aufgrund Vorgaben der Geschäftsführung ist DANIELI Österreich gezwungen 20% seines Einkaufsvolumens im ehemaligen Jugoslawien einzukaufen, weshalb Möglichkeiten sowie Risiken in den Ländern Bosnien und Herzegowina, Serbien und Kroatien dargestellt werden, um die Einkaufsstrategie zu optimieren.
Path decomposition of a graph has received an important amount of interest over the past decades because of its applications in algorithmic graph theory and in real life problems. For the computation of a path decomposition of small width, we use different heuritics approaches. One of the most useful method is by Bodlaender and Kloks. In this thesis, we focus on the computation, applications, transformation and approximation of a path decomposition of small width.
It is easy to convert a path decomposition in to nice path decomposition with same width, which is more convinent to use to find the graph parameters like independent sets, chromatic polynomials etc. Inspired by [28], we find an algorithm to compute the chromatic polynomial of a graph via nice path decomposition with small width.
In the practice of software engineering, project managers often face the problem of software project management.
It is related to resource constrained project scheduling
problem. In software project scheduling, main resources are considered to be the employees with some skill set and required amount of salary. The main purpose of software
project scheduling is to assign tasks of a project to the available employees such that the total cost and duration of the project are minimized, while keeping in check that
the constraints of software project scheduling are fulfilled. Software project scheduling (SPSP) has complex combined optimization issues and its search space increases exponentially when number of tasks and employees are increased, this makes software project scheduling problem (SPSP) a NP-Hard problem. The goal of software project scheduling problem is to minimize total cost and duration of project which makes it multi-objective problem. Many algorithms are proposed up till now that claim to give near optimal results for NP-Hard problems, but only few are there that gives feasible set of solutions for software project scheduling problem, but still we want to get more efficient algorithm to get feasible and efficient results.
Nowadays, most of the problems are being solved by using nature inspired algorithms because these algorithms provide the behavior of exploration and exploitation. For solving
software project scheduling (SPSP) some of these nature inspired algorithms have been used e.g. genetic algorithms, Ant Colony Optimization algorithm (ACO), Firefly etc.
Nature inspired algorithms like particle swarm optimization, genetic algorithms and Ant Colony Optimization algorithm provides more promising result than naive and greedy algorithms. However there is always a quest and room for more improvement. The main purpose of this research is to use bat algorithm to get efficient results and solutions for software project scheduling problem. In this work modified bat algorithm is implemented where a different approach of random walk is used. The contributions of this thesis are to: (1) To adapt and apply modified multi-objective bat algorithm for solving software project scheduling (SPSP) efficiently, (2) to adapt and apply other nature inspired algorithms like genetic algorithms for solving software project scheduling (SPSP) and (3) to compare and analyze the results obtained by applied nature inspired algorithms and provide the conclusion.
This work deals with the construction of a microscope for combined total internal reflection fluorescence (TIRF) and confocal microscopy. It is especially designed for single-molecule fluorescence spectroscopy. The design of the microscope body is based on the miCube (Hohlbein lab, Wageningen University, NL). The excitation and detection pathways were adapted to allow both TIRF and confocal illumination as well as camera and pointdetection for two color-channels to allow single-molecule Förster resonance transfer measurements
A number of real time PCR approaches have been published in the literature. In this thesis, the suitability of different real time PCR approaches using hydrolysis probes have been evaluated regarding PCR performance, cost effectiveness as well as handling. The effect of double-quenched probes as well as the impact of the increase of relative Flap endonuclease amount in quantitative real time PCR has been examined. In terms of genotyping a TaqMan™ assay, considered to be the gold-standard in this application, has been tested and compared to phosphorothioate modified probes, allele specific primers, SNAKE primers, an allele specific probe and primer assays as well as an assay using minor groove binder probes. Promising observations have been made in the case of double-quenched probes, phosphorothioate modified probes, SNAKE primers as well as minor groove binder probes.
The cultivation of mammalian cells in the third dimension has a great potential for a
wide application in regenerative medicine, pharmaceutical industry or cancer research.
An overview about actual 3-D cultivation techniques like hydrogels and porous scaffolds as well as their various materials and modifications is given in this thesis. Also different products and their implementation for a new application of 3-D cell
culture in a laboratory are described.
Die vorliegende Diplomarbeit befasst sich mit der Analyse, Auswertung und Empfehlung einer Berechnungsmethode für die axiale Klemmkraft und Wellenmuttern-Anzugsmoment bei Hochgenauigkeits-Axial-Schrägkugellagern für Gewindetriebe. Des Weiteren wird der Einfluss der Klemmkraft bzw. WellenmutternAnzugsmomentes auf unterschiedliche Lagersätze und –anordnungen überprüft und Korrekturfaktoren dafür erarbeitet
Decentralization is one of the key attributes associated with blockchain technology. Among the different developments in recent years, decentralized autonomous organizations (DAOs) have been of growing interest. DAOs are currently a key part of another emerging use case, namely decentralized science (DeSci). Given the novelty of the field, an integrative definition of DeSci has not been established, but some inherent concepts and ideas can be traced back to the Open Science movement. Although the DeSci movement has the potential to benefit the public, for example through funding underrepresented research areas or more inclusive and transparent research in general, some negative aspects of decentralization should not be neglected. Due to the novelty of blockchain and emerging use cases, research can and should precede mass adoption, to which this paper aims to contribute.
Massive multiple-input multiple-output (MIMO), eine Technik bei der die Basisstation einer Mobilfunkzelle mit einer großen Anzahl an Antennen ausgestattet ist, wird derzeit als eine vielversprechende Schlüsseltechnologie zur Erfüllung der Anforderungen zukünftiger drahtloser Kommunikationsnetze der fünften Generation betrachtet. Die zuversichtlichen Angaben über die Leistung solcher Systeme beruht allerdings auf einer theoretischen, bisher kaum praktisch verizierten Annahme, dass die drahtlosen Übertragungskanäle verschiedener Nutzer aufgrund der hohen Anzahl an Antennen voneinander unabhängig sind. Das heißt, dass sogenannte günstige Übertragungsbedingungen herrschen. Die vorliegende Masterarbeit untersucht diese neuartigen Systeme unter zwei verschiedenen Perspektiven.
Im ersten Teil dieser Arbeit wird der Einfluss von realistischen Übertragungsbedingungen auf die Performance von massive MIMO Systemen evaluiert. Dazu werden entsprechende numerische Systemsimulationen durchgeführt und mit den Ergebnissen von praktischen massive MIMO Messkampagnen verglichen.
Die Untersuchungen ergeben, dass die sogenannten günstigen Übertragungsbedingungen in realistischen Umgebungen nur bedingt beobachtet werden können. Daher führen traditionelle Kanalmodelle zu einer ungenauen Abschätzung der Leistung von praktischen massive MIMO Systemen. Um diesem Problem zu begegnen, wird deshalb eine neuartige Parametrisierung des traditionellen Kronecker-Modells vorgeschlagen, sodass relevante Kenngrößen realistischer Kanäle mit diesem Modell präzise widergespiegelt werden.
Anschließend folgt eine Untersuchung verschiedener Methoden zur Kanalschätzung in massive MIMO Systemen unter den verschiedenen Kanalmodellen mittels numerischer Simulationen. Die Experimente zeigen auf, dass Schätzmethoden, welche speziell für massive MIMO unter der Annahme von günstigen Übertragungsbedinungen hergeleitet wurden, eine signifikante Leistungsminderung unter realistischen Kanalmodellen erfahren.
Im zweiten Teil dieser Arbeit liegt der Fokus auf der Anwendung von massive MIMO Systemen in sogenannten Internet of Things (IoT) Netzwerken. Die typischerweise hohe Anzahl an aktiven IoT-Geräten macht die Anwendung von effizienten Scheduling-Algorithmen notwendig. Daher wird ein Downlink-Scheduling-Algorithmus präsentiert, welcher sich die Eigenschaften von massive MIMO Systemen und die typischen Anforderungen an die Datenraten von IoT-Geräten zunutze macht. Im Speziellen wird vorgeschlagen, die IoT-Nutzer in Gruppen aufzuteilen und die verschiedenen Gruppen nacheinander zu versorgen. Die Gruppengröße wird dabei mit Hilfe asymptotischer Eigenschaften von massive MIMO Systemen hergeleitet.
Um die Gruppenmitglieder zu selektieren, wird eine modifizierte Version des populären Semi-Orthogonal-User-Selection (SUS) Algorithmus vorgeschlagen. Die anschließend durchgeführten numerischen Simulationen bestätigen, dass die modifizierte Version von SUS die Nachteile des originalen Algorithmus eliminiert, was wiederum zu verbesserten Datenraten in dem betrachteten System führt.
Computationally solving eigenvalue problems is a central problem in numerical analysis and as such has been the subject of extensive study. In this thesis we present four different methods to compute eigenvalues, each with its own characteristics, strengths and weaknesses. After formally introducing the methods we use them in various numerical experiments to test speed of convergence, stability as well as performance when used to compute eigenfaces, denoise images and compute the eigenvector centrality measure of a graph.
The primary objective of this work and the research at the “Helmholtz-Zentrum für Umweltforschung” was to gain a deeper understanding of the basically transformation processes, especially for nitrogen species, in constructed wetlands. Therefore two different types of laboratory scale model systems, run with two different artificial wastewaters, had been observed for about 4 months. Data about the situation of three nitrogen species (ammonium, nitrate, nitrite), the physical condition of the pore water and the carbon sources contained by the water had been collected and compared. The present work will provide a summary about the actual knowledge of the microbial processes in constructed wetlands and the general character of such constructions. It will explain the different methods used to gain the data which will be later wards discussed with the aid of the created graphs in the final argumentation.
This Bachelor thesis investigates the learning rules of the Hebbian, Oja and BCM neuron models for their convergence to, and the stability of, the fixed points. Existing research is presented in a structured manner using consistent notation. Hebbian learning is neither convergent nor stable. Oja learning converges to a stable fixed point, which is the eigenvector corresponding to the largest eigenvalue of the covariance matrix of the input data. BCM learning converges to a fixed point which is stable, when assuming a discrete distribution of orthogonal inputs that occur with equal probability. Hebbian learning can therefore not be used in further applications, where convergence to a stable fixed point is required. Furthermore, this Bachelor thesis came to the conclusion that determining the fixed points of the BCM learning rule explicitly involves extensive calculation and other methods for verifying the stability of possible fixed points should be considered.
Cryptorchidism describes a disease, in which one or both testes do not descend into the scrotum properly. With a prevalence of up to 10%, cryptorchidism is one of the most common birth defects of the male genital tract. Despite its associated health risks and accompanying economic damage, resulting from surgery and losses in breeding, studies on canine cryptorchidism and its causes are relatively rare. In this study a relational database for genetic causes of cryptorchidism was established and used as a basis for the identification of candidate genes. Associated regions were analysed by nanopore sequencing with the goal to identify genetic variants correlated with cryptorchidism in German Sheep Poodle.
A Protein is a large molecule that consists of a vast number of atoms; one can only imagine the complexity of such a molecule. Protein is a series of amino acids that bind to each other to form specific sequences known as peptide chains. Proteins fold into three-dimensional conformations (or so-called protein’s native structure) to perform their functions. However, not every protein folds into a correct structure as a result of mutations occurring in their amino acid sequences. Consequently, this mutation causes many protein misfolding diseases. Protein folding is a severe problem in the biological field. Predicting changes in protein stability free energy in relation to the amino acid mutation (ΔΔG) aids to better comprehend the driving forces underlying how proteins fold to their native structures. Therefore, measuring the difference in Gibbs free energy provides more insight as to how protein folding occurs. Consequently, this knowledge might prove beneficial in designing new drugs to treat protein misfolding related diseases. The protein-energy profile aids in understanding the sequential, structural, and functional relationship, by assigning an energy profile to a protein structure. Additionally, measuring the changes in the protein-energy profile consequent to the mutation (ΔΔE) by using an approach derived from statistical physics will lead us to comprehend the protein structure thoroughly. In this work, we attempt to prove that ΔΔE values will be approximate to ΔΔG values, which can lead the future studies to consider that the energy profile is a good predictor of protein binding affinity as Gibbs free energy to solve the protein folding problem.
Anomaly Detection is a very acute technical problem among various business enterprises. In this thesis a combination of the Growing Neural Gas and the Generalized Matrix Learning Vector Quantization is presented as a solution based on collected theoretical and practical knowledge. The whole network is described and implemented along with references and experimental results. The proposed model is carefully documented and all the further open researching questions are stated for future investigations.
In the context of globalization and the internationalization of international markets, mergers and acquisitions are becoming increasingly important for transnational corporations and national economies of countries as a form of internationalization, integration and the way to attract foreign investment. In the framework of this paper, the theoretical aspects of mergers and acquisitions have been analyzed, and the experience of Germany, China and Russia in attracting investments through mergers and acquisitions has been examined, and the success of this method for each country has been assessed.
Purpose: This study addresses issues of occupational mental health among nurses. Factors such as linking role, work and social factors, stress, burnout, depression, absenteeism and the intention of turnover, guides the research. The purpose of this research paper therefore looks forward to answer the question "How to measure the extent at which nurses experience symptoms or risk of depression through various factors such as individual or demographic factors, emotional exhaustion and stressful working situations?"
Design: Data were collected from 9 nurses working for major hospitals located in Germany, Baden-Württemberg (Mannheim and Heidelberg), Bremen, Ukraine and Ghana.
Methods: The design and method utilized in the qualitative and quantitative research methods is a survey, which consists of a questionnaire and biographical Interviews.
Questionnaire was used to collect data, which included demographic and job characte-ristics, job-related stress, emotional labor, and depressive symptoms The PHQ-9, serves to measure the depressive symptoms of the participants and serves as an instrument to back up the Interviews conducted. The questionnaire was evaluated with the SPSS version 21.0. Descriptive statistics, correlations, and frequency were used to analyze and evaluate the data.
Results: The study found out that all the participants who took part in this survey are depressed ranging from minimal to moderate depression. The questionnaire detected approximately 20% of the participants being minimal depressed, 40%, mild depressed and 30% moderate depressed. The composed questions targeted on factors like Occupational Stress and Work strain with factors as well as recognition and appreciation from patients and organization. 77, 80% admitted, they have no recognition and appreciation from colleagues and patients. 44,40% turned out to be very stressed up with
their daily work routine and the other 55,60% finds it only stressful .100% turned out to find Labor disturbances as a stress factor. All the participants are not pleased with their salary which leads to Job dissatisfaction.
Conclusion: The results show that it is necessary to implement programs for nurses to help reduce job-related stress, Preventive and suitable methods should be considered to reduce mental strain before depression manifest itself.
Clinical Relevance: Introducing programs that may help nurses and its organization is the work of Human resource management in nursing organization. Nursing administrators have to understand that, the rate at which nurses have to work and deal with other stressful situations might cause them to suffer depressive symptoms. In other to help this situation, they can aspire to enhance stressful work conditions, develop programs
that subdue job related stress and minimize the expectations of depressive symptoms.
Mathematics Behind the Zcash
(2020)
Among all the new developed cryptocurrencies from Bitcoin, Zcash comes out to be the strongest cryptocurrency providing both transparency and anonymity to the transactions and its users by deploying the strong mathematics of zk-SNARKs.
We discussed the zero knowledge proofs which is a basic building block for providing the functionality to zk-SNARKs. It offers schnorr and sigma protocols with interactive and noninteractive versions. Non-interactive proofs are further used in Zcash transactions where the validation of sent transaction is proved by cryptographic proof.
Further, we deploy zk-SNARKs proofs following common reference string as public parameter when transaction is made. The proof allows sender to prove that she knows a secret for an instance such that the proof is succinct, can be verified very efficiently and does not leak the
secret. Non-malleability, small proofs and very effective verification make zk-SNARKs a classic tool in Zcash. Since we deal with NP problems therefore we have considered the elliptic curve cryptography to provide the same security like RSA but with smaller parameter size.
Lastly, we explain Zcash transaction process after minting the coin, the corresponding transaction completely hides the sender, receiver and amount of transaction using zero knowledge proof.
As future considerations, we talk about the improvements that can be done in term of decentralization, efficiency by comparing with top ranked cryptocurrencies namely Ethereum and Monero, privacy preserving against the thread of quantum computers and enhancements in shielded transactions.
Mathematics behind the Zcash
(2020)
Among all the new developed cryptocurrencies, Zcash comes out to be the strongest cryptocurrency providing both transparency and anonymity to the transactions and its users by deploying the strong mathematics of zk-SNARKs. We discussed the zero knowledge proofs as a building block for providing the functionality to zk-SNARKs. It offers schnorr protocol which is further used in Zcash transactions where the validation of sent transaction is proved by cryptographic proof. Further, we deploy zk-SNARKs following common reference string that allows sender to prove that she knows a secret such that the proof is succinct, can be verified and does not leak the secret. Non-malleability, small proofs and effective verification make zk-SNARKs a classic tool in Zcash. We deal with NP problems therefore we have considered the elliptic curve cryptography to provide the security. Lastly, we explain Zcash transaction, the corresponding transaction completely hides the sender, receiver and amount of transaction using zero knowledge proof.
Soft Learning Vector Quantisation (SLVQ) andRobust Soft Learning Vector Quantisation (RSLVQ) are supervised data classification methods, that have been applied successfully to real world classification problems. The performance of SLVQ and RSLVQ, however, reduces, when they are applied tomore complicated classification problems. In this thesis, we have introducedmodi-fications to SLVQand RSLVQ, in order to havemore capable versions of them. A few possibilities to modify SLVQ and RSLVQ are considered, some of them are not successful enough and they have been included for the sake of completeness. The fruits of the thesis are plenty, including Tangent Soft Learning Vector Quantisation-Strong (TSLVQ-S), together with its more stable version Tangent Robust Soft Learning Vector Quantisation-Strong (TRSLVQ-S), Attraction Soft Learning Vector Quantisation (ASLVQ) and Grassmannian Soft Learning Vector Quantisation (GSLVQ).
With the growing market of cryptocurrencies, blockchain is becoming central to various research areas relevant from a mathematical and cryptographic point of view. Moreover, it is capable of transforming the traditional methods involving centralized network operations into decentralized peer-to-peer functionalities. At the same time, it provides an alternative to digital payments in a robust and tamperproof manner by adding the element of cryptography, consequently making it traversable for an individual who is a part of the blockchain network. Furthermore, for a blockchain to be optimal and efficient, it must handle the blockchain trilemma of security, decentralization, and scalability constraints in an effective manner. Algorand, a blockchain cryptocurrency protocol intended to solve blockchain’s trilemma, has been studied and discussed. It is a permissionless (public) blockchain protocol and uses pure proof of stake as its consensus mechanism.
This Master Thesis covers two main Topics: Sharing Economy and Risk Management and combines them in frames of this paper in order to provide a methodology (Uber was chosen as an example) of how a risk management process may be applied to a Sharing Economy business, as well as which types of risks are of special relevance for those types of businesses.