Refine
Document Type
- Master's Thesis (121)
- Bachelor Thesis (94)
- Conference Proceeding (66)
- Diploma Thesis (14)
- Final Report (6)
Year of publication
Language
- English (301) (remove)
Keywords
- Blockchain (40)
- Maschinelles Lernen (28)
- Vektorquantisierung (9)
- Algorithmus (7)
- Bioinformatik (6)
- Bitcoin (6)
- Graphentheorie (6)
- Internet der Dinge (6)
- Neuronales Netz (6)
- Unternehmen (6)
Institute
- Angewandte Computer‐ und Biowissenschaften (120)
- 06 Medien (35)
- 03 Mathematik / Naturwissenschaften / Informatik (28)
- Wirtschaftsingenieurwesen (21)
- 01 Elektro- und Informationstechnik (7)
- 04 Wirtschaftswissenschaften (7)
- Ingenieurwissenschaften (4)
- Sonstige (4)
- 02 Maschinenbau (2)
- 05 Soziale Arbeit (1)
Sensor fusion is an important and crucial topic in many industrial applications. One of the challenging problems is to find an appropriate sensor combination for the dedicated application or to weight their information adequately. In our contribution, we focus on the application of the sensor fusion concept together with the reference to the distance-based learning for object classification purposes. The developed machine learning model has a bi-functional architecture, which learns on the one side the discrimination of the data regarding their classes and, on the other side, the importance of the single signals, i.e., the contribution of each sensor to the decision. We show that the resulting bi-functional model is interpretative, sparse, and simple to integrate in many standard artificial neural networks.
The study “Proteomic and systems biological database analysis of changed proteins from rat brain tissue after diving “ is about system biological testing of proteomic data obtained by rat brain after experimental diving in a pressure chamber. Basically, brain tissue from animal decompression sickness (DCS) was analyzed by mass spectrometry and has given two larger sets of modified proteins. Thereupon, the resulting up- and down-regulated proteins wereidentified and later compared by means of systems of biological databases, in this case GeneGo MetaCoreTM, in order to find similar or various affected cell biological signaling pathways when two different mass-spectrometry methods were compared.
Evolution of Game Music : a look at characteristic elements of music in video games across time
(2015)
Music in video games is a subject worth regarding. Nevertheless, it isn't totally explored yet. This thesis shows and explains characteristics every video game music has and explores them regarding the developments in the history of video games. The thesis contains information about video games that inspired the musical evolution of games or that contain music as key part, as well as information about technological advances that influenced the musical evolution.
Community acquired pneumonia (CAP) is a very common, yet infectious and sometimes lethal disease. Therefor, this disease is connected to high costs of diagnosis and treatment. To actually reduce the costs for health care in this matter, diagnosis and treatment must get cheaper to conduct with no loss in predictive accuracy. One effective way in doing so would be the identification of easy detectable and highly specific transcriptomic markers, which would reduce the amount of work required for laboratory tests by possibly enhanced diagnosis capability.
Transcriptomic whole blood data, derived from the PROGRESS study was combined with several documented features like age, smoking status or the SOFA score. The analysis pipeline included processing by self organizing maps for dimensionality and noise reduction, as well as diffusion pseudotime (DPT). Pseudotime enabled modelling a disease run of CAP, where each sample represented a state/time in the modelled run. Both methods combined resulted in a proposed disease run of CAP, described by 1476 marker genes. The additional conduction of a geneset analysis also provided information about the immune related functions of these marker genes.
Decentralizing Smart Energy Markets - tamper-proof-documentation of flexibility market processes
(2020)
The evolving granularity and structural decentralization of the energy system leads to a need for new tools for the efficient operation of electricity grids. Local Flexibility Markets (or "Smart Markets") provide platform concepts for market based congestion management. In this context there is a distinct need for a secure, reliable and tamper-resistant market design which requires transparent and independent monitoring of platform operation. Within the following paper different concepts for blockchain-based documentation of relevant processes on the proposed market platform are described. On this basis potential technical realizations are discussed. Finally, the implementation of one setup using Merkle tree operations is presented by using open source libraries.
In the context of globalization and the internationalization of international markets, mergers and acquisitions are becoming increasingly important for transnational corporations and national economies of countries as a form of internationalization, integration and the way to attract foreign investment. In the framework of this paper, the theoretical aspects of mergers and acquisitions have been analyzed, and the experience of Germany, China and Russia in attracting investments through mergers and acquisitions has been examined, and the success of this method for each country has been assessed.
This Master Thesis covers two main Topics: Sharing Economy and Risk Management and combines them in frames of this paper in order to provide a methodology (Uber was chosen as an example) of how a risk management process may be applied to a Sharing Economy business, as well as which types of risks are of special relevance for those types of businesses.
Differentiation is ubiquitous in the field of mathematics and especially in the field of Machine learning for calculations in gradient-based models. Calculating gradients might be complex and require handling multiple variables. Supervised Learning Vector Quantization models, which are used for classification tasks, also use the Stochastic Gradient Descent method for optimizing their cost functions. There are various methods to calculate these gradients or derivatives, namely Manual Differentiation, Numeric Differentiation, Symbolic Differentiation, and Automatic Differentiation. In this thesis, we evaluate each of the methods mentioned earlier for calculating derivatives and also compare the use of these methods for the variants of Generalized Learning Vector Quantization algorithms.
Over recent years, Maximal Extractable Value (MEV) has gained significant importance within the decentralized finance (DeFi) ecosystem. Remarkably, within just two years of its emergence, MEV has seen an extraction of approximately 600 million USD - a phenomenon that has sparked concerns regarding potential threats to blockchain stability.
With growing interest in the Ethereum network and the growing DeFi sector, research surrounding MEV has substantially increased. This work aims to offer a comprehensive understanding of MEV. Additionally, this research quantifies the largest types of MEV (Arbitrage, Sandwich and Liquidations) from March 2022 to March 2023. The data are then compared to other sources, revealing a general upward trend, with a particularly noticeable increase in Sandwich Attacks.
This thesis aims to research the platform YouTube and whether “being a YouTuber” qualifies as a profession or not and what leads to this. The author combines existing scientific data and information provided by YouTubers doing this as a job and uses the compilation method. The author merges that material and uses it to create a bachelor thesis that covers both the theoretical and practical approach. The aim was to find out if there is a success recipe that can be followed that leads to views and clicks which are essential for the profession as a YouTuber. To do this, the author created two channels to see how the factors mentioned in this thesis are applied and if the approach leads to success. The findings of this thesis showed, that although the profession of a YouTuber can be classified as a job, it needs to be viewed differently from commonly known and in society accepted careers. Becoming a YouTuber and making money from this business, therefore, cannot be guaranteed.
In this work, a transgenic zebrafish line that expresses the fluorophore dsRed under the endogenous zebrafish cochlin promotor is supposed to be established, using the CRISPR/Cas9 system. dsRed was cloned into a pBluescript vector, followed by the cloning of the cochlin locus into this vector. This bait construct was then supposed to be micro injected into wild type AB zebrafish embryos. The micro injection of Cas9 mRNA, single guide RNA and a bait construct was practiced with the tyrosinase gene, which was disrupted using CRISPR/Cas9.
The digital transformation of higher education demands effective and efficient methods for learning support and assessment of learning processes. This paper relates learning support and assessment to each other in the context of learning management systems. It refers to previous studies carried out in multiple introductory economic courses of the University of Applied Sciences Mittweida which examine possible connections between the use of digital tests and learning success, investigate student’s acceptance and self-perceived learning success with respect to the webbased portion of a blended course and a purely online based course. Based on a survey (n = 71) and a quantitative analysis (n = 214) with logging and exam assessment data, the previous work shows that students approached the web-based course portion with rather reserved attitudes. Still, they perceived the individual course elements, namely videos, podcasts, interactive worksheets, online tests, and a comprehensive PDF file to be beneficial to their learning experience. Especially we could indicate a positive correlation between the points students achieved in the online tests and the exam results.
The set of transactions that occurs on the public ledger of an Ethereum network in a specific time frame can be represented as a directed graph, with vertices representing addresses and an edge indicating the interaction between two addresses.
While there exists preliminary research on analyzing an Ethereum network by the means of graph analysis, most existing work is focused on either the public Ethereum Mainnet or on analyzing the different semantic transaction layers using
static graph analysis in order to carve out the different network properties (such as interconnectivity, degrees of centrality, etc.) needed to characterize a blockchain network. By analyzing the consortium-run bloxberg Proof-of-Authority (PoA) Ethereum network, we show that we can identify suspicious and potentially malicious behaviour of network participants by employing statistical graph analysis. We thereby show that it is possible to identify the potentially malicious
exploitation of an unmetered and weakly secured blockchain network resource. In addition, we show that Temporal Network Analysis is a promising technique to identify the occurrence of anomalies in a PoA Ethereum network.
We investigate the folding and thermodynamic stability of a tertiary contact of baker's yeast ribosomal ribonucleic acid (rRNA), which is supposed to be essential for the maturation process of ribosomes in eukaryotes at lower temperatures1. Ribosomes are cellular machines essential for all living organisms. RNA is at the center of these machines and responsible for translation of genetic information into proteins2,3. Only recently, the rRNA tertiary contact of interest was discovered in Zurich by the research group of Vikram Govind Panse. Gerhardy et al.1 showed in vitro that within the 60s-preribosome under defined metal ion concentrations the tertiary contact become visible between a GAAA-tetraloop and a kissing loop motif. Our aim is now to understand this RNA structure, especially the formation of the rRNA tertiary contact, in terms of thermodynamics and kinetics at various experimental conditions, such as temperature and metal ion concentration of K(I), Na(I) and Mg(II). Therein, we use optical spectroscopy like UV/VIS spectroscopy and ensemble Förster or Fluorescence Resonance Energy Transfer (FRET) folding studies. Our findings will help to further characterize this newly discovered ribosomal RNA contact and to elucidate its function within the ribosomal maturation process.
The almost complete transcription of the human genome yield in a high number of transcripts, that do not encode proteins. However, the functional elucidation of especially long non cod-ing RNAs is still difficult. Secondary structure analysis is assumed to be a possible method to detect functional relationships of lncRNAs on a large scale, but it is still time consuming and error-prone. GRAPHCLUST, the currently most suitable clustering tool based on RNA secondary structure analysis, lacks mainly in an efficient method for the interpretation of its results. Hence, an independent and interactive RNA clustering interpretation tool was developed to allow visu-alisation and an efficient analysis of RNA clustering results.
The aim of this master thesis is to describe the key factors of successful energy efficiency projects. In particular, local conditions of such projects in Kazakhstan will be emphasized and a country-specific guideline will be provided at the end. The following topics will be covered in this thesis: energy efficiency technologies, financing, and capacities. The first part examines the energy efficiency approaches and their potential in the local industry. The second part deals with available financing methods, their specific characteristics and appropriateness for overcoming investment barriers in Kazakhstan. The third part of the master thesis concerns necessary project capacities. The application of the three elements for successful project implementation is described in the end.
As economies are getting more and more interconnected, the importance of the global logistics sector grew accordingly. However, both structural challenges and current events lead to recent supply chain disruptions, exposing the vulnerabilities of the sector. Simultaneously, blockchain has emerged as a key innovative technology with use cases going far beyond the exchange of virtual currencies. This paper aims to analyze how the technology is transforming global logistics and its challenges. Therefore, six use cases, are presented to give an overview of the technological possibilities of blockchain and smart contracts. The analysis combines theoretical approaches from scientific journals and combines them with findings from real-world implementations. The paper finds that the technology can change supply chain design fundamentally, with processes and decisions being automated and power within supply chain structures changing. However, implementations also face technological, environmental, and organizational challenges that need to be solved for wide-spread adoption.
The Media System of Malawi
(2010)
nicht vorhanden
A relatively new research field of neurosciences, called Connectomics, aims to achieve a full understanding and mapping of neural circuits and fine neuronal structures of the nervous system in a variety of organisms. This detailed information will provide insight in how our brain is influenced by different genetic and psychiatric diseases, how memory traces are stored and ageing influences our brain structure. It is beyond question that new methods for data acquisition will produce large amounts of neuronal image data. This data will exceed the zetabyte range and is impossible to annotate manually for visualization and analysis. Nowadays, machine learning algorithms and specially deep convolutional neuronal networks are heavily used in medical imaging and computer vision, which brings the opportunity of designing fully automated pipelines for image analysis. This work presents a new automated workflow based on three major parts including image processing using consecutive deep convolutional networks, a pixel-grouping step called connected components and 3D visualization via neuroglancer to achieve a dense three dimensional reconstruction of neurons from EM image data.
Die biologische Ammoniumoxidation ist ein zentraler Bestandteil des globalen Stickstoffkreislaufs. Angesichts der extremen Massen Stickstoff anthropogenen Ursprungs in der Umwelt, liegt die Entfernung reaktiven Stickstoffs im Interesse der Umwelt und der öffentlichen Gesundheit. In der folgenden Arbeit werden Bedingungen zur anaeroben Ammoniumoxidation mit Nitrat in einem Anammox-Reaktor untersucht. Dabei wurden 2 Laborreaktoren für eine Zeit von insgesamt 116 Tagen betrieben und beobachtet, die ausschließlich als Elektronendonatoren und Akzeptoren Ammonium und Nitrat enthielten. Zusätzlich wurden Batchkulturen mit Zellen eines Reaktors angezüchtet und auf ihre Gaszusammensetzung abhängig unterschiedlicher Eigenschaften untersucht. Hierbei wurde eine Reihe unterschiedlicher analytischer Quantifizierungsmethoden genutzt und es konnte gezeigt werden, dass ein Abbau unter den Bedingungen stattfindet.
Die aktuelle Forschung zu dieser Reaktion ist spärlich und verleiht der Bachelorarbeit dadurch Relevanz.
It is possible to obtain a common updating rule for k-means and Neural Gas algorithms by using a generalized Expectation Maximization method. This result is used to derive two variants of these methods. The use of a similarity measure, specifically the gaussian function, provides another clustering alternative to the before mentioned methods. The main benefit of using the gaussian function is that it inherently looks for a common cluster center for similar data points (depending on the value of the parameter s ). In different experiments we report similar behaviour of batch and proposed variants. Also we show some useful results for the “alternative” similarity method, specifically when there is no clue about the number of clusters in the data sets.
The objective of this Bachelor Project is the creation of a tool that should support forensic investigators during IT forensic interventions. It uses Kismet as the base program and adds functionalities to it via the plugin interface. The installation of the plugin shall be explained, how the plugin works, and a recommendation on how to use it. To understand the underlying basics, an introduction about WLAN and Bluetooth is given. The tests that were performed with the new plugin are described as well as their results. It is therefore briefly discussed why the tool is applicable for locating Wi-Fi devices, especially access points, but not Bluetooth devices. Using all this a few ideas on how to improve the tool and what can be researched in this area are provided.
Sequences are an important data structure in molecular biology, but unfortunately it is difficult for most machine learning algorithms to handle them, as they rely on vectorial data. Recent approaches include methods that rely on proximity data, such as median and relational Learning Vector Quantization. However, many of them are limited in the size of the data they are able to handle. A standard method to generate vectorial features for sequence data does not exist yet. Consequently, a way to make sequence data accessible to preferably interpretable machine learning algorithms needs to be found. This thesis will therefore investigate a new approach called the Sensor Response Principle, which is being adapted to protein sequences. Accordingly, sequence similarity is measured via pairwise sequence alignments with different sequence alignment algorithms and various substitution matrices. The measurements are then used as input for learning with the Generalized Learning Vector Quantization algorithm. A special focus lies on sequence length variability as it is suspected to affect the sequence alignment score and therefore the discriminative quality of the generated feature vectors. Specific datasets were generated from the Pfam protein family database to address this question. Further, the impact of the number of references and choice of substitution matrices is examined.
In the field of satellites it is common practice to combine multiple ground stations into one network, to increase communication times with satellites. This work focuses on TIM, which is an international academic colaborative project. Important criteria for this project are elaborated and used to evaluate existing ground station networks. It concludes that there is no appropriate solution availiable for this specific use case and establish a proposed solution. The proposed ground station network software will be elaborated and evaluated.
Prototype-based Vector Quantization is one of the key methods in data processing like data compression or interpretable classification learning. Prototype vectors serve as references for data and data classes. The data are given as vectors representing objects by numerical features. Famous approaches are the Neural Gas Vector Quantizer (NGVQ) for data compression and Learning Vector Quantizers (LVQ) for classification tasks. Frequently, training of those models is time consuming. In the contribution we discuss modifications of these algorithms adopting ideas from quantum computing. The aim for this is a least twofold: First quantum computing provides ideas for enormous speedup making use of quantum mechanical systems and inherent parallelization.
Second, considering data and prototype vectors in terms of quantum systems, implicit data processing is performed, which frequently results in better data separation. We will highlight respective ideas and difficulties when equipping vector quantizers with quantum computing features.
This work examines the impact Web 2.0 has on CRM in journalism. For this purpose the communication strategies of one international, one New Zealand and one exclusively online men’s magazine are compared. Through this comparison changes in the magazines’ approach to CRM are identified and expert interviews with editors give further insight into the dynamic of the evolution CRM and the journalism industry are going through. Finally, the conclusion illuminates the effects this evolution has on CRM in journalism.
Internationalization and business expansion appear to be the most challenging processes in business conduction today. Every step of the foreign market entry process and overseas operations establishment is full of obvious risks and hidden pitfalls. Theoretical background, multiplied with the vital practice, is playing the key role in such a complicated business process; such information can be used as a guideline by further market entrants and players. At present, Germany with its well-developed engineering industry represents a broad space for research of internationalization process in its different forms, as well as can show both successful and negative results of foreign market entries.
Different small molecule kinase inhibitors, which have an influence on cell growth, proliferation and cell survival were tested alone and in combination with Erlotinib in the Erlotinib-resistant non-small cell lung cancer cell line PC-9ER and Cisplatin in the K-Ras mutant cell line H358. The aim was to find out, which combinations produce the best antiproliferative effects in non-small cell lung cancer cell lines.
Thіs bachеlоr thеsіs was еxеcutеd fоr Іntеrpіpе cоmpany and іt cоncеntratеs оn іts busіnеss stratеgy оn іntеrnatіоnal markеts, еspеcіally оn Mіddlе Еastеrn pіpеs markеt. Chооsіng an іnapprоprіatе еntry busіnеss stratеgy can lеad tо sіgnіfіcant nеgatіvе cоnsеquеncеs, busіnеss stratеgy sеlеctіоn оn іntеrnatіоnal markеts іs оnе оf thе mоst crіtіcal dеcіsіоns іn іntеrnatіоnal tradе systеm. Thе thеоrеtіcal framеwоrk оf thе bachеlоr thеsіs іs prоvіdеd іn thе sеcоnd chaptеr, whіch was maіnly cоllеctеd by dеsktоp studyіng. Thе thеоry rеvіеw cоntaіns dеscrіptіоn оf varіоus fоrеіgn markеt stratеgіеs, mеthоds and mеchanіsms оf dеcіsіоn-makіng, lеvеls and typеs оf busіnеss еnvіrоnmеnt. A cоmbіnatіоn оf thеоrіеs іs adоptеd tо facіlіtatе thе prоcеss оf gathеrіng thе rеquеstеd іnfоrmatіоn. Thе thіrd chaptеr cоntaіns іnfоrmatіоn abоut Іntеrpіpе Cоmpany und іt´s еcоnоmіc actіvіty іn thе hоst cоuntry and abrоad. Gеnеral іnfоrmatіоn abоut Іntеrpіpе Cоmpany, іts currеnt pоsіtіоn, busіnеss dеvеlоpmеnt stratеgy fоr 2015-2016 yеars arе prеsеntеd. Thе sіtuatіоn оn thе pіpеs and whееls markеt іn Ukraіnе durіng thе pеrіоd 2014-2015 was analyzеd and оn accоunt оf thіs thе rеasоns оf dеclіnе іn prоfіts and salеs wеrе еducеd. Cоіncіdеntly thе pеnеtratіоn fоrms оf Іntеrpіpе Cоmpany tо fоrеіgn cоuntrіеs wеrе cоnsіdеrеd. Іn thіs rеgard thе mоst succеssful еntry fоrms arе suggеstеd tо bе accеptеd as thе maіn kеy stratеgy оf pеnеtratіоn tо thе іntеrnatіоnal markеt. Thе fоrth chaptеr prоvіdеs thе іnfоrmatіоn abоut thе cеrtaіn apprоach оf pеnеtratіоn оf Іntеrpіpе Cоmpany tо thе Mіddlе Еastеrn pіpе markеt. Thе purpоsе іs tо іncrеasе thе numbеr оf dеlіvеrіеs tо оіl and gas cоmpanіеs іn thіs rеgіоn and cоntіnuе еstablіshіng оf іts rеlatіоns wіth kеy agеnts and dіstrіbutоrs. Thе prоjеct aіms tо еlеvatе currеnt pоsіtіоn оf thе еntеrprіsе оn Mіddlе Еastеrn pіpеs markеt and adjust advantagеоus іntеrnatіоnal rеlatіоns fоr bоth cоuntеrparts. Data іs cоllеctеd frоm varіоus sоurcеs, іncludіng: bооks and jоurnals іn thе thеоrеtіcal framеwоrk, nеwspapеrs, cоmpany’s publіshеd rеpоrts, prеss rеlеasеs, catalоguеs, bullеtіns, brоchurеs, prеsеntatіоn, Іntеrnеt rеsоurcеs еtc. іn thе еmpіrіcal study.
The research of this thesis aims to analyze how a specific CSR approach from the Adidas Group on sustainability is perceived globally based on an analysis of the movements on the stock market combined with a sentiment analysis of tweet activities on Twitter. The thesis analyzed both positive feedback and critic from customers worldwide regarding the approach and other initiatives from the Adidas Group and their partner Parley for the Oceans, a non-governmental organization working towards a more sustainable world.
This master thesis was developed based on public information about Linde AG. It analyzed and evaluated macroeconomic factors influencing the pеrformance of the company. Microeconomic and macroeconomic indicators play the central role for the financial management of each global company. Thus, performance measurement is important for understanding the vаlue and extent of the environment. The study of the thesis aims at estimating the extent to which a company may opеrate on the global market and what factors contribute to its performance the most.
Firstly, the thesis examines theoretical background based on the previous researches. It defines the specific macroeconomic and microeconomic factors and their role in the company’s performance. Afterwards the thesis analyses Linde AG activities on domestic and foreign markets. The present structure, the current position in the markets and financial indicators are analyzed. The correlation and regression analysis were developed with the aim to find the links between the company’s performance and the macroeconomic environment. It is believed that inflation, exchange and interest rates as well as stock market index have a significant influence on the Linde’s performance.
The results showed that the indicators of inflation rate and stock market index play a significant role in the Linde’s performance. Thus, when it comes to exchanging rates, more data needs to be evaluated in order to derive concrete conclusions.
In this work, we identify similarities between Adversarial Examples and Counterfactual Explanations, extend already stated differences from previous works to other fields of AI such as dimensionality, transferability etc. and try to observe these similarities and differences in different classifier with tabular and image data. We note that this topic is an open discussion and the work here isn’t definite and canbe further extended or modified in the future, if new discoveries found.
In the past few years, social media has become the most popular communication software, replacing phone calls, text messages, television and even advertisements. Social media has become the most important channel for spreading opinions. As a result of this trend, many politicians have also started to operate social media (Wang, Tsai, & Chen 2019). This study was conducted in order to understand whether there was an intercandidate agenda-setting effect between the Facebook posts of legislative candidates and presidential candidates during the election period, and whether the legislative candidates' Facebook posts were influenced by the presidential candidates' Facebook posts. The target population of this study was the three presidential candidates in Taiwan's 2020 presidential election — Dr. Tsai Ing-Wen, Mr. Han Kuo-Yu, and Mr. James Soong — as well as the 36 legislative candidates in Taipei, Taichung, and Kaohsiung.
The study focused on Facebook posts from 1thNovember 2019 to 10th January 2020, 10 weeks before the voting day. Text-mining and cosine similarity were used to organize the posts and compare the similarity between posts. Finally, the similarity between posts was presented as a line graph.
The study revealed that there was an inter-candidate agenda-setting effect between legislative candidate posts and presidential candidate posts, and that Dr. Tsai Ing-Wen, who was also the incumbent president during the campaign, was the most influential Facebook poster during the entire election.
Future research is proposed on the inter-candidate agenda-setting effect only analyzing the similarity of posts among the candidates to discuss the influence of the candidates' Facebook agenda-setting during a specific election period.
This is the first study in which the Facebook posts of Taiwanese politicians are analyzed and the relationships were analyzed and the relationships were systematically compared, across multiple degrees, which opens up a whole new subject for future elections in Taiwan.
A Systematic Literature Review on Blockchain Oracles: State of Research, Challenges, and Trends
(2023)
To enable data exchange between the Blockchain protocol (on-chain) and the real world (off-chain), e.g., non-Blockchain-based applications and systems, a software called Oracle is used [3]. Blockchain oracle is an important component in the use of off-chain data for on-chain smart contracts. However, there is limited scientific literature available on this important blockchain topic. Therefore, in this paper, a novel systematic literature review based on intelligent methods, e.g., information linking, topic clustering and focus identification through frequency calculations, is proposed. Thus, the current state of scientific research interest, content and challenges, and future research directions for blockchain oracles are identified. This paper shows that there is little unbiased literature that does not call oracles a problem. From the results of this new literature review framework, relevant areas of data handling and verification with blockchain oracles are identified for future research.
nicht vorhanden
In this work a novelty detection framework provided by M. Filippone and G. Sanguinetti is considered, which is useful especially when only few training samples are available. It is restricted to Gaussian mixture models and makes use of information theory, applying the Kullback-Leibler divergence. In this work two variations of the framework are presented, applying the symmetric Hellinger divergence and a statistical likelihood approach.
After creating a new blockchain transaction, the next step usually is to make miners aware of it by having it propagated through the blockchain’s peer-to-peer network. We study an unintended alternative to peer-to-peer propagation: Exclusive mining. Exclusive mining is a type of collusion between a transaction initiator and a single miner (or mining pool). The initiator sends transactions through a private channel directly to the miner instead of propagating them through the peerto-peer network. Other blockchain users only become aware of these transactions once they have been included in a block by the miner. We identify three possible motivations for engaging in exclusive mining: (i) reducing transaction cost volatility (“confirmation as a service”), (ii) hiding unconfirmed transactions from the network to prevent frontrunning and (iii) camouflaging wealth transfers as transaction costs to evade taxes or launder money. We further outline why exclusive mining is difficult to prevent and introduce metrics which can be used to identify mining pools engaging in exclusive mining activity.
Derived from the Ancient Greek word τραῦμα (engl. wound,
damage), the word trauma refers to either physical or emotional wounds. Nowadays, it is mostly used in the context of psychological wounds, inflicted by an identity-shattering event – an event that causes the traumatised to not be able to reconcile their lived reality with the expectation of a human universal experience anymore. The last decade, the last two years in particular, and the last two weeks ad absurdum, have scarred the global landscape of human existence beyond recognition. From Putin’s unexpected reimposition of mutually assured destruction doctrines via the global SARS-Cov-2 pandemic to the lingering threat of climate doom, people all over the globe have been faced with persistent threats to their most basic perceptions of ontological safety. This article seeks to examine the impact of the SARS-Cov-2 pandemic and to which degree it is justified to speak of a pandemic trauma. In addition, it engages with the liminality of pandemic trauma as a shared, collective and an isolated, individual experience, and potential mitigation strategies for building community resilience.
The Tutte polynomial is an important tool in graph theory. This paper provides an introduction to the two-variable polynomial using the spanning subgraph and rank-generating polynomials. The equivalency of definitions is shown in detail, as well as evaluations and derivatives. The properties and examples of the polynomial, i.e. the universality, coefficient relations, closed forms and recurrence relations are mentioned. Moreover, the thesis contains the connection between the dichromate and other significant polynomials.
More than 10 years after the invention of Bitcoin, the underlying blockchain technology is having an increasing effect on today’s society. Although one of the most popular application areas of blockchain is still the field of cryptocurrencies, the technological concepts are crossing into further application domains such as international supply chains. Fast-changing markets, high costs of time and risk management as well as biased relationships between the actors pose big challenges to an appropriate supply chain management. Based on a case study about sensor tracking, this paper explores the potential impact of blockchain on small and medium enterprises within an international supply chain. We will show that blockchain technologies offers a high potential to reduce inequalities of power relations between involved actors within supply chains. To achieve this, the requirements for the use of blockchain in supply chain management will be analyzed by means of a conducted case study and an expert survey of the companies concerned.
This thesis work is focusing on the optimization and improvement of IP network and IP transit operations and strategy as well as service offerings. Therefore, this thesis tries to give suggestions at different areas of engineering, business, strategy and operational contexts. This thesis is written in English, as this topic itself is mainly handled in English language too. The first part will try to identify and evaluate methods which are helpful to improve the practical work which will be focused in the second part of this work.
Since its foundation as an application of algebra, coding theory is obtaining a day by day increasing importance. For instance, any communication system needs the concepts of coding theory to function efficiently. In this thesis, reader will find an introductory explanation to linear codes and binary hamming codes including some of the algebraic tools devised in their applications. All the described software applications are verified using SageMath 9.0 using Hochschule Mittweida’s JupyterHub.
This master thesis covers the topics of Customer relationships formation in the IT-outsourcing market on the example of “ABC” company. Most works related to the topic IT outsourcing cover the problems of implementation of IT services and the process of providing them to the customers and mostly all the issues are covered from the perspec-tive of consumers. Thus, problems and results of outsourcing providers of IT services remain almost uncovered. This master thesis is to reveal the specific features of IT out-sourcing business in Belarus and to develop an approach to the formation and construc-tion of a system of relationships between the company and its clients as a source of competitiveness increase.
Applications and Potential Impacts of Blockchain Technology in Logistics and Supply Chain Areas
(2022)
The motive of the present thesis is to analyze the applications and potential impacts of blockchain technology in the logistics and supply chain areas. For this purpose, the literature from different sources has been used to analyze and get an overview of the current status and role of blockchain technology within the logistics and supply chain areas. Different use cases, as well as pilot projects from organizations all over the world and also from Germany, have been included. Suggestions for further applications and implementations of blockchain technology along with their potential impacts have been made. Additionally, the cost of implementing blockchain-based solutions and applications has been estimated along with providing recommendations and suggestions for important and key points to be considered before preparing and deciding to implement blockchain-based solutions in any organization.
In today’s market, the process of dealing with textual data for internal and external processes has become increasingly important and more complex for certain companies. In this context,the thesis aims to support the process of analysis of similarities among textual documents by analyzing relationships among them. The proposed analysis process includes discovering similarities among these financial documents as well as possible patterns. The proposal is based on the exploitation and extension of already existing approaches as well as on their combination with well-known clustering analysis techniques. Moreover, a software tool has been implemented for the evaluation of the proposed approach, and experimented on the EDGAR filings, on the basis of qualitative criteria.
Safety, quality, and sustainability concerns have arisen from global supply chains. Stakeholders incur risk regarding these factors, given their significance and complexity. Thus, each business's supply chain risk management must prioritize product characteristics. Accordingly, an effective traceability solution that can monitor and regulate product and supply chain aspects is crucial, especially in a given scenario. This re-search paper elucidates the potential of smart contracts in blockchain to enhancing the efficacy of business transactions and ensuring comprehensive traceability within the supply chain of paper-based coffee cups The improved levels of transaction transparency and security in traditional supply chains have been achieved through the digitization of supply chain ecosystem interactions and transactions. This approach makes verifying sources, manufacturing procedures, and quality standards easier in complex supply chains. Accordingly, the integration helps stakeholders monitor and track the whole ecosystem, promoting transparency, predictability, and dependability.
We present dimensionality reduction methods like autoencoders and t-SNE for visualization of high-dimensional data into a two-dimensional map. In this thesis, we initially implement basic and deep autoencoders using breast cancer and mushroom datasets. Next, we build another dimensionality reduction method t-SNE using the same datasets. The obtained visualization results of the datasets using the dimensionality reduction methods are documented in the experiments section of the thesis. The evaluation of classification and clustering for the dimensionality reduction techniques is also performed. The visualization and evaluation results of t-SNE are significantly better than the other dimensionality reduction techniques.
Abstract nicht vorhanden
The occurence of prostate cancer (PCa) has been consistently rising since three decades and remains the third leading cause of cancer-related deaths after lung and bowel cancer in Germany. Despite of new methods of early detection, such as prostate-specific antigen (PSA) testing, it persists to be the most common cancer in german men with over 63,400 new diagnoses in Germany every year and exhibits high prevalence in other countries of Northern andWestern Europe as well [64]. Men over the age of 70 are most commonly affected by the lethal disease, whereas an indisposition before 50 is rare. The malignant prostate tumor can be healed through operation or irradiation while the cancer hasn’t reached the stage of metastasis in which other therapeutic methods have to be employed [14] [15]. In the metastatic phase, the patient usually exhibits symptoms when the tumors size affects the urethra or the cancer spreads to other tissue, often the bones [16].
The high prevalence of this disease marks the importance of further research into prognosis and diagnosis methods, whereby identification of further biomarkers in PCa poses a major topic of scientific analysis. For this task, the effectiveness of high-throughput RNA sequencing of the transcriptome (RNA molecules of an organism or specific cell type) is frequently exploited [66]. RNA sequencing or RNA-Seq in short, offers the possibility of transcriptome assessment, enabling the identification of transcriptional aberrations in diseases as well as uncharacterized RNA species such as non-coding RNAs (ncRNAs) which remain undetected by conventional methods [49]. To alleviate interpretation of the sequenced reads they are assembled to reconstruct the transcriptome as close to the original state as possible, thus enabling rapid detection of relevant biomolecules in the data [49]. Transcriptomic studies often require highly accurate and complete gene annotations on the reference genome of the examined organism. However, most gene annotations and reference genomes are far from complete, containing a multitude of unidentified protein-coding and non-coding genes and transcripts. Therefore, refinement of reference genomes and annotations by inclusion of novel sequences, discovered in high quality transcriptome assemblies, is necessary [24].
The following is a description and outline of the work done at the Cornell Lab of Ornithology developing Nation Feathers VR, a virtual reality game for learning about bird calls and songs. The goal was to develop a game which is intuitive, educational and entertaining. Furthermore, the software needed to be structured in a way that allows for feasible future expansion. This required careful data saving and retrieval. The game gives the player an opportunity to learn and apply that knowledge, all while maintaining a shorter runtime in order to reduce the total time spent in the virtual world. This is meant to prevent any discomfort to the player that may result from extended use of the VR headset.
Both cryptocurrency researchers and early adopters of cryptocurrencies agree that they possess a special kind of materiality, based on the laborious productive process of digital ‘mining’ [1]. This idea first appears in the Bitcoin White Paper [2] that encourages Bitcoin adopters to construct and justify its value in metaphoric comparison to gold mining. In
this paper, I explore three material aspects of blockchain: physical infrastructure, human language and computer code. I apply the concept of 'continuous materiality' [3] to show how these three aspects interact in practical implementations of blockchain such as Bitcoin and Ethereum. I start from the concept of ‘digital metallism’ that stands for ‘fundamental value’ of cryptocurrencies, and end with the move of Ethereum to ‘proof-of-stake’, partially as a countermeasure against ‘evil miners’. I conclude that ignoring material aspects of blockchain technology can only further problematize complicated relations between their technical, semiotic and social materiality.
In this thesis, the changes in economy and society and the resulting effects on the labor market are being outlined. Current studies show that the shrinking labor market and the increasing digitalization result in a lack of skilled tech talent and a transition from an employer market to a clear employee market. Derived from the findings of the scientific research on this topic and conducted expert interviews, practical recommendations for recruitment actions within the scope of employer branding will be defined in order to help corporations to gain the needed tech skill set and drive innovation.
Glycans play an important role in the intracellular interactions of pathogenic bacteria. Pathogenic bacteria possess binding proteins capable of recognizing certain sugar motifs on other cells, which are found in glycan structures. Artificial carbohydrate synthesis allows scientists to recreate those sugar motifs in a rational, precise, and pure form. However, due to the high specificity of sugar-binding proteins, known as lectins, to glycan structures, methods for identifying suitable binding agents need to be developed. To tackle this hurdle, the Fraunhofer Institute for Cell Therapy and Immunology (Fraunhofer IZI) and the Max-Planck Institute of Colloids and Interfaces (MPIKG) developed a binding assay for the high throughput testing of sugar motifs that are presented on modular scaffolds formed by the assembly of four DNA strands into simple, branched DNA nanostructures. The first generation of this assay was used in combination with bacteria that express a fluorescent protein as a proof-of-concept. Here, the assay was optimized to be used with bacteria not possessing a marker gene for a fluorescent protein by staining their genomic DNA with SYBR® Green. For the binding assay, DNA nanostructures were combined with artificially synthesized mannose polymers, typical targets for many lectins on the surface of bacteria, presenting them in a defined constellation to bind bacteria strongly due to multivalent cooperativity. The testing of multiple mannose polymers identified monomeric mannose with a 5’-carbon linker and 1,2-linked dimeric mannose with linker as the best binding candidates for E. coli, presumably due to binding with the FimH protein on the surface. Despite similarities between the FimH proteins of E. coli and K. pneumoniae, binding was only observed between E. coli and the different sugar molecules on DNA structures. Furthermore, the degree of free movement seemed to affect the binding of mannose polymers to targeted proteins, since when utilizing a more flexible DNA nanostructure, an increase in binding could be observed. An alternative to the simple DNA nanostructures described above is the use of larger, more complex DNA origami structures consisting of several hundred strands. DNA origami structures are capable of carrying dozens of modifications at the same time. The results for the DNA origami structure showed a successful functionalization with up to 71 1,2-linked dimeric mannose with linker molecules. These results point towards a solution for the high-throughput analysis of potential binding agents for pathogenic bacteria e.g. as an alternative treatment for antibiotic-resistant.
In this work a second version for the Python implementation of an algorithm called Probabilistic Regulation of Metabolism (PROM) was created and applied to the metabolic model iSynCJ816 for the organism Synechocystis sp. PCC 6803. A crossvalidation was performed to determine the minimal amount of expression data needed to produce meaningful results with the PROM algorithm. The failed reproduction of the results of a method called Integrated and Deduced Regulation of Metabolism (IDREAM) is documented and causes for the failed reproduction are discussed.
Studying and understanding the metabolism of plants is essential to better adapt them to future climate conditions. Computational models of plant metabolism can guide this process by providing a platform for fast and resource-saving in silico analyses. The reconstruction of these models can follow kinetic or stoichiometric approaches with Flux Balance Analysis being one of the most common one for stoichiometric models. Advances in metabolic modelling over the years include the increasing number of compartments, the automation of the reconstruction process, the modelling of plant-environment interactions and genetic variants or temporally and spatially resolved models. In addition, there is a growing focus on introducing synthetic pathways in plants to increase their agricultural potential regarding yield, growth and nutritional value. One example is the β-hydroxyaspartate cycle (BHAC) to bypass photorespiration. After the implementation in a stoichiometric C3 plant model, in silico flux analyses can help to understand the resulting metabolic changes. When comparing with in vivo experiments with BHAC plants, the metabolic model can reproduce most results with exceptions regarding growth and oxaloacetate. To evaluate whether the BHAC is suitable to establish a synthetic C4 cycle, the pathway is implemented in a two-cell type model that is capable of running a C4 cycle. The results show that the BHAC is only beneficial under light limitation in the bundle sheath cell. An additional engineering target for improved performance of plants is malate synthase. This work serves as the basis for further analyses combining the different factors boosting the advantages of the BHAC and for in vivo experiments in C3 and C4 plants.
The larval zebrafish mutant Knörf has got a not yet identified gen, which is lethal after 14 dpf in a homozygous state. The mutation courses various degenerations and the loss of the regeneration ability. One of these degenerations was first discovered in the retina by a histological section. The mutants retinas show gaps in the IPL at 7 and 8 dpf which number increases during the maturation of the larva. In recent studies a pax 6 staining was performed, which showed that amacrine cells areaffected. Different types of amacrine cells were tested and it was shown that the parvalbuminergic amacrine cells disappear. The staining was performed in a time course. At 5 dpf is no difference between the number of parvalbuminergic amacrine cells in siblings and mutants but then the degeneration starts. At 2 dpa there is thefirst significant difference which increases at later stages and leads nearly to a full disappearance of these cells in the eye. Parvalbumin is not only present in the retina, therefore the brain as another central nervous system structure was examined. In the telencephalon these cells disappear already at 2 dpa. The parvalbuminergic cells are also present in the skeletal muscle of the tail. Here the degeneration starts approximately at the half of the tail and intensifies to distal areas. It was shown, that parvalbuminergic cells in the muscle disappear until 4dpa. The role of parvalbumin is seemed in the binding ofcalcium and therefore it supports the adjustment of the resting potential after an excitation in the central nervous system. In muscles it assists in the slowing of relaxing after a contraction of a muscle.
The aim of this bachelor thesis was to establish extracytoplasmic function (ECF) σ factors as synthetic genetic regulators for biotechnological and synthetic biology applications in the new emerging model organism Vibrio natriegens. Therefore, synthetic genetic circuits were engineered on plasmids as test set-up for the investigated ECFs and their target promoters. The resulting plasmid library consisted of the reporter plasmids with the target promoter, fused to a lux cassette, a set of high-copy ECF plasmids and a backup set of lower-copy ECF plasmids. First, the high-copy plasmids were transformed in V. natriegens to test them for their functionality upon different inducer levels, which yielded good inducibility for few, but showed too high ECF-expression in most strains. For this reason, the set of lower copy plasmids was used for combinatorial co-transformation, to investigate the ECFs for their cross-talk to unspecific ECF target promoters. The switching to the lower-copy plasmid-set seemed to be partly helpful, while still much room for fine-tuning of the circuits remains. The knowledge gained can be used to achieve higher success rates when engineering synthetic circuits for various applications in V. natriegens, by using the ECFs here recommended as suitable synthetic genetic regulators.
Cryptorchidism is the most common disorder of sex development in dogs. It describes a failure of one or both testes to descend into the scrotum in due time. It is a heritable multifactorial disease. In this work, selected dogs of a german sheep poodle breed were sequenced with nanopore sequencing and subsequently examined for genetic variations correlating with cryptorchidism. The relationships of the studied dogs were also analyzed and visually processed.
The Infinica product suite consists of multiple individual microservice applications, mainly gathered around Infinica Process Engine which allows the execution of highly individualised process definitions. For estimating process performance, a layered queuing network approach has been applied. In the first step this required the implementation of a basic modelling framework. Subsequently the implemented framework was used to evaluate the applicability of the approach by creating two models and comparing them with actual performance measurements. Although the calculated results deviated from the expected results, analysis showed that the differences may
derive from an inaccurate model. Nevertheless the general approach seems to be appropriate for the given application as well as for microservices in general, especially when extended with advanced modelling techniques, as the analysed modelled results appear consistent.
Our current research aims to establish a complete ribonucleic acid (RNA) production line from plasmid design to purification of in vitro transcribed RNA and labeling of RNA. RNA is the central molecule within the central dogma of molecular biology and is involved in most essential processes within a cell[1]. In many cases, only compact three-dimensional structures of the respective RNA are able to fulfill their function. In this context, RNA tertiary contacts such as kissing loops and pseudoknots are essential to stabilize three-dimensional folding[2]. We will produce a tertiary contact consisting of a kissing loop and a GAAA tetraloop that occurs in eukaryotic ribosomal RNA[3,4]. The RNA sequence is integrated into a vector plasmid. Subsequently, the plasmid is amplified in E. coli. After following plasmid purification steps, the RNA sequence will be transcribed in vitro[5,6]. In order for the RNA be used for Förster resonance energy transfer (FRET) experiments at the single molecule level, fluorescent dyes must be coupled to the RNA molecule[7].
How Covid-19 impacts the workplace of knowledge workers in a pandemic and post pandemic world
(2021)
The following master thesis covers the topic workplace. The focus lies on the corona pandemic and how the pandemic has affected and will continue to affect the workplaces of knowledge workers. Therefore, the workplace as a research area has been described holistically, followed by the presentation of gathered secondary data and the conducted in depth interviews by the author. The presented secondary data and primary data are agreeing in the workplace how people know it will be changed after the pandemic. The most likely outcome is the hybrid workplace concept which mixes the home office, the office and alternatively third places. For these changes the companies have to be equipped and prepared. The meaning of the office will increase and has to be redesigned in order to meet the needs of the knowledge workers which are coming back to the office eventually.
There are multiple ways to gain information about an individual and its health status, but an increasingly popular field in medicine has become the analysis of human breath, which carries a lot of information about metabolic processes within the individuals body. The information in exhaled breath consists of volatile (organic) compounds (VOCs). These VOCs are products of metabolic processes within the individuals body, thus might be an indicator for diseases disturbing those processes. The compounds are to be detected by mass-spectrometric (MS) or ion-mobility spectrometric (IMS) techniques, making the analysis of these compounds not only bounded to exhaled breath. The resulting data is spectral data, capturing concentrations of the VOCs indirectly through intensities. However, a number of about 3000 VOCs [1] could already be determined in human exhaled breath. The number of research paper about VOC-analysis and detection had risen nearly constantly over the last decade 1. Furthermore, the technique to identify VOCs could also be used to capture biomarker from alien species within the individuals body. Extracting VOCs from an individual can be done by non- or minimal invasive techniques. However, the manual identification of VOCs and biomarkers related to a certain disease or infection is not feasible due to the complexity of the sample and often unknown metabolic products, thus automized techniques are needed. [1–4] To establish breath analysis as a diagnosis tool, machine learning methodes could be used. Machine learning has become a popular and common technique when dealing with medical data, due to the rapid analysis. Taking this advantage, breath analysis using machine learning could become the model of choice for diagnosis, keeping in mind that conventional methodes are laboratory based and thus when trying detect bacterial infection need sometimes several days to identify the organism. [5]
The epithelial membrane proteins (EMP1-3), which belong to the family of peripheral myelin proteins 22-kDa (PMP22), are involved in epithelial differentiation. EMP2 was found to be a downstream target gene of the tumor suppressor gene HOPX, a homeobox-containing gene. Additionally, a dysregulation of EMP2 has been observed in various cancers, but the function of EMP2 in human lung cancer has not yet been clarified.
In this study, a real-time RT-PCR, Western blot and cytoblock analysis were performed to analyze the expression of EMP2. Gain-of-function was achieved by stable transfection with an EMP2 expression vector and loss-of-function by siRNA knockdown. Stable transfection led to overexpression of EMP2 at both mRNA and protein levels in the transfected cell lines H1299 and H2170.
Functional assays including proliferation, colony formation, migration and invasion assays as well as cell cycle analyzes were performed after stable transfection and it was found that the ectopic EMP2 expression resulted in a reduced cell proliferation, migration and invasion as well as a G1 cell cycle arrest. After the EMP2 gene was silenced by the siRNA knockdown, inhibition of the cell invasive property was observed. These phenomena were accompanied by reduced AKT, mTor and p38 activities.
Taken together, the data suggest that the epithelial membrane protein 2 (EMP2) is a tumor suppressor and exerts its tumor suppressive function by inhibiting AKT and MAPK signaling pathways in human lung cancer cells.
Workload Optimization Techniques for Password
Guessing Algorithms on Distributed Computing Platforms
(2019)
The following thesis covers several ways to optimize distributed computing platforms for cryptanalytic purposes. After an introduction on password storage, password guessing attacks and distributed computing in general, a set of inital benchmark results for a variety of different devices will be analyzed. The shown results are mainly based on utilization of the open source password recovery tool Hashcat. The second part of this work shows an algorithmic implementation for information retrieval and workload generation. This thesis can be used for the conception of a distributed computing system, inventory analysis of available hardware devices, runtime and cost estimations for specific jobs and finally strategic workload distribution.
The objective of this diploma thesis is to analyse the results of functional tests carried out on hydraulic valve blocks at Wujin Plant of Bosch Rexroth (Chang-zhou) Co., Ltd., (China). Based on this analysis, tests could be checked for systematic errors and root causes of failures be identified. Finally, this helped i n-crease the first pass yield of testing to release resources so far bound in inefficient testing processes. Furthermore, a tracking mechanism was established to monitor the function of crucial sensors at test benches.
The thesis presents an investigation of the question whether it is viable for the English company Essential Care to introduce a direct selling channel in the United Kingdom. The thesis provides an outline of the direct selling and labour market in the United Kingdom, including organisations and legislation for direct selling. A SWOT analysis illustrates the external and internal factors that could have an influence on the feasibility of the project. The main part of the thesis focuses on a market research survey which was conducted in the United Kingdom. Followed by an analysis of the results it provides a detailed outline of the findings. At the end of the thesis the overall findings are summarised and recommendations for Essential Care are presented.
This thesis deals with the possible integration of social media communication in the marketing of the International Rectifier Corporation. The basis for the implementation of the new communication channel is set through a detailed description of basic, theoret ical functions and features of business-to-business communication as well as social media communication. Based on this knowledge the marketing communication of International Rectifier is analyzed and compared to the competitors. The theoretical lessons in combination with the analysis will then be used to develop a competitive and effective social media strategy for International Rectifier.
For the first time it was discovered that ultraviolet radiation with a wavelength of 200 to 400 nm (maximum 365 nm) radiated from a distance of 40 cm (intensity: 3500 mW/cm²) to PMMA altered its surface wettability as well as a roughness at the nanoscale that was observed with an atomic force microscope (AFM). The roughness rises and falls again in a short time ( 1-2days ) after 75 min and 180 min irradiation time. However , during the next 10 days roughness became stabilized and there was no influence of UV if PMMA was stored in air or in a Petri dish out of glass.
In an era of global climate change and fast growing cities, local governments are in an urgent need for adopting sustainable urban growth concepts for tackling a liveable and prosperous urban future. Against this background, the smart city notion progressively gained popularity as an urban development concept, which heavily relies on technology and urban data use for fostering sustainable urban growth. However, so far, the understandingof the smart city term is ambiguous, and little scientific research has been done on developing comprehensive conceptual frameworks to support local governments in the making of smarter cities. This paper aims at presenting the current state-of-the-art of smart city research in order to support the making of smart city best practices and to promote a comprehensive understanding of the smart city notion. In doing so, the role of technology in the making of smarter cities and critical success factors in transforming cities are elaborated, following the methodological approach of a multidimensional conceptual framework. The research findings and an expert interview with a representative of the state capital will then serve for the assessment of the weak points and best practices in the smart city pursuit of the German city Munich, providing urban policymaking with valuable insights and fostering the development of a comprehensive smart city conceptualism.
In the swiftly changing world of academic publishing, the Sea of Wisdom platform seizes the opportunity to innovate. By combining the technologies of blockchain, decentralized finance (DeFi), and Non-Fungible Tokens (NFTs) with traditional scholarly communication, we present a groundbreaking, decentralized solution. Our design, although adaptable, primarily uses Ethereum's Virtual Machine, tapping into its robust scientific community.
Genetic sex determination of ancient DNA samples based on one simple mathematical algorithm, which considers the number of mapped reads on autosomal, X, and Y chromosomes. The algorithm is implemented in one command line tool - SiD. SiD is used to deter-mine the sex of 16 samples, which have been shotgun sequenced and captured with a 1240k panel.
Object detection and classification is active field of research inmachine learning and computervision. Depending on the application there are different limitations to adjust to, but also possibilities to take advantage of. In my thesis, We focus on classification and detection of video sequence during night-time and the proposed method is robust since it does use image thresholding [8] which is commonly use in other methods and the thesis uses histograms of oriented gradients (HOG) [37] as features and support vector machine (SVM) [74] as classifier. It is of great importance that the extracted features from the images should be robust and distinct enough to help the classifier distinguish between high-beam and a low-beam. The classifier is part of the object detection which predicts whether or not a testing image matches one group or the other. In our case that is predicting whether or not an image belongs to high or low-beam sequence.
In this paper, we conduct experiments to optimize the learning rates for the Generalized Learning Vector Quantization (GLVQ) model. Our approach leverages insights from cog- nitive science rooted in the profound intricacies of human thinking. Recognizing that human-like thinking has propelled humankind to its current state, we explore the applica- bility of cognitive science principles in enhancing machine learning. Prior research has demonstrated promising results when applying learning rate methods inspired by cognitive science to Learning Vector Quantization (LVQ) models. In this study, we extend this approach to GLVQ models. Specifically, we examine five distinct cognitive science-inspired GLVQ variants: Conditional Probability (CP), Dual Factor Heuristic (DFH), Middle Symmetry (MS), Loose Symmetry (LS), and Loose Symme- try with Rarity (LSR). Our experiments involve a comprehensive analysis of the performance of these cogni- tive science-derived learning rate techniques across various datasets, aiming to identify optimal settings and variants of cognitive science GLVQ model training. Through this research, we seek to unlock new avenues for enhancing the learning process in machine learning models by drawing inspiration from the rich complexities of human cognition. Keywords: machine learning, GLVQ, cognitive science, cognitive bias, learning rate op- timization, optimizers, human-like learning, Conditional Probability (CP), Dual Factor Heuristic (DFH), Middle Symmetry (MS), Loose Symmetry (LS), Loose Symmetry with Rarity (LSR).
When entering waterways that are restricted either in height, width or by another vessel, the behaviour of a ship changes. The most evident effect of navigating in shallow water is the squat which has led to several groundings. Because of pressure differences the vessel is pulled down into the water and the trim is changed. Another shallow water effect is the speed loss due to an increase in resistance which can reduce the maximal speed by upto 50 percent. In general the behaviour of a ship in shallow water is said to be sluggish, meaning that it is more difficult to navigate which affects the radius of the turning circle among others. Sailing parallel to a close-by bank affects the lateral force and the yaw moment. The interaction with other ships has similar effects as bank effects, but is more sophisticated since more parameters play a major role. In this thesis each of these effects is researched by studying several papers by renowned researchers.
Several models are developed which are correspondent with the inherent model of forces and moments of the simulation program. The challenges and obstacles that arised during modelling and implementation are pointed out and solutions or approaches are given.
Increasing speed in laser processing is driven by the development of high-power lasers into ranges of more than 1 kW. Additionally, a proper distribution of these laser power is required to achieve high quality processing results. In the case of high pulse repletion rates, a proper distribution of the pulses can be obtained from ultrafast beam deflection in the range of several 100 m/s. A two-dimensional polygon mirror scanner has been used to distribute a nanosecond pulsed laser with up to 1 kW average power at a wavelength of 1064 nm for multi pass laser engraving. The pulse duration of this laser can be varied between 30 ns and 240 ns and the pulse repetition rate is set between 1 and 4 MHz. The depth information is included in greyscale bitmaps, which were used to modulate the laser during the scanning accordingly to the lateral position and the depth. The process allows high processing rates and thus high throughput.
Laser engraving requires a precise ablation per pulse through all layers of a depth map. To transform this process towards areas of a square meter and more within an acceptable time, needs high-power ultra-short pulsed lasers for the precision and a high scan speed for the beam distribution. Scan speeds in the range of several 100 m/s can be achieved with a polygon scanner. In this work, a polygon scanner has been utilized within a roll-engraving machine to treat an 800 x 220 mm² (L x Dia) roll with 0.55 m² in a laser engraving process. The machine setup, the processing strategy and the data handling has been investigated and result in an efficient large area process. Pre-tests were performed with a multi-MHz-frequency nanosecond-pulsed laser, to investigate the processing strategy. A method to overcome the duty cycle of the polygon scanner was found in the synchronization of two polygons, enabling the use on a single laser source in a time-sharing concept. The throughput and the utilization of the laser source can be increased by the factor of two
In this work, Direct Laser Interference Patterning (DLIP) is used in conjunction with the polygon scanner technique to fabricate textured polystyrene and nickel surfaces through ultra-fast beam deflection. For polystyrene, the impact of scanning speed and repetition rate on the structure formation is studied, obtaining periodic features with a spatial period of 21 μm and reaching structure heights up to 23 μm. By applying scanning speeds of up to 350 m/s, a structuring throughput of 1.1 m²/min has been reached. Additionally, the optical configuration was used to texture nickel electrode foils with line-like patterns with a spatial period of 25 μm and a maximum structure depth of 15 μm. Subsequently, the structured nickel electrodes were assessed in terms of their performance for the Hydrogen Evolution Reaction (HER). The findings revealed a significant improvement in HER efficiency, with a 22% increase compared to the untreated reference electrode.
This bachelor thesis examines two main topics: Corporate Social Responsibility and Corporate Philanthropy as an integral part of it. It was written in order to prove the high importance of business philanthropy in today’s global market and to encourage companies to strengthen their CSR policy so as to contribute to the resolution of social problems. This paper reviews the theoretical framework of CSR, its evolution, types and theories relating to Corporate Philanthropy. Also it represents a comparative analysis of successful practices of corporate philanthropy in pharmaceutical and other global industries predominantly in Europe and USA. This work underlines competitive advantages and important socio-economic impact of CP and suggest recommendations for companies in developing their CSR activities. The subsequent paper is based on internet research using articles, presentations, reports and studies, websites and official legal documents.
In this thesis, we implement, correct, and modify the compartmental model described in “Transmission Dynamics of Large Coronavirus Disease Outbreak in Homeless Shelter, Chicago, Illinois, USA, 2020”. Our objective is to engage in reading and understanding scientific literature, reproduce the results, and modify or generalize an existing mathematical model. We provide an overview of epidemiological models, focusing on simple compartmental SEIR models. We correct inaccuracies and misprints in the original implementation and use the limited-memory Broyden–Fletcher–Goldfarb–Shanno algorithm to fit the model’s parameters. Furthermore, we modify the model by introducing an additional compartment. The resulting model has a more intuitive interpretation and relies on fewer assumptions. We also perform the fitting process for this alternative model. Finally, we demonstrate the advantages of our modified implementations and discuss other possible approaches.
This thesis deals with the development of a methodology / concept to analyse targeted attacks against IIoT / IoT devices. Building on the established background knowledge about honeypots, fileless malware and injection techniques a methodology is created that leads to a concept of a honeypot analyzation system. The system is created to analyse and detect novel threats like fileless attacks which are often utilized by Advanced Persistent Threats. That system is partially implemented and later evaluated by performing a simulated attack utilizing fileless attacks. The effectiveness is discussed and rated based on the results.
This master thesis investigates a new method for the feature extraction of gray scale images, the so called „Non-Euclidean Principal Component Analysis“ 1. Thereby the standard inner product of the Euclidean space is substituted by a semi inner product in the well known learning rule of Oja and Sanger. The new method is compared with the standard principal component analysis (PCA) by extracting features (feature vectors) of different databases with class labels and judged regarding the accuracies of „Border Sensitive Generalized Learning Vector Quantization“ (BSGLVQ), „Feed Forward Neural Networks“ (FFNN) and the „Support Vector Machines“ (SVM).
Obesity is a major public health issue in many countries and its development leads to many severe conditions. Adipose tissue (AT) simply called fat, in males visceral adipose tissues (VAT) are dominant. Estrogens play an important role in many pathological processes.
In this study, one of the subtypes of the estrogen receptor ER-beta is activated using KB (Specific ligand) treatment on VAT.
In this study, I investigated the metabolism effectof KB treatment on VAT using bioinformatics methods.
In this thesis study, I applied several bioinformatics methods such as differential expression gene analysis, pathway analysis, RNA splicing analysis and SNPs callings to make the prediction of the effect of KB treatment on VAT. A list of candidate genes, pathways and SNPs were identified in this study, which could provide some clues to reveal the genetic mechanism underlying the KB treatment effect. The results of my study show that the KB treatment on VAT has caused significant effect.
At a global level, different studies disclose that transport systems are responsible for 25% of CO2 emissions. In the context of sustainable mobility, one of the challenges in the short term is associated with the research and improvement of alternative fuels, which should allow a fast decrease in the generation of greenhouse gases due to sustainable transport means. In this sense, green hydrogen can play a fundamental role. Green hydrogen is the basis for producing synthetic fuels, which can replace oil and its derivatives. Synthetic fuels or e-fuel are hydrocarbons produced from carbon dioxide (CO2) and green hydrogen (H2) as the only raw materials. H2 or efuel could be used in many sectors (manufacturing, residential, transportation, mining and other industries). In this study, different applications of hydrogen are evaluated by techno-economic analysis. The main variable that affects the production of hydrogen and its derivatives is the cost of electricity. Considering the renewable energy potential of Chile, it is feasible to develop in Chile the green hydrogen production as an energy vector, which would be technically and economically viable, together with the environmental benefits
Influenza A viruses are responsible for the outbreak of epidemics as well as pandemics worldwide. The surface protein neuraminidase of this virus is responsible, among other things, for the release of virions from the cell and is thus of interest in pharmacological research. The aim of this work is to gain knowledge about evolutionary changes in sequences of influenza A neuraminidase through different methods. First, EVcouplings is used with the goal of identifying evolutionary couplings within the protein sequences, but this analysis was unsuccessful. This is probably due to the great sequence length of neuraminidase. Second, the natural vector method will be used for sequence embedding purposes, in hopes to visualize sequential progression of the virus protein over time. Last, interpretable machine learning methods will be applied to examine if the data is classifiable by the different years and to gain information if the extracted information conform to the results from the EVcouplings analysis. Additionally to using the class label year, other labels such as groups or subtypes are used in classification with varying results. For balanced classes the machine learning models performed adequately, but this was not the case for imbalanced data. Groups and subtypes can be classified with a high accuracy, which was not the case for the years, continents or hosts. To identify the minimal number of features necessary for linear separation of neuraminidase group 1 subtypes, a logistic regression was performed at last, resulting in the identification of 15 combinations of nine amino acid frequencies. Since the sequence embedding as well as the machine learning methods did not show neuraminidase evolution over time, further research is necessary, for example with focus on one subtype with balanced data.
Digital innovation in the quality management system from supply chain to final product conformityy
(2019)
As the new revolution is happening in the industry 4.0 as digitalization and the new trend in innovation is taken place. So, we want to digitalize the process from the supply chain to the final product conformity of the aircraft.
So every document which is received from the supplier like (eg.CoC, Inspection report, concession) digitally. When the part is received at the warehouse of the OEM the warehouse personal has a system to say that part A serial no X is the perfect fit for the part no By with the help of QR code and book the part into the ERP.
The biggest challenge we have is to reduce in production inspection method to be done by a human. We want to bring one more upper step that is automation with edition with IOT in the process to give better data processing to the Automation process plus reduce the overall inspection time and what is needed in create a proper visual automation control system and also with help of gauge Rand R make the process more accurate and also certify the traceability of the process . At finally there was so much data and we need data security for that to create a proper data source and data storage for supplier data as well as internal data security.
Learning Vector Quantization (LVQ) methods have been popular choices of classification models ever since its introduction by T. Kohonen in the 90s. These days, LVQ is combined with Deep Learning methods to provide powerful yet interpretable machine-learning solutions to some of the most challenging computational problems.
However, techniques to model recurrent relationships in the data using prototype methods still remain quite unsophisticated. In particular, we are not aware of any modification of LVQ that allows the input data to have different lengths. Needless to say, such data is abundant in today's digital world and demands new processing techniques to extract useful information. In this paper, we propose the use of the Siamese architecture to not only model recurrent relationships within the prototypes but also the ability to handle prototypes of various dimensions simultaneously.
A number of real time PCR approaches have been published in the literature. In this thesis, the suitability of different real time PCR approaches using hydrolysis probes have been evaluated regarding PCR performance, cost effectiveness as well as handling. The effect of double-quenched probes as well as the impact of the increase of relative Flap endonuclease amount in quantitative real time PCR has been examined. In terms of genotyping a TaqMan™ assay, considered to be the gold-standard in this application, has been tested and compared to phosphorothioate modified probes, allele specific primers, SNAKE primers, an allele specific probe and primer assays as well as an assay using minor groove binder probes. Promising observations have been made in the case of double-quenched probes, phosphorothioate modified probes, SNAKE primers as well as minor groove binder probes.
The subject of the following paper is the analysis of global company motives for taking on sport sponsorships as a corporate social responsibility (CSR) initiative. This work is compilatory in nature because it is derived from literature released by experts as well as real-life case studies. The expert literature provides a basis of theories and models regarding the fundamental motives for CSR and sport sponsoring and visualizes them by means of statistics and real-life case studies. This paper aims to inform individuals, leaders and specifically global organizations about the benefits that taking on a sport sponsorship may have for fulfilling a company’s CSR objectives
The design of an interview model based on competencies arises from the need to have highly qualified people that contribute to the achievement of organizational objectives. It intendes to shape the department of human resources into a strategic area of the company. To achieve this, organizational competencies are defined, and guidelines for the elaboration of a portfolio of questions, as well as the design of a competency dictionary, are established. These serve as tools for the human resources processes of the company Visbal Moreno y Sucesores Ltd. Through this work, the importance of the human factor is exposed as part of the organizational strategy.
Drought is one of the most common and dangerous threats plants have to face, costing the global agricultural sector billions of dollars every year and leading to the loss of tons of harvest. Until people drastically reduce their consumption of animal products or cellular agriculture comes of age, more and more crops will need to be produced to sustain the ever growing human population. Even then, as more areas on earth are becoming prone to drought due to climate change, we may still have to find or breed plant varieties more suitable to grow and prosper in these changing environments.
Plants respond to drought stress with a complex interplay of hormones, transcription factors, and many other functional or regulatory proteins and mapping out this web of agents is no trivial task. In the last two to three decades or so, machine learning has become immensely popular and is increasingly used to find patterns in situations that are too complex for the human mind to overlook. Even though much of the hype is focused on the latest developments in deep learning, relatively simple methods often yield superior results, especially when data is limited and expensive to gather.
This Master Thesis, conducted at the IPK in Gatersleben, develops an approach for shedding light on the phenotypic and transcriptomic processes that occur when a plant is subjected to stress. It centers around a random forest feature selection algorithm and although it is used here to illuminate drought stress response in Arabidopsis thaliana, it can be applied to all kinds of stresses in all kinds of plants.
Development of a genetic biomonitoring test for the investigation of pollinator-plant-interactions
(2021)
There is a world-wide decline in biodiversity recorded. Especially insects and accompanying pollinators are threatened. When the foraging behaviour of pollinators is understood in detail, future crop and floral pollination services can be sustained and it is possible to establish projects for the conservation of pollinators and plant biodiversity. With the use of nanopore sequencing methods it is possible to detect pollen species that were collected by pollinators by their genetic information. In this study, a protocol for portable nanopore sequencing of DNA from pollen that was collected by honey bees, bumble bees and wild bees is being designed. DNAmetabarcoding is used to identify species within the mixed DNA sample. The ITS2-region will be used as a barcode. We will investigate pollen preferences of three pollinator species by placing their hives or nests at the same. Based on the results, landscape management schemes are developed that target pollen preferences and nutritional requirements of managed and wild social bee species as well as solitary wild bees.
In this work, a protocol for portable nanopore sequencing of DNA from pollen collected from honey bees, bumble bees, and wild bees was developed. DNA metabarcoding is applied to identify genera within the mixed DNA samples. The DNA extraction and ITS and ITS2 PCR parameters tested for this purpose were applied to the collected pollen sample and the amplicons were then decoded using the Flongle sequencer adapter from Oxford Nanopore Technologies. It is shown that the main pollinator resources at the different sites can be identified in percentage proportions. The protocol generated in this study can be used for further ecological questions.
In the present bachelor thesis, nanopore sequencing and Illumina sequencing was compared using pollen DNA collected from honeybees and bumble bees. Therefore, nanopore sequencing was performed with the MinION sequencers and the generated reads were analysed with bash programming. A quantitative and qualitative (based on ITS2 sequences) BLAST run was performed. The results confirme the error probability of nanopore sequencing that is described in the literature. Nevertheless, with both sequencing methods similar sample preferences of the bees could have been observed, allowing ecological conclusions.
With the increasing usage of blockchain technology, legal challenges such as GDPR compliance arise. Especially the right of erasure is considered challenging as blockchains are tamperproof by design. Several approaches investigated
possibilities to weaken the tamperproof aspect of blockchains in favor of GDPR compliance. This paper presents several approaches, then focuses on chameleon hash functions by evaluating the possibility to use these specific functions in a private blockchain. The goal of the built system is to take a step towards the digitization of the bill of lading used in international trade. This paper describes the developed software as well as the core considerations around the system such as network design or block structure.
Cryptorchidism describes a disease, in which one or both testes do not descend into the scrotum properly. With a prevalence of up to 10%, cryptorchidism is one of the most common birth defects of the male genital tract. Despite its associated health risks and accompanying economic damage, resulting from surgery and losses in breeding, studies on canine cryptorchidism and its causes are relatively rare. In this study a relational database for genetic causes of cryptorchidism was established and used as a basis for the identification of candidate genes. Associated regions were analysed by nanopore sequencing with the goal to identify genetic variants correlated with cryptorchidism in German Sheep Poodle.
The main purpose of this Bachelor thesis was to find and to compile comprehensive information on barley genes expressed in the context of pollen embryogen esis. In the present study, this approach was confined to genes that were previously known to be associated with the initiation of embryogenesis in different plant species. First, candidate transcript sequences were identified in barley. Second, transcript and associated genomic sequences were analyzed in silico to provide suitable structural and functional annotations. Finally, the results of one representative example are presented and interpreted in detail. This work aims to contribute to a significantly improved understanding of pollen embryogenesis - a biological phenomenon broadly used for haploid technology in crop improvement.
Success story DAB in the UK
(2017)
The popularity of digital audio broadcasting in different countries can be explained mainly by means of historical development. In this work, the general technical conditions are explained and the mode of operation is explained. In addition, advantages, disadvantages and alternatives are presented. After that, the development of the digital radio in Germany and the UK is compared with the current situation in order to show how the differences have led to a different distribution and acceptance of the medium.
The bachelor thesis is assigned to introduce the theoretical concept of Human Recourses Management, to analyze the work of human resources department of the LLC Tavria-V and to offer actions with recommendation to improve the productivity of the personnel. To start the implementation of actions for personnel management improvement, first of all, an overview of theoretical and methodological aspects of the HRM are presented and theories which earlier had an impact on our present running of the "workers" are described. Secondly, the concept of organizational work of the enterprise, main indexes and types of activities are figured out and in the form of tables and diagrams analyzed. The main object of the thesis - the process of personnel management with qualitative characteristics is described and presented. Using also the survey of employee all advantages and disadvantages of the present system of HRM are defined. Then in the last part, taking to account all data about current situation, recommended actions and effect for LLC Tavria-V on the basis of the personnel management analysis are presented in the work.