Refine
Document Type
- Master's Thesis (121)
- Bachelor Thesis (94)
- Conference Proceeding (66)
- Diploma Thesis (14)
- Final Report (6)
Year of publication
Language
- English (301) (remove)
Keywords
- Blockchain (40)
- Maschinelles Lernen (28)
- Vektorquantisierung (9)
- Algorithmus (7)
- Bioinformatik (6)
- Bitcoin (6)
- Graphentheorie (6)
- Internet der Dinge (6)
- Neuronales Netz (6)
- Unternehmen (6)
Institute
- Angewandte Computer‐ und Biowissenschaften (120)
- 06 Medien (35)
- 03 Mathematik / Naturwissenschaften / Informatik (28)
- Wirtschaftsingenieurwesen (21)
- 01 Elektro- und Informationstechnik (7)
- 04 Wirtschaftswissenschaften (7)
- Ingenieurwissenschaften (4)
- Sonstige (4)
- 02 Maschinenbau (2)
- 05 Soziale Arbeit (1)
Noise in the oceans is a constantly increasing factor. The growing industrialisation due to shipping, offshore wind parks, seismic studies and other anthropogenic noise is putting the eco system under immense stress. The focus of this thesis is on the assessment of continuous underwater noise from ships. Based on existing strategies in air as well as underwater and a comparison of both an alternative strategy for the assessment of con-tinuous noise from ships is given. The concept developed is based on published, scien-tifically observed responses of animals to ship passes with an indication of an effect range. A model is created to describe the strategy using publicly available data for cargo ships as an example. The results are summarized in maps depicting the affected area for an MRU of the OSPAR II region and the MPA “Borkum Riffgrund”. The strategy is discussed and evaluated on the basis of these results. From this, further improvements and the need for additional information in publicly available data on vessel traffic are derived.
As part of the research project Trusted Blockchains for the Open, Smart Energy Grid of the Future (tbiEnergy), one of the objectives is to investigate how a holistic blockchain approach for the realization of a local energy market could be accomplished and how corresponding hardware security mechanisms can be integrated. This paper provides an overview of the implemented prototype and describes the system and its processes.
Due to the intractability of the Discrete Logarithm Problem (DLP), it has been widely used in the field of cryptography and the security of several cryptosystems is based on the hardness of computation of DLP. In this paper, we start with the topics on Number Theory and Abstract Algebra as it will enable one to study the nature of discrete logarithms in a comprehensive way, and then, we concentrate on the application and computation of discrete logarithms. Application of discrete logarithms such as Diffie Hellman key exchange, ElGamal signature scheme, and several attacks over the DLP such as Baby-step Giant-step method, Silver Pohlig Hellman algorithm, etc have been analyzed. We also focus on the elliptic curve along with the discrete logarithm over the elliptic curve. Attacks for the elliptic curve discrete logarithm problem, ECDLP have been discussed. Moreover, the extension of several discrete logarithms-based protocols over the elliptic curve such as the elliptic curve digital signature algorithm, ECDSA have been discussed also.
This master thesis covers the topics of Studying customers’ behavior on the example of skin care brand Nivea. There are presented theoretical basis for the following research about marketing, customers’ behavior and conducting marketing research properly. Then, there is the analysis of German market. Since Nivea is the brand of Beiersdorf company, there is a description of Beiersdorf’s activity and operation work. The main idea of the paper work is to analyze customers’ behavior of Nivea. Therefore, the work contains huge research about the brand along with its’ micro- and macroenvironment. There also were conducted an in-depth interview and a survey to understand customers’
current needs. With all the results the author of the work proposed some ideas for Nivea brand.
Probabilistic Micropayments
(2022)
Probabilistic micropayments are important cryptography research topics in electronic commerce. The Probabilistic micropayments have the potential to be researched in order to obtain efficient algorithms with low transaction costs and high speeding computer power. To delve into the topic, it is vital to scrutinize the cryptographic preliminaries such as hash functions and digital signatures. This thesis investigates the important probabilistic methods based on a centralized or decentralized network. Firstly, centralized networks such as lottery-based tickets, Payword, coin-flipping, and MR2 are described, and an approach based on blind signatures is also discussed. Then, decentralized network methods such as MICROPAY3, a transferable scheme on the blockchain network, along with an efficient model for cryptocurrencies, are explained. Then we compare the different probabilistic micropayment methods by improving their drawback with a new technique. To set the results from the theoretical analysis of different methods into some context, we analyze the attacks that reduce the security and, therefore, the system’s efficiency. Particularly, we discuss various methods for detecting double-spending and eclipse attacks occurrence
Classification label security determines the extent to which predicted labels from classification results can be trusted. The uncertainty surrounding classification labels is resolved by the security to which the classification is made. Therefore, classification label security is very significant for decision-making whenever we are encountered with a classification task. This thesis investigates the determination of the classification label security by utilizing fuzzy probabilistic assignments of Fuzzy c-means. The investigation is accompanied by implementation, experimentation, visualization and documentation of the results.
Vicia faba leaves and calli were transformed using CRISPR Cas RNP. Two kinds of CPP fused SpyCas9 were used with sgRNA7, sgRNA5 or sgRNA13 targeting PDS exon 1, PDS exon 2 or MgCh exon 3 respectively. RNP were applied using high pressure spraying, biolistic delivery, incubation in RNP solution and infiltration of leaf tissue. A PCR and restriction enzyme based approach was used for detection of mutation. Screening of 679 E. coli colonies containing the cloned fragments resulted in detection of 14 mutations. Most of the 14 mutations were deletions of sizes 150, 500 or 730 bp. 5 out of the 14 mutations were point mutations located two to three bp upstream of PAM.
Tokenization projects are currently very present when it comes to new blockchain technologies. After explaining the fundamentals of cross-chain interaction, the bachelor thesis will focus on tokenizing technology for Bitcoin on Ethereum. To get a more practical context, implementing the currently most successful decentralized tokenization project is described.
Introducing natural adversarial observations to a Deep Reinforcement Learning agent for Atari Games
(2021)
Deep Learning methods are known to be vulnerable to adversarial attacks. Since Deep Reinforcement Learning agents are based on these methods, they are prone to tiny input data changes. Three methods for adversarial example generation will be introduced and applied to agents trained to play Atari games. The attacks target either single inputs or can be applied universally to all possible inputs of the agents. They were able to successfully shift the predictions towards a single action or to lower the agent’s confidence in certain actions, respectively. All proposed methods had a severe impact on the agent’s performance while producing invisible adversarial perturbations. Since natural-looking adversarial observations should be completely hidden from a human evaluator, the negative impact on the performance of the agents should additionally be undetectable. Several variants of the proposed methods were tested to fulfil all posed criteria. Overall, seven generated observations for two of three Atari games are classified as natural-looking adversarial observations.
Fermat proposed fermat’s little theorem in 1640, but a proof was not officially published until 1736. In this thesis paper, we mainly focus on different proofs of fermat’s little theorem like combinatorial proof by counting necklaces, multinomial proofs, proof by modular arithmetic, dynamical systems proof, group theory proof etc. We also concentrate on the generalizations of fermat’s little theorem given by Euler and Laplace. Euler was the first scientist to prove the fermat’s little theorem. We will also go through three different proofs given by Euler for fermat’s little theorem. This theorem has many applications in the field of mathematics and cryptography. We focus on applications of fermat’s little theorem in cryptography like primality testing and publickey cryptography. Primality test is used to determine if the given number n is a prime number or composite number. In this paper, we also concentrate on fermat primality test and Miller-Rabin primality test, which is an extension of fermat primality test. We also discuss the most widely used public-key cryptosystem i.e, the RSA Algorithm, named after its developers R. Rivest, A. Shamir, and L. Adleman. The algorithm was invented in 1978 and depends heavily on fermat’s little theorem.
In Machine Learning, Learning Vector Quantization(LVQ) is well known as supervised learning method. LVQ has been studied to generate optimal reference vectors because of its simple and fast learning algorithm [12]. In many tasks of classification, different variants of LVQ are considered while training a model. In this thesis, the two variants of LVQ, Generalized Matrix Learning Vector Quantization(GMLVQ) and Generalized Tangent Learning Vector Quantization(GTLVQ) have been discussed. And later, transfer learning technique for different variants of LVQ has been implemented, visualized and we have compared the results using different datasets.
VQ-VAE is a successful generative model which can perform lossy compression. It combines deep learning with vector quantization to achieve a discrete compressed representation of the data. We explore using different vector quantization techniques with VQ-VAE, mainly neural gas and fuzzy c-means. Moreover, VQ-VAE consists of a non-differentiable discrete mapping which we will explore and propose changes to the original VQ-VAE loss to fit the alternative vector quantization techniques.
Over the last two decades, the rapid advances in digitization methods put us on the fourth industrial era’s cusp. It is an era of connectivity and interactivity between various industrial processes that need a new, trusted environment to exchange and share information and data without relying on third parties. Blockchain technologies can provide such a trusted environment. This paper focuses on utilizing the blockchain with its characteristics to build machine-to-machine (M2M) communication and digital twin solutions. We propose a conceptual design for a system that uses smart contracts to construct digital twins for machines and products and executes manufacturing processes inside the blockchain. Our solution also employs the decentralized identifiers standard (DIDs) to provide self-sovereign digital identities for machines and products. To validate the approach and demonstrate its applicability, the paper presents an actual implementation of the proposed design to a simulated case study done with the help of Fischertechnik factory model.
Abstract: Blockchain Technology has become an innovative, mature tool for digital transformation, disrupting more and more application areas in their business processes, values, or even economic models. This paper leverages more than 30 academic publications on prototypes and their Blockchain-based use cases to transact certificates in the context of public education. The conceptual design and guiding ideas are reflected in the practical application development for the Federal Ministry of Education and Research ECHT! project within the showcase region WIR! in Mittweida and are used for the research design. During this approach we applied agile methods and the current certificate process to propose a comprehensive disclosure of a new software prototype including a three-layered architecture with multi-stakeholder components. The artefact instantiation contributes to the practical knowledge base within Information System Research and specifically in digital certificate processes starting from creation, searching, and proofing up to revoking by consideration of an existing IT landscape as well as organizational hierarchy.
With the advancement in cryptography and emerging internet technology, electronic voting is gaining popularity since it ensures ballot secrecy, voter security, and integrity. Many commercial startups and e-Voting systems have been proposed, but due to lack of trust, privacy, transparency, and hacking issues, many solutions have been suspended. Blockchain, along with cryptographic primitives, has emerged as a promising solution due to its transparent, immutable, and decentralized nature. In this paper, we summarized the properties that existing solutions should satisfy and explained some cryptographic primitives like ZKP, Ring signatures along with their security limitations. We gave a comprehensive review of some blockchain-based e-Voting systems and discussed their strengths and weaknesses based on the given properties with table of comparison.
Cryptocurrencies are characterized by high volatility, both in the short and long term. Experienced traders exploit this to make profits from price fluctuations by swing trading. However, this requires closely observing and analyzing the prices and trading positions at the right time. Only a few specialists, who spend time focusing on this, or optimized trading bots are able to actually make continuously profits. The autradix protocol is a selfoptimizing and self-learning parametric trading algorithm that analyzes price actions in real-time and adaptively optimizes the algorithm’s parameters to realize the user’s investment objective. Embedded in an adaptive genetic algorithm, possible parameterizations are simulated and the optimal for the investigated trading pairs are calculated. The generic trading protocol API enables coupling with various crypto exchanges and decentralized protocols. A smart contract based decentralized, trustless, and tokenized fund, controlled by a DAO, enables users to invest, operate trading agents, and to participate in the profits generated according to their share.
While blockchain technology is still in an early stage of its development, it is already of surging economic importance.
In the literature, blockchain is referred to as either being a disruptive, institutional, foundational, or general purpose technology. There is still no consensus about the economic theory that should apply for analyzing its economic effects. This article draws on use cases from the coffee supply chain to explore, which theories could potentially apply to an emerging blockchain economy.
The wind energy sector is undergoing digitalization processes that span multi-tier supply chains of turbine components and wind farm maintenance, amongst others. In an industrial use case that includes Siemens Gamesa Renewable Energy, Vestas and APQP4Wind, the processes of producing, fastening, and servicing bolts in turbines are mapped to a digital model. The model follows the lifetime of turbine bolts from the manufacturing phase, to fastening in turbines and maintenance, until their replacement and recycling. The development of the digital model is iteratively addressed in a design science research approach, as the authors actively contribute to the project. Distributed ledgers (DLs) support the notary documentation of the bolts and turbines, from their registration phase to the assembly-, technical service verification- and recycling phases. The immutable and decentralized nature of DLs secures the data against tampering and prevents any changes taken unilaterally by engaging the service stakeholders and component providers in a blockchain consortium.
Global challenges like climate change, food security, and infectious diseases such as the COVID-19 pandemic are nearly impossible to tackle when established experts and upstart innovators work in silos. If research organizations, governments, universities, NGOs, and the private sector could collaborate on these challenges more easily, lasting solutions would certainly come more quickly. Aligned with the United Nations’ Sustainable Development Goals, SAIRA connects key players in different arenas: scientists and engineers at research and technology organizations (RTOs) looking to collaborate on sustainable development projects, companies seeking R&D support to tackle their most challenging problems, and startups with innovative ideas and a desire to scale. The platform is a blockchain-secured open innovation platform, anchored on Max Plank Digital Library's blockchain network bloxberg, that assures the authenticity and integrity of all user-generated content and collaboration processes.
Mapping identities, digital assets, and people’s profiles on the internet is getting much traction in the blockchain cosmos lately. The new technology is currently forming architectures that will further pave new ways to reach fundamental mechanisms to interact in a decentralized, user-centered manner. These schemes are often declared as the next generation of the web. Within the article will be shown, how the internet has evolved in managing identities, what problems arose, and how new data architectures help build applications on top of privacy rights. Both technological and ethical perspectives are viewed to answer which guidelines should be considered to fulfill the upcoming branch of decentralized services and what we can learn from historical schemes regarding their privacy, accounting, and user data.
Blockchain and other distributed ledger technologies are evolving into enabling infrastructures for innovative ICT-solutions. Numerous features, such as decentralization, programmability, and immutability of data, have led to a multitude of use cases that range from cryptocurrencies, tracking and tracing to automated business protocols or decentralized autonomous systems. For organizations that seek blockchain adoption, the overwhelming spectrum of potential application areas requires guidance reducing complexity and support the development of blockchain-based concepts. This paper introduces a classification approach to provide design and implementation guidance that goes beyond current textbook classifications. As an outcome, a typology for management and business architects is developed, before the paper concludes with an instantiation of existing use cases and a discussion of their classes.
Bitcoin's energy consumption and social costs in relation to its capacity as a settlement layer
(2021)
Bitcoin runs on energy. The decentralized network’s amount of energy consumption has resulted in multifaceted discussions about its efficiency and environmental impact. To put Bitcoin’s energy consumption into perspective, we propose to relate (a) the energy consumption in TWh and (b) resulting social costs in the form of carbon emissions to the Dollar value settled on the Bitcoin network. Both metrics allow to relate and quantify the capacity of Bitcoin as a settlement layer to the network’s energy consumption and resulting carbon missions, or social costs. We find that in early 2021 Bitcoin (a) settles between $2,333 and $7,555 for each Dollar spent on energy and (b) that, on average, a Dollar settled on the Bitcoin blockchain causes in social costs between 0.007% and 0.01%, depending on the estimated energy consumption converted into the costs of carbon emissions. These results help to assess the efficiency, cost and sustainability of Bitcoin and may allow a comparison of Bitcoin with existing settlement base layers such as Fedwire or gold
We investigate the folding and thermodynamic stability of a tertiary contact of baker's yeast ribosomal ribonucleic acid (rRNA), which is supposed to be essential for the maturation process of ribosomes in eukaryotes at lower temperatures1. Ribosomes are cellular machines essential for all living organisms. RNA is at the center of these machines and responsible for translation of genetic information into proteins2,3. Only recently, the rRNA tertiary contact of interest was discovered in Zurich by the research group of Vikram Govind Panse. Gerhardy et al.1 showed in vitro that within the 60s-preribosome under defined metal ion concentrations the tertiary contact become visible between a GAAA-tetraloop and a kissing loop motif. Our aim is now to understand this RNA structure, especially the formation of the rRNA tertiary contact, in terms of thermodynamics and kinetics at various experimental conditions, such as temperature and metal ion concentration of K(I), Na(I) and Mg(II). Therein, we use optical spectroscopy like UV/VIS spectroscopy and ensemble Förster or Fluorescence Resonance Energy Transfer (FRET) folding studies. Our findings will help to further characterize this newly discovered ribosomal RNA contact and to elucidate its function within the ribosomal maturation process.
Several algorithms have been proposed for the testing of series-parallel graphs in linear time. We give our alternate algorithms for testing series-parallel graphs, their tree decompositions, and the independence number when the input is undirected biconnected series-parallel graphs, which run (approximately) linearly in polynomial time.
In bioinformatics one important task is to distinguish between native and mirror protein models based on the structural information. This information can be obtained from the atomic coordinates of the protein backbone. This thesis tackles the problem of distinction of these conformations, looking at the statistics of the dihedral angles’ distribution regarding the protein backbone. This distribution is visualized in Ramachandran plots. By means of an interpretable machine learning classification method – Generalized Matrix Learning Vector Quantization – we are able to distinguish between native and mirror protein models with high accuracy. Further, the classifier model supplies supplementary information on the important distributional regions for distinction, like α-helices and β-strands.
A Protein is a large molecule that consists of a vast number of atoms; one can only imagine the complexity of such a molecule. Protein is a series of amino acids that bind to each other to form specific sequences known as peptide chains. Proteins fold into three-dimensional conformations (or so-called protein’s native structure) to perform their functions. However, not every protein folds into a correct structure as a result of mutations occurring in their amino acid sequences. Consequently, this mutation causes many protein misfolding diseases. Protein folding is a severe problem in the biological field. Predicting changes in protein stability free energy in relation to the amino acid mutation (ΔΔG) aids to better comprehend the driving forces underlying how proteins fold to their native structures. Therefore, measuring the difference in Gibbs free energy provides more insight as to how protein folding occurs. Consequently, this knowledge might prove beneficial in designing new drugs to treat protein misfolding related diseases. The protein-energy profile aids in understanding the sequential, structural, and functional relationship, by assigning an energy profile to a protein structure. Additionally, measuring the changes in the protein-energy profile consequent to the mutation (ΔΔE) by using an approach derived from statistical physics will lead us to comprehend the protein structure thoroughly. In this work, we attempt to prove that ΔΔE values will be approximate to ΔΔG values, which can lead the future studies to consider that the energy profile is a good predictor of protein binding affinity as Gibbs free energy to solve the protein folding problem.
A classical topic in the theory of random graphs is the probability of at least one isolated vertex in a given random graph. An isolated node has a huge impact on social networks which can be given by a random graph. We present a distribution on the number of isolated vertex using the probability generating function. We discuss the relationship between isolated edges and extended cut polynomials, extended matching polynomials using the principle of inclusion exclusion. We introduce an algorithm based on colored graphs for general graphs. We apply this to the components of a graph as well. Finally, we implement the idea on a special class of graphs like cycle, bipartite graph, path, and others. We discuss recursive procedure based on the analogous coloring rules for ladder and fan graphs.
he automatic comparison of RNA/DNA or rather nucleotide sequences is a complex task requiring careful design due to the computational complexity. While alignment-based models suffer from computational costs in time, alignment-free models have to deal with appropriate data preprocessing and consistently designed mathematical data comparison. This work deals with the latter strategy. In particular, a systematic categorization is proposed, which emphasizes two key concepts that have to be combined for a successful comparison analysis: 1) the data transformation comprising adequate mathematical sequence coding and feature extraction, and 2) the subsequent (dis-)similarity evaluation of the transformed data by means of problem specific but mathematically consistent proximity measures. Respective approaches of different categories
of the introduced scheme are examined with regard to their suitability to distinguish natural RNA virus sequences from artificially generated ones encompassing varying degrees of biological feature preservation. The challenge in this application is the limited additional biological information available, such that the decision has to be made solely on the basis of the sequences and their
inherent structural characteristics. To address this, the present work focuses on interpretable, dissimilarity based classification models of machine learning, namely variants of Learning Vector Quantizers. These methods are known to be robust and highly interpretable, and therefore,
allow to evaluate the applied data transformations together with the chosen proximity measure with respect to the given discrimination task. First analysis results are provided and discussed, serving as a starting point for more in-depth analysis of this problem in the future.
Smart ultrafast laser processing with rotating beam – Laser micro drilling, cutting and turning
(2021)
Current micro drilling, cutting and turning processes are mainly based on EDM, milling, stamping, honing or grinding. All these technologies are using a tool with a predefined geometry that is transferred to the working piece. In contrast the laser is a highly flexible tool, which can adapt its size very fast by changing only a software setting. Thanks to the efforts in laser development during the last years, stable ultrafast lasers with sufficient average power and high repetition rates became industrially available. For using as many pulses as possible, a cost-efficient production demands for innovative processes and machining setups with fast axes movement and special optics for beam manipulation. GFH has developed a helical drilling optics, which rotates the beam up to 30.000 rpm in a very precise circle and allows furthermore to adjust the diameter and the incidence angle. This enables the laser to be used for high precision drilling and cutting and also for micro turning processes.
Pulsed laser processing of vacuum component surfaces is a promising method for electron cloud mitigation in particle accelerators. By generating a hierarchically structured surface, the escape probability of secondary electrons is reduced. The choice of laser treatment parameters – such as laser power, scanning speed and line distance – has an influence on the resulting surface morphology as well as on its performance. The impact of processing parameters on the surface properties of copper is investigated by Secondary Electron Yield (SEY) measurements, Scanning Electron Microscopy (SEM), ablation depth measurements in an optical microscope and particle release analysis. Independent of the laser wavelength (532nm and 1064nm), it was found that the surface morphology changes when varying the processing parameters. The ablation depth increases and the SEY reduces with increasing laser fluence. The final application requires the capability to treat tens of meters of vacuum pipes. The limiting factors of this type of surface treatment for the applicability in particle accelerators are discussed.
Increasing speed in laser processing is driven by the development of high-power lasers into ranges of more than 1 kW. Additionally, a proper distribution of these laser power is required to achieve high quality processing results. In the case of high pulse repletion rates, a proper distribution of the pulses can be obtained from ultrafast beam deflection in the range of several 100 m/s. A two-dimensional polygon mirror scanner has been used to distribute a nanosecond pulsed laser with up to 1 kW average power at a wavelength of 1064 nm for multi pass laser engraving. The pulse duration of this laser can be varied between 30 ns and 240 ns and the pulse repetition rate is set between 1 and 4 MHz. The depth information is included in greyscale bitmaps, which were used to modulate the laser during the scanning accordingly to the lateral position and the depth. The process allows high processing rates and thus high throughput.
Beam shaping and splitting with diffractive optics for high performance laser scanning systems
(2021)
Diffractive optical elements (DOEs) enable novel high performance and process-tailored scanning strategies for galvanometer-based scan heads. Here we present several such concepts integrating DOEs with laser scanners and the respective application use cases. Beam shaping DOEs providing a homogeneous fluence over a custom defined profile, such as a rectangular Top-Hat, enable increased process quality in Laser-Induced Forward Transfer (LIFT) compared to the Gaussian beam of the laser source. We show that aberrations which occur over the necessary large wafer-sized image field can be eliminated through the use of a synchronous XY-stage motion. Another application that benefits from the use of DOEs is laser drilling. Drilling in display and electronics manufacturing demands high throughput that can only be achieved through the use of beam splitting DOEs for parallel processing. To this end, the joint MULTISCAN project is developing a variable multi-beam tool capable of scanning and switching each individual beamlet for increased control.
The shape-memory Nitinol as a nickel-titanium alloy is widely used in actuator and medical applications. However, the connection of a flange to the rod is a critical point. Therefore, laser rod end melting enables material accumulations to generate a preform at the end of a rod, followed by die forming, so that the flange can be generated. This process has been successfully applied on 1.4301 steel. This study is aimed to investigate laser rod end melting of shape-memory Nitinol regarding the resultant surface quality of the preforms. The results showed that spherical preforms could be generated without visible surface discoloration due to oxidation. By using different scan rates, different solidification conditions occurred which led to significantly different surface structures. These findings show that laser rod end melting can principally be applied on Nitinol to generate preforms for flanges whereby the surface quality depends on the solidification conditions.
We demonstrate a thulium-based fiber amplifier delivering pulses tunable between <120fs and 2ps duration at up to 228 μJ of pulse energy at a center wavelength of 1940 nm and 500-kHz repetition rate. Due to the excellent long-term stability, this system proves the ability of this technology to be integrated into ultra-fast material processing machines.
The emerging Internet of Things (IoT) technology interconnects billions of embedded devices with each other. These embedded devices are internet-enabled, which collect, share, and analyze data without any human interventions. The integration of IoT technology into the human environment, such as industries, agriculture, and health sectors, is expected to improve the way of life and businesses. The emerging technology possesses challenges and numerous
security threats. On these grounds, it is a must to strengthen the security of IoT technology to avoid any compromise, which affects human life. In contrast to implementing traditional cryptosystems on IoT devices, an elliptic curve cryptosystem (ECC) is used to meet the limited resources of the devices. ECC is an elliptic curve-based public-key cryptography which provides equivalent security with shorter key size compared to other cryptosystems such as Rivest–Shamir–Adleman (RSA). The security of an ECC hinges on the hardness to solve the elliptic curve discrete logarithm problem (ECDLP). ECC is faster and easier to implement and also consumes less power and bandwidth. ECC is incorporated in internationally recognized standards for lightweight applications due to the
benefits ECC provides.
We present dimensionality reduction methods like autoencoders and t-SNE for visualization of high-dimensional data into a two-dimensional map. In this thesis, we initially implement basic and deep autoencoders using breast cancer and mushroom datasets. Next, we build another dimensionality reduction method t-SNE using the same datasets. The obtained visualization results of the datasets using the dimensionality reduction methods are documented in the experiments section of the thesis. The evaluation of classification and clustering for the dimensionality reduction techniques is also performed. The visualization and evaluation results of t-SNE are significantly better than the other dimensionality reduction techniques.
In the following bachelor thesis the current trends and potential applications of digitalization in the service industry will be discussed. With the nowadays surging demand on digitalization in all industries, there are branches of the service industry where digitalization is yet to be exploited to its full potential. However, it is difficult to pick and choose which branches of the industry should be fully digitized and which should be partially digitized. The result of this work should therefore facilitate the process of applying digitization in the consulting services where face to face human interaction has been the key to the industry for years. For this purpose, essential factors to be taken into account were identified, which are to be sought after through the analysis, in the specification of the system requirements as well as in the performance of a utility value analysis.
We propose a method for edge detection in images with multiplicative noise based on Ant Colony System (ACS). To adapt the Ant Colony System algorithm to multiplicative noise, global pheromone matrix is computed by the Coefficient of Variation. We carried out a performance comparison of the edge detection Ant Colony System algorithm among several techniques, the best results were found in the gradient and the coefficient of variation.
In dieser Arbeit wurden neuartige Proteasen aus psychrotoleranten Bakterienstämmen isoliert und auf ihre biochemischen Eigenschaften charakterisiert. Des Weiteren konnten S8 Familie Proteasen Gene amplifiziert werden und Unterschiede in der Aminosäuresequenz konnten in Zusammenhang mit den biochemischen Eigenschaften der Proteasen in Verbindung gebracht werden.
This Master Thesis covers two main Topics: Sharing Economy and Risk Management and combines them in frames of this paper in order to provide a methodology (Uber was chosen as an example) of how a risk management process may be applied to a Sharing Economy business, as well as which types of risks are of special relevance for those types of businesses.
A relatively new research field of neurosciences, called Connectomics, aims to achieve a full understanding and mapping of neural circuits and fine neuronal structures of the nervous system in a variety of organisms. This detailed information will provide insight in how our brain is influenced by different genetic and psychiatric diseases, how memory traces are stored and ageing influences our brain structure. It is beyond question that new methods for data acquisition will produce large amounts of neuronal image data. This data will exceed the zetabyte range and is impossible to annotate manually for visualization and analysis. Nowadays, machine learning algorithms and specially deep convolutional neuronal networks are heavily used in medical imaging and computer vision, which brings the opportunity of designing fully automated pipelines for image analysis. This work presents a new automated workflow based on three major parts including image processing using consecutive deep convolutional networks, a pixel-grouping step called connected components and 3D visualization via neuroglancer to achieve a dense three dimensional reconstruction of neurons from EM image data.
In an era of global climate change and fast growing cities, local governments are in an urgent need for adopting sustainable urban growth concepts for tackling a liveable and prosperous urban future. Against this background, the smart city notion progressively gained popularity as an urban development concept, which heavily relies on technology and urban data use for fostering sustainable urban growth. However, so far, the understandingof the smart city term is ambiguous, and little scientific research has been done on developing comprehensive conceptual frameworks to support local governments in the making of smarter cities. This paper aims at presenting the current state-of-the-art of smart city research in order to support the making of smart city best practices and to promote a comprehensive understanding of the smart city notion. In doing so, the role of technology in the making of smarter cities and critical success factors in transforming cities are elaborated, following the methodological approach of a multidimensional conceptual framework. The research findings and an expert interview with a representative of the state capital will then serve for the assessment of the weak points and best practices in the smart city pursuit of the German city Munich, providing urban policymaking with valuable insights and fostering the development of a comprehensive smart city conceptualism.
In this work a second version for the Python implementation of an algorithm called Probabilistic Regulation of Metabolism (PROM) was created and applied to the metabolic model iSynCJ816 for the organism Synechocystis sp. PCC 6803. A crossvalidation was performed to determine the minimal amount of expression data needed to produce meaningful results with the PROM algorithm. The failed reproduction of the results of a method called Integrated and Deduced Regulation of Metabolism (IDREAM) is documented and causes for the failed reproduction are discussed.
This master thesis covers the topics of Customer relationships formation in the IT-outsourcing market on the example of “ABC” company. Most works related to the topic IT outsourcing cover the problems of implementation of IT services and the process of providing them to the customers and mostly all the issues are covered from the perspec-tive of consumers. Thus, problems and results of outsourcing providers of IT services remain almost uncovered. This master thesis is to reveal the specific features of IT out-sourcing business in Belarus and to develop an approach to the formation and construc-tion of a system of relationships between the company and its clients as a source of competitiveness increase.
At a global level, different studies disclose that transport systems are responsible for 25% of CO2 emissions. In the context of sustainable mobility, one of the challenges in the short term is associated with the research and improvement of alternative fuels, which should allow a fast decrease in the generation of greenhouse gases due to sustainable transport means. In this sense, green hydrogen can play a fundamental role. Green hydrogen is the basis for producing synthetic fuels, which can replace oil and its derivatives. Synthetic fuels or e-fuel are hydrocarbons produced from carbon dioxide (CO2) and green hydrogen (H2) as the only raw materials. H2 or efuel could be used in many sectors (manufacturing, residential, transportation, mining and other industries). In this study, different applications of hydrogen are evaluated by techno-economic analysis. The main variable that affects the production of hydrogen and its derivatives is the cost of electricity. Considering the renewable energy potential of Chile, it is feasible to develop in Chile the green hydrogen production as an energy vector, which would be technically and economically viable, together with the environmental benefits
In this paper, we designed, implemented, and tested a special surveillance camera system based on a combination of classical image processing algorithms. The system’s sub-objective consists of tracking experimental vehicles driving on a defined trajectories (Rail) in real time. Furthermore, it analyzes the scene to collect additional vehicles & rail-related information. The system then uses the gathered data to reach its main objective which confines oneself in independently predicting vehicles collision. Consequently, we propose a hybrid method of detecting and tracking ATLAS-vehicles efficiently. To detect the vehicle at the beginning of the video, periodically every n-frame, and in the case where the tracked vehicle has been lost, we used Histogram Back-Projection. By contrast, Kernelized correlation filter is used to track the detected vehicles. Combining these two methods provides one of the best trade-offs between accuracy and speed even on a single processing core. The proposed method achieves the best performance compared with three different approaches on a custom dataset.
Standard assembly time is an important piece of data in product development that is used to compare different product variants or manufacturing variants. In the presented approach, standard time is created with the use of a decision tree regarding standard manual and machine-manual operations, taking into consideration product characteristics and typical tools, equipment and layout. The analysed features include, among others: information determined during product development, such as product structure, parts characteristics (e.g. weight, size), connection type, as well as the information determined during assembly planning: tools (e.g. hand screw driver, power screw driver, pliers), equipment (e.g. press, heater), workstation layout (e.g. distance, way of feeding). The object-attribute-value (OAV) framework was applied for the assembly characteristic. An example of the decision tree application to predict standard assembly time was presented for a mechanical subassembly. The case study was dedicated to standard time prediction for a bearing assembly. The presented approach is particularly important for the enterprises which offer customized products.
Sensor fusion is an important and crucial topic in many industrial applications. One of the challenging problems is to find an appropriate sensor combination for the dedicated application or to weight their information adequately. In our contribution, we focus on the application of the sensor fusion concept together with the reference to the distance-based learning for object classification purposes. The developed machine learning model has a bi-functional architecture, which learns on the one side the discrimination of the data regarding their classes and, on the other side, the importance of the single signals, i.e., the contribution of each sensor to the decision. We show that the resulting bi-functional model is interpretative, sparse, and simple to integrate in many standard artificial neural networks.
Learning Vector Quantization (LVQ) methods have been popular choices of classification models ever since its introduction by T. Kohonen in the 90s. These days, LVQ is combined with Deep Learning methods to provide powerful yet interpretable machine-learning solutions to some of the most challenging computational problems.
However, techniques to model recurrent relationships in the data using prototype methods still remain quite unsophisticated. In particular, we are not aware of any modification of LVQ that allows the input data to have different lengths. Needless to say, such data is abundant in today's digital world and demands new processing techniques to extract useful information. In this paper, we propose the use of the Siamese architecture to not only model recurrent relationships within the prototypes but also the ability to handle prototypes of various dimensions simultaneously.